| 1 | [[PageOutline]] |
| 2 | |
| 3 | ◢ <[wiki:III121201/Lab3 實作三]> | <[wiki:III121201 回課程大綱]> ▲ | <[wiki:III121201/Lab5 實作五]> ◣ |
| 4 | |
| 5 | = 實作四 Lab 4 = |
| 6 | |
| 7 | {{{ |
| 8 | #!html |
| 9 | <div style="text-align: center;"><big style="font-weight: bold;"><big>HDFS 叢集環境操作練習<br/>HDFS full distributed mode in practice</big></big></div> |
| 10 | }}} |
| 11 | |
| 12 | {{{ |
| 13 | #!text |
| 14 | 以下練習,請連線至 hadoop.classcloud.org 操作。底下的 hXXXX 等於您的用戶名稱。 |
| 15 | }}} |
| 16 | |
| 17 | * https://hadoop.classcloud.org |
| 18 | |
| 19 | == Content 1: HDFS Shell 基本操作 == |
| 20 | == Content 1: Basic HDFS Shell Commands == |
| 21 | |
| 22 | === 1.1 瀏覽你HDFS目錄 === |
| 23 | === 1.1 Browsing Your HDFS Folder === |
| 24 | |
| 25 | {{{ |
| 26 | ~$ hadoop fs -ls |
| 27 | Found 1 items |
| 28 | drwxr-xr-x - hXXXX supergroup 0 2010-01-24 17:23 /user/hXXXX/tmp |
| 29 | ~$ hadoop fs -lsr |
| 30 | drwxr-xr-x - hXXXX supergroup 0 2010-01-24 17:23 /user/hXXXX/tmp |
| 31 | }}} |
| 32 | |
| 33 | === 1.2 上傳資料到 HDFS 目錄 === |
| 34 | === 1.2 Upload Files or Folder to HDFS === |
| 35 | |
| 36 | * 上傳 Upload |
| 37 | |
| 38 | {{{ |
| 39 | ~$ hadoop fs -put /etc/hadoop/conf input |
| 40 | }}} |
| 41 | |
| 42 | * 檢查 Check |
| 43 | |
| 44 | {{{ |
| 45 | ~$ hadoop fs -ls |
| 46 | Found 2 items |
| 47 | drwxr-xr-x - hXXXX supergroup 0 2011-04-19 09:16 /user/hXXXX/input |
| 48 | drwxr-xr-x - hXXXX supergroup 0 2010-01-24 17:23 /user/hXXXX/tmp |
| 49 | ~$ hadoop fs -ls input |
| 50 | Found 25 items |
| 51 | -rw-r--r-- 2 hXXXX supergroup 321 2011-04-19 09:16 /user/hXXXX/input/README |
| 52 | -rw-r--r-- 2 hXXXX supergroup 3936 2011-04-19 09:16 /user/hXXXX/input/capacity-scheduler.xml |
| 53 | -rw-r--r-- 2 hXXXX supergroup 196 2011-04-19 09:16 /user/hXXXX/input/commons-logging.properties |
| 54 | (.... skip ....) |
| 55 | }}} |
| 56 | |
| 57 | === 1.3 下載 HDFS 的資料到本地目錄 === |
| 58 | === 1.3 Download HDFS Files or Folder to Local === |
| 59 | |
| 60 | * 下載 Download |
| 61 | |
| 62 | {{{ |
| 63 | ~$ hadoop fs -get input fromHDFS |
| 64 | }}} |
| 65 | |
| 66 | * 檢查 Check |
| 67 | {{{ |
| 68 | ~$ ls -al | grep fromHDFS |
| 69 | drwxr-xr-x 2 hXXXX hXXXX 4096 2011-04-19 09:18 fromHDFS |
| 70 | ~$ ls -al fromHDFS |
| 71 | 總計 160 |
| 72 | drwxr-xr-x 2 hXXXX hXXXX 4096 2011-04-19 09:18 . |
| 73 | drwx--x--x 3 hXXXX hXXXX 4096 2011-04-19 09:18 .. |
| 74 | -rw-r--r-- 1 hXXXX hXXXX 3936 2011-04-19 09:18 capacity-scheduler.xml |
| 75 | -rw-r--r-- 1 hXXXX hXXXX 196 2011-04-19 09:18 commons-logging.properties |
| 76 | -rw-r--r-- 1 hXXXX hXXXX 535 2011-04-19 09:18 configuration.xsl |
| 77 | (.... skip ....) |
| 78 | ~$ diff /etc/hadoop/conf fromHDFS/ |
| 79 | }}} |
| 80 | |
| 81 | === 1.4 刪除檔案 === |
| 82 | === 1.4 Remove Files or Folder === |
| 83 | |
| 84 | {{{ |
| 85 | ~$ hadoop fs -ls input/masters |
| 86 | Found 1 items |
| 87 | -rw-r--r-- 2 hXXXX supergroup 10 2011-04-19 09:16 /user/hXXXX/input/masters |
| 88 | ~$ hadoop fs -rm input/masters |
| 89 | Deleted hdfs://hadoop.nchc.org.tw/user/hXXXX/input/masters |
| 90 | }}} |
| 91 | |
| 92 | === 1.5 直接看檔案 === |
| 93 | === 1.5 Browse Files Directly === |
| 94 | |
| 95 | {{{ |
| 96 | ~$ hadoop fs -ls input/slaves |
| 97 | Found 1 items |
| 98 | -rw-r--r-- 2 hXXXX supergroup 10 2011-04-19 09:16 /user/hXXXX/input/slaves |
| 99 | ~$ hadoop fs -cat input/slaves |
| 100 | localhost |
| 101 | }}} |
| 102 | |
| 103 | === 1.6 更多指令操作 === |
| 104 | === 1.6 More Commands -- Help message === |
| 105 | |
| 106 | {{{ |
| 107 | hXXXX@hadoop:~$ hadoop fs |
| 108 | |
| 109 | Usage: java FsShell |
| 110 | [-ls <path>] |
| 111 | [-lsr <path>] |
| 112 | [-du <path>] |
| 113 | [-dus <path>] |
| 114 | [-count[-q] <path>] |
| 115 | [-mv <src> <dst>] |
| 116 | [-cp <src> <dst>] |
| 117 | [-rm <path>] |
| 118 | [-rmr <path>] |
| 119 | [-expunge] |
| 120 | [-put <localsrc> ... <dst>] |
| 121 | [-copyFromLocal <localsrc> ... <dst>] |
| 122 | [-moveFromLocal <localsrc> ... <dst>] |
| 123 | [-get [-ignoreCrc] [-crc] <src> <localdst>] |
| 124 | [-getmerge <src> <localdst> [addnl]] |
| 125 | [-cat <src>] |
| 126 | [-text <src>] |
| 127 | [-copyToLocal [-ignoreCrc] [-crc] <src> <localdst>] |
| 128 | [-moveToLocal [-crc] <src> <localdst>] |
| 129 | [-mkdir <path>] |
| 130 | [-setrep [-R] [-w] <rep> <path/file>] |
| 131 | [-touchz <path>] |
| 132 | [-test -[ezd] <path>] |
| 133 | [-stat [format] <path>] |
| 134 | [-tail [-f] <file>] |
| 135 | [-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...] |
| 136 | [-chown [-R] [OWNER][:[GROUP]] PATH...] |
| 137 | [-chgrp [-R] GROUP PATH...] |
| 138 | [-help [cmd]] |
| 139 | |
| 140 | Generic options supported are |
| 141 | -conf <configuration file> specify an application configuration file |
| 142 | -D <property=value> use value for given property |
| 143 | -fs <local|namenode:port> specify a namenode |
| 144 | -jt <local|jobtracker:port> specify a job tracker |
| 145 | -files <comma separated list of files> specify comma separated files to be copied to the map reduce cluster |
| 146 | -libjars <comma separated list of jars> specify comma separated jar files to include in the classpath. |
| 147 | -archives <comma separated list of archives> specify comma separated archives to be unarchived on the compute machines. |
| 148 | The general command line syntax is |
| 149 | hadoop command [genericOptions] [commandOptions] |
| 150 | }}} |
| 151 | |
| 152 | == Content 2: 使用網頁 GUI 瀏覽資訊 == |
| 153 | == Content 2: User Web GUI to browse HDFS == |
| 154 | |
| 155 | * [http://hadoop.nchc.org.tw:50030 JobTracker Web Interface] |
| 156 | * [http://hadoop.nchc.org.tw:50070 NameNode Web Interface] |
| 157 | |
| 158 | == Content 3: 更多 HDFS Shell 的用法 == |
| 159 | == Content 3: More about HDFS Shell == |
| 160 | |
| 161 | * hadoop fs <args> ,下面則列出 <args> 的用法[[BR]]Following are the examples of hadoop fs related commands. |
| 162 | * 以下操作預設的目錄在 /user/<$username>/ 下[[BR]]By default, your working directory will be at /user/<$username>/. |
| 163 | {{{ |
| 164 | $ hadoop fs -ls input |
| 165 | Found 25 items |
| 166 | -rw-r--r-- 2 hXXXX supergroup 321 2011-04-19 09:16 /user/hXXXX/input/README |
| 167 | -rw-r--r-- 2 hXXXX supergroup 3936 2011-04-19 09:16 /user/hXXXX/input/capacity-scheduler.xml |
| 168 | -rw-r--r-- 2 hXXXX supergroup 196 2011-04-19 09:16 /user/hXXXX/input/commons-logging.properties |
| 169 | (.... skip ....) |
| 170 | }}} |
| 171 | * 完整的路徑則是 '''hdfs://node:port/path''' 如:[[BR]]Or you have to give a __''absolute path''__, such as '''hdfs://node:port/path''' |
| 172 | {{{ |
| 173 | $ hadoop fs -ls hdfs://hadoop.nchc.org.tw/user/hXXXX/input |
| 174 | Found 25 items |
| 175 | -rw-r--r-- 2 hXXXX supergroup 321 2011-04-19 09:16 /user/hXXXX/input/README |
| 176 | -rw-r--r-- 2 hXXXX supergroup 3936 2011-04-19 09:16 /user/hXXXX/input/capacity-scheduler.xml |
| 177 | -rw-r--r-- 2 hXXXX supergroup 196 2011-04-19 09:16 /user/hXXXX/input/commons-logging.properties |
| 178 | (.... skip ....) |
| 179 | }}} |
| 180 | |
| 181 | === -cat === |
| 182 | |
| 183 | * 將路徑指定文件的內容輸出到 STDOUT [[BR]] Print given file content to STDOUT |
| 184 | {{{ |
| 185 | $ hadoop fs -cat input/hadoop-env.sh |
| 186 | }}} |
| 187 | |
| 188 | === -chgrp === |
| 189 | |
| 190 | * 改變文件所屬的組 [[BR]] Change '''owner group''' of given file or folder |
| 191 | {{{ |
| 192 | $ hadoop fs -ls |
| 193 | Found 2 items |
| 194 | drwxr-xr-x - hXXXX supergroup 0 2011-04-19 09:16 /user/hXXXX/input |
| 195 | drwxr-xr-x - hXXXX supergroup 0 2010-01-24 17:23 /user/hXXXX/tmp |
| 196 | $ hadoop fs -chgrp -R ${USER} input |
| 197 | $ hadoop fs -ls |
| 198 | Found 2 items |
| 199 | drwxr-xr-x - hXXXX hXXXX 0 2011-04-19 09:21 /user/hXXXX/input |
| 200 | drwxr-xr-x - hXXXX supergroup 0 2010-01-24 17:23 /user/hXXXX/tmp |
| 201 | }}} |
| 202 | |
| 203 | === -chmod === |
| 204 | |
| 205 | * 改變文件的權限 [[BR]] Change '''read and write permission''' of given file or folder |
| 206 | {{{ |
| 207 | $ hadoop fs -ls |
| 208 | Found 2 items |
| 209 | drwxr-xr-x - hXXXX hXXXX 0 2011-04-19 09:21 /user/hXXXX/input |
| 210 | drwxr-xr-x - hXXXX supergroup 0 2010-01-24 17:23 /user/hXXXX/tmp |
| 211 | $ hadoop fs -chmod -R 755 input |
| 212 | $ hadoop fs -ls |
| 213 | Found 2 items |
| 214 | drwxrwxrwx - hXXXX hXXXX 0 2011-04-19 09:21 /user/hXXXX/input |
| 215 | drwxr-xr-x - hXXXX supergroup 0 2010-01-24 17:23 /user/hXXXX/tmp |
| 216 | }}} |
| 217 | |
| 218 | === -chown === |
| 219 | |
| 220 | * 改變文件的擁有者 [[BR]] Change '''owner''' of given file or folder |
| 221 | {{{ |
| 222 | $ hadoop fs -chown -R ${USER} input |
| 223 | }}} |
| 224 | * 注意:因為在 hadoop.nchc.org.tw 上您沒有管理者權限,因此若要改成其他使用者時,會看到類似以下的錯誤訊息: |
| 225 | * Note: Since you don't have the super user permission, you will see error message as following: |
| 226 | {{{ |
| 227 | $ hadoop fs -chown -R h1000 input |
| 228 | chown: changing ownership of 'hdfs://hadoop.nchc.org.tw/user/hXXXX/input':org.apache.hadoop.security.AccessControlException: Non-super user cannot change owner. |
| 229 | }}} |
| 230 | |
| 231 | === -copyFromLocal, -put === |
| 232 | |
| 233 | * 從 local 放檔案到 hdfs [[BR]] Both commands will copy given file or folder from local to HDFS |
| 234 | {{{ |
| 235 | $ hadoop fs -copyFromLocal /etc/hadoop/conf dfs_input |
| 236 | }}} |
| 237 | |
| 238 | === -copyToLocal, -get === |
| 239 | |
| 240 | * 把hdfs上得檔案下載到 local [[BR]] Both commands will copy given file or folder from HDFS to local |
| 241 | {{{ |
| 242 | $ hadoop fs -copyToLocal dfs_input input1 |
| 243 | }}} |
| 244 | |
| 245 | === -cp === |
| 246 | |
| 247 | * 將文件從 hdfs 原本路徑複製到 hdfs 目標路徑 [[BR]] Copy given file or folder from HDFS source path to HDFS target path |
| 248 | {{{ |
| 249 | $ hadoop fs -cp input input1 |
| 250 | }}} |
| 251 | |
| 252 | === -du === |
| 253 | |
| 254 | * 顯示目錄中所有文件的大小 [[BR]] Display the size of files in given folder |
| 255 | {{{ |
| 256 | $ hadoop fs -du input |
| 257 | Found 24 items |
| 258 | 321 hdfs://hadoop.nchc.org.tw/user/hXXXX/input/README |
| 259 | 3936 hdfs://hadoop.nchc.org.tw/user/hXXXX/input/capacity-scheduler.xml |
| 260 | 196 hdfs://hadoop.nchc.org.tw/user/hXXXX/input/commons-logging.properties |
| 261 | ( .... skip .... ) |
| 262 | }}} |
| 263 | |
| 264 | === -dus === |
| 265 | |
| 266 | * 顯示該目錄/文件的總大小 [[BR]] Display total size of given folder |
| 267 | {{{ |
| 268 | $ hadoop fs -dus input |
| 269 | hdfs://hadoop.nchc.org.tw/user/hXXXX/input 84218 |
| 270 | }}} |
| 271 | |
| 272 | === -expunge === |
| 273 | |
| 274 | * 清空垃圾桶 [[BR]] Clean up Recycled |
| 275 | {{{ |
| 276 | $ hadoop fs -expunge |
| 277 | }}} |
| 278 | |
| 279 | === -getmerge === |
| 280 | |
| 281 | * 將來源目錄<src>下所有的文件都集合到本地端一個<localdst>檔案內 [[BR]] Merge all files in HDFS source folder <src> into one local file |
| 282 | {{{ |
| 283 | $ hadoop fs -getmerge <src> <localdst> |
| 284 | }}} |
| 285 | {{{ |
| 286 | $ mkdir -p in1 |
| 287 | $ echo "this is one; " >> in1/input |
| 288 | $ echo "this is two; " >> in1/input2 |
| 289 | $ hadoop fs -put in1 in1 |
| 290 | $ hadoop fs -getmerge in1 merge.txt |
| 291 | $ cat ./merge.txt |
| 292 | }}} |
| 293 | * 您應該會看到類似底下的結果:[[BR]]You should see results like this: |
| 294 | {{{ |
| 295 | this is one; |
| 296 | this is two; |
| 297 | }}} |
| 298 | |
| 299 | === -ls === |
| 300 | |
| 301 | * 列出文件或目錄的資訊 [[BR]] List files and folders |
| 302 | * 文件名 <副本數> 文件大小 修改日期 修改時間 權限 用戶ID 組ID [[BR]] <file name> <replication> <size> <modified date> <modified time> <permission> <user id> <group id> |
| 303 | * 目錄名 <dir> 修改日期 修改時間 權限 用戶ID 組ID [[BR]] <folder name> <modified date> <modified time> <permission> <user id> <group id> |
| 304 | {{{ |
| 305 | $ hadoop fs -ls |
| 306 | Found 5 items |
| 307 | drwxr-xr-x - hXXXX supergroup 0 2011-04-19 09:32 /user/hXXXX/dfs_input |
| 308 | drwxr-xr-x - hXXXX supergroup 0 2011-04-19 09:34 /user/hXXXX/in1 |
| 309 | drwxrwxrwx - hXXXX hXXXX 0 2011-04-19 09:21 /user/hXXXX/input |
| 310 | drwxr-xr-x - hXXXX supergroup 0 2011-04-19 09:33 /user/hXXXX/input1 |
| 311 | drwxr-xr-x - hXXXX supergroup 0 2010-01-24 17:23 /user/hXXXX/tmp |
| 312 | }}} |
| 313 | |
| 314 | === -lsr === |
| 315 | |
| 316 | * ls 命令的遞迴版本 [[BR]] list files and folders with recursive |
| 317 | {{{ |
| 318 | $ hadoop fs -lsr in1 |
| 319 | -rw-r--r-- 2 hXXXX supergroup 14 2011-04-19 09:34 /user/hXXXX/in1/input |
| 320 | -rw-r--r-- 2 hXXXX supergroup 14 2011-04-19 09:34 /user/hXXXX/in1/input2 |
| 321 | }}} |
| 322 | |
| 323 | === -mkdir === |
| 324 | |
| 325 | * 建立資料夾 [[BR]] create directories |
| 326 | {{{ |
| 327 | $ hadoop fs -mkdir a b c |
| 328 | }}} |
| 329 | |
| 330 | === -moveFromLocal === |
| 331 | |
| 332 | * 將 local 端的資料夾剪下移動到 hdfs 上 [[BR]] move local files or folder to HDFS ( it will delete local files or folder. ) |
| 333 | {{{ |
| 334 | $ hadoop fs -moveFromLocal in1 in2 |
| 335 | }}} |
| 336 | |
| 337 | === -mv === |
| 338 | |
| 339 | * 更改資料的名稱 [[BR]] Change file name or folder name. |
| 340 | {{{ |
| 341 | $ hadoop fs -mv in2 in3 |
| 342 | }}} |
| 343 | |
| 344 | === -rm === |
| 345 | |
| 346 | * 刪除指定的檔案(不可資料夾)[[BR]] Remove given files (not folders) |
| 347 | {{{ |
| 348 | $ hadoop fs -rm in1/input |
| 349 | Deleted hdfs://hadoop.nchc.org.tw/user/hXXXX/in1/input |
| 350 | }}} |
| 351 | === -rmr === |
| 352 | |
| 353 | * 遞迴刪除資料夾(包含在內的所有檔案) [[BR]] Remove given files and folders with recursive |
| 354 | {{{ |
| 355 | $ hadoop fs -rmr a b c dfs_input in3 input input1 |
| 356 | Deleted hdfs://hadoop.nchc.org.tw/user/hXXXX/a |
| 357 | Deleted hdfs://hadoop.nchc.org.tw/user/hXXXX/b |
| 358 | Deleted hdfs://hadoop.nchc.org.tw/user/hXXXX/c |
| 359 | Deleted hdfs://hadoop.nchc.org.tw/user/hXXXX/dfs_input |
| 360 | Deleted hdfs://hadoop.nchc.org.tw/user/hXXXX/in3 |
| 361 | Deleted hdfs://hadoop.nchc.org.tw/user/hXXXX/input |
| 362 | Deleted hdfs://hadoop.nchc.org.tw/user/hXXXX/input1 |
| 363 | }}} |
| 364 | |
| 365 | === -setrep === |
| 366 | |
| 367 | * 設定副本係數 [[BR]] setup replication numbers of given files or folder |
| 368 | {{{ |
| 369 | $ hadoop fs -setrep [-R] [-w] <rep> <path/file> |
| 370 | }}} |
| 371 | {{{ |
| 372 | $ hadoop fs -setrep -w 2 -R in1 |
| 373 | Replication 2 set: hdfs://hadoop.nchc.org.tw/user/hXXXX/in1/input2 |
| 374 | Waiting for hdfs://hadoop.nchc.org.tw/user/hXXXX/in1/input2 ... done |
| 375 | }}} |
| 376 | |
| 377 | === -stat === |
| 378 | |
| 379 | * 印出時間資訊 [[BR]] Print Status of time stamp of folder |
| 380 | {{{ |
| 381 | $ hadoop fs -stat in1 |
| 382 | 2011-04-19 09:34:49 |
| 383 | }}} |
| 384 | === -tail === |
| 385 | |
| 386 | * 將文件的最後 1K 內容輸出 [[BR]] Display the last 1K contents of given file |
| 387 | * 用法 Usage |
| 388 | {{{ |
| 389 | hadoop fs -tail [-f] 檔案 (-f 參數用來顯示如果檔案增大,則秀出被append上得內容) |
| 390 | hadoop fs -tail [-f] <path/file> (-f is used when file had appended) |
| 391 | }}} |
| 392 | {{{ |
| 393 | $ hadoop fs -tail in1/input2 |
| 394 | this is two; |
| 395 | }}} |
| 396 | |
| 397 | === -test === |
| 398 | |
| 399 | * 測試檔案, -e 檢查文件是否存在(1=存在, 0=否), -z 檢查文件是否為空(1=空, 0=不為空), -d 檢查是否為目錄(1=存在, 0=否) [[BR]] test files or folders [[BR]] -e : check if file or folder existed ( 1 = exist , 0 = false )[[BR]] -z : check if file is empty ( 1 = empty , 0 = false ) [[BR]] -d : check if given path is folder ( 1 = it's folder , 0 = false ) |
| 400 | * 要用 echo $? 來看回傳值為 0 or 1 [[BR]] You have to use '''echo $?''' to get the return value |
| 401 | * 用法 Usage |
| 402 | {{{ |
| 403 | $ hadoop fs -test -[ezd] URI |
| 404 | }}} |
| 405 | |
| 406 | {{{ |
| 407 | $ hadoop fs -test -e in1/input2 |
| 408 | $ echo $? |
| 409 | 0 |
| 410 | $ hadoop fs -test -z in1/input3 |
| 411 | $ echo $? |
| 412 | 1 |
| 413 | $ hadoop fs -test -d in1/input2 |
| 414 | $ echo $? |
| 415 | 1 |
| 416 | }}} |
| 417 | |
| 418 | === -text === |
| 419 | |
| 420 | * 將檔案(如壓縮檔, textrecordinputstream)輸出為純文字格式 [[BR]] Display archive file contents into STDOUT |
| 421 | {{{ |
| 422 | $ hadoop fs -text <src> |
| 423 | }}} |
| 424 | {{{ |
| 425 | $ gzip merge.txt |
| 426 | $ hadoop fs -put merge.txt.gz . |
| 427 | $ hadoop fs -text merge.txt.gz |
| 428 | 11/04/19 09:54:16 INFO util.NativeCodeLoader: Loaded the native-hadoop library |
| 429 | 11/04/19 09:54:16 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library |
| 430 | this is one; |
| 431 | this is two; |
| 432 | }}} |
| 433 | * ps : 目前沒支援zip的函式庫 [[BR]] PS. It does not support zip files yet. |
| 434 | {{{ |
| 435 | $ gunzip merge.txt.gz |
| 436 | $ zip merge.zip merge.txt |
| 437 | $ hadoop fs -put merge.zip . |
| 438 | $ hadoop fs -text merge.zip |
| 439 | PK�N�>E73 merge.txtUT ���Mq��Mux |
| 440 | ��+��,V���Tk�(��<�PK�N�>E73 ��merge.txtUT���Mux |
| 441 | ��PKOY |
| 442 | }}} |
| 443 | |
| 444 | === -touchz === |
| 445 | |
| 446 | * 建立一個空文件 [[BR]] creat an empty file |
| 447 | {{{ |
| 448 | $ hadoop fs -touchz in1/kk |
| 449 | $ hadoop fs -test -z in1/kk |
| 450 | $ echo $? |
| 451 | 0 |
| 452 | }}} |
| 453 | |
| 454 | ---- |
| 455 | |
| 456 | * 您可以用以下指令把以上練習產生的暫存目錄與檔案清除:[[BR]]You can clean up the temporary folders and files using following command: |
| 457 | {{{ |
| 458 | ~$ hadoop fs -rmr in1 merge.txt.gz merge.zip |
| 459 | ~$ rm -rf input1/ fromHDFS/ merge.zip |
| 460 | }}} |