| 1 |   | [[WikiInclude(waue/2009/0617)]] | 
                      
                      
                        |   | 1 | [[PageOutline]] | 
                      
                        |   | 2 | {{{ | 
                      
                        |   | 3 | #!html | 
                      
                        |   | 4 | <div style="text-align: center;"><big | 
                      
                        |   | 5 |  style="font-weight: bold;"><big><big> hadoop 0.20 程式開發 </big></big></big></div> | 
                      
                        |   | 6 | <div style="text-align: center;"> <big>eclipse plugin + Makefile</big> </div> | 
                      
                        |   | 7 | }}} | 
                      
                        |   | 8 |  = 零. 前言 = | 
                      
                        |   | 9 |  * 開發hadoop 需要用到許多的物件導向語法,包括繼承關係、介面類別,而且需要匯入正確的classpath,否則寫hadoop程式只是打字練習... | 
                      
                        |   | 10 |  * 用類 vim 來處理這種複雜的程式,有可能會變成一場惡夢,因此用eclipse開發,搭配mapreduce-plugin會事半功倍。 | 
                      
                        |   | 11 |  * 若繼承練習一的系統,可以直接跳到'''二、 建立專案 ''' 開始 | 
                      
                        |   | 12 |  | 
                      
                        |   | 13 |  == 0.1 環境說明 ==  | 
                      
                        |   | 14 |  * ubuntu 8.10 | 
                      
                        |   | 15 |  * sun-java-6 | 
                      
                        |   | 16 |  * eclipse 3.3.2 | 
                      
                        |   | 17 |  * hadoop 0.20.2 | 
                      
                        |   | 18 |  == 0.2 目錄說明 ==  | 
                      
                        |   | 19 |  * 使用者:waue | 
                      
                        |   | 20 |  * 使用者家目錄: /home/hadooper | 
                      
                        |   | 21 |  * 專案目錄 : /home/hadooper/workspace | 
                      
                        |   | 22 |  * hadoop目錄: /opt/hadoop | 
                      
                        |   | 23 |  = 一、安裝 = | 
                      
                        |   | 24 |  | 
                      
                        |   | 25 | 安裝的部份沒必要都一模一樣,僅提供參考,反正只要安裝好java , hadoop , eclipse,並清楚自己的路徑就可以了 | 
                      
                        |   | 26 |  | 
                      
                        |   | 27 |  == 1.1. 安裝java == | 
                      
                        |   | 28 |   | 
                      
                        |   | 29 | 首先安裝java 基本套件 | 
                      
                        |   | 30 |   | 
                      
                        |   | 31 | {{{ | 
                      
                        |   | 32 | $ sudo apt-get install java-common sun-java6-bin sun-java6-jdk sun-java6-jre | 
                      
                        |   | 33 | }}} | 
                      
                        |   | 34 |  | 
                      
                        |   | 35 |  == 1.1.1. 安裝sun-java6-doc == | 
                      
                        |   | 36 |   | 
                      
                        |   | 37 |  1 將javadoc (jdk-6u10-docs.zip) 下載下來 | 
                      
                        |   | 38 |  [https://cds.sun.com/is-bin/INTERSHOP.enfinity/WFS/CDS-CDS_Developer-Site/en_US/-/USD/ViewProductDetail-Start?ProductRef=jdk-6u10-docs-oth-JPR@CDS-CDS_Developer 下載點] | 
                      
                        |   | 39 | [[Image(wiki:waue:2009:0617:1-1.png)]] | 
                      
                        |   | 40 | [[Image(wiki:0428Hadoop_Lab1:hadoop_administration.png)]] | 
                      
                        |   | 41 |   | 
                      
                        |   | 42 |  2 下載完後將檔案放在 /tmp/ 下 | 
                      
                        |   | 43 |  | 
                      
                        |   | 44 |  3 執行 | 
                      
                        |   | 45 |   | 
                      
                        |   | 46 | {{{ | 
                      
                        |   | 47 | $ sudo apt-get install sun-java6-doc | 
                      
                        |   | 48 | }}} | 
                      
                        |   | 49 |  | 
                      
                        |   | 50 |  == 1.2. ssh 安裝設定 == | 
                      
                        |   | 51 |  | 
                      
                        |   | 52 | {{{ | 
                      
                        |   | 53 | $ apt-get install ssh  | 
                      
                        |   | 54 | $ ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa | 
                      
                        |   | 55 | $ cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys | 
                      
                        |   | 56 | $ ssh localhost | 
                      
                        |   | 57 | }}} | 
                      
                        |   | 58 |  | 
                      
                        |   | 59 | 執行ssh localhost 沒有出現詢問密碼的訊息則無誤 | 
                      
                        |   | 60 |  | 
                      
                        |   | 61 |  == 1.3. 安裝hadoop == | 
                      
                        |   | 62 |  | 
                      
                        |   | 63 | 安裝hadoop0.20到/opt/並取目錄名為hadoop | 
                      
                        |   | 64 |  | 
                      
                        |   | 65 | {{{ | 
                      
                        |   | 66 | $ cd ~ | 
                      
                        |   | 67 | $ wget http://apache.ntu.edu.tw/hadoop/core/hadoop-0.20.2/hadoop-0.20.2.tar.gz | 
                      
                        |   | 68 | $ tar zxvf hadoop-0.20.2.tar.gz | 
                      
                        |   | 69 | $ sudo mv hadoop-0.20.2 /opt/ | 
                      
                        |   | 70 | $ sudo chown -R waue:waue /opt/hadoop-0.20.2 | 
                      
                        |   | 71 | $ sudo ln -sf /opt/hadoop-0.20.2 /opt/hadoop | 
                      
                        |   | 72 | }}} | 
                      
                        |   | 73 |  | 
                      
                        |   | 74 |  * 編輯 /opt/hadoop/conf/hadoop-env.sh | 
                      
                        |   | 75 |  | 
                      
                        |   | 76 | {{{ | 
                      
                        |   | 77 | #!sh | 
                      
                        |   | 78 | export JAVA_HOME=/usr/lib/jvm/java-6-sun | 
                      
                        |   | 79 | export HADOOP_HOME=/opt/hadoop | 
                      
                        |   | 80 | export PATH=$PATH:/opt/hadoop/bin | 
                      
                        |   | 81 | }}} | 
                      
                        |   | 82 |  | 
                      
                        |   | 83 |  * 編輯 /opt/hadoop/conf/core-site.xml | 
                      
                        |   | 84 |  | 
                      
                        |   | 85 | {{{ | 
                      
                        |   | 86 | #!sh | 
                      
                        |   | 87 | <configuration> | 
                      
                        |   | 88 |   <property> | 
                      
                        |   | 89 |     <name>fs.default.name</name> | 
                      
                        |   | 90 |     <value>hdfs://localhost:9000</value> | 
                      
                        |   | 91 |   </property> | 
                      
                        |   | 92 |   <property> | 
                      
                        |   | 93 |     <name>hadoop.tmp.dir</name> | 
                      
                        |   | 94 |     <value>/tmp/hadoop/hadoop-${user.name}</value> | 
                      
                        |   | 95 |   </property> | 
                      
                        |   | 96 | </configuration> | 
                      
                        |   | 97 |  | 
                      
                        |   | 98 | }}} | 
                      
                        |   | 99 |  | 
                      
                        |   | 100 |  * 編輯 /opt/hadoop/conf/hdfs-site.xml | 
                      
                        |   | 101 |  | 
                      
                        |   | 102 | {{{ | 
                      
                        |   | 103 | #!sh | 
                      
                        |   | 104 | <configuration> | 
                      
                        |   | 105 |   <property> | 
                      
                        |   | 106 |     <name>dfs.replication</name> | 
                      
                        |   | 107 |     <value>1</value> | 
                      
                        |   | 108 |   </property> | 
                      
                        |   | 109 | </configuration> | 
                      
                        |   | 110 | }}} | 
                      
                        |   | 111 |  | 
                      
                        |   | 112 |  * 編輯 /opt/hadoop/conf/mapred-site.xml | 
                      
                        |   | 113 |  | 
                      
                        |   | 114 | {{{ | 
                      
                        |   | 115 | #!sh | 
                      
                        |   | 116 | <configuration> | 
                      
                        |   | 117 |   <property> | 
                      
                        |   | 118 |     <name>mapred.job.tracker</name> | 
                      
                        |   | 119 |     <value>localhost:9001</value> | 
                      
                        |   | 120 |   </property> | 
                      
                        |   | 121 | </configuration> | 
                      
                        |   | 122 | }}} | 
                      
                        |   | 123 |  | 
                      
                        |   | 124 |  * 啟動 | 
                      
                        |   | 125 | {{{ | 
                      
                        |   | 126 | $ cd /opt/hadoop | 
                      
                        |   | 127 | $ source /opt/hadoop/conf/hadoop-env.sh | 
                      
                        |   | 128 | $ hadoop namenode -format | 
                      
                        |   | 129 | $ start-all.sh | 
                      
                        |   | 130 | $ hadoop fs -put conf input | 
                      
                        |   | 131 | $ hadoop fs -ls  | 
                      
                        |   | 132 | }}} | 
                      
                        |   | 133 |  | 
                      
                        |   | 134 |  * 沒有錯誤訊息則代表無誤 | 
                      
                        |   | 135 |  | 
                      
                        |   | 136 |  == 1.4. 安裝eclipse == | 
                      
                        |   | 137 |   | 
                      
                        |   | 138 |  * 在此提供兩個方法來下載檔案 | 
                      
                        |   | 139 |    * 方法一:下載 [http://www.eclipse.org/downloads/download.php?file=/eclipse/downloads/drops/R-3.4.2-200902111700/eclipse-SDK-3.4.2-linux-gtk.tar.gz eclipse SDK 3.4.2 Classic],並且放這檔案到家目錄 | 
                      
                        |   | 140 |    * 方法二:貼上指令 | 
                      
                        |   | 141 | {{{ | 
                      
                        |   | 142 | $ cd ~ | 
                      
                        |   | 143 | $ wget http://ftp.cs.pu.edu.tw/pub/eclipse/eclipse/downloads/drops/R-3.4.2-200902111700/eclipse-SDK-3.4.2-linux-gtk.tar.gz | 
                      
                        |   | 144 | }}} | 
                      
                        |   | 145 |  | 
                      
                        |   | 146 |  * eclipse 檔已下載到家目錄後,執行下面指令: | 
                      
                        |   | 147 |   | 
                      
                        |   | 148 | {{{ | 
                      
                        |   | 149 | $ cd ~ | 
                      
                        |   | 150 | $ tar -zxvf eclipse-SDK-3.4.2-linux-gtk.tar.gz | 
                      
                        |   | 151 | $ sudo mv eclipse /opt | 
                      
                        |   | 152 | $ sudo ln -sf /opt/eclipse/eclipse /usr/local/bin/ | 
                      
                        |   | 153 |  | 
                      
                        |   | 154 | }}} | 
                      
                        |   | 155 |  | 
                      
                        |   | 156 |  = 二、 建立專案 =  | 
                      
                        |   | 157 |  | 
                      
                        |   | 158 | == 2.1 安裝hadoop 的 eclipse plugin == | 
                      
                        |   | 159 |  | 
                      
                        |   | 160 |  * 匯入hadoop 0.20.2 eclipse plugin  | 
                      
                        |   | 161 |   | 
                      
                        |   | 162 | {{{ | 
                      
                        |   | 163 | $ cd /opt/hadoop | 
                      
                        |   | 164 | $ sudo cp /opt/hadoop/contrib/eclipse-plugin/hadoop-0.20.2-eclipse-plugin.jar /opt/eclipse/plugins | 
                      
                        |   | 165 | }}} | 
                      
                        |   | 166 |  | 
                      
                        |   | 167 | {{{ | 
                      
                        |   | 168 | $ sudo vim /opt/eclipse/eclipse.ini | 
                      
                        |   | 169 | }}} | 
                      
                        |   | 170 |  | 
                      
                        |   | 171 |  * 可斟酌參考eclipse.ini內容(非必要) | 
                      
                        |   | 172 |   | 
                      
                        |   | 173 | {{{ | 
                      
                        |   | 174 | #!sh | 
                      
                        |   | 175 | -startup | 
                      
                        |   | 176 | plugins/org.eclipse.equinox.launcher_1.0.101.R34x_v20081125.jar | 
                      
                        |   | 177 | --launcher.library | 
                      
                        |   | 178 | plugins/org.eclipse.equinox.launcher.gtk.linux.x86_1.0.101.R34x_v20080805 | 
                      
                        |   | 179 | -showsplash | 
                      
                        |   | 180 | org.eclipse.platform | 
                      
                        |   | 181 | --launcher.XXMaxPermSize | 
                      
                        |   | 182 | 512m | 
                      
                        |   | 183 | -vmargs | 
                      
                        |   | 184 | -Xms40m | 
                      
                        |   | 185 | -Xmx512m | 
                      
                        |   | 186 | }}} | 
                      
                        |   | 187 |  | 
                      
                        |   | 188 | == 2.2 開啟eclipse == | 
                      
                        |   | 189 |  | 
                      
                        |   | 190 |  * 打開eclipse  | 
                      
                        |   | 191 |   | 
                      
                        |   | 192 | {{{ | 
                      
                        |   | 193 | $ eclipse & | 
                      
                        |   | 194 | }}} | 
                      
                        |   | 195 |  | 
                      
                        |   | 196 | 一開始會出現問你要將工作目錄放在哪裡:在這我們用預設值 | 
                      
                        |   | 197 |  | 
                      
                        |   | 198 | [[Image(wiki:waue:2009:0617:2-1.png)]]  | 
                      
                        |   | 199 |  | 
                      
                        |   | 200 | ------- | 
                      
                        |   | 201 |  | 
                      
                        |   | 202 | '''PS: 之後的說明則是在eclipse 上的介面操作''' | 
                      
                        |   | 203 |  | 
                      
                        |   | 204 | ------- | 
                      
                        |   | 205 |  | 
                      
                        |   | 206 | == 2.3 選擇視野 == | 
                      
                        |   | 207 |  | 
                      
                        |   | 208 | || window -> || open pers.. -> || other.. -> || map/reduce||  | 
                      
                        |   | 209 |  | 
                      
                        |   | 210 | [[Image(wiki:waue:2009:0617:win-open-other.png)]]  | 
                      
                        |   | 211 |  | 
                      
                        |   | 212 | ------- | 
                      
                        |   | 213 |  | 
                      
                        |   | 214 | 設定要用 Map/Reduce 的視野 | 
                      
                        |   | 215 | [[Image(wiki:waue:2009:0617:2-2.png)]]  | 
                      
                        |   | 216 |  | 
                      
                        |   | 217 | --------- | 
                      
                        |   | 218 |  | 
                      
                        |   | 219 | 使用 Map/Reduce 的視野後的介面呈現 | 
                      
                        |   | 220 | [[Image(wiki:waue:2009:0617:2-3.png)]]  | 
                      
                        |   | 221 |  | 
                      
                        |   | 222 | -------- | 
                      
                        |   | 223 |  | 
                      
                        |   | 224 | == 2.4 建立專案 == | 
                      
                        |   | 225 |  | 
                      
                        |   | 226 |  || file ->  || new ->  || project ->  || Map/Reduce ->  || Map/Reduce Project -> ||  next ||  | 
                      
                        |   | 227 | [[Image(wiki:waue:2009:0617:file-new-project.png)]] | 
                      
                        |   | 228 |  | 
                      
                        |   | 229 | -------- | 
                      
                        |   | 230 |  | 
                      
                        |   | 231 | 建立mapreduce專案(1) | 
                      
                        |   | 232 |  | 
                      
                        |   | 233 | [[Image(wiki:waue:2009:0617:2-4.png)]]  | 
                      
                        |   | 234 |  | 
                      
                        |   | 235 | ----------- | 
                      
                        |   | 236 |  | 
                      
                        |   | 237 | 建立mapreduce專案的(2) | 
                      
                        |   | 238 | {{{ | 
                      
                        |   | 239 | #!sh | 
                      
                        |   | 240 | project name-> 輸入 : icas (隨意) | 
                      
                        |   | 241 | use default hadoop -> Configur Hadoop install... -> 輸入: "/opt/hadoop" -> ok | 
                      
                        |   | 242 | Finish | 
                      
                        |   | 243 | }}} | 
                      
                        |   | 244 |  | 
                      
                        |   | 245 | [[Image(wiki:waue:2009:0617:2-4-2.png)]] | 
                      
                        |   | 246 |  | 
                      
                        |   | 247 |  | 
                      
                        |   | 248 | -------------- | 
                      
                        |   | 249 |  | 
                      
                        |   | 250 | == 2.5 設定專案 == | 
                      
                        |   | 251 |  | 
                      
                        |   | 252 | 由於剛剛建立了icas這個專案,因此eclipse已經建立了新的專案,出現在左邊視窗,右鍵點選該資料夾,並選properties | 
                      
                        |   | 253 |  | 
                      
                        |   | 254 | -------------- | 
                      
                        |   | 255 |  | 
                      
                        |   | 256 |  Step1. 右鍵點選project的properties做細部設定 | 
                      
                        |   | 257 |  | 
                      
                        |   | 258 | [[Image(wiki:waue:2009:0617:2-5.png)]] | 
                      
                        |   | 259 |  | 
                      
                        |   | 260 | ---------- | 
                      
                        |   | 261 |  | 
                      
                        |   | 262 |  Step2. 進入專案的細部設定頁 | 
                      
                        |   | 263 |  | 
                      
                        |   | 264 | hadoop的javadoc的設定(1) | 
                      
                        |   | 265 | [[Image(wiki:waue:2009:0617:Image(2-5-1.png)]] | 
                      
                        |   | 266 |  | 
                      
                        |   | 267 |  * java Build Path -> Libraries -> hadoop-0.20.2-ant.jar | 
                      
                        |   | 268 |  * java Build Path -> Libraries -> hadoop-0.20.2-core.jar  | 
                      
                        |   | 269 |  * java Build Path -> Libraries ->  hadoop-0.20.2-tools.jar | 
                      
                        |   | 270 |    * 以 hadoop-0.20.2-core.jar 的設定內容如下,其他依此類推 | 
                      
                        |   | 271 |     | 
                      
                        |   | 272 | {{{ | 
                      
                        |   | 273 | #!sh | 
                      
                        |   | 274 | source ...-> 輸入:/opt/opt/hadoop-0.20.2/src/core | 
                      
                        |   | 275 | javadoc ...-> 輸入:file:/opt/hadoop/docs/api/ | 
                      
                        |   | 276 | }}} | 
                      
                        |   | 277 |  | 
                      
                        |   | 278 | ------------ | 
                      
                        |   | 279 |  Step3. hadoop的javadoc的設定完後(2) | 
                      
                        |   | 280 | [[Image(wiki:waue:2009:0617:Image(2-5-2.png)]] | 
                      
                        |   | 281 |  | 
                      
                        |   | 282 | ------------ | 
                      
                        |   | 283 |  Step4. java本身的javadoc的設定(3) | 
                      
                        |   | 284 |   | 
                      
                        |   | 285 |  * javadoc location -> 輸入:file:/usr/lib/jvm/java-6-sun/docs/api/ | 
                      
                        |   | 286 |   | 
                      
                        |   | 287 | [[Image(wiki:waue:2009:0617:2-5-3.png)]] | 
                      
                        |   | 288 |  | 
                      
                        |   | 289 | ----- | 
                      
                        |   | 290 | 設定完後回到eclipse 主視窗 | 
                      
                        |   | 291 |  | 
                      
                        |   | 292 |  | 
                      
                        |   | 293 | == 2.6 連接hadoop server == | 
                      
                        |   | 294 |  | 
                      
                        |   | 295 | -------- | 
                      
                        |   | 296 |  Step1. 視窗右下角黃色大象圖示"Map/Reduce Locations tag" -> 點選齒輪右邊的藍色大象圖示: | 
                      
                        |   | 297 | [[Image(wiki:waue:2009:0617:2-6.png)]] | 
                      
                        |   | 298 |  | 
                      
                        |   | 299 | ------------- | 
                      
                        |   | 300 |  Step2. 進行eclipse 與 hadoop 間的設定(2) | 
                      
                        |   | 301 | [[Image(wiki:waue:2009:0617:2-6-1.png)]] | 
                      
                        |   | 302 |  | 
                      
                        |   | 303 | {{{ | 
                      
                        |   | 304 | #!sh | 
                      
                        |   | 305 | Location Name -> 輸入:hadoop  (隨意) | 
                      
                        |   | 306 | Map/Reduce Master -> Host-> 輸入:localhost  | 
                      
                        |   | 307 | Map/Reduce Master -> Port-> 輸入:9001  | 
                      
                        |   | 308 | DFS Master -> Host-> 輸入:9000 | 
                      
                        |   | 309 | Finish | 
                      
                        |   | 310 | }}} | 
                      
                        |   | 311 | ---------------- | 
                      
                        |   | 312 |  | 
                      
                        |   | 313 | 設定完後,可以看到下方多了一隻藍色大象,左方展開資料夾也可以秀出在hdfs內的檔案結構 | 
                      
                        |   | 314 | [[Image(wiki:waue:2009:0617:2-6-2.png)]] | 
                      
                        |   | 315 | ------------- | 
                      
                        |   | 316 |  | 
                      
                        |   | 317 |  = 三、 撰寫範例程式 = | 
                      
                        |   | 318 |  | 
                      
                        |   | 319 |  * 之前在eclipse上已經開了個專案icas,因此這個目錄在: | 
                      
                        |   | 320 |    * /home/hadooper/workspace/icas | 
                      
                        |   | 321 |  * 在這個目錄內有兩個資料夾:  | 
                      
                        |   | 322 |    * src : 用來裝程式原始碼  | 
                      
                        |   | 323 |    * bin : 用來裝編譯後的class檔 | 
                      
                        |   | 324 |  * 如此一來原始碼和編譯檔就不會混在一起,對之後產生jar檔會很有幫助 | 
                      
                        |   | 325 |  * 在這我們編輯一個範例程式 : WordCount  | 
                      
                        |   | 326 |  | 
                      
                        |   | 327 |  == 3.1 mapper.java == | 
                      
                        |   | 328 |   | 
                      
                        |   | 329 |  1. new | 
                      
                        |   | 330 |   | 
                      
                        |   | 331 |  || File ->  || new ->  || mapper ||  | 
                      
                        |   | 332 | [[Image(wiki:waue:2009:0617:file-new-mapper.png)]] | 
                      
                        |   | 333 |  | 
                      
                        |   | 334 | ----------- | 
                      
                        |   | 335 |  | 
                      
                        |   | 336 |  2. create | 
                      
                        |   | 337 |   | 
                      
                        |   | 338 | [[Image(wiki:waue:2009:0617:3-1.png)]] | 
                      
                        |   | 339 | {{{ | 
                      
                        |   | 340 | #!sh | 
                      
                        |   | 341 | source folder-> 輸入: icas/src | 
                      
                        |   | 342 | Package : Sample | 
                      
                        |   | 343 | Name -> : mapper | 
                      
                        |   | 344 | }}} | 
                      
                        |   | 345 | ---------- | 
                      
                        |   | 346 |  | 
                      
                        |   | 347 |  3. modify | 
                      
                        |   | 348 |   | 
                      
                        |   | 349 | {{{ | 
                      
                        |   | 350 | #!java | 
                      
                        |   | 351 | package Sample; | 
                      
                        |   | 352 |  | 
                      
                        |   | 353 | import java.io.IOException; | 
                      
                        |   | 354 | import java.util.StringTokenizer; | 
                      
                        |   | 355 |  | 
                      
                        |   | 356 | import org.apache.hadoop.io.IntWritable; | 
                      
                        |   | 357 | import org.apache.hadoop.io.Text; | 
                      
                        |   | 358 | import org.apache.hadoop.mapreduce.Mapper; | 
                      
                        |   | 359 |  | 
                      
                        |   | 360 | public class mapper extends Mapper<Object, Text, Text, IntWritable> { | 
                      
                        |   | 361 |  | 
                      
                        |   | 362 |         private final static IntWritable one = new IntWritable(1); | 
                      
                        |   | 363 |         private Text word = new Text(); | 
                      
                        |   | 364 |  | 
                      
                        |   | 365 |         public void map(Object key, Text value, Context context) | 
                      
                        |   | 366 |                         throws IOException, InterruptedException { | 
                      
                        |   | 367 |                 StringTokenizer itr = new StringTokenizer(value.toString()); | 
                      
                        |   | 368 |                 while (itr.hasMoreTokens()) { | 
                      
                        |   | 369 |                         word.set(itr.nextToken()); | 
                      
                        |   | 370 |                         context.write(word, one); | 
                      
                        |   | 371 |                 } | 
                      
                        |   | 372 |         } | 
                      
                        |   | 373 | } | 
                      
                        |   | 374 | }}} | 
                      
                        |   | 375 |  | 
                      
                        |   | 376 | 建立mapper.java後,貼入程式碼 | 
                      
                        |   | 377 | [[Image(wiki:waue:2009:0617:3-2.png)]] | 
                      
                        |   | 378 |  | 
                      
                        |   | 379 | ------------ | 
                      
                        |   | 380 |  | 
                      
                        |   | 381 | == 3.2 reducer.java == | 
                      
                        |   | 382 |  | 
                      
                        |   | 383 |  1. new | 
                      
                        |   | 384 |  | 
                      
                        |   | 385 |  * File -> new -> reducer | 
                      
                        |   | 386 | [[Image(wiki:waue:2009:0617:file-new-reducer.png)]] | 
                      
                        |   | 387 |  | 
                      
                        |   | 388 | ------- | 
                      
                        |   | 389 |  2. create | 
                      
                        |   | 390 | [[Image(wiki:waue:2009:0617:3-3.png)]] | 
                      
                        |   | 391 |  | 
                      
                        |   | 392 | {{{ | 
                      
                        |   | 393 | #!sh | 
                      
                        |   | 394 | source folder-> 輸入: icas/src | 
                      
                        |   | 395 | Package : Sample | 
                      
                        |   | 396 | Name -> : reducer | 
                      
                        |   | 397 | }}} | 
                      
                        |   | 398 |  | 
                      
                        |   | 399 | ----------- | 
                      
                        |   | 400 |  | 
                      
                        |   | 401 |  3. modify  | 
                      
                        |   | 402 |   | 
                      
                        |   | 403 | {{{ | 
                      
                        |   | 404 | #!java | 
                      
                        |   | 405 | package Sample; | 
                      
                        |   | 406 |  | 
                      
                        |   | 407 | import java.io.IOException; | 
                      
                        |   | 408 |  | 
                      
                        |   | 409 | import org.apache.hadoop.io.IntWritable; | 
                      
                        |   | 410 | import org.apache.hadoop.io.Text; | 
                      
                        |   | 411 | import org.apache.hadoop.mapreduce.Reducer; | 
                      
                        |   | 412 |  | 
                      
                        |   | 413 | public class reducer extends Reducer<Text, IntWritable, Text, IntWritable> { | 
                      
                        |   | 414 |         private IntWritable result = new IntWritable(); | 
                      
                        |   | 415 |  | 
                      
                        |   | 416 |         public void reduce(Text key, Iterable<IntWritable> values, Context context) | 
                      
                        |   | 417 |                         throws IOException, InterruptedException { | 
                      
                        |   | 418 |                 int sum = 0; | 
                      
                        |   | 419 |                 for (IntWritable val : values) { | 
                      
                        |   | 420 |                         sum += val.get(); | 
                      
                        |   | 421 |                 } | 
                      
                        |   | 422 |                 result.set(sum); | 
                      
                        |   | 423 |                 context.write(key, result); | 
                      
                        |   | 424 |         } | 
                      
                        |   | 425 | } | 
                      
                        |   | 426 | }}} | 
                      
                        |   | 427 |  | 
                      
                        |   | 428 |  * File -> new -> Map/Reduce Driver | 
                      
                        |   | 429 | [[Image(wiki:waue:2009:0617:file-new-mr-driver.png)]] | 
                      
                        |   | 430 | ---------- | 
                      
                        |   | 431 |  | 
                      
                        |   | 432 | == 3.3 WordCount.java (main function) == | 
                      
                        |   | 433 |  | 
                      
                        |   | 434 |  1. new | 
                      
                        |   | 435 |  | 
                      
                        |   | 436 | 建立WordCount.java,此檔用來驅動mapper 與 reducer,因此選擇 Map/Reduce Driver | 
                      
                        |   | 437 | [[Image(wiki:waue:2009:0617:3-4.png)]] | 
                      
                        |   | 438 | ------------ | 
                      
                        |   | 439 |  | 
                      
                        |   | 440 |  2. create | 
                      
                        |   | 441 |  | 
                      
                        |   | 442 | {{{ | 
                      
                        |   | 443 | #!sh | 
                      
                        |   | 444 | source folder-> 輸入: icas/src | 
                      
                        |   | 445 | Package : Sample | 
                      
                        |   | 446 | Name -> : WordCount.java | 
                      
                        |   | 447 | }}} | 
                      
                        |   | 448 |  | 
                      
                        |   | 449 | ------- | 
                      
                        |   | 450 |  3. modify | 
                      
                        |   | 451 |  | 
                      
                        |   | 452 | {{{ | 
                      
                        |   | 453 | #!java | 
                      
                        |   | 454 | package Sample; | 
                      
                        |   | 455 |  | 
                      
                        |   | 456 | import org.apache.hadoop.conf.Configuration; | 
                      
                        |   | 457 | import org.apache.hadoop.fs.Path; | 
                      
                        |   | 458 | import org.apache.hadoop.io.IntWritable; | 
                      
                        |   | 459 | import org.apache.hadoop.io.Text; | 
                      
                        |   | 460 | import org.apache.hadoop.mapreduce.Job; | 
                      
                        |   | 461 | import org.apache.hadoop.mapreduce.lib.input.FileInputFormat; | 
                      
                        |   | 462 | import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat; | 
                      
                        |   | 463 | import org.apache.hadoop.util.GenericOptionsParser; | 
                      
                        |   | 464 |  | 
                      
                        |   | 465 | public class WordCount { | 
                      
                        |   | 466 |  | 
                      
                        |   | 467 |         public static void main(String[] args) throws Exception { | 
                      
                        |   | 468 |                 Configuration conf = new Configuration(); | 
                      
                        |   | 469 |                 String[] otherArgs = new GenericOptionsParser(conf, args) | 
                      
                        |   | 470 |                                 .getRemainingArgs(); | 
                      
                        |   | 471 |                 if (otherArgs.length != 2) { | 
                      
                        |   | 472 |                         System.err.println("Usage: wordcount <in> <out>"); | 
                      
                        |   | 473 |                         System.exit(2); | 
                      
                        |   | 474 |                 } | 
                      
                        |   | 475 |                 Job job = new Job(conf, "word count"); | 
                      
                        |   | 476 |                 job.setJarByClass(WordCount.class); | 
                      
                        |   | 477 |                 job.setMapperClass(mapper.class); | 
                      
                        |   | 478 |  | 
                      
                        |   | 479 |                 job.setCombinerClass(reducer.class); | 
                      
                        |   | 480 |                 job.setReducerClass(reducer.class); | 
                      
                        |   | 481 |                 job.setOutputKeyClass(Text.class); | 
                      
                        |   | 482 |                 job.setOutputValueClass(IntWritable.class); | 
                      
                        |   | 483 |                 FileInputFormat.addInputPath(job, new Path(otherArgs[0])); | 
                      
                        |   | 484 |                 FileOutputFormat.setOutputPath(job, new Path(otherArgs[1])); | 
                      
                        |   | 485 |                 System.exit(job.waitForCompletion(true) ? 0 : 1); | 
                      
                        |   | 486 |         } | 
                      
                        |   | 487 | } | 
                      
                        |   | 488 | }}} | 
                      
                        |   | 489 |  | 
                      
                        |   | 490 | 三個檔完成後並存檔後,整個程式建立完成 | 
                      
                        |   | 491 | [[Image(wiki:waue:2009:0617:3-5.png)]] | 
                      
                        |   | 492 |  | 
                      
                        |   | 493 | ------- | 
                      
                        |   | 494 |  | 
                      
                        |   | 495 |  * 三個檔都存檔後,可以看到icas專案下的src,bin都有檔案產生,我們用指令來check | 
                      
                        |   | 496 |   | 
                      
                        |   | 497 | {{{ | 
                      
                        |   | 498 | $ cd workspace/icas | 
                      
                        |   | 499 | $ ls src/Sample/ | 
                      
                        |   | 500 | mapper.java  reducer.java  WordCount.java | 
                      
                        |   | 501 | $ ls bin/Sample/ | 
                      
                        |   | 502 | mapper.class  reducer.class  WordCount.class | 
                      
                        |   | 503 | }}} | 
                      
                        |   | 504 |  | 
                      
                        |   | 505 |  = 四、測試範例程式 = | 
                      
                        |   | 506 |   | 
                      
                        |   | 507 |  * 右鍵點選WordCount.java -> run as -> run on Hadoop  | 
                      
                        |   | 508 |     | 
                      
                        |   | 509 | [[Image(wiki:waue:2009:0617:run-on-hadoop.png)]] | 
                      
                        |   | 510 |  | 
                      
                        |   | 511 |  | 
                      
                        |   | 512 |  | 
                      
                        |   | 513 | = 五、結論 = | 
                      
                        |   | 514 |  | 
                      
                        |   | 515 |  * 搭配eclipse ,我們可以更有效率的開發hadoop | 
                      
                        |   | 516 |  * hadoop 0.20 與之前的版本api以及設定都有些改變,因此hadoop 環境的設定,需要看[http://hadoop.apache.org/core/docs/r0.20.2/quickstart.html hadoop 0.20 的quickstart]; 而如何使用 [http://hadoop.apache.org/core/docs/r0.20.2/api/ hadoop 0.20] 的api,則可以看 /opt/hadoop/src/example/ 裡面的程式碼來提供初步的構想 |