| | 1 | [[PageOutline]] |
| | 2 | {{{ |
| | 3 | #!html |
| | 4 | <div style="text-align: center;"><big |
| | 5 | style="font-weight: bold;"><big><big> hadoop 0.20 程式開發 </big></big></big></div> |
| | 6 | <div style="text-align: center;"> <big>eclipse plugin + Makefile</big> </div> |
| | 7 | }}} |
| | 8 | = 零. 前言 = |
| | 9 | * 開發hadoop 需要用到許多的物件導向語法,包括繼承關係、介面類別,而且需要匯入正確的classpath,否則寫hadoop程式只是打字練習... |
| | 10 | * 用類 vim 來處理這種複雜的程式,有可能會變成一場惡夢,因此用eclipse開發,搭配mapreduce-plugin會事半功倍。 |
| | 11 | * 早在hadoop 0.19~0.16之間的版本,筆者就試過各個plugin,每個版本的plugin都確實有大大小小的問題,如:hadoop plugin 無法正確使用,目前測試最穩定的對應是 '''Eclipse 3.3.2 + Hadoop 0.18~0.20 的 plugin''' |
| | 12 | * 注:Eclipse 3.4~3.5 搭配 hadoop-eclipse-plugin (於 Ubuntu 平台)會有無法 run on hadoop 的問題! |
| | 13 | * Last Update: 2010/09/24 |
| | 14 | |
| | 15 | == 0.1 環境說明 == |
| | 16 | * Ubuntu 10.04 |
| | 17 | * sun-java-6 |
| | 18 | * Eclipse 3.3.2 |
| | 19 | * hadoop 0.20.2 |
| | 20 | == 0.2 目錄說明 == |
| | 21 | * 使用者:waue |
| | 22 | * 使用者家目錄: /home/hadooper |
| | 23 | * 專案目錄 : /home/hadooper/workspace |
| | 24 | * hadoop目錄: /opt/hadoop |
| | 25 | = 一、安裝 = |
| | 26 | |
| | 27 | 安裝的部份沒必要都一模一樣,僅提供參考,反正只要安裝好java , hadoop , eclipse,並清楚自己的路徑就可以了 |
| | 28 | |
| | 29 | == 1.1. 安裝java == |
| | 30 | |
| | 31 | 首先安裝java 基本套件 |
| | 32 | |
| | 33 | {{{ |
| | 34 | $ sudo apt-get install java-common sun-java6-bin sun-java6-jdk sun-java6-jre |
| | 35 | }}} |
| | 36 | |
| | 37 | == 1.1.1. 安裝sun-java6-doc == |
| | 38 | |
| | 39 | 1 將javadoc (jdk-6u10-docs.zip) 下載下來 |
| | 40 | [https://cds.sun.com/is-bin/INTERSHOP.enfinity/WFS/CDS-CDS_Developer-Site/en_US/-/USD/ViewProductDetail-Start?ProductRef=jdk-6u10-docs-oth-JPR@CDS-CDS_Developer 下載點] |
| | 41 | [[Image(wiki:waue/2009/0617:1-1.png)]] |
| | 42 | [[Image(wiki:0428Hadoop_Lab1:hadoop_administration.png)]] |
| | 43 | |
| | 44 | 2 下載完後將檔案放在 /tmp/ 下 |
| | 45 | |
| | 46 | 3 執行 |
| | 47 | |
| | 48 | {{{ |
| | 49 | $ sudo apt-get install sun-java6-doc |
| | 50 | }}} |
| | 51 | |
| | 52 | == 1.2. ssh 安裝設定 == |
| | 53 | |
| | 54 | {{{ |
| | 55 | $ apt-get install ssh |
| | 56 | $ ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa |
| | 57 | $ cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys |
| | 58 | $ ssh localhost |
| | 59 | }}} |
| | 60 | |
| | 61 | 執行ssh localhost 沒有出現詢問密碼的訊息則無誤 |
| | 62 | |
| | 63 | == 1.3. 安裝hadoop == |
| | 64 | |
| | 65 | 安裝hadoop0.20到/opt/並取目錄名為hadoop |
| | 66 | |
| | 67 | {{{ |
| | 68 | $ cd ~ |
| | 69 | $ wget http://apache.ntu.edu.tw/hadoop/core/hadoop-0.20.2/hadoop-0.20.2.tar.gz |
| | 70 | $ tar zxvf hadoop-0.20.2.tar.gz |
| | 71 | $ sudo mv hadoop-0.20.2 /opt/ |
| | 72 | $ sudo chown -R waue:waue /opt/hadoop-0.20.2 |
| | 73 | $ sudo ln -sf /opt/hadoop-0.20.2 /opt/hadoop |
| | 74 | }}} |
| | 75 | |
| | 76 | * 編輯 /opt/hadoop/conf/hadoop-env.sh |
| | 77 | |
| | 78 | {{{ |
| | 79 | #!sh |
| | 80 | export JAVA_HOME=/usr/lib/jvm/java-6-sun |
| | 81 | export HADOOP_HOME=/opt/hadoop |
| | 82 | export PATH=$PATH:/opt/hadoop/bin |
| | 83 | }}} |
| | 84 | |
| | 85 | * 編輯 /opt/hadoop/conf/core-site.xml |
| | 86 | |
| | 87 | {{{ |
| | 88 | #!sh |
| | 89 | <configuration> |
| | 90 | <property> |
| | 91 | <name>fs.default.name</name> |
| | 92 | <value>hdfs://localhost:9000</value> |
| | 93 | </property> |
| | 94 | <property> |
| | 95 | <name>hadoop.tmp.dir</name> |
| | 96 | <value>/tmp/hadoop/hadoop-${user.name}</value> |
| | 97 | </property> |
| | 98 | </configuration> |
| | 99 | |
| | 100 | }}} |
| | 101 | |
| | 102 | * 編輯 /opt/hadoop/conf/mapred-site.xml |
| | 103 | |
| | 104 | {{{ |
| | 105 | #!sh |
| | 106 | <configuration> |
| | 107 | <property> |
| | 108 | <name>mapred.job.tracker</name> |
| | 109 | <value>localhost:9001</value> |
| | 110 | </property> |
| | 111 | </configuration> |
| | 112 | }}} |
| | 113 | |
| | 114 | * 啟動 |
| | 115 | {{{ |
| | 116 | $ cd /opt/hadoop |
| | 117 | $ source /opt/hadoop/conf/hadoop-env.sh |
| | 118 | $ hadoop namenode -format |
| | 119 | $ start-all.sh |
| | 120 | $ hadoop fs -put conf input |
| | 121 | $ hadoop fs -ls |
| | 122 | }}} |
| | 123 | |
| | 124 | * 沒有錯誤訊息則代表無誤 |
| | 125 | |
| | 126 | == 1.4. 安裝eclipse == |
| | 127 | |
| | 128 | |
| | 129 | {{{ |
| | 130 | |
| | 131 | }}} |
| | 132 | |
| | 133 | * eclipse 檔已下載到家目錄後,執行下面指令: |
| | 134 | |
| | 135 | {{{ |
| | 136 | $ cd ~ |
| | 137 | $ wget http://secuse.nchc.org.tw/class/eclipse-SDK-3.3.2-linux-gtk.tar.gz |
| | 138 | $ tar -zxvf eclipse-SDK-3.3.2-linux-gtk.tar.gz |
| | 139 | $ sudo mv eclipse /opt |
| | 140 | $ sudo ln -sf /opt/eclipse/eclipse /usr/local/bin/ |
| | 141 | |
| | 142 | }}} |
| | 143 | |
| | 144 | = 二、 建立專案 = |
| | 145 | |
| | 146 | == 2.1 安裝hadoop 的 eclipse plugin == |
| | 147 | |
| | 148 | * 匯入hadoop 0.20.2 eclipse plugin |
| | 149 | |
| | 150 | {{{ |
| | 151 | $ cd /opt/hadoop |
| | 152 | $ sudo cp /opt/hadoop/contrib/eclipse-plugin/hadoop-0.20.2-eclipse-plugin.jar /opt/eclipse/plugins |
| | 153 | }}} |
| | 154 | |
| | 155 | {{{ |
| | 156 | $ sudo vim /opt/eclipse/eclipse.ini |
| | 157 | }}} |
| | 158 | |
| | 159 | * 可斟酌參考eclipse.ini內容(非必要) |
| | 160 | |
| | 161 | {{{ |
| | 162 | -showsplash |
| | 163 | org.eclipse.platform |
| | 164 | -vmargs |
| | 165 | -Xms80m |
| | 166 | -Xmx512m |
| | 167 | }}} |
| | 168 | |
| | 169 | == 2.2 開啟eclipse == |
| | 170 | |
| | 171 | * 打開eclipse |
| | 172 | |
| | 173 | {{{ |
| | 174 | $ eclipse & |
| | 175 | }}} |
| | 176 | |
| | 177 | 一開始會出現問你要將工作目錄放在哪裡:在這我們用預設值 |
| | 178 | |
| | 179 | [[Image(wiki:waue/2009/0617:2-1.png)]] |
| | 180 | |
| | 181 | ------- |
| | 182 | |
| | 183 | '''PS: 之後的說明則是在eclipse 上的介面操作''' |
| | 184 | |
| | 185 | ------- |
| | 186 | |
| | 187 | == 2.3 選擇視野 == |
| | 188 | |
| | 189 | || window -> || open pers.. -> || other.. -> || map/reduce|| |
| | 190 | |
| | 191 | [[Image(wiki:waue/2009/0617:win-open-other.png)]] |
| | 192 | |
| | 193 | ------- |
| | 194 | |
| | 195 | 設定要用 Map/Reduce 的視野 |
| | 196 | [[Image(wiki:waue/2009/0617:2-2.png)]] |
| | 197 | |
| | 198 | --------- |
| | 199 | |
| | 200 | 使用 Map/Reduce 的視野後的介面呈現 |
| | 201 | [[Image(wiki:waue/2009/0617:2-3.png)]] |
| | 202 | |
| | 203 | -------- |
| | 204 | |
| | 205 | == 2.4 建立專案 == |
| | 206 | |
| | 207 | || file -> || new -> || project -> || Map/Reduce -> || Map/Reduce Project -> || next || |
| | 208 | [[Image(wiki:waue/2009/0617:file-new-project.png)]] |
| | 209 | |
| | 210 | -------- |
| | 211 | |
| | 212 | 建立mapreduce專案(1) |
| | 213 | |
| | 214 | [[Image(wiki:waue/2009/0617:2-4.png)]] |
| | 215 | |
| | 216 | ----------- |
| | 217 | |
| | 218 | 建立mapreduce專案的(2) |
| | 219 | {{{ |
| | 220 | #!sh |
| | 221 | project name-> 輸入 : icas (隨意) |
| | 222 | use default hadoop -> Configur Hadoop install... -> 輸入: "/opt/hadoop" -> ok |
| | 223 | Finish |
| | 224 | }}} |
| | 225 | |
| | 226 | [[Image(wiki:waue/2009/0617:2-4-2.png)]] |
| | 227 | |
| | 228 | |
| | 229 | -------------- |
| | 230 | |
| | 231 | == 2.5 設定專案 == |
| | 232 | |
| | 233 | 由於剛剛建立了icas這個專案,因此eclipse已經建立了新的專案,出現在左邊視窗,右鍵點選該資料夾,並選properties |
| | 234 | |
| | 235 | -------------- |
| | 236 | |
| | 237 | Step1. 右鍵點選project的properties做細部設定 |
| | 238 | |
| | 239 | [[Image(wiki:waue/2009/0617:2-5.png)]] |
| | 240 | |
| | 241 | ---------- |
| | 242 | |
| | 243 | Step2. 進入專案的細部設定頁 |
| | 244 | |
| | 245 | hadoop的javadoc的設定(1) |
| | 246 | [[Image(wiki:waue/2009/0617:2-5-1.png)]] |
| | 247 | |
| | 248 | * java Build Path -> Libraries -> hadoop-0.20.2-ant.jar |
| | 249 | * java Build Path -> Libraries -> hadoop-0.20.2-core.jar |
| | 250 | * java Build Path -> Libraries -> hadoop-0.20.2-tools.jar |
| | 251 | * 以 hadoop-0.20.2-core.jar 的設定內容如下,其他依此類推 |
| | 252 | |
| | 253 | {{{ |
| | 254 | #!sh |
| | 255 | source ...-> 輸入:/opt/opt/hadoop-0.20.2/src/ |
| | 256 | javadoc ...-> 輸入:file:/opt/hadoop/docs/api/ |
| | 257 | }}} |
| | 258 | |
| | 259 | ------------ |
| | 260 | Step3. hadoop的javadoc的設定完後(2) |
| | 261 | [[Image(wiki:waue/2009/0617:2-5-2.png)]] |
| | 262 | |
| | 263 | ------------ |
| | 264 | Step4. java本身的javadoc的設定(3) |
| | 265 | |
| | 266 | * javadoc location -> 輸入:file:/usr/lib/jvm/java-6-sun/docs/api/ |
| | 267 | |
| | 268 | [[Image(wiki:waue/2009/0617:2-5-3.png)]] |
| | 269 | |
| | 270 | ----- |
| | 271 | 設定完後回到eclipse 主視窗 |
| | 272 | |
| | 273 | |
| | 274 | == 2.6 連接hadoop server == |
| | 275 | |
| | 276 | -------- |
| | 277 | Step1. 視窗右下角黃色大象圖示"Map/Reduce Locations tag" -> 點選齒輪右邊的藍色大象圖示: |
| | 278 | [[Image(wiki:waue/2009/0617:2-6.png)]] |
| | 279 | |
| | 280 | ------------- |
| | 281 | Step2. 進行eclipse 與 hadoop 間的設定(2) |
| | 282 | [[Image(wiki:waue/2009/0617:2-6-1.png)]] |
| | 283 | |
| | 284 | {{{ |
| | 285 | #!sh |
| | 286 | Location Name -> 輸入:hadoop (隨意) |
| | 287 | Map/Reduce Master -> Host-> 輸入:localhost |
| | 288 | Map/Reduce Master -> Port-> 輸入:9001 |
| | 289 | DFS Master -> Host-> 輸入:9000 |
| | 290 | Finish |
| | 291 | }}} |
| | 292 | ---------------- |
| | 293 | |
| | 294 | 設定完後,可以看到下方多了一隻藍色大象,左方展開資料夾也可以秀出在hdfs內的檔案結構 |
| | 295 | [[Image(wiki:waue/2009/0617:2-6-2.png)]] |
| | 296 | ------------- |
| | 297 | |
| | 298 | = 三、 撰寫範例程式 = |
| | 299 | |
| | 300 | * 之前在eclipse上已經開了個專案icas,因此這個目錄在: |
| | 301 | * /home/hadooper/workspace/icas |
| | 302 | * 在這個目錄內有兩個資料夾: |
| | 303 | * src : 用來裝程式原始碼 |
| | 304 | * bin : 用來裝編譯後的class檔 |
| | 305 | * 如此一來原始碼和編譯檔就不會混在一起,對之後產生jar檔會很有幫助 |
| | 306 | * 在這我們編輯一個範例程式 : WordCount |
| | 307 | |
| | 308 | = 五、結論 = |
| | 309 | |
| | 310 | * 搭配eclipse ,我們可以更有效率的開發hadoop |
| | 311 | * hadoop 0.20 與之前的版本api以及設定都有些改變,因此hadoop 環境的設定,需要看[http://hadoop.apache.org/core/docs/r0.20.0/quickstart.html hadoop 0.20 的quickstart]; 而如何使用 [http://hadoop.apache.org/core/docs/r0.20.0/api/ hadoop 0.20] 的api,則可以看 /opt/hadoop/src/example/ 裡面的程式碼來提供初步的構想 |