顯然 755的權限不夠於 hadoop.tmp.dir 資料夾
chmod -R 777 /tmp/hadoop
*
hadoop 0.20 以後:設定在 core-site.xml 內的 hadoop.tmp.dir 到 /var/hadoop
sudo mkdir /var/hadoop
sudo chown USERNAME:USERNAME /var/hadoop
sudo chmod 777 /var/hadoop
通常hadoop 遇到問題而開不起來,通常只好重新namenode -format ,然而還是會遇到問題
/opt/hadoop/bin/stop-all.sh
rm -rf /var/hadoop/*
rm -rf /tmp/hadoop*
rm -rf /opt/hadoop/logs/*
/opt/hadoop/bin/hadoop namenode -format
/opt/hadoop/bin/hadoop start-all.sh/opt/hadoop/bin/hadoop stop-all.sh
--------
在CentOS實作Hadoop單機..
設定登入免密碼(因為如果要多台運算,每台都必須輸入密碼。)
安裝Java-JDK(已包含JDE),把預設的Open-JDK改成Sun-JDK。
//修改預設JDK
#alternatives --install /usr/bin/java java /usr/java/jdk1.6.0_22/bin/java 1
#alternatives --install /usr/bin/javac javac /usr/java/jdk1.6.0_22/bin/javac 1
//檢查指令
java -version
//有無正確執行
java or javac
下載安裝hadoop(使用0.20.2)
修改三個設定檔。(hadoop-env.sh,core-site.xml,hdfs-site.xml,mapred-site.xml)
提供最簡單的設定檔。
//hadoop-env.sh
<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>/var/hadoop/hadoop-\${user.name}</value>
</property>
</configuration>
//core-site.xml
<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>/var/hadoop/hadoop-\${user.name}</value>
</property>
</configuration>
//hdfs-site.xml
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
</configuration>
//mapred-site.xml
<configuration>
<property>
<name>mapred.job.tracker</name>
<value>localhost:9001</value>
</property>
</configuration>
format namenode, secondnamenode, tasktracker
./opt/hadoop/bin/hadoop namenode -format
//啟動Hadoop
./opt/hadoop/bin/start-all.sh
//檢查運作狀態
http://localhost:50030 50060 50070 看結果。
真是惡夢~
修了10幾個錯誤才能執行,從昨天下午到現在花了一天的時間,我晚點整理在放上來!
1.localhost:50030 50060跑不出來~
2.Exception in thread "main" java.lang.IllegalArgumentException: n must be positive
Hadoop Common HADOOP-6766 apache pacth
https://issues.apache.org/jira/browse/HADOOP-6766
3.INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 0 time(s).
4.file could only be replicated to 0 nodes, instead of 1' in hadoop?
5.ERROR org.apache.hadoop.hdfs.ser:ver.datanode.DataNode: java.io.IOException:
Exception in thread "main" org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input path does not exist:
沒有把檔案傳進去hdfs
hadoop fs -rmr input
沒有留言:
張貼留言