hadoop(2.6.0)伪分布模式搭建


---》vmware 安装ubantu

设置root密码
http://jingyan.baidu.com/article/5225f26b0ac250e6fb09084e.html
VM tools
安装后 可以跨系统粘贴复制 和文件共享,好用!
http://jingyan.baidu.com/article/1974b289b813dcf4b1f77411.html
http://blog.csdn.net/saint_bxg/article/details/6911243
ssh
http://jingyan.baidu.com/article/9c69d48fb9fd7b13c8024e6b.html
ftp
http://jingyan.baidu.com/article/67508eb4d6c4fd9ccb1ce470.html
jdk
ubantu不支持rpm格式 ,最好下载tar.gz格式
http://jingyan.baidu.com/article/5d368d1e12a1af3f60c0570a.html
http://blog.csdn.net/xiaoxiaoxuewen/article/details/7550176

---》hadoop(2.6.0)伪分布模式搭建
* 安装java
* 安装ssh
* 免密码ssh设置
现在确认能否不输入口令就用ssh登录localhost:
$ ssh localhost
如果不输入口令就无法用ssh登陆localhost,执行下面的命令:
$ ssh-keygen -t dsa -P '' -f ~/.ssh/id_dsa
$ cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys

* 配置 hadoop
conf/core-site.xml:
<configuration>
     <property>
         <name>fs.default.name</name>
         <value>hdfs://localhost:9000</value>
     </property>
</configuration>

conf/hdfs-site.xml:
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>/home/xxx/mysoft/hadoop_tmp</value>
</property>
</configuration>

conf/mapred-site.xml:
<configuration>
     <property>
         <name>mapred.job.tracker</name>
         <value>localhost:9001</value>
     </property>
</configuration>

* 格式化
hadoop namenode -format
* 启动Hadoop 守护进程
xxx@xxx:~/mysoft/hadoop-2.6.0/sbin$./start-all.sh
* 验证启动的进程
xxx@xxx:~/mysoft/hadoop-2.6.0/share/hadoop/mapreduce$ jps
8002 Jps
7092 NodeManager
6180 NameNode
6405 DataNode
6696 SecondaryNameNode
6861 ResourceManager

* 测试
创建两个文件
xxx@xxx:~/tmp$ echo "Hello World Bye World" > file01 
xxx@xxx:~/tmp$ echo "Hello Hadoop Goodbye Hadoop" > file02

拷贝到hadoop中
xxx@xxx:~/tmp$ hadoop fs -copyFromLocal file0* /input
copyFromLocal: `/input': No such file or directory
xxx@xxx:~/tmp$ hadoop fs -mkdir /input
xxx@xxx:~/tmp$ hadoop fs -ls /input
xxx@xxx:~/tmp$ hadoop fs -copyFromLocal file0* /input
xxx@xxx:~/tmp$ hadoop fs -ls /input
Found 2 items
-rw-r--r--   1 xxx supergroup         22 2016-01-26 14:40 /input/file01
-rw-r--r--   1 xxx supergroup         28 2016-01-26 14:40 /input/file02

执行例子
xxx@xxx:~/mysoft/hadoop-2.6.0/share/hadoop/mapreduce$ hadoop jar hadoop-mapreduce-examples-2.6.0.jar wordcount /input/ /output

查看结果
xxx@xxx:~/mysoft/hadoop-2.6.0/share/hadoop/mapreduce$ hadoop fs -ls /output
Found 2 items
-rw-r--r--   1 xxx supergroup          0 2016-01-26 14:44 /output/_SUCCESS
-rw-r--r--   1 xxx supergroup         41 2016-01-26 14:44 /output/part-r-00000
xxx@xxx:~/mysoft/hadoop-2.6.0/share/hadoop/mapreduce$ hadoop fs -cat /output/part-r-00000
Bye     1
Goodbye 1
Hadoop  2
Hello   2
World   2


参考:
http://www.aboutyun.com/thread-6487-1-1.html
http://www.linuxidc.com/Linux/2014-08/105915p4.htm
http://blog.csdn.net/mlzhu007/article/details/8462615

猜你喜欢

转载自ynp.iteye.com/blog/2274380