准备完全分布式主机的ssh

准备完全分布式主机的ss

 

1.删除所有主机上.ssh下所有文件

2.在s250主机上生成密钥对

     $>ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa

3.将s250的公钥文件id_rsa.pub远程复制到251 ~ 253主机上。并放置/home/centos/.ssh/authorized_keys

    $>scp id_rsa.pub centos@s250:/home/centos/.ssh/authorized_keys

    $>scp id_rsa.pub centos@s251:/home/centos/.ssh/authorized_keys

    $>scp id_rsa.pub centos@s252:/home/centos/.ssh/authorized_keys

 4.配置完全分布式文件   cd /soft/hadoop/etc/hadoop

[core-site.xml]      localhost--->s200 【主控端名】

[hdfs-site.xml]      1--->3                   【slave个数】

[mapred-site.xml]  不变

[yarn-site.xml]     localhost--->s200 

[slaves]

s201

s202

s203

修改[hadoop-env.sh]以便在任何目录下都有效

export JAVA_HOME=/soft/jdk

...

5.分发配置【远程复制已改好的full配置文件】 cd /soft/hadoop/etc

$>scp -r full centos@s201:/soft/hadoop/etc

$>scp -r full centos@s202:/soft/hadoop/etc

...

6.远程 删除符号链接,创建符号链接【尽量写绝对路径】

  ssh s201 rm /soft/hadoop/etc/hadoop

  ssh s202 rm /soft/hadoop/etc/hadoop

  ...

  ssh s201 ln -s /soft/hadoop/etc/full /soft/hadoop/etc/hadoop

  ssh s202 ln -s /soft/hadoop/etc/full /soft/hadoop/etc/hadoop

  ...

7.删除临时目录文件

 $>sudo rm -rf /tmp/*

8.删除 Hadoop 日志

 $>ssh s201 rm -rf /soft/hadoop/logs/*

9.格式化文件系统

 $>hadoop namenode -format

10.打开进程

 $>start-all.sh

scp:远程复制

cp:本地复制

猜你喜欢

转载自www.cnblogs.com/Vowzhou/p/10144322.html