Hadoop fully distributed preparations

Hadoop fully distributed operation (emphasis)

(1) Prepare 3 clients (close firewall, static ip, host name)
(2) Install JDK
(3) Configure environment variables
(4) Install Hadoop
(5) Configure environment variables
(6) Configure cluster
(7) Single point Start
(8) configure ssh
(9) cluster and test the cluster. The
virtual machine should perform the following tests:
hostname
ping the current host name
test ip
ifconfig
test firewall
service iptables status
test file permissions
ll /opt

scp (secure copy) secure copy (from server1 to server 2)

command:

scp     -r     $pdir/$fname    $user@hadoop$host:$pdir/$fname
命令   递归    要拷贝的文件路径/名称    目的用户@主机:目的路径/名称

Operate under dev1 For
example:

cd /opt/module
scp -r /opt/module  dev1@hadoop102:/opt/

Then hadoop101, hadoop102, hadoop103
and then authorized operation

chown dev:dev1 -R /opt/module

Insert picture description here
Authorized successfully.
If the permissions are insufficient then

vi  /etc/sudoers

Insert picture description here
In set nuauthorization command line below plus 91 drawing a line
can

source /etc/sudoers

Make the file work

Rsync is a fast synchronization command, which is faster than scp and
can avoid the transmission of the same file. It has the advantages of being fast, avoiding copying the same content and supporting symbolic links.

rsync    -rvl       $pdir/$fname $user@hadoop$host:$pdir/$fname
命令   选项参数   要拷贝的文件路径/名称     目的用户@主机:目的路径/名称

E.g

rsync -rvl /opt/software/ dev1@hadoop102:/opt/software

Guess you like

Origin blog.csdn.net/houzhicongone/article/details/114704137