Hadoop study notes (to be sorted out)

installation steps:
1. Install the virtual machine system and make preparations (one can be installed and then cloned)
2. Modify the hostname and host of each virtual machine
3. Create user groups and users
4. Configure the virtual machine network so that the virtual machine systems and the host can ping each other.
5. Install jdk and configure environment variables to check whether the configuration is successful
6. Configure ssh to achieve passwordless login between nodes when ssh node1/2 command verification is successful
7. The master configures hadoop and transfers the hadoop file to the node node
8. Configure environment variables, start hadoop, check whether the installation is successful, and execute wordcount to check whether it is successful.
#Environment variable configuration 
export HADOOP_HOME=/opt/hadoop/ export PATH =$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
#SSL free login configuration

#Install SSL
yum -y install openssh-server  
ps -e | grep ssh  
ssh localhost

#generate key pair
ssh-keygen -t rsa

#On the master, import authorized_keys
cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys
#Enter the .ssh directory of the master and perform the copy operation
scp authorized_keys hadoop@node1:~/.ssh/
scp authorized_keys hadoop@node2:~/.ssh/

#Modify the permissions of the authorized_keys file on each host On all machines, execute the command
chmod 600 .ssh/authorized_keys
ssh node1(node2)

 

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=325008910&siteId=291194637