1. Goal
1.1 Run in the graphical window
1.2 Finish Eclipse and plug-in installation
1.3 JDK installation
2. Preliminary preparation
2.1 Install the system
2.2 Node creation user
2.2.1 Create Group
(You can operate under the ordinary user of bass, you can operate under the root user. If you need to increase the privileges of sudo in bass, the root user does not need it.)
$ sudo useradd -u 285 -g 285 -m -s /bin/bash angel
The user number is 285, the user group number is 285, and the user name is angel.
2.2.2 Set the angel user group and its password
sudo gpasswd -a angel sudo
adds the angel user to the sudo group.
sudo passwd angel
password is 123
2.2.3 Switch angel user
su-angel
password: 123 At
this point, the angel user is created successfully.
2.3.Java installation
2.3.1 Nodes create app directory and modify /app file attributes
3.1.1 Create app directory
Create
sudo mkdir /app under angel user
3.1.2 Modify /app file attribute
sudo chown -R angel:angel /app
2.3.2 Edit jdk environment variables for all nodes
vi /home/angel/.profile
add 2 lines at the end
2.3.3 All node jdk environment variables take effect
source /home/angel/.profile
2.3.4 Upload jdk and compress jdk to angel user
Use winscp tool, log in as root user
2.3.5 Unzip the jdk compressed package and place it in the /app directory
cd / app
tar xzvf /home/angel/jdk-8u261-linux-x64.tar.gz -C / app
2.3.6 Test
java -version
javac -version So far
, Java is installed successfully.
3 Start the Hadoop cluster
3.1 Start dfs
start-dfs.sh
3.2 Start yarn
start-yarn.sh
3.3 Start JobHistoryServer
mr-jobhistory-daemon.sh start historyserver
3.4 View process
3.4.1 master node
jps
3.4.2 slave1, slave2 nodes
This is the end of jps
, and the Hadoop cluster has been successfully started!
4 Upload Eclipse
4.1 Use Winscp to upload eclipse to the /root directory
eclipse-java-oxygen-3-linux-gtk-x86_64.tar.gz
4.2 Unzip eclipse to the /app directory
cd / app
tar xvzf /root/eclipse-java-oxygen-3-linux-gtk-x86_64.tar.gz
5 Upload hadoop-eclipse-plugin
5.1 Use Winscp to put hadoop-eclipse-plugin-2.6.0.jar in the /app/eclipse/plugins/ directory
5.2 Check whether the upload is successful
cd /app/eclipse/plugins/
ll hadoop* The
hadoop upload is successful here!
6 Download hadoop-2.8.5
Download hadoop-2.8.5 from the master node to
scp -r angel@master:/app/hadoop-2.8.5 /app in the /app directory on the desktop node
because the master and desktop do not have ssh protocol, so you need to enter the password
in the terminal, Check whether the download is successful.
So far, hadoop-2.8.5 downloaded successfully.
7 Modify file attributes
chown -R eclipse/ hadoop-2.8.5/ jdk1.8.0_261/
Check if the modification is successful
cd /home/angel/app
ll
to the file attribute modification successfully!
8 start eclipse
Conditions: it must be started on the desktop; it must be entered as an angel user; it must be restarted; the hadoop cluster must be started.
If you can't enter with angel users, click the link below to solve this problem.
Solution
Click Launch.
Open Windows-Preferences in the eclipse window, there is "Hadoop Map/Reduce" option on the left side of the window, which means the installation is successful!
Open "File"-"New"-"Project" in the eclipse window interface, and there is a "Map/Reduce Project" option, which means the installation is successful. At this point, the MapReduce platform is successfully built!