Previously installed Hadoop is followed by a detailed tutorial Xiamen woods filled with rain, so much the problems encountered, he did not how to think, lead to follow Hbase official website a lot of problems when the stand-alone configuration, remember now down to avoid future recidivism.
First, install the software first Tell me what network tutorial. However, the official website of the tutorial is very simple, such as Hadoop installation. Hadoop user is created by default; to hadoop rights; all are operating (or be wrong) at hadoop user; will update apt; will extract; will be equipped JAVA_HOME, will configure SSH-free secret landing. But I'm a white ah, do not understand, so check a lot of information. But these are Linux-based, I have to remember this.
First, create a user hadoop
1.1 hadoop user can create landing, and automatically create the user's login directory
sudo useradd -m hadoop
-m is automatically set up the user's login directory, log in as the new user will lose if you do not go wrong.
After user login shell specified -s used. The default is / bin / bash. (So you can not specify)
Other See Detailed useradd command:
https://www.cnblogs.com/irisrain/p/4324593.html
1.2 Set password, you can simply set hadoop, follow the prompts to enter the password twice
sudo passwd hadoop
sudo that allows ordinary users to have root privileges command, if the root is not necessary to lose
1.3 can increase hadoop administrator privileges to users, easy to deploy
sudo adduser hadoop sudo
Second, update apt
Follow us using apt to install software (such as ssh-server), if not the update may not install some software
sudo apt-get update
Third, install SSH, configuration-free landing close
Xiamen cf official website tutorial tutorials, Xiamen University, said download ssh-server (first make sure your linux system has been installed ssh, only for ubuntu system is generally installed by default ssh client, so we also need to manually install ssh server)
The official website says install ssh (why? I do not know, but I believe the official website hhh)
So the installation process is:
sudo apt-get install ssh
sudo apt-get install pdsh
Configuration -free secret landing is:
First check what can you avoid secret:
ssh localhost
If not, using ssh-keygen generated key and the authorization key is added (refer to the official website)
ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys #加入授权
chmod 0600 ~/.ssh/authorized_keys #啥意思?
Fourth, install the Java environment
4.1 apt fool install java
Press Options teacher seems to be the first to see the Java -version , no, the system prompts apt-install headlessjava getting better, and then install openjdk-8-jdk
apt install openjdk-8-jdk
11 seems to be disturbed because this is not wrong on hadoop3, stable
4.2 edit the current logged in user environment variable configuration file
vim ~/.bashrc
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
JAVA_HOME指明JDK安装路径,此路径下包括lib,bin,jre等文件夹
如果你不记得自己的JAVA在哪里,可以输 whereis java ,再翻文件夹到第一个不是快捷键的就是你java的位置啦!
4.3 Let environment variables to take effect immediately
Perform the following code:
source ~/.bashrc
4.4 Inspection
After executing the above command, you can check to see if settings are correct:
Etc. with a view java -version test it, so far, successfully installed the Java environment. Here you can enter the Hadoop installation.
echo $ JAVA_HOME # test variable value the Java - Version $ JAVA_HOME / bin / java -version # and directly execute the same java -version