hadoop data warehouse build and run initialization --hadoop

Grid using user-created files decompress hadoop installation, and modify the corresponding profile

  • core-site.xml
  • hdfs-site.xml
  • yarn-site.xml
  • mapred-site.xml
  • hadoop-env.sh
  • yarn-env.sh

The modified after copying to another directory hadoop three nodes from

  • scp -r ./hadoop-3.1.0 from node IP: / home / grid

Then each / etc / profile file on the four machines, add the following to the root user environment variables

export  JAVA_HOME=/usr/java/jdk1.8.0_171
export  CLASSPATH=.:$JAVA_HOME/jre/lib/rt.jar:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
export  HADOOP_HOME=/home/grid/hadoop-3.1.0
export  HADOOP_COMMON_HOME=$HADOOP_HOME
export  HADOOP_HDFS_HOME=$HADOOP_HOME
export  HADOOP_MAPRED_HOME=$HADOOP_HOME
export  HADOOP_YARN_HOME=$HADOOP_HOME
export  HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export  PATH=$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$HADOOP_HOME/lib
export  HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export  HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
export  LD_LIBRARY_PATH=$HADOOP_HOME/lib/native

The environment variables to take effect

source /etc/profile

Then the user performs the grid to the command on the primary machine

# Format HDFS 
HDFS the NameNode - format
 # . If there name has beensuccessfully formatted output in the successful initialization 
# start HDFS 
Start- dfs.sh
 # start YARN 
start-yarn.sh

Enter JPS view the process, the following process shall be so successful

21680 NodeManager
21539 ResourceManager
20883 NameNode
25623 Jps
21225 SecondaryNameNode
21023 DataNode

Practice on the way encounter two problems:

1.start-dfs.sh will appear when the following warning

WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

Re-run using the debug mode, you can see detailed error information

DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path

Reference ligt0610 post processing: https: //blog.csdn.net/ligt0610/article/details/47757013

I.e. hadoop-env.sh which we add the following:

export HADOOP_OPTS="-Djava.library.path=${HADOOP_HOME}/lib/native/"

Re-run start-dfs.sh. Alarm information is not displayed. execution succeed

2. NameNode can not be viewed through a web interface

Upon inquiry found, hadoop3 version has become the port of 50070 9870

From entering the port. success!

 

20190616

Today, when it came to trying new questions are recorded

The implementation of the

# Format HDFS
hdfs namenode - format

When prompted

sh: 5: hdfs: not found

Useradd suspected when the user does not specify the type of shell, so the default is / bin / sh. Causes the command does not recognize.

The following steps:

1: View of the current user shell type the command: echo $ SHELL

2: Change the current user's default login shell: chsh -s / bin / bash username or use usermod -s / bin / bash username command

After normal default shell into / bin / bash.
---------------------
Author: wind Shi Yu
Source: CSDN
Original: https: //blog.csdn.net/suifengshiyu/article/details/40952771
copyright Disclaimer: This article is a blogger original article, reproduced, please attach Bowen link!

Is not resolved, the reasons for exclusion

Guess you like

Origin www.cnblogs.com/hipth/p/9042522.html