Solution: Unable to load native-hadoop library for your platform error reason when installing Hadoop

Just after installing hadoop, the following error occurred

[root@ctOS ~]# /usr/local/hadoop/sbin/start-dfs.sh
20/06/04 08:22:37 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
localhost: namenode running as process 19541. Stop it first.
localhost: datanode running as process 19667. Stop it first.
Starting secondary namenodes [0.0.0.0]
0.0.0.0: secondarynamenode running as process 19839. Stop it first.
20/06/04 08:22:40 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

Means: WARN level warning, unable to load the local hadoop library for the platform... Use the appropriate built-in java classes.

At this time, use the jps command to check whether the NameNode and DateNode start normally.

[root@ctOS ~]# jps
19667 DataNode
19541 NameNode
19097 Jps
19839 SecondaryNameNode

Possible reason:
The hadoop native library provided by Apache is 32-bit, but there will be problems on a 64-bit server, so you need to compile the 64-bit version yourself.
Solution
1. First find the 64-bit lib package corresponding to your hadoop version, and you can manually compile it yourself.
2. You can download the corresponding compiled version online
(such as http://dl.bintray.com/sequenceiq/sequenceiq-bin/)
Unzip the prepared 64-bit lib package to the lib/native and lib directories of the installed hadoop installation directory

[root@ctOS ~]# tar -xvf hadoop-native-64-2.7.0.tar -C /usr/local/hadoop/lib/native

[root@ctOS ~]# tar -xvf hadoop-native-64-2.7.0.tar -C /usr/local/hadoop/lib

Add environment variables

[root@ctOS ~]# vim /etc/profile
在/etc/profile中添加以下内容
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
 
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"

Let environment variables take effect

[root@ctOS ~]# source /etc/profile 

Try again /usr/local/hadoop/sbin/start-dfs.sh

[root@ctOS native]# /usr/local/hadoop/sbin/start-dfs.sh
Starting namenodes on [localhost]
localhost: namenode running as process 19541. Stop it first.
localhost: datanode running as process 19667. Stop it first.
Starting secondary namenodes [0.0.0.0]
0.0.0.0: secondarynamenode running as process 19839. Stop it first.

No errors

Hadoop installation reference https://www.cnblogs.com/StarZhai/p/11712074.html

Guess you like

Origin blog.csdn.net/weixin_46251846/article/details/106537957