hadoop GLIBC_2.14 not found

hadoop 2.7.2 centos 6.5 安装过程中报GLIBC_2.14 not found

查看系统的libc版本
[hadoop@master001 native]$ ll /lib64/libc.so.6
lrwxrwxrwx. 1 root root 12 Apr 14 16:14 /lib64/libc.so.6 -> libc-2.12.so
显示版本为2.12
到网站http://ftp.gnu.org/gnu/glibc/
下载glibc-2.14.tar.bz2
下载glibc-linuxthreads-2.5.tar.bz2
[hadoop@master001 native]$ tar -jxvf /home/hadoop/software/glibc-2.14.tar.bz2
[hadoop@master001 native]$ cd glibc-2.14/
[hadoop@master001 glibc-2.14]$ tar -jxvf /home/hadoop/software/glibc-linuxthreads-2.5.tar.bz2
[hadoop@master001 glibc-2.14]$ cd .. #必须返回上级目录
[hadoop@master001 glibc-2.14]$ export CFLAGS="-g -O2"           #加上优化开关,否则会出现错误
[hadoop@master001 native]$ ./glibc-2.14/configure --prefix=/usr --disable-profile --enable-add-ons --with-headers=/usr/include --with-binutils=/usr/bin
[hadoop@master001 native]$ make        #编译,执行很久,可能出错,出错再重新执行
[hadoop@master001 native]$ sudo make install   #安装,必须root用户执行
#验证版本是否升级
[hadoop@master001 native]$ ll /lib64/libc.so.6
lrwxrwxrwx 1 root root 12 Jun 25 02:07 /lib64/libc.so.6 -> libc-2.14.so #显示2.14
增加调试信息
[hadoop@master001 native]$ export HADOOP_ROOT_LOGGER=DEBUG,console
#显示有下面红色部分,说明本地库不再有错误
[hadoop@master001 native]$ hadoop fs -text /test/data/origz/access.log.gz
15/06/25 02:10:01 DEBUG util.Shell: setsid exited with exit code 0
15/06/25 02:10:01 DEBUG conf.Configuration: parsing URL jar:file:/usr/hadoop/share/hadoop/common/hadoop-common-2.5.2.jar!/core-default.xml
15/06/25 02:10:01 DEBUG conf.Configuration: parsing input stream sun.net.www.protocol.jar.JarURLConnection$JarURLInputStream@71be98f5
15/06/25 02:10:01 DEBUG conf.Configuration: parsing URL file:/usr/hadoop/etc/hadoop/core-site.xml
15/06/25 02:10:01 DEBUG conf.Configuration: parsing input stream java.io.BufferedInputStream@97e1986
15/06/25 02:10:02 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, about=, always=false, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)])
15/06/25 02:10:02 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, about=, always=false, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)])
15/06/25 02:10:02 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, about=, always=false, type=DEFAULT, valueName=Time, value=[GetGroups])
15/06/25 02:10:02 DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group related metrics
15/06/25 02:10:02 DEBUG security.Groups:  Creating new Groups object
15/06/25 02:10:02 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
15/06/25 02:10:02 DEBUG util.NativeCodeLoader: Loaded the native-hadoop library
15/06/25 02:10:02 DEBUG security.JniBasedUnixGroupsMapping: Using JniBasedUnixGroupsMapping for Group resolution
15/06/25 02:10:02 DEBUG security.JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMapping
15/06/25 02:10:02 DEBUG security.Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
15/06/25 02:10:02 DEBUG security.UserGroupInformation: hadoop login
15/06/25 02:10:02 DEBUG security.UserGroupInformation: hadoop login commit
15/06/25 02:10:02 DEBUG security.UserGroupInformation: using local user:UnixPrincipal: hadoop
15/06/25 02:10:02 DEBUG security.UserGroupInformation: UGI loginUser:hadoop (auth:SIMPLE)
15/06/25 02:10:03 DEBUG hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
15/06/25 02:10:03 DEBUG hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false
15/06/25 02:10:03 DEBUG hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
15/06/25 02:10:03 DEBUG hdfs.BlockReaderLocal: dfs.domain.socket.path =
15/06/25 02:10:03 DEBUG retry.RetryUtils: multipleLinearRandomRetry = null
15/06/25 02:10:03 DEBUG ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@501edcf1
15/06/25 02:10:03 DEBUG ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@16e7dcfd
15/06/25 02:10:04 DEBUG unix.DomainSocketWatcher: org.apache.hadoop.net.unix.DomainSocketWatcher$1@7e499e08: starting with interruptCheckPeriodMs = 60000
15/06/25 02:10:04 DEBUG shortcircuit.DomainSocketFactory: Both short-circuit local reads and UNIX domain socket are disabled.
15/06/25 02:10:04 DEBUG ipc.Client: The ping interval is 60000 ms.
15/06/25 02:10:04 DEBUG ipc.Client: Connecting to master001/192.168.75.155:8020
15/06/25 02:10:04 DEBUG ipc.Client: IPC Client (577405636) connection to master001/192.168.75.155:8020 from hadoop sending #0
15/06/25 02:10:04 DEBUG ipc.Client: IPC Client (577405636) connection to master001/192.168.75.155:8020 from hadoop: starting, having connections 1
15/06/25 02:10:04 DEBUG ipc.Client: IPC Client (577405636) connection to master001/192.168.75.155:8020 from hadoop got value #0
15/06/25 02:10:04 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 122ms
text: `/test/data/origz/access.log.gz': No such file or directory
15/06/25 02:10:04 DEBUG ipc.Client: stopping client from cache: org.apache.hadoop.ipc.Client@16e7dcfd
15/06/25 02:10:04 DEBUG ipc.Client: removing client from cache: org.apache.hadoop.ipc.Client@16e7dcfd
15/06/25 02:10:04 DEBUG ipc.Client: stopping actual client because no more references remain: org.apache.hadoop.ipc.Client@16e7dcfd
15/06/25 02:10:04 DEBUG ipc.Client: Stopping client
15/06/25 02:10:04 DEBUG ipc.Client: IPC Client (577405636) connection to master001/192.168.75.155:8020 from hadoop: closed
15/06/25 02:10:04 DEBUG ipc.Client: IPC Client (577405636) connection to master001/192.168.75.155:8020 from hadoop: stopped, remaining connections 0
==============================
完成之后,需要将集群重启:
[hadoop@master001 ~]$ sh /usr/hadoop/sbin/start-dfs.sh
[hadoop@master001 ~]$ sh /usr/hadoop/sbin/start-yarn.sh
[hadoop@master001 ~]$ hadoop fs -ls /
[hadoop@master001 ~]$ hadoop fs -mkdir /usr
[hadoop@master001 ~]$ hadoop fs -ls /
Found 1 items
drwxr-xr-x   - hadoop supergroup          0 2015-06-25 02:27 /usr

猜你喜欢

转载自feilong2483.iteye.com/blog/2308633