版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/qq_35744460/article/details/89039838
hadoop-2.6.0-cdh5.7.0 编译,支持 native 和压缩
相关环境安装
百度云相关软件连接
链接:https://pan.baidu.com/s/10QWQsdKi4wNHoCmV_dA39Q
提取码:bmic
BUILDING.txt 环境介绍
Requirements:
* Unix System
* JDK 1.7+
* Maven 3.0 or later
* Findbugs 1.3.9 (if running findbugs)
* ProtocolBuffer 2.5.0
* CMake 2.6 or newer (if compiling native code), must be 3.0 or newer on Mac
* Zlib devel (if compiling native code)
* openssl devel ( if compiling native hadoop-pipes )
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)
Building distributions:
Create binary distribution without native code and without documentation:
$ mvn package -Pdist -DskipTests -Dtar
Create binary distribution with native code and with documentation:
$ mvn package -Pdist,native,docs -DskipTests -Dtar
Create source distribution:
$ mvn package -Psrc -DskipTests
Create source and binary distributions with native code and documentation:
$ mvn package -Pdist,native,docs,src -DskipTests -Dtar
Create a local staging version of the website (in /tmp/hadoop-site)
$ mvn clean site; mvn site:stage -DstagingDirectory=/tmp/hadoop-site
java环境安装
要使用jdk1.7 虽然 文档写的是1.7 + ,但是使用jdk1.8测试后报错
tar -zxvf jdk1.7.0_45.tar.gz -C /usr/java/
vi ~/.bash_profile
export JAVA_HOME=/usr/java/jdk1.7.0_45
export PATH=$JAVA_HOME/bin:$PATH
[hadoop@hadoop001 java]$ source ~/.bash_profile
[hadoop@hadoop001 home]$ java -version
java version "1.7.0_45"
Java(TM) SE Runtime Environment (build 1.7.0_45-b18)
Java HotSpot(TM) 64-Bit Server VM (build 24.45-b08, mixed mode)
maven安装
[root@hadoop001 app]# yum install -y unzip zip
[root@hadoop001 app]# cd /usr
[root@hadoop001 usr]# mkdir maven
[root@hadoop001 app]# unzip -d /usr/maven/ apache-maven-3.3.9-bin.zip
**maven环境变量**
export MAVEN_HOME=/usr/maven/apache-maven-3.3.9
export PATH=$MAVEN_HOME/bin:$JAVA_HOME/bin:$PATH
[hadoop@hadoop001 java]$ source ~/.bash_profile
[hadoop@hadoop001 java]$ mvn -v
Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5; 2015-11-10T16:41:47+00:00)
Maven home: /usr/maven/apache-maven-3.3.9
Java version: 1.8.0_45, vendor: Oracle Corporation
Java home: /usr/java/jdk1.8.0_45/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "3.10.0-514.el7.x86_64", arch: "amd64", family: "unix"
修改conf/settings.xml(将mirror换成阿里云的)
<localRepository>/usr/maven/repo</localRepository> 配置仓库路径
<mirrors>
<mirror>
<id>alimaven</id>
<name>aliyun maven</name>
<url>http://maven.aliyun.com/nexus/content/groups/public/</url>
<mirrorOf>central</mirrorOf>
</mirror>
</mirrors>
相关依赖安装
[root@hadoop001 ~]# yum install -y gcc gcc-c++ make cmake
[root@hadoop001 ~]# yum -y install autoconf automake libtool curl g++
[root@hadoop001 ~]# yum install -y openssl openssl-devel svn ncurses-devel zlib-devel libtool
[root@hadoop001 ~]# yum install -y snappy snappy-devel bzip2 bzip2-devel lzo lzo-devel lzop autoconf automake
安装Protocol Buffer 2.5.0
[root@hadoop001 app]# tar -zxvf protobuf-2.5.0.tar.gz -C /home/hadoop/lib/
指定编译之后的安装路径,目录不需要自己去创建,编译过程中会自动创建
[root@hadoop001 protobuf-2.5.0]# ./configure --prefix=/usr/protobuf
[root@hadoop001 protobuf-2.5.0]# make && make install 安装
export MAVEN_HOME=/usr/maven/apache-maven-3.3.9
export PROTOBUF_HOME =/usr/protobuf
export PATH=$PROTOBUF_HOME/bin:$MAVEN_HOME/bin:$JAVA_HOME/bin:$PATH
[hadoop@hadoop001 java]$ protoc --version
libprotoc 2.5.0
安装findbugs
[root@hadoop001 app]# unzip -d /home/hadoop/lib/ findbugs-1.3.9.zip
[hadoop@hadoop001 java]$ vi ~/.bash_profile
export FINDBUGS_HOME=/home/hadoop/lib/findbugs-1.3.9
export PATH=$FINDBUGS_HOME/bin:$PROTOBUF_HOME/bin:$MAVEN_HOME/bin:$JAVA_HOME/bin:$PATH
[hadoop@hadoop001 java]$ source ~/.bash_profile
[hadoop@hadoop001 java]$ findbugs -version
1.3.9
编译
解压原文件(src)
[root@hadoop001 app]# tar -zxvf hadoop-2.6.0-cdh5.7.0-src.tar.gz -C /home/hadoop/source/
开始编译
[hadoop@hadoop001 hadoop-2.6.0-cdh5.7.0]$ mvn -X clean package -Pdist,native -DskipTests -Dtar
编译时间较长请耐心等待,等编译成功后查看编译文件
[hadoop@hadoop001 hadoop-dist]$ cd target/
[hadoop@hadoop001 target]$ ll
total 565848
drwxrwxr-x. 2 hadoop hadoop 28 Apr 5 01:17 antrun
drwxrwxr-x. 3 hadoop hadoop 22 Apr 5 01:17 classes
-rw-rw-r--. 1 hadoop hadoop 1998 Apr 5 01:17 dist-layout-stitching.sh
-rw-rw-r--. 1 hadoop hadoop 690 Apr 5 01:17 dist-tar-stitching.sh
drwxrwxr-x. 9 hadoop hadoop 149 Apr 5 01:17 hadoop-2.6.0-cdh5.7.0
-rw-rw-r--. 1 hadoop hadoop 192393404 Apr 5 01:17 hadoop-2.6.0-cdh5.7.0.tar.gz
-rw-rw-r--. 1 hadoop hadoop 7314 Apr 5 01:17 hadoop-dist-2.6.0-cdh5.7.0.jar
-rw-rw-r--. 1 hadoop hadoop 386994546 Apr 5 01:18 hadoop-dist-2.6.0-cdh5.7.0-javadoc.jar
-rw-rw-r--. 1 hadoop hadoop 4856 Apr 5 01:17 hadoop-dist-2.6.0-cdh5.7.0-sources.jar
-rw-rw-r--. 1 hadoop hadoop 4856 Apr 5 01:17 hadoop-dist-2.6.0-cdh5.7.0-test-sources.jar
drwxrwxr-x. 2 hadoop hadoop 51 Apr 5 01:17 javadoc-bundle-options
drwxrwxr-x. 2 hadoop hadoop 28 Apr 5 01:17 maven-archiver
drwxrwxr-x. 3 hadoop hadoop 22 Apr 5 01:17 maven-shared-archive-resources
drwxrwxr-x. 3 hadoop hadoop 22 Apr 5 01:17 test-classes
drwxrwxr-x. 2 hadoop hadoop 6 Apr 5 01:17 test-dir
检测是否支持
没有编译时检测是否支持压缩
[hadoop@hadoop001 bin]$ ./hadoop checknative -a
19/04/02 23:23:38 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Native library checking:
hadoop: false
zlib: false
snappy: false
lz4: false
bzip2: false
openssl: false
19/04/02 23:23:39 INFO util.ExitUtil: Exiting with status 1
编译后检测
[hadoop@hadoop001 bin]$ ./hadoop checknative -a
19/04/05 07:58:50 INFO bzip2.Bzip2Factory: Successfully loaded & initialized native-bzip2 library system-native
19/04/05 07:58:50 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
Native library checking:
hadoop: true /home/hadoop/app/hadoop-2.6.0-cdh5.7.0/lib/native/libhadoop.so.1.0.0
zlib: true /lib64/libz.so.1
snappy: true /lib64/libsnappy.so.1
lz4: true revision:99
bzip2: true /lib64/libbz2.so.1
openssl: true /lib64/libcrypto.so
编译成功以支持压缩
总结
环境安装编译过程遇到一些写小坑:
- jdk版本问题需要1.7
- 源文件编译权限问题
我是用hadoop用户编译的,查看文件权限
drwxrwxr-x. 17 root root 4096 Apr 5 00:19 hadoop-2.6.0-cdh5.7.0
不能修改创建target,需要修改权限
chown -R hadoop:hadoop hadoop-2.6.0-cdh5.7.0/
下一篇将根据本次编译好的进行安装,文章链接 hadoop伪分布式安装