hadoop2.7.1编译安装及碰到的问题

编译命令:
mvn package -Pdist,native,docs -DskipTests -Dtar

最好-p不带docs太费时间。直接用

mvn package -Pdist,native -DskipTests -Dtar

编译完成后的bin版本在目录 hadoop-dist下,跟从官网直接下载非src版本类似。

为提高下载速度,修改maven 源库,用*只从oschina找,有些它那没有。改用central 没有从apache的找。

nexus-osc <!– mirrorOf>* central // Nexus osc http://maven.oschina.net/content/groups/public/

编译过程中碰到的问题:

[ERROR] Failed to execute goal on project hadoop-common: Could not resolve dependencies for project org.apache.hadoop:hadoop-common:jar:2.7.1: Could not transfer artifact org.apache.commons:commons-math3:jar:3.1.1 from/to nexus-osc (http://maven.oschina.net/content/groups/public/): GET request of: org/apache/commons/commons-math3/3.1.1/commons-math3-3.1.1.jar from nexus-osc failed: Premature end of Content-Length delimited message body (expected: 1599627; received: 866169 -> [Help 1]

[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:2.7.1:protoc (compile-protoc) on project hadoop-common: org.apache.maven.plugin.MojoExecutionException: ‘protoc –version’ did not return a version -> [Help 1]

缺这缺那的,用thrift编译说明提到的一个把开发工具全装上。

yum -y groupinstall “Development Tools”

需要安装ant, yum install ant

Caused by: org.apache.maven.plugin.MojoExecutionException: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program “cmake” (in directory “/root/hadoop-2.7.1-src/hadoop-common-project/hadoop-common/target/native”): error=2, No such file or directory

需要安装 findbugs

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (site) on project hadoop-common: An Ant BuildException has occured: stylesheet /home/hadoop/hadoop-2.7.1-src/hadoop-common-project/hadoop-common/${env.FINDBUGS_HOME}/src/xsl/default.xsl doesn’t exist.

[ERROR] around Ant part …… @ 43:251 in /home/hadoop/hadoop-2.7.1-src/hadoop-common-project/hadoop-common/target/antrun/build-main.xml

然后设置环境变量 export FINDBUGS_HOME=/usr/local/findbugs-3.0.0

需要安装cmake

Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on
project hadoop-pipes: An Ant BuildException has occured: exec returned: 1
[ERROR] around Ant part …<exec dir=”/home/pory/workplace/hadoop-2.4.1-src/hadoop-
tools/hadoop-pipes/target/native” executable=”cmake” failonerror=”true”>… @ 5:131 in
/home/pory/workplace/hadoop-2.4.1-src/hadoop-tools/hadoop-pipes/target/antrun/build-
main.xml

安装zlib-dev 和 libssl-dev ,可能在groupinstall已经安装了。

//kms这个,换个目录重新编译或者多试几次,因为下载tomcat超时导致的问题。

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (dist) on project hadoop-kms: An Ant BuildException has occured: exec returned: 2
[ERROR] around Ant part …… @ 10:120 in /home/hadoop/hadoop-2.7.1-src/hadoop-common-project/hadoop-kms/target/antrun/build-main.xml
[ERROR] -> [Help 1]

main:
[exec] $ tar cf hadoop-2.7.1.tar hadoop-2.7.1
[exec] $ gzip -f hadoop-2.7.1.tar
[exec]
[exec] Hadoop dist tar available at: /root/hadoop-2.7.1-src/hadoop-dist/target/hadoop-2.7.1.tar.gz
[exec]
[INFO] Executed tasks
[INFO]
[INFO] — maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-dist —
[INFO] Building jar: /root/hadoop-2.7.1-src/hadoop-dist/target/hadoop-dist-2.7.1-javadoc.jar
[INFO] ————————————————————————
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main …………………………… SUCCESS [ 2.751 s]
[INFO] Apache Hadoop Project POM ……………………… SUCCESS [ 1.519 s]
[INFO] Apache Hadoop Annotations ……………………… SUCCESS [ 3.809 s]
[INFO] Apache Hadoop Assemblies ……………………… SUCCESS [ 0.166 s]
[INFO] Apache Hadoop Project Dist POM ………………… SUCCESS [ 2.397 s]
[INFO] Apache Hadoop Maven Plugins …………………… SUCCESS [ 3.735 s]
[INFO] Apache Hadoop MiniKDC ………………………… SUCCESS [ 2.503 s]
[INFO] Apache Hadoop Auth …………………………… SUCCESS [ 3.707 s]
[INFO] Apache Hadoop Auth Examples …………………… SUCCESS [ 4.847 s]
[INFO] Apache Hadoop Common …………………………. SUCCESS [02:04 min]
[INFO] Apache Hadoop NFS ……………………………. SUCCESS [ 6.261 s]
[INFO] Apache Hadoop KMS ……………………………. SUCCESS [ 14.043 s]
[INFO] Apache Hadoop Common Project …………………… SUCCESS [ 0.048 s]
[INFO] Apache Hadoop HDFS …………………………… SUCCESS [02:29 min]
[INFO] Apache Hadoop HttpFS …………………………. SUCCESS [ 29.947 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal …………… SUCCESS [ 8.841 s]
[INFO] Apache Hadoop HDFS-NFS ………………………… SUCCESS [ 4.443 s]
[INFO] Apache Hadoop HDFS Project ……………………. SUCCESS [ 0.042 s]
[INFO] hadoop-yarn …………………………………. SUCCESS [ 0.055 s]
[INFO] hadoop-yarn-api ……………………………… SUCCESS [ 39.073 s]
[INFO] hadoop-yarn-common …………………………… SUCCESS [ 32.676 s]
[INFO] hadoop-yarn-server …………………………… SUCCESS [ 0.059 s]
[INFO] hadoop-yarn-server-common ……………………… SUCCESS [ 11.506 s]
[INFO] hadoop-yarn-server-nodemanager ………………… SUCCESS [ 17.067 s]
[INFO] hadoop-yarn-server-web-proxy …………………… SUCCESS [ 3.369 s]
[INFO] hadoop-yarn-server-applicationhistoryservice ……. SUCCESS [ 7.629 s]
[INFO] hadoop-yarn-server-resourcemanager ……………… SUCCESS [ 19.577 s]
[INFO] hadoop-yarn-server-tests ……………………… SUCCESS [ 4.658 s]
[INFO] hadoop-yarn-client …………………………… SUCCESS [ 6.131 s]
[INFO] hadoop-yarn-server-sharedcachemanager …………… SUCCESS [ 3.213 s]
[INFO] hadoop-yarn-applications ……………………… SUCCESS [ 0.035 s]
[INFO] hadoop-yarn-applications-distributedshell ………. SUCCESS [ 2.549 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher …… SUCCESS [ 1.913 s]
[INFO] hadoop-yarn-site ……………………………… SUCCESS [ 0.035 s]
[INFO] hadoop-yarn-registry …………………………. SUCCESS [ 5.228 s]
[INFO] hadoop-yarn-project …………………………… SUCCESS [ 5.745 s]
[INFO] hadoop-mapreduce-client ………………………. SUCCESS [ 0.046 s]
[INFO] hadoop-mapreduce-client-core …………………… SUCCESS [ 23.276 s]
[INFO] hadoop-mapreduce-client-common ………………… SUCCESS [ 18.340 s]
[INFO] hadoop-mapreduce-client-shuffle ………………… SUCCESS [ 3.877 s]
[INFO] hadoop-mapreduce-client-app …………………… SUCCESS [ 7.821 s]
[INFO] hadoop-mapreduce-client-hs ……………………. SUCCESS [ 5.485 s]
[INFO] hadoop-mapreduce-client-jobclient ……………… SUCCESS [ 4.061 s]
[INFO] hadoop-mapreduce-client-hs-plugins ……………… SUCCESS [ 1.787 s]
[INFO] Apache Hadoop MapReduce Examples ………………. SUCCESS [ 5.106 s]
[INFO] hadoop-mapreduce ……………………………… SUCCESS [ 3.268 s]
[INFO] Apache Hadoop MapReduce Streaming ……………… SUCCESS [ 4.292 s]
[INFO] Apache Hadoop Distributed Copy ………………… SUCCESS [ 8.772 s]
[INFO] Apache Hadoop Archives ………………………… SUCCESS [ 2.479 s]
[INFO] Apache Hadoop Rumen …………………………… SUCCESS [ 6.148 s]
[INFO] Apache Hadoop Gridmix ………………………… SUCCESS [ 3.935 s]
[INFO] Apache Hadoop Data Join ………………………. SUCCESS [ 2.535 s]
[INFO] Apache Hadoop Ant Tasks ………………………. SUCCESS [ 2.101 s]
[INFO] Apache Hadoop Extras …………………………. SUCCESS [ 2.971 s]
[INFO] Apache Hadoop Pipes …………………………… SUCCESS [ 9.334 s]
[INFO] Apache Hadoop OpenStack support ………………… SUCCESS [ 4.874 s]
[INFO] Apache Hadoop Amazon Web Services support ………. SUCCESS [01:30 min]
[INFO] Apache Hadoop Azure support …………………… SUCCESS [ 10.201 s]
[INFO] Apache Hadoop Client …………………………. SUCCESS [ 8.911 s]
[INFO] Apache Hadoop Mini-Cluster ……………………. SUCCESS [ 0.078 s]
[INFO] Apache Hadoop Scheduler Load Simulator …………. SUCCESS [ 4.249 s]
[INFO] Apache Hadoop Tools Dist ……………………… SUCCESS [ 11.546 s]
[INFO] Apache Hadoop Tools …………………………… SUCCESS [ 0.020 s]
[INFO] Apache Hadoop Distribution ……………………. SUCCESS [ 47.111 s]
[INFO] ————————————————————————
[INFO] BUILD SUCCESS
[INFO] ————————————————————————
[INFO] Total time: 13:40 min
[INFO] Finished at: 2015-09-13T02:09:30+08:00
[INFO] Final Memory: 125M/429M

01
[INFO] ------------------------------------------------------------------------
02
[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:2.2.0:protoc (compile-protoc) on project hadoop-common: org.apache.maven.plugin.MojoExecutionException: ‘protoc --version’ did not return a version -> [Help 1]
03
[ERROR]
04
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
05
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
06
[ERROR]
07
[ERROR] For more information about the errors and possible solutions, please read the following articles:
08
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
09
[ERROR]
10
[ERROR] After correcting the problems, you can resume the build with the command
11
[ERROR] mvn -rf :hadoop-common

或者出现版本不匹配的错误,hadoop-2.2.0编译要求protoc 2.5.0

解决:安装protobuf:

安装protobuf过程

下载:https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz

解压后依次执行下面的命令即可

./configure

make

make check

make install

配置环境变量:

1
export PROTOC_HOME=/home/hadoop/software/protobuf-2.5.0
2
export PATH= P R O T O C H O M E / s r c : PROTOC_HOME/src: PATH

[root@hadoop01 protobuf-2.5.0]# protoc–version

libprotoc 2.5.0

01
[INFO] ------------------------------------------------------------------------
02
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (make) on project hadoop-common: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program “cmake” (in directory “/usr/local/hadoop-2.2.0-src/hadoop-common-project/hadoop-common/target/native”): java.io.IOException: error=2, No such file or directory -> [Help 1]
03
[ERROR]
04
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
05
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
06
[ERROR]
07
[ERROR] For more information about the errors and possible solutions, please read the following articles:
08
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
09
[ERROR]
10
[ERROR] After correcting the problems, you can resume the build with the command
11
[ERROR] mvn -rf :hadoop-common

解决:安装cmake
1
sudo apt-get install cmake

01
[INFO] ------------------------------------------------------------------------
02
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (make) on project hadoop-common: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
03
[ERROR]
04
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
05
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
06
[ERROR]
07
[ERROR] For more information about the errors and possible solutions, please read the following articles:
08
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
09
[ERROR]
10
[ERROR] After correcting the problems, you can resume the build with the command
11
[ERROR] mvn -rf :hadoop-common

解决:安装zlib-devel
1
$ sudo apt-get install zlib1g.dev

01
[INFO] ------------------------------------------------------------------------
02
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (make) on project hadoop-pipes: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
03
[ERROR]
04
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
05
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
06
[ERROR]
07
[ERROR] For more information about the errors and possible solutions, please read the following articles:
08
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
09
[ERROR]
10
[ERROR] After correcting the problems, you can resume the build with the command
11
[ERROR] mvn -rf :hadoop-pipes

解决: 安装openssl-devel
apt-get 无法安装 openssl-devel请执行:

view sourceprint?
1
sudo apt-get install openssl
2
sudo apt-get install libssl-dev

想要的结果包在hadoop-dist/target/下

猜你喜欢

转载自blog.csdn.net/weixin_43654136/article/details/84964631