Hadoop在ubuntu下安装配置文件及出现问题

我的配置:

路径:

hadoop /home/flyuz/hadoop

jdk1.8.0_172 /opt/java/jdk1.8.0_172

eclipse /opt/eclipse

 

版本:

ubuntu16.04

hadoop 2.7.6

eclipse oxygen.3a

jdk1.8.0_172

插件2.7.1

 

环境文件:

/etc/environment

#path千万别删

PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:$JAVA_HOME/bin"

export CLASSPATH=.:$JAVA_HOME/lib:$JAVA_HOME/jre/lib

export JAVA_HOME=/opt/java/jdk1.8.0_172

 

/etc/profile

#set java 别删path

export JAVA_HOME=/opt/java/jdk1.8.0_172

export JRE_HOME=/opt/java/jdk1.8.0_172/jre

export CLASSPATH=.:$CLASSPATH:$JAVA_HOME/lib:$JRE_HOME/lib

export PATH=$PATH:$JAVA_HOME/bin:$JRE_HOME/bin:$PATH

#set hadoop

export HADOOP_HOME=/home/flyuz/hadoop

 

~/.bashrc

#set java

export JAVA_HOME=/opt/java/jdk1.8.0_172

#ser hadoop

export HADOOP_INSTALL=/home/flyuz//hadoop

export PATH=$PATH:$HADOOP_INSTALL/bin

export PATH=$PATH:$HADOOP_INSTALL/sbin

export HADOOP_MAPRED_HOME=$HADOOP_INSTALL

export HADOOP_COMMON_HOME=$HADOOP_INSTALL

export HADOOP_HDFS_HOME=$HADOOP_INSTALL

export YARN_HOME=$HADOOP_INSTALL

 

遇到的问题:

datanode  打不开, 原因 format次数过多,导致namenode的clusterID和datanode的clusterID 不匹配。

解决:

/home/flyuz/hadoop/tmp/dfs 中的data下的version中的clusterID复制成和name下的version一样的

 

eclipse中编译时出错:log4j

解决:在项目src目录中 创建log4j.properties的文本文件

log4j.rootLogger=INFO, stdout

log4j.appender.stdout=org.apache.log4j.ConsoleAppender

log4j.appender.stdout.layout=org.apache.log4j.PatternLayout

log4j.appender.stdout.layout.ConversionPattern=%d %p [%c] - %m%n

log4j.appender.logfile=org.apache.log4j.FileAppender

log4j.appender.logfile.File=target/spring.log

log4j.appender.logfile.layout=org.apache.log4j.PatternLayout

log4j.appender.logfile.layout.ConversionPattern=%d %p [%c] - %m%n

 

配置完成后:

cd ~/hadoop/bin/

start-all.sh  运行

jps  查看启动情况   应该有六个

SecondaryNameNode
Jps
NameNode
DataNode
ResourceManager
NodeManager

猜你喜欢

转载自www.cnblogs.com/flyuz/p/9105342.html
今日推荐