并行作业2:Ubuntu(16.04)下安装配置hadoop(2.7.3)

Ubuntu(16.04)下安装配置hadoop(2.7.3)

系统采用vm下ubuntu16.04

一、Java环境搭建(参考我的其它博客)

二、安装ssh-server并实现免密码登录

1、下载安装ssh-server

sudo apt-get install openssh-server

2、启动ssh

sudo /etc/init.d/ssh start

3、设置免密码登录

cd ~/.ssh
ssh-keygen -t rsa  一直回车,直到生成rsa

4、导入authorized_keys

cat ./id_rsa.pub >> ./authorized_keys

5、测试免密码登录

ssh localhost

6、关闭防火墙

ufw disable

二、安装运行Hadoop

1、官网下载hadoop安装文件,这里用的是如下版本

hadoop-2.7.3.tar.gz
拖到虚拟机桌面

2、将其解压到/usr/local/hadoop目录下

cd /usr/local
sudo mkdir hadoop
cd ~/桌面
sudo mv hadoop-2.7.3.tar.gz /usr/local/hadoop
cd /usr/local/hadoop
tar -zxvf hadoop-2.7.3.tar.gz

3、配置.bashrc文件

sudo gedit ~/.bashrc

//末尾加上
#HADOOP VARIABLES START  
export JAVA_HOME=/usr/local/java/jvm/jdk1.8.0_162  
export HADOOP_INSTALL=/usr/local/hadoop/hadoop-2.7.3
export PATH=$PATH:$HADOOP_INSTALL/bin  
export PATH=$PATH:$HADOOP_INSTALL/sbin  
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL  
export HADOOP_COMMON_HOME=$HADOOP_INSTALL  
export HADOOP_HDFS_HOME=$HADOOP_INSTALL  
export YARN_HOME=$HADOOP_INSTALL  
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native  
export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"  
#HADOOP VARIABLES END  

//修改完成之后保存关闭,并输入以下命令使环境变量立即生效
source ~/.bashrc

4、配置Hadoop

(1)修改 core-site.xml 配置文件
cd /usr/local/hadoop/hadoop-2.7.3/etc/hadoop
sudo gedit core-site.xml

//修改内容如下:
<configuration>
        <property>
             <name>hadoop.tmp.dir</name>
             <value>file:/usr/local/hadoop/hadoop-2.7.3/tmp</value>
             <description>Abase for other temporary directories.</description>
        </property>
        <property>
             <name>fs.default.name </name>
             <value>hdfs://localhost:9000</value>
        </property>
</configuration>
(2)修改hdfs-site.xml配置文件
sudo gedit hdfs-site.xml
//修改内容如下:
<configuration>
        <property>
             <name>dfs.replication</name>
             <value>1</value>
        </property>
        <property>
             <name>dfs.namenode.name.dir</name>
             <value>file:/usr/local/hadoop/hadoop-2.7.3/tmp/dfs/name</value>
        </property>
        <property>
             <name>dfs.datanode.data.dir</name>
             <value>file:/usr/local/hadoop/hadoop-2.7.3/tmp/dfs/data</value>
        </property>
</configuration>
(3)配置JAVA_HOME
sudo gedit  hadoop-env.sh

//修改内容如下:
# The java implementation to use.  
export JAVA_HOME=/usr/local/java/jvm/jdk1.8.0_162 
export HADOOP=/usr/local/hadoop/hadoop-2.7.3  
export PATH=$PATH:/usr/local/hadoop/hadoop-2.7.3/bin

5、运行测试

(1)首次启动对NameNode格式化
hdfs namenode –format
(2)启动hadoop
start-all.sh
(3)测试
jps
//这里输出了6个节点,说明安装成功
注意如果jps缺少namenode需要删除tmp,然后新建文件夹,再格式化,再重启

猜你喜欢

转载自blog.csdn.net/daihanglai7622/article/details/84757962
今日推荐