hadoop 2.6 hbase 1.0 install guide:
environment
System: Ubuntu 14.04
hadoop version: 2.6.0
hbase version: 1.0
jdk version: 1.8
Download address: Find it slowly on Apache~~
The environment configuration of jdk is not listed here. Let's talk about hadoop configuration first.
hadoop installation
1. Installation location: /opt
2. Create hadoop user group sudo addgroup hadoop
3. Create hadoop user sudo adduser -ingroup hadoop hadoop
4. Add permission to hadoop Add sudo vim /etc/sudoers
under root ALL=(ALL:ALL) ALL
hadoop ALL=(ALL:ALL) ALL
- 1
5. Install ssh sudo apt-get install ssh openssh-server
6. Create ssh passwordless login
su - hadoop
ssh-keygen -t rsa -P ""
cd ~/.ssh
cat id_rsa.pub >> authorized_keys
- 1
- 2
- 3
- 4
Test ssh localhost
7. Unzip hadoop
tar -zxvf hadoop-2.6.0.tar.gz
sudo mv hadoop-2.6.0 /opt/hadoop
sudo chmod -R 775 /opt/hadoop
sudo chown -R hadoop:hadoop /opt/hadoop
- 1
- 2
- 3
- 4
8. The configuration environment variable is added at the endsudo vim ~/.bashrc
#HADOOP VARIABLES START
export JAVA_HOME=/opt/jdk1.8.0
export HADOOP_HOME=/opt/hadoop
export PATH=$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path-$HADOOP_HOME/lib"
#HADOOP VARIABLES END
export HBASE_HOME=/opt/hbase
export PATH=$PATH:$HBASE_HOME/bin
export JRE_HOME=${JAVA_HOME}/jre
export CLASSPATH=.${JAVA_HOME}/lib:${JRE_HOME}/lib:${HADOOP_HOME}/share/hadoop/common/lib:${HBASE_HOME}/lib
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
source ~/.bashrc
9. Modify hadoop-env.sh
to change JAVA_HOME to the jdk installation directory, here is /opt/jdk1.8.0
10. Modify core-site.xml
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://localhost:9000</value>
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>/opt/hadoop/tmp</value>
</property>
</configuration>
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
11. Modify mapred-site.xml.template and modify it to mapred-site.xml
<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
</configuration>
- 1
- 2
- 3
- 4
- 5
- 6
- 7
12. Modify yarn-site.xml
<configuration>
<!-- Site specific YARN configuration properties -->
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
</configuration>
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
13. Modify hdfs-site.xml
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>/opt/hadoop/dfs/name</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>/opt/hadoop/dfs/data</value>
</property>
<property>
<name>dfs.permissions</name>
<value>false</value>
</property>
</configuration>
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
14. Modify masters and slaves
masters do not seem to exist, add and
insert by yourself: localhost
15. Add temporary directory
cd /opt/hadoop
mkdir tmp dfs dfs/name dfs/data
- 1
- 2
16. Initialize hdfs hdfs namenode -format
17. Start hadoop
start-dfs.sh
start-yarn.sh
- 1
- 2
hbase installation
hbase安装相对简单,就是把其整合hadoop
1.解压
tar -zxvf hbase-1.0.0-bin.tar.gz
sudo mv hbase-1.0.0 /opt/hbase
cd /opt
sudo chmod -R 775 hbase
sudo chown -R hadoop:hadoop: hbase
- 1
- 2
- 3
- 4
- 5
2.修改环境变量 sudo vim /opt/hbase/conf/hbase-env.sh
修改$JAVA_HOME为jdk安装目录,这里是/opt/jdk1.8.0
3.修改hbase-site.xml
添加:
<configuration>
<property>
<name>hbase.rootdir</name>
<value>hdfs://localhost:9000/hbase</value>
</property>
<property>
<name>hbase.cluster.distributed</name>
<value>true</value>
</property>
</configuration>
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
4.启动hbase start-hbase.sh
5.进入hbase shell hbase shell
6.查看进程
通过jps
,应该会看到一共有9个进程,分别如下:
3616 NodeManager
3008 NameNode
6945 HQuorumPeer
7010 HMaster
3302 SecondaryNameNode
3128 DataNode
7128 HRegionServer
3496 ResourceManager
7209 Jps
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
进程号不一定是这些
至此,hadoop和hbase都安装完成了,不过这只是单机版也可以说是伪分布式配置,希望对大家有所帮助。
参考blog:http://blog.csdn.net/xanxus46/article/details/45133977
hbase 学习blog: http://www.cnblogs.com/panfeng412/category/316094.html