hbase-ha 安装

 

hbase-ha  安装

-1.目标:

10.156.50.35  hmaster
10.156.50.36  hmaster
10.156.50.37  hregionserver

 

0.准备工作 hadoop 服务器

10.156.50.35 yanfabu2-35.base.app.dev.yf zk1  hadoop1 master1 master
10.156.50.36 yanfabu2-36.base.app.dev.yf zk2  hadoop2 master2
10.156.50.37 yanfabu2-37.base.app.dev.yf zk3  hadoop3 slaver1

 

1.准备工作 ntp 服务器

 

yum install ntp –y
chkconfig ntpd on

vi /etc/ntp.conf
服务器配置:
	# 设置允许访问ntp-server进行校时的网段
	restrict 172.23.27.120 mask 255.255.255.0 nomodify notrap
	#本地时钟源
	server 172.23.27.120
	#当外部时钟不可用,使用本地时钟
	fudge 172.23.27.120 stratum 10
客户端配置:
	#设置上层时钟源,设置为ntp server地址
	server 172.23.27.120
	#允许与上层时钟服务器同步时间
	restrict 172.23.27.120 nomodify notrap noquery
	#本地时钟
	server  172.23.27.115 
	#当上层时钟源不可用,使用本地时钟
	fudge   172.23.27.115 stratum 10

运行
服务器端
	service ntpd start
	service ntpd stop
	ntpstat
客户端
	ntpdate –u 172.23.27.120
	service ntpd start
	ntpstat
查看
	watch ntpq -p

 

2.安装hbase

2.0 修改 ~/.bash_profile

 

vim ~/.bash_profile

export HBASE_HOME=/home/zkkafka/hbase
export PATH=$HBASE_HOME/bin:$PATH

source ~/.bash_profile

 

2.1 修改 hbase-evn.sh

#开启JAVA_HOME配置
export JAVA_HOME=/home/zkkafka/jdk1.8.0_151/
#关闭HBase自带的zookeeper,使用zookeeper集群
export HBASE_MANAGES_ZK=false

 

2.2 修改hbase-site.xml

<configuration>
    <property>
        <name>hbase.rootdir</name>
        <value>hdfs://master/hbase</value>
    </property>
    <property>
        <name>hbase.cluster.distributed</name>
        <value>true</value>
    </property>
    <property>
	<!--对应的zookeeper集群,不用加端口-->
        <name>hbase.zookeeper.quorum</name>
        <value>master1,master2,slaver1</value>
    </property>
</configuration> 

 

2.3 修改regionservers配置

slaver1

 

2.4 修改 backup-masters 

master2

 

2.5 复制Hadoop配置文件hdfs-site.xml到HBase的conf目录

cp /home/zkkafka/hadoop/etc/hadoop/hdfs-site.xml ./

 

2.6 复制配置文件到其他节点

scp /home/zkkafka/hbase/conf/*  [email protected]:/home/zkkafka/hbase/conf/
scp /home/zkkafka/hbase/conf/*  [email protected]:/home/zkkafka/hbase/conf/

 

 

2.7 启动hbase

sh /home/zkkafka/hbase/bin/start-hbase.sh


[[email protected] bin]$ ./start-hbase.sh 
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/zkkafka/hbase/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/zkkafka/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
running master, logging to /home/zkkafka/hbase/bin/../logs/hbase-zkkafka-master-yanfabu2-35.base.app.dev.yf.out
slaver1: running regionserver, logging to /home/zkkafka/hbase/bin/../logs/hbase-zkkafka-regionserver-yanfabu2-37.base.app.dev.yf.out
master2: running master, logging to /home/zkkafka/hbase/bin/../logs/hbase-zkkafka-master-yanfabu2-36.base.app.dev.yf.out

 

2.8 查看进程hbase

[[email protected] bin]$ jps
59330 QuorumPeerMain
79763 Jps
56377 Kafka
86680 ResourceManager
86570 DFSZKFailoverController
79514 HMaster                           √
86044 JournalNode
87356 NameNode

 

 

[[email protected] ~]$ jps
37365 QuorumPeerMain
99335 Jps
56489 DFSZKFailoverController
99224 HMaster                           √
34571 Kafka
56606 NameNode
56319 JournalNode

 

[[email protected] ~]$ jps
61619 JournalNode
61829 NodeManager
42955 QuorumPeerMain
73002 HRegionServer                     √
40189 Kafka
61693 DataNode
73182 Jps

 

2.9 查看 web-ui

http://10.156.50.35:16010/master-status
http://10.156.50.36:16010/master-status

 

3. shell命令

cd /home/zkkafka/hadoop/share/hadoop/yarn/lib/
mv jline-0.9.94.jar jline-0.9.94.jar.bak
rz jline-2.12.jar

 

[[email protected] ~]$ hbase version
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/zkkafka/hbase/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/zkkafka/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
HBase 2.0.5
Source code repository git://dd7c519a402b/opt/hbase-rm/output/hbase revision=76458dd074df17520ad451ded198cd832138e929
Compiled by hbase-rm on Mon Mar 18 00:41:49 UTC 2019
From source with checksum fd9cba949d65fd3bca4df155254ac28c

 

 

 

[[email protected] lib]$ hbase shell
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/zkkafka/hbase/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/zkkafka/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
HBase Shell
Use "help" to get list of supported commands.
Use "exit" to quit this interactive shell.
For Reference, please visit: http://hbase.apache.org/2.0/book.html#shell
Version 2.0.5, r76458dd074df17520ad451ded198cd832138e929, Mon Mar 18 00:41:49 UTC 2019
Took 0.0048 seconds  

 

 

4.数据库操作

create 'data_analysis',{NAME => 'data_time', VERSIONS => 1},{NAME => 'inamount', VERSIONS => 1}

create 'data_analysis', {NAME => 'inaccount', VERSIONS => 1}, {NAME => 'inamount', VERSIONS => 1}, {NAME => 'outaccount', VERSIONS => 1},{NAME => 'outamount', VERSIONS => 1};

put 'data_analysis', '2019-05-19 00:00:00', 'inaccount', '10000';
put 'data_analysis', '2019-05-19 00:00:00', 'inamount', '100';
put 'data_analysis', '2019-05-19 00:00:00', 'outaccount', '10100';
put 'data_analysis', '2019-05-19 00:00:00', 'outamount', '101';




put 'data_analysis', '2019-05-19 00:00:00', 'inaccount:xianxishoudanaccount', '5000';
put 'data_analysis', '2019-05-19 00:00:00', 'inaccount:xianshangshoudanaccount', '5000';
put 'data_analysis', '2019-05-19 00:00:00', 'inamount:xianxishoudanamount', '50';
put 'data_analysis', '2019-05-19 00:00:00', 'inamount:xianshangshoudanamount', '50';

get 'data_analysis' ,'2019-05-19 00:00:00' , 'inaccount'
get 'data_analysis' ,'2019-05-19 00:00:00' , 'inaccount:xianxishoudanaccount'
get 'data_analysis' ,'2019-05-19 00:00:00' , 'inaccount:xianshangshoudanaccount'


scan 'data_analysis'
ROW                                              COLUMN+CELL                                                                                                                                   
 2019-05-19 00:00:00                             column=inaccount:, timestamp=1558080234354, value=10000                                                                                       
 2019-05-19 00:00:00                             column=inaccount:xianshangshoudanaccount, timestamp=1558080601831, value=5000                                                                 
 2019-05-19 00:00:00                             column=inaccount:xianxishoudanaccount, timestamp=1558080601812, value=5000                                                                    
 2019-05-19 00:00:00                             column=inamount:, timestamp=1558080234393, value=100                                                                                          
 2019-05-19 00:00:00                             column=inamount:xianshangshoudanamount, timestamp=1558080601856, value=50                                                                     
 2019-05-19 00:00:00                             column=inamount:xianxishoudanamount, timestamp=1558080601844, value=50                                                                        
 2019-05-19 00:00:00                             column=outaccount:, timestamp=1558080234406, value=10100                                                                                      
 2019-05-19 00:00:00                             column=outamount:, timestamp=1558080234417, value=101                                                                                         
                                                                                                                                                                         




flush 'data_analysis'

 

 

[[email protected] bin]$ hdfs dfs -lsr /hbase/data/default/data_analysis
lsr: DEPRECATED: Please use 'ls -R' instead.
drwxr-xr-x   - zkkafka supergroup          0 2019-05-17 16:03 /hbase/data/default/data_analysis/.tabledesc
-rw-r--r--   2 zkkafka supergroup       1808 2019-05-17 16:03 /hbase/data/default/data_analysis/.tabledesc/.tableinfo.0000000001
drwxr-xr-x   - zkkafka supergroup          0 2019-05-17 16:03 /hbase/data/default/data_analysis/.tmp
drwxr-xr-x   - zkkafka supergroup          0 2019-05-17 16:11 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a
-rw-r--r--   2 zkkafka supergroup         48 2019-05-17 16:03 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a/.regioninfo
drwxr-xr-x   - zkkafka supergroup          0 2019-05-17 16:11 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a/.tmp
drwxr-xr-x   - zkkafka supergroup          0 2019-05-17 16:11 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a/.tmp/inaccount
drwxr-xr-x   - zkkafka supergroup          0 2019-05-17 16:11 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a/.tmp/inamount
drwxr-xr-x   - zkkafka supergroup          0 2019-05-17 16:11 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a/.tmp/outaccount
drwxr-xr-x   - zkkafka supergroup          0 2019-05-17 16:11 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a/.tmp/outamount
drwxr-xr-x   - zkkafka supergroup          0 2019-05-17 16:11 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a/inaccount
-rw-r--r--   2 zkkafka supergroup       5097 2019-05-17 16:11 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a/inaccount/5243c1f49c7b4b0fa91d8df3a936e7a2
drwxr-xr-x   - zkkafka supergroup          0 2019-05-17 16:11 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a/inamount
-rw-r--r--   2 zkkafka supergroup       5083 2019-05-17 16:11 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a/inamount/9e7bc1d2a1e64987b90c3254e53c57cb
drwxr-xr-x   - zkkafka supergroup          0 2019-05-17 16:11 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a/outaccount
-rw-r--r--   2 zkkafka supergroup       4931 2019-05-17 16:11 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a/outaccount/c3217f1ea5a24f3daf1d984f55c78a6b
drwxr-xr-x   - zkkafka supergroup          0 2019-05-17 16:11 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a/outamount
-rw-r--r--   2 zkkafka supergroup       4926 2019-05-17 16:11 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a/outamount/4061fca2d54e471a86da5290d9a67020
[[email protected] bin]$ 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

捐助开发者 

在兴趣的驱动下,写一个免费的东西,有欣喜,也还有汗水,希望你喜欢我的作品,同时也能支持一下。 当然,有钱捧个钱场(支持支付宝和微信 以及扣扣群),没钱捧个人场,谢谢各位。

 

个人主页http://knight-black-bob.iteye.com/



 
 
 谢谢您的赞助,我会做的更好!

猜你喜欢

转载自knight-black-bob.iteye.com/blog/2441803
0条评论
添加一条新回复