hbase-ha Installation

 

hbase-ha Installation

-1 goal:

10.156.50.35  hmaster
10.156.50.36  hmaster
10.156.50.37  hregionserver

 

0. preparations hadoop server

10.156.50.35 yanfabu2-35.base.app.dev.yf zk1  hadoop1 master1 master
10.156.50.36 yanfabu2-36.base.app.dev.yf zk2  hadoop2 master2
10.156.50.37 yanfabu2-37.base.app.dev.yf zk3  hadoop3 slaver1

 

1. Preparations ntp server

 

the install NTP -Y yum 
the chkconfig the ntpd ON 

VI /etc/ntp.conf 
server configuration: 
	# Set segment when ntp-server allows access to be calibrated 
	the restrict 172.23.27.120 mask 255.255.255.0 nomodify notrap 
	# local clock source 
	Server 172.23.27.120 
	# when the external clock is not available, use the local clock 
	fudge 172.23.27.120 10 stratum 
client configuration: 
	# set top clock source address set to the ntp server 
	server 172.23.27.120 
	# server synchronization clock allows the upper time 
	the restrict 172.23.27.120 nomodify notrap NOQUERY 
	# local clock 
	server 172.23.27.115 
	# clock source when the upper layer is not available, use the local clock 
	fudge 172.23.27.115 stratum 10 

runs 
the server 
	-Service the ntpd Start 
	-Service the ntpd STOP 
	ntpstat 
client 
	ntpdate -u 172.23.27.120
	Start ntpd Service 
	ntpstat 
view 
	watch ntpq -p

 

2. Install hbase

2.0 modify ~ / .bash_profile

 

vim ~/.bash_profile

export HBASE_HOME=/home/zkkafka/hbase
export PATH=$HBASE_HOME/bin:$PATH

source ~/.bash_profile

 

2.1 modify hbase-evn.sh

# Enable configuration JAVA_HOME 
Export JAVA_HOME = / Home / zkkafka / jdk1.8.0_151 / 
# close HBase comes zookeeper, zookeeper using cluster 
export HBASE_MANAGES_ZK = false

 

2.2 modify hbase-site.xml

<configuration>
    <property>
        <name>hbase.rootdir</name>
        <value>hdfs://master/hbase</value>
    </property>
    <property>
        <name>hbase.cluster.distributed</name>
        <value>true</value>
    </property>
    <property>
	<!--对应的zookeeper集群,不用加端口-->
        <name>hbase.zookeeper.quorum</name>
        <value>master1,master2,slaver1</value>
    </property>
</configuration> 

 

2.3 modify the configuration regionservers

slaver1

 

2.4 modify backup-masters 

master2

 

2.5 Copy configuration file hdfs-site.xml Hadoop HBase to the conf directory

cp /home/zkkafka/hadoop/etc/hadoop/hdfs-site.xml ./

 

2.6 Copy configuration files to other nodes

scp /home/zkkafka/hbase/conf/*  [email protected]:/home/zkkafka/hbase/conf/
scp /home/zkkafka/hbase/conf/*  [email protected]:/home/zkkafka/hbase/conf/

 

 

2.7 start hbase

sh /home/zkkafka/hbase/bin/start-hbase.sh


[zkkafka@yanfabu2-35 bin]$ ./start-hbase.sh 
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/zkkafka/hbase/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/zkkafka/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
running master, logging to /home/zkkafka/hbase/bin/../logs/hbase-zkkafka-master-yanfabu2-35.base.app.dev.yf.out
slaver1: running regionserver, logging to /home/zkkafka/hbase/bin/../logs/hbase-zkkafka-regionserver-yanfabu2-37.base.app.dev.yf.out
master2: running master, logging to /home/zkkafka/hbase/bin/../logs/hbase-zkkafka-master-yanfabu2-36.base.app.dev.yf.out

 

2.8 View process hbase

[zkkafka @ yanfabu2-35 bin] $ JPS 
59330 QuorumPeerMain 
79763 JPS 
56,377 Kafka 
86680 ResourceManager 
86570 DFSZKFailoverController 
79514 HMaster √ 
86044 JournalNode 
87356 NameNode

 

 

[zkkafka @ yanfabu2-36 ~] $ JPS 
37365 QuorumPeerMain 
99335 JPS 
56489 DFSZKFailoverController 
99224 HMaster √ 
34571 Kafka 
56606 NameNode 
56319 JournalNode

 

[zkkafka@yanfabu2-37 ~]$ jps
61619 JournalNode
61829 NodeManager
42955 QuorumPeerMain
73002 HRegionServer                     √
40189 Kafka
61693 DataNode
73182 Jps

 

2.9 View web-ui

http://10.156.50.35:16010/master-status
http://10.156.50.36:16010/master-status

 

3. shell command

cd /home/zkkafka/hadoop/share/hadoop/yarn/lib/
mv jline-0.9.94.jar jline-0.9.94.jar.bak
rz jline-2.12.jar

 

[zkkafka@yanfabu2-35 ~]$ hbase version
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/zkkafka/hbase/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/zkkafka/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
HBase 2.0.5
Source code repository git://dd7c519a402b/opt/hbase-rm/output/hbase revision=76458dd074df17520ad451ded198cd832138e929
Compiled by hbase-rm on Mon Mar 18 00:41:49 UTC 2019
From source with checksum fd9cba949d65fd3bca4df155254ac28c

 

 

 

[zkkafka@yanfabu2-35 lib]$ hbase shell
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/zkkafka/hbase/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/zkkafka/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
HBase Shell
Use "help" to get list of supported commands.
Use "exit" to quit this interactive shell.
For Reference, please visit: http://hbase.apache.org/2.0/book.html#shell
Version 2.0.5, r76458dd074df17520ad451ded198cd832138e929, Mon Mar 18 00:41:49 UTC 2019
Took 0.0048 seconds  

 

 

4. database operations

create 'data_analysis',{NAME => 'data_time', VERSIONS => 1},{NAME => 'inamount', VERSIONS => 1}

create 'data_analysis', {NAME => 'inaccount', VERSIONS => 1}, {NAME => 'inamount', VERSIONS => 1}, {NAME => 'outaccount', VERSIONS => 1},{NAME => 'outamount', VERSIONS => 1};

put 'data_analysis', '2019-05-19 00:00:00', 'inaccount', '10000';
put 'data_analysis', '2019-05-19 00:00:00', 'inamount', '100';
put 'data_analysis', '2019-05-19 00:00:00', 'outaccount', '10100';
put 'data_analysis', '2019-05-19 00:00:00', 'outamount', '101';




put 'data_analysis','2019-05-19 00:00:00', 'inaccount: xianxishoudanaccount', '5000'; 
PUT 'data_analysis', '2019-05-19 00:00:00', 'inaccount: xianshangshoudanaccount', '5000' ;
put 'data_analysis', '2019-05-19 00:00:00', 'inamount:xianxishoudanamount', '50';
put 'data_analysis', '2019-05-19 00:00:00', 'inamount:xianshangshoudanamount', '50';

get 'data_analysis' ,'2019-05-19 00:00:00' , 'inaccount'
get 'data_analysis' ,'2019-05-19 00:00:00' , 'inaccount:xianxishoudanaccount'
get 'data_analysis' ,'2019-05-19 00:00:00' , 'inaccount:xianshangshoudanaccount'


scan 'data_analysis'
ROW                                              COLUMN+CELL                                                                                                                                   
 2019-05-19 00:00:00                             column=inaccount:, timestamp=1558080234354, value=10000                                                                                       
 2019-05-19 00:00:00                             column=inaccount:xianshangshoudanaccount, timestamp=1558080601831, value=5000                                                                 
 2019-05-19 00:00:00                             column=inaccount:xianxishoudanaccount, timestamp=1558080601812, value=5000                                                                    
 2019-05-19 00:00:00                             column=inamount:, timestamp=1558080234393, value=100                                                                                          
 2019-05-19 00:00:00                             column=inamount:xianshangshoudanamount, timestamp=1558080601856, value=50                                                                     
 2019-05-19 00:00:00                             column=inamount:xianxishoudanamount, timestamp=1558080601844, value=50                                                                        
 2019-05-19 00:00:00                             column=outaccount:, timestamp=1558080234406, value=10100                                                                                      
 2019-05-19 00:00:00                             column=outamount:, timestamp=1558080234417, value=101                                                                                         
                                                                                                                                                                         




flush 'data_analysis'

 

 

[zkkafka@yanfabu2-35 bin]$ hdfs dfs -lsr /hbase/data/default/data_analysis
lsr: DEPRECATED: Please use 'ls -R' instead.
drwxr-xr-x   - zkkafka supergroup          0 2019-05-17 16:03 /hbase/data/default/data_analysis/.tabledesc
-rw-r--r--   2 zkkafka supergroup       1808 2019-05-17 16:03 /hbase/data/default/data_analysis/.tabledesc/.tableinfo.0000000001
drwxr-xr-x   - zkkafka supergroup          0 2019-05-17 16:03 /hbase/data/default/data_analysis/.tmp
drwxr-xr-x   - zkkafka supergroup          0 2019-05-17 16:11 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a
-rw-r--r--   2 zkkafka supergroup         48 2019-05-17 16:03 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a/.regioninfo
drwxr-xr-x   - zkkafka supergroup          0 2019-05-17 16:11 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a/.tmp
drwxr-xr-x   - zkkafka supergroup          0 2019-05-17 16:11 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a/.tmp/inaccount
drwxr-xr-x   - zkkafka supergroup          0 2019-05-17 16:11 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a/.tmp/inamount
drwxr-xr-x   - zkkafka supergroup          0 2019-05-17 16:11 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a/.tmp/outaccount
drwxr-xr-x   - zkkafka supergroup          0 2019-05-17 16:11 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a/.tmp/outamount
drwxr-xr-x   - zkkafka supergroup          0 2019-05-17 16:11 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a/inaccount
-rw-r--r--   2 zkkafka supergroup       5097 2019-05-17 16:11 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a/inaccount/5243c1f49c7b4b0fa91d8df3a936e7a2
drwxr-xr-x   - zkkafka supergroup          0 2019-05-17 16:11 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a/inamount
-rw-r--r--   2 zkkafka supergroup       5083 2019-05-17 16:11 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a/inamount/9e7bc1d2a1e64987b90c3254e53c57cb
drwxr-xr-x   - zkkafka supergroup          0 2019-05-17 16:11 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a/outaccount
-rw-r--r--   2 zkkafka supergroup       4931 2019-05-17 16:11 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a/outaccount/c3217f1ea5a24f3daf1d984f55c78a6b
drwxr-xr-x   - zkkafka supergroup          0 2019-05-17 16:11 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a/outamount
-rw-r--r--   2 zkkafka supergroup       4926 2019-05-17 16:11 /hbase/data/default/data_analysis/ed3abfb268f14d203f95dd0a45f80b8a/outamount/4061fca2d54e471a86da5290d9a67020
[zkkafka@yanfabu2-35 bin]$ 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Developers donors 

Driven by interests, write a 免费thing, there is joy, there is also sweat, hope you like my work, but also to support what. Of course, the rich holding a money market (support Alipay and micro-letters and buttoned group), no money holding individual field, thank you.

 

Profile : http://knight-black-bob.iteye.com/



 
 
 Thank you for your sponsorship, I will do better!

Guess you like

Origin knight-black-bob.iteye.com/blog/2441803