Teacher Yu takes you to learn big data-Spark fast big data processing Chapter 9 HBase Section 2 HBase environment construction

HBase environment construction

Modify the configuration file

Modify hbase-env.sh

Before the change:
Insert picture description here

Change # export HBASE_MANAGES_ZK=true to export HBASE_MANAGES_ZK=false to
indicate that zookeeper is not started when HBase is started, and the user starts zookeeper separately.
After the change:
Insert picture description here

Modify hbase-site.xml

Before the change:
Insert picture description here

Add in hbase-site.xml

   <property>
		<name>hbase.rootdir</name>
		<value>hdfs://dmcluster/hbase</value>
	</property>
	<property>
		<name>hbase.cluster.distributed</name>
		<value>true</value>
	</property>
	<property>
		<name>hbase.zookeeper.quorum</name>
		<value>app-11,app-12,app-13</value>
	</property>
	<property>
		<name>hbase.zookeeper.property.dataDir</name>
		<value>/hadoop/HBase/hbase-2.2.0/zookeeper</value>
	</property>
	<property>
		<name>hbase.zookeeper.property.clientPort</name>
		<value>2181</value>
	</property>
	<property>
		<name>hbase.unsafe.stream.capability.enforce</name>
		<value>false</value>
	</property>

After the change:
Insert picture description here

Modify regionservers

Before the change:
Insert picture description here

After adding app-11
app-12
app-13
changes in regionservers :
Insert picture description here

Add backup-masters file

Only write app-13 in the file
Insert picture description here

Install HBase

Download the HBase installation package

1. Log in as root user
Command:sudo /bin/bash
Insert picture description here

2. Create HBase Directory
Command:mkdir /hadoop/HBase
Insert picture description here

3. Change all the directories under the HBase directory to all hadoop
commands:chown hadoop:hadoop /hadoop/HBase
Insert picture description here

4. Log in as hadoop user
Command:su - hadoop
Insert picture description here

5. Enter the HBase installation directory
Command:cd /hadoop/HBase/
Insert picture description here

6. Download the HBase installation package
Command:wget http://archive.apache.org/dist/hbase/2.2.0/hbase-2.2.0-bin.tar.gz
Insert picture description here

7. Unzip the installation package
Command:tar -xzf hbase-2.2.0-bin.tar.gz
Insert picture description here

Modify the configuration file

8. Enter the configuration file
command:cd hbase-2.2.0/conf/
Insert picture description here

9. Delete the configuration file that needs to be modified
Command:rm -rf hbase-env.sh hbase-site.xml regionservers
Insert picture description here

10. Copy the modified configuration file in the /tmp/Spark-stack/HBase/conf directory to the configuration file
Command:cp /tmp/Spark-stack/HBase/conf/* ./
Insert picture description here

Modify environment variables

11. Open the environment variable
command:vi ~/.bashrc
Insert picture description here

12. Addexport HBASE_HOME=/hadoop/HBase/hbase-2.2.0 export PATH=${HBASE_HOME}/bin:$PATH
Insert picture description here

13. Make environment variables take effect
Command:source ~/.bashrc
Insert picture description here

14. Check whether the environment variable is valid
Command:echo $PATH
Insert picture description here

Install HBase on the other two machines

15. First create a directory to install HBase on app-12 and app-13.
Command: ssh hadoop@app-12 "mkdir /hadoop/HBase"ssh hadoop@app-13 "mkdir /hadoop/HBase"
Insert picture description here

16. Copy HBsae to app-12 and app-13
Command: scp -r -q /hadoop/HBase/hbase-2.2.0 hadoop@app-12:/hadoop/HBase/scp -r -q /hadoop/HBase/hbase-2.2.0 hadoop@app-13:/hadoop/HBase/
Insert picture description here

17. Copy environment variables to app-12 and app-13.
Command: scp ~/.bashrc hadoop@app-12:~/scp ~/.bashrc hadoop@app-13:~/
Insert picture description here

Cleanup work (delete unclean files after the first installation failure)

18. Clear the hbase directory in hdfs, do not need to clear if it does not exist
Command:hdfs dfs -rm -r -f /hbase
Insert picture description here

19. Clear the hbase node in zookeeper, otherwise an error such as Master is initializing may occur.
Command:echo 'rmr /hbase' | zkCli.sh
Insert picture description here

20. Start HBase on app-12, because app-12 is the HBASE_MASTER
command:ssh hadoop@app-12 "cd /hadoop/HBase/hbase-2.2.0/bin && ./start-hbase.sh"
Insert picture description here

21. Check if the startup is successful
Command:ssh hadoop@app-12 "jps"
Insert picture description here

22. Open the HBaseWeb monitoring page
URL: http:// app-12:16030
Insert picture description here

Set up automation script

23. Add to automatic start
command:vi /hadoop/config.conf
Insert picture description here

24, add export HBASE_IS_INSTALL=True
Insert picture description here

25. Make environment variables take effect
Command:source ~/.bashrc
Insert picture description here

26. Confirm that the start.all script has HBase
commands: For vi /hadoop/startAll.sh
Insert picture description here
detailed learning content, you can watch Spark's fast big data processing scan~~~ or search for Spark Yu Haifeng
Insert picture description here

Guess you like

Origin blog.csdn.net/weixin_45810046/article/details/112950883