APACHE HADOOP installation

0.5 Preparation Before Installation

0.1 turn off the firewall

1 service iptables status
2 service iptables stop

0.2 Close Selinux

SELINUX lot of strange problems are caused.

 

1. Create a user 

1 useradd hadoop -d /home/hadoopecho
2 hadoop|passwd hadoop --stdin

 

2.SSH free dense configuration

2.1 generate a key file

1 ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa

2.2 Copy the public key to each node

1 scp id_rsa.pub xxx@ip:~/.ssh/file
2 cat id_rsa.pub >> authorized_keys

2.3 Configuration SSHD configuration, enable the following two

RSAAuthentication yes
PubkeyAuthentication yes

If prompted for a password or ssh hostnamexx found, they need to view the log information / var / log / secure is to check the details of the error, the directory permissions are usually wrong,
rights are generally put the password file is set to 600, chmod 600 .ssh / xxx

 

3. Modify the configuration file

 1 3.1 core-site.xml
 2 
 3 <configuration>
 4 <property>
 5 <name>fs.default.name</name>
 6 <value>hdfs://hmaster/:9000</value>
 7 <final>true</final>
 8 </property>
 9 <property>
10 <name>hadoop.tmp.dir</name>
11 <value>file:/home/hadoop/tmp</value>
12 </property>
13 <property>
14 <name>io.file.buffer.size</name>
15 <value>131072</value>
16 </property>
17 </configuration>
18 
19 
20 3.2 hdfs-site.xml
21 
22 <configuration>
23 <property>
24 <name>dfs.replication</name>
25 <value>1</value>
26 </property>
27 <property>
28 <name>dfs.data.dir</name>
29 <value>/home/hadoop/hdfs/data</value>
30 </property>
31 <property>
32 <property>
33 <name>dfs.name.dir</name>
34 <value>/home/hadoop/hdfs/name</value>
35 </property>
36 <property>
37 <name>dfs.webhdfs.enabled</name>
38 <value>true</value>
39 </property>
40 </configuration>
41 
42 
43 3.3 mapred-site.xml
44 
45 <configuration>
46 <property>
47 <name>mapred.job.tracker</name>
48 <value>hmaster:8021</value>
49 </property>
50 <property>
51 <name>mapred.local.dir</name>
52 <value>/tmp/hadoop/mapred/local</value>
53 </property>
54 <property>
55 <name>mapred.system.dir</name>
56 <value>/tmp/hadoop/mapred/system</value>
57 </property>
58 <property>
59 <name>mapred.tasktracker.map.tasks.maximum</name>
60 <value>2</value>
61 </property>
62 <property>
63 <name>mapred.tasktracker.reduce.tasks.maximum</name>
64 <value>2</value>
65 </property>
66 <property>
67 <name>mapred.child.java.opts</name>
68 <value>Xmx200m</value>
69 </property>
70 <property>
71 <name>mapred.jobhistory.address</name>
72 <value>hmaster:10020</value>
73 </property>
74 <property>
75 <name>mapred.jobhistory.webapp.address</name>
76 <value>hmaster:19888</value>
77 </property>
78 </configuration>
79 
80 3.4 yarn-site.xml
81 
82 <configuration>
83 <!-- Site specific YARN configuration properties -->
84 <property>
85 <name>yarn.resourcemanager.address</name>
86 <value>hmaster:8032</value>
87 </property>
88 <property>
89 <name>yarn.nodemanager.aux-services</name>
90 <value>mapreduce.shuffle</value>
91 </property>
92 <property>
93 <name>yarn.nodemanager.webapp.address</name>
94 <value>hmaster:8088</value>
95 </property>
96 </configuration>
View Code

Note
master node / etc / hosts first two lines must be commented out
# 127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4 the Oracle 11g-
# :: 1 localhost localhost.localdomain localhost6 localhost6.localdomain6
or will play on 127.0.0.1 namenode service, resulting in denial of service related access.

3.5 Set Environment Variables

1 PATH=$PATH:$HOME/bin:$HOME/sbin:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
2 JAVA_HOME=/usr/local/src/jdk1.8
3 export HADOOP_HOME=/home/hadoop/hadoop
4 export JAVA_HOME=/usr/local/src/jdk1.8
5 export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
6 export LD_LIBRARY_PATH=$HADOOP_HOME/lib/native

3.6 slaves configuration file

Each row of write slave IP, such as
[@ hadoop1 hadoop hadoop] $ CAT slaves
192.168.43.199

 

4. Initialize

4.1 HDFS file system format

hadoop namenode -format
see Exiting with status 0 it shows the success of the initialization.
In the master, slave view the process with JPS
have namenode, secondary namenode, datanode normal.

 

5.web system

master: 50070 namenode is the web address of
master: 19888 jobhistory web address

Guess you like

Origin www.cnblogs.com/peeyee/p/11965214.html