hadoop2.6.0-cdh5.7.0安装


下载Hadoop和JDK

安装JDK

  • 解压jdk压缩包
  tar -zxvf /home/hadoop/software/jdk-7u80-linux-x64.tar.gz -C /usr/java

   
   

  • 配置jdk环境变量

   
   
  1. hadoop:root:/usr/java:>vi /etc/profile
  2. # /etc/profile
  3. # System wide environment and startup programs, for login setup
  4. # Functions and aliases go in /etc/bashrc
  5. # It's NOT a good idea to change this file unless you know what you
  6. # are doing. It's much better to create a custom.sh shell script in
  7. # /etc/profile.d/ to make custom changes to your environment, as this
  8. # will prevent the need for merging in future updates.
  9. #add path
  10. export JAVA_HOME=/usr/java/jdk1 .7 .0_80
  11. export PATH= $JAVA_HOME</span>/bin:<span class="hljs-variable">$PATH
  12. #show path
  13. hadoop:root:/usr/java:> source /etc/profile
  14. hadoop:root:/usr/java:>java -version
  15. java version "1.7.0_80"
  16. Java(TM) SE Runtime Environment (build 1.7 .0_80-b15)
  17. Java HotSpot(TM) 64-Bit Server VM (build 24.80-b11, mixed mode)

配置shh


   
   
  1. hadoop:hadoop:/home/hadoop:>ssh-keygen -t rsa
  2. Generating public/ private rsa key pair.
  3. Enter file in which to save the key (/home/hadoop/.ssh/id_rsa):
  4. Enter passphrase (empty for no passphrase):
  5. Enter same passphrase again:
  6. Your identification has been saved in /home/hadoop/.ssh/id_rsa.
  7. Your public key has been saved in /home/hadoop/.ssh/id_rsa.pub.
  8. The key fingerprint is:
  9. ca: 20:e2: 68: 64: 46:e0:f2: 62: 63:b9: 60: 71:a5: 75: 4a hadoop@hadoop
  10. The key 's randomart image is:
  11. +--[ RSA 2048]----+
  12. |. E . |
  13. |o = o |
  14. |.+ o . |
  15. |o.+ |
  16. |+Xo . S |
  17. |@oo. o . |
  18. |.+ o |
  19. |. |
  20. | |
  21. +-----------------+
  22. hadoop:hadoop:/home/hadoop:>
  23. hadoop:hadoop:/home/hadoop:>cp .ssh/id_rsa.pub ~/.ssh/authorized_keys
  24. hadoop:hadoop:/home/hadoop:>cd .ssh/
  25. hadoop:hadoop:/home/hadoop/.ssh:>ll
  26. total 12
  27. -rw-r--r-- 1 hadoop hadoop 395 Jan 2 02: 16 authorized_keys
  28. -rw------- 1 hadoop hadoop 1675 Jan 2 02: 16 id_rsa
  29. -rw-r--r-- 1 hadoop hadoop 395 Jan 2 02: 16 id_rsa.pub

安装Hadoop

  • 解压Hadoop 
    hadoop:hadoop:/home/hadoop/app:>tar -zxvf /home/hadoop/software/hadoop-2.6.0-cdh5.7.0.tar.gz -C /home/hadoop/app/

  • 配置环境


   
   
  1. hadoop:hadoop:/home/hadoop:>vi .bash_profile
  2. # .bash_profile
  3. # Get the aliases and functions
  4. if [ -f ~/.bashrc ]; then
  5. . ~/.bashrc
  6. fi
  7. # User specific environment and startup programs
  8. export HADOOP_HOME=/home/hadoop/app/hadoop- 2.6. 0-cdh5. 7.0
  9. export PATH= $HADOOP_HOME</span></span>/bin:<span class="hljs-variable"><span class="hljs-variable">$HADOOP_HOME/sbin: $PATH</span></span></div></div></li><li><div class="hljs-ln-numbers"><div class="hljs-ln-line hljs-ln-n" data-line-number="14"></div></div><div class="hljs-ln-code"><div class="hljs-ln-line"> </div></div></li><li><div class="hljs-ln-numbers"><div class="hljs-ln-line hljs-ln-n" data-line-number="15"></div></div><div class="hljs-ln-code"><div class="hljs-ln-line"> </div></div></li><li><div class="hljs-ln-numbers"><div class="hljs-ln-line hljs-ln-n" data-line-number="16"></div></div><div class="hljs-ln-code"><div class="hljs-ln-line">hadoop:hadoop:/home/hadoop:&gt;<span class="hljs-built_in"><span class="hljs-built_in">source</span></span> .bash_profile </div></div></li><li><div class="hljs-ln-numbers"><div class="hljs-ln-line hljs-ln-n" data-line-number="17"></div></div><div class="hljs-ln-code"><div class="hljs-ln-line">hadoop:hadoop:/home/hadoop:&gt;<span class="hljs-built_in"><span class="hljs-built_in">echo</span></span> <span class="hljs-variable"><span class="hljs-variable">$HADOOP_HOME
  10. /home/hadoop/app/hadoop- 2.6. 0-cdh5. 7.0

修改配置文件

  • hadoop-env.sh
export JAVA_HOME=/usr/java/jdk1.8.0_45
   
   
  • core-site.xml

   
   
  1.       <property>
  2.           <name>fs.default.name </name>
  3.            <value>hdfs://hadoop-01:9000 </value>
  4.                         </property>
  5.       <property>
  6.           <name>hadoop.tmp.dir </name>
  7.           <value>/home/hadoop/app/tmp </value>
  8.                         </property>
  • hdfs-site.xml

   
   
  1. <configuration>
  2.         <property>
  3.               <name>dfs.namenode.name.dir </name>
  4.               <value>/home/hadoop/app/tmp/dfs/name </value>
  5.                         </property>
  6.         <property>
  7.               <name>dfs.datanode.data.dir </name>
  8.                       <value>/home/hadoop/app/tmp/dfs/data </value>
  9.                             </property>
  10.       <property>
  11.             <name>dfs.namenode.secondary.http-address </name>
  12.                     <value>hadoop-01:50090 </value>
  13.                           </property>
  14.       <property>
  15.               <name>dfs.namenode.secondary.https-address </name>
  16.                       <value>hadoop-01:50091 </value>
  17.                             </property>
  18.           <property>
  19.                 <name>dfs.replication </name>
  20.                       <value>1 </value>
  21.                             </property>
</configuration>
  • slaves
echo  "hadoop-01" > ./etc/hadoop/slaves
  • mapred-site.xml(正常情况下没有这个文件,可由 mapred-site.xml.template 
    复制而来) cp mapred-site.xml.template   mapred-site.xml

   
   
  1. <configuration>
  2. <property>
  3. <name>mapreduce.framework.name </name>
  4. <value>yarn </value>
  5. </property>
  6. </configuration>
  • yarn-site.xml

   
   
  1. <configuration>
  2. <property>
  3. <name>yarn.nodemanager.aux-services </name>
  4. <value>mapreduce_shuffle </value>
  5. </property>
  6. </configuration>

格式化Hadoop

   hdfs namenode -format

启动Hadoop


   
   
  1. hadoop:hadoop:/home/hadoop/app/hadoop-2.6.0-cdh5.7.0/etc/hadoop:>start-all.sh
  2. This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
  3. 18/ 01/ 02 02: 49: 33 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  4. Starting namenodes on [hadoop]
  5. hadoop: starting namenode, logging to /home/hadoop/app/hadoop- 2.6. 0-cdh5. 7.0/logs/hadoop-hadoop-namenode-hadoop. out
  6. hadoop: starting datanode, logging to /home/hadoop/app/hadoop- 2.6. 0-cdh5. 7.0/logs/hadoop-hadoop-datanode-hadoop. out
  7. Starting secondary namenodes [ 0.0. 0.0]
  8. hadoop-01: starting secondarynamenode, logging to /home/hadoop/app/hadoop- 2.6. 0-cdh5. 7.0/logs/hadoop-hadoop-secondarynamenode-hadoop. out
  9. 18/ 01/ 02 02: 50: 15 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  10. starting yarn daemons
  11. starting resourcemanager, logging to /home/hadoop/app/hadoop- 2.6. 0-cdh5. 7.0/logs/yarn-hadoop-resourcemanager-hadoop. out
  12. hadoop- 01: starting nodemanager, logging to /home/hadoop/app/hadoop- 2.6. 0-cdh5. 7.0/logs/yarn-hadoop-nodemanager-hadoop. out
  13. hadoop:hadoop:/home/hadoop/app/hadoop- 2.6. 0-cdh5. 7.0/etc/hadoop:>jps
  14. 8345 NodeManager
  15. 8066 SecondaryNameNode
  16. 7820 NameNode
  17. 7914 DataNode
  18. 8249 ResourceManager
  19. 8613 Jps

猜你喜欢

转载自blog.csdn.net/eieiei438/article/details/81738661