Environment setup
-
Virtual machine cloning and static IP configuration (you can refer to the blog post listed below as a reference configuration)
-
In
opt
, a new directorymodule
,software
folder. Store the decompressed files and compressed file packages separately.
-
JDK
Installation configuration## 解压jdk压缩包 tar -zxvf jdk-8u144-linux-x64.tar.gz -C /opt/module/ ## 进入/etc/profile 配置环境变量 vim /etc/profile ## 文件末尾写入配置 export JAVA_HOME=/opt/module/jdk1.8.0_144 export PATH=$PATH:$JAVA_HOME/bin ## 重新加载Profile文件 source /etc/profile ## 查看jdk安装情况 $ java -version java version "1.8.0_144" Java(TM) SE Runtime Environment (build 1.8.0_144-b01) Java HotSpot(TM) 64-Bit Server VM (build 25.144-b01, mixed mode)
-
Hadoop
Installation configuration## 解压hadoop压缩包 tar -zxvf hadoop-2.7.2.tar.gz -C /opt/module/ ## 进入/etc/profile 配置环境变量 vim /etc/profile ## 文件末尾写入配置 export HADOOP_HOME=/opt/module/hadoop-2.7.2 export PATH=$PATH:$HADOOP_HOME/bin export PATH=$PATH:$HADOOP_HOME/sbin ## 重新加载profile文件 source /etc/profile ## 查看Hadoop安装情况 $ hadoop Usage: hadoop [--config confdir] [COMMAND | CLASSNAME] CLASSNAME run the class named CLASSNAME or where COMMAND is one of: fs run a generic filesystem user client version print the version jar <jar> run a jar file note: please use "yarn jar" to launch YARN applications, not this command. checknative [-a|-h] check native hadoop and compression libraries availability distcp <srcurl> <desturl> copy file or directories recursively archive -archiveName NAME -p <parent path> <src>* <dest> create a hadoop archive classpath prints the class path needed to get the credential interact with credential providers Hadoop jar and the required libraries daemonlog get/set the log level for each daemon trace view and modify Hadoop tracing settings Most commands print help when invoked w/o parameters.
After adding an environment variable in /etc/profile, it can only take effect in the current terminal after compiling with source /etc/profile. After restarting a terminal, the environment variable becomes invalid and can be resolved after restarting the system.
The
/etc/profile
screenshot of the final configuration file is:
important directory:[root@localhost hadoop-2.7.2]# ll 总用量 28 drwxr-xr-x 2 root root 194 5月 22 2017 bin drwxr-xr-x 3 root root 20 5月 22 2017 etc drwxr-xr-x 2 root root 106 5月 22 2017 include drwxr-xr-x 3 root root 20 5月 22 2017 lib drwxr-xr-x 2 root root 239 5月 22 2017 libexec -rw-r--r-- 1 root root 15429 5月 22 2017 LICENSE.txt -rw-r--r-- 1 root root 101 5月 22 2017 NOTICE.txt -rw-r--r-- 1 root root 1366 5月 22 2017 README.txt drwxr-xr-x 2 root root 4096 5月 22 2017 sbin drwxr-xr-x 4 root root 31 5月 22 2017 share
- bin directory: store scripts for operating Hadoop related services (HDFS, YARN)
- etc directory: Hadoop configuration file directory, storing Hadoop configuration files
- lib directory: store Hadoop local library (compress and decompress data)
- sbin directory: store scripts to start or stop Hadoop related services
- share directory: store Hadoop dependent jar packages, documents, and official cases