Install and configure Hadoop on Windows platform (without cygwin)

Due to the needs of the project, I installed several virtual machines Windows server 2012 R2 on VMware, and wanted to build a Hadoop cluster. I was just getting started with hadoop, and I was at a loss. Then I started searching for various tutorials. First, I chose cygwin to install it. However, there were a lot of problems later, and I gave up. In the end, I chose to install and configure Hadoop directly on Windows server 2012 R2. I am very grateful to the blogger for the tutorial. The address is https://blog.csdn.net/antgan/article/details/52067441 . Basically, I followed this tutorial. Tell me I've stepped on a pit.

The whole process is:

JDK installation and environment variable configuration (search for this tutorial by yourself)

Download hadoop2.5.2.tar.gz and decompress it, configure the hadoop environment variables, and modify the hadoop configuration file. Mine is directly placed in the C directory, that is, C:\hadoop-2.5.2

Its file directory is

 

Download hadooponwindows-master.zip  and unzip it, replace the bin directory with the bin directory under the original hadoop directory.

Operating environment: run cmd, the window will go to C:\hadoop-2.5.2\bin, execute the "hdfs namenode -format" command, and an error will be reported here,

Related to hdfs-site.xml file,   exception: Invalid byte 2 of 2-byte UTF-8 sequence

<?xml version="1.0" encoding="UTF-8"?>  

Directly change UTF-8 to UTF8, ie

<?xml version="1.0" encoding="UTF8"?>  

To re-run cmd, first delete the files in the logs directory generated by the last formatting and the files generated in the workplace directory. Note that only the files in the directory are deleted, not the file directory. The window goes to C:\hadoop-2.5.2\bin, execute the "hdfs namenode -format" command, no error is reported

Convert the cmd window to sbin, execute the "start-all.cmd" command, and the following four windows will pop up:

 

Immediately under sbin, execute the "jps" command and get the following results:

 

At this point, the hadoop service is built.

Follow the blogger tutorial above to perform the upload test and operate HDFS. Note that when the Hadoop service is enabled, open another cmd window and perform the following operations:

First create the input directory:

 

 

Upload data to the directory:

 

Then look at the file:

 

 

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=325229682&siteId=291194637