Build --- Eclipse view to operate HDFS and HDFS API ---java

Topic: Build --- Eclipse view operation Hdfs and HDFS API ---java


table of Contents

One, connect hdfs

Note: Start the hadoop cluster before connecting

0. Add with host management tool

1. With hadoop plugin

2. Start eclipse, and then configure other items

, , HDFS API --- java

1. Unzip hadoop, configure environment variables

2. Put the other two files into hadoop's bin, one is hadoop.dll and the other is winutils.exe

3. Add the environment variables of hadoop to the eclipse: windows->prefence-->search for hadoop, import the installation directory of hadoop

4. Create a java project and use java API to operate hdfs


One, connect hdfs

Note: Start the hadoop cluster before connecting

0. Add with host management tool

Run, add, save and apply!

1. With hadoop plugin

Find the dropins folder under the eclipse installation path, and put hadoop-eclipse-plugin-2.7.3.jar under dropins.

 

2. Start eclipse, and then configure other items

(1) Click the trivia button in the upper right corner of eclipse, select the Map/Reduce button, and click Open

The following two areas appear, indicating that the application is successful

(2) Click on the little elephant below to create a new hadoop location

Click to see the picture below: modify the content in the red circle

Visit 50070 port, you can know the host, port; Location name is arbitrary;

Kusui NameNode, DataNode:    
http://192.168.80.5:50070

 

See the following: the connection is successful!

Can view data

After the connection is successful, let's get the HDFS API

, , HDFS API --- java

1. Unzip hadoop, configure environment variables

Unzip hadoop-2.10.0.tar.gz to the windows system, the unzipped directory is D:\hadoop-2.10.0, set the HODOOP_HOME environment variable, "system environment variable" add variable="HADOOP_HOME", value=" D:\hadoop-2.10.0", append "Path", variable="Path", value="D:\hadoop-2.10.0\bin".

 

2. Put the other two files into hadoop's bin, one is hadoop.dll and the other is winutils.exe

3. Add the environment variables of hadoop to the eclipse: windows->prefence-->search for hadoop, import the installation directory of hadoop

4. Create a java project and use java API to operate hdfs

(1) The jar package to be imported is as follows:

[1] The jar package under the common module:

[2] The jar package under hdfs:

Create two new folders, take out the shelf packages, and put them in separately.

As shown:

(2) Create a java project, name it Hbase_project1, and import the rack package

Right-click the newly built project, build path-->Config build path-->libralies-->Add library-->Double-click User Libraary-->new-->New common2.10.0_lib and hdfs2.10.0_lib, and then Add External Library, add the respective packages

Then if the package:

Then start your code journey, alright!

 

Guess you like

Origin blog.csdn.net/qq_46009608/article/details/110825832