Ubuntu under the eclipse configuration MapReduce

Download the configuration file:

Links: https://pan.baidu.com/s/13vatPHpDP5HaW0mKuHydUA
extraction code: pjxi

1) Start hadoop

cd /usr/local/hadoop
./sbin/start-dfs.sh

2) Copy the files hadoop-eclipse-plugin-2.6.0.jar to / usr / lib / eclipse / plugins /

sudo cp /media/sf_gx/hadoop2x-eclipse-plugin-master/release/hadoop-eclipse-plugin-2.6.0.jar /usr/lib/eclipse/plugins/

3) to the user all the permissions msq

cd /usr/lib/eclipse/plugins
sudo chmod 777 hadoop-eclipse-plugin-2.6.0.jar

4) Start eclipse

sudo /usr/lib/eclipse/eclipse -clean

5) Configuration

     (1) Start Eclipse can be seen after DFS Locations on the left side of the Project Explorer (If you see a welcome screen, click on the upper left corner you can see the Close x .CentOS the need to switch Perspective to see, namely access a second step down configuration steps).

     

 

       (2) The first step: Select Preference under the Window menu.

 

 

 

     Will pop up a window, on the left side of the form will be more of Hadoop Map / Reduce options, click on this option, select the Hadoop installation directory (such as / usr / local / hadoop, Ubuntu is not good select the directory, direct input on the line) .

     Step Two: Switch Map / Reduce development view, choose Select Open Perspective under the Window menu -> Other (CentOS is a Window -> Perspective -> Open Perspective -> Other), a pop-up window, choose Map / Reduce option to to switch.

 

 

 

 

 

      The third step: establishing connection Hadoop cluster, click the bottom right corner of the Eclipse software Map / Reduce Locations panel, in the panel, right-click and select New Hadoop Location.

     In the pop-up to the General Options panel, General settings to be consistent with the Hadoop configuration. General Host two values are the same, if it is pseudo-distributed, you can fill in localhost, and I use Hadoop pseudo-distributed configuration , set fs.defaultFS to hdfs: // localhost: 9000, the DFS Master of the Port to change 9000. Map / Reduce (V2) Master of Port by default can, Location Name free to fill.

       The final set as shown below:

 

 

 

Advanced parameters options panel is Hadoop configuration parameters actually fill Hadoop configuration items (/ usr / local / hadoop / etc / hadoop configuration files), as I configured hadoop.tmp.dir, will be carried out accordingly Modifications. But it would be more cumbersome to modify, we can solve (we will discuss below) by copying the configuration files.

In short, we just configure the General on the line, click on the finish, Map / Reduce Location has been created.

 

Files in HDFS operating in Eclipse

Once configured, click on the MapReduce Location (click the triangle to expand) on the left in the Project Explorer will be able to directly view the list of files in HDFS (HDFS file must be in the following figure is the output of WordCount), double-click to view the contents, right click to upload, download, delete files in HDFS, no longer need to operate through the tedious hdfs dfs -ls commands.
The following output / part-r-00000 records the output file.

Using the Eclipse view the contents of files in HDFS

If you can not see, you can right-click on the Location Reconnect or try to restart Eclipse.

Tips

After the contents of HDFS changes, Eclipse will not sync refresh, you need to right-click on the MapReduce Location Project Explorer, select Refresh, to see the file after the change.

 

Create a MapReduce project in Eclipse

Click the File menu, select New -> Project ...:

 

Create a Project

Select the Map / Reduce Project, click Next

 

 

 

 

Create a MapReduce project

Click Finish to complete the Project name has been created for the project WordCount can.

 

Guess you like

Origin www.cnblogs.com/msq2000/p/11781518.html