[2019/7/28] Summer self - Week 3 progress report

  This week mainly familiar with HDFS operations.

  After the virtual machine installed Hadoop, I need its own file system, HDFS and MapReduce under a series of operations to familiarize pave the way for the next HBase learning. Because HBase is HDFS file storage system, data processing is MapReduce, in many operations need to operate with lots of HDFS to, so I need to configure actual equipment and above good eclipse plugin configuration.

  The first is to start Hadoop.

  Before configuring Hadoop when the core-site.xml I used to test the localhost into a virtual machine ip, and previously put the Ubuntu firewall turned off, ensure eclipse on a real machine can access, there would be

java.lang.IllegalArgumentException: Wrong FS: hdfs://192.168.0.107:9000/user/hadoop/test.txt, expected: hdfs://192.168.0.106:9000

Case. 

  Then on the eclipse you can create a java project, import hadoop-common-2.7.7.jar and hadoop-hdfs-2.7.7.jar (also must import two jar package depends lib, but fortunately both dependent about the same), then write a test case.

package Chapter3;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;

public class test {
    public static void main(String[] args) 
    {
        try {
            String filename = "hdfs://192.168.0.106:9000/user/hadoop/test.txt";
            Configuration conf = new Configuration();
            FileSystem fs = FileSystem.get(conf);
            if(fs.exists(new Path(filename))){
                System.out.println ( "file exists" );
            }else{
                System.out.println ( "File does not exist" );
            }
        } catch (Exception e) {
            e.printStackTrace ();}
    }
}

  Hadoop also need to join two core configuration files in the bin folder -

  Then you can run smoothly.

 

   The main test real machine virtual machine to access Hadoop, eclipse plug-in MapReduce and HDFS are a few basic mode of operation, after HBASE will be used in the operation of hdfs, after learning deepens.

 

Guess you like

Origin www.cnblogs.com/limitCM/p/11329396.html