hadoop- hdfs

1. Practice

Enter /app/hadoop-1.1.2, create a new folder input, in the folder vi daysn.txt

daysn wu is handsome
very handsome
wow so so handsome
yes he is

Then save. Ye who will be uploaded to this daysn.txt hdfs go. Hdfs established in the folder / upload

hadoop fs -mkdir /upload
Hadoop fs -Ls /

Upload command using copyFromLocal

hadoop fs -copyFromLocal daysn.txt /upload/daysn.txt
Hadoop fs -Ls /

 

 

Next we try to read hdfs with code files. 

Then create a folder classes in app / hadoop-1.1.2 below. Into the classes directory, vi Test.java, enter the following code.

import java.io.InputStream;
import java.net.URI;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.*;
import org.apache.hadoop.io.IOUtils;
public class Test{ public static void main(String[] args) throws Exception{
String uri = args[0];
Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(URI.create(uri), conf);
     InputStream in = null;
try{
in = fs.open(new Path(uri));
IOUtils.copyBytes(in, System.out, 4096, false);
}finally{
IOUtils.closeStream(in);
}
} }

then

javac -classpath ../hadoop-core-1.1.2.jar Test.java compile it

Then use the same command with a Test (say the name of this class really should be called catFile like, write command to do number)

Then try to modify the file, here about 100 bytes to generate a text file in the local file system of the building with the experimental examples,

Write a program that reads the contents of the file and write it to 101-120 bytes of HDFS as a new file

import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.OutputStream;
import java.net.URI;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IOUtils;
import org.apache.hadoop.util.Progressable;

public class LocalFile2Hdfs {
    public static void main(String[] args) throws Exception {

        // Get reads the source and destination files location parameters 
        String local = args [0 ];
        String uri = args[1];

        FileInputStream in = null;
        OutputStream out = null;
        The conf the Configuration = new new the Configuration ();
         the try {
             // Get read file data 
            in = new new the FileInputStream ( new new File (local));

            // get object file information 
            the FileSystem FS = FileSystem.get (URI.create (uri), conf);
            out = fs.create(new Path(uri), new Progressable() {
                @Override
                public void progress() {
                    System.out.println("*");
                }
            });

            // skip the first 100 characters 
            in.skip (100 );
             byte [] Buffer = new new  byte [20 is ];

            // read 101 from the position 20 characters into buffer 
            int bytesRead = in.read (buffer);
             IF (bytesRead> = 0 ) {
                out.write(buffer, 0, bytesRead);
            }
        } finally {
            IOUtils.closeStream(in);
            IOUtils.closeStream(out);
        }
    }
}

A new code files in the classes directory

 LocalFile2Hdfs.java

And then do something

javac -classpath ../hadoop-core-1.1.2.jar LocalFile2Hdfs.java

In the input folder, add a new.txt, content just to play.

 

Guess you like

Origin www.cnblogs.com/daysn/p/12286022.html