When testing the PutMerge program, java.lang.IllegalArgumentException: Wrong FS: hdfs://192.168.31.225:9000/user/root appears. The source code of the test program is as follows:
package com.hadoop.demo;
import java.io.IOException;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FSDataOutputStream;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
public class PutMerge {
public static void main(String[] args) throws IOException {
Configuration conf = new Configuration();
//conf.set("fs.defaultFS", "hdfs://192.168.31.225:9000");
FileSystem hdfs = FileSystem.get(conf);
FileSystem local = FileSystem.getLocal(conf);
Path inputDir = new Path("D:/hadooptest");
Path hdfsFile = new Path("hdfs://192.168.31.225:9000/user/root/example.txt");
try{
FileStatus[] inputFiles = local.listStatus(inputDir);
FSDataOutputStream out = hdfs.create(hdfsFile);
for(int i= 0 ; i < inputFiles.length; i++){
System.out.println(inputFiles[i].getPath().getName());
FSDataInputStream in = local.open(inputFiles[i].getPath());
byte buffer[] = new byte[256];
int bytesRead = 0;
while((bytesRead = in.read(buffer))>0){
out.write(buffer,0,bytesRead);
}
in.close();
}
out.close();
}catch(Exception ex){
ex.printStackTrace();
}
System.out.println("end---------->");
}
}
Solution:
1. Hadoop needs to put the core-site.xml and hdfs-site.xml on the cluster under the current project. Below the bin folder of the eclipse working directory.
2. Add conf.set("fs.defaultFS", "hdfs://192.168.31.225:9000") to the code;