java连接hdfs文件系统并访问文件

package hadoop;

import java.io.BufferedReader;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.net.URL;

import org.apache.commons.io.IOUtils;
import org.apache.hadoop.fs.FsUrlStreamHandlerFactory;

public class Hellohdfs {
    
    private static final String PATH="hdfs://192.168.199.132:9000/test.txt";

    public static void main(String[] args) throws Exception {
        //设定开启HDFS协议
        URL.setURLStreamHandlerFactory(new FsUrlStreamHandlerFactory());

        //开启URL
        InputStream in =  new URL(PATH).openStream();

        //读取文件
        BufferedReader reader = new BufferedReader(new InputStreamReader(in));
        String line = null;
        while((line =reader.readLine())!= null){
            System.out.println(line.toString());
        }

       
//      org.apache.hadoop.io.IOUtils.copyBytes(in, System.out, 1024,true);
    }
}

此时会出现connected refused问题这时候按以下步骤修改

把core-site.xml中的localhost 改为本机的ip地址

然后修改/etc/hosts文件

把localhost改为自己的ip地址

并添加

0.0.0.0  192.168.199.132(自己的ip地址)

猜你喜欢

转载自blog.csdn.net/qq_42039996/article/details/83350951