查找某个文件在HDFS集群中的位置

分享一下我老师大神的人工智能教程!零基础,通俗易懂!http://blog.csdn.net/jiangjunshow

也欢迎大家转载本篇文章。分享知识,造福人民,实现我们中华民族伟大复兴!

               

     通过"FileSystem.getFileBlockLocation(FileStatus file,long start,long len)"可查找指定文件在HDFS集群上的位置,其中file为文件的完整路径,start和len来标识查找文件的路径。

以下是java代码的实现:

package com.njupt.hadoop;


import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.BlockLocation;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.hdfs.DistributedFileSystem;
import org.apache.hadoop.hdfs.protocol.DatanodeInfo;


public class FindFileOnHDFS {


public static void main(String[] args) throws Exception{
getHDFSNodes();
getFileLocal();
}


public static void getFileLocal() throws Exception {


Configuration conf = new Configuration();
FileSystem hdfs = FileSystem.get(conf);
Path fpath = new Path("/user/root/20120722/word.txt");

FileStatus fileStatus = hdfs.getFileStatus(fpath);
   BlockLocation[] blkLocations = hdfs.getFileBlockLocations(fileStatus, 0, fileStatus.getLen());

   int blockLen = blkLocations.length;
   
   for(int i = 0 ; i < blockLen ; ++i ){
    String[] hosts = blkLocations[i].getHosts();
    System.out.println("block_"+i + "_location:" + hosts[i]);
   }
}


public static void getHDFSNodes() throws Exception{
Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(conf);
DistributedFileSystem hdfs = (DistributedFileSystem)fs;
DatanodeInfo[] dataNodeStats = hdfs.getDataNodeStats();

for( int i = 0 ; i < dataNodeStats.length ; ++i ){
System.out.println("DataNode_" + i + "_Node:" + dataNodeStats[i].getHostName());
}
}
}

           

给我老师的人工智能教程打call!http://blog.csdn.net/jiangjunshow

这里写图片描述

猜你喜欢

转载自blog.csdn.net/sfhinsc/article/details/84023534