Bloomfilter 持久化问题

问题描述:使用org.apache.hadoop.util.bloom.BloomFilter,将Hdfs文件夹中的所有文件内容添加到BloomFilter,然后将其持久化到Hdfs。
直接从主函数运行正常
从web程序调用,发现文件总是0.0b

错误原因:
FSDataOutputStream输出流未flush,未close

正确代码:
public static boolean bloomFilterInit(Configuration config, int numbers,
String fromuri, String touri) throws IOException {
boolean isInit = false;
int vectorSize = HashUtil.getOptimalBloomFilterSize(numbers,
HashUtil.FALSEPOSRATE);
int hashCount = HashUtil.getOptimalK(numbers, vectorSize);
FileSystem fileSystem = FileSystem.get(config);
BloomFilter bf = readHdfsToBloom2(vectorSize, hashCount, fileSystem,
fromuri);
if (bf != null) {
FSDataOutputStream strm = fileSystem.create(new Path(touri));
bf.write(strm);
strm.flush();
strm.close();
isInit = true;
}
bf = null;
return isInit;
}

猜你喜欢

转载自belinda407.iteye.com/blog/2212461