运行mapreduce出现java heap space

运行后,map的0%都没完成,直接
org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.OutOfMemoryError: Java heap space

受到这句话的启发
For us to see why your job is running out of memory we would probably need to see your code. Perhaps you are creating memory-intensive objects every map() that could instead be created once in setup() and re-used every map()?

去检查了自己的代码,果然,有个HashSet放在map中初始化了。改在setup阶段初始化,map使用前clear,问题解决。

猜你喜欢

转载自belinda407.iteye.com/blog/2213003
今日推荐