spark基于yarn运行报错:Required executor memory (1024 MB), offHeap memory (0) MB, overhead (384 MB), and Py

问题报错

解决方法 

修改yran-site.xml文件

<!-- 设置RM内存资源配置,两个参数 -->
<property>
    <description>The minimum allocation for every container request at the RM,
        in MBs. Memory requests lower than this won't take effect,
        and the specified value will get allocated at minimum.</description>
    <name>yarn.scheduler.minimum-allocation-mb</name>
    <value>1024</value>
</property>
<property>
    <description>The maximum allocation for every container request at the RM,
        in MBs. Memory requests higher than this won't take effect,
        and will get capped to this value.</description>
    <name>yarn.scheduler.maximum-allocation-mb</name>
    <value>4096</value>
</property>

<!-- 前两个配置表示单个节点可用的最大内存,RM中的两个值都不应该超过该值。否则会报错
<property>
    <description>Amount of physical memory, in MB, that can be allocated
        for containers.</description>
    <name>yarn.nodemanager.resource.memory-mb</name>
    <value>8192</value>
</property>

确保yarn.nodemanager.resource.memory-mb的值一定要大于yarn.scheduler.minimum-allocation-mb和yarn.scheduler.maximum-allocation-mb的值

猜你喜欢

转载自blog.csdn.net/m0_55868614/article/details/121345605