Start a pyspark on yarn:
$ pyspark --master yarn
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/__ / .__/\_,_/_/ /_/\_\ version 3.2.0
/_/
Using Python version 3.8.12 (default, Nov 12 2021 08:41:47)
Spark context Web UI available at http://master:4041
Spark context available as 'sc' (master = yarn, app id = application_1652774608535_0001).
SparkSession available as 'spark'.
>>
Submit a task:
>>> sc.parallelize([1,2]).map(lambda x:x*10).collect()
report error
java.io.IOException: Cannot run program "python3": error=2, No such file or directory
......
Solution, add environment variable:
export PYSPARK_PYTHON=$PYTHON_HOME/bin/python3
Reboot, executed successfully