在Windows上用IDEA本地运行Hadoop或者的spark程序时报错:
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
19/04/02 14:50:28 ERROR Shell: Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:382)
at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:397)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:390)
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:80)
at org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(SecurityUtil.java:610)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:277)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:265)
at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:810)
at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:780)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:653)
at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2427)
at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2427)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2427)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:295)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2516)
出现这样的错误是因为Windows中环境变量的问题。Windows的环境变量中缺少winutils.exe工具。
解决办法:
下载Hadoop对应版本的winutils
我这里安装的是Hadoop2.7版本:
下载地址:链接:https://pan.baidu.com/s/1sArPAkynnt2R30UDpmse8g
提取码:odcf
下载到电脑上解压,然后配置环境变量:
1.添加系统环境变量,这里要注意,添加的是系统环境变量:
2.添加path(系统):
然后就可以了。