Hadoop 启动dfs报错Permission denied

报错日志如下:

yun12-01: starting namenode, logging to hadoop-rand-namenode-yun12-01.out
yun12-01: /logs/hadoop-rand-namenode-yun12-01.out: Permission denied
yun12-01: /logs/hadoop-rand-namenode-yun12-01.out: Permission denied
yun12-01: /logs/hadoop-rand-namenode-yun12-01.out: Permission denied
yun12-01: starting datanode, logging to hadoop-2.4.1/logs/hadoop-rand-datanode-yun12-01.out
yun12-01: /hadoop-2.4.1/sbin/hadoop-daemon.sh: line 151: /hadoop-2.4.1/logs/hadoop-rand-datanode-yun12-01.out: Permission denied
yun12-01: head: cannot open '/logs/hadoop-rand-datanode-yun12-01.out' for reading: No such file or directory
yun12-01: /sbin/hadoop-daemon.sh: line 166: /home/rand/app/hadoop-2.4.1/logs/hadoop-rand-datanode-yun12-01.out: Permission denied
yun12-01: /sbin/hadoop-daemon.sh: line 167: /logs/hadoop-rand-datanode-yun12-01.out: Permission denied
Starting secondary namenodes [0.0.0.0]

类似这种出现Permission denied 首先排错,Hadoop目录下文件的所有者和所属主是root用户还是当前用户。

执行下面命令,将文件所属者和所属组改为当前用户。

sudo chown -R username:username /Hadoop-2.4.1/*

注:将username改为自己当前用户。

如果仍然报错:

1.判断出错的文件路径,如:当前是/logs文件报的错。

2.先备份logs文件,再重新创建logs。

mv logs logsback
mkdir logs

3.最后重新启动dfs:

start-dfs.sh

猜你喜欢

转载自blog.csdn.net/Rand_C/article/details/82973471