What should I do if the Hadoop 50070 port cannot be opened?

What should I do if the Hadoop 50070 port cannot be opened?

Hadoop 50070 is the web management page of hdfs. When building a Hadoop cluster environment, some big data development technicians will encounter the situation that the Hadoop 50070 port cannot be opened. There are many reasons for this problem. To solve this problem, the following aspects are required. Check it out!

1. Check whether the Namenode is successfully deployed

To check whether the Namenode is successfully deployed, you can use the command /etc/init.d/hadoop-0.20-namenode status or jps to check. If the Namenode is not deployed successfully, you need to redeploy the Namenode; Check step by step!

2. Check whether the datanode is successfully deployed

To check whether the datanode is successfully deployed, you can use the jps command to check. If the deployment is unsuccessful, find the problem node and solve it; if the deployment is successful, go to step 3.

3. Check whether the firewall is enabled

Check whether the firewall is normally enabled. If the firewall is disabled, you can set it as follows:

netstat –ant #View local development port

127.0.0.1 50070

In hdfs-site.xml, change the bind IP of the open port:

<property>

  <name>dfs.http.address</name>

  <value>0.0.0.0:50070</value>

</property>

Change the binding IP to 0.0.0.0 instead of the local loopback IP, so that the external network can access port 50070 of the machine

The above is the normal troubleshooting sequence and solution for the Hadoop 50070 port cannot be opened. If the three inspection items are set normally, but the problem has not been solved, you can check the browser cache.

 

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=326170619&siteId=291194637