Problems arise when a file is read hdfs record java.net.ConnectException: Connection refused

 

The company's hadoop cluster is built before colleagues and I (a white) to read files on hdfs when the spark shell, execute the following command

>>> word=sc.textFile("hdfs://localhost:9000/user/hadoop/test.txt")
>>> word.first()

Error: java.net.ConnectException: Call From hadoop / 133.0.123.130 to localhost: 9000 failed on connection exception: java.net.ConnectException: Connection refuse. Appears to be hdfs local server connection is a problem, I see the files on the following hdfs, discovery can view normal, indicating that the connection with hdfs local server, the communication is not a problem! Thought to want to, I have to change another way to read files on the hdfs

>>> word=sc.textFile("/user/hadoop/test.txt")
>>> word.first()

Because the file is read on hdfs spark default, so this way is possible, and found normal operation, under which the problem is clear, and is "localhost: 9000" out of the question, I look at the hadoop / etc / port disposed core-site.xml

Display port settings properly, it should be localhost corresponding IP address wrong! Then view the hosts file, found

Found localhost IP address corresponding to the local server IP addresses do not match, finally found a reason, I will read the instructions hdfs change the file:

>>> word=sc.textFile("hdfs://hadoop:9000/user/hadoop/test.txt")
>>> word.first()

The results normal.

 

Guess you like

Origin www.cnblogs.com/hgz-dm/p/11356357.html