IDEA configuration problems encountered hdfs

Through online tutorials to see if there is a query hdfs java program in a file:

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.log4j.BasicConfigurator;


public class test {
    public static void main(String[] args){
        try {
            String filename = "hdfs://localhost:9000/123.txt"; #这是检查hdfs中的根目录中是否有123.txt
            Configuration conf = new Configuration();
            FileSystem fs = FileSystem.get(conf);
            if (fs.exists(new Path(filename))){
                System.out.println("exist");
            }else {
                System.out.println("not exist");
            }
        } catch (Exception e) {
            e.printStackTrace();
        }
    }
}

Want to try in the IDEA, IDEA and ECLIPSE but at configuration time is somewhat different, so step on a lot of the pit.
1. Import jar package
IDEA jar import process is File-Projiect Structure-Modules (eclipse is Libraries)

and then click to select the corresponding item, I was test, click the plus sign at right:
Here Insert Picture Description
Select the location where the hadoop, in hadoop jar package in hadoop / share / hadoop /, select the common / lib, wherein the bag loading.
Here Insert Picture Description
2. Copy the hadoop / etc / hadoop in core-site.xml and hdfs-site.xml to test / src that is in the src directory of the project (eclipse is placed in the bin directory):
Here Insert Picture Description
otherwise there will be:
Here Insert Picture Description

Solution log4j problems 3. Run the program to solve: log4j: WARN No appenders could be found for logger
two ways:
first: Add relevant documents in the file src
second: adding a code in the main function: the BasicConfigurator.configure (); reference: wherein there are specific reasons for the problems

4. When: org.apache.hadoop.fs.UnsupportedFileSystemException: No FileSystem for scheme " hdfs"
is incomplete because the import jar package, it is recommended to hadoop / share / hadoop / common / lib package Import All

The format must each start hadoop namenode problem
First create a directory in the home directory hadoop_tmp
sudo mkdir ~ / hadoop_tmp
then modify hadoop / etc / hadoop directory in core-site.xml file, add the following nodes:

hadoop. tmp.dir
/ home / zcsc / hadoop_tmp
. a Base for the Temporary directories OTHER

Note: I am so users are zcsc directory is / home / zcsc / hadoop_tmp
format at the NameNode
hadoop the NameNode -format
to hadoop_tmp permissions:
sudo chmod -R 777 / home / zcsc / hadoop_tmp
then start hadoop
start-all.sh
JPS command execution can see the NameNode

Released five original articles · won praise 0 · Views 514

Guess you like

Origin blog.csdn.net/pursuingparadise/article/details/103887866