1. Installation Configuration maven
Download maven.zip
the maven unzip to a specific directory
configuration environment variable
New MAVEN_HOME, add% MAVEN_HOME% in the Path / bin
CMD test whether the configuration
configuration maven settings.xml file
into the root directory maven \ conf in a settings.xml file
to add Ali cloud images
<mirror>
<id>alimaven</id>
<name>aliyun maven</name>
<url>http://maven.aliyun.com/nexus/content/groups/public/</url>
<mirrorOf>central</mirrorOf>
</mirror>
<mirror>
<id>alimaven</id>
<mirrorOf>central</mirrorOf>
<name>aliyun maven</name>
<url>http://maven.aliyun.com/nexus/content/repositories/central/</url>
</mirror>
<mirror>
<id>ibiblio</id>
<mirrorOf>central</mirrorOf>
<name>Human Readable Name for this Mirror.</name>
<url>http://mirrors.ibiblio.org/pub/mirrors/maven2/</url>
</mirror>
<mirror>
<id>jboss-public-repository-group</id>
<mirrorOf>central</mirrorOf>
<name>JBoss Public Repository Group</name>
<url>http://repository.jboss.org/nexus/content/groups/public</url>
</mirror>
<mirror>
<id>central</id>
<name>Maven Repository Switchboard</name>
<url>http://repo1.maven.org/maven2/</url>
<mirrorOf>central</mirrorOf>
</mirror>
<mirror>
<id>repo2</id>
<mirrorOf>central</mirrorOf>
<name>Human Readable Name for this Mirror.</name>
<url>http://repo2.maven.org/maven2/</url>
</mirror>
Modify the local warehouse location
2. installation configuration hadoop
Hadoop extract to the specified directory
configuration environment variable
New HADOOP_HOME, add% HADOOP_HOME% in the Path / bin
CMD test whether the configuration
IDEA environment configuration
打开File->Settings->Build, Execution, Deployment->Build Tools->Maven
HDFS client configuration
Maven new project, File-> new-> project, select the item maven
configuration pom.xml, double-click into pom.xml
import-dependent coordinates
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>RELEASE</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>2.8.2</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.7.2</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.7.2</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.7.2</version>
</dependency>
<dependency>
<groupId>jdk.tools</groupId>
<artifactId>jdk.tools</artifactId>
<version>1.8</version>
<scope>system</scope>
<systemPath>${JAVA_HOME}/lib/tools.jar</systemPath>
</dependency>
</dependencies>
Start to download the required jar package
click maven in the install
Configuration log files in the project src / main / resources directory, create a file named "log4j.properties", fill in the file
log4j.rootLogger=INFO, stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d %p [%c] - %m%n
log4j.appender.logfile=org.apache.log4j.FileAppender
log4j.appender.logfile.File=target/spring.log
log4j.appender.logfile.layout=org.apache.log4j.PatternLayout
log4j.appender.logfile.layout.ConversionPattern=%d %p [%c] - %m%n
Creating a Package
Creating HdfsClient class
Code Testing
package hdfs;
import com.google.gson.internal.$Gson$Preconditions;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.*;
import org.apache.hadoop.yarn.webapp.hamlet.Hamlet;
import org.junit.Test;
import java.io.IOException;
import java.net.URI;
import java.net.URISyntaxException;
public class HdfsClient {
public static void main(String[] args) throws Exception,IOException,URISyntaxException {
Configuration conf = new Configuration();
//conf.set("fs.defaultFS","hdfs://192.168.186.102:9000");
//1.获取hdfs客户端对象
//FileSystem fs = FileSystem.get(conf);
FileSystem fs = FileSystem.get(new URI("hdfs://192.168.186.102:9000"),conf,"hadoop");
//在hdfs上创建路径
fs.mkdirs(new Path("/test/dashen/shazi"));
//3.关闭资源
fs.close();
System.out.println("over");
}
}