Sqoop之——Sqoop 1.4.7 Java 开发

Java 1.8 +Sqoop 1.4.7

本文主要是备注,最近在做这方面的工作,发现网上的文档比较少,mark下。

Maven 引用

  • 数据库连接的Jar包
  • common-lang3
  • avro以及avro-mapred
  • hadoop-hdfs,hadoop-common
  • mapreduced 相关jar

Maven 引用

  • 数据库连接的Jar包
  • common-lang3
  • avro以及avro-mapred
  • hadoop-hdfs,hadoop-common
  • mapreduced 相关jar

pom

 <dependency>
      <groupId>mysql</groupId>
      <artifactId>mysql-connector-java</artifactId>
      <scope>runtime</scope>
  </dependency>
  <dependency>
      <groupId>org.apache.sqoop</groupId>
      <artifactId>sqoop</artifactId>
      <version>1.4.7</version>
  </dependency>

  <dependency>
      <groupId>org.apache.commons</groupId>
      <artifactId>commons-lang3</artifactId>
      <version>3.0</version>
  </dependency>
  <!--hadoop-->
  <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-common</artifactId>
      <version>2.8.4</version>
  </dependency>
  <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-hdfs</artifactId>
      <version>2.8.4</version>
  </dependency>
  <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-mapreduce-client-core</artifactId>
      <version>2.8.4</version>
  </dependency>
  <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-mapreduce-client-common</artifactId>
      <version>2.8.4</version>
  </dependency>
  <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-mapreduce-client-jobclient</artifactId>
      <version>2.8.4</version>
      <scope>test</scope>
  </dependency>
  <dependency>
      <groupId>org.apache.avro</groupId>
      <artifactId>avro-mapred</artifactId>
      <version>1.8.1</version>
  </dependency>
  <dependency>
      <groupId>org.apache.hive</groupId>
      <artifactId>hive-common</artifactId>
      <version>2.3.2</version>
  </dependency>
  <dependency>
      <groupId>org.apache.avro</groupId>
      <artifactId>avro</artifactId>
      <version>1.8.1</version>
  </dependency>

实例代码

实例代码和网上以及官网的测试代码一致

package com.example.demo;


import org.apache.hadoop.conf.Configuration;
import org.apache.sqoop.Sqoop;
import org.apache.sqoop.hive.HiveConfig;
import org.apache.sqoop.tool.ImportTool;
import org.apache.sqoop.tool.SqoopTool;

import java.io.IOException;

public class SqoopTest {
    public static void main(String[] args) throws IOException {
        System.out.println(" begin test sqoop");
        String[] argument = new String[] {
                "--connect","jdbc:mysql://localhost:3306/testsqoop?useSSL=false",
                "--username","root",
                "--password","root",
                "--table","data_table",
                "--hive-import","--hive-database","testsqoop","--hive-overwrite","--create-hive-table",
                "--hive-table","data_table",
                "--delete-target-dir",
        };
        com.cloudera.sqoop.tool.SqoopTool sqoopTool=(com.cloudera.sqoop.tool.SqoopTool)SqoopTool.getTool("import");
        Configuration conf= new Configuration();
        conf.set("fs.default.name","hdfs://localhost:9000");
        Configuration hive=HiveConfig.getHiveConf(conf);
        Sqoop sqoop = new Sqoop(sqoopTool,SqoopTool.loadPlugins(conf) );
        int res = Sqoop.runSqoop(sqoop,argument);
        System.out.println(res);
        System.out.println("执行sqoop结束");
    }
}

经过本地环境测试以及Spring Boot Web服务测试 可以运行

猜你喜欢

转载自blog.csdn.net/l1028386804/article/details/82670300