大数据学习之sqoop

点击下载sqoophttp://mirror.bit.edu.cn/apache/sqoop/

解压缩:tar -xvf sqoop-1.4.7.bin__hadoop-2.6.0.tar

重命名:mv sqoop-1.4.7.bin__hadoop-2.6.0 sqoop

配置环境变量

  export SQOOP_HOME=/data/bigdata/sqoop

  export PATH=$SQOOP_HOME/bin:$HIVE_HOME/bin:$SCALA_HOME/bin:$JAVA_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$PATH

  source /etc/profile 使文件生效

配置文件修改

  cd sqoop 进入目录

       cp conf/sqoop-env-template.sh conf/sqoop-env.sh

  打开编辑sqoop-env.sh (因为我只有hive 只配置了hive,hbase也一样)

    #Set path to where bin/hadoop is available

    export HADOOP_COMMON_HOME=/data/bigdata/hadoop-2.7.6

    #Set path to where hadoop-*-core.jar is available
    export HADOOP_MAPRED_HOME=/data/bigdata/hadoop-2.7.6

    #set the path to where bin/hbase is available
    #export HBASE_HOME=

    #Set the path to where bin/hive is available
    export HIVE_HOME=/data/bigdata/hive

把MySQL的驱动包上传到sqoop的lib下(之前装hive时候已经有了,拷贝过来即可,参考:https://www.cnblogs.com/lihuanghao/p/9338194.html)

  cp ../hive/lib/mysql-connector-java-5.1.46.jar lib/

  注:之后导入数据可能会报错:could not find ... HIVE_CONF_DIR 解决办法:

  cp ../hive/lib/hive-exec-2.3.3.jar lib/

至此sqoop安装结束

附:

  一条导入hive指令(更多请查看官网)

  sqoop import --connect 'jdbc:mysql://mysql:3306/health?autoReconnect=true' --username sqoop --password sqoop --table he_count    --hive-import --hive-table he_count --hive-overwrite -m 1 --hive-database default -z

  

  

猜你喜欢

转载自www.cnblogs.com/lihuanghao/p/9341948.html