下载:sqoop-1.4.7.bin__hadoop-2.6.0.tar.gz
1、conf 文件下
[root@hadoop01 conf]# mv sqoop-env-template.sh sqoop-env.sh
[root@hadoop01 conf]# mv sqoop-site-template.xml sqoop-site.xml
2、修改配置文件
[root@hadoop01 conf]# vi sqoop-env.sh
#Set path to where bin/hadoop is available
export HADOOP_COMMON_HOME=/data/bigData/hadoop-2.9.2
#Set path to where hadoop-*-core.jar is available
export HADOOP_MAPRED_HOME=/data/bigData/hadoop-2.9.2
#set the path to where bin/hbase is available
#export HBASE_HOME=
#Set the path to where bin/hive is available
export HIVE_HOME=/data/bigData/hive
3、设置sqoop 全局路径
[root@hadoop01 /]# vi /etc/profile
#sqoop
export SQOOP_HOME=/data/bigData/sqoop/sqoop-1.4.7
export PATH=$SQOOP_HOME/bin:$PATH
#spark
export SPARK_HOME=/opt/spark/standSpark/spark-2.0.1-bin-hadoop2.7
export PYTHONSPARK=$SPARK_HOME/python
export PATH=$SPARK_HOME/bin:$PATH
4、使全局命令生效
[root@hadoop01 /]# source /etc/profile
5、验证是否sqoopOK?
[root@hadoop01 /]# sqoop help
Warning: /data/bigData/sqoop/sqoop-1.4.7/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /data/bigData/sqoop/sqoop-1.4.7/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /data/bigData/sqoop/sqoop-1.4.7/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/data/bigData/hadoop-2.9.2/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hBase/pseduo_hbase-0.98.17-hadoop2/hbase-0.98.17-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
20/01/04 12:47:35 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
usage: sqoop COMMAND [ARGS]
Available commands:
codegen Generate code to interact with database records
create-hive-table Import a table definition into Hive
eval Evaluate a SQL statement and display the results
export Export an HDFS directory to a database table
help List available commands
import Import a table from a database to HDFS
import-all-tables Import tables from a database to HDFS
import-mainframe Import datasets from a mainframe server to HDFS
job Work with saved jobs
list-databases List available databases on a server
list-tables List available tables in a database
merge Merge results of incremental imports
metastore Run a standalone Sqoop metastore
version Display version information
See 'sqoop help COMMAND' for information on a specific command.
可以看到sqoop 已经单台机器已经部署OK了。