Spark Sql Hive and inheritance

Built-Hive

  1. The required core-site.xml hdfs-site.xml and copied to the spark conf directory. If you find metastore_db under Spark path, you need to delete [ when only the first start ].
  2. When you first start to create metastore, you need to specify spark.sql.warehouse.dir this parameter,
spark-shell --master spark://hadoop01:7077 --conf spark.sql.warehouse.dir=hdfs://hadoop01:9000/spark_warehouse

Just create a table

scala> spark.sql("create table test(id bigint,name string)")

Results are as follows, hadoop cluster on when you can see where this table
Here Insert Picture Description

External Hive

  1. The hive-site.xml need to copy the spark conf directory.
  2. If a hive of metestore use the mysql database, you will need the mysql jdbc driver package into the directory under jars of spark.
  3. Sql queries can be performed by spark-sql or spark-shell. Complete and hive connection.

Guess you like

Origin blog.csdn.net/drl_blogs/article/details/93109394