Create a hive of tables and operations on the data table

A, hive into two tables

  1, the internal table (management table):

      Delete to delete the data on the table when hdfs.

  2, the external table

      Delete the table when not delete data on hdfs.

      External table can not be inserted using insert the data, all data sources are provided by external people, so this hive think they do not have exclusive data, delete the hive table, the table does not delete the data inside

Second, the operation of the hive data table;

  1, insert into general strongly not recommended to insert the data in this way, because it will produce small files in HDFS above, the impact of HDFS metadata management

  2, hive when construction of the table if you do not use a delimiter, it defaults to \ 001. Asc code is a value, a non-printing characters.

  3, specify the delimiter when creating tables

    Create an internal table

    create table if not exists stu2(id int,name string) row format delimited fileds terminated by '\t' stored as textfile location '/user/hive/warehouse/myhive/stu2';

    Create an external table

    create external table if not exists student(s_id string,s_name string) row format delimited fields terminated by '\t' stored as textfile location '/user/hive/warehouse/myhive/student';

  4, create a table based on query results, and data query results into the new table to go inside

      create table stu3 as select * from stu2; in this way the use of more

      According to the existing table structure to create a table, just copy the table structure in this way:

      create table stu4 like stu2;

  5, look-up table type:

    desc formatted stu2;

  6, how to load data into external tables inside it?

   1, data is loaded from the local file system to a table

     load data local inpath '/export/servers/hivedatas/student.csv' into table student;

     Load data and overwrite existing data

     load data local inpath '/export/servers/hivedatas/student.csv' overwrite  into table student;

   2, hdfs loaded from the file system to the table data (required in advance to upload data to hdfs file system, in fact, a mobile operating file)

    load data inpath '/hivedatas/techer.csv' into table techer;

 

  

Guess you like

Origin www.cnblogs.com/nacyswiss/p/12606983.html
Recommended