大数据开发:Hive DDL操作入门

2021-06-16 18:13:29 浏览数 (1)

Hive针对于数据管理操作,提供了类SQL语言HQL,在Hadoop生态当中,Hive定位为数据仓库工具,对于数据的各种操作,也就是使用HQL来完成。而HQL查询,可以分为DDL和DML两个部分来掌握。今天的大数据开发学习分享,我们就先来讲讲Hive DDL操作入门。

HQL的使用对于本身有SQL基础的人来说,是非常容易上手的。DDL操作,主要包括创建数据库中的对象——表、视图、索引等。

Hive常用DDL操作示例:

※"[ ]"括起来的代表可以写也可以不写的语句。

1、创建数据库:

CREATE DATABASE name;

2、显示命令:

show tables;

show databases;

show partitions ;

show functions;

describe extended table_name dot col_name;

3、建表:

CREATE [EXTERNAL] TABLE [IF NOT EXISTS] table_name

[(col_name data_type [COMMENT col_comment], ...)]

[COMMENT table_comment]

[PARTITIONED BY (col_name data_type [COMMENT col_comment], ...)]

[CLUSTERED BY (col_name, col_name, ...)

[SORTED BY (col_name [ASC|DESC], ...)] INTO num_buckets BUCKETS]

[ROW FORMAT row_format]

[STORED AS file_format]

[LOCATION hdfs_path]

CREATE TABLE 创建一个指定名字的表。如果相同名字的表已经存在,则抛出异常;用户可以用 IF NOT EXIST 选项来忽略这个异常;

EXTERNAL 关键字可以让用户创建一个外部表,在建表的同时指定一个指向实际数据的路径(LOCATION);

LIKE 允许用户复制现有的表结构,但是不复制数据;

COMMENT可以为表与字段增加描述;

ROW FORMAT

DELIMITED [FIELDS TERMINATED BY char] [COLLECTION ITEMS TERMINATED BY char]

        [MAP KEYS TERMINATED BY char] [LINES TERMINATED BY char]

   | SERDE serde_name [WITH SERDEPROPERTIES (property_name=property_value, property_name=property_value, ...)]

STORED AS

SEQUENCEFILE

            | TEXTFILE

            | RCFILE   

            | INPUTFORMAT input_format_classname OUTPUTFORMAT

output_format_classname

如果文件数据是纯文本,可以使用 STORED AS TEXTFILE。

如果数据需要压缩,使用 STORED AS SEQUENCE 。

4、创建简单表:

CREATE TABLE person(name STRING,age INT);

5、创建外部表:

CREATE EXTERNAL TABLE page_view(viewTime INT, userid BIGINT,

     page_url STRING, referrer_url STRING,

     ip STRING COMMENT 'IP Address of the User',

     country STRING COMMENT 'country of origination')

COMMENT '这里写表的描述信息'

ROW FORMAT DELIMITED FIELDS TERMINATED BY '54'

STORED AS TEXTFILE

LOCATION '<hdfs_location>';

6、创建分区表:

CREATE TABLE par_table(viewTime INT, userid BIGINT,

     page_url STRING, referrer_url STRING,

     ip STRING COMMENT 'IP Address of the User')

COMMENT 'This is the page view table'

PARTITIONED BY(date STRING, pos STRING)

ROW FORMAT DELIMITED ‘t’

   FIELDS TERMINATED BY 'n'

STORED AS SEQUENCEFILE;

7、创建分桶表:

CREATE TABLE par_table(viewTime INT, userid BIGINT,

     page_url STRING, referrer_url STRING,

     ip STRING COMMENT 'IP Address of the User')

COMMENT 'This is the page view table'

PARTITIONED BY(date STRING, pos STRING)

CLUSTERED BY(userid) SORTED BY(viewTime) INTO 32 BUCKETS

ROW FORMAT DELIMITED ‘t’

   FIELDS TERMINATED BY 'n'

STORED AS SEQUENCEFILE;

8、创建带索引字段的表:

CREATE TABLE invites (foo INT, bar STRING) PARTITIONED BY (dindex STRING);

9、复制一个空表:

CREATE TABLE empty_key_value_store

LIKE key_value_store;

10、显示所有表:

SHOW TABLES;

11、按正则表达式显示表:

hive> SHOW TABLES '.*s';

12、表中添加一个字段:

ALTER TABLE pokes ADD COLUMNS (new_col INT);

13、添加一个字段并为其添加注释:

hive> ALTER TABLE invites ADD COLUMNS (new_col2 INT COMMENT 'a comment');

14、删除列:

hive> ALTER TABLE test REPLACE COLUMNS(id BIGINT, name STRING);

15、更改表名:

hive> ALTER TABLE events RENAME TO 3koobecaf;

16、增加、删除分区:

#增加:

ALTER TABLE table_name ADD [IF NOT EXISTS] partition_spec [ LOCATION 'location1' ] partition_spec [ LOCATION 'location2' ] ...

      partition_spec:

  : PARTITION (partition_col = partition_col_value, partition_col = partiton_col_value, ...)

#删除:

ALTER TABLE table_name DROP partition_spec, partition_spec,...

17、改变表的文件格式与组织:

ALTER TABLE table_name SET FILEFORMAT file_format

ALTER TABLE table_name CLUSTERED BY(userid) SORTED BY(viewTime) INTO num_buckets BUCKETS

#这个命令修改了表的物理存储属性

18、创建和删除视图:

#创建视图:

CREATE VIEW [IF NOT EXISTS] view_name [ (column_name [COMMENT column_comment], ...) ][COMMENT view_comment][TBLPROPERTIES (property_name = property_value, ...)] AS SELECT;

#删除视图:

DROP VIEW view_name;

0 人点赞