您好,登录后才能下订单哦!
一、分区表概述
分区表也是内部表,创建表时可以同时为表创建一个或多个分区,这样我们在加载数据时为其指定具体的分区,查询数据时可以指定具体的分区从而提高效率,分区可以理解为表的一个特殊的列。关键字是partitioned。
分区表实际上是将表文件分成多个有标记的小文件方便查询。
二、创建分区表
这里我们将oracle用户scott下的emp表导出的emp.csv文件在Hive中创建分区表存放,按照部门编号进行分区,emp表的字段信息值如下:
empno, ename, job, mgr, hiredate, salary, comm, deptno
7499, ALLEN, SALESMAN, 7698, 1981/2/20, 1600, 300, 30
hive> create table part_emp(
> empno int,
> ename string,
> job string,
> mgr int,
> hiredate string,
> salary float,
> comm float
> )
> partitioned by (deptno int)
> row format delimited fields terminated by ',';
OK
Time taken: 0.061 seconds
查看分区表,其中# Partition Information为分区信息,有两个分区year和city
hive> desc extended part_emp;
OK
empno int None
ename string None
job string None
mgr int None
hiredate string None
salary float None
comm float None
deptno int None
# Partition Information
# col_name data_type comment
deptno int None
三、分区表插入数据
1、通过load命令加载数据
第一次分区信息为deptno=10
hive> load data local inpath '/root/emp.csv_10' into table part_emp partition(deptno=10);
Copying data from file:/root/emp.csv_10
Copying file: file:/root/emp.csv_10
Loading data to table default.part_emp partition (deptno=10)
[Warning] could not update stats.
OK
Time taken: 2.267 seconds
第二次分区信息为deptno=20
hive> load data local inpath '/root/emp.csv_20' into table part_emp partition(deptno=20);
Copying data from file:/root/emp.csv_20
Copying file: file:/root/emp.csv_20
Loading data to table default.part_emp partition (deptno=20)
[Warning] could not update stats.
OK
Time taken: 8.151 seconds
第三次分区信息为deptno=30,第三次通过insert的方式加载分区信息
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
hive> load data local inpath '/root/emp.csv_30' into table part_emp partition(deptno=30);
Copying data from file:/root/emp.csv_30
Copying file: file:/root/emp.csv_30
Loading data to table default.part_emp partition (deptno=30)
[Warning] could not update stats.
OK
Time taken: 7.344 seconds
四、根据分区查询,分区很像是一个特殊的列
hive> select * from part_emp where deptno=10;
7782 CLARK MANAGER 7839 1981/6/9 2450.0 100.0 10
7839 KING PRESIDENT NULL 1981/11/17 5000.0 120.0 10
7934 MILLER CLERK 7782 1982/1/23 1300.0 133.0 10
8129 Abama MANAGER 7839 1981/6/9 2450.0 122.0 10
8131 Jimy PRESIDENT NULL 1981/11/17 5000.0 333.0 10
8136 Goodle CLERK 7782 1982/1/23 1300.0 421.0 10
查看分区表的分区信息
hive> show partitions part_emp;
deptno=10
deptno=20
deptno=30
五、分区表在HDFS上的存储形式
一个分区对应一个目录
六、观察分区表查询和普通表查询的执行计划
普通表
hive> explain select * from emp where deptno=10;
ABSTRACT SYNTAX TREE:
(TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME emp))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR TOK_ALLCOLREF)) (TOK_WHERE (= (TOK_TABLE_OR_COL deptno) 10))))
STAGE DEPENDENCIES:
Stage-1 is a root stage
Stage-0 is a root stage
STAGE PLANS:
Stage: Stage-1
Map Reduce
Alias -> Map Operator Tree:
emp
TableScan
alias: emp
Filter Operator
predicate:
expr: (deptno = 10)
type: boolean
Select Operator
expressions:
expr: empno
type: int
expr: ename
type: string
expr: job
type: string
expr: mgr
type: int
expr: hiredate
type: string
expr: salary
type: float
expr: comm
type: float
expr: deptno
type: int
outputColumnNames: _col0, _col1, _col2, _col3, _col4, _col5, _col6, _col7
File Output Operator
compressed: false
GlobalTableId: 0
table:
input format: org.apache.hadoop.mapred.TextInputFormat
output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
Stage: Stage-0
Fetch Operator
limit: -1
分区表:
hive> explain select * from part_emp where deptno=10;
ABSTRACT SYNTAX TREE:
(TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME part_emp))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR TOK_ALLCOLREF)) (TOK_WHERE (= (TOK_TABLE_OR_COL deptno) 10))))
STAGE DEPENDENCIES:
Stage-0 is a root stage
STAGE PLANS:
Stage: Stage-0
Fetch Operator
limit: -1
Processor Tree:
TableScan
alias: part_emp
Select Operator
expressions:
expr: empno
type: int
expr: ename
type: string
expr: job
type: string
expr: mgr
type: int
expr: hiredate
type: string
expr: salary
type: float
expr: comm
type: float
expr: deptno
type: string
outputColumnNames: _col0, _col1, _col2, _col3, _col4, _col5, _col6, _col7
ListSink
免责声明:本站发布的内容(图片、视频和文字)以原创、转载和分享为主,文章观点不代表本网站立场,如果涉及侵权请联系站长邮箱:is@yisu.com进行举报,并提供相关证据,一经查实,将立刻删除涉嫌侵权内容。