Hive的详细安装步骤

发布时间:2021-08-12 10:57:55 作者:chen
来源:亿速云 阅读:155

这篇文章主要介绍“Hive的详细安装步骤”,在日常操作中,相信很多人在Hive的详细安装步骤问题上存在疑惑,小编查阅了各式资料,整理出简单好用的操作方法,希望对大家解答”Hive的详细安装步骤”的疑惑有所帮助!接下来,请跟着小编一起来学习吧!

1、解压缩文件[root@hadoop0 opt]# tar -zxvf hive-0.9.0.tar.gz
2、改名字[root@hadoop0 opt]# mv hive-0.9.0 hive
3、配置环境变量,修改etc/profile全局变量文件/opt/hive/bin
JAVA_HOME=/opt/jdk1.6.0_24
HADOOP_HOME=/opt/hadoop
HBASE_HOME=/opt/hbase
HIVE_HOME=/opt/hive
PATH=$JAVA_HOME/bin:$HADOOP_HOME/bin:$HBASE_HOME/bin:$HIVE_HOME/bin:$PATH
export JAVA_HOME  HADOOP_HOME HBASE_HOME  HIVE_HOME PATH
[root@hadoop0 bin]# su -
4、测试运行,看看是否安装成功[root@hadoop0 ~]# hive
WARNING: org.apache.hadoop.metrics.jvm.EventCounter is deprecated. Please use org.apache.hadoop.log.metrics.EventCounter in all the log4j.properties files.
Logging initialized using configuration in jar:file:/opt/hive/lib/hive-common-0.9.0.jar!/hive-log4j.properties
Hive history file=/tmp/root/hive_job_log_root_201509250619_148272494.txt
hive> show tables;
FAILED: Error in metadata: MetaException(message:Got exception: java.net.ConnectException Call to hadoop0/192.168.46.129:9000 failed on connection exception: java.net.ConnectException: Connection refused)
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
--解决方案:hive依赖于hdfs存储数据,所以确保hadoop启动了
[root@hadoop0 ~]# start-all.sh
Warning: $HADOOP_HOME is deprecated.
starting namenode, logging to /opt/hadoop/libexec/../logs/hadoop-root-namenode-hadoop0.out
localhost: starting datanode, logging to /opt/hadoop/libexec/../logs/hadoop-root-datanode-hadoop0.out
localhost: starting secondarynamenode, logging to /opt/hadoop/libexec/../logs/hadoop-root-secondarynamenode-hadoop0.out
starting jobtracker, logging to /opt/hadoop/libexec/../logs/hadoop-root-jobtracker-hadoop0.out
localhost: starting tasktracker, logging to /opt/hadoop/libexec/../logs/hadoop-root-tasktracker-hadoop0.out
--至此最简单的hive环境配置完毕

5、开始创建数据表hive> show tables;
OK
Time taken: 5.619 seconds
hive> create table stu(name String,age int);
FAILED: Error in metadata: MetaException(message:Got exception: org.apache.hadoop.ipc.RemoteException
org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create directory /user/hive/warehouse/stu.
Name node is in safe mode.
The reported blocks 18 has reached the threshold 0.9990 of total blocks 17. Safe mode will be turned off automatically in 15 seconds.
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:2204)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2178)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.mkdirs(NameNode.java:857)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:578)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1393)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1389)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1387)
)
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
--解决方案:由于缺少参数配置,手工建立目录,解决这个问题
[root@hadoop0 ~]# mkdir -p /user/hive/warehouse/stu
hive> create table stu(name String,age int);
OK
Time taken: 0.229 seconds
6、开始插入数据,Hive不支持Insert语句hive> insert into stu values('MengMeng',24);
FAILED: Parse Error: line 1:12 mismatched input 'stu' expecting TABLE near 'into' in insert clause
hive> show tables;
OK
stu
Time taken: 0.078 seconds
hive> desc stu;
OK
name    string
age     int
Time taken: 0.255 seconds


--解决方案:hive不支持上述操作,可以使用load加载
hive> LOAD DATA LOCAL INPATH '/opt/stu.txt' OVERWRITE INTO TABLE stu;
Copying data from file:/opt/stu.txt
Copying file: file:/opt/stu.txt
Loading data to table default.stu
Deleted hdfs://hadoop0:9000/user/hive/warehouse/stu
OK
Time taken: 0.643 seconds
7、查询刚才导入的语句hive> select name ,age from stu;
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_201509250620_0001, Tracking URL = http://hadoop0:50030/jobdetails.jsp?jobid=job_201509250620_0001
Kill Command = /opt/hadoop/libexec/../bin/hadoop job  -Dmapred.job.tracker=hadoop0:9001 -kill job_201509250620_0001
Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 0
2015-09-25 06:37:55,535 Stage-1 map = 0%,  reduce = 0%
2015-09-25 06:37:58,565 Stage-1 map = 100%,  reduce = 0%, Cumulative CPU 0.59 sec
2015-09-25 06:37:59,595 Stage-1 map = 100%,  reduce = 0%, Cumulative CPU 0.59 sec
2015-09-25 06:38:00,647 Stage-1 map = 100%,  reduce = 100%, Cumulative CPU 0.59 sec
MapReduce Total cumulative CPU time: 590 msec
Ended Job = job_201509250620_0001
MapReduce Jobs Launched:
Job 0: Map: 1   Cumulative CPU: 0.59 sec   HDFS Read: 221 HDFS Write: 22 SUCCESS
Total MapReduce CPU Time Spent: 590 msec
OK
--查询结构显示出来了
JieJie 26       NULL
MM 24   NULL
Time taken: 12.812 seconds
疑问:为何有个null值呢,切待下次研究

到此,关于“Hive的详细安装步骤”的学习就结束了,希望能够解决大家的疑惑。理论与实践的搭配能更好的帮助大家学习,快去试试吧!若想继续学习更多相关知识,请继续关注亿速云网站,小编会继续努力为大家带来更多实用的文章!

推荐阅读:
  1. EasyMock的详细安装步骤
  2. nagios的详细安装步骤

免责声明:本站发布的内容(图片、视频和文字)以原创、转载和分享为主,文章观点不代表本网站立场,如果涉及侵权请联系站长邮箱:is@yisu.com进行举报,并提供相关证据,一经查实,将立刻删除涉嫌侵权内容。

hive

上一篇:python GUI图形化编程wxpython怎么用

下一篇:php中Redis应用之消息传递的示例分析

相关阅读

您好,登录后才能下订单哦!

密码登录
登录注册
其他方式登录
点击 登录注册 即表示同意《亿速云用户服务条款》