您好,登录后才能下订单哦!
这篇文章给大家分享的是有关hive-1.1.0-cdh5.7.0如何编译的内容。小编觉得挺实用的,因此分享给大家做个参考,一起跟随小编过来看看吧。
环境说明:
1、VM10虚拟机
2、系统:centos6.5
3、Hadoop:hadoop-2.6.0-cdh6.7.0
4、JDK:jdk1.8.0_45
5、Maven:apache-maven-3.3.9
6、MySQL:mysql-5.6.39
1. hive-1.1.0-cdh6.7.0-src.tar.gz软件包下载地址:
wget http://archive.cloudera.com/cdh6/cdh/5/hive-1.1.0-cdh6.7.0-src.tar.gz
将hive-1.1.0-cdh6.7.0-src.tar.gz下载到:/home/hadoop/software/
[hadoop@hadoop002 software]$ pwd
/home/hadoop/software
[hadoop@hadoop002 software]$ ll
-rw-rw-r--. 1 hadoop hadoop 311585484 Feb 20 07:16 hadoop-2.6.0-cdh6.7.0.tar.gz
-rw-rw-r--. 1 hadoop hadoop 14652104 Feb 21 05:28 hive-1.1.0-cdh6.7.0-src.tar.gz
2. 解压文件:
[hadoop@hadoop002 software]$ tar -zxvf hive-1.1.0-cdh6.7.0-src.tar.gz
-rw-rw-r--. 1 hadoop hadoop 311585484 Feb 20 07:16 hadoop-2.6.0-cdh6.7.0.tar.gz
drwxrwxr-x. 32 hadoop hadoop 4096 Jun 1 16:06 hive-1.1.0-cdh6.7.0
-rw-rw-r--. 1 hadoop hadoop 14652104 Feb 21 05:28 hive-1.1.0-cdh6.7.0-src.tar.gz
3. 进入hive-1.1.0-cdh6.7.0目录,执行编译命令:mvn clean package -DskipTests -Phadoop-2 -Pdist
[hadoop@hadoop002 software]$ cd hive-1.1.0-cdh6.7.0
[hadoop@hadoop002 hive-1.1.0-cdh6.7.0]$ ll
total 592
drwxrwxr-x. 4 hadoop hadoop 4096 Jun 1 16:26 accumulo-handler
drwxrwxr-x. 4 hadoop hadoop 4096 Jun 1 16:06 ant
drwxrwxr-x. 4 hadoop hadoop 4096 Jun 1 16:27 beeline
drwxrwxr-x. 3 hadoop hadoop 4096 Mar 24 2016 bin
drwxrwxr-x. 2 hadoop hadoop 4096 Mar 24 2016 checkstyle
drwxrwxr-x. 4 hadoop hadoop 4096 Jun 1 16:28 cli
drwxrwxr-x. 3 hadoop hadoop 4096 Mar 24 2016 cloudera
drwxrwxr-x. 4 hadoop hadoop 4096 Jun 1 16:00 common
drwxrwxr-x. 2 hadoop hadoop 4096 Jun 1 16:01 conf
drwxrwxr-x. 5 hadoop hadoop 4096 Jun 1 16:28 contrib
drwxrwxr-x. 6 hadoop hadoop 4096 Mar 24 2016 data
-rw-rw-r--. 1 hadoop hadoop 10753 Jun 1 16:06 datanucleus.log
drwxrwxr-x. 2 hadoop hadoop 4096 Mar 24 2016 dev-support
drwxrwxr-x. 6 hadoop hadoop 4096 Mar 24 2016 docs
drwxrwxr-x. 2 hadoop hadoop 4096 Mar 24 2016 findbugs
drwxrwxr-x. 5 hadoop hadoop 4096 Jun 1 16:29 hbase-handler
drwxrwxr-x. 14 hadoop hadoop 4096 Jun 1 16:29 hcatalog
drwxrwxr-x. 4 hadoop hadoop 4096 Mar 24 2016 hwi
drwxrwxr-x. 13 hadoop hadoop 4096 Mar 24 2016 itests
drwxrwxr-x. 4 hadoop hadoop 4096 Jun 1 16:26 jdbc
drwxrwxr-x. 2 hadoop hadoop 4096 Mar 24 2016 lib
-rw-rw-r--. 1 hadoop hadoop 23169 Mar 24 2016 LICENSE
drwxrwxr-x. 7 hadoop hadoop 4096 Jun 1 16:04 metastore
-rw-rw-r--. 1 hadoop hadoop 397 Mar 24 2016 NOTICE
drwxrwxr-x. 4 hadoop hadoop 4096 Mar 24 2016 odbc
drwxrwxr-x. 3 hadoop hadoop 4096 Mar 24 2016 packaging
-rw-rw-r--. 1 hadoop hadoop 51736 Mar 24 2016 pom.xml
drwxrwxr-x. 5 hadoop hadoop 4096 Jun 1 16:24 ql
-rw-rw-r--. 1 hadoop hadoop 4048 Mar 24 2016 README.txt
-rw-rw-r--. 1 hadoop hadoop 376416 Mar 24 2016 RELEASE_NOTES.txt
drwxrwxr-x. 5 hadoop hadoop 4096 Jun 1 16:02 serde
drwxrwxr-x. 6 hadoop hadoop 4096 Jun 1 16:25 service
drwxrwxr-x. 7 hadoop hadoop 4096 Mar 24 2016 shims
drwxrwxr-x. 4 hadoop hadoop 4096 Jun 1 16:13 spark-client
drwxrwxr-x. 6 hadoop hadoop 4096 Jun 1 15:53 target
drwxrwxr-x. 5 hadoop hadoop 4096 Mar 24 2016 testutils
[hadoop@hadoop002 hive-1.1.0-cdh6.7.0]$
# 执行编译命令
[hadoop@hadoop002 hive-1.1.0-cdh6.7.0]$ mvn clean package -DskipTests -Phadoop-2 -Pdist
[INFO] --- build-helper-maven-plugin:1.8:attach-artifact (attach-jdbc-driver) @ hive-packaging ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Hive ............................................... SUCCESS [03:07 min]
[INFO] Hive Shims Common .................................. SUCCESS [02:12 min]
[INFO] Hive Shims 0.23 .................................... SUCCESS [02:57 min]
[INFO] Hive Shims Scheduler ............................... SUCCESS [ 5.277 s]
[INFO] Hive Shims ......................................... SUCCESS [ 3.539 s]
[INFO] Hive Common ........................................ SUCCESS [02:14 min]
[INFO] Hive Serde ......................................... SUCCESS [01:57 min]
[INFO] Hive Metastore ..................................... SUCCESS [03:01 min]
[INFO] Hive Ant Utilities ................................. SUCCESS [ 17.141 s]
[INFO] Spark Remote Client ................................ SUCCESS [12:54 min]
[INFO] Hive Query Language ................................ SUCCESS [04:58 min]
[INFO] Hive Service ....................................... SUCCESS [01:20 min]
[INFO] Hive Accumulo Handler .............................. SUCCESS [ 37.886 s]
[INFO] Hive JDBC .......................................... SUCCESS [01:32 min]
[INFO] Hive Beeline ....................................... SUCCESS [ 8.866 s]
[INFO] Hive CLI ........................................... SUCCESS [ 7.600 s]
[INFO] Hive Contrib ....................................... SUCCESS [ 7.161 s]
[INFO] Hive HBase Handler ................................. SUCCESS [01:14 min]
[INFO] Hive HCatalog ...................................... SUCCESS [ 20.745 s]
[INFO] Hive HCatalog Core ................................. SUCCESS [ 14.203 s]
[INFO] Hive HCatalog Pig Adapter .......................... SUCCESS [ 11.418 s]
[INFO] Hive HCatalog Server Extensions .................... SUCCESS [ 38.225 s]
[INFO] Hive HCatalog Webhcat Java Client .................. SUCCESS [ 9.254 s]
[INFO] Hive HCatalog Webhcat .............................. SUCCESS [ 43.961 s]
[INFO] Hive HCatalog Streaming ............................ SUCCESS [ 9.030 s]
[INFO] Hive HWI ........................................... SUCCESS [ 5.013 s]
[INFO] Hive ODBC .......................................... SUCCESS [ 4.396 s]
[INFO] Hive Shims Aggregator .............................. SUCCESS [ 0.889 s]
[INFO] Hive TestUtils ..................................... SUCCESS [ 1.786 s]
[INFO] Hive Packaging ..................................... SUCCESS [01:45 min]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 43:35 min
[INFO] Finished at: 2018-06-01T16:33:55+08:00
[INFO] Final Memory: 139M/494M
[INFO] ------------------------------------------------------------------------
[hadoop@hadoop002 hive-1.1.0-cdh6.7.0]$
#查看编译好的tar包apache-hive-1.1.0-cdh6.7.0-bin.tar.gz:
[hadoop@hadoop002 hive-1.1.0-cdh6.7.0]$ cd ./packaging/target/
[hadoop@hadoop002 target]$ ll
total 129248
drwxrwxr-x. 2 hadoop hadoop 4096 Jun 1 16:32 antrun
drwxrwxr-x. 3 hadoop hadoop 4096 Jun 1 16:32 apache-hive-1.1.0-cdh6.7.0-bin
-rw-rw-r--. 1 hadoop hadoop 105763881 Jun 1 16:33 apache-hive-1.1.0-cdh6.7.0-bin.tar.gz
-rw-rw-r--. 1 hadoop hadoop 12610959 Jun 1 16:33 apache-hive-1.1.0-cdh6.7.0-jdbc.jar
-rw-rw-r--. 1 hadoop hadoop 13797857 Jun 1 16:33 apache-hive-1.1.0-cdh6.7.0-src.tar.gz
drwxrwxr-x. 2 hadoop hadoop 4096 Jun 1 16:32 archive-tmp
drwxrwxr-x. 3 hadoop hadoop 4096 Jun 1 16:32 maven-shared-archive-resources
drwxrwxr-x. 3 hadoop hadoop 4096 Jun 1 16:32 tmp
drwxrwxr-x. 2 hadoop hadoop 4096 Jun 1 16:32 warehouse
[hadoop@hadoop002 target]$
感谢各位的阅读!关于“hive-1.1.0-cdh5.7.0如何编译”这篇文章就分享到这里了,希望以上内容可以对大家有一定的帮助,让大家可以学到更多知识,如果觉得文章不错,可以把它分享出去让更多的人看到吧!
免责声明:本站发布的内容(图片、视频和文字)以原创、转载和分享为主,文章观点不代表本网站立场,如果涉及侵权请联系站长邮箱:is@yisu.com进行举报,并提供相关证据,一经查实,将立刻删除涉嫌侵权内容。