您好,登录后才能下订单哦!
Hue是一个开源的Apache Hadoop UI系统,最早是由Cloudera Desktop演化而来,由Cloudera贡献给开源社区,它是基于Python Web框架Django实现的。通过使用Hue我们可以在浏览器端的Web控制台上与Hadoop集群进行交互来分析处理数据,例如操作HDFS上的数据,运行MapReduce Job等等。HUE的特性如下图所示:
HUE的架构图如下:
1、安装依赖软件包
本文搭建HUE的平台是Oracle Linux 7.4。
[root@hdp01 ~]# yum -y install gcc-c++ asciidoc cyrus-sasl-devel cyrus-sasl-gssapi krb5-devel libxml2-devel libxslt-devel mysql-devel openldap-devel python-devel sqlite-devel openssl-devel gmp-devel libffi libffi-devel MySQL-python mysql-community-devel cyrus-sasl-plain
如果这些软件包不安装,在编译的过程中会报错,尤其是mysql相关的软件包,还有cryus-sasl。
2、安装HUE
HUE的下载地址可以从http://gethue.com/官网找 。HUE的安装很简单,如下:
[hadoop@hdp01 u02]$ tar -xzf hue-4.1.0.tgz
[hadoop@hdp01 u02]$ cd hue-4.1.0
[hadoop@hdp01 hue-4.1.0]$ make apps
3、创建MySQL数据库和用户
默认情况下,HUE使用的是自带的sqlite3,测试环境可以用用,但是正式环境就需要使用诸如oracle、MySQL、PostgreSQL等数据库了。
mysql> create database hue;
mysql> create user hue identified by 'abcABC@12';
mysql> grant all privileges on *.* to hue@'%' identified by 'abcABC@12';
mysql> flush privileges;
4、编辑HUE配置文件
HUE的配置文件位于$HUE_HOME/desktop/conf目录,文件名为hue.ini。这里把我配置的内容贴出来,如下:
[desktop]
secret_key=
http_host=0.0.0.0
http_port=8888
is_hue_4=true
time_zone=Asia/Shanghai
django_debug_mode=false
http_500_debug_mode=false
server_user=hadoop
server_group=hadoop
default_user=hadoop
default_hdfs_superuser=hadoop
enable_server=yes
[[database]]
engine=mysql
host=192.168.120.92 --数据库主机
port=3306 --MySQL端口
user=hue --数据库用户
password=abcABC@12 --数据库密码
name=hue --数据库名称
[hadoop]
[[hdfs_clusters]]
[[[default]]]
fs_defaultfs=hdfs://hdp01:9000
webhdfs_url=http://hdp01:50070/webhdfs/v1
security_enabled=false
[[[default]]]
resourcemanager_host=hdp01
resourcemanager_port=8032
submit_to=True
security_enabled=false
resourcemanager_api_url=http://hdp01:8088
proxy_api_url=http://hdp01:8088
history_server_api_url=http://hdp01:19888
[[mapred_clusters]]
[[[default]]]
submit_to=False
[beeswax]
hive_server_host=hdp01.thinkjoy.tt
hive_server_port=10000
hive_conf_dir=/u01/hive/conf
[metastore]
enable_new_create_table=true
[filebrowser]
archive_upload_tempdir=/tmp
show_download_button=false
show_upload_button=false
[sqoop]
server_url=http://192.168.120.96:12000/sqoop
sqoop_conf_dir=/u01/sqoop/conf
[hbase]
--这个cluster名称随便叫,如果hbase没有开启thriftserver,通过hue访问hbase会报无法访问xxx的9090端口错误。
hbase_clusters=(Cluster|hdp02:9090),(Cluster|hdp03:9090),(Cluster|hdp04:9090)
hbase_conf_dir=/u01/hbase/conf
truncate_limit = 500
thrift_transport=buffered
[search]
solr_url=http://hdp01:8983/solr/
[zookeeper]
[[clusters]]
[[[default]]]
host_ports=hdp01:2181,hdp02:2181,hdp03:2181,hdp04:2181
[liboozie]
oozie_url=http://192.168.120.101:11000/oozie
[libzookeeper]
ensemble=hdp01:2181,hdp02:2181,hdp03:2181,hdp04:2181
###########################################################################
# Settings for the RDBMS application
###########################################################################
[librdbms]
[[databases]]
[[[mysql]]]
nice_name="MySQL DB"
name=hue
engine=mysql
host=192.168.120.92
port=3306
user=hue
password=abcABC@12
Hbase开启thrift服务,如下:
[hadoop@hdp01 ~]$ hbase-daemons.sh start thrift
5、初始化MySQL数据库
如果使用了RDBMS数据库,则在开启服务之前,必须初始化数据库,否则在访问HUE的管理界面会报错。另外初始化的过程中已创建了HUE的管理员账户:hadoop。
hadoop@hdp01 ~]$ /u01/hue/build/env/bin/hue syncdb
Syncing...
Creating tables ...
Creating table auth_permission
Creating table auth_group_permissions
Creating table auth_group
Creating table auth_user_groups
Creating table auth_user_user_permissions
Creating table auth_user
Creating table django_openid_auth_nonce
Creating table django_openid_auth_association
Creating table django_openid_auth_useropenid
Creating table django_content_type
Creating table django_session
Creating table django_site
Creating table django_admin_log
Creating table south_migrationhistory
Creating table axes_accessattempt
Creating table axes_accesslog
You just installed Django's auth system, which means you don't have any superusers defined.
Would you like to create one now? (yes/no): yes
Username (leave blank to use 'hadoop'):
Email address:
Password:
Password (again):
Superuser created successfully.
Installing custom SQL ...
Installing indexes ...
Installed 0 object(s) from 0 fixture(s)
Synced:
> django.contrib.auth
> django_openid_auth
> django.contrib.contenttypes
> django.contrib.sessions
> django.contrib.sites
> django.contrib.staticfiles
> django.contrib.admin
> south
> axes
> about
> filebrowser
> help
> impala
> jobbrowser
> metastore
> proxy
> rdbms
> zookeeper
> indexer
> dashboard
Not synced (use migrations):
- django_extensions
- desktop
- beeswax
- hbase
- jobsub
- oozie
- pig
- search
- security
- spark
- sqoop
- useradmin
- notebook
(use ./manage.py migrate to migrate these)
[hadoop@hdp01 ~]$ /u01/hue/build/env/bin/hue migrate
Running migrations for django_extensions:
- Migrating forwards to 0001_empty.
> django_extensions:0001_empty
- Loading initial data for django_extensions.
Installed 0 object(s) from 0 fixture(s)
Running migrations for desktop:
- Migrating forwards to 0026_change_is_trashed_default_to_false.
> pig:0001_initial
> oozie:0001_initial
> oozie:0002_auto__add_hive
> oozie:0003_auto__add_sqoop
> oozie:0004_auto__add_ssh
> oozie:0005_auto__add_shell
> oozie:0006_auto__chg_field_java_files__chg_field_java_archives__chg_field_sqoop_f
> oozie:0007_auto__chg_field_sqoop_script_path
> oozie:0008_auto__add_distcp
> oozie:0009_auto__add_decision
> oozie:0010_auto__add_fs
> oozie:0011_auto__add_email
> oozie:0012_auto__add_subworkflow__chg_field_email_subject__chg_field_email_body
> oozie:0013_auto__add_generic
> oozie:0014_auto__add_decisionend
> oozie:0015_auto__add_field_dataset_advanced_start_instance__add_field_dataset_ins
> oozie:0016_auto__add_field_coordinator_job_properties
> oozie:0017_auto__add_bundledcoordinator__add_bundle
> oozie:0018_auto__add_field_workflow_managed
> oozie:0019_auto__add_field_java_capture_output
> oozie:0020_chg_large_varchars_to_textfields
> oozie:0021_auto__chg_field_java_args__add_field_job_is_trashed
> oozie:0022_auto__chg_field_mapreduce_node_ptr__chg_field_start_node_ptr
> oozie:0022_change_examples_path_format
- Migration 'oozie:0022_change_examples_path_format' is marked for no-dry-run.
> oozie:0023_auto__add_field_node_data__add_field_job_data
> oozie:0024_auto__chg_field_subworkflow_sub_workflow
> oozie:0025_change_examples_path_format
- Migration 'oozie:0025_change_examples_path_format' is marked for no-dry-run.
> desktop:0001_initial
> desktop:0002_add_groups_and_homedirs
> desktop:0003_group_permissions
> desktop:0004_grouprelations
> desktop:0005_settings
> desktop:0006_settings_add_tour
> beeswax:0001_initial
> beeswax:0002_auto__add_field_queryhistory_notify
> beeswax:0003_auto__add_field_queryhistory_server_name__add_field_queryhistory_serve
> beeswax:0004_auto__add_session__add_field_queryhistory_server_type__add_field_query
> beeswax:0005_auto__add_field_queryhistory_statement_number
> beeswax:0006_auto__add_field_session_application
> beeswax:0007_auto__add_field_savedquery_is_trashed
> beeswax:0008_auto__add_field_queryhistory_query_type
> beeswax:0009_auto__add_field_savedquery_is_redacted__add_field_queryhistory_is_reda
> desktop:0007_auto__add_documentpermission__add_documenttag__add_document
> desktop:0008_documentpermission_m2m_tables
> desktop:0009_auto__chg_field_document_name
> desktop:0010_auto__add_document2__chg_field_userpreferences_key__chg_field_userpref
> desktop:0011_auto__chg_field_document2_uuid
> desktop:0012_auto__chg_field_documentpermission_perms
> desktop:0013_auto__add_unique_documenttag_owner_tag
> desktop:0014_auto__add_unique_document_content_type_object_id
> desktop:0015_auto__add_unique_documentpermission_doc_perms
> desktop:0016_auto__add_unique_document2_uuid_version_is_history
> desktop:0017_auto__add_document2permission__add_unique_document2permission_doc_perm
> desktop:0018_auto__add_field_document2_parent_directory
> desktop:0019_auto
> desktop:0020_auto__del_field_document2permission_all
> desktop:0021_auto__add_defaultconfiguration__add_unique_defaultconfiguration_app_is
> desktop:0022_auto__del_field_defaultconfiguration_group__del_unique_defaultconfigur
> desktop:0023_auto__del_unique_defaultconfiguration_app_is_default_user__add_field_d
> desktop:0024_auto__add_field_document2_is_managed
> desktop:0025_auto__add_field_document2_is_trashed
> desktop:0026_change_is_trashed_default_to_false
- Migration 'desktop:0026_change_is_trashed_default_to_false' is marked for no-dry-run.
- Loading initial data for desktop.
Installed 0 object(s) from 0 fixture(s)
Running migrations for beeswax:
- Migrating forwards to 0014_auto__add_field_queryhistory_is_cleared.
> beeswax:0009_auto__chg_field_queryhistory_server_port
> beeswax:0010_merge_database_state
> beeswax:0011_auto__chg_field_savedquery_name
> beeswax:0012_auto__add_field_queryhistory_extra
> beeswax:0013_auto__add_field_session_properties
> beeswax:0014_auto__add_field_queryhistory_is_cleared
- Loading initial data for beeswax.
Installed 0 object(s) from 0 fixture(s)
Running migrations for hbase:
- Migrating forwards to 0001_initial.
> hbase:0001_initial
- Loading initial data for hbase.
Installed 0 object(s) from 0 fixture(s)
Running migrations for jobsub:
- Migrating forwards to 0006_chg_varchars_to_textfields.
> jobsub:0001_initial
> jobsub:0002_auto__add_ooziestreamingaction__add_oozieaction__add_oozieworkflow__ad
> jobsub:0003_convertCharFieldtoTextField
> jobsub:0004_hue1_to_hue2
- Migration 'jobsub:0004_hue1_to_hue2' is marked for no-dry-run.
> jobsub:0005_unify_with_oozie
- Migration 'jobsub:0005_unify_with_oozie' is marked for no-dry-run.
> jobsub:0006_chg_varchars_to_textfields
- Loading initial data for jobsub.
Installed 0 object(s) from 0 fixture(s)
Running migrations for oozie:
- Migrating forwards to 0027_auto__chg_field_node_name__chg_field_job_name.
> oozie:0026_set_default_data_values
- Migration 'oozie:0026_set_default_data_values' is marked for no-dry-run.
> oozie:0027_auto__chg_field_node_name__chg_field_job_name
- Loading initial data for oozie.
Installed 0 object(s) from 0 fixture(s)
Running migrations for pig:
- Nothing to migrate.
- Loading initial data for pig.
Installed 0 object(s) from 0 fixture(s)
Running migrations for search:
- Migrating forwards to 0003_auto__add_field_collection_owner.
> search:0001_initial
> search:0002_auto__del_core__add_collection
> search:0003_auto__add_field_collection_owner
- Loading initial data for search.
Installed 0 object(s) from 0 fixture(s)
? You have no migrations for the 'security' app. You might want some.
Running migrations for spark:
- Migrating forwards to 0001_initial.
> spark:0001_initial
- Loading initial data for spark.
Installed 0 object(s) from 0 fixture(s)
Running migrations for sqoop:
- Migrating forwards to 0001_initial.
> sqoop:0001_initial
- Loading initial data for sqoop.
Installed 0 object(s) from 0 fixture(s)
Running migrations for useradmin:
- Migrating forwards to 0008_convert_documents.
> useradmin:0001_permissions_and_profiles
- Migration 'useradmin:0001_permissions_and_profiles' is marked for no-dry-run.
> useradmin:0002_add_ldap_support
- Migration 'useradmin:0002_add_ldap_support' is marked for no-dry-run.
> useradmin:0003_remove_metastore_readonly_huepermission
- Migration 'useradmin:0003_remove_metastore_readonly_huepermission' is marked for no-dry-run.
> useradmin:0004_add_field_UserProfile_first_login
> useradmin:0005_auto__add_field_userprofile_last_activity
> useradmin:0006_auto__add_index_userprofile_last_activity
> useradmin:0007_remove_s3_access
> useradmin:0008_convert_documents
- Migration 'useradmin:0008_convert_documents' is marked for no-dry-run.
Starting document conversions...
Finished running document conversions.
- Loading initial data for useradmin.
Installed 0 object(s) from 0 fixture(s)
Running migrations for notebook:
- Migrating forwards to 0001_initial.
> notebook:0001_initial
- Loading initial data for notebook.
Installed 0 object(s) from 0 fixture(s)
5、启动HUE服务
[hadoop@hdp01 ~]$ /u01/hue/build/env/bin/supervisor &
启动后,通过浏览器访问HUE服务器的8888端口,如下:
PS:如果要启用中文界面,需修改HUE_HOME/desktop/core/src/desktop/settings.py文件,将LANGUAGE_CODE = 'en-us'的值改为'zh-CN',然后make apps即可。
参考文献:
1、安装Hue后的一些功能的问题解决干货总结
免责声明:本站发布的内容(图片、视频和文字)以原创、转载和分享为主,文章观点不代表本网站立场,如果涉及侵权请联系站长邮箱:is@yisu.com进行举报,并提供相关证据,一经查实,将立刻删除涉嫌侵权内容。