HDFS与CentOS集成步骤如下:
安装Java环境
sudo yum install java-1.8.0-openjdk-devel
下载并解压Hadoop
wget https://archive.apache.org/dist/hadoop/common/hadoop-3.3.4/hadoop-3.3.4.tar.gz
tar -xzvf hadoop-3.3.4.tar.gz -C /usr/local/
sudo mv /usr/local/hadoop-3.3.4 /usr/local/hadoop
配置环境变量
编辑/etc/profile.d/hadoop.sh
,添加:
export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk
export HADOOP_HOME=/usr/local/hadoop
export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
使配置生效:
source /etc/profile.d/hadoop.sh
配置HDFS核心文件
<property>
<name>fs.defaultFS</name>
<value>hdfs://localhost:9000</value>
</property>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>/usr/local/hadoop/data/namenode</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>/usr/local/hadoop/data/datanode</value>
</property>
格式化NameNode
hdfs namenode -format
启动HDFS服务
start-dfs.sh
验证集成状态
jps
命令查看NameNode、DataNode进程是否运行。http://localhost:50070
。说明:以上步骤为单机版HDFS集成,若需部署集群,需额外配置SSH免密登录、Zookeeper(高可用)等。