centos

CentOS下Filebeat与Kafka集成实践

小樊
54
2025-03-31 17:53:36
栏目: 智能运维

在CentOS下将Filebeat与Kafka集成,通常用于日志的集中收集和转发。以下是一个基本的集成实践步骤:

1. 部署Filebeat

wget https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-6.6.0-linux-x86_64.tar.gz
tar -xzf filebeat-6.6.0-linux-x86_64.tar.gz
cd filebeat-6.6.0
filebeat.inputs:
- type: log
  paths:
    - /var/log/nginx/access.log

output.kafka:
  hosts: ["kafka:9092"]
  topic: 'nginx_access_logs'
  compression: gzip
  max_message_bytes: 1000000
./filebeat -e -c filebeat.yml

2. 部署Kafka

wget https://downloads.apache.org/kafka/2.8.1/kafka_2.12-2.8.1.tgz
tar -xzf kafka_2.12-2.8.1.tgz
cd kafka_2.12-2.8.1
dataDir=/tmp/zookeeper
clientPort=2181
log.dirs=/tmp/kafka
zookeeper.connect=localhost:2181
./bin/zookeeper-server-start.sh config/zookeeper.properties
./bin/kafka-server-start.sh config/server.properties

3. 验证集成

./bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic nginx_access_logs
kafkacat -C -b localhost:9092 -t nginx_access_logs

4. 安全性和优化配置

setenforce 0
systemctl stop firewalld
systemctl disable firewalld

filebeat.yml中添加Kafka认证信息:

output.kafka.username: your_username
output.kafka.password: your_password
output.kafka.sasl.mechanism: PLAIN

参考资料

0
看了该问题的人还看了