您好,登录后才能下订单哦!
系统各部分应用介绍:
Kibana:开源web展现
Elasticsearch:开源的搜索引擎框架logstash部分,可进行多数据集群,提高效率,从redis中读取数据,并转发到Kibana
Logstash:系统log收集,转载的工具,同时集成各类日志插件,对日志查询和分析的效率显著提高
Logstash shipper:收集log 并将log转发给redis 存储
Logstash indexer:从redis中读取数据并转发给elasticsearch
Redis:是db,logstash shipper将log转发到redis数据库中存储
部署:
1、jdk:
path:/usr/local/jdk7
cat /etc/profile
export JAVA_HOME=/usr/local/jdk7
export PATH=$JAVA_HOME/bin:$PATH
export REDIS_HOME=/usr/local/redis-2.6.12
export ES_HOME=/usr/local/elasticsearch
export ES_CLASSPATH=$ES_HOME/config
2、ElasticSearch
wget https://download.elasticsearch.org/elasticsearch/elasticsearch/elasticsearch-0.20.2.tar.gz tar xvf elasticsearch-0.20.2.tar.gz mv elasticsearch-0.20.2 elasticsearch
cd /usr/local/elasticsearch/config
vim elasticsearch.yml
cluster.name: elasticsearch
node.name: "litong"
path.conf: /usr/local/elasticsearch
path.data: /usr/local/elasticsearch/data
path.work: /usr/local/elasticsearch/tmp
path.logs: /usr/local/elasticsearch/logs
bootstrap.mlockall: true
mkdir -p /usr/local/elasticsearch/data /usr/local/elasticsearch/tmp /usr/local/elasticsearch/logs
Get the service wrapper
wget http://github.com/elasticsearch/elasticsearch-servicewrapper/archive/master.zip unzip master.zip mv elasticsearch-servicewrapper-master/service/ /usr/local/elasticsearch/bin/ rm -rf elasticsearch-servicewrapper-master/
vim service/elasticsearch.conf
set.default.ES_HOME=/usr/local/elasticsearch set.default.ES_HEAP_SIZE=1024 #memory wrapper.java.additional.10=-Des.max-open-files=true wrapper.logfile.maxsize=5m wrapper.logfile.maxfiles=5
service:/etc/init.d/elasticsearch
bin/service/elasticsearch install
service elasticsearch start
bin/plugin -install mobz/elasticsearch-head
5、Redis
Install Redis server
wget http://redis.googlecode.com/files/redis-2.6.12.tar.gz tar xzf redis-2.6.12.tar.gz mv redis-2.6.12 /usr/local/redis cd /usr/local/redis make
make install
5、Configure Redis – ‘cp redis.conf 6379.conf’
vim 6379.conf
daemonize yes pidfile /var/run/redis/redis_6379.pid port 6379 timeout 300 tcp-keepalive 60 logfile /var/log/redis/redis_6379.log
Add REDIS home to root user’s ‘.bash_profile’
# Redis export REDIS_HOME=/usr/local/redis
Copy Redis init script
cp utils/redis_init_script /etc/init.d/redis_6379
Configure Redis init script
# chkconfig: - 85 15 # description: Redis is a persistent key-value database # processname: redis REDISPORT=6379 EXEC=/usr/local/redis/src/redis-server CLIEXEC=/usr/local/redis/src/redis-cli PIDFILE=/var/run/redis/redis_6379.pid CONF="/usr/local/redis/6379.conf"
Activate Redis service
mkdir /var/run/redis /var/log/redis cd /etc/init.d chkconfig --add redis
Start
service redis start
6、Logstash
mkdir /usr/local/logstash cd /usr/local/logstash wget https://logstash.objects.dreamhost.com/release/logstash-1.1.9-monolithic.jar
Indexer configuration – indexer.conf:
input {
redis {
host => "192.168.0.235"
port => "6379"
type => "redis-input"
data_type => "list"
key => "logstash"
format => "json_event"
}
}
output {
stdout { debug => true debug_format => "json"}
elasticsearch {
host => "192.168.0.235"
port => "9300"
cluster => "elasticsearch"
}
}
Shipper configuration – shipper.conf:
input {
file {
type => "nginx"
path => ["/usr/local/nginx/logs/*.log"]
exclude => ["*.gz"]
tags => ["nginx"]
}
}
output {
stdout { debug => true debug_format => "json"}
redis {
host => "192.168.0.235"
data_type => "list"
key => "logstash"
}
}
java -jar logstash-1.1.9-monolithic.jar agent -f indexer.conf & java -jar logstash-1.1.9-monolithic.jar agent -f shipper.conf &
Setup Ruby
yum install ruby ruby-devel ruby-ri ruby-rdoc rubygems
wget http://production.cf.rubygems.org/rubygems/rubygems-2.0.3.zip unzip rubygems-2.0.3.zip ruby rubygems-2.0.3/setup.rb
Get Kibana
wget https://github.com/rashidkpc/Kibana/archive/v0.2.0.zip unzip v0.2.0.zip cd Kibana-0.2.0 gem install bundler bundle install
Configure KibanaConfig.rb:
Elasticsearch = "192.168.0.235:9200" KibanaPort = 80 KibanaHost = '192.168.0.235'
Run Kibana
bundle exec ruby kibana.rb
免责声明:本站发布的内容(图片、视频和文字)以原创、转载和分享为主,文章观点不代表本网站立场,如果涉及侵权请联系站长邮箱:is@yisu.com进行举报,并提供相关证据,一经查实,将立刻删除涉嫌侵权内容。