logging for openstack - elasticsearch, fluentd, logstash, kibana
Post on 21-Jan-2018
1.144 Views
Preview:
TRANSCRIPT
OPENSTACK&
LOGGING
ABOUT ME
Md Safiyat Reza
Fresh out of college!
Open-source enthusiast
An EMACS and KDE user.
Software Engineer at Snapdeal.com
safiyat
@reza_safiyat
reza.safiyat@acm.org
Logs are boring!
nova-compute nova-network nova-manage nova-conductor nova-scheduler nova-api nova-cert nova-console nova-consoleauth nova-dhcpbridge
apache logs
cinder-apicinder-schedulercinder-volume
syslog
keystone
glance-api glance-registry
dhcp-agent l3-agent metadata-agent openvswitch-agent server.log openvswitch-server
Lots of logs!
What logs contain...TIMESTAMP
PID
NAME
REQUESTID
USERID
TENANTID
[INSTANCE INSTANCEID]
MESSAGE
All that is fine,
How to use them?
When shit happens...
DEBUG INFO AUDIT WARNING ERROR CRITICAL TRACE
Use logs for a good cause.
Log Collection, Aggregation & Visualization
Log Collectors and Aggregators
The flow of logs
• Open source• Centralized logging• Collection & Transport• High availability
• Written in Ruby• Requires Ruby (alt. td-agent)• XML-styled configuration• Plugins available• Owned by Treasure Data
• Written in JRuby• Requires JVM• JSON-styled configuration• A lot and lot of plugins• Owned by Elastic
<source>path /var/log/syslogtype syslogformat grokgrok_pattern %{SYSLOGLINE}tag system
</source>
<filter> type record_transformer enable_ruby <record> timestamp ${timestamp.gsub!(" ", "T");
timestamp.gsub(/\.\d*/, "+05:30"} </record></filter>
<match *> type forest subtype elasticsearch <template> host elasticsearchhost index_name someindexname type_name sometypename </template></match>
Fluentd: syslog to elasticsearch
input { file { path => "/var/log/syslog" start_position => beginning type => syslog sincedb_path => "/dev/null" }}
filter { Grok { match => [ "message", "%{SYSLOGLINE:log}" ] }}
output { file { path => "/home/safiyat/kafkaop" } elasticsearch { protocol => "http" }}
Logstash: syslog to elasticsearch
Kibana
Kibana, again.
Thank you for bearing this!
top related