using heka
TRANSCRIPT
All rights reserved. Exotel Techcom Pvt. Ltd. © 2016
Handling Logs, Events and Metrics Using Heka
All rights reserved. Exotel Techcom Pvt. Ltd. © 2016
Siddharth [email protected] / @sidramesh
All rights reserved. Exotel Techcom Pvt. Ltd. © 2016
ContextDistributed systems produce different types of data
LOGS EVENTSMETRICS
Each type of data has unique (aggregation / processing / freshness) characteristics
All rights reserved. Exotel Techcom Pvt. Ltd. © 2016
Why? (The Problem)
Collecting, aggregating, processing these distinct types of data
require a variety of different tools
All rights reserved. Exotel Techcom Pvt. Ltd. © 2016
Why? (The Problem)
LOGS METRICS• Logstash• rsyslogd• fluentd• Splunk
• Statsd pushing into (Graphite | Influx | OpenTSDB)
• Perf Collectors• Datadog / Newrelic
All rights reserved. Exotel Techcom Pvt. Ltd. © 2016
Events:• There is no standard!• Brokered Queues (SQS, RabbitMQ, Beanstalkd)• Websockets / XMPP
Application Developer:• Use many different libraries
DevOps Team:• Different technologies to maintain
Why? (The Problem)
All rights reserved. Exotel Techcom Pvt. Ltd. © 2016
Heka is a high performance, extremely extensible tool for gathering, analyzing, monitoring and reporting data
• Acquire data• Transform data• Output data
Simple Semantics:
“Swiss Army Knife”
What? (is Heka)
Internally modeled as a generic message router
All rights reserved. Exotel Techcom Pvt. Ltd. © 2016
Various Inputs- Logs- TCP / UDP / HTTP- StatsD
…Inbuilt Realtime GraphingComplex Processing
- Aggregation- Anomaly detection (including Alerting!)
What? (is Heka)
All rights reserved. Exotel Techcom Pvt. Ltd. © 2016
• Library (currently only in Go) to emit logs, events or metrics
• Log Input for logs, StatsD Input for metrics, TCP Input for events
How? (do we use it)
• Only hekad runs on every machine
• Local hekad captures data and pushes to a brokered queue (Kafka)
• Global hekad reads off Kafka and pushes logs and metrics to respective stores
• Soon – Push to Nagios (Eliminate nsca)
Basically, the single data pipeline in Exotel
All rights reserved. Exotel Techcom Pvt. Ltd. © 2016
No separate libraries for sending events, collecting perf counters
Our Experience
• Single library to integrate withDev win:
• Consistent interface 5 logstash Medium instances replaced with 1 hekad medium instance
• Ability to change the underlying pieces without dev support
DevOps win:
Always < 5% CPU utilization• Single data pipeline for all
data
• Super light-weight
All rights reserved. Exotel Techcom Pvt. Ltd. © 2016
• Do ping us for any help
Our ExperienceA bit tough to configure at first:
• Very light-weight (than say, Ruby plugins for Logstash)Lua plugins FTW:
• Dynamically loaded by Heka w/o restart
All rights reserved. Exotel Techcom Pvt. Ltd. © 2016
TakeawaysTremendous value in having a single data pipeline
“One ring to rule (90% of) them all”
Heka helps you (possibly) achieve it