Its an often asked question especially when tasked with deploying a logging solution. Which is better, Graylog or ELK? Both are amazing products that do things that just a handful of years ago used to cost a fortune to do. So…without spending several months evaluating each one you want to know which one is best to skip the need for testing? As always, the answer to the question is “it depends”.

While both ELK and Graylog are commonly compared as logging solutions it is important to realize an important distinction. Graylog was made from the very beginning with the intention of being a logging solution where as ELK is a Big Data solution that also happens to be pretty good at logging. It sounds like a minor distinction but in reality they are quite large. Is a log “just a log”, or is it Big Data that you are eager to run Business Intelligence stuff on?

Graylog is application that is built for the specific purpose of being a logging solution and in comparison to ELK it performs the (L)ogstash and (K)ibana functions. Out of the box Graylog does much of what you want out of a logging solution nice and easy and in the GUI. Parsing, alerting, configuration, custom pipelines, some basic graphing. The learning curve on Graylog is relatively simple and you can have something fully functional with a minimal amount of time. Everything of importance is in the GUI and easy to find.

Where Graylog starts to fall short is when you want to use it for anything except for the typical enterprise style logging requirements. Graphing is somewhat basic and you end up needing to use Graphana or Kibana for anything needing more intricate graphs. Once you start getting into very heavy and custom parsing requirements, the extractors used for parsing in Graylog are powerful but in a different way as compared to something like Logstash.  Most Graylog extractors are “everything” extractors for a certain type of log whereas Logstash will use a combination of generic plugins assembled together to parse a certain type of log.  You can do a lot with Graylog but once you get outside the scope of what it does really well it becomes a lot more difficult to accomplish whatever it is you are trying to do. I’ve seen a few environments that started off using Graylog, then added Graphana to display Graylog data, then added a InfluxDB or Graphite datastore to store metrics to graph in Graphana, then custom scripts and programs to tweak Graylog, then exporting metrics and logs to yet another program to do Business Intelligence style trending.


Easy to use out of the box
The streams feature is pretty amazing
Mostly GUI based so learning curve is a lot easier
Handles most of the details of the Elasticsearch indexing for you
Authentication in the free version
Alerting in the free version


Graphing is basic and you need to use Graphana and/or Kibana in addition for more advanced needs
Log processing is not nearly as flexible as Logstash.  Some larger installs of Graylog will process logs with Logstash and then feed it into Graylog
Custom log extractors can be used and written, but they are written in Java which has a steeper learning curve then Ruby (what is used for Logstash)
There is less flexibility in the way indexes are created and written
There are far fewer plugins available then in Logstash and Kibana

ELK is three independent products. Elasticsearch, which stores the data and allows you to search it. Logstash, which processes the data. Kibana, which visualizes the data. The learning curve is steep but the payoff is worth it. ELK allows you to do almost everything you will need in a single application stack and removes the need to set up, learn, and maintain several different applications. Logs, metric collection, searching, graphing, trending, business intelligence. It is possible to do it all.

Inside the initial complexity lies the benefit to ELK. It is a blank slate on top of which you build what you specifically need and the possibilities are nearly endless. Creating a visualization to show a problem that would be very difficult to search for or alert on is where ELK starts to shine. ELK has a fairly open ecosystem for plugins and you can probably find what you are needing to do if it isn’t already built in. It is also able to handle both logs and metrics as well as the visualization aspect of it, you are only learning and maintaining one application versus a handful of them stitched together.

Logstash, though it is complex, is a very powerful processing pipeline for any data. It is a resource hog compared to Graylog due to running in jRuby…but there is a very fortunate side effect of that. A non-programmer such as myself can usually throw together something in Ruby that will work. There has been some recent effort to rewrite the resource hungry plugins into Java so the performance is gradually improving. The learning curve of the product itself is where ELK falls short, it is a hurdle to overcome. Once you do though ELK is a very powerful product suite.


Logstash is a very powerful and customizable log processing pipeline
Its relatively easy to write custom Ruby plugins for Logstash even for non programmers
Kibana visualizations come close to competing with what Graphana can do
Can use the same product set and knowledge for both metrics and logs and not need to use/support multiple different products
You have complete control over how you index data into Elasticsearch. Hot/cold clusters, different retention periods for different indexes, etc.
Logstash config is text based. Copy/paste those chunks of configs because you are usually doing the same thing in 10 different ways
The different beats plugins come with Kibana dashboards that are easily importable


Logstash config is text based (learning curve)
Logstash can be a resource hog
Kibana ships with no default ‘logging’ dashboards
You are more exposed to managing the Elasticsearch cluster and indexes (be careful)
Authentication is a paid feature, or you need to find a 3rd party plugin
Alerting is a paid feature, or you need to find a 3rd party plugin

In summary
If all you are looking for is a place to store logs and search through them as needed, or want a logging solution but do not want to nor are planning on spending much time on it, then Graylog is probably a pretty wise choice to look into. Out of the box it works well enough and has default values for a lot of the settings that can/will cause you grief later on if you were to set them up wrong. The time to usable solution is smaller and you don’t need to learn as much in order to be successful.

If instead you foresee the need for graphing and metrics collection or are just interested in big data then give ELK serious consideration. If you use Graylog instead, chances are you will end up using ELK (or similar products such as fluentd or Graphana) in addition to Graylog so might as well start with a product that can do more of the end requirements from the start.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Close Menu