Home » Good Practices » Log data using Serilog, Elasticsearch and Kibana

Log data using Serilog, Elasticsearch and Kibana

Often developers as a log store use text files, system events, output console or database. In some cases they use distributed systems such Splunk or Elasticsearch. In this article I would like to present how to store logs in the form of objects, rather than plain text, which also offers all sorts of graphs and charts called dashboards. To achieve it, I’ll use Serilog, Elasticsearch and Kibana.

To test this solution, you’ll need Visual Studio (obvious), an Elasticsearch instance and the Kibana extension on your PC. You may as well use Amazon Web Services which offer these services even for free. I prefer the second option because it is much easier and quicker to set up.

Elasticsearch and Kibana – AWS configuration

What is Elasticsearch?

Elasticsearch is a search engine based on Lucene. It provides a distributed, multitenant-capable full-text search engine with an HTTP web interface and schema-free JSON documents. In this case I’m going to use it as a data store.

What is Kibana?

Kibana is a plugin which gives the ability to analyse real-time data stored in and passed to Elasticsearch. It’s very simple in use and gives you possibility to create different graphs and charts in minutes.

How to set it up?

As an example I’m going to use Elasticsearch and Kibana offered as a service in AWS. From an application point of view I’ll need only Elasticsearch’s Endpoint URL where Serilog will transfer logs.

The first step is to create an AWS account. I’ll skip it here. Next step is to log into AWS console, select our region (right top corner of the page) and create new Elasticsearch instance. You may find it here (as of December 2016):

Elasticsearch - 1

On the first screen it asks me to provide instance domain I would like to use. In my case it’s mpustelak-test:

Elasticsearch - 1

Domain provided? Good. Next step is to select instance size, number of partition you would like to it on, disc type etc… I’ll pick basic and free options.


Third step is, as with most AWS’ services, to provide access policy we would like to attach to this instance and Elasticsearch. Only for this scenario I’m setting it to be open to everyone and accessible from every place in the world (both read and write data). Please, don’t try to invade it once you’ll get here! I’ll delete it once article will be finished 🙂

On development and production environments I would suggest to set up some restrictions i.e. accessible to certain IP’s only.

Elasticsearch - 3

The last step it to confirm that all we just picked up is correct. Done? Great. Now we need to wait for several minutes and our instance will be ready to go.


10 minutes gone, page refreshed and voila! Our brand-new and free instance is running. Now we may obtain the URLs required by Serilog (Endpoint URL) and another one to access Kibana.


Easy enough even for someone who never touched the AWS console and it doesn’t take longer than a few minutes. I’ve got Elasticsearch’s Endpoint URL and Kibana’s dashboard link.

There’s an even easier (with some practice) and faster way to start new instance using CloudFormation or other script language i.e. PowerShell with AWS Tools. You may even store scripts to do it in any source code version control system for later re-use. However, I wanted to present it in the easiest possible way – with just a few clicks.

Next part is to create an application (actually it’s integration test) which will send data to AWS.

.NET and Serilog

Serilog is a Nuget package which helps you log data into any data store as objects i.e. JSON. It might be helpful once you’ll be searching through it.

What I would like to present here is to create an integrated test using NUnit instead of a console application. Getting into the habit of creating tests first will help you in later stages. It might be even executed during Continuous Integration pipeline, i.e. in TeamCity. That’s why I’m going to use Class Library project. Like usually, you’ll be able to get the code from Github.

Once project is ready, I need to install all required packages:

install-package Serilog
install-package Serilog.Sinks.ElasticSearch
install-package Serilog.Settings.AppSettings

Thanks to these packages I’ll be able to log the data and then send it to Elasticsearch. The last package gives me tools to load configuration directly from settings file instead of hardcoding them in the code.

Like I mentioned before, I’m going to write the test first where I’ll send logs to Serilog’s data store.

public class LoggingExample
    private readonly IFixture _fixture = new Fixture();

    public void Given_LogManager_When_SendingLogMessage_Then_ItShouldNotThrowAnException()
        var logObject = new
            CurrentDate = DateTime.UtcNow,
            ObjectName = _fixture.Create()

        Assert.That(() =>
            using (var log = new LoggerConfiguration()
                log.Information("This is my test message with an {@LogObject}", logObject);
        }, Throws.Nothing);

This simple code presents how to create an instance of Serilog’s Ilogger implementation using configuration stored in App.settings file. The last part is responsible for sending anonymous object into Elasticsearch. I’m not expecting any exceptions here, that’s why I’m checking if the log.Information(…) part won’t throw anything.

My App.settings file:

<?xml version="1.0" encoding="utf-8" ?>
 <add key="serilog:using" value="Serilog.Sinks.Elasticsearch" />
 <add key="serilog:write-to:Elasticsearch.nodeUris" value="https://search-mpustelak-test-adep5jkmtwoqjr4sbmmd2rzzbm.eu-west-1.es.amazonaws.com" />
 <add key="serilog:write-to:Elasticsearch.indexFormat" value="test-{0:yyyy.MM}" />
 <add key="serilog:write-to:Elasticsearch.templateName" value="serilog-events-template" />

It means that I’m going to use Elasticsearch as my data store. In addition, I’m providing the required settings like Endpoint’s URL, index’s pattern and the template I’ll use in my example.

Coding work done, test passed. I doesn’t mean that everything works as expected. It means only that the logger doesn’t throw any exceptions.

Next step is to configure Kibana to display my index (pattern would be test-*). Now I’ll use the second URL from AWS instance configuration.


Once hitting save I’ll see all fields passed by my integration test discovered by Kibana.


Last print screen proves that my integration test actually sent something to Elasticsearch.



In my opinion, the combination of Elasticsearch + Kibana + Serilog is a really good way to store data logs. Thanks to this, it is really fast and easy to build a reliable logging system. I spent up to 10 minutes setting up AWS instance, meanwhile I was coding my integration test. Moreover, out of box, you’re getting an amazing tool which is Kibana. I didn’t show it here, but Kibana features amazing charts and graphs you may create based on passed data. It has an easy to use UI where, using Lucene, you may find data you’re searching for.

Published by

Mateusz Pustelak

Software Developer with several years of commercial experience, TDD practitioner, DDD/CQRS fan. Currently working for Universal Music Group in London.

8 thoughts on “Log data using Serilog, Elasticsearch and Kibana”

  1. Hi Mateusz! Nice post 🙂

    Just a minor point regarding Serilog configuration in the unit test; because the Elasticsearch sink buffers and sends messages asynchronously, the logger will need to be disposed in order to ensure the batch is flushed before the test is torn down:

    using (var log = new LoggerConfiguration()


    Application code can place the logger in a `using` block like this, or if the static `Log` class is configured and used, the `Log.CloseAndFlush()` method performs this role.


  2. Thanks a lot for your big efforts.
    Elasticsearch + Kibana in AWS are fully free?

    In your real projects, NOT USE log4net, NLog and Microsoft’s Logging Application Block. ONLY USE Serilog to generate structured data file. Optionally, you can store all your logs in ELK stack (free in AWS), isn’t?

    IMHO, better samples for minimize learning curve are real applications with full source code and good patterns and practices. Full source code sample REAL application? not Demo, only real applications?
    I think, the better, full real samples in github+codeplex+codeproject with full source code and using good patterns and practices: “advanced search across all repositories”, better if anyone knows a good project with patterns.

    Googling I have a BIG BANG FAT repository of links and infinitum code. Really.

    Mindly notes:

    main influences, are full of innovative ideas that can free our minds to explore new techniques, patterns and paradigms.

    You want to use technologies that allow for rapid development, constant iteration, maximal efficiency, speed, robustness and more. You want to be lean and you want to be agile. You want to use technologies that will help you succeed in the short and long term. And those technologies are not always easy to pick out.

    ELK stack = Elasticsearch + LogStash + Kibana

    1. sounds like you need your own blog, mate. maybe make the “full source code and using good patterns and practices” like you’d like to see

Leave a Reply

Your email address will not be published. Required fields are marked *