Signup/Sign In

Setup Fluent Bit with Elasticsearch and Kibana (EFK) for Log Management on Linux Machine (Non Kubernetes)

Posted in Programming   LAST UPDATED: SEPTEMBER 6, 2021

    In this tutorial we will cover the basic EFK setup which is Fluent Bit, Elasticsearch and Kibana for log aggregation/collection, parsing the logs, storing the logs in Elasticsearch and finally visualizing the logs using the Kibana UI.

    These days, with growing focus on distributed and microservice based architecture, where different services may run on different virtual machines or servers, it becomes crucial to have some setup in place for log collection, to have all the logs collected and stored at one place and if you want to do analysis on the collected logs having Kibana UI or Grafana UI is of great benefit.

    EFK setup for log aggregation

    Fluent Bit is a data collection service, Elasticsearch is a service to store data in JSON format and Kibana is UI service which can be configured to stream data from Elasticsearch service. We will see how we can do the basic installation of all these services on a Linux machine on a non Kubernetes environment.

    Install and Configure Fluent Bit

    We have a separate tutorial covering installation steps of Fluent Bit. To configure Fluent Bit, we will have to setup Input and Output configuration so that we can read logs from our Application log file, or in case of multiple applications, we can also configure Fluent Bit to tail logs from multiple log files.

    Once you have installed Fluent Bit on your Linux Machine, you will find its configuraiton files in /etc/td-agent-bit/ directory. You will have a td-agent-bit.conf configuration file in which we can provide Input plugin and Output plugin.

    In our setup, we will use the Input tail plugin to tail data from log files, and we will have the Elasticsearch output plugin to push the data into Elasticsearch, creating an index on which the data will be stored.

    Following should be the Input configuration:

    [INPUT]
            Name              tail
            Tag               mylogs
            Path              /path/to/files/*.log
            Mem_Buf_Limit     5MB
            Skip_Long_Lines   Off
            Refresh_Interval  10

    and the output configuration:

    [OUTPUT]
            Name  es
            Match *
            Index mylogs
            Host localhost
            Port 9200
            Logstash_Format Off

    In the above configuration we have specified the Hostname of the server on which elastic search is running, if you setup all the three service on the same machine, you can use localhost too, else change it to the hostname of your machine and the port on which the Elasticsearch runs is 9200 by default.

    Restart the Fluent bit service using the following command:

    sudo service td-agent-bit restart

    Install Elasticsearch

    Installing Elasticsearch is very easy. Download the zip file for Elasticsearch from here: https://www.elastic.co/downloads/elasticsearch

    Unzip the zip file downloaded. Then all you have to do is go to config folder and update the configuration file for elasticsearch in elasticsearch.yml file. All you have to do is provide the hostname for the server if your host has a domain name.

    To start the service, go to the bin/ directory and run the elasticsearch shell file. Run the following command:

    nohup ./elasticsearch & 

    This will run the service in background.

    Install Kibana

    To install kibana, download Kibana from here: https://www.elastic.co/downloads/kibana

    Make sure you download the same version number of Kibana as of Elasticsearch. Unzip the zip file downloaded. Go to the kibana folder, in the config directory and update the kibana.yml file and change the hostname from localhost to your server's hostname if required.

    Then go to bin/ directory and run the kibana shell script, using the following command:

    nohup ./kibana &

    This will run the service in the background. Once up, the Kibana service will be available at http://localhost:5601/app/kibana URL.

    The EFK Setup:

    If all is configured well, your Fluent Bit service should start reading logs from the log files and push it to Elasticsearch and you should be able to see the logs in Kibana.

    To see if Fluent Bit is working fine and is creating index in Elasticsearch you can run the following command to see the indices created in Elasticsearch:

    curl -XGET 'localhost:9200/_cat/indices?v&pretty'

    To see the logs in Kibana UI, you should create a new Index pattern for your index which is mylogs, and then use the Kibana UI to to see logs for that index.

    If you face any confusion with the following setup, feel free to comment down below.

    You may also like:

    About the author:
    I like writing content about C/C++, DBMS, Java, Docker, general How-tos, Linux, PHP, Java, Go lang, Cloud, and Web development. I have 10 years of diverse experience in software development. Founder @ Studytonight
    Tags:FluentBitElasticsearchKibanaMicroservices
    IF YOU LIKE IT, THEN SHARE IT
     

    RELATED POSTS