DevOps Classroomnotes 07/Mar/2022

Collecting Logs from different servers into Elastic Search via Logstash

  • Lets try to write a configuration in logstash which would listen for input from beats components and send the output to elastic search after parsing content into multiple fields
  • Beats:
    • Logstash has a beats framework where it supports different types of beats
    • Refer Here for all the beats
    • For collecting logs from files we can use filebeat Refer Here
    • Filebeat docs Refer Here
    • Filebeat installation Refer Here
    • Apt based filebeat installation Refer Here
    • To configure filebeat we need to change the config file /etc/filebeat/filebeat.yml
    • Refer Here for the configuration file to read the logs
    • Till this point we are running logstash manually. To configure logstash to run as a service and read multiple configurations. Create .conf files and place them in /etc/logstash/conf.d folder
    • Now logstash is running, elasticsearch and kibana are also running
    • We need to start filebeat
    • Now filebeat is also running
    • To see the logs, application should generate logs, to simulate users accessing the application, let me create a simple shell script which will call apache server for every 10 s
    • </ul>
      while :
      curl ${url}
      echo "press <Ctrl+C> to exit"
      sleep 10s
      * Now execute this script from any machine. Windows users execute from gitbash. Refer Here

    • Now lets move to kibana to view the logs and do some analysis
    • Now we need to create an data view (earlier was referred as index pattern)
    • Since the kibana is unable to connect to secured elastic, lets use elastic cloud. This is hosted elastic search and kibana which is free trail for 14 days. Refer Here
    • Now the changed logstash configuration is as shown below
    input {
        beats {
            port => 5044
    filter {
        grok {
            match => { "message" => "%{COMBINEDAPACHELOG}"}
    output {
        elasticsearch {
            cloud_id => "qtelastictrail:dXMtY2VudHJhbDEuZ2NwLmNsb3VkLmVzLmlvJDMzZmNlYzY0NWI5ZTRhY2ViMDA3MzMyNWI0YmEzOWY2JDczN2VkY2JjMDFhMTRjOTI4ZjcxNzIxNWU1ZDIzYmYx"
            index => "apache-%{+yyyy.MM.dd}"
            cloud_auth => "elastic:8jnJITEZ0bbZxASxDs8ecUFd"
        stdout {
    • In any of your existing linux machines install metric beat, donot configure
    • Please follow the classroom video for steps.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

About continuous learner

devops & cloud enthusiastic learner