DevOps Classroom Series – 25/Jul/2020

Filter Plugins of Logstash

  • Filter plugin is used to perform transformations on data.

CSV Filter

  • This filter is useful for parsing the .csv files.
  • Lets create one csv file
Fname,Lname,Age,Salary,EmailId,Gender
Tony ,Stark,38,900000,tony.stark@qt.com,m
wonder,women,28,400000,wonder.women@qt.com,f
captian,america,40,800000,captian.america@qt.com,m

input {
    file {
        path => "/home/ubuntu/employees.csv"
        start_position => "beginning"
    }
}
filter {
    csv {
        autodetect_column_names => true
    }
}
output{
    stdout {
        codec => rubydebug
    }
}

Preview

Mutate filter

  • This filter is used to perform mutations Refer Here
input {
    file {
        path => "/home/ubuntu/employees.csv"
        start_position => "beginning"
    }
}
filter {
    csv {
        autodetect_column_names => true
    }
    mutate {
        rename => {
            "Fname" => "Firstname"
            "Lname" => "Lastname"
        }
        convert => {
            "Age" => "integer"
            "Salary" => "float"
        }
        uppercase => [ "Gender" ]
    }
}
output{
    stdout {
        codec => rubydebug
    }
}

Grok filter

  • This is powerful & often usen plugin for parsing the unstructure data into structured data.
  • Grok is a way of matching a line againsta a pattern (based on regular expressions).
  • The general syntax of grok pattern is
${PATTERN:FIELDNAME}
${PATTERN:FIELDNAME:type}
  • Logstash ships about 120 patterns by default, these are resuable & extensible. These patterns are located for reference at Refer Here You can create a custom pattern USERNAME [a-zA-Z0-9._-]+
  • Experiment with grok debugger before using a pattern
  • A sample grok configuration
input {
    file {
    path => "/var/log/http.log"
    }
}
filter {
    grok {
    match => { "message" => "%{IP:client} %{WORD:method} %{URIPATHPARAM:request} %{NUMBER:bytes} %{NUMBER:duration}" }
    }
}
output {
    stdout {
        codec => rubydebug
    }
}
  • When we install XPack 5.5 Onwards Kibana also has Grok debugger.

Date filter

  • This plugin is used to parse the dates from the fields filter { date { match => ["timestamp", "dd/MMM/YYYY:HH:mm:ss Z"] target => event_timestamp }

}

Geoip filter

  • Given the ip address, it adds the geographical location of the IP Address

  • Client ip => "84.134.13.13" => geoip{ source => client ip} =>

    • geoip
      • timezone: "India"
      • ip: 84.134.13.13
      • lattitude
      • longitude
      • continent coe
      • country name
      • region name
      • city name

Beats

  • Beats ar lightweight data shippers that are installed on servers to ship operation data to elastic search
  • The Beats framework is madeup of a library called libbeat
  • Elastic.co has built and maintained several beats
    • Filebeat
    • Packetbeat
    • Metricbeat
    • Heartbeat
    • Winlogbeat
    • Auditbeat
  • There are several other community beats
    • amazonbeat
    • mongobeat
    • httpbeat
    • nginxbeat
  • From any beat the output can be sent to Logstash or Elasticsearch

Our Sample Log Pipeline

  • In addition to the image, we would be using metric beats & heart beat Preview

Explore Kibana

  • To explore kibana, we need one server with elastic search & kibana installed.
  • The issue which we faced in the class today was happening because of master not discovered exception. Preview
  • I have done the following changes in /etc/elasticsearch/elasticsearch.yml Preview
  • Now the stuff is working Preview

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

About learningthoughtsadmin