Lab Setup
-
Basic Architecture

-
Elastic Search and Kibana Setup
- Create a vm with at least 4GB of RAM
- Install elastic search by following instructions Refer Here
- To configure elastic search Refer Here

- Now lets install kibana Refer Here
- Now lets work with kibana configuration

- Now access kibana
-
Lets create an apache server and also install file beats to export the logs of apache to logstash
- Create a free vm
- Install apache2
sudo apt update sudo apt install apache2 -y -
Now create a configuration file to test whether the logs can be processed by logstash
input
{
stdin {}
}
filter
{
grok
{
"match" => { "message" => "%{COMBINEDAPACHELOG}"}
}
}
output
{
stdout {}
}
- Now lets change the output in the configuration Refer Here
input
{
stdin {}
}
filter
{
grok
{
"match" => { "message" => "%{COMBINEDAPACHELOG}"}
}
}
output
{
elasticsearch
{
index => "apache-%{+yyyy.MM.dd}"
hosts => "172.31.28.161"
}
}
- Now lets install beats, which can reads logs from local server and send it to the logstash.
- For this we will be installing file beats Refer Here
- Configure filebeats to read apache access logs

- Now we need to change the input of the conf to read from beats. Beats will forward the traffic to some port on logstash. Refer Here
input
{
beats
{
port => 5044
}
}
filter
{
grok
{
"match" => { "message" => "%{COMBINEDAPACHELOG}"}
}
}
output
{
elasticsearch
{
index => "apache-%{+yyyy.MM.dd}"
hosts => "172.31.28.161"
}
}
- Next Steps:
- We need to configure logstash to start automatically whenever the linux machine starts
- We need to place the above configuration in a specific folder
- Now we need to start beats, which exports the logs to logstash which add fields and stores in elastic search
