How to Read Logs through Filebeat to Logstash and Store into Elasticsearch Index - Ubuntu OS

How to Read Logs through Filebeat to Logstash and Store into Elasticsearch Index - Ubuntu OS

Prerequisites

  • Before starting, ensure you have installed the following tools, which we have discussed previously:

  1. Filebeat

  2. Logstash

  3. Elasticsearch

  4. Kibana

  • A sample log file (e.g., )

  • Must be root user

  • Open ports:

  1. (for Filebeat → Logstash)

  2. (for Elasticsearch API)

  3. (for Kibana)

Let's Creating /var/log/sample.log file

Step 1 : Create a sample log file Add some sample log lines to it for testing. For example:

sample.log

Step 2 : Allow port through the firewall with the following command:

ufw

Step 3 : Open the /etc/filebeat/filebeat.yml file for configurations.

Change it; it changes according to my snapshot.

Open the filebeat.yml using below command

Changes needed for Filebeat inputs

don't use type: log
filebeat

Changes for Elasticsearch Output

Elasticsearch

Changes for Logstash Output

Logstash

Save /etc/filebeat/filebeat.yml configuration file

Step 4 : Start filebeat service using systemctl and check status

filebeat

Step 5 : Check filebeat Configuration is correct or not using below command

configuration test

Response must be Config OK If not , please check the filebeat.yml configuration file.

Step 6 : Check filebeat output using below command

output test

Let's Configure Logstash to read logs form filebeat on port 5044

Step 1 : Creating pipeline in conf.d directory, use below command to create one pipeline

demo.conf

Step 2 : Adding demo.conf pipeline entry into pipelines.yml file

Do the below changes

pipelines.yml

Step 3 : Adding pipeline configuration code to logs from Beat, which will get parsed and store into Elasticsearch index

Open the demo.conf file using vim or another editor

demo.log pipeline configuration

Save your pipeline configuration code. If the pipeline is not saving, then change the demo.conf file permission according to your user.

NOTE : I am giving permission to Owner, Group and Other with all permission read write and execute (it's not a better choice to give all permissions to all; please follow IAM)

Now I have saved the demo.conf pipeline.

Step 4 : Checking Logstash all configuration is OK or not

If you get permission error, then please use below command

like this,

error

Use this one command

Result

Configuration test Result must be -> Config Validation Result: Ok Exiting Logstash

Step 5 : Starting the Logstash using below command

logstash

Once Logstash is started, please restart the Filebeat.

NOTE : Logstash started but logstash is not started as root so you need to use below command

One more thing needed: Logstash need permissions to the user can write to

use below command to allow permission

Run your pipeline using below command without making Logstash root user

Pipeline execution started.........

Pipeline started

After running the pipeline then you will get this output on terminal

Terminal output

Now we have sent logs on the index sample-logs one main point here.

We have provided the index name in the Logstash pipeline output section, so the index will automatically get created.

Check our index created or not

Go to the kibana -> Management -> Stack Management -> Index Management -> Indices

Indices

Click on sample-logs index

sample.log file data is loaded (79.17 KB).

data

Click on Discover index

data

Check log are parsed by Grok pattern or not (our logs are parsed)

Logs

Conclusion

We demonstrated how to configure Filebeat to read logs and send them to Logstash, which processes and parses the logs before storing them in an Elasticsearch index on Ubuntu OS. By following these steps, you can efficiently collect, parse, and analyze logs in real-time using the Elastic Stack.


SOC | SOC Platform | SIEM Engineer | Threat Intelligence | Threat hunting | IR | Detection Engineer | Cloud Security

To view or add a comment, sign in

Others also viewed

Explore topics