How to Read Logs through Filebeat to Logstash and Store into Elasticsearch Index - Ubuntu OS
Prerequisites
Before starting, ensure you have installed the following tools, which we have discussed previously:
Filebeat
Logstash
Elasticsearch
Kibana
A sample log file (e.g., )
Must be root user
Open ports:
(for Filebeat → Logstash)
(for Elasticsearch API)
(for Kibana)
Let's Creating /var/log/sample.log file
Step 1 : Create a sample log file Add some sample log lines to it for testing. For example:
Step 2 : Allow port through the firewall with the following command:
Step 3 : Open the /etc/filebeat/filebeat.yml file for configurations.
Change it; it changes according to my snapshot.
Open the filebeat.yml using below command
Changes needed for Filebeat inputs
Changes for Elasticsearch Output
Changes for Logstash Output
Save /etc/filebeat/filebeat.yml configuration file
Step 4 : Start filebeat service using systemctl and check status
Step 5 : Check filebeat Configuration is correct or not using below command
Response must be Config OK If not , please check the filebeat.yml configuration file.
Step 6 : Check filebeat output using below command
Let's Configure Logstash to read logs form filebeat on port 5044
Step 1 : Creating pipeline in conf.d directory, use below command to create one pipeline
Step 2 : Adding demo.conf pipeline entry into pipelines.yml file
Do the below changes
Step 3 : Adding pipeline configuration code to logs from Beat, which will get parsed and store into Elasticsearch index
Open the demo.conf file using vim or another editor
demo.log pipeline configuration
Save your pipeline configuration code. If the pipeline is not saving, then change the demo.conf file permission according to your user.
NOTE : I am giving permission to Owner, Group and Other with all permission read write and execute (it's not a better choice to give all permissions to all; please follow IAM)
Now I have saved the demo.conf pipeline.
Step 4 : Checking Logstash all configuration is OK or not
If you get permission error, then please use below command
like this,
Use this one command
Configuration test Result must be -> Config Validation Result: Ok Exiting Logstash
Step 5 : Starting the Logstash using below command
Once Logstash is started, please restart the Filebeat.
NOTE : Logstash started but logstash is not started as root so you need to use below command
One more thing needed: Logstash need permissions to the user can write to
use below command to allow permission
Run your pipeline using below command without making Logstash root user
Pipeline execution started.........
After running the pipeline then you will get this output on terminal
Now we have sent logs on the index sample-logs one main point here.
We have provided the index name in the Logstash pipeline output section, so the index will automatically get created.
Check our index created or not
Go to the kibana -> Management -> Stack Management -> Index Management -> Indices
Click on sample-logs index
sample.log file data is loaded (79.17 KB).
Click on Discover index
Check log are parsed by Grok pattern or not (our logs are parsed)
Conclusion
We demonstrated how to configure Filebeat to read logs and send them to Logstash, which processes and parses the logs before storing them in an Elasticsearch index on Ubuntu OS. By following these steps, you can efficiently collect, parse, and analyze logs in real-time using the Elastic Stack.
SOC | SOC Platform | SIEM Engineer | Threat Intelligence | Threat hunting | IR | Detection Engineer | Cloud Security