A
Atul Vishwakarma
Guest
Welcome to this comprehensive guide on setting up the ELK Stack (Elasticsearch, Logstash, Kibana) on AWS EC2 instances. If you're managing applications in the cloud, especially Java-based ones, efficient log monitoring is crucial for debugging, performance analysis, and security. The ELK Stack, combined with Filebeat, provides a powerful, open-source solution for collecting, processing, visualizing, and analyzing logs in real-time.
In this blog post, we'll walk through a step-by-step setup using Ubuntu-based EC2 instances. This tutorial is based on a practical example involving a Java application, but the principles apply broadly. We'll cover everything from infrastructure provisioning to creating dashboards in Kibana. By the end, you'll have a fully functional log monitoring system.
Prerequisites:
The ELK Stack is a collection of tools from Elastic:
This setup allows you to centralize logs from distributed systems like EC2, parse them into structured data, and gain insights through dashboards.
Launch three Ubuntu EC2 instances:
Connect via SSH to each instance as the ubuntu user. Update packages on all machines:
Elasticsearch is the backbone for storing logs.
Logstash processes incoming logs.
Kibana provides the UI for log analysis.
Filebeat ships logs from your app to Logstash.
To test, we'll use a sample Java app.

Create visualizations:
Build a dashboard:
Congratulations! You've set up the ELK Stack on AWS EC2, integrated Filebeat for log shipping, and created a dashboard for real-time monitoring. This setup scales wellβadd more Filebeat instances for multi-app environments. For production, consider security enhancements like SSL, authentication, and backups.
If you encounter issues, check service logs with
If you found this guide helpful, consider buying me a coffee to support my work!
Continue reading...
In this blog post, we'll walk through a step-by-step setup using Ubuntu-based EC2 instances. This tutorial is based on a practical example involving a Java application, but the principles apply broadly. We'll cover everything from infrastructure provisioning to creating dashboards in Kibana. By the end, you'll have a fully functional log monitoring system.
Prerequisites:
- An AWS account with access to EC2.
- Basic knowledge of SSH, Linux commands, and AWS networking (e.g., security groups).
- Three EC2 instances (t3.micro or similar): One for the ELK server (Elasticsearch, Logstash, Kibana), one for the client machine (hosting the app and Filebeat), and an optional web server for testing.
- Ensure ports like 9200 (Elasticsearch), 5044 (Logstash), and 5601 (Kibana) are open in your security groups.
1. Overview of the ELK Stack
The ELK Stack is a collection of tools from Elastic:
- Elasticsearch: A search and analytics engine that stores and indexes your logs.
- Logstash: A data processing pipeline that ingests, transforms, and sends logs to Elasticsearch.
- Kibana: A web interface for visualizing and querying your logs.
- Filebeat: A lightweight shipper that forwards logs from your application servers to Logstash.
This setup allows you to centralize logs from distributed systems like EC2, parse them into structured data, and gain insights through dashboards.
2. Infrastructure Setup
Launch three Ubuntu EC2 instances:
- ELK Server: Hosts the core ELK components. Assign a public IP for Kibana access.
- Client Machine: Runs your Java app and Filebeat. Use the ELK server's private IP for communication.
- Web Server (Optional): For simulating additional log sources.
Connect via SSH to each instance as the ubuntu user. Update packages on all machines:
Code:
sudo apt update
3. Step-by-Step Installation
Step 1: Install and Configure Elasticsearch (on ELK Server)
Elasticsearch is the backbone for storing logs.
Install Java (required for Elasticsearch and Logstash):
Code:sudo apt update && sudo apt install openjdk-17-jre-headless -y
Install Elasticsearch:
Code:wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add - echo "deb https://artifacts.elastic.co/packages/7.x/apt stable main" | sudo tee /etc/apt/sources.list.d/elastic-7.x.list sudo apt update sudo apt install elasticsearch -y
Configure Elasticsearch: Edit/etc/elasticsearch/elasticsearch.yml
:
Code:sudo vi /etc/elasticsearch/elasticsearch.yml
Add or modify:
Code:network.host: 0.0.0.0 [cluster.name](http://cluster.name/): my-cluster [node.name](http://node.name/): node-1 discovery.type: single-node
Start and enable the service:
Code:sudo systemctl start elasticsearch sudo systemctl enable elasticsearch sudo systemctl status elasticsearch
Verify it's running:
Code:curl -X GET "[http://localhost:9200](http://localhost:9200/)"
Step 2: Install and Configure Logstash (on ELK Server)
Logstash processes incoming logs.
Install Logstash:
Code:sudo apt install logstash -y
Configure Logstash: Edit/etc/logstash/conf.d/logstash.conf
:
Code:sudo vi /etc/logstash/conf.d/logstash.conf
Add:
Code:input { beats { port => 5044 } } filter { grok { match => { "message" => "%{TIMESTAMP_ISO8601:log_timestamp} %{LOGLEVEL:log_level} %{GREEDYDATA:log_message}" } } } output { elasticsearch { hosts => ["http://localhost:9200"] index => "logs-%{+YYYY.MM.dd}" } stdout { codec => rubydebug } }
This config accepts logs from Beats (like Filebeat), parses them using Grok, and stores them in Elasticsearch.
Start and enable the service:
Code:sudo systemctl start logstash sudo systemctl enable logstash sudo systemctl status logstash
Allow traffic on port 5044:
Code:sudo ufw allow 5044/tcp
Step 3: Install and Configure Kibana (on ELK Server)
Kibana provides the UI for log analysis.
Install Kibana:
Code:sudo apt install kibana -y
Configure Kibana: Edit/etc/kibana/kibana.yml
:
Code:sudo vi /etc/kibana/kibana.yml
Modify:
Code:server.host: "0.0.0.0" elasticsearch.hosts: ["http://localhost:9200"]
Start and enable the service:
Code:sudo systemctl start kibana sudo systemctl enable kibana sudo systemctl status kibana
Allow traffic on port 5601:
Code:sudo ufw allow 5601/tcp
Access the Kibana dashboard: Open a browser and navigate tohttp://<ELK_Server_Public_IP>:5601
.
Step 4: Install and Configure Filebeat (on Client Machine)
Filebeat ships logs from your app to Logstash.
Install Filebeat:
Code:wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add - echo "deb https://artifacts.elastic.co/packages/7.x/apt stable main" | sudo tee /etc/apt/sources.list.d/elastic-7.x.list sudo apt update sudo apt install filebeat -y
Configure Filebeat: Edit/etc/filebeat/filebeat.yml
:
Code:sudo vi /etc/filebeat/filebeat.yml
Modify:
Code:filebeat.inputs: - type: log enabled: true paths: - /home/ubuntu/JavaApp/target/app.log output.logstash: hosts: ["<ELK_Server_Private_IP>:5044"]
Start and enable the service:
Code:sudo systemctl start filebeat sudo systemctl enable filebeat sudo systemctl status filebeat
Verify Filebeat:
Code:sudo filebeat test output
Step 5: Deploy a Java Application and Generate Logs (on Client Machine)
To test, we'll use a sample Java app.
Install Java if needed:
Code:sudo apt install openjdk-17-jre-headless -y
Download and run a sample app:
Code:wget https://repo1.maven.org/maven2/org/springframework/boot/spring-boot-sample-simple/1.4.2.RELEASE/spring-boot-sample-simple-1.4.2.RELEASE.jar -O app.jar nohup java -jar app.jar > /home/ubuntu/JavaApp/target/app.log 2>&1 &
Generate test logs:
Code:echo "Test log entry $(date)" >> /home/ubuntu/JavaApp/target/app.log
Step 6: View and Analyze Logs in Kibana
- In Kibana, go to Discover.
- Select the
logs-*
index pattern. - Search for logs from your app, e.g.,
log.file.path: "/home/ubuntu/Boardgame/target/app.log"
. - View parsed fields like
log_timestamp
,log_level
, andlog_message
.

Create visualizations:
- Pie Chart: For log level distribution.
- Line Chart: For logs over time.
- Data Table: For a structured log view.
Build a dashboard:
- Go to Dashboard > Create Dashboard.
- Add your visualizations.
- Save as "Java Application Log Monitoring".
Conclusion
Congratulations! You've set up the ELK Stack on AWS EC2, integrated Filebeat for log shipping, and created a dashboard for real-time monitoring. This setup scales wellβadd more Filebeat instances for multi-app environments. For production, consider security enhancements like SSL, authentication, and backups.
If you encounter issues, check service logs with
journalctl
or Elastic's documentation. Happy logging!
Support My Work
If you found this guide helpful, consider buying me a coffee to support my work!
Continue reading...