Close Menu
    Facebook X (Twitter) Instagram
    devcurrentdevcurrent
    • DevOps
    • Tutorials
    • How To
    • News
    • Development
    Facebook X (Twitter) Instagram
    devcurrentdevcurrent
    Home»DevOps»How to Integrate Kibana for Seamless ECS Log Monitoring
    DevOps

    How to Integrate Kibana for Seamless ECS Log Monitoring

    ayush.mandal11@gmail.comBy ayush.mandal11@gmail.comJanuary 19, 2025Updated:January 19, 2025No Comments5 Mins Read
    Facebook Twitter Pinterest LinkedIn Tumblr Email
    Kibana ECS
    Share
    Facebook Twitter LinkedIn Pinterest Email

    In today’s digital landscape, monitoring and analyzing application logs is crucial for understanding system performance, debugging issues, and improving overall reliability. When working with Amazon ECS (Elastic Container Service), you can centralize your log management using the ELK stack—Elasticsearch, Logstash, and Kibana. In this comprehensive guide, we will walk you through setting up ECS logs for direct integration with Kibana using both the AWS Management Console and Terraform.


    Table of Contents

    Toggle
    • Understanding the ELK Stack and ECS Logs
      • What is the ELK Stack?
      • Benefits of Centralizing Logs with Kibana
      • Logging Options in ECS
    • Adding the ELK Stack to ECS
      • Setting Up the ELK Stack
        • Using AWS Console
        • Using Terraform
      • Deploy Fluent Bit to ECS
    • Setting Up ECS Logs with Kibana (AWS Console and Terraform)
      • Configure the Elasticsearch Domain
        • Using AWS Console
        • Using Terraform
      • Deploy Fluent Bit as a Sidecar in ECS
        • Using AWS Console
        • Using Terraform
      • Verify Logs in Kibana
      • Conclusion
    • References

    Understanding the ELK Stack and ECS Logs

    ELK stack

    What is the ELK Stack?

    The ELK stack is a powerful open-source platform designed for searching, analyzing, and visualizing log data. It consists of:

    1. Elasticsearch: A distributed search and analytics engine for storing and querying logs.
    2. Logstash: A data processing pipeline that ingests, transforms, and forwards logs to Elasticsearch.
    3. Kibana: A visualization tool for exploring and analyzing data stored in Elasticsearch.

    By integrating ECS logs with Kibana, you can monitor containerized applications effectively, troubleshoot issues in real-time, and gain actionable insights from your data.

    Benefits of Centralizing Logs with Kibana

    • Improved Visibility: Centralize logs from all ECS tasks for seamless monitoring.
    • Powerful Analytics: Use Kibana’s dashboards to track performance trends and anomalies.
    • Simplified Debugging: Quickly identify issues by querying logs with Elasticsearch.
    • Scalability: Handle large volumes of logs without compromising performance.
    See also  5 Common AWS VPC Peering Mistakes and How to Avoid Them

    Logging Options in ECS

    Amazon ECS supports multiple logging drivers, including AWS CloudWatch Logs, Fluentd, and Splunk. To set up Kibana integration, we’ll use Fluent Bit, a lightweight log processor compatible with Elasticsearch.


    Adding the ELK Stack to ECS

    Setting Up the ELK Stack

    Using AWS Console

    1. Elasticsearch Domain:
      • Open the Amazon OpenSearch Service console and select Create domain.
      • Choose a deployment type and configure the instance size, storage, and access policies.
      • Enable a public access endpoint or restrict it to your VPC.
    2. Logstash:
      • Deploy a Logstash instance in your VPC. Use an EC2 instance and install Logstash with the necessary plugins for Elasticsearch output.
      • Configure a pipeline to receive logs from Fluent Bit and forward them to Elasticsearch.
    3. Kibana:
      • Access the Kibana endpoint provided by OpenSearch Service.
      • Set up your index patterns and start building dashboards.

    Using Terraform

    You can provision the ELK stack using Terraform. Here is an example script:

    resource "aws_opensearch_domain" "elk" {
      domain_name           = "elk-stack"
      elasticsearch_version = "7.10"
    
      cluster_config {
        instance_type  = "m5.large.search"
        instance_count = 3
      }
    
      ebs_options {
        ebs_enabled = true
        volume_size = 20
      }
    
      access_policies = <<POLICY
    {
      "Version": "2012-10-17",
      "Statement": [
        {
          "Effect": "Allow",
          "Principal": "*",
          "Action": "es:*",
          "Resource": "arn:aws:es:region:account-id:domain/elk-stack/*"
        }
      ]
    }
    POLICY
    }
    
    resource "aws_instance" "logstash" {
      ami           = "ami-12345678"
      instance_type = "t2.medium"
      user_data     = <<EOF
    #!/bin/bash
    sudo apt update && sudo apt install -y logstash
    EOF
    }
    

    Run terraform apply to deploy the ELK stack resources.


    Deploy Fluent Bit to ECS

    Refer to the earlier section for detailed Fluent Bit configuration and deployment steps.

    Also Read: What is Datadog?


    Setting Up ECS Logs with Kibana (AWS Console and Terraform)

    Configure the Elasticsearch Domain

    Using AWS Console

    1. Open the Amazon OpenSearch Service console.
    2. Create a new domain by selecting Create domain.
      • Choose a deployment type (e.g., Development and testing).
      • Configure instance types and storage as needed.
      • Enable a public access endpoint or restrict access to your VPC.
    3. Note down the Elasticsearch endpoint and Kibana URL.
    See also  Rethinking Cloud Strategies: Modern Data Management and Enterprise Computing

    Using Terraform

    Here’s an example Terraform script to create an Elasticsearch domain:

    resource "aws_opensearch_domain" "ecs_logging" {
      domain_name           = "ecs-logs-domain"
      elasticsearch_version = "7.10"
    
      cluster_config {
        instance_type = "t3.small.search"
        instance_count = 2
      }
    
      ebs_options {
        ebs_enabled = true
        volume_size = 10
      }
    
      access_policies = <<POLICY
    {
      "Version": "2012-10-17",
      "Statement": [
        {
          "Effect": "Allow",
          "Principal": "*",
          "Action": "es:*",
          "Resource": "arn:aws:es:region:account-id:domain/ecs-logs-domain/*"
        }
      ]
    }
    POLICY
    }
    

    Run terraform apply to provision the domain.


    Deploy Fluent Bit as a Sidecar in ECS

    Using AWS Console

    1. Open the ECS console and select your cluster.
    2. Edit your task definition to add a Fluent Bit container.
      • Use the official Fluent Bit image: fluent/fluent-bit:latest.
      • Configure environment variables for Elasticsearch: FLUENT_ELASTICSEARCH_HOST=<Elasticsearch endpoint> FLUENT_ELASTICSEARCH_PORT=443 FLUENT_ELASTICSEARCH_TLS=true FLUENT_ELASTICSEARCH_USER=<username> FLUENT_ELASTICSEARCH_PASSWORD=<password>
      • Mount the task’s log files to Fluent Bit using volumes.
    3. Update the logging configuration of your main container to send logs to the Fluent Bit sidecar.

    Using Terraform

    Here’s how to configure Fluent Bit in your ECS task definition:

    resource "aws_ecs_task_definition" "ecs_task" {
      family                   = "ecs-logging-task"
      container_definitions = jsonencode([
        {
          "name": "app-container",
          "image": "<your-app-image>",
          "logConfiguration": {
            "logDriver": "awslogs",
            "options": {
              "awslogs-group": "/ecs/ecs-app-logs",
              "awslogs-region": "<region>",
              "awslogs-stream-prefix": "ecs"
            }
          }
        },
        {
          "name": "fluent-bit",
          "image": "fluent/fluent-bit:latest",
          "environment": [
            { "name": "FLUENT_ELASTICSEARCH_HOST", "value": "<Elasticsearch endpoint>" },
            { "name": "FLUENT_ELASTICSEARCH_PORT", "value": "443" },
            { "name": "FLUENT_ELASTICSEARCH_TLS", "value": "true" }
          ]
        }
      ])
    }
    

    Run terraform apply to update the ECS task definition.


    Verify Logs in Kibana

    1. Open Kibana using the URL provided by your Elasticsearch domain.
    2. Create an index pattern matching your log index (e.g., ecs-logs-*).
    3. Use Kibana’s Discover tab to explore the ingested logs.
    4. Build custom dashboards for real-time monitoring and analytics.

    Conclusion

    By following this guide, you’ve set up a robust logging pipeline to centralize ECS logs in Kibana, leveraging the ELK stack’s capabilities. Whether you used the AWS Management Console or Terraform, this setup ensures scalable log management and visualization for your containerized workloads. With Kibana dashboards, you can gain valuable insights, enhance system reliability, and streamline debugging efforts.

    See also  From ECS to EKS: A Complete Migration Guide

    References

    1. Amazon ECS Documentation
    2. Amazon OpenSearch Service Documentation
    3. Fluent Bit Documentation
    4. Terraform AWS Provider
    5. Kibana Documentation
    devops elk
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    ayush.mandal11@gmail.com
    • Website

    Related Posts

    How to Setup OpenVPN Server on AWS EC2

    August 1, 2025

    Platform Engineering: The Strategic Imperative for Modern DevOps and Internal Developer Platforms

    July 5, 2025

    AIOps: Revolutionizing Incident Management and Observability in the Age of Complexity

    June 12, 2025
    Leave A Reply Cancel Reply

    Latest Posts
    openvpn aws

    How to Setup OpenVPN Server on AWS EC2

    7:35 am 01 Aug 2025
    platform engineering

    Platform Engineering: The Strategic Imperative for Modern DevOps and Internal Developer Platforms

    2:46 pm 05 Jul 2025
    AIOps

    AIOps: Revolutionizing Incident Management and Observability in the Age of Complexity

    6:05 am 12 Jun 2025
    lambda optimization

    Optimizing AWS Lambda Performance: Effective Warmup Strategies for Faster Response Times

    9:57 am 22 May 2025
    queue

    How Queue Systems Work in Applications

    3:26 pm 08 May 2025
    Tags
    AI aiops android ansible apple argocd aws aws bedrock celery cloudfront cost optimization datadog devops devsecops django ecs elk fastapi gitops gitops-tools grafana helm how to ingress iphone karpenter keda kubernetes lambda openswift vs kubernetes openvpn platform engineering probes prompt engineer python quantum computing queue route 53 terraform terragrunt vpc VPN
    Facebook X (Twitter) Instagram Pinterest
    • About Us
    • Terms & Conditions
    • Privacy Policy
    • Contact Us
    © 2025 ThemeSphere. Designed by ThemeSphere.

    Type above and press Enter to search. Press Esc to cancel.