2 min read

Automation in Action: Using GitHub Actions and Ansible to Set Up an Elastic Stack

Automation in Action: Using GitHub Actions and Ansible to Set Up an Elastic Stack

This week, I made some solid progress with my homelab. While it may not have been the most flashy update, I deployed Elasticsearch on a Proxmox VM and set up Kibana and Logstash as containers using GitHub Actions. It's a step closer to getting my monitoring stack fully up and running, and I'm one step closer to wrapping up my homelab setup.

In the grand scheme of my homelab, setting up the Elastic Stack (Elasticsearch, Logstash, and Kibana) is crucial for centralized logging and monitoring. I wanted to integrate these services into my existing setup without complicating things too much, so I decided to deploy Elasticsearch on a Proxmox VM. For Kibana and Logstash, I opted to run them as containers using GitHub Actions for automation. This fits into my larger homelab plan, where each component gets deployed in a logical order to avoid unnecessary bottlenecks.

The deployment process was fairly straightforward. I started with deploying the Elasticsearch VM in Proxmox using an Ansible playbook, ensuring I had the right amount of resources allocated to it. Then, I moved on to Kibana and Logstash, choosing to deploy these as containers using GitHub Actions. This automation saves me time in the long run, as I can easily manage updates and deployments with minimal manual intervention.

In addition to deploying the Elastic Stack, I’ve also been refactoring some of my other services that were deployed using Docker Compose. Specifically, I updated my configuration to include Graylog as a logging service. Here’s a quick snippet of the configuration I’ve added to my other containers:

logging:
      driver: "gelf"
      options:
        gelf-address: "udp://${LOGSTASH_HOST}:${LOGSTASH_PORT}"
        tag: "pgadmin"

This ensures that logs from my containers (such as pgAdmin, Jenkins Agents, etc) are sent directly to Logstash for processing, which then feeds into Elasticsearch. It's a seamless integration that will help me centralize logging across multiple services, making it easier to monitor and troubleshoot in the future.

While it may not have been the most exciting week of deployments, every step I take gets me closer to my ultimate goal: a fully automated and efficient homelab. The Elasticsearch, Logstash, and Kibana stack is a key part of my logging infrastructure, and the automation with GitHub Actions is helping to streamline my workflows. Next week, I’ll be diving into setting up Prometheus, Mimir and Grafana, so stay tuned!