This project collects system metrics (CPU, memory, disk usage) from multiple remote servers via SSH and logs the data to Logstash for centralized analysis.
remote_monitor.py– Python script for collecting remote system metrics and logging them to Logstash..env– Environment variables configuration file for setting up the script.requirements.txt– Required Python packages for running the script.Dockerfile– Docker configuration to containerize the script.README.md– Documentation for setup, usage, and deployment.
- Reads a list of remote servers from the
.envfile. - Uses SSH with an RSA private key to access each server.
- Executes system monitoring commands remotely.
- Collects CPU, memory, and disk usage metrics.
- Logs metrics to Logstash for centralized monitoring.
- Runs only once per execution.
docker build -t remote-monitor .Ensure .env is configured, then execute:
pip install -r requirements.txt
python remote_monitor.pyUse the .env file and run the container:
docker run --rm --env-file .env remote-monitorTo schedule a cron job on the host system to run the container every hour. Append the following line to the crontab
0 * * * * docker run --rm --env-file /path/to/.env remote-monitor- 0 * * * * → Runs at the start of every hour.
- docker run --rm → Runs the container, cleans up after execution.
- --env-file /path/to/.env → Uses environment variables for execution.
Define these values in the .env file:
LOGSTASH_HOST=<LOGSTASH_SERVER_HOSTNAME> # Logstash server hostname or IP
LOGSTASH_PORT=<LOGSTASH_SERVER_PORT> # Logstash server port. Default: 5044
PRIVATE_KEY_PATH=/path/to/key.pem # Path to RSA private key for SSH authentication
USERNAME=<USERNAME> # SSH username for remote servers
REMOTE_HOSTS=host1.example.com,host2.example.com,host3.example.com # Comma-separated list of remote servers- Receives Logs: Listens on TCP port 5044 (matching the Python script's Logstash handler).
- Parses JSON: Uses the JSON codec to process structured logs.
- Formats Data:
- Converts CPU and memory usage to float.
- Stores disk usage as string (since it may contain %).
- Stores in Elasticsearch:
- Sends parsed logs to an Elasticsearch cluster.
- Uses daily index naming (remote-system-metrics-YYYY.MM.dd).
- Supports authentication if required.