Kafka + Redis + Spring Boot — stream, filter, and display critical logs in real time.
┌──────────────┐ Kafka "logs" ┌──────────────┐ ┌───────┐
│ log-producer │ ──────────────────▶│ log-consumer │───────▶│ Redis │
│ (simulator) │ topic (3 part.) │ (filter) │ └───┬───┘
└──────────────┘ └──────┬───────┘ │
│ │
REST API + Dashboard ◀──────┘
http://localhost:8081
- Java 17+
- Maven 3.9+
- Docker & Docker Compose
docker compose up -dmvn clean installmvn -pl log-producer spring-boot:runIn a second terminal:
mvn -pl log-consumer spring-boot:runOpen http://localhost:8081 in your browser.
API endpoints:
GET /api/v1/logs/critical?limit=50— latest critical logsGET /api/v1/logs/stats— error counts per service
log-monitoring-system/
├── docker-compose.yml # Kafka (KRaft) + Redis
├── pom.xml # Parent POM
├── log-common/ # Shared domain model
│ └── domain/
│ ├── LogEvent.java # Immutable value object (record)
│ └── LogLevel.java # DEBUG → CRITICAL enum
├── log-producer/ # Kafka publisher + simulator
│ ├── application/ # Port: LogEventPublisher
│ ├── infrastructure/kafka/ # Adapter: KafkaLogEventPublisher
│ └── simulation/ # Driving adapter: LogSimulationRunner
└── log-consumer/ # Kafka listener + Redis store + REST
├── application/ # Port: CriticalLogRepository, LogFilterService
├── infrastructure/kafka/ # Adapter: KafkaLogEventConsumer
├── infrastructure/redis/ # Adapter: RedisCriticalLogRepository
└── web/ # REST controller + static dashboard
| Concept | Where |
|---|---|
| Topic | logs — auto-created with 3 partitions |
| Producer | KafkaLogEventPublisher sends keyed JSON messages |
| Consumer | KafkaLogEventConsumer in group log-monitor-group |
| Partition | Keyed by serviceName → same service, same partition |
| Offset | auto-offset-reset: earliest in consumer config |