SmartPosts is an AI-assisted publishing platform that combines a React front end, a Go REST API, and a Python gRPC microservice to deliver intelligent Q&A experiences on user-generated posts. The system persists content in MySQL, uses Redis for caching and hot-post leaderboards, and streams answers generated by the AI service back to clients via Server-Sent Events.
| Component | Stack | Responsibilities |
|---|---|---|
frontend/ |
React 19, Vite, Tailwind CSS | SPA for browsing posts, authentication, voting, and consuming AI answers. Served behind Nginx, proxies /api to the backend. |
backend/ |
Go 1.25, Gin, GORM, gRPC client | REST API, JWT auth, post CRUD, voting, SSE streaming, Redis caching, RabbitMQ publisher, gRPC client for AI service. |
backend/cmd/worker |
Go worker | Consumes RabbitMQ rank.update queue, updates Redis sorted set for hot posts. |
ai-service/ |
Python 3.13, LangChain, gRPC | Streams RAG responses using OpenRouter-hosted LLMs; exposes gRPC service defined in protos/rag_agent.proto. |
| Infrastructure | MySQL 8, Redis 8, RabbitMQ 4 | Managed via Docker Compose with persistent volumes and health checks. |
Additional configuration lives in backend/configs/config.yml and frontend/nginx/nginx.conf.
- 🔐 User registration and login backed by JWT authentication.
- 📝 Post creation, editing, deletion, and retrieval with Redis caching and hot-post ranking.
- 📊 Voting system with background workers updating Redis leaderboards.
- 🤖 AI “Ask Post” endpoint that streams answers from the gRPC AI service via SSE.
- 📡 gRPC contract shared through
protos/rag_agent.protofor type-safe communication between services.
- Docker and Docker Compose (v2+) for the easiest end-to-end setup.
- For direct development without containers: Go 1.25, Node.js 20+, npm, Python 3.13, and
uvorpipfor dependency management. - Access credentials for the AI integrations (OpenRouter and LangSmith).
- Configure AI secrets: Create
ai-service/.envwith the following keys and your own values:OPENROUTER_API_KEYLANGSMITH_API_KEYLANGSMITH_PROJECTLANGSMITH_TRACING/LANGSMITH_TRACING_V2(optional flags)TOKENIZERS_PARALLELISM
- Launch the stack:
This starts the frontend (port 3000), backend API (exposed via
docker compose up -d --build
/apithrough Nginx), AI gRPC service, MySQL, Redis, RabbitMQ, and the ranking worker. - Open http://localhost:3000 to use the application.
Services rely on default credentials from docker-compose.yml. Override them via environment variables or compose overrides for production deployments.
- Start dependencies with hot-reload mounts:
docker compose -f docker-compose.dev.yml up backend-server backend-worker db redis rabbitmq ai-service
- Or run locally with external services running:
JWT_SECRET=<your-secret> go run ./cmd/server JWT_SECRET=<your-secret> go run ./cmd/worker
- Execute Go tests:
go test ./...
cd frontend
npm install
npm run devVite serves the SPA on http://localhost:5173. When running outside Docker, configure a proxy or set VITE_API_BASE_URL (if introduced) so /api resolves to the backend.
cd ai-service
uv sync # or: python -m pip install .
python rag_agent_server.pyEnsure the environment variables listed above are present before starting the server.
The gRPC contract lives in protos/rag_agent.proto. Regenerate artifacts after changes:
python -m grpc_tools.protoc -I protos --python_out=ai-service --pyi_out=ai-service --grpc_python_out=ai-service protos/rag_agent.proto
protoc -I protos --go_out=backend --go-grpc_out=backend protos/rag_agent.proto- The backend reads configuration from
backend/configs/config.yml; adjust hostnames and credentials if running outside Docker. - Set
JWT_SECRETin the backend environment for secure tokens (defaults to a placeholder).