This guide provides detailed instructions for deploying the LLM Proxy Java application using Docker.
- Docker installed on your system
- Docker Compose (optional, for easier deployment)
-
Clone the repository:
git clone https://github.com/amorin24/llmproxy-java.git cd llmproxy-java -
Create a
.envfile with your API keys:OPENAI_API_KEY=your_openai_api_key GEMINI_API_KEY=your_gemini_api_key MISTRAL_API_KEY=your_mistral_api_key CLAUDE_API_KEY=your_claude_api_key -
Build and run with Docker Compose:
docker-compose up -d
-
Access the web UI at
http://localhost:8080
docker build -t llmproxy-java .docker run -p 8080:8080 \
-e OPENAI_API_KEY=your_openai_api_key \
-e GEMINI_API_KEY=your_gemini_api_key \
-e MISTRAL_API_KEY=your_mistral_api_key \
-e CLAUDE_API_KEY=your_claude_api_key \
llmproxy-javaYou can configure the container using environment variables:
| Environment Variable | Description | Default |
|---|---|---|
OPENAI_API_KEY |
OpenAI API key | - |
GEMINI_API_KEY |
Google Gemini API key | - |
MISTRAL_API_KEY |
Mistral API key | - |
CLAUDE_API_KEY |
Anthropic Claude API key | - |
JAVA_OPTS |
JVM options | -Xms512m -Xmx1024m |
The Docker Compose configuration includes a health check that verifies the application is running correctly by checking the /api/health endpoint.
For production deployments, consider:
- Using a reverse proxy like Nginx for SSL termination
- Setting up proper logging with volume mounts
- Implementing monitoring and alerting
- Using Docker Swarm or Kubernetes for orchestration
Example production docker-compose.yml:
version: '3.8'
services:
llmproxy:
build: .
ports:
- "8080:8080"
environment:
- OPENAI_API_KEY=${OPENAI_API_KEY}
- GEMINI_API_KEY=${GEMINI_API_KEY}
- MISTRAL_API_KEY=${MISTRAL_API_KEY}
- CLAUDE_API_KEY=${CLAUDE_API_KEY}
- JAVA_OPTS=-Xms1g -Xmx2g
restart: unless-stopped
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8080/api/health"]
interval: 30s
timeout: 10s
retries: 3
start_period: 40s
volumes:
- ./logs:/app/logs
deploy:
resources:
limits:
cpus: '2'
memory: 2GCheck the logs for errors:
docker logs llmproxy-javaEnsure environment variables are correctly passed to the container. You can verify with:
docker exec llmproxy-java env | grep API_KEYAdjust the JVM memory settings using the JAVA_OPTS environment variable:
docker run -e JAVA_OPTS="-Xms1g -Xmx2g" llmproxy-java