Everything you need to run OpenClaw AI agents on Nebius Cloud with inference powered by Nebius Token Factory.
Fork: github.com/opencolin/openclaw — Nebius Token Factory built in as a first-class provider alongside Anthropic, OpenAI, DeepSeek, and the rest.
| Install | |
|---|---|
| Token Factory Provider Plugin | openclaw plugins install clawhub:tokenfactory |
| Nebius Cloud Skill | openclaw skills install nebius |
| Package | What it does |
|---|---|
tokenfactory-plugin |
Token Factory Provider Plugin -- adds 40+ open-source models (Qwen, DeepSeek, Llama, GLM, FLUX, etc.) via Nebius Token Factory |
nebius-skill |
Nebius Cloud Skill -- deploy and manage Nebius infrastructure from your terminal |
deploy-ui |
Web UI for deploying OpenClaw to Nebius |
deploy-scripts |
Shell scripts, Dockerfile, and configs for Nebius infrastructure automation |
Pick the path that matches what you want to do:
Install the provider plugin to get 44+ models:
openclaw plugins install clawhub:tokenfactoryThen follow the plugin setup guide to configure your API key.
Run the Deploy UI locally:
git clone https://github.com/opencolin/openclaw-nebius.git
cd openclaw-nebius
npm install
npm run dev:deployThen open http://localhost:3000 and follow the wizard. It handles regions, platforms, images, and credentials.
Install the Nebius skill for Claude Code:
git clone https://github.com/opencolin/openclaw-nebius.git /tmp/openclaw-nebius
cp -r /tmp/openclaw-nebius/nebius-skill ~/.claude/skills/nebiusThen ask Claude to deploy:
/nebius deploy OpenClaw as a serverless endpoint
The skill handles the full workflow: region selection, endpoint creation, networking, SSH tunnels, dashboard access, and device pairing.
export TOKEN_FACTORY_API_KEY={your-key-from-studio.nebius.ai}
cd deploy-scripts
./install-openclaw-serverless.shSee deployment paths for all four options (local, Docker, GPU serverless, CPU serverless).
Get your key at studio.nebius.ai. Keys look like v1. followed by a long string.
| Model | Type | Context | Input $/1M | Output $/1M |
|---|---|---|---|---|
nebius/deepseek-ai/DeepSeek-V3.2 |
Chat | 163K | $0.30 | $0.45 |
nebius/Qwen/Qwen3.5-397B-A17B |
Chat | 131K | $0.60 | $3.60 |
nebius/zai-org/GLM-5 |
Chat | 131K | $1.00 | $3.20 |
nebius/openai/gpt-oss-120b |
Reasoning | 131K | $0.15 | $0.60 |
nebius/deepseek-ai/DeepSeek-R1-0528 |
Reasoning | 163K | $0.80 | $2.40 |
nebius/Qwen/Qwen3-Coder-480B-A35B-Instruct |
Chat | 131K | $0.40 | $1.80 |
38 chat/reasoning models and 2 image generation models total. See full catalog.
| Region | Location | CPU Platform |
|---|---|---|
eu-north1 |
Finland | cpu-e2 |
eu-west1 |
Paris | cpu-d3 |
us-central1 |
US | cpu-e2 |
docker pull ghcr.io/opencolin/openclaw-serverless:latest # ~400 MB, CPU
docker pull ghcr.io/opencolin/nemoclaw-serverless:latest # ~1.1 GB, GPU-readyThis is an npm workspaces monorepo. The plugin and deploy UI are the two buildable packages.
# Install all dependencies
npm install
# Build the provider plugin
npm run build
# Run plugin tests
npm test
# Type-check the plugin
npm run check
# Start the deploy UI locally
npm run dev:deployThe nebius-skill package is pure markdown and requires no build step.
+-------------------+
| Nebius Cloud |
| Token Factory |
| (44+ models) |
+--------+----------+
|
OpenAI-compatible API
|
+------------------------+------------------------+
| | |
+---------v----------+ +---------v----------+ +---------v----------+
| tokenfactory-plugin | | deploy-ui | | nebius-skill |
| | | | | |
| Registers models | | Browser wizard | | Claude Code / |
| in OpenClaw as | | + install scripts | | OpenClaw skill |
| a provider | | for deploying | | for deploying |
| | | agents to Nebius | | via CLI |
+--------------------+ +--------------------+ +--------------------+
- tokenfactory-plugin connects OpenClaw to Token Factory for inference (use models)
- deploy-ui + deploy-scripts provide the UI and scripts to get OpenClaw running on Nebius (deploy agents)
- nebius-skill teaches AI coding assistants how to manage Nebius infrastructure (automate deployments)
- OpenClaw Docs -- Official OpenClaw documentation
- Docker Install -- Install OpenClaw with Docker
- Token Factory -- Nebius managed GPU inference API
- Discord -- Join the OpenClaw community
- Nebius Cloud -- Cloud platform with GPU infrastructure
- Nebius CLI docs -- Official CLI documentation
MIT