One file. Any AI.
The open source Inference as Code engine. Write AI workflows in YAML. Run them anywhere.
Quick Start · 5 Verbs · Examples · Benchmarks · Course · Install
# news.nika.yaml -- Scrape Hacker News and summarize the top stories
schema: "nika/workflow@0.12"
provider: claude # or: openai, mistral, groq, gemini, deepseek, xai, local
tasks:
- id: scrape
fetch: { url: "https://news.ycombinator.com", extract: article }
- id: summarize
with: { page: $scrape }
infer: "3-bullet summary of today's top stories: {{with.page}}"nika run news.nika.yamlNika is a workflow engine where each step is a YAML task with exactly one verb: infer, exec, fetch, invoke, or agent. Write your steps in a .nika.yaml file, run nika run, and Nika handles the rest: parallel execution, data flow between tasks, retries, structured output, and multi-provider LLM routing.
Inference as Code. The same shift that Terraform brought to infrastructure. Describe your intent in a file, let a runtime handle execution. Your workflow is a YAML file that you commit, review in a PR, diff, and version. Five verbs describe any automation, from a 3-step summary to a 50-task parallel pipeline across multiple AI providers.
| Without Nika | With Nika | |
|---|---|---|
| Workflow | Copy-paste between ChatGPT tabs | Write steps once, run forever |
| Scale | One thing at a time | 50 items in parallel with for_each |
| Providers | Locked into one vendor at $20/mo | 14 LLM providers, switch in one line |
| Output | Pray the LLM returns valid JSON | 5-layer schema validation with auto-repair |
| Reproducibility | "It worked last time" | Deterministic DAG, NDJSON traces, event replay |
| Deployment | Docker + Python + venv + pip | Single binary, zero dependencies |
# Install (pick one)
brew install supernovae-st/tap/nika # macOS / Linux
cargo install nika # from crates.io
npx @supernovae-st/nika # run without installing
# Set up your API key
nika setup
# Run your first workflow
nika run hello.nika.yamlhello.nika.yaml
schema: "nika/workflow@0.12"
provider: claude
inputs:
topic: "butterflies"
tasks:
- id: haiku
infer: "Write a haiku about {{inputs.topic}}"Want more? Scaffold a full project:
nika init # 5 starter workflows (one per verb)
nika doctor # verify your setupEvery task uses exactly one verb. That is the entire API surface.
| Verb | What it does | Example |
|---|---|---|
infer: |
Call any LLM | infer: "Summarize this: {{with.text}}" |
exec: |
Run a shell command | exec: "git log --oneline -5" |
fetch: |
HTTP request + extraction | fetch: { url: "https://...", extract: markdown } |
invoke: |
Call MCP or builtin tools | invoke: { tool: nika:thumbnail, params: { width: 800 } } |
agent: |
Multi-turn autonomous loop | agent: { prompt: "Research...", max_turns: 15 } |
flowchart LR
classDef verb fill:#6366f1,stroke:#4f46e5,stroke-width:2px,color:#fff
classDef target fill:#06b6d4,stroke:#0891b2,stroke-width:2px,color:#fff
INFER[infer]:::verb --> LLM["9 Providers"]:::target
EXEC[exec]:::verb --> SHELL[Shell]:::target
FETCH[fetch]:::verb --> HTTP["HTTP + 9 Extract Modes"]:::target
INVOKE[invoke]:::verb --> TOOLS["63 Tools + MCP"]:::target
AGENT[agent]:::verb --> LOOP["Agentic Loop + Guardrails"]:::target
Five words. Not fifty abstractions. If you've used Terraform, GitHub Actions, or Docker Compose, this will feel familiar. The pattern is the same. Declare what you want, let the engine figure out how.
schema: "nika/workflow@0.12"
provider: claude
tasks:
- id: scrape
fetch: { url: "https://example.com/blog", extract: markdown }
- id: summarize
with: { content: $scrape }
infer: "Summarize in 3 bullets: {{with.content}}"
- id: translate
for_each: ["French", "Spanish", "Japanese", "German", "Portuguese"]
as: lang
concurrency: 5
with: { summary: $summarize }
infer: "Translate to {{with.lang}}: {{with.summary}}"schema: "nika/workflow@0.12"
tasks:
- id: claude_take
provider: anthropic
infer: "Analyze this trend: {{inputs.topic}}"
- id: gpt_take
provider: openai
model: gpt-4o
infer: "Analyze this trend: {{inputs.topic}}"
- id: gemini_take
provider: gemini
model: gemini-2.5-flash
infer: "Analyze this trend: {{inputs.topic}}"
- id: synthesize
depends_on: [claude_take, gpt_take, gemini_take]
with:
claude: $claude_take
gpt: $gpt_take
gemini: $gemini_take
infer: "Synthesize these 3 perspectives: {{with.claude}} / {{with.gpt}} / {{with.gemini}}"schema: "nika/workflow@0.12"
provider: claude
tasks:
- id: extract
infer: "Tell me about Alice, 30, Rust and Python developer"
structured:
schema:
type: object
required: [name, age, skills]
properties:
name: { type: string }
age: { type: number, minimum: 0 }
skills: { type: array, items: { type: string }, minItems: 1 }
enable_repair: true
max_retries: 3The prompt is natural language. Never mention JSON. The 4-layer defense handles extraction, validation, retry, and LLM repair automatically. Same result across every supported provider.
schema: "nika/workflow@0.12"
provider: claude
tasks:
- id: research
agent:
prompt: "Research the top 5 competitors for our product"
tools: [nika:read, nika:write, nika:glob]
max_turns: 15
guardrails:
- type: length
max_words: 2000
limits:
max_cost_usd: 1.00
completion:
mode: explicitschema: "nika/workflow@0.12"
provider: claude
tasks:
- id: import
invoke: { tool: nika:import, params: { path: "./photo.jpg" } }
- id: thumbnail
with: { img: $import }
invoke:
tool: nika:pipeline
params:
hash: "{{with.img.hash}}"
ops:
- { op: thumbnail, width: 800 }
- { op: optimize }
- { op: convert, format: webp }
- id: describe
with: { img: $import }
infer:
content:
- type: image
source: "{{with.img.hash}}"
- type: text
text: "Write an alt-text description for this image"115 more examples available via
nika showcase listandnika showcase extract <name>.
Switch providers in one line. Same workflow, any AI.
| Provider | Models | Env Var |
|---|---|---|
| Anthropic | claude-opus-4, claude-sonnet-4, claude-haiku-4.5 | ANTHROPIC_API_KEY |
| OpenAI | gpt-4o, gpt-4.1, o3, o4-mini | OPENAI_API_KEY |
| Gemini | gemini-2.5-pro, gemini-2.5-flash | GEMINI_API_KEY |
| Mistral | mistral-large-latest, mistral-small-latest | MISTRAL_API_KEY |
| Groq | llama-3.3-70b-versatile, mixtral-8x7b | GROQ_API_KEY |
| DeepSeek | deepseek-chat, deepseek-reasoner | DEEPSEEK_API_KEY |
| xAI | grok-3 | XAI_API_KEY |
| Native | Any GGUF model locally via mistral.rs | -- |
| Mock | Deterministic test responses, no API calls, no keys | -- |
Connect to any OpenAI-compatible endpoint (vLLM, Ollama, LiteLLM, SGLang) via named endpoints in nika.toml or slash syntax: model: myserver/llama-3.3-70b.
Get guaranteed schema-valid JSON from any provider. No prompt hacking required.
| Layer | Strategy |
|---|---|
| L0 | Provider-native tool/schema enforcement |
| L2 | Extract + validate JSON from response |
| L3 | Retry with error feedback |
| L4 | LLM repair call (last resort) |
Same result across every supported provider. No exceptions.
tasks:
- id: fetch_data
fetch: { url: "https://api.example.com/users" }
- id: process
with:
users: $fetch_data # bind upstream output
name: $fetch_data.data[0].name # JSONPath access
safe: $fetch_data.name ?? "Unknown" # default fallback
infer: "First user: {{with.name | upper | trim}}"65 pipe transforms: upper, lower, trim, join(","), split(","), sort, unique, flatten, first, last, length, to_json, parse_json, parse_yaml, default("x"), pluck(field), where(field, val), sort_by(field), pick(f1,f2), omit(f1,f2), jq(expr), regex(pattern), html_escape, sanitize, and 40+ more.
Parallel loops with for_each + concurrency:
- id: translate
for_each: ["en", "fr", "ja", "de", "ko"]
as: locale
concurrency: 5
infer: "Translate to {{with.locale}}: {{with.text}}"All accessible via invoke: nika:*, no external dependencies.
Media tools: import, resize, convert, optimize, metadata, charts, QR, C2PA
| Tool | Purpose |
|---|---|
nika:import |
Import any file into CAS |
nika:decode |
Base64 string to CAS store |
nika:thumbnail |
SIMD-accelerated resize (Lanczos3) |
nika:convert |
Format conversion (PNG/JPEG/WebP) |
nika:optimize |
Lossless PNG optimization (oxipng) |
nika:pipeline |
Chain operations in-memory |
nika:metadata |
Universal EXIF/audio/video metadata |
nika:dimensions |
Image dimensions (~0.1ms) |
nika:thumbhash |
25-byte compact placeholder |
nika:dominant_color |
Color palette extraction |
nika:strip |
Remove EXIF metadata |
nika:svg_render |
SVG to PNG (resvg) |
nika:phash |
Perceptual image hashing |
nika:compare |
Visual similarity comparison |
nika:pdf_extract |
PDF text extraction |
nika:chart |
Bar/line/pie charts from JSON |
nika:provenance |
C2PA content credentials |
nika:verify |
C2PA verification + EU AI Act |
nika:qr_validate |
QR decode + quality score |
nika:quality |
Image quality (DSSIM/SSIM) |
Data tools: jq, merge, filter, map, chunk, aggregate, flatten
| Tool | Purpose |
|---|---|
nika:jq |
Full jq stdlib (100+ functions via jaq-core) |
nika:json_merge |
Deep merge JSON objects |
nika:map |
Transform array elements |
nika:filter |
Filter array by condition |
nika:group_by |
Group array into object by field |
nika:chunk |
Split array into N-sized chunks |
nika:aggregate |
Sum, avg, min, max over arrays |
nika:json_flatten |
Flatten nested JSON |
nika:json_unflatten |
Unflatten dotted keys |
nika:set_diff |
Set difference between arrays |
nika:zip |
Zip two arrays together |
nika:token_count |
Count tokens for a model |
Web extraction tools: HTML to Markdown, CSS selectors, metadata, links, readability
| Tool | Purpose |
|---|---|
nika:html_to_md |
HTML to clean Markdown |
nika:css_select |
CSS selector extraction |
nika:extract_metadata |
OG, Twitter Cards, JSON-LD |
nika:extract_links |
Rich link classification |
nika:readability |
Article content extraction |
File & core tools: read, write, edit, glob, grep, sleep, log, assert
| Tool | Purpose |
|---|---|
nika:read |
Read file contents |
nika:write |
Write file (with overwrite mode) |
nika:edit |
Edit file in place |
nika:glob |
Pattern-match files |
nika:grep |
Search file contents |
nika:sleep |
Delay execution |
nika:log |
Emit log messages |
nika:emit |
Emit custom events |
nika:assert |
Runtime assertions |
nika:run |
Run sub-workflows |
nika:complete |
Signal agent completion |
nika:inject |
Template marker replacement |
Nika is an MCP-native client. Connect to any Model Context Protocol server.
mcp:
web_search:
command: npx
args: ["-y", "@anthropic/mcp-web-search"]
tasks:
- id: search
invoke: { mcp: web_search, tool: search, params: { query: "..." } }
- id: agent_task
agent:
prompt: "Research this topic thoroughly"
mcp: [web_search]
max_turns: 10Expose any workflow as a REST API. SDKs for Rust, Node.js, and Python.
nika serve --port 3000curl -X POST http://localhost:3000/v1/jobs \
-H "Content-Type: application/json" \
-d '{"workflow": "news.nika.yaml", "inputs": {"topic": "AI"}}'SSE streaming, job queues, concurrent execution, and per-job isolation built in.
Three views: Studio (editor + DAG), Command (chat + execution), Control (settings).
+-----------------------------------------------------------------------+
| Nika Studio v0.75.0 |
|-----------------------------------------------------------------------|
| +- Files --------+ +- Editor ------------------------------------+ |
| | > workflows/ | | 1 | schema: "nika/workflow@0.12" | |
| | deploy.nika | | 2 | provider: claude | |
| | review.nika | | 3 | tasks: | |
| +- DAG ----------+ | 4 | - id: research | |
| | [research]--+ | | 5 | agent: | |
| | | | | | 6 | prompt: "Find AI papers" | |
| | [analyze] [e] | +--------------------------------------------+ |
| | | | | |
| | [ report ] | Tree-sitter highlighting | LSP | Git gutter |
| +----------------+ Vi/Emacs modes | Fuzzy search | Undo/redo |
+-----------------------------------------------------------------------+
| [1/s] Studio [2/c] Command [3/x] Control |
+-----------------------------------------------------------------------+
Full LSP with 16 capabilities: completion, hover, go-to-definition, diagnostics, semantic tokens, code actions, inlay hints, CodeLens, rename, formatting, and more.
cargo install nika-lsp # standalone
code --install-extension supernovae.nika-lang # VS Code| Level | Name | What You Learn |
|---|---|---|
| 01 | Jailbreak | exec, fetch, infer: the 3 core verbs |
| 02 | Hot Wire | Data bindings, transforms, templates |
| 03 | Fork Bomb | DAG patterns, parallel execution |
| 04 | Root Access | Context files, imports, inputs |
| 05 | Shapeshifter | Structured output, JSON Schema |
| 06 | Pay-Per-Dream | Multi-provider, native models, cost control |
| 07 | Swiss Knife | Builtin tools, file operations |
| 08 | Gone Rogue | Autonomous agents, skills, guardrails |
| 09 | Data Heist | Web scraping, 9 extraction modes |
| 10 | Open Protocol | MCP integration |
| 11 | Pixel Pirate | Media pipeline, vision |
| 12 | SuperNovae | Boss battle, everything combined |
Real benchmarks. Real tasks. No cherry-picking.
| Tool | Peak RAM | Cold start | Lines of config |
|---|---|---|---|
| Nika | ~45 MB | 4 ms | 12 |
| LangChain (Python) | ~230 MB | 1.2 s | 48 |
| LangGraph (Python) | ~210 MB | 1.1 s | 62 |
| CrewAI (Python) | ~280 MB | 1.4 s | 55 |
Nika uses 5x less RAM than LangChain for the same task.
| Metric | Nika | Python equivalent |
|---|---|---|
| Cold start | 4 ms | 800+ ms |
| RAM (idle) | 12 MB | 60+ MB |
| Binary size | ~25 MB | 200+ MB (with venv) |
| Dependencies | 0 (single binary) | pip install, venv, Docker... |
| Install | Download and run | pip install, venv, requirements.txt |
A Raspberry Pi can run Nika. A GitHub Action can run Nika. A $5/month VPS can run Nika.
| Tool | Execution model | Guardrails | Retry |
|---|---|---|---|
| Nika | Deterministic DAG | Yes (4 types) | Yes (exponential backoff) |
| CrewAI | Agent negotiation | No | Manual |
| LangGraph | State machine | Partial | Manual |
| AutoGPT | Open-ended loop | No | No |
flowchart TD
classDef phase fill:#6366f1,stroke:#4f46e5,stroke-width:2px,color:#fff
classDef verb fill:#06b6d4,stroke:#0891b2,stroke-width:2px,color:#fff
classDef backend fill:#10b981,stroke:#059669,stroke-width:2px,color:#fff
YAML[".nika.yaml"]:::phase
RAW["Parse (source spans)"]:::phase
ANA["Analyze (validate + resolve)"]:::phase
LOW["Lower (runtime types)"]:::phase
DAG["DAG Engine"]:::phase
YAML --> RAW --> ANA --> LOW --> DAG
subgraph Verbs
INF[infer]:::verb
EXC[exec]:::verb
FET[fetch]:::verb
INV[invoke]:::verb
AGT[agent]:::verb
end
DAG --> INF & EXC & FET & INV & AGT
subgraph Backends
PROV["9 Providers"]:::backend
MCPS["MCP Servers"]:::backend
BUILT["63 Builtin Tools"]:::backend
CAS["CAS Media Store"]:::backend
end
INF & AGT --> PROV
INV & AGT --> MCPS
INV --> BUILT
BUILT --> CAS
Three-phase AST (inspired by rustc): Raw (parse with source spans) --> Analyzed (validate, resolve bindings) --> Lowered (concrete runtime types). The immutable DAG is built from petgraph for safe concurrent execution.
24 workspace crates:
tools/
nika/ CLI entry point cargo install nika
nika-engine/ Embeddable runtime cargo add nika-engine
nika-core/ AST, types, catalogs zero I/O
nika-event/ EventLog, TraceWriter
nika-mcp/ MCP client (rmcp)
nika-media/ CAS store, media processor
nika-storage/ Storage abstraction
nika-daemon/ Background daemon + secrets
nika-init/ Project scaffolding
nika-cli/ CLI subcommands
nika-display/ Render engine
nika-lsp-core/ Protocol-agnostic LSP
nika-lsp/ Standalone LSP binary
nika-serve/ HTTP server
nika-sdk/ Rust SDK
nika-vault/ Encrypted credential store
Nika ships native compliance infrastructure for the EU AI Act (Regulation 2024/1689). No plugins. No add-ons. Built in.
| Nika Feature | EU AI Act Article | What it does |
|---|---|---|
nika:provenance |
Art. 50(2) | C2PA content credentials: sign AI-generated images with cryptographic provenance |
nika:verify |
Art. 50(2) | Verify C2PA manifests, returns eu_ai_act_compliant: true/false |
| NDJSON execution traces | Art. 12 | 58+ event types logged per workflow run. Full audit trail. |
| Nika Shield (5-layer) | Art. 9 | Prompt injection defense: taint analysis, spotlighting, canary tokens, capabilities |
| Trust levels (4-tier) | Art. 50(2) | Every data binding classified: Trusted, ModelGenerated, ModelTainted, Untrusted |
| Agent guardrails | Art. 14 | max_turns, cost limits, LLM judge, schema validation. Human oversight by design. |
| AGPL open source | Art. 13 | Fully auditable. Every line of code. Every decision reviewable. |
Article 50 enters enforcement on August 2, 2026. Penalties up to 7.5M EUR. No other AI workflow engine ships these features natively.
| Method | Command |
|---|---|
| Homebrew | brew install supernovae-st/tap/nika |
| Cargo | cargo install nika |
| npm | npm install -g @supernovae-st/nika |
| npx | npx @supernovae-st/nika |
| Docker | docker run --rm -v "$(pwd)":/work supernovae/nika run /work/flow.nika.yaml |
| Source | git clone https://github.com/supernovae-st/nika && cargo install --path nika/tools/nika |
nika --version # nika 0.79.3
nika doctor # full system health checkFeature flags
| Feature | Default | Description |
|---|---|---|
native-inference |
yes | Local GGUF models via mistral.rs |
media-core |
yes | Tier 2 media tools |
media-phash |
yes | Perceptual hashing |
media-pdf |
yes | PDF text extraction |
media-chart |
yes | Chart generation |
media-qr |
yes | QR code validation |
media-iqa |
yes | Image quality assessment |
media-provenance |
no | C2PA signing + verification |
fetch-extract |
yes | HTML extraction |
fetch-markdown |
yes | HTML to Markdown |
fetch-article |
yes | Article extraction |
fetch-feed |
yes | RSS/Atom/JSON Feed |
lsp |
no | Standalone LSP binary |
# Minimal build
cargo install --path tools/nika --no-default-features
# Custom features
cargo install --path tools/nika --features "native-inference,media-core"Platform: macOS and Linux. The daemon, scheduling (
nika every,nika schedule), and background jobs require Unix. Core features (run, check, test, infer, fetch, invoke, agent, LSP) work on all platforms including Windows.
| Resource | Description |
|---|---|
| User Guide | Getting started, verbs, data flow, providers |
| Showcase | 8 guided examples + browseable workflow catalog |
| Manifesto | Why Inference as Code matters |
| Contributing | Build, test, conventions |
| Citation | Academic citation (Zenodo DOI) |
nika run flow.nika.yaml # execute workflow
nika run flow.nika.yaml --resume # re-run, skip completed tasks
nika check flow.nika.yaml # validate without executing
nika test flow.nika.yaml # test with mock provider
nika lint flow.nika.yaml # best-practice linting
nika explain flow.nika.yaml # human-readable summary
nika graph flow.nika.yaml # visualize DAG
nika serve --port 3000 # HTTP API
nika init # scaffold project
nika provider list # API key status
nika model list # available models
nika mcp list # MCP servers
nika doctor # system health
nika showcase list # browse the showcase workflow cataloggit clone https://github.com/supernovae-st/nika.git
cd nika
cargo build # build all 32 crates
cargo test --workspace --lib # 10,790+ tests (safe, no keychain popups)
cargo clippy -- -D warnings # zero warnings policyNote:
cargo testwithout--libruns contract tests that trigger macOS Keychain popups. Always use--lib.
Nika is built with AI assistance. Every commit says so (Co-Authored-By: Nika 🦋). AI accelerates the keystrokes. Humans own the architecture, the design decisions, and the 10,790+ tests that prove it works. See the Manifesto for our stance on AI-assisted development.
See CONTRIBUTING.md for full guidelines.
AGPL-3.0-or-later. Nika is free software. Use it, study it, share it, improve it.
The AGPL protects the commons: if you modify Nika and offer it as a hosted service, you share your changes back. For CLI usage, there are zero restrictions. Commercial use is welcome.
If your organization needs a commercial license without copyleft obligations, contact contact@supernovae.studio.
Read the Manifesto to understand why.
Nika v0.79.3 · Schema nika/workflow@0.12 · Rust 1.86+ · 32 crates · 10,900+ tests
SuperNovae Studio · QR Code AI · GitHub
Built in Paris. Open source. Forever.
Liberate your AI. 🦋