Skip to content

Add AI-powered connector generation via CLI#7

Open
JasonBenichou wants to merge 2 commits intostacksyncdata:prodfrom
JasonBenichou:feature/ai-powered-connector-generation
Open

Add AI-powered connector generation via CLI#7
JasonBenichou wants to merge 2 commits intostacksyncdata:prodfrom
JasonBenichou:feature/ai-powered-connector-generation

Conversation

@JasonBenichou
Copy link
Copy Markdown

@JasonBenichou JasonBenichou commented Apr 3, 2026

Summary

This pull request improves the Workflows CDK for building Stacksync custom connectors and modules from natural language, and makes the terminal experience match the official developer flow (local run → ngrok → Developer Studio → same-region workflows).

CLI and UX

  • Interactive entrypoint: Running workflows without a subcommand opens a branded menu (create, update, validate, run locally, expose with ngrok, documentation, setup, exit). Actions loop back to the menu after completion, with a short pause so output is readable.
  • Connector selection: Paths are chosen from discovered connector roots (current directory and immediate subfolders with app_config.yaml), with an option for a custom path. Prompts avoid confusing ranges like "1–1" when only one project exists.
  • Generation flow: Progress steps for parsing, LLM call, and validation; compact preview and post-generation summary aligned with module_config.yaml, schema.json, and route.py under src/modules/.
  • Next steps after create: Menu to run the connector (run_dev.sh), expose via ngrok (with binary check, optional download link, reuse of existing tunnels when they match the port, and starting the app in the background when needed), or open the Stacksync developer documentation.
  • Managed processes: When the CLI starts ngrok or a background connector for the expose flow, exiting the menus (including Ctrl+C where applicable) or finishing a foreground run_dev run can prompt to stop those processes, with separate confirmations for the tunnel and the background connector. Docker-based background starts are cleaned up with docker rm -f workflows-app-<folder> when appropriate.

AI layer

  • Default Anthropic model set to Claude Haiku for faster iteration, with a retry on transient connection errors.
  • System prompt adjusted so clarification questions do not reference an internal "registry" in user-facing wording; custom apps remain first-class.
  • Clarification flow asks one question at a time for easier replies.

Runtime and generated projects

  • Project root resolution in dynamic_routing.py: prefers WORKFLOWS_PROJECT_ROOT / STACKSYNC_CONNECTOR_ROOT, then Flask app.root_path, then a safe getcwd(), then /usr/src/app, reducing FileNotFoundError from os.getcwd() in Docker and worker contexts.
  • Compiler / template alignment: Generated layout closer to the official app connector template (.gitignore, .env, README.md, root and config/entrypoint.sh, removal of redundant capability.yaml); module_only output path fixes to avoid nested project directories.
  • Docker dev scripts: ENV WORKFLOWS_PROJECT_ROOT in dev Dockerfile; chdir in gunicorn config and cd /usr/src/app in entrypoint; docker rm -f before docker run in generated run_dev.sh / run_dev.bat to avoid name conflicts.

Documentation

  • README rewritten around how to use the package: prerequisites, install, quick start, menu overview, command reference, AI configuration, generated layout, and practical tips for Developer Studio and ngrok.

Notes for reviewers

  • Upstream default branch is prod (there is no main branch on this repository); this PR targets prod.
  • Please verify AI keys and ngrok are optional for non-AI commands; run workflows validate and a quick workflows create smoke test if you have API access.

Replace the manual connector creation workflow with a single-command
AI-powered approach: `workflows create "<description>"` generates a
complete, deployable connector project in ~15 seconds.

New modules:
- ai/: LLM planner (Anthropic/OpenAI), intent parser, deterministic
  validator, clarification engine, and structured prompts
- cli/: `workflows` CLI with create, setup, list, inspect commands
  and interactive first-time API key configuration
- registry/: capability manifests (Slack, HubSpot, Stripe, Salesforce,
  OpenAI, PostgreSQL) used for LLM context and validation
- spec/: ConnectorSpec Pydantic model and compiler that generates
  Stacksync-compatible projects (src/modules/ structure, Module Schema
  format, /execute + /content + /schema endpoints, deployment files)
- templates/: pre-built specs for --no-ai fallback mode

Generated connectors include real API call implementations, proper
error handling, Dockerfiles, gunicorn config, and all deployment files
matching the Stacksync connector template structure.

Also syncs requirements.txt with setup.py.

Made-with: Cursor
@greptile-apps
Copy link
Copy Markdown

greptile-apps bot commented Apr 3, 2026

Greptile Summary

This PR adds an AI-powered workflows CLI that generates Stacksync connector scaffolding from natural-language descriptions, backed by Claude or OpenAI structured output, plus an interactive ngrok/process-management flow and a more robust dynamic_routing.py that eliminates all bare os.getcwd() calls.

  • P1 – forced AI SDK install: openai and anthropic are in install_requires, adding ~100 MB to every install even for users who only need workflows validate; they should be extras_require[\"ai\"].
  • P1 – hardcoded python path: _run_connector calls [\"python\", ...] which fails on Python 3-only systems; use sys.executable as _start_main_py_background already does.

Confidence Score: 4/5

Two P1 defects (forced AI SDK deps, hardcoded python path) should be fixed before merging, but neither causes data loss or security risk.

Prior concerns from the previous review round (bat-file escaping, broad exception swallowing, ENV_FILE timing) have been addressed or acknowledged. Two new P1 issues remain: forcing openai/anthropic as hard install dependencies affects every consumer of the package, and the hardcoded 'python' binary breaks the run-locally flow on Python 3-only systems. Both are straightforward one-line fixes.

setup.py (install_requires must become extras_require) and src/workflows_cdk/cli/main.py (_run_connector must use sys.executable)

Important Files Changed

Filename Overview
src/workflows_cdk/cli/main.py New 1494-line interactive CLI with ngrok integration and AI generation flow — hardcoded python binary will fail on Python 3-only systems
setup.py Version bump to 0.2.0 with CLI entry point; openai and anthropic forced as hard dependencies instead of optional extras
src/workflows_cdk/ai/planner.py Anthropic/OpenAI LLM orchestrator with structured output; PR says Haiku default but code sets claude-sonnet-4-6
src/workflows_cdk/core/dynamic_routing.py Replaces all bare os.getcwd() calls with a robust _project_root() helper that prefers env vars, Flask root_path, then safe cwd fallback
src/workflows_cdk/spec/compiler.py Project scaffolder generating Flask routes, Docker scripts, and module configs correctly
src/workflows_cdk/ai/validator.py Deterministic post-LLM spec validator; unused manifest variables are dead code but logic is otherwise correct
src/workflows_cdk/ai/prompts.py LLM system prompts with clear XML structure and single-turn clarification rules
src/workflows_cdk/registry/registry.py In-memory capability registry with keyword search and YAML manifest loading

Sequence Diagram

sequenceDiagram
    participant User
    participant CLI as workflows CLI (main.py)
    participant Planner as AI Planner (planner.py)
    participant LLM as Claude / OpenAI
    participant Compiler as Compiler (compiler.py)
    participant Disk as File System

    User->>CLI: workflows create "description"
    CLI->>CLI: _ensure_dotenv() / _has_api_key()
    CLI->>Planner: build_prompt(description)
    Planner->>Planner: parse_intent() → detected slugs
    Planner-->>CLI: (system_prompt, user_msg)
    CLI->>Planner: call_llm(system, user_msg)
    Planner->>LLM: messages.create() / responses.create()
    LLM-->>Planner: JSON ConnectorSpec
    Planner-->>CLI: ConnectorSpec
    CLI->>CLI: validate_spec(spec, registry)
    opt confidence < 0.85
        CLI->>User: render_clarification(ambiguities)
        User-->>CLI: answers
        CLI->>Planner: refine(draft, answers)
        Planner->>LLM: refinement call
        LLM-->>Planner: updated ConnectorSpec
    end
    CLI->>Compiler: compile_connector(spec, output_dir)
    Compiler->>Disk: Write routes, schema, configs, Docker files
    Compiler-->>CLI: (project_dir, rationale)
    CLI->>User: _interactive_menu(project_dir, port)
Loading
Prompt To Fix All With AI
This is a comment left during a code review.
Path: setup.py
Line: 29-30

Comment:
**`openai` and `anthropic` forced as hard install dependencies**

Both AI SDK packages are in `install_requires`, so every `pip install workflows-cdk` pulls them in — even for users who only run `workflows validate` or `workflows run` and never touch AI features. Together they add ~100 MB of transitive dependencies and may conflict with existing SDK versions.

These should be optional extras, consistent with the PR description's claim that "AI keys and ngrok are optional for non-AI commands":

```suggestion
        "click>=8.0.0",
        "rich>=13.0.0",
    ],
    extras_require={
        "ai": [
            "openai>=1.0.0",
            "anthropic>=0.30.0",
        ],
    },
```

How can I resolve this? If you propose a fix, please make it concise.

---

This is a comment left during a code review.
Path: src/workflows_cdk/cli/main.py
Line: 1456-1462

Comment:
**Hardcoded `python` binary fails on Python 3-only systems**

`subprocess.run(["python", ...])` raises `FileNotFoundError` on systems (common on modern Linux/macOS) where only `python3` is on PATH. The background launcher `_start_main_py_background` already uses `sys.executable` correctly — this call should match it.

```suggestion
        try:
            subprocess.run([sys.executable, str(main_py)], cwd=str(project_dir))
        except KeyboardInterrupt:
```

How can I resolve this? If you propose a fix, please make it concise.

---

This is a comment left during a code review.
Path: src/workflows_cdk/ai/planner.py
Line: 40-41

Comment:
**PR description says Claude Haiku; code defaults to Claude Sonnet**

`DEFAULT_ANTHROPIC_MODEL = "claude-sonnet-4-6"` contradicts the PR description which explicitly states "Default Anthropic model set to **Claude Haiku** for faster iteration." Haiku is significantly cheaper and faster, so users relying on the stated default will be surprised by higher latency and cost.

How can I resolve this? If you propose a fix, please make it concise.

---

This is a comment left during a code review.
Path: src/workflows_cdk/ai/validator.py
Line: 61-62

Comment:
**Unused `manifest` variable in `_validate_actions` and `_validate_triggers`**

`manifest = registry.get(spec.app_type)` is assigned but never read in both functions (line 61 in `_validate_actions` and the equivalent in `_validate_triggers`). This is dead code that will generate linter warnings. Either remove it or add the registry-based validation that appears to be intended.

How can I resolve this? If you propose a fix, please make it concise.

Greploops — Automatically fix all review issues by running /greploops in Claude Code. It iterates: fix, push, re-review, repeat until 5/5 confidence.
Use the Greptile plugin for Claude Code to query reviews, search comments, and manage custom context directly from your terminal.

Reviews (2): Last reviewed commit: "feat(cli): AI-assisted connectors, guide..." | Re-trigger Greptile

Comment thread src/workflows_cdk/spec/compiler.py Outdated
Comment thread src/workflows_cdk/ai/planner.py Outdated
Comment thread src/workflows_cdk/cli/main.py Outdated
Comment thread src/workflows_cdk/cli/main.py
- Interactive workflows menu (create, update, validate, run, expose, docs, setup)
- Connector picker from nearby projects; clearer prompts; menu loops with pauses
- Post-generation next steps: run run_dev, auto ngrok with optional install, open docs
- Track CLI-started ngrok/background connector; prompt to stop on exit and after run
- Planner: Haiku default, connection-error retry, prompts without registry jargon
- Clarifier: one question at a time; validator/spec/template tweaks for modules
- dynamic_routing: stable project root (env, Flask root_path, cwd fallback)
- Compiler: template alignment (.gitignore, .env, README, entrypoint, no capability.yaml)
- Docker dev: WORKFLOWS_PROJECT_ROOT, gunicorn chdir, entrypoint cd, run_dev docker rm
- README: usage-focused quick start, menus, commands, ngrok/Studio tips

Made-with: Cursor
@AlexisFavre
Copy link
Copy Markdown
Contributor

@greptile

Comment on lines +1456 to +1462
_pause_return_to_menu(label="next steps menu")


def _run_connector(project_dir: Path) -> None:
"""Try to start the connector from *project_dir*."""
run_dev = project_dir / "run_dev.sh"
main_py = project_dir / "main.py"
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Hardcoded python binary fails on Python 3-only systems

subprocess.run(["python", ...]) raises FileNotFoundError on systems (common on modern Linux/macOS) where only python3 is on PATH. The background launcher _start_main_py_background already uses sys.executable correctly — this call should match it.

Suggested change
_pause_return_to_menu(label="next steps menu")
def _run_connector(project_dir: Path) -> None:
"""Try to start the connector from *project_dir*."""
run_dev = project_dir / "run_dev.sh"
main_py = project_dir / "main.py"
try:
subprocess.run([sys.executable, str(main_py)], cwd=str(project_dir))
except KeyboardInterrupt:
Prompt To Fix With AI
This is a comment left during a code review.
Path: src/workflows_cdk/cli/main.py
Line: 1456-1462

Comment:
**Hardcoded `python` binary fails on Python 3-only systems**

`subprocess.run(["python", ...])` raises `FileNotFoundError` on systems (common on modern Linux/macOS) where only `python3` is on PATH. The background launcher `_start_main_py_background` already uses `sys.executable` correctly — this call should match it.

```suggestion
        try:
            subprocess.run([sys.executable, str(main_py)], cwd=str(project_dir))
        except KeyboardInterrupt:
```

How can I resolve this? If you propose a fix, please make it concise.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants