Skip to content

nanami-he/aidone

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

English | 简体中文

AIDONE

Don't trust "done". Ask for evidence.

AIDONE is a product-facing acceptance protocol for AI-generated code. It helps founders, product people, designers, creators, and solo builders verify that AI-coded software is safe, maintainable, testable, and ready to hand to users.

The core file is AIDONE.md: a short checklist you can put in a project root and ask your AI coding agent to follow before it claims work is done.

Before And After

Without AIDONE, an AI agent may end a task like this:

Done. The settings page is implemented and ready to use.

With AIDONE, the same handoff should look more like this:

What the user sees: from the Settings page, users can now export their data as a JSON file.
Files touched: SettingsView.tsx, db/export.ts, copy.ts.
AIDONE check:
- Safety (P0): pass. No passwords or keys added to the code; bad input won't crash the app.
- Product quality (P1): one gap — "clear data" is a one-tap delete and still needs a second confirmation.
- Risk (P2): not applicable (this is a local export, not a production change).
Checks run: project builds. Style check fails on 2 older files, unrelated to this change.
How to verify yourself: open Settings -> Export Data, confirm the JSON file downloads.
Remaining risk: "clear data" is a one-way delete — should be tightened before real users see it.

Not more words. The AI should brief you the way an engineer briefs a non-technical founder — what changed, what was checked, what risk remains.

Why This Exists

AI can write code that runs.

That does not mean the work is acceptable.

A feature can appear to work while still missing basic product engineering defaults:

  • user-facing text is hardcoded instead of localizable
  • loading, empty, error, and success states are missing
  • secrets or URLs are hardcoded
  • permissions are only enforced in the UI
  • invalid inputs crash the app
  • error screens expose stack traces
  • tests are missing
  • "ship-ready" is claimed without build, lint, or test evidence

AIDONE turns those hidden engineering expectations into explicit acceptance checks.

Acceptance, Not Code Review

AIDONE is not another engineering best-practices list.

Code review asks: "Is this code well written?"

AIDONE asks: "Can a product owner accept this AI-generated work without reading every line of code?"

That is a different job. AIDONE turns engineering concerns into product-facing questions:

  • If we change language later, where does copy live?
  • If the API fails, what does the user see?
  • If a normal user tries the admin path, what blocks them?
  • If the agent says tests pass, which command proves it?
  • If a destructive action exists, what prevents accidental damage?

Who It Is For

Use AIDONE if you:

  • vibe code with AI agents
  • do not review every line of code yourself
  • are a founder, product person, designer, creator, or non-specialist builder
  • still need professional software defaults
  • want evidence instead of confident completion claims

It is not a replacement for senior engineering review. It is a practical acceptance bar that makes AI agents report risk, evidence, and skipped checks.

Quick Start

The easiest way: hand this repo URL to your AI tool and let it install itself.

Install https://github.com/nanami-he/aidone into my current project.

Claude Code, Cursor, Codex, Aider, and similar tools can read a GitHub URL, fetch the AIDONE files, drop them where they belong, and add the usage prompt to your AGENTS.md / CLAUDE.md.

If you prefer to do it manually:

Copy AIDONE.md into your project root.

Then add this to your AGENTS.md, CLAUDE.md, Cursor rules, or project instructions:

Before claiming any coding task is done, follow `AIDONE.md`.

Use P0 and P1 by default.
Apply P2 when the task touches production, migrations, external services, background jobs, payments, auth, or user data.
If a check cannot be run, state why and provide the closest available verification.

When asking an AI agent to review code:

Review this project with AIDONE. Do not only check whether the feature runs. Report P0, P1, and relevant P2 gaps with file references and verification evidence.

When asking an AI agent to implement code:

Implement this using AIDONE. At the end, report files changed, checks run, P0/P1 acceptance status, manual verification steps, and remaining risk.

For short replies, ask for the compact handoff:

Use AIDONE, but keep the final handoff compact: evidence, gaps, manual check.

Expected shape:

Evidence: pnpm build passed; pnpm lint failed on 2 existing files.
Gaps: no tests; destructive action still needs second confirmation.
Manual check: Settings -> Export Data downloads JSON.

The File Set

AIDONE.md                 Core acceptance protocol
llms.txt                  LLM-readable project index
examples/AGENTS.md        Example agent instruction
examples/CLAUDE.md        Example Claude Code instruction
examples/review-prompt.md Review prompt
examples/delivery-template.md Final delivery template
checklists/               Optional deep-dive checklists
references/sources.md     Source map and related standards

Priority Levels

Level Meaning Use When
P0 Always required Every non-trivial code change, including prototypes
P1 Default product bar Normal product work, UI, API, auth, data, settings, workflows
P2 Relevant for risk Production, migrations, background jobs, external services, scale

Related Standards

AIDONE fits into the emerging set of AI-era project files:

File Audience Purpose
README.md Humans What this project is
AGENTS.md Coding agents How to work in this repository
DESIGN.md Design agents How the product should look and feel
AIDONE.md AI coding agents and reviewers What "done" must prove

License

MIT

Packages

 
 
 

Contributors