Skip to content

queflyhq/open-memory-protocol

Open Memory Protocol (OMP)

The open wire-level contract for AI memory. Switch editors, keep your memory.

📖 Landing page (best entry point): https://quefly.com/oss/open-memory-protocol 📜 Latest spec: spec/omp-v0.1.md (Editor's Draft v0.1) 🛠 Reference implementation: memex (Apache 2.0) 📄 Spec licence: CC BY 4.0 💬 Discussion / RFC issues: github.com/queflyhq/open-memory-protocol/issues


What is OMP?

The Open Memory Protocol is a vendor-neutral wire-level standard for AI memory. It defines a five-verb surface — recall, remember, link, observe, validate — that any AI tool can implement on either side: as a client (an AI editor that reads/writes memory) or as a server (a memory store that other tools talk to).

Every AI coding tool today ships its own memory. CLAUDE.md for Claude Code. .cursorrules for Cursor. Project scratchpads in Windsurf. Switch tools and your context resets. Run two side by side and you maintain two divergent memories of the same project.

That's a wire-protocol problem. HTTP, SMTP, and IMAP solved analogous problems decades ago by separating the protocol from the client. OMP applies the same discipline to AI memory: one open spec, many implementations, one shared memory across the user's tools.


The contract in a paragraph

A conforming OMP server exposes five operations over a stable transport:

Verb What it does p99 target
recall Return a budget-bounded subgraph relevant to a query < 50 ms
remember Idempotently promote a fact / decision / constraint into durable memory < 30 ms
link Add a typed relationship between two concepts < 30 ms
observe Append an episodic event (fire-and-forget) < 5 ms
validate Check whether an intended action conflicts with stored constraints < 20 ms

Every concept carries provenance (source, confidence, created_at, last_confirmed_at, verification). Every error returns the canonical envelope {error, message, retry_after, details}. Implementations choose their storage, retrieval algorithm, embedding model, and transport — the protocol fixes only the verb surface, the provenance schema, the latency targets, the error envelope, and the federation contract.

The spec is small enough to implement in a few hundred lines of any language.


Quick links


Why an open protocol (not a library, not a SaaS)

Adjacent system What it solves What OMP adds
CLAUDE.md / .cursorrules Per-tool persistent context Cross-tool persistence + provenance + retrieval + safety gate
MCP Tool / resource invocation standard The memory layer beneath tool invocation (composes with MCP, doesn't replace it)
RAG libraries (LangChain, LlamaIndex) Retrieval pattern inside one app Wire-level contract between apps + typed concepts + lifecycle + safety
Vector DBs (Pinecone, Qdrant, Weaviate, Chroma) Vector storage tier Typed concepts + edges + episodic stream + verbs above the storage layer
Knowledge graphs (RDF, Neo4j) Rich graph + formal query First-class confidence + decay + episodic stream + latency contract

See spec/omp-v0.1.md §4 Related Work for the full positioning.


Spec status

OMP v0.1 is an Editor's Draft. It is published under CC BY 4.0 by Quefly Enterprises LLP (LLPIN ACX-1059) for review and implementation. Two independent interoperable implementations are required before transition to Candidate Recommendation; the reference (memex) counts as one.

Maturity path (per §Status of This Document):

  1. Editor's Draftwe are here
  2. Working Draft
  3. Candidate Recommendation
  4. Proposed Recommendation
  5. Recommendation

We may, in the future, submit OMP to a standards organisation (W3C, IETF, OASIS) for formal standardisation. Nothing in this document presumes that outcome — the spec is open under CC BY 4.0 regardless.


How to implement OMP

  1. Read the spec. spec/omp-v0.1.md. ~5000 words. Plain English with normative MUST / SHOULD / MAY per [RFC 2119 / BCP 14].
  2. Pick a transport. HTTP, gRPC, MCP, stdio, IPC — all valid. The verb shapes are transport-agnostic.
  3. Pick storage. DuckDB, Postgres, SQLite, Kuzu, RDF, embedded — all valid. The protocol does not specify a backend.
  4. Implement the five verbs per §6. The Python signatures in the spec are normative on shape; you may translate to your language idiomatically.
  5. Round-trip the provenance schema per §7. Every concept needs id, name, description, kind, source, confidence, created_at, last_confirmed_at, verification, metadata.
  6. Honour the error envelope per §11. {error, message, retry_after, details} for every non-success response.
  7. Run the conformance suite. conformance/ (under development). When all sections pass, you may publicly claim OMP v0.1 conformance.
  8. Add your implementation to implementations.md via PR.

A minimal Python server is ~300 lines. See examples/.


Contributing

OMP welcomes:

  • Bug reports on the spec text (ambiguity, contradiction, edge cases not addressed)
  • Editorial corrections (typos, broken anchors, formatting)
  • Change proposals with rationale (open an Issue tagged change-request first; PRs against the spec only after the proposal is discussed)
  • Conformance tests that cover MUST-level requirements not yet tested
  • Implementations to list under implementations.md

See CONTRIBUTING.md for the full process. The protocol matures via consensus + working code; implementer feedback weighs heavily.


License

  • Spec text (the contents of spec/) — CC BY 4.0. Reproduce, translate, embed, and republish, provided you attribute Quefly Enterprises LLP (LLPIN ACX-1059) and preserve the licence notice.
  • Code in this repository (conformance tests, examples) — Apache License 2.0. See per-file headers.
  • Reference implementation (memex) — Apache License 2.0 with explicit patent grant.

Edited by Quefly Enterprises LLP · LLPIN ACX-1059 · Cuttack, Odisha, India Landing: https://quefly.com/oss/open-memory-protocol

About

Open Memory Protocol — open spec for AI-agent memory: recall, remember, link, observe, validate. CC BY 4.0.

Topics

Resources

License

Code of conduct

Contributing

Stars

Watchers

Forks

Packages

 
 
 

Contributors