Public beta

Give Your AI Agents
Context + Memory

Give your agents persistent context that grows, connects, and evolves with every conversation.

Multi-Providerpip installSelf-hosted
quickstart.py
from contextmemory import create_table, Memory, SessionLocal

# Setup
create_table()
memory = Memory(SessionLocal())

# Add memories from a conversation
memory.add(messages=[
    {"role": "user", "content": "I love Python programming"}
], conversation_id=1)

# Search memories
results = memory.search("What does the user like?", conversation_id=1)

Context, not just storage

Your AI agents deserve memory that understands relationships between ideas, not a flat key-value store.

Bubbles, not rows

Memories live as semantic facts and episodic bubbles. They connect automatically — no graph database needed.

Search by meaning

FAISS-powered vector search finds related memories even when the words don't match. Fast, accurate, scalable.

Works anywhere

pip install and you're done. SQLite out of the box, PostgreSQL when you scale. OpenAI or Claude via OpenRouter.