Documentation

Long-term memory for AI conversations

ContextMemory extracts, stores, and retrieves important facts from conversations, enabling AI Agents to remember user preferences, context, and history across sessions.

Features

Dual Memory Types

Semantic facts + Episodic bubbles

Fast Search

FAISS-powered O(log n) lookup

Smart Updates

Auto contradiction detection

Auto Connections

Bubbles link to related facts

Multi-Provider

OpenAI or Claude

Flexible Storage

SQLite or PostgreSQL

Installation

terminal
pip install contextmemory

Quick Start

1OpenAI (Direct)

quickstart_openai.py
from contextmemory import configure, create_table, Memory, SessionLocal

# Configure with OpenAI
configure(
    openai_api_key="sk-...",
    database_url="postgresql://...",  # Optional, defaults to SQLite
)

# Create tables
create_table()

# Use memory
db = SessionLocal()
memory = Memory(db)

2OpenRouter (Claude, etc.)

quickstart_openrouter.py
from contextmemory import configure, create_table, Memory, SessionLocal

# Configure with OpenRouter
configure(
    openrouter_api_key="sk-or-v1-...",
    llm_provider="openrouter",
    llm_model="anthropic/claude-sonnet-4.5",  # Or any OpenRouter model
    embedding_model="openai/text-embedding-3-small",
    database_url="postgresql://...",
)

create_table()
db = SessionLocal()
memory = Memory(db)

Environment Variables (Alternative)

.env
# For OpenAI
export OPENAI_API_KEY="sk-..."

# For OpenRouter
export OPENROUTER_API_KEY="sk-or-v1-..."

# Optional
export DATABASE_URL="postgresql://..."

Basic Usage

Add Memories

add_memories.py
# Add memories from a conversation
messages = [
    {"role": "user", "content": "Hi, I'm Samiksha and I love Python programming"},
    {"role": "assistant", "content": "Nice to meet you! Python is great."},
]

result = memory.add(messages=messages, conversation_id=1)
# Returns: {'semantic': ['User is named Samiksha', 'User loves Python'], 'bubbles': []}

Search Memories

search_memories.py
results = memory.search(
    query="What programming language does the user like?",
    conversation_id=1,
    limit=5
)

print(results)
# {
#   'query': '...',
#   'results': [
#     {'memory_id': 1, 'memory': 'User loves Python programming', 'type': 'semantic', 'score': 0.89}
#   ]
# }

Update & Delete

update_delete.py
# Update a memory
memory.update(memory_id=1, text="User is an expert Python developer")

# Delete a memory
memory.delete(memory_id=1)

Memory Types

Semantic Facts

Stable, long-term truths about the user:

  • Name, preferences, skills
  • Professional background
  • Dietary preferences, relationships

Episodic Bubbles

Time-bound moments with automatic connections:

  • Current tasks, deadlines
  • Active problems being solved
  • Significant events
bubbles_example.py
# Bubbles auto-connect to related semantic facts
memory.add(
    messages=[
        {"role": "user", "content": "I'm debugging a FastAPI auth issue"},
        {"role": "assistant", "content": "Let me help with that."}
    ],
    conversation_id=1
)
# Creates bubble: "User is debugging FastAPI auth issue"
# Auto-connects to: "User works on backend development"

Full Example: Chat with Memory

chat_with_memory.py
from openai import OpenAI
from contextmemory import configure, create_table, Memory, SessionLocal

# Configure
configure(
    openrouter_api_key="sk-or-v1-...",
    llm_provider="openrouter",
    llm_model="anthropic/claude-sonnet-4.5",
    embedding_model="openai/text-embedding-3-small",
    database_url="postgresql://...",
)

create_table()

# Initialize
chat_client = OpenAI(
    api_key="sk-or-v1-...",
    base_url="https://openrouter.ai/api/v1"
)
db = SessionLocal()
...

Configuration Reference

ParameterRequiredDefaultDescription
openai_api_keyYes*-OpenAI API key
openrouter_api_keyYes*-OpenRouter API key
llm_providerNoopenaiopenai or openrouter
llm_modelNogpt-4o-miniLLM model for extraction
embedding_modelNotext-embedding-3-smallEmbedding model
database_urlNoSQLitePostgreSQL URL
debugNoFalseEnable debug logging

*One of openai_api_key or openrouter_api_key required based on llm_provider.

API Reference

configure(**kwargs)

Set global configuration. Call before any other operations.

create_table()

Create all required database tables. Idempotent.

Memory(db: Session)

Main memory interface. Methods:

  • add(messages, conversation_id)→ Extract & store memories
  • search(query, conversation_id, limit)→ Search memories
  • update(memory_id, text)→ Update a memory
  • delete(memory_id)→ Delete a memory
SessionLocal()

Create a new database session.

How It Works

User MessageExtraction (LLM)Tool Classification (LLM)FAISS Index + DB
Semantic FactsADD/UPDATE/REPLACE/NOOP
Episodic Bubbles
Connection Finder → Links bubbles to related facts

Contradiction Detection

"I'm vegetarian" → "I eat meat" triggers REPLACE

FAISS Search

O(log n) vector search instead of O(n) loops

Smart Extraction

Only extracts from latest interaction, not context