Long-term memory for AI conversations
ContextMemory extracts, stores, and retrieves important facts from conversations, enabling AI Agents to remember user preferences, context, and history across sessions.
Features
Dual Memory Types
Semantic facts + Episodic bubbles
Fast Search
FAISS-powered O(log n) lookup
Smart Updates
Auto contradiction detection
Auto Connections
Bubbles link to related facts
Multi-Provider
OpenAI or Claude
Flexible Storage
SQLite or PostgreSQL
Installation
pip install contextmemoryQuick Start
1OpenAI (Direct)
from contextmemory import configure, create_table, Memory, SessionLocal
# Configure with OpenAI
configure(
openai_api_key="sk-...",
database_url="postgresql://...", # Optional, defaults to SQLite
)
# Create tables
create_table()
# Use memory
db = SessionLocal()
memory = Memory(db)2OpenRouter (Claude, etc.)
from contextmemory import configure, create_table, Memory, SessionLocal
# Configure with OpenRouter
configure(
openrouter_api_key="sk-or-v1-...",
llm_provider="openrouter",
llm_model="anthropic/claude-sonnet-4.5", # Or any OpenRouter model
embedding_model="openai/text-embedding-3-small",
database_url="postgresql://...",
)
create_table()
db = SessionLocal()
memory = Memory(db)⚡Environment Variables (Alternative)
# For OpenAI
export OPENAI_API_KEY="sk-..."
# For OpenRouter
export OPENROUTER_API_KEY="sk-or-v1-..."
# Optional
export DATABASE_URL="postgresql://..."Basic Usage
Add Memories
# Add memories from a conversation
messages = [
{"role": "user", "content": "Hi, I'm Samiksha and I love Python programming"},
{"role": "assistant", "content": "Nice to meet you! Python is great."},
]
result = memory.add(messages=messages, conversation_id=1)
# Returns: {'semantic': ['User is named Samiksha', 'User loves Python'], 'bubbles': []}Search Memories
results = memory.search(
query="What programming language does the user like?",
conversation_id=1,
limit=5
)
print(results)
# {
# 'query': '...',
# 'results': [
# {'memory_id': 1, 'memory': 'User loves Python programming', 'type': 'semantic', 'score': 0.89}
# ]
# }Update & Delete
# Update a memory
memory.update(memory_id=1, text="User is an expert Python developer")
# Delete a memory
memory.delete(memory_id=1)Memory Types
Semantic Facts
Stable, long-term truths about the user:
- Name, preferences, skills
- Professional background
- Dietary preferences, relationships
Episodic Bubbles
Time-bound moments with automatic connections:
- Current tasks, deadlines
- Active problems being solved
- Significant events
# Bubbles auto-connect to related semantic facts
memory.add(
messages=[
{"role": "user", "content": "I'm debugging a FastAPI auth issue"},
{"role": "assistant", "content": "Let me help with that."}
],
conversation_id=1
)
# Creates bubble: "User is debugging FastAPI auth issue"
# Auto-connects to: "User works on backend development"Full Example: Chat with Memory
from openai import OpenAI
from contextmemory import configure, create_table, Memory, SessionLocal
# Configure
configure(
openrouter_api_key="sk-or-v1-...",
llm_provider="openrouter",
llm_model="anthropic/claude-sonnet-4.5",
embedding_model="openai/text-embedding-3-small",
database_url="postgresql://...",
)
create_table()
# Initialize
chat_client = OpenAI(
api_key="sk-or-v1-...",
base_url="https://openrouter.ai/api/v1"
)
db = SessionLocal()
...Configuration Reference
| Parameter | Required | Default | Description |
|---|---|---|---|
| openai_api_key | Yes* | - | OpenAI API key |
| openrouter_api_key | Yes* | - | OpenRouter API key |
| llm_provider | No | openai | openai or openrouter |
| llm_model | No | gpt-4o-mini | LLM model for extraction |
| embedding_model | No | text-embedding-3-small | Embedding model |
| database_url | No | SQLite | PostgreSQL URL |
| debug | No | False | Enable debug logging |
*One of openai_api_key or openrouter_api_key required based on llm_provider.
API Reference
configure(**kwargs)Set global configuration. Call before any other operations.
create_table()Create all required database tables. Idempotent.
Memory(db: Session)Main memory interface. Methods:
add(messages, conversation_id)→ Extract & store memoriessearch(query, conversation_id, limit)→ Search memoriesupdate(memory_id, text)→ Update a memorydelete(memory_id)→ Delete a memory
SessionLocal()Create a new database session.
How It Works
Contradiction Detection
"I'm vegetarian" → "I eat meat" triggers REPLACE
FAISS Search
O(log n) vector search instead of O(n) loops
Smart Extraction
Only extracts from latest interaction, not context