Memory
Give agents memory that persists
MUXI's three-tier memory system lets agents remember context within conversations and across sessions.
New to memory? Read Memory Concepts → first to understand how the three-tier architecture works.
API Reference: GET /v1/memory | DELETE /v1/memory/buffer
Memory Architecture
┌─────────────────────────────────────┐
│ Buffer Memory │ ← Recent messages (fast)
│ ~50 messages │
└─────────────────────────────────────┘
↓
┌─────────────────────────────────────┐
│ Vector Search │ ← Semantic similarity
│ Find related context │
└─────────────────────────────────────┘
↓
┌─────────────────────────────────────┐
│ Persistent Memory │ ← Long-term storage
│ SQLite / PostgreSQL │
└─────────────────────────────────────┘
Quick Setup
Conversation Memory (Default)
memory:
buffer:
size: 50 # Keep 50 recent messages
With Semantic Search
memory:
buffer:
size: 50
vector_search: true # Find related past messages
With Persistence
memory:
buffer:
size: 50
vector_search: true
persistent:
enabled: true
provider: sqlite # Survives restarts
Buffer Memory
Stores recent conversation messages in memory:
memory:
buffer:
size: 50 # Messages before summarization
multiplier: 10 # Effective capacity: 500 messages
| Field | Default | Description |
|---|---|---|
size
| 50 | Messages to keep in full |
multiplier
| 10 | Summarized message capacity |
vector_search
| false | Enable semantic search |
When the buffer fills, older messages are automatically summarized to preserve context while saving space.
Vector Search
Find semantically related past conversations:
memory:
buffer:
vector_search: true
embedding_model: openai/text-embedding-3-small
When enabled, MUXI:
- Embeds each message as a vector
- Searches for similar past interactions
- Includes relevant context in prompts
This helps agents recall related information even from distant conversations.
Persistent Memory
Save conversations across sessions:
memory:
persistent:
enabled: true
provider: sqlite
# Stores in ~/.muxi/{formation_id}/memory.dbBest for: Single-user, local development
memory:
persistent:
enabled: true
provider: postgresql
connection_string: ${{ secrets.POSTGRES_URI }}Best for: Multi-user, production deployments
Multi-User Memory
Isolate memory per user:
memory:
persistent:
enabled: true
provider: postgresql
connection_string: ${{ secrets.POSTGRES_URI }}
user_isolation: true
Pass user ID in requests:
curl -X POST http://localhost:8001/v1/chat \
-H "X-Muxi-User-Id: user_123" \
-d '{"message": "Remember I prefer Python"}'
response = formation.chat(
"Remember I prefer Python",
user_id="user_123"
)
const response = await formation.chat('Remember I prefer Python', {
userId: 'user_123'
});
response, _ := formation.ChatWithOptions("Remember I prefer Python", muxi.ChatOptions{
UserID: "user_123",
})
Each user's memory is completely isolated.
Complete Configuration
memory:
# Buffer memory
buffer:
size: 50
multiplier: 10
vector_search: true
embedding_model: openai/text-embedding-3-small
# Working memory (tool outputs, intermediate state)
working:
max_memory_mb: 10
fifo_interval_min: 5
# Persistent storage
persistent:
enabled: true
provider: postgresql
connection_string: ${{ secrets.POSTGRES_URI }}
user_isolation: true
Disable Memory
For stateless interactions (no context between messages):
memory:
buffer:
size: 0
persistent:
enabled: false
How It Works
sequenceDiagram
participant U as User
participant M as MUXI
participant B as Buffer
participant V as Vector DB
participant P as Persistent
U->>M: New message
M->>B: Load recent messages
M->>V: Search similar context
M->>P: Load user history
M->>M: Build prompt with context
M->>U: Response
M->>B: Save to buffer
M->>V: Index new message
M->>P: Persist to database
Next Steps
Add Memory Guide - Step-by-step tutorial
Multi-User Support - User isolation details
Knowledge - Add document-based RAG