Open Source · Local First · Go CLI

Your LLM's
persistent memory

botmem gives your AI agents structured, queryable memory that survives between sessions. Facts, relationships, context — stored locally in SQLite.

$go install github.com/stukennedy/botmem@latest

Four types of memory,
one unified context

Each memory type serves a different purpose — from always-on context blocks to a full knowledge graph. Query them individually or export everything as a single JSON payload.

Memory Blocks

Labeled, always-available context. Store user preferences, persona definitions, and active project state. Think of it as your LLM's working memory.

botmem block set persona "..."

Archival Memory

Long-term factual storage with full-text search (FTS5). Tag entries, search by meaning with optional semantic embeddings via Ollama.

botmem archive search "preferences"

Knowledge Graph

Entity-relationship triplets that map how things connect. People, projects, concepts — linked through typed predicates you define.

botmem graph query "Stuart"

Conversation Summaries

Hierarchical summaries at multiple abstraction levels. Compress long conversations into progressively higher-level overviews.

botmem summary list --level 0

Three commands.
Persistent memory.

1

Initialize

botmem init

Interactive setup wizard. Pick your LLM provider — Claude Code, Anthropic API, or Ollama for fully local operation.

2

Ingest

botmem ingest

Pipe in conversation text. The LLM automatically extracts facts, relationships, block updates, and summaries.

3

Recall

botmem context

Export your full memory as a structured JSON payload, ready to inject into any LLM system prompt.

Built for developers
who build agents

Local-first

Everything stored in a single SQLite file at ~/.botmem/. No cloud, no accounts, no tracking.

Multi-provider

Works with Claude Code, Anthropic API, or Ollama. Use local models for complete privacy.

Smart extraction

LLM-powered ingest automatically structures conversations into facts, relations, and summaries.

Full-text search

FTS5 index with optional semantic embeddings via Ollama's nomic-embed-text model.

JSON context export

One command to export your entire memory as a JSON payload ready for system prompt injection.

No lock-in

Standard SQLite database. Query it directly, back it up, migrate it. Your data, your rules.

Up and running
in sixty seconds

~/.botmem
# Install botmem
$ go install github.com/stukennedy/botmem@latest
# Run the setup wizard
$ botmem init
✓ Provider: claude-code
✓ Database: ~/.botmem/botmem.db
# Set up core memory
$ botmem block set human "Developer working on Discord bots"
# Ingest a conversation
$ botmem ingest "User prefers Go, works on Moltbot"
✓ 2 facts extracted
✓ 3 relations added
✓ 1 summary created
# Get full context for your LLM
$ botmem context
{"core_blocks": [...], "summaries": [...], "relations": [...]}