SuperMemory gives your AI semantic memory that persists across sessions. Store knowledge, recall context, and build on past conversations.
Real examples of what SuperMemory does for your AI assistant.
Your AI learns your coding style, tool preferences, and workflow habits once — and remembers them forever.
Pick up exactly where you left off. Your AI recalls decisions, discussions, and context from previous sessions.
Teach your AI a procedure once — deploy scripts, debugging workflows, build steps — and it remembers how.
Architecture decisions, file conventions, API patterns — your AI keeps a living knowledge base of your project.
Remember past bugs, solutions, and workarounds. Your AI never solves the same problem twice from scratch.
Store research, meeting notes, reference material — anything you want your AI to know without re-explaining.
Built on modern retrieval technology for fast, accurate memory recall.
Memories are embedded as vectors using OpenAI or Gemini models. Search finds memories by meaning, not exact words — "deployment process" finds your CI/CD procedure even if you never used that phrase.
A two-stage retrieval pipeline: fast vector similarity first, then cross-encoder reranking for precision. Returns the most relevant memories, not just the closest vectors.
Run locally with SQLite for full privacy, or deploy to the cloud with Firestore for access across devices. Same MCP interface either way — your AI doesn't need to know the difference.
Add SuperMemory to your AI client in seconds.
SuperMemory is an MCP server that gives any AI assistant persistent semantic memory. It runs alongside your AI client and provides three tools:
Your AI learns to use these tools naturally. Say "remember this for next time" and it stores a memory. Ask about something from weeks ago and it retrieves the relevant context.
Add SuperMemory to your Claude Desktop configuration file:
{
"mcpServers": {
"supermemory": {
"command": "npx",
"args": ["@nicepkg/supermemory"],
"env": {
"OPENAI_API_KEY": "sk-..."
}
}
}
}
Connect SuperMemory to Claude on the web using the hosted server:
https://supermemory.13afoundry.com/mcp
The hosted server uses Firestore for cloud-persistent storage. Your memories are available across all sessions.
ChatGPT supports MCP servers through Developer Mode:
https://supermemory.13afoundry.com/mcp
Once connected, ChatGPT can store and retrieve memories across conversations.
Add SuperMemory to your Cursor MCP configuration:
{
"mcpServers": {
"supermemory": {
"command": "npx",
"args": ["@nicepkg/supermemory"],
"env": {
"OPENAI_API_KEY": "sk-..."
}
}
}
}
SuperMemory works with any application that supports the Model Context Protocol.
For stdio-based clients (local): run the server with npx:
npx @nicepkg/supermemory
For HTTP-based clients (cloud): point to the hosted endpoint:
https://supermemory.13afoundry.com/mcp
Environment variables: set OPENAI_API_KEY for embeddings (or GEMINI_API_KEY with EMBEDDING_PROVIDER=gemini to use Google's models instead).