A persistent memory service for AI agents that stores and manages conversation history, enabling agents to maintain context across sessions, replay conversations, fork conversations at any point, and perform semantic search across all conversations.
- Persistent conversation storage - All messages are stored with full context and metadata
- Resumable response streams - Long lived response streams can survive client reconnects or user device swtiching
- Conversation replay/audit - Reconstruct converstation history and agent memory state as it was at any point in time
- Conversation forking - Fork a conversation at any message to explore alternative paths
- Access control - User-based ownership and sharing with fine-grained permissions
- Multi-database support - Works with PostgreSQL and MongoDB; local, Redis, and Infinispan caching; PGVector, Qdrant, and Infinispan for vector search
- Semantic & Fulltext Search - Search across all conversations using vector and/or Fulltext
- File Attachments - Durably store files attached to messages sent to your agent, or generated by you agent.
- Encrypted - All database/file stortage data is encrypted while at rest
This is a proof of concept (POC) currently under development.
The Memory Service includes an MCP server that lets AI coding assistants (Claude Code, Cursor, etc.) persist and search session notes across conversations.
Install:
go install github.com/chirino/memory-service/memory-service-mcp@latestOr use the mcp subcommand of the main binary:
go install github.com/chirino/memory-service@latest
memory-service mcp remoteFor an embedded local-memory setup:
memory-service mcp embedded --db-url ./memory.dbSee memory-service-mcp/README.md for full configuration and usage details.
Visit the Memory Service Documentation for complete guides:
- Getting Started - Deploy Memory Service using Docker Compose
- Core Concepts - Understanding conversations, messages, and forking
- Quarkus LangChain4j Integration - Integrate with Quarkus LangChain4j agents
- Spring AI Integration - Integrate with Spring AI agents
- Configuration - Service configuration reference
Apache 2.0
