Server data from the Official MCP Registry
Graph-based memory system for LLMs with knowledge graphs and semantic search
Graph-based memory system for LLMs with knowledge graphs and semantic search
Valid MCP server (2 strong, 1 medium validity signals). 5 known CVEs in dependencies (0 critical, 3 high severity) Package registry verified. Imported from the Official MCP Registry.
7 files analyzed ยท 6 issues found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Set these up before or after installing:
Environment variable: MEMOGRAPH_VAULT
Add this to your MCP configuration file:
{
"mcpServers": {
"io-github-indhar01-memograph": {
"env": {
"MEMOGRAPH_VAULT": "your-memograph-vault-here"
},
"args": [
"-y",
"memograph-web-ui"
],
"command": "npx"
}
}
}From the project's GitHub README.
A graph-based memory system for LLMs with intelligent retrieval. MemoGraph provides a powerful solution to the LLM memory problem by combining knowledge graphs, hybrid retrieval, and semantic search.
pip install memograph
Install with optional dependencies:
# For OpenAI support
pip install memograph[openai]
# For Anthropic Claude support
pip install memograph[anthropic]
# For Ollama support
pip install memograph[ollama]
# For embedding support
pip install memograph[embeddings]
# Install everything
pip install memograph[all]
from memograph import MemoryKernel, MemoryType
# Initialize the kernel attached to your vault path
kernel = MemoryKernel("~/my-vault")
# Ingest all notes in the vault
stats = kernel.ingest()
print(f"Indexed {stats['indexed']} memories.")
# Programmatically add a new memory
kernel.remember(
title="Meeting Note",
content="Decided to use BFS graph traversal for retrieval.",
memory_type=MemoryType.EPISODIC,
tags=["design", "retrieval"]
)
# Retrieve context for an LLM query
context = kernel.context_window(
query="how does retrieval work?",
tags=["retrieval"],
depth=2,
top_k=8
)
print(context)
MemoGraph includes a full-featured MCP server for seamless integration with AI assistants like Cline and Claude Desktop.
| Category | Tools | Description |
|---|---|---|
| Search | search_vault, query_with_context | Semantic search and context retrieval |
| Create | create_memory, import_document | Add memories and import documents |
| Read | list_memories, get_memory, get_vault_info | Browse and retrieve memories |
| Update | update_memory | Modify existing memories |
| Delete | delete_memory | Remove memories by ID |
| Analytics | get_vault_stats | Vault statistics and insights |
| Discovery | list_available_tools | List all available tools |
| Autonomous | auto_hook_query, auto_hook_response, configure_autonomous_mode, get_autonomous_config | Autonomous memory management |
| Graph | relate_memories, search_by_graph, find_path | Graph-native linking and traversal |
| Bulk | bulk_create | Create multiple memories in one call |
Add to your ~/.cline/mcp_settings.json:
{
"mcp": {
"servers": {
"memograph": {
"command": "python",
"args": ["-m", "memograph.mcp.run_server"],
"env": {
"MEMOGRAPH_VAULT": "/path/to/your/vault"
}
}
}
}
}
Add to your claude_desktop_config.json:
{
"mcpServers": {
"memograph": {
"command": "python",
"args": ["-m", "memograph.mcp.run_server", "--vault", "/path/to/your/vault"]
}
}
}
NEW: MemoGraph is now available in the official MCP Registry!
# Install via MCP CLI (if available)
mcp install io.github.indhar01/memograph
# Or manually configure in your MCP client:
{
"mcpServers": {
"memograph": {
"command": "python",
"args": ["-m", "memograph.mcp.run_server"],
"env": {
"MEMOGRAPH_VAULT": "~/my-vault"
}
}
}
}
Benefits of MCP Registry:
See MCP_REGISTRY_GUIDE.md for complete submission and configuration guide.
Once configured, use natural language with your AI assistant:
"Search my vault for memories about Python"
"Create a memory titled 'Project Ideas' with content '...'"
"Update memory abc-123 to have salience 0.9"
"Delete memory xyz-456"
"What tools are available?"
"Get vault statistics"
See CONFIG_REFERENCE.md for complete MCP configuration guide.
MemoGraph comes with a powerful CLI for managing your vault and chatting with it.
Index your markdown files into the graph database:
memograph --vault ~/my-vault ingest
Force re-indexing all files:
memograph --vault ~/my-vault ingest --force
Quickly add a memory from the command line:
memograph --vault ~/my-vault remember \
--title "Team Sync" \
--content "Discussed Q3 goals." \
--tags planning q3
Generate context for a query:
memograph --vault ~/my-vault context \
--query "What did we decide about the database?" \
--tags architecture \
--depth 2 \
--top-k 5
Start an interactive chat session with your vault context:
memograph --vault ~/my-vault ask --chat --provider ollama --model llama3
Or ask a single question:
memograph --vault ~/my-vault ask \
--query "Summarize our design decisions" \
--provider claude \
--model claude-3-5-sonnet-20240620
Check your environment and connection to LLM providers:
memograph --vault ~/my-vault doctor
MemoGraph supports different types of memories inspired by cognitive science:
The library uses BFS (Breadth-First Search) to traverse your knowledge graph:
# Retrieve nodes with depth=2 (2 hops from seed nodes)
nodes = kernel.retrieve_nodes(
query="graph algorithms",
depth=2, # Traverse up to 2 levels deep
top_k=10 # Return top 10 relevant memories
)
Each memory has a salience score (0.0-1.0) that represents its importance:
---
title: "Critical Architecture Decision"
salience: 0.9
memory_type: semantic
---
We decided to use PostgreSQL for better ACID guarantees...
MemoGraph/
โโโ memograph/ # Main package
โ โโโ core/ # Core functionality
โ โ โโโ kernel.py # Memory kernel
โ โ โโโ graph.py # Graph implementation
โ โ โโโ retriever.py # Hybrid retrieval
โ โ โโโ indexer.py # File indexing
โ โ โโโ parser.py # Markdown parsing
โ โโโ adapters/ # LLM and embedding adapters
โ โ โโโ embeddings/ # Embedding providers
โ โ โโโ frameworks/ # Framework integrations
โ โ โโโ llm/ # LLM providers
โ โโโ storage/ # Storage and caching
โ โโโ mcp/ # MCP server implementation
โ โโโ cli.py # CLI implementation
โโโ tests/ # Test suite
โโโ examples/ # Example usage
โโโ scripts/ # Utility scripts
We welcome contributions! Please see our Contributing Guide for details.
Clone the repository:
git clone https://github.com/Indhar01/MemoGraph.git
cd MemoGraph
Install in development mode:
pip install -e ".[all,dev]"
Install pre-commit hooks:
pre-commit install
Run tests:
pytest
We maintain high code quality standards:
See our Security Policy for reporting vulnerabilities.
This project is licensed under the MIT License - see the LICENSE file for details.
Inspired by the need for better memory management in LLM applications. Built with:
Current Version: 0.1.0 (Alpha - Marketplace Ready)
This project is in active development with a focus on code quality and stability:
Recent Improvements:
Made with โค๏ธ for better LLM memory management
Be the first to review this server!
by Modelcontextprotocol ยท Developer Tools
Read, search, and manipulate Git repositories programmatically
by Toleno ยท Developer Tools
Toleno Network MCP Server โ Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace ยท Developer Tools
Create, build, and publish Python MCP servers to PyPI โ conversationally.