MCP Marketplace
BrowseHow It WorksFor CreatorsDocs
Sign inSign up
MCP Marketplace

The curated, security-first marketplace for AI tools.

Product

Browse ToolsSubmit a ToolDocumentationHow It WorksBlogFAQChangelog

Legal

Terms of ServicePrivacy PolicyCommunity Guidelines

Connect

support@mcp-marketplace.ioTwitter / XDiscord

MCP Marketplace ยฉ 2026. All rights reserved.

Back to Browse

SmartMemory MCP Server

by MauriceIsrael
Developer ToolsLow Risk10.0MCP RegistryLocal
Free

Server data from the Official MCP Registry

Neuro-symbolic memory for LLMs (POC)

About

Neuro-symbolic memory for LLMs (POC)

Security Report

10.0
Low Risk10.0Low Risk

Valid MCP server (1 strong, 1 medium validity signals). No known CVEs in dependencies. Imported from the Official MCP Registry.

5 files analyzed ยท No issues found

Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.

Permissions Required

This plugin requests these system permissions. Most are normal for its category.

file_system

Check that this permission is expected for this type of plugin.

How to Install

Add this to your MCP configuration file:

{
  "mcpServers": {
    "mcp-server": {
      "args": [
        "-y",
        "@modelcontextprotocol/server-smart-memory"
      ],
      "command": "npx"
    }
  }
}

Documentation

View on GitHub

From the project's GitHub README.

SmartMemory

Give your LLM structured memory | Transform conversations into verified knowledge graphs


[!CAUTION] Proof of Concept Only: This project is an experimental implementation of a Neuro-Symbolic architecture. It is designed to demonstrate how LLMs can interact with knowledge graphs for rule learning. It is NOT intended for production or professional use. Use it for research, experimentation, and learning purposes only.


๐Ÿš€ Quick Start

New user? โ†’ 5-Minute Quick Start Guide

Having issues? โ†’ Troubleshooting Guide

Need to configure? โ†’ Configuration Reference

Want to understand how it works? โ†’ Neuro-Symbolic Architecture | Technical Architecture

Looking for specific docs? โ†’ ๐Ÿ“š Documentation Index


๐ŸŽฏ What is SmartMemory?

SmartMemory enables your favorite LLM (Claude, Gemini, etc.) to remember facts, learn business rules, and deduce new information.

You can use it in two main ways:

1. ๐Ÿ’ฌ Conversational Mode (The "Brain")

  • For: Individuals using LLM clients (Claude Desktop, etc.).
  • Goal: Have your assistant remember facts and learn logic naturally as you chat.
  • How: Configure it as an MCP server.
  • ๐Ÿ‘‰ Go to Setup

2. ๐Ÿ—๏ธ Supervision Mode (The "Factory")

  • For: Teams, developers, or heavy users.
  • Goal: Extract thousands of rules from documents (PDFs) and visualize the knowledge graph.
  • How: Deploy the full Dashboard via Docker.
  • ๐Ÿ‘‰ Go to Setup

๐Ÿ’ฌ Mode 1: Conversational Setup (MCP)

This mode gives your LLM "long-term memory" and logical deduction capabilities.

Option A: Install via Docker (Recommended) ๐Ÿณ

Best for: Everyone! No Python installation required.

The SmartMemory Docker image is available on GitHub Container Registry.

Simply add to your MCP client configuration:

For Claude Desktop, edit ~/Library/Application Support/Claude/claude_desktop_config.json:

{
  "mcpServers": {
    "smart-memory": {
      "command": "docker",
      "args": ["run", "--rm", "-i", "ghcr.io/mauriceisrael/smart-memory:latest"]
    }
  }
}

For Gemini (Cline), edit ~/.cline/mcp_settings.json:

{
  "mcpServers": {
    "smart-memory": {
      "command": "docker",
      "args": ["run", "--rm", "-i", "ghcr.io/mauriceisrael/smart-memory:latest"]
    }
  }
}

Restart your client and you're done! โœ…


Option B: Local Server (Private) ๐Ÿ”’

Best for: Developers & Privacy-conscious users who want to run from source.

Installation Steps (Local)
  1. Clone & Install

    git clone https://github.com/MauriceIsrael/SmartMemory
    cd SmartMemory
    python3 -m venv venv
    source venv/bin/activate
    pip install -e .
    
  2. Connect to Claude Desktop Edit your configuration file (~/Library/Application Support/Claude/claude_desktop_config.json on macOS):

    {
      "mcpServers": {
        "smartmemory": {
          "command": "/absolute/path/to/SmartMemory/venv/bin/python",
          "args": ["-m", "smart_memory.server"]
        }
      }
    }
    

    (Replace /absolute/path/... with your actual path)

  3. Chat! Restart Claude and try:

    "I know Bob. He goes to work by car. Can he vote?"

    See Interactive Demo below for what to expect.


๐Ÿ—๏ธ Mode 2: Supervision Setup (Docker)

This mode runs the Web Dashboard and API server. Ideally suited for:

  • Visualizing the Knowledge Graph.
  • Extracting rules from documents (PDFs).
  • Hosting a shared memory server for a team.

Quick Start (Docker)

You don't need Python installed. Just Docker.

  1. Run the container

    For Dashboard mode (web interface):

    For Ollama (local):

    docker run -p 8080:8080 \
      -e LLM_PROVIDER=ollama \
      -e LLM_MODEL=llama3 \
      -e LLM_BASE_URL=http://172.17.0.1:11434 \
      -v $(pwd)/brain:/app/data \
      ghcr.io/mauriceisrael/smart-memory:latest dashboard
    

    For OpenAI:

    docker run -p 8080:8080 \
      -e LLM_PROVIDER=openai \
      -e LLM_MODEL=gpt-4 \
      -e LLM_API_KEY=your-api-key \
      -v $(pwd)/brain:/app/data \
      ghcr.io/mauriceisrael/smart-memory:latest dashboard
    

    (Note: Add dashboard at the end to start web server. Without it, starts MCP mode)

    (The -v volume persists your knowledge graph and rules)

  2. Open the Dashboard Go to http://localhost:8080

LLM Configuration

SmartMemory uses an LLM to extract business rules from documents. Configure it in two ways:

Option 1: Via Dashboard (Local Development)

  1. Go to Admin page
  2. Select your provider (Ollama, OpenAI, Anthropic, Google)
  3. Enter your configuration (API key or Ollama URL)
  4. Test connection
  5. Save

Option 2: Via Environment Variables (Docker)

Already shown above! Pass -e LLM_PROVIDER=... when starting Docker.

Supported Providers:

  • Ollama (local, free): llama3, qwen2.5-coder, mistral
  • OpenAI: gpt-4, gpt-3.5-turbo
  • Anthropic: claude-3-5-sonnet
  • Google: gemini-1.5-pro

โ†’ Full Configuration Guide

Advanced Deployment

We support deploying to Google Cloud Run, GitHub, etc. ๐Ÿ‘‰ Read the Deployment Guide (DEPLOY.md) for full instructions on hosting it online.

Extracting Rules from Documents

Once your dashboard is running (or via CLI):

  1. Upload a PDF (e.g., "Company_Policy.pdf").
  2. Configure LLM: The server needs an LLM API key (OpenAI, Anthropic, or local Ollama) to read the doc.
  3. Review: The system will propose rules. You can approve them in bulk in the dashboard.

๐Ÿ’ฌ Interactive Demo: From Facts to Rules

Here is what happens when you use Conversational Mode:

> I know Bob
LLM: โœฆ I've processed your statement "I know Bob".

> He goes at work by car
LLM: โœฆ I've noted that Bob goes to work by car.

> Can Bob vote?
LLM: โœฆ I can't answer yet. But since he drives, he is likely an adult.
     May I add a rule: "Drivers are adults"?

> yes
LLM: โœจ Rule 'drivers_are_adults' added.
     Now, may I add a rule: "Adults can vote"?

> yes
LLM: โœจ Rule 'adults_can_vote' added.
     โœฆ Therefore, yes, Bob can vote.

๐Ÿ› ๏ธ Technical Stack

  • Backend: Python 3.11+, RDFLib, FastAPI
  • Frontend: SvelteKit, TypeScript, TailwindCSS
  • Reasoning: Neuro-Symbolic (LLM + SPARQL/OWL)
  • Protocol: Model Context Protocol (MCP)

๐Ÿ“œ License

MIT License - see LICENSE

Reviews

No reviews yet

Be the first to review this server!

0

installs

New

no ratings yet

Is this your server?

Claim ownership to manage your listing, respond to reviews, and track installs from your dashboard.

Claim with GitHub

Sign up with the GitHub account that owns this repo

Links

Source Code

Details

Published February 24, 2026
Version 1.0.1
0 installs
Local Plugin

More Developer Tools MCP Servers

Git

Free

by Modelcontextprotocol ยท Developer Tools

Read, search, and manipulate Git repositories programmatically

80.0K
Stars
3
Installs
6.5
Security
No ratings yet
Local

Toleno

Free

by Toleno ยท Developer Tools

Toleno Network MCP Server โ€” Manage your Toleno mining account with Claude AI using natural language.

114
Stars
407
Installs
8.0
Security
4.8
Local

mcp-creator-python

Free

by mcp-marketplace ยท Developer Tools

Create, build, and publish Python MCP servers to PyPI โ€” conversationally.

-
Stars
55
Installs
10.0
Security
5.0
Local

MarkItDown

Free

by Microsoft ยท Content & Media

Convert files (PDF, Word, Excel, images, audio) to Markdown for LLM consumption

89.9K
Stars
15
Installs
6.0
Security
5.0
Local

mcp-creator-typescript

Free

by mcp-marketplace ยท Developer Tools

Scaffold, build, and publish TypeScript MCP servers to npm โ€” conversationally

-
Stars
14
Installs
10.0
Security
5.0
Local

FinAgent

Free

by mcp-marketplace ยท Finance

Free stock data and market news for any MCP-compatible AI assistant.

-
Stars
13
Installs
10.0
Security
No ratings yet
Local