Server data from the Official MCP Registry
Find relevant Smart‑Thinking memories fast. Fetch full entries by ID to get complete context. Spee…
Find relevant Smart‑Thinking memories fast. Fetch full entries by ID to get complete context. Spee…
Remote endpoints: streamable-http: https://server.smithery.ai/@Leghis/smart-thinking/mcp
Valid MCP server (1 strong, 1 medium validity signals). No known CVEs in dependencies. Imported from the Official MCP Registry.
3 files analyzed · No issues found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Available as Local & Remote
This plugin can run on your machine or connect to a hosted endpoint. during install.
From the project's GitHub README.
Smart-Thinking is a Model Context Protocol (MCP) server that delivers graph-based, multi-step reasoning without relying on external AI APIs. Everything happens locally: similarity search, heuristic-based scoring, verification tracking, memory, and visualization all run in a deterministic pipeline designed for transparency and reproducibility.
ReasoningOrchestrator initializes a session, restores any saved graph state, and prepares feature flags.ThoughtGraph, linking to context, prior thoughts, and relevant memories.QualityEvaluator and MetricsCalculator compute weighted scores and traces that explain the decision path.VerificationService and heuristic traces are attached to the node and propagated across connections.MemoryManager/VerificationMemory, and a structured MCP response is returned with a timeline of reasoning steps.Each step is logged with structured metadata so you can visualize the reasoning fabric, audit decisions, and replay sessions deterministically.
Smart-Thinking ships as an npm package compatible with Windows, macOS, and Linux.
npm install -g smart-thinking-mcp
npx -y smart-thinking-mcp
git clone https://github.com/Leghis/Smart-Thinking.git
cd Smart-Thinking
npm install
npm run build
npm link
Need platform-specific configuration details? See
GUIDE_INSTALLATION.mdfor step-by-step instructions covering Windows, macOS, Linux, and Claude Desktop integration.
smart-thinking-mcp — start the MCP server (globally installed package).npx -y smart-thinking-mcp — launch without a global install.npm run start — execute the built server from source.npm run demo:session — run the built-in CLI walkthrough that feeds sample thoughts through the reasoning pipeline and prints the resulting timeline.The demo script showcases how the orchestrator adds nodes, evaluates heuristics, and records verification feedback step by step.
Smart-Thinking is validated across the most popular MCP clients and operating systems. Use the new connector mode (--mode=connector or SMART_THINKING_MODE=connector) when a client only accepts the search and fetch tools required by ChatGPT connectors.1
| Client | Transport | Notes |
|---|---|---|
| ChatGPT Connectors & Deep Research | HTTP + SSE | Deploy with SMART_THINKING_MODE=connector node build/index.js --transport=http --host 0.0.0.0 --port 8000. Point ChatGPT to https://<host>/sse and keep only search/fetch enabled, aligning with OpenAI’s remote MCP guidance.1 |
| OpenAI Codex CLI & Agents SDK | Streamable HTTP / SSE | Configure the Codex agent with http://localhost:3000/mcp or http://localhost:3000/sse and set SMART_THINKING_MODE=connector when only knowledge retrieval is needed.2 |
| Claude Desktop / Claude Code | stdio | Add "command": "smart-thinking-mcp" (or an npx command) to claude_desktop_config.json. Full toolset is available.3 |
| Cursor IDE | stdio / SSE / Streamable HTTP | Add the server to ~/.cursor/mcp.json or the project .cursor/mcp.json. Cursor supports prompts, roots, elicitation, and streaming.4 |
| Cline (VS Code) | stdio | Place the command in ~/Documents/Cline/MCP/smart-thinking.json or use the in-app marketplace to register the toolset.3 |
| Kilo Code | stdio | Register via the MCP marketplace and run the server locally; Smart-Thinking exposes deterministic tooling for autonomous edits.3 |
Need a minimal deployment footprint? Combine
--transport=http --mode=connectorwith a reverse proxy (ngrok, fly.io, render, etc.) so remote clients can consume the server without exposing the full toolset.
For registry scanners and fallback metadata extraction, Smart-Thinking also exposes:
GET /.well-known/mcp/server-card.jsonfeature-flags.ts toggles advanced behaviours such as external integrations (disabled by default) and verbose tracing.config.ts aligns platform-specific paths and verification thresholds.memory-manager.ts and verification-memory.ts store session graphs, metrics, and calculation results using deterministic JSON snapshots.ToolIntegrator.export SMART_THINKING_ENABLE_EXTERNAL_TOOLS=true
executePython, executeJavaScript) and external tool calls return a local fallback result.FeatureFlags.externalLlmEnabled and FeatureFlags.externalEmbeddingEnabled remain disabled by default, so no remote LLM/embedding provider is required.npm run build # Compile TypeScript sources
npm run lint # ESLint across src/
npm run test # Jest test suite
npm run test:coverage # Jest coverage report
npm run watch # Incremental TypeScript compilation
See docs/modernisation-smart-thinking-v12-plan.md for the modernization checklist and rollout tracking.
80.47% statements, 81.59% lines, 84.34% functions, 63.48% branches.npm run lint and npm run test:coverage before each release candidate.Contributions are welcome. Please open an issue or pull request describing the change, and run the quality checks above before submitting.
OpenAI, “Building MCP servers for ChatGPT and API integrations,” highlights that connectors require search and fetch tools for remote use. (https://platform.openai.com/docs/mcp) ↩ ↩2
OpenAI Agents SDK documentation on MCP transports (stdio, SSE, streamable HTTP). (https://openai.github.io/openai-agents-python/mcp/) ↩
Model Context Protocol client catalogue listing Claude, Cline, Kilo Code, and other MCP-compatible applications. (https://modelcontextprotocol.io/clients) ↩ ↩2 ↩3
Cursor documentation for configuring MCP servers via stdio/SSE/HTTP transports. (https://cursor.com/docs/context/mcp) ↩
Be the first to review this server!
by Modelcontextprotocol · Developer Tools
Read, search, and manipulate Git repositories programmatically
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.