Server data from the Official MCP Registry
Contemplative reasoning with Lotus Sutra wisdom framework and ext-apps visualization.
Contemplative reasoning with Lotus Sutra wisdom framework and ext-apps visualization.
Valid MCP server (2 strong, 1 medium validity signals). 4 known CVEs in dependencies (0 critical, 3 high severity) Package registry verified. Imported from the Official MCP Registry. Trust signals: trusted author (5/6 approved).
7 files analyzed · 5 issues found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Add this to your MCP configuration file:
{
"mcpServers": {
"io-github-linxule-lotus-wisdom": {
"args": [
"-y",
"app"
],
"command": "npx"
}
}
}From the project's GitHub README.
An MCP server implementation that provides a tool for problem-solving using the Lotus Sutra's wisdom framework, combining analytical thinking with intuitive wisdom.
Available at: https://lotus-wisdom-mcp.linxule.workers.dev/mcp
This MCP server was developed from the Lotus OS prompt, which was designed to implement a cognitive framework based on the Lotus Sutra. The MCP server format makes this framework more accessible and easier to use with Claude and other AI assistants.
Note: The original prompt framework may work less effectively with newer Claude models, but this MCP server implementation provides consistent functionality across model versions.
The server implements a structured thinking process using wisdom domains inspired by the Lotus Sutra:
The server organizes thoughts using wisdom domains (all valid values for the tag input parameter):
Entry (🚪): begin
Skillful Means (🔆): upaya, expedient, direct, gradual, sudden
Non-Dual Recognition (☯️): recognize, transform, integrate, transcend, embody
Meta-Cognitive (🧠): examine, reflect, verify, refine, complete
Process Flow (🌊): open, engage, express
Meditation (🧘): meditate
Each thought is beautifully formatted with:
Note: The visualization appears in the server console output, helping developers track the thinking process.
tag='begin' to receive the full frameworkA tool for problem-solving using the Lotus Sutra's wisdom framework, with various approaches to understanding.
Begin your journey with tag='begin' - this returns the full framework (philosophy, domains, guidance) to ground your contemplation. Then continue with the other tags.
Inputs:
tag (string, required): The current processing technique (must be one of the tags listed above)content (string, required): The content of the current processing stepstepNumber (integer, required): Current number in sequencetotalSteps (integer, required): Estimated total steps needednextStepNeeded (boolean, required): Whether another step is neededisMeditation (boolean, optional): Whether this step is a meditative pausemeditationDuration (integer, optional): Duration for meditation in seconds (1-10)Returns:
MEDITATION_COMPLETE status for meditation stepsWISDOM_READY status when the contemplative process is completeGet a summary of the current contemplative journey.
Inputs: None
Returns:
The Lotus Wisdom tool is designed for:
Here's how a conversation with Claude might flow when using the Lotus Wisdom MCP server:
User: "Help me understand the relationship between freedom and responsibility."
Claude would begin the journey with tag='begin' to receive the framework, then continue:
{
"tag": "begin",
"content": "Entering contemplation on freedom and responsibility.",
"stepNumber": 1,
"totalSteps": 6,
"nextStepNeeded": true
}
→ Returns FRAMEWORK_RECEIVED with full framework
{
"tag": "open",
"content": "The question explores the relationship between freedom and responsibility, which contain an apparent tension but also deep connection.",
"stepNumber": 2,
"totalSteps": 6,
"nextStepNeeded": true
}
{
"tag": "direct",
"content": "Freedom and responsibility are two sides of the same coin. True freedom isn't absence of constraints but the capacity to choose our response within constraints.",
"stepNumber": 3,
"totalSteps": 6,
"nextStepNeeded": true
}
{
"tag": "meditate",
"content": "Contemplating how freedom without responsibility becomes chaos, and responsibility without freedom becomes oppression.",
"stepNumber": 4,
"totalSteps": 6,
"nextStepNeeded": true,
"isMeditation": true
}
{
"tag": "integrate",
"content": "Freedom and responsibility mutually enable each other. Our freedom to choose gives rise to our responsibility for what we choose, and our willingness to take responsibility expands our freedom.",
"stepNumber": 5,
"totalSteps": 6,
"nextStepNeeded": true
}
{
"tag": "express",
"content": "The paradox resolves when we see that authentic freedom includes responsibility as its natural expression.",
"stepNumber": 6,
"totalSteps": 6,
"nextStepNeeded": false
}
When the tool returns status: 'WISDOM_READY', Claude then speaks the final wisdom naturally, integrating all the insights from the contemplative journey.
Install via Smithery for one-click setup, or follow the manual instructions below.
Requires Node.js 18+. The server runs locally via npx.
# Claude Code
claude mcp add lotus-wisdom -- npx -y lotus-wisdom-mcp
# Codex CLI (OpenAI)
codex mcp add lotus-wisdom -- npx -y lotus-wisdom-mcp
# Gemini CLI (Google)
gemini mcp add lotus-wisdom npx -y lotus-wisdom-mcp
Add to your claude_desktop_config.json:
| OS | Config path |
|---|---|
| macOS | ~/Library/Application Support/Claude/claude_desktop_config.json |
| Windows | %APPDATA%\Claude\claude_desktop_config.json |
| Linux | ~/.config/Claude/claude_desktop_config.json |
{
"mcpServers": {
"lotus-wisdom": {
"command": "npx",
"args": ["-y", "lotus-wisdom-mcp"]
}
}
}
Add to .vscode/mcp.json (workspace) or open Command Palette > MCP: Open User Configuration (global):
{
"servers": {
"lotus-wisdom": {
"command": "npx",
"args": ["-y", "lotus-wisdom-mcp"]
}
}
}
Note: VS Code uses
"servers"as the top-level key, not"mcpServers". Other VS Code forks (Trae, Void, PearAI, etc.) typically use this same format.
Add to ~/.cursor/mcp.json (global) or .cursor/mcp.json (project):
{
"mcpServers": {
"lotus-wisdom": {
"command": "npx",
"args": ["-y", "lotus-wisdom-mcp"]
}
}
}
Add to ~/.codeium/windsurf/mcp_config.json (Windows: %USERPROFILE%\.codeium\windsurf\mcp_config.json):
{
"mcpServers": {
"lotus-wisdom": {
"command": "npx",
"args": ["-y", "lotus-wisdom-mcp"]
}
}
}
Open MCP Servers icon in Cline panel > Configure > Advanced MCP Settings, then add:
{
"mcpServers": {
"lotus-wisdom": {
"command": "npx",
"args": ["-y", "lotus-wisdom-mcp"]
}
}
}
In Settings > MCP Servers > Add Server, set Type to STDIO, Command to npx, Args to -y lotus-wisdom-mcp. Or paste in JSON/Code mode:
{
"lotus-wisdom": {
"name": "Lotus Wisdom",
"command": "npx",
"args": ["-y", "lotus-wisdom-mcp"],
"isActive": true
}
}
In Settings > MCP Servers, add a new server with Type: stdio, Command: npx, Args: -y lotus-wisdom-mcp.
Alternatively, edit ~/.codex/config.toml directly:
[mcp_servers.lotus-wisdom]
command = "npx"
args = ["-y", "lotus-wisdom-mcp"]
Alternatively, edit ~/.gemini/settings.json directly:
{
"mcpServers": {
"lotus-wisdom": {
"command": "npx",
"args": ["-y", "lotus-wisdom-mcp"]
}
}
}
On Windows, npx requires a shell wrapper. Replace "command": "npx" with:
{
"command": "cmd",
"args": ["/c", "npx", "-y", "lotus-wisdom-mcp"]
}
For CLI tools on Windows:
claude mcp add lotus-wisdom -- cmd /c npx -y lotus-wisdom-mcp
codex mcp add lotus-wisdom -- cmd /c npx -y lotus-wisdom-mcp
ChatGPT only supports remote MCP servers over HTTPS. Use Smithery or connect directly to the hosted instance below via ChatGPT Settings > Connectors.
A public instance is available at https://lotus-wisdom-mcp.linxule.workers.dev/mcp. No API key needed.
For clients supporting Streamable HTTP, connect directly to the URL. For stdio-only clients, use mcp-remote:
{
"mcpServers": {
"lotus-wisdom": {
"command": "npx",
"args": ["-y", "mcp-remote", "https://lotus-wisdom-mcp.linxule.workers.dev/mcp"]
}
}
}
To self-host your own instance, see worker/README.md.
bun install
bun run build
bun run start
Enable debug mode:
LOTUS_DEBUG=true bun run start
In MCP clients that support ext-apps (Claude Desktop, Cursor, ChatGPT), the tool renders an interactive "Living Trace" visualization inline in the chat:
Clients without ext-apps support are unaffected — they receive the same JSON tool responses as before.
The Lotus Wisdom framework recognizes that wisdom often emerges not through linear thinking but through a dance between different modes of understanding. The tool facilitates this by:
Tracking Wisdom Domains: As you move through different tags, the tool tracks which wisdom domains you're engaging, helping you see the shape of your inquiry.
Journey Consciousness: The tool maintains awareness of your complete journey, showing both the sequence of tags used and the movement between wisdom domains.
Non-Linear Progress: While steps are numbered, the process isn't strictly linear. You can revisit, revise, and branch as understanding deepens.
Integration Points: Tags like integrate, transcend, and embody help weave insights together rather than keeping them separate.
Natural Expression: The tool handles the contemplative process, but the final wisdom is always expressed naturally by the AI, not as formatted output.
MCP tool descriptions stay in the AI's context window constantly when the server is connected. To minimize this overhead while preserving the full teaching content:
lotuswisdom tool description is kept minimal—just enough for the AI to know when and how to use ittag='begin', including:
begin tag ensures models receive complete understanding before contemplatingThis approach reduces constant context overhead by ~85% when the tool is idle. When actually used, the full framework is delivered on first step—nothing is lost.
This MCP server is licensed under the MIT License. For more details, please see the LICENSE file in the project repository.
Contributions are welcome! Please feel free to submit issues or pull requests on the GitHub repository.
Current version: 0.4.0
nextStepNeeded=false now correctly returns WISDOM_READY (previously only express and complete could complete)tag='begin' can now be called with just {"tag":"begin"} - all other params auto-filledbegin tag now returns full parameter explanations, response format details, and meditation handlingtag='begin' opens the journey—returns full framework before contemplation startsbegin tag ensures models receive complete understanding before contemplatingtitle field for better discoverabilityBe the first to review this server!
by Modelcontextprotocol · Developer Tools
Read, search, and manipulate Git repositories programmatically
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.
by Microsoft · Content & Media
Convert files (PDF, Word, Excel, images, audio) to Markdown for LLM consumption
by mcp-marketplace · Developer Tools
Scaffold, build, and publish TypeScript MCP servers to npm — conversationally
by mcp-marketplace · Finance
Free stock data and market news for any MCP-compatible AI assistant.