Server data from the Official MCP Registry
MCP server that reduces LLM context by removing code comments and converting data formats to TOON
MCP server that reduces LLM context by removing code comments and converting data formats to TOON
Valid MCP server (2 strong, 3 medium validity signals). No known CVEs in dependencies. Package registry verified. Imported from the Official MCP Registry.
7 files analyzed · 1 issue found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Add this to your MCP configuration file:
{
"mcpServers": {
"io-github-ankitpal181-toon-parse-mcp": {
"args": [
"toon-parse-mcp"
],
"command": "uvx"
}
}
}From the project's GitHub README.
mcp-name: io.github.ankitpal181/toon-parse-mcp
A specialized Model Context Protocol (MCP) server that optimizes token usage by converting data to TOON (Token-Oriented Object Notation) and stripping non-essential context from code files.
The toon-parse-mcp MCP server helps AI agents (like Cursor, Claude Desktop, etc.) operate more efficiently by:
optimize_input_context(raw_input: str): Processes raw text data (JSON/XML/CSV/YAML) and returns optimized TOON format.read_and_optimize_file(file_path: str): Reads a local code file and returns a token-optimized version (no inline comments, minimized whitespace).protocol://mandatory-efficiency: Provides a strict system instruction prompt for LLMs to ensure they use the optimization tools correctly.pip install toon-parse-mcp
toon-parse-mcpcommandpython3 -m toon_parse_mcp.server (Ensure your environment is active or use absolute path to python)~/.codeium/windsurf/mcp_config.json directly.mcpServers object:{
"mcpServers": {
"toon-parse-mcp": {
"command": "python3",
"args": ["-m", "toon_parse_mcp.server"]
}
}
}
~/.gemini/antigravity/mcp_config.json directly.mcpServers object:{
"mcpServers": {
"toon-parse-mcp": {
"command": "python3",
"args": ["-m", "toon_parse_mcp.server"]
}
}
}
Add this to your claude_desktop_config.json:
{
"mcpServers": {
"toon-parse-mcp": {
"command": "python3",
"args": ["-m", "toon_parse_mcp.server"]
}
}
}
When the server is active, the AI will have access to the optimize_input_context and read_and_optimize_file tools. You can also refer to the efficiency protocol by asking the AI to "check the mandatory efficiency protocol".
To run the test suite:
pip install -e ".[test]"
pytest tests/
mcp >= 1.25.0toon-parse >= 2.4.3MIT License - see LICENSE for details.
Be the first to review this server!
by Modelcontextprotocol · Developer Tools
Read, search, and manipulate Git repositories programmatically
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.
by Microsoft · Content & Media
Convert files (PDF, Word, Excel, images, audio) to Markdown for LLM consumption
by mcp-marketplace · Developer Tools
Scaffold, build, and publish TypeScript MCP servers to npm — conversationally
by mcp-marketplace · Finance
Free stock data and market news for any MCP-compatible AI assistant.