Server data from the Official MCP Registry
An MCP server to extract keyphrases from a text with the BERT model
An MCP server to extract keyphrases from a text with the BERT model
Valid MCP server (1 strong, 4 medium validity signals). 5 known CVEs in dependencies (1 critical, 3 high severity) Imported from the Official MCP Registry. 1 finding(s) downgraded by scanner intelligence.
12 files analyzed · 6 issues found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Add this to your MCP configuration file:
{
"mcpServers": {
"io-github-ivanrublev-keyphrases-mcp": {
"args": [
"keyphrases-mcp"
],
"command": "uvx"
}
}
}From the project's GitHub README.
Empowering LLMs with authentic keyphrase Extraction
Built with the following tools and technologies:
This Keyphrases MCP Server is a natural language interface designed for agentic applications to extract keyphrasess from provided text. It integrates seamlessly with MCP (Model Content Protocol) clients, enabling AI-driven workflows to extract keyphrases more accurately and with higher relevance using the BERT machine learning model. It works directly with your local files in the allowed directories saving the context tokens for the LLM. The application ensures secure document processing by exposing only extracted keyphrases to the MCP client, not the original file content.
Using this MCP Server, you can ask the following questions:
Keyphrases help users quickly grasp the main topics and themes of a document without reading it in full and enable the following applications:
Autoregressive LLM models such as in Claude or ChatGPT process text sequentially, which—not only limits their ability to fully contextualize keyphrases across the entire document—but also suffers from context degradation as the input length increases, causing earlier keyphrases to receive diluted attention.
Bidirectional models like BERT, by considering both left and right context and maintaining more consistent attention across the sequence, generally extract existing keyphrases from texts more accurately and with higher relevance especially when no domain-specific fine-tuning is applied.
However, as autoregressive models adopt longer context windows and techniques such as input chunking, their performance in keyphrase extraction is improving, narrowing the gap with BERT. And domain-specific fine-tuning can make autoregressive LLM model to outperform the BERT solution.
This MCP server combines BERT for keyphrase extraction with an autoregressive LLM for text generation or refinement, enabling seamless text processing.
The server uses a KeyBERT framework for the multi-step extraction pipeline combining spaCy NLP preprocessing with BERT embeddings:
See configuration document for details.
This project is licensed under the MIT License.
For questions or support, reach out via GitHub Issues.
Be the first to review this server!
by Modelcontextprotocol · Developer Tools
Read, search, and manipulate Git repositories programmatically
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.