Server data from the Official MCP Registry
Map text into knowledge graphs to create a structured representation of conceptual relations and t…
Map text into knowledge graphs to create a structured representation of conceptual relations and t…
Remote endpoints: streamable-http: https://server.smithery.ai/@infranodus/mcp-server-infranodus/mcp
Valid MCP server (2 strong, 1 medium validity signals). No known CVEs in dependencies. Imported from the Official MCP Registry.
3 files analyzed · No issues found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
Available as Local & Remote
This plugin can run on your machine or connect to a hosted endpoint. during install.
From the project's GitHub README.
A Model Context Protocol (MCP) server that integrates InfraNodus knowledge graph and text network analysis capabilities into LLM workflows and AI assistants like Claude Desktop.
InfraNodus MCP Server enables LLM workflows and AI assistants to analyze text using advanced network science algorithms, generate knowledge graphs, detect content gaps, and identify key topics and concepts. It transforms unstructured text into structured insights using graph theory and network analysis.

generate_knowledge_graph
analyze_existing_graph_by_name
analyze_text
generate_content_gaps
generate_topical_clusters
generate_contextual_hint
generate_research_questions
generate_research_ideas
optimize_text_structure
generate_responses_from_graph
develop_conceptual_bridges
develop_latent_topics
develop_text_tool
create_knowledge_graph
overlap_between_texts
merged_graph_from_texts
difference_between_texts
analyze_google_search_results
analyze_related_search_queries
search_queries_vs_search_results
generate_seo_report
memory_add_relations
memory_get_relations
retrieve_from_knowledge_base
search
fetch
More capabilites coming soon!
InfraNodus represents any text as a network graph in order to identify the main clusters of ideas and gaps between them. This helps generate advanced insights based on the text's structure. The network is effectively a knowledge graph that can also be used to retrieve complex ontological relations between different entities and concepts. This process is automated in InfraNodus using the search and fetch tools along with the other tools that analyze the underlying network.
However, you can also easily use InfraNodus as a more traditional memory server to save and retrieve relations. We use [[wikilinks]] to highlight entities in your text to make your content and graphs compatible with markup syntax and PKM tools such as Obsidian. By default, InfraNodus will generate the name of the memory graph for you based on the context of the conversation. However, you can modify this default behavior by adding a system prompt or project instruction into your LLM client.
Specifically you can specify to always use a speciic knowlege graph for memories to store everything in one place:
Save all memories in the `my-memories` graph in InfraNodus.
Or you can ask InfraNodus to only save certain entities, e.g. for building social networks:
When generating entities, only extract people, companies, and organizations. Ignore everything else.
The easiest and the fastest way to launch the InfraNodus MCP server is to either use our server URL https://mcp.infranodus.com for the remote / web applications or to add a manual configuration to your LLM apps if you're running them locally.
You can also install the server locally, so you have more control over it. In this case, you can also edit the source files and even create your tools based on the InfraNodus API.
Below we describe the two different ways to set up your InfraNodus MCP server.
https://mcp.infranodus.com
To use InfraNodus, see the tools available and simply call them through the chat interface (e.g. "show me the graphs where I talk about this topic" or "get the content gaps from the document I uploaded")
If your client is not using InfraNodus for some actions, add the instruction to use InfraNodus explicitly.
You can deploy the InfraNodus server manually via npx — a package that allows to execute local and remote Node.Js packages on your computer.
The InfraNodus MCP server is available as an npm package at https://www.npmjs.com/package/infranodus-mcp-server from where you can launch it remotely on your local computer with npx. It will expose its tools to the MCP client that will be using this command to launch the server
Just add this in your Claude's configuration file (Settings > Developer > Edit Config), inside the "mcpServers" object where the different servers are listed:
{
"mcpServers": {
"infranodus": {
"command": "npx",
"args": ["-y", "infranodus-mcp-server"],
"env": {
"INFRANODUS_API_KEY": "YOUR_INFRANODUS_API_KEY"
}
}
}
}
To connect the InfraNodus MCP server to your Claude code, you can use this command. Make sure to provide the correct InfraNodus API key for your account:
claude mcp add infranodus -s user \
-- env INFRANODUS_API_KEY=YOUR_INRANODUS_KEY \
npx -y infranodus-mcp-server
Clone and build the server:
git clone https://github.com/yourusername/mcp-server-infranodus.git
cd mcp-server-infranodus
npm install
npm run build:inspect
Note that build:inspect will generate the dist/index.js file which you will then use in your server setup. The standard npm run build command will only build a Smithery file.
Set up your API key:
Create a .env file in the project root:
INFRANODUS_API_KEY=your-api-key-here
Inspect the MCP:
npm run inspect
Open your Claude Desktop configuration file:
open ~/Library/Application\ Support/Claude/claude_desktop_config.json
Add the InfraNodus server configuration:
a. remote launch via npx:
{
"mcpServers": {
"infranodus": {
"command": "npx",
"args": ["-y", "infranodus-mcp-server"],
"env": {
"INFRANODUS_API_KEY": "YOUR_INFRANODUS_API_KEY"
}
}
}
}
b. launch this repo with node, specify the absolute path to the repo + /dist/index.js:
{
"mcpServers": {
"infranodus": {
"command": "node",
"args": ["/absolute/path/to/mcp-server-infranodus/dist/index.js"],
"env": {
"INFRANODUS_API_KEY": "your-api-key-here"
}
}
}
}
Note: you can leave the INFRANODUS_API_KEY empty in which case you can make 70 free requests after which you will hit quota and will need to add your API key.
Open your Claude Desktop configuration file:
%APPDATA%\Claude\claude_desktop_config.json
Add the InfraNodus server configuration:
a. remote launch via npx:
{
"mcpServers": {
"infranodus": {
"command": "npx",
"args": ["-y", "infranodus-mcp-server"],
"env": {
"INFRANODUS_API_KEY": "YOUR_INFRANODUS_API_KEY"
}
}
}
}
b. launch this repo with node:
{
"mcpServers": {
"infranodus": {
"command": "node",
"args": ["C:\\path\\to\\mcp-server-infranodus\\dist\\index.js"],
"env": {
"INFRANODUS_API_KEY": "your-api-key-here"
}
}
}
}
For other applications supporting MCP, use the following command to start the server via npx:
INFRANODUS_API_KEY=your-api-key npx -y infranodus-mcp-server
or locally
INFRANODUS_API_KEY=your-api-key node /path/to/mcp-server-infranodus/dist/index.js
The server communicates via stdio, so configure your application to run this command and communicate through standard input/output.
InfraNodus server is also available through Smithery: a repository of MCP servers that has an easy-to-follow installation process for most LLM clients. You will need a separate accout at Smithery though.
Create an account on Smithery.Ai (it's free and you can use your Google or GitHub login)
Then go to the Smithery InfraNodus Server, click "Configure" at the top right, and add your InfraNodus API key there.
Go to Smithery InfraNodus Server and get the URL link from Smithery https://server.smithery.ai/@infranodus/mcp-server-infranodus/mcp for the server or use one of their automatic setup tools for Claude or Cursor.
You may need to get your separate Smithery API key and Smithery proile link to make this work.
// e.g. Cursor will access directly the server via Smithery
"mcpServers": {
"mcp-server-infranodus": {
"type": "http",
"url": "https://server.smithery.ai/@infranodus/mcp-server-infranodus/mcp?api_key=YOUR_SMITHERY_KEY&profile=YOUR_SMITHERY_PROFILE",
"headers": {}
}
}
// Claude uses a slightly different implementation
// Fot this, it launches the MCP server on your local machine
"mcpServers": {
"mcp-server-infranodus": {
"command": "npx",
"args": [
"-y",
"@smithery/cli@latest",
"run",
"@infranodus/mcp-server-infranodus",
"--key",
"YOUR_SMITHERY_KEY",
"--profile",
"YOUR_SMITHERY_PROFILE"
]
}
}
Note, in both cases, you'll automatically get the YOUR_SMITHERY_KEY and YOUR_SMITHERY_PROFILE values from Smithery when you copy the URL with credentials. These are not your InfraNodus API keys. You can use the InfraNodus API server without the API for the first 70 calls. Then you can add it to your Smithery profile and it will automatically connect to your account using the link above.
Once installed, you can ask Claude to:
npm run dev
Test the server with the MCP Inspector:
npm run build:inspect
npm run inspect
npm run build
npm run watch
Analyzes text and generates a knowledge graph.
Parameters:
text (string, required): The text to analyzeincludeStatements (boolean): Include original statements in responsemodifyAnalyzedText (string): Text modification options ("none", "entities", "lemmatize")Retrieves and analyzes an existing graph from your InfraNodus account.
Parameters:
graphName (string, required): Name of the existing graphincludeStatements (boolean): Include statements in responseincludeGraphSummary (boolean): Include graph summaryAnalyze a text, URL, or YouTube transcript. Extract and analyze a graph from text or URL; provide either text or url.
Parameters:
text (string, optional): Text to analyze. Provide either this or url.url (string, optional): URL to fetch content from (e.g. webpage or YouTube transcript). Provide either this or text.includeStatements (boolean): Include processed statements in responseincludeGraph (boolean): Include full graph structure in responseaddNodesAndEdges (boolean): Include nodes and edges in responseincludeGraphSummary (boolean): Include AI-generated graph summary for RAG prompt augmentationmodifyAnalyzedText (string): Entity detection — "none", "detectEntities", or "extractEntitiesOnly"Identifies content gaps and missing connections in text.
Parameters:
text (string, required): The text to analyze for gapsFor long-running operations (like SEO analysis), the MCP server supports real-time progress notifications that provide intermediary feedback to AI agents. This allows agents to:
The server implements MCP progress notifications using:
import { ProgressReporter } from "../utils/progress.js";
import { ToolHandlerContext } from "../types/index.js";
handler: async (params: ParamType, context: ToolHandlerContext = {}) => {
const progress = new ProgressReporter(context);
await progress.report(25, "Fetching data from API...");
// Do work
await progress.report(75, "Analyzing results...");
// More work
await progress.report(100, "Complete!");
return results;
};
The generate_seo_report tool demonstrates this pattern with 6 major progress checkpoints that provide detailed status updates throughout the multi-step analysis process.
# Clean install
rm -rf node_modules package-lock.json
npm install
npm run build
MIT
For issues related to:
Be the first to review this server!
by Modelcontextprotocol · Developer Tools
Read, search, and manipulate Git repositories programmatically
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.