Server data from the Official MCP Registry
Add OpenTelemetry tracing to Python AI agents. Supports LangGraph, LlamaIndex, CrewAI, OpenAI SDK.
Add OpenTelemetry tracing to Python AI agents. Supports LangGraph, LlamaIndex, CrewAI, OpenAI SDK.
Valid MCP server (0 strong, 4 medium validity signals). No known CVEs in dependencies. Package registry verified. Imported from the Official MCP Registry.
11 files analyzed · 1 issue found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Set these up before or after installing:
Environment variable: ANTHROPIC_API_KEY
Environment variable: LLM_MODEL
Environment variable: OPENAI_API_KEY
Environment variable: GEMINI_API_KEY
Environment variable: GROQ_API_KEY
Add this to your MCP configuration file:
{
"mcpServers": {
"io-github-alanzha2-observe-instrument-mcp": {
"env": {
"LLM_MODEL": "your-llm-model-here",
"GROQ_API_KEY": "your-groq-api-key-here",
"GEMINI_API_KEY": "your-gemini-api-key-here",
"OPENAI_API_KEY": "your-openai-api-key-here",
"ANTHROPIC_API_KEY": "your-anthropic-api-key-here"
},
"args": [
"observe-instrument-mcp"
],
"command": "uvx"
}
}
}From the project's GitHub README.
An MCP server that automatically instruments Python AI agents with the ioa-observe-sdk — adding OpenTelemetry-based tracing, metrics, and logs with zero manual effort.
Works with any MCP-compatible AI coding assistant: Claude Desktop, Cursor, Windsurf, and others.
Two tools:
instrument_agent — reads a Python agent file, applies full observe SDK instrumentation, writes it back, and returns a summary of changes. Creates a .bak backup before modifying.
check_instrumentation — audits a file for missing instrumentation without modifying it.
Supported frameworks: LlamaIndex, LangGraph, CrewAI, raw OpenAI SDK.
pip install observe-instrument-mcp
# or
uv add observe-instrument-mcp
Requires an API key for your chosen LLM provider. Defaults to Claude (ANTHROPIC_API_KEY). See supported providers below.
Add to ~/Library/Application Support/Claude/claude_desktop_config.json:
{
"mcpServers": {
"observe-instrument": {
"command": "uvx",
"args": ["observe-instrument-mcp"],
"env": {
"ANTHROPIC_API_KEY": "sk-ant-..."
}
}
}
}
Add to .cursor/mcp.json in your project:
{
"mcpServers": {
"observe-instrument": {
"command": "uvx",
"args": ["observe-instrument-mcp"],
"env": {
"ANTHROPIC_API_KEY": "sk-ant-..."
}
}
}
}
Add to ~/.codeium/windsurf/mcp_config.json:
{
"mcpServers": {
"observe-instrument": {
"command": "uvx",
"args": ["observe-instrument-mcp"],
"env": {
"ANTHROPIC_API_KEY": "sk-ant-..."
}
}
}
}
Ready-to-use uninstrumented agent files are included in the examples/ folder:
examples/
single-agent/
openai-sdk-example.py # OpenAI SDK customer support agent
langgraph-example.py # LangGraph currency converter
llama-index-example.py # LlamaIndex math agent
crewai-example.py # CrewAI research crew
multi-agent/
openai-sdk-multi-agent-example.py # OpenAI SDK orchestrator pipeline
langgraph-multi-agent-example.py # LangGraph supervisor pattern
llama-index-multi-agent-example.py # LlamaIndex research + writing pipeline
crewai-multi-agent-example.py # CrewAI research + publishing crews
Once configured, ask your AI assistant:
Instrument my agent with the observe SDK: path/to/my_agent.py
Check what observe SDK instrumentation is missing from path/to/my_agent.py
| Variable | Description |
|---|---|
LLM_MODEL | Model to use (default: claude-sonnet-4-6). See provider table below. |
ANTHROPIC_API_KEY | Required for Anthropic models |
OPENAI_API_KEY | Required for OpenAI models |
GEMINI_API_KEY | Required for Google Gemini models |
GROQ_API_KEY | Required for Groq models |
| Provider | Key variable | LLM_MODEL example |
|---|---|---|
| Anthropic | ANTHROPIC_API_KEY | claude-sonnet-4-6 |
| OpenAI | OPENAI_API_KEY | gpt-4o |
| Google Gemini | GEMINI_API_KEY | gemini/gemini-2.0-flash |
| Groq | GROQ_API_KEY | groq/llama-3.3-70b |
| Ollama (local, free) | none | ollama/llama3.2 |
Install the SDK in your project:
pip install ioa-observe-sdk
# or
uv add ioa-observe-sdk
Start the observability stack (OTel Collector + ClickHouse):
cd path/to/observe/deploy
docker compose up -d
Run your agent:
OPENAI_API_KEY=sk-... OTLP_HTTP_ENDPOINT=http://localhost:4318 python my_agent.py
Query traces:
docker exec -it clickhouse-server clickhouse-client --user admin --password admin
SELECT SpanName, ServiceName, Duration / 1000000. AS ms, Timestamp
FROM otel_traces
ORDER BY Timestamp DESC
LIMIT 20;
git clone https://github.com/alanzha2/observe-instrument-mcp
cd observe-instrument-mcp
pip install -e .
# Test the server locally
mcp dev observe_instrument_mcp/server.py
Apache-2.0
Be the first to review this server!
by Modelcontextprotocol · Developer Tools
Read, search, and manipulate Git repositories programmatically
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.
by Microsoft · Content & Media
Convert files (PDF, Word, Excel, images, audio) to Markdown for LLM consumption
by mcp-marketplace · Developer Tools
Scaffold, build, and publish TypeScript MCP servers to npm — conversationally
by mcp-marketplace · Finance
Free stock data and market news for any MCP-compatible AI assistant.