Server data from the Official MCP Registry
AI Agent-Native Data Platform — ingest, validate, transform, and query data.
AI Agent-Native Data Platform — ingest, validate, transform, and query data.
Valid MCP server (3 strong, 1 medium validity signals). 7 known CVEs in dependencies (0 critical, 7 high severity) Package registry verified. Imported from the Official MCP Registry.
13 files analyzed · 8 issues found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Unverified package source
We couldn't verify that the installable package matches the reviewed source code. Proceed with caution.
Set these up before or after installing:
Environment variable: PIPELINE_URL
Environment variable: PIPELINE_API_KEY
Add this to your MCP configuration file:
{
"mcpServers": {
"io-github-datris-datris": {
"env": {
"PIPELINE_URL": "your-pipeline-url-here",
"PIPELINE_API_KEY": "your-pipeline-api-key-here"
},
"args": [
"datris-mcp-server"
],
"command": "uvx"
}
}
}From the project's GitHub README.
datris.ai · Try Hosted Free · Documentation · MCP Registry · PyPI
Ingest, validate, transform, store, and retrieve your data — whether you're an AI agent talking through MCP or a developer writing config. One platform for both.
git clone https://github.com/datris/datris-platform-oss.git
cd datris-platform-oss
cp .env.example .env # Add your ANTHROPIC_API_KEY and/or OPENAI_API_KEY
docker compose up -d
UI: http://localhost:4200 · API: http://localhost:8080
Add to your MCP client config (Claude Desktop, Cursor, etc.):
{
"mcpServers": {
"datris": {
"command": "uvx",
"args": ["datris-mcp-server"],
"env": {
"PIPELINE_URL": "http://localhost:8080"
}
}
}
}
brew tap datris/tap
brew install datris
datris ingest data.csv --dest postgres
datris ingest sales.csv --ai-validate "prices > 0" --ai-transform "convert dates to YYYY/MM/DD"
datris query "SELECT * FROM sales"
datris search "quarterly revenue" --store pgvector
datris tap create "Fetch S&P 500 daily prices from yfinance" --pipeline stocks
datris taps
Source (File Upload / MinIO Event / Database Pull / Kafka)
→ Preprocessor (optional REST endpoint)
→ Data Quality (AI rules, header validation, schema validation)
→ Transformation (AI transformation, destination schema)
→ Destinations (in parallel):
PostgreSQL, MongoDB, MinIO (Parquet/ORC), Kafka, ActiveMQ,
REST Endpoint, Qdrant, Weaviate, Milvus, Chroma, pgvector
→ Notifications (ActiveMQ topic)
| Feature | Description |
|---|---|
| MCP Server | 30+ tools for AI agents — pipeline CRUD, upload, query, search, profiling |
| AI Data Quality | Plain English validation rules — AI generates and runs a validation script |
| AI Transformation | Plain English transformations — AI generates and runs a transformation script |
| AI Schema Generation | Upload a file, get a complete pipeline config |
| AI Data Profiling | Upload a file, get statistics + suggested validation rules |
| AI Error Explanation | Job failures explained in plain English |
| Natural Language Query | Ask questions in English, get SQL results |
| RAG Pipeline | Chunk, embed, and search across 5 vector databases |
CSV, JSON, XML, Excel, PDF, Word, PowerPoint, HTML, email, EPUB, plain text, .zip/.tar/.gz archives
Anthropic Claude (Opus 4.6, Sonnet 4.6, Haiku) · OpenAI (GPT-5, GPT-4.1, o3) · Ollama (local models)
| Service | Purpose |
|---|---|
| MinIO | S3-compatible object store for file staging and data output |
| MongoDB | Configuration store, job status tracking, metadata |
| ActiveMQ | File notification queue, pipeline event notifications |
| HashiCorp Vault | Secrets management (database credentials, API keys) |
| Apache Kafka | Optional streaming source and destination |
| Apache Spark | Local Spark for writing Parquet/ORC to MinIO |
Full documentation at docs.datris.ai or locally at docs/.
Be the first to review this server!
by Modelcontextprotocol · Developer Tools
Read, search, and manipulate Git repositories programmatically
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.
by Microsoft · Content & Media
Convert files (PDF, Word, Excel, images, audio) to Markdown for LLM consumption
by mcp-marketplace · Developer Tools
Scaffold, build, and publish TypeScript MCP servers to npm — conversationally
by mcp-marketplace · Finance
Free stock data and market news for any MCP-compatible AI assistant.