Server data from the Official MCP Registry
MCP server for Dataiku DSS project, flow, and operations APIs.
MCP server for Dataiku DSS project, flow, and operations APIs.
Valid MCP server (2 strong, 2 medium validity signals). No known CVEs in dependencies. Package registry verified. Imported from the Official MCP Registry.
6 files analyzed · 1 issue found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Set these up before or after installing:
Environment variable: DATAIKU_URL
Environment variable: DATAIKU_API_KEY
Environment variable: DATAIKU_PROJECT_KEY
Add this to your MCP configuration file:
{
"mcpServers": {
"io-github-clssck-dataiku-mcp": {
"env": {
"DATAIKU_URL": "your-dataiku-url-here",
"DATAIKU_API_KEY": "your-dataiku-api-key-here",
"DATAIKU_PROJECT_KEY": "your-dataiku-project-key-here"
},
"args": [
"-y",
"dataiku-mcp"
],
"command": "npx"
}
}
}From the project's GitHub README.
MCP server for Dataiku DSS REST APIs, focused on flow analysis and reliable day-to-day operations (projects, datasets, recipes, jobs, scenarios, folders, variables, connections, and code environments).
Cursor one-click install includes placeholder environment values. Update
DATAIKU_URL,DATAIKU_API_KEY, and optionallyDATAIKU_PROJECT_KEYafter adding the server.
project.map) with recipe subtypes and connectivity.not_found, forbidden, validation, transient, unknown with retry hints.project: list, get, metadata, flow, mapdataset: list, get, schema, preview, metadata, download, create, update, deleterecipe: list, get, create, update, delete, downloadjob: list, get, log, build, buildAndWait, wait, abortscenario: list, run, status, get, create, update, deletemanaged_folder: list, get, contents, download, upload, delete_filevariable: get, setconnection: infercode_env: list, getnpm ci
npm run build
Run as a local CLI after build:
node dist/index.js
Use directly from npm (after publish):
npx -y dataiku-mcp
Recommended local workflow from repo root:
# install deps
npm ci
# static checks
npm run check
# unit tests
npm test
# build distribution
npm run build
# run MCP server locally (dev)
npm start
Optional live DSS integration tests:
# requires DATAIKU_URL, DATAIKU_API_KEY, DATAIKU_PROJECT_KEY in .env
npm run test:integration
# includes destructive actions (create/update/delete)
DATAIKU_MCP_DESTRUCTIVE_TESTS=1 npm run test:integration
src/: MCP server and tool implementations.tests/: unit + integration test suites.examples/: demos, fixtures, artifacts, and ad-hoc local scripts.bin/: package executable entrypoint.dist/: compiled output (generated).Create a local env file:
cp .env.example .env
# then edit .env
Run directly in dev:
npm start
Example scripts and sample outputs are kept under examples/ to avoid root-level clutter.
DATAIKU_URL: DSS base URLDATAIKU_API_KEY: DSS API keyDATAIKU_PROJECT_KEY (optional): default project keyDATAIKU_REQUEST_TIMEOUT_MS (optional): per-attempt request timeout in milliseconds (default: 30000)DATAIKU_RETRY_MAX_ATTEMPTS (optional): max attempts for retry-enabled requests (GET only, default: 4, cap: 10)DATAIKU_DEBUG_LATENCY (optional): set to 1/true to include per-tool timing diagnostics in structuredContent.debug.latency (off by default)Use this server command in clients (npm package):
{
"command": "npx",
"args": ["-y", "dataiku-mcp"],
"env": {
"DATAIKU_URL": "https://your-dss-instance.app.dataiku.io",
"DATAIKU_API_KEY": "your_api_key",
"DATAIKU_PROJECT_KEY": "YOUR_PROJECT_KEY"
}
}
Windows note: if your MCP client launches commands without a shell, use npx.cmd:
{
"command": "npx.cmd",
"args": ["-y", "dataiku-mcp"],
"env": {
"DATAIKU_URL": "https://your-dss-instance.app.dataiku.io",
"DATAIKU_API_KEY": "your_api_key",
"DATAIKU_PROJECT_KEY": "YOUR_PROJECT_KEY"
}
}
You can also run TypeScript directly during development:
{
"command": "npx",
"args": ["tsx", "/absolute/path/to/Dataiku_MCP/src/index.ts"],
"env": {
"DATAIKU_URL": "https://your-dss-instance.app.dataiku.io",
"DATAIKU_API_KEY": "your_api_key",
"DATAIKU_PROJECT_KEY": "YOUR_PROJECT_KEY"
}
}
Settings -> Developer -> Edit Config.mcpServers in claude_desktop_config.json:{
"mcpServers": {
"dataiku": {
"command": "npx",
"args": ["-y", "dataiku-mcp"],
"env": {
"DATAIKU_URL": "https://your-dss-instance.app.dataiku.io",
"DATAIKU_API_KEY": "your_api_key",
"DATAIKU_PROJECT_KEY": "YOUR_PROJECT_KEY"
}
}
}
}
Cursor supports both project-scoped and global MCP config:
.cursor/mcp.json~/.cursor/mcp.jsonExample:
{
"mcpServers": {
"dataiku": {
"command": "npx",
"args": ["-y", "dataiku-mcp"],
"env": {
"DATAIKU_URL": "https://your-dss-instance.app.dataiku.io",
"DATAIKU_API_KEY": "your_api_key",
"DATAIKU_PROJECT_KEY": "YOUR_PROJECT_KEY"
}
}
}
}
cline_mcp_settings.json:{
"mcpServers": {
"dataiku": {
"command": "npx",
"args": ["-y", "dataiku-mcp"],
"env": {
"DATAIKU_URL": "https://your-dss-instance.app.dataiku.io",
"DATAIKU_API_KEY": "your_api_key",
"DATAIKU_PROJECT_KEY": "YOUR_PROJECT_KEY"
}
}
}
}
This repo already includes a project-scoped MCP file at .mcp.json.
The checked-in .mcp.json uses node node_modules/tsx/dist/cli.mjs src/index.ts for cross-platform startup (including Windows); run npm ci first.
This repo includes a manual GitHub Actions release workflow:
.github/workflows/release.ymlActions -> Release NPM Package -> Run workflowInputs:
bump: patch | minor | majorversion: optional exact version (overrides bump)publish: whether to publish to npmRequired repository configuration:
NPM_RELEASE_ENABLED=trueNPM_PUBLISH_ACCESS=publicThe workflow will:
main.publish=true).Trusted publishing setup (npm):
https://www.npmjs.com/package/dataiku-mcp -> Settings -> Trusted Publisher.GitHub Actions.clssckDataiku_MCPrelease.ymlThis repo is configured for MCP Registry publishing:
server.json.github/workflows/publish-mcp-registry.ymlmcpName in package.jsonServer namespace:
io.github.clssck/dataiku-mcpPublish paths:
Publish to MCP Registry in GitHub Actions.publish=true (it triggers MCP Registry publish).Validation notes:
server.json.name must match package.json.mcpName.server.json.packages[].identifier + version must reference a real npm publish.After adding the server in a client, run:
project with { "action": "map", "projectKey": "YOUR_PROJECT_KEY" } (defaults to maxNodes=300, maxEdges=600; override as needed)You should receive a flow summary in text and normalized nodes, edges, stats, roots, and leaves under structuredContent.map.
When truncation limits are applied (default maxNodes=300, maxEdges=600), structuredContent.truncation reports before/after node+edge counts and whether truncation occurred.
project.map returns a compact text summary; full normalized graph is in structuredContent.map.job.wait and job.buildAndWait include structuredContent.normalizedState with one of terminalSuccess | terminalFailure | timeout | nonTerminal while preserving raw DSS state.DATAIKU_DEBUG_LATENCY=1, responses include per-tool and per-API-call latency metrics under structuredContent.debug.latency.limit/offset (and action-specific caps like maxNodes, maxEdges, maxKeys, maxPackages) to page or expand results when needed.dataset.get and job.get are summary-first by default; pass includeDefinition=true to include full DSS JSON in structuredContent.definition.Be the first to review this server!
by Modelcontextprotocol · Developer Tools
Read, search, and manipulate Git repositories programmatically
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.
by Microsoft · Content & Media
Convert files (PDF, Word, Excel, images, audio) to Markdown for LLM consumption
by mcp-marketplace · Developer Tools
Scaffold, build, and publish TypeScript MCP servers to npm — conversationally
by mcp-marketplace · Finance
Free stock data and market news for any MCP-compatible AI assistant.