MCP server to enforce development workflow discipline
Valid MCP server (1 strong, 1 medium validity signals). 5 known CVEs in dependencies (0 critical, 2 high severity) Package registry verified. Imported from the Official MCP Registry.
6 files analyzed · 6 issues found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Add this to your MCP configuration file:
{
"mcpServers": {
"mcp-server": {
"args": [
"-y",
"@programinglive/dev-workflow-mcp-server"
],
"command": "npx"
}
}
}From the project's GitHub README.
An MCP (Model Context Protocol) server that helps enforce development discipline and workflow best practices. This server acts as your coding conscience, reminding you to follow proper development workflows.
This MCP server guides you through a disciplined development workflow:
tools/handlers.js now re-exports focused handler modules, improving maintainability and bundler behavior.tests/handlers.test.js has been split into handlers-*.test.js files with shared helpers, reducing duplication and clarifying intent.tests/test-helpers.js centralizes workflow state setup, request builders, and git mocks.docs/PRD.md root, status bumped to v1.8.0, and release notes updated to reflect the current workflow.npm run release:<type> followed by git push --follow-tags origin main to keep stages clean.Each project gets its own isolated workflow state file.
npm install @programinglive/dev-workflow-mcp-server
This will automatically create a .state/workflow-state.json file in the project where you ran npm install (using npm's INIT_CWD), keeping workflow history separate per project. If you're installing the package itself (inside node_modules), the script skips creation so it never pollutes the shared package directory.
git clone https://github.com/programinglive/dev-workflow-mcp-server.git
cd dev-workflow-mcp-server
npm install
Windows prerequisites: Installing dependencies from source compiles native modules such as
better-sqlite3. Make sure Python 3 (added to PATH) and the Visual Studio Build Tools “Desktop development with C++” workload are installed before runningnpm install. Without them, npm will fail with a “need python” or build error.
Plesk supports Node.js applications through its Node.js extension. To deploy the MCP server on a Plesk subscription:
httpdocs/dev-workflow-mcp-server). From SSH you can run:
cd httpdocs
git clone https://github.com/programinglive/dev-workflow-mcp-server.git
cd dev-workflow-mcp-server
npm install --production over SSH). Linux hosts already ship the Python/build toolchain required for better-sqlite3; if your plan uses a Windows host, install Python 3 and the Visual Studio Build Tools beforehand or ask your provider to enable them.DEV_WORKFLOW_USER_ID or DEV_WORKFLOW_STATE_FILE). This keeps state files outside the web root if desired.index.js and Application mode to production. Plesk will run the server with node index.js.Tip: The MCP server communicates over stdio. If you only need it as a CLI tool, you can also run
npx @programinglive/dev-workflow-mcp-serverdirectly in an SSH session without keeping it running under the Node.js panel.
Important: MCP clients (Windsurf, Claude Desktop, etc.) must launch the server process locally via stdio. Hosting the dashboard on a public domain does not expose the MCP interface. Without SSH or another way to execute
node index.json the server, users cannot connect their MCP clients to the hosted instance.
Deploy the MCP server to Google Cloud Compute Engine using Docker and PostgreSQL for a production-ready, cloud-hosted setup.
Quick Start:
bash scripts/setup-gcp-instance.sh.envdocker-compose up -dBenefits:
See the GCP Deployment Guide for complete step-by-step instructions.
index.js. This runs directly from source and requires no build step. Recommended for MCP usage.npm run build once to generate dist/. This creates an optimized bundle but isn’t needed for MCP usage.Point your MCP client to the server entry point. Replace <PROJECT_ROOT> with the absolute path to this repository on your machine.
~/Library/Application Support/Windsurf/config.json):{
"mcpServers": {
"dev-workflow": {
"command": "node",
"cwd": "<PROJECT_ROOT>",
"args": ["index.js"],
"env": {
"DEV_WORKFLOW_DB_TYPE": "postgres",
"DEV_WORKFLOW_DB_URL": "postgres://USER:PASS@HOST:5432/devworkflow"
}
}
}
}
~/Library/Application Support/Claude/claude_desktop_config.json):{
"mcpServers": {
"dev-workflow": {
"command": "node",
"args": ["<PROJECT_ROOT>/index.js"]
}
}
}
%APPDATA%\Windsurf\config.json):{
"mcpServers": {
"dev-workflow": {
"command": "node",
"args": ["<PROJECT_ROOT>\\index.js"]
}
}
}
%APPDATA%\Claude\claude_desktop_config.json):{
"mcpServers": {
"dev-workflow": {
"command": "node",
"args": ["<PROJECT_ROOT>\\index.js"]
}
}
}
~/.config/windsurf/config.json):{
"mcpServers": {
"dev-workflow": {
"command": "node",
"args": ["<PROJECT_ROOT>/index.js"]
}
}
}
~/.config/claude/claude_desktop_config.json):{
"mcpServers": {
"dev-workflow": {
"command": "node",
"args": ["<PROJECT_ROOT>/index.js"]
}
}
}
Note: On Windows paths in JSON require escaped backslashes (e.g.,
"C:\\path\\to\\project").
If you keep this repository checked out at /Users/alex/code/dev-workflow-mcp-server and want to point Windsurf at a hosted PostgreSQL instance, drop the following into ~/Library/Application Support/Windsurf/mcp_config.json:
{
"mcpServers": {
"dev-workflow": {
"command": "node",
"cwd": "/Users/alex/code/dev-workflow-mcp-server",
"args": ["index.js"],
"env": {
"DEV_WORKFLOW_DB_TYPE": "postgres",
"DEV_WORKFLOW_DB_URL": "postgres://devworkflow:devworkflow_secure_password@34.50.121.142:5432/devworkflow"
}
}
}
}
This mirrors the Windows configuration shared in previous releases, but avoids npx lookup issues on macOS by launching the local index.js directly.
After adding the configuration, restart the application to load the MCP server.
Antigravity users should configure the MCP server in their mcp_config.json.
Windows: %APPDATA%\Antigravity\mcp_config.json or C:\Users\<USERNAME>\.gemini\antigravity\mcp_config.json
{
"mcpServers": {
"dev-workflow": {
"command": "node",
"args": ["<PROJECT_ROOT>\\index.js"]
}
}
}
See Antigravity Getting Started for detailed instructions and troubleshooting.
To ensure the fastest possible startup (critical for IDE-integrated clients like Claude Desktop), we recommend pointing directly to index.js using node rather than npx. This avoids the overhead of checking for package updates.
Optimized Configuration:
{
"mcpServers": {
"dev-workflow": {
"command": "node",
"args": ["<PROJECT_ROOT>\\index.js"],
"env": {
"DEV_WORKFLOW_USER_ID": "your_user_id"
}
}
}
}
The server automatically times its own startup and logs it to stderr:
Dev Workflow MCP Server running on stdio (startup took 15ms)
The server supports SQLite (default), MySQL, and PostgreSQL.
.env.env.example to .env:
cp .env.example .env
.env with your settings:
DEV_WORKFLOW_DB_TYPE=mysql
DEV_WORKFLOW_DB_URL=mysql://user:pass@localhost:3306/db
Alternatively, export variables directly:.
| Variable | Description | Default |
|---|---|---|
DEV_WORKFLOW_DB_TYPE | Database driver (sqlite, mysql, postgres) | sqlite |
DEV_WORKFLOW_DB_URL | Connection string for MySQL/Postgres | null |
DEV_WORKFLOW_DB_PATH | Override path for the SQLite database file | <project>/.state/dev-workflow.db |
MySQL:
export DEV_WORKFLOW_DB_TYPE=mysql
export DEV_WORKFLOW_DB_URL="mysql://user:password@localhost:3306/dev_workflow"
PostgreSQL:
export DEV_WORKFLOW_DB_TYPE=postgres
export DEV_WORKFLOW_DB_URL="postgresql://user:password@localhost:5432/dev_workflow"
To ensure compatibility with existing reporting dashboards, the PostgresAdapter and MysqlAdapter automatically normalize column names:
task_description → DB column: descriptiontimestamp → DB column: completed_atThe adapters use aliases in queries so the MCP tools still receive the expected task_description and timestamp fields.
These databases use an INTEGER column for user_id.
"1") are parsed directly into integers."programinglive") are automatically hashed into a consistent, positive integer to ensure compatibility with the schema while maintaining unique user isolation.| Variable | Description |
|---|---|
DEV_WORKFLOW_USER_ID | Override the auto-generated user ID (e.g., set to your name/email) |
DEV_WORKFLOW_STATE_FILE | Override the location of the workflow-state.json file |
If you use multiple AI coding tools simultaneously (e.g., Antigravity and Windsurf) on the same project, they will share the same workflow state by default.
To maintain separate, distinct sessions for each tool, configure a unique DEV_WORKFLOW_USER_ID for each.
Antigravity Config (mcp_config.json):
{
"mcpServers": {
"dev-workflow": {
"command": "node",
"args": ["path/to/server/index.js"],
"env": {
"DEV_WORKFLOW_USER_ID": "antigravity_user"
}
}
}
}
Windsurf Config: Add the environment variable in your Windsurf MCP settings:
{
"mcpServers": {
"dev-workflow": {
"command": "node",
"args": ["path/to/server/index.js"],
"env": {
"DEV_WORKFLOW_USER_ID": "windsurf_user"
}
}
}
}
The project uses Node.js native test runner (node --test).
npm test
This runs:
.env configurationTo verify MySQL or PostgreSQL adapters, run tests with the environment variable set:
# Test MySQL
export DEV_WORKFLOW_DB_URL="mysql://root:pass@localhost:3306/test_db"
node --test tests/db-adapters.test.js
# Test PostgreSQL
export DEV_WORKFLOW_DB_URL="postgres://postgres:pass@localhost:5432/test_db"
node --test tests/db-adapters.test.js
npm run build - Bundle the source into dist/index.mjs for distributionnpm run dev - Run in development mode with file watchingnpm run local - Alias for running from source (same as npm start)npm run web - Launch the lightweight workflow dashboard for browsing task history (see Web Dashboard docs)npm run webThis command starts the dashboard defined in web/server.js, giving you a quick view of workflow history and summary statistics.
npm run web
# 🌐 Dev Workflow Dashboard running at http://localhost:3111
PORT (common on hosts like Plesk/Render) or DEV_WORKFLOW_WEB_PORT before falling back to auto-selection.?user=<id> lets you inspect another user’s history (defaults to default).GET /api/version → current package version from package.json (used by dashboard to display version dynamically).GET /api/summary?user=<id> → overall stats for the user.GET /api/history?user=<id>&page=1&pageSize=20&startDate=YYYY-MM-DD&endDate=YYYY-MM-DD → paginated task history.GET /api/history-summary?user=<id>&frequency=daily|monthly|yearly → aggregated counts over time.Open http://localhost:3111 in a browser to view the dashboard UI (web/index.html).
Running npm run build generates:
dist/index.mjs - Optimized ES module bundledist/docs/ - Pre-rendered HTML documentation generated from Markdown via scripts/build-docs.jsThe build bundles all source files while externalizing Node.js built-in modules and dependencies, resulting in a single file distribution.
For MCP server usage, point your client at index.js (source) to avoid stdio transport compatibility issues. The built dist/index.mjs is primarily for:
This package is configured for the official MCP Tools Registry using npm package deployment:
package.json declares mcpName: "io.github.programinglive/dev-workflow-mcp-server".server.json describes the server and links it to the npm package @programinglive/dev-workflow-mcp-server.To publish a new server version to the registry:
npm testnpm run release:patch (runs your existing release pipeline and publishes to npm)npm view @programinglive/dev-workflow-mcp-server versionbrew install mcp-publisher (or follow the docs at https://modelcontextprotocol.info/tools/registry/publishing/)mcp-publisher login githubmcp-publisher publishcurl "https://registry.modelcontextprotocol.io/v0/servers?search=io.github.programinglive/dev-workflow-mcp-server"When invoking the lightweight CLI from PowerShell, use --% to prevent PowerShell from rewriting JSON arguments, for example:
node --% index.js call start_task --args "{\"description\":\"Convert docs to HTML during build\",\"type\":\"feature\"}"
The --% prefix and escaped double quotes ensure the JSON reaches the MCP server unchanged.
When you install this package in a project, a .state/workflow-state.json file is automatically created in your project root. This file:
.gitignore by default)dist/. The MCP server walks back to the project root (looking for .git or package.json) before reading or writing workflow state, so you never need duplicate copies under build directories.Each project maintains its own isolated workflow history, so you can work on multiple projects without mixing their histories. Within that .state directory, the MCP server automatically creates a unique per-user subdirectory (e.g., .state/users/user-abc123/). The generated identifier persists locally so multiple developers sharing the same repository never clobber each other’s workflow files. If you prefer a specific name, set DEV_WORKFLOW_USER_ID before launching the server and that value will be used instead of the auto-generated ID.
Use cases:
Let the server choose – Do nothing and the first time you run any MCP tool the server creates .state/users/<random-id>/. The dashboard User ID filter accepts that value (visible in the folder name or in workflow responses).
Set an explicit ID – Before starting the server, export DEV_WORKFLOW_USER_ID:
# macOS/Linux
export DEV_WORKFLOW_USER_ID=alice
node index.js
# Windows PowerShell
$env:DEV_WORKFLOW_USER_ID = "alice"
node index.js
Now all history for that session lands in .state/users/alice/ and the dashboard can be filtered with alice.
Multiple users on one host – Run separate processes (or MCP clients) with different DEV_WORKFLOW_USER_ID values. Each user’s workflow state remains isolated.
Tip: The web dashboard simply reads existing records. Typing a new value into the
User IDfilter will only return results after a workflow session has written history into.state/users/<that-id>/.
If you're using this package, add this to your project's .gitignore:
.state/
This keeps workflow state local to each developer's machine.
Need to override the location? Set
DEV_WORKFLOW_STATE_FILE=/absolute/path/to/your/project/.state/workflow-state.jsonbefore launching the server (or inside your MCP client config). The server will honor that path, letting you keep the package installed centrally while maintaining per-project workflow history.
start_task - Begin a new coding taskmark_bug_fixed - Mark the feature/bug as fixed (requires tests next)create_tests - Mark that tests have been createdskip_tests - Skip tests with justificationrun_tests - Record test results (must pass to proceed)create_documentation - Mark documentation as createdcheck_ready_to_commit - Verify all steps are completecommit_and_push - Commit and push changesperform_release - Record release details (or use skip_release when the project has no release automation)complete_task - Mark task as complete and resetforce_complete_task - Force completion with reasondrop_task - Abandon current taskget_workflow_status - Show current statusview_history - View completed taskscontinue_workflow - Get next-step guidancererun_workflow - Reset and restart the current task from the beginningrun_full_workflow - Execute every workflow step in sequence with a single command (requires supplying the details for each phase)run_full_workflowUse this when you already have all the information needed for each workflow phase and want to execute them in one go.
{
"summary": "Add payment webhooks",
"testCommand": "npm test",
"documentationType": "README",
"documentationSummary": "Document webhook configuration",
"commitMessage": "feat: add payment webhooks",
"releaseCommand": "npm run release:minor",
"releaseNotes": "Release webhook support",
"branch": "feature/payments",
"testsPassed": true,
"testDetails": "node --test; 42 tests",
"releaseType": "minor",
"preset": "minor"
}
The tool will:
mark_bug_fixed using summarycreate_testsrun_tests with testsPassed, testCommand, and optional testDetailscreate_documentation with documentationType and documentationSummary
docs/product/PRD.md must exist before documentation can be marked completecheck_ready_to_commitcommit_and_push with commitMessage and optional branchperform_release with releaseCommand, plus optional releaseNotes, releaseType, and preset
skip_release with a justification when the repository has no Node-based release step (e.g., Python-only or docs-only tasks)complete_task reusing commitMessageAll arguments except the optional flags are required and must be non-empty strings.
The create_documentation step enforces that a PRD (Product Requirements Document) exists at docs/product/PRD.md before documentation can be marked as complete. This ensures all projects maintain a current PRD that describes the product's goals, features, and requirements.
The package ships with a release guard (release-wrapper.js) that backs the npm run release:* scripts. The guard refuses to run unless:
check_ready_to_commit and commit_and_push have been completedIf any requirements are missing, the guard exits with guidance to return to the MCP tools. This prevents accidentally bumping versions or tagging releases outside the managed workflow. To release correctly:
perform_release {"command":"patch"} (or minor/major) via the MCP client, or skip_release {"reason":"<explanation>"} if no release applies.complete_task.This repository ships with .github/workflows/npm-publish.yml, which publishes the package to npm whenever a git tag matching v* is pushed (for example, v1.1.14). To enable the workflow:
npm token create --read-only false).NPM_TOKEN containing that token.npm run release:<type> so the workflow triggers.npm run build succeeds locally; the workflow runs the build before publishing so broken bundles block the release.npm publish --provenance. Leave GitHub Actions' default OIDC permissions enabled so the job can request an ID token.repository.url field in package.json pointing at this GitHub repo. Provenance validation fails if it does not match the repository that built the package.The workflow verifies that the tag version matches package.json before publishing and fails fast if they diverge.
All tool invocations validate their arguments payload before running:
Example (stringified JSON object):
{
"name": "start_task",
"arguments": "{\"description\":\"Add reporting endpoint\",\"type\":\"feature\"}"
}
start_taskStart a new coding task. This is your first step - be conscious about what you're coding.
Parameters:
description (string, required): Clear description of what you're going to codetype (enum, required): Type of task - "feature", "bugfix", "refactor", or "other"Example:
Use the start_task tool with:
- description: "Add user authentication to the login page"
- type: "feature"
mark_bug_fixedMark that the bug/feature is fixed. Reminder: Now you MUST create tests!
Parameters:
summary (string, required): Brief summary of what was fixed/implementedcreate_testsConfirm that you've created the necessary tests covering your change. Required before recording test results.
Parameters: none
skip_testsRecord an explicit justification when automated tests aren't feasible. Marks testing as satisfied so you can proceed with documentation and verification, while flagging the task for manual QA.
Parameters:
reason (string, required): Why automated tests were skippedrun_testsRecord test results. NEVER commit if tests fail! Only proceed if all tests are green.
Parameters:
passed (boolean, required): Did all tests pass?testCommand (string, required): The test command that was rundetails (string, optional): Test results detailsExample:
Use run_tests with:
- passed: true
- testCommand: "npm test"
- details: "All 15 tests passed"
create_documentationMark that documentation has been created/updated. This is required before committing.
Parameters:
documentationType (enum, required): "PRD", "README", "RELEASE_NOTES", "inline-comments", "API-docs", "changelog", or "other"summary (string, required): What was documentedcheck_ready_to_commitCheck if all workflow steps are completed and you're ready to commit & push.
commit_and_pushAutomatically run git add, git commit, and git push after the ready check passes.
Auto-detection of primary branch: If no branch is specified, the tool automatically detects your project's primary branch by checking for origin/main first, then falling back to origin/master. This eliminates the need to specify the branch parameter for most projects.
Parameters:
commitMessage (string, required): Conventional commit message to usebranch (string, optional): Target branch to push. If omitted, auto-detects primary branch (main or master)perform_releaseRecord the release after you've committed and pushed. Required before you can complete the task.
Parameters:
command (string, required): Release command that was executed (e.g., npm run release)notes (string, optional): Additional release notescomplete_taskMark the task as complete after successful commit & push. Resets workflow for next task.
Parameters:
commitMessage (string, required): The commit message useddrop_taskAbandon the current task without completing the workflow. Preserves an audit entry with context, then resets the state so you can start fresh.
Parameters:
reason (string, optional): Additional detail about why the task was droppedget_workflow_statusGet current workflow status and what needs to be done next.
view_historyView workflow history of completed tasks.
Parameters:
limit (number, optional): Number of recent tasks to show (default: 10)workflow_reminderGet a complete reminder of the development workflow discipline.
pre_commit_checklistGet a pre-commit checklist to ensure nothing is missed before committing.
Here's how you'd use this MCP server in a typical coding session:
Start your task:
Ask Cascade to use start_task:
"Start a new task: implementing user profile page, type: feature"
Code your feature/fix
Mark as fixed:
"Mark the feature as fixed: User profile page with avatar and bio completed"
Create tests:
Run tests:
"Record test results: passed=true, command='npm test'"
Document:
"Create documentation: type=README, summary='Added user profile section to docs'"
Check readiness:
"Check if I'm ready to commit"
Commit & Push:
"Commit and push: commitMessage='feat: add user profile page with tests and docs'"
Record release:
"Record release: command='npm run release', notes='v1.2.3'"
"Complete the task with commit message: 'feat: add user profile page'"
"Drop task: reason='Switching to a different feature'"
start_task - This sets your intentionskip_tests only when absolutely necessary and document the reason for manual QAget_workflow_status - Check where you are anytimeYou can modify the workflow in index.js:
The server maintains state in .state/workflow-state.json:
This file is automatically created and managed by the server. It contains local, machine-specific progress and is ignored by git so each environment can manage its own workflow history without cross-contamination.
This MCP server aligns with your existing development rules:
MIT
Feel free to customize this server to match your specific workflow needs!
Be the first to review this server!
by Modelcontextprotocol · Productivity
Knowledge graph-based persistent memory across sessions
by Modelcontextprotocol · Productivity
Time and timezone conversion capabilities for your AI assistant
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.
by Microsoft · Content & Media
Convert files (PDF, Word, Excel, images, audio) to Markdown for LLM consumption
by mcp-marketplace · Developer Tools
Scaffold, build, and publish TypeScript MCP servers to npm — conversationally