Server data from the Official MCP Registry
MCP server for AI-enhanced prompt engineering and request conversion.
MCP server for AI-enhanced prompt engineering and request conversion.
Valid MCP server (2 strong, 4 medium validity signals). 4 known CVEs in dependencies (0 critical, 3 high severity) Package registry verified. Imported from the Official MCP Registry.
6 files analyzed · 5 issues found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Add this to your MCP configuration file:
{
"mcpServers": {
"io-github-aungmyokyaw-betterprompt-mcp": {
"args": [
"-y",
"betterprompt-mcp"
],
"command": "npx"
}
}
}From the project's GitHub README.
BetterPrompt MCP is a Model Context Protocol (MCP) server that enhances user requests using advanced prompt engineering techniques. It exposes a single, powerful tool that transforms simple requests into structured, context-rich instructions tailored for optimal AI model performance.
Instead of manually crafting detailed prompts, BetterPrompt MCP converts your requests into expertly engineered prompts that get better results from AI models.
Before & After Example
Without BetterPrompt:
"Write a function to calculate fibonacci numbers"
With BetterPrompt Enhancement:
"You are a world-class AI assistant with expertise in advanced prompt engineering techniques from top AI research labs like Anthropic, OpenAI, and Google DeepMind.
Your task is to provide an exceptional response to the following user request:
"Write a function to calculate fibonacci numbers"
Please enhance your response by:
Structure your response with clear headings, detailed explanations, and examples where appropriate. Ensure your answer is comprehensive, actionable, and directly addresses all aspects of the request."
Install and run via npx:
npx -y betterprompt-mcp
Or add to your MCP client configuration:
{
"mcpServers": {
"betterprompt": {
"command": "npx",
"args": ["-y", "betterprompt-mcp"]
}
}
}
Most MCP clients work with this standard config:
{
"mcpServers": {
"betterprompt": {
"command": "npx",
"args": ["-y", "betterprompt-mcp"]
}
}
}
Pick your client below. Where available, click the install button; otherwise follow the manual steps.
Click a button to install:
Fallback (CLI):
code --add-mcp '{"name":"betterprompt","command":"npx","args":["-y","betterprompt-mcp"]}'
Click to install:
Or add manually: Settings → MCP → Add new MCP Server → Type: command, Command: npx -y betterprompt-mcp.
Click to install:
Or manually: Program → Install → Edit mcp.json, add the standard config above.
Install button: TODO – no public deeplink available yet.
Manual setup:
mcpServers entry:{
"mcpServers": {
"betterprompt": {
"command": "npx",
"args": ["-y", "betterprompt-mcp"]
}
}
}
Restart Continue if needed.
Click to install:
Or manually: Advanced settings → Extensions → Add custom extension → Type: STDIO → Command: npx -y betterprompt-mcp.
Install via CLI:
claude mcp add betterprompt npx -y betterprompt-mcp
Add to claude_desktop_config.json using the standard config above, then restart Claude Desktop. See the MCP quickstart:
Model Context Protocol – Quickstart
Follow the Windsurf MCP documentation and use the standard config above.
Follow the Gemini CLI MCP server guide; use the standard config above.
Docs: Configure MCP server in Gemini CLI
Open Qodo Gen chat panel → Connect more tools → + Add new MCP → Paste the standard config above → Save.
Create or edit ~/.config/opencode/opencode.json:
{
"$schema": "https://opencode.ai/config.json",
"mcp": {
"betterprompt": {
"type": "local",
"command": ["npx", "-y", "betterprompt-mcp"],
"enabled": true
}
}
}
enhance-requestTransforms user requests into world-class AI-enhanced prompts using advanced prompt engineering techniques.
Input:
request (string, required): The user request to transform into an enhanced AI promptOutput: AI-enhanced prompt with structure, context, and clear instructions.
Example Usage:
{
"name": "enhance-request",
"arguments": {
"request": "Write a function to calculate fibonacci numbers"
}
}
Request:
{
"name": "enhance-request",
"arguments": {
"request": "Explain quantum computing"
}
}
Enhanced Result:
"You are a world-class AI assistant with expertise in advanced prompt engineering techniques from top AI research labs like Anthropic, OpenAI, and Google DeepMind.
Your task is to provide an exceptional response to the following user request:
"Explain quantum computing"
Please enhance your response by:
Structure your response with clear headings, detailed explanations, and examples where appropriate. Ensure your answer is comprehensive, actionable, and directly addresses all aspects of the request."
BetterPrompt MCP leverages the MCP Sampling API to enhance user requests:
enhance-request tool, the server sends a sampling request to your MCP clientThis approach has several benefits:
betterprompt-mcp/
├── src/
│ └── index.ts # Main server implementation
├── tests/ # Test files and verification scripts
├── dist/ # Compiled output (generated)
├── package.json # Dependencies and scripts
├── tsconfig.json # TypeScript configuration
└── README.md # Documentation
Build:
npm run build
Watch (dev):
npm run watch
Format:
npm run format
npm run format:check
Test:
npm run test:comprehensive
We use ESLint + Prettier to keep the codebase consistent.
npm run lintnpm run lint -- --fix or npm run lint:fixnpm run lint:ci (produces artifacts/lint-report.json)scripts/lint-autofix-and-commit.sh. The script uses a conservative heuristic (small change threshold) and will abort auto-commit when changes appear large or potentially behavior-affecting; in such cases open a PR for human review.MIT License
For questions or issues, open an issue on GitHub or contact the author via GitHub profile.
Aung Myo Kyaw (GitHub)
Be the first to review this server!
by Modelcontextprotocol · Developer Tools
Read, search, and manipulate Git repositories programmatically
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.
by Microsoft · Content & Media
Convert files (PDF, Word, Excel, images, audio) to Markdown for LLM consumption
by mcp-marketplace · Developer Tools
Scaffold, build, and publish TypeScript MCP servers to npm — conversationally
by mcp-marketplace · Finance
Free stock data and market news for any MCP-compatible AI assistant.