Server data from the Official MCP Registry
Database management — query, schema inspection, migrations for PostgreSQL, MySQL, SQLite.
Database management — query, schema inspection, migrations for PostgreSQL, MySQL, SQLite.
Valid MCP server (1 strong, 3 medium validity signals). No known CVEs in dependencies. Package registry verified. Imported from the Official MCP Registry. Trust signals: trusted author (5/5 approved).
4 files analyzed · 1 issue found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Add this to your MCP configuration file:
{
"mcpServers": {
"io-github-seayniclabs-berth": {
"args": [
"berth-mcp"
],
"command": "uvx"
}
}
}From the project's GitHub README.
A secure berth for your data -- database access for AI tools.
Berth is a Model Context Protocol server that gives AI assistants safe, structured access to PostgreSQL, SQLite, and MySQL databases. It exposes 13 tools for inspecting schemas, running queries, managing data, generating migrations, and performing backups -- all governed by a 3-tier safety model that prevents accidental damage.
Berth enforces three operating modes that control what SQL is permitted:
| Mode | Default | Allows | Blocks |
|---|---|---|---|
| read-only | Yes | SELECT, EXPLAIN | All writes |
| write | No | INSERT, UPDATE, DELETE, CREATE | DROP, TRUNCATE, ALTER DROP, DELETE without WHERE |
| admin | No | Everything | Destructive ops require a confirmation token (60s expiry) |
The server starts in read-only mode. Write and admin modes must be explicitly enabled. Destructive operations in admin mode generate a one-time confirmation token that expires after 60 seconds -- the AI must echo the token back to confirm intent.
| Tool | Description | Key Parameters |
|---|---|---|
health | Server health check | -- |
db_connect | Connect to a database | dsn (connection string) |
db_query | Execute a SELECT query (auto-adds LIMIT 1000) | connection_id, sql |
db_execute | Execute INSERT/UPDATE/DELETE (respects safety mode) | connection_id, sql, confirmation_token |
db_schema | List tables, views, and indexes | connection_id |
db_describe | Column details for a table | connection_id, table |
db_relationships | Foreign key relationships | connection_id, table (optional) |
db_size | Database and table sizes | connection_id |
db_active_queries | Currently running queries (PostgreSQL only) | connection_id |
db_explain | Run EXPLAIN ANALYZE on a query | connection_id, sql |
generate_migration | Generate migration SQL by comparing schemas | connection_id + target_sql, or from_connection + to_connection |
db_backup | Create a database backup | connection_id, output_path |
db_restore | Restore from backup (admin mode + confirmation token) | connection_id, input_path, confirmation_token |
The generate_migration tool compares two schemas and produces dialect-aware SQL to migrate from one to the other. Two modes of operation:
Mode 1 — Live database vs. target DDL:
Provide connection_id (an active connection) and target_sql (CREATE TABLE statements describing the desired schema). Berth introspects the live database and diffs it against the parsed target.
Mode 2 — Two live databases:
Provide from_connection and to_connection (two active connection IDs). Berth introspects both and generates the migration to transform the source into the target.
What it generates:
CREATE TABLE for new tablesALTER TABLE ADD COLUMN for new columnsALTER TABLE ALTER COLUMN / MODIFY COLUMN for type, nullability, and default changesCREATE INDEX / DROP INDEX for index changesADD CONSTRAINT / DROP CONSTRAINT for foreign key changesDROP TABLE and DROP COLUMN are commented out with warnings (safety first)Dialect handling:
ALTER COLUMN ... TYPE, SET/DROP NOT NULL, SET/DROP DEFAULTMODIFY COLUMN for all column changes, DROP INDEX ... ON tablepg_stat_activity, EXPLAIN ANALYZE, pg_dump/psql backup/restore.backup/.restore via sqlite3 CLIinformation_schema introspection, mysqldump/mysql backup/restoreFrom PyPI:
pip install berth-mcp
Or in an isolated environment:
pipx install berth-mcp
MySQL support requires an optional dependency:
pip install berth-mcp[mysql]
PostgreSQL (asyncpg) and SQLite (aiosqlite) drivers are included by default.
Run the server:
berth
Berth communicates over stdio using the MCP protocol. It is designed to be launched by an MCP client, not run standalone.
claude mcp add berth -- berth
Add to your claude_desktop_config.json:
{
"mcpServers": {
"berth": {
"command": "berth",
"args": []
}
}
}
If installed in a virtual environment, use the full path:
{
"mcpServers": {
"berth": {
"command": "/path/to/venv/bin/berth",
"args": []
}
}
}
| Variable | Default | Description |
|---|---|---|
BERTH_BACKUP_DIR | Current working directory | Sandbox directory for backup and restore paths. All paths are validated to stay within this directory. |
sqlite_master before use in PRAGMA statements; parameterized queries used throughoutBERTH_BACKUP_DIR; null bytes rejectedgit clone https://github.com/seayniclabs/berth.git
cd berth
python -m venv .venv && source .venv/bin/activate
pip install -e ".[test]"
python -m pytest tests/ -q
Integration tests for PostgreSQL and MySQL require Docker:
docker compose -f tests/docker-compose.test.yml up -d
python -m pytest tests/ -q
docker compose -f tests/docker-compose.test.yml down
Be the first to review this server!
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.
by Microsoft · Content & Media
Convert files (PDF, Word, Excel, images, audio) to Markdown for LLM consumption
by mcp-marketplace · Developer Tools
Scaffold, build, and publish TypeScript MCP servers to npm — conversationally
by mcp-marketplace · Finance
Free stock data and market news for any MCP-compatible AI assistant.
by Taylorwilsdon · Productivity
Control Gmail, Calendar, Docs, Sheets, Drive, and more from your AI