Server data from the Official MCP Registry
Extract database metadata from PostgreSQL, Snowflake, SQL Server, BigQuery, and Oracle.
Extract database metadata from PostgreSQL, Snowflake, SQL Server, BigQuery, and Oracle.
Valid MCP server (2 strong, 1 medium validity signals). 17 known CVEs in dependencies (0 critical, 10 high severity) Package registry verified. Imported from the Official MCP Registry.
4 files analyzed · 18 issues found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Unverified package source
We couldn't verify that the installable package matches the reviewed source code. Proceed with caution.
Add this to your MCP configuration file:
{
"mcpServers": {
"io-github-optisol-business-db-metadata-extractor-mcp": {
"args": [
"db-metadata-extractor-mcp"
],
"command": "uvx"
}
}
}From the project's GitHub README.
mcp-name: io.github.Optisol-Business/db-metadata-extractor-mcp
A Model Context Protocol (MCP) server that extracts and queries database schema metadata from PostgreSQL, Snowflake, SQL Server, BigQuery, and Oracle databases.
pip install db-metadata-extractor-mcp
git clone https://github.com/Optisol-Business/db-metadata-extractor-mcp.git
cd db-metadata-extractor-mcp
pip install -e .
db-metadata-extractor-mcp
The server starts in stdio mode by default and listens for MCP client connections.
Add to ~/.config/Claude/claude_desktop_config.json (macOS/Linux) or %APPDATA%\Claude\claude_desktop_config.json (Windows):
{
"mcpServers": {
"db-metadata-extractor": {
"command": "db-metadata-extractor-mcp",
"args": [],
"env": {}
}
}
}
Restart Claude Desktop.
Tell Claude:
Extract metadata from my PostgreSQL database and save it to
/tmp/output
Claude will use the server's tools to extract and query your database schema.
extract_metadataExtracts complete schema metadata from a database.
Parameters:
db_type (required): postgresql, snowflake, sqlserver, bigquery, oracleoutput_path (required): Local directory for JSON outputdatabase_name: Database/schema namehost: Database host (not needed for BigQuery/Snowflake)port: Database portusername: Database userpassword: Database passwordschema_name: Specific schema (optional)tables: Array of table names to extract (optional)account: Snowflake account IDwarehouse: Snowflake warehouserole_name: Snowflake roleproject_id: BigQuery project IDservice_account_key: BigQuery service account JSON (base64 encoded)Returns:
query_metadataQuery previously extracted metadata.
Parameters:
filepath (required): Path to metadata JSON filetable_name: Filter by table name (substring match)field_name: Filter by column name (substring match)page: Page number (default: 1)page_size: Results per page (default: 20)Returns:
# Via Claude
"Extract all tables from my dev PostgreSQL database at localhost:5432"
Parameters Claude will use:
{
"db_type": "postgresql",
"host": "localhost",
"port": 5432,
"database_name": "dev_db",
"username": "postgres",
"password": "your_password",
"output_path": "/tmp/db_metadata"
}
"Extract schema from Snowflake account XYZ123"
Parameters:
{
"db_type": "snowflake",
"account": "XYZ123",
"username": "your_user",
"password": "your_password",
"warehouse": "COMPUTE_WH",
"role_name": "ANALYST",
"database_name": "PRODUCTION",
"output_path": "C:/metadata"
}
"Extract metadata from BigQuery project my-project-123"
Parameters:
{
"db_type": "bigquery",
"project_id": "my-project-123",
"service_account_key": "base64_encoded_json_key",
"output_path": "/tmp/bq_metadata"
}
Start with HTTP transport:
db-metadata-extractor-mcp --transport streamable-http --port 3000
# Set database credentials via env
export DB_HOST=localhost
export DB_USER=postgres
export DB_PASSWORD=secret
db-metadata-extractor-mcp
The extracted metadata is saved as a JSON file with structure:
{
"source": {
"db_type": "postgresql",
"extracted_at": "2026-04-09T14:30:00",
"host": "localhost"
},
"schemas": [
{
"schema_name": "public",
"tables": [
{
"table_name": "users",
"columns": [
{
"column_name": "id",
"data_type": "int",
"is_nullable": false,
"is_primary_key": true
},
{
"column_name": "email",
"data_type": "varchar",
"is_nullable": false
}
],
"indexes": [
{
"index_name": "users_email_idx",
"columns": ["email"]
}
]
}
]
}
]
}
psycopg2-binarysnowflake-connector-pythonpyodbc, pymssqlgoogle-cloud-bigqueryoracledbProblem: "Unable to connect to database"
Solution: Verify credentials and network access:
# Test PostgreSQL connection
psql -h localhost -U postgres -c "SELECT 1"
# Test Snowflake
snowsql -a XYZ123 -u your_user
Problem: "Access denied" or "insufficient permissions"
Solution: Ensure database user has:
SELECT on tablesUSAGE on schemasCONNECT on databasesProblem: Extraction times out on large databases
Solution: Extract specific schema/tables:
{
"schema_name": "public",
"tables": ["users", "orders"] // Specify subset
}
MIT License - See LICENSE file
Contributions welcome! Please:
Be the first to review this server!
by Modelcontextprotocol · Developer Tools
Read, search, and manipulate Git repositories programmatically
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.
by Microsoft · Content & Media
Convert files (PDF, Word, Excel, images, audio) to Markdown for LLM consumption
by mcp-marketplace · Developer Tools
Scaffold, build, and publish TypeScript MCP servers to npm — conversationally
by mcp-marketplace · Finance
Free stock data and market news for any MCP-compatible AI assistant.