Server data from the Official MCP Registry
MCP server for FastBCP — high-performance parallel database export to files and cloud
MCP server for FastBCP — high-performance parallel database export to files and cloud
Valid MCP server (3 strong, 1 medium validity signals). 4 known CVEs in dependencies (0 critical, 3 high severity) Package registry verified. Imported from the Official MCP Registry.
5 files analyzed · 5 issues found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Set these up before or after installing:
Environment variable: FASTBCP_PATH
Environment variable: FASTBCP_TIMEOUT
Environment variable: FASTBCP_LOG_DIR
Environment variable: LOG_LEVEL
Add this to your MCP configuration file:
{
"mcpServers": {
"io-github-arpe-io-fastbcp-mcp": {
"env": {
"LOG_LEVEL": "your-log-level-here",
"FASTBCP_PATH": "your-fastbcp-path-here",
"FASTBCP_LOG_DIR": "your-fastbcp-log-dir-here",
"FASTBCP_TIMEOUT": "your-fastbcp-timeout-here"
},
"args": [
"fastbcp-mcp"
],
"command": "uvx"
}
}
}From the project's GitHub README.
A Model Context Protocol (MCP) server that exposes FastBCP functionality for exporting data from databases to files (CSV, TSV, JSON, BSON, Parquet, XLSX, Binary) with optional cloud storage targets.
FastBCP is a high-performance CLI tool for exporting data from databases to files. This MCP server wraps FastBCP functionality and provides:
preview_export_commandBuild and preview a FastBCP export command WITHOUT executing it. Shows the exact command with passwords masked. Always use this first.
execute_exportExecute a previously previewed command. Requires confirmation: true as a safety mechanism.
validate_connectionValidate source database connection parameters (parameter check only, does not test actual connectivity).
list_supported_formatsList all supported source databases, output formats, and storage targets.
suggest_parallelism_methodRecommend the optimal parallelism method based on source database type and table characteristics.
get_versionReport the detected FastBCP binary version, supported types, and feature flags.
Clone or download this repository:
cd /path/to/fastbcp-mcp
Install Python dependencies:
pip install -r requirements.txt
Configure environment:
cp .env.example .env
# Edit .env with your FastBCP path
Add to Claude Code configuration (~/.claude.json):
{
"mcpServers": {
"fastbcp": {
"type": "stdio",
"command": "python",
"args": ["/absolute/path/to/fastbcp-mcp/src/server.py"],
"env": {
"FASTBCP_PATH": "/absolute/path/to/FastBCP"
}
}
}
}
Restart Claude Code to load the MCP server.
Verify installation:
# In Claude Code, run:
/mcp
# You should see "fastbcp: connected"
Edit .env to configure:
# Path to FastBCP binary (required)
FASTBCP_PATH=./fastbcp/FastBCP
# Execution timeout in seconds (default: 1800 = 30 minutes)
FASTBCP_TIMEOUT=1800
# Log directory (default: ./logs)
FASTBCP_LOG_DIR=./logs
# Log level (default: INFO)
LOG_LEVEL=INFO
The server supports multiple ways to authenticate and connect:
| Parameter | Description |
|---|---|
server | Host:port or host\instance (optional with connect_string or dsn) |
user / password | Standard credentials |
trusted_auth | Windows trusted authentication |
connect_string | Full connection string (excludes server/user/password/dsn) |
dsn | ODBC DSN name (excludes server/provider) |
provider | OleDB provider name |
application_intent | SQL Server application intent (ReadOnly/ReadWrite) |
| Option | CLI Flag | Description |
|---|---|---|
format | --format | Output format: csv, tsv, json, bson, parquet, xlsx, binary |
file_output | --fileoutput | Output file path |
directory | --directory | Output directory path |
storage_target | --storagetarget | Storage: local, s3, s3compatible, azure_blob, azure_datalake, fabric_onelake |
delimiter | --delimiter | Field delimiter (CSV/TSV) |
quotes | --quotes | Quote character |
encoding | --encoding | Output encoding |
no_header | --noheader | Omit header row (CSV/TSV) |
decimal_separator | --decimalseparator | Decimal separator (. or ,) |
date_format | --dateformat | Date format string |
bool_format | --boolformat | Boolean format: TrueFalse, OneZero, YesNo |
parquet_compression | --parquetcompression | Parquet compression: None, Snappy, Gzip, Lz4, Lzo, Zstd |
timestamped | --timestamped | Add timestamp to output filename |
merge | --merge | Merge parallel output files |
| Option | CLI Flag | Description |
|---|---|---|
method | --method | Parallelism method |
distribute_key_column | --distributeKeyColumn | Column for data distribution |
degree | --degree | Parallelism degree (default: 1) |
load_mode | --loadmode | Append or Truncate |
batch_size | --batchsize | Batch size for export operations |
map_method | --mapmethod | Column mapping: Position or Name |
run_id | --runid | Run ID for logging |
data_driven_query | --datadrivenquery | Custom SQL for DataDriven method |
settings_file | --settingsfile | Custom settings JSON file |
log_level | --loglevel | Override log level (Information/Debug) |
no_banner | --nobanner | Suppress banner output |
license_path | --license | License file path or URL |
cloud_profile | --cloudprofile | Cloud storage profile name |
User: "Export the 'orders' table from PostgreSQL (localhost:5432, database: sales_db,
schema: public) to CSV file at /tmp/orders.csv. Use parallel export."
Claude Code will:
1. Call suggest_parallelism_method to recommend Ctid for PostgreSQL
2. Call preview_export_command with your parameters
3. Show the command with masked passwords
4. Explain what will happen
5. Ask for confirmation
6. Execute with execute_export when you approve
User: "Export the 'transactions' table from SQL Server to Parquet format
with Snappy compression, saved to /data/exports/."
Claude Code will use parquet format with parquet_compression set to Snappy.
User: "Export the 'users' table from PostgreSQL to CSV on S3 bucket
s3://my-bucket/exports/ using my AWS profile."
Claude Code will use storage_target=s3 with cloud_profile.
User: "What version of FastBCP is installed?"
Claude Code will call get_version and display the detected version,
supported source types, output formats, and available features.
This server implements a mandatory two-step process:
preview_export_command firstexecute_export with confirmation: trueYou cannot execute without previewing first and confirming.
--sourcepassword, --sourceconnectstring, -x, -gRun the test suite:
# Run all tests
python -m pytest tests/ -v
# Run with coverage
python -m pytest tests/ --cov=src --cov-report=html
fastbcp-mcp/
src/
__init__.py
server.py # MCP server (tool definitions, handlers)
fastbcp.py # Command builder, executor, suggestions
validators.py # Pydantic models, enums, validation
version.py # Version detection and capabilities registry
tests/
__init__.py
test_command_builder.py
test_validators.py
test_version.py
.env.example
requirements.txt
CHANGELOG.md
README.md
This MCP server wrapper is provided as-is. FastBCP itself is a separate product from Arpe.io.
Be the first to review this server!
by Modelcontextprotocol · Developer Tools
Read, search, and manipulate Git repositories programmatically
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.
by Microsoft · Content & Media
Convert files (PDF, Word, Excel, images, audio) to Markdown for LLM consumption
by mcp-marketplace · Developer Tools
Scaffold, build, and publish TypeScript MCP servers to npm — conversationally
by mcp-marketplace · Finance
Free stock data and market news for any MCP-compatible AI assistant.