Server data from the Official MCP Registry
Vehicle data for AI: VIN decoder, automotive specs, stolen checks, valuation and way more.
Vehicle data for AI: VIN decoder, automotive specs, stolen checks, valuation and way more.
Remote endpoints: streamable-http: https://mcp.vincario.com/mcp
Valid MCP server (1 strong, 4 medium validity signals). 3 known CVEs in dependencies (1 critical, 1 high severity) Imported from the Official MCP Registry.
4 tools verified · Open access · 3 issues found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Remote Plugin
No local installation needed. Your AI client connects to the remote endpoint directly.
Add this to your MCP configuration to connect:
{
"mcpServers": {
"com-vincario-vehicle-data": {
"url": "https://mcp.vincario.com/mcp"
}
}
}From the project's GitHub README.
An MCP (Model Context Protocol) server that exposes the Vincario API to AI agents and LLM clients. Enables AI assistants to decode VINs, check stolen vehicle databases, and retrieve market valuations through natural language.
| Tool | Description |
|---|---|
vin_decode | Decode a VIN and return detailed vehicle information |
vin_decode_info | List available fields for a given VIN (free endpoint) |
stolen_check | Check if a VIN appears in stolen vehicle databases |
vehicle_market_value | Get market valuation for a vehicle (supports odometer input) |
X-API-Key HTTP headerdocker build -t vincario-mcp .
docker run -p 8080:8080 vincario-mcp
The server starts on http://localhost:8080.
pip install uv
uv sync
uv run main.py
Pass your Vincario API key as an HTTP header with each request:
X-API-Key: your_api_key_here
Add to your MCP config (.mcp.json or claude_desktop_config.json):
{
"mcpServers": {
"vincario": {
"type": "http",
"url": "http://localhost:8080",
"headers": {
"X-API-Key": "your_api_key_here"
}
}
}
}
If connecting to the hosted server at https://mcp.vincario.com/mcp, replace the URL accordingly.
Once connected, you can ask your AI assistant:
The server uses streamable HTTP transport (stateless_http=True), which means no persistent session is required. Each request is independent, making it straightforward to deploy behind a reverse proxy or load balancer.
For HTTPS deployment, place a reverse proxy (nginx, Caddy, Cloudflare) in front of the server — the application itself runs on plain HTTP port 8080.
See Vincario API Terms of Service for usage terms.
Be the first to review this server!
by Modelcontextprotocol · Developer Tools
Read, search, and manipulate Git repositories programmatically
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.
by Microsoft · Content & Media
Convert files (PDF, Word, Excel, images, audio) to Markdown for LLM consumption
by mcp-marketplace · Developer Tools
Scaffold, build, and publish TypeScript MCP servers to npm — conversationally
by mcp-marketplace · Finance
Free stock data and market news for any MCP-compatible AI assistant.