Server data from the Official MCP Registry
Web search, crawl, scrape & extract with agent-browser, SearXNG, Tavily, DuckDuckGo, Bing & more
Web search, crawl, scrape & extract with agent-browser, SearXNG, Tavily, DuckDuckGo, Bing & more
Valid MCP server (2 strong, 3 medium validity signals). 1 known CVE in dependencies (0 critical, 1 high severity) Package registry verified. Imported from the Official MCP Registry. Trust signals: trusted author (3/3 approved).
7 files analyzed Β· 2 issues found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Set these up before or after installing:
Environment variable: SEARCH_PROVIDER
Environment variable: SEARCH_API_URL
Environment variable: SEARCH_API_KEY
Add this to your MCP configuration file:
{
"mcpServers": {
"io-github-yokingma-one-search-mcp": {
"env": {
"SEARCH_API_KEY": "your-search-api-key-here",
"SEARCH_API_URL": "your-search-api-url-here",
"SEARCH_PROVIDER": "your-search-provider-here"
},
"args": [
"-y",
"one-search-mcp"
],
"command": "npx"
}
}
}From the project's GitHub README.
A Model Context Protocol (MCP) server implementation that integrates with multiple search providers for web search, local browser search, and scraping capabilities with agent-browser.
agent-browser for browser automation.one_search, one_scrape, one_map, one_extractBreaking Changes in v1.1.0:
agent-browser, which provides similar functionality without requiring external API services.FIRECRAWL_API_URL and FIRECRAWL_API_KEY are no longer used.What Changed:
one_scrape and one_map now use agent-browser instead of Firecrawlone_extract now preprocesses multi-URL page content for downstream analysis instead of performing built-in LLM extractionMigration Steps:
FIRECRAWL_API_URL and FIRECRAWL_API_KEY from your environment variablesnpm install -g one-search-mcp@latestBrowser Requirement: This server uses agent-browser for web scraping and local search, which requires a Chromium-based browser.
Good News: The server will automatically detect and use browsers already installed on your system:
If you don't have any of these browsers installed, you can:
# Option 1: Install Google Chrome (Recommended)
# Download from: https://www.google.com/chrome/
# Option 2: Install Microsoft Edge
# Download from: https://www.microsoft.com/edge
# Option 3: Install Chromium via agent-browser
npx agent-browser install
# Option 4: Install Chromium directly
# Download from: https://www.chromium.org/getting-involved/download-chromium/
# Add to Claude Code with default settings (local search)
claude mcp add one-search-mcp -- npx -y one-search-mcp
# Add with custom search provider (e.g., SearXNG)
claude mcp add one-search-mcp -e SEARCH_PROVIDER=searxng -e SEARCH_API_URL=http://127.0.0.1:8080 -- npx -y one-search-mcp
# Add with Tavily API
claude mcp add one-search-mcp -e SEARCH_PROVIDER=tavily -e SEARCH_API_KEY=your_api_key -- npx -y one-search-mcp
# Install globally (Optional)
npm install -g one-search-mcp
# Or run directly with npx
npx -y one-search-mcp
Docker image includes all dependencies (Chromium browser) pre-installed, no additional setup required.
Pull the image:
# From GitHub Container Registry
docker pull ghcr.io/yokingma/one-search-mcp:latest
# Or from Docker Hub
docker pull zacma/one-search-mcp:latest
Configure with Claude Desktop:
{
"mcpServers": {
"one-search-mcp": {
"command": "docker",
"args": ["run", "-i", "--rm", "ghcr.io/yokingma/one-search-mcp:latest"],
"env": {
"SEARCH_PROVIDER": "local"
}
}
}
}
With custom search provider:
{
"mcpServers": {
"one-search-mcp": {
"command": "docker",
"args": [
"run", "-i", "--rm",
"-e", "SEARCH_PROVIDER=tavily",
"-e", "SEARCH_API_KEY=your_api_key",
"ghcr.io/yokingma/one-search-mcp:latest"
]
}
}
}
Search Engine:
searxng, duckduckgo, bing, tavily, google, zhipu, exa, bocha, local, default is local.google.tavily, bing, google, zhipu, exa, bocha.// supported search providers
export type SearchProvider = 'searxng' | 'duckduckgo' | 'bing' | 'tavily' | 'google' | 'zhipu' | 'exa' | 'bocha' | 'local';
| Provider | API Key Required | API URL Required | Notes |
|---|---|---|---|
local | No | No | Free, uses browser automation |
duckduckgo | No | No | Free, no API key needed |
searxng | Optional | Yes | Self-hosted meta search engine |
bing | Yes | No | Bing Search API |
tavily | Yes | No | Tavily API |
google | Yes | Yes (Search Engine ID) | Google Custom Search |
zhipu | Yes | No | ζΊθ°± AI |
exa | Yes | No | Exa AI |
bocha | Yes | No | εζ₯ AI |
Add to your Claude Desktop configuration file:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
{
"mcpServers": {
"one-search-mcp": {
"command": "npx",
"args": ["-y", "one-search-mcp"],
"env": {
"SEARCH_PROVIDER": "local"
}
}
}
}
Add to your mcp.json file:
{
"mcpServers": {
"one-search-mcp": {
"command": "npx",
"args": ["-y", "one-search-mcp"],
"env": {
"SEARCH_PROVIDER": "local"
}
}
}
}
Add to your ./codeium/windsurf/model_config.json file:
{
"mcpServers": {
"one-search-mcp": {
"command": "npx",
"args": ["-y", "one-search-mcp"],
"env": {
"SEARCH_PROVIDER": "local"
}
}
}
}
If you want to use SearXNG as your search provider, you can deploy it locally using Docker:
Prerequisites:
Quick Start:
# Clone SearXNG Docker repository
git clone https://github.com/searxng/searxng-docker.git
cd searxng-docker
# Start SearXNG
docker compose up -d
After deployment, SearXNG will be available at http://127.0.0.1:8080 by default.
Configure OneSearch to use SearXNG:
# Set environment variables
export SEARCH_PROVIDER=searxng
export SEARCH_API_URL=http://127.0.0.1:8080
For more details, see the official SearXNG Docker documentation.
If you see an error like "Browser not found", the server couldn't detect any installed Chromium-based browser. Please install one of the following:
Or install via agent-browser:
npx agent-browser install
MIT License - see LICENSE file for details.
Be the first to review this server!
by Toleno Β· Developer Tools
Toleno Network MCP Server β Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace Β· Developer Tools
Create, build, and publish Python MCP servers to PyPI β conversationally.
by Microsoft Β· Content & Media
Convert files (PDF, Word, Excel, images, audio) to Markdown for LLM consumption
by mcp-marketplace Β· Developer Tools
Scaffold, build, and publish TypeScript MCP servers to npm β conversationally
by mcp-marketplace Β· Finance
Free stock data and market news for any MCP-compatible AI assistant.
by Taylorwilsdon Β· Productivity
Control Gmail, Calendar, Docs, Sheets, Drive, and more from your AI