Server data from the Official MCP Registry
CacheTank MCP server gives every AI tool your personal context so you never start from zero.
CacheTank MCP server gives every AI tool your personal context so you never start from zero.
Valid MCP server (2 strong, 4 medium validity signals). 2 known CVEs in dependencies (0 critical, 2 high severity) Package registry verified. Imported from the Official MCP Registry.
3 files analyzed · 3 issues found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
This plugin requests these system permissions. Most are normal for its category.
Set these up before or after installing:
Environment variable: CACHETANK_READ_TOKEN
Environment variable: CACHETANK_WRITE_TOKEN
Add this to your MCP configuration file:
{
"mcpServers": {
"io-github-jacobsilver7-art-cachetank-mcp": {
"env": {
"CACHETANK_READ_TOKEN": "your-cachetank-read-token-here",
"CACHETANK_WRITE_TOKEN": "your-cachetank-write-token-here"
},
"args": [
"-y",
"cachetank-mcp"
],
"command": "npx"
}
}
}From the project's GitHub README.
Stop re-explaining yourself to AI.
CacheTank is your AI memory layer. Save your identity, projects, decisions, and knowledge once — every AI tool gets it automatically.
Every time you open ChatGPT, Claude, Cursor, Copilot, or Gemini you start from zero. You re-explain who you are, what you are working on, what you have already decided. CacheTank fixes this. Save context once, and every AI-powered tool that supports MCP loads it before your first message.
CacheTank solves the number one frustration of working with AI: repeating yourself. Whether you use one AI tool or ten, CacheTank gives each one your full context without you typing a word.
No copy-pasting. No system prompts. No re-explaining. Your context follows you everywhere.
Add to your Claude Desktop config (claude_desktop_config.json):
{
"mcpServers": {
"cachetank": {
"command": "npx",
"args": ["-y", "cachetank-mcp"],
"env": {
"CACHETANK_READ_TOKEN": "your-read-token",
"CACHETANK_WRITE_TOKEN": "your-write-token"
}
}
}
}
claude mcp add cachetank -- npx -y cachetank-mcp
Then set your tokens:
export CACHETANK_READ_TOKEN=your-read-token
export CACHETANK_WRITE_TOKEN=your-write-token
Add to Cursor settings (Settings > MCP Servers):
{
"cachetank": {
"command": "npx",
"args": ["-y", "cachetank-mcp"],
"env": {
"CACHETANK_READ_TOKEN": "your-read-token",
"CACHETANK_WRITE_TOKEN": "your-write-token"
}
}
}
Use the same npx command:
npx -y cachetank-mcp
Required environment variables:
CACHETANK_READ_TOKEN — Your CacheTank read token (get it from the extension or cachetank.com)CACHETANK_WRITE_TOKEN — Optional. Enables saving back to your tank.CacheTank also runs as a remote MCP server for browser-based AI tools:
https://cachetank-mcp-77926794635.us-central1.run.app/mcp
Fetch your personal context for a specific project. Returns your identity, project knowledge, and recent outputs formatted as markdown.
fill_tank({ project: "My Startup" })
Save a piece of knowledge, decision, or output to your tank. Saved items become part of your context automatically in future conversations.
cache_it({
title: "Q1 pricing decision",
markdown: "Decided on \$29/mo for pro tier based on competitor analysis...",
project: "My Startup",
layer: "PROJECTS"
})
| Resource URI | Description |
|---|---|
cachetank://context | Your full personal context. Auto-loaded at conversation start. |
Every knowledge worker using AI tools faces the same problem: context loss. You explain your role, your project, your constraints, your preferences — and then the conversation ends. Next conversation, you start over.
This is not just annoying. It is expensive. Studies show knowledge workers spend 23 minutes re-establishing context every time they switch tools or start a new conversation.
CacheTank is the fix. One knowledge base. Every AI tool. Zero re-explaining.
CacheTank is a personal knowledge layer for AI. It stores your identity, projects, decisions, and knowledge in one place, then serves it to any AI tool via the Model Context Protocol (MCP). Think of it as persistent memory that works across ChatGPT, Claude, Gemini, Cursor, Copilot, and every other MCP-compatible AI tool.
Install CacheTank. Save your context once — your role, projects, preferences, and decisions. Every AI conversation automatically loads your context before the first message. No copy-pasting system prompts. No re-explaining.
Yes. CacheTank provides a context URL that any AI tool can read, including ChatGPT. For MCP-compatible tools like Claude and Cursor, context loads automatically. For others, paste your context URL.
MCP is an open standard for connecting AI tools to external data sources. CacheTank uses MCP to give Claude, Cursor, and other compatible tools direct access to your personal context without manual copy-pasting.
Yes. Your context is stored securely and only accessible via your personal tokens. CacheTank never trains on your data. Read tokens are safe to share with AI tools. Write tokens should be kept private.
Custom instructions are platform-specific and limited in length. CacheTank is cross-platform, unlimited, and automatically organizes your knowledge by project and priority using a wisdom cycle that promotes important concepts over time.
MIT — July Blue Sky LLC
Be the first to review this server!
by Modelcontextprotocol · Developer Tools
Read, search, and manipulate Git repositories programmatically
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.
by Microsoft · Content & Media
Convert files (PDF, Word, Excel, images, audio) to Markdown for LLM consumption
by mcp-marketplace · Developer Tools
Scaffold, build, and publish TypeScript MCP servers to npm — conversationally
by mcp-marketplace · Finance
Free stock data and market news for any MCP-compatible AI assistant.