About the Goldsky MCP server
The Model Context Protocol (MCP) is an open protocol that creates standardized connections between AI applications and external services. Goldsky provides an MCP server that allows AI tools like Claude, Cursor, and other MCP clients to search and access our documentation directly. Your MCP-enabled AI tools can search across all Goldsky products and features, making it easier to:- Find relevant documentation while coding
- Get accurate answers about Goldsky’s capabilities
- Access examples and best practices in context
- Navigate between related features across products
How the Goldsky MCP server works
When an AI tool has the Goldsky MCP server connected, it can search our documentation during response generation:- The AI proactively searches Goldsky docs when relevant to your question
- Searches span all four core Goldsky products (Subgraphs, Mirror, Turbo, and Compose)
- Real-time documentation access ensures up-to-date information
- Context-aware results help you find the right product and feature
Access the Goldsky MCP server
The Goldsky MCP server is hosted at:Connect the Goldsky MCP server
Choose your preferred AI tool to get started with the Goldsky MCP server:- Cursor
- VS Code
- Claude
- Claude Code
To connect the Goldsky MCP server to Cursor:See the Cursor documentation for more details.
1
Open MCP settings
- Use Command + Shift + P (Ctrl + Shift + P on Windows) to open the command palette.
- Search for “Open MCP settings”.
- Select Add custom MCP. This opens the
mcp.jsonfile.
2
Configure the Goldsky MCP server
In
mcp.json, add the Goldsky server:3
Test the connection
In Cursor’s chat, ask “What tools do you have available?” Cursor should show the Goldsky MCP server as an available tool.
Try asking: “How do I deploy a subgraph on Goldsky?” or “What data sinks does Mirror support?”
Using the Goldsky MCP server effectively
Best practices for AI-assisted development
Once connected, your AI assistant can help you with:Product selection and architecture
Product selection and architecture
When to use: Starting a new project or evaluating optionsAsk questions like:
- “Should I use Subgraphs or Mirror for my NFT marketplace?”
- “When should I choose Turbo over Mirror?”
Configuration and setup
Configuration and setup
When to use: Setting up pipelines, subgraphs, or compose appsAsk questions like:
- “Show me a Mirror pipeline config for decoding Uniswap events”
- “How do I configure a Turbo pipeline with TypeScript transforms?”
- “What’s the syntax for Compose task triggers on ERC-20 transfers?”
Troubleshooting and debugging
Troubleshooting and debugging
When to use: Fixing issues or understanding errorsAsk questions like:
- “My subgraph deployment failed with error X, what does this mean?”
- “How do I debug a failing Mirror transform?”
- “What are common issues with Compose task execution?”
Product-specific search tips
- Subgraphs
- Mirror
- Turbo
- Compose
Key topics to search:
- Deploying and managing subgraphs
- GraphQL schema and queries
- Instant subgraphs (no-code)
- Cross-chain/multi-chain indexing
- Webhooks and event notifications
- Migration from The Graph or Alchemy
- “How do I create an instant subgraph?”
- “Show me how to deploy a cross-chain subgraph”
- “What’s the webhook payload format?”