Skip to main content

About the Goldsky MCP server

The Model Context Protocol (MCP) is an open protocol that creates standardized connections between AI applications and external services. Goldsky provides an MCP server that allows AI tools like Claude, Cursor, and other MCP clients to search and access our documentation directly. Your MCP-enabled AI tools can search across all Goldsky products and features, making it easier to:
  • Find relevant documentation while coding
  • Get accurate answers about Goldsky’s capabilities
  • Access examples and best practices in context
  • Navigate between related features across products

How the Goldsky MCP server works

When an AI tool has the Goldsky MCP server connected, it can search our documentation during response generation:
  • The AI proactively searches Goldsky docs when relevant to your question
  • Searches span all four core Goldsky products (Subgraphs, Mirror, Turbo, and Compose)
  • Real-time documentation access ensures up-to-date information
  • Context-aware results help you find the right product and feature

Access the Goldsky MCP server

The Goldsky MCP server is hosted at:
https://docs.goldsky.com/mcp

Connect the Goldsky MCP server

Choose your preferred AI tool to get started with the Goldsky MCP server:
To connect the Goldsky MCP server to Cursor:
1

Open MCP settings

  1. Use Command + Shift + P (Ctrl + Shift + P on Windows) to open the command palette.
  2. Search for “Open MCP settings”.
  3. Select Add custom MCP. This opens the mcp.json file.
2

Configure the Goldsky MCP server

In mcp.json, add the Goldsky server:
{
  "mcpServers": {
    "Goldsky": {
      "url": "https://docs.goldsky.com/mcp"
    }
  }
}
3

Test the connection

In Cursor’s chat, ask “What tools do you have available?” Cursor should show the Goldsky MCP server as an available tool.
Try asking: “How do I deploy a subgraph on Goldsky?” or “What data sinks does Mirror support?”
See the Cursor documentation for more details.

Using the Goldsky MCP server effectively

Best practices for AI-assisted development

Once connected, your AI assistant can help you with:
When to use: Starting a new project or evaluating optionsAsk questions like:
  • “Should I use Subgraphs or Mirror for my NFT marketplace?”
  • “When should I choose Turbo over Mirror?”
The AI will search across product documentation to provide comparative guidance.
When to use: Setting up pipelines, subgraphs, or compose appsAsk questions like:
  • “Show me a Mirror pipeline config for decoding Uniswap events”
  • “How do I configure a Turbo pipeline with TypeScript transforms?”
  • “What’s the syntax for Compose task triggers on ERC-20 transfers?”
The AI will find relevant configuration examples and syntax.
When to use: Fixing issues or understanding errorsAsk questions like:
  • “My subgraph deployment failed with error X, what does this mean?”
  • “How do I debug a failing Mirror transform?”
  • “What are common issues with Compose task execution?”
The AI will search documentation for error explanations and solutions. If it is unable to help, don’t hesitate to reach out to our support, and we will continue to add debugging guidance to our documentation to continuously improve the performance of the MCP server. Mention in your email that you are using the MCP server for priority support.

Product-specific search tips

Key topics to search:
  • Deploying and managing subgraphs
  • GraphQL schema and queries
  • Instant subgraphs (no-code)
  • Cross-chain/multi-chain indexing
  • Webhooks and event notifications
  • Migration from The Graph or Alchemy
Example queries:
  • “How do I create an instant subgraph?”
  • “Show me how to deploy a cross-chain subgraph”
  • “What’s the webhook payload format?”
Tip: Be specific about products: When asking questions, mention the specific Goldsky product (Subgraphs, Mirror, Turbo, or Compose) to get more targeted results.Example: “How do I configure a Mirror pipeline?” vs “How do I configure a pipeline?” (could refer to Mirror or Turbo)