You can use subgraphs as a pipeline source, allowing you to combine the flexibility of subgraph indexing with the expressiveness of the database of your choice. You can also push data from multiple subgraphs with the same schema into the same sink, allowing you to merge subgraphs across chains.

What you’ll need

  1. One or more subgraphs in your project - this can be from community subgraphs, a deployed subgraph, or a no-code subgraph.

    If more than one subgraph is desired, they need to have the same graphql schema. You can use this tool to compare schemas.
  2. A working database supported by Mirror. For more information on setting up a sink, see the sink documentation.

Walkthrough

1

Prepare a database

goldsky secret list will show you the database secrets available on your active project.

If you need to setup a secret, you can use goldsky secret create -h. Here is an example.

2

Select the subgraphs to combine

Open the Subgraphs Dashboard and find the deployment IDs of each subgraph you would like to use as a source.

Run the following query against the subgraph to get the deployment ID.

query {
  _meta {
    deployment
  }
}
3

Create the pipeline definition

Open a text editor create your definition, using the subgraphEntity source. In this example we will use subgraphs on Optimism and on BSC:

  • qidao-optimism (QmPuXT3poo1T4rS6agZfT51ZZkiN3zQr6n5F2o1v9dRnnr)
  • qidao-bsc (QmWgW69CaTwJYwcSdu36mkXgwWY11RjvX1oMGykrxT3wDS)

They have the same schema, and we will be syncing the account and event entities from each.

Entities may be camelCased in the GraphQL API, but here they must be snake_cased. For example, dailySnapshot will be daily_snapshot here.

sources:
  - type: subgraphEntity
    # The deployment IDs you gathered above. If you put multiple,
    # they must have the same schema
    deployments:
      - id: QmPuXT3poo1T4rS6agZfT51ZZkiN3zQr6n5F2o1v9dRnnr
      - id: QmWgW69CaTwJYwcSdu36mkXgwWY11RjvX1oMGykrxT3wDS
    # A reference name, referred to later in the `sourceStreamName` of either a transformation or a sink.
    referenceName: account
    entity:
      # The name of the entities
      name: account
  - type: subgraphEntity
    deployments:
      - id: QmPuXT3poo1T4rS6agZfT51ZZkiN3zQr6n5F2o1v9dRnnr
      - id: QmWgW69CaTwJYwcSdu36mkXgwWY11RjvX1oMGykrxT3wDS
    referenceName: market_daily_snapshot
    entity:
      name: market_daily_snapshot
# We are just replicating data, so we don't need any SQL transforms.
transforms: []
sinks:
  # In this example, we're using a postgres secret called SUPER_SECRET_SECRET.
  # Feel free to change this out with any other type of sink.
  - type: postgres
    # The sourceStreamName matches the above `referenceNames`
    sourceStreamName: account
    table: qidao_accounts
    schema: public
    secretName: SUPER_SECRET_SECRET
  - type: postgres
    sourceStreamName: market_daily_snapshot
    table: qidao_market_daily_snapshot
    schema: public
    secretName: SUPER_SECRET_SECRET
4

Create the pipeline

goldsky pipeline create qidao-crosschain --definition-path qidao-crosschain.yaml --status ACTIVE

You should see a response from the server like:

◇  Successfully validated --definition-path file
✔ Created pipeline with name: qidao-crosschain
name: qidao-crosschain
version: 1
project_id: project_cl8ylkiw00krx0hvza0qw17vn
status: INACTIVE
resource_size: s
is_deleted: false
created_at: 1697696162607
updated_at: 1697696162607
definition:
  sources:
    - type: subgraphEntity
      entity:
        name: account
      referenceName: account
      deployments:
        - id: QmPuXT3poo1T4rS6agZfT51ZZkiN3zQr6n5F2o1v9dRnnr
        - id: QmWgW69CaTwJYwcSdu36mkXgwWY11RjvX1oMGykrxT3wDS
    - type: subgraphEntity
      entity:
        name: market_daily_snapshot
      referenceName: market_daily_snapshot
      deployments:
        - id: QmPuXT3poo1T4rS6agZfT51ZZkiN3zQr6n5F2o1v9dRnnr
        - id: QmWgW69CaTwJYwcSdu36mkXgwWY11RjvX1oMGykrxT3wDS
...
5

Monitor the pipeline

Monitor the pipeline with goldsky pipeline monitor qidao-crosschain. The status should change from STARTING to RUNNING in a minute or so, and data will start appearing in your postgresql database.

6

Create API server

Once you have multiple subgraphs being written into one destination database, you can set up a GraphQL API server with this database as a source; there are many options to do this:

Can't find what you're looking for? Reach out to us at support@goldsky.com for help.