Our prices are quoted on a monthly basis for simpler presentation, but metered and billed on an hourly basis. This has a few key implications.

  1. Because there aren’t a consistent number of hours per month, we conservatively estimate that each month has 730 hours; thus, the estimated monthly pricing you see is more expensive than what you would actually pay for the quoted usage in most months.

  2. All estimations of the number of subgraphs or Mirror pipelines assume “always-on” capacity. In practice, you can run twice the number of subgraph workers or pipeline workers for half the time and pay the same price. This similarly holds for the “entities stored” metric in subgraphs.

Subgraphs

We track usage based on two metrics: (1) the number of active subgraphs, and (2) the amount of data stored across all subgraphs in your project.

Metering

Active Subgraphs

The amount of active subgraph workers, tracked hourly. If you pause or delete a subgraph, it no longer gets billed.

Examples:

  1. If you have 10 subgraphs, you will be using 10 subgraph worker hours every hour. At 730 hours in the average month, with 10 subgraphs you would incur 7,300 subgraph worker hours.
  2. If you start with 10 subgraphs in a billing period and delete all of them halfway through the billing period, you are charged for the equivalent of 5 subgraphs that billing period.

Subgraph Entities Stored

The number of entities stored across all the subgraphs in your project. This is tracked hourly as well. If you delete a subgraph, it no longer gets tracked. All subgraph entities in a project count towards the projects usage cumulatively.

Examples:

  1. If you have 3 subgraphs that have 30,000 entities collectively, you will be using 30,000 subgraph entity storage hours every hour.

    At 730 hours in the average month, you will incur 30,000 * 730 = 21,900,000 subgraph entity storage hours in that month.

  2. If you start with 3 subgraphs, each with 10,000 entities, and you delete two of them after 10 days, you will be using 30,000 entity subgraph hours for the first 10 days, then 10,000 entity subgraph hours after that.

Starter Plan

Active Subgraphs

Up to 3 concurrent subgraphs a month, or more precisely, 2250 worker hours.

Subgraph Storage

Up to 100,000 entities stored for a whole month, equivalent to 73,000,000 entity storage hours.

You will be incurring usage for each hour that each subgraph in your project is deployed and active. If you have 2 subgraphs deployed and active for 2 hours each, you will accumulate 4 hours of usage.

When you exceed free tier limits in the Starter plan, subgraph indexing will be paused but they will still be queryable.

Scale Plan Rates

Active Subgraphs
(subgraph worker-hours)
First 2,250 worker-hoursFree (ie. 3 always-on subgraphs)
2,251+ worker-hours$0.05/hour (ie. ~$36.50 a month per additional subgraph)
Subgraph Storage
(subgraph entity storage-hours)
First 100K entities stored
(ie. up to 75,000,000 storage-hours)
Free
Up to 10M entities stored
(ie. up to 7.5B storage-hours)
~$4.00/month per 100K entities
(ie. $0.0053 per 100K entities stored per hour)
Above 10M entities stored
(ie. >7.5B storage-hours)
~$1.05/month per 100K entities
(ie. $0.0014 per 100K entities stored per hour)

Mirror

Metering

Active Pipeline Workers

The number of active workers, billed hourly. A small pipeline (the default) will be 1 worker. Pipelines can have multiple parallel workers, and each worker incurs usage separately. This is billed hourly.

Resource SizeWorkers
small1
medium4
large10
x-large20
xx-large40

If you have one small pipeline and one large pipeline each deployed for 2 hours, you will accumulate 1*2*1 + 1*2*10 = 2 + 20 = 34 hours of usage.

Note that pipelines with a single subgraph as a source, and webhooks or graphql APIs as sink(s) are not metered as pipelines. You will still however accumulate hourly subgraph usage.

Examples:

  1. If you have 1 small pipeline, you will be using 1 pipeline worker-hour every hour. At 730 hours in the average month, with that pipeline you would incur 730 pipeline worker-hours over the course of the month.
  2. If you start with 10 small pipelines in a billing period and delete all of them halfway through the billing period, you are charged the equivalent of 5 pipeline workers for the full billing period.
  3. If you have 2 large pipelines, you will be using 20 pipeline worker-hours every hour, equating to 14,600 pipeline worker-hours if you run them the entire month.

Pipeline Event Writes

The number of records written by pipelines in your project. For example, for a PostgresQL sink, every row created, updated, or deleted, counts as a ‘write’. For a Kafka sink, every message counts as write.

Example:

  1. If you have a pipeline that writes 20,000 records per day for 10 days, and then 20 records per day for 10 days, you will be using 200,200 pipeline event writes.
  2. If you have two pipelines that each have written 1 million events during one month, then you will not be charged for the first million events, then $1 for the next million, as the free tier would be exhausted.

Starter Plan

Active Pipeline Workers

You are able to run 1 small sized pipeline for free (equating to 730 pipeline hours) each billing cycle.

Pipeline Event Writes

You are able to write up to 1 million events to a sink, cumulatively across all pipelines, per each billing cycle.

When you exceed free tier limits in the Starter Plan, pipelines will be paused but they will still be queryable.

Scale Plan Rates

You will be incurring usage for each hour that each pipeline in your project is deployed and active. One thing to note is that pipelines have a notion of resource size (this maps to the underlying VM size) which is a multiplier on the hourly usage.

Active Pipelines
(pipeline worker-hours)
First 750 worker-hoursFree (ie. 1 always-on pipeline worker for a full month)
751+ worker-hours$0.10 (ie. $73.00/month per worker)
Pipeline Throughput
(pipeline events written)
For the first 1M events writtenFree
Up to 10M events written$1.00 per 100,000 events
Above 10M events written$0.10 per 100,000 events