ERC-721 tokens, also known as NFTs, provide a standardized format for non-fungible digital assets within EVM ecosystems. The process of transferring ERC-721 tokens into a database is fundamental, unlocking opportunities for data analysis, tracking, and the development of innovative solutions.

This guide is part of a series of tutorials on how you can stream transfer data into your datawarehouse using Mirror pipelines. Here we will be focusing on ERC-721 Transfers, visit the following two other guides for other types of Transfers:

What you’ll need

  1. A Goldky account and the CLI installed
  1. A basic understanding of the Mirror product
  2. A destination sink to write your data to. In this example, we will use the PostgreSQL Sink

Introduction

In order to stream all the ERC-721 Transfers of a chain there are two potential methods available:

  1. Use the raw_logs Direct Indexing dataset for that chain in combination with Decoding Transform Functions using the ABI of a specific ERC-721 Contract.
  2. Use the decoded_logs Direct Indexing dataset for that chain in which the decoding process has already been done by Goldsky. This is only available for certain chains as you can check in this list.

We’ll primarily focus on the first decoding method using raw_logs and decoding functions as it’s the default and most used way of decoding; we’ll also present an example using decoded_logs and highlight the differences between the two.

ERC-721 Tranfers using Decoding Transform Functions

In this example, we will stream all the Transfer events of all the ERC-721 tokens for the Scroll chain. To that end, we will dinamically fetch the ABI of the Cosmic Surprise token from the Scrollscan API (available here) and use it to identify all the same events for the tokens in the chain. We have decided to use the ABI of this NFT contract for this example but any other ERC-721 compliant token would also work.

We need to differentiate ERC-20 token transfers from ERC-721 (NFT) transfers since they have the same event signature in decoded data: Transfer(address,address,uint256). However, if we look closely at their event definitions we can appreciate that the number of topics differ:

  • ERC-20: event Transfer(address indexed _from, address indexed _to, uint256 _value)
  • ERC-721: event Transfer(address indexed _from, address indexed _to, uint256 indexed _tokenId)

ERC-20 Transfer events have 3 topics (one topic for event signature + 2 topics for the indexed params). NFTs on the other hand have 4 topics as they have one more indexed param in the event definition. We will use this as a filter in our pipeline transform to only transfer ERC-20 Transfer events.

Let’s now see all these concepts applied in an example pipeline definition:

Pipeline Definition

scroll-erc721-transfers.yaml
sources:
  - type: dataset
    referenceName: scroll_mainnet.raw_logs
    version: 1.0.0
transforms:
  - referenceName: scroll_decoded
    type: sql
    primaryKey: id
    # Fetch the ABI from scrollscan for Cosmic Surprise token
    sql: >
      SELECT
        *,
        _gs_evm_decode(
            _gs_fetch_abi('https://api.scrollscan.com/api?module=contract&action=getabi&address=0xcf7f37b4916ac5c530c863f8c8bb26ec1e8d2ccb', 'etherscan'),
            `topics`,
            `data`
        ) AS `decoded`
        FROM scroll_mainnet.raw_logs

  - referenceName: scroll_clean
    primaryKey: id
    type: sql
    # Clean up the previous transform, unnest the values from the `decoded` object
    sql: >
      SELECT
        *,
        decoded.event_params AS `event_params`,
        decoded.event_signature AS `event_name`
        FROM scroll_decoded
        WHERE decoded IS NOT NULL
        AND decoded.event_signature = 'Transfer'
        AND topics LIKE '0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef%'
        AND SPLIT_INDEX(topics, ',', 3) IS NOT NULL

  - referenceName: scroll_721_transfers
    primaryKey: id
    type: sql
    sql: >
      SELECT
        id,
        lower(address) AS contract_address,
        lower(event_params[1]) AS sender,
        lower(event_params[2]) AS recipient,
        COALESCE(TRY_CAST(event_params[3] AS NUMERIC), -999) AS token_id,
        event_name,
        block_number,
        block_hash,
        log_index,
        transaction_hash,
        transaction_index
        FROM scroll_clean
sinks:
  - type: postgres
    table: erc721_transfers
    schema: mirror
    secretName: <YOUR SECRET>
    description: Postgres sink for ERC721 transfers
    referenceName: scroll_721_sink
    sourceStreamName: scroll_721_transfers

If you copy and use this configuration file, make sure to update:

  1. Your secretName. If you already created a secret, you can find it via the CLI command goldsky secret list.
  2. The schema and table you want the data written to, by default it writes to mirror.erc721_transfers.

There are 3 transforms in this pipeline definition which we’ll explain how they work:

Transform: scroll_decoded
SELECT
  *,
  _gs_evm_decode(
      _gs_fetch_abi('https://api.scrollscan.com/api?module=contract&action=getabi&address=0xc7d86908ccf644db7c69437d5852cedbc1ad3f69', 'etherscan'),
      `topics`,
      `data`
  ) AS `decoded`
  FROM scroll_mainnet.raw_logs

As explained in the Decoding Contract Events guide we first make use of the _gs_fetch_abi function to get the ABI from Scrollscan and pass it as first argument to the function _gs_evm_decode to decode its topics and data. We store the result in a decoded ROW which we unnest on the next transform.

Transform: scroll_clean
SELECT
  *,
  decoded.event_params AS `event_params`,
  decoded.event_signature AS `event_name`
  FROM scroll_decoded
  WHERE decoded IS NOT NULL
  AND decoded.event_signature = 'Transfer'
  AND topics LIKE '0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef%'
  AND SPLIT_INDEX(topics, ',', 3) IS NOT NULL

In this second transform, we take the event_params and event_signature from the result of the decoding. We then filter the query on:

  • decoded IS NOT NULL: to leave out potential null results from the decoder
  • decoded.event_signature = 'Transfer': the decoder will output the event name as event_signature, excluding its arguments. We use it to filter only for Transfer events.
  • topics LIKE '0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef%': topics is a comma separated string. Each value in the string is a hash. The first is the hash of the full event_signature (including arguments), in our case Transfer(address,address,uint256) for ERC-721, which is hashed to 0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef. We use LIKE to only consider the first signature, with a % at the end, which acts as a wildcard.
  • SPLIT_INDEX(topics, ',', 3) IS NOT NULL: as mentioned in the introduction, ERC-20 transfers share the same event_signature as ERC-721 transfers. The difference between them is the number of topics associated with the event. ERC-721 transfers have four topics, and ERC-20 transfers have three.
Transform: scroll_721_transfers
SELECT
  id,
  lower(address) AS contract_address,
  lower(event_params[1]) AS sender,
  lower(event_params[2]) AS recipient,
  COALESCE(TRY_CAST(event_params[3] AS NUMERIC), -999) AS token_id,
  event_name,
  block_number,
  block_hash,
  log_index,
  transaction_hash,
  transaction_index
  FROM scroll_clean

In this last transform we are essentially selecting all the Transfer information we are interested in having in our database. We’ve included a number of columns that you may or may not need, the main columns needed for most purposes are: id, contract_address (if you are syncing multiple contract addresses), sender, recipient and token_id.

  • id: This is the Goldsky provided id, it is a string composed of the dataset name, block hash, and log index, which is unique per event, here’s an example: log_0x60eaf5a2ab37c73cf1f3bbd32fc17f2709953192b530d75aadc521111f476d6c_18
  • lower(address) AS contract_address: We use the lower function here to lower-case the address to make using this data simpler downstream, we also rename the column to contract_address to make it more explicit.
  • lower(event_params[1]) AS sender: Here we continue to lower-case values for consistency. In this case we’re using the first element of the event_params array (using a 1-based index), and renaming it to sender. Each event parameter maps to an argument to the event_signature.
  • lower(event_params[2]) AS recipient: Like the previous column, we’re pulling the second element in the event_params array and renaming it to recipient.

For the token_id we introduce a few SQL functions COALESCE(TRY_CAST(event_params[3] AS NUMERIC), -999) AS token_id. We’ll start from the inside and work our way out.

  1. event_params[3] is the third element of the event_params array, and for ERC-721 this is the token ID. Although not covered in this example, since ERC-20 shares the same signature, this element represents a token balance rather than token ID if you’re decoding ERC-20 transfers.
  2. TRY_CAST(event_params[3] AS NUMERIC) is casting the string element event_params[3] to NUMERIC - token IDs can be as large as an unsigned 256 bit integer, so make sure your database can handle that, if not, you can cast it to a different data type that your sink can handle. We use TRY_CAST because it will prevent the pipeline from failing in case the cast fails returning a NULL value instead.
  3. COALESCE(TRY_CAST(event_params[3] AS NUMERIC), -999): COALESCE can take an arbitrary number of arguments and returns the first non-NULL value. Since TRY_CAST can return a NULL we’re returning -999 in case it does. This isn’t strictly necessary but is useful to do in case you want to find offending values that were unable to be cast.

Lastly, we are also adding more block metadata to the query to add context to each transaction:

event_name,
block_number,
block_hash,
log_index,
transaction_hash,
transaction_index

It’s worth mentioning that in this example we are interested in all the ERC-721 Transfer events but if you would like to filter for specific contract addresses you could simply add a WHERE filter to this query with address you are interested in, like: WHERE lower(address) IN ('0xBC4CA0EdA7647A8aB7C2061c2E118A18a936f13D', '0xdac17f958d2ee523a2206206994597c13d831ec7')

Deploying the pipeline

Our last step is to deploy this pipeline and start sinking ERC-721 transfer data into our database. Assuming we are using the same file name for the pipeline configuration as in this example, we can use the CLI pipeline create command like this:

goldsky pipeline create scroll-erc721-transfers --definition-path scroll-erc721-transfers.yaml

After some time, you should see the pipeline start streaming Transfer data into your sink.

Remember that you can always speed up the streaming process by updating the resourceSize of the pipeline

Here’s an example transfer record from our sink:

idcontract_addresssenderrecipienttoken_idevent_nameblock_numberblock_hashlog_indextransaction_hashtransaction_index
log_0x5e3225c40254dd5b1b709152feafaa8437e505ae54c028b6d433362150f99986_340x6e55472109e6abe4054a8e8b8d9edffcb31032c50xd2cda3fa01d34878bbe6496c7327b3781d4422bc0x6e55472109e6abe4054a8e8b8d9edffcb31032c538087399Transfer40575980x5e3225c40254dd5b1b709152feafaa8437e505ae54c028b6d433362150f99986340xf06c42ffd407bb9abba8f00d4a42cb7f1acc1725c604b8895cdb5f785f82796711

We can find this transaction in Scrollscan. We see that it corresponds to the transfer of MERK token:

This concludes our successful deployment of a Mirror pipeline streaming ERC-721 Tokens from Scroll chain into our database using inline decoders. Congrats! 🎉

ERC-721 Transfers using decoded datasets

As explained in the Introduction, Goldsky provides decoded datasets for Raw Logs and Raw Traces for a number of different chains. You can check this list to see if the chain you are interested in has these decoded datasets. In these cases, there is no need for us to run Decoding Transform Functions as the dataset itself will already contain the event signature and event params decoded.

Click on the button below to see an example pipeline definition for streaming ERC-1155 tokens on the Ethereum chain using the decoded_logs dataset.

You can appreciate that it’s pretty similar to the inline decoding pipeline method but here we simply create a transform which does the filtering based on the raw_log.topics just as we did on the previous method.

Assuming we are using the same filename for the pipeline configuration as in this example we can deploy this pipeline with the CLI pipeline create command:

goldsky pipeline create ethereum-erc721-transfers --definition-path ethereum-decoded-logs-erc721-transfers.yaml

Conclusion

In this guide, we have learnt how Mirror simplifies streaming NFT Transfer events into your database.

We have deep dived into the standard decoding method using Decoding Transform Functions, implementing an example on Scroll chain. We have also looked into an example implementation using the decoded_logs dataset for Ethereum. Both are great decoding methods and depending on your use case and dataset availability you might prefer one over the other.

With Mirror, developers gain flexibility and efficiency in integrating blockchain data, opening up new possibilities for applications and insights. Experience the transformative power of Mirror today and redefine your approach to blockchain data integration.

Can't find what you're looking for? Reach out to us at support@goldsky.com for help.