What you’ll need
- A Goldky account and the CLI installed
Install Goldsky's CLI and log in
Install Goldsky's CLI and log in
-
Install the Goldsky CLI:
For macOS/Linux:
For Windows:Windows users need to have Node.js and npm installed first. Download from nodejs.org if not already installed.
- Go to your Project Settings page and create an API key.
-
Back in your Goldsky CLI, log into your Project by running the command
goldsky loginand paste your API key. -
Now that you are logged in, run
goldskyto get started:
- A basic understanding of the Mirror product
- A destination sink to write your data to. In this example, we will use the PostgreSQL Sink
Introduction
In order to stream all the ERC-721 Transfers of a chain there are two potential methods available:- Use the readily available ERC-20 dataset for the chain you are interested in: this is the easiest and quickest method to get you streaming token transfers into your sink of choice with minimum code.
- Build the ERC-20 Transfers pipeline from scratch using raw or decoded logs: this method takes more code and time to implement but it’s a great way to learn about how you can use decoding functions in case you want to build more customized pipelines.
Using the ERC-20 Transfers Source Dataset
Every EVM chain has its own ERC-20 dataset available for you to use as source in your pipelines. You can check this by running thegoldsky dataset list command and finding the EVM of your choice.
For this example, let’s use apex chain and create a simple pipeline definition using its ERC-20 dataset that writes the data into a PostgreSQL instance:
apex-erc721-tokens.yaml
If you copy and use this configuration file, make sure to update:
- Your
secretName. If you already created a secret, you can find it via the CLI commandgoldsky secret list. - The schema and table you want the data written to, by default it writes to
public.apex_erc721_transfers.
Building ERC-721 Transfers from scratch using logs
In the previous method we just explored, the ERC-721 datasets that we used as source to the pipeline encapsulates all the decoding logic that’s explained in this section. Read on if you are interested in learning how it’s implemented in case you want to consider extending or modifying this logic yourself. To build the token transfers pipeline from scratch, use theraw_logs Direct Indexing dataset for that chain in combination with Decoding Transform Functions using the ABI of a specific ERC-721 Contract.
Building ERC-721 Tranfers using Decoding Transform Functions
In this example, we will stream all theTransfer events of all the ERC-721 tokens for the Scroll chain. To that end, we will dinamically fetch the ABI of the Cosmic Surprise token from the Scrollscan API (available here)
and use it to identify all the same events for the tokens in the chain. We have decided to use the ABI of this NFT contract for this example but any other ERC-721 compliant token would also work.
We need to differentiate ERC-20 token transfers from ERC-721 (NFT) transfers since they have the same event signature in decoded data: Transfer(address,address,uint256).
However, if we look closely at their event definitions we can appreciate that the number of topics differ:
- ERC-20:
event Transfer(address indexed _from, address indexed _to, uint256 _value) - ERC-721:
event Transfer(address indexed _from, address indexed _to, uint256 indexed _tokenId)
Pipeline Definition
scroll-erc721-transfers.yaml
If you copy and use this configuration file, make sure to update:
- Your
secretName. If you already created a secret, you can find it via the CLI commandgoldsky secret list. - The schema and table you want the data written to, by default it writes to
mirror.erc721_transfers.
Transform: scroll_decoded
_gs_fetch_abi function to get the ABI from Scrollscan and pass it as first argument
to the function _gs_log_decode to decode its topics and data. We store the result in a decoded ROW which we unnest on the next transform.
We include the topic and SPLIT_INDEX filters here to limit decoding only to the relevant events.
topics LIKE '0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef%':topicsis a comma separated string. Each value in the string is a hash. The first is the hash of the full event_signature (including arguments), in our caseTransfer(address,address,uint256)for ERC-721, which is hashed to0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef. We useLIKEto only consider the first signature, with a%at the end, which acts as a wildcard.SPLIT_INDEX(topics, ',', 3) IS NOT NULL: as mentioned in the introduction, ERC-20 transfers share the sameevent_signatureas ERC-721 transfers. The difference between them is the number of topics associated with the event. ERC-721 transfers have four topics, and ERC-20 transfers have three.
Transform: scroll_clean
event_params and event_signature from the result of the decoding. We then filter the query on:
decoded IS NOT NULL: to leave out potential null results from the decoderdecoded.event_signature = 'Transfer': the decoder will output the event name as event_signature, excluding its arguments. We use it to filter only for Transfer events.
Transform: scroll_721_transfers
id, contract_address (if you are syncing multiple contract addresses), sender, recipient and token_id.
id: This is the Goldsky providedid, it is a string composed of the dataset name, block hash, and log index, which is unique per event, here’s an example:log_0x60eaf5a2ab37c73cf1f3bbd32fc17f2709953192b530d75aadc521111f476d6c_18address AS contract_address: We use the lower function here to lower-case the address to make using this data simpler downstream, we also rename the column to contract_address to make it more explicit.lower(event_params[1]) AS sender: Here we continue to lower-case values for consistency. In this case we’re using the first element of theevent_paramsarray (using a 1-based index), and renaming it tosender. Each event parameter maps to an argument to theevent_signature.lower(event_params[2]) AS recipient: Like the previous column, we’re pulling the second element in theevent_paramsarray and renaming it torecipient.
COALESCE(TRY_CAST(event_params[3] AS NUMERIC), -999) AS token_id. We’ll start from the inside and work our way out.
event_params[3]is the third element of theevent_paramsarray, and for ERC-721 this is the token ID. Although not covered in this example, since ERC-20 shares the same signature, this element represents a token balance rather than token ID if you’re decoding ERC-20 transfers.TRY_CAST(event_params[3] AS NUMERIC)is casting the string elementevent_params[3]toNUMERIC- token IDs can be as large as an unsigned 256 bit integer, so make sure your database can handle that, if not, you can cast it to a different data type that your sink can handle. We useTRY_CASTbecause it will prevent the pipeline from failing in case the cast fails returning aNULLvalue instead.COALESCE(TRY_CAST(event_params[3] AS NUMERIC), -999):COALESCEcan take an arbitrary number of arguments and returns the first non-NULL value. SinceTRY_CASTcan return aNULLwe’re returning-999in case it does. This isn’t strictly necessary but is useful to do in case you want to find offending values that were unable to be cast.
WHERE filter to this query with address you are interested in, like: WHERE address IN ('0xBC4CA0EdA7647A8aB7C2061c2E118A18a936f13D', '0xdac17f958d2ee523a2206206994597c13d831ec7')
Deploying the pipeline
Our last step is to deploy this pipeline and start sinking ERC-721 transfer data into our database. Assuming we are using the same file name for the pipeline configuration as in this example, we can use the CLI pipeline create command like this:goldsky pipeline create scroll-erc721-transfers --definition-path scroll-erc721-transfers.yaml
After some time, you should see the pipeline start streaming Transfer data into your sink.
Remember that you can always speed up the streaming process by updating the resourceSize of the pipeline
| id | contract_address | sender | recipient | token_id | event_name | block_number | block_hash | log_index | transaction_hash | transaction_index |
|---|---|---|---|---|---|---|---|---|---|---|
| log_0x5e3225c40254dd5b1b709152feafaa8437e505ae54c028b6d433362150f99986_34 | 0x6e55472109e6abe4054a8e8b8d9edffcb31032c5 | 0xd2cda3fa01d34878bbe6496c7327b3781d4422bc | 0x6e55472109e6abe4054a8e8b8d9edffcb31032c5 | 38087399 | Transfer | 4057598 | 0x5e3225c40254dd5b1b709152feafaa8437e505ae54c028b6d433362150f99986 | 34 | 0xf06c42ffd407bb9abba8f00d4a42cb7f1acc1725c604b8895cdb5f785f827967 | 11 |
