Create a table containing ERC-721 Transfers for several or all token contracts
Install Goldsky's CLI and log in
goldsky login
and paste your API key.
goldsky
to get started:
goldsky dataset list
command and finding the EVM of your choice.
For this example, let’s use apex
chain and create a simple pipeline definition using its ERC-20 dataset that writes the data into a PostgreSQL instance:
secretName
. If you already created a secret, you can find it via the CLI command goldsky secret list
.public.apex_erc721_transfers
.raw_logs
Direct Indexing dataset for that chain in combination with Decoding Transform Functions using the ABI of a specific ERC-721 Contract.decoded_logs
Direct Indexing dataset for that chain in which the decoding process has already been done by Goldsky. This is only available for certain chains as you can check in this list.raw_logs
and decoding functions as it’s the default and most used way of decoding; we’ll also present an example using decoded_logs
and highlight the differences between the two.
Transfer
events of all the ERC-721 tokens for the Scroll chain. To that end, we will dinamically fetch the ABI of the Cosmic Surprise token from the Scrollscan API (available here)
and use it to identify all the same events for the tokens in the chain. We have decided to use the ABI of this NFT contract for this example but any other ERC-721 compliant token would also work.
We need to differentiate ERC-20 token transfers from ERC-721 (NFT) transfers since they have the same event signature in decoded data: Transfer(address,address,uint256)
.
However, if we look closely at their event definitions we can appreciate that the number of topics differ:
event Transfer(address indexed _from, address indexed _to, uint256 _value)
event Transfer(address indexed _from, address indexed _to, uint256 indexed _tokenId)
secretName
. If you already created a secret, you can find it via the CLI command goldsky secret list
.mirror.erc721_transfers
._gs_fetch_abi
function to get the ABI from Scrollscan and pass it as first argument
to the function _gs_log_decode
to decode its topics and data. We store the result in a decoded
ROW which we unnest on the next transform.
We include the topic and SPLIT_INDEX
filters here to limit decoding only to the relevant events.
topics LIKE '0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef%'
: topics
is a comma separated string. Each value in the string is a hash. The first is the hash of the full event_signature (including arguments), in our case Transfer(address,address,uint256)
for ERC-721, which is hashed to 0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef
. We use LIKE
to only consider the first signature, with a %
at the end, which acts as a wildcard.SPLIT_INDEX(topics, ',', 3) IS NOT NULL
: as mentioned in the introduction, ERC-20 transfers share the same event_signature
as ERC-721 transfers. The difference between them is the number of topics associated with the event. ERC-721 transfers have four topics, and ERC-20 transfers have three.event_params
and event_signature
from the result of the decoding. We then filter the query on:
decoded IS NOT NULL
: to leave out potential null results from the decoderdecoded.event_signature = 'Transfer'
: the decoder will output the event name as event_signature, excluding its arguments. We use it to filter only for Transfer events.id
, contract_address
(if you are syncing multiple contract addresses), sender
, recipient
and token_id
.
id
: This is the Goldsky provided id
, it is a string composed of the dataset name, block hash, and log index, which is unique per event, here’s an example: log_0x60eaf5a2ab37c73cf1f3bbd32fc17f2709953192b530d75aadc521111f476d6c_18
address AS contract_address
: We use the lower function here to lower-case the address to make using this data simpler downstream, we also rename the column to contract_address to make it more explicit.lower(event_params[1]) AS sender
: Here we continue to lower-case values for consistency. In this case we’re using the first element of the event_params
array (using a 1-based index), and renaming it to sender
. Each event parameter maps to an argument to the event_signature
.lower(event_params[2]) AS recipient
: Like the previous column, we’re pulling the second element in the event_params
array and renaming it to recipient
.COALESCE(TRY_CAST(event_params[3] AS NUMERIC), -999) AS token_id
. We’ll start from the inside and work our way out.
event_params[3]
is the third element of the event_params
array, and for ERC-721 this is the token ID. Although not covered in this example, since ERC-20 shares the same signature, this element represents a token balance rather than token ID if you’re decoding ERC-20 transfers.TRY_CAST(event_params[3] AS NUMERIC)
is casting the string element event_params[3]
to NUMERIC
- token IDs can be as large as an unsigned 256 bit integer, so make sure your database can handle that, if not, you can cast it to a different data type that your sink can handle. We use TRY_CAST
because it will prevent the pipeline from failing in case the cast fails returning a NULL
value instead.COALESCE(TRY_CAST(event_params[3] AS NUMERIC), -999)
: COALESCE
can take an arbitrary number of arguments and returns the first non-NULL value. Since TRY_CAST
can return a NULL
we’re returning -999
in case it does. This isn’t strictly necessary but is useful to do in case you want to find offending values that were unable to be cast.WHERE
filter to this query with address you are interested in, like: WHERE address IN ('0xBC4CA0EdA7647A8aB7C2061c2E118A18a936f13D', '0xdac17f958d2ee523a2206206994597c13d831ec7')
goldsky pipeline create scroll-erc721-transfers --definition-path scroll-erc721-transfers.yaml
After some time, you should see the pipeline start streaming Transfer data into your sink.
id | contract_address | sender | recipient | token_id | event_name | block_number | block_hash | log_index | transaction_hash | transaction_index |
---|---|---|---|---|---|---|---|---|---|---|
log_0x5e3225c40254dd5b1b709152feafaa8437e505ae54c028b6d433362150f99986_34 | 0x6e55472109e6abe4054a8e8b8d9edffcb31032c5 | 0xd2cda3fa01d34878bbe6496c7327b3781d4422bc | 0x6e55472109e6abe4054a8e8b8d9edffcb31032c5 | 38087399 | Transfer | 4057598 | 0x5e3225c40254dd5b1b709152feafaa8437e505ae54c028b6d433362150f99986 | 34 | 0xf06c42ffd407bb9abba8f00d4a42cb7f1acc1725c604b8895cdb5f785f827967 | 11 |
decoded_logs
dataset.
Decoded Logs Pipeline Definition
secretName
. If you already created a secret, you can find it via the CLI command goldsky secret list
.mirror.erc721_transfers
.raw_log.topics
just as we did on the previous method.
Assuming we are using the same filename for the pipeline configuration as in this example we can deploy this pipeline with the CLI pipeline create command:
goldsky pipeline create ethereum-erc721-transfers --definition-path ethereum-decoded-logs-erc721-transfers.yaml