Create a table containing ERC-1155 Transfers for several, or all token contracts.
Install Goldsky's CLI and log in
goldsky login
and paste your API key.
goldsky
to get started:
goldsky dataset list
command and finding the EVM of your choice.
For this example, let’s use apex
chain and create a simple pipeline definition using its ERC-20 dataset that writes the data into a PostgreSQL instance:
secretName
. If you already created a secret, you can find it via the CLI command goldsky secret list
.public.apex_erc1155_transfers
.raw_logs
Direct Indexing dataset for that chain in combination with Decoding Transform Functions using the ABI of a specific ERC-1155 Contract.decoded_logs
Direct Indexing dataset for that chain in which the decoding process has already been done by Goldsky. This is only available for certain chains as you can check in this list.raw_logs
and decoding functions as it’s the default and most used way of decoding; we’ll also present an example using decoded_logs
and highlight the differences between the two.
Transfer
events of all the ERC-1155 tokens for the Scroll chain. To that end, we will dinamically fetch the ABI of the Rubyscore_Scroll token from the Scrollscan API (available here)
and use it to identify all the same events for the tokens in the chain. We have decided to use the ABI of this token for this example but any other ERC-1155 compliant token would also work.
ERC-1155 combines the features of ERC-20 and ERC-721 contracts and adds a few features.
Each transfer has both a token ID and a value representing the quantity being transfered for funglible tokens, the number 1
for tokens intended to represent NFTs, but how these work depends on how the contract is implemented.
ERC-1155 also introduces new event signatures for transfers: TransferSingle(address,address,address,uint256,uint256)
and TransferBatch(address,address,address,uint256[],uint256[])
which lets the contract transfer multiple tokens at once to a single recipient.
This causes us some trouble since we want one row per transfer in our database, so we’ll need some extra SQL logic in our pipeline to deal with this. To mitigate this complexity we have created two different transforms, each dealing with Single and Batch transfers separately.
We then aggregate both tables into a single view using a third transform.
Let’s now see all these concepts applied in an example pipeline definition:
secretName
. If you already created a secret, you can find it via the CLI command goldsky secret list
.mirror.erc1155_transfers
._gs_fetch_abi
function to get the ABI from Scrollscan and pass it as first argument
to the function _gs_log_decode
to decode its topics and data. We store the result in a decoded
ROW which we unnest on the next transform.
event_params
and event_signature
from the result of the decoding. We then filter the query on decoded IS NOT NULL
to leave out potential null results from the decoder.
event_params
we pull out the sender, recipient and token ID, note the indexes we use are different since ERC-1155 tokens have a different event_signature
.
event_params[4]
is the fourth element of the event_params
array, and for ERC-1155 this is the token ID.TRY_CAST(event_params[4] AS NUMERIC)
is casting the string element event_params[4]
to NUMERIC
- token IDs can be as large as an unsigned 256 bit integer, so make sure your database can handle that, if not, you can cast it to a different data type that your sink can handle. We use TRY_CAST
because it will prevent the pipeline from failing in case the cast fails returning a NULL
value instead.COALESCE(TRY_CAST(event_params[4] AS DECIMAL(78)), 0)
: COALESCE
can take an arbitrary number of arguments and returns the first non-NULL value. Since TRY_CAST
can return a NULL
we’re returning 0
in case it does. This isn’t strictly necessary but is useful to do in case you want to find offending values that were unable to be cast.event_params[5]
which represents the amount of a token.
event_signature
TransferSingle(address,address,address,uint256,uint256)
. As with ERC-721, we could use the event signature as a filter instead.
TRIM(LEADING '[' FROM TRIM(TRAILING ']' FROM event_params[4]))
: Similar to the ERC-721 example,
we use event_params
to access the token_id information. For ERC-1155, the string for batch transfers in element 4 looks like this when decoded: [1 2 3 4 5 6]. We need to trim the leading and trailing [ and ] characters before splitting it out into individual token IDs._gs_split_string_by(...)
: This is a Goldsky UDF which splits strings by the space character only. If you need to split by another character, for now you can use REGEXP_REPLACE(column, ',', ' ')
to replace commas with spaces.CROSS JOIN UNNEST ... AS token_ids (token_id)
: This works like UNNEST
in most other SQL dialects, but is a special case in Flink. It may be confusing that we have two separate CROSS JOINs, but they don’t work like CROSS JOIN in other SQL dialects, we’ll get a single row with a token_id
and token_value
that map correctly to each other._gs_generate_series
which will generate an array of indexes for as many tokens there are in the batch.
We combine this indexes with our existing table and use to access each token - amount pair:
id
coming from the source represents an entire batch transfer event, which can contain multiple tokens, so we concatenate the token_id to the id
to make the unnested rows unique.
goldsky pipeline create scroll-erc1155-transfers --definition-path scroll-erc1155-transfers.yaml
After some time, you should see the pipeline start streaming Transfer data into your sink.
id | contract_address | sender | recipient | token_id | event_name | block_number | block_hash | log_index | transaction_hash | transaction_index |
---|---|---|---|---|---|---|---|---|---|---|
log_0x360fcd6ca8c684039c45642d748735645fac639099d8a89ec57ad2b274407c25_7 | 0x7de37842bcf314c83afe83a8dab87f85ca3a2cee | 0x0000000000000000000000000000000000000000 | 0x16f6aff7a2d84b802b2ddf0f0aed49033b69f4f9 | 6 | 1 | 105651 | 0x360fcd6ca8c684039c45642d748735645fac639099d8a89ec57ad2b274407c25 | 7 | 0x5907ba72e32434938f45539b2792e4eacf0d141db7c4c101e207c1fb26c99274 | 5 |
decoded_logs
dataset.
Decoded Logs Pipeline Definition
secretName
. If you already created a secret, you can find it via the CLI command goldsky secret list
.mirror.erc1155_transfers
.raw_log.topics
just as we did on the previous method.
Assuming we are using the same filename for the pipeline configuration as in this example we can deploy this pipeline with the CLI pipeline create command:
goldsky pipeline create ethereum-erc1155-transfers --definition-path ethereum-decoded-logs-erc1155-transfers.yaml