Dynamic tables are a powerful feature that allows you to maintain updatable lookup tables within your pipeline. Unlike static SQL transforms, dynamic tables can be modified in real-time without redeploying your pipeline.
Once defined, use the dynamic_table_check() function in SQL transforms:
Copy
Ask AI
transforms: # Define the dynamic table tracked_wallets: type: dynamic_table backend_type: Postgres backend_entity_name: user_wallets secret_name: MY_POSTGRES # Use it in SQL filtered_transfers: type: sql primary_key: id sql: | SELECT * FROM erc20_transfers WHERE dynamic_table_check('tracked_wallets', to_address)
For Postgres backends, you can update the table directly:
Copy
Ask AI
-- Add valuesINSERT INTO turbo.tracked_contracts VALUES (lower('0x2791Bca1f2de4661ED88A30C99A7a9449Aa84174')), (lower('0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48'));-- Remove valuesDELETE FROM turbo.tracked_contractsWHERE value = lower('0x...');-- Check contentsSELECT * FROM turbo.tracked_contracts;
Changes take effect immediately - within seconds, your pipeline will start filtering based on the updated values!
Monitor all ERC-20 transfers for specific wallets:
Copy
Ask AI
name: wallet-trackerresource_size: ssources: polygon_transfers: type: dataset dataset_name: matic.erc20_transfers version: 1.2.0 start_at: latesttransforms: # Dynamic table for wallets we're tracking tracked_wallets: type: dynamic_table backend_type: Postgres backend_entity_name: user_wallets secret_name: MY_POSTGRES # Filter to only transfers involving tracked wallets wallet_transfers: type: sql primary_key: id sql: | SELECT *, CASE WHEN dynamic_table_check('tracked_wallets', from_address) THEN 'outgoing' WHEN dynamic_table_check('tracked_wallets', to_address) THEN 'incoming' ELSE 'unknown' END as direction FROM polygon_transfers WHERE dynamic_table_check('tracked_wallets', from_address) OR dynamic_table_check('tracked_wallets', to_address)sinks: postgres_sink: type: postgres from: wallet_transfers schema: public table: wallet_activity secret_name: MY_POSTGRES
To add a wallet to track:
Copy
Ask AI
INSERT INTO turbo.user_wallets VALUES (lower('0x...your-wallet...'));
Track processed records to avoid duplicates using an in-memory backend:
Copy
Ask AI
transforms: # In-memory deduplication - fast but lost on restart seen_transactions: type: dynamic_table backend_type: InMemory backend_entity_name: processed_txs sql: | SELECT transaction_hash FROM ethereum_transactions new_transactions_only: type: sql primary_key: transaction_hash sql: | SELECT * FROM ethereum_transactions WHERE NOT dynamic_table_check('seen_transactions', transaction_hash)
InMemory vs Postgres for Deduplication:
Use InMemory for development, testing, or when processing a bounded dataset where persistence isn’t needed
Use Postgres for production when you need deduplication state to survive restarts
When a dynamic table uses SQL to auto-populate, both the dynamic table and any SQL transform using it must reference the same source or upstream transform.This ensures data consistency and proper synchronization.
Good example:
Copy
Ask AI
transforms: my_table: sql: SELECT value FROM source_1 # Uses source_1 my_transform: sql: | SELECT * FROM source_1 # Also uses source_1 WHERE dynamic_table_check('my_table', value)
Bad example:
Copy
Ask AI
transforms: my_table: sql: SELECT value FROM source_1 # Uses source_1 my_transform: sql: | SELECT * FROM source_2 # Uses source_2 - ERROR! WHERE dynamic_table_check('my_table', value)