Build an analytics pipeline that processes NFT transfer events and wallet data using provided dbt and Dagster environment. Expected time: 3-4 hours.
Run these in the order they're listed below
logs_nft1.sqllogs_nft2.sqltxs_nft1.sqltxs_nft2.sql
wallet_networth.csv is a csv file with each holder's wallet native balance on ethereum.
Import this data into your db in another table to then run the analysis pipelines on them.
Create models showing:
- Current snapshot of all NFT holders
- Incrementally updates for faster processing
- Includes total tokens held per wallet
- Includes first/last transfer timestamps per wallet
- Whales: Wallets holding >1% of supply
- High-Value Holders: Wallets holding ≥2 NFTs AND net worth >$5k
- We should be able to easily paramterize the NFT and net worth number and rerun the analysis on the new parameters
Implement basic dbt tests to verify:
- Transfer logic correctness
- Holdings calculations
- Edge cases (mints/burns)
Create a Dagster job that:
- Refreshes current holders and trader classifications
- Runs every 5 minutes
- Includes basic logging
-
DBT Models
- Analytics models
- Basic tests
-
Dagster Pipeline
- Basic scheduled job
- Error handling
-
README explaining:
- Approach
- How to run
- How to validate
Compare against Etherscan: