The goal of this crawler is to retreive all the domains used by the attacker during a ClickFix campaign. It crawls the smart contracts used by the malware on the blockchain to get the updates and deobfuscate them using webcrack.
The deobfuscation part is highly dependent on the specific campaign we analyzed. If you want to reuse this code, be sure to perform tests on this part and adjust the deobfuscation outputs.
The crawler was initially tested for all the transcation of our campaign between September and November of 2025.
You need to create a https://etherscan.io/ account to get a API token. The free plan is enough to crawl the data of this malware.
Note: At the time of publishing this tool, the etherscan.io free API plan is unavailable for a week due to unusually high network activity. Hence, I was not able to perform final tests on the crawler, which may still contains errors. Don't hesitate to open an issue if any.
First, we need to get all the transcations of the Windows and MacOS contracts. To do so, we can use the web feature of bscscan that allows to export CSV files.
Note: This step is still manual. We didn't find any simple programmatic way to do this without using external indexers.
In my case, I want to get the two following contracts hashes:
- 0x46790e2ac7f3ca5a7d1bfce312d11e91d23383ff
- 0x68DcE15C1002a2689E19D33A3aE509DD1fEb11A5
Go to the following website and download the CSV data of all of the transactions of the contract you want to monitor during a specified time range: https://testnet.bscscan.com/exportData
Once the transaction hashes are downloaded, the script will fetch their data. It will then deobfuscate the JS script with webcrack to get the obfuscated url. Finally, it will use one of the two known method from this malware to deobfuscate the url and get the corresponding domains.
To use the crawler, run:
python clickfix-crawler.py -t <YOUR_ETHERSCAN_TOKEN> -f your/transcation/hashes/file.csv -f another/transcation/hashes/files.csvThe crawler will save the status of the crawled transcations to avoid fetching them again in a marashaled file clickfix-crawler-data.tmp.
Transcation that ended up in an error will be processed again.
If you want to skip this file and fetch again all the data from the API use the -n flag.