Open
Description
With illustration already and behaviors soon (maybe?), our worker can be ordered to download arbitrary URLs without stopping at any specific size. That's a potential hazard.
We should check how the crawler enforces the limit we use as well ; it could be a simple post-write check which might be vulnerable as well.