-
Notifications
You must be signed in to change notification settings - Fork 74
Closed
Description
Impact
On professional setups it's not uncommon for the corpus to reach hundreds of Megabytes if not Gigabytes
As it stands Medusa will "halt" and replay the corpus before proceeding with breaking properties
This reduces iteration speed, as most devs that have a corpus that achieves the intended coverage would rather:
- Have medusa quickly replay the corpus to break the properties
- Medusa continue with increasing the corpus / trying to break properties more deeply
Echidna vs Medusa
After compilation + slither
Echidna will:
- Setup the workers
- Replay each file from the corpus on each worker
- Flag any broken property as it breaks them as it's replaying the corpus
- Highly likely these will already be shrunken
Medusa will:
- Stall and re-run the corpus (but not flag any broken property)
- Setup the workers
- Break the properties
- Seems to show shrunken sequences currently
Real World Impact
Waiting 30 mins to have a property break after 5 seconds is not ideal
Current Mitigation
I rename the corpus and run a quick run
Only once I believe all properties are properly written will I use the bigger corpus
Metadata
Metadata
Assignees
Labels
No labels