Release notes - ArcLight - v1.0.25
I learned recently that Siteimprove respects a crawl delay set in robots.txt (unlike many crawlers, including Googlebot). We have some uncertainty in that claim of theirs for traffic we were seeing against quod.lib where there is this in the robots.txt:
User-agent: *
Crawl-delay: 5
However, it can only be helpful to try. At your next opportunity, can you please add the above stanza to the robots.txt for deepblue.lib (currently the default from dspace, I think) and findingaids.lib.