If you're seeing HTTP requests to your site from the Wrblebot, it's us. We have a distributed change detection system that uses "Wrblebot" as the user agent to generate the Wrble index. We understand that indexers consume your resources and take several steps to reduce load.
We humbly ask that you allow Wrblebot crawling as it's the best chance we have for a more open and accessible index outside of Microsoft and Google. This democratization will allow more competition and variety in providing search traffic to your site.
Things we do to lower our impact:
- Respect robots.txt rate-limits (on a per-crawler basis as we're distributed)
- Respect robots.txt denies
- Calculate likelihood of changes to reduce volume
If there's anything we can do to be a better citizen, reach out to our crawling team at firstname.lastname@example.org.