If you feel that NerdyBot is crawling your website too quickly, please let us know what an appropriate crawl rate is. If you'd like us to stop crawling your website, the best thing to do is to block our web crawler using the robots.txt specification. To do this, add the following to your robots.txt:
User-agent: NerdyBot Disallow: /If you block NerdyBot using robots.txt, you will see crawl requests die down gradually, rather than immediately. This happens because of our distributed architecture. Our computers only periodically receive robots.txt information for domains they are crawling.
Blocking our web crawler by IP address will not work. Due to the distributed nature of our infrastructure, we have thousands of constantly changing IP addresses. We strongly recommend you don't try to block our web crawler by IP address, as you'll most likely spend several hours of futile effort and be in a very bad mood at the end of it. You really should just include us in your robots.txt or contact us directly.
To read more about NerdyData, please visit our documentation.