My understanding is that restrictions like this exist so that all scrapers / spiders / etc. can be banned or blocked at will if they become bothersome. If you put a restriction like “Don’t use a spider, scraper, or other automated technology that interferes with the operation of the website” now you might have a fight on your hands over whether or not something actually “interfered” with the website.
My understanding is that restrictions like this exist so that all scrapers / spiders / etc. can be banned or blocked at will if they become bothersome. If you put a restriction like “Don’t use a spider, scraper, or other automated technology that interferes with the operation of the website” now you might have a fight on your hands over whether or not something actually “interfered” with the website.