Akamai has a scraper filter (I think it just rate limits scrapers out of the box but can be configured to block if you want).
I'm not sure how good it is at detecting what is a scraper and what isn't though.
Yeah, AWS has one of these, a set of firewall rules called "bot control". it seems to work well enough for blocking the well-behaved bots who request pages at a reasonable rate and self-identify with user-agent strings (which i'm not really concerned about blocking, but it does give me some nice graphs about their traffic). it seem doesn't do a whole lot to block an unknown scraper hitting pages as fast as it can.