In an April 2015 Quocirca wrote about the problem of bad bots (web robots) that are causing more and more trouble for IT security teams.
Bad bots are used for all sorts of activities including brute force login attempts, online ad fraud (creating false clicks), co-ordinating man-in-the-middle attacks, scanning for IT vulnerabilities that attackers can exploit and clustered as botnets to perpetrate denial of service attacks.
Blocking such activity is desirable, but not if it also blocks good bots, such as web crawlers (that keep search engines up to date), web scrapers which populate price comparison and content aggregation websites, and the bots of legitimate vulnerability scanning and pen testing services from vendors such as Qualys, Rapid7 and WhiteHat.
Distil Networks has emerged as a thought leader in the space with appliances and web services to identify bot-like activity and add bots to black lists (blocked) or white lists (allowed). The service also recognizes that whether a bot is good or bad may depend on the target organization, some news sites may welcome aggregator bots others may not and policy can be set accordingly.
As of February, 2016 Distil has a formidable new competitor. The web content distribution and security vendor Akamai has released a new service called Bot Manager, which is linked to Akamai’s Client Reputation Service (released in 2015) that helps to detect bots and assess their behaviour in real-time.
Akamai accepts that it aims to capitalise on the market opened up by Distil and others. Akamai will of course hope to make fast progress into the bot protection market through the loyalty of its large customer base, many of who will see the benefit of adding closely linked bot protection to other Akamai services including its Prolexic DDoS mitigation and Kona website protection.
Akamai says it has already identified 1,300 good bots. It was also keen to point out that it believes it has taken responding to bots to a new level. This includes:
• Silent denial where a bot (and its owner) does not know it has been blocked
• Serving alternate content (such as sending competitors false pricing information)
• Limiting the activity of good bots to certain times to limit impact on performance for real users
• Prioritising good bots for different partners
• Slowing down aggressive bots (be they good or bad)
The control of Bot Manager and what how it responds is down to individual customers that can take action on different groups of bots based on either Akamai’s or the customer’s classification. They can take this to extremes; for example, if your organization wanted to stop its content being searchable by Google, you could block its web-crawler.
Distil and Akamai do not have the market to themselves, other bot protection products and services include Shape Security’s Botwall and the anti-web scrapping service ShieldSquare. Bot blocking capabilities are also built into Imperva’s Incapsula DDoS services and F5’s Application Security Manager. Bad bots and good bots alike are going to have to work harder and harder to get the access they need to carry out their work.