Access and Feeds

Bad Bots: Increasingly Sophisticated in their Attack on Web Sites

By Dick Weisinger

Internet site traffic is increasingly attributed to visits by bot programs.  40 percent of all traffic on the internet is from bots, and half of those visits are made by bad bots.  Bots are software programs that automate tasks on the internet or pretend to be human. Bad bots are those that do malicious things, like consuming a lot of bandwidth, scraping and stealing content, and purposely over tasking the server to slow it down or crash it.

A report by Distil Networks, the 4th Annual Bad Bot Report, that was released today finds that:

  • Proprietary content from 97 percent of web sites is being scraped by bad bots.
  • Bots visiting 90 percent of web sites are able to get past the login page to access normally restricted data
  • The cloud is home to many bad bots. 60 percent of bad bots originate from servers running in data centers.  Amazon AWS is a favorite for hosting bots.
  • 75 percent of bad bots are sophisticated enough to be able to access external resources or reside for long periods of time on a server

The Distil report found that “bad bot events are increasingly efficient, persistent, and elusive…  One of the side effects of increasing bad bot sophistication is their ability to carry out significant attacks using fewer requests, delay requests and stay under request rate limits. This method being known as low and slow and reduces the noise of many bad bot attack campaigns.”

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Shout it
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)

Leave a Reply

Your email address will not be published. Required fields are marked *

*