By Mario Toneguzzi
A new report says there is a growing concern about bad bot behaviour on website traffic - something ecommerce sites should pay particular attention to.
The fifth annual Bad Bot Report 2018: The Year Bad Bots Went Mainstream, by Distil Networks, also found that ecommerce has the number one highest proportion of “sophisticated” bots (22.9 per cent), which mimic human behaviour to evade detection.
Edward Roberts, director of product marketing for Distil, which is a global leader in bot mitigation, says the study showed a growth in all bot behaviour both good and bad.
“Bad is up approximately 10 per cent and good is also up. The proportion of human traffic is down because both of those are up,” he says. “And that sort of tells you that bots are growing in popularity and more people are launching them.”
Here are the key findings from the report:
● In 2017, bad bots accounted for 21.8 per cent of all website traffic, a 9.5 per cent increase over the previous year. Good bots increased by 8.7 per cent to make up 20.4 per cent of all website traffic;
● For the first time, Russia became the most blocked country, with one in five companies (20.7 per cent) implementing country-specific IP block requests;
● Gambling companies and airlines suffer from higher proportions of bad bot traffic than other industries with 53.1 per cent and 43.9 per cent of traffic coming from bad bots, respectively. Ecommerce, healthcare and ticketing websites suffer from highly-sophisticated bots, which are difficult to detect;
● 83.2 per cent of bad bots report their user agent as web browsers Chrome, Firefox, Safari or Internet Explorer. 10.4 per cent claim to come from mobile browsers such as Safari Mobile, Android or Opera;
● 82.7 per cent of bad bot traffic emanated from data centres in 2017, compared to 60.1 per cent in 2016. The availability and low cost of cloud computing explains the dominance of data centre use;
● 74 per cent of bad bot traffic is made up of moderate or sophisticated bots, which evade detection by distributing their attacks over multiple IP addresses, or simulating human behaviour such as mouse movements and mobile swipes; and
● Account takeover attacks occur two to three times per month on the average website, but immediately following a breach, they are three times more frequent, as bot operators know that people reuse the same credentials across multiple websites.
“A bot is any script or anything running that’s hitting your website or whatever web property you have,” says Roberts. “It is doing something for some reason that you might not be aware of. The classic example of a bot would be Google. Google in itself is a bot. It goes around the Internet and it scrapes everybody’s website in order to index the content so that you can find it on their search engine . . . It’s just an automated tool that goes around the web and hits every website. Of course, you want to allow that one. So that would be classified as a good bot.
“On the nefarious side, which is a bad bot, they go around doing all manner of things and ignoring the peril is the message we want to get across. They’re not just benign things hitting your web infrastructure. They are trying to do things. Like they could be competitors scraping your prices to beat you in the marketplace . . . They could be testing gift card balances . . . These things are just going around the web just looking to find things that they can take advantage of. So they do have a purpose.”
Distil, which is based out of San Francisco, says bad bots are used by competitors, hackers and fraudsters and are the key culprits behind web scraping, brute force attacks, competitive data mining, online fraud, account hijacking, data theft, spam, digital ad fraud and downtime.
The findings of the report comes from Distil’s newly-launched Distil Research Lab, a team of analysts who examine the most sophisticated automated threats for some of the world’s most attacked websites.
“One of most robust group of companies that we do protect are ecommerce companies . . . Bots do different things on different industries. Ecommerce or retail sites are definitely suffering from some of the most dangerous and most sophisticated types of bots that we see,” explained Roberts.
Mario Toneguzzi, based in Calgary has 37 years of experience as a daily newspaper writer, columnist and editor. He worked for 35 years at the Calgary Herald covering sports, crime, politics, health, city and breaking news, and business. For 12 years as a business writer, his main beats were commercial and residential real estate, retail, small business and general economic news. He nows works on his own as a freelance writer and consultant in communications and media relations/training. Email: firstname.lastname@example.org.