Whilst the figures reported in Google Analytics are not completely reliable, they do give you a good idea of what’s going on with your website, and when it comes to traffic, it’s a great way of understanding how much you are getting, where it’s coming from and the quality of that traffic.
Bots are an increasing problem, leading to noticeable skews in data. These bots send automated traffic to your website, which is at best of no value, at worst, malicious. These bot visits bump up your traffic stats, making your site look more popular than it really is, but at the same time affecting many other stats negatively.
Bots Skew Your Website Bounce & Conversion Rates
Your website bounce rate is an important performance indicator and can tell you several things. A low bounce rate can mean that people are engaged with your site and that you are attracting good quality, relevant traffic.
Conversely a high bounce rate could mean that you may have a problem with the quality of the traffic you are attracting, or that you have serious aesthetic or usability issues with your website, causing people to distrust your site and immediately leave it.
It is estimated that 56% of all website traffic comes from bots
Consider that for a moment. Potentially half of the visits to your website could be bots and not potential customers at all, no wonder your conversion rates appear to be so low!
Bot traffic leads to a 100% bounce rate, a visit duration of 0 and of course, this means that your goal conversion rate will drop through the floor – this in turn will skew your overall bounce and conversion rates, making your websites performance look worse than it really is.
How To Filter Bot Traffic Out Of Google Analytics Reports
Prolific new bots are popping up all the time, and it is incredibly difficult to reliably guard against them all. Whilst you can specify that some or all bots are restricted from visiting your site at all via your .htaccess file, a good way of dealing with this problem is by filtering bot visits out of Google Analytics using manual filters. Alternatively, new to Google Analytics in July 2014, you can very easily exclude all known bots from showing up in your stats with a single click.
It is worth noting that the ‘exclude all’ method alone, has not consistently caught all bots in my experience. On all of my sites, I use both filtering methods; excluding all known bots at report view level and setting up individual filters for prolific offenders on a site by site basis.
When you implement both methods, it means that you are less likely to see skewed figures in future, although be aware that your traffic may appear to go down once these exclusions are in place.
How To Exclude All Bots
- Log in to your Google Analytics account
- Go to ‘All web site data’
- Choose ‘View settings’
- Scroll to the bottom and you will see ‘Bot Filtering’ heading and a tick box.
- Tick the box to ‘exclude all known bots and spiders’
- Click save
Filtering Out Individual Bots
- Go to ‘Admin’
- Click on ‘Filters’, under ‘All Web Site Data’
- Click ‘+New Filter’
- Check ‘Create New Filter’
- Give the filter a name e.g. “ButtonsForWebsites BOT”
- Under Filter Type, choose ‘Custom’
- Select ‘exclude’ and choose ‘Referral’ from the ‘filter’ field
- In ‘Filter Pattern’ copy the name of the bot you want to block e.g. buttons-for-website.com
- Click save
Common Bot Offenders To Look Out For
Spotting bot traffic is easy, just look for referrers with a consistent 100% bounce rate. Chances are it’s a bot and should be filtered out.
This article was syndicated from Business 2 Community: Why Bot Traffic Is Bad & How To Filter It From Analytics
More Digital & Social articles from Business 2 Community:
- 8 Content Marketing Trends to Watch Out for in 2015 (Infographic)
- 6 Ways to Find Content Ideas For Your Blog Even If You Work In a Boring Industry
- Your Customers Don’t Care About Your Clout.. err.. Klout
- 8 Advanced Trends In Social And Digital Marketing (Infographic)
- Setting Up Your LinkedIn Profile for Maximum Success