Heat Level: Medium: These tips require some experience.
Bottom Line: Bots and internal traffic (aka you) skew your Analytics data - which can make your stats useless.
Do This: Set up a filtered view to ensure you’re only looking at real traffic from actual people, and excluding:
Internal traffic
Spammy referral traffic that didn’t actually go to your website
Bot and spider traffic
Google Analytics user, you know how incredibly valuable all of your data can be.
As aHowever, sometimes the data can look a little funky. At some point you’ll probably see a big spike of 1-second pageviews. You might notice a lot of sessions from Siberia. You might see a bunch of referrals from strange-looking URLs. You might realize that by spending so much time working on your own website, you’re skewing your own data.
Now that you know how to review your Analytics data, you want to make sure you’re getting the real picture. For all of these reasons, you need to set up a filtered view in Google Analytics.
First, a “view” is the level in an Analytics account where you can see reports and analysis tools. You automatically get an unfiltered view when you set up a property.
A “filter” is a way of removing certain data from your reports. These are some things that can skew your data: bots, spiders and yourself!
Bots and spiders are computer programs that crawl the web and send traffic to sites. Some are totally harmless (like Google’s bots). Some are called referral spam (like semalt.com and buttons-for-website.com) which try to trick you into thinking they’re sending real traffic to your site. Don’t worry, your site isn’t infected or hacked by the Russians. Still, bots aren’t real people, so you don’t want to count them in your data.
You and your office can also skew your own data. You and your team are constantly on your own website checking out listings or writing blogs. Do you really want your own traffic to be tracked? Probably not.
So to make sure you’re only seeing stats from real humans who aren’t you or your agents, you’ll want to filter your data. You’ll need to create a Filtered View to make this happen.
We always recommend that our clients keep a Raw View of all data so you don’t lose anything.
Your default view already has all of your data. We like to rename it Raw View because the default name (www.yourwebsite.com) isn’t very descriptive.
In the left-hand column, click the Admin gear icon
In the right-hand column, click “View Settings.”
In the View Name field, type Raw View.
Check the box to exclude known bots and spiders
Click Save.
Now, you can create the filtered view. This will show only the data from real people who aren’t on your staff. We usually name this our Reporting View.
In the left-hand column, click the Admin gear icon.
In the right hand column, click the blue +Create View button.
Name this view “Reporting View” and set it to your local timezone.
Check the box to exclude known bots and spiders.
Click Create View.
Your Reporting View will need to have filters added to strip out bots and internal traffic. Make sure you are in the Reporting View when doing this:
In the left-hand column, click the Admin gear icon
In the right-hand column, click Filters
Click the orange +Add Filter button
Here are the three filters we use most often:
Filter Name: Something specific like “Pittsburgh Office IP Filter” or “Smith Home IP Filter”
Filter Type: Exclude
Select Source or Destination: Traffic from the IP Addresses
Select Expression: that are equal to
IP Address: put your IP address in the textbox (Google “what is my IP address” to find yours)
You can set up as many IP filters as you need to strip out your office network, your home network, your agents’ home networks, your assistants, your web manager, etc.
Filter Name: Hostname Filter
Filter Type: Include only
Select Source or Destination: traffic to the hostname
Select Expression: that contain
Hostname: yourwebsite.com (don’t include www.)
First, make sure you checked “Exclude all hits from known bots and spiders” from both your Reporting and Raw views. This is found on the settings page for each view rather than the filters page.
Then, on the Reporting View, set up:
Filter Name: Bot Exclude
Filter Type: Custom
Exclude: Campaign Source
Filter Pattern (paste this exactly): semalt|anticrawler|best-seo-offer|best-seo-solution|buttons-for-website|buttons-for-your-website|7makemoneyonline|-musicas*-gratis|kambasoft|savetubevideo|ranksonic|medispainstitute|offers.bycontext|100dollars-seo|sitevaluation|dailyrank
Note: this pattern includes many of the top offenders. This article contains a few more bot filtering expressions. Here’s how to identify bots that are targeting you and create your own bot filtering expression.
We recently added a lowercase filter to our standard set. This makes sure Google knows that www.yourwebsite.com and www.YourWebsite.com are one and the same. Here's how to set up the lowercase filter.
With these three filters in place, you’ll be looking at only real data from real people (your team excluded). When you log into Analytics, make sure you’re looking at your Reporting View. It’ll give you a truer sense of what’s happening.
Keep in mind that filters are only applied going forward, and you can’t filter past data. So the sooner you apply these filters, the better.
If you’re not seeing any data in this view after 24-48 hours, you may have set up one of the filters incorrectly. Make sure you followed the instructions above to a T. Still having trouble? Shoot us an email and we can talk you through it.