With the launch of Google’s new Helpful Content Update, rankings and traffic are likely to change a bit over the next few weeks. So, we thought it would be helpful to start our SEO newsletter with a few tricks we’ve learned to quickly figure out why traffic is going down.
We’ll talk about seven different ways to figure out why your traffic dropped and how to keep an eye on and fix future drops.
Most of the time, one of these seven things is to blame for a drop in organic traffic:
- Redesign and rebranding
- Updating a website without SEO oversight
- Content updates
- Changing the architecture of the site
- Domain migrations
- Google algorithm update
- Technical issues
Find out what’s changed on your site as a starting point for looking into drops. Here are a few tips that may help you figure out why your traffic changed.
7 Hacks for Diagnosing Traffic Drops
- Use your GSC coverage report to spot trends
- Use your GSC coverage report to check for URL bloat
- Use GSC page experience, Core Web Vitals, and crawl stats Reports
- Compare Bing and Google traffic
- Use Archive.org to find changes
- Crawl the website
- Use automated tagging
If there are any annotations or release notes in Google Analytics (GA), that’s a big help in figuring out what’s changed, but often there aren’t, so we have to think of other ways.
1. Use Your GSC Coverage Report to Spot Trends
Checking the coverage reports on Google Search Console (GSC) is a quick way to figure out what’s going on.
Look at the graphs on the right and write down any patterns you see. Which graphs are increasing or decreasing?
For example, in this report, we can see that the number of “noindex” pages has grown by a lot. Then we’d ask, “Does this have anything to do with the drop in traffic?” Maybe this site noindexed a bunch of pages by mistake recently.
2. Use Your GSC Coverage Report to Check for URL Bloat
A coverage report from Google Search Console can also show you problems like too many URLs. When you add a lot of website pages with duplicate or low-quality content, this is called URL bloat. This makes it harder for your most important pages to get a high ranking.
The graph above shows an example of a site that has released more than 100,000 URLs in the past few months. This caused a big drop in how many impressions they were getting before.
So, we don’t have a clear answer here, but it gives you an idea of what needs more research because we can see that the number of noindex URLs is going up and the number of impressions is going down.
It’s possible that Google wasn’t indexing their newly added pages because they were too similar to other pages or didn’t have much content. It’s also possible that this site didn’t index some pages on purpose, which caused this drop.
3. GSC Page Experience, Core Web Vitals, and Crawl Stats Reports
Rankings can be affected by big changes in performance, so check out these reports:
Core Web Vitals in Google Search Console
Based on how people actually use your pages, the Core Web Vitals report shows how well they work.
Page Experience in Google Search Console
The Page Experience report gives an overview of how people who visit your site use it.
Crawl Stats in Google Search Console
The Crawl Stats report tells you how many times Google has crawled your website and how often.
The average response time is shown by the orange line in this Crawl stats report. To be clear, the average response time is how long it takes Googlebot to download a full page on average.
The number of URLs crawled goes down as the average response time goes up. This isn’t always a traffic killer, but you should think about it as a possible cause.
The crawl stats can also be used to find hosting problems. This helps you find out why some sub-domains of your site have been having trouble recently. For instance, they could be serving 500s or another problem that Google is telling them about.
The good thing about the GSC Page Experience, Core Web Vitals, and Crawl Stats reports is that they only take a minute or two to look over. So, they are a great way to quickly get a feel for the site and figure out what might be causing the drop in traffic.
4. Compare Bing and Google Traffic
Here’s a quick way to figure out if the drop is your fault or Google’s: look at the organic traffic data from Bing.
If traffic drops on Google but not on Bing, it’s probably Google’s fault.
If there is no difference and both Google and Bing show a drop in organic traffic, it’s likely that you did something.
Good news: it’s much easier to fix things when you own up to them. You can figure out how you did what you did and get your site back on the list.
Bad news: If Google is to blame for the drop, you’ll need to do more research to find out what they changed and why it’s affecting you. This might require some big data solutions, which we’ll talk about in the next section.
5. Look for changes on Archive.org
Archive.org can be very helpful if you don’t keep records of how your site has changed in the past, which most people don’t. In these cases, you can use Archive.org to see screenshots of every page and template on the site from before and after the traffic drop.
One of the best things about Archive is that it can go back years, while GSC can only show data from the last 16 months.
6. Crawl the Website
You’ll need to crawl the website to find technical problems. You can do this with the help of tools like screaming frog or Sitebulb.
7. Use Automated Tagging
You should use automated tagging if you aren’t already. This is the best choice if you have a large site or need to use big data to figure out which keywords and pages are causing the traffic drop.
With automated tagging for categories, goals, and page type, you can:
- Easily find patterns in traffic drops
- Better understanding of ongoing traffic
- Retain knowledge from past analyses
- Make it easier to predict the impact of future SEO projects
Our course is taught by an industry expert who has years of experience in local SEO. Coach A will provide you with step-by-step guidance and support to help you implement the strategies and tactics you learn.