In this article, we'll take a look at a framework for diagnosing and addressing drops in organic search traffic. The framework covers validating the traffic drop, identifying what caused it, pinpointing which areas of the site were impacted, and investigating the potential reasons.
Websites often experience sudden drops in organic search traffic, and pinpointing the exact reasons behind it is not always straightforward.
The road may require significant time and effort, but this battle-tested framework covers all the bases and will guide you to the right areas to check.
1. Validate The Drop #
Keep calm and confirm it’s a real drop
Before you panic and set off any alarm bells, it’s crucial to confirm that there is indeed a significant drop, and it’s not just a reporting error or normal fluctuation.
Here are the steps to validate the drop:
1.1 Do you see the same organic traffic drop in both GSC and GA? #
Check if the drop is consistent in both Google Search Console (GSC) and Google Analytics (GA).
If Google Search Console (GSC) did not report any traffic drop, then it’s likely an issue with your GA/GTM tags or account settings.
1.2 Is the drop only for organic, or also other channels? #
This is another check to do in Google Analytics (or another analytics platform), compare the drop in organic traffic with other channels.
A drop across multiple channels likely indicates a site-wide issue like a tracking issue or seasonal fluctuation, rather than an organic search-specific problem.
1.3 Is the drop caused by seasonality? #
Some industries experience natural fluctuations in traffic due to holidays, events, or seasonal trends.
Review historical data over the past 2-3 years (if such data is available) to identify any seasonal patterns.
Keep in mind, especially if doing international SEO, some events shift dates annually (Ramadan as example), so look for larger trends rather than compare the exact dates.
2. What Caused the Drop? #
Once you’ve established a legitimate organic traffic drop using this validation process, it’s time to figure out the cause.
Organic traffic declines typically result from three main issues: changes in keyword rankings, lost keywords driving traffic, and fluctuations in click-through rates (CTRs).
By analysing Google Search Console data, we can pinpoint the main factors behind the decline. However, there’s a limitation: the GSC interface only allows you to export up to 1,000 rows. Fortunately, you can overcome this by using the Google Search Console API to extract more data.
The API allows you to extract all the data for keywords, pages, clicks, impressions, and CTR going back 16 months.
The easiest way to do this is via a Google sheets add-on, such as SearchAnalytics for Sheets.
2.1. Was there a drop in keyword ranking positions? #
If you’ve dropped in keyword rankings, that’s going to result in a drop in organic traffic.
To check this use the Search Console data: extract all the keywords sending clicks and their avg position before and after the drop. Then compare the average positions of the same keywords to see if and how they changed position after the drop.
2.2. Was there a drop in the total number of keywords ranking? #
Even if average positions seem stable, a decrease in the number of ranking keywords can impact traffic.
Sometimes, the main keywords might stay the same, but many long-tail keywords may have been lost.
To check this use the Google Search Console data collected above, total up the number of keywords sending traffic before vs after the drop. Then Compare the totals to see if there has been a decrease.
2.3. Was there a drop in CTR? #
If rankings and keyword counts appear normal, a shift in CTRs for your ranking terms may be responsible.
Here, you should collect the keywords and their pages where a significant CTR change has been noted. We are going to use this data for further investigation in the next steps.
3. Where Was The Drop? #
Not all traffic drops affect an entire site equally. Identifying the specific sections, categories, or content types hit hardest enables a targeted diagnosis and remediation plan.
3.1. Was it category/page type specific, or across the entire site? #
Using the Search Console data collected in the previous step, segment the data via the different areas or page types of the site.
This should determine whether the entire site was affected equally or if the drop focused on certain areas of the site.
3.2. Did the pages drop rankings for all search terms or only some? #
Look at the keywords that caused the drop in traffic and the pages they led to. Now check the other keywords for the same pages. Then, examine if the majority of keywords for those pages also experienced a drop in rankings, or if it was only a small selection of keywords that sent traffic to the page.
This should determine if the drop in rankings is widespread across all keywords for a page or limited to just a few.
4. Why Was There a Drop? #
OK, we’ve validated there is an issue with organic traffic, we’ve gathered the data on what caused the drop and we have the data on where the drop happened on the site. Now, it’s time to understand why there was a drop to find actions to remedy the situation.
Your investigation into the underlying reasons will differ based on whether a recent Google algorithm update is suspected or if the issue appears site-side.
👉 Here are some resources to check for confirmed or suspected Google algorithm updates:
- @searchliaison This is an official 𝕏 account from Google to share news and insights. When a core update, or a specific update is launched, it will be announced here.
- Seoundtable.com is a website which reports on the goings-on in the world of search. The site covers all Google updates, announced or not.
- SERP fluctuation trackers. There are many tools which attempt to track changes in Google’s daily search results by monitoring a database of keywords. Significant fluctuations in rankings can indicate a potential update.
No evidence of a Google update #
The first 5 checks here are generally if you don’t believe there has been a Google update.
In that case, the traffic drop it’s most likely caused by a change to page content or some technical issue on the site.
So, the following steps start off with some quick sanity checks, then go into some deeper technical checks.
4.1 Are the affected pages still indexable and indexed? #
Check indexability
This is a sanity check that can be automated relatively easily.
Once we have a list of pages affected from steps 2 and 3, we want to quickly check if these pages can still be indexed.
For a small number of pages, you can check manually, but the easiest way is to use Screaming Frog in list mode, paste in all the URLs to gather all the elements we want to check.
Which are:
- URL is 200 status. It is not a 404, soft 404, 5xx, or some other status code.
- URL does not redirect to a different URL
- URL is not set to noindex
- URL is not blocked in robots.txt
- URL is not canonicalised to a different URL (The canonical tag is self-referencing, or doesn’t have a canonical tag)
Additionally, even if they are still indexable, they may no longer be indexed in Google.
Again, using Screaming Frog in list mode, plugin the GSC account for the site and enable the URL inspection setting.
This will pull the status of the URLs and tell you if they are currently indexed in Google.
The api is limited to 2000 URLs a day, so you may have to do in chunks if checking a large amount of URLs.
4.2 Are the affected page’s titles and content still the same as pre-drop? #
Check pages changes
Another sanity check is to see if the page’s title and content changed during the drop.
Title change can significantly impact rankings and click-through rates (CTR). Check if titles have changed and consider Google’s tendency to rewrite titles in SERPs.
Content change is one of the most obvious reasons for a drop in traffic. These could be pages that have been refreshed or re-optimised for SEO, or there could be a bug that has changed the content.
If it’s not obvious to you if the page have changed try:
- If using WordPress, you may be able to do this via the history module.
- If you have old crawls in site crawler, use these.
- You may be able to find the page at the https://web.archive.org/
- Try Google cache if it hasn’t been updated.
4.3 Did Anything else change on the site? #
Audit recent site changes
Many other changes to a website can affect SEO performance, such as URL structure, internal linking, design, layout, CMS updates, frontend changes, and more.
Get a full list of all changes that happened just before the drop, then check their potential impact on SEO performance.
4.4 Any new or increase in issues in GSC? #
Look for errors in GSC reports
In Google Search Console, check the different reports (such as the Coverage, Sitemaps, Page Experience, Core Web Vitals, and Mobile Usability reports) to identify any issues with the affected URLs.
4.5 Audit the site for technical issues #
Conduct a technical SEO audit
I’d leave this for last, as it’s the task that’s going to take the longest time to do properly.
If everything else seems fine & there is no evidence of a Google update, there is possibly some technical issue that has caused a drop.
Here some great resources that help on technical SEO audits:
- Free checklists and resources: https://seosly.com/blog/seo-audits/
- Free Technical SEO Course: https://www.bluearrayacademy.com/courses/technical-seo-certification
- Paid Technical SEO Course: https://marketingsyrup.com/
Evidence of a Google update #
The next checks are generally used when it appears that a Google update happened during the drop in traffic. Either announced or due to observed fluctuations being reported in the industry or tracking tools.
Changes caused by Google’s core updates are often based on the perceived quality of the page’s content.
A drop in rankings due to a core update is less likely to be caused by pre-existing technical issues on your site. Instead, the focus should be on analysing your content, keywords, and competitors.
So, the following checks are designed to help you understand how Google’s algorithm perceives the quality of your content in relation to the search intent and compared to your competitors.
4.6 Does the intent of the dropped keyword/s match the content of the page? #
Assessing Keyword Intent Match
Behind every search there is an intent.
During core updates, Google will sometimes do ‘intent shifting’, where it changes what it thinks people want to find in the SERP result when they use certain keywords or search queries.
So, look at the keyword and the pages which have dropped, analyse the current SERPs and see if the intent Google perceives people are looking for, is the same as your page.
4.7 Does the page content of affected pages answer the question the dropped keyword/s asks? #
Evaluating Content Quality
Here you need to spot-check keywords and pages that have dropped and assess if the content really answers the question the keyword asks.
Try to be impartial and think like a user landing on the page. If you had searched for it, would you be happy with the search result, does it satisfy you?
Ensure that the content is high-quality, relevant, and follows Google’s guidelines.
4.8 Who is ranking above us now that we have dropped? #
Identifying Better-Ranking Competitors
SEO is generally a 0-sum game, meaning if a site increases in rankings, other sites must drop at the same time.
Using the keywords collected that have dropped, look at the SERPs and identify the pages that are now ranking above yours. Collect a sample of these pages for the analyses in the next step.
4.9 Do the pages ranking above answer the question better, have higher quality or a better experience than your page? #
Comparing Content Quality and User Experience
With the pages we collected in the previous step, it’s time to analyse our content vs the competitors to try and understand if their content better answers the question, has better quality content, or has a better user experience.
4.10 Do some keywords or pages improve or stay stable during Google updates? #
Winner vs losers
During Google updates, you will sometimes find that although some keywords/pages have dropped, other ones may be stable or have even increased.
This is because rankings are largely based on a per-page basis, rather than penalising an entire site (Although this can happen in some circumstances, such as a backlink penalty).
During the analysis we did in part 2, we can gather any keywords/pages that were stable or increased, and analyse them to try and uncover why.
Let me know if you have any comments, or suggestions to improve the process.