It has now been enough time since the September 2019 update to gauge what it targeted more or less and what changes should be made to regain your rankings before or during the next big core update.
What I present here are not individual fixes as they depend on each website, but more of a concise approach that I have used in the past to recover multiple websites from the rankings peril.
Why ‘Leave No Stone Unturned’?
I will also talk about an approach of ‘leave no stone unturned’ SEO auditing to ensure recovering from a penalty. Inspired heavily by Glenn Gabe with his American version ‘kitchen sink’ approach (read more about it here). For some it’s a nobrainer that with multiple Google algorithms at play, you need to cover everything on a site and not just cherrypick your fixes. However – I now use Glenn’s articles (linked above) and research to explain to fellow SEOs or/and clients that this is a valid approach. That often a Google query penalises a site for a number of reasons and if we just carry out one test – the chances of that one element being the only thing at play are miniscule.
I’ve been a SEO consultant in London for many years and I have never seen so much turmoil in Google’s updates as during 2019.
In a nutshell – noone but Google know definitely what exactly triggered the penalty. They did state they’re looking for a big improvement in the website quality overall. So by covering all main aspects of SEO and not only implementing one or two tests that seem most likely – we have the biggest chance of seeing a reversal of any penalty/filter they have applied.
Why it’s necessary to look at these updates ‘holistically’?
These big core updates should be looked at not as individual factors. We don’t get ‘penguin’ or ‘panda’ anymore – they’re not single big changes of one factor.
These new core updates, as much as multiple smaller updates throughout the year, are a collection of sets of ranking adjustments. Alltogether designed to change the landscape.
Sure you may say – the ‘Medic’ update targeted one particular niche more than others, for me – it was targeting trust on the SERPs landscape and, the accuracy of data presented, depth of topic coverage and things like (perhaps) authority or trustworthiness of each individual author.
What does Google want from us?
Here are a few directions Google moves towards with their individual events/focus areas I’ll be discussing:
* Security: HTTPs as ranking factor, recent announcement on blocking partial content
* Mobile experience: speed on mobile and smartphone crawler being used, prioritising mobile experience over desktop for rankings
* Trust: schema and multiple coverage of author as part of the algorithm, knowledge graph taking over more and more, Q&A schema in the SERPs gaining prominence
The new Google core algo update in September 2019 should be looked at through the lens of these changes, as it’s part of a continued process to bring us mobile first, secure, trustworthy search results.
Do you have any case studies of a site making a recovery during the September 2019 Google Core update?
Yes I do, as a freelance SEO consultant it is my job to record and after having asked permission from clients, present the results of the SEO capmaigns. The below is one of the clients I’ve worked with prior to September 2019:
Google Medic recovery in September 2019 Core Update.
As you can see they’ve experienced a drop back in September 2018, then had a small very short recovery in March 2019 and May 2019 in time with other Google updates, but then a slow decline followed.
What were the adjustments carried out to address the September 2019 Google Core update?
Here are the main themes of the updates carried out to improve the situation:
1 Improving the navigation across the site, interlinking and parts of the architecture (there were some potential duplicates, or pages belonging to a section but being in a different folder due to CMS).
2 Addressing the intent vs search query – on multiple pages the visitors were looking for one thing, say ‘apples’ and because our page was about ‘fruit’ in general, we believed we meet the intent, but we never really talked about ‘apples’ specifically. Updates were made to address keywords and queries more appropriately and in a more indepth approach.
Here’s another example of this:
3 Using the ‘left no stone unturned’ and improving the overall website quality, tackling anything significant from the common ‘Google Quality Guidelines’ issues – speed of the site, content duplication, bad experience on the pages, lowering bounce rate etc.
What are the most important (in my opinion) elements to address on any site that is struggling post Medic from 2018 OR September 2019 Core update?
Here is a step by step process I use to review query-to-page relevancy and if the intent is met:
- Search Console: Go to performance: Go to the page affected by the drop and view the keywords it shows up for and which keywords lost the performance (do this by using the date to compare)
- Grab an example keyword or two (I always target the highest impression or/and highest clicks keywords) and Google it in the relevant locale. Then see what ranks, here are some questions I use to evaluate a query vs results:
- Is the query: informational or purchase drive?
- Has that intent changed since the update?
- Are the pages ranking targeting the keyword specifically on a page, or they have a large hub page/huge guide that shows up purely due to authority and relevancy and depth and quantity of content?
- Has our ranking page changed since the update? Why may that occured?
- Does your ranking page have as much copy and covers the topic as well as the competitors’ ranking pages?
- What other elements may be leading the current top 3 rankings for that query, that your page may lack?
- Then review the page against that and make recommendations
- Repeat the process for top 10-20 keywords
- Set up tests to see which pages improve over time since these changes have been applied
- Then if this works = great, you’ve got a valid formula, if it doesn’t = back to the drawing board, you may have missed something underlying OR – as we’ll cover in the next step, the ‘Leave no stone unturned’ SEO approach hasn’t been used.
What are some of the most common technical onsite issues that combined may cause problems with Google core updates?
- Content duplication (whether technical or content based)
- Having a site that’s way too big and ‘tires’ the indexing engines and delivers way too many thin pages/lacking in content, dupe pages
- Bad mobile speed compared to top ranking competitors
- Worse mobile layout than desktop (since Google uses the mobile bot since roughly the time of Medic)
- Big percentage of pages with errors, compared to competitors
End notes
I hope this post and ideas help you, if you’ve been struck by one of past Google updates. Feel free to get in touch if you’re a bit stuck on what to tackle next on your site, always happy to help.