- NEW
- 45 minutes
- Websites
All Skill Levels
Get a behind-the-scenes-look into FareHarbor's 2024 season
If you stay on top of SEO news, you’re probably familiar with the dreaded algorithm update. But what does an algorithm update mean for your website? As Google learns more about how users interact with their search engine and discover what searchers want, the development team behind the scenes constantly refines the platform to serve up results that are accurate and address the search intent. At its core, an algorithm update refers to changes in Google’s search ranking algorithm code, which in turn affects how sites are ranked in the search engines results page (SERP).
Depending on the scope of the algorithm updates, they can have a significant effect on how pages are ranked, which is why every time a new update is rolled out, SEOs flock to measure site performance and determine how their sites were affected. This guide covers some of the major algorithm updates that have changed the way search works and how to adjust to these changes.
The way Google calculates search results boils down to relevance (how a site’s content addresses search intent) and authority (using backlinks to determine how a site relates to other sites). In the early days, SEO emerged as a way to “trick” search engines by exploiting these two ranking factors using link schemes, keyword stuffing, hidden text and links, and a variety of other tactics.
Google’s continual updates have refined the way the search engine determines relevance and authority without falling for these tricks. In fact, circumventing proper SEO practices results in ranking penalties from Google. The search engine regularly refreshes the way in which it displays search results and adds new features to make searches as easy and useful as possible for users. Today, SEO is about understanding how the search engine works and using best practices to rank in a way that is based on quality and relevance.
As soon as a new Google algorithm update is rolled out, SEO experts scramble to investigate what the change was and how it might affect site rankings and search results. This is because the majority of the work of SEO is to optimize websites for the current search engine guidelines and best practices, so as soon as Google changes them, it can dramatically alter a website’s position in the SERP. To make matters more complicated, most of the time Google does not explicitly outline what changes to expect from an algorithm update.
The effect of algorithm updates is not always necessarily a bad one. The purpose of algorithm updates is to make the search engine better for users, and sometimes this means it’s better for businesses as well because they have a better chance to rank, depending on the type of update. Some of the things affected by updates include:
What can often frustrate SEO experts about algorithm updates is that sometimes they’re rolled out with little or no advance notice. Even when the updates are announced, SEOs have to dissect the announcement to parse out exactly how it will affect rankings. Either way, algorithm updates almost always mean that businesses have to do some work to understand how they affect them and how they can adjust their SEO tactics.
Google rolls out new algorithm updates on the 13th of every month, and while some are nearly imperceptible to webmasters, others can have a big impact on rankings. The January 2020 Core Update has impacted a variety of different types of websites, with a mix of organic traffic improvements for some and deindexing of spammy pages for others. Google doesn’t disclose the exact details of their monthly updates, but SEO professionals have gathered insights into the types of sites that were affected. Here are the main takeaways from this update:
Google announced the BERT Update at the end of October 2019, calling it the most important update in the past five years. This update is focused on using context to understand search queries made using natural, conversational language. With the rise of voice search, these types of queries are becoming more common, and a literal approach doesn’t always provide the best search results. A single word can have a variety of meanings in the English language, and BERT uses machine learning to help determine meaning using context provided by other words in the query.
For example, the word “bass” could refer to the type of fish, to the instrument, or to the lowest pitch in music. To understand what type of “bass” you’re referring to, the search engine uses context clues from the rest of your query, such as “best places to fish for bass” or “learn to play bass.” BERT allows Google to understand more complicated search queries where context is crucial to discerning which meaning of a word is being used.
BERT is directed at understanding search queries rather than website content, but web pages that are poorly written could be affected if they don’t have a clear focus and structure. As the search engine gets better at understanding human language, it’s crucial for on-page SEO to use words carefully and correctly and to organize the content to ensure that meaning is clear.
Google has said that this update will affect up to 10% of search queries, which means it could have a huge impact. It’s also a major leap for the search engine in terms of understanding how searchers use language and providing more specific and targeted results.
The Diversity Update was rolled out in 2019 to offer a greater variety of websites in the search results. Often the search results are dominated by multiple pages from the same domain, and the aim of this update was to limit the spots in the top results to no more than two listings from the same site. This helps both the searchers by providing more options and businesses by giving them a chance to rank on the first page. There are, however, instances when the results will have more than two listings for one site if they’re relevant to the query, and the available data shows a limited impact so far.
There were a variety of core updates in 2019, announced and unannounced, and Google hasn’t always been clear on what exactly changed. Many of the changes have been aimed at the medical/health industry to ensure that the search engine serves up results from trustworthy sites that demonstrate expertise and authority on the topic. What businesses can learn from these updates is that Google is focusing on relevant, valuable, trustworthy content.
This update launched in 2017 targets pages with thin content that seem to only serve the purpose of generating ad revenue. Google aims to give users relevant, valuable content to keep them coming back to the platform, and this update helps them avoid serving up blogs that have thin content that is centered on ads or affiliates.
To stay on top of what Google considers thin content, it’s important to review the Search Quality Raters Guidelines, but as a general rule, just make sure your pages are actually providing valuable content that was written with the user in mind rather than just for the sake of ad revenue or to promote affiliate links.
Google’s Mobile Update, nicknamed Mobileggedon, launched in 2015 and targeted any sites that were not optimized for mobile. With the emergence of mobile as the main device on which people are doing searches, Google caught up by ensuring that mobile-friendly pages rank at the top of mobile searches and those that are not optimized drop in the rankings.
Especially in the travel and tourism industry, the majority of searches happen on mobile, so it’s crucial to ensure that your site is mobile friendly. This not only affects your ranking but also the user’s experience.
What does mobile-friendly mean?
Like Pidgeon, Possum targets local searches. This update launched in 2016 helps Google serve up relevant results based on the searcher’s location. The closer the searcher is to a business’ address, the more likely that business is to appear in the search results. To adjust for Possum, make sure you use specific local keywords, and keep an eye out on your local rank. It’s important to check on your rank from your specific target location, or a variety of locations, using a tool like Rank Tracker.
Pidgeon was launched in 2014 and is related to local SEO. It targets poor on-page (elements on your site like title tags and meta descriptions) and off-page (listings in business directories and backlinks) SEO and for local searches. While the local and core algorithms are separate, Pidgeon brought them closer together so that traditional SEO tactics are now used in local searches.
To adjust for this, improve your on- and off-page local SEO by listing your business in local and major business directories and using specific local keywords. Learn more about how to improve your local SEO on our guide on this topic.
Launched in 2013, Hummingbird uses natural language processing to interpret search queries and serve up results that not only match the keywords but also satisfy the search intent. This means pages that don’t have the exact keywords but address the query directly have an opportunity to rank. Of course, keywords are still important, but Hummingbird also takes into account synonyms and co-concurring terms to base searches on broader concepts. To adjust for Hummingbird, it’s important to expand keyword research to incorporate the general concepts associated with certain searches.
RankBrain is part of the Hummingbird algorithm and was launched in 2015. RankBrain is a machine learning system that helps Google discern the meaning behind a query, allowing the search engine to offer the best results in response. Google has said that RankBrain is the third-most important ranking factor, so don’t skip out on optimizing your content for relevance based on the keywords you’re targeting.
Launched in 2012 and incorporated into the core algorithm in 2016, Penguin targets spammy or irrelevant links that seem to be attempting to manipulate rankings. Penguin works in real time, meaning it’s constantly checking links and penalizing sites for manipulative backlinks. Sites can adjust by monitoring link growth through backlink checks using a tool like SEO SpyGlass.
Launched as a filter in 2011, Panda became part of the core algorithm in 2016. This update targets duplicate, plagiarized, or thin content, user-generated spam, and keyword stuffing. Panda assigns a quality score to pages, and if sites fall into any of these practices, that score will be low, in turn affecting the site’s ranking.
Panda rollouts are frequent, meaning sites often get penalized for these practices, but they can recover quickly by checking for content duplication, keyword stuffing, and thin content and fixing these issues. This can be determined by using an SEO crawler, and it can be avoided in the first place by writing quality content.
Google is constantly evolving as it learns from users, resulting in these regular updates to its algorithm. To stay on top of these updates, it’s a good idea to subscribe to SEO blogs such as Moz and Search News You Can Use. For more tips on how to improve your SEO, head over to our SEO guides!