Having a website that requires maintenance, optimisation and constant changes can be hard work – but not knowing why your website isn’t performing as you anticipated can be even harder. Despite optimising until your heart’s content, there may be other reasons why your website isn’t doing as well as you’d you’d like – and it could have something to do with Google’s infamous algorithms and their penalties.
If you’re not sure what Google’s algorithms consist of, what their purpose is or what they look for, we’ve got the answers! This article will assist you in understanding each algorithm, what it looks for on your site and how you can avoid being penalised!
The Panda Algorithm
Panda was the first major algorithm launched on February 23rd 2011. The purpose of Panda was Google trying to show higher-quality sites in more prominent places in search results, and demote those that may be of lower quality. The algorithm was officially named “Panda” after one of its creators, Navneet Panda.
Panda was originally known as the “Farmer” update, as it seemed to affect those websites known as content farms. Content farms are websites that accumulate information from a range of sources, in some cases stealing that information from other websites, creating a huge number of pages. The main purpose of these content farms is to rank for a wide variety of keywords in Google. Therefore, Panda’s main purpose was to penalise sites based around the quality of on-site content.
Panda has shown to impact rankings site-wide for most of the websites affected by the algorithm. This means it does not just reduce the rankings of certain pages within Google, but rather considers your entire site to be of low quality. Despite this, there have been some cases where Panda can affect just a section of your site, such as a news blog or a sub-domain.
What does Google deem as Quality Content?
When considering content on your site, Google have released a checklist that you can use to determine whether or not the content on your site is of high quality or not. Check out the list below:
- Would you trust the information from within this article?
- Is this piece of content written by an expert, or qualified professional, who knows the topic well?
- Does the site have duplicate, similar or redundant articles on the same topics with slightly different keyword variations?
- Would you feel comfortable providing your credit card information to this site?
- Does the article have spelling, stylistic or factual errors?
- Are the topics of this article driven by genuine interests of readers who come to the site, or does the site generate content based on the need to rank for specific keywords?
- Does this article provide original content, reporting, research and analysis?
- Does this page provide value when compared to the other articles shown within search results?
- How much quality control is done on your content? How many people have proofread this and checked your facts?
- Does your article show two sides of the argument or story?
- Is the site a recognised authority on its topic?
- Is the content mass-produced or outsourced to a wide range of creators, spread across a large network of websites, so that individual pages don’t get as much attention and care?
- Was the article edited well? Does it seem hastily produced?
- For a health-related query, would you trust information from this site?
- Would you recognise this site as authoriate when the name is mentioned?
- Does this article provide a complete or comprehensive description of the topic?
- Does this article contain insightful and interesting information that is beyond obvious?
- Is this the sort of page you’d want to bookmark or share with a friend?
- Does this article have an excessive amount of advertising that distract or interfere with the main content?
- Would you expect to see this article in a printed publication?
- Are the articles short, unsubstantial or lacking in helpful specifics?
- Are the pages produced with care and attention to details vs. less attention to detail?
- Would users complain when they see pages from this site?
Whilst this list is significant in size, these questions are important to ask when you’re looking at the content featured on your site. These questions don’t necessarily mean that Google tries to figure out the answers to these questions, but rather these questions are what a real life user would use to rate the quality of your site. Let’s not forget that Google aims to bring the best experience for its users – so it wants to give the highest ranking to the site that provides the most relevant and insightful content to users.
Whilst we will never know the ins and outs of Google’s algorithms, we do know that there are a few factors that widely influence Panda.
Panda Factors: Thin Content
“Thin Content” is known as a page that adds little or no value to the person reading it. It doesn’t mean that a page is a certain number of words, but more often than not, pages with very few words are not super helpful! If you have a lot of pages on your site that contain only a few sentences, and those pages are all indexed by Google, then the Panda algorithm may decide that most of your indexed pages are of low quality.
Tip: ensure all your pages serve a specific value to your user, and provide insightful information in more than just one or two sentences.
Panda Factors: Duplicate Content
Duplicate content can cause multiple problems for your site, resulting in it being seen as a low quality site by Panda. The first would be when your site has a large amount of content that is copied across from other sources on the internet. If you source a lot of your informative content from another site, Google can easily figure out that you are not the creator of this content. Therefore it will see that a large portion of your site is made up of other site’s information – causing Panda to look at your site unfavourably.
Tip: create unique and original content for blog articles and category based content.
Problems can also occur with duplicated content across your own site. One example could come from product pages – if you have multiple pages for the same product type, but with slight variations such as colour or size, then you will have essentially a large amount of pages for the same product which are identical to Google. Therefore the search engine is seeing a large proportion of your site as duplicated, and will penalise you accordingly.
Tip: using canonical tags, you can set a “preferred” url to point to, so Google doesn’t index all of these variant pages. We’ve got a great guide on canonical tags here.
Panda Factors: Low-Quality Content
When you’re writing information for your website, you should be thinking about providing users with the best knowledge on your topic possible. Quality over quantity – it’s better to write one amazing blog post a week than one post a day of average quality. Posting short posts around topics that are of use, but not answering them in-depth with significant detail, is going to impact your site.
Tip: provide users with insightful and in-depth content that assists them in their search, rather than writing regular content without much use or helpful information.
Recovering from a Panda Penalty
Google tends to refresh the Panda algorithm approximately once a month – they used to announce when they were going to refresh Panda, however they only do this now for really large changes to the algorithm. When the algorithm is refreshed, Google takes a new look at each site on the internet (yes, every site!) and decides whether or not it looks like a quality site based on the Panda criteria.
If you make changes to your site after being hit with the algorithm, each time Panda is refreshed you should see things improve if you’ve made positive changes based on what Panda looks at. It can take a while sometimes, but stick with it and keep making those changes, and the algorithm will update your rankings in time.
Occasionally Google updates the algorithm, rather than simply updating it. This mean that Google has changed the criteria the algorithm looks at to determine what is considered high quality. Keep an eye out for updates, and make sure you always check your rankings and traffic to ensure you’ve not been hit. If you have, you know what to do!
The Penguin Algorithm
The Penguin Algorithm was rolled out on April 24th 2012. The aim of Penguin is to reduce the trust that Google has in sites that have “cheated” in gaining link equity, by creating unnatural backlinks in order to advance in the search engine results page.
Penguin’s main focus is around unnatural linking; however, there are other factors that can affect your site through the Penguin algorithm. What we do know though, is that links are the most important thing to look at when considering this algorithm.
Why Does Google Deem Links Important?
A link is seen as a vote of confidence for your site. If a site links to yours, then this is a recommendation for your website. The larger and more reputable the website, the more this link is going to count to your site. For example, if a website like the BBC links to your website, that can mean massive things for your site’s equity. However, a link from a small blog is not going to count for as much.
In the past, SEO-ers have used techniques to acquire lots of links from these smaller sites, which would then amount to a larger profile of “votes” to your site. However, this is what Google is using Penguin to try and avoid – with lots of backlinks to your site of low quality, in a seemingly unnatural pattern, could see you getting hit with a penalty.
Another factor that is important in this algorithm is anchor text. Anchor text is the part of the text that is underlinked as a link. If we link to our latest Social Media Round-Up, the anchor text will be“social media round-up.” If we get multiple people linking to our blog with the anchor text “social media round-up,” that is seen as a hint to Google that people who are searching for “social media round-up” probably want to see sites like our Rice Blog in their search results.
With this, it’s easy to see how websites can manipulate this algorithm to their advantage. If we wanted to get our site to rank well for this page in the past, we can cheat the algorithm by creating self-made links, using the same anchor text in these links, containing phrases like social media updates, social media 2018, social media news. Whilst a highly authoritative link from a well respected site is good, in the past many SEO-ers discovered that creating a large number of links from low quality websites was just as effective. From this, many would follow this technique in order to increase their domain authority.
We will never know what specific factors Penguin looks at, we do know that this type of low quality self-made link method is what the algorithm is trying to discover. Google have revealed that this algorithm can affect your website sitewide – so if Penguin detect a large number of links to your site are untrustworthy and of poor quality, then this is going to reduce Google’s trust in your entire site, even if those links are just to one page sitewide. Therefore your whole website is going to see a reduction in rankings, resulting in less traffic, aka less sales.
Depending on the amount of link acquisition that has been done in the past will depend on the penalty you get from Penguin. However, it’s good to refresh your backlink profile every six months, and keep on top of your disavow file too to ensure Google can see you’re actively trying to resolve any unwanted links on a regular basis.
Recovering from a Penguin Penalty
Penguin used to work in a similar way to Panda, in that it was refreshed periodically and sites were re-evaluated with each refresh. When it was first released, this algorithm was not updated with much regularity – however now, after the final update known as Penguin 4.0 in September 2016, this algorithm is now real-time signal processed within its core search algorithm. This means that when Google now re-crawls and reindexes pages – which happens constantly – those pages will be assessed by the Penguin filter. Therefore pages will be “caught” by Penguin, or “released” after being updated by the webmaster.
Tip: In order to recover from the Penguin penalty if it does hit your site, you will need to identify the unnatural links pointing to your site and remove them, or place them within the disavow tool in Google Search Console.
As it works with any other update and crawl, it can take time for Google to re-index and review your site after you make these updates after being hit by Penguin, so ensure you keep on top of your backlink profile and rankings.
Tip: review your backlinks every six months, (more regularly depending on the size of your site), and re-submit your disavow file similarly.
The Hummingbird Algorithm
Google announced its introduction of Hummingbird on September 26th 2013, whilst also noting the algorithm had already been live for a month *surprise*. Many people had already noted that they had noticed a considerable drop in their rankings before this announcement – which we can assume was down to Hummingbird, surely?
Think again – if Hummingbird was truly responsible for ranking changes, we should have seen an outburst from SEO-ers all over the world of something drastic occuring in and around August 2013, however nothing was noted.
If you think that Hummingbird may have affected you, have a look at your traffic around the beginning of October of 2013, which was actually a refresh of Penguin. If you think you were affected by Hummingbird, you may have actually been hit by Penguin, which happened just a week after Google made their announcement about Hummingbird.
Hummingbird was an entire overhaul of the entire Google algorithm. Google is an engine, in which the Penguin and Panda algorithms work in a sense that we’re adding a new part to the engine, such as an exhaust or break pad. However, Hummingbird was a new engine. This new engine makes use of Panda and Penguin, but a large amount of this new engine is a completely new design.
The main purpose of Hummingbird is for Google to better understand the user. If we think about this as understanding what a user is wanting when they are searching – if they are looking for “the best place to get coffee in Birmingham” what are they looking for? If we were to search for this post-Hummingbird, the algorithm is able to understand that by “place” the user is looking for a physical address, therefore is able to show coffee shops in that location.
There has been speculation that in order to improve Google’s Voice Search function, this update was necessary to return the best results for quick searches such as these. When we’re typing a search query, we may type “best coffee birmingham” but when we’re speaking, we’re more likely to say “what is the best coffee shop in Birmingham?” Therefore the point of Hummingbird is in place to better understand what a user means when they search for these types of queries through voice search.
We have a great guide on how to optimise your website for voice search.
Recovering from a Hummingbird Penalty
From what we’ve already mentioned, the best way to avoid penalisation is to create content than answers a users queries, rather than just aiming to rank for particular keywords.
Pro Tip: look at question-based search queries to base content around, such as “What is SEO?” (as an example). This helps you to rank for featured snippets, which have higher CTR’s than traditional organic listings.
It seems that Google’s goal with all three of these algorithms is to encourage website owners to publish content that is truly the best of the best. Google’s goal is to deliver answers to people who are searching for information. If you can produce content that answers these questions, then you are on the right track to improving your organic search rankings.
Whilst there are less ways to “fix” your website from a penalty by Hummingbird, this algorithm is a lot different from the others. When a site has been affected by Panda or Penguin, it is because Google has lost trust in your site, whether it be on-site content or the sources of your backlinks. If those issues are fixed, you are able to then regain the algorithms trust, and see improvements.
However if your site is looking to have done poorly since the introduction of Hummingbird, there isn’t really a path to take for you to regain your rankings you may have once held. You can, however, discover new ways of gaining traffic through keywords, outsourcing methods and other channels of traffic.
Do you think your site has received a penalty, but you’re not sure how or why? Contact us here at Ricemedia for advice as to how we can help you recover your rankings and improve them for the foreseeable future!