Google has been working around the clock improving their search algorithm. Each change has been accompanied by an update known as Panda which has been tweaked to improve the results presented in the SERP’s. On May 20th 2014, Google introduced the most recent update – Panda 4.0. Google has always stated that Panda will be in charge of taking action against content farms, especially those that were ranking high within search engines while providing “shallow” low quality content. Google has often prided itself on providing users with the best search experience providing relevant high quality results. So, what’s changed?
Web developers are getting much smarter being able to find loop holes within the older version of Google. They were able to outrank even the best content providers by building links using software programs, spinned content and aggregating content. In short, high quality original content was getting harder to find within the search engine results.
With so much information about the recent Panda update, it’ll be a good idea to provide you with a breakdown of what’s changed over the years and how you can tweak your website so you’re NOT affected by the future Panda updates. After Panda 1.0 rolled out on Feb. 24, 2011, I’ve noticed a trend begin to take form within the Google algorithm and how it displays the search results to the end user. If you pay close attention, there are 5-6 factors which impact you’re rankings and staying clear of them will improve your chances of not being affected now or in the future.
So, what will you learn…
- Panda 1.0-4.0, what’s changed?
- What is Panda 4.0 all about
- How Has Panda 4.0 Impacted Search Results?
- Examples of websites impacted and why (eBay, suite101,ask.com, examiner.com, etc)
- Effective strategies to beat the future Panda updates
What’s Changed Since Panda 1.0?
Google Panda had a total of 26 updates since February 24, 2011 and it’s probably the only update in history that has had such an impact on web rankings within the shortest period of time. If you do the math, there has been an update every 1-2 months, all of which have been tweaked to clear or update the SERP’s.
In order for you to understand how each update has affected the SERP’s, it’s important to look back at what changed after each roll out. Once we’ve understood the change, we can find a trend which will provide us with a better understanding of how the search queries and results are shifting. The objective should be not to find creative ways to beat Panda using blackhat techniques but how to be more proactive in the way we organize and create content for our audience.
Let’s travel back in time starting with…
Panda 1.0 – (Feb. 24, 2011): Affected 12% queries within ONLY US search results aimed at eliminating content farms and sites providing low quality content. Matt Cutts, the head of Google’s web spam team, announced that websites with duplicate scraped content will be affected once Version 2 rolls out on April 11, 2011.
Panda 2.0 – (April 11, 2011): Affecting 2% of queries and went live for all English speaking countries and those who are typing English search queries. Next, Google introduced an extension through Chrome for the user to block websites from appearing within their SERP’s. However, Google uses these statistics to create a pool of websites that have been blocked several times and won’t include them within the SERP’s.
Panda 2.1 – (May 10, 2011): Very little websites affected that NO percentage was given. They just cleaned up a few more websites from the previous two updates.
Panda 2.2 – (June 16, 2011): Not much was changed, however Matt Cutts, head of Google web spam, confirmed that this update was pushed out manually to improve scraper detection. The algorithm update was pushed out to test how well it will perform against those websites which are scraping content using software programs. The actual implementation of this technology was done during Panda 3.0.
Panda 2.3 – (July 23, 2011): An update to the Panda filters which incorporates improved signals working at eliminating lower quality sites. This update added some advanced features that can differentiate between lower – and – higher quality websites. The EXACT factors used were not released but web developers saw a significant change in the SERP’s within the first 1-2 weeks.
Panda 2.4 – (Aug. 12, 2011): Affects 6% – 9% of all non-English websites and includes minor changes to the English update. The focus has been to return high quality results for users located in non-English countries. Google noted on their website that people should evaluate their websites finding ways to improve quality and add unique value.
Panda 2.5 – (Sept. 28, 2011): Not sure how many search results were affected but this was aimed at cleaning up changes made during the previous updates. Again, from the trend we can conclude that high quality unique value is what the focus has been.
Panda 3.0 – (Oct. 19, 2011): This update was known as the “flux” update which was aimed at improving the fluctuation that people have seen since the beginning of Panda 1.0 (Feb. 24, 201). People have seen an increase than sudden decrease in the organic search rankings of their website. Google made it clear that Panda is NOT a single page assessment but this update affects the whole website. Google provided the following breakdown, you would notice a change if…
- An increase if you’re site provides the best value to the reader.
- A decrease if lots of pages on your site are about the same EXACT topic.
- A decrease if there is duplicate content on your website (syndication, aggregated or reproduced)
- An increase if you improved the usability like navigation, site speed and low bounce rate
- An increase if your site has higher than normal user engagement which can be measured by longer interaction with your page, social shares, etc.
Panda 3.1 – (Nov. 18, 2011): A minor update to Google Panda and affected less than 1%. This was a trial run for a bigger update later in the New Year.
Panda 3.2 – (January. 18, 2012): Google confirmed that no changes were made to Panda and this was just a refresh to data already within the SERP’s.
Panda 3.3 – (February. 27th, 2012): Google announces another Panda update will pushed into circulation very soon that will directly impact site owners and the SERP’s. This minor update of Panda 3.3 was meant to refresh search results eliminating low quality websites that have not yet been hot previously.
Panda 3.4 – (March. 23rd, 2012): 1.6% of search queries were affected after this recent Panda update. Google announced this update targets low quality websites. It also revealed plans to add a new “freshness” factor that rewards websites who routinely add new content.
Panda 3.5 – (April. 19th, 2012): A minor refresh of the Panda updates.
Panda 3.6 – (April. 27th, 2012): Another minor refresh of the Panda updates. Google spokesman had to say the following in regards to buzz surrounding Panda…
“We’re continuing to iterate on our Panda algorithm as part of our commitment to returning high-quality sites to Google users. This most recent update is one of the over 500 changes we make to our ranking algorithms each year.”
Panda 3.7 – (June. 9th, 2012): Another Panda refresh which impacted 1% of search queries within the United States and word wide. The sites that were previously impacted were “hit” once more.
Panda 3.8 – (June. 25th, 2012): Impacted less than 1%, Google stated no changes were made to the algorithm or signals. They simply wanted to do a data refresh so ran the algorithm once more.
Panda 3.9.1-3.9.9 – These updates consisted of Panda refreshes aimed at cleaning up the data within search results. Once the dust cleared, Google was set to roll out Panda 4.0 on May 20th, 2014.
Panda 4.0 – (May 20th, 2014): – This was the most recent update which included tweaks to the original Panda 1.0 update. When this update rolled out, we were able to see a trend which developed overtime. Google Panda was designed to boost high quality websites while pushing down low quality, thin and shallow content websites.
Within the most recent update – Panda 4.0, Google did target specific types of websites once the dust cleared after the previous updates were implemented. Let’s take a closer look…
What Is The Impact of Panda 4.0?
In order for you to understand the complete effect of Panda 4.0, you need to know what types of websites suffered the most after the update rolled out. For example, once the dust settled, it was clear that “3” types of websites felt the wrath of Panda 4.0. If you pay close attention to all “3” factors then you’ll be able to shift you’re focus when creating content in the future and avoid being penalized by Google.
Google Penalized Aggregated Content
The recent Panda 4.0 update directly impacted sites which gathered content from other websites and posted them on their own blog or website. This technique is known as “aggregating” and for those who are familiar with this term, it’s when automated software will skim through the web gathering content from niche related websites then post it on their blog. Many websites would often use this technique to compile “top news stories” types of content for their blog. It’s a one stop content hub for those people who just want to read the top stories within a particular niche.
What’s the problem?
None of the content is unique and is often scattered through multiple websites across the web which depletes the value of the content. This technique would generate numerous complaints if the proper permission wasn’t granted by the author before publishing. Even though many aggregating websites lost rankings, there were some that still achieved a boost in the rankings. Aggregating websites like BuzzFeed.com were not affected after Panda 4.0 for several reasons.
- They’ve partnered with only the best content providers
- They are highly moderated and have their own editor’s
- Been around since 2006 and maintained a solid reputation throughout the years
- BuzzFeed.com produces their own high quality content written by staff reporters, experienced contributors and editors through their partners.
- Content on BuzzFeed.com is shared by thousands on social networks each day.
Google Penalized Sites with Thin Content
Some of the hardest “hit” was those that had “thin” content on their website like eBay.com, Ask.com and Suite101. Google is working hard at lowering rankings of those that have 1-2 high quality pages which interlink to low quality content pages. A quality website to Google should always produce high quality content no matter what the topic. Websites are often writing lengthy high quality posts when promoting products but fail to provide value on other topics. Website’s that continuously produces high quality content shows consistency and is rewarded with higher rankings within the SERP’s.
Why was Google so harsh on “thin” content websites?
The answer is simple; marketers would create a link building campaign centered on selling products and would redirect visitors right away to a store or mobile app website. Having one page providing value is not good enough especially when that content has a sale pitch embedded into it. A visitor would arrive on a page and be promised a solution but when they click the link, they would be redirected to a third party website. How is it possible that a website which provides no value be able to rank higher than those structured around value? Panda 4.0 is aimed at eliminating these websites from high in the SERP’s.
Google Rewarded High Quality Content Sites
This has been Google’s objective since its beginnings and for them to be considered the best search engine; they need to provide ONLY the best value to people. You’ll notice from the introduction of Panda 1.0 on February 24, 2011, Google has been tweaking the algorithm to favour solid, structured, reputable, lengthy and user-friendly content.
This change is evident when we take a closer look at the websites that were “Winners” after the Panda 4.0 rolled out and what they did different compared to the “Losers”.
How Has Panda 4.0 Impacted Search Results?
You’ll notice a change in the way search results are presented within Google after the recent Panda 4.0 roll out. This recent update on May 20th, 2014 was the conclusion of Google’s “cleanup” stage and impacted the search results enormously. You’ll notice that search results are more “keyword targeted” and include “high” quality websites. Next, you’ll notice the search results are centered on questions like “how-to”, “what is” and “why”, “tutorials” etc. Finally a majority of these websites have been around for years and built credibility with not only the search engines but readers from all over the world. They have a thick page structure linking to other pages within their site that contains valuable information. In other words, these are not websites that have “thin” content interlinking to other pages with low quality content.
Let’s look at two important metrics that will narrow everything down – Domain and Page Authority. I’ve decided to focus on these two because they will compress all the data I need when doing these examples below.
Here’s a quick definition of each…
Domain Authority – Is measured using these 3 factors: Age, Popularity and Size. These 3 factors are all important because the recent Panda update has paid close attention to these factors when ranking websites.
The older the domain means the longer it’s been around and built credibility over time providing a valuable resource to visitors.
Popularity can be contributed to the website producing high quality content which resonates which its readers and has enormous social shares. A website which is popular has produced enormous amount of links through blogging, interlinking, commenting, social media and article marketing.
The larger a website in “size” means it has enormous amount of content interlinking between pages. A larger website with quality content tends to have more links then a smaller website which will have very few.
Page Authority – Relates to the pages within the website and the likelihood of that specific page ranking high within the SERP’s. The metrics involved are relevance and inbound links pointing to that page from external resources.
A few things to remember…
When measuring “page authority” your calculating the internal pages that specific page is linking to. Each page interlinking will have an influence on each other. When Panda 4.0 was rolled out they wanted to eliminate “thin” content websites which means sites with 1-2 high quality pages and the rest being of low quality content. Google wanted to eliminate “1” page review style websites with content only on the home page. These websites would write a lengthy product reviews then built links ranking the page high enough to generate income through affiliate commissions. However, those that put time and effort creating actual content suffered losses.
Page Authority involves a new metric to create a fair playing field and the only weapon you can use to win is creating “high” quality content.
The Winners After Panda 4.0
Let’s look at the top 3 Results…
1) Glassdoor.com – (Domain Authority: 87 | Page Authority: 89 |)
Glassdoor.com saw an increase of 100% in search engine visibility, however keep in mind, this has nothing to do with traffic increase. The Panda 4.0 is all about providing value and increasing the rankings within the SERP’s of those websites that translate into a better user experience.
There are a few reasons why Glassdoor.com saw an increase of 100% in rankings after the update. First, the pages interlinking are very detailed. Glassdoor.com is a job search website with over 199 high profile companies providing career opportunities to people. Each listing has an average 390-600 words and highly “targeted” towards a specific skilled audience. Next, Google loves them for the data providing people with the EXACT information they are looking for. Google and Glassdoor.com have the same objective in mind which is to get people the most accurate information they need.
2) eMedicineHealth.com – (Domain Authority: 76 | Page Authority: 78 |)
EMedicineHealth.com increased by 500% after Panda 4.0 which is enormous and many believe it’s because of the sensitivity of the niche. Any health related website is highly governed and information has been backed up by medical experts. EMedicineHealth.com was rewarded since it provides in depth information on illnesses targeting those looking for advice, etc. Google needs these websites because they collect the data that Google is looking for to display within the search results. They need highly regulated websites to provide valuable information which they can boost up and display to people searching online. Next, health related websites have very “thick” content on every single page since they are discussing health issues. They cover description, symptoms, causes, cures and related information for every medical condition. For example, the average length of content on every page I visited was 3000-5000 words interlinking to other related content. The domain age is 11 years old which is huge!
eMedicineHealth.com (Inside Page)…will link to WebMD.com, another top 3 medical website.
3) MedTerms.com – (Domain Authority: 78 | Page Authority: 49 |)
For those of you not aware, MedTerms.com is a website within the health niche providing visitors with everything from A-Z in medicine. You can find detailed information on medicine, diseases, videos, discussions and health living. This website increased 500% after the recent Panda 4.0 and I’m not surprised. First, it’s a highly specialized website which is updated with high quality content almost every day. Next, due to the sensitivity of the niche, it’s only linking out to the most reliable sources online. MedTerms.com is highly supervised by both doctors and government facilities again due to the sensitive nature of the topic. Google rewards these types of websites for the data they provided which is both detailed and highly targeted. For example, if you type in “definition of blood pressure”, you’ll see the following…
- MedTerms.com shows up as the first result within Google
- A complete definition of the term interlinking to other resources within the content
- High blood pressure slideshow
- A quiz to test your blood pressure
- Exercise tips to lower blood pressure
- Highly relevant external links to other resources
MedTerms.com (Inside Page)…
The Losers After Panda 4.0
With the winners, you’re always going to have losers. You’re going to be surprised at some of the losers because they are very popular websites. However, if you dissect the reason why they dropped within the rankings after Panda 4.0, you’d have a better understanding of what happen. Panda 4.0 said from the very start it wanted to create a fair playing field for everyone even if it means dropping those that have benefited enormously from high rankings previously.
Let’s look at the top 2 Results…
1) Ask.com – (Domain Authority: 93 | Page Authority: 94 |)
After looking at the domain authority and page authority, you know immediately that ask.com is a high powered website but why did it drop 50% within the SERP’s. The answer can be found when looking over who Panda 4.0 targeted during the roll out. For example, it targeted “thin” content websites with low quality pages scattered throughout the website.
First, when you type in a search phrase in ask.com, you’ll get answers from various different websites, some of which have value and others none at all. Next, when you head over to the Q&A community where people ask questions, you run into another problem. People have control over posting questions and others can answer them to the best of their ability. The answer may be inaccurate and provide no value at all to the user. Next, there is no character limitation which means that you can post an answer which contains 20 words.
All of this combined is a problem for Google whose mission is to provide only the most valuable content to their users. You can’t have websites like Ask.com appear high within the SERP’s providing low quality inaccurate answers to researches.
2) eBay.com – (Domain Authority: 96 | Page Authority: 96 |)
You’ll learn a lot from the ask.com example above and how thin content pages scattered throughout your website can harm your rankings. eBay.com is an auction website so has given too much control to the people who are buying and selling, however it needs to do just that in order to keep their business model moving forward. It would be impossible for eBay.com to manually scan and list the products in their market themselves. eBay.com has given so much control to the end user allowing them to create content, upload images and choose the appropriate title that it’s backfired now that Panda 4.0 has rolled out.
Let’s look at the impact a little closer…
First, you have people who are listing products, with some, having no prior knowledge in marketing or sales. They often use broad keywords within their title and the description is no more than 100 characters long. Next, that same page will interlink to other pages which are EXACTLY the same format (broad title and description no more than 10 words).
Another thing I’ve noticed is the advertisement on every single page. Having adverts on a page is NOT a bad thing because they provide relevant products to users, however when you have 5-6 long banners on a page with very little content, it becomes a spam issue. That’s exactly what happen with these thin content pages which had no more than 100 words within the description but 5-6 advertisements that would force you to scroll all the way down to reach the bottom of the page.
If you paste a majority of the URL’s from eBay’s product listings into OpenSiteExplorer.org, you’ll notice the “Page Authority” comes back as a “1” which I’m not surprised because the “page” itself has no value. It’s a thin content page interlinking to other thin content pages.
eBay.com spends millions of dollars each year on advertising and optimization. If you started putting websites like eBay in the top positions when someone types in a question or needs information, you’ll run into a problem when these thin pages show up in the SERP’s. They provide no value at all so it’s important to keep popular websites like eBay in line with algorithm updates.
Strategies to Help You Beat Panda 4.0
1) Research and Analyze
You need to have a tool in place which has been providing you reports on your progress. It doesn’t matter how big or small your website, you need to have a tool to measure metrics. There are many tools available however Google Analytics is probably the best FREE tool. If you don’t have it installed, do it right away. Next, if you have been using it than it’s time to analyze your results. Google allows you to go back to February 24, 2011 when Panda 1.0 rolled out and it’s time to analyze your data.
Start by looking over your traffic statistics for decreases after the series of Panda roll outs. If you don’t have those dates, there listed above including what changes occurred and who was affected. If you see declines within your traffic after certain dates than you need to dig deeper and find out what changed. Ask yourself these questions…
- What content has decreased in traffic after the update?
- What keywords are not bringing as much traffic as before?
- What landing pages are not bringing as much traffic as before?
- Countries affected and why? The reason is you can be ranking differently geographically in search engines.
- Behavior and acquisitions, specifically, if organic traffic has declined which would mean a drop in the rankings somewhere.
Once you have narrowed it down, you can proceed to the next step…
2) Revive Existing Content
If you’ve found that certain content is not driving as much traffic to your website, it means that Google considers it either of lower quality or interlinking to poor irrelevant content. What questions does Google ask when determining the quality of a website and content?
Edit the content fixing length, structure, format and extending the information.
3) Create High Quality Content
I can’t emphasize enough that “content is king”. You’re only safe now and in the future if you tweak your content creation strategy. You should create “only” the best content to be published on your website. Don’t write for search engines but for people keeping their questions and objectives in mind. Statistically, it’s been proven that people are searching for a solution online so when creating content, keep that in mind. Here are a few strategies that will help…
- Research your competitors to see what they are doing, than do it better.
- Add images, videos and infographics to attract all types of readers within your niche
- Scatter keywords throughout your content to improve relevancy and “eye” catching within the search results
- Don’t write with an end in sight; only stop writing when you know you’ve covered everything.
I’ve elaborated on each of these within my post: 13 Content Writing Tips That Will Crush Your Competition
4) Create a New Interlinking Strategy
Panda 4.0 scans pages through the interlinking done within your content. If you’re linking to low poor quality content or an external site which provides no valuable, it’s time to fix that. If you have several “thin” content pages on your website than I would temporarily remove the internal links replacing them with external links to “high” powered websites. Go back and fix the content on your website then you can add internal links building a brand new structure. If you have external links on your website, it’s a good idea to visit these websites making sure their still actively producing new content or even online.
Google is definitely shaking things up and they will continue to throughout the next several months. It’s important to take proactive measures to protect yourself from future algorithm updates. I’ve seen several shifts in the way that data is stored and presented to people searching online for content however one thing that has not changed is the importance that Google places on value. If you can provide your readers with value than you’ve succeeded in creating something special. Just to recap on what I’ve discussed and what you should do to protect yourself and your brand, here’s a few quick pointers…
Review my section on the changes made to Panda since its beginnings so you’ll get a better idea of the trend for yourself. Make some notes on a piece of paper and place it on your desk as a reference each time you are updating you’re website or creating new content. Next, analyze if you’ve been affected by the recent Panda 4.0 roll out by looking over your statistics. Go back a few months to see what’s changed and why or, if nothing has changed, what you’ve been doing that has continued to work for you.
Finally, you have the list of “Winners” and “Losers” after Panda 4.0. Go through each of them looking for similarities and why they suffered a gain or loss. You can learn a lot looking over websites within different niches because the content maybe be different but successful website all have a few things in common: high quality content, lengthy content, unique content containing video and images, relevant interlinking, well-structured, content formatted and few advertisements.
If you have any questions or comments, please post them below.