Farmers, pandas and penguins (which are all affected by places related to Venice) work together to achieve a common goal; not unlike that of a phantom.
If you disagree with the above statement, you don’t know enough about Google. The search results produced by their algorithm are arguably the front-line for digital marketers, so understanding that algorithm is crucial if you want to succeed online. The problem, as you may know, is that they make 100’s of updates every year.
Here’s a summary of the 9 most important Google updates since 2011 and what they mean for you and your business:
1. Panda (and Farmer) – February 23, 2011
Panda likes a good user experience. Farmer doesn’t like spammy link farms for the same reason. These updates (which were rolled out together) marked the birth of ‘Domain Authority’ and the death of ‘Page Rank’. In other words, Google stopped ranking web pages individually and started factoring in the quality and relevance of every page in your website to get a better picture of your website’s value as a whole.
So no more shoddy ‘splash’ pages that only exist to catch traffic! Every page has to be deemed high value if you want to get found. Why? To eliminate any risk of a searcher being sent to a website by Google only to then receive a poor user-experience. Other usability factors that Panda and Farmer look at include:
- Ratio of advertising to valuable content
- Relative number of outbound links to low quality websites
- How regularly fresh content is released (i.e. blogs, news, social media feeds etc)
- Ratio of original content to referenced content
- Keyword Stuffing: The overuse of keywords in content and meta-data at the detriment of readability and therefore user experience.
For the last 2 years, Google have tweaked the Panda algorithm and run manual refreshes which often resulted in nasty surprises and sudden ranking changes. After 25 updates, Google are finally happy with it and Panda has become part of their core ‘real time’ algorithm.
2. Freshness – November 3, 2011
The part of the Panda algorithm that rewarded the regular release of fresh content was split out, refined and then subsumed in to Google’s core algorithm around 18 months before the rest of Panda.
It deserves a special mention because it is such an effective method of optimisation. Of course you still need to make sure the content is valuable according to all the other factors Google looks for; but get it right and you will be stunned by the power of high quality blogging. If you regularly release original and valuable content, Google will relish the opportunity to deliver that content to their information-hungry searchers!
3. Ads Above the Fold (aka ‘Top-Heavy’) & Page Layout – January 19, 2012
This update was all about the volume and the quality of content ‘above-the-fold’. Traditionally that term refers to the area you can still see when printed material is folded in half. It’s what people see first and is your opportunity to entice them to read more. In the digital world, that’s the part of a web page at the top which is visible before a user scrolls.
Perhaps it shouldn’t be too surprising that Google skewed their thinking this way given that an estimated 80% of web users stayed above-the-fold in 2010, according to Nielsen data. If users look at the top more often, there should be a lot of valuable stuff there.
It lead to another more refined update called Page Layout Update #2 in October, 2012. Placing outbound links further down the page, structuring your Headings (H1–H6) in a logical way and making sure your internal navigation links are in good condition are all efficient and simple ways to improve your page layout.
NB This part of the algorithm is constantly being upgraded in line with ever-evolving UX and coding standards.
4. Venice – February 27, 2012
Remember when all of a sudden you started seeing more local businesses in search results? Unfortunately, it probably wasn’t due to a boom in your local economy (Feb, 2012 was more doom-and-gloom than boom!). Instead it was an intentional move by Google to help small businesses compete with the dominant online corporations; if only for their respective local markets.
The interesting thing here is that customers may now find you more or less easily according to how/where they set their location in Google. This is called a ‘localisation’ of search results. Each users’ individual search history was the first factor used to personalise results, introduced in 2010. By combining this with locality, they ensured that smaller/local businesses still get an opportunity to rank.
Since Venice, Google have integrated more personalisation tools including user-defined interests.
5. Penguin – April 24, 2012
What is there to say about this little tuxedo’d fella? He’s a harsh critic, he can be vicious if crossed and he causes the kind of damage that’s hard to recover from. Whereas Panda passed judgement on the value of your user-experience to help index and rank your website, Penguin is all about law and order in search; pro-actively punishing you if he thinks you’re pro-actively trying to deceive him.
Penguin impacts websites that have previously used surreptitious techniques to manipulate search results (often called ‘black-hat’ tactics); unfortunately that includes those who did so because of bad advice or dodgy SEO partners. The main things Penguin looks for are:
- ‘Cloaking’: Script-based serving of content to search engines
- Hiding text by making it the same colour as the background, making it small or setting it to appear off to the side of the visible screen
- Unnatural proportion of inbound links from low quality websites
- Keyword spamming in anchor texts or on the source pages of inbound links
- Low quality blogging, comments and wiki spam to create inbound links
Penguin has had a few of his own updates since the first release, each more aggressive than the last:
- Penguin 2 – May 24, 2012
- Penguin 3 – October 5, 2012
Penguin also incorporated strict penalties for repeat or blatant plagiarists. Since August 2012, if somebody steals your content you can report it to Google under the DMCA (Digital Millennium Copyright Act) and they will have the infringing site removed. The Penguin algorithm, or at least it’s core function, is inclusive of these signals to help penalise copyright plagiarists.
How do you beat a penguin? Be honest and transparent with the content in your website by not using CSS to disguise or hide anything; only get inbound links from high quality websites; only use original content wherever possible and ensure that all content taken from other websites is referenced correctly.
6. Exact Match Domain (EMD) – September 27, 2012
This update simply tied together two existing parts of the algorithm. EMD looks at the use of the brand name on a website and makes assessments on the relative extent of keyword overuse between the domain name and body copy.
In a nutshell, a website called BuyInsuranceOnline.com would have tighter restrictions on the number of times the exact phrase ‘buy insurance online’ could be used within the website. Essentially, if you go for a shortcut by buying an exact match domain, you will be penalised faster for over-optimisation (keyword stuffing) to balance out your unfair advantage.
In practice, it’s usually the internal linking structure of EMD websites that triggers the over-optimisation penalties. If you have an EMD, make sure you only have internal links which are absolutely necessary!
7. Phantom – May 8, 2013
This one’s as mysterious as it sounds. Some say it was nothing more than the laying of foundations for the two updates that followed; perhaps a pre-cursor at best. However, some sites were affected disproportionately so most people in the know think that something bigger was going on. Google said nothing, predictably.
In particular, the sites that were hit hardest fell into one or more of the following categories:
- Lots of outbound links but none classed as ‘nofollow’:
A ‘nofollow’ tells Google to not count the destination website’s optimisation as important when indexing and ranking your website. It makes sense that they would want you to manage this.
- Lots of outbound links but all classed as ‘nofollow’
Google might want to see that you are managing your outbound links instead of sending their bots to far-flung corners of the web where there is no relevance to be found with your site… but at the same time they do want you to reference your sources and send them to relevant sites. Some of your outbound links should be followed.
- Heavy cross-linking between multiple domains:
For a long time, Google has disregarded inbound links if there is an outbound link to the same website: They know that these ‘reciprocal links’ are not true indications of value, but rather a favour between two website owners. Phantom started going after even larger groups of website owners who traded links in circles in order to avoid the reciprocation being noticed.
- Anchor texts for back-links on the cusp of previously acceptable keyword densities:
Phantom obviously tightened restrictions.
8. Penguin 2.0 (Penguin 4) – May 22, 2013
Penguin 2.0 is a totally new version of Penguin that is much more far-reaching. It clamps down very heavily on breaches of Google’s Webmaster Guidelines. Be afraid, be very afraid:
- More link farms are being chased down (see Panda) and all websites linked to those farms are being punished
- The ‘Disavow’ rule has been introduced: Website owners must now manage a list of which inbound links Google should ignore
- While social media signals have been a major ranking factor since December 2010, they have now been promoted; potentially overtaking back-links in importance. The Plus 1 button has been a major factor since March, 2011
- Unnatural inbound back-link patterns from low quality sources are even more dangerous than ever
- There is a new emphasis on the resolution of optimisation focus. If your optimisation seems too wide OR too narrow in proportion to other factors, including website size and back-link profile, you will be penalised
- Significant spikes in link-building activity (a signal of outsourced link-building) will result in a penalty.
All in all, Penguin 2.0 creates an environment where, for the first time, there is an airtight set of rules to isolate truly valuable websites and reward them accordingly. The only way to compete with best-in-class websites is to become a best-in-class website. How? Release good content, share it and get social buzz pointing to your website. Good SEO can no longer be an after-thought… it is a constant process.
9. The Knowledge Graph – December, 2012 to May, 2013
The introduction of The Knowledge Graph hails the seed of a revolution in search. Essentially Google has become less keyword-focused and more about ‘semantic search’ meaning it can now understand the context of each query according to the person searching it and even the time it is searched.
First of all it can now understand sentences. Secondly, it takes structured markup and schema data into account to understand the context of the information it finds on websites. Thirdly it can find logical connections between things it reads on websites and the things you search for. It even looks at your personalisations and other signals to get a better idea of the context behind your search query before delivering a mixture of relevant link results, Wikipedia snippets, definitions, facts, statistics and the relationships between them all. You’ve probably seen it in action, at the top of the right-hand column in search results.
As the Knowledge Graph grows in reach, you will need to make sure your wider marketing is targeted and effective so that Google can place you in the right context for the right search queries. Surely this is the future of Google’s algorithm… expect to see Google become much more semantic over the following year or two!