Penguin is an algorithm which identifies web spam, especially unnatural Backlinks. With the update first published in 2012 combats riggings that tend push pages in the ranking although they do not justify a high ranking in terms of content or technology. Before Penguin it was indeed a good strategy to buy links. On link farms, these positive rankings could be breed exclusively for their own website, so to speak, and then released into the digital world. In this time, Google rated websites according to how often other sites linked to this website.
When certain keywords appear again and again in components of the link, such as link anchors (the term, from which is linked), the link title (to see if you move the mouse over the link), as well as in the terms in the immediate vicinity of a link, it’s an indication to Google that the linked URL is relevant for this keyword. Although the authority of the linking site also plays a role, link farms compensated the lack of authority by sheer quantity.
However, Penguin put a stop to this unfair practices. Whoever had too many unnatural links on his domain had to expect a penalty from Google. There was no notification about this, in contrast to manual measures. Up then came again, who made the homework of Google. The only way to go back to top was doing one’s homework given by Google.
But even that took a long time, since you had to wait until the Penguin called in once again. Quite often you even had to wait for a manual test. This was especially unfair to those who were victims of Negative SEO, so hijacked attacks from competitors who tried with cheaply bought backlinks to hurt a competition site. They were punished for something they did not have to answer for. The Penguin in his 4.0-feathering could ease this injustice better than its precursors.
What’s changing with Penguin 4.0?
The Penguin is now part of the core algorithm and is therefore not displayed periodically, but works in real time – without recess, twenty-four-seven, every day of a year. When Penguin 4.0 discovers unnatural backlinks, a page can therefore be punished sooner. The consequences of the punishment for website operators are not as dramatic as before, however: Penaltys will no longer affect entire domains (not necessarily), but often only the pages lead to the unnatural links.
What previously took infinite effort and time, is now finally done by the algorithm itself. Penguin 4.0 identifies unnatural links mostly by itself. Penguin 4.0 does not only identify unnatural links, but punishes them automatically. The fledged penguin thus directly relativizes the value of links, if they give him creeps. This should severely limit the abuse of links.
In addition, the possibility of devaluing certain links also shows how far Google has already departed from the former dependency on links. While Penguin 4.0 should be equipped with no deep learning function (independent learning), it is only a matter of time until Google can rely on machine learning for the assessment of backlinks, to emancipate even further from the former dependancy. However, it is already clear that with the new update backlinks can no longer generally be regarded as valuable ranking signals. Their value is now flexibly estimated by Google and is dependent on other Onpage SEO criteria or user signals.
If the domain is no longer punished, is it worthwhile to pimp individual pages with bought links to attract visitors?
This strategy initially sounds really tempting. With Penguin 4.0, linkbuilding is obviously not so dangerous. Since often the entire domain is not punished anymore, but only individual sites spoiled with linkspam, website operators could take the risk of pushing individual landingspages for discount campaigns or other short-term events with bought links. Whoever thinks so, forgets that Penguin 4.0 works in real time and can relativize the value of links.
Once the algorithm detects webspam, the page is downgraded in the ranking. Whoever does this more often also risks that the domain will be punished. While domain penalty is no longer the rule by Penguin 4.0, they are not completely off the table, especially since Google is still giving manual penalties, in order to keep link farms or notorious frauds out of the rankings.
What do to if a page is punished by Penguin 4.0? Is the Disavow tool becoming less important?
Anyone who has been hit by earlier penguin updates could either ask the link giver to spam his page no longer with these bad links or – since the spammer usually does not react to such inquiries – had to put unnatural links in quarantine with the Disavow tool. Using the disavow files, Google’s bots recognize spam links. Rand Fishkin of MOZ suspects Google has used its disavow database, which has grown strongly in the last years (and webmasters fed), meanwhile to recognize unnatural links largely independent.
Google’s contact for SEO concerns, John Mueller, has already announced that the setting of disavow files is no longer absolutely necessary. Ideally, Penguin 4.0 detects spam links and sorts them so that they do not play any role in the ranking of a page.
However, if you want to ensure that this is done, you should still use the Disavow tool to send unnatural links into quarantine. Conversely, it is still not a good idea, in principle, to asign all incoming links with Disavow files. This would remove all risks from links, but with them all opportunities to go up in the ranking through the links. Last but not least, this advice by Mueller underlines Fishkin’s thesis that Google does not want to forgo the help of webmasters in the identification of unnatural links. On this view, SEOs are allowed to continue feeding the Penguin 4.0.