Google has officially confirmed on their Webmasters Blog that the launch of Penguin 4.0 has now started rolling out. Most significantly, it is now contained within their core algorithm, is updated in real-time and works on a more granular basis.
Google Penguin was first launched in April 2012 with the aim of targeting “black hat webspam.” Google described this in 2012, as those who “use techniques that don’t benefit users, where the intent is to look for shortcuts or loopholes that would rank pages higher than they deserve to be ranked.” In particular, Google mentions keyword stuffing (using the keyword too many times on the page) and link schemes (artificially creating links to your website). Ultimately, the update sought to penalize those websites which are in breach of the Google Quality Guidelines, with a focus on the points mentioned above.
While there have been half a dozen updates since the initial launch in 2012, the most previous update was Penguin 3.0 back on October 18, 2014. There has been much discussion in the SEO community about the length of time it has taken Google to release a subsequent update, with many webmasters having to wait years for their website to recover from a penalty.
There has been some significant volatility in the SERPs, or Search Engine Result Pages, since the beginning of September, but looking at the latest chatter on various SEO forums, the chatter relating specifically to Penguin, or those noticing recoveries in Penguin hit websites only really started over the last 24-48 hours. We won’t go into too much detail on the community chatter now that Google has confirmed the rollout of Penguin 4.0, but you can read yourself the chatter on the Black Hat Forums thread (page 14) and at WebmasterWorld.
Returning to the announcement by Google, there is some significant news about the changing nature of Penguin. The full extract of the statement is below:
- Penguin is now real-time. Historically, the list of websites affected by Penguin was periodically refreshed at the same time. Once a webmaster considerably improved their website and its presence on the internet, many of Google’s algorithms would take that into consideration very fast, but others, like Penguin, needed to be refreshed. With this change, Penguin’s data is refreshed in real time, so changes will be visible much faster, typically taking effect shortly after we recrawl and reindex a page. It also means we’re not going to comment on future refreshes.
- Penguin is now more granular. Penguin now devalues spam by adjusting ranking based on spam signals, rather than affecting ranking of the whole website.
It is kind of interesting that in a similar way to when the Panda Algorithm was incorporated into the core algorithm, no future comments on Penguin changes (or at least parts of the core algorithm that now deals with Penguin) will now be made. This will make it tough to isolate any webspam changes, but while frustrating, it should also make it more difficult for black hat practitioners to “game the system.” In our opinion, this is a good thing.
One of the most significant impacts of the new changes, though, is that it now won’t take long to recover from Penguin if you are affected. Google indicates that changes will take effect shortly after they recrawl or reindex a page. One thing, though, that they are silent on is exactly how long it takes for URLS in a disavow file to take effect. Certainly, it would not make sense for this to happen too quickly as it could be gamed by Black Hat SEOs.
That being said, with the algorithm now more granular, it seems that any penalty will be made on a page by page basis. This has the potential to blunt the effect of any penalty significantly, and we would not be surprised if this benefits black hat SEO’s in the long run.
It is unclear how long it will take for the update to roll out, but we suspect that it might take several weeks or even a month or so for Google to recrawl the entirety of the web fully. We suspect, though, that most webmasters may notice changes in the coming days \ week.
Jonathan Griffin Editor, SEO Consultant, & Developer.
Jonathan Griffin is The Webmaster's Editor & CEO, managing day-to-day editorial operations across all our publications. Jonathan writes about Development, Hosting, and SEO topics for The Webmaster and The Search Review with more than nine years of experience. Jonathan also manages his own SEO consultancy, offering SEO developer services. He is an expert on site-structure, strategy, Schema, AMP, and technical SEO. You can find Jonathan on Twitter as @thewebmastercom.