What Impact Will Google Penguin 4.0 Have on Your Site?
By Sean Bucher, Account Manager at Bayshore Solutions.
Another day, another algorithm update.
Google has released another Penguin update, 4.0 to be exact. This update differs from previous updates in a few core ways that will impact sites differently than previous Penguin updates. The update comes almost two years since the launch of 3.0 in 2014, and almost 4 years after the first Penguin algorithm launched in 2012.
Penguin is one of many quality-related algorithms referred to by some SEO experts as “penalties,” put in place to provide better results to users by devaluing domains that used blackhat SEO tactics. These techniques have included keyword stuffing and backlink building in the realm of link farms, unnatural linking, and poor variety of anchors and backlink targets. Penguin was implemented to target the backlinking blackhat SEO.
Over the course of a year, Google can make upwards of 400 updates to their search algorithms, with some have a negligible impact, while others can shift rankings entirely. In the past, the Penguin algorithm has operated outside of the core algorithm, as a one-time update with every launch. But as part of Google’s efforts to move their search algorithms towards machine-learning back entities, the Penguin 4.0 release will operate as part of the core algorithm.
So How Will Running Penguin in Real-Time Impact Websites and Businesses?
By adding Penguin to the core algorithm, updates and changes will run in real-time, which means fixes and corrections will also run in real-time. Previously, if your domain or site was hit by a Penguin penalty, often the whole domain would be affected. That means if a select few pages on your website had been hit with spammy links, or a directory level of your site was written with over-optimized anchor text creating an inflated number of links, the whole domain could be affected.
Because the Penguin Algorithm is now incorporated into the core Google algorithm and running in real-time, “penalties” can now be assessed at the page and directory level. This will help some sites that had previously seen the impact across their entire domain.
One of the most common penalties we see, particularly with WordPress websites, are spam hacks where hackers can come in through a vulnerable theme or plug-in and create what we call “ghost” pages. These pages don’t actually exist, but the hacker creates a backlink to the hacked domain with some sort of product extension like:
Which then creates a 404. So what engines see is backlink data to page that doesn’t exist. So not only is the domain now being penalized by for the low quality link coming from a questionable source as part of the hack, but it’s also registering a 404 page. For smaller sites where you may only have 10 or so pages, hacks like this often create hundreds, if not thousands of links. This in turn creates thousands of 404 errors. So as Google looks at the overall maintenance and health of a website it registers a low amount of 200 level response codes. If the ratio of Good (200 level) to bad (400 level) responses weighs too much on the bad side, Google and other engines may devalue your entire domain.
With the update to the algorithm now, expect to see a lighter negative impact for such attacks. Now, this doesn’t mean they shouldn’t be addressed through a link disavow and webmaster outreach effort, but it should allow sites more cushion should they be hit by such an attack. Keep in mind, only the pages or directories hit by poor backlinks should see the impact.
How do I know if My Site Has Been Affected by Penguin?
Previously, if a site had been penalized by Penguin, you often would see a sharp spike in organic traffic, followed by a strong decrease almost immediately after with no recovery.
In the case of Penguin 4.0, the peaks and valleys may not be as easy to identify because they will not impact the entire domain. Given that only pages and directories will be impacted, it may take more digging and may not be as apparent.
Other indicators exist, but may require some digging through free, easy to use options like your Google Webmaster Tools/ Search Console account. Google (and Bing) provide a plethora of data to site owners to protect your domains.
Within Google Search Console, site owners can navigate to the “Search Traffic” drop down menu and select “Links to Your Site.” Here, you can easily see which domains link to you and what pages their links are targeting. This will start to give you an idea of how Google views your backlink profile. You can dig further into “Who Links the Most” by clicking “More>>” and expanding the table. The example below lists a total recognized backlink profile of 1,286 links.
Upon further examination, we are able to identify a few questionable domains that have a high count of links to this site, while targeting one page.
While most of these are local directories, we see 15 links from one site pointing to one page that appears to have little relevance to the website we are examining. Upon further study through a download of the links, we see the backlink is a no-follow link coming from a comment thread back in 2012. This is a questionable link and four years ago, at the time of the posting, was probably part of the company’s technique to build backlinks through comments, a tactic rarely used today by experts in the industry.
By taking the time to look and ask yourself, “Does this seem logical that this page would be linking to my site?” You should be able to identify if your site is the victim of bad backlinks.
Other tools that accomplish this, but require a paid subscription are Moz, Majestic SEO, and Ahrefs. All provide a more extensive snapshot of backlink profiles, with Majestic offering probably the most comprehensive look at a site’s profile. If you work with an agency, they should have access to these tools. This is particularly important, should that you do have a backlink penalty issue, as your agency can more closely examine where the low quality links are coming from using these tools.
An expert can then work with you on a clean-up and disavow plan that should positively impact your domain and help you recover from any quality algorithms that may be negatively impacting your organic traffic.
Is There Any Reason to be Skeptical of the Update?
One gripe many have had with Google over the years is its inability to identify link farms in a more timely manner. If you look deep enough, some sites are still the beneficiaries of link farms created on seemingly legitimate domains. For most White Hat SEOs, it seems absurd that Google hasn’t been able to identify some of these link farms. With the real-time update now in effect, many industry experts hope that penalties or devaluations can be assessed for those still practicing some of these tactics.
Time will tell if that in-fact is the case. In the end, a good SEO strategy will push the limits of your campaigns and sites by finding new ways to engage the right audiences, and not promise instant rankings for terms via paid linking efforts. Remember, the best SEO campaigns take time, but provide relevance to the user and add to the subject of a topic.