Webmaster News - Edition 02 - January 28, 2019.

In this edition, Bob Ross does the Floss, Google Chrome Changes could destroy ad-blockers, GoDaddy backs down over JavaScript injection, Disavowing Links is not dead… Yet!, and much more.

In Edition 02, Bob Ross does the Floss (in Pure CSS of course), Google Chrome Changes could destroy ad-blockers, WordPress.com is building a new news publishing platform, GoDaddy backs down over JavaScript injection, Disavowing Links is not dead… Yet!, and much more.

In This Issue:

SEO & Search News

Google Algorithm Updates

The past week has been relatively quiet on the Google Algorithm Update front, although the volatility is higher than the typical volatility seen during the second half of December.

There was a spike in volatility in the SERPs around the 20th January, but I did not see much chatter about this. Personally, I saw a decent uptick in traffic around this time, but it was shortlived.

You can see the latest SERP volatility chart below:

The SERP Tracker shows a spike in volatility around January 21, 2019.
The SERP Tracker shows a spike in volatility around January 21, 2019. © SEMrush

You can find details of all past Google Updates here, including a discussion of the January 9th update that I covered in my previous edition of the Webmaster Newsletter two weeks ago.


In a recent Google Hangout, John Mueller was asked a question related to whether you should disavow medium quality links that are not ultra spammy. In particular, he was asked “Let’s say they didn’t get a manual action. Can those links hurt them algorithmically?”

John Mueller replied:

That can definitely be the case. It’s something where, our algorithms, when they look at it, if they say, Oh, there are a bunch of really bad links here, then, maybe they’ll be a little bit more cautious in regards to links in general for the website. So if you clean that up, then the algorithms look at it and say, Oh, it’s ok. It’s not bad.

You only want to really disavow they very worst of the links. If your site is performing well, then I would not use the disavow tool at all. If you previously built some spammy links, suffered from negative SEO, or if your site is performing poorly, you may want to consider doing a backlink audit and disavowing the worst of the links.

I personally recommend the SEMrush backlink audit. You can link your Search Console backlink data, and Majestic to get good coverage of your backlinks. The SEMrush backlink audit tool is very good, and you can filter your backlinks based on various criteria, such as deindexed domains, or link networks.

You can see a screenshot of my own backlink audit below:

SEMrush Disavow.
SEMrush Disavow. © The Webmaster

If you want to give it a try, you can sign up to SEMrush with a 7-day free trial here.

Just remember to use the disavow with care. In many cases, Google is able to just ignore spammy links, and removing links via the Disavow file risks removing good links. If you do find your rankings drop after disavowing, then you can just remove the “less shady links” from the disavow file and resubmit.

Most webmasters will not need to disavow. I only use a disavow file due to prior negative SEO, and even then I was very selective about the links I disavowed.


On January 16, 2019, Kevin Indig posted a screenshot of a search result where the PDF content has been used for the featured snippet.

You can see a screenshot below:

Featured Snippet from PDF showing in Google.
Featured Snippet from PDF showing in Google. © The Webmaster

Google has been indexing PDF files since at least 2011. Google can crawl PDF’s, follow links in them, and generally index the content (apart from images). PDF’s can even rank as high as web pages.

I wrote in detail about the indexation of PDFs (including about the featured snippets) here.


Most features in the old Search Console to be removed by the end of March 2019

In a recent blog post, Google has set out what features and changes are likely to be seen in the Search Console by the end of March 2019.

Here is a summary of the main changes:

What's being removed in the Old Search ConsoleWhat to use now in the New Search Console
Crawl errors report and APIIndex Coverage report [1]
Old Sitemaps reportNew Sitemaps report [2]
Fetch as GoogleURL inspection tool [3]
User-managementSettings section of the new Search Console
Structured data dashboardReplaced with multiple new reports. [4]
Property SetsNo Replacement
HTML suggestionsNo Replacement
Android Apps featuresNo Replacement. Most features are now in Firebase.
Blocked resources reportNo replacement. Use URL inspection tool.

Notes:

[1] - Google will remove the list of crawl errors. It will change the focus to highlighting only the issues that site owners need to fix. When issues are fixed, site owners can now request the errors to be revalidated, decreasing the time it takes to reprocess the changes.

[2] - Google is aiming to bring the remainder of the information from the old report into the new report, specifically for images and video.

[3] - The new URL inspection tool allows you to review URLs on your site with a lot more information provided. You can now view HTTP headers, page resource, JavaScript console log, and screenshots. In particular, the user set canonical and the canonical Google uses (which may or may not be the same as what you set), can be viewed. You can submit pages for indexing or re-processing from here.

[4] - New reports have been added, including Jobs, Recipes, Events, and Q&A. Other Structured Data types that are not supported with Rich Results features will be removed.


John Mueller has been hinting that people should focus more on content (implied) than backlinks:

This makes sense. Create great content, and the links will come, right? (Hint: They do.)


Google is aware of when Domains change ownership

In a question on the Webmaster Central Help Forum, a user asks whether redirecting a domain using 301 redirects to a legitimate domain will have an effect on the reputation of the new domain.

Aaseesh, Webmaster Trends Analyst at Google, replied saying that:

Google understands when domains change ownership so it won’t necessarily rank for the queries it used to rank pre change of ownership. So if the sole purpose of buying a domain is to get search traffic from the old domain, I would suggest against doing so since there’s no benefit.

Gary Illyes, Webmaster Trends Analyst, confirmed in July 2016 that 301 redirects no longer lose PageRank:

However, if you redirect your penalty ridden site to a new domain, then the penalty may flow to the new domain because of the redirects.

From a negative SEO perspective, John Mueller has already gone on record saying they usually catch those instances whether a competitor redirects a penalty hit site to yours.


There is no time limit for algorithm changes to take effect

John Mueller, Webmaster Trends Analyst at Google, responded to a user on Twitter asking how long it takes to get out of an algorithmic penalty:

There is no fixed timeframe for algorithmic changes to take effect, past the need to recrawl & reprocess pages from the site. In general, these are not penalties, so instead of just waiting, I’d recommend getting objective input from peers early on, eg by posting in a forum.

In other words, recovering from an algorithmic penalty should be relatively quick. If you guess a site move can take around 3 months or so to process, I would imagine you are looking at a minimum of a few months.

Naturally, this assumes you have carried out all the necessary tasks to recover from the algorithmic penalty. John Mueller suggests many webmasters may be under a mistaken impression here, and that a peer review is sometimes helpful.

I recommend posting in the Google Webmaster Central Help Forum. John Mueller is even known to provide feedback there on occasion.


Google tells some Ad customers that it will start optimizing their campaigns

There is an interesting article in Search Engine Land which reports that Google has sent out notices saying that it will start optimizing campaigns unless users opt out, seven days after a user receives an email.

Aaron Levy, director of PPC at Elite SEM, tweeted a copy of the email.

In a response to Search Engine Land, Google said:

Our sales teams are always looking for ways to help customers get the best results from Google Ads. We are rolling out a pilot program that we believe will help businesses optimize their accounts. As always, we build customer feedback into the final product. Customers are in full control of the account and can accept or reject recommendations as they desire.

One of the key points made is that customers are still responsible for their accounts, and can either accept or reject the recommendations. Aaron Levy also posted a copy of the disclaimer, which says:

You can review or edit this change anytime by visiting your Google Ads account. Google doesn’t guarantee or promise any particular results from implementing these changes, including impact on your campaign performance or spend. Make sure you monitor your account regularly so you udnerstand what’s happening and can make campaign adjustments.

Is it just me, or does this sound like a terrible idea?

Web Development News

WordPress.com to build a publishing platform for news organizations

WordPress has announced plans to develop a next-generation publishing platform, Newspack by WordPress.com. The new platform is expected to launch near the end of July 2019.

Here is what WordPress had to say about why they are launching the new project:

With many local news organizations struggling to find sustainable models for journalism, we’re seeing a need for an inexpensive platform that provides the technology and support that lets news organizations build their businesses and focus on what they do best — providing critical reporting for their communities.

Our hope with Newspack is to give them a platform where they can continue to focus on what they do best, while we focus on providing world-class technology and support across their editorial and business operations.

Some of the features expected include:

  • Gutenberg support (Blocks)
  • Email integration for both marketing, and editorial
  • Programmatic ad integration
  • Analytics
  • Real-time backups
  • Revenue generating tools for subscriptions and e-commerce

NewsPack is currently inviting small and medium-sized digital news organizations, including local, single topic, and general interest, to apply to be charter participants in the development of the platform.

While there are no charges during the development of the platform, there will be a fee after the launch of between $1000 and $2000 per month. If you are interested, you can apply here.


Google Chrome changes could ‘destroy’ ad-blockers

In a document posted to a key discussion list for Chrome), Google said it wanted to restrict Origin Access, so that host permissions are granted at runtime, rather than install-time. Origin Access permissions are used to determine the sites an extension can interact with and will affect many APIs, script injection, cookies and more.

This will affect a large number of extensions, including Ad Blocking (For example, UBlock and Ghostery), Accessibility Extensions, and other website enhancement extensions.

A statement by ad-blocking developer Ghostery, given to Gizmodo, said:

This would basically mean that Google is destroying ad blocking and privacy protection as we know it.

They pretend to do this for the sake of privacy and browser performance, however in reality, users would be left with only very limited ways to prevent third parties from intercepting their surfing behavior or to get rid of unwanted content.


Gutenberg Phase 2 to Update Core Widgets to Blocks, Classic Widget in Development

There are more changes afoot in the WordPress ecosystem.

WordPress plans to change Widgets to Blocks sometime in 2019. There will also be a classic Widget block to deal with third-party widgets that haven’t yet converted to blocks.

I recommend watching Matt Mullenweg at State of the Word 2018 where he talks in detail about blocks. You can view the video at the recommended time below:

You can find a more detailed discussion on the proposed changes at WPTavern.


Former employee hacks WPML WordPress plugin. Sends email to all users warning of vulnerabilities.

On January 19, 2019, the popular WorldPress Multilingual Plugin (WPML) was hacked by an ex-employee who retained access to WPML via a back door.

The ex-employee then sent out an email to every user warning of several vulnerabilities and adised them no to store any personal or otherwise sensitive data in the database.

WPML Hack.
WPML Hack. © Ben Word

WPML responded on Twitter on January 20th:

We’re very sorry to report that our WEBSITE got hacked. Looks like an ex-employee backdoor. There is NO exploit in the WPML plugin we doublechecked. Payment information was NOT compromised as we don’t store this information. We strongly advise changing your WPML account password.

On January 21, 2019, relaunched their rebuilt WPML.org site (following it being defaced with the same message contained in the email), and again advised everyone using the plugin to change their password:


WordPress to show warnings on servers running outdated PHP versions

WordPress will now show a warning in the admin panel if your web hosting server is running an outdated version of PHP.

WordPress currently supports PHP versions going back to 5.2.4, but will now be showing a warning to users for PHP5 5.5 and below. The warning contains a link to a help article with further information on how site owners can update their PHP Version, and what to do before you update (i.e., make backups, check themes and plugins for compatibility using the PHP Compatibility Checker plugin).

You can see a screenshot of the warning below (via ZNet):

WordPress now shows warnings for PHP 5.5 and below.
WordPress now shows warnings for PHP 5.5 and below. © ZNet

It is interesting that WordPress is not encouraging its users to upgrade to a more modern version.

Currently, only PHP 7.1 and above are still receiving security fixes. Anyone who is using PHP 7.0 and below may be subject to unpatched security vulnerabilities. Even PHP 7.1 will be marked the end of life in December 2019, as the following chart shows:

Current supported versions of PHP.
Current supported versions of PHP. © php.net

Bob Ross Doing the Floss in Pure CSS

Now for something a little more fun.

I came across an awesome pure CSS Bob Ross doing the Floss, created by Steve Gardner. The Floss has been around for a while, but came to prominence in the popular game Fortnite. Pretty cool, right?

Feel free to click the HTML and SCSS tabs to see the code.

See the Pen Bob Ross Doing the Floss, Like a Boss (Pure CSS) by Steve Gardner (@ste-vg) on CodePen.

Web Hosting News

World’s largest web hosting companies hit by security fears

DreamHost, HostGator, OVH, iPage, and Bluehost are among the many web hosting companies that are affected by security flaws that may put millions of users at risk, according to a Paulos Yibelo, a security researcher, who has recently disclosed dozens of bugs:

All five had at least one serious vulnerability allowing a user account hijack

Paulos Yibelo set out in detail the various vulnerabilities in a detailed post. I have summarized a few of the attack vectors below:

  • BlueHost - Yubelo embedded a malicious piece of JavaScript code on a page full of kittens. As soon as a logged-in BlueHost user lands on that page after clicking a link on social media or email, the JavaScript code will activate and compromise the user’s BlueHost account via a cross-site request forgery flaw (CSRF). The attacker can then take over the BlueHost account, including requesting a new password after entering the attackers own email account into the user’s profile.
  • DreamHost - In a similar method to BlueHost, DreamHost was vulnerable to a cross-site scripting (XSS) attack. Again, allowing the attacker to replace the email, and reset the account password.
  • Hostgator - Again, a similar CSRF flaw allowed an attacker to trick cross-site script countermeasures from running allowing an attacker to modify the email address and reset the password.
  • iPage - This was a unique attack vector, in that the attacker could craft a web address allowing them to reset the account password to one chosen by the attacked. No existing or current password was required, making it a very simple, yet effective, attack.
  • OVH - OVH had a couple of vulnerabilities. Again, they were vulnerable to a CSRF attack, but also their API was vulnerable allowing important data to be read.

If I have learned anything from these attacks, it is to make sure to log out of important online accounts as soon as possible, and definitely before browsing other sites on the web, or clicking any links on Social Media or emails. I am pretty sure these are not the only companies to be vulnerable to attacks of this nature.

All of the companies (except OVH) have now fixed the vulnerabilities.

Dreamhost responded:

We currently have a fix in production that should prevent leveraging CSRF from our old panel.dreamhost.com/id/ submit forms and are making efforts to increase security and sanitize inputs across the rest of our endpoints.

EIG (owner of Bluehost, iPage, and Hostgator) commented:

I wanted to… let you know that after an internal analysis of the vulnerabilities you shared, we’ve taken steps to address and patch the potential vulnerabilities you identified.


GoDaddy removes JavaScript injection which tracks website performance

In Episode 1 of this Newsletter, I reported how Igor Kromin discovered that GoDaddy was injecting a script to provide Real User Metrics (RUM). GoDaddy acknowledged in a help article that there is a chance some customers might experience issues or slower site performance as a result.

Following an outcry, GoDaddy confirmed that they were no longer automatically opting users into RUM. Here is their full response (via zdnet):

We created a Real User Metrics (RUM) JavaScript to improve our hosting environment for our customers. The script is a non-invasive performance monitor that enables us to measure and track the performance of customer websites, and collects information, such as connection time and page load time.

We only collect performance data, nothing more. We don’t collect personal information. The data we collect is used to monitor our internal systems, optimize DNS resolution, improve network routing and server configurations, and help us improve the performance of our customers’ websites.

After careful review of the concerns being raised around this program, we have decided to turn off the Javascript insertion on our hosting platform immediately. We will reintroduce this program in the future, so that it is on an opt-in only basis. We apologize for any confusion and inconvenience to our customers.”


Domino’s Pizza: A landmark case for Web Accessibility

An important legal case in the US has forced Domino’s Pizzas to make their App and Website fully accessible to blind users.

The court case followed a complaint by a blind customer who had struggled to change toppings, use vouchers, and complete a purchase via Domino’s iPhone app. He argued that this was in breach of the Americans with Disabilities Act 1990, which makes it unlawful for businesses to deny people with disabilities access to their goods and services unless it would cause “undue burden” to the business.

Despite losing the case in Federal Court in 2017, a three-judge appeals panel reversed that decision, finding in favor of the complainant.

Compliance

One of the easiest ways to be compliant is to use an accessibility compliant framework such as Bootstrap or Foundation. There are many other accessible frameworks, but these are a couple of the most popular examples.

The easiest way to test for accessibility of your web pages is to use Google’s own tool at web.dev. You can see the relevant part of the report for The Webmaster below:

Use web.dev to check for accessibility.
Use web.dev to check for accessibility. © The Webmaster
Jonathan Griffin. Editor @ The Webmaster

About the author

Editor, Hosting Expert, SEO Developer, & SEO Consultant.

Jonathan is currently the Editor & CEO at The Webmaster. He is also an SEO Developer offering consultancy services, primarily to other web development companies. He specializes in the technical side of SEO, including site audits, development of SEO related features, and site structure & strategy.

In his spare time, Jonathan has a passion for learning. He regularly undertakes professional courses on subjects ranging from python, web development, digital marketing, and Advanced Google Analytics.

Read more about Jonathan Griffin on our About Page.