Webmaster News - Edition 06 - September 16, 2019.

In this edition, we cover 3 Google Updates, look at the new link attributes, the updated Quality Raters Guidelines, free coding boot camp, the new Lazy Loading attribute, and more.

In this edition, we cover 3 Google Algorithm Updates that took place in the last couple of weeks, look at the new link attributes, the updated Quality Raters Guidelines, free coding boot camp, the new Lazy Loading attribute, WordPress Security updates, and much more.

In This Issue:

SEO & Search News

Yet another Google Algorithm Update on the 13th of September.

The SERP Trackers are starting to spike again, more so than the recent volatility seen on the 5th of September.

It is difficult to say what the update is targeting, but Law & Government, Autos & Vehicles, Health, Finance are some of the worst-hit sectors. Google may be becoming better at ascertaining YMYL content, and I comment more about this here.

SEMrush Sensor 13th of September 2019
SEMrush Sensor 13th of September 2019 © SEMrush.

It looks like another relatively small Google Search Update is rolling out, with a spike on the 10th and the 13th of September. These may be two distinct updates as the sectors targeted appear to differ.

You can read more about this Google Update here.


Google Update on the 5th of September 2019 - Possible rollback?

A further Google Update was released on the 5th of September.

Bill Lambert (the self-proclaimed Google Search Team member) who predicted the 29th August update, indicates that this is a flash rollback:

Mozcast 5th of September 2019.
Bill Lambert. © Seroundtable.

I am not 100% convinced that this is a rollback as the SEMrush sensor shows a much smaller update, and the categories targeted appear to differ. However, Bill Lambert is becoming a credible source, so despite my doubts, it is worth a mention. I could be wrong.

You can read more about this Google Update here.

You can view see the scale of the update in the SERP tracker below:

SEMrush Sensor 5th of September 2019
SEMrush Sensor 5th of September 2019 © SEMrush.

Large Google Update on the 29th of August 2019

Google released a relatively significant update on the 29th of August, with an inside source claiming to indicate that dwell time is now relevant for ranking in the search results pages.

I am not 100% convinced, but the source, Bill Lambert, is credible. He regularly posts comments on Seroundtable informing users of upcoming updates.

Google Update was predicted several weeks prior with dwell time targeted.
Google Update was predicted several weeks prior with dwell time targeted. © GPWA.

You can read more about this Google Update here.

You can view see the scale of the update in the SERP tracker below:

SEMrush Sensor 30th of August 2019
SEMrush Sensor 30th of August 2019 © SEMrush.

Google has announced two new link attributes that provide webmasters of two unique ways of identifying links.

The two new link attributes are as follows:

  • rel=“sponsored” - The sponsored link attribute should be used to identify links on your site created as part of advertisements, sponsorships or other compensation agreements.

  • rel=“ugc” - The UGC attribute, which stands for User Generated Content, is recommended for links within user-generated content. Examples include the author link on blog comments and forum posts.

The existing rel=“nofollow” link attribute can still be used where you don’t want to imply any endorsement or ranking credit to another page.

You can also use rel=“nofollow” in place of the sponsored or UGC link attributes. Multiple link attributes can be used in combination.

One important thing to note is that from the 1st of March, 2020, rel=“nofollow” will become “a hint.” To block a page from being crawled, you should use other methods, such as using the robots.txt rile, meta tag, or password-protection of pages.


Despite rel="nofollow" becoming a hint in March, 2020, Google’s John Mueller says that there is still no need to disavow them.

If you have already disavowed nofollow links, that’s not a problem either.


Google Updates the Quality Raters Guidelines

Google has recently updated the Quality Raters Guidelines. These are used by Google employees to rate individual websites manually.

The rating does not affect the website being rated directly. Google does, however, use the ratings to help improve its algorithm generally.

The document is, therefore, one of the most important documents you can read to let you know what Google wants from your website to rank well.

I’ve highlighted some of the changes below:

1. There is a huge focus on original content

Google is now stressing the need for original reporting for news-based content. This requires original reporting that is both in-depth and investigative.

I suspect that news-based sites that rewrite other news articles may start to see their rankings fall.

If you are guilty of this, now is the perfect time to carry out a content audit and check what additional value your article has over other articles.

Quality Raters Guidelines - Focus on Original Content.
Quality Raters Guidelines - Focus on Original Content. © Google.

In the second bullet-point above, Google also updates its requirement for Artistic Content. It would appear that Google wishes to see original higher quality images and video in articles.

2. Google updates their definition of Your Money or Your Life (YMYL)

YMYL and EAT (Expertise, Authority, and Trust) is the big thing in Search right now. YMYL pages typically require more EAT.

Google has updated its definition of YMYL to include both pages and topics, and provide further clarification and examples:

Quality Raters Guidelines - Updated definition.
Quality Raters Guidelines - Updated definition. © Google.

No-index Directive no longer supported in robots.txt file

On the 2nd of July, 2019, Google announced that they were no longer going to support the no-index directive in a robots.txt file.

The new rules were implemented on the 1st of September, 2019:

Instead you should use one of the following techniques:

  • Noindex in robots meta tags
  • 404 and 410 HTTP status codes
  • Password protection
  • Disallow in robots.txt (where the page is not linked to from another page)
  • Search Console removal tool can be used for temporary removals

The Old Search Console is no more

On the 9th of September 2019, Google said goodbye to the old Search Console.

The new Search Console has been active for some time, but the old console remained available while they moved features over. Now, if you try to visit the old homepage or dashboard, you will be redirected to the relevant page on the new Search Console.

There are still some legacy features that remain to be moved over to the new console, and I discuss this further in the next newsletter item.


Google is still moving legacy features over to the new Search Console

There are still several features that have not yet made it to the new Search Console, including:

  • International targeting
  • Removals
  • Crawl stats
  • Messages
  • Url Parameters
  • Web Tools

You can find links to these in the new Search Console under the “Legacy tools and reports” section in the left-hand menu.

Search Console Legacy Tools and Reports.
Search Console Legacy Tools and Reports. © The Webmaster

As part of Google’s series of new #AskGoogleWebmasters videos on YouTube, John Mueller responded to a question on how to submit a Robots.txt file:

Question:

We’re not able to submit robots.txt. I have to switch to the old version when is a plan to move it to the new version?

Answer:

As you’ve probably noticed we’re still moving features and reports over from the old search console to the new search console.

Depending on when you’re watching this video we might already have moved more features over.

In general our plan with the new search console is not to just copy features over but rather to rethink them along the way what problems are we trying to help websites with. […]

Using the old search console is of course perfectly fine in the meantime.

Interestingly, I had a look for the Robot.txt testing \submision tool and could not find a link to it anywhere, even in the legacy tools section.

I can, however, confirm it is still available at this page.


Google Introduces Auto-DNS verification in the new Search Console

Back in February, Google launched the ability to set up a new domain property that incorporates all the different versions of your website; http, https, www, non-www.

Google received very positive feedback about the new site-wide property, but one common complaint was that verifying your domain ownership by DNS was too complicated.

In response, Google announced on the 3rd of September that it as now simplified the DNS-verification process.

There are still a few steps that you need to undertake, but Google guides you through the process. The steps are fewer than before, but you will need to visit your domain registrar.

Google Introduces Auto-DNS verification.
Google Introduces Auto-DNS verification. © Google

If you haven’t set up the site-wide domain property, I highly recommend you do so to gain a complete picture of your domain’s traffic.


Hit by a Google Penalty? Moving your content to a new domain won’t help.

For a long time, it has been well known in the SEO industry that redirecting your site to a new domain will pass any Google penalty to the new domain.

But did you know that this applies even if you don’t redirect the old site to the new? If the content is the same on both sites, Google might treat them the same.

Essentially, you don’t need a 301 redirect for Google to see it as a site move.

If you are trying to escape a penalty, the best thing to do is to fix your site. The only alternative, I would imagine, would be to re-write all the content for any new domain.


Google Drops URLs for Breadcrumbs

On the 21st of August, 2019, there was a considerable increase in the number of breadcrumbs appearing in the SERPs (Search Engine Result Pages).

Sharp increase in number of Breadcrumbs showing in SERPs.
Sharp increase in number of Breadcrumbs showing in SERPs. © RankRanger

This was caused by Google dropping the URL field in the search results, and placing it with breadcrumbs.

You do not have to have Schema markup for breadcrumbs for Google to show them. Google will use your URL folder structure as your breadcrumbs.

Breadcrumbs in SERPs.
Breadcrumbs in SERPs. © The Webmaster

In the above case, the URL structure differs from the breadcrumb structure following a redesign a few years ago. You can view the actual breadcrumbs for that page here.

It is a little disconcerting that Google has chosen to show breadcrumbs based on the URL structure, rather than my schema markup. I checked the schema markup, and Google confirms it is correct.


Website Traffic is not a Google Ranking Factor

There have been many claims, including data provider CognitiveSEO, that traffic is a ranking factor. Well, Google has officially debunked that myth:


Google clarifies how it picks a Canonical URL

A single web page might be accessible by many different URLs. For example:

  • https://www.thewebmaster.com/
  • http://www.thewebmaster.com/
  • https://thewebmaster.com/
  • https://www.thewebmaster.com/?utm_campaign=autumn-content&utm_medium=social&utm_source=Twitter

Google only wants to add one of those URLs (the canonical) to the Search Results, so which one do they choose?

In an #AskGoogleWebmasters video, John Mueller says that Google tries to choose the URL based on the site’s preference, before falling back to what is helpful to the user:

  • Link rel canonical annotation
  • Redirects
  • Internal linking
  • Url in the sitemap file
  • HTTPS URLs
  • “Nicer” looking URLs

You can view the full video below:


Google Shortens Title Length in the SERPs by 4-5 characters

Mordy Oberstein noticed that Google had dropped the character count of titles in its search results by 4-5 characters around the 7th of September, 2019.

Title Tags length changes - September 2019
Title Tags length changes - September 2019 © Moz

As this is the average title length, some titles will be longer, some shorter. At least, though, you can use it as a guide.

I’ve written a detailed guide on SEO title best practices here. I’ve updated it with this information.

Web Development News

Interested in Learning Web Development? Check out this FREE boot camp.

The boot camp is being run by Frontend Masters, one of the most reputable places for front-end development courses. Even I am a paid-up subscriber.

The boot camp is a crash course on HTML, CSS, and JavaScript. By the end of the course, you will have built your own portfolio site, a calculator, and a Feed-A-Star-Mole Game.

You can track your progress easily over around 21 hours of video, divided into nine sections:

  • Introduction to HTML
  • Introduction to CSS
  • HTML Forms
  • Website Embeds & GitHub Pages
  • Calculator Project: HTML & CSS
  • Introduction to JavaScript
  • Calculator Project: JavaScript
  • Using JavaScript in Websites
  • Build a Game Project: Feed-A-Star-Mole

Once you have signed up, you will get invited to a Discord server where you can ask for help from mentors, and join the thousands of people who have recently signed up.

The best thing is that it is completely free. You can sign up to the Bootcamp here.


A look at the New Image and Iframe Lazy Loading in Chrome

Chrome 75, due to be released next month (May 2019) features Native image lazy-loading. At present, this feature is only available in Chrome. It is available in Edge (once the Chromium version is publicly released), and I suspect Firefox will pick it up relatively quickly.

The new feature uses an HTML loading attribute that allows Chromium to defer loading offscreen images and iframes until a user scrolls near them. It supports three values:

  • lazy: is suitable for lazy loading.
  • eager: is not suitable for lazy loading. Eager will load the image or iframe right away.
  • auto: browser determines whether or not to load lazily.

Some examples of usage are set out below:

<!-- Lazy-load an offscreen image when the user scrolls near it -->
<img src="unicorn.jpg" loading="lazy" alt=".."/>

<!-- Load an image right away instead of lazy-loading -->
<img src="unicorn.jpg" loading="eager" alt=".."/>

<!-- Browser decides whether or not to lazy-load the image -->
<img src="unicorn.jpg" loading="auto" alt=".."/>

<!-- Lazy-load images in <picture>. <img> is the one driving image 
loading so <picture> and srcset fall off of that -->
<picture>
  <source media="(min-width: 40em)" srcset="big.jpg 1x, big-hd.jpg 2x">
  <source srcset="small.jpg 1x, small-hd.jpg 2x">
  <img src="fallback.jpg" loading="lazy">
</picture>

<!-- Lazy-load an image that has srcset specified -->
<img src="small.jpg"
     srcset="large.jpg 1024w, medium.jpg 640w, small.jpg 320w"
     sizes="(min-width: 36em) 33.3vw, 100vw"
     alt="A rad wolf" loading="lazy">

<!-- Lazy-load an offscreen iframe when the user scrolls near it -->
<iframe src="video-player.html" loading="lazy"></iframe>

The most important thing when implementing is to ensure that only images below the fold (not in view when the page first loads) are lazy-loaded.

You can read more about the new feature in this article by Addy Osmani, Engineering Manager at Google working on Chrome.

If you use WordPress you can take advantage of the new lazy-loading feature by installing Google’s new lazy-loading plugin.

Note: If you use a table of contents with in-page links using lazy loading can cause reflow problems. This is when you jump to the correct part of the page, and then the images load causing you to end up at the wrong part. CSS-Tricks have an article offering some solutions, but they are less than perfect at this time.


WordPress will adopt the New rel=“UGC” in next release

Joost de Valk, the creator of Yoast and WordPress contributor, has said that he will ensure that the new rel=“UGC” link attribute will land in the next WordPress release.

The new UGC link attribute should be used for user generated content, such as comments.

I discussed the new link attributes earlier in this newsletter.


As of 20th August, 2019, WordPress updated its official advice for the minimum PHP version to be PHP 7.0. The latest version is currently 7.3.

I’ve been saying for a long time that PHP 7 is far superior to 5.6 (the previous version) in terms of performance, so it is good to see the change.


New Security Release for WordPress - Upgrade now.

WordPress 5.2.3 was released on the 5th of September 2019, the first security release since March.

The release patches a number of vulnerabilities, which you can read more about here.

There are also updated versions of 5.0 and earlier, should you need them.


Caniuse and MDN collaborate on compatibility data

Caniuse has, for the last ten years, been an invaluable tool for developers to check feature availability across browsers.

MDN has long been an excellent resource for developers. Around two years ago it started re-doing its browser compatibility tables. MDN currently now has 10,500 compatibility tables compared to Caniuse’s 500.

As a result of the collaboration:

  • MDN will adopt the presentation of Caniuse, incorporate browser-use data, and allow similar filtering.

  • Caniuse will have access to MDN data and will include a link back to MDN’s more detailed reference guide. For example:

    Caniuse is now linked with MDN data.
    Caniuse is now linked with MDN data. © The Webmaster.

You can read more about the collaboration here.

Web Hosting News

SiteGround now supports QUIC

SiteGround has now announced support for QUIC on its servers as standard. To enable QUIC on their Dedicated Servers you will need their Booster Add-on.

QUIC is similar to TCP+TLS+HTTP2 but is implemented on top of UDP. UDP stands for User Datagram Protocol. UDP is essentially TCP without all the error checking. It will eventually evolve into HTTP/3.

This has many benefits:

  • UDP packets are received by the recipient more quickly.
  • The sender will not have to wait to ensure the packet has been received.

You can read a little more about QUIC and the future with HTTP/3 here.

Support for QUIC is still very patchy. At the moment only Chromium-based browsers (Chrome, Latest versions of Edge) support QUIC.

It is not yet supported by CloudFlare, although they do have a waitlist for your CloudFlare domain to be enabled for it.

That being said, you may already be taking advantage of QUIC if you use Google Fonts on your site. SEe the screenshot below:

Google Fonts use QUIC.
Google Fonts use QUIC. © The Webmaster.

You can read more about SiteGround here.


SiteGround launch New Client Area and Site Tools

SiteGround have recently revamped its Client Area, and added some new Site Tools.

The new cPanel dashboard looks very modern, and is a completely custom implementation:

SiteGround's new Client Area
SiteGround's new Client Area. © The Webmaster

To promote the new dashboard and features they did an in-depth webinar, which is worth watching:

I have negotiated a special deal with SiteGround to give you up to 70% of your first invoice.

Use this link to take you to the special landing page.


GoDaddy gets patent for “Portfolio-based domain name recommendations”

The new patent provides a more complex way for GoDaddy to show domain recommendations to its customers.

There are several examples in the patent, including:

  • Limiting the number of times a domain is shown to the customer. After seeing a domain multiple times GoDaddy may decide the customer is not interested in it.
  • Showing domain suggestions that are already related to the ones in the customers portfolio
  • Adding labels to domain suggestions such as “Good for an Email Address”, or “Improves SEO for website”.

Below is one of the diagrams in the patent that clearly show the methodology behind GoDaddy’s thinking:

GoDaddy Patent - Portfolio-based domain name recommendations.
GoDaddy Patent - Portfolio-based domain name recommendations. © GoDaddy

If you are having difficulty choosing a domain name, I recommend reading this guide, and also checking my list of top domain name generators.

Jonathan Griffin. Editor @ The Webmaster

About the author

Editor, Hosting Expert, SEO Developer, & SEO Consultant.

Jonathan is currently the Editor & CEO at The Webmaster. He is also an SEO Developer offering consultancy services, primarily to other web development companies. He specializes in the technical side of SEO, including site audits, development of SEO related features, and site structure & strategy.

In his spare time, Jonathan has a passion for learning. He regularly undertakes professional courses on subjects ranging from python, web development, digital marketing, and Advanced Google Analytics.

Read more about Jonathan Griffin on our About Page.