In this edition, we cover 3 Google Algorithm Updates that took place in the last couple of weeks, look at the new link attributes, the updated Quality Raters Guidelines, free coding boot camp, the new Lazy Loading attribute, WordPress Security updates, and much more.
In This Issue:
SEO & Search News
Yet another Google Algorithm Update on the 13th of September.
The SERP Trackers are starting to spike again, more so than the recent volatility seen on the 5th of September.
It is difficult to say what the update is targeting, but Law & Government, Autos & Vehicles, Health, Finance are some of the worst-hit sectors. Google may be becoming better at ascertaining YMYL content, and I comment more about this here.
It looks like another relatively small Google Search Update is rolling out, with a spike on the 10th and the 13th of September. These may be two distinct updates as the sectors targeted appear to differ.
You can read more about this Google Update here.
Google Update on the 5th of September 2019 - Possible rollback?
A further Google Update was released on the 5th of September.
Bill Lambert (the self-proclaimed Google Search Team member) who predicted the 29th August update, indicates that this is a flash rollback:
I am not 100% convinced that this is a rollback as the SEMrush sensor shows a much smaller update, and the categories targeted appear to differ. However, Bill Lambert is becoming a credible source, so despite my doubts, it is worth a mention. I could be wrong.
You can read more about this Google Update here.
You can view see the scale of the update in the SERP tracker below:
Large Google Update on the 29th of August 2019
Google released a relatively significant update on the 29th of August, with an inside source claiming to indicate that dwell time is now relevant for ranking in the search results pages.
I am not 100% convinced, but the source, Bill Lambert, is credible. He regularly posts comments on Seroundtable informing users of upcoming updates.
You can read more about this Google Update here.
You can view see the scale of the update in the SERP tracker below:
Google evolves rel=“nofollow” with two new link attributes
Google has announced two new link attributes that provide webmasters of two unique ways of identifying links.
The two new link attributes are as follows:
rel=“sponsored” - The sponsored link attribute should be used to identify links on your site created as part of advertisements, sponsorships or other compensation agreements.
rel=“ugc” - The UGC attribute, which stands for User Generated Content, is recommended for links within user-generated content. Examples include the author link on blog comments and forum posts.
The existing rel=“nofollow” link attribute can still be used where you don’t want to imply any endorsement or ranking credit to another page.
You can also use rel=“nofollow” in place of the sponsored or UGC link attributes. Multiple link attributes can be used in combination.
One important thing to note is that from the 1st of March, 2020, rel=“nofollow” will become “a hint.” To block a page from being crawled, you should use other methods, such as using the robots.txt rile, meta tag, or password-protection of pages.
No need to disavow nofollow links
rel="nofollow" becoming a hint in March, 2020, Google’s John Mueller says that there is still no need to disavow them.
If you have already disavowed nofollow links, that’s not a problem either.
Also, it doesn't change anything if you disavow nofollow links, so if you have them disavowed, that's not a problem.— 🍌 John 🍌 (@JohnMu) September 12, 2019
Google Updates the Quality Raters Guidelines
Google has recently updated the Quality Raters Guidelines. These are used by Google employees to rate individual websites manually.
The rating does not affect the website being rated directly. Google does, however, use the ratings to help improve its algorithm generally.
The document is, therefore, one of the most important documents you can read to let you know what Google wants from your website to rank well.
I’ve highlighted some of the changes below:
1. There is a huge focus on original content
Google is now stressing the need for original reporting for news-based content. This requires original reporting that is both in-depth and investigative.
I suspect that news-based sites that rewrite other news articles may start to see their rankings fall.
If you are guilty of this, now is the perfect time to carry out a content audit and check what additional value your article has over other articles.
In the second bullet-point above, Google also updates its requirement for Artistic Content. It would appear that Google wishes to see original higher quality images and video in articles.
2. Google updates their definition of Your Money or Your Life (YMYL)
YMYL and EAT (Expertise, Authority, and Trust) is the big thing in Search right now. YMYL pages typically require more EAT.
Google has updated its definition of YMYL to include both pages and topics, and provide further clarification and examples:
No-index Directive no longer supported in robots.txt file
On the 2nd of July, 2019, Google announced that they were no longer going to support the no-index directive in a robots.txt file.
The new rules were implemented on the 1st of September, 2019:
Just a reminder -- 🗓️ September 1, 2019 is not far away 🕙🕚🕛. Entries like ⛔️ "noindex" in 🤖 robots.txt won't be supported much longer, make sure to use the other options mentioned in the blog post 👇. https://t.co/SO20OHFxOT— Google Search Central (@googlesearchc) August 28, 2019
Instead you should use one of the following techniques:
- Noindex in robots meta tags
- 404 and 410 HTTP status codes
- Password protection
- Disallow in robots.txt (where the page is not linked to from another page)
- Search Console removal tool can be used for temporary removals
The Old Search Console is no more
On the 9th of September 2019, Google said goodbye to the old Search Console.
The new Search Console has been active for some time, but the old console remained available while they moved features over. Now, if you try to visit the old homepage or dashboard, you will be redirected to the relevant page on the new Search Console.
There are still some legacy features that remain to be moved over to the new console, and I discuss this further in the next newsletter item.
Google is still moving legacy features over to the new Search Console
There are still several features that have not yet made it to the new Search Console, including:
- International targeting
- Crawl stats
- Url Parameters
- Web Tools
You can find links to these in the new Search Console under the “Legacy tools and reports” section in the left-hand menu.
As part of Google’s series of new #AskGoogleWebmasters videos on YouTube, John Mueller responded to a question on how to submit a Robots.txt file:
We’re not able to submit robots.txt. I have to switch to the old version when is a plan to move it to the new version?
As you’ve probably noticed we’re still moving features and reports over from the old search console to the new search console.
Depending on when you’re watching this video we might already have moved more features over.
In general our plan with the new search console is not to just copy features over but rather to rethink them along the way what problems are we trying to help websites with. […]
Using the old search console is of course perfectly fine in the meantime.
Interestingly, I had a look for the Robot.txt testing \submision tool and could not find a link to it anywhere, even in the legacy tools section.
I can, however, confirm it is still available at this page.
Google Introduces Auto-DNS verification in the new Search Console
Back in February, Google launched the ability to set up a new domain property that incorporates all the different versions of your website; http, https, www, non-www.
Google received very positive feedback about the new site-wide property, but one common complaint was that verifying your domain ownership by DNS was too complicated.
In response, Google announced on the 3rd of September that it as now simplified the DNS-verification process.
There are still a few steps that you need to undertake, but Google guides you through the process. The steps are fewer than before, but you will need to visit your domain registrar.
If you haven’t set up the site-wide domain property, I highly recommend you do so to gain a complete picture of your domain’s traffic.
Hit by a Google Penalty? Moving your content to a new domain won’t help.
For a long time, it has been well known in the SEO industry that redirecting your site to a new domain will pass any Google penalty to the new domain.
But did you know that this applies even if you don’t redirect the old site to the new? If the content is the same on both sites, Google might treat them the same.
Essentially, you don’t need a 301 redirect for Google to see it as a site move.
Just to state the obvious, since the sites are the same, Google will treat them the same & pick a canonical for the combined signals. You don't necessarily need to do a 301 to have it be seen as a site move.— 🍌 John 🍌 (@JohnMu) August 26, 2019
If you are trying to escape a penalty, the best thing to do is to fix your site. The only alternative, I would imagine, would be to re-write all the content for any new domain.
Google Drops URLs for Breadcrumbs
On the 21st of August, 2019, there was a considerable increase in the number of breadcrumbs appearing in the SERPs (Search Engine Result Pages).
This was caused by Google dropping the URL field in the search results, and placing it with breadcrumbs.
You do not have to have Schema markup for breadcrumbs for Google to show them. Google will use your URL folder structure as your breadcrumbs.
In the above case, the URL structure differs from the breadcrumb structure following a redesign a few years ago. You can view the actual breadcrumbs for that page here.
It is a little disconcerting that Google has chosen to show breadcrumbs based on the URL structure, rather than my schema markup. I checked the schema markup, and Google confirms it is correct.
Website Traffic is not a Google Ranking Factor
There have been many claims, including data provider CognitiveSEO, that traffic is a ranking factor. Well, Google has officially debunked that myth:
Hi Suhas! No, traffic to a website isn't a ranking factor. If you're starting to get relevant traffic & users love your site, that's a good start though!— Google Search Central (@googlesearchc) August 26, 2019
Google clarifies how it picks a Canonical URL
A single web page might be accessible by many different URLs. For example:
Google only wants to add one of those URLs (the canonical) to the Search Results, so which one do they choose?
In an #AskGoogleWebmasters video, John Mueller says that Google tries to choose the URL based on the site’s preference, before falling back to what is helpful to the user:
- Link rel canonical annotation
- Internal linking
- Url in the sitemap file
- HTTPS URLs
- “Nicer” looking URLs
You can view the full video below:
Google Shortens Title Length in the SERPs by 4-5 characters
Mordy Oberstein noticed that Google had dropped the character count of titles in its search results by 4-5 characters around the 7th of September, 2019.
As this is the average title length, some titles will be longer, some shorter. At least, though, you can use it as a guide.
I’ve written a detailed guide on SEO title best practices here. I’ve updated it with this information.
Web Development News
Interested in Learning Web Development? Check out this FREE boot camp.
The boot camp is being run by Frontend Masters, one of the most reputable places for front-end development courses. Even I am a paid-up subscriber.
You can track your progress easily over around 21 hours of video, divided into nine sections:
- Introduction to HTML
- Introduction to CSS
- HTML Forms
- Website Embeds & GitHub Pages
- Calculator Project: HTML & CSS
- Build a Game Project: Feed-A-Star-Mole
Once you have signed up, you will get invited to a Discord server where you can ask for help from mentors, and join the thousands of people who have recently signed up.
The best thing is that it is completely free. You can sign up to the Bootcamp here.
A look at the New Image and Iframe Lazy Loading in Chrome
Chrome 75, due to be released next month (May 2019) features Native image lazy-loading. At present, this feature is only available in Chrome. It is available in Edge (once the Chromium version is publicly released), and I suspect Firefox will pick it up relatively quickly.
The new feature uses an HTML
loading attribute that allows Chromium to defer loading offscreen images and iframes until a user scrolls near them. It supports three values:
- lazy: is suitable for lazy loading.
- eager: is not suitable for lazy loading. Eager will load the image or iframe right away.
- auto: browser determines whether or not to load lazily.
Some examples of usage are set out below:
<!-- Lazy-load an offscreen image when the user scrolls near it --> <img src="unicorn.jpg" loading="lazy" alt=".."/> <!-- Load an image right away instead of lazy-loading --> <img src="unicorn.jpg" loading="eager" alt=".."/> <!-- Browser decides whether or not to lazy-load the image --> <img src="unicorn.jpg" loading="auto" alt=".."/> <!-- Lazy-load images in <picture>. <img> is the one driving image loading so <picture> and srcset fall off of that --> <picture> <source media="(min-width: 40em)" srcset="big.jpg 1x, big-hd.jpg 2x"> <source srcset="small.jpg 1x, small-hd.jpg 2x"> <img src="fallback.jpg" loading="lazy"> </picture> <!-- Lazy-load an image that has srcset specified --> <img src="small.jpg" srcset="large.jpg 1024w, medium.jpg 640w, small.jpg 320w" sizes="(min-width: 36em) 33.3vw, 100vw" alt="A rad wolf" loading="lazy"> <!-- Lazy-load an offscreen iframe when the user scrolls near it --> <iframe src="video-player.html" loading="lazy"></iframe>
The most important thing when implementing is to ensure that only images below the fold (not in view when the page first loads) are lazy-loaded.
You can read more about the new feature in this article by Addy Osmani, Engineering Manager at Google working on Chrome.
If you use WordPress you can take advantage of the new lazy-loading feature by installing Google’s new lazy-loading plugin.
Note: If you use a table of contents with in-page links using lazy loading can cause reflow problems. This is when you jump to the correct part of the page, and then the images load causing you to end up at the wrong part. CSS-Tricks have an article offering some solutions, but they are less than perfect at this time.
WordPress will adopt the New rel=“UGC” in next release
Joost de Valk, the creator of Yoast and WordPress contributor, has said that he will ensure that the new rel=“UGC” link attribute will land in the next WordPress release.
The new UGC link attribute should be used for user generated content, such as comments.
Yeah it's a one line change that I'll make sure lands in the next WordPress release. It's quite easy.— Joost de Valk (@jdevalk) September 10, 2019
I discussed the new link attributes earlier in this newsletter.
Recommended PHP version for WordPress is now PHP 7
As of 20th August, 2019, WordPress updated its official advice for the minimum PHP version to be PHP 7.0. The latest version is currently 7.3.
I’ve been saying for a long time that PHP 7 is far superior to 5.6 (the previous version) in terms of performance, so it is good to see the change.
New Security Release for WordPress - Upgrade now.
WordPress 5.2.3 was released on the 5th of September 2019, the first security release since March.
The release patches a number of vulnerabilities, which you can read more about here.
There are also updated versions of 5.0 and earlier, should you need them.
Caniuse and MDN collaborate on compatibility data
Caniuse has, for the last ten years, been an invaluable tool for developers to check feature availability across browsers.
MDN has long been an excellent resource for developers. Around two years ago it started re-doing its browser compatibility tables. MDN currently now has 10,500 compatibility tables compared to Caniuse’s 500.
As a result of the collaboration:
MDN will adopt the presentation of Caniuse, incorporate browser-use data, and allow similar filtering.
Caniuse will have access to MDN data and will include a link back to MDN’s more detailed reference guide. For example:
Web Hosting News
SiteGround now supports QUIC
SiteGround has now announced support for QUIC on its servers as standard. To enable QUIC on their Dedicated Servers you will need their Booster Add-on.
QUIC is similar to TCP+TLS+HTTP2 but is implemented on top of UDP. UDP stands for User Datagram Protocol. UDP is essentially TCP without all the error checking. It will eventually evolve into HTTP/3.
This has many benefits:
- UDP packets are received by the recipient more quickly.
- The sender will not have to wait to ensure the packet has been received.
You can read a little more about QUIC and the future with HTTP/3 here.
Support for QUIC is still very patchy. At the moment only Chromium-based browsers (Chrome, Latest versions of Edge) support QUIC.
It is not yet supported by CloudFlare, although they do have a waitlist for your CloudFlare domain to be enabled for it.
That being said, you may already be taking advantage of QUIC if you use Google Fonts on your site. SEe the screenshot below:
You can read more about SiteGround here.
SiteGround launch New Client Area and Site Tools
SiteGround have recently revamped its Client Area, and added some new Site Tools.
The new cPanel dashboard looks very modern, and is a completely custom implementation:
To promote the new dashboard and features they did an in-depth webinar, which is worth watching:
I have negotiated a special deal with SiteGround to give you up to 70% of your first invoice.
GoDaddy gets patent for “Portfolio-based domain name recommendations”
The new patent provides a more complex way for GoDaddy to show domain recommendations to its customers.
There are several examples in the patent, including:
- Limiting the number of times a domain is shown to the customer. After seeing a domain multiple times GoDaddy may decide the customer is not interested in it.
- Showing domain suggestions that are already related to the ones in the customers portfolio
- Adding labels to domain suggestions such as “Good for an Email Address”, or “Improves SEO for website”.
Below is one of the diagrams in the patent that clearly show the methodology behind GoDaddy’s thinking:
Jonathan Griffin Editor, SEO Consultant, & Developer.
Jonathan Griffin is The Webmaster's Editor & CEO, managing day-to-day editorial operations across all our publications. Jonathan writes about Development, Hosting, and SEO topics for The Webmaster and The Search Review with more than nine years of experience. Jonathan also manages his own SEO consultancy, offering SEO developer services. He is an expert on site-structure, strategy, Schema, AMP, and technical SEO. You can find Jonathan on Twitter as @thewebmastercom.