Webmaster News - Edition 05 - March 20, 2019.

By Jonathan Griffin. Editor, SEO Consultant, & Developer.

· 24 min read

In Edition 05, Google confirms that Site Speed is a ranking factor, a Major Core Google Update sites Globally, I look at two Security Vulnerabilities affecting Chrome and WordPress, an emulator for the very first web browser, and much more.

In This Issue:

SEO & Search News

“March 2019 Core Update” - Major Google Update March 12, 2019

March 12, 2019, saw Google launch their first major “named” update of the year. It has been called the “March 2019 Core Update” by Google.

The impact of the update is significant, and many people who were hit by Medic Update on August 1, 2018, have reported seeing recoveries. However, it does not appear to be a reversal of that earlier update.

Google has said explicitly that you cannot do anything to recover from this update, and that it is a change on their end. It is likely to be related to how Google interprets search queries.

I have written in-depth about this Google Update here.

You can view see the scale of the update in the SERP tracker below:

March 2019 Core Update - SEMrush Sensor.
March 2019 Core Update - SEMrush Sensor. CREDIT: THE WEBMASTER.

Google Algorithm Update March 5, 2019

A relatively minor (compared to the 12th of March update) hit on March 5, 2019.

SEMrush Sensor 5th of March 2019 in the US.
SEMrush Sensor 5th of March 2019 in the US. CREDIT: SEMRUSH.

You can read more about the update in my post here.

You may not rank for Keyword-style Brand Names by default

John Mueller has confirmed that you will not necessarily rank for your Brand Name if you choose a generic, keyword-based name as your brand.

If your brand name is unique and does not rank in Google, then it could be a sign of a penalty being applied to your site.

This was the case with ashleymadison.com back in 2017, which you can read more about here.

Google’s Gary Illyes provides Google Image Search Ranking Advice

A few weeks ago at PubCon, Google’s Gary Illyes presented a keynote speech in which he discussed image search, and how they were putting a lot more resources into improving the results.

Here are a few takeaways:

Google: How We Select Dates For Search Results Snippets

A recent Google blog post reveals some interesting detail about how Google selects the date that shows in the SERP snippet.

Essentially, Google recommends that webmasters should:

  • Show a clear date that is visibly displayed prominently on the page.
  • Use structured data, such as datePublished and dateModified.

The post continues with some best practices, such as:

  • Show when the page is updated. You can use the publish date and updated date, or just the updated date. What I do is have both dates in the structured data, but only show the most recent date on the page.
  • Use the right timezone.
  • Be consistent in usage.
  • Follow Google’s structured data guidelines.
  • Troubleshoot by minimizing other dates on the page.

I have noticed that sometimes when I have a date in the title of a post that Google will initially use this date, before adjusting to the correct date a few days later. In other words, Google can sometimes get confused if you have other prominently displayed dates on the page.

Google Says All Separate Mobile URL Sites Should Move To Responsive

John Mueller said on Reddit that he believes that all those using a separate mobile site should think about moving to a responsive web design:

With mobile first indexing, I’d recommend integrating the mobile version just as well as you would the desktop version.

That means adding the alternate-amphtml link to the mobile version (pointing at the amp version), and including a reference to the mobile version from the amp version.

That said, at some point all of these sites with separate mobile URLs should just move to a responsive design anyway, which makes all of this moot. (Separate mobile URLs makes everything much harder than it needs to be)

Search Quality Raters Don’t Directly Impact Rankings

For many years Google has been saying that they only use Search Quality Raters to evaluate search results and that they do not affect your site’s rankings in any way.

A recent blog post by Matthew Woodward disagrees with this contention and published an article explaining why based on Google Analytics data.

This caused a bit of a storm, and several Google SEO professionals and Google’s John Mueller waded into the argument:

Of course, conspiracy theorist (:sarcasm:) Rand Fishkin (Moz founder) had a particularly snarky comment:

I’ll have to side with John Mueller on this one. Google has made it very clear on multiple occasions, and realistically, I don’t see how Google can regularly manually rate all web pages on the internet and expect to get decent search results.

It just isn’t practical, and I think that should be obvious to everyone.

It is quite interesting that Bing also decided to clarify their position:

Two more Google JavaScript SEO Videos released

Google has released two more videos in their JavaScript SEO series:

When does JavaScript SEO matter?

This video mainly talks about sites that use JavaScript to dynamically display HTML \ content and those that only use JavaScript to display functionality or effects.

Essential JavaScript SEO tips.

This contains a few tips to ensure that Google can crawl your site effectively, such as pre-rendering JavaScript-based content on the Server, among other things.

Sharing the same IP address with other websites is not a concern in most cases

In a recent Google Hangout, John Mueller confirmed that sharing the same IP address with other sites should not be a problem if one of the sites sharing that IP gets a penalty.

Mueller confirmed that there might be some edge cases whereby almost all of the sites on an IP are spammy, and in those cases, it may be difficult to separate the good sites from the bad ones.

I believe that Google is much more sophisticated in identifying spammy link networks, or spammy sites owned by the same person. Just looking at how SEMrush can distinguish Link Networks in their backlink audit I suspect that a lot of different flags would need to trigger before Google takes action against a whole IP.

For instance, SEMrush can identify link networks by Mirror Pages, IP, Google Analytics IP, Whois info, and Adsense ID.

Furthermore, with the use of services such as CloudFlare so prolific, it is very easy to hide a server’s origin IP from Google.

Actual authors’ names on blog posts rather than having a generic author is better

Google’s John Mueller has confirmed that having a real author’s name on blog posts might have an indirect influence on rankings as it is more user-friendly than generic authors such as “Admin.”

Mueller’s reference to the indirect effect on rankings is probably about gaining more backlinks, as he said that people would be more likely to recommend a post that has a real author name on it.

You can view the full conversation here.

John Mueller didn’t mention it, but if you have been following any of my reports on Google Updates lately, you will know that E-A-T, or Expertise, Authority, and Trust, has been a significant talking point.

I believe that having a named author, with detailed author bio’s on the site is beneficial to demonstrate E-A-T. This is not something you can do with a generic author.

Hiding content using display:none in the stylesheet for responsive design is OK

There was an interesting conversation on a Recent Google Hangout over whether it is acceptable to hide part of the content for responsive mobile-first design.

John Mueller said that it is OK from Google’s point of view to use display:none to do this.

However, if you deliberately try to mislead Google by having white text on a white background to hide content, that is not OK.

I think that the mobile version should be substantially the same as the desktop version of a website. Google has already shown they are willing to send out notices for those using AMP where the AMP version differs from the desktop version.

Furthermore, from experience, hiding content behind tabs, etc (that often use the display:none markup doesn’t perform as well as non-tabbed content.) It’s something I am in the process of remedying myself on this site (i.e., by removing tabbed content.)

Google says tabbed content is taken into account, and this may be so. But my experience shows that when you search for an exact string from that hidden tabbed content, Google doesn’t recognize it, or at least show it in the Google Search in the snippet.

You can view the relevant part of the discussion below:

Low traffic pages are not automatically low quality

There have been many discussion around the web surrounding the deletion of low-quality pages in an attempt to increase your site’s quality score.

The default answer by Google has been to update low-quality content, not delete it. This, however, is not always possible, especially if you have many hundreds of old (no longer relevant) posts.

John Mueller’s advice in a recent Google Hangout is to ensure that they are really of low quality.

Some pages may have low traffic, but that does not necessarily make them low quality. It could be that they target unpopular keywords.

Mueller recommends the following:

  • Use a combination of metrics before deleting any content
  • Try not to remove the content. Instead, try other solutions such as 301 redirects to related pages, combine multiple pages, or update the content.

Pages don’t get any ranking bonus for being crawled more often

John Mueller asked the following question in a Google Hangout:

How to fix the crawl frequency of low priority pages within a website. Will Google crawl more of such pages because the quantity of these pages is more compared to the important pages?

You will find below a summary of the main points made in reply:

  • If a page rarely changes it is ok to crawl it once every few months.
  • There is no need to improve the crawl rate unless your pages are crawled less frequently than they are updated.
  • If you want pages to be crawled more frequently, you should link to them more within the websites, and add the last modification date to your sitemap.

Mueller’s reply was quite in-depth, so I recommend you watch the full discussion:

Redirecting a website hit with a manual action will not remove the penalty

John Mueller was asked in a recent Google Hangout (40:11) whether link or content penalties can be transferred through a redirected site.

Mueller’s reply came in two parts. Firstly, whether it is your site hit by the penalty:

If yours is that spammy website and you’re redirecting to another website to try to escape that penalty then probably we will be able to follow that site migration and apply that manual action or algorithmic action to the new website.

As well so my recommendation there would be instead of trying to get away by doing fancy redirects or other types of site moves I would recommend just cleaning up the issues.

Secondly, whether some random site hit with a penalty is redirecting to yours:

If a random spammy website redirects to your website that’s usually something we can recognize and just ignore.

Ultimately, it is clear that penalties transfer through redirects, and you are better to try and fix the issues rather than create a new site and redirect the old one to the new.

Site speed is important for Google

In the latest Google Hangout, John Mueller reminded everyone that Site Speed is important. I have written previously about how Google now uses Site Speed as a Ranking Factor in the Mobile Search Results Pages.

In the Google Hangout, Mueller made the following points:

  • In the past, Google used to say that as long as a page load was not “ridiculously” long, you are fine. This is now incorrect.
  • John Mueller recommends you now take site speed seriously. It is a ranking factor.
  • There are tools that provide you with objective measures you can work on, such a web.dev.
  • Google can determine how quickly the page is “generally accessible”. It sounds like Google is using metrics like “Time to Interactive”, or “First Meaningful Paint” to determine page load times. You can see these metrics on Google’s own tool at web.dev, linked above.

You can view the full discussion below:

It’s Perfectly Fine for your Title tag and H1 to be the same

John Mueller was asked a question whether there is an issue with having your meta title equivalent to your H1 tag. His reply was short and to the point:

That’s perfectly fine, yeah.

It may be perfectly fine, but I am not sure it is the best practice, and many site audit tools, such as SEMrush, agree:

It is a bad idea to duplicate your title tag content in your first-level header. If your page’s <title> and <h1> tags match, the latter may appear over-optimized to search engines.

Also, using the same content in titles and headers means a lost opportunity to incorporate other relevant keywords for your page.

You can read more about Title Tags here.

Use Videos to Support your Main content, Not Replace It

John Mueller was asked in a Hangout (46:25) if it was possible to use Google Hangout Videos in their blog posts.

Mueller expanded on the conversation saying that if you use an embedded video, it should be supported by other content on the page to show up in Google:

I’d be cautious about using just a video as the primary piece of content on a web page and you should really work to kind of use the video in a way that supports your primary content but not that it replaces your primary.

So for example I wouldn’t take any of these videos and just put them on a blog post and add a title to them and expect them to show up in search.

But if you have specific content around that video. If you have a transcription or have some comments to the content that is shown in the video or you’re using that video as kind of a point of reference with regards to your content I think that’s that’s a perfectly fine approach.

But just purely using a video on a page is something that makes it really hard for us to determine what is actually useful on this page and why should we show it in the search results.

Don’t use QAPage Structured Data Markup for FAQs

In a recent Hangout (58:56) Google warns you against using Q&A markup that is used for a page that has multiple answers to a single question, for other types of questions.

John Mueller has said they don’t currently support simple fixed FAQ type questions, such as the example below, but this is something they may do in the future.

The FAQ I currently use around the site are as follows:

What is an example question?

I don’t know, but this is an example answer.

And here is the markup:

<div itemscope="" itemtype="https://schema.org/FAQPage">
  <div class="card mb-4" itemscope="" itemtype="https://schema.org/Question">
      <div class="card-header question" itemprop="name">
          <div id="what-is-an-example-question" class="m-0 p-0">
            <strong>What is an Example Question?</strong>
      <div class="card-body answer small" itemprop="acceptedAnswer" itemscope="" itemtype="https://schema.org/Answer">
          <div itemprop="text">I don’t know, but this is an example answer.</div>

Note that I am using FAQPage, not QAPage schema that John Mueller is referring to.

Unfortunately, FAQPage schema is still pending wider review, so I doubt it is helping much at the present time.

You can either have a single FAQ page listing all questions or separate pages for each question

In a recent Google Hangout, there was an interesting discussion (1:00:33) as to whether you should put FAQ’s all on one page, or have them on different pages.

John Mueller commented:

I think having FAQ content on your pages is perfectly fine that’s something you can structure in multiple ways that you have one page with a lot of questions or one page with just one question and answer.

Or maybe you have more information on the answers where you need a complete page for that but from a structured data point of view from a Google search point of view there’s nothing at least that I’m aware of where you would need to do it one way or the other.

I have seen some good examples where sites appear in the featured snippets because they have a nice set up with common questions that people actually actually ask that I think is pretty cool if you can make that happen but obviously you need to think about whether people are actually asking these questions and whether you really have to answer that that is useful for you as a business if it’s shown in search.

So if I don’t know if someone asks like how do I join a google webmaster hangout and you give an answer for that like is that really something that you you have any coming from for your website if it’s fun like that whereas if you have an answer to make maybe you’re a manufacturer of a phone like how do i reset my phone you give a couple of steps to do that.

Moz updates how they calculate Domain Authority scores

Back in the year 2000, Google launched its Google Toolbar for Internet Explorer. On it, there was a PageRank meter that enabled users to see the PageRank of any page they were visiting.

PageRank was measured on a scale between 0 and 10 and indicated the strength of the backlinks linking to the site. This changed everything about SEO, and very quickly everyone was trying to gain links to increase their PageRank Score.

In 2016, Google killed it off, and SEO’s focused mainly on another link score metric called Domain Authority (DA), created by Moz.

The problem with these link scores is that webmasters would become consumed by them, solely focussing on their score, rather than more obvious things such as traffic, or keyword rankings. Just take the following Tweet, for example:

Many link sellers, whether spammers or those offering guest or editorial posts, base their prices on these scores. SEO agencies may have promoted the progress of a client’s domain authority to justify their fees.

A sudden change in Domain Authority, therefore, is going to be noticed.

What has changed in the new Moz Domain Authority?

Moz has changed from a complex linear model to a neural network that is better able to detect link manipulation.

They have integrated a Spam Score for each domain, and other complex distributions of links based on quality and traffic. In essence, it is much more sophisticated, and more accurately reflects modern SEO.

You can read more about the new Domain Authority here.

You can check your Moz Domain authority here. You’ll need to make a free account to get access.

Web Development News

Google Confirms Serious Chrome Vulnerability

Google’s security lead for the Chrome browser has warned users of the browser to update their installs “right this minute.”

The urgency relates to a zero-day vulnerability for Chrome that Google’s Threat Analysis Group has confirmed is being actively exploited since February 27, 2019.

Google is not releasing details of the exploit until enough users have patched the vulnerability. From what little I can find, it would seem that it affects the Chrome FileReader, an API interface that enables the browser to read the contents of files on a users system.

The easiest way to check whether your browser is up to date is type chrome://settings/help into your browser. You will then see a notification telling you whether your Google Chrome is up to date:

Check whether Google Chrome is up to date.
Check whether Google Chrome is up to date. CREDIT: THE WEBMASTER.

Firefox now automatically blocks autoplaying audio and video

I mentioned in Edition 3 of my Newsletter that Firefox was going to block sites from playing audio automatically, including in videos.

Well, the update has now been released.

Any playback before a user has interacted with the page will be counted as autoplay and will be blocked if it is audible. Videos that autoplay will still be allowed provided that they start muted.

You can read more about the changes here.

WordPress 5.1.1 Security and Maintenance Release

I highly recommend you update to the latest version of WordPress as soon as possible, especially if you use the WordPress commenting system.

Simon Scannell of RIPS Technologies discovered the vulnerability, and he published a post summarizing how an unauthenticated attacker could take over any WordPress site that has comments enabled:

An attacker can take over any WordPress site that has comments enabled by tricking an administrator of a target blog to visit a website set up by the attacker.

As soon as the victim administrator visits the malicious website, a cross-site request forgery (CSRF) exploit is run against the target WordPress blog in the background, without the victim noticing.

The CSRF exploit abuses multiple logic flaws and sanitization errors that when combined lead to Remote Code Execution and a full site takeover.

The upcoming WordPress 5.2 will change the minimum PHP version required to 5.2, and this latest version will contain several changes that will help hosting providers prepare.

For instance, hosting providers can now offer a button for their users to update their PHP version.

Proposal for new WordPress Block Directory to host Single Block Plugins

Alex Shiels, a WordPress core contributor, has proposed a new Block Directory on WordPress.org that would host JavaScript-based, single block plugins.

The new Block Directory will be completely separate from the existing plugin directory and will be searchable by block name and description.

The goal would be to make the new Gutenberg blocks searchable from within the WordPress editor. You can read more about his proposals here.

You can see an impression of what the new installation screen would like below:

Proposal for new WordPress Block Directory to host Single Block Plugins.
Proposal for new WordPress Block Directory to host Single Block Plugins. CREDIT: ALEX SHIELS.

While the proposal is widely praised, there is some concern over the ability to install new blocks from within the editor itself.

WordPress developer Jamie Schmid said:

I am not convinced that making blocks searchable and installable from within the editor is the best solution.

This, along with page level block controls and style overrides, is encouraging a very short-sighted, page-level solution to an issue that is very likely a global site (or content or even business) issue.

I’d love to instead see a central view for all installed blocks – similar to how plugins are, but more organized by type/function/etc and with a visual alongside.

This will encourage making decisions at the site level, encouraging some bigger-picture reflection.

And same to being able to apply access controls to the installation of new blocks.

I agree with Schmid. I can see that if users can install blocks on demand from within the editor, they may use a variety of different styles on many different pages.

Over time, this would make it more difficult to update the design in the future.

WordPress now powers over 1/3rd of the top 10 million sites on the web

WordPress is celebrating powering over one-third of the web.

According to W3Techs, WordPress has increased its market share to 33.4% of all sites, up from just 13.1% in January 2011, and 29.2% in January 2018.

WordPress now powers over 1/3rd of the Web.
WordPress now powers over 1/3rd of the Web. CREDIT: W3TECHS.

It looks like the rate of growth is only increasing, with interest in other popular CMS such as Joomla and Drupal starting to wain.

Interestingly, Website Builders such as Squarespace, Shopify, and Wix are also surging in popularity, albeit relative to their market share.

You can see all the latest data in the table below:

CMS Market Share 2019.
CMS Market Share 2019. CREDIT: W3TECHS.

Mozilla launches its free, encrypted file-sharing service ‘Firefox Send’

Mozilla has launched Firefox Send, a free encrypted file transfer service that allows users to share files safely from any browser.

Firefox Send allows you to share files up to 1GB without an account, or 2.5GB with an account.

The recipient of any file sent receives a link which immediately starts the download of the file. They do not require an account or for you to go to any third-party site to receive the file.

Firefox Send.
Firefox Send. CREDIT: FIREFOX.

You can read more about Firefox Send in the announcement post.

Check out the emulater for the very first Web Browser

A team at CERN have rebuilt WorldWideWeb (later renamed to Nexus), the world’s first browser created in 1990 by Tim Berners-Lee in celebration of its 30th anniversary.

The WorldWideWeb browser emulator is available here. It is not as intuitive as today’s browsers, but fortunately, you can find video demonstrations of the key features here.

In case you were wondering, you can see a screenshot of the original browser below:

WorldWideWeb browser.
WorldWideWeb browser. CREDIT: WIKIPEDIA.ORG

A bit of Programmer Humour - Thanos.js

In anticipation of the Avengers:Endgame movie, which is now just a month away, I present to you Thanos.js.

A nifty JavaScript library that will randomly delete half the files in your development project.


Basic usage:

thanos snap-fingers --with-glove

Web Hosting News

GoDaddy is now the Official Sponsor of the Men’s Cricket World Cup 2019

GoDaddy has announced the sponsorship of the Men’s Cricket World Cup for 2019.

Cricket is a global sport that is particularly popular in India. With GoDaddy looking to expand Globally, with a particular interest in India, it is an ideal sponsorship opportunity for the company.

Nikhil Arora, Managing Director and Vice President, GoDaddy India said:

Cricket in India is a favourite sport. It is viewed in every nook and corner of our country, giving GoDaddy an opportunity to reach our audiences, including in the Tier II & III cities, helping entrepreneurs and small business owners bring their ideas to life online.

Cloudflare Raises $150 million. Eyes IPO.

CloudFlare last week announced that it had raised $150 million to “support Cloudflare’s growth, extend product ranges, and continue its international expansion into new markets.”

The company’s last funding round in 2014 of $110 million included investors such as CapitalG (Google Capital), Microsoft Corp, Baidu, and Qualcomm.

They were looking to go public with an IPO earlier in the year, with a potential value of more than $3.5 billion. However, with the new funding, I suspect there may now be a little more flexibility in the timings.

If you are not using CloudFlare, I highly recommend you consider it. I use the following CloudFlare products for this site:

  • CloudFlare Pro - This includes image optimization, more page rules (full page caching), etc.
  • Argo - Tiered Caching, and Smart Routing for better performance.
  • Load balancing & geo-steering - This manages my six origin servers around the world.

If you use WordPress, check out my detailed tutorial for integrating W3 Total Cache and CloudFlare.

DigitalOcean Launches New Marketplace

On March 5, 2019, DigitalOcean announced the launch of their new Marketplace.

The DigitalOcean Marketplace is a platform where developers can find pre-configured one-click applications such as WordPress, Discourse, cPanel, and Plesk.

Digital Ocean Marketplace.
Digital Ocean Marketplace. CREDIT: THE WEBMASTER.

One of the most significant entries to the Marketplace is cPanel.

You can now install cPanel on any Droplet (even the $5 ones) with 15 days cPanel License free of charge. After that, you will need to purchase a license for $20 per month (or $15 via a reseller).

I have always used Vultr for vanilla cPanel installs as they manage all the licensing on your behalf for just $15 per month. The only downside is that cPanel is only available on $10 cloud servers.

DigitalOcean on the other hand, allow you to install cPanel on $5 servers and give you 15 days of cPanel free.

Either way, as a developer, I am a big fan of both companies.