Webmaster News - Edition 01 - January 15, 2019.

In this Edition, I take a look at the Google Update on the 6-12th January, WordPress 5, Github private repos on free plans, a look at the tech stack of thewebmaster.com, and much more!

Welcome to the first edition of Webmaster News. My goal is to provide invaluable insights that you, as a webmaster, can use. I know many of you signed up to the newsletter some time ago, and I apologise for the delay getting updates to you.

As a newsletter subscriber, you will get an issue like this a couple of times a month packed full of SEO, Development, and Hosting News. I’ll try to keep it relevant and interesting to what you care about the most.

In This Issue:

SEO & Search News

Google Algorithm Update between the 6th and 14th January, 2019 (ongoing).

It appears that another quality update has rolled out between the 6th and 14th January 2019.

Based on the chatter in the community, and information from SERP trackers, the update appears to relate to E-A-T (Experience, Authority, and Trust). You can read my full commentary on the Algorithm Update here.

Google Algorithm Update between the 6th and 14th January 2019.
Google Algorithm Update between the 6th and 14th January 2019. Β© SEMrush

Google Help Hangout January 11, 2019

Every couple of weeks John Mueller, Webmaster Trends Analyst at Google, holds a Google Hangout and answers any webmaster questions that you may have.

I have summarized some of the main points in the January 11 Hangout below:

  • 1:09 minutes - Old Search Console Features

    Google Plans to close down several features when it finalizes the move from the old Search Console to the new Search Console.

  • 3:11 minutes - Javascript Rendering

    If you deliver a server-side JavaScript rendered page with JavaScript still to process, and that JavaScript breaks the page, or removes content, then Google will process that, even though it is in error. John Muller said:

    I would make sure that if you deliver a server-side rendered page and you still have JavaScript on there make sure that it’s it’s built in a way that when the JavaScript breaks it, it doesn’t remove the content.

    Following this, the user stated that he was using an HTML version of the page that was shown to Googlebot. The page rendered to Google is not quite the same than what the typical user is getting. John Mueller was asked if that was ok:

    It should just be equivalent so if you’re doing server-side rendering and all of the functionality is in the static HTML version that you serve then that’s fine. Whereas, if you do server-side rendering and just the content is rendered, but all of the links, for example, don’t work then that’s something where we’re missing functionality, and we might not be able to crawl it.

  • 14:35 minutes - Content Delivery Networks

    Sites served with a CDN are not treated any differently than non-CDN sites.

  • 16:58 minutes - Image URL Changes

    If you change CDN, or completely change the structure of your image URLs, your image rankings will change.

    That’s something where we will have to go off and first recrawl those images reprocess them reindex them get them all ready for image search again and so if you just change the URLs that are referenced within your pages then that will result in those images being seen as new images first.

    All have to work their way up again so setting up redirects like you mentioned that’s that’s a fantastic way to do that because that way we know the old images are related to the new ones, and we can forward any signals we have from the old ones to the new images.

  • 19:23 minutes - Changing a website theme

    If I change my website theme but keep the same content, URLs, etc., will the rankings change?

    Yes, this will result in changes in your website’s visibility on Google. It’s not necessarily the case that it’ll drop. It can also rise, so if you significantly improve your website through things like clearly marking up headings structured data, where it makes sense using a clear HTML structure, that makes it easier for us to pick out which content belongs together, which content belongs to the images. All of that can have a really strong positive effect on your website in search.

  • 31:34 minutes - Page and Images Sitemaps

    You can use separate sitemaps for pages and images, or you can combine them. Both methods work.

There are quite a few other points made in the video, so if you have time, I would recommend watching the whole thing. Listening to other user’s problems, along with John Mueller’s recommendations is quite insightful.


DuckDuckGo broke 9 billion searches in 2018

DuckDuckGo, the search engine that focusses on User Privacy and founded in 2008, has celebrated a major milestone.

In 2018 they served over 9 billion searches, with strong growth predicted throughout 2019.

DuckDuckGo served over 9 Billion private searches in 2018.
DuckDuckGo served over 9 Billion private searches in 2018. Β© DuckDuckGo

When asked on Twitter, John Mueller confirmed that linking to the same internal page several times on a page is not detrimental, provided it is natural.

This does not tell the whole story, so it is easy to get the wrong idea. Internal linking does matter.

A page that is only linked internally once is going to have less Page Rank flowing to it than a page that is linked to hundreds of times.

Recommended: Audit your site’s internal linking

Many SEO tools such as SEMrush look at internal linking as part of their Site Audit. They will recommend action be taken against any page that has zero (orphaned page) or one internal page linking to it.

You can see an example of one part of the report below:

Internal Link Distribution.
Internal Link Distribution. Β© SEMrush

If you want to check your own website for internal link structure, you can sign up to SEMrush with a 7-day free trial here.


With the rise of Artificial Intelligence, Glenn Gabe, Digital Marketing Consultant at G-Squared, asked John Mueller what Google’s stance will be on unnatural links inserted into articles created by AI.

Basically, if you are responsible for your content, even if you use AI or AI-based writing services for it.


Use 404s & Temporary Sitemaps To Speed Up Page Removal From Google

When I launched this new version of TheWebmaster.com website at the end of November last year, following a revamp of the site structure, I had about 500 URLs to de-index or redirect. While many were paginated pages for reviews or category pages, I did prune, merge, and update many old posts no longer receiving any traffic that were not worth updating.

To get Google to pick up the change to the site structure as quickly as possible, and knowing that I had written about the topic previously, I took a copy of the old sitemap and added it to the Search Console.

Adding an old sitemap enables the old and redirected pages to be crawled faster.

Use Temporary Sitemaps To Speed Up Page Removal From Google.
Use Temporary Sitemaps To Speed Up Page Removal From Google. Β© SEMrush

John Mueller recently repeated the advice to use an old sitemap. If you wish to remove a page rather than redirect, Mueller advises to let it 404.

Here is what he said:

One way to speed this up could be to submit a temporary sitemap file listing these URLs with the last modification date (eg, when you changed them to 404 or added a no-index) so that we know to recrawl & reprocess them. This is something you’d just want to do for a limited time (maybe a few months), and then remove, so that you don’t end up in the long run with a sitemap file that’s not needed by your site.


Google: Don’t redirect all 404’s to the Homepage.

John Mueller was asked on Twitter how Google treats site redirects when a webmaster redirects all their 404’s to the homepage. You can follow the conversation below.

Mueller states that you should not redirect all your 404’s to the homepage. He also recommended back in 2017 that redirects should have a one to one relationship, otherwise they will treat them as 404’s.


New feature for rich results from Google Webmasters!

Google has recently updated its Rich Results Testing tool, which you can find here.

Rich Results are special search results that include a carousel, image, or other textual UI elements. The new testing tool only supports the following types:

  • Job posting
  • Recipe
  • Course
  • TV and Movie
  • Event
  • Q&A Page

For all other Schemas, such as Reviews, you can use the Structured Data Testing Tool.

Web Development News

What’s under the bonnet at TheWebmaster.com

Having previously used WordPress, then Django for The Webmaster, I was taken aback by the simplicity, and ease to develop the new version using Hugo.

Hugo Static Site Generator.
Hugo Static Site Generator. Β© Hugo

Hugo is one of the most popular and the fastest open-source static site generator currently available.

A static site generator uses raw data (markdown files, data files, API’s), along with templates to generate the entire website in static HTML, which is then uploaded to your web hosting server. When a visitor visits the site, because the site is pre-built, the site is faster and more secure.

A quick rundown of The Webmaster’s technology stack is as follows:

  • Hugo - Hugo Static Site Generator is used to build the site. It integrates with various third-party services, including Google Sheets, and Typeform to bring in data. Hugo uses Markdown files for posts.
  • Bootstrap - I use the Bootstrap framework due to its simplicity, and the fact that many components look reasonable out the box. I am more of a developer, than a designer, and Bootstrap makes the design aspects easier.
  • GitHub - I use Github to store and version control the code. I can easily create development versions by creating new branches. Pushing to the master branch will trigger the Continuous Integration, which automatically uploads the new version of the site to the web.
  • Typeform - I use Typeform for contact forms, Web hosting review forms, and more. Typeform integrates seamlessly with Google Sheets (automatically updates the Google Sheet), so I can manipulate data, make calculations, and publish the data in a format that can be imported into Hugo on each build of the website.
  • Google Sheets - Google sheets is the workhorse of my Web Hosting Reviews and Hosting Deals system. Google sheets make it extremely easy to enter, sort, and filter large amounts of data. You can then publish the resulting data as a CSV file that can be easily consumed by the Hugo templates on every build.
  • Wercker - Wercker is a Continuous Integration tool. When I push my code or new posts (markdown files) to the master branch on GitHub, Wercker will automatically run. First, it retrieves the code. It then runs the Hugo Static Site Generator to build all the HTML files that make up the site, importing where necessary any data from Google Sheets, or Typeform. It will then RSYNC the site to my six hosting servers.

There are sophisticated services available, such as Netifly or Forestry, that will generate and deploy a Hugo site for free. As I do not like to be tied down to any one provider (no matter how great the feedback of those companies are) I take a somewhat manual approach:

  • Vultr and Digital Ocean - The site is hosted on six servers around the world (East US, West US, UK, Australia, India, and Singapore. I use both Vultr and Digital Ocean cloud hosting on their $5 per month plans.
  • Moss.sh - To manage my servers I use Moss.sh. Moss takes care of all the NGINX configuration, server updates, provisioning and initial setup of the Vultr & Digital Ocean servers, and more. Their support is extremely helpful, and I would highly recommend them to anyone.
  • CloudFlare Load Balancing - I use CloudFlare Load Balancing. This checks each of the six servers every 15 seconds from 8 regions. It also automatically steers traffic to the fastest server location. The only downside is that they only allow 5 Pools so my Singapore location just acts as a failover for the India location. If any server goes down or fails to respond quickly enough, traffic is automatically re-routed.
    CloudFlare Load Balancing.
    CloudFlare Load Balancing. Β© The Webmaster
  • CloudFlare Page Caching & Argo - I use CloudFlare full Page Caching (implemented via Page Rules), with Argo Smart Caching and Tiered Caching for even faster page loads. You can see a screenshot of the response times I get:
    CloudFlare Argo.
    CloudFlare Argo. Β© The Webmaster

With page speed now a Google Ranking Factor for the mobile search, and with at least 60% of all searches now carried out on a mobile device, having a fast website is essential for both SEO and your end user.

A faster site also helps with your crawl budget, enabling Google to pick up changes to your website faster, especially on larger sites.

I hope that gives you some valuable insight as to the technology stack behind The Webmaster.

Recommended: Optimize your website for speed.

If you use WordPress, you can follow this detailed guide to W3 Total Cache and WordPress.


WordPress 5.0

WordPress 5.0 brought in the vision that WordPress Creator, Matt Mullenweg, set out back in 2016.

Mullenweg’s vision was of a block-based approach that simplifies the post creation process by unifying widgets, shortcodes, and more.

Instead of a large blank canvas with a WYSIWYG editor, more akin to what you find in Microsoft Word or Google Docs, you now have a series of blocks that are independent of each other.

The default editor comes with more than 16 blocks, and you can add more blocks by installing themes, and other plugins.

Jetpack, for example, has just launched four more blocks, including a subscription block, related posts block, tiled gallery block, and shortlinks.

You can read more about the new editor here.


WordPress 5.0.2 Brings performance improvements to the Block Editor

If you have already tried out the new WordPress Editor, you may have noticed the editor was a little sluggish if you had a large number of blocks in your post.

The good news is that WordPress 5.0.2 increases the performance of a post with 200 blocks by 330%. So, if you found it too sluggish to use, you may want to have another go.


Github now has private repos on free plans

On the 7th January 2019, GitHub (now owned by Microsoft) announced that private repositories were now available on their free plan.

Before the change, anyone wanting to keep their repository private needed to pay $7 per month for the privilege. The free plan only permits 3 collaborators for private repo’s and does not include the Wiki or insights.

There have been some concerns in the developer community. One of the things that made GitHub great is that many were forced into having public repo’s. This made it an excellent place to share code, collaborate, and develop open-source projects. With private repo’s now freely available, some may be encouraged to keep their code private.


Updated Articles on the Webmaster:

Web Hosting News

EIG Hosting Brands Partner with MarketGoo

EIG Brands such as BlueHost, JustHost, etc, have had a torrid time with their reputation over the past few years. Quite frankly, after all the problems caused by them bodging the take over of reputable brands such as Hostnine, A Small Orange, Site5, and Arvixe in particular, it is sometimes difficult to talk positively about them, but this is worthy of a mention.

MarketGoo is targeted at inexperienced, non-technical customers, and helps users optimize their site for the search rankings.

CloudFlare Load Balancing.
CloudFlare Load Balancing. Β© The Webmaster

Features include:

  • SEO checklist and plan – Step-by-step website optimization plan
  • Recommended Tasks - Marketgoo provides suggestions that might help the user rank better.
  • Analytics – Improvements are tracked and communicated to the user.
  • Google Analytics Integration
  • Keyword tool – helps users find opportunities
  • Mobile – Mobile friendliness test.

I wouldn’t recommend this over a tool such as SEMrush, but at around a third of the price, it will definitely appeal to small business owners and bloggers that have a limited product. I am impressed.


Let’s Encrypt now serves 150 million websites. Expects to rise to 215 million in 2019.

Let’s Encrypt is a free, easy to use option for those requiring an SSL certificate. The organizations behind the Let’s Encrypt include Mozilla, Cisco, Akamai, EFF, IdenTrust and researchers from the University of Michigan.

Many hosts now support Let’s Encrypt with a one-click installer within their control panel, making it easier than ever to use their free certificates.

By the end of 2018, more than 150 million domains were using their certificates, as you can see in the chart below:

CloudFlare Load Balancing.
CloudFlare Load Balancing. Β© The Webmaster

By 2019, Let’s Encrypt are expecting more than 215 full qualified domains to be using the service.

You can view the latest Let’s Encrypt statistics here.


GoDaddy caught injecting JavaScript into customer sites to collect speed metrics.

This news item is quite disconcerting, and to be honest, I am very disappointed in GoDaddy as a result.

Igor Kromin, a Developer and Blogger, recently discovered that GoDaddy was injecting a script to provide Real User Metrics (RUM) into customers websites.

GoDaddy themselves acknowledge that this may cause issues for some sites:

Most customers won’t experience issues when opted-in to RUM, but the javascript used may cause issues including slower site performance, or a broken/inoperable website.

You can read the GoDaddy Help article explaining why users are automatically opted in, along with instructions to deactivate it here.

GoDaddy should really make this opt-in, not opt-out.


NameCheap is holding a Happy New Business Sale Between the 15th and 17th January 2019.

Namecheap will be offering substantial discounts on Domains (up to 98% off), Shared Hosting, EasyWP Managed WordPress, SSL certificates and VPN.

If you are in the market for a new domain, this may well be the perfect chance. You can find their offer page here.


Updated articles

Here is a selection of some of the new and updated articles on The Webmaster:

Jonathan Griffin. Editor @ The Webmaster

Editor, SEO Consultant, & Developer.

Jonathan Griffin is The Webmaster's Editor & CEO, managing day-to-day editorial operations across all our publications. Jonathan writes about Development, Hosting, and SEO topics for The Webmaster and The Search Review with more than nine years of experience. Jonathan also manages his own SEO consultancy, offering SEO developer services. He is an expert on site-structure, strategy, Schema, AMP, and technical SEO. You can find Jonathan on Twitter as @thewebmastercom.

Read more about Jonathan Griffin on our About Page.