Webmaster News - Edition 03 - February 14, 2019.

In this edition, Google's Self-Proclaimed 'Chief of Sunshine and Happiness & trends analyst' does AMA on Reddit, GitHub Pages get hacked, two Google Updates, and EIG reports declining user numbers.

In Edition 03, Google’s Self-Proclaimed “Chief of Sunshine and Happiness & trends analyst” does AMA on Reddit, GitHub Pages get hacked, two Google Updates, and EIG reports declining user numbers.

In This Issue:

SEO & Search News

Google Algorithm Updates

There appears to have been two Google Updates over the last couple of weeks, which you can see on the chart below:

The SERP Tracker shows two spikes in volatility around February 7th and February 12th, 2019.
The SERP Tracker shows two spikes in volatility around February 7th and February 12th, 2019. © SEMrush

Google Update February 7, 2019

A relatively calm couple of weeks ended on February 7, 2019, when all the SERP trackers showed significant volatility on that day.

This was a relatively short update, just lasting the day, with volatility levels dropping back down on February 8th.

The amount of chatter on various SEO forums was relatively limited, with a focus being more on the UK SERPs. While there was volatility in both the UK and US, the UK sectors hit by this update were more typically associated with E-A-T, meaning Experience, Authority, and Trust.

You can read more about this Google Algorithm Update in more detail here.

Google Update February 12, 2019

This update hit on the 12th February, and lasted into the 13th. The scale of this update is comparable to the previous 7th February update.

Like previous updates, it targets sectors that are vulnerable to E-A-T. A broad SEO strategy to combat E-A-T, user intent, and content quality should help prevent being hit by these type of updates, and provide some much-wanted improvements in rankings.

You can read more about this update here.


Google Adds Voice Search Input in Chrome for Android

Voice search is everywhere, and growing day by day. From voice search on mobile to Digital Assistants, the way people interact with the web is changing.

I recently updated an article with various Voice-usage statistics, with some predictions estimating that 30%-50% of all searches will be voice-based by 2020.

It is only natural, then, that Google has added voice input to the search bar on Chrome for Android.

To use the new functionality, you will need to grant Google some permissions. You will automatically see the prompts to do so after clicking the microphone icon in the search bar:

Google Voice Search on Chrome for Android.
Google Voice Search on Chrome for Android. © The Webmaster

Google Search Console now has a Security Section

Google has added a new section to the Search Console that helps you discover if your website has been hacked or if it contains malware. It also provides advice and guidance on how to fix any issues.

You can view an image of example issues in the Tweet below:

There are nine different types of security report messages:

  • Social Engineering (Phishing and Deceptive Sites)
  • Malware infection type: Server configuration
  • Malware infection type: SQL injection
  • Malware infection type: Code injection
  • Malware infection type: Error template
  • Cross-site malware warnings
  • Hacked type: Code injection
  • Hacked type: Content injection
  • Hacked type: URL injection

You can read Google’s guidance about each of these types here.


Gary Illyes recent took part in an AMA on Reddit (Q&A session).

Here are some of the highlights:

#1. Google only uses SERP Evaluation Metrics in Evaluations

When asked whether Google use interactions of searchers to alter what position certain results may hold?

Gary Illyes responded:

[The] answer is that we primarily use these things in evaluations.

A short while later he clarified further:

Sure. When we want to launch a new algorithm or an update to the “core” one, we need to test it. Same goes for UX features, like changing the color of the green links. For the former, we have two ways to test: 1. With raters, which is detailed painfully in the raters guidelines 2. With live experiments.

1 was already chewed to bone and it’s not relevant here anyway. 2 is when we take a subset of users and force the experiment, ranking and/or ux, on them. Let’s say 1% of users get the update or launch candidate, the rest gets the currently deployed one (base). We run the experiment for some time, sometimes weeks, and then we compare some metrics between the experiment and the base. One of the metrics is how clicks on results differ between the two.

#2. Is there an internal linking overoptimization penalty?

No, you can abuse your internal links as much as you want AFAIK.

#3. Is there anything that most SEO’s tend to overlook/not pay attention to?

Google Images and Video search is often overlooked, but they have massive potential.

#4. Lots of people keep saying that part of the RB [RankBrain] system includes UX signals, including Dwell Time, Bounce Rate, Click Through Rate etc.

RankBrain is a PR-sexy machine learning ranking component that uses historical search data to predict what would a user most likely click on for a previously unseen query. It is a really cool piece of engineering that saved our butts countless times whenever traditional algos were like, e.g. “oh look a “not” in the query string! let’s ignore the hell out of it!“, but it’s generally just relying on (sometimes) months old data about what happened on the results page itself, not on the landing page. Dwell time, CTR, whatever Fishkin’s new theory is, those are generally made up crap. Search is much more simple than people think.

For those who don’t know Fishkin is the founder of Moz. You may have seen him presenting Moz’s Whiteboard Friday.

#5. Is there anything you can share around how Google measures quality or determines relevance?

We have this massive doc that touches on those things.

Basically we have an idea about determining relevance, we ask raters to evaluate the idea, raters say it looks good, we deploy it. Relevance is a hairy topic though, we could chat about it for days and still not have a final answer. It’s a dynamic thing that’s evolving a lot.

#6. Do you get benefit from duplicate content published elsewhere on the web and canonicals to your content?

I’ve paraphrased the question a bit here, but the point is whether you get page rank for internal and external links pointing to the duplicate article, or other relevant signals, for other peoples content if they repost your blog article.

I can’t give a concrete answer here because spammers would have a field day with it, but generally the page in a dup cluster that shows up in the results will see more benefits.

Gary Illyes appears to indicate there is some benefit. This is very interesting, as it would open up platforms like Medium to abuse.

For example, in Medium, you can use the import content so that it declares the rel=canonical tag back to the original article.

While outgoing links in that article are rel=nofollow, Gary Illyes appears to indicate that you may get some benefit for any internal Medium links leading to that Medium article. You can easily increase these internal links by tagging the article under multiple categories.

If you wondered why it took an extra couple of days to post this newsletter, I was busy copying my whole website on to Medium [sarcasm]. I’ll have to test this out, though, I think.

The question in full:

To what extent do metrics around E.A.T transfer, through the link graph, from one site to another? If Expert A, on site A, links through to an article by author B. on site B, I assume this increases, algorithmically, author B’s expertise?

Gary Illyes reply:

I guess that’s a little oversimplified, but yeah.

#8. Will web accessibility be considered direct ranking factor for images and video rankings?

Unfortunately, no.

#9. Is CrUX data being used to assess page speed as a ranking factor?

I don’t remember where the data is coming from, but not our publicly available tools, of which we have many for reasons that are beyond me.

Honestly, I don’t know. For users it’s excellent if the landing AMP is properly implemented and offers the same functionality as its parent, however that’s often not the case. For me as a user that’s frustrating, so I’d hope the search-amp team will focus more on ensuring that we only show AMPs in the results if they’re standalone or are on par with the parent.

If you have time, it is worth reading the whole Reddit AMA. There are some further Q&As not covered here.


Yoast To Add Live Indexing With Bing & Google

Joost de Valk announced at YoastCon that the popular Yoast SEO WordPress plugin will provide live indexing with both Google and Bing starting in March.

Bing has confirmed that it will work via their Webmaster Tools API, which will allow WordPress sites to automcatically submit URLs to the Bing index:

In fact, in readiness for the new ability, Bing announced on January 31, 2019, that you can now submit up to 10,000 URLs per day to Bing Webmaster Tools.

For Google, there is the Indexing API which allows you to update an URL, Remove an URL, get the satus of a request, and send batch indexing requests.

As these are publicly available APIs I expect more tools to start offering similar abilities. John Mueller confirmed this:

Google Search Console Performance Report Now Consolidates URL Data

The old version of the Performance Report (still available until April) credits page metrics to the exact URL that appeared in the search results. Under the old version, a single page may have many URLs, for example mobile, AMP, or desktop versions.

The new version now aggregates all the search metrics for a single piece of content into the canonical URL. You can see an example below:

Google Search Console Performance Report Now Consolidates URL Data.
Google Search Console Performance Report Now Consolidates URL Data. © Google

The change is due to be permanent from April 10, 2019. All data will be unified from January 2018.

You can read more about the change here.

Web Development News

Prototyping Stacks for Lighthouse (Google’s Page Auditing Tool)

Google is currently asking for feedback for new Stack Packs for Lighthouse. If you are new to Lighthouse head over to web.dev and get your web page appraised for speed, SEO, accessibility, and more.

Google is currently developing specific versions of Lighthouse for specific stacks, such as WordPress. The only stack they are currently working on is WordPress, which once rolled out, they will add stacks for different frameworks such as other popular CMS and JavaScript Frameworks (React, Angular, etc.)

With Lighthouse optimized for specific stacks they can add more specific advice on how to rectify issues, such as recommended specific Plugins:

WordPress Lighthouse Stack.
WordPress Lighthouse Stack. © Google

Google is currently asking the developer community what messages should be displayed for WordPress users.

You can view a copy of the working spreadsheet here with the current suggestions.

A GitHub user takes over dozens of domains they don’t own via GitHub Pages

On February 9th a GitHub user called “haxorlife” took over 65 domains via GitHub pages by exploiting a flaw in the custom domain configuration.

The flaw occurs when a GitHub user with a private repository with published GitHub pages on the Pro plan downgrades to the Free plan. On the Pro plan, you are allowed GitHub pages on private repos. On a Free Plan, you are only allowed GitHub pages on public repositories.

If you downgrade while having GitHub pages on a private repo those sites will disappear.

Haxorlife took advantage of the pages for these domains being deleted to create a new repo for that domain and act as an imposter. The owner of the domain cannot then create a repo because they will get an error stating the CNAME has been taken, and they are effectively locked out of managing that domain at GitHub.

What’s worse, is that Haxorlife can publish anything to their repo and publish it under your domain.

After the initial discussion on ycombinator, GitHub deleted the user haxorlife. It is unclear whether they have fixed the flaw, but I suspect they will be working on it fairly swiftly.

You can read the initial discussion here, and subsequent Reddit discussion here.

Firefox 66 to block automatically playing audible video and audio

Great news. Firefox 66, scheduled for release on March 19, 2019, will block automatic audio from playing by default.

Any playback before a user has interacted with the page will be counted as autoplay and will be blocked if it is audible. Videos that autoplay will still be allowed provided that they start muted.

The announcement article provides some advice to developers on how to adapt to the changes.

GitHub desktop now suggests next steps to keep you moving forward

GitHub is one of my favorite services. I use it to store all my code with version control, manage tasks, and fully integrates with code validation services, Continuous Integration tools (automatic deployment when the code is updated), and generally makes my life easier.

One of the big issues with GitHub is that it requires some learning before you use it effectively. I’ve been using it for several years, and it is only in the last year I am using it correctly. For several years, I just committed to the Master Branch directly (ouch). Now each issue, or closely related group of issues, has its own branch, with various tests run before pulling to the master branch.

The GitHub desktop editor, and integration with various code editors such as Atom, Visual Code Studio, and PHPStorm certainly make the process more straightforward. It avoids the need to use the command line editor to run basic commit, push, and merge commands, which is a major barrier to new users.

When you are new to GitHub, knowing what to do, or what the next step might be, can be difficult. With Version 1.6 of the GitHub desktop, they make things easier by now suggesting actions to take. For example, if you have just committed some code to your repo, you may want to push the branch the remote.

Here is a brief demo of the changes:

GitHub Desktop 1.6
GitHub Desktop 1.6 © GitHub

I used to solely use the command line and Atom to manage my GitHub repositories. Since writing this update, I have started using the GitHub desktop again.

Web Hosting News

Endurance International Group [EIG] reports declining numbers as it simplifies

Endurance International Group has recently reported its fourth-quarter earnings for 2018, and it does not make good reading.

EIG is the company behind popular hosting brands such as HostGator, BlueHost, Site5, Hostmonster, and JustHost. In recent years they have been on a large spending spree acquiring brands such as Site5, A Small Orange, Arvixe, ResellerClub, and Constant Contact, just to name a few.

Unfortunately, many of the acquisitions carried out in 2016 and 2017 were poorly managed. When EIG moved their customers over to their own infrastructure, it resulted in very poor service and support as they were hit by a multitude of issues.

It is not surprising their subscriber numbers are falling.

EIG Quarterly Subscribers.
EIG Quarterly Subscribers. © The Webmaster

Jeffrey H. Fox, president and chief executive officer at Endurance International Group, commented:

I am pleased with our financial performance in 2018. The Endurance team made substantial strategic and operational progress while delivering to our 2018 integrated operating plan.

In 2019, we will continue to simplify our operations and maintain focus on increasing the value we deliver to customers on our strategic brands, which we believe provides a foundation for growth.

I do believe that EIG started to make some headway with all their issues the moment they dropped their Indian-based technical support in favor of a GoDaddy-style telephone \ chat only support service. I would still find it hard to recommend them, though.

I do own a Bluehost Account which I use for testing, and they have definitely improved. That’s a good sign, at least. I just think EIG have created so much bad will over the past few years that many users will just avoid EIG hosts like the plague.

You can read my review of Bluehost here.


Interesting NamesCon talk from GoDaddy about Domain Investing

There was a very interesting speech by Paul Nicks (GoDaddy) at NamesCon on January 28, 2019, that evolved around domain investing.

Here are some of the highlights:

  • Sales were up 87% in Latin America, 60% in Asia, 56% in EMEA, 47% USA.
  • All regions had average sale prices of $1,822 or more.

Of particular interest was Nicks’ description of the different types of investors at GoDaddy:

  • High volume investors - those doing closeouts, registration fee. The average sale price was between $300 and $500.
  • High-value investors - those buying premium domains at $2,000+. This group tended to sell at $20k+.
  • “Sweet Spot” investors - those buying at $30-$60 and selling at an average of $2,000 - $5,000.

Nicks recommended that new investors use the “Sweet Spot” method.

If you ever thought about domain investing, you may like to watch the whole speech below:

I must say that this did get me to check the value of some of my old unused domains with GoDaddy’s Value & Appraisal tool, and instead of letting them expire they are now up for sale. Even if I get 20% of the value of the appraisal tool, I will be happy, but I suspect it may take a while.


Special deal at Namecheap for new customers

Between February 11th and February 18th, new Namecheap customers can get a .com domain registration for just $5.88.

This is a limited time offer through this link.

Discounted .com domain for new Namecheap customers.
Discounted .com domain for new Namecheap customers. © The Webmaster

The following articles have had a significant refresh over the last couple of weeks, and may be of particular interest:

Jonathan Griffin. Editor @ The Webmaster

About the author

Editor, Hosting Expert, SEO Developer, & SEO Consultant.

Jonathan is currently the Editor & CEO at The Webmaster. He is also an SEO Developer offering consultancy services, primarily to other web development companies. He specializes in the technical side of SEO, including site audits, development of SEO related features, and site structure & strategy.

In his spare time, Jonathan has a passion for learning. He regularly undertakes professional courses on subjects ranging from python, web development, digital marketing, and Advanced Google Analytics.

Read more about Jonathan Griffin on our About Page.