Google Webmaster Guidelines have been updated

SEO

Google Webmaster Guidelines have been updated

Google has recently updated its webmaster Guidelines, making various changes to recommended SEO practices for Design & Content, Technical, and Quality Guidelines.

Google has recently updated its Webmaster Guidelines. Changes have been made to their recommended search engine optimization, or SEO, practices for Design & Content, Technical, and Quality Guidelines. For reference purposes, you will find the new guidelines here, with an archived version of the old guidelines here.

Purely from a cosmetic viewpoint, the new guidelines have been improved from a user readability perspective. No longer are all the guidelines on one very long page, but several sections have been incorporated into expandable tabs. This makes it easier to find the information you need at a glance, without having to scroll down the page too far.

It is worth reading the new guidelines in full, and if you have time checking out the old guidelines too. We shall summarize some of the changes below:

Introduction section

Previously this section introduced the guidelines and had some in-page links to take you to the relevant sections. The new guidelines have a short description but have expanded tabs for each of the General Guidelines.

Old version

Webmaster Guidelines introduction old

New version

Webmaster Guidelines introduction new

The advice informing you how to submit your site to Google is now contained in a new section titled "Help Google find your pages." This is much more detailed than the 3 points stated in the old version as it merges several points contained in the old technical guidelines (which we will come to shortly).

Help Google find your pages section

To view this section, click on the title of the expandable tab to show the following:

Help Google Find your Pages Section

You will see that this section is a merger of the old introduction, some of the Design and Content Guidelines as well as some of the technical guidelines. This groups all the relevant information into one easy to use section.

Some of the differences are as follows:

Old Version New Version
1. Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link. 1. Ensure that all pages on the site can be reached by a link from another findable page. The referring link should include either text or, for images, an alt attribute, that is relevant to the target page.
2. Offer a site map to your users with links that point to the important parts of your site. If the site map has an extremely large number of links, you may want to break the site map into multiple pages. 2. Provide a sitemap file with links that point to the important pages on your site. Also provide a page with a human-readable list of links to these pages (sometimes called a site index or site map page).
3. Keep the links on a given page to a reasonable number. 3. Limit the number of links on a page to a reasonable number (a few thousand at most).
4. To help Google fully understand your site's contents, allow all of your site's assets, such as CSS and JavaScript files, to be crawled. The Google indexing system renders webpages using the HTML of a page as well as its assets such as images, CSS, and JavaScript files. To see the page assets that Googlebot cannot crawl and to debug directives in your robots.txt file, use the Fetch as Google and the robots.txt Tester tools in Search Console. 4. Use the robots.txt file on your web server to manage your crawling budget by preventing crawling of infinite spaces such as search result pages. Keep your robots.txt file up to date. Learn how to manage crawling with the robots.txt file. Test the coverage and syntax of your robots.txt file using the robots.txt testing tool.

Commenting on each of the differences in turn:

  1. The new version is easier to understand by nontechnical persons. Also, the advice is updated to include other types of links, including image links. There is also advice to include an alt attribute for image links which acts as the anchor text of normal links.
  2. There is some duplication here. Reference is now made to offering both a sitemap file for the search engines and a readable sitemap file for users. This second point is also covered under the heading "Ways to help Google find your site", as you can see in the image above.
  3. Previous traditional advice recommended having no more than 100 links on a page (although Google phrased it as "a reasonable number"), but they now recommend "a few thousand at most."
  4. This relates to changes in how Google now crawls your content. It now refers to more detailed resources.

How Google understands your pages Section

How Google understands your Pages Section

This section contains several items from the old Design and Content Guidelines as well as some Technical Guidelines.

Some of the differences are as follows:

Old Version New Version
1. Make sure that your <title> elements and ALT attributes are descriptive and accurate. 1. Ensure that your <title> elements and alt attributes are descriptive, specific, and accurate.
2. Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link. 2. Design your site to have a clear conceptual page hierarchy.
3. Review our recommended best practices for images, video and rich snippets. 3. Follow our recommended best practices for images, video, and structured data.
4. If your company buys a content management system, make sure that the system creates pages and links that search engines can crawl. 4. When using a content management system (for example, Wix or WordPress), make sure that it creates pages and links that search engines can crawl.
5. To help Google fully understand your site's contents, allow all of your site's assets, such as CSS and JavaScript files, to be crawled. The Google indexing system renders webpages using the HTML of a page as well as its assets such as images, CSS, and JavaScript files. To see the page assets that Googlebot cannot crawl and to debug directives in your robots.txt file, use the Fetch as Google and the robots.txt Tester tools in Search Console 5. To help Google fully understand your site's contents, allow all site assets that would significantly affect page rendering to be crawled: for example, CSS and JavaScript files that affect the understanding of the pages. The Google indexing system renders a webpage as the user would see it, including images, CSS, and JavaScript files. To see which page assets that Googlebot cannot crawl, or to debug directives in your robots.txt file, use the blocked resources report in Search Console and the Fetch as Google and robots.txt Tester tools.
6. Not covered 6. Make your site's important content visible by default. Google is able to crawl HTML content hidden inside navigational elements such as tabs or expanding sections, however we consider this content less accessible to users, and believe that you should make your most important information visible in the default page view.
7. Make reasonable efforts to ensure that advertisements do not affect search engine rankings. For example, Google's AdSense ads and DoubleClick links are blocked from being crawled by a robots.txt file. 7. Make a reasonable effort to ensure that advertisement links on your pages do not affect search engine rankings. For example, use robots.txt or rel="nofollow" to prevent advertisement links from being followed by a crawler.

Commenting on each of the differences in turn:

  1. The addition of the word "specific" means that you now need to be more careful when writing your alt tags. Bad practices can seep in with this kind of SEO optimization where you might just type in a few keywords, but it is not necessarily specific to that particular image.
  2. Part of this was covered before, but the stress here is to ensure users are guided to think about site structure.
  3. Instead of a "review best practices" Google is now being firmer saying that you "should follow" them. A small change, but it does make its importance clearer.
  4. By including examples of a Content Management System (includes both a standard CMS and Website Builder), it makes it easier to read and understand.
  5. This makes reference to the new Render Report in Google Webmaster Tools, to encourage you to check what Google sees when crawling your site.
  6. This is a new change that states that Google gives less weight to content hidden behind tabs or expanding sections. You should make sure all your important content is immediately visible on page load.
  7. This provides more helpful advice on how to deal with advertisements on your site.

How Visitors use your pages Section

How Visitors use your Pages Section

This section contains several items from the old Design and Content Guidelines as well as some Technical Guidelines.

Some of the differences are as follows:

Old Version New Version
1. Try to use text instead of images to display important names, content, or links. The Google crawler doesn't recognize text contained in images. If you must use images for textual content, consider using the ALT attribute to include a few words of descriptive text. 1. Try to use text instead of images to display important names, content, or links. If you must use images for textual content, use the alt attribute to include a few words of descriptive text.
2. Check for broken links and correct HTML. 2. Ensure that all links go to live web pages. Use valid HTML.
3. Google strongly recommends that all webmasters regularly monitor site performance using Page Speed, YSlow, WebPagetest, or other tools. For more information, tools, and resources, see Let's Make The Web Faster. 3. Optimize your page loading times. Fast sites make users happy and improve the overall quality of the web (especially for those users with slow internet connections). Google recommends that you use tools like PageSpeed Insights and Webpagetest.org to test the performance of your page.
4. Not covered 4. Design your site for all device types and sizes, including desktops, tablets, and smartphones. Use the mobile friendly testing tool to test how well your pages work on mobile devices, and get feedback on what needs to be fixed.
5. Test your site to make sure that it appears correctly in different browsers. 5. Ensure that your site appears correctly in different browsers.
6. Not covered 6. If possible, secure your site's connections with HTTPS. Encrypting interactions between the user and your website is a good practice for communication on the web.
7. Not covered 7. Ensure that your pages are useful for readers with visual impairments, for example, by testing usability with a screen-reader.

Commenting on each of the differences in turn:

  1. A stronger focus is now made to using the Alt tag. Instead of "consider using" it now has a more forceful requirement to "use."
  2. This has been rewritten in a simpler language and contains a link to further resources to enable you to check your HTML.
  3. This section was updated to make it simpler, and now only recommends two places to test your site speed. If you want further recommendations \ advice check out our article on the topic here.
  4. With the new Mobile friendliness factor now being part of the search algorithm, it is now important that your website has a mobile version.
  5. A small wording change.
  6. With HTTPS now being a ranking factor, Google now recommends that you use HTTPS on your website.
  7. Another new change suggests that you check your site is usable by people with visual impairments.

Quality guidelines section

There are no changes in the quality guidelines. This includes the main introductory paragraphs, "Basic Principles" and "Specific Guidelines". The only minor point is that in the subheadings Google no longer duplicates the words "Quality Guidelines". Could this be an indication to avoid over optimizing keywords? Possibly, but perhaps this is something we already knew.

Webmaster Guidelines quality guidelines

Guidelines that no longer exist

So far we have covered all the changes and additions, but there are a few guidelines that have been deleted in their entirety:

  • Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.
  • If you decide to use dynamic pages (i.e., the URL contains a "?" character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.

The robots.txt point is covered, with a links to further resources about managing the Crawling budget. Presumably, this has been removed to make the guidelines easier to understand.

About dynamic pages, we presume that because they have become commonplace this guideline is no longer applicable.

Check out our top user-rated host: SiteGround
Need help choosing a hosting provider?
Check out our top user-rated host: SiteGround