Google: Your Sitemap showing for search queries indicates Site Quality Issues

If you are Googling something specific on your site and one of the results is your XML sitemap, then it is likely that you have site quality issues.

Google’s John Mueller said on Twitter last week that if your sitemap ranks in Google Search for a query, it indicates that your site may have quality issues.

The comment came about when Twitter user Bolocan Cristian posted a screenshot of a search result, asking Mueller how he can remove the sitemap file from the search results.

Cristian followed up, asking, “Why is Google indexing URLs like sitemaps.xml?”

Mueller responded with criticism of the site’s content, saying, “there’s not a lot of good content that’s worth showing.”

He continued that “when XML files are seen as being more relevant than your pages, that’s often a sign of your pages being really suboptimal. I’d work on that first, if this query is important.”

You can see Mueller’s full reply below:

Mueller followed up with some advice on how to remove the URL from the Search Engine Results Pages (SERPs).

However, he reiterated his advice to focus on content first if that particularly search query was important.

Is it normal for XML Sitemaps to be indexed?

I checked the sites I manage, and none of the sitemaps was indexed. I then checked some other popular sites, and a surprising number was (Seroundtable, for example).

Google indexing XML sitemaps in SERPs
Google indexing XML sitemaps in SERPs © The Search Review.

I think it is worth clarifying that your sitemap showing up as indexed is not an issue.

The issue is when the sitemap shows in the search results when you type in a keyword relating to your site’s content. This indicates that Google thinks your sitemap is more relevant to that query than the content on your site.

The only official reference to sitemaps being indexable is a web archive of an old Google Group thread from 2008.

In it, Mueller said that the file might have been linked to from somewhere, indicating that Google won’t index them unless they are crawlable from other links on the web.

Mueller said, “It does look like we have some of your Sitemap files indexed. It’s possible that there is a link to them somewhere (or perhaps to the Sitemaps Index file).”

The URL removal tool in the Search Console only removes a file for 90 days. If there is a link to it will most likely find it’s way back in the index again once the 90 days are up.

The best way to stop your XML Sitemap file from being indexed

Mueller’s recommends to set the x-robots-tag HTTP header to “noindex” for non-HTML content.

I’d do it slightly differently, in that I would be more specific to the actual sitemap file.

On an apache server (works with NGINX as a reverse proxy too) the easiest way is to add the following snippet to your .htaccess file:

<FilesMatch "sitemap.xml">
Header set X-Robots-Tag "noindex"
</FilesMatch>

For NGINX, you could add the following to the configuration file:

location = sitemap.xml {
    add_header  X-Robots-Tag "noindex";
}
Jonathan Griffin. Editor @ The Webmaster

Editor, SEO Consultant, & Developer.

Jonathan Griffin is The Webmaster's Editor & CEO, managing day-to-day editorial operations across all our publications. Jonathan writes about Development, Hosting, and SEO topics for The Webmaster and The Search Review with more than nine years of experience. Jonathan also manages his own SEO consultancy, offering SEO developer services. He is an expert on site-structure, strategy, Schema, AMP, and technical SEO. You can find Jonathan on Twitter as @thewebmastercom.

Read more about Jonathan Griffin on our About Page.