Google Search Console is a crucial tool for monitoring and improving the functionality of a website in order to achieve a better user experience. Since you’re already familiar with the tool, it’s time to improve your skills and start utilizing one of our favorite reporting features – index coverage. Monitoring the errors on your website will allow you to fix any issues as soon as possible. This will not only help keep your website “healthy” but will also help improve your rankings.
What is the Index Coverage Report Feature?
The new index coverage reporting feature gives an overview of each page of your website that the Google bots tried to crawl and index. If there were any issues with crawling or indexing the pages, an error will appear in this section informing you of the issue. When there is an error, Google Search Console will send you an email letting you know what the issue is so that you can get it fixed.
Index Coverage Errors
If you’ve received an email from Google Search Console informing you of an error, there are seven possible errors you might see. Some of these errors are easier to fix than others, but the sooner you get them fixed, the better!
Server Error (5XX)
A 500-level error will happen when a certain page on your website is unable to load due to a server-wide issue or a brief server disconnection. If your website’s pages are able to load properly but the error still remains in Google Search Console, don’t fret! The issue will most likely resolve itself after it’s next crawl. To speed up the process, simply resubmit your sitemap or Fetch as Google.
A redirect error will appear when your website URL has changed and it redirects repeatedly. If you haven’t changed your domain name recently, a redirect error can also appear if you have recently changed from an unsecured (HTTP) to a more secure (HTTPS) protocol. To fix this issue, you’ll want to fix your internal links by utilizing a redirect plugin, such as Better Search and Replace.
Submitted URL Blocked by Robots.txt
This error will appear when a Google bot attempts to crawl your website and a specific page is blocked from indexing due to the robots.txt file. To see what pages are marked to not be indexed, simply run the robots.txt test through Google Search Console.
Submitted URL Marked as “No Index”
All of the pages on your website are accessible to anyone and can appear in search results as long as they’re not submitted to be blocked for indexing through your robots.txt. A submitted URL marked as “no index” error will appear when you submit a page for indexing but it is marked as “no index” through a tag or HTTP response.
If you’re receiving the “no index” error on pages you want to be indexed, simply remove the “no index” tag. If you are receiving this error on pages you don’t want to be indexed, make sure they are properly added to your robots.txt file.
Submitted URL Seems to be a Soft 404
Soft 404 errors happen when a page appears to be broken to the Google bots, but don’t show a proper 404 not found response. Common causes for a soft 404 error include vacant category pages and themes that automatically create pages. To help avoid soft 404 errors, keep the back end of your website clean and organized – don’t hoard themes, plugins, and drafts that aren’t being utilized.
Submitted URL Not Found (404 Error)
A 404 error will happen when you delete a page from your website without removing it from your sitemap or setting up a 301 redirect correctly. To fix it, simply set up a 301 redirect for the page(s) with 404 errors. To prevent them in the future, keep your sitemap up to date and always set up a 301 redirect for deleted pages!
Submitted URL has a Crawl Issue
A crawl issue error will happen when the Google bots are prevented from crawling the pages of your website. By focusing on site speed and unblocking any resources, you will be able to avoid any crawl issues. The “Fetch as Google” tool will help you get any fixes reflected in Google Search Console in a timely manner.