Welcome to this blog series which will be looking at the shiny new features as and when they are rolled out into the Search Console beta. Here at Rice, we love the new version already and wanted to explain more about the new features but also how you can use them, as well as how we’ve used them for our clients. We’ve created a guide discussing two of the new reports, specifically Search Performance and Index Coverage. Enjoy!
Contents
- 1. How to Access & Report Sharing
- 2. Search Performance Report
- 3. Index Coverage Report
- 4. Conclusion
1. How to Access & Report Sharing
1.1 How to Access the New Search Console
Now let’s get started with our run-through of the first new section within Search Console!
Firstly if you’re not sure how to access the new search console, you can go directly to it by navigating to https://search.google.com/search-console in your browser. From there, just select the property you wish to view via the Search Property drop-down.
Alternatively, if you are within the current Search Console you can click the link ‘Try the new Search Console’.
Once you’ve selected a property, preferably one that you know will have a lot of data to look at, you’ll see the list of options on the left – Status is the section which is shown first by default.
1.2 Report Sharing
Now before we delve into the individual reports in here, we’ll just mention another new feature in the beta which is Sharing. This allows sharing of a single page to anyone with the link but doesn’t grant access to other areas. This is very useful if you want to share a certain report on a temporary basis or to highlight a problem to website developers. If you wish to share a page you’ll first have to enable it. Just go to the blue share icon on a shareable page.
Then click enable sharing which will turn the icon a different colour and show an opened padlock, you can then copy the link and send to who you wish.
This is useful for knowing which pages you’ve currently shared, if you wish to un-share a page just click back on the Share button and disable sharing.
Going back to today’s topic, you should already be on the correct Status page but if you’re not, just click ‘Status’ on the left-hand menu and you should see two graphs for the sections we are going to go through today; Performance and Index Coverage.
The status page is a brief overview of the problems or reports you can expect, the search performance graph will only show clicks so you’ll have to click through to the report to see impressions, CTR, etc. To open any of them in more detail, just click ‘Open Report’ to learn more.
2. Search Performance Report
2.1 Overview
This is essentially the new and improved version of the existing Search Analytics report we all know and love. It’s a fantastic resource for seeing how your website is performing within search results. It has many metrics available to help improve your search performance, such as how often a query or page associated with your website appears (impressions), how many times it’s been clicked (clicks) and it’s average position over time (position) with many filters available such as device, country, date, etc.
The new version has now got a massive 16 months worth of data in comparison to the previous amount of 90 days, making it so much easier to see improvements and issues over a larger date range as well as those year-on-year comparisons.
2.2 Design Changes
Here is a visual comparison of the current Search Analytics report in Search Console and the Search Performance report in the new Search console.
Search Analytics Report filter options (current Search Console)
Search Performance Report filter options (new Search Console)
Looking at the Search Performance report we can see that the design is a lot more streamlined with fewer options shown as default on the page. The metrics for Clicks, Impressions, CTA and Position have been moved further down so that they are now above the graph.
In the existing Search Analytics report, you have to tick and untick a checkbox above the type/date filters to select them, here you can click on the metric box directly to change/switch between them, this works so much better! Each one also contains a helpful pop-up explaining what each is, handy if you’re new to the tool.
By default, the report filters are set to show Search Type: Web and Date: Last 3 months. To change these, just click on each and update the results. For Search Type you can choose between Web, Image, and Video, you can also choose to Compare these types if you wish.
The Date filter has a lot more options than the current Search Analytics report.
You can pick from some set filters such as the last 7 or 28 days or the last 3, 6 and 12 months. Alternatively to see everything, just click Full Duration which will show you the full 16 months if available. The custom date selector is also useful if you have your own set of dates that you want to see. The Compare tab is what you would expect and allows you to compare set filters or your own date range.
The ‘New +’ filter allows you to choose between filters / compare options for the data groups; Query, Page, Country, and Device. Within Search Analytics these options are currently contained within dropdowns on the data group links themselves.
Moving these the filter/compare options for the groups to the top alongside the date and search type means that the data group buttons can be moved further down. Separating these and moving them below makes sense as you don’t need to keep scrolling up to switch between them.
You may have noticed that the Search Appearance feature is not showing by default on this report. This is because now it will only show if there are impressions from at least one feature on the below list:
- Search result link
- AMP article rich results
- AMP non-rich results
- Job listing
- Job details
This looks to be a nice way to save space, but while writing this we’ve not been able to get the Search Appearance filter to appear, despite websites fulfilling the above criteria.
2.3 What to look for
The first thing you are likely to do is to go for those large date range comparisons because, well, now you can! It’s also great to better see those seasonality trends, which before were not that easy within the 90-day limit.
As we’ve mentioned earlier this report is very similar to the Search Analytics report you might already be used to. For more information on how you can use this and other features in Search Console to help improve your website’s visibility, have a listen to Laura’s BrightonSEO talk on Search Console Quick Wins.
Safe to say we are currently loving this upgrade to what was already a fantastic report If we were to ask for an addition it would be the ability to add annotations to the reports like you can in Google Analytics in order to track any web changes that are implemented or changed.
If you’d like more information on this new report, check out the Google Search Performance Guide.
3. Index Coverage Report
The index coverage report is the most exciting new feature of the new Search Console, simply because it provides So. Much. Data on how Google is viewing and treating the URLs in a website (which we love). In the current search console, this was very much lacking with only limited data such as index counts and error messages being available.
3.1 Statuses
Within the new Search Console, the basic graph shown on the Status page gives a very brief insight into the URLs, showing how many URLs are valid and how many have errors.
Clicking ‘open report’ takes you fully into the report. As with the metric boxes in the Search Performance report, you can click on the Status boxes to show the individual elements in the graph. The statuses available are Error, Valid with warnings, Valid and Excluded. Here is a quick summary of what it means when a URL is placed within one of the status sections.
Error: The page has not been indexed due to an error, look at the error description within the status box for more information and how to fix.
Valid with warnings: This page has been indexed but there are some issues. It may be that you don’t intend for these URLs to be indexed and these URLs should be checked + where they originate from.
Valid: The page was fully indexed.
Excluded: This page has been excluded from the index, this may be as it hasn’t been indexed yet or has been deliberately excluded such as via a noindex meta tag or robots.txt file.
View of the Index Coverage report
What’s also great about this report is that you can choose to enable ‘Impressions’ which will overlay the impressions graph from the Search Performance graph. This helps you to see how errors or changes may be affecting impressions. For example, if you’re seeing errors go up and impressions go down, it’s something that should be prioritised and looked into.
You can also click on an individual day to see the specific stats for that date, and you’ll also see a visual indicator of when a new issue has been found. If you see this and click on the issue, you’ll be taken straight to the issue specified.
The main graph will also highlight dates where new issues were found, which is very useful. It would be great if we were able to add our own annotations as we can do currently within Google Analytics to specify changes that have been implemented.
Below the graph, you’ll see the Status box. This is where the messages called Status Reasons appear. Each of the statuses (Valid, Excluded, Error, etc) can have one or more specific reasons appear. Clicking any of the status boxes will update the status box with the status reasons that Google has spotted.
For example, if you’ve got Errors showing, make sure that the Error box is the only one selected and check out the status box for the issues. If you would rather just see all the reasons you can also just make sure every box is selected.
One frustrating current feature is that the URLs in the report are not linked to open up the URL itself. Clicking on a URL within a reason will bring up the Page Details box but doesn’t give you an option to go to the actual URL. Copying the URL can also be a bit of a nightmare but these are all issues that we are sure will be addressed in future versions of the beta.
3.2 Error Fixes / Validation
Here is an example of some of the reasons that may show when the Error status is clicked on.
This report has highlighted two different issues, with some pages containing a server error and some a redirect error. Clicking on the individual errors will then take you further into the specific URLs with this error and when they were seen.
Clicking on an individual URL on this page will bring up some quick links of ways to check the page including via fetch as render, and testing robots.txt.
You may have noticed the ‘Validation’ column within the Status Reasons list. This is a way to signal to Google when issues have been fixed. If an error has a message of Not Started, it should be prioritised to fix. Once you believe you have fixed the problems causing the issue you can click the message and click Validate Fix.
This will start the validation process with the number of URLs showing, this may take several days so patience is key!
When the validation begins, Google will check a few pages and if the error still exists in these pages the validation will stop. If these pages no longer contain the error, the Status will change to Started.
If there are no issues after all the URLs have been checked the Validation will change to Passed. During the validation process, the URL list will update with each URLs status.
If the validation fails, it’s possible to ask for a revalidation. Try not to request another validation in the middle of a validation request, you should wait for the current version to finish first.
3.3 Status Reasons
So while we’ve seen some status reasons for errors, it’s important to note that not all of the reasons you’ll see in the status box will be things you need to fix. Think of this report as a way to check that that the URLs in your website are being crawled and indexed (or not indexed) as expected. This report will group URLs together that it feels you might need to double check, or just to confirm that it’s the correct behaviour.
For example, if you’ve got URLs which are canonicalised to another URL you’ll see a list for that (Alternative page with proper canonical tag), if you’ve submitted URLs which redirect you’ll see a list for that (Page with redirect), if you’ve blocked a group of URLs via robots you’ll see a list for that (Blocked via robots).
So, If you’ve got a large list of messages to go through, don’t panic! All the messages help to ensure that you are aware of what URLs Google is seeing and what issues are being found (if any) and also what URLs are excluded and why. You won’t see the Validation Status or Request Fix for all messages. The amount of new information Google gives you on how it views the URLs on your website can be overwhelming at first.
Let’s look through some of the many Status reasons which were included for URLs in our example website within the Excluded section.
The Status Reason descriptions below come from Google’s Index Coverage help guide, I’d recommend saving this or printing them for future reference until you get used to what each one means (there are a lot!) Even if you know what the term means it’s always worth double checking the specific Google description for it. These are the descriptions:
“Crawled – currently not indexed: The page was crawled by Google, but not indexed. It may or may not be indexed in the future; no need to resubmit this URL for crawling.”
This admittedly is quite a vague description but if you’ve got an important URL in that list it would be worth double checking the page to ensure everything is correct.
“Discovered – currently not indexed”: The page was found by Google, but not crawled yet.”
This is handy for knowing which URLs have been submitted but haven’t yet been crawled and so why they haven’t been indexed yet. If you’ve recently added new content and it hasn’t been indexed, nor does it appear here, then search engines may be struggling with crawling your site.
“Page with redirect”: The URL is a redirect, and therefore was not added to the index.”
What it says on the tin, the URL is a redirect and so hasn’t been indexed. As we’ll cover further on, it’s worth double-checking these.
“Excluded by ‘noindex’ tag”: When Google tried to index the page it encountered a ‘noindex’ directive, and therefore did not index it. If you do not want the page indexed, you have done so correctly. If you do want this page to be indexed, you should remove that ‘noindex’ directive.”
This is an important one to check, make sure that the URLs in this list have not been set to noindex by mistake.
3.4 URL Discovery Filter
Another welcome addition is the drop-down found below the Index coverage heading. This allows you to filter by the way in which Google discovered the URL. By default, this is set to All Known pages but you are also able to select All Submitted pages of specific sitemap URLs.
This is perfect for diagnosing sitemap issues you may be having, especially for large sitemaps as the same status messages and graph will be shown here.
You also get the index coverage report for individual sitemaps via the Sitemap option on the left-hand menu. When you click into an individual sitemap, you can click See Index Coverage which will take you to the same screen as well.
When new issues are found, Search Console will email you regarding them, similarly to the email confirmation that’s sent when submitting a validation. This helps to ensure that you’re up to date with issues and what may be happening within your website.
3.5 How it’s been helpful
So we’ve been using the Index Coverage report for a couple weeks and it has already been useful for highlighting issues that need to be investigated.
One can be seen in the screenshot below:
As you can see the amount of Excluded URLs suddenly jumped up at the beginning of January. Although these weren’t indexable it does show that Google is suddenly seeing a lot more URLs during a crawl. There were many status reasons in the status bar for the Excluded status but the Trend graph showed which one was causing the issue.
Clicking into “alternative page with proper canonical tag” showed that there was a pagination issue where it was generating an endless amount of links that didn’t exist. For example, URLs such as /page/2573/, /page/2574/ etc when there should only be 20 pages of news.
They are showing up as canonicalised as unfortunately the website pagination is currently canonicalised to the main news page (something we are also trying to rectify!) but if they were not canonicalised you would expect the sharp rise in URLs to show up elsewhere in the report. These were soon flagged up within a crawl and fixed.
Another example of an issue was seen in URLs that were in the Crawl Anomaly reason for a different website. A URL is grouped within the Crawl Anomaly set when Google has difficulties fetching the URL. The URLs associated with this reason steadily increased throughout January highlighting that Google keeps finding issues with crawling some of the URLs.
We downloaded the example URLs to investigate, and also ran a crawl, finding that some of them were due to incorrect relative linking which was quickly marked to be fixed.
Although the report doesn’t state where the link came from (something that would be great in a future update!) we were able to check this against the URL itself and a crawl of the website to find and resolve the error.
These are just two examples of how the report can be used to spot errors that may not necessarily be highlighted as such within the report, or in the classic version of Search Console. This is particularly helpful for larger websites where you may not be able to complete a full crawl.
The new reports are a lot more helpful when you are migrating a website. As mentioned in the excellent blog post by Aleyda Solis on Monitoring Web Migrations, the Index Coverage report allows you to monitor which URLs have been indexed and also which have issues.
3.6 Tips for using the Index coverage Report
We would recommend regularly checking your websites Index Coverage report for large increases or decreases within URLs – not only within the errors, you should get an email about an increase in error related issues but as shown above, other problems may also only show up with a deeper look into the report.
Check that the URLs provided for each reason are correct. For example, double check the noindexed/blocked via robots.txt lists for any URLs that shouldn’t be in there.
3.6.1 Don’t forget to check the URLs in the Valid Status
It might be tempting to just look at the error, warning, and excluded statuses but the Valid Status should not be ignored. Each status reason is limited to 1,000 URLs, so if you have a medium-to-large website it’s likely that you will only see a sample of URLs within these sections which can be a shame if you want to check all URLs that are indexed.
So the Valid Status will include a section called Submitted and indexed URLs. These URLs are those submitted in the sitemap and also indexed. You should have a quick check of these URLs, if there are some that are incorrect it means that you should be looking at removing these URLs from your sitemap files.
The Indexed, not submitted in sitemap section, however, can throw up a lot more interesting URLs. These are URLs which are not submitted within a sitemap but have been indexed by Google. This list can include old URLs which are no longer linked within a web crawl but are indexed, that you may not even know about.
This set of URLs came in really handy for a new client who had a lot of URL changes in the past. It highlighted old URLs which although they were no longer in use and linked within a web crawl were actually duplicate URLs and were still live and indexed which meant they needed to be redirected.
3.6.2 Validate your Fixes
If you know that an error has been fixed, make sure you validate the fix. Although this takes time, it’s a great way to also keep track within Search Console of what issues have been worked on.
3.6.3 Use the Page Detail Box to see when a URL was last crawled
When you click on a URL within the reasons list, it brings up the Page Details box on the right. This gives you options such as Fetch As Google and Submit to Index. Another interesting piece of information it gives you is when Google last crawled it. Oliver Mason has written a brilliant blog investigating the accuracy of the Last Crawled time within the new Search Console, finding not only that it is accurate but is also now reported in local time.
If you are trying to figure out an issue with a particular page or have updated elements on it, the Last Crawled date can give you can indication of whether Google has looked at the page since. Alternatively, if it’s not a recent date it’s a sign that Google doesn’t find this page that used to crawl often. If you think it’s an important page look at how it’s linked within the website, check for duplication issues and any on-page improvements that can be made. If it hasn’t been crawled in a while but you’ve fixed an issue or updated page elements it’s worth using Submit to Index via the Page Details box to request Google to re-crawl it sooner.
3.6.4 Download and Crawl the URL Lists to check for issues
Although you may be crawling URLs which are in obvious issue categories such as Crawl Anomaly, it’s also wise to double check the URLs in other reasons. For example within URLs that Redirect, again these may not be URLs that are currently linked in a web crawl that you would encounter within a crawling tool like Screaming Frog. These may be legacy URLs that Google is still checking. It’s always worth downloading these URLs for both old and current redirects, running them through Screaming Frog and double checking to make sure there are no redirect chains and the end URL is correct. When we tested some of the redirects listed, we found some pretty interesting redirect chains and incorrect final destination URLs which were soon fixed.
4. Conclusion
Search Console has grown so much since its inception back in 2005 and the new features within the Search Console Beta have made a fantastic impression.
The Performance report has had a design upgrade and now includes more data. While the Search Appearance section seems to be missing and there seems to be a small data mismatch between the old Search Analytics report, it’s already a much-used feature within our Birmingham office.
The Index Coverage report has given us so much more information than ever before on the URLs that Google is finding and how it has interpreted them. If we were to change anything it would be an option to increase the 1,000 URL limit which limits the number of URLs shown per status reason. While it’s great to get so much information, for larger websites it can be frustrating not be able to download the full list or a bigger sample of the data.
All in all, it’s a massive thumbs-up from the Rice team and we’re excited to see the next new features rolling out within the beta!