Want to perform a Technical SEO Audit but don’t know where to start?

Step this way… we’ll tell you about some great tools and let you know the key areas to focus on, whether you’re performing audits for yourself, investigating issues for clients or just simply want to find out a little bit more!

Part one of this blog series is going to focus on gathering and analysing your data. While in part two we’ll look at the key On-site Ranking Factors that you need to take into consideration within an audit.

 

What is a Technical SEO Audit?

Simply put, a Technical SEO audit is a process which brings to light any problems that might be causing issues for search engines when they crawl your website, and it will also highlight any possible improvement opportunities. This involves looking at the title & meta description tags, sitemap.xml & robots.txt files, URL structure, page speed, page status codes and duplicate content to name but a few key areas.

Data Capturing Tools

Screaming Frog

The first and most important step before doing anything else, is to gather your data for analysis; for this you need to crawl the entire website.

There are a few tools out there to do this, and you can even write your own, but the easiest and best one by far is the wonderful Screaming Frog SEO Spider. What’s even better is that the basic version of  the tool is free to use and will crawl up to 500 URLs, which is great if you just want to get to grips with what the tool can do, or if your site is small and you don’t need a deep analysis. Once you get the hang of the tool, or if you are auditing larger sites, you can choose to pay the yearly subscription and get the full version. There are so many brilliant options and features using this premium tool, but for now we’ll just focus on the free version.

Once the tool is downloaded, it’s just a case of putting in the website address you want to look at, and then clicking start and letting the crawler do its thing, which might take a while depending on how large your website is. When the crawl is complete you’ll have most of the important data to hand to help you within your audit process.

Webmaster Tools

Make sure that you have access to your site in Google Webmaster Tools, this is invaluable when conducting an audit because as well as notifying you about any penalties or errors, it also gives you access to a variety of tools and checkers which will greatly speed up the audit process.

Data Analysis

Accessibility

Once you have your data,  it’s the time to check how accessible your website is, to be sure search engines and users alike can successfully crawl and access all your important website content.

Robots.txt

Did the crawler manage to access all your pages and content successfully?  If there are a strangely low number of pages or even none at all, it could mean that your pages are being blocked to search engines via a robots.txt file. This file is stored on the root of a website and tells search engines which pages or directories they can and can’t crawl. A common problem is when a robots.txt file is set to disallow the entire website, this is normal during the development process of a website so that the development site isn’t indexed, but sometimes when the website goes live this file doesn’t get updated and therefore none of the new pages are  indexed, so its always good to check!

The robots.txt file is designed to signify to search engines any groups or pages that you would rather they didn’t spend time crawling because they are not relevant, are low quality or include parameters, such as search pages or admin and login sections.

Below is an example of a robots.txt file stopping search engines from crawling a website:

User-agent: *
Disallow: /

In the following example, we see how to disallow the admin folder of a wordpress site. Its important to note that if you have an existing robots.txt file on your site to make sure that it is not disallowing CSS or Javascript files. Search engines now crawl websites like a browser would, so it’s necessary to make sure that these file types are accessible to them.

User-agent: *
Disallow: /wp-admin/

Helpful Tools

Robots.txt Tester (Webmaster Tools)

If you wish to check whether certain URLs are accessible according to your robots.txt file, you can use the robots.txt Tester within Webmaster Tools (Crawl > robots.txt Tester). This tool will also signify any errors that the file itself may have.

robots

Fetch as Google (Webmaster Tools)

Want to check if your robots file is affecting how your page is displayed? Just go to Crawl > Fetch as Google and enter your URL, making sure you select Fetch and Render. Wait for it to complete and then click on the link, which will show how the googlebot and the user will be seeing your page. If they are noticeably different it’s most likely due to certain CSS files or folders being blocked, which will be detailed  in the list below. It is important to make sure that view for the googlebot and the user look as similar as possible.

fetch

 

Sitemap.xml

A sitemap.xml contains all the URLs for Google to follow (a bit like a road map) and should show the structure of your site. It can enable search engines to find pages that might normally be quite hard to find or are inaccessible. Like the robots.txt file, this is also kept on the root and is normally available at /sitemap.xml

This file needs to be in a valid xml format and should contain all your important pages. It should be as up to date as possible and should always be submitted within Webmaster Tools (Crawl > Sitemap) to Google to start indexing. It’s also the place where you can find how many URLs are indexed and if there are any errors,

Helpful Tools

Generating a XML file (XML-Sitemaps or Screaming Frog)

Screaming Frog can also create a static sitemap for you from the URLs it has crawled by going to Sitemaps > Create XML Sitemap.

A free tool which can generate a static sitemap for you can be found at XML-Sitemaps.com. Just enter your website address and click start. There is a maximum of 500 URLs, but this should be fine for a small to medium website.

Checking Sitemap URLs (Screaming Frog)

A great feature in Screaming Frog is the ability to check that all the links within a sitemap are valid. Simply go to Mode > List and click on Upload List > From a file and select your .xml file, and it’ll crawl each of the available links and let you know of any errors with the file. This is very useful if you have a large file that isn’t dynamic.

 

Error pages & Redirects

When you carried out your website crawl with Screaming Frog you would have seen a column called Status code with different numbers in, such as 200, 404, 301 or 302. A status code is what a web page sends back accessed, and i’ll just run though the most common ones.

200 The most common status code you will see, it says the request has succeeded and the page has been found.

301 (Moved Permanently) – You’ll see this next to a URL in Screaming Frog when it’s been moved or deleted, and a redirect has been added pointing to its new destination. This redirect tells Google to pass the link juice from the old URL to the new one, and eventually replace the old URL with the new one in the index.

302 (Found) – This is also known as a temporary redirect, which essentially says this page has been temporarily moved but could come back at a later date, so tells search engines to retain any link juice for that URL ONLY and not to the page being redirected to.  This also means that the old URL will remain indexed in the search results. As you almost always want to pass the link juice on and change the page indexed, it’s wise to always ensure you are using 301 redirects and not any 302s throughout the site.

404 (Not Found) – The page has not been found. It’s important to keep on track with any 404s you find in your crawl, and investigate where they are coming from. Sometimes the 404 can simply be caused by an incorrectly spelled URL on a link or nav item, or when a link exists to a deleted page somewhere on the site.

It’s worth investigating your 404s and existing redirects to decide the best action or improvement, it’s not bad to have some 404s on your site but if there is a page that is similar, it’s wise to have a redirect. If a new website is created always make sure 301 redirects are in place for the old URLs so that people can access the new site from the search results.

Helpful Tools

Crawl Errors (Webmaster Tools)

Within the Crawl Errors Section you’ll find all the 404s for your website including any for mobile and other devices. This is a good place to check for any old website URLs that might not have redirects in place, as Screaming Frog just looks at links within your site and not what may be on other sites or indexed in Google. Just be aware that it isn’t a live view of the website!

url

 

Site Structure

A good site architecture is beneficial for both users and search engines, as it provides a clear structure of your website making it easier to crawl for search engines, and ensures that users can access all the website content.  Make sure that all the important pages are accessible from the main nav and are not buried many clicks away. Within the site’s hierarchy, all the important pages should be nearer to the root of the site and ensure any pages that are related to each other are logically grouped.

Helpful tools

Tree View (Screaming Frog)

There is a ‘View’ dropdown on the right of the crawl where you can switch to seeing your website in tree structure, which helps to better visualise your current site structure. Are your important pages nearer to the root of the website?

 

Page Speed

Page speed is how long it takes for your browser to fully display the content on a page. Google uses page speed as a ranking factor so it’s important that your website is as fast as possible. It also benefits the user experience as the bounce rate of visitors is likely to be higher on slower loading pages.

Helpful tools

Page Speed Analyser (PageSpeed Insights)

Google Developers PageSpeed Insights is a good tool to see some basic issues that your website may be having and improvement suggestions.

 

Mobile Site

On April 21st Google started rolling out an algorithm update which now uses mobile-friendliness as a ranking signal within the mobile search results.

This means that any websites which are not optimised for mobile will soon be seeing significant ranking changes. This means that ensuring your website is mobile friendly is now a very high priority, especially if a lot of your website’s traffic is coming from these devices.

Check out our mobile website optimisation tips for more information on getting your website mobile ready.

Helpful tools

Mobile Usability (Webmaster Tools)

For a rough check of whether your website or certain pages are not optimised for mobile, go to Search Traffic > Mobile Usability which will list the main pages with issues.

Indexability

Want to see how many of your pages are indexed within Google? Easy!

You can use the “site:” command within the search bar, which will bring back the rough number and list of pages indexed.

sitemap

As mentioned earlier, it’s very useful for comparing the number of pages that are indexed versus the numbers crawled by Screaming Frog and within your Sitemap. Ideally they should be roughly the same, meaning that all the pages are indexed correctly.

If the index count is quite a lot smaller, it means there might be indexing problems or that some important pages are not accessible from search engines. For example, if they are disallowed within the robots.txt file.

If the index count is quite a lot larger, this can point to duplicate URLs being available for the same page. Google  marks this  as duplicate content, so it should be avoided. We will go how issues like this can be avoided in part two.


 

So there we go, a beginner’s guide to starting a Technical SEO audit beginning with basic data capture and analysis. We hope this has been helpful,  this is just one small part of the many things you can look at and tools to use if you want to fully audit your site, and it is entirely dependent on your website and its requirements. Look out for part two when we will move onto the On-site Ranking Factors and the Duplicate Content Issues that every website audit should be watching out for.

Don’t forget there are also Off-Site Ranking Signals to consider and Content and Link audits that should be carried out, all of which combine to affect how your website will be ranking. If you would like to know more or how we can help you get in touch with us today!

Photo by Giorgio Montersino, available under a Creative Commons Attribution-ShareAlike 2.0 Generic license.

 

What is SEO?
A guide to going organic

Want customers on your website?

Then you want our free guide!

Download Guide
//