With more businesses going online every day, lazy SEO just doesn’t cut it anymore. If your website isn’t optimised correctly, you’ll be missing out on a lot of free traffic from search engines, and risk being overtaken by your competitors. After auditing hundreds of websites, we’ve highlighted the 4 clear cut signs of lazy SEO:
#1 Unresolved WWW
What it is
WWW resolve is when the non www version of your website forwards to the www version (or vice versa), for example, http://example.co.uk should automatically 301 redirect (forward) to http://www.example.co.uk.
Importance
If you have unresolved WWW, when search engine bots (Google, Bing, Yahoo etc.) visit your website, they view http://example.co.uk and http://www.example.co.uk as two different domains, which splits the link authority between both domains, causes duplicate content and duplicate meta issues, which can all be detrimental to your website’s SEO.
How-to Fix
Firstly, decide which version of your domain you would like to keep and which you would like to 301 redirect. It’s entirely up to you and what you think works for your brand. If you’re not fussed then check the Page Authority (via Open Site Explorer) and Trust Flow (via Majestic) metrics for each page and choose the URL with the higher link authority.
The standard method to implement this is via the .htaccess file, below are the instructions for both versions:
WWW Resolve with www.
RewriteEngine on
RewriteCond %{HTTP_HOST} !^www\.example\.co.uk$ [NC]
RewriteRule ^(.*)$ http://www.example.com/$1 [R=301,L]
WWW Resolve without www.
RewriteEngine on
RewriteCond %{HTTP_HOST} ^www\.example\.co.uk$ [NC]
RewriteRule ^(.*)$ http://example.com/$1 [R=301,L]
Watch Out For
If you implement this using a different method, make sure the redirects give a 301 status header and not 302; 302 redirects do not pass on page authority.
Example – Ricemedia.co.uk
To see WWW resolve in action, put this link into your browser: http://ricemedia.co.uk/organic-search/ and you’ll notice that it automatically 301 redirects to https://www.ricemedia.co.uk/seo/, all other links on our website work exactly the same way.
#2 Missing Robots.txt
What it is
A Robots.txt file is a simple text file placed in the root folder of the domain that provides directive to search engine bots.
Importance
The Robots.txt file is very important in terms of telling search engine bots which pages they are allowed to visit on your site, as well as which pages they are not allowed to visit.
For instance, an e-commerce website with product filters that include parameters at the end of the URLs can cause a huge duplication issue, which is likely to negatively impact your SEO. This can be avoided by implementing an optimised Robots.txt file.
If you are not running an e-commerce website there are still basics that you need to include in your website’s Robots.txt file, like an xml sitemap, so that search engine bots can crawl each URL on your website with ease.
How-to Fix
Creating a basic robots.txt file is very simple, use a notepad style program and copy the following into it:
user-agent: *
allow: /
sitemap: http://www.example.co.uk/sitemap.xml
Name it ‘robots’ and save it as a txt file, then upload it to the root folder of your domain. You can check if it has been uploaded correctly by visiting yourdomainname.com/robots.txt or you can test how it’s working, using Google’s own robots.txt tester.
Watch Out For
Make sure ‘allow’ is not ‘disallow’, as your domain will be de-indexed in search engines.
If you are implementing more advanced directives then make sure you’re testing using the tester mentioned above, as you wouldn’t want to accidentally exclude parts of your website that are supposed to be indexed.
Do not disallow Javascript and CSS files, as this may harm the way your web pages are rendered and indexed (which ultimately harms rankings).
Example – Ricemedia.co.uk
Ricemedia’s robots.txt welcomes all search engine bots, tells them not to access the /wp-admin/ folder, as we don’t want the login area of our website indexed in search engines, and provides an XML sitemap so all URLs are easily accessible.
#3 No Homepage Meta Description Optimisation
What it is
A Meta Description is unique text which explains a web page to search engines. This text is not seen by humans on the page itself, only in the SERPs as a small description underneath the website link and title.
If you’re an SEO executive, you may be wondering why I have chosen the Meta Description instead of Meta Title, which, as we all know, is a far bigger SEO sin. It’s because the latter is the next step down from SEO laziness – SEO neglect.
Importance
It is a major ranking factor and is considered an essential in SEO. It also affects the click-through rate to your website from search engines, so the more enticing, the better.
How-to Fix
All you need to do is write up a new Meta Description. Include the keywords that you are targeting to the homepage, make it enticing so searchers will want to click through to your website, include brand values, USPs and marketing messages to make it stand out from the competition but keep it under 155 characters. When you have completed this, upload it to your website’s homepage.
Watch Out For
Make sure you keep it relevant to your website.
Example – Hubspot.com
HubSpot’s Meta Description is great for a number of reasons. It clearly explains what HubSpot is, it includes variations of the keywords that they are targeting, concisely explains their USPs, reads well and is well under the 155 characters mark.
#4 Missing XML Sitemap
What it is
An XML sitemap is a file which lists all of your website’s URLs for search engine bots to crawl.
Importance
XML sitemaps make it very easy for search engine bots to find and crawl all of your website’s URLs. In addition to this, they also inform search engines of each web page’s priority. If your website is not bot friendly and the URLs are hard to discover, it may cause search engine bots to leave prematurely. However, if you present all of your URLs in an up-to date and structured sitemap it makes the URL discovery process that much easier.
How-to Fix
There are a number of different ways to generate your xml sitemap and plenty of tools to choose from. One sure fire method is to use Screaming Frog, a free downloadable SEO tool. You can create a sitemap by clicking on ‘Sitemaps’ and then ‘Create XML Sitemap’.
Once you have uploaded it to your website, head over to Google Webmaster Tools > Crawl > Sitemaps > Add/Test Sitemaps and include your sitemap there (also do the same for Bing).
Watch Out For
A dynamic sitemap is good as it updates automatically whenever changes are made, but if it is not possible to create a dynamic sitemap then be sure to update the sitemap manually from time to time depending on how often you create/update URLs.
Having multiple sitemaps for different sections of your website is a good idea but be sure to submit them all to Google and make sure they are structured correctly.
Don’t include URLs that you do not want indexed, for example, product URLs with parameters on the end.
Example – Moz.com
Moz have multiple sitemaps for different sections of their website, this one is for their main blog and is dynamic. You can see this by looking at the <lastmod> line of the code.
To Summarise
The 4 clear signs of a lazy SEO are as follows:
1. Unresolved WWW
2. Missing robots.txt
3. No homepage meta description optimisation
4. Missing XML sitemap
We hope you enjoyed this post, let us know what you think in the comments section below and get in touch today if you’d like some help getting your SEO fighting fit!