Whether you’re building a site from scratch, making improvements in preparation for the release of a new iteration of your current site, or just making some small changes, here are some SEO essentials you really need to think about.
Being able to create and control 301s is a must, as it ensures that you keep a clean website. If you’re migrating from an old site, you need to make sure that all your old pages are redirected to the new pages. Control the type of 3xx response code you create and make sure you understand the difference between a 301 redirect (permanent) and 302 redirect (temporary).
Imagine someone’s bookmarked a page to come back to, or posted a link on their website or social media to share with others. What happens if you delete that page without redirecting it? Think of all the potential traffic you’ve lost, let alone the revenue and conversions.
Redirecting old pages to new pages is a simple way of making sure that your users have the best possible experience of your website.
Screaming Frog is a handy tool that crawls your website and collects all the URLs that are currently live. Use it prior to making any changes so you can create a comprehensive redirect list.
Google’s Webmaster Guidelines state that a core part of a good website is providing a good user experience. This principle should also apply when a user reaches a 404 error, which is inevitable at some stage in your website’s life. Creating a custom 404 page allows you to leave a message, provide links to key pages, and get the user back on track.
Have a little fun with your 404 pages — circumvent the user’s frustration at coming across a broken site and make them smile or laugh. As a result, your bounce rate will definitely improve.
You can use a server header checker to check a page’s response codes. This can also be used to check that the above 301 redirects have been implemented correctly and to see what the redirect path is.
Following two very public announcements about the pending mobile update from Google in early 2016, and then the subsequent update, most people should now see the value in a mobile-optimised website.
Google has a history of being secretive and of giving little or no warning that an update is coming; sometimes they don’t even confirm that an update has rolled out until a month later.
Predominately, your user base determines how you build your website — whether it’s with a mobile UX (user experience) in mind or a desktop UX. Some companies gain a lot of traffic from mobile, with people comparing products while shopping; or, in some cases, a mobile offers more privacy.
The long and short of this is, even if you make your website for desktop users, at least make it adaptive to mobile to satisfy the Google mobile update.
Googlebot is constantly evolving (especially in the last few years) to better understand and interpret different forms of code and what that means for a human. The days of optimising for bots rather than humans are now in the past.
An example of this can be seen in the Google Search Console. After completing a fetch & render of the website, you will be given the option to view a side-by-side comparison of what a normal user will see compared to what Googlebot sees. This will allow you to make changes for both humans and robots without sacrificing the quality of your content when fetching and rendering a page.
Google PageSpeed Insights will not only show you if CSS and JS are blocked but has other website optimisation suggestions.
Page speed is becoming more and more important. One way to assist with this is hosting your site on a dedicated IP address. If your website is hosted along with other websites, this can hurt your site’s performance and, if your budget allows, it should be avoided.
An exception to this is if you share an IP with development and/or staging sites. In fact, using a development and/or staging site is strongly recommended as it provides a platform to work on and try out major changes without effecting the live site, and allows you to carry out thorough quality control before going live.
This isn’t a must but, let’s be honest, it will make your website work that much better. It makes your URLs readable to users, allowing them to easily understand what will be on a page by looking at the URL. This also includes Google, as it can easily understand where on your website the page sits.
Google states: “Consider organising your content so that URLs are constructed logically and in a manner that is most intelligible to humans (when possible, readable words rather than long ID numbers).”
With that, I’d consider user-friendly URLs as key to a site’s SEO.
Each page’s meta data should be different and properly represent the page’s content. For the best results, the platform you choose should allow you to customise your meta data on each page.
This Title and Description length tool is a good basic length checker with alerts for exceeding recommended limits. A bonus content word counter is included.
This is easy mistake to make, especially in situations where, for example, you’ve migrated from a development site to the live site and blocked Google from indexing this at a page level rather than in the robot.txt file.
So, if you discover you’re not getting organic traffic, how do you find out what’s going on?
First, check to see if the pages of your website are being indexed by using the ‘site:’ command (e.g. site:www.example.com). This will list all the pages of your website that Google has indexed. If you find that no results are returned or you’re missing pages, go to the missing pages and look at the source code for the ‘no-index’ tag. The easiest way to find it is by using CTRL + F on Windows or COMMAND + F on Apple and search for ‘index’, which is generally found in the <head> section.
Being able to control and edit the robots.txt file of your website can be extremely important. The robot.txt file directs robots (or bots such as googlebot) as to what to index and what not to index. The ability to prevent the cart of an e-commerce website and the admin folders of a site being indexed is good practice, but the ability to control what is indexed from the directory should be an essential of any website platform you wish to use.
Robots.txt can generally be found at: e.g. www.example.com/robots.txt
An XML sitemap assists search engines to discover a site’s pages and understand its structure. Sitemaps can be submitted via Google Search Console and they immediately tell search engines that there are new pages that have been created.
Having a dynamic sitemap that is ‘tied-in’ or can be ‘added’ on top of’ a CMS, which automatically updates the sitemap as pages are created, deleted, redirected or modified, can not only save you lots of time but keeps Google informed of these changes.
Sitemap.xml can generally be found at: e.g. www.example.com/sitemap.xml
Dedicated e-commerce website platforms are inherently different to blogging website platforms. If you’re using an e-commerce platform, features you should be looking for are the ability to insert content, create templates and link to pages that aren’t part of the product section of your website.
Does your company have bricks and mortar locations? If so, this section applies to you. Some store finder systems don’t index the store pages and contact information, instead they pull them from a database. Clickable and index-able pages create NAP (Name, Address & Phone Number) consistency and are an easy way for users to find your contact details straight from the search results.
Duplicate content can easily arise if you aren’t careful, resulting from reusing content on different pages and sections of your website, using the same content on different location pages or on print-friendly pages, or from www and non-www versions of your site both being live at the same time.
Reusing content dilutes its value and causes pages to compete with one another in the search results. The best thing to do, in SEO terms, is make sure you have unique copy on every page of your website.
Copyscape is the go to tool for plagiarism and duplicate content checking.