13 SEO Essentials For Your ‘New’ Website

 
13 September, 2017

Whether you’re building a site from scratch, making improvements in preparation for the release of a new iteration of your current site, or just making some small changes, here are some SEO essentials you really need to think about.

1. 301 redirects

Being able to create and control 301s is a must, as it ensures that you keep a clean website. If you’re migrating from an old site, you need to make sure that all your old pages are redirected to the new pages. Control the type of 3xx response code you create and make sure you understand the difference between a 301 redirect (permanent) and 302 redirect (temporary).

Imagine someone’s bookmarked a page to come back to, or posted a link on their website or social media to share with others. What happens if you delete that page without redirecting it? Think of all the potential traffic you’ve lost, let alone the revenue and conversions.

Redirecting old pages to new pages is a simple way of making sure that your users have the best possible experience of your website.

Screaming Frog is a handy tool that crawls your website and collects all the URLs that are currently live. Use it prior to making any changes so you can create a comprehensive redirect list.

2. Custom 404 page

Google’s Webmaster Guidelines state that a core part of a good website is providing a good user experience. This principle should also apply when a user reaches a 404 error, which is inevitable at some stage in your website’s life. Creating a custom 404 page allows you to leave a message, provide links to key pages, and get the user back on track.

Have a little fun with your 404 pages — circumvent the user’s frustration at coming across a broken site and make them smile or laugh. As a result, your bounce rate will definitely improve.

You can use a server header checker to check a page’s response codes. This can also be used to check that the above 301 redirects have been implemented correctly and to see what the redirect path is.

3. Mobile-friendly

Following two very public announcements about the pending mobile update from Google in early 2016, and then the subsequent update, most people should now see the value in a mobile-optimised website.

Google has a history of being secretive and of giving little or no warning that an update is coming; sometimes they don’t even confirm that an update has rolled out until a month later.

Predominately, your user base determines how you build your website — whether it’s with a mobile UX (user experience) in mind or a desktop UX. Some companies gain a lot of traffic from mobile, with people comparing products while shopping; or, in some cases, a mobile offers more privacy.

The long and short of this is, even if you make your website for desktop users, at least make it adaptive to mobile to satisfy the Google mobile update.

Use either Google’s TestMySite or PageSpeed Insights for suggestions about mobile optimisation.

4. Allows CSS and JS to be indexed

Googlebot is constantly evolving (especially in the last few years) to better understand and interpret different forms of code and what that means for a human. The days of optimising for bots rather than humans are now in the past.

According to Google: “Disallowing crawling of Javascript or CSS files in your site’s robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings.”

An example of this can be seen in the Google Search Console. After completing a fetch & render of the website, you will be given the option to view a side-by-side comparison of what a normal user will see compared to what Googlebot sees. This will allow you to make changes for both humans and robots without sacrificing the quality of your content when fetching and rendering a page.

Google PageSpeed Insights will not only show you if CSS and JS are blocked but has other website optimisation suggestions.

5. Obtaining a dedicated IP to host your website

Page speed is becoming more and more important. One way to assist with this is hosting your site on a dedicated IP address. If your website is hosted along with other websites, this can hurt your site’s performance and, if your budget allows, it should be avoided.

An exception to this is if you share an IP with development and/or staging sites. In fact, using a development and/or staging site is strongly recommended as it provides a platform to work on and try out major changes without effecting the live site, and allows you to carry out thorough quality control before going live.

6. Ability to create search engine-friendly custom URLs

This isn’t a must but, let’s be honest, it will make your website work that much better. It makes your URLs readable to users, allowing them to easily understand what will be on a page by looking at the URL. This also includes Google, as it can easily understand where on your website the page sits.

Google states: “Consider organising your content so that URLs are constructed logically and in a manner that is most intelligible to humans (when possible, readable words rather than long ID numbers).”

With that, I’d consider user-friendly URLs as key to a site’s SEO.

7. Meta titles, meta descriptions and page structure on each individual page

Each page’s meta data should be different and properly represent the page’s content. For the best results, the platform you choose should allow you to customise your meta data on each page.

This Title and Description length tool is a good basic length checker with alerts for exceeding recommended limits. A bonus content word counter is included.

8. Failure to disable no-index tags

This is easy mistake to make, especially in situations where, for example, you’ve migrated from a development site to the live site and blocked Google from indexing this at a page level rather than in the robot.txt file.

So, if you discover you’re not getting organic traffic, how do you find out what’s going on?

First, check to see if the pages of your website are being indexed by using the ‘site:’ command (e.g. site:www.example.com). This will list all the pages of your website that Google has indexed. If you find that no results are returned or you’re missing pages, go to the missing pages and look at the source code for the ‘no-index’ tag. The easiest way to find it is by using CTRL + F on Windows or COMMAND + F on Apple and search for ‘index’, which is generally found in the <head> section.

9. Editable robots.txt

Being able to control and edit the robots.txt file of your website can be extremely important. The robot.txt file directs robots (or bots such as googlebot) as to what to index and what not to index. The ability to prevent the cart of an e-commerce website and the admin folders of a site being indexed is good practice, but the ability to control what is indexed from the directory should be an essential of any website platform you wish to use.

Robots.txt can generally be found at: e.g. www.example.com/robots.txt

10. Automatic XML Sitemap or single click creation

An XML sitemap assists search engines to discover a site’s pages and understand its structure. Sitemaps can be submitted via Google Search Console and they immediately tell search engines that there are new pages that have been created.

Having a dynamic sitemap that is ‘tied-in’ or can be ‘added’ on top of’ a CMS, which automatically updates the sitemap as pages are created, deleted, redirected or modified, can not only save you lots of time but keeps Google informed of these changes.

Sitemap.xml can generally be found at: e.g. www.example.com/sitemap.xml

11. Create content pages ‘outside’ of e-commerce category and product pages

Dedicated e-commerce website platforms are inherently different to blogging website platforms. If you’re using an e-commerce platform, features you should be looking for are the ability to insert content, create templates and link to pages that aren’t part of the product section of your website.

12. Individual store location pages can be indexed

Does your company have bricks and mortar locations? If so, this section applies to you. Some store finder systems don’t index the store pages and contact information, instead they pull them from a database. Clickable and index-able pages create NAP (Name, Address & Phone Number) consistency and are an easy way for users to find your contact details straight from the search results.

13. Duplicate Content Issues

Duplicate content can easily arise if you aren’t careful, resulting from reusing content on different pages and sections of your website, using the same content on different location pages or on print-friendly pages, or from www and non-www versions of your site both being live at the same time.

Reusing content dilutes its value and causes pages to compete with one another in the search results. The best thing to do, in SEO terms, is make sure you have unique copy on every page of your website.

Copyscape is the go to tool for plagiarism and duplicate content checking.

SEO Goals

There are some ‘goals’ you should also work towards that aren’t necessarily deal breakers but will definitely help with your website build and optimisation. These will help you achieve higher Google PageSpeed scores, boost organic rankings and encourage social sharing.

  • Server response time <500ms (<200ms, ideally)
  • Server supports browser caching and file compression
  • Minification of JS and CSS
  • Supports schema mark-up on appropriate pages (e.g. Organisation, Product and Local Business contact details)
  • Product ratings and reviews can be left and/or inclusion of third-party systems
  • Inclusion of social sharing buttons

Are you building a new website and want to ensure it’s optimised for SEO for even better returns? Contact Found Digital for web development and SEO advice at hello@founddigtal.com.au.

Contact Us