The 3-Point Technical SEO Checklist for Writing Content

Every good page is built on the foundation of great content. And the content that is featured on your website should always be updated, readable and relevant. In turn, ensuring that your website meets a certain standard is key in allowing you to reach your audiences. This process can be improved by specifically targeting your readers’ interests.

However, it’s not uncommon for one’s website to underperform still, in spite of the presence of good content and the utilisation of audience targeting. What causes this? Spelling mistakes? Text length? The use of slang? These are all possibilities. But if these factors aren’t the cause of the problems your website is facing, then perhaps you should consider changing your perspective a little.

There’s more to blogs than link building, posts, and keywords. In fact, your blog may very well be underachieving due to a shaky technical foundation. Simply put, your website is a goner if the tech behind it isn’t improved. Google has become a lean, mean, sophisticated machine over the years. This means that even the slightest technical adjustments behind the scenes can be useful in refreshing your struggling website and the way it handles visitors.

URL optimisation, navigation tweaking and fixing the code – these are terms that may sound like excerpts from your worst nightmare. However, you’ll soon realise that these adjustments are your best friends. Let’s take a look at how these technicalities are important for SEO and why you should put them on your to-do list today:

  1. Make sure your redirects are proper and that you fix links that are broken for an optimized crawl budget

The basic rule of good link building is making sure that people can find your content at all times. This requires an understanding of the meaning of different page status codes. The most usual ones are:

  •    200 – success when loading the page
  •    301 – the page has been relocated to another domain permanently
  •    302 – the page has been relocated temporarily
  •    404 – the original page is gone/removed
  •    500 – there is no response whatsoever
  •    503 – the page is currently unavailable

Knowing these codes is also important when it comes to improving the architecture of your website. For example, if you don’t want to have a page on the website, use a 301 redirect, not a 302. Such a mistake may confuse search engines, which could impact your SEO ranking negatively.

This knowledge can also come in handy during the testing process. If a page shows a 404-error code, you might need to look at the way it’s linked to other websites. This could also mean that something has been deleted.

The bottom line is that incorrectly coded URLs waste Google’s time and cause the engine to index non-existent pages, which drains your limited crawl budget. Always inspect your links and make sure they are of use so that you can make the most out of your budget.

  1. Make sure that all junk code is gone and that the loading time is under 2 seconds

Every second of delay during loading time can cause a loss in conversions. 8% to be exact. When loading times are long, up to 80% of your potential customers will become less invested in navigating your website as well. In turn, another 40% of your customers will elect not to revisit your website.

That’s something you can’t afford, especially with Google’s algorithms favouring websites which load faster. Furthermore, because people now have much shorter attention spans, this means you need to alter your website to cater to that characteristic.

An excellent tool that can help you with loading problems is Google’s PageSpeed Insights. What the tool does is grade your site on a scale of 1 – 100, and only scores greater than 85 are considered satisfactory if you want to optimise your SEO. In utilising PageSpeed Insight’s recommendations, you may see a spike in SEO performance in a matter of hours.

  1. Make sure you don’t have duplicate or thin content

In 2011, Google introduced the Panda update, which shook the marketing industry to the core. Many websites’ rankings tumbled because the revision started penalising websites for duplicates found within both links and textual forms of information.

If you’re looking to analyse the state of your blog in terms of the presence of duplicate and thin content, then give the Siteliner tool a try. It scans up to 250 different pages and shows you the percentage of duplicates contained on your site with an exact list of the matching content.

Resolving this issue prevents Google from indexing the same page with multiple different URLs, allowing you to completely avoid any penalty whatsoever. The best part? Doing this on the regular only take a few minutes each time.