Technical SEO Done Right –2020 Update for Beginners


Let’s cut to the chase, SEO is your bread and butter whether you’re running an online store or a brick and mortar business. The next wave of sales depends on it. We already know that over 87% of shoppers begin product and service searches online.

Technical SEO helps you get to the front page of the internet. Don’t get us wrong, SEO isn’t a magic bullet that will magically get you to the top.

It’s a gradual process with lots of moving parts, including off and on page SEO, site structure, and content marketing. Sprinkle in a little social media marketing and you have the perfect recipe for SEO success.

In this guide, we’ll get into the very nitty gritty of technical SEO aimed at beginners. To be honest, all you really need is time and commitment to get search engines to notice you. As always, the tried and tested formula for SEO success is consistency and persistency – and that hasn’t changed at all in 2020.

If you would rather skip to specific parts of the article, refer to this table of contents to make reading easier.

 

1. The Fundamentals of Technical SEO 

Let’s start things off with the basics. This chapter will cover technical SEO and how you can optimize your website to become more amenable to modern search engines.

Simply put, technical SEO is the process of ensuring that your web page meets the technical requirements of search engines with the goal of improving organic rankings. This will ultimately drive a lot of traffic to your website, depending on the keyword you’re ranking for. The most important elements of technical SEO include crawling, rendering, website structure, and indexing.

Why Technical SEO is so Important 

Let’s say you have the most engaging content on your website with superior products and services.

But your website isn’t optimized for technical SEO.

This will hurt your chances of ranking. If your content had the potential of reaching page 1 of search results, not having good technical SEO means you’ll be lurking somewhere on page 10.

And customers don’t even go very far down the search results page, much less visiting page 10. According to Moz, the first page of Google results captures anywhere from 71% to 92% of search traffic.

Bad SEO makes it difficult (or even impossible) for search engines to crawl, render, and index webpages on your website.

But getting Google’s crawlers to index your website is just half the battle. You want your pages to be completely optimized for technical SEO, your site should be free of duplicate content, secure with HTTPS, mobile-friendly, have superior loading speeds – and few other things that most beginners tend to ignore.

This doesn’t mean that your website’s technical SEO rank has to be perfect. It doesn’t. But the closer you reach perfection, the easier you make it for search engines to crawl through your content, and the better your chances of ranking.

Here are a few basic things to keep in mind when working on your website:

To optimize your website for technical SEO, you need to take into account the following issues:

  • Website architecture
  • Low content issues
  • Structure of the URL
  • Use of HTML attributes such as hreflang
  • Duplicate content
  • Presence of 404 pages
  • 301 redirection
  • Add structured data to your website

We will cover all and more of these things in the rest of this blog.

2. Optimize Your Site Structure – Like a Filing Cabinet

A standard filing cabinet is organized like this: you have individual drawers in the cabinet, folders within the drawers, files within the folders, and documents within the files. There are no duplicate files and they are located in a special spot. In other words, there is a clear navigation path to each file. Imagine for a moment that the files are randomly arranged and there are thousands of files to rummage through taking hours of your time.

Site architecture serves the exact same purpose as a filing cabinet – to arrange files in a shallow, easily crawled space.

A properly defined site structure makes it easier for search engine crawlers to index your website, this in turn increases website ranking. Site structure has become more important than ever before with the introduction of artificial intelligence in the ranking algorithm, mobile-first index, and voice search.

It’s not enough to randomly scatter pages on your website, but these have to be properly grouped into categories for search engines to understand the topics that you want to rank for. This doesn’t mean that keywords are less important, it simply means that topic relevancy is equally important.

Here’s the typical modus operandi of most search engines:

  • They discover your website is available for indexing purposes
  • They start crawling your homepage and then follow all the links from there
  • They try to make sense of the site structure to understand how your pages and posts are related and which pages are more valuable than others.

Pro tip: You don’t want your website ranking for 404 errors, community profiles, and other non-essential links. It is better to deindex such pages.

When you optimize site structure, you make it very easy for search engines to crawl your website.

The actual site structure of your website depends on the type of content and services it offers. If we were to divide websites into categories, we will end up with four major types:

  • News websites
  • Blogs
  • Corporate websites
  • Online stores (selling products and services)

A good site structure for a typical website looks like this:

This is a flat structure and the site’s pages are only a few links away from one another.  This becomes even more important if you’re running an online store with 300k product pages. A flat architecture could make or break your Seo. Your goal is to be as structured as possible without any ‘orphan pages’ (a web page that doesn’t have any internal links pointing to it).

Poor site structure also makes it hard to troubleshoot indexing issues. You can use the “Site Audit” tool of Ahrefs to get a complete breakdown of your site structure. Ahrefs isn’t free you can start a 7 day trial for $7.

But you can use the free Visual Site Mapper here to get an in-depth look at your website’s architecture. Here’s what the tool looks like in action:

Structure Your URLs Correctly 

When it comes to SEO success, the devil really is in the details. While proper URL structure isn’t the end-all and be-all of SEO efforts, doing it right can tilt the scales in your favor. An SEO-friendly URL structure not only improves page ranking but also enhances the user experience.

The goal is to make your URLs follow a consistent, logical structure. This helps visitors understand where they are on your website. Additionally, categorizing your pages gives more context to search engines about each page in each category.

For example, all the blog posts published by Moz include the “/blog/category” subfolder to help Google know that all of these pages are under the “Blog” category.

This seems to work pretty well for search results. If you Google “Moz blog”, you’ll notice that Google adds site links to the results.

If you check further, you’ll notice that all of these share the sitelink “/blog/category”.

Search engines reward consistency and stability when it comes to URL structure.

Use Breadcrumbs to Make Navigation Easier 

Your secret sauce to SEO success could lie in using breadcrumb trails. Breadcrumbs are a secondary navigation system that automatically adds internal links to subpages and categories on your website. This in turn helps to solidify the website architecture. This also allows Google to turn URLs into breadcrumb-style navigation in their search results pages, as shown below:

3. Getting Your Pages Crawled, Rendered, and Indexed

This segment of the blog is all about making it easy for search engines to find, crawl, and index your entire website. We’ll show you how to troubleshoot and fix crawl errors, and how to send search engine spiders into every nook and cranny of your website.

Spotting Indexing Problems

Your first stop is to find any pages on your website that are giving search engine spiders issues. Here are 2 ways to do this:

Coverage Report

Google’s Search Console provides a lot of information that can help you spot potential errors. In this case, we are interested in the “Coverage Report” tool to reveal pages on the site that have been indexed. You’ll get a bird’s eye view of your website to see if Google is unable to fully index webpages.

Click here to visit the Google Search Console.

On the far left of the screen, under Index, click on “Coverage” to get a report for your website.

Here’s what a typical report looks like:

Your goal is to reduce the number of errors to 0.

Using Screaming Frog 

Screaming Frog is the most powerful crawler that you can use to crawl your website. It is worth noting that Screaming Frog requires a license to crawl more than 500 URLs. So if you’ve got a big website, you’ll need to pay up.

Internal Linking 

Most websites should face no trouble in getting their homepage indexed.

The main challenge lies in indexing pages that are located several links away from the homepage. As we mentioned earlier, having a properly defined site architecture is a great way to minimize any issues since all your deepest links are only 3 to 4 clicks away from the home page.

However, should your website happen to have a set of webpages that are far removed from the home page, it’s time to start creating internal links to them.

This is doubly true for pages that have extremely high authority and get crawled all the time.

Create an SEO-Boosting XML Sitemap

Think of an XML sitemap as the roadmap of your website that leads search engines to all your important webpages. In fact, a Google official stated that XML sitemaps are the second most important source when it comes to discovering URLs.

It is interesting to note that they didn’t mention what the first one is. But it is safe to assume this would be lots of backlinking from authoritative websites. Visit the Search Console to see if your sitemap is good to go.

On the far left of the screen, under Index, click on Sitemaps.

This will reveal the sitemap(s) that Google is using to crawl your website.

Using the URL Inspection Tool under Search Console

If a URL on your website isn’t indexed and you want to know why, simply use the URL inspection tool. This tool helps you identify the reasons why your page isn’t indexed so you can troubleshoot them. Furthermore, the tool also lets you see how Google renders pages that ARE indexed.

4. The Problem with Duplicate Content 

If your website has unique and original content for every webpage, then you won’t have to worry about duplicate content… unless your CMS multiple versions of the webpage on different URLs. Duplicate content is a surefire way of tanking your SEO ranking. The same also applies to thin content, but that shouldn’t be a problem for most websites.

In this section, we will introduce you to a few tools that let you find duplicate content.

Your first stop is Raven Tools. It will scan your entire website for duplicate (or thin) content and let you know which pages need to be fixed.

It is worth noting that Raven Tools will only search for duplicate content within your website, not outside of your website.

This is where Copyscape comes in. Copyscape lets you check if your site’s content is unique. This can be done by using their “Batch Search” tool.

Here you can provide a list of URLs and identify areas where duplicate content appears.

If you find similar text on another website, simply copy and paste it on Google.

If Google ranks your page first, they consider you to be the original author of that content.

Copyscape is not free but it is the best plagiarism detector out there. You can use free alternatives such as Small SEO Tools. But this tool isn’t nearly as versatile as the “Batch Search” feature of Copyscape. This means you’ll have to manually copy and paste content from your selected URLs to see if they are plagiarized. This will take a lot of time, not to mention the fact that Small SEO Tools isn’t as comprehensive as Copyscape in its results.

That’s all you need to know as far as duplicate content goes.

Pro tip: If other websites plagiarize your content and put it on their blog without your permission, that’s not your problem. You only need to worry about the content on your own website.

Make Good Use of the “Noindex” Tag

Most websites will have pages with some duplicate content. For example, the author bio on your About Us page may be too similar to the small bio snippet that appears next to most blog posts. And this is okay for the most part as long as the duplicated content isn’t too voluminous.

That being said, there are a few pages that contain a lot of duplicate content and are indexed. As we said earlier, this can tank your ranking on search result pages. Your best bet is to add the ‘noindex’ tag to those pages.

The ‘noindex’ tag instructs search engines to not index the page.

You can check the ‘noindex’ tag by visiting the “Inspect URL” feature in Google Search Console.

Google will let you know that the page is being indexed with this message, “URL is available to Google”. This means the noindex tag hasn’t been set up correctly.

But if you see the messaged, “Excluded by ‘noindex’ tag”, then the noindex tag has been set up correctly.

It could take anywhere from a few days to a few weeks for Google to update the search results pages.

You can keep track of the ‘noindex’ tag under the “Excluded” to see if Google is removing your pages as instructed.

Using Canonical URLs (also known as ‘rel canonical’) 

Most webpages that have duplicate content can be tagged with ‘noindex’ – and that’s all you need to do. You can replace duplicate content with original content in case of plagiarism.

But there’s an alternative to getting around the duplicate content problem: canonical URLs.

Canonical URLs are ideal for pages that have similar content with minor differences between them. Suppose you run an online store that sells shirts.

And you have a product page set up for extra large shirts.

Depending on how your CMS is set up, every color, size, and design variation will generate new URLs. This isn’t good because you will have duplicate content on all these URLs.

But you can get around this limitation by using the canonical tag to inform Google that the first version of the product page is the main one, and the other pages are variations.

5. Website Loading Speed 

An important metric used by search engines to rank your website is page loading speed. This doesn’t mean that a fast loading website will reach the top of the search page (you really need backlinks to achieve this). But improving the website loading speed can help you make significant improvements in your overall organic traffic.

In this section, we will explore a few simple ways to boost your site’s loading speed.

Use a Faster Hosting Provider

No matter how well you optimize your website, your biggest bottleneck will always be the hosting provider.  There are various hosting providers out there, but the most prominent of them that deserve mention in this blog are Bluehost and DreamHost. The best part is that they are both recommended by WordPress due to their excellent support team dedication to infrastructure.

They are also pretty good in terms of website loading speed, having a worldwide average of 153 ms. At less than $3 per month, we think Bluehost offers the best possible value for money.

If you don’t already have a Bluehost account, use our promo link here to find out for yourself!

For a more in-depth information on the best hosting provider, check out our review here.

Reduce the Size of Your Web Page

As a general rule of thumb, the more assets your server has to load, the slower your website will be to respond to requests. This is why it is a good idea to reduce the size of your web page. The size of a web page plays a direct role on loading speeds.

You can compress the images to improve page loading speed, but we don’t recommend doing that because it’s better to have a good looking website than a fast loading page with pixelated images. However, if site loading speed is a top priority for you, do whatever it is that you need in order to reduce the page’s total size.

Minify CSS, HTML, and JavaScript

You can optimize the code on your website (WordPress is known for good coding) by removing unnecessary characters, redundant code, commas, spaces, and more. This will drastically increase page speed. You can use various tools such as UglifyJS and CSSNano to get this done.

Browser Caching

Browsers cache a lot of information (such as images, JavaScript files, stylesheets, and more) so that the next time someone visits your website, they won’t have to reload the entire page from scratch. Use a tool like YSlow to see if you already have an expiration date set for your cache. You can then set the ‘expires’ header to configure the time it will take for the information to be cached.

Unless you change your website too frequently, set this to one year.

Eliminate Third-Party Scripts

Your website probably has tons of third party scripts from tools such as Google Analytics and LiveChat. Do keep in mind that these will impact page loading speed. Make sure to analyze all your third party scripts to see if you can get rid of the non-essential tools.

Wrapping Up

That’s it for our guide on technical SEO.

If you think the information helped you out or if you want to offer your own suggestions, do let us know in the comments below.

0/5 (0 Reviews)

Recent Content