Technical SEO: The Underrated Part Of Optimization
Technical SEO is the evidence-based side of search engine optimization. It includes activities performed on a website that ensure it meets the technical requirements of search engines like Google in order to be included in search results.
At worst, failure to meet these requirements means complete invisibility in search results. Hello! Anyone out there? At best, it means lower ranking than you might have otherwise “earned” through other SEO and social media efforts like publishing fantastic content. This is why technical SEO is important.
Attending to technical SEO can improve your search engine ranking as well as user (client) experience to meet business goals. Technical SEO helps search engines more effectively crawl and index your pages. It supports Google’s efforts to provide a more meaningful display of your website in search results so that people are more likely to click. Yes, this is why technical SEO is important.
A technically optimized site is:
- Easy for search engines to navigate and index
- Easy for people to navigate
- Built for performance
- Bridging the gap between human content and a search engine’s limited understanding of it
It All Starts with Site Structure and Navigation
Simplicity and consistency are essential in the site structure. When your site achieves this, a search engine can crawl (find) all pages you would like indexed (visible in search results).
A reliable site structure and navigation also prevent orphan pages that are disconnected from your site. And as a bonus, efficient structure and navigation make it easier to maintain your website and avoid taking users down paths to nowhere.
How to Improve Your Technical SEO
To further explore why technical SEO is important, we’ll briefly examine specific areas you can address to improve your site’s performance in search.
Older sites often have layers. As you update your site, you add new features. But you may have failed to remove old code.
This becomes “code bloat”. It slows down how quickly search engines can read your site. It impacts performance. And over time a site becomes more prone to glitchy behavior because of conflicts in the code.
How to Fix it
There are two types of code bloat index bloat and on-page bloat. In both cases, you can lose rankings. To find and fix indexing bloat go to your search console and click Index status under Google Index. You will likely see a spike in your indexed pages starting on a particular day. To see what specific pages you have listed go to google.com and search site:yoursite.com there you can see how many results Google pulls for your website. The results should match up with how many pages you have on your website.
You can use three options to remove indexing for a page. This is good practice on category pages and pages with little to no content.
- Use a redirect if the page is of no value
- Place a meta robots tag on-page to tell Google not to index that page like this:
<META NAME=”ROBOTS” CONTENT=”NOINDEX, FOLLOW”>
- Disallow crawling in your robot.txt file
If you have on page bloat removal can be a little more daunting as the code needs to be removed from your website’s site code. You can always look at the plugins you have on-site and remove unused plugins to remove some extra code. After you have removed any unused plugins we advise having a website developer help you debug your code to ensure you don’t remove anything that is important to your site.
An XML sitemap is a detailed list of all of the live pages on your website that you want to be indexed. A sitemap doesn’t guarantee that Google will index any one page. But it communicates your intention and helps search engines prioritize pages and make decisions about how to crawl and index your site if other technical SEO requirements are in order.
How to Fix it
If you are missing a site map first you will need to create one. You can use a free tool like XML – Sitemap Generator. Once you download your new site map visit your Google Search Console and click on the sitemaps, add your new sitemap, and click submit.
Site Architecture (or Structure)
Site architecture is achieved through internal linking.
The fewer layers your site structure has, the easier it is for search engines to understand and for people to navigate. You can liken this to the organizational structure within a company. Three to four layers are ideal. For example:
- Individual blog post (3)
- Homepage (1)
- Services Page (2)
- Individual service pages (3)
Additionally, link equity is important. Links from other pages pass some “equity” to the linked page. This equity decreases the further that page is away from a well-performing page. A deeper architecture puts pages further away from the “cash cows” that supply them with link equity.
How to Fix it
Changing your site structure can take some time. The best way to understand your site structure is to start with your menu. It’s best not to have too many different layers in your link. The best practice is yoursite.com/blog/blog-title. If your site needs 4 or more layers you may want to rethink your site unless you have a large amount of content or industries.
Yoast a popular SEO plugin for WordPress has a great article on how to improve your site structure. The best way to ensure your site makes sense is to take the time to look around yourself. Can you easily find what you are looking for? Does your site have internal links that help lead users to more great information on the same topic?
Google can’t actually read a page. It doesn’t know if it includes a price list, a menu, an event schedule, or a direct answer to a common conundrum. Structured data uses a code called Schema Markup to help search engines understand what the content is so that it can display it more meaningfully in the search results.
For example, those cards that appear at the top or right of some search results often give you instant answers to your questions.
How to Fix it
To check your site’s structured data you can use the Google Structured Data Tool to test what on your site needs to be fixed. You can then use the JSON Schema markup generator to help you write the new code you need and implement it on your site through your site editor.
When someone enters a query, Google tries to show them the most relevant and thorough answer from the most authoritative site. So content must be substantial enough to be worthy to answer a query. This has not only to do with length. A “not thin” piece of content is well thought out. It answers the questions that a person who searched for something has in the most understandable way.
How to Fix it
The best way to fix thin content is to write more content. Take an audit of your content by using an SEO tool like Screaming Frog or Netpeak Spider. In your report you will see a word count for each page. If the page is considered a blog or sales page you should have a minimum of 300 – 500 words on-page to ensure you are not coming up as thin content. If your page is not a content page and is considered a category page or listing page you can use a robots.txt or no follow (as listed in the code bloat fix section) to remove those pages as content from Google.
Google considers duplicate content to be “substantive blocks” of repeated content. Things like adding a quote, repeating a CTA, or having a footer is generally not considered duplicate content. However, sometimes duplicate content is necessary for marketing, testing, networking, or visitor experience. You have tools to tell Google not to try to index the pages that have lots of duplication.
That’s where canonical tags come in.
Google chooses one of the pages as the original using an algorithm that is not known by anyone but Google. Canonical tags give you more control by telling Google which you prefer to appear in results.
How to Add Canonical Tags
If you are looking to add canonical tags to your page AHREFs an SEO tool has a great article on how to understand what a canonical is and what type your site should be using.
If you have a website that you want accessible internationally, you need to add Hreflang tags. These communicate with Google that individual pages are intended for specific languages, countries, or regions.
404 pages generally represent a broken link. If you delete, move, or rename pages this may occur. Not only is it bad for the user experience. It also tells Google you don’t maintain your site. Tools like Map Broker XML Sitemap Validator can help you find existing broken links. You can then remove any non-existing pages from your XML sitemap and limit the creation of 404 in the future.
You should also consider improving your 404 page to manually redirect lost visitors back into your site. From cute puppies to funny animations to sincere apologies, companies have all kinds of ways to say, “Oops. That page doesn’t exist. But click here and we’ll take you back.”
301 redirects are permanent redirections you place on your website. If you had no choice but to move or delete a page, you might use them to redirect to it or a similar piece of content. But too many can be a red flag to users and Google in addition to slowing your site down. Prevent them and use sparingly. The Map Broker tool also helps you identify these.
How to Add Redirects
If you have WordPress or the ability to add plugins to your site redirects can be set up simply using the plugin redirection. Once activated you can visit your plugin under the tools menu and add redirects as required.
Why Technical SEO Is Important
You can create the most useful content and attend to other aspects of SEO, but if you’re missing why technical SEO is so important, you may not get the results you seek. The great news is that technical SEO is the clearly-defined part of SEO, so there’s no guesswork, or wondering if your “SEO strategy” will work. Most websites could benefit from some attention to these important optimization aspects.