< back to SEO

10 Technical SEO Tips To Boost Traffic To Your Website

Technical SEO is the backbone of any site which wishes to rank well long-term. Whereas links focus on authority, content focuses on targeting and user intent; Technical SEO aims to make life easier for search engines by improving the way they both gather information & use this to rank you accordingly. 

It is all too common that a website will have all of the visual indications of a beautifully designed & well developed platform, but also be completely overshadowed by a myriad of technical issues which render the stunning aesthetic, along with the blood, sweat & tears completely redundant.

With that said, here are 10 tips you can consider when either developing or planning your next web venture:

Mitigate duplicate content issues

Duplicate content issues are the bane of most starter-ecommerce platforms. Often, when emphasis is not put on SEO or organic traffic during the inception of a website, technical problems are allowed to quietly capitulate unbeknownst to the website owner. These issues start small, but spread quickly if not kept in check using the right techniques.

https://www.example.com/category/

https://www.example.com/category/?cat=1&sort=desc

https://www.example.com/category/?cat=1&sort=asc

https://www.example.com/category/?cat=1&sort=desc&limit=10

The above URLs look harmless, which is what makes duplicate content issues difficult to spot to the untrained eye; but if we consider that all 4 of those pages have the exact same content (albeit in a different order, see sort / limit parameters), we can start to see how, if left unchecked, search engines will begin to believe you are purposefully attempting to flood the index with irrelevant content. Ergo, penalty.

With that being said, correct self-canonicalisation (whilst truncating / removing parameters) is a fantastic way to prepare against the eventuality of duplicate content but effectively defining the origin of the page.

<link rel=”canonical” href=”https://www.example.com/category/” />

This will ensure from the start that search engines are always fully aware of the original page.

Mitigating a duplicate content problem after propagation requires some evaluation, but can be achieved with the use of a conditional noindex:

<meta name=”robots” content=”NOINDEX, NOFOLLOW” />

This will usually be implemented server-side whenever a certain set of parameters are accepted into the URL.

<?php  

if(isset($_GET[‘sort’])){ echo ‘ <meta name=”robots” content=”NOINDEX, NOFOLLOW” /> ’; } 

?> 

Once duplicate content has been mitigated, a fantastic way to ensure search engines do not access these parameters again is to instantiate rules within your robots.txt file: 

User-agent: *

Disallow: *?sort

Disallow: *?limit

With all of this in place, you should be able to easily mitigate and future-proof against any more issues of this nature. 

Site speed evaluation 

Site speed is one of the most important factors for any well-established businesses organic-traffic. Site-speed is a factor which not only directly affects ranking potential, but also the engagement and behaviour of your users. 

When studies were conducted in relation to how long a mobile user will wait for a page to load, the maximum threshold in which users waited for the content they had landed on from the SERP was 3 seconds. With this in mind, ensuring a website is both fit-for-purpose and streamlined to keep FCP (First-content-paint), CLS (Cumulative layout shift) & TTI (Time to Interactive). 

There are a plethora of ways to improve your websites overall speed, and many of these are easy to implement regardless of platform. Some quick wins include:

  • Optimizing images (ensure that if bought from stock, that these are not placed directly on the site).
  • Improve server capacity (shared/economy hosting is not set-up for speed, as many websites on that server draw from the same resource-pool).
  • Only include widgets on pages which need them (e.g. Google Maps on every page, when an image in place of the map and a widget on the contact page would be more efficient). 

Google Pagespeed insights is a fantastic tool for evaluation of your Web vitals.

Proper content structure 

A proper content structure ensures that crawlers understand the hierarchical importance of your content (referred to as deep-linking). A proper content structure should not be seen just a URL, but as a drilldown of content available.

https://www.example.com/shop/tools/handsaws/

Rather than looking at the above as one page, break it down into its individual sections, and formulate the idea of individually optimizing each for specific user intent 

/shop/Top level with internal links to all child categories, e.g. “Hardware tools & Materials” 
/tools/More specific breakdown “Hardware tools”
/handsaws/Specific to user intent “Buy handsaws online”

Having a well thought-out content structure in place allows for the continual optimisation from store through to product level whilst being weighted hierarchically.

For more information, you can check our dedicated content structure blog post.

Check index coverage 

It is important that constant evaluation on your index coverage is undertaken, not only on warnings, but also on your valid entries & pages which will be found via the SERP. 

Ordinarily, a page will stay indexed until the end of its life, when it will be removed and begin to 404 – although, it still takes time for search engines to recognise this – and, if left unchecked, users will more than likely stumble across this page and be greeted with “404, not found! (sorry about that)”, which will more than likely be enough to persuade them to find their answer elsewhere. 

To counter this, redirects (temporary/permanent depending on context) should be used to ensure users still encounter a usable page, even if it isn’t specifically tailored to their search intent. 

For example:

https://www.example.com/shop/tools/handsaws/tenon-saws

⮡  302 (if category is temporarily unavailable) 301 (if gone permanently) ⮧

https://www.example.com/shop/tools/handsaws/

Ensure all pages 2xx

Internal linking is a huge factor in SEO, many site-owners (who aren’t technical SEO savvy) believe that if a page loads, everything should be fine. Although, it is quite the contrary. When Search Engines crawl your site they use a metric known as “crawl budget”, this crawl budget is the amount of pages Search Engines can realistically crawl and weigh at that specific point in time. More refined, it is the amount of attention your website pages are getting from search engines. 

When you implement internal links e.g. “/shop/handsaws/”, make sure that the internal link resolves straight to the target page, and not through a redirect (or chain of redirects). The best example of this is a site using HTTPS.

Most websites will have server-side rules detecting if HTTPS is active, and redirecting from HTTP to HTTPS (via a 301) if this is not the case. If you then implement an absolute (link containing full path including domain) which contains HTTP, you will force your users, and search engines, through a redirect:

Link: http://www.example.com/shop/tools/handsaws/ (HTTP://)

⮡ 301 ⬎

        http://www.example.com/shop/tools/handsaws/ (HTTPS://)

To remedy this, either always use relative links, e.g. (/shop/tools/handsaws rather than http://example.com/shop/tools/handsaws), and if you must use absolute links, ensure they use the correct protocol for your site.

Proper semantic markup 

Search engines have gotten a lot smarter at deciphering both the topic and context of pages, as well as the on-page layout of content; but making search engines’ lives easier by mitigating the need for robots to vigorously waste time attempting to understand content can harbour great results. 

Semantic are contextually similar to content structure, in that it is a hierarchy of information which needs to be both human and machine readable – fortunately, HTML gives us the ability to correctly structure pages in such a way as to completely remove the guesswork for robots. This is achieved with the follow:

  • Heading tags (<H1> – <H6>)
  • Paragraph tags (<p> </p>
  • List tags (both ordered and unordered (<ol></ol>, <ul></ul>) 
  • Navigation Items (<nav></nav>)
  • Header items (<header></header>)
  • Page sections (<section></section>)
  • Article placements (<article></article>)
  • Side content, e.g. related posts, taxonomies (<aside></aside>)

Utilizing semantic markup efficiently gives Robots the ability to understand the contextual meaning of your content, and rank it accordingly:

<h1>Hacksaws</h1>

<p>Writeup about uses of hacksaws</p>

<h2>Hacksaw blades</h2>

<p>Writeup about hacksaw blades</p>

<h3>Hacksaw blade teeth-types</h3>

Following the title, heading, subheading etc. model allows content to be understood and weighted more effectively than content which has not been treated semantically. 

Cache, cache, cache

As mentioned above, page-speed is vital in order to properly complete the SERP, as well as to retain users, increase frequency and build a trusted relationship with customers. One of the most effective ways to improve page-speed is to implement caching for static resources (images, css, JavaScript, some videos) to ensure users do not need to re-download these resources on either every page-load, or every subsequent visit. 

Frequently, cache settings will be handled by either a developer, development company or hosting company. Depending on the host, caching may be activated by default, but it is important to check the “cache-control: max-age” headers sent along with the files to gauge the intervals in which users must re-download static resources, or to see if the caching functionality has been activated. 

.  

gZIP Compression 

gZIP is a lossless form of compression which takes all static resources on a webserver and reduces their file sizes, which in turn allows you to serve smaller files to your users at no detriment to quality. 

Although either a developer, development company or server-side knowledge is required to implement and understand efficiently, the pay-off for both robots & users is truly worth the hassle of initial configuration. gZIP will actively compress files such as;

  • Html
  • Js
  • Css
  • Images (png/jpeg/svg)
  • XML

It cannot be understated that this is a must, if you are managing a website without gZIP activated, you are definitely running the risk of competitors gaining a huge advantage. 

Test on mobile viewports 

The mobile-first methodology is the most important factor in relation to Technical SEO – regardless of site speed, duplicate content or semantics –  if your website is not optimized towards mobile-traffic & usability, it is not optimized towards longevity.

It is estimated that in Q1 of 2020, 51% (rounded down) is mobile-traffic. This essentially means that neglecting basic mobile principles pigeonholes your website into only catering for 49% of all internet users, which could be refined to a smaller percentage of users with the search intent to find your content.

In 2020, websites must abide by at least the rudimentary mobile-web principles to ensure they stay competitive, fundamental principles include:

  • Ensuring your website properly adjusts to the users viewport (whether it be mobile / desktop / tablet)
  • Ensuring content is accessible regardless of device (not blocking certain viewports from viewing expected content)
  • Ensuring the page loads before the 3-second threshold to deter users from bouncing back to the SERP.

Structured Data

Finally, structured data. The veritable holy grail of Technical SEO. Structured data gives search engines more information about the page, and in return search engines will add additional information to your SERP listings e.g. price, quantity information, breadcrumbs and even the much favoured review stars.

Google has an entire search gallery full of the various types of structured data it supports, from products, to recipes and articles, all the way to FAQ’s & events. Failure to utilize structured data as part of your Technical strategy would be to openly neglect the potential to grow your organic traffic further. 

If you believe your website is currently being affected by any of the issues above, or you have a Technical SEO related query, we would be more than happy to evaluate the best course of action and remedy these issues to ensure the technical longevity of your online business.