When looking for ways to get a website to rank better in Google, it’s common to focus strongly on the content and try to get ahead through beating the competition in output. Technical SEO, meanwhile, gets overlooked. It doesn’t sound as interesting, and it doesn’t have the high ceiling of a quality content strategy.
But only a fool builds their house upon sand, as all the design flourishes in the world aren’t useful when the tide comes and washes it away. Just as the wise person builds their house upon a strong foundation, the smart business looks after this essential SEO tactic.
So what does technical SEO actually consist of? Let’s take a look, and then move on to the titular topic.
Image credit: Wikimedia Commons
What technical SEO involves
Technical SEO factors are those that don’t relate to the quality of a website’s content, instead relating to its structure and design. It’s quite a broad topic, especially when you take into account that technology and design standards are always changing, but it doesn’t demand the kind of frequent attention that goes towards content.
Instead, it’s something to look at in detail only occasionally. Think of it as a health check for your website. Being in good condition doesn’t achieve much in itself, but a website that isn’t technically up to par will make very poor use of even the best content, and it’s extremely important to get the best return on the investment you’ve placed in your design and copy.
Here are some of the biggest technical SEO elements that still don’t get the attention they deserve, and explanations of why they’re worth considering.
Building a clear site hierarchy
The hierarchy of a site isn’t just there to help users navigate it. In fact, how you structure your website will have a huge effect on how easily search engine crawlers can figure it out. This often gets forgotten about because people assume that Google must intuitively know how their sites work— but they’re often only simple for users. Bots have entirely different perspectives.
Every well-structured site will provide the following:
- An up-to-date and comprehensive sitemap
- If you have pages that are indexable but not included in your sitemap, this muddies the waters. Make sure that every page you want indexed is included, and every page you don’t want indexed is set to ‘noindex’.
- A sensible URL structure that uses natural language and uses folders sparingly
- Every folder or category you create should serve a purpose, and the language should be simple enough that a human could understand it (avoid lengthy strings).
- Page titles and metadata that use keywords accurately
- Keywords aren’t what they once were, but they’re still very important when used properly. If Google’s bot thinks a page is about one thing but the title suggests something else entirely, it will cause confusion and reduce authority.
Quality content is strongly undermined by a poorly-structured website. If you’ve put in work to create something valuable to your audience, you should make every effort to keep it accessible.
Tagging copy with microdata
Microdata, or structured data, is a method of tagging different parts of your copy to make them easier for search bots to interpret. Rather than trusting that Google will infer the meaning of a page, you can break it down into defined chunks— tagging your address, your business type, your opening hours, etc.
It’s known that Google doesn’t much like the idea of relying on microdata (it always wants to go down the natural language route, and the technology isn’t that far off), but for the moment it’s still quite valuable in a lot of situations.
When Google features a rich snippet in a search results page, microdata often plays a role in which site the content is taken from. Not as big a role as, say, Google My Business, but a role regardless. And since it doesn’t take all that long to implement some basic microdata, there’s really no reason why any business should ignore it entirely.
Keeping page content distinct
In the relentless push for more content, it’s common for businesses to run the risk of accidentally sabotaging themselves through duplicated copy and keyword cannibalisation. These things have to be avoided.
Firstly, every important page should be given a rel=canonical tag, and it should then be used in the event that the page content is reused elsewhere or reachable through multiple URLs. This tag tells a search crawler that the tagged page is the definitive version of a page, avoiding any confusion about which URL to rank.
Secondly, pages that are canonically distinct must be semantically distinct as well. You can have more than one page on a particular topic, but if you want both pages to rank highly, you need to keep them from overlapping too greatly. While covering the same topic won’t necessarily prevent them from ranking, it won’t have as much authority as a consolidated page combining their contents would.
Using large chunks of duplicated copy on your website doesn’t look good to crawlers, so if you’re in the middle of an article and you feel that there’s a relevant paragraph elsewhere on your website, use an internal link. Google can miss updates, and connecting your various pages will ensure that when one of them gets crawled, the linked pages will be picked up and crawled as well.
Providing security through HTTPS
HTTPS is an extension of HTTP designed to secure data exchanges through encryption, and having been around as a part of the ecommerce world for a long while, it is now considered a user protection standard for websites in general.
In fact, come the 18th of July, Google will begin warning searchers of questionable sites, marking all non-HTTPS sites as ‘not secure’. Because of this, it has gone from a recommended action to an essential one. As HTTPS becomes the default (and in the age of GDPR), any site that doesn’t measure up will likely be given a wide berth by users.
Thankfully, moving to HTTPS isn’t all that onerous a task. There are resources on the Google Security Blog page, and there’s plenty of information out there to help people along (here’s a general HTTPS guide). Since it’s something that only needs to be done once, it should be done as soon as possible.
Using mobile-first design
Though there are still some stragglers, it seems like most decent websites now work reasonably well on mobile, and that’s something to be celebrated— but it won’t be enough in the long run. In 2016, Google announced that they were experimenting with mobile-first indexing, and stated clearly that their algorithms “will eventually primarily use the mobile version of a site’s content to rank pages from that site”.
While we’re not at that point yet (desktop-first indexing is still the standard), the fact that we’re going to get there sooner or later is reason enough to overhaul old website design procedures. Being ‘friendly’ to mobile users is just the first step in a long process of changing the fundamental template of a well-built site.
Just imagine what will happen when the day finally comes that mobile-first indexing is rolled out as the default and desktop versions of sites become secondary considerations. If businesses haven’t fully prepared, they’re going to see some very interesting shifts in their rankings. Everything from mediocre page speed to sloppy navigation design could sink them.
Technical SEO isn’t sufficient, but it is necessary
Nailing technical SEO won’t get a site full of thin and weak content to rank well, but you’ll be hard-pressed to move a technically-questionable site up the rankings with any speed or reliability— and it’ll be a sisyphean task, because it will probably end up rolling right back down.
One of the biggest reasons why getting yourself up and running with a website is such a popular option these days is that people recognise the value in having the technical elements handled for them, leaving them to focus on the content. There’s great flexibility in having a bespoke site, but if you forget to keep it properly maintained, it will quickly fall behind.
Given that it takes considerably less time to shore up a site technically than it does to produce quality content to cover a few months, there’s no sense in even trying to devise a content strategy until you’re sure that your site is good enough to host it. You should review your setup on an annual basis at the very least to make sure that it stays in line with best practises.
So, to recap, technical SEO essentially provides the foundation for your website content to build on. Make it as strong as possible, taking into account these important steps that often get left out, or it will only become a bigger threat to your rankings as time goes by and search standards become more stringent.
Kayleigh Alexandra is a content writer for Micro Startups — a site dedicated to spreading the word about startups and small businesses of all shapes and sizes. Visit the blog for the latest micro biz news and inspiring entrepreneurial stories.
Follow us on Twitter @getmicrostarted.
Hi Kayleigh,
That’s a nice write-up. I do accept Technical SEO elements aren’t just sufficient, but they are necessary. Today, people create websites just for the sake of making some money online. But, they don’t understand the importance of SEO.
Hi Kayleigh,
Thanks for facing this direction of SEO.
Plugins like Yoast and most premium themes are helping bloggers get most of these points easily.
Web hosting services have a huge part to play with optimizing their servers for speedy response.
All put together, technical SEO is a just a side of the coin. Content SEO must come in to complete the game.
Thanks for making it awesome today
Hi, Kayleigh great to see you here on Hazel’s blog.
Yes, make it easy for Google to understand and good to look at and read for the UX.
And that is your foundation and as you point out a site with a strong foundation becomes a strong site if the content is then good and of long-form nature and covers the user intent.
Regards Dexter