A lot of work goes into optimizing a website to rank well in search engine results. Everything from the way a website is built to the colors of the text can have an impact on your final SERP ranking. We tend to break these factors into categories: on-page, off-page, semantic, user experience, domain authority, etc., etc., ad nauseam.
While these categories may or may not be helpful, enough testing has been done to know that technical SEO is a category worth paying attention to. Technical SEO focuses on organic ranking factors that are tied directly to the way your site is built, and it tends to be more objective than subjective.
So, here are six objective technical SEO factors that you should look at closely. I’ve described them in layman’s terms and offered suggestions on how you might improve them for your own site, but be aware that some of these fixes will require the help of a web developer.
This is a general, catch-all phrase referring to the ease with which your site can be scanned and navigated by the various spiders, bots, and assorted digital creepy crawlies that explore and catalog page content on behalf of search engines, like Google. You want to be sure that you are doing as much as possible to ensure that they can see what you want them to see and can’t see what you don’t.
Several factors contribute to the overall crawlability of your site. One of the simplest things you can do to improve crawlability is to create a sitemap file and submit it to Google using Google Search Console. If you’re using WordPress, the Yoast SEO plugin will automatically create a sitemap for you, but you will still need to submit the file to Search Console.
Also, don’t forget your robots.txt file. This is a file that can tell crawlers to ignore certain pages when crawling and indexing. This is particularly useful for confirmation pages, administrative pages, or hidden destinations in a sales funnel. The Yoast SEO plugin can help you take care of this, too.
Having duplicate content on your site can seriously harm your SERP rankings. Duplicate content is when two or more unique pages on your site do not have a sufficiently different body of content as determined by the crawler.
Depending on your site structure, this can be a tricky one to nail down, especially if you’ve inherited a site that was built using outdated practices. Outside of manually rewriting/restructuring all your duplicate content (which you should probably do), the easiest way to combat a duplicate content issue is by using rel=canonical tags in your meta data.
In the meta data of a page, you can supply a canonical link that basically says: “Hey, I know I look an awful lot like that other page. Why don’t you just consider me to be that other page, for all intents and purposes?” This will behave like a 301 redirect, but without the redirection. Duplicate content problem: solved.
Many of the things we’re mentioning in this article have a direct impact on how much priority Google gives your site when delivering SERPs. However, many of them have an indirect impact, too. If your user experience is poor, you will get less and briefer traffic which in turn will cause your site not to perform as well in the rankings. Page performance has a major impact on both direct and indirect factors.
Google PageSpeed Insights is a useful tool to identify what areas need to be optimized. A waterfall test can also show you which elements of your site are taking the most time to load or causing the largest bottleneck.
One word of caution as you interpret your results: It’s not uncommon for tools like PageSpeed Insights to spit out a long list of “high priority” performance issues. These tools tend to set an inconveniently high standard for page performance. The difference between having a site that loads reasonably fast for your users and having a site that scores perfectly in PageSpeed Insights often isn’t worth the effort to meet that standard.
As it was in the days of our fathers and their fathers before them, it’s still impossible to determine exactly how much weight Google gives to each SEO factor. However, signs indicate that, while HTTPS has been a ranking factor for years and years, it is now being given more and more weight.
In addition to improving the user experience by inspiring trust (who wants that ugly “Not Secure” next to their URL?) and removing potential obstacles to valuable conversions, adding an SSL certificate can have a measurable impact on your SERP ranking.
The singularity is upon us. Access to the combined sum of the knowledge of mankind fits in the palm of our hands. And Google welcomes our new mobile overlords with open arms. It has been cautiously but steadily deploying its “mobile-first” indexing for over a year.
The short version of “mobile-first” indexing means that index rankings will be determined by the content/layout/accessibility of the mobile version of your site, rather than the desktop version. If you don’t have a mobile version, then it will index your site based on your desktop version, but it will also likely penalize your rankings, too.
A flawless mobile experience should be your priority when developing layout or content for your site.
This is another one of those indirect ranking factors. Google has stated over and over that having a few 404s on your site is not going to hurt your rankings. However, 404s can absolutely impact the user experience, and should be avoided if at all possible.
In addition to keeping your site linking structure clean, you can also provide a custom 404 page instead of relying on the browser default. If it fits your brand, you can even have fun with it.
Tricked you. Meta descriptions and keywords are next to worthless when it comes to SEO. While Google can (but might not) use the meta description for its results listings, it doesn’t factor in the algorithm at all. Nor do meta keywords.
So, still write your descriptions for the results pages, but do NOT waste your time here. However, there is a form of metadata that can be useful for rankings and user experience: structured data.