A lot of business and website owners balk at the words “technical SEO.” To many people, “technical” suggests a concept that is inherently difficult to understand, while SEO in general can seem confusing if you don’t have a background in web design or marketing.
The term itself is intimidating, but the meaning is straightforward. Technical SEO is basically the behind-the-scenes stuff that helps search engines—primarily Google—find and index your site. In other words, it’s all the SEO you do for your site, minus the actual content.
Optimizing a site’s content with relevant keywords and user-friendly experience is called “On Site SEO.” There’s “Off Site SEO” as well, which refers to how much other websites link to your site. There is a lot of crossover between the three types of SEO, particularly On Site and Technical—as you will see, many of the fixes for technical SEO issues will improve your user experience, and vice versa.
This article focuses on technical SEO for beginners.
From a technical standpoint, it’s important to stay updated on the changes made to search engine algorithms.Click To TweetWhen talking about search engines, we’re going to focus on Google. While there are other search engines, Google is still far and away the most popular, and it’s unlikely that will change anytime soon.
Fast Sites Mean High Rankings: Speeding Up Your Site
From a user’s perspective, slow sites are (obviously) less appealing than fast ones. You might have already guessed that your site speed directly correlates to how much time your visitors will want to spend on your page. The numbers back this up: Data from the gurus at Aykira shows that 57% of users will abandon a site if it takes more than 3 seconds to load, while 47% percent of users expect your page to load in less than 2 seconds.
But what you might not know is that a slow site also lowers your ranking in Google’s search engine algorithm. Google considers something called Time to First Byte (abbreviated to TTFB) when ranking sites. As the name suggests, this is the time it takes a web browser to load the first byte of information from your page.
If your users perceive your site as slow, not only will they leave (with a feeling of frustration towards your brand), but they may leave by clicking the back button so they can choose a different search result. This is called pogo-sticking, and if it happens enough, Google will lower your ranking, replacing you with a site that is more user-friendly.
Diagnosing Site Speed Problems
Luckily, there are several tools you can use to diagnose problems with your site speed. A great place to start is Google’s PageSpeed Insights Tool. You should shoot for a ranking above 80%, although obviously the faster the better.
Another tool to check for further insight into your site’s speed is GTMetrix. GTMetrix will give you a prioritized list of the elements slowing down your site, with a heavier focus on the technical over user experience. Make sure to click on the “Waterfall” tab, which shows the exact amount of time it took to fill each request. This can help you see if an individual request is taking a significant portion of loading time.
You may get two different ratings between both tools; this is because the sites prioritize different elements. Using multiple diagnostic tools helps you get a more complete picture of your site’s issues.
If you feel a bit overwhelmed by your site’s long list of problems, don’t worry. A few small changes can dramatically improve your site performance.
Quick Fixes for Site Speed Problems
Now that you have your diagnosis, it’s time to get to work repairing the issues. Unfortunately, explaining every factor that slows down websites (and how to fix them) would take quite a bit of time. For the sake of brevity, we’ll focus on a couple of easy fixes.
Optimizing Images. Most images contain a lot of metadata, which adds unnecessary loading time and isn’t particularly useful for your purposes. The best course of action is to compress your images. There are a few good tools for this.
If you have a WordPress site, there are plugins that will automatically compress your images for you, like WP Smush. ShortPixel is another great plugin that is a little more manual.
You can also compress images before you upload them. A number of easy-to-use online sites will compress your image files for you, including ImageOptimizer and Optimizilla.
Make sure to pay attention to your file sizes and the type of file you use. Look for images where you can compromise quality in exchange for a smaller size. Compressed JPEGs are usually smaller than the higher-quality PNG files. SVG and other vector images should be used whenever possible, as they can be scaled without the quality suffering.
Combining Images into Sprites. “Sprites” are image files that are made up of many smaller images. They allow you to upload many images for the price of one, instead of requiring separate requests for each individual image. You then can use CSS to tell the browser which area of the image to use.
This makes sprites useful for images that appear multiple times, like your logo, menu icons, or navigation tools.
An easy way to create a sprite is to use an online sprite generating tool like SpritePad. On SpritePad, you can upload your images onto the site, then download it as a larger PNG file along with the corresponding CSS code.
For a more hands-on approach to CSS Image Sprites, check out W3School’s guide.
Content delivery networks. CDNs work under this premise – when the information on your site is physically closer to the people trying to access it, performance increases. That means it will load faster and work better, things that every business should aspire to with their site.
How do CDNs get your site physically closer to users? By using an interconnected system of cache servers located in different places, each of which hosts the information on your site. In practice, it means that someone in Florida may be pulling up your site’s data from a server in Georgia, while a California user may get it from a different server in Nevada. Since the data doesn’t have to travel as far for each user, loading times are decreased.
Additionally, CDNs like CloudFlare also provide other benefits, such as giving your site additional security against cyberattacks.
Other Great Guides to Fixing Common Site Slowing Problems:
You don’t have to solve every problem that GTMetrix or PageSpeed Insights identifies, but addressing the highest priority ones should lead to a marked improvement in your site speed. It’s also useful to get a sample of multiple pages from your website.
Remember, higher speeds mean higher rankings!
Mobile Friendliness
Another factor that Google’s search rankings take into consideration is mobile friendliness. Over half of all web browsing now occurs on mobile devices, so it make sense that Google values this quality highly.
How do you know if your site is mobile friendly? Once again, the best tool for the job is provided by Google. The Mobile Friendly Test Tool will tell you if Google believes that your site will be easy to use on mobile devices or not. It will also provide a short list of improvements you can make to ensure a more user-friendly experience on mobile platforms. The suggestions may be a simple as changing the size of the text or spacing the links farther apart.
Usually, to improve your page’s mobile friendliness, the best course of action is to create a responsive design.
You could create a separate URL for mobile and desktop, which was a common early approach to reach mobile users. This would leave you with two different sites to update, however, and wouldn’t account for the widely varying size of modern tablets, phones, and laptops—which is why this approach is largely dying out.
Creating a Responsive Design
Let’s take a quick look at the backbone of responsive website design.
Elements in responsive designs have set percentages instead of widths. For example, this is non-responsive CSS:
#body {
width: 600px;
}
But when you change it to this…:
#body {
width: 50%;
}
…the element becomes responsive—instead of having a fixed size, it will automatically be resized to fit half the screen.
You can also utilize media queries in order to have different CSS values corresponding to screen size. For example:
@media screen and (min-width: 600px) { CSS code here… }
Now the CSS code you enter will only apply when the screen is 600 pixels wide.
This is by no means all there is to mobile site design, but it’s a good place to start. Another important thing to remember is most mobile users are browsing on slower connections than their desktop counterparts, so the above-mentioned speed techniques will prove useful in this regard.
For an in-depth look at creating a responsive web design, check out Treehouse Front End Developer Nick Petit’s guide.
Another great site for learning more is This Is Responsive, which features responsive design patterns and a comprehensive list of hundreds of links to responsive design resources.
Building a Strong Site Architecture
Google sends out its army of spiders to crawl the web on a regular basis. Having a clear, streamlined site architecture will help the spiders index and rank your webpage faster, assisting Google in tracking your latest content updates. In general, it will also make your site more navigable for human visitors.
Siloing Content
One of the ways Google will crawl a site is by using internal links. “Siloing” means organizing the structure of your sites so all content is logically organized into categories. Again, here the user experience intersects with the technical side of things.
Ideally, the spiders can land on the homepage or any other post, and easily visit all other relevant and recent posts within the category. You should organize your site so it only takes 3-4 clicks to get from any given page to any other page.
Here’s a deeper look at siloing your site’s content from Web Marketing Today.
Creating XML and HTML Sitemaps
HTML sitemaps are created for humans, but they also are helpful for spiders navigating your site. Usually the link to your HTML sitemap will appear in the footer of your website. XML sitemaps, on the other hand, are essentially a text file with one URL per link.
Larger sites can require multiple sitemaps. According to Google, single sitemaps shouldn’t be more than 50,000 URLs, or 50MB.It’s a good idea to create different sitemaps for every type of content, i.e. videos, articles, images, etc. Both types of sitemaps are useful, but to start you should absolutely create an XML sitemap for technical SEO purposes.
There are a lot of tools online to generate sitemaps if you don’t want to build one manually. WordPress users can use specialized plugins like Google XML sitemap. Another tool that’s very easy to use is located at web-site-map.com. (The site will also check for broken links).
Now that you have your sitemaps, you can submit them to Google Search Console one at time.
It’s important to also add your sitemap location to your robot.txt file, so other spiders know where to look for them. Google also checks here if there is a problem with your submission.
Remove Crawling Errors
Lastly, to optimize your site structure for crawlers, you need to move “crawl errors.” Among other reasons, Google has a limit on how much time it will spend crawling your site, so you don’t want to blow that on a bunch of dead pages.
A list of these errors can be found in Search Console under “Crawl” and then “Crawl Errors.” Don’t be intimidated by the potentially thousands of errors that pop up—many of these problems can be addressed in large groups. Here’s a guide to fixing crawl errors from Moz.com
Fix Robots.txt Restrictions
In July 2015, Google sent out this message to a huge group of webmasters:
“Google systems have recently detected an issue with your homepage that affects how well our algorithms render and index your content. Specifically, Googlebot cannot access your JavaScript and/or CSS files because of restrictions in your robots.txt file. These files help Google understand that your website works properly so blocking access to these assets can result in suboptimal rankings.”
Luckily, shortly afterward a quick way of removing restrictions was posted on StackOverflow and spread around the web.
Simply add this to your robot.txt file:
User-Agent: Googlebot
Allow: .js
Allow: .css
This should open up your files for Googlebot.
Remove Duplicate Content
Duplicate content is any content that exists in an identical form somewhere else on the internet—either on your webpage or someone else’s.
Google has been slowly learning to recognize duplicate content, which it considers a negative element of the user experience on a site. Duplicate content can also prove confusing to search engines because they’re not sure which of the pages is most relevant. To avoid losing traffic, there are a few methods you can use to get rid of duplicate content.
First, you have to identify the duplicates. There are a few good tools that make this easy.
Using the trusty Google Search Console, users can find issues under “Search Appearance,” then “HTML Improvements.” Click on “Duplicate meta descriptions” to see a list of pages with duplicate content.
The online tool Siteliner can also provide a free, quick, and easy analysis of duplicate pages. It arranges them into readable graphs and percentages. This is great for smaller sites, but Siteliner caps out at 250 pages.
If you can, simply delete the pages. If you can’t, Woorank provides an involved article with 7 different ways to get rid of content.
Conclusion
This guide only scratches the surface of technical SEO—a skill set that takes years of experience and practice to master. Don’t despair, however. Even utilizing one or two of these methods can drastically improve your site ranking. Hopefully, this guide provided you with some quick fixes to some of the more common issues surrounding technical SEO, while providing resources to explore the subject deeper.