Technical SEO is what gets high-quality content the ranking it deserves. Learn from these common issues (and quick fixes) to push your website to the top of the SERP.
Here, you’ll find:
- Why technical SEO matters
- How to determine your current technical SEO health
- The six most common technical SEO issues
- Tips to improve your technical SEO
Search engine optimization (SEO) is what puts brand content in front of target audiences. So it’s no surprise that seasoned marketers consider it a pillar of a well-rounded digital marketing plan.
But it takes more than on-page SEO to appease the great and powerful Google — ruler of search in the land of engines.
If you truly want to rank, you also need technical SEO.
Technical SEO is equally, if not more, important as on-site optimization. But like peanut butter and jelly, they just go better together.
Unfortunately, companies and marketing teams often ignore it to their site’s demise. So let’s take a closer look at how technical SEO can help (or hurt) your website performance.
On-page SEO vs. technical SEO: What’s the difference?
Both on-page SEO and technical SEO help your website rank in search engines like Google, Yahoo!, and Bing. But how you use them to improve your site is where the two differ.
Search engine optimization is all about content. Starting with keyword research, and content creation, on-page SEO requires:
- Incorporating keywords into page titles, H2s, and throughout on-page content
- Writing metadata, such as meta descriptions and meta titles, using keywords
- Inserting relevant internal links into blog posts and web pages with keywords
Technical SEO involves:
- Improving page load speed
- Ensuring the navigation and site structure are correct and intuitive
- Making the site user-friendly for mobile devices using responsive designs
- Optimizing images and videos
- Using schema markup
- Adding “author” HTML tags to author bio sections
This is a wildly abbreviated overview of the work that goes into both strategies, but in short: one makes site changes you can usually see (SEO) and the other makes background site changes you experience (technical SEO).
Why is technical SEO so important?
You built a monster of a content marketing strategy that includes publishing high-quality, in-depth content on a weekly basis. Plus, you optimize the on-page elements of each post. But your content barely makes it to the second page of Google, even though it’s 10x better than what’s on the first page.
What’s happening?
Diving head-first into your technical SEO will shed some light. For instance, maybe your site’s performance is hurting the user experience (UX) — a critical Google ranking factor. If your site takes too long to load or has broken links throughout the site, then people will bounce away.
When people bounce, so does your site ranking — right down the drain.
So by creating a great UX (and equally excellent content) on the front and back end, the odds of people sticking around increase:
Unfortunately, only 35% of sites see high page views per visit, according to the HubSpot Inbound Marketing Trends Report.
Ignore your technical SEO, and your on-page optimization efforts won’t be enough to increase your search engine results page (SERP) position, traffic, and retention. Then comes the potential demise of your site’s ranking.
In fact, it’s the reason 33% of marketers surveyed by HubSpot state mobile-friendliness as the trend to leverage.
And the top marketing channels echo this sentiment, with 36% focusing on their website and blog. Not far behind are content marketing and SEO marketing — all of which relies on excellent SEO and technical SEO.
Smart move, given that the majority (27%) of internet traffic still comes from search:
The 6 most common technical SEO issues
There are plenty of mistakes site owners can make when it comes to SEO. But here are the top six high-priority technical SEO issues Veronica Baas, Lead Strategist at HawkSEM, sees most often:
- Sitemap.xml crawl errors
- Duplicate content issues
- Unintentional temporary redirects
- Unoptimized images
- Poorly coded JavaScript & CSS
- Broken internal and external links
Let’s break it down.
1. Sitemap.xml crawl errors
Sitemaps often have old and outdated pages that are broken or redirect traffic. Removing these is critical if you want Google’s spiders to crawl and index your pages.
Note that larger sites have bigger site maps that can halt indexing if it’s a muddled mess.
2. Duplicate content issues
Content marketers know duplicate content is a big fat NO. But sometimes they’re necessary for site functionality. For example, page two or three of a blog or ecommerce product category.
These pages can co-exist, but don’t need indexing for search. However, if you want them added to Google, then differentiate the content, so it passes the non-duplicate test.
Another duplicate content issue is having a www. and non-www. or http: and https: versions of your domain. You only need https pages with either www. or non-www. pages.
For instance: https://mysite[dot]com or https://www.mysite[dot]com (one or the other, not both). When you have both, it creates two separate domains on Google (if Google fails to detect the duplicate page and index one, which it normally does).
3. Unintentional temporary redirects
A 302 or 307 redirect is temporary, while 301 redirects are permanent. The difference: a temporary redirect is for taking down a web page you’ll republish one day. For example, a black Friday sale page.
A permanent redirect, such as a 301, tells Google the page is going down forever and won’t be coming back…ever (hit the road, Jack).
“When using 301 redirects, Google will pass the web page authority of the old page to its new redirect destination, giving the new page a boost,” says Baas. “Temporary redirects don’t have this same effect so can be a big waste of page authority.”
4. Unoptimized images
Unoptimized images can impact site speed and load stability negatively. This can happen when site owners:
- Add ginormous images to web pages
- Lack proper caching and/or have lazy loading setup
- Use the wrong image file format (next-generation file formats are optimal for speed)
- Don’t have height and width HTML attributes
5. Poorly coded JavaScript & CSS
Now for some “geek” talk. If you’re a coder or familiar with programming languages, this is for you. When JavaScript and CSS (common languages used to build websites) are coded poorly, it bogs down site speed.
This can happen when there are:
- Unused JavaScript/CSS files
- Unminified JavaScript/CSS files
- Render-blocking JavaScript
- CSS resources blocking the first paint
- DOM tree size/node count
6. Broken internal & external links
It happens to the best of us, even those with Trello boards that are organized to the T.
You move a page, forget to switch all the links to your site, and now you have a broken link problem. It’s an easy fix if you catch it early and use an SEO tool like Screaming Frog to spot broken links.
Then there are broken links outside of your realm. External links change, and when they do, it tells Google the site is outdated and can hurt crawling efficiency and the user experience.
In other words, bad news for your core web vitals and page ranking.
How can you improve your technical SEO?
Dealing with poor site performance ends here. It’s time to take the steps necessary to make your site worthy of visitors (and the SERPs).
Here’s a technical SEO checklist you can use now to improve your site today.
1. Audit your current SEO efforts
The first step to improving your technical SEO strategy: conduct an SEO audit. A proper SEO audit is a mix of a manual walk-through of your site coupled with the use of trusted tools, such as Semrush and Screaming Frog, to find common technical issues.
Some issues auditing tools can detect include:
- Duplicate content
- Broken internal links
- Invalid robots.txt format
- Non-secure pages
- Slow page load speed
- Multiple canonical URLs
After the audit, identify what’s broken and then address those errors first. Some audits reveal a ton of red flags, which can be overwhelming. That’s why it’s a good idea to have an SEO expert and a web developer review and help you address the more technical issues.
Pro tip: Register your site with Google Search Console and Bing Webmaster Tools. These free tools can find technical issues on your website.
Here’s a short clip from our recent webinar that delves into how to perform a technical SEO audit.
2. Understand how search bots crawl and index
Google has billions of pages indexed. Google and other search engines have bots that crawl websites via the source code on your website. But these bots don’t “see” web pages the same way humans do.
It takes more than just producing great content for Google to find it, rank it, and reel in traffic. (Although great content is a must.) If a bot can’t find or understand your pages — even if your content is the best on the internet — it can’t rank in the search results.
Despite how powerful search engines are, their bots crawl a finite number of pages across the internet. Because of this, you want to make sure they’re crawling the most important, high-quality pages on your site — not wasting time on low-quality pages. This is referred to as “crawl budget.”
Crawl budget is key for larger websites. If you have thousands of pages, but your crawl stats show that Google’s only crawling a portion of them each day, it means they’re missing big parts of your site. You can improve your crawl budget by excluding crawlers from irrelevant pages. These could be:
- Admin or login pages
- “Thank you” or confirmation pages
- Paginated pages
- Testing and development pages
- PPC landing pages
Pro tip: Check which pages are indexed in search engines by doing a simple site: [website URL] search in Google. You can click through all the indexed results to see if a chunk of pages might be missing or if there are pages that shouldn’t be indexed.
3. Implement structured data
One way to improve how bots understand your website content is through structured data or schema markup. This is important for SEO and to prepare for the future of search, as Google and other engines continue to personalize the user experience and answer questions directly on their search engine results pages.
There are hundreds of different schema types, and the best fit for your website depends on your product or service, industry, and the type of content you offer. Google’s Structured Data Markup Helper is a highly useful tool if you’re not familiar with structured data. It walks you through the steps to add structured data to your site, notes which items need to be marked up, and creates the HTML for you.
Structured data helps you stand out in the search results and increases the likelihood of your site appearing in SERP features like Featured Snippets (aka rich snippets) or People Also Ask. This can be hugely beneficial for your site. If you already have structured data, you can check if it’s working properly by using the Rich Results Test tool.
4. Secure your site
The “http” in URLs stands for Hypertext Transfer Protocol, and it allows for information to be passed between web servers and clients. The “S” stands for secure. If a website isn’t secure, any information a user inputs on the page (like their name, address, or credit card details) are not protected and can be stolen.
On a secure website, the code is encrypted. This means any sensitive information cannot be traced. Having a secure site can give you a small ranking boost. Plus, web browsers like Chrome are getting more aggressive about letting searchers know if they’re on a non-secure site and could be at risk.
Check if your website is secure by looking in your browser. If your site has a lock next to the URL, it’s secure. Secure domains will also show “https” in the search bar, vs. just “http.”
Having a secure vs. non-secure site can be the difference between a user converting or not converting. If your website is secure, your audience can feel confident that their personal data is safe and that your brand is trustworthy. On the other hand, arriving at a site with warnings that the page they’re on isn’t secure may make users uneasy and cause them to bounce.
5. Ensure your site is mobile-friendly
Fun fact: 41% of web traffic comes from mobile users (are you surprised?).
Because of this, Google has a mobile-first indexing approach — aka mobile-friendly sites get priority ranking.
Not having a mobile-friendly website is no longer an option if you want to rank in search engines. A good rule of thumb is to have a responsive website instead of a separate mobile site. You can test your mobile-friendliness by using Google’s Mobile-Friendly Test tool.
But it’s not enough for a site to be simply mobile responsive. Your site should also have a positive overall mobile user experience. Mobile users are very fickle and will bounce quickly if they can’t find what they’re looking for fast.
Nearly 45% of web users bounce from websites 21%-40% of the time.
By optimizing for mobile users, you can push your site closer to the 1%-20% bounce rate to help your ranking (and user experience). Unfortunately, it’s a step many marketers forget to take since we’re so often working on desktops. Google Search Console can also alert you to any mobile usability issues, like clickable elements being too small or content being too close to the screen’s edge.
Examples of sites that are mobile-friendly (left) and… not so much (right).
But don’t stop here — accessibility extends beyond mobile devices.
“Accessibility is a ranking factor/technical SEO topic that I really anticipate will take off next year,” says Baas. “Google just updated their PageSpeed Insights test last week to include a TON of new information including a new section on accessibility.”
Often, when Google improves a tool to include additional insights on a particular ranking factor — in this case accessibility — it means they’re about to update the algorithm to put more weight on this ranking factor (or already have).
“My recommendation to SEOs would be to focus more on the optimization opportunities for accessibility now lined out in this test,” Baas advises.
6. Review your website architecture
We’ve highlighted the importance of website architecture for SEO before. Basically, its goal is to make navigating your website easy, clear, and intuitive while making it easier for search engines to crawl your pages. The main components of website architecture are:
- Navigation
- Internal links
- URL structures
- Metadata
Navigation
Navigation is important for user experience as well as search engines. Google bots crawl links and your XML sitemap, but they also use navigation to determine how important certain pages are on your site.
Because of this, you want to make sure your important pages are linked as “tier 1,” or most important. Ideally, you don’t have more than seven tier 1 items unless you have a really large website, and I usually don’t recommend linking tier 4 pages and beyond in the navigation to avoid clutter.
It’s also important to have footer navigation that lives on every page of your site. That way, when bots are crawling, they’re crawling your footer links. It’s common to link your privacy policy, support page, local info, and social media profiles in the footer.
Internal links
When bots are crawling your content, they’re following both internal and external links. Because of that, you want to use internal links to guide them to the important pages on your site.
You usually don’t need to link to your homepage internally since it’s going to be your highest authority page anyway. You should, however, link to internal content, such as product pages and blogs.
Also, be sure to use keywords in your anchor text instead of generic phrases like “learn more” or “click here.” Bots use anchor text to determine the topic of the content you link to.
URL structures
If your website host automatically creates URLs for you when you add new pages to your site, you may not think about URL structures much. But these structures are yet another signal that explains what your page is about to search engine bots. Check out these two examples:
- https://www.imdb.com/title/tt0120338/
- https://hawksem.com/blog/b2b-paid-social-media-marketing-strategies/
Not to toot our own horn here, but it’s clear that one URL structure has a much clearer explanation of what a page is going to be about than another. Also, you should use keywords in your URLs when possible, and URL structures should follow your navigation’s structure (like how the above blog title comes after “/blog/” root category).
Pro tip: Avoid underscores in your URLs. Bots ignore underscores and will think anything separated by an underscore is one long word, so use hyphens instead.
Metadata
Metadata refers to things like your site’s page title and meta description, which summarize the page’s content. These elements improve your click-through rate when you follow best practices like:
- Including keywords
- Adding meta tags
- Using pipes or hyphens to separate words
- Keeping titles under 60 characters
- Keeping meta descriptions under 155 characters
The page title may cut off in search results if it goes too long, especially for mobile. It’s also worth noting that Google outputs the URL above the page title now on the SERP. This is another reason URL structure is important and should be easy to read.
7. Optimize for page speed
Page speed is another ranking factor that can hurt or help search engine page positions of your site. If it’s too slow, then it’ll fall in the ranks. Bad news for site owners that want to create an interactive experience using moving graphics.
This and various other elements can bog down your page load time, including:
- Large videos
- Unoptimized images
- Too many graphics
- Excessive ads
- Clunky or outdated plugins
- Too much Flash content
“Page speed updates in the last year have really made this more important,” says Baas. “Google has improved their tools for webmasters to use to improve page speed, such as the Google Search Console page experience report and the PageSpeed Insights test.”
To improve your page load speeds, you must first identify what’s slowing it down. Then you can either remove, optimize, or reduce them. For example, resizing images to reduce file size is a way to improve website performance. Or compressing them using PNG files via a tool like TinyPNG.
So what is the target page load speed? Roughly 25% of websites load within five seconds. And if you want to beat 75% of the web, then your site should load within 2.9 seconds. Only 6% of the web loads at 0.8 seconds.
Now, you have an idea of how fast to make your site compete with the majority of the web.
8. Check backlink quality
Lightning strikes. Tornadoes. Earthquakes. Some things are outside of our control. And the same holds true for your link-building strategy. You choose which internal and external links to include on your site.
But what about the websites that decide to link to yours?
Google and other search engines use backlinks to determine the quality of your site. It used to be the more backlinks, the higher it ranks your site. However, after link farms sprung up, polluting the internet with spammy backlinks, Google had to change its tune.
Today, backlinks only count for your site if the linking site applies to the linked page/blog. Remember, Google is all about helping users find information. So every link should lead the user to more information that they’ll find helpful.
To check your site’s backlink profile, you can use site audit tools like Ahrefs, Moz, and Google Search Console.
Although you can’t control who links to your site, you can improve your backlink profile by doing things like:
- Writing guest blog posts that link back to your site
- Creating informative posts other authority sites will want to link to
- Finding mentions of your brand or product and asking for a backlink (if the site is relevant and high-quality)
Once you make improvements to your backlink profile, ask Google to re-crawl your site so it can re-rank it.
Is technical SEO difficult?
Technical SEO is doable when you have the right tools (and the time) to run an audit and make improvements. It’s a tedious task that requires understanding why something is broken and how to fix it.
Unfortunately, most website owners are too busy to audit, analyze, and improve their technical SEO. So even when you have tools like Screaming Frog to run analytics, it still requires you to make decisions and enhancements. In some cases, this may create a long to-do list of tasks, like resizing thousands of images, redesigning your site for mobile-friendliness, and updating broken links.
Don’t have the time or (mental capacity) to do it all yourself? Most don’t. And it’s a good reason to enlist the help of a full-service digital marketing agency like HawkSEM. Not only do we offer SEO (and technical SEO) services — we also manage content marketing, PPC, and conversion rate optimization.
But there’s more (wink).
At Hawk, we also have ConversionIQ, a proprietary reporting software that connects all your marketing platforms and reporting data onto one easy-to-navigate dashboard. This allows you to pull data from your performing keywords to influence on-site title tags and H1 content to drive more conversions.
The takeaway
The more technical side of SEO can be intimidating. After all, it’s filled with code, jargon, and robots.
But by getting a handle on your technical SEO, you can be confident that your efforts are more thorough, well-rounded, and poised for maximum search engine visibility.
Ready to increase your tech SEO know-how?
Check out this webinar recording, “The Importance of Technical SEO” for even more insights. Need help with your technical SEO? Get in touch.