How to Perform a Technical SEO Audit for a Client

Lindsay Halsey

Lindsay Halsey is a co-founder of Pathfinder SEO. She has over 10 years of experience working in SEO with small to large businesses. Lindsay focuses on teaching site owners, freelancers, and agencies how to get found on Google via a guided approach to SEO. Stay in touch on Twitter - @linds_halsey.

The idea of having to perform a technical SEO audit for a client can be daunting. But we’ve done hundreds of them, and we can reassure you that it doesn’t need to be.

Performing a basic technical SEO audit is relatively easy, and it will provide actionable insights that can lead to better search engine rankings for a client's site. Let’s dive into what a basic technical SEO audit entails.

The Basics of Technical SEO

Search engines need to be able to crawl and index web pages before they can be ranked and clicked on in search results. Optimizing technical SEO means making it easy and efficient for search engines to crawl and index the pages on a website.

Technical SEO can be extensive and highly specific. In the vast majority of cases, though, you won’t need to worry about going down any rabbit holes. Instead, you can simply focus on these five key technical SEO elements:

  • Robots.txt
  • XML Sitemap
  • Page Experience (including mobile usability, security, and speed)
  • Page Not Found Errors
  • Broken Links

In order to know where to start with your technical SEO work, you’ll want to audit each of the elements above.

Beginning a Technical SEO Audit

You can easily uncover technical SEO issues on a site by going through each of the five key elements above. Here's how to audit each one.

Auditing Robots.txt

A site’s robots.txt file tells the search engines which pages/content to ignore or avoid. Here’s a great resource for understanding robots.txt files. When auditing this file, you should be looking for two things:

  • Does the site already have a robots.txt file?
  • If so, does the file contain a good set of instructions?

Each content management system (WordPress, Squarespace, Wix, etc) has its own default robots.txt file. You’ll need to figure out which CMS the site you’re auditing uses. Next, familiarize yourself with the default robots.txt format for that content management system. Google recommends that you leave a site’s robots.txt set to the CMS default, which is useful in the majority of cases.

Pull up your site’s existing robots.txt file by going to: yourdomain.com/robots.txt and compare that to the CMS default.

One common red flag is a robots.txt file that looks like this:

User-agent: *
Disallow: /

This tells the search engines not to crawl any pages on the site, and typically appears in the robots.txt file for a website still in development.

If the site is live, then we want to replace Disallow: / with Disallow:   .

Common Robots.txt audit recommendations include:

  • No action is needed. Your site has a great robots.txt file (CMS default).
  • Your site does not have a robots.txt file. We suggest creating one.
  • Your robots.txt file is telling the search engines to avoid crawling your website. Immediate action is needed because your website is live but invisible to the search engines.

Auditing an XML Sitemap

An XML sitemap is the opposite of a robots.txt file. Instead of listing pages that should be hidden, it lists all of the pages/content that you want the search engines to find and scan.

Go to yourdomain.com/sitemap.xml. This is the most common place an XML sitemap would be located. If it’s there, evaluate the XML sitemap. If it’s not there, try to hunt it down.

When it comes to XML sitemaps, we look for the following:

  • Does the site already have an XML sitemap?
  • If so, is it properly configured to include the content we want to be indexed?
  • Has the XML sitemap been submitted to Google via the Google Search Console? Check the Google Search Console to verify whether it has been both submitted and accepted by Google.
  • Are all of the content types in the sitemap high-value (i.e., neither thin nor duplicate pages)?

Common XML sitemap audit recommendations include:

  • Your site does not have an XML sitemap. Create one and submit it to Google via the Search Console.
  • Your site has an XML sitemap, but it needs to be configured. Currently, it includes thin or duplicate content. Once your sitemap has been edited, we recommend re-submitting it to Google via the Google Search Console.
  • Your site has an XML sitemap and it’s great. Now, we need to submit it to Google via the Google Search Console.
  • Your site has an XML sitemap. It’s great and has been submitted successfully to Google via the Google Search Console. No action is required.

Auditing Page Experience

To audit page experience, you need to evaluate its component parts: speed, mobile usability, and security.

Page Speed

Page speed is an increasingly important factor in search engine algorithms. It can be challenging to address because Google continually changes the rules of the game and creates new benchmarks. When it comes to speed, “good” is good enough.

Mainly, you want to ensure that speed doesn’t impact a user’s experience of a website. If it does, then speed optimization becomes a high priority.

Google provides feedback on page speed in various ways. Two of the most useful are:

  • Google Search Console: The Core Web Vitals in the Google Search Console measures performance according to Google’s most important metrics. You’ll get a sense of site performance versus single-page performance. Not every website has data in the Core Web Vitals section of the Google Search Console, though; you’ll have to look and see.
  • Google PageSpeed Insights: This tool is a great way to get the current speed and performance metrics for any single web page.

Common audit recommendations include:

  • Improve page caching.
  • Use a CDN.
  • Switch to premium hosting.
  • Implement next-gen image formats like webp.

It’s worth noting that many CMSs (like Squarespace & Wix) have very limited capabilities when it comes to improving speed. Sometimes, there isn’t much you can do.

Mobile Usability

As you know, the web is accessed by many different devices, from desktop computers to laptops to tablets and smartphones. Good mobile usability means that your site provides a positive user experience, regardless of the screen size.

The good news is that most websites already meet Google’s mobile usability standards, thanks to modern web design and the inherent mobile-friendliness (responsiveness) of most CMS platforms. If your site still isn’t mobile-friendly, you may have a problem.

Go to the Google Search Console and review the Mobile Usability report. Common things to address include:

  • Any text that is too small to read on mobile.
  • Images with text that can no longer be read on smaller screens.
  • Clickable elements that are too close together to be distinguished.
  • Content that is wider than the screen.

Website Security

A secure website is a must! If an insecure website gets hacked, it will likely drop out of the search results. Start checking website security by confirming the existence of a properly installed SSL certificate, and make sure the site is accessed using HTTPS.

Use the Liquid Web SSL Checker to confirm your website meets this standard. If it doesn’t, know that upgrading a site to HTTPS needs to become your priority.

Another place to look for compromised security is in Google Search Console. Look at the Security Issues report, where Google alerts you to any issues it’s uncovered while crawling the site. Google also provides documentation about addressing any issues it finds. Remember that Google does not like to send traffic to sites with security issues, so make sure to address these quickly.

Common audit recommendations include:

  • Switch from HTTP to HTTPS.
  • Address security issues such as malware or code injections.

Auditing Page Not Found Errors & Broken Links

Page not found errors create a poor user experience, as do broken links. They prevent people from finding the content they’re looking for, and prevent Google from crawling and indexing a website. As part of any technical SEO audit, these errors should be examined and recommendations for addressing them should be made.

Page Not Found Errors

Page Not Found errors (aka 404 errors) typically occur when a URL path breaks or gets changed without implementing a redirect.

Google reports Pages Not Found in Google Search Console’s Coverage report. To find them, navigate to “Excluded” in the Coverage report. Look for “Not Found (404).” Review the URLs in this list and document them so that they’re ready to be fixed with 301 redirects.

Broken Links

Broken links are internal and external links located throughout the website that no longer work. As noted, these create a poor user experience and can prevent Google (and users) from reaching the pages they’re looking for.

Scan your site using this broken link checker. Note how many errors are found. Each broken link will need to be fixed by going through the site and repairing/redirecting the hyperlink.

Common audit recommendations for both of the above include:

  • Fix 404 errors with redirects.
  • Repair or redirect broken links.

The Takeaway

By auditing the five key areas of technical SEO as explained above, you can ensure Google’s positive response to your website. While you can always dive deeper into technical SEO, doing so can often be a waste of time. A few simple tweaks can go a long way. 

We're Here to Help

We built Pathfinder SEO for agencies like yours who are looking to start and scale their SEO services offering. We’ll accelerate you through the process and help you avoid costly mistakes.

A subscription to the Pathfinder SEO platform includes access to all the resources you need to create your SEO services offering and effectively deliver the service. That includes templates, process documentation, and built-in SEO tools.

New Pathfinder SEO subscribers receive two free onboarding sessions with our SEO coaches.

Sign up for a subscription today.

Lindsay Halsey

Lindsay Halsey is a co-founder of Pathfinder SEO. She has over 10 years of experience working in SEO with small to large businesses. Lindsay focuses on teaching site owners, freelancers, and agencies how to get found on Google via a guided approach to SEO. Stay in touch on Twitter - @linds_halsey.
Scroll to Top