Technical SEO Audit Services

What is a Technical SEO Audit?

✅ The overall objective of a technical audit is to identify, diagnose and fix SEO issues that cause common problems like low search engine rankings, site speed performance, duplicate content, or website indexation problems.

A Technical SEO Audit helps you determine what specific technical factors prevent your site from ranking well as it could in the search engines.

You can think of it as an overall SEO health check for your website.

The more technical issues found on your site, the fewer odds are your site pages are ranking well in Google and other search engines. And conversely, the fewer issues, the higher your site can rank.

 

Technical SEO Audit_2Why Is a Technical SEO Audit Important?

To improve website rankings, you need a clear understanding and prioritized plan to improve your site’s SEO health.

You can’t treat your SEO health issues until you know what specific items need to be fixed.

This is where a technical audit comes into play by identifying and prioritizing factors that need to be corrected to improve your site’s overall SEO health.

✅ A website can have the best content in the world, but if technical SEO issues prevent your site from being crawled and indexed in Google, your site will continue to underperform, resulting in poor performance and lost leads and sales volumes.

The SEO Audit is your best tactic to find items you need to fix to increase Organic traffic, leads, and sales.

What Are the Key Elements Of a Technical SEO Audit?

✔️ Crawl & Index

For your website to show up and rank in the search engines, your site must be easily crawled by search engine bots (spiders). These bots scour the web and index web pages, and content found. This information becomes part of the search engine’s index (database of organized content).

Common Crawl and Index Issues:

• High-value website pages are not indexed in Google.
• Search engine spiders can’t find site pages because web pages can’t be crawled.
• Incorrect use of robots.txt or nofollow, no index tags.
• 404-page/broken link errors and 500 server errors.
• Low value, thin web pages are consuming crawl budgets.
• Inaccurate XML Sitemap missing important pages to be indexed.
• URL redirect issues.

✔️ Website XML Sitemap

Every website should use an XML Sitemap to communicate to Google and other search engines which important pages should be crawled and indexed.

Having an accurate and up-to-date XML Sitemap is an integral part of technical SEO, as it helps search engine spiders crawl and find new site content.

Common XML Sitemap Issues:

• XML Sitemap is missing (doesn’t have one).
• Important URLs are not in the Sitemap.
• Sitemap contains non-indexable URLs (ex. Pages excluded by Nofollow, Noindex tag).
• Sitemap contains broken pages (404-error pages)
• XML sitemap is not formatted correctly.
• Sitemap is over 50,000 URLs.
• Sitemap contains non-canonical URLs.
• Sitemap contains low-value, thin content pages.
• XML Sitemap is not declared in the robots.txt file.

✔️ Robots.txt File

A robots.txt directive is used to tell Google spiders and other search engine crawlers which URLs can be accessed on your site.

Robots.txt file is the initial traffic controller of your website.

It tells the crawlers which pages and parts of your site can be crawled and which ones can’t.

Common Robots.txt Issues:

• Missing robots.txt file.
• Robots.txt file is blocking important pages (that it shouldn’t be).
• Wildcards are improperly used.
• Robots.txt file is not stored in the root directory.
• Use of incorrect syntax to disallow pages or directories.
• Forgetting to use “/” at the end of a directory path.
• Case sensitive issues (using capitalized vs. lowercase directives).

✔️ Response Codes

After a spider visits your site, the web server will provide a specific response code.

A 200-response code is returned in normal circumstances, which means the site page is working correctly. In other instances, you may see 404-page errors or 301 redirects.

Response codes are a good indicator of the overall health of your site.

For example, sites with significant 404 and 500 errors can cause site crawling and indexing issues.

Common Response Code Issues:

• The site has many broken links/pages (404-errors – page not found).
• The site has 500 server errors.
• The site is not using 302 redirects (temporary redirects) correctly.

✔️ Website HTTPS Security

HTTPS is a secure protocol for sending encrypted data to and from your internet browser. HTTPS Secure Website

Google has made a big push for websites to implement HTTPS protocols through SSL certificates. An SSL certificate is a digital certificate that enables an encrypted connection between the web server and the browser.

Common Website Security Issues:

• Mixed content (site is showing HTTP (non-secure) and HTTPS (secure) pages and content.
• Non-secure URLs don’t 301 redirect to secure URLs.
• Sitemaps weren’t resubmitted with secure URLs.
• The server key expired.
• Secure pages lack self-referencing canonical tags.

✔️ Website URLs

The URL is the web address to your specific web page.

URLs play an important role in the Search Engine Optimization process.

URLs with keywords in them can help boost rankings. Additionally, descriptive URLs can help improve Organic Click-Through Rates (CTRs).

Common Website URL Issues:

• URL does not contain targeted keywords.
• URL contains too many keywords (keyword URL stuffing).
• Non-HTTPS (secure) URLs.
• URLs use session IDs and dynamic URLs.
• Non-descriptive or vague URL naming conventions.
• Uses both page versions (with “/” or without trailing “/”).
• Missing Rel=”Canonical” tag.
• Non-standard usage of www and non-www URL addresses.

✔️ Website Architecture Flat Site Structure

Site architecture is a hierarchical structure of your site’s content. It provides a clear organization of topics and content for your site.

Website architecture is an important SEO component because it helps spiders crawl and index your entire site.

It is best practice to use a relatively “flat site architecture.” This means that search engine spiders can reach any site page with 3-4 clicks or less.

Common Website Architecture Issues:

• Pages are more than four clicks from the home page.
• Important web pages are buried deep in the site.
• Search engine crawl budgets don’t index high-value and important web pages.
• Too many category-level pages are used.
• Limited or no usage of Topic Clusters.
• Too many page levels (directories) dilute page authorities.

✔️ JavaScript and SEO

JavaScript is a website programming language that allows for web page interactivity like menus, forms, content, graphics, and other web page elements.

In recent years, Google and other search engines have gotten better at crawling sites that use JavaScript for navigation and other page features.

Common SEO and JavaScript Issues:

• Blocked JavaScript resources within your robots.txt file.
• JavaScript syntax errors.
• Several URLs for the same content result in duplicate content issues.
• Heavy use of JavaScript navigation over HTML links.
• Excessive JavaScript use causes slower page load times.
• Not minifying JavaScript.

✔️ Schema Markup

Google uses structured data to help understand the page content. Schema also influences rich snippet SERP results, resulting in higher Organic click-through rates.

A simple test to see if your page supports snippet-rich results is this tool – Google Rich Results Tool.

Common Schema Markup Issues:

• Missing Schema.
• Schema validation errors.
• Use of multiple schema languages (JSON-LD, RFD, Microdata)
• Parse errors.
• Validation errors.
• Validation warnings.

✔️ Canonicalization and Duplicate Content

Duplicate content can be a significant problem for websites, especially enterprise and eCommerce websites.

When your site has identical, near duplicate, or similar pages, this can have negative SEO impacts.

Duplicate content can cause indexation problems, crawl budget waste, and generate lower search engine rankings (lower organic traffic).

Update Website ContentCommon Canonicalization and Duplicate Content Issues:

• Homepage not canonicalized properly.
• Identical, copied content (not using Rel=canonical tag)
• Non-www and /index.html versions of the URLs not 301-redirected.
• HTTP and HTTPS versions.
• URLs don’t resolve to a single case.
• Trailing slashes create duplicate content.
• Parameter URLs creating duplicate content.
• Canonical tag not included on the site pages.
• Site doesn’t canonicalize print pages.
• Pagination issues.

✔️ Mobile SEO Friendly Website

With nearly 60% of web searches done on Mobile device, it is critical that your site is SEO mobile-friendly and provide a positive user experience.

A simple test to determine if your site is mobile-friendly or not is to use Google Mobile-Friendly Test.

According to Google, it is best to use responsive web design, avoid interstitials, and have fast mobile page speeds.

To test mobile page speeds, use Google’s PageSpeed Insights Tool.

Google PageSpeed Insights
Common SEO Mobile Issues:

• Site design is not mobile-friendly (not using responsive web design).
• Blocked JavaScript, CSS, and image files.
• Mobile pages have full-page app interstitials.
• Images are resizing.
• Images are too large (not compressed and slowing down page speed performance).
• Viewport is not set.
• Text is too small to read.
• Clickable elements are too close together.
• Content is wider than the screen.
• Missing content (if you are using desktop and mobile site versions).

✔️ Web Page Speed and Performance

Page speed (page load time) measures how quickly page content loads.

An excellent tool to measure page load performance is the PageSpeed Insights tool.

Common Web Page Speed and Performance Issues:

• Page load times of top landing pages are high.
• No usage of page caching.
• Excessive HTTP requests.
• Site uses inline CSS.
• Site uses inline JavaScript.
• Site doesn’t minify JavaScript or CSS.
• Large (non-compressed) images.
• No use of Content Delivery Network (CDN).
• Using too many WordPress Plugins.
• Use of slow web hosting provider.

 

Should You Work With An SEO Consultant Or Use An SEO Tool?

Many technical SEO audit tools are available to help you identify issues.

Some of the best technical audit tools include SEMRush, Ahrefs, Screaming Frog, Moz, Sitebulb, Deep Crawl, OnCrawl, Google Search Console, GTMetrix, WebPageTest.org, and others.

Botify, SEOClarity, BrightEdge, Conductor, Siteimprove, SEMRush and Searchmetrics provide excellent technical audits at an enterprise level.

However, the use of the tool itself does not replace the need to work with an experienced and knowledgeable SEO Consultant and expert.

There are many benefits to working with a professional Consultant to help you find and correct technical SEO issues.

Speed To SEO Improvements and Business Impacts.

Working with an SEO Consultant is the quickest way to improve your rankings, traffic, leads, and sales. By leveraging proven processes and workflows, a good Consultant can help you scale and improve your rankings much quicker than going at it alone.

Save Time and Reduce Frustrations.

Many businesses will try to use audit tools and self-diagnose SEO issues.

I often see companies trying to use old-school SEO tactics and get frustrated when they don’t see any measurable results.

Conducting in-house SEO will typically only get you so far.

SEO failures and lack of measurable progress can lead to limited buy-in and support from Executive Management and funding. These roadblocks further inhibit the speed at which your business can reap the full benefits of SEO.

✅ Scale SEO Results Quickly.

Leverage the SEO Consultant’s experience, knowledge, and best practices to diagnose technical issues accurately. A good Consultant knows which factors are the most important to focus on and correct. (This leads to quicker wins and further support of initiatives.)

Technical SEO Audit Tool Use, Adoption, and Training.

An experienced SEO expert will help you identify the correct technical tools to use for your site and your level of knowledge. In addition, technical SEO training can go a long way towards leveraging tool investments and providing business cases to get projects on the IT Roadmap.

Stay Up To Date On Latest SEO Trends and Changes.

Google and other search engines are constantly testing and changing their ranking algorithms.

Therefore, you need an SEO Expert and Consultant to help keep you updated on these changes and how they can positively or negatively impact your site rankings, traffic, leads, and sales.

If you are negatively hit with an algorithm change, the SEO Expert can help you pivot and correct issues to restore rankings and traffic.

How to Perform a Technical SEO Audit?

Performing a comprehensive technical SEO audit usually takes between 40-60 hours and examines over 150 site factors. (This is typical of Corey Wenger’s technical SEO services.)

Below, I will outline a “shortened version” of a Website Technical SEO Audit.

✔️ Step 1: Setup your SEO audit crawler tools to scan your website.

For smaller sites, I recommend using Screaming Frog Spider.

This audit tool does an excellent job of spidering your site and collecting valuable data points, from response codes to structured data.

Before running the Screaming Frog crawl, be sure to connect your Google Search Console and Google Analytics accounts.

Screaming Frog Spider
To validate findings, I also recommend using a secondary crawler like Sitebulb or Oncrawl.

For large, enterprise-level sites, I recommend using Deep Crawl. This crawler is speedy and provides a wealth of information you can use to diagnose and fix common issues.

✔️ Step 2: Setup Technical SEO Audit scoring tools to scan your website.

My favorite tools in this category are SEMRush and Ahrefs.

Both tools complement one another and should always be used when conducting SEO audits.

Once your site is crawled, you will be presented with a list of technical SEO factors in order of overall severity (errors, warnings, and notices) and site impact (number of pages on your site).

You will want to focus on the most severe errors that have the highest level of site impact (affects the most pages on your site).

✔️ Step 3: Analyze site crawlability and indexability.

Review your Google Search Console/Coverage Report. (Analyze crawl errors – your valid, valid with warnings, excluded, and error pages.)

How many valid pages are indexed? How does this compare to how many pages are submitted through your XML Sitemaps? Are all important web pages indexed? Which pages are missing from Google’s index?

Google Search Console Coverage Report
Validate your findings with Screaming Frog Response Codes (200-ok, 301-redirected, 404-page errors, 500-server errors, etc.)

Identify how many pages are indexed using the following search operator – site:domain.com or site:www.domain.com.

Now, compare the difference between the index and your submitted pages within Google Search Console.

The most common issues here are over-indexation (your site has more pages indexed in Google than the actual site size) or under-indexation (your site has fewer pages in Google than the site’s actual size.)

✔️ Step 4: Review your XML Sitemaps and robots.txt file.

Within Google Search Console, go to Index/Sitemaps to review current sitemaps submitted to Google.

When was the last time the Sitemap was submitted? Are sitemaps missing important pages? Do sitemaps include pages that shouldn’t be indexed? (PPC pages, low-values pages, thank you pages, etc.)

Google Search Console Sitemaps
Next, examine your robots.txt file.

Are there any important pages accidentally being blocked? Are there any pages or directories that should be blocked but are not?

Review Screaming Frog Spider Response Codes for pages blocked by the robots.txt file.

✔️ Step 5: Examine site security and the use of HTTPS.

Using Screaming Frog, identify any non-secure HTTP pages and site elements. Then, cross-check URLs with SEMRush’s HTTPS report.

Identify pages and elements that need to be corrected and use HTTPS protocols.

Is there any non-secure (HTTP) pages included in your XML Sitemaps? For example, do all 301 redirects point to secure (HTTPS) URLs?

Screaming Frog Spider Security Tab
✔️ Step 6: Website architecture and URLs.

Ideally, all site content should be within three clicks from the home page.

Generally, the higher the number of clicks (to get to information within site), the less importance is given to those pages by the search engines.

Review Screaming Frog URL report to identify problems URLs.

Screaming Frog URL Report
Review Screaming Frog Canonical report to detect canonicalization issues. Use Rel=canonical link attributes to the preferred page you want to index by the search engines for duplicate or near-duplicate pages.

✔️ Step 7: Improve website page load speeds and performance.

Page speeds are measured on a page-by-page basis and mainly impact site conversion rates.

Faster page load times help prevent site abandonment (higher bounce rates) and increase visitor engagement (time on site, pages viewed per visitor, and bounce rates).

Analyze Google Search Console/Page Experience, Mobile Experience, and Core Web Vitals reports.

Core Web Vitals tests the speed, responsiveness, and stability of the page loading experience for users.

Google Core Web Vitals
Uses Google PageSpeed insights to identify specific elements causing slower page load times. Common site speed issues include large
image files, no caching policies, nonuse of CSS and JavaScript minification, and other factors.

Prioritize your site performance fixes by most significant impacts to your site’s site speed and overall performance.

✔️ Step 8: Mobile-friendly SEO website design.

As of 2021, nearly 60% of all web searches are done on mobile devices. Mobile Friendly Website Design

Having fast mobile page loads and an excellent user experience is vital to SEO success.

To test your website’s mobile-friendliness, use Google’s Mobile-Friendly testing tool.

Be sure not to block the crawling of any page assets (CSS, JavaScript, and images) for any Googlebot using robots.txt or other methods.

Fully accessing these external files will help search engine algorithms detect your site’s responsive web design configuration and treat it appropriately.

To help identify mobile issues, review your Google Analytics mobile data. For example, high bounce rates can strongly indicate mobile design issues.

Examine your site for common mobile usability issues like slow page load times, viewport not set, text too small to read, content wider than the screen, or clickable elements too close together.

✔️ Step 9: Schema markup validation.

Structured data is code you can add to your web pages visible to search engine crawlers.

This helps crawlers understand the context of your site content. (It is a way to describe your data to search engines in a language they can understand.)

Structured Data can help you enhance the presentation of your listings (and increase organic click-through rates) in the search engine result pages (SERPs) either through featured snippets or knowledge graph entries.

Screaming Frog Spider Structured Data Tab
Use Screaming Frog Structured Data report for errors. Validate Screaming Frog findings by using SEMRush’s Markup report.

✔️ Step 10: Compile all technical SEO issues and prioritize SEO fixes and enhancements.

Once you have run scans using various Technical Website Audit tools, it’s time to consolidate your findings in the master site audit findings document.

This is usually an Excel or Google Sheet that ranks SEO issues by overall impact/business impacts and level of effort needed to correct issues.

Identify resources required to fix site problems and align internal and external teams.

Provide a clear definition of the problem, where it can be found, and the solution.

In many instances, Project Management now follows Agile SEO. This is where large projects are broken down into small, two-week Sprints of work.

SEO Agile Process Corey Wenger SEO Consulting

Each Sprint involves prioritized tasks assigned to each team member and can last 1-2 weeks. At the end of each Sprint, the project manager reviews and approves tasks and then pushes live on the website.

✔️ Step 11: Follow strict SEO deployment and validation processes.

You’ll push content and site changes live at the end of each Sprint.

This requires careful coordination with your IT, digital marketing, or web development departments.

For an SEO implementation, you go through pre-launch to post-launch validations to ensure everything is working correctly and maintain site integrity.

SEO Deployment Process
✔️ Step 12: Re-run technical SEO audits to validate fixes and enhancements.

SEO optimization is an ongoing process.

Once you have implemented website changes, you must carefully validate that those changes had the intended impacts (improved performance and rankings.)

Re-crawl your site using Screaming Frog, Sitebulb, Deep Crawl, SEMRush, Ahrefs, and other tools to quantify site improvements.

Compare previous site crawls.

You should be seeing the total number of issues trending downward (as more items are fixed each month).

You also should see your SEMRush and Ahrefs Score increase.

Be sure to notate fixes in Google Analytics and review organic traffic metrics (users, sessions, bounce rates, conversions, etc.) pre and post-site fixes/enhancements deployments.

What SEO Tools Are Used For a Technical SEO Audit?

To perform a Technical site audit, I recommend using the following SEO audit tools.

Crawlers: Screaming Frog Spider, Sitebulb, Deep Crawl, OnCrawl.

Audit Tools: SEMRush, Ahrefs, SEO Site Checkup, SERanking, and others.

Google Tools: Google Search Console (GSC), Google Analytics, Mobile-Friendly Tool, Robots.txt Tester, Lighthouse, Rich Results Test, and other tools.

Page Speed Tools: GSC Core Web Vitals Report, Google PageSpeed Insights, GTMetrix, WebPageTest.org, Pingdom, etc.

Other Tools: BuiltWith, Wappalyzer, and XML Sitemap Validator

 

How Often Should You Perform Technical Website Crawls For SEO?

It is vital to run regularly scheduled website audits to determine how technical issues have been corrected (or not).

It is recommended to run SEO audits every 2-3 weeks to validate site fixes and enhancements have been implemented correctly and determine what effects site rankings have been impacted.

In some instances (when the technical SEO issue corrections don’t need to wait for the pages to be re-crawled and indexed), you can run the site audit post-deployment.

 

Technical SEO Checklist

✅ Here are the top 12 technical SEO audit elements to focus on for your website audits.

1. Crawl & Index
2. XML Sitemaps
3. Robots.txt
4. Response Codes
5. Website Security (HTTPS)
6. Website URLs
7. Website Architecture
8. JavaScript
9. Schema Markup
10. Canonicalization & Duplicate Content
11. Mobile Friendly
12. Site Speed and Performance

If you are looking for help to:

→ Increase Sales
→ Boost online sales leads
→ Increase website traffic
→ Develop effective SEO strategies
→ Avoid negatively affecting your SEO from launching a new website
→ Improve your local online presence
→ Increase online brand reach

…You’ve come to the right place.

Let’s discuss your project.

SEO Trusted Advisor & Consultant