SEO Audit: How To Perform an SEO Audit
Ahh the SEO Audit. This interesting document that outlines everything wrong with a website with respect to SEO. It’s a very important step in the process of optimizing a website as it will outline much of the technical and content limitations and issues that need to be addressed.
Combining the SEO audit with the client’s goals in terms of visibility, rankings and overall performance and you will be able to put together a well thought plan and proposal on how to get there.
I know a lot of people struggle with both executing an SEO audit and delivering an SEO audit. On top of that, figuring out where it fits into your service offering can also be a challenge – is it something you do for free to convert clients? Or is it part of the on-page seo process once a client has already signed on?
Regardless of the business decisions you make surrounding the SEO Audit, you’ll want to make sure that the analysis you perform and the audit that you actually prepare are of top quality. This post will give you a high level of what should be included in your audit and what areas you should be focusing on.
The performance section of the SEO audit should provide a snapshot of where a website currently stands within the organic search landscape. It’s possible to take this a step further and to benchmark it against industry averages or averages of competitors but I don’t feel this is necessary.
Some of this section you will need access to the client’s Google Analytics and you may also need access to some of the popular SEO tools.
Current Organic Traffic
Ideally you will look at total organic traffic and compare it month over month and year over year and as a percentage of total traffic. I understand that this would depend on access to the client Google Analytics and if you were doing it as a free audit you may not want to include this at all. If you can’t get GA access, you could always just look at SEMRush data and comment on any trends in the traffic.
As you can see by the above image, SEMRush provides decent organic search traffic data. While it might not be exact (it’s definitely not) it’s usually good at picking up trends and issues caused by penalties or technical issues.
Landing Page & Conversion Data
If you can, look and and comment on the conversion data and landing page data for organic traffic. Does all the organic traffic go the to homepage (like most small business websites) or is it well diversified throughout the content. Are they tracking performance at all in terms of conversions? Looking at this will be able to spark great client conversations down the road.
Based on some quick keyword research (just preliminary stuff don’t spend a ton of time here) check to see where the website appears in their search results page of choice. I would mix in some higher volume and higher competition keywords as well as some long-tail keywords. I like to include the keyword difficulty percentage from Moz to give the client an understanding that some keywords are much more difficult than others and their plan and budget needs to be considered accordingly.
Current SEO Metrics
It’s good to report on some of the more popular metrics, which typically include:
- Moz’s Domain Authority
- Moz’s Page Authority
- Majestic’s Trust Flow
- Majestic’s Citation Flow
- Do-follow and No-follow root linking domains
Be sure to explain what each of them mean and give some context around what it means specific for their situation and their plan. It’s possible to nicely compare these to their top competitors (if you know them) as that would give a bit more value than just the pure metrics.
Link Profile Analysis
A good link profile analysis will answer some important questions:
- How many total links does the website have, and what percent of them are no-follow vs. do-follow?
- What anchor text distribution and ratios does the website have?
- What type of links are pointing to the website?
- Are there any high-risk links that should be disavowed?
- What types of links would benefit this overall link profile?
I like to use Majestic’s Anchor text tool in order to take a deeper look at ratios and link profile analysis. AHrefs also provides great link data.
The technical audit will look at any technical limitations that might be hindering the website’s organic search visibility. While these are still overlooked from time to time, the technical aspects of SEO are becoming more popular with good on-page seo practices. Let’s review some of the things you should look at:
Head on over to yoursite.com/robots.txt to view the robots file. You’ll want to look for anything unusual that may be disallowed that shouldn’t be. For example, a standard WordPress robots.txt file should look like this:
Does the website have a sitemap and is that sitemap submitted to Google Search Console?
You can view an example XML sitemap here: https://seobrothers.co/sitemap_index.xml.
Usually, if a sitemap is auto-generated and in place that’s all we’re looking for. However, it’s possible that it’s incorrectly formatted (especially if manually generated) and there is also a chance that the priorities are specified incorrectly (which is a bit more advanced but something to consider for larger websites).
Indexed Pages & Issues
This sort of doubles with the duplicate content section as well but, for example, sometimes a website may have 100 pages in their sitemap submitted to Google, however only 30 are indexed. While not necessarily a huge issue, this is certainly a potential problem.
Some of the causes of this may be particular post types or sections of the website set to no-index but included in the sitemap, duplicate content on large sections of the website that keep it out of Google’s primarily index. I’ve worked with a client that had multiple brands that sold the same products and all of their websites had copy-and-paste content. This cause big issues with pages getting indexed.
Page speed is important. Not only is it an increasingly important ranking factor, it’s a hugely important conversion factor. I can’t remember the stat exactly but it’s something like a 7% loss in conversion rate for every second delay in page load speed. (confirmed this stat here).
Google wants to return results that provide the best user experience for the searcher and while much of that experience is the relevance to the search term, how quickly a page loads is also a big part of the user experience.
A great way to benchmark a website’s speed and performance and to get recommendations on how to improve it is via Google’s Page Speed Insights Tool. SEO consultants should be intimately familiar with this tool and how to act on its recommendations.
Pingdom has a great tool for measuring the load time of websites as well. While different than Google’s tool this is also very useful to determining overall page load speed.
404 Error pages can lead to poor user experience. Depending on the reason for the broken links or pages, it’s possible that they are also causing a decrease in domain and page authority due to lost link value. While having access to Google Search Console would be the best way to identify 404 errors that Google has found it’s not always possible to get this sort of access if you are completing an audit for free.
In this case I would recommend using a tool like Screaming Frog in order to crawl the entire website to identify any broken links on the site itself. While this doesn’t consider broken inbound links it does pick up any broken internal links and pages.
You can sort by status code in Screaming frog in both the in-link and out-link view to see if there are any broken links and trouble URLs. This will help identify any issues of missing content or broken links that you can manage with proper redirects.
URL structure and internal linking is one of the most important on-page factors that most run of the mill SEOs completely miss. Setting up a website so that it is properly themed and siloed is a great way to add a level of relevance for both users and search engines.
We discussed this in the on-page SEO post but here is an example of good vs bad structure:
I’ll create a post explaining the reasoning behind siloing and how you can silo both physically using the above directory structure and virtually, using similar URLs and internal links.
So what are we looking for in terms of URL structure for an SEO audit? We’re looking primarily for organization of content. Are pages and posts just published any old way or is there a system and organization to the content. The more thought that went into the structure ahead of time usually the better the website will perform.
It’s always easier to setup a website well in the first place than to restructure it later, however it is possible. You’ll just need to properly manage all the redirection of pages similar to how you would in a website redesign or migration.
Does the website have internal links within content? Does it use some sort of breadcrumb system at least? Are siloings or sections of the website linking within their sections properly?
These are questions that you should be able to answer when you look at a websites’s internal links. I like to do this part manually but Screaming Frog will also help sort things out.
Mobile Experience & Issues
If they pass they probably aren’t at risk of any sort of mobilegeddon-style algorithm update penalties. If you want to take it a step further you can look at the user experience within the mobile environment, however that is getting a little outside the scope of an SEO audit.
The content audit is what many people think about when they reference on-page SEO. It’s the page titles, the meta data, and the actual copy and content that is on a website. While the technical aspects are very important – and in some cases can cause the whole website to disappear from the search engines – the content aspects of the website are usually where you’ll see the biggest wins and returns on your investment on optimizations.
Moz has a great crawl tool that will pull a lot of the data necessary to examine in this section. I love starting with both a Moz crawl and a Screaming Frog crawl and sitting down with the data to analyze the following sections.
Are the page titles not optimized, under-optimized, or over-optimized? Is the website taking advantage of the full length that a page title can be but not going over? Are they wasting space with additional branding or templates at the end of their titles?
For example, the page title of this post is:
The SEO Audit: How To Create Amazing SEO Audits
Assuming our head keyword is SEO Audit and/or SEO Audits, you’ll see how we incorporated the keyword within the page title without overdoing it and while making it sound as natural as possible. A default and/or template-based page title may look like this:
SEO Audit | Brand Name
This follows the “Page Name | Brand Name” template that is the default for WordPress (and many CMSs out there).
The purpose of this section in the SEO audit isn’t to fix all the page title issues it’s merely to identify potential problems (duplicate page titles) and opportunities (can we optimize them better?).
Meta keywords are dead. Stop using them. Now that that is out of the way lets turn our attention to the meta-description.
Your crawl data will be able to tell you whether they exist on the site, if they are duplicated and if they have opportunity to be well optimized.
While the meta description isn’t a direct ranking signal, it does impact rankings in other ways. They directly impact click-through performance from the SERPs (which is a ranking signal) and matching the meta description to the body copy is a great way to reduce bounce rate as well.
Some general rules for header tags:
- Only one H1 tag per page.
- Follow the H1, H2, H3 hierarchy when sectioning off content.
- Include keyword variations in header tags when possible.
- H1 tags should be unique across the website.
You’ll also want to identify any issues and opportunities within the SEO audit. It’s more than giving the data you’ll need to comment on what should be done and why.
While some duplicate content isn’t really a problem and if it’s handled well (follow, no-index) then it shouldn’t be a problem at all, it is still important to identify the threats of duplicate content. Moz’s crawl will specifically identify duplicated page titles, meta descriptions, or body copy, while Screaming Frog you’ll need to do a bit of digging and comparison yourself.
Running the site through copyscape.com is a way to identify duplicate content of your website across the web.
Depending on the scope of the audit (and whether its free vs paid) you may also want to make specific recommendations on how to solve the identified issues.
Thin content, like duplicate content, is another potential threat. Some “tag” pages on smaller blogs are good examples of this. They may have a snippet or excerpt of one post. On a small scale, this isn’t really an issue. But on a large scale it certainly could be.
It’s also good to look at images, videos, podcasts, etc. to make sure they are properly marked-up and optimized. Are the images using proper title and alt tags? Is there enough rich media across the website in general?
I like to score the site on a scale for this, as it’s time consuming to identify every image or piece of rich media that is not optimized. Unless of course it’s a paid audit and you have the time to really dig deep.
Other Body Copy Notes
Any other comments about the copy of the website should be put into here, if applicable. Word count notes, formatting, relevancy, etc. are all things you could comment on.
So What Next?
Auditing a website is one thing but what you do with the audit is more important. An SEO Audit is used for many different reasons. If you’re a freelancer or web professional an SEO Audit is a great way to prospect and warm up potential clients. It’s much easier to charge for SEO when you show them exactly what it is you’re going to fix.
That said, auditing can take some time. Anywhere between an hour to a day depending on the scope of the audit and the experience of the auditor. Some agencies and firms will want to charge for an audit – and rightfully so – as there is a lot of work that goes into them and they are extremely value in educating the client on what needs to be fixed in order to improve search visibility.
Regardless of whether the audit is “free” in the sense as it’s used for prospecting or its paid for you also need to decide how you’re going to deliver the audit – which can also be time consuming.
Sending over your in-depth audit document alone may sound like a good idea, but regardless of how non-technical you tried to make it, it may still be a little confusing for a business owner or marketing professional. Understanding this it’s best to either sit down with them and walk them through it, presenting it, or a combination of both.
I’ll create more about delivering audits in another post and explain what has worked for me in the past and what processes I’ve seen work for others.