Start 2021 off right with this quick local SEO website audit that Nick Pierno – Whitespark’s Director of Custom Projects – presented at our 2020 Local Search Summit. Nick runs through the importance of how assessing the SEO viability of a website can help surface potential problem areas and shares his audit checklist to get you set up for success.
A Note on SEO Scan Tools
There are some good SEO scan tools (Woorank or SEOptimer) out there that I usually run sites through during my audit process to catch anything I missed or just get a kind of programmatic look at things, I’d say these scan tools work better in tandem with a manual audit, rather than as a replacement for one.
Gauging a Website’s Overall Local SEO Health
You can obviously spend a dozen hours or longer auditing every little detail of a website. And I’m sure there are even businesses out there doing that and doing fantastic work. On the other hand, someone with some SEO experience can get a pretty good read on things just putzing around a site for 10 minutes. I think this quick audit sort of sits between both extremes.
This provides some formality and structure to the process without becoming a huge expensive project unto itself. The audit scope is actually really flexible too, you can painstakingly dig into every item on the list and roll it up into a formal document, and that may take about five hours. Or you can rip through it taking a quick glance at each area, marking the checkboxes and end up with a pretty good summary (maybe for internal use or something) in about 30 minutes.
Who is this audit for?
- Agencies & marketers
- Website and business owners
- Anyone whose friend and family ask them for SEO/website recommendations
This audit is for everyone. SEOs might use this when they’re kicking off a project and making an action plan, while developers might use it before launching a site to make sure that it’s SEO ready. Agencies or marketers might use it to evaluate prospects during their sales process. Website or business owners who’ve had some SEO or development work done, may want to use it to check things over or just get a general sense of where they’re at. I personally use it pretty much any time I need to get a read on a local business’s SEO setup. I’ll spend a few hours on it when I have a client who wants it packaged up as a deliverable. Or I’ll do a 30-minute version when a friend or family member asks me to take a look at their site, which happens kind of often. So I think you can use it in pretty much any scenario.
The Audit Checklist
This audit process uses a checklist. The full version includes things like GMB, citations, and reviews, in addition to about 30 website-specific areas that it covers.
👉 Grab the full checklist and an example resource including a list of tools that will help you through the process.
To use it, just run through the list of areas to check. Each area has a priority so that once you’re done auditing, you know where to start. Feel free to tweak those priorities if you don’t agree with mine. Then there’s a status drop down with three choices, all good, heads up, or danger. Those are pretty self explanatory. Then there’s a notes column to jot down what you noticed for future reference.
Now, we don’t have time to cover every item on the checklist today, we don’t even have time to cover all the website items. So we’ll actually just go over a few of the most important areas.
Top Audit Priority Items
The areas we’re going to focus on today:
- Current rankings
- HTTPS and URL variants
- Mobile friendliness
- Tile tags
You could theoretically just use this handful of items to give a website a decent quick once over. I’ve randomly selected a website from a plumber in Austin, Texas to use for some examples. For whatever reason, I always use plumbers for my SEO examples. I completed a copy of the checklist for this site as a point of reference for you when you go back to use the template.
#1. Check the site’s current search presence
This will give you a general sense of where the site’s at. If they’re ranking number one, you might not find a lot to comment on. If they’re nowhere to be found, maybe you’ll find some simple things that will make a huge difference. How do we do this? Pretty straightforward. We’ll search their flagship or most important phrases on Google.
You’ll want to do this from an incognito or private browsing session just to prevent your search history from affecting the results. I’ll start by looking at the homepage title tag. If anyone’s even remotely attempted to optimize the site, there will be some target terms in there. If the target terms seem dumb, that’s okay. We’re more interested right now in seeing if they rank for what they’ve targeted.
Here I pulled out plumbing company from their homepage title, and I added the geo modifier Austin, TX to make sure I’m searching their region, not mine. If they’re not ranking for those flagship terms, you can also try searching for a longer string from the title tag. Like here, I’ve got Plumbing Company | Plumbing Services: Austin, TX, which is their exact homepage title without the company name. This will help determine if they can rank for a longer, more specific search even though they weren’t ranking for the shorter, more competitive terms (possibly because they’re outmatched by their competition). You might also search for phrases you think they should be targeting or ranking for. I searched plumbing, Austin, TX, (probably the most competitive term in their space) and they were on page five.
I should also mention that there are all sorts of rank tracking tools out there, shoutout to Whitespark’s Local Rank Tracker. But I think for this quick and dirty check that the manual search does the trick.
Whenever you’re searching, you’re really just looking to get a sense of whether they’re killing it or not killing it at all. Or maybe there’s something holding them back. When I search plumbing company Austin, TX. ATX is ranking at the top of the local pack for my search and they’re on page two of the organic results. To me, this indicates that the site is viable, doesn’t have any big deal breakers or impediments. In this case, maybe some content improvements, or link building are all it really needs.
Search the company name
It’s also worth searching for the company name. Sometimes you’ll need to throw in a geo modifier like the city if it’s not a super unique name. You just want to see if they have a knowledge panel, i.e. they own the top ranking spot. Maybe you want to see if their citations or their other profiles occupy the rest of the page. Ideally, it’ll kind of own as much of this page as possible.
As I mentioned before, ATX had a top local pack ranking, they were on the page two for a decent phrase, and now on the branded SERP they have the knowledge panel, so branded looks pretty good. There are no major problems with the site showing up. But there’s also a lot of room to improve and the rankings department. So, I marked this as a heads up, and we’ll press on to see how they could work towards bettering their rankings.
Why? Well, because being in Google’s index is kind of the whole point of this.
Do a Google search for site: yourdomain.com. In this case, we’re searching for site:atxplumbingcompany.com. This will show you all the pages Google has indexed for that domain. You want to make sure that all the important pages are there. If the pages are there, then nothing is overtly blocking them. If they aren’t there, check the site’s robots.txt (yourdomain.com/robots.txt). Note: If there isn’t a robots, then it’s not the culprit, though it’s a good idea to create one (most CMS generate one).
Looking at the file, you just want to see if any important pages are disallowed or if the whole site is disallowed, which happens fairly often with WordPress and is a feature they have to easily hide a site while it’s being developed. Sometimes it gets forgotten and the site gets launched, so the entire site gets ignored by Google until someone figures it out.
You can prevent this by unchecking “discourage search engines” under the reading settings in the WordPress dashboard. You can also check the HTML <head> section of your source code for a noindex tag, which acts in much the same way, just on a per page basis.
These two indexing blockers are the most common offenders. But if nothing is, blocking search engines from a page and it’s still not showing up, you can try requesting indexing through Search Console or you can just Google how to do that. You may need a deeper technical dive might need to hire a professional or do some more research into it. ATX’s pages are all present in Google’s index. So there’s nothing for them to report here. I give them an “all good”.
#3. Check for HTTPS secure connection and URL variants
Why worry about HTTPS? Google formally recommends it and it’s generally considered to be a ranking signal (albeit probably a minor one). More importantly, browsers make non-HTTPS sites look suspicious to site visitors with scary icons in the address bar and other warnings, which might make people nervous. It’s super easy to check, because all the popular browsers let you know right in the address bar. So you probably already know after the first time you visit the site whether or not it’s using HTTPS.
In order to use HTTPS, you need an SSL certificate. In the past, these were expensive and annoying to set up but you can get one for free from an organization called Let’s Encrypt, and a lot of popular hosts are offering them now with very easy setup that only takes a few clicks. That being said, you might want to consult a professional or do some really thorough research before migrating an established HTTP site. It can be a bit of a process and it does come with some risks. So just be careful.
Some sites might use HTTPS, but some of their content is not secure. You can also see this in the address bar (see below). It’s usually an image or resource on the page, that’s still using HTTP leftover from migration.
You can easily troubleshoot that kind of insecure content with your browser’s developer tools, or with online tools like why no padlock and then you can get those resources fixed up.
Check the common URL variants (HTTP, HTTPS, www, non www). I check these manually by entering them into my address bar. You want them all to end up at the same place. Not to be confused with them all resolving to a page, because they can all work by showing their own version of a page. You want each one to redirect back to a single variant.
If you enter “www.atxplumbingcompany.com” it should spit you back out at just at “atxplumbingcompany.com” or vice versa, whatever the version you want to use is. You’ll see in the image below, ATX’s site breaking when I try to use www as a sub domain, which is probably the worst case scenario. They need a redirect for this.
So ATX’s site doesn’t use HTTPS, and the www version doesn’t redirect anywhere. So that’s fairly bad. I’d consider this a “danger” item and something they should address ASAP.
#4. How does the site behave on mobile devices?
This is important because most sites today are indexed by Google Mobile First, which means your site is evaluated primarily on what the search engine sees, and a mobile viewport. Mobile usage is also massive, any given local business site is probably getting 50 to 70% of its traffic from mobile users. Ultimately, if the mobile experience of a site is really bad, or it’s just like a desktop site and people have to struggle to use it, that can hurt a website’s performance.
There are a few easy ways to check this:
- You can plug the site into Google’s mobile friendly test. It’ll spit out a simple “yes” or “no” answer and give you a few extra details.
- You can also just resize your browser window to its narrowest width.
You can use your browser’s developer tools to resize your browser (this gets a little bit narrower, closer to what a phone looks like).
- You can view the site on your mobile phone.
If Google says the page is mobile-friendly, and your visual inspection shows everything is readable and functional, you’re good to go. I give ATX an “all good” on this.
#5. Title tags
Title tags can be a pretty big ranking factor for any site. But I think the right title tag is basically mission critical for smaller websites trying to get local rankings (must read: how to build relevance in local search). They’re also one of the easiest tasks with a dramatic impact, especially if you back them up with your content. Title tags also generally determine how your pages look in the SERPs, when they’re shared on social media and instant messaging platforms, etc. Optimizing title tags is a task with what I’d call a good easiness-to-impact ratio.
First, I want to see if the title tag is updated at all, even if it just says “home” or the company name. If it is optimized to whatever degree, the next thing to look for is if it’s targeting the right phrases. If you have some SEO experience or you’re familiar with the industry you’re looking at, you can probably venture a pretty solid guess using your big brain. For less obvious situations than say, a plumber like in this situation, you can use Google Ads Keyword Planner, which is great for brainstorming keyword variations. It offers you the ability to filter by region right down to the city level. That’s really handy, because you can get region and city specific keyword volumes. It’s geared towards Google Ads, though, so the keywords it gives you may tend to have a transactional bias. I think that’s a good thing for local search most of the time, but it’s worth keeping in mind. Also note, you’ll also need a Google Ads account to use it. I do take the search volume and competition numbers that it generates with a grain of salt. They tend to be most meaningful relative to each other.
Other tools you can use include Moz’s Keyword Explorer, Google Search Console queries report, Google Analytics landing pages report and dozens and dozens of other keyword tools.
The highest volume keywords aren’t always the best targets even for a homepage a service page. Sometimes you can get better results by finding lower volume lower competition phrases, because you can actually start getting traffic on them.
To measure the competition, you can quickly audit a few sites in the search results for a given phrase and get an idea of how hard they’ll be to beat.
Another thing to look for is like super long or spammy seeming title tags. Even though long title tags aren’t necessarily harmful and there are strategic uses for them, they can be a sign that someone is being spammy or inexperienced or even that the title tags are getting auto-generated by the CMS.
Make title tags stand out a little by being more friendly, unique descriptive. compared to all the other blue links in the SERP. You still want to target your keywords, but you can add in a bit of warmth and charm to your snippets. I think this approach can lead to better click through rates and just generally brand better brand experience.
I think ATX’s home and service page titles could use a bit more love but there’s nothing particularly wrong with them. I give them an “all clear” and I’ll just make a note to them that it’s worth revisiting the title tags.
Content is a huge ranking factor and also conversion and brand experience factors. Content is hard to prescribe though and is more of an art than an exact science. Trying to define good content for SEO often devolves into word counts, keyword density, repetition, and your headings and copy. More recently, we’re looking at Latent Semantic Indexing (LSI) using synonyms and different ways of saying things and so on.
But for me, especially as far as doing a quick audit goes, I think it really just boils down to having a good chunk of decent quality text that directly supports the target concept for a given page. In other words, I think you should make it substantial and relevant.
You want your content to support the target keywords or concepts for the pages on. You want it to be detailed and informative, and try to answer all the questions an average user might have about the service or product or whatever the page is talking about. It certainly doesn’t hurt to include the phrases and keywords that you’re targeting and their synonyms within the content.
Be careful to avoid cramming anything in just for SEO purposes. Write for the customer. Write about the topic at hand. By doing this, you almost always include the right phrases just by default, and the more you write, the more likely you’ll hit on those LSI terms and phrases too.
It’s also nice to try and link to your other pages as well as relevant external pages whenever it makes sense. Internal links are definitely valuable. I think the external ones can help a page look more useful and natural, among other things.
I think a hefty amount of content (assuming it hits the right notes) helps a page rank. I was eye-rolling minimum word counts earlier but I don’t know any other way to quantify it. I’d say shoot for like 800 words on an important page. That’s just a shot in the dark. I just want to shoot for like a nice big chunk of meat for the SEO that search engines can chew on basically, you can distribute that between headings, paragraphs, blurbs, lists on any other kinds of formats to avoid making a huge walls of text., which isn’t user friendly or pretty.
For the bonus round, I really like including dynamic content feeds on the homepage whenever it’s possible. This could be your latest blog posts, product inventory, recent jobs you’ve completed, and so on. As long as the content is semi-regularly updated, it means you have a semi-regularly updated homepage too. Often, home pages get neglected and are static for a long time. Dynamic content feeds show Google that the site and its content are actively maintained and cared for which I think is a good thing.
So AFX’s content is decent, it’s well-written (even if it reads a bit SEO for my taste) but the homepage falls a bit below that 800 word mark, and the way it’s laid out doesn’t really give me an informational vibe. It’s all very salesy. Nothing terribly wrong with it, I just suggest they expand on it. The service pages I spot checked are acceptable too, but definitely not enough material on them (a couple 100 words or so).
Why should we look at links? Assuming the relevance is established by the title tag and the page content, and assuming there’s no other major technical impediment, I’d say backlinks are the single most powerful driver of rankings in the universe.
Use a backlink checker tool to evaluate this.
Depending on where the site I’m auditing is sitting in the rankings, I’ll choose a couple sites to compare its backlink profile against. If my target site is on page two, which it is, then I might take a look at the number one ranking site and then the bottom site ranking on the first page results. That’ll basically tell me the threshold for breaking onto the front page and then the threshold for unseating the top spot.
Plug those domains into your backlink checker tool and it will give you some sort of summary metric, such as page authority and domain authority for Moz, or URL rank and domain rank for Ahrefs. (Side note: I think the page level metrics usually matter more than the domain metrics for a page’s ranking ability.) You also might want to compare the number of backlinks and the number of referring domains. A lot of the time, tons of links will come from a single domain because it’s in a sidebar, or footer, etc. 5,000 links from the same site is obviously a lot different than a link from 5,000 different sites.
It’s also worth looking at the individual sites that are linking. Those summary metrics try to factor in quality, trust, relevance, volume, etc. into their algorithms, but none of them are completely reliable. It might be worth taking a look for yourself. You should be asking, “Are a large portion of these links coming from directories?” This happens a lot with local businesses. Do you see a lot of low-quality sites or sites that aren’t relevant to the target site or lots of sites that are in a different language? Sometimes these can be signs of some sort of shady link building that they did in the past.
Just because a link tool scores one site higher than another, doesn’t necessarily mean it will rank better. Google just has way more sophisticated algorithms and way more data to work with. And of course, there are other factors that backlinks determine the rankings to.
In ATX’s case, their competitor Daniels Austin outranks another competitor Radiant Plumbing on a search for plumbing company. Even though the latter has more links and a higher score, Daniels Austin actually uses the exact phrase plumbing company in their homepage copy and Radiant Plumbing doesn’t. So, in spite of having maybe a little bit better of a link situation (according to Ahrefs anyway) a page’s content relevance can still win out. When there’s a huge gap though, the results are usually a lot more predictable, which explains why ATX is quite a ways behind these other two sites in the results. Their backlink metrics are considerably lower. ATX has a very low UR and DR due to only having 28 referring domains which are all either local directories like Yellow Pages or link building junk like Austinplumbingcompany.blogspot.com. That sounds generally pretty harmless, but they need to outweigh it with some quality links in order to get it working in their favor. I would indicate to them that this is fairly urgent area for them to work on, it’s probably the biggest thing holding them back right now.
Communicating the Audit Results
When I’m done running through the checklist, I’ll have a list of “heads up” and “danger” items that I’ll communicate to the business owner or project stakeholders. In a more formal setting, I’ll actually turn the results of this audit into a document with a section for each area and recommendations for how to fix it.
What’s covered in the audit checklist?
The audit checklist covers 90% or more of the things that would be be preventing or helping a website to rank. The full checklist includes some off-site stuff like reviews and GMB which also factor in ranking. Even if you look at the whole checklist and just rip through it quickly, you’ll find what’s working and what’s not.