How to do Technical SEO Audit?
Despite all your hard work on content creation, On Page SEO, and Off Page SEO, your website still isn’t ranking. It is frustrating, right? Let me tell you the next step that can help you rank your website, and that is a technical SEO audit. After spending years in the SEO industry, I can confidently say that technical SEO isn’t an option. It is a necessity. In this article, I have shared how you can do a technical SEO audit by following simple steps. If you have a business in Manchester, this guide will be particularly helpful for you. So, are you ready to unlock the true potential of your website?
What is a Technical SEO Audit?
A technical SEO audit is a detailed analysis of your website’s backend. It will help you find issues that prevent search engines like Google from effectively crawling, indexing, and ranking your website. As a technical SEO expert, your job is to make things easier for the search engines. You have to make sure there are no crawling and indexing issues on your website. You also need to check that there are no broken links on your website and that the page speed is up to the mark. Learn how to do a full website SEO Audit.
Why is technical SEO important?
If you’ve heard of technical SEO audits and the question of ‘Why it is important? doesn’t immediately come to mind, you’re already an expert. If not, you need to understand it. Here is why it is important.
1. Improve Crawlability and indexation.
When you publish a new website or a page search engine like Google uses bots ( Crawlers or spiders ) to crawl and index your web pages. If your web page isn’t easily crawlable for any reason, search engines won’t index your pages. If your pages are not indexed, they will also not rank. That is why, with technical SEO, you can diagnose why search engines are not crawling and indexing your web pages.
2. Boost User Experience.
A technically strong website does not just make crawlers happy; it makes users happy, too. Technical aspects like Page Speed, Mobile friendliness, and security (HTTPS) contribute to a safe and fast browsing experience for the users. You can do a technical SEO audit and check if your website page speed is good, the site is mobile friendly, and it is secured.
3. Increase SEO Performance.
If your website is full of technical SEO issues, search engines like Google won’t be able to understand your web page content easily. If search engines aren’t able to effectively crawl and understand your web pages, your web pages won’t rank. It does not matter how strong your website’s on-page and off-page SEO is. In that case, you can do a technical SEO audit. You have to make sure there are no broken links or deleted pages, incorrect redirects, or javascript rendering problems.
Best tools for technical SEO Audit.
To perform an in depth SEO audit, you need to have the best tools in your arsenal. You can use the following tools.
1. Google Search Console.
It helps to monitor your website performance and top pages that bring traffic. You can also check those Pages that are indexed but not getting any traffic. With that, you can also monitor technical SEO issues related to your site. This tool is a must for technical SEO audits.
2. Screaming Frog.
It is one of the best tools that we use as a local SEO agency in Manchester. It provides a detailed analysis of a website, and with that, it is much easier to dig deep and analyze what kind of technical issues are bothering your site.
3. Ahrefs.
If you don’t want to handle the audit manually, you can use Ahrefs. All you need to do is crawl your site with it. It will highlight all the issues on your site, whether technical or on-page issues.
4. SEMrush.
If you don’t have Ahrefs and still want to do an audit automatically. You can use SEMrush. It offers almost the same features as Ahrefs. To start the audit, put your website URL and crawl completion. It will show you the list of all the issues your website has.
Apart from these, other tools are also available, such as Sitebulb and Majestic. We also use some extensions such as lighthouse and detailed SEO Extension.
How to perform a technical SEO audit?
To perform a technical SEO audit, you must know that it can be easy, complex, and challenging as well. If you have a small business website, the audit will be easier. If you have a large website, just like E-commerce with hundreds of pages, it can be complex. And if you’re doing an audit of a large multilingual website, it can be challenging.
To make the audit easier for any website, you must be an expert. The process for audit I’m sharing right now will help you understand how you can conduct an audit and understand it.
To start the audit, First, analyze the…
1. Crawlability and indexing issues.
When it comes to search engine optimization, crawlability is one of the most fundamental aspects. It is the process that allows search engines like Google or Bing bots to discover and access your website’s web page content. If your website or web pages can’t be crawled, they won’t be indexed in the search engine. Since the crawler can’t crawl your web pages, there is no chance of ranking.
Crawlability can be blocked due to these issues.
- You have accidentally blocked the crawler in Robots.txt.
- You have accidentally set the meta robots tags. (No-Index, Follow)
- Your website has crawl budget issues.
- Your website has excessive redirects.
- You might have a 404 not found an error on your site.
- There might be a misuse of canonical tags.
- Your website might have an HTTP status code error.
- There might be a misconfigured security setting.
- You might have set incomplete or incorrect tags.
These are the issues that can limit the crawlability of your website. To check these issues, you need to run a crawler for your website. For that, you need to put your URL in the screaming and wait for the crawl completion.
Meanwhile, you can start by checking the Robots.txt file. To check the Robot.txt file, just add this at the end of your website URL (https://example.com/robots.txt), and it shows the robots.txt file.
You can also check it with the detailed SEO extension. Just click on it, and at the bottom, you will see a text named robots.txt. You have to click on it, and it will redirect you to the robots.txt page. There, you will see and confirm whether you have blocked the search engine crawler or not. If you have blocked any search engine, the robots.txt file will look like this.
Meta Robots tags.
Make sure you have not set meta robots tags for any of your pages. To check the meta robot tag, you can use a detailed SEO extension. Open any of your website pages and click on the extension, and you will see whether the tag is set or not.
Crawl Budget
A crawl budget is the time that search engines allocate for your site. In this specific period of time, search engines will crawl your website. If your website is full of errors such as 404 not found, redirect chain, or pages that are no longer providing value to the user and search engine. If the search spends allocated time on these pages, your whole crawl budget will be wasted. And the search won’t be able to reach the important pages. That is why you have to make sure your site is clean in terms of errors. So that search engines can crawl your website effectively. To check the errors, you can crawl your site with the screaming frog. You can check whether there are any issues like 404 not found, 301 redirect, or redirect chain present in your site.
To increase the crawl efficiency, you have to make sure Your website doesn’t have excessive redirects. If you intentionally redirect a page, then it is fine. If not, you have to check why there are excessive amounts of redirects present on your site. To check the redirects, crawl your site with the Screaming Frog and check if there are any redirects.
404 errors or 404 not found are problematic for the user and the search engine as well. It can destroy the user experience for both. When search engine crawlers crawl the site, they find the pages no longer exist. They think that we are deceiving users and not providing the best user experience. Again, to find this error, you have to go to Screaming Frog and click on the response code. After that, click on 4xx, and it will show you all the errors.
Misuse of canonical tags is something that a webmaster cannot find easily. The problem it can create is, for example, if you have a page with great content and some powerful links also point to that page. But this page has a canonical tag for another page. That means Google will give priority to the second page that you have set as a canonical tag. To find this issue, you can check with the screaming frog. There is a dedicated section, which you can see in the overview bar.
If you are not able to find it, you can also check it in the Google Search Console. Open the Search Console and click on pages. There, you will find “Alternate page with the proper canonical tag” or “ Crawled – currently not indexed.”
You can also check the canonical tags with the Detailed SEO Extension. Open the page of your site and click on the extension. It will show you all the details regarding that page. You can find out there if there is a missing canonical or if you have put the wrong canonical.
Whenever you visit any site, you will find it has a secure connection (HTTPS). Search engines like Google prioritize the site as having a secure connection. To check the connection, you have to open your site and click on the three dots near the domain name. And there you will find out. Google search console will also show if there are any non-secure pages present on your website. Go to the search console and click on HTTPS. There, you will find HTTPS URLs and Non-HTTPS URLs.
When you check and fix all of these errors. Your website crawlability and indexability will be improved. However, if you still find out the indexing issues then you need to check the content quality. Ask yourself a question: is your web page content valuable for the user, and can it solve the user’s problem? If it is not able to solve the user problem, then why does a search engine index your web page? If you want to write on a specific topic and you find that in the search, there is already a lot of information on that particular topic. What can you do to present this information more effectively, or what can you add to it in terms of value?
Website Navigation and Architecture issues.
If you have a physical clothing store and everything is messed up in it. What do you think, a customer going to buy anything or they will leave your store. Of course, they will leave your store because you have not managed or organized clothes in your store. The same is the case with the website. Your website should have a hierarchical structure. The pages and blog articles should be organized and interconnected via internal linking. So that search engine crawlers can easily crawl your website.
If your site doesn’t have a logical hierarchical structure and it has messed up with the pages and blog articles. Search engines like Google crawlers won’t be able to crawl the site effectively. If it doesn’t crawl, it will not be indexed. Not indexed means no ranking, no traffic, no sales.
To check your website structure, you can use screaming frog. If it is not available, you can use Ahrefs and SEMrush. To check with the Screaming Frog, go to the overview and then the site structure. After that, just check the bottom, and you will find out the crawl depth bar. Crawl depth means how many clicks a user needs to find a particular page from your website menu. All the web pages must be three clicks away from the menu for the users.
To further check the site structure, you can click on visualization in the menu tab of the screaming. There, click on the crawl tree graph, and it will present your website’s whole structure along with all the pages. There, you can analyze and understand how your website pages are interconnected. Since all the pages can be interconnected with the internal linking, so make sure to check the internal links. And confirm all the internal links are fine, meaning not a broken one present on any page.
You also need to check that a user can easily navigate through your website. All the pages and categories in the menu and Footer are making things easier for the user to navigate through the website. We have to make a seamless user experience for the users so that they can’t be frustrated while visiting our website, to improve the navigation further and to make the user experience good, you can add the breadcrumbs to your site.
With that you also need to make your URL structure is perfectly fine. You can keep the URL structure simple and you can also add paths into it. The URL structure will also depend upon your site size or website type. For example, if you have a small local business website, then a simple URL structure can work if you have an e-commerce website with 100s of pages and categories. Then, you need to consider a logical URL structure. While creating a URL for any page, make sure to keep the URL short, only include your targeted keyword in it, and add a hyphen to it.
XML Sitemap issues.
XML sitemap is a file that contains all of your website pages. You have to submit an XML sitemap in the Google search console. So that search engines can discover pages in it. When it comes to sitemap some things you need to keep in mind that will help you to optimize your sitemap.
Include only those pages in the sitemap that are important and can bring traffic or sales to you. Don’t include pages that are not important.
Make sure to put a legit URLs with 200 ok status code. Don’t include 404 not found pages, 301 redirect pages, or the pages that are non-canonicals. The reason is when you put a URL in the sitemap that means it is in the priority list. For that reason, it is recommended only to add legitimate URLs.
Make sure your site not having more than 50,000 pages and the size should not be more than 50MB. If you have more than fifty thousand pages, submit multiple sitemaps in the search console.
How to diagnose if you have errors in the sitemap?
Open Screaming Frog and change the mode from spider to list. Click on upload, and you will find multiple options. Just select one. I normally select download XML sitemap and put the URL of your sitemap. Once you do that, it will download all the URLs from the sitemap and crawl them. Once its crawl is complete, make sure to check if the sitemap has 404 not found, 301 redirect, non-canonical, or pages that are not important. If you found these errors, make sure to clean it and only keep the URL having a 200-ok status code.
Page Speed and core web vitals.
For that, you also need to understand Core Web Vitals. Core Web Vitals is a set of metrics introduced by Google to qualify essential aspects of user experience. These metrics, Largest Contentful Paint (LCP), First Input Delay (FID), and Cumulative Layout Shift (CLS), measure loading performance, interactivity, and visual stability, respectively.
How to audit page speed and core web vitals.
To check the page speed, visit the https://pagespeed.web.dev/. Input your URL, and it will show you the speed score for both mobile and desktop. It will also show you the core web vital metrics. If you scroll down, it will also show you why your site is taking a longer time to load. Google search console also shows you core web vitals. You can exactly see the Good URLs, URLs that need to be improved, and poor URLs.
If your website isn’t optimized for mobile devices, you’re missing a lot of traffic and revenue. In 2016, Google announced mobile-first indexing, and in 2023, it was fully implemented for all the websites. Mobile-first indexing means Google uses the mobile version of your site for ranking and indexing.
To check whether your site is mobile-friendly or not. You can use any website. Just type in Google, and you will see different websites. We normally check on mobile devices to confirm any issues the website has. You can also check with the browser, open the inspect, and go to the toggle tool. There, you can select different devices and dimensions to check how your website looks on different mobile devices.
If you have a SEMrush tool, do a website SEO audit with it. Click on issues and select Mobile SEO. If your site has any error, it will show you.
Check Schema Markup issues.
Schema markup is a type of structured data that helps search engines understand your website content better. It also helps to show your content in rich snippets, such as star ratings for reviews, event details, or product information. There are different types of schema markup available that you can implement on your site.
- Article schema.
- Local Business schema
- Organization schema.
- FAQs schema
- Product schema
- Recipe schema
Mainly, you will find three types of schema formats in which we can write the code. ( JSON-LD, Microdata, RDFa ). However, Google recommends using JSON-LD. If you have implemented schema markup code in your website and you want to check what types of schema are in the website. Open a website https://validator.schema.org/ and put your website URL in it. It will show you all the types of schema your website has. With that, it also shows you the code and errors, if there are any.
Perform a Technical SEO Audit.
Now, you have learned how you can perform a technical SEO audit for your site. By following the steps outlined above, you can identify and fix key issues that may be impacting your site.
You can use different tools for the audit, such as Screaming Frog, Ahrefs, and SEMrush. If you need further help with the audit or any information regarding the website, you can contact our Leads Oriented SEO Agency in Manchester.