An SEO audit is an integral part of your website’s success. It's also often one of the first steps (together with the keyword research) that SEO specialists take when working on new web projects.
In short, it’s a review of technical and content website elements. An SEO audit focuses on identifying issues causing a block to organic visibility and suggesting fixes for them.
When carrying out an SEO audit, there are numerous factors and issues that need to be considered. Below, we outline common elements and challenges to learn how to conduct a technical SEO audit by yourself.
Initial website analysis
Find out the basic website information, check if search engines index the site, and identify elements that lead to increasing organic traffic.
Check indexation issues
Site indexing is one of the most crucial SEO elements as search engines can’t show your content in search results if your site is not indexed. Indexing issues impact your website traffic to a great extent, and if your site is not indexed, you won’t get any organic traffic.
There are more ways to find out whether your site is indexed or not. The easiest way is to check the indexed pages by using the site: search operator. This command provides the approximate list of your indexed pages that the search engine found. To test it, simply type into Google’s search box the following operator:
Ideally, you want to focus on the number of search results first.
The indexation issues can be spotted quite quickly, by looking at the number of indexed pages - if it’s lower (or way too higher) than you expected (you can compare it to how many products, categories or articles do you approximately have), or if numbers of indexed pages vary in different search engines (e.g. Google, Bing, DuckDuckGo,...).
To find out whether specific pages are indexed or not, you can also use the URL inspection tool, a diagnostic tool provided by Google Search Console that provides information about Google's indexed version of a specific page. To access the tool, you need to sign in with your Google account and verify that you have the authority to access the website information.
When inspecting the current index status of a page, you may come across the following situations:
- A page is not currently indexed and it shouldn’t. Ideally, this is what you want to achieve for pages such as login pages, advanced filters, etc. There is no need to change anything.
- A page is indexed and it should be. In this instance, everything is working fine (for any indexable content such as blog posts, products, categories, etc.).
- A page shouldn’t be indexed but it is. In this case, you’ll need to use a noindex directive that tells Google not to index it. This will prevent the page from appearing in search results. When Google bot next crawls that page, it will drop that page entirely from the search results.
- A page should be indexed but it isn’t. There are several reasons why this page is not indexed. It can take time for search engines to crawl and index your page (take a look at your internal linking structure to identify any issues).
Similar to indexation checks on Google, you can also look into indexing in different search engines. They often support the site: search operator as well.
TIP: You can use the URL submission feature in Bing Webmasters Tools to get your content indexed on Bing as soon as you publish it. Bing also released their Content Submission API to know about any site changes, it also allows instant indexing of your content.
Try Marketing Miner now:
Finding landing pages with 404 errors
One of the SEO audit tasks is to identify the HTTP status codes that your landing pages currently return to see if some of your site visitors won’t encounter a 404 not-found error message. Fixing 404 error pages will help you recover traffic and generate leads to push for more sales.
How do 404 errors happen? It can simply be a page that has been recently removed without redirecting its URL. As a result, the page may be listed in search results until robots crawl it again to discover a 404 error code, not to index it anymore. This issue often happens with PPC campaigns that land on a broken link with a 404 error. Remember always to test your links first!
Unifying URL formats
Does your domain name include www? Do you have user-friendly URLs? Are you using query string parameters in the URL for filters? These are the questions that help you give direction on what to improve to make it easy for crawlers to discover and index your website content.
Consistent URL formats
Ideally, you should use consistent URL formats for all pages with the same domain name. Try to search for the following versions of your domain:
- http://domain.com
- https://domain.com
- http://www.domain.com
- https://www.domain.com
All domain varieties above should lead to the main one. It’s also important to check whether the /index.php or /index.html page is redirected. From an SEO standpoint, it doesn’t matter which option you go for. Ideally, you want to use the HTTPS version that uses an SSL certificate. However, the main focus is to redirect all URL versions (using 301 signals for a permanent redirect) to keep it consistent.
If you use more URL formats for the same page, it’s important to keep them consistent by implementing redirects. To identify which option is often used, try Google Analytics tool that has an overview of the most visited pages - grouped by a hostname.
You should also stay consistent with the preferred version of a trailing slash (“/”) placed at the end of a URL. If you choose to use trailing slashes in your URLs then always use them where it’s applicable. However, you don’t have to use them, but remember to stay consistent and have 301 redirects from one to the other to avoid duplicates.
Human-readable URLs
Which URL version is more readable and has easily recognized content?
https://www.domain.com/product.php?productId[]=fjewl5325&active=1
The version above?
Or this one?
https://www.domain.com/laptops/thinkpad-x1
Although Google is now using the category name in the search results for different web pages instead of their URLs, it’s still important to structure them right. The easier a URL is to read for humans, the better it is for search engines. Clean URLs are generally preferred as they are legible and easy to understand. Widely speaking, the cleaner and more accessible the URL through internal linking, the more authoritative it is.
If you use similar URLs on your site similar to our first example, you should consider stopping using them immediately. We don’t say to remove all old URLs and redirect them to a new version. If some of these URLs have historical interest and big organic traffic, it’s often more efficient to keep them and focus on creating new content on human-readable URLs.
Using URL parameters to filter data
Whether you are using URL parameters (tags, colors, size,...) to filter product categories or blog posts, their order is critical! When the user selects a brand first (Adidas) and then a size (10), the URL will look like the example below:
https://www.domain.com/brand/adidas/size/10/
However, if they pick a size 10 first and then Adidas brand after that, the URL won’t look like this:
https://www.domain.com/size/10/brand/adidas/
The order of URL parameters must correspond with the original selection:
https://www.domain.com/brand/adidas/size/10/
A re-ordered version of URL parameters indicates in which order the user selected them. As search engines treat every parameter-based URL as a new page, you will create a large amount of indexable URLs with duplicate content. This can lead to keyword cannibalization or being completely filtered out of the search results.
Gathering data for an SEO audit
Before you start working on your technical or content audit, it’s important to gather data for your report first. It will help you recognize the impact of your suggested changes and see if your predictions were accurate.
Each website audit includes various data sections and different types of sources. Here are the tools you will probably be using during the audit process.
Getting access to website analytics tools
Google Analytics is the most widely used web analytics tool, but you can also find analytics platforms such as Omniture or Matomo (formerly Piwik).
These tools will help you gather valuable information about website visitors, traffic sources, keywords, landing pages, and more. Having access to website analytics is essential to know how your website performs and how visitors browse it. Analytics tools will help you set up KPIs and measure the impact of your work.
Search engine analytics tools
Search engines have their own webmaster tools to help us manage our web presence. They also provide you with insights into how bots see your website. This information helps identify indexation issues, create authoritative content, and discover problems with indexing or penalization from Google.
See below what analytics tools to use for different search engines:
Search engine | Analytics Tool |
---|---|
Google Search Console | |
Seznam | Seznam Webmaster Tools |
Bing | Bing Webmaster Tools |
Yandex | Yandex Webmaster |
When carrying out an SEO audit, it’s essential to have access to analytics tools of all search engines that are frequently used in your client’s market - e.g., for the Czech market, you will often work with Google Search Console, Seznam Webmaster Tools and sometimes with Bing Webmaster Tools data too.
Advertising tools
Advertising tools such as Google Ads (formerly Google AdWords) or Microsoft Ads (formerly Bing) offer keyword research tools to know how frequently your keywords are searched for, discover trends or find your potential competitors, and more.
For this reason, it’s crucial to have access to such platforms and gain lots of valuable data for your analysis.
Access log
An Access log is a list of all requests for individual files that users or crawlers have requested from a website. Access logs are generated continuously every time when the server processes a request. These requests are being added every time as a new line in this log.
Here's an example of how a log file is created:
Thanks to this process, we can analyze logs in-depth (often huge files) and find out how frequently bots are crawling your site or how users behave on your website.
You can get several different insights from your access file, for example, identifying your most commonly crawled pages or identifying whether some pages have been crawled or not.
Technical SEO factors
Content might be a king, but to get it noticed by search engines, you need to look at technical SEO elements.
Robots.txt and sitemap.xml
Many SEO specialists agree that robots.txt and sitemap.xml files are a must-have for all websites. But that’s not true. Not having a robots.txt file on your website will result in crawling and indexing all your content. It’s a problem only if it exists, but crawlers can’t find it. A sitemap is not necessarily needed if you have a website with just a few pages (up to 100 indexable pages).
However, the larger the project, the higher the priority is to use such files to help search engines better understand your website and ensure faster indexing for high-priority content.
Where should robots.txt and sitemap.xml files be located?
To check if your website is already using a robots.txt file, just navigate to your domain and add /robots.txt.
www.yourdomain.com/robots.txt
If you can’t find anything, you don’t have a robots.txt file yet as that’s where crawlers would search for your robots.txt file. To block bots from specific content, you need to ensure using a disallow rule. Don’t also forget to use a sitemap directive to help search engines locate the file. Its reference should look like the example below:
Sitemap: https://www.domain.com/link-to-your-sitemap.xml
On this link, users and crawlers can find your valid sitemap or sitemap index. Sitemap index is a file with multiple links of all your sitemaps, and it’s used for websites with over 50,000 URLs. Learn more about sitemaps and how to submit them.
Image and video sitemaps
Adding image and video content in your sitemap may help crawlers discover more about them to include in image and video search results.
Find out more about sitemap extensions:
Page speed
As an SEO, you should make page speed a top priority. Mobile usage keeps increasing, and if your website is not loaded fast enough, the user will likely leave it and return to the search results.
Slow page speed also means that search engines can crawl fewer pages. We know that page speed is a ranking factor. However, it can also affect rankings indirectly by increasing bounce rate and reducing dwell time. Dwell time refers to the length of time a visitor spends on a page or website before returning to the SERPs.
Optimizing your page speed can lead to higher ranking, lower bounce rate, better user experience, and increased conversion rate.
How to check your page speed
One of the easiest ways to check your site speed is through your web browser.
In Google Chrome, open DevTools by pressing the F12 key or Ctrl+Shift+I (Cmnd+Opt+I). After the Console panel opens, navigate to the Network tab that is mostly used to find and solve network issues to optimize websites and their speed. In this section, you can see all connections for the current page that are listed. Read more about how to identify pagespeed issues with DevTools.
Chrome DevTools
To test a specific page, open DevTools and navigate to the page you want to analyze. I recommend testing your site’s page load performance on multiple devices, different internet connections with disabled website caching. In the Lighthouse tab, you can also find screenshots that show how the page looked while loading.
Here are other tools you can quickly use for testing page loading time:
The first two mentioned above provide detailed information about loading time and generate recommendations for website speed optimization. In WebPageTest, don’t forget to select the location of your targeted audience to have as accurate results as possible.
The last tool mentioned above (Google’s PageSpeed Insights) reports on the performance and suggests how the page speed may be improved. In this tool, you will find real-world usage data from the Chrome UX report dataset, real users’ FCP (First Contentful Paint), First Input Delay (FID), Largest Contentful Paint (LCP), and Cumulative Layout Shift (CLS) experiences over the last 28 days.
The last three metrics also relate to page responsiveness, stability, and how they affect the user experience. This group of standardized metrics from Google represents so-called Core Web Vitals, new ranking factors to measure the user experience on the web.
The data in the Chrome UX report is a valuable free source of real-world user measurements of your website performance to know how users see your website. If you want to use a detailed report of your website performance, use this free DataStudio dashboard with lots of useful data: https://g.co/chromeuxdash
Here’s an example of how it looks like: Marketing Miner GDS CrUX Dashboard.
Marketing Miner’s website speed data based on the Chrome UX report
The last tool to measure the quality and speed of web pages is Lighthouse that you can use in Chrome DevTools as I mentioned above.
Bulk web page speed test
If you want to check Google Page Speed Insight score of different web pages or pages from your sitemap in bulk, use our Page Speed Miner in the URL miners section.
What to watch out for
To optimize your website performance, it’s important to know what information to look for but also what to watch out for. Your page load time shouldn’t be longer than 1.5 seconds. Many people abandon sites that take more than 3 seconds to load.
Page speed information for pages with disabled cache is a valuable metric to focus on. However, don’t forget to look at loading time when navigating between different web pages. Similar to the page load time metric, it’s much more beneficial to focus on the Speed Index metric instead.
The server response time is also an interesting metric to focus on from an SEO perspective. It represents the time that passes between a user or bots requesting a page and your server responding to it. The less time it takes to load, the faster and more efficient it is to crawl the website and index the content.
How to improve your page speed
- Enable caching (especially for images).
- Enable GZIP compression.
- Use the srcset attribute to choose the right image size on different devices automatically. Don’t forget to also add a link to the best image version in the src attribute.
- Reduce the size of your images with lossless compression.
- If your site has international traffic, set up a CDN to improve page speed for people from different parts of the world.
- For web pages with lots of image or video content, consider using lazy loading to reduce initial page load time as it makes content visible when it’s in the user’s viewport. If you use lazy loading, don’t forget to add this content to <noscript> in the source code to ensure search engine bots index this content.
- If you work with a responsive site, consider whether it’s important to load all desktop version styles and scripts for mobile devices too. If not, it’s a good practice to disable loading some of these resources on specific devices.
AMP and Facebook instant articles
Another solution for mobile-optimized content is delivered by Google and Facebook. It’s either the AMP (Accelerated Mobile Pages) or Facebook Instant Articles. Google’s AMP is often preferred over Facebook.
The AMP is an open-source project that is designed to help webmasters create mobile-optimized content that loads instantly on all devices. The AMP ensures loading pages in an instant. On mobile search, Google marks such pages with the lightning bolt icon to help users quickly identify content that loads quickly and smoothly, which can improve your website traffic too.
If you work with a WordPress site, you can simply use their AMP plugin to implement the changes: https://wordpress.org/plugins/amp/
However, implementing AMP and Facebook Instant Articles on different websites is not as easy.
How does AMP impact SEO? If the AMP is implemented correctly, the load time improvements are often noticeable and have an impact on user experience. While AMP itself isn’t a ranking factor, page speed is, which is another advantage of using AMP pages. AMP pages are also displayed in the coveted top stories carousel (a section that appears for news-oriented searches) or marked with the lighting bolt icon.
Separate URLs
Do you have a responsive website that works well on all devices? Then from an SEO perspective, you don’t have to worry about having a separate mobile site version. If your site’s mobile version is located on a different URL than its desktop version, you have to mark them as canonical not to be seen as duplicate content.
If your mobile version has a different URL, include the code below in the head of your source code:
<link rel="canonical" href="https://www.example.com/page">
This will automatically identify the desktop version of your website.
Make sure you add the code below in the head of source code for your desktop version too:
<link rel="alternate" media="only screen and (max-width: 640px)" href="http://m.example.com/page">
To help bots understand your separate mobile URLs, use a rel="alternate" tag pointing to a specific mobile URL together with a width of the user's device.
In general, Google doesn't recommend separate URLs as a website set up as it's difficult to implement and maintain them. Read more about separate URLs here.
Multilingual content
Similar to separate URLs, if your site servers different content to users in different languages or regions, optimizing search results for your website is important. Search engines will serve the right content to users from different locations when optimizing different language site versions successfully. You can achieve this by adding elements to your page header to tell Google about your different language variants. For example, for the URL https://www.marketingminer.com/en, we add the following values:
# Canonical URL:
<link rel="canonical" href="https://www.marketingminer.com/en" />
# Alternative links to different language versions:
<link rel="alternate" href="https://www.marketingminer.com/cs" hreflang="cs" />
<link rel="alternate" href="https://www.marketingminer.com/en" hreflang="en" />
<link rel="alternate" href="https://www.marketingminer.com/de" hreflang="de" />
<link rel="alternate" href="https://www.marketingminer.com/sk" hreflang="sk" />
The #x-default attribute signals to Google that this page doesn’t target any specific language or region.
<link rel="alternate" hreflang="x-default" href="https://www.marketingminer.com/en" />
In this instance, it’s recommended to use a canonical tag that leads to specific content and adding links to all different language versions of this content too (including an original link of this content too). You should also use the x-default hreflang tag that signals default language version of the page that search engines should serve to users outside of your language or region specified in the code.
A hreflang tag also accepts values that define country for a given language (a country-only value is not allowed). You can use it when you want to specify which regions the content in a given language is localized for. Hreflang accepts values that define countries in ISO 3166-1 format. The implementation of an hreflang can look like the following:
<link rel="alternate" href="https://www.marketingminer.com/en" hreflang="en-us" />
You can also use a sitemap to let search engines know about different language variants for each URL. How to do it, you ask?
<url>
<loc>https://www.marketingminer.com/en</loc>
<xhtml:link rel="alternate" hreflang="cs" href="https://www.marketingminer.com/cs"/>
<xhtml:link rel="alternate" hreflang="sk" href="https://www.marketingminer.com/sk"/>
<xhtml:link rel="alternate" hreflang="de" href="https://www.marketingminer.com/de"/>
</url>
The third way how to tell Google all of your language variants for the URL is adding it directly to the HTTP. It will look like this:
Link: <https://www.marketingminer.com/en>; rel="alternate"; hreflang="en",
<https://www.marketingminer.com/cs>; rel="alternate"; hreflang="cs",
<https://www.marketingminer.com/sk>; rel="alternate"; hreflang="sk",
<https://www.marketingminer.com/de>; rel="alternate"; hreflang="de"
If you want to find out whether you’ve got all your hreflang annotations right, try our Hreflang Checker now.
Implementing structured data
Structured data is a standardized way to provide further information about a page. This information makes it easier for search engines to contextualize and understand the content on web pages. It will result in accurately matching website content to relevant search queries. Thanks to implementing structured data on your data, you can appear in search features such as featured snippets (reviews, events, images, recipes,…).
Learn what structured data formats you can work with to implement it:
Structured data formats:
JSON-LD | Microdata | RDFa | |
---|---|---|---|
✓ | ✓ | ✓ | |
Yahoo | ✓ | ✓ | ✓ |
Bing |
✓ |
✓ | ✓ |
Yandex |
✓ |
✓ | ✓ |
As stated in Google's structured data guidelines, the search engine doesn’t guarantee that your structured data will show up in search results, even if your page is marked up correctly. You can use The Rich Results Test to validate your structured data or preview most features in SERPs.
Want to know what rich results Google offers? Here is a list of rich results you can test in the tool: https://support.google.com/webmasters/answer/7445569#zippy=%2Csupported-types
Most search structured data uses Schema.org vocabulary or OpenGraph documentation. There are more attributes on schema.org that Google does not require. However they may be useful for other search engines and platforms.
Schema.org
Schema.org is a collection of descriptive tags that make it easier for major search engines (Google, Bing, Yahoo, and Yandex) to read your content. This reference website is full of documentation and guidelines to help you with structured data markup on your web pages. Try it yourself to see what else it does: https://schema.org/docs/full.html
OpenGraph
The OpenGraph protocol provides more control over how social media platforms display your web pages. To turn your pages into rich graph objects, you need to add metadata to their source code. Using OpenGraph on your site can help you improve your website traffic and boost social media engagements too.
Here's an example of what an OpenGraph implementation can look like:
<meta property="og:title" content="Marketing Miner"/>
<meta property="og:type" content="website"/>
<meta property="og:url" content="https://www.marketingminer.com/en"/>
<meta property="og:description" content="Data mining tool for online marketers! Save your time with various tools that drive you with valuable data."/>
<meta property="og:site_name" content="Marketing Miner"/>
Find out more about implementing social media meta tags here: https://moz.com/blog/meta-data-templates-123
Pagination attributes
Site pagination is used to divide content across a series of pages. We typically talk about a collection of articles or content filters (category pages). It divides a list of articles or products into digestible elements. Optimizing your website with pagination attributes helps users feel more in control while browsing and helps search engines understand the relationship between pages of paginated content.
One of the best ways to do it is by inserting pagination attributes rel="next" and rel="prev" into the head section of your source code. They are often use on eCommerce websites or blog pages and look as the following:
<link rel="prev" href="https://www.marketingminer.com/en/blog">
<link rel="next" href="https://www.marketingminer.com/en/blog/3">
By implementing these attributes, you tell search engines that the next page in this category is found on https://www.marketingminer.com/en/blog/3 and the previous one is: https://www.marketingminer.com/en/blog.
It’s worth noting that Google no longer uses these attributes, although other search engines still use them. You can also let search engines know about paginated content by adding this information in the HTTP header. It can look like the following:
Link: <https://www.marketingminer.com/en/blog/; rel="prev">
Link: <https://www.marketingminer.com/en/blog/3; rel="next">
Once again, some search engines don’t support this format of pagination attributes and it’s important to also add noindex, follow directive to crawl the content but blocking it from being indexed.
To learn more, read Google’s best practices when implementing pagination on your website: https://developers.google.com/search/docs/advanced/ecommerce/pagination-and-incremental-page-loading
Faceted Navigation Filters
Filters can be a good help but have to be applied in right circumstances to not make more harm to your website’s optimization. Filters can cover multiple keywords; however, they can also create another SEO black hole on your site!
What are faceted navigation filters best practices for SEO?
- Filters can change headers and titles based on the filtered content.
- Filters you wish to index include links in source code to identify a specific filter and let search engines crawl all additional URLs.
- Filters, that are listed in different orders, should always create one unique URL (URLs are not dependent on which order the user clicked on the filters).
- Multiple filters (two and more filters used in one segment) are blocked by robots.txt. This will ensure that crawl budget won’t be wasted as search engines won’t spend time crawling all filter variations. There can be even billions of them on one small site.
SEO audit: analyzing the biggest blockers
When identifying the biggest blockers during your website analysis, it’s useful to work with your access logs. If you can’t get your access logs, you can also use advanced crawling tools such as Screaming Frog (there are also other tools such as Xenu, Sitebulb, or ContentKing).
Website analysis and optimizing crawl budget are useful (especially for large sites) to find your new (or important) content faster, prevent server overload when crawlers load your site, and increase website traffic.
When analyzing the biggest blockers, you often work on the following tasks:
- Identifying internal links with parameters you don’t want to index (category order, using multiple filters, etc,...). These parameters or filters can be blocked in robots.txt file.
- Adding parameters in the URL Parameters section. Adding URL parameters to this tool will reduce the crawling of duplicate URLs and help Google crawl your site more efficiently. Not sure how to do it? Check out this guide by Google to know when and how to use the URL Parameters tool.
- Looking for so-called spider traps. Spider traps represent structural issues that kill crawls and hurt your website’s indexation. You can find them by using Screaming Frog and analyzing mainly never-ending URL traps, URL parameters associated with a user ID (?session= apod.), poorly-formed relative URLs, and others.
Website content factors
There are several website content factors during an SEO audit. They mainly relate to quality content and visual representations of your website.
Website structure and internal linking
Not many people realize that poor website structure can also negatively impact user experience and website traffic. Proper site architecture is a must. The more appealing your website to users, the more attractive it is to search engines. From an SEO perspective, it also impacts spreading link juice throughout your site.
What is link juice? Link juice describes the amount of authority that is passed on from one page to another. Each web page has its authority based on the amount and quality of links (external or internal). This value is distributed to other pages through internal and external referral links. The more links there are, the less link juice is passed on per link.
Remember that internal links not only help people navigate websites, but they can also create a website architecture for hierarchy. If you use them strategically, they can pass authority to important pages. Internal links can establish a hierarchy on your website to show the most crucial pages to search engines.
There are many tools to analyze and visualize your internal linking structure to know where link juice flows between your pages or spot any linking issues. The internal linking visualization can look like the following example:
Visualizing your site structure will help you analyze its optimization, identify internal linking opportunities, and see link juice distribution. When proposing a new site structure, it’s essential to analyze internal linking and do your keyword research to understand what users are searching for and include this data in your SEO audit.
Click-depth is another critical factor for search engines that helps them better understand your site architecture. Click-depth describes how many clicks (through internal links) it takes to reach a specific web page from the homepage. Learn more about click-depth and URL structure here.
Website architecture best practices:
- Keep things simple. All of your URLs must follow the same format to easily access alternative content (similar articles, categories,...) or get to high-priority pages.
- Use breadcrumbs to indicate the page’s position in the site architecture and list their structured data so search engines can use them in search results.
- Create a navigation menu structure that is logical and straightforward. When building categories, less is more. Fewer categories and more filters (better structure) are key to helping users find the right content quickly.
- When your content mentions certain products (categories, filters, services,...) from different sections, make sure to link to them so the users can easily get to them in one click.
- Use internal linking to refer users to the most relevant content (similar articles, etc.). Want to learn more about content clusters? Read here: Topic Clusters: The Next Evolution of SEO.
Content readability
When creating content, we often overlook its readability. If you want your website visitors to read to the end of your blog posts, make sure your content is easy to read. Text easiness also influences whether the user makes it to the end of your article or signs up for your newsletter. From an SEO perspective, content readability impacts user behavior on a site and whether the consumer will return to the search results or not (a short dwell time).
For this reason, it’s crucial to make your text readable.
Do you want to know how easier your website is to understand while writing a blog or article? The WebFX readability tool can create a detailed report with various metrics for you to find out.
Best practices to improve your content readability:
- Font size. Depending on your audience, choose a font that’s at least 12-16 points (16-22 pixels). For example, if any of your users are older, you should consider using larger font sizes.
- Contrast and color. Providing good contrast between background and text colors is vital to make it accessible to as many users as possible. Check out this helpful tool to test the contrast ratio of background and text colors to make your text accessible.
- Printable fonts and content. Don’t forget about users who prefer printing their content or saving it to their e-readers too. If you also decide to use a printable version of your content, try to avoid using new print URLs (with a ?print parameter) or make sure search engine bots do not crawl them.
- Line length. The optimal line length is considered to be around 50-80 characters per line. A line spacing can make or break your content too. Keep it around 1.5 value to keep your audience engaged with your text.
- Font. Content readability is affected by your style of writing and your choice of font as well. Font affects the overall user experience of your website. Using different font styles of the same article can even double its reading length. Read this interesting article to find out why fonts matter and what types of fonts users prefer.
- Keep it simple. In Europe, having the menu on the left or at the top is common. Filters are often placed on the top of the content or in a vertical panel on the left side. It’s also a good practice to use H1 to describe what the content is all about.
- Get inspired. One of my favorite website examples with great content readability is Medium.com. Take a look around to get some inspiration for your website too.
Hidden content
Placing hidden content on your desktop website version that is accessible after the user interacts with your site (click,...) and has less SEO value than visible content. For this reason, it’s important to highlight your most important content first and hide only your low priority content in tabs.
On mobile, it’s a bit different as there is less room to work with, and sometimes it’s challenging to present content on a small screen while making your content easy to access. For this reason, it’s more common to hide some elements, and Google does not devalue hidden content on a mobile device in any way.
Find out how Google treats hidden content which becomes visible when clicking a button:
E.A.T. (Expertise, Authority, Trust)
Google’s E.A.T. represents the most important characteristics of high quality content.
E.A.T stands for expertise, authority, and trustworthiness. Google uses these factors to evaluate the overall quality of a web page.
Expertise
- Do you have expertise in your industry?
- Do people find your content valuable?
- Can users easily find the author of an article and learn more information about them?
Authority
- How authoritative is a website? And how many high-quality and relevant backlinks does it have?
- Do you link to reputable and reliable websites that provide accurate information?
- Who is the author of the page content? What other blog posts did they write, and what’s their quality?
Trustworthiness
- What’s at risk if a user trusts the web page? Does it use security mechanisms, such as HTTPS, to keep users safe?
- Who is responsible for the website, and do they have a good reputation?
- Is the content up to date and accurate? Is the content updated based on the latest industry news?
These three factors are crucial, especially for the content of so-called YMYL (your money or your life). According to Google, YMYL is any page including content that can affect a reader’s happiness, health, safety, or financial stability.
Image optimization tips
Image search is popular, and it continues to develop rapidly. Image SEO can be just as important as optimizing your text content.
Image optimization should be a part of all SEO audits. Here’s what you should focus on:
- Use high-quality photos you have taken yourself (avoid using stock images).
- Add descriptive alt-text. An alt attribute is a short description of an image that improves accessibility for people who can’t see it and helps search engines better understand what an image is about.
- Include title-text only when you want to add supporting information that will appear as a tooltip when people hover over the image element.
- Use responsive images. A srcset attribute allows specifying different versions of the same image to serve a different image per screen width. Note that it’s also a good practice to add a src attribute, as some search engines only index images with a src attribute.
- Make sure to place images near relevant text.
- Adding structured data can help search engines show your images as rich results. You need to add structured data every time an image is used, even if it's the same image. You can also indicate the author, date, location, or even add a thumbnail. Google Images currently supports structured data for products, videos, and recipes.
- Compress your images. Many tools will help you compress your images before uploading them. You can use tools such as Optimizilla or Compressor.io.
- Include descriptive titles. Take time to make image titles simple and descriptive. Break words with a hyphen or underscore.
- Use image sitemaps.
- Add OpenGraph meta tags to load a preview image on social media platform when people share your web content.
- Choose the right format (JPG for photos and GIF for moving pictures). Google Images supports the following formats: BMP, GIF, JPEG, PNG, WebP, and SVG.
Google Images SEO Best Practices with John Mueller:
SEO audit using Marketing Miner
If you are dealing with an SEO audit, then Marketing Miner will be an excellent help. For an SEO audit, you will primarily use the Reports section, where you choose what data MM should collect for you.
So, go to the Create Report section, which can be found on the top right, and select the following functions:
- Status Code
- Page Speed
- Validity Checker
- Hreflang Checker
- Structured Data Checker
- Indexability Checker
- Index Checker
- Broken Link Checker
- Content Analysis
Then, in the next step, insert the sitemap of your website:
All you have to do is check the inserted data and click on Process Report.
Marketing Miner will generate a report with the actual data collected in a clear spreadsheet, which you can download to Excel and check how and if you have the things we have listed in this article correctly implemented on your site.
TIP: Sample MM report for SEO audit: https://www.marketingminer.com/en/report/55173ab334db08b8ce5327d5042269fae9e18c3a6861a151597cc2a10bf135fd/visualize
SEO audit report
Your SEO audit report should not just point out errors, but to also include suggestions and instructions for developers on how to fix the errors. They are often available in different formats (Google Docs, Word, or PDF). PDF files are a predominant format as it enables all users to view the same static file without any changes.
As a part of an SEO audit, you should also develop a prioritized plan with actions to know where to start. Not all issues have the same priority and weight, and sometimes it takes longer to fix them. Therefore, a good actionable plan that fits your resources is a must.
Are you ready to learn what issues are holding your website back from achieving top search results? Try our tool now!
And for more information about how to optimize your website, we have prepared comprehensive guides where we explain everything step by step:
- Keyword research - https://www.marketingminer.com/en/blog/keyword-research-guide.html
- Link building - https://www.marketingminer.com/en/blog/what-is-link-building.html
- Local SEO - https://www.marketingminer.com/en/blog/local-seo-guide.html
SEO audit checklist example
Check out our free SEO audit checklist to make your work a bit easier (and much faster). To make a copy for your own use: Click File > Make A Copy.