One of Google’s most helpful tools is the Google Search Console. Capable of monitoring website traffic and identifying issues that may be hurting SEO on your site. Despite the incredible capabilities, it can be difficult to understand the common search console errors and the best way to fix them.
Learning the most common Google Search Console error report will help you identify issues as they are happening and swiftly modify them to ensure that you continue to get valuable insights about your website. Not fixing issues can lead to websites not appearing in SERPs, in the worst-case scenario, so understanding how to fix them is pertinent for all marketers.
What is Google Search Console?
Google Search Console is an incredible, free tool offered by Google that works as a troubleshooting aid. It will identify issues, troubleshoot them, and help you resolve any issues there might be for crawls that are trying to index your website.
Not only does it work as a troubleshooting tool, but it also helps you see what pages are performing well on your site and those that are being ignored by Google – allowing you to make necessary adjustments to help web performance and build on your content.
Within Google Search Console, you will be able to use a tool known as the Index Coverage report which gives you a comprehensive list of all the pages on your website that Google has attempted to crawl and index. Not only that, it reflects on any issues for you to resolve promptly.
1. Pages That Haven’t Been Indexed
Google might not index certain pages on your website which can be horrific for your SEO efforts. If Google doesn’t index your pages, then you are going to struggle to get value from the pages which will directly affect your opportunities in the organic search results.
Crawl Issues on Pages
Crawl issues can be caused by a variety of factors and sometimes Google doesn’t provide the specific details as to why this is happening If this happens, using the URL inspection tool within Google Search Console can provide the tools to debug your pages and aid the functioning of your site.
Pages Blocked by Robots.Txt
Sometimes when you submit a page for indexing, it can be blocked by robots.txt. To stop this from happening, you can test your pages with the robots.txt tester to ensure that you won’t encounter this problem.
How robots.txt prevents Google from crawling a page is a simple line of code that says that Google is not allowed to crawl it. This happens even if you have requested Google to crawl it through submitting it for index.
To ensure that your page is crawled, find the line of code preventing indexing and remove it from the robots.txt file. If you are struggling to do this, check your sitemap.xml file to see if the URL is listed there. If you find it there, remove it.
If you are using WordPress, WordPress plugins sometimes issue pages to your sitemap file that shouldn’t be there, so check it regularly to prevent this from happening.
Top tip: make sure your WordPress sitemap is optimised by including important pages like your Home page, your ‘About Us’ page, and other pages of high-quality content.
A Page has Been Marked No Index
An issue that can occur when submitting a page for indexing is that the page has a ‘noindex’ directive in a meta tag or even in an HTTP response. If you have either of these issues, they must be removed in order to have your page indexed.
To ensure that a page can be indexed, open your page’s source code and type in the word ‘noindex’. If you see it there, go into your CMS and find the setting that removes this. If you want, you can manually edit it by dealing with the code for the page directly.
The HTTP response can be caused by an X-Robots-tag. You can remove it from the code in the HTTP header’s response or request a developer to remove it for you.
2. 404 Errors
We have all seen them, 404 errors mean that Googlebots are struggling to find pages. This can be due to a myriad of reasons, sometimes the page no longer exists in the place the bot has access to or sometimes the page is now blank.
One of the most common forms of errors, 404 errors happen as websites are modified and changed. Understanding these errors and how to fix them can aid your website’s appearance.
A Soft 404 From Submitted URL
Sometimes when submitting a page for indexing, the server returns what is known as a soft 404. These are pages that appear to be broken by Google but they aren’t showing a 404 Not Found response.
These could be caused by a category page with no content in that category or if your website’s theme is creating pages that shouldn’t exist. You should either convert these pages to proper 404 pages or redirect them to the new location. Another alternative is to add content to these pages.
A Submitted URL Not Found 404
One issue with 404’s is that can submit a non-existent URL for indexing. This can cause multiple issues and is easily done by forgetting to remove a page from your sitemap after wiping it from your website. To prevent this from occurring, regularly check your sitemap file.
3. Mobile Usability Errors
Websites should always be tailored for mobile devices considering we live in a world that is increasingly turning to this type of web surfing. Mobile usability errors direct you to problems that pages have on site that make it hard to navigate for users.
Content Wider Than Screen
When content is wider than your screen, this usually means that you have an element on your page or an image that hasn’t been sized properly for mobile devices. A common issues with WordPress sites, this can happen if an image is given a caption or a plugin is used to generate an element that isn’t native to a theme.
A quick fix is to immediately remove the image or element that is causing the content to be wider than the screen. A long term fix is to modify the code to ensure that the content or element is responsive to mobile usability.
Viewport Not Set
Viewports or viewport property are used to inform browsers on how to adjust a page’s dimensions. This then scales the page to the correct size of the screen. If a page does not define a viewport property, this can cause some issues.
You can fix this by specifying the viewport using a meta viewport tag. Avoiding using wide, large elements on the site can prevent difficulties that require users to scroll horizontally.
This trait of mobile usability defines if the font size of a page is too small to be read and requires users to zoom in to consume content. To fix this error, simply specify a viewport in your pages and set font sizes to properly scale within the viewport. This can be done using relative units like em or rem rather than using pixel value for font size.
Clickable Elements Too Close Together
When a report says that clickable elements are too close together, this is usually due to elements like buttons and links being in too close proximity to one another. This makes it hard for mobile users to click the elements that they wish to access without tapping other elements that they may not want to access.
To ensure this doesn’t happen, correctly space and size elements that are suitable for mobile users. This is recommended by Google to be 48 pixels and spacing between elements to be at least 8 pixels.
Key Takeaways for Fixing Common Search Console Errors
By making a checklist of the opportunities you have to fix common search console errors, you can prevent bugs in your system and allow Google to crawl indexed pages, helping with your SEO:
- Using web design that is responsive and adheres to best practices ensures that mobile usability doesn’t suffer many issues.
- By improving your site’s speed, you can resolve any issues you are experiencing with core web vitals.
- If there are issues with your index coverage, make sure that you look at your sitemap, HTML meta tags, and your robots.txt file to make sure that you don’t have any issues with those spaces.