Why is your website having indexation problems?
Websites relatively often have problems with indexation – and there may be many reasons for this. We have compiled a checklist to help diagnose and solve your problem. For starters, however, it is worth making sure that your website is actually not indexed. It is easy – just type the command site:example.com into Google, replacing […]
Websites relatively often have problems with indexation – and there may be many reasons for this. We have compiled a checklist to help diagnose and solve your problem.
For starters, however, it is worth making sure that your website is actually not indexed. It is easy – just type the command site:example.com into Google, replacing example.com with the address of the site you want to check.
You should also remember to disable the SafeSearch filter, as it may block search results associated with certain websites. To do this, click Settings and then Search Settings:
If you still cannot see your website in the search results, try using the site: command, followed by the full website address, e.g. “https://www.example.com”.
Still no results? This probably means that Google’s robots haven’t come across it yet. This can happen to brand new websites (which Google has not found yet, so it has not had a chance to index them), as well as to new subpages added to existing and previously indexed websites.
To speed up indexation, consider creating a Google Search Console account, verifying the domain and adding a sitemap in .xml or / format and try to index a specific address by adding it to the queue in the field below:
And then click the button highlighted below:
It can also happen that the home page is properly indexed, but for some reason other subpages, or some of them, have not been indexed. There are a number of possible reasons why this happens:
- The website is available both with and without www, without redirection
If a page is available using both versions of the address and both return a server response code 200 indicating a correct page without redirection – Google may be restricting indexation of at least one of the versions. This is a common error that can be resolved by implementing a permanent redirect from one version to another (i.e. one that returns a server response code of 301). If both websites are new – it doesn’t matter from which page we make the redirect, if on one of them some results have already been indexed, it is worth making a redirect to that page.
- The website is blocked from being indexed
Indexation problems can also occur if a website has directives in place that prevent indexation. This can be done through the robots.txt file (disallow directive), .htaccess (noindex, nofollow) or even directly in the meta robots (noindex, nofollow).
Tip: for a popular content management system such as WordPress, users often tick the anti-indexation checkbox during installation and then forget to index the site manually – it’s worth bearing this in mind when looking for possible sources of indexation problems!
- There are errors in the website structure
If certain subpages in the website structure are not internally linked and do not appear in the .xml sitemap, Google will have a hard time indexing them. It is worth ensuring that the flow of internal PageRank is correct and that each relevant URL is included in the internal linking structure of the page, i.e. that there is an internal link leading to it from another subpage that has already been indexed.
- Subpages and content are duplicated
If a website contains too many duplicate pages, Google may consider that it has too little unique content and will refrain from indexing it.
- The website takes too long to load
Robots that render and interpret websites do not wait indefinitely for a subpage to load. If the page does not load within the specified time frame (some reports suggest 4-5 seconds), it will most likely not be indexed. It is worth using the PageSpeed Insights report and make sure that the speed indicator is between 80-90 points (which translates into loading time of less than 2 seconds).
- The website has no content
The absence of content, as well as its duplication, causes the value of specific subpages in the “eyes” of Google’s algorithm to drop dramatically. This could be another reason why subpages will not be indexed.
- The website does not comply with Google’s guidelines
Penalty from Google (commonly known as a “filter”)
A penalty from Google (commonly called a “filter”) can be imposed when a website does not comply with its guidelines, such as when it:
- hides content on the site,
- applies cloaking (i.e. content presented to robots differs from that presented to the user),
- obtains spammy links on external sites,
- is worthless to the user and created specifically to obtain traffic from a search engine,
- is a copy of another website with no added value or unique content, or
- the domain on which we have placed our website in the past has already received a penalty from Google, contained spam or illegal content
- The website is experiencing technical issues
If the page is designed on Flash, important content is presented in Ajax, contains an incorrect http header of the main page (other than 200), or, for instance, is marked by Google as “Soft 404”, i.e. a page which returns the server response code 200, but the search engine decided that the subpage should return the 404 status. This happens, for example, when the subpage has no content or when Googlebot “thinks” the subpage is buggy, in which case Googlebot may also decide not to index it.
- Temporary redirections have been set
If, for example, a website is switching URLs and redirects from the old address to the new one have been set up via temporary redirects, i.e. 302, it is likely that the new address may have, at least initially, indexation problems. If we are changing the address, we should use a permanent redirect (returning a server response code of 301). The use of a temporary redirect (302) should be thought through and not implemented without reason.
- The website is spreading viruses
It may happen that our website has been infected and becomes dangerous. Again, Googlebot will probably decide not to index it, or even de-index it if the problem wasn’t there to begin with.
Such deindexing will appear in “Manual actions” in Google Search Console. It shows all actions taken by the “verifiers”, i.e. Google employees who check whether the website complies with Google’s guidelines.
The reasons behind indexation problems are endless. We hope that our list will help you diagnose the most important and popular ones, making your website more visible for Google search.
You need assistance with technical optimization of your website? Is your website not visible enough in search engines? Are you looking for experts to help you prepare and implement a complete positioning strategy for your brand? We look forward to hearing from you.