We all are aware that search engines crawlers index your whole website to cache your website’s pages for their index. The whole process is fully automated but might takes some time.
Moreover, there are certain pages on a website which are not liked to be shown in search results like archive pages, labelled pages and so on. So, the admin of a particular website disallow these pages using robots.txt or using noindex robots header tags.
The same can be achieved very easily in blogger blogs and we already discussed full guide on adding the custom robots.txt file in blogger and custom robots header tags settings in blogger.
But what if we want to prevent search engine crawlers and not to index a specific post or page on our blog. Here we are discussing the solution for how to stop search engines from indexing specific page or post in blogger.
Steps to Prevent Search Engines from Indexing Specific Post or Page:
You just need to follow the easy step by step guide to prevent search engines bots for a specific page.
1.First of all, you need to enable custom robots header settings on your blogger blog.
2. After that, Go to Post or Page Editor from your Blogger blog’s dashboard.
3. Under Post Settings, you will be shown Custom Robots Tags. Click on that.
4.Untick the ‘default’ and ‘all’ tick boxes.
5.Tick right the ‘noindex’ and ‘none’ tick boxes and Click Done.
6. Now Publish the Post or Page as per your convenience.
7.You’re all set. Now, that particular post or particular will not allow search engines spiders to crawl it.
So, I hope you successfully learnt how to disable search engines bots for a particular post or page in blogger. Don’t forget to share your views regarding this post through comments or if you have any query you can drop it in the comment box.