How to Create and add Robots code in Blogger?
Blogspot is a fruitful platform for every blogger because of its friendly design in all aspects. If the blogger using any platform like Blogger.com, WordPress, and Typepad, they must do some SEO to promote the blog in Search Engines.
Robots.txt is a small code file. It is used by Webmasters to control the search engine web robots. The robots mainly function on directories, Web Pages, blog (or) site.
What is Robots.txt?
‘Robots.txt’ is a text file which contains a few lines of simple code. This file helps to crawl and index your content in the search results. Always remember that search crawlers scan the ‘Robots.txt’ file before crawling any web page.
By default, each website allows the search engine robots, however, if you would like to restrict the robots either to not crawl any bound directory, file (or) the complete website, then you may want the ‘robots. txt’ file in which you are getting to write instructions for search engine bots.
Steps to edit Robots.txt on Blogger:
- Login to your blogger dashboard
- Go to Settings > Search Preferences > Crawlers and Indexing
- In the ‘Crawlers and indexing’ box, there are two options which would offer you the flexibility to customize your ‘robots.txt’ file. Now press the ‘Edit’ button in ‘Custom robots.txt’.
- Now you can see the text area, to type the content. Copy and paste the code given below then click Save Changes button.
User-agent: Mediapartners-Google Disallow: User-agent: * Disallow: /search Allow: / Sitemap:
For example, If you want to stop robots from crawling this URL
To stop the crawling of this page, use this structure code
User-agent: *Disallow: p/about us
Alternative Method to create Robots.txt code:
Create ‘Robots.txt’ code here:
- Go to ‘XML sitemap for Blogger” site (It is one of the tech projects of Indian popular Blogger Amit Agarwal)
- Here you will find “XML Sitemap for Blogger’ field. Paste your blogger URL, then click on “Generate Sitemap”. You will get the XML sitemap code.
- Copy the code into notepad.
- Next, Login to your blogger dashboard and go to “Settings › Search preferences › Crawlers and indexing”
- Then paste it in ‘Custom robots.txt’ box and save the changes.
How to check your “Robots.txt” file?
You can check this file on your blog by adding “/robots.txt” at least to your blog URL in the address bar of your browser. Take a look at the below example for a demo.
(Replace your-domain.blogspot.com with your BlogSpot URL)
Finally, I conclude by saying that ‘Robots.txt’ is used to control the indexing and crawl your blog content for search engines. But do not try to add this code without any knowledge. Do not hesitate to ask your questions from the Comment box. I will help you to get better functionality to your blog.