How to Create and add Robots code in Blogger?

Blogspot is a fruitful platform for every blogger because of its friendly design in all aspects. If the blogger using any platform like Blogger.com, WordPress, and Typepad, they must do some SEO to promote the blog in Search Engines.

Robots.txt is a small code file. It is used by Webmasters to control the search engine web robots. The robots mainly function on directories, Web Pages, blog (or) site.

A2Hosting web hosting deal

How to Create and add Robots.txt code in Blogger?

What is Robots.txt?

Robots.txt’ is a text file which contains a few lines of simple code. This file helps to crawl and index your content in the search results. Always remember that search crawlers scan the ‘Robots.txt’ file before crawling any web page.

By default, each website allows the search engine robots, however, if you would like to restrict the robots either to not crawl any bound directory, file (or) the  complete website, then you may want the ‘robots. txt’ file in which you are getting to write instructions for search engine bots.

Steps to edit Robots.txt on Blogger:

  1. Login to your blogger dashboard
  2. Go to Settings > Search Preferences > Crawlers and Indexingcrawlers and indexing
  3. In the ‘Crawlers and indexing’ box, there are two options which would offer you the flexibility to customize your ‘robots.txt’ file. Now press the ‘Edit’ button in ‘Custom robots.txt’.add robots code here
  4. Now you can see the text area, to type the content. Copy and paste the code given below then click Save Changes button.
User-agent: Mediapartners-Google
User-agent: *
Disallow: /search
Allow: /

How to block the link from Search Engines:

For example, If you want to stop robots from crawling this URL

Recommended Hosting

bluehost wordpress hosting


To stop the crawling of this page, use this structure code

User-agent: *Disallow: p/about us

Alternative Method to create Robots.txt code:

Create ‘Robots.txt’ code here:

  1. Go to ‘XML sitemap for Blogger” site (It is one of the tech projects of Indian popular Blogger Amit Agarwal)add your blogspot url to get robots code

  2. Here you will find “XML Sitemap for Blogger’ field. Paste your blogger URL, then click on “Generate Sitemap”. You will get the XML sitemap code.
  3. Copy the code into notepad.
  4. Next, Login to your blogger dashboard and go to Settings › Search preferences › Crawlers and indexing”
  5. Then paste it in ‘Custom robots.txt’ box and save the changes.

How to check your “Robots.txt” file?

You can check this file on your blog by adding “/robots.txt” at least to your blog URL in the address bar of your browser. Take a look at the below example for a demo.


(Replace your-domain.blogspot.com with your BlogSpot URL)


Finally, I conclude by saying that ‘Robots.txt’ is used to control the indexing and crawl your blog content for search engines. But do not try to add this code without any knowledge. Do not hesitate to ask your questions from the Comment box. I will help you to get better functionality to your blog.

Ultimate wordpress guide ebook

Subscribe Now & Get Free eBook

Just enter your email address and get this free "Ultimate WordPress Guide" ($90 Worth). Get regular updates and Deals at your inbox


We hate Spam.

Satish Kumar Ithamsetty

Satish Kumar Ithamsetty is a full-time blogger since 2009. This blog is to support every blogger & Marketer. He helps 100+ new bloggers and 10,000+ readers by this blog. He writes on Blogging, SEO, SMO etc.

Click Here to Leave a Comment Below 2 comments

Join Our Newsletter

Subscribe & get one Free eBook
Enter Your Details