Browse By

Robots.txt Custom File in BlogSpot Blog Website.

Today we are going to discuss a very useful post regarding adding custom robots.txt file in blogger blog.

What is Custom robots.txt file?

A custom robots.txt file is a simple txt file where the internet site or blog proprietor writes the commands for the online crawlers to what to crawler or no longer. The instructions are written in distinct coding which will only be read through net is one more step to make blog more seo pleasant.we are able to allow and disallow search crawlers to what field of our weblog they must crawl and what area to not crawl. In Blogger’s interface we will add it easily and i will guide you the way to do it effortlessly. So lets  this tutorial.


                                                                       Image Credit

You can always check Your Blog Robots.Txt File by this link:     (Remember to replace your domain name with your website domain name)

Enabling Robots.txt File:

So, This is the process just follow the simple steps below.
  • Go To Blogger >> Settings >> Search Preferences
  • Look For Custom Robots.Txt Section In The Bottom and Edit It.
  • Now a check box will appear tick “Yes” and a box will appear where you have to write the robots.txt file. Enter this:

User-agent: Mediapartners-Google*

Allow: /
User-agent: *
Disallow: /search
Allow: /

Note : The first line “User-agent: Mediapartners-Google” is for Google AdSense. So if you are using Google AdSense in your blog then remain it same otherwise remove it.

  • Click “Save Changes”.
  • And You are Done! Now Lets take a view on the explanation of each of the line:
  • User-agent: Mediapartners-Google : This is a first command which is used for Google AdSense enabled blogs if you are not using Google AdSense then remove it. In this command, we’re telling to crawl all pages where Adsense Ads are placed!
  • User-agent: * : Here the User-agent is calling the robot and * is for all the search engine’s robots like Google, Bing etc.
  • Disallow: /search : This line tells the search engine’s crawler not to crawl the search pages.
  • Allow: / : This one allows to index your entire site or your blog.
  • Sitemap : So this last command tells the search engine’s crawler to index the every new & updated post from 1 to 500 posts.
  • A single sitemap file should not be exceed 50MB which is why we limit it to max results 500.
If you have more than 500 posts on your blog, you simply submit another sitemap starting at post number 501 for the next 500 posts. Read here how to submit another site map if you have more than 500 or 1000 post.

You can also add your own command to disallow or allow more pages. For Ex: if you want to disallow or allow a page then you can use this commands:

To Allow a Page:

Allow: /p/contact.html

To Disallow:

Disallow: /p/contact.html

I hope you can learn to add robots.txt file in your blogger blog! If you face any difficulty then please let me know in comments. Its your turn to add Robots.txt file in your website & tell thanks in comments and keep sharing this post.Happy Learning, Happy Blogging, Happy Earning.Do not forget to subscribe our website via RSS FEED here or you want every post delivered in your email id as it published then please subscribe our website with your email ID.Also I rewquest you to share this article in Facebook & Twitter.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: