Easily Add Custom Robots.txt File in Blogger.Adding Robots.txt file into the blogger is now a commonly thing in these modern days of blogging.You must be aware of these technical things and How you can add them in your blogger. By Adding Robots.txt file into blogger it will help your blog to enhancing its rating and a way to getting big and organic traffic from search engines.This is a Easy Way for Adding Custom Robots.txt File in Blogger Blog.
So you are on the Right place , in this tutorial i will tell you how you can easily Add custom Robots.txt file into your Blogger in very few and simple steps.
What is Robots.txt File?
Robots.txt file which in simple words interacts all kind of crawler/spiders ,the most common example of its is GoogleBot.Whenever you Publish a New Post on your Blogger, the Robot.txt file which you have added into your blogger , it informs the Robots like "Spiders" and "Crawlers" those were sent by search engines like "Google" about your blog content .If Those Robots find any kind of new Pages on your Blog then their work is to index them in search engines.First Robots will check the Robots.txt file on your blog to examine how you have manage it with what kind of purposes.If you have Disallow some Pages in your Robot.txt file then Search engines Spiders will also ignores that pages and don't let them index until you have made the corrections into the Robots.txt file.
How do they see Robots.txt?
The Main thing that search engines Spiders and crawlers check is the Robots.txt file whenever they come to you site. Robots that were sent by the Search engines Just follow the instruction that we have add in the Robot.txt file . So therefor they will only index those pages which they have been asked to index.Where Robots.txt File is located?
You can easily check/view your Robots.txt file by adding /robots.txt at the end of your blog address.For Example:https://www.my-blog.blogspot.com/robots.txt
or you can check your Robots.txt file directly by logging into your blog and then choose "Settings"> "Search Engine Preference" > "Crawlers and indexing" and select Edit to Custom robots.txt.
How Robots.txt does looks like?
If you have never try to touch your blogger robots.txt file, it looks something like this .User-agent: Mediapartners-GoogleDon't worry about the colored part of Robot.txt file , i have just highlighted different part of Robots.txt file to tell its features in a better way that you can easily understand what is the meaning of these words.
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: http://my-blog.blogspot.com/feeds/posts/default?orderby=UPDATED
What is User-agent:Media partners-Google?
Media partners-Google that is inside your Robots.txt blogger file is a Robot by Google AdSense which work is to often crawl your blog/website for serving relevant adds on your site according to your blog content nature.if you will disallow Google Adsense Robots on your blog then they might not be able to See add on your Blog posts and on Pages.If you have not Apply Google AdSenes still for your Blog then you can simply remove these lines from your Robots.text file from your Blogger.What is User-agent: * ?
The part of User-agent: * inside your Robots.txt file is a programming syntax symbolic character "*".Its works in your Blogger is to specifies the incoming Robots, Crawlers and spiders to your Blogger.What is Disallow: /search ?
Another keyword that is inside your Robots.txt file is Disallow: /search .The keyword Disallow means "not to do" some things inside your blogger.And the Next Text part is on same line /search which specifies Robots for not to crawls any pages that are /search results in your Blog.A simple example of Disallow: /search is those pages that are under the search directory are never crawled and indexed .For Example:
http://my-blog.blogspot.com/search/label/anything
This kind of pages never indexed and crawled by the search engines robots.
What is Allow: / ?
The keyword Allow specifies to do thing in your Blog .The Slash "/" means to crawls those pages inside your Blogger which have found on your Blogger with "/" for example your post and other pages that starts from "/" in your homepage are crawled and indexed by Google Robots and spiders.Adding Custom Robots.Txt to Blogger
After a little description on keyword inside Robots.txt file, Now Let's come to next part on How to Add Custom Robots.txt File into Blogger.Steps for adding Custom Robots.txt File into Blogger
1:Go to Blogger and sign into your Blogger account , After Signing in it will lead you to dashboard , then click on your Blog.2:Now Go to "Settings" > "Search Preferences" > "Crawlers and indexing"
3:Then Click on "Edit" of Custom robots.txt and click "YES" check box.
4:Now paste your Robots.txt file There and Save it .
Here is the Custom Robots.txt file Code.if you want to do changes in it then it depend on you.
User-agent: Mediapartners-Google5:Once you have successfully pasted the Code ,then press "Save Changes" and you have DONE!
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: http://my-blog.blogspot.com/feeds/posts/default?orderby=UPDATED
How to see if changes are being made to Robots.txt?
After Adding the Custom Robots.txt file into your Blogger now next is to check either that file is added into your blogger or not.So for Checking if the Robots.txt file is added or not into your Blogger , simply add /robots.txt after your Blog address and open it in a new Tab.For Example:
http://mytrickspedia.blogspot.com/robots.txtWhen you will open your Blog with Robots.txt , there you will see your code that you have used inside Robots.txt file.
Here is a screenshot.
From the Author:
Adding Custom Robots.txt file into blogger is easy ain't you find any difficultly After reading out this Tutorial.
In SEO and Site Ranking Case , you can make changes into your Robots.txt File in your blogger, so keep learning more about SEO Tactics and easily ranked #1 on Google .
nice blog.Robots.txt file of a website plays an important role in SEO by instructing the web robots about the site.robots txt checker
ReplyDeletenice blog.This tool can help you to identify errors that may exist within your current /robots.txt file. It also lists the pages that you've specified to be disallowed.robots txt checker
ReplyDelete