-->

What is Custom Robots.txt? Why is it important? How to use?


What is Custom Robots.txt? Why is it important? How to use

robots.txt sitemap, robots txt file
If you have a blog or website and you don't know what robots.txt is? So it is very important to know. In this, we have given you well information about robots.txt.

Due to a lack of accurate information on robots.txt, despite our unwillingness, all information on the blog is shown in search results.

In this post, we are going to give detailed information about all these, how you can show whatever you want according to your wish with the post in the search results.

robots.txt is also called custom robots.txt. It is a small text file that tells the search engine bots on the blog which information to crawl or index from the blog.

Just like if you want to show only the articles of your blog from this setting, then only the articles will be able to show.

The bots or robots of the search engine appear after following the robots.txt file. In such a case, if you have not added the text file to the blog, then it will show whatever it wants to on its own.

What is robots.txt?

custom robots txt content
The bots or robots of the search engine depend on the command you gave in the blog's robots.txt file. Do you want to show only articles or anything else with it? The way you place the command in the robots.txt file will be shown accordingly.

Like if you do not want to index category or tags in the blog. That is, if you want to hide, you can do this by giving a command. After that category and tags will not be visible on any search engine like google, yahoo, bing.

If you want to know what is the robots.txt of which blog, then in any browser you will have to type /robots.txt after the URL address of that blog or website. In this way, you can also see your own. Keep in mind that the blog in which robots.txt has not been added will not show in it.

Advantages of Robots.txt File?

custom robots txt
There are many advantages to adding a robots.txt file to a blog.

With the help of robots.txt, any post or content of your blog will be found private from the search engine.

Any type of data such as a file, folder, image, pdf, etc. will be prevented from being indexed / show in the search engine.

In search results by search engine crawlers, the article of the blog does not appear to show only the article, and everything starts appearing in it. Which you will be able to see clear and clean only articles through robots txt.

Through robots txt file you will also be able to keep sensitive information private.

The robots' sitemap via Robots txt reveals the bots of the search engine. Which will make the blog a good index.

According to your blog categories, labels, tags, images, documents, etc. Will be able to prevent it from being indexed.

Before being indexed in the search engine, by ignoring the duplicate page of the blog, you will be able to prevent it from being shown in the search results.

What is the Syntax of robots.txt?

It is very important to know about some syntax that works in the setting of robots txt. Because only after having the correct information of all the commands, you will be able to add commands by giving a good command on your blog.

Now one by one, we will know in detail about all the commands, what is the meaning of which command.

Allow: - The search engine's bots will crawl or index the content of the blog according to the type of command you give or allow inside the text file of the robots txt. And will show in the search results.

Disallow: - Unwanted Word or Categories, Labels, Tags that come after blog address will be able to prevent indexing in Search Engine.

User-Agent: - This is a command given to bots like Google Bot, Google Bot-Image, AdsBot Google, Media partners-Google, etc. We will write this command in front of the user-agent.

User-Agent: *: - In case of giving the same command to all Search Engine Robots, we will use the * sign in front of the User-Agent.

User-Agent: Media partner-GoogleUser: - When AdSense is advertised on the blog, you will use a command like Media partner-GoogleUser in front of the user-agent. Using this, when viewing ads on the blog, the ads code will not appear in it.

Sitemap.xml: - This is XML file. Through sitemap.xml, Google gets to know all the posts in the blog. With which the search engine can easily index and rank your blog posts.
  

How to Add Custom Robots.txt File?

custom robots txt generator for blogger
About adding custom robot.txt to your blogger, we are going to tell you how to generate and add a robots txt file for your blog.

For this, understand all the steps given below. Because adding the wrong code in robot.txt file means the wrong command has a wrong effect on the ranking of your blog.

Several websites are available to generate robots txt files. Through which you can generate for free.
You can also copy the given robots.txt file code and add it to your blog. Before adding add, replace your blog's URL address instead of http://www.example.com.

Robot.txt file code

robot.txt example

Generally, everyone uses the two types of code given below. In which you can give more commands as per your choice.

User-agent: *
Disallow: / search
Allow: /
Sitemap: http://www.example.com/atom.xml?redirect=false&start-index=1&max-results=500.

or

User-agent: *
Disallow: / search
Allow: /
Sitemap: https://www.example.com/sitemap.xml

If you use Google ads on the blog, then such a command will be added as given below.

User-agent: Media partners-Google
Disallow:

User-agent: *
Disallow: / search
Allow: /
Sitemap: https://www.example.com/sitemap.xml

Now, paste this robots.txt text on your blog. How and where to add it, about which we are going to tell below.

Learn the settings for adding a Robots.txt file to a blog.


  • add custom robots.txt file in blogger, custom robots.txt for blogger, custom robots txt
custom robots.txt, robots.txt generator, robots.txt sitemap, robots.txt no index, custom robots.txt for blogger, robots txt file, txt file, eng.dtechin
custom robots.txt, robots.txt generator, robots.txt sitemap, robots.txt no index, custom robots.txt for blogger, robots txt file, txt file, eng.dtechin

  • To do this setting, first of all, go to your blog's dashboard and click on Settings.
  • After that scroll the page upwards and go to the head section written “Crawlers and indexing”.
  • After that, we will enable it by clicking Enable custom robots.txt.
  • Then we will now click on Custom robots.txt.
  • After that, a small page will open to add custom robots.txt. In which to paste this text.
  • Now finally save it by clicking on save.
  • Now custom robots.txt has been added to your blog.

We hope that you have understood perfectly well what custom robots.txt is. Why is it necessary for a blog And how is it added to the blog. And will be able to add well to your blog. If you like this information, then please share this useful information on the social site for the blog.

What is the difference between song and music, Music vs Song Difference

What is the difference between song and music, Music vs Song Difference Some people do not know the difference between music and song. But t...

Popular Posts