The robots.txt is the file that is used to create allow and disallow of a website data.
If you have a website and want to index in search engine then you must need to allow your website data so that search engine can index your website. Sometimes you may not want to allow all pages of website.
Why robots.txt is necessary?
It is necessary because if you have huge data on website then some data will be useless or will not for search engine. If you allow all data then it will create problem for your website.
Suppose if you allow all data to search engine to index then some useless data will also be crawled by search engine. This will be bad for your website. When search engine crawl your website useless data then these data will not valuable for search engine. Your whole website will be affected because of these data.
If search engine is not giving value of your website then your website will be not index properly or derank from search engine. Now if you have created robots.txt file for your website then search engine (Google,Bing,yahoo etc) will not crawl/index data that you disallow in robots.txt file. This data will not more valuable for user and for search engine that you disallow in robots.txt file.
How to Create robots txt file?
If you are using yoast seo plugin then you can easily create robots.txt file. Here i will teach how you can create this file using yoast seo plugin.
First you need to install and activate yoast seo plugin.You can install by going to plugins then ad new and search for yoast seo. After that click on install and once installation is completed activate plugin.
After installing yoast SEO plugin you will find a SEO tab in sidebar.When you click on seo you will get some options as General,Search appearance,search console,social and tools. Now click on tools. When you click on tools a new page will be open in front of you.
Here you you will found different options to click. Click on file editor. After that you will see the create robots.txt option click on that. Once you click on this you will see this screen.
Here you will see some things that will be appear by default. A User-agent:* is necessary for every robots.txt file either you create on yoast seo or another platform.
Now if you want to disallow data write as:
Disallow:/ Here put that data which you want to disallow/
/ / In these bars you need to put that data that you want to disallow from search engine that is useless for search engine or users as plugin,wp-content etc.
It is not necessary to allow pages or data of your website in robots.txt. Because if you have created content then search engine will crawl automatically. But for those data which you not want to index must Disallow in robots.txt file. After that you need to click on save button. Now you created successfully robots.txt file for your website.