A text file stored on a website’s server that includes basic rules for indexing robots which “crawl” the site. This file allows you to specifically allow (or disallow) certain files and folders from being viewed by crawler bots, which can keep your indexed pages limited to only the pages you wish.
It is a file that says to a search engine crawler “do not search” all or certain parts of a website that you want to keep private, therefore stopping it from appearing in search engine results.It used by webmasters that give search engine robots [like Googlebot] directives on how to crawl a website.It gives rules for search engines to follow when crawling a website. Here you can designate pages to skip and pages that are allowed.