The robots.txt file is a text file that tells web robots (most often search engines) which pages on your site to crawl or not to crawl. From here, you can enter which part of website cannot access by search engine crawlers.