What is the porpose of Robot.txt file in SEO?

sophies2019-12-16T06:35:53Z

Robots.txt is a text file webmasters create to instruct web robots (typically search engine robots) how to crawl pages on their website. The robots.txt file is part of the the robots exclusion protocol (REP), a group of web standards that regulate how robots crawl the web, access and index content, and serve that content up to users.

The main purpose of Robot.txt file is that it tells which pages on your site to crawl.
 or It also tells web robots which pages not to crawl.

Anonymous2019-11-21T09:51:43Z

simply put, it's used to direct search engines to the appropriate content on your site