Regulate Website Crawling with Robots.txt
Website crawling is the process by which search engine bots crawl the web to index information about your site and its pages. While this is essential for search engine optimization (SEO), sometimes you need to control which parts of your website are crawlable to bots. This is where the Robots.txt file comes in handy. Robots.txt is a simple text f