Robots.txt is usually a file that exists on the root Listing of every Web page and can be used to instruct serps on which directories/information of the web site they are able to crawl and include things like within their index. In line with Rubright, a much better method is https://news.rafeeg.ae