Robots.txt Basic Syntax
Allowing all web crawlers access to all content
User-agent: *
Disallow:
Using this syntax in a robots.txt file tells web crawlers to crawl all pages on www.example.com, including the homepage.
User-agent: *
Disallow:
Using this syntax in a robots.txt file tells web crawlers to crawl all pages on www.example.com, including the homepage.
Comments
Post a Comment