Robots.txt Basic Syntax

Allowing all web crawlers access to all content

User-agent: *
Disallow:

Using this syntax in a robots.txt file tells web crawlers to crawl all pages on www.example.com, including the homepage.
       


Comments

Popular posts from this blog

Uncontrolled Component - React Js

OG And Twitter Card Code For Products