Robots.txt Generator
Create robots.txt files to facilitate crawler access and improve SEO.
android Configuration
code Generated Code
What is Robots.txt?
face
User-agent
The specific robot to which the record applies. * is a wildcard for all robots.
block
Disallow
The command used to tell a user-agent not to browse a particular URL or directory.
check_circle
Allow
Used in Googlebot to tell the robot it can access a page or subfolder, even if its parent page or folder is disallowed.
map
Sitemap
The location of your site's Sitemap XML file.
Robots.txt is a text file webmasters create to instruct web robots (typically search engine robots) how to crawl pages on their website. The robots.txt file is part of the robots exclusion protocol (REP), a group of web standards that regulate how robots crawl the web, access and index content, and serve that content to users.