robots.txt template
Hugo can generate a customized robots.txt in the same way as any other template.
To generate a robots.txt file from a template, change the site configuration:
enableRobotsTXT: true
enableRobotsTXT = true
{
"enableRobotsTXT": true
}
By default, Hugo generates robots.txt using an embedded template.
User-agent: *
Search engines that honor the Robots Exclusion Protocol will interpret this as permission to crawl everything on the site.
robots.txt template lookup order
You may overwrite the internal template with a custom template. Hugo selects the template using this lookup order:
/layouts/robots.txt
/themes/<THEME>/layouts/robots.txt
robots.txt template example
layouts/robots.txt
User-agent: *
{{ range .Pages }}
Disallow: {{ .RelPermalink }}
{{ end }}
This template creates a robots.txt file with a Disallow
directive for each page on the site. Search engines that honor the Robots Exclusion Protocol will not crawl any page on the site.
To create a robots.txt file without using a template:
- Set
enableRobotsTXT
tofalse
in the site configuration. - Create a robots.txt file in the
static
directory.
Remember that Hugo copies everything in the
static
directory to the root of publishDir
(typically public
) when you build your site.
Last updated:
February 17, 2025
:
all: Change shortcode usage and design to prevent invalid HTML (0fca8ef25)
Improve this page