Developer Documentation

Robots.txt

The SEO module will read a custom robots.txt file from /.config/robots.txt in your project's root directory.

The robots.txt file is a standard for providing instructions to various bots that may visit your site. There is no guarantee that bots will obey the directives it provides however so other measures should be taken if content should not be indexed such as adding nofollow attributes to links and a robots meta tag with a value of noindex to your website's head.

An example robots.txt file may look like the following:

# Add a custom sitemap
Sitemap: /custom-sitemap.xml

# Disallow /private for all user agents
User-agent: *
Disallow: /private

# Allow /private/special for one user agent
User-agent: friendly-bot
Allow: /private/special

The contents of that file will be appended to the robots.txt file generated by the CMS which can be located at <site-url>/robots.txt.

Programmatically generated directives may be added to robots.txt via the robots_txt filter.

add_filter( 'robots_txt', function ( string $output ) : string {
	$output .= '
User-agent: *
Disallow: /private
';

	return $output;
} );