Developer Documentation


Robots API

The Robots API provides central control over the robots meta tag. The robots meta tag allows you to utilize a more granular, page-specific approach to controlling how an individual page should be indexed and served to users in search engine results. The meta tag is automatically placed in the <head> section of a page.

<!DOCTYPE html>
		<meta name="robots" content="max-image-preview:large, follow" />

The Robots API allows you to hook into this meta tag to modify its values. By default, the robots meta tag will include code that sets the maximum size of an image preview for images on the page. To disable this completely, use a remove_filter on the wp_robots filter hook.

remove_filter( 'wp_robots', 'wp_robots_max_image_preview_large' );

You can modify the contents of the robots meta tag using the wp_robots filter as well. The values are passed into the filter as an array.

add_filter( 'wp_robots', function( array $robots ) : array {
	$robots['follow'] = true;
	$robots['foo'] = 'bar';
	unset( $robots['max-image-preview'] );

	return $robots;
} );

The example above would output the following:

<meta name="robots" content="follow, foo:bar" />

Note that on local environments, and when the "Search engine visibility" setting in the admin Reading settings is set to "Discourage search engines from indexing this site", the robots meta tag will default to include noindex, nofollow unless overridden by the filter in addition to any custom parameters.

For more information, refer to the wp_robots hook developer documentation or this list of available robots meta values.


The SEO module will read a custom robots.txt file from /.config/robots.txt in your project's root directory.

The robots.txt file is a standard for providing instructions to various bots that may visit your site. There is no guarantee that bots will obey the directives it provides however so other measures should be taken if content should not be indexed such as adding nofollow attributes to links and a robots meta tag with a value of noindex to your website's head.

An example robots.txt file may look like the following:

# Add a custom sitemap
Sitemap: /custom-sitemap.xml

# Disallow /private for all user agents
User-agent: *
Disallow: /private

# Allow /private/special for one user agent
User-agent: friendly-bot
Allow: /private/special

The contents of that file will be appended to the robots.txt file generated by the CMS which can be located at <site-url>/robots.txt.

Programmatically generated directives may be added to robots.txt via the robots_txt filter.

add_filter( 'robots_txt', function ( string $output ) : string {
	$output .= '
User-agent: *
Disallow: /private

	return $output;
} );