Menu

Johan Gant

Drupal Team Manager

A patch for the robotstxt module

Related post categories Digital products
x min read

We make heavy use of drush make in our builds, which usually includes a new Drupal core.

By default, Drupal installs with a robots.txt file in the root. For those of us that use the robotstxt module, one of the installation steps is to delete this file. This allows Drupal to handle the HTTP request and it allows site administrators to manage the contents of their robots file through the admin interface.

When you’re rolling out automated deployments of entire Drupal sites the last thing you want to do is have to delete that file each and every time. There are a few potential solutions to this problem - perhaps a post-install process to tidy up the file - but we opted to submit a patch into the robotstxt module's issue queueThis patch adds a rewrite rule into the .htaccess file that always ensures Drupal handles requests for robots.txt. This means we don’t have to delete the robots.txt file and can deploy/redeploy sites that use robotstxt safely.

The patch affects a file that is part of Drupal core and while we’re aware this is normally something to avoid, we have a really specific requirement and the patch is part of an automated process. This means we can easily take it out and redeploy the entire build when a more suitable solution is available.