Configurable external robots.txt file

Currently the robots.txt file used by the application is inside the .war
file and thus difficult to modify for users. This patch adds a new
optional configuration parameter that allows the user to specify an
external robots file, so that it is easy to modify and easy to preserve
during upgrades of the application. If no configuration change is made
the original robots file is used, only if the following is added to the
gerrit.config file will the external file be used:

  [httpd]
    robotsFile = etc/myrobots.txt

If the file indicated by this parameter is relative then it will be
resolved as sub directory of the site directory, if it is absolute it
will be used as is.

If the file doesn't exist or can't be read a message will be written to
the log and the default file will be used.

Bug: issue 1968
Change-Id: Iad02dbd97633e9c45dbce15d1f227f3931255e0a
Signed-off-by: Juan Hernandez <juanantonio.hernandez@gmail.com>
This commit is contained in:
Juan Hernandez
2013-08-06 16:30:50 +02:00
parent 4df685e3a8
commit ec51256e6c
3 changed files with 114 additions and 0 deletions

View File

@@ -1747,6 +1747,16 @@ a trusted username in the `TRUSTED_USER` HTTP Header:
filterClass = org.anyorg.MySecureFilter
----
[[httpd.robotsFile]]httpd.robotsFile::
+
Location of an external robots.txt file to be used instead of the one
bundled with the .war of the application.
+
If not absolute, the path is resolved relative to `$site_path`.
+
If the file doesn't exist or can't be read the default robots.txt file
bundled with the .war will be used instead.
[[ldap]]Section ldap
~~~~~~~~~~~~~~~~~~~~