Make robots.txt reachable

If disallow_robots is enabled, make sure we not only install and
alias the robots.txt file but also grant permission to read it and
omit it from our general article URL rewrites.

Change-Id: I9532dd8fd18010aaad388e8fdcbc3051fc653234
This commit is contained in:
Jeremy Stanley 2016-09-10 14:35:51 +00:00
parent f5a6ac32a7
commit 35f57bcb4f
1 changed files with 6 additions and 0 deletions

View File

@ -98,10 +98,16 @@
<% if scope['mediawiki::disallow_robots'] == true %>
# Request that search engines not index this site
Alias /robots.txt /srv/mediawiki/robots.txt
<Directory "/srv/mediawiki/robots.txt">
Require all granted
</Directory>
<% end %>
# Redirect old /Article_Name urls
RewriteEngine on
<% if scope['mediawiki::disallow_robots'] == true %>
RewriteCond %{REQUEST_URI} !^/robots.txt$
<% end %>
RewriteCond %{REQUEST_URI} !^/w/
RewriteCond %{REQUEST_URI} !^/wiki/
RewriteRule ^/(.*)$ https://<%= @vhost_name %>/wiki/$1 [L,R]