Warn crawlers not to touch the tripwire
OpenDev's static content sites implement web application firewall rules designed to catch and block malicious crawlers by instructing them not to index a particular non-existent file and then blocking any client that tries to reach it. Add a corresponding robots.txt directive to the docs.openstack.org site similar to the one used in https://docs.opendev.org/robots.txt already. Change-Id: I3fa133f06998d69245ce34191771604a7dce451a Signed-off-by: Jeremy Stanley <fungi@yuggoth.org>
This commit is contained in:
@@ -15,3 +15,6 @@ Disallow: /ironic/pike
|
||||
Disallow: /ironic/queens
|
||||
Disallow: /ironic/rocky
|
||||
Disallow: /ironic/stein
|
||||
|
||||
# Tripwire URL designed to catch and block misbehaving crawlers
|
||||
Disallow: /83572085-b2cf-4be3-b8ed-abff38714d87.html
|
||||
|
||||
Reference in New Issue
Block a user