c499b57e16
We've noticed that our uwsgi queues are filling up and a lot of requests are being made to robots.txt which ends up 500/503 erroring. Add a robots.txt file which allows crawling of our lists and archives with a delay value in hopes this will cause bots to cache results and not fill up the queue with repetetive requests. Change-Id: I660d8d43f6b2d96663212d93ec48e67d86e9e761
8 lines
97 B
Plaintext
8 lines
97 B
Plaintext
User-agent: *
|
|
|
|
Disallow: /accounts/*
|
|
Allow: /archives/*
|
|
Allow: /mailman3/lists/*
|
|
|
|
Crawl-delay: 2
|