6466fa8419
Crawlers that ignore our robots.txt are triggering archive creation so rapidly our rootfs fills up between weekly purges, so doing it once a day should hopefully mitigate further problems. Change-Id: Ib4e56fbd666f7bf93c017739697d8443d527b8c7 |
||
---|---|---|
.. | ||
app.ini.j2 | ||
docker-compose.yaml.j2 | ||
gitea.vhost.j2 |