Crawlers expect to find robots.txt in the top-level directory of a
site, not in a subtree. Since we publish multiple Sphinx projects in
subdirectories of docs.opendev.org, we can't effectively include our
robots.txt in a docs build. Instead, manage it with Ansible and tell
Apache to serve it directly.
This uses a generic robots.txt on the expectation that other vhosts
may reuse the same one, but is installed into a dedicated directory
so that we can install other domain-specific robots.txt files
alongside it as needed.
Change-Id: I4f8179fcda85f8d84fa5f960b3afe2a96fe92e43