From 5b21d5f8e227f9be480c82cd800a54c574a5b920 Mon Sep 17 00:00:00 2001 From: Christian Berendt Date: Sat, 22 Aug 2015 10:10:07 +0200 Subject: [PATCH] [sitemap] transform README from Markdown to reStructuredText Change-Id: Icdd8269323075cb3b9a7f81d9118cf57b36a7267 --- sitemap/{README.md => README.rst} | 31 ++++++++++++++----------------- 1 file changed, 14 insertions(+), 17 deletions(-) rename sitemap/{README.md => README.rst} (61%) diff --git a/sitemap/README.md b/sitemap/README.rst similarity index 61% rename from sitemap/README.md rename to sitemap/README.rst index a95f0bad..7c47db75 100644 --- a/sitemap/README.md +++ b/sitemap/README.rst @@ -1,39 +1,36 @@ -= Sitemap Generator +Sitemap Generator +***************** This script crawls all available sites on http://docs.openstack.org and extracts all URLs. Based on the URLs the script generates a sitemap for search engines according to the protocol described at http://www.sitemaps.org/protocol.html. -== Usage +Usage +===== To generate a new sitemap file simply run the spider using the following command. It will take several minutes to crawl all available sites on http://docs.openstack.org. The result will be available in the file -```sitemap_docs.openstack.org.xml```. +``sitemap_docs.openstack.org.xml``. -``` -$ scrapy crawl sitemap -``` + $ scrapy crawl sitemap -It's also possible to crawl other sites using the attribute ```domain```. +It's also possible to crawl other sites using the attribute ``domain``. For example to crawl http://developer.openstack.org use the following command. -The result will be available in the file ```sitemap_developer.openstack.org.xml```. +The result will be available in the file ``sitemap_developer.openstack.org.xml``. -``` -$ scrapy crawl sitemap -a domain=developer.openstack.org -``` + $ scrapy crawl sitemap -a domain=developer.openstack.org -To write log messages into a file append the parameter ```-s LOG_FILE=scrapy.log```. +To write log messages into a file append the parameter ``-s LOG_FILE=scrapy.log``. -== Dependencies +Dependencies +============ -* Scrapy (https://pypi.python.org/pypi/Scrapy) +* `Scrapy `_ To install the needed modules you can use pip or the package management system included in your distribution. When using the package management system maybe the name of the packages differ. When using pip it's maybe necessary to install some development packages. -``` -$ pip install scrapy -``` + $ pip install scrapy