Hervé Beraud fbe3f2fe7c Adding pre-commit
Introduced changes:
- pre-commit config and rules
- Add pre-commit to pep8 gate, Flake8 is covered in the pre-commit hooks.
- Applying fixes for pre-commit compliance in all code.

Also commit hash will be used instead of version tags in pre-commit to
prevend arbitrary code from running in developer's machines.

pre-commit will be used to:
- trailing whitespace;
- Replaces or checks mixed line ending (mixed-line-ending);
- Forbid files which have a UTF-8 byte-order marker (check-byte-order-marker);
- Checks that non-binary executables have a proper
  shebang (check-executables-have-shebangs);
- Check for files that contain merge conflict strings (check-merge-conflict);
- Check for debugger imports and py37+ breakpoint()
  calls in python source (debug-statements);
- Attempts to load all yaml files to verify syntax (check-yaml);
- Run flake8 checks (flake8) (local)

For further details about tests please refer to:

Change-Id: I7ac1599e903577e28fb64bb07a6b984e1ff8a023
Signed-off-by: Moisés Guimarães de Medeiros <>
2020-09-15 15:05:13 +02:00
generator Update sitemap 2020-06-21 21:48:53 +02:00
README.rst Update docs building 2019-08-11 09:48:06 +02:00 doc-tools unit tests 2016-08-03 07:05:51 +00:00
scrapy.cfg script to generate the sitemap.xml for 2014-05-29 01:29:18 +02:00
transform-sitemap.xslt Adding pre-commit 2020-09-15 15:05:13 +02:00


Sitemap Generator

This script crawls all available sites on and extracts all URLs. Based on the URLs the script generates a sitemap for search engines according to the sitemaps protocol.


To install the needed modules you can use pip or the package management system included in your distribution. When using the package management system maybe the name of the packages differ. Installation in a virtual environment is recommended.

$ virtualenv venv
$ . venv/bin/activate
$ pip install Scrapy

When using pip, you may also need to install some development packages. For example, on Ubuntu 16.04 install the following packages:

$ sudo apt install gcc libssl-dev python-dev python-virtualenv


To generate a new sitemap file, change into your local clone of the openstack/openstack-doc-tools repository and run the following commands:

$ cd sitemap
$ scrapy crawl sitemap

The script takes several minutes to crawl all available sites on The result is available in the file.



Sets the domain to crawl. Default is

For example, to crawl use the following command:

$ scrapy crawl sitemap -a

The result is available in the file.


You can define a set of additional start URLs using the urls attribute. Separate multiple URLs with ,.

For example:

$ scrapy crawl sitemap -a -a urls=""

Write log messages to the specified file.

For example, to write to scrapy.log:

$ scrapy crawl sitemap -s LOG_FILE=scrapy.log