瀏覽代碼

[sitemap] transform README from Markdown to reStructuredText

Change-Id: Icdd8269323075cb3b9a7f81d9118cf57b36a7267
tags/0.30.1
Christian Berendt 4 年之前
父節點
當前提交
5b21d5f8e2
共有 1 個文件被更改,包括 14 次插入17 次删除
  1. +14
    -17
      sitemap/README.rst

sitemap/README.md → sitemap/README.rst 查看文件

@@ -1,39 +1,36 @@
= Sitemap Generator
Sitemap Generator
*****************

This script crawls all available sites on http://docs.openstack.org and extracts
all URLs. Based on the URLs the script generates a sitemap for search engines
according to the protocol described at http://www.sitemaps.org/protocol.html.

== Usage
Usage
=====

To generate a new sitemap file simply run the spider using the
following command. It will take several minutes to crawl all available sites
on http://docs.openstack.org. The result will be available in the file
```sitemap_docs.openstack.org.xml```.
``sitemap_docs.openstack.org.xml``.

```
$ scrapy crawl sitemap
```
$ scrapy crawl sitemap

It's also possible to crawl other sites using the attribute ```domain```.
It's also possible to crawl other sites using the attribute ``domain``.

For example to crawl http://developer.openstack.org use the following command.
The result will be available in the file ```sitemap_developer.openstack.org.xml```.
The result will be available in the file ``sitemap_developer.openstack.org.xml``.

```
$ scrapy crawl sitemap -a domain=developer.openstack.org
```
$ scrapy crawl sitemap -a domain=developer.openstack.org

To write log messages into a file append the parameter ```-s LOG_FILE=scrapy.log```.
To write log messages into a file append the parameter ``-s LOG_FILE=scrapy.log``.

== Dependencies
Dependencies
============

* Scrapy (https://pypi.python.org/pypi/Scrapy)
* `Scrapy <https://pypi.python.org/pypi/Scrapy>`_

To install the needed modules you can use pip or the package management system included
in your distribution. When using the package management system maybe the name of the
packages differ. When using pip it's maybe necessary to install some development packages.

```
$ pip install scrapy
```
$ pip install scrapy

Loading…
取消
儲存