OpenStack Messaging (Zaqar)
Go to file
2014-08-26 19:22:26 +00:00
doc Add dev docs for transport/storage dirvers 2014-08-25 09:20:26 +02:00
etc Fix config generator in Zaqar 2014-08-22 15:33:32 +02:00
tests Merge "Switch to oslo.utils" 2014-08-26 17:05:02 +00:00
tools/config Fix config generator in Zaqar 2014-08-22 15:33:32 +02:00
zaqar Merge "Add the ability to benchmark across multiple queues" 2014-08-26 19:22:26 +00:00
.coveragerc Rename Marconi to Zaqar 2014-08-04 10:36:50 +02:00
.gitignore adds docs directory with dev guide 2014-03-18 14:48:54 -04:00
.gitreview Fix .gitreview due to the repo rename/move 2014-08-16 21:30:22 +04:00
.testr.conf Use testr instead of nosetest 2014-01-17 15:43:49 +01:00
AUTHORS.rst refactor: Rename AUTHORS so that it doesn't keep getting overwritten 2013-03-19 16:33:43 -04:00
babel.cfg Prepare marconi for localization 2014-06-04 22:31:55 +02:00
bench-requirements.txt Rename Marconi to Zaqar 2014-08-04 10:36:50 +02:00
doc-test.conf Rename Marconi to Zaqar 2014-08-04 10:36:50 +02:00
HACKING.rst Rename Marconi to Zaqar 2014-08-04 10:36:50 +02:00
LICENSE Include full license text 2014-03-21 10:16:28 +01:00
MANIFEST.in Initial import. 2012-11-01 09:52:20 +01:00
openstack-common.conf Rename Marconi to Zaqar 2014-08-04 10:36:50 +02:00
README.rst Report claim and delete latency separately in the benchmark tool 2014-08-25 13:40:14 +02:00
requirements-py3.txt Switch to oslo.utils 2014-08-26 12:07:53 +12:00
requirements.txt Switch to oslo.utils 2014-08-26 12:07:53 +12:00
setup.cfg Fix config generator in Zaqar 2014-08-22 15:33:32 +02:00
setup.py Updated from global requirements 2014-04-30 02:40:38 +00:00
test-requirements-py3.txt Updated from global requirements 2014-08-24 17:15:48 +00:00
test-requirements.txt Updated from global requirements 2014-08-24 17:15:48 +00:00
tox.ini Enable MongoDB tests on py27 2014-08-05 10:40:00 -05:00

Zaqar

Message queuing service for OpenStack. To find more information read our wiki.

Running a local Zaqar server with MongoDB

Note: These instructions are for running a local instance of Zaqar and not all of these steps are required. It is assumed you have MongoDB installed and running.

  1. From your home folder create the ~/.zaqar folder and clone the repo:

    $ cd
    $ mkdir .zaqar
    $ git clone https://github.com/openstack/zaqar.git
  2. Copy the Zaqar config files to the directory ~/.zaqar:

    $ cp zaqar/etc/zaqar.conf.sample ~/.zaqar/zaqar.conf
    $ cp zaqar/etc/logging.conf.sample ~/.zaqar/logging.conf
  3. Find [drivers] section in ~/.zaqar/zaqar.conf and specify to use mongodb storage:

    storage = mongodb

    Then find the [drivers:storage:mongodb] section and modify the URI to point to your local mongod instance:

    uri = mongodb://$MONGODB_HOST:$MONGODB_PORT

    By default, you will have:

    uri = mongodb://127.0.0.1:27017
  4. For logging, find the [DEFAULT] section in ~/.zaqar/zaqar.conf and modify as desired:

    log_file = server.log
  5. Change directories back to your local copy of the repo:

    $ cd zaqar
  6. Run the following so you can see the results of any changes you make to the code without having to reinstall the package each time:

    $ pip install -e .
  7. Start the Zaqar server with logging level set to INFO so you can see the port on which the server is listening:

    $ zaqar-server -v
  8. Test out that Zaqar is working by creating a queue:

    $ curl -i -X PUT http://127.0.0.1:8888/v1/queues/samplequeue -H
    "Content-type: application/json"

You should get an HTTP 201 along with some headers that will look similar to this:

HTTP/1.0 201 Created
Date: Fri, 25 Oct 2013 15:34:37 GMT
Server: WSGIServer/0.1 Python/2.7.3
Content-Length: 0
Location: /v1/queues/samplequeue

Running tests

First install additional requirements:

$ pip install tox

And then run tests:

$ tox -e py27

You can read more about running functional tests in separate TESTS_README.

Running the benchmarking tool

First install and run zaqar-server (see above).

Then install additional requirements:

$ pip install -r bench-requirements.txt

Copy the configuration file to ~/.zaqar:

$ cp etc/zaqar-benchmark.conf.sample ~/.zaqar/zaqar-benchmark.conf

In the configuration file specify where zaqar-server can be found:

server_url = http://localhost:8888

The benchmarking tool needs a set of messages to work with. Specify the path to the file with messages in the configuration file. Alternatively, put it in the directory with the configuration file and name it zaqar-benchmark- messages.json. As a starting point, you can use the sample file from the etc directory:

$ cp etc/zaqar-benchmark-messages.json ~/.zaqar/

If the file is not found or no file is specified, a single hard-coded message is used for all requests.

Run the benchmarking tool using the following command:

$ zaqar-bench-pc

By default, the command will run a performance test for 3 seconds, using one consumer and one producer for each CPU on the system, with 2 greenlet workers per CPU per process. You can override these defaults in the config file or on the command line using a variety of options. For example, the following command runs a performance test for 10 seconds using 4 producer processes with 20 workers each, plus 1 consumer process with 4 workers:

$ zaqar-bench-pc -pp 4 -pw 20 -cp 1 -cw 4 -t 10

By default, the results are in JSON. For more human-readable output add the --verbose flag. Verbose output looks similar to the following:

Starting Producer...

Starting Consumer...

Consumer
========
duration_sec: 10.2
ms_per_claim: 37.6
ms_per_delete: 11.8
reqs_per_sec: 82.0
successful_reqs: 833.0
total_reqs: 833.0

Producer
========
duration_sec: 10.2
ms_per_req: 3.8
reqs_per_sec: 1033.6
successful_reqs: 10523.0
total_reqs: 10523.0