bd8cc59dc2
We have observed that tempest-full jobs was time out many times. http://status.openstack.org/elastic-recheck/#1783405 Based on below ethercalc, we are trying to check the slow tests and mark them slow so that those tests will not run as part of tempest-full job. There is other job tempest-slow which will run these tests. https://ethercalc.openstack.org/dorupfz6s9qt Compute slow tests have been marked slow in - I2a0e154ba38c7407b41b7e986a0bf9b451c65cae This commit mark network slow tests based on above ethercalc. Change-Id: Ic2b3f5ea5b6885fe727a21810ddd9e17b779a1a0 Partial-Bug: #1783405 |
||
---|---|---|
.. | ||
README.rst | ||
__init__.py | ||
manager.py | ||
test_aggregates_basic_ops.py | ||
test_encrypted_cinder_volumes.py | ||
test_minimum_basic.py | ||
test_network_advanced_server_ops.py | ||
test_network_basic_ops.py | ||
test_network_v6.py | ||
test_object_storage_basic_ops.py | ||
test_security_groups_basic_ops.py | ||
test_server_advanced_ops.py | ||
test_server_basic_ops.py | ||
test_server_multinode.py | ||
test_shelve_instance.py | ||
test_snapshot_pattern.py | ||
test_stamp_pattern.py | ||
test_volume_backup_restore.py | ||
test_volume_boot_pattern.py | ||
test_volume_migrate_attached.py |
README.rst
Tempest Field Guide to Scenario tests
What are these tests?
Scenario tests are "through path" tests of OpenStack function. Complicated setups where one part might depend on completion of a previous part. They ideally involve the integration between multiple OpenStack services to exercise the touch points between them.
Any scenario test should have a real-life use case. An example would be:
- "As operator I want to start with a blank environment":
- upload a glance image
- deploy a vm from it
- ssh to the guest
- create a snapshot of the vm
Why are these tests in Tempest?
This is one of Tempest's core purposes, testing the integration between projects.
Scope of these tests
Scenario tests should always use the Tempest implementation of the OpenStack API, as we want to ensure that bugs aren't hidden by the official clients.
Tests should be tagged with which services they exercise, as determined by which client libraries are used directly by the test.
Example of a good test
While we are looking for interaction of 2 or more services, be specific in your interactions. A giant "this is my data center" smoke test is hard to debug when it goes wrong.
A flow of interactions between Glance and Nova, like in the introduction, is a good example. Especially if it involves a repeated interaction when a resource is setup, modified, detached, and then reused later again.