This adds a uwsgi functional test check to .zuul.yaml so that
deploying Deckhand via uwsgi (in a more standalone fashion,
sans containerization) works as intended.
Change-Id: I931ab4d11719daca7665d3a25b00e353c707237e
This patchset converts much of the previous logic in
functional-tests.sh into Ansible playbooks to be executed
by Zuul. This mainly includes all the Docker-related
deployment logic.
The functional-tests.sh script has been slimmed down to
just work with uwsgi so that a standalone functional
test deployment can be performed relatively easily,
mainly by developers.
Finally, py27 support for the gate has been dropped
as the Dockerfile in this project currently assumes
python3 for installing requirements and so forth,
leading to requirements issues blocking the gate.
Change-Id: I903a2845390061641d292fb0c016ba6a53723fc9
This patchset adds functional tests to .zuul.yaml. Additionally
it adds a functional-py35 job as well which will also be kicked
off via Zuul.
Change-Id: Ic2d1db4d3cd65c4d93c3a6f04e6efeeba9755f07
This PS adds noauth middleware to bypass keystone authentication
which will occur when Deckhand's server is executed in development
mode. Development mode is enabled by setting development_mode as True
in etc/deckhand/deckhand.conf.sample.
The logic is similar to Drydock's here: [0].
[0] 1c78477e95/drydock_provisioner/util.py (L43)
Co-Authored-By: Luna Das <luna.das@imaginea.com>
Co-Authored-By: Felipe Monteiro <felipe.monteiro@att.com>
Change-Id: I677d3d92768e0aa1a550772700403e0f028b0c59
This is to update releasenotes/docs tox jobs to remove need
to defined build_sphinx in setup.cfg and to ensure that they
both clean up prior to running via appropraite rm -rf commands
and to ensure all the requirements are being installed.
Change-Id: Iadd375dbb596151cb140fae03b82a728a64364a0
This patch set adds integration tests to Deckhand
where "integration" means the interaction between
Deckhand, Barbican and Keystone. OSH is used to
deploy Keystone and Barbican and Docker to deploy
PostgreSQL and Deckhand.
Unlike functional testing in Deckhand, all
integration tests use the default in-code policy
defaults and an admin token supplied by keystone
to validate authN and authZ.
The test scenarios consist of Deckhand secret
lifecycle management as well as document rendering
with secrets retrieved from Barbican.
Change-Id: Ib5ae1b345b2a4bd579671ec4ae9a232c2e3887dc
Updates Deckhand to use alembic to manage database upgrades.
Moves from creating tables at startup of Deckhand to the
db-sync job.
Change-Id: I6f4cb237fadc46fbee81d1c33096f48a720f589f
This is a trivial PS that fixes the tox -e cover job in
tox.ini which was recently broken with [0].
[0] https://review.gerrithub.io/#/c/405318/
Change-Id: Id50a6348e6f306c3d8d68fdd79eb331880e7498b
This PS fixes tox -v skipping over SQLite unit test jobs. tox -v
is used in CICD to run all jobs in tox envlist but currently
py{35,27}-{postgresql,} translates to:
py35-
py27-
py35-postgresql
py27-postgresql
Where the first two should instead be:
py35
py27
This PS also adds --regex flags to the unit test jobs so regular
expressions work with them.
Change-Id: Id468259a1b2e020494bdd58103d8750b4fac6000
Recently JSONB replaced a back-end agnostic data type
for the "data" column in the Document model. This
made it necessary to drop support for running Deckhand
unit tests with any other database store.
However, this arragenement is undesirable as a user
shouldn't need to have postgresql installed just to
kick off unit tests.
So, this PS re-adds support for running unit tests
via an in-memory sqlite database.
To run unit tests with sqlite:
tox -e py35
Unit tests still run against postgresql via:
tox -e py35-postgresql
Both jobs are executed in CICD already.
This PS also updates the remaining DB columns to use JSONB if
postgresql is enabled; else fallback columns are used for testing
with sqlite. This is a necessary change to make the column data
types consistent.
Change-Id: I951f2f04fd013d635bb7653a238ff1eb3725b5e1
For whatever reason, the following command:
pifpaf run postgresql -- <test command>
is not returning a non-zero error code on test failure.
(An example print out is included below.)
This PS updates pretty_tox.sh to forcibly raise a non-zero error
code in the event of test failure. It also renames the script
to run_pifpaf.sh to be more intuitive.
Example:
======
Totals
======
Ran: 7 tests in 5.7673 sec.
- Passed: 6
- Skipped: 0
- Expected Fail: 0
- Unexpected Success: 0
- Failed: 1
Sum of execute time for each test: 2.6962 sec.
==============
Worker Balance
==============
- Worker 0 (7 tests) => 0:00:02.698323
+ exit 0
py35: commands succeeded
congratulations :)
Change-Id: I7b1fa9d42295d06752997f251a0ec14082b44d03
This reverts https://review.gerrithub.io/#/c/393980/ which was
a temporary workaround to unblock the Deckhand gate. pifpaf should
be used to run unit tests as having to install Docker just to kick
off unit tests is excessive.
However, the unit-tests.sh script is maintained in tools/ directory
as a fallback.
Change-Id: I24a10d4b3ea00006004f27d0086719fb0bf86dd9
Unusual documents are documents with different data
types for the data field. The data types include:
object, array, string and integer.
This PS makes necessary ORM model and schema
changes needed to support the different data types.
The ORM data type for the data column has been changed
to JSONB for PostgreSQL. Thus, DH now only supports
PostgreSQL. As a result, the tox jobs have been updated
to only use postgre.
Change-Id: I53694d56bef71adacb5eb79162678be73acb4ad8
This PS unblocks the gate by replacing pifpaf to run postgresql
for unit tests with docker, as a workaround. This is because
"pifpaf run postgresql" is failing with pifpaf not being able
to find the command "pifpaf run". Steps to reproduce:
python3 -m virtualenv -p python3 /tmp/venv
source /tmp/venv/bin/activate
pip install -U pip wheel devpi-client setuptools
pip install pifpaf
$pifpaf run postgresql
>> pifpaf: 'run' is not a pifpaf command. See 'pifpaf --help'.
>> Did you mean one of these?
help
The unit test script for spinning up the docker postgresql container
and then running unit tests is very similar to the pre-existing
script for running functional tests located in tools/ directory.
Change-Id: Ib0f414ff58007037ac12161876dcd7a10e91f48c
This PS implements oslo.policy integration in Deckhand.
The policy.py file implements 2 types of functions for
performing policy enforcement in Deckhand: authorize,
which is a decorator that is used directly around
falcon on_HTTP_VERB methods that raises a 403 immediately
if policy enforcement fails; and conditional_authorize,
to be used inside controller code conditionally.
For example, since Deckhand has two types of documents
with respect to security -- encrypted and cleartext
documents -- policy enforcement is conditioned on the
type of the documents' metadata.storagePolicy.
Included in this PS:
- policy framework implementation
- policy in code and policy documentation for all
Deckhand policies
- modification of functional test script to override
default admin-only policies with custom policy file
dynamically created using lax permissions
- bug fix for filtering out deleted documents (and
its predecessors in previous revisions) for
PUT /revisions/{revision_id}/documents
- policy documentation
- basic unit tests for policy enforcement framework
- allow functional tests to be filtered via regex
Due to the size of this PS, functional tests related to
policy enforcement will be done in a follow up.
Change-Id: If418129f9b401091e098c0bd6c7336b8a5cd2359
Unskip some pep8 rules that aren't unreasonably annoying:
E121 - continuation line under-indented for hanging indent
E122 - continuation line missing indentation or outdented
E123 - closing bracket does not match indentation of opening bracket’s line
E124 - closing bracket does not match visual indentation
E125 - continuation line with same indent as next logical line
E126 - continuation line over-indented for hanging indent
E251 - unexpected spaces around keyword / parameter equals
Change-Id: Idf2640fc2d10715a687c46c3e853122ce38109ee
This PS revamps document hashing. Instead of relying on Python's
built-in hash function to hash the contents of a document (i.e.
metadata and data values), sha256 from hashlib is used instead,
mostly for security purposes.
Further, new parameters have been added to the document DB model:
data_hash and metadata_hash, and the old value hash has been
dropped. The data type for storing the hashes has been changed
to String from BigInt.
Finally, testing documentation was added.
Change-Id: I428ddcbce1007ea990ca0df1aa630072a050c722
Sphinx can be leveraged to auto-generate docs into feature-rich
HTML pages. Docstrings in modules and classes can be easily
auto-injected into documentation to produce high-level
documentation, yet with accompanying code blocks and
necessary docstrings.
This commit introduces the tox job for auto-generating docs,
as well as the foundational logic and documentation pages
(index, HACKING and glossary). While hacking rules are
introduced in HACKING.rst, they are not added in this
commit; that will be done in a follow-up patchset.
Additional documentation will also be included in a series
of future patchsets.
Change-Id: Iacd0e4542ebf481d66ab19dd43014b8d5bcc9e3f
Update test-requirements.txt to use latest version of:
* hacking
Enable the following off-by-default checks:
* [H203] Use assertIs(Not)None to check for None.
* [H204] Use assert(Not)Equal to check for equality.
* [H205] Use assert(Greater|Less)(Equal) for comparison.
* [H210] Require ‘autospec’, ‘spec’, or ‘spec_set’ in
mock.patch/mock.patch.object calls
* [H904] Delay string interpolations at logging calls.
Made minimal code changes to comply with changes.
Change-Id: I3559ead76b5476650d7193e7023d349175234922
1) Add Dockerfile
2) Add entrypoint.sh
3) Add uwsgi in requirements.txt and remove it from
tox.ini
Change-Id: Ie9086335b5e6403e5b1e46981db110894606b9d1
This commit fixes flake8 errors and fixes a minor bug related to
a schema version being v1 rather than v1.0.
OpenStack hacking rules are used to pin down flake8 to sane
standards using [0].
[0] 06e676c461/test-requirements.txt (L5)
Change-Id: Ib236df6f5ec9505c0e635f0faa9877d3397a2e55
This commit constitutes 1 of 2 monolithic ports from Github.
The following major changes have been made:
- Created schemas for validating different types of documents
(control and document schemas), including:
* certificate key
* certificate
* data schema
* document
* layering policy
* passphrase
* validation policy
- Implemented pre-validation logic which validates that each
type of document conforms to the correct schema specifications
- Implemented views for APIs -- this allows views to change the
DB data to conform with API specifications
- Implemented relevant unit tests
- Implement functional testing foundation
Change-Id: I83582cc26ffef91fbe95d2f5f437f82d6fef6aa9
This commit adds the documents API and adds logic for performing
pre-validation schema checking wherever applicable in the
documents API.
The following endpoints in the documents API have been implemented:
- POST /documents
This commit adds the initial engine framework for Deckhand. Included
is the logic for parsing YAML files as well as validating them and
doing forward substitution as specified by the YAML file.
This commit also includes unit tests for the framework changes.
This commit implements the core Deckhand API framework.
It does not implement any real API routes. The core
framework is modeled after Drydock's [0].
This commit specifically:
- implements the core API framework which uses falcon
- implements errors.py for preliminary errors
- implements base resource class from which other API
resources will inherit to build out the API itself
- implements base API router
- implements entry-point for kicking off deckhand
- updates base README.rst with instructions on
- running and installing -- similar to Drydock's
- implements dummy API resource for secrets, to
be fleshed out further in a follow-up commit
[0] https://github.com/att-comdev/drydock
* DECKHAND-11: Add oslo.config integration to Deckhand
This commit adds oslo.config integration to Deckhand. It also
creates a lot of preliminary files/configuration settings
needed to run tox as well as lint and oslo-config-generator
jobs.
* Remove sample config file.