DECKHAND-80: Validations API Implementation
The Validations API has been introduced to Deckhand, allowing users to register new validation results in Deckhand, as well as query the API for validation results for a revision. The validation results include a list of errors that occurred during document validation. All functional tests related to the API are now passing. The following endpoints have been implemented: * /api/v1.0/revisions/{revision_id}/validations * /api/v1.0/revisions/{revision_id}/validations/{validation_name} * /api/v1.0/revisions/{revision_id}/validations/{validation_name}/entries * /api/v1.0/revisions/{revision_id}/validations/{validation_name}/entries/{entry_id} Some back-end refactoring was needed to implement this API. In particular: - Added a new Validation sqlalchemy DB model - Introduced DataSchema handling to the engine.document_validation module so that registered schema validations can be used - Changed the way the result of the 'deckhand-schema-validation' internal validation is generated: it is now the amalgamation of all the internal and registered schema validations executed - Introduced rawquery generation so that raw SQL queries can be used to get results from DB Fixed following bug: - UniqueConstraint is now used to correctly generate unique constraints for sqlalchemy models that are supposed to be combinations of columns Change-Id: I53c79a6544f44ef8beab2600ddc8a3ea91ada903
This commit is contained in:
parent
46803b7e60
commit
8aec0390f8
@ -49,8 +49,15 @@ class BucketsResource(api_base.BaseResource):
|
||||
|
||||
# NOTE: Must validate documents before doing policy enforcement,
|
||||
# because we expect certain formatting of the documents while doing
|
||||
# policy enforcement.
|
||||
validation_policies = self._create_validation_policies(documents)
|
||||
# policy enforcement. If any documents fail basic schema validaiton
|
||||
# raise an exception immediately.
|
||||
doc_validator = document_validation.DocumentValidation(documents)
|
||||
try:
|
||||
validations = doc_validator.validate_all()
|
||||
except (deckhand_errors.InvalidDocumentFormat,
|
||||
deckhand_errors.InvalidDocumentSchema) as e:
|
||||
LOG.error(e.format_message())
|
||||
raise falcon.HTTPBadRequest(description=e.format_message())
|
||||
|
||||
for document in documents:
|
||||
if document['metadata'].get('storagePolicy') == 'encrypted':
|
||||
@ -60,29 +67,14 @@ class BucketsResource(api_base.BaseResource):
|
||||
|
||||
self._prepare_secret_documents(documents)
|
||||
|
||||
# Save all the documents, including validation policies.
|
||||
documents_to_create = documents + validation_policies
|
||||
created_documents = self._create_revision_documents(
|
||||
bucket_name, list(documents_to_create))
|
||||
bucket_name, documents, validations)
|
||||
|
||||
if created_documents:
|
||||
resp.body = self.view_builder.list(created_documents)
|
||||
resp.status = falcon.HTTP_200
|
||||
resp.append_header('Content-Type', 'application/x-yaml')
|
||||
|
||||
def _create_validation_policies(self, documents):
|
||||
# All concrete documents in the payload must successfully pass their
|
||||
# JSON schema validations. Otherwise raise an error.
|
||||
try:
|
||||
validation_policies = document_validation.DocumentValidation(
|
||||
documents).validate_all()
|
||||
except deckhand_errors.InvalidDocumentFormat as e:
|
||||
# FIXME(fmontei): Save the malformed documents and the failed
|
||||
# validation policy in the DB for future debugging, and only
|
||||
# afterward raise an exception.
|
||||
raise falcon.HTTPBadRequest(description=e.format_message())
|
||||
return validation_policies
|
||||
|
||||
def _prepare_secret_documents(self, secret_documents):
|
||||
# Encrypt data for secret documents, if any.
|
||||
for document in secret_documents:
|
||||
@ -94,9 +86,11 @@ class BucketsResource(api_base.BaseResource):
|
||||
for t in types.DOCUMENT_SECRET_TYPES]):
|
||||
document['data'] = {'secret': document['data']}
|
||||
|
||||
def _create_revision_documents(self, bucket_name, documents):
|
||||
def _create_revision_documents(self, bucket_name, documents,
|
||||
validations):
|
||||
try:
|
||||
created_documents = db_api.documents_create(bucket_name, documents)
|
||||
created_documents = db_api.documents_create(
|
||||
bucket_name, documents, validations=validations)
|
||||
except deckhand_errors.DocumentExists as e:
|
||||
raise falcon.HTTPConflict(description=e.format_message())
|
||||
except Exception as e:
|
||||
|
111
deckhand/control/validations.py
Normal file
111
deckhand/control/validations.py
Normal file
@ -0,0 +1,111 @@
|
||||
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import yaml
|
||||
|
||||
import falcon
|
||||
from oslo_log import log as logging
|
||||
import six
|
||||
|
||||
from deckhand.control import base as api_base
|
||||
from deckhand.control.views import validation as validation_view
|
||||
from deckhand.db.sqlalchemy import api as db_api
|
||||
from deckhand import errors
|
||||
from deckhand import policy
|
||||
|
||||
LOG = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class ValidationsResource(api_base.BaseResource):
|
||||
"""API resource for realizing validations endpoints."""
|
||||
|
||||
view_builder = validation_view.ViewBuilder()
|
||||
|
||||
@policy.authorize('deckhand:create_validation')
|
||||
def on_post(self, req, resp, revision_id, validation_name):
|
||||
validation_data = req.stream.read(req.content_length or 0)
|
||||
try:
|
||||
validation_data = yaml.safe_load(validation_data)
|
||||
except yaml.YAMLError as e:
|
||||
error_msg = ("Could not parse the validation into YAML data. "
|
||||
"Details: %s." % e)
|
||||
LOG.error(error_msg)
|
||||
raise falcon.HTTPBadRequest(description=six.text_type(e))
|
||||
|
||||
try:
|
||||
resp_body = db_api.validation_create(
|
||||
revision_id, validation_name, validation_data)
|
||||
except errors.RevisionNotFound as e:
|
||||
raise falcon.HTTPNotFound(description=e.format_message())
|
||||
|
||||
resp.status = falcon.HTTP_201
|
||||
resp.append_header('Content-Type', 'application/x-yaml')
|
||||
resp.body = self.view_builder.show(resp_body)
|
||||
|
||||
def on_get(self, req, resp, revision_id, validation_name=None,
|
||||
entry_id=None):
|
||||
if all([validation_name, entry_id]):
|
||||
self._show_validation_entry(
|
||||
req, resp, revision_id, validation_name, entry_id)
|
||||
elif validation_name:
|
||||
self._list_validation_entries(req, resp, revision_id,
|
||||
validation_name)
|
||||
else:
|
||||
self._list_all_validations(req, resp, revision_id)
|
||||
|
||||
@policy.authorize('deckhand:show_validation')
|
||||
def _show_validation_entry(self, req, resp, revision_id, validation_name,
|
||||
entry_id):
|
||||
try:
|
||||
entry_id = int(entry_id)
|
||||
except ValueError:
|
||||
raise falcon.HTTPBadRequest(
|
||||
description='The {entry_id} parameter must be an integer.')
|
||||
|
||||
try:
|
||||
entry = db_api.validation_get_entry(
|
||||
revision_id, validation_name, entry_id)
|
||||
except errors.RevisionNotFound as e:
|
||||
raise falcon.HTTPNotFound(description=e.format_message())
|
||||
|
||||
resp_body = self.view_builder.show_entry(entry)
|
||||
resp.status = falcon.HTTP_200
|
||||
resp.append_header('Content-Type', 'application/x-yaml')
|
||||
resp.body = resp_body
|
||||
|
||||
@policy.authorize('deckhand:list_validations')
|
||||
def _list_validation_entries(self, req, resp, revision_id,
|
||||
validation_name):
|
||||
try:
|
||||
entries = db_api.validation_get_all_entries(revision_id,
|
||||
validation_name)
|
||||
except errors.RevisionNotFound as e:
|
||||
raise falcon.HTTPNotFound(description=e.format_message())
|
||||
|
||||
resp_body = self.view_builder.list_entries(entries)
|
||||
resp.status = falcon.HTTP_200
|
||||
resp.append_header('Content-Type', 'application/x-yaml')
|
||||
resp.body = resp_body
|
||||
|
||||
@policy.authorize('deckhand:list_validations')
|
||||
def _list_all_validations(self, req, resp, revision_id):
|
||||
try:
|
||||
validations = db_api.validation_get_all(revision_id)
|
||||
except errors.RevisionNotFound as e:
|
||||
raise falcon.HTTPNotFound(description=e.format_message())
|
||||
resp_body = self.view_builder.list(validations)
|
||||
|
||||
resp.status = falcon.HTTP_200
|
||||
resp.append_header('Content-Type', 'application/x-yaml')
|
||||
resp.body = resp_body
|
55
deckhand/control/views/validation.py
Normal file
55
deckhand/control/views/validation.py
Normal file
@ -0,0 +1,55 @@
|
||||
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
from deckhand.control import common
|
||||
|
||||
|
||||
class ViewBuilder(common.ViewBuilder):
|
||||
"""Model validation API responses as a python dictionary."""
|
||||
|
||||
_collection_name = 'validations'
|
||||
|
||||
def list(self, validations):
|
||||
return {
|
||||
'count': len(validations),
|
||||
'results': [
|
||||
{'name': v[0], 'status': v[1]} for v in validations
|
||||
]
|
||||
}
|
||||
|
||||
def list_entries(self, entries):
|
||||
results = []
|
||||
|
||||
for idx, e in enumerate(entries):
|
||||
results.append({'status': e['status'], 'id': idx})
|
||||
|
||||
return {
|
||||
'count': len(entries),
|
||||
'results': results
|
||||
}
|
||||
|
||||
def show(self, validation):
|
||||
return {
|
||||
'status': validation.get('status'),
|
||||
'validator': validation.get('validator')
|
||||
}
|
||||
|
||||
def show_entry(self, entry):
|
||||
return {
|
||||
'name': entry.get('name'),
|
||||
'status': entry.get('status'),
|
||||
'createdAt': entry.get('createdAt'),
|
||||
'expiresAfter': entry.get('expiresAfter'),
|
||||
'errors': entry.get('errors')
|
||||
}
|
@ -29,6 +29,7 @@ from oslo_log import log as logging
|
||||
from oslo_serialization import jsonutils as json
|
||||
import six
|
||||
import sqlalchemy.orm as sa_orm
|
||||
from sqlalchemy import text
|
||||
|
||||
from deckhand.db.sqlalchemy import models
|
||||
from deckhand import errors
|
||||
@ -90,7 +91,15 @@ def setup_db():
|
||||
models.register_models(get_engine())
|
||||
|
||||
|
||||
def documents_create(bucket_name, documents, session=None):
|
||||
def raw_query(query, **kwargs):
|
||||
"""Execute a raw query against the database."""
|
||||
stmt = text(query)
|
||||
stmt = stmt.bindparams(**kwargs)
|
||||
return get_engine().execute(stmt)
|
||||
|
||||
|
||||
def documents_create(bucket_name, documents, validations=None,
|
||||
session=None):
|
||||
"""Create a set of documents and associated bucket.
|
||||
|
||||
If no changes are detected, a new revision will not be created. This
|
||||
@ -125,6 +134,10 @@ def documents_create(bucket_name, documents, session=None):
|
||||
if any([documents_to_create, documents_to_delete]):
|
||||
bucket = bucket_get_or_create(bucket_name)
|
||||
revision = revision_create()
|
||||
if validations:
|
||||
for validation in validations:
|
||||
validation_create(revision['id'], validation['name'],
|
||||
validation)
|
||||
|
||||
if documents_to_delete:
|
||||
LOG.debug('Deleting documents: %s.', documents_to_delete)
|
||||
@ -179,6 +192,7 @@ def _documents_create(bucket_name, values_list, session=None):
|
||||
return document
|
||||
|
||||
for values in values_list:
|
||||
values.setdefault('data', {})
|
||||
values = _fill_in_metadata_defaults(values)
|
||||
values['is_secret'] = 'secret' in values['data']
|
||||
|
||||
@ -266,7 +280,7 @@ def document_get(session=None, raw_dict=False, **filters):
|
||||
# "regular" filters via sqlalchemy and all nested filters via Python.
|
||||
nested_filters = {}
|
||||
for f in filters.copy():
|
||||
if '.' in f:
|
||||
if any([x in f for x in ('.', 'schema')]):
|
||||
nested_filters.setdefault(f, filters.pop(f))
|
||||
|
||||
# Documents with the the same metadata.name and schema can exist across
|
||||
@ -286,6 +300,56 @@ def document_get(session=None, raw_dict=False, **filters):
|
||||
raise errors.DocumentNotFound(document=filters)
|
||||
|
||||
|
||||
def document_get_all(session=None, raw_dict=False, revision_id=None,
|
||||
**filters):
|
||||
"""Retrieve all documents for ``revision_id`` that match ``filters``.
|
||||
|
||||
:param session: Database session object.
|
||||
:param raw_dict: Whether to retrieve the exact way the data is stored in
|
||||
DB if ``True``, else the way users expect the data.
|
||||
:param revision_id: The ID corresponding to the ``Revision`` object. If the
|
||||
ID is ``None``, then retrieve the latest revision, if one exists.
|
||||
:param filters: Dictionary attributes (including nested) used to filter
|
||||
out revision documents.
|
||||
:returns: Dictionary representation of each retrieved document.
|
||||
"""
|
||||
session = session or get_session()
|
||||
|
||||
if revision_id is None:
|
||||
# If no revision_id is specified, grab the newest one.
|
||||
revision = session.query(models.Revision)\
|
||||
.order_by(models.Revision.created_at.desc())\
|
||||
.first()
|
||||
if revision:
|
||||
filters['revision_id'] = revision.id
|
||||
else:
|
||||
filters['revision_id'] = revision_id
|
||||
|
||||
# TODO(fmontei): Currently Deckhand doesn't support filtering by nested
|
||||
# JSON fields via sqlalchemy. For now, filter the documents using all
|
||||
# "regular" filters via sqlalchemy and all nested filters via Python.
|
||||
nested_filters = {}
|
||||
for f in filters.copy():
|
||||
if any([x in f for x in ('.', 'schema')]):
|
||||
nested_filters.setdefault(f, filters.pop(f))
|
||||
|
||||
# Retrieve the most recently created documents for the revision, because
|
||||
# documents with the same metadata.name and schema can exist across
|
||||
# different revisions.
|
||||
documents = session.query(models.Document)\
|
||||
.filter_by(**filters)\
|
||||
.order_by(models.Document.created_at.desc())\
|
||||
.all()
|
||||
|
||||
final_documents = []
|
||||
for doc in documents:
|
||||
d = doc.to_dict(raw_dict=raw_dict)
|
||||
if _apply_filters(d, **nested_filters):
|
||||
final_documents.append(d)
|
||||
|
||||
return final_documents
|
||||
|
||||
|
||||
####################
|
||||
|
||||
|
||||
@ -414,10 +478,9 @@ def _apply_filters(dct, **filters):
|
||||
unwanted results.
|
||||
:return: True if the dictionary satisfies all the filters, else False.
|
||||
"""
|
||||
def _transform_filter_bool(actual_val, filter_val):
|
||||
def _transform_filter_bool(filter_val):
|
||||
# Transform boolean values into string literals.
|
||||
if (isinstance(actual_val, bool)
|
||||
and isinstance(filter_val, six.string_types)):
|
||||
if isinstance(filter_val, six.string_types):
|
||||
try:
|
||||
filter_val = ast.literal_eval(filter_val.title())
|
||||
except ValueError:
|
||||
@ -426,20 +489,23 @@ def _apply_filters(dct, **filters):
|
||||
filter_val = None
|
||||
return filter_val
|
||||
|
||||
match = True
|
||||
|
||||
for filter_key, filter_val in filters.items():
|
||||
actual_val = utils.jsonpath_parse(dct, filter_key)
|
||||
|
||||
# If the filter is a list of possibilities, e.g. ['site', 'region']
|
||||
# for metadata.layeringDefinition.layer, check whether the actual
|
||||
# value is present.
|
||||
if isinstance(filter_val, (list, tuple)):
|
||||
if actual_val not in [_transform_filter_bool(actual_val, x)
|
||||
for x in filter_val]:
|
||||
match = False
|
||||
break
|
||||
actual_val = utils.jsonpath_parse(dct, filter_key, match_all=True)
|
||||
if not actual_val:
|
||||
return False
|
||||
|
||||
if isinstance(actual_val[0], bool):
|
||||
filter_val = [_transform_filter_bool(x) for x in filter_val]
|
||||
|
||||
if not set(actual_val).intersection(set(filter_val)):
|
||||
return False
|
||||
else:
|
||||
actual_val = utils.jsonpath_parse(dct, filter_key)
|
||||
|
||||
# Else if both the filter value and the actual value in the doc
|
||||
# are dictionaries, check whether the filter dict is a subset
|
||||
# of the actual dict.
|
||||
@ -448,16 +514,20 @@ def _apply_filters(dct, **filters):
|
||||
is_subset = set(
|
||||
filter_val.items()).issubset(set(actual_val.items()))
|
||||
if not is_subset:
|
||||
match = False
|
||||
break
|
||||
return False
|
||||
else:
|
||||
# Else both filters are string literals.
|
||||
if actual_val != _transform_filter_bool(
|
||||
actual_val, filter_val):
|
||||
match = False
|
||||
break
|
||||
if isinstance(actual_val, bool):
|
||||
filter_val = _transform_filter_bool(filter_val)
|
||||
|
||||
return match
|
||||
# Else both filters are string literals.
|
||||
if filter_key in ['metadata.schema', 'schema']:
|
||||
if not actual_val.startswith(filter_val):
|
||||
return False
|
||||
else:
|
||||
if actual_val != filter_val:
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def revision_get_all(session=None, **filters):
|
||||
@ -914,3 +984,71 @@ def revision_rollback(revision_id, latest_revision, session=None):
|
||||
new_revision['documents'])
|
||||
|
||||
return new_revision
|
||||
|
||||
|
||||
####################
|
||||
|
||||
|
||||
@require_revision_exists
|
||||
def validation_create(revision_id, val_name, val_data, session=None):
|
||||
session = session or get_session()
|
||||
|
||||
validation_kwargs = {
|
||||
'revision_id': revision_id,
|
||||
'name': val_name,
|
||||
'status': val_data.get('status', None),
|
||||
'validator': val_data.get('validator', None),
|
||||
'errors': val_data.get('errors', []),
|
||||
}
|
||||
|
||||
validation = models.Validation()
|
||||
|
||||
with session.begin():
|
||||
validation.update(validation_kwargs)
|
||||
validation.save(session=session)
|
||||
|
||||
return validation.to_dict()
|
||||
|
||||
|
||||
@require_revision_exists
|
||||
def validation_get_all(revision_id, session=None):
|
||||
# Query selects only unique combinations of (name, status) from the
|
||||
# `Validations` table and prioritizes 'failure' result over 'success'
|
||||
# result via alphabetical ordering of the status column. Each document
|
||||
# has its own validation but for this query we want to return the result
|
||||
# of the overall validation for the revision. If just 1 document failed
|
||||
# validation, we regard the validation for the whole revision as 'failure'.
|
||||
query = raw_query("""
|
||||
SELECT DISTINCT name, status FROM validations as v1
|
||||
WHERE revision_id = :revision_id AND status = (
|
||||
SELECT status FROM validations as v2
|
||||
WHERE v2.name = v1.name
|
||||
ORDER BY status
|
||||
LIMIT 1
|
||||
)
|
||||
GROUP BY name, status
|
||||
ORDER BY name, status;
|
||||
""", revision_id=revision_id)
|
||||
|
||||
result = query.fetchall()
|
||||
return result
|
||||
|
||||
|
||||
@require_revision_exists
|
||||
def validation_get_all_entries(revision_id, val_name, session=None):
|
||||
session = session or get_session()
|
||||
|
||||
entries = session.query(models.Validation)\
|
||||
.filter_by(**{'revision_id': revision_id, 'name': val_name})\
|
||||
.order_by(models.Validation.created_at.asc())\
|
||||
.all()
|
||||
|
||||
return [e.to_dict() for e in entries]
|
||||
|
||||
|
||||
@require_revision_exists
|
||||
def validation_get_entry(revision_id, val_name, entry_id, session=None):
|
||||
session = session or get_session()
|
||||
entries = validation_get_all_entries(
|
||||
revision_id, val_name, session=session)
|
||||
return entries[entry_id]
|
||||
|
@ -23,8 +23,8 @@ from sqlalchemy.ext.hybrid import hybrid_property
|
||||
from sqlalchemy import ForeignKey
|
||||
from sqlalchemy import Integer
|
||||
from sqlalchemy.orm import relationship
|
||||
from sqlalchemy import schema
|
||||
from sqlalchemy import String
|
||||
from sqlalchemy import UniqueConstraint
|
||||
|
||||
|
||||
# Declarative base class which maintains a catalog of classes and tables
|
||||
@ -82,13 +82,6 @@ class DeckhandBase(models.ModelBase, models.TimestampMixin):
|
||||
return d
|
||||
|
||||
|
||||
def gen_unique_constraint(table_name, *fields):
|
||||
constraint_name = 'ix_' + table_name.lower()
|
||||
for field in fields:
|
||||
constraint_name = constraint_name + '_%s' % field
|
||||
return schema.UniqueConstraint(*fields, name=constraint_name)
|
||||
|
||||
|
||||
class Bucket(BASE, DeckhandBase):
|
||||
__tablename__ = 'buckets'
|
||||
|
||||
@ -106,6 +99,7 @@ class Revision(BASE, DeckhandBase):
|
||||
documents = relationship("Document",
|
||||
primaryjoin="Revision.id==Document.revision_id")
|
||||
tags = relationship("RevisionTag")
|
||||
validations = relationship("Validation")
|
||||
|
||||
def to_dict(self):
|
||||
d = super(Revision, self).to_dict()
|
||||
@ -117,8 +111,6 @@ class Revision(BASE, DeckhandBase):
|
||||
class RevisionTag(BASE, DeckhandBase):
|
||||
UNIQUE_CONSTRAINTS = ('tag', 'revision_id')
|
||||
__tablename__ = 'revision_tags'
|
||||
__table_args__ = (
|
||||
gen_unique_constraint(__tablename__, *UNIQUE_CONSTRAINTS),)
|
||||
|
||||
tag = Column(String(64), primary_key=True, nullable=False)
|
||||
data = Column(oslo_types.JsonEncodedDict(), nullable=True, default={})
|
||||
@ -126,11 +118,12 @@ class RevisionTag(BASE, DeckhandBase):
|
||||
Integer, ForeignKey('revisions.id', ondelete='CASCADE'),
|
||||
nullable=False)
|
||||
|
||||
UniqueConstraint(*UNIQUE_CONSTRAINTS)
|
||||
|
||||
|
||||
class Document(BASE, DeckhandBase):
|
||||
UNIQUE_CONSTRAINTS = ('schema', 'name', 'revision_id')
|
||||
__tablename__ = 'documents'
|
||||
__table_args__ = (gen_unique_constraint(*UNIQUE_CONSTRAINTS),)
|
||||
|
||||
id = Column(Integer, primary_key=True)
|
||||
name = Column(String(64), nullable=False)
|
||||
@ -138,7 +131,7 @@ class Document(BASE, DeckhandBase):
|
||||
# NOTE(fmontei): ``metadata`` is reserved by the DB, so ``_metadata``
|
||||
# must be used to store document metadata information in the DB.
|
||||
_metadata = Column(oslo_types.JsonEncodedDict(), nullable=False)
|
||||
data = Column(oslo_types.JsonEncodedDict(), nullable=True)
|
||||
data = Column(oslo_types.JsonEncodedDict(), nullable=True, default={})
|
||||
data_hash = Column(String, nullable=False)
|
||||
metadata_hash = Column(String, nullable=False)
|
||||
is_secret = Column(Boolean, nullable=False, default=False)
|
||||
@ -160,6 +153,8 @@ class Document(BASE, DeckhandBase):
|
||||
Integer, ForeignKey('revisions.id', ondelete='CASCADE'),
|
||||
nullable=True)
|
||||
|
||||
UniqueConstraint(*UNIQUE_CONSTRAINTS)
|
||||
|
||||
@hybrid_property
|
||||
def bucket_name(self):
|
||||
if hasattr(self, 'bucket') and self.bucket:
|
||||
@ -177,18 +172,34 @@ class Document(BASE, DeckhandBase):
|
||||
if not raw_dict:
|
||||
d['metadata'] = d.pop('_metadata')
|
||||
|
||||
if 'bucket' in d:
|
||||
d.pop('bucket')
|
||||
|
||||
return d
|
||||
|
||||
|
||||
class Validation(BASE, DeckhandBase):
|
||||
__tablename__ = 'validations'
|
||||
|
||||
id = Column(Integer, primary_key=True)
|
||||
name = Column(String(64), nullable=False)
|
||||
status = Column(String(8), nullable=False)
|
||||
validator = Column(oslo_types.JsonEncodedDict(), nullable=False)
|
||||
errors = Column(oslo_types.JsonEncodedList(), nullable=False, default=[])
|
||||
revision_id = Column(
|
||||
Integer, ForeignKey('revisions.id', ondelete='CASCADE'),
|
||||
nullable=False)
|
||||
|
||||
|
||||
def register_models(engine):
|
||||
"""Create database tables for all models with the given engine."""
|
||||
models = [Bucket, Document, Revision, RevisionTag]
|
||||
models = [Bucket, Document, Revision, RevisionTag, Validation]
|
||||
for model in models:
|
||||
model.metadata.create_all(engine)
|
||||
|
||||
|
||||
def unregister_models(engine):
|
||||
"""Drop database tables for all models with the given engine."""
|
||||
models = [Bucket, Document, Revision, RevisionTag]
|
||||
models = [Bucket, Document, Revision, RevisionTag, Validation]
|
||||
for model in models:
|
||||
model.metadata.drop_all(engine)
|
||||
|
@ -12,33 +12,35 @@
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import re
|
||||
|
||||
import jsonschema
|
||||
from oslo_log import log as logging
|
||||
|
||||
from deckhand.db.sqlalchemy import api as db_api
|
||||
from deckhand.engine import document as document_wrapper
|
||||
from deckhand.engine.schema import base_schema
|
||||
from deckhand.engine.schema import v1_0
|
||||
from deckhand import errors
|
||||
from deckhand import factories
|
||||
from deckhand import types
|
||||
|
||||
LOG = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class DocumentValidation(object):
|
||||
"""Class for document validation logic for YAML files.
|
||||
|
||||
This class is responsible for validating YAML files according to their
|
||||
schema.
|
||||
|
||||
:param documents: Documents to be validated.
|
||||
:type documents: List of dictionaries or dictionary.
|
||||
"""
|
||||
|
||||
def __init__(self, documents):
|
||||
"""Class for document validation logic for YAML files.
|
||||
|
||||
This class is responsible for validating YAML files according to their
|
||||
schema.
|
||||
|
||||
:param documents: Documents to be validated.
|
||||
:type documents: list[dict]
|
||||
"""
|
||||
if not isinstance(documents, (list, tuple)):
|
||||
documents = [documents]
|
||||
|
||||
self.documents = documents
|
||||
self.documents = [document_wrapper.Document(d) for d in documents]
|
||||
|
||||
class SchemaType(object):
|
||||
"""Class for retrieving correct schema for pre-validation on YAML.
|
||||
@ -48,54 +50,182 @@ class DocumentValidation(object):
|
||||
YAML data.
|
||||
"""
|
||||
|
||||
# TODO(fmontei): Support dynamically registered schemas.
|
||||
schema_versions_info = [
|
||||
{'id': 'deckhand/CertificateKey',
|
||||
'schema': v1_0.certificate_key_schema},
|
||||
'schema': v1_0.certificate_key_schema,
|
||||
'version': '1.0'},
|
||||
{'id': 'deckhand/Certificate',
|
||||
'schema': v1_0.certificate_schema},
|
||||
'schema': v1_0.certificate_schema,
|
||||
'version': '1.0'},
|
||||
{'id': 'deckhand/DataSchema',
|
||||
'schema': v1_0.data_schema},
|
||||
# NOTE(fmontei): Fall back to the metadata's schema for validating
|
||||
# generic documents.
|
||||
{'id': 'metadata/Document',
|
||||
'schema': v1_0.document_schema},
|
||||
'schema': v1_0.data_schema_schema,
|
||||
'version': '1.0'},
|
||||
{'id': 'deckhand/LayeringPolicy',
|
||||
'schema': v1_0.layering_schema},
|
||||
'schema': v1_0.layering_policy_schema,
|
||||
'version': '1.0'},
|
||||
{'id': 'deckhand/Passphrase',
|
||||
'schema': v1_0.passphrase_schema},
|
||||
'schema': v1_0.passphrase_schema,
|
||||
'version': '1.0'},
|
||||
{'id': 'deckhand/ValidationPolicy',
|
||||
'schema': v1_0.validation_schema}]
|
||||
'schema': v1_0.validation_policy_schema,
|
||||
'version': '1.0'},
|
||||
# FIXME(fmontei): Remove this once all Deckhand tests have been
|
||||
# refactored to account for dynamic schema registeration via
|
||||
# `DataSchema` documents. Otherwise most tests will fail.
|
||||
{'id': 'metadata/Document',
|
||||
'schema': v1_0.document_schema,
|
||||
'version': '1.0'}]
|
||||
|
||||
def __init__(self, data):
|
||||
"""Constructor for ``SchemaType``.
|
||||
schema_re = re.compile(
|
||||
'^([A-Za-z]+\/[A-Za-z]+\/v[1]{1}(\.[0]{1}){0,1})$')
|
||||
|
||||
Retrieve the relevant schema based on the API version and schema
|
||||
name contained in `document.schema` where `document` constitutes a
|
||||
single document in a YAML payload.
|
||||
|
||||
:param api_version: The API version used for schema validation.
|
||||
:param schema: The schema property in `document.schema`.
|
||||
@classmethod
|
||||
def _register_data_schemas(cls):
|
||||
"""Dynamically detect schemas for document validation that have
|
||||
been registered by external services via ``DataSchema`` documents.
|
||||
"""
|
||||
self.schema = self.get_schema(data)
|
||||
data_schemas = db_api.document_get_all(
|
||||
schema=types.DATA_SCHEMA_SCHEMA)
|
||||
|
||||
def get_schema(self, data):
|
||||
# Fall back to `document.metadata.schema` if the schema cannot be
|
||||
# determined from `data.schema`.
|
||||
for doc_property in [data['schema'], data['metadata']['schema']]:
|
||||
schema = self._get_schema_by_property(doc_property)
|
||||
if schema:
|
||||
return schema
|
||||
return None
|
||||
for data_schema in data_schemas:
|
||||
if cls.schema_re.match(data_schema['metadata']['name']):
|
||||
schema_id = '/'.join(
|
||||
data_schema['metadata']['name'].split('/')[:2])
|
||||
else:
|
||||
schema_id = data_schema['metadata']['name']
|
||||
cls.schema_versions_info.append({
|
||||
'id': schema_id,
|
||||
'schema': data_schema['data'],
|
||||
'version': '1.0',
|
||||
'registered': True,
|
||||
})
|
||||
|
||||
def _get_schema_by_property(self, doc_property):
|
||||
schema_parts = doc_property.split('/')
|
||||
doc_schema_identifier = '/'.join(schema_parts[:-1])
|
||||
@classmethod
|
||||
def _get_schema_by_property(cls, schema_re, field):
|
||||
if schema_re.match(field):
|
||||
schema_id = '/'.join(field.split('/')[:2])
|
||||
else:
|
||||
schema_id = field
|
||||
|
||||
for schema in self.schema_versions_info:
|
||||
if doc_schema_identifier == schema['id']:
|
||||
return schema['schema'].schema
|
||||
return None
|
||||
matching_schemas = []
|
||||
|
||||
for schema in cls.schema_versions_info:
|
||||
# Can't use `startswith` below to avoid namespace false
|
||||
# positives like `CertificateKey` and `Certificate`.
|
||||
if schema_id == schema['id']:
|
||||
matching_schemas.append(schema)
|
||||
return matching_schemas
|
||||
|
||||
@classmethod
|
||||
def get_schemas(cls, doc):
|
||||
"""Retrieve the relevant schema based on the document's ``schema``.
|
||||
|
||||
:param dict doc: The document used for finding the correct schema
|
||||
to validate it based on its ``schema``.
|
||||
:returns: A schema to be used by ``jsonschema`` for document
|
||||
validation.
|
||||
:rtype: dict
|
||||
"""
|
||||
cls._register_data_schemas()
|
||||
|
||||
# FIXME(fmontei): Remove this once all Deckhand tests have been
|
||||
# refactored to account for dynamic schema registeration via
|
||||
# ``DataSchema`` documents. Otherwise most tests will fail.
|
||||
for doc_field in [doc['schema'], doc['metadata']['schema']]:
|
||||
matching_schemas = cls._get_schema_by_property(
|
||||
cls.schema_re, doc_field)
|
||||
if matching_schemas:
|
||||
return matching_schemas
|
||||
|
||||
return []
|
||||
|
||||
def _format_validation_results(self, results):
|
||||
"""Format the validation result to be compatible with database
|
||||
formatting.
|
||||
|
||||
:results: The validation results generated during document validation.
|
||||
:type results: list[dict]
|
||||
:returns: List of formatted validation results.
|
||||
:rtype: list[dict]
|
||||
"""
|
||||
internal_validator = {
|
||||
'name': 'deckhand',
|
||||
'version': '1.0'
|
||||
}
|
||||
|
||||
formatted_results = []
|
||||
for result in results:
|
||||
formatted_result = {
|
||||
'name': types.DECKHAND_SCHEMA_VALIDATION,
|
||||
'status': result['status'],
|
||||
'validator': internal_validator,
|
||||
'errors': result['errors']
|
||||
}
|
||||
formatted_results.append(formatted_result)
|
||||
|
||||
return formatted_results
|
||||
|
||||
def _validate_one(self, document):
|
||||
raw_dict = document.to_dict()
|
||||
try:
|
||||
# Subject every document to basic validation to verify that each
|
||||
# main section is present (schema, metadata, data).
|
||||
jsonschema.validate(raw_dict, base_schema.schema)
|
||||
except jsonschema.exceptions.ValidationError as e:
|
||||
LOG.debug('Document failed top-level schema validation. Details: '
|
||||
'%s.', e.message)
|
||||
# NOTE(fmontei): Raise here because if we fail basic schema
|
||||
# validation, then there is no point in continuing.
|
||||
raise errors.InvalidDocumentFormat(
|
||||
detail=e.message, schema=e.schema)
|
||||
|
||||
schemas_to_use = self.SchemaType.get_schemas(raw_dict)
|
||||
|
||||
if not schemas_to_use:
|
||||
LOG.debug('Document schema %s not recognized.',
|
||||
document.get_schema())
|
||||
# NOTE(fmontei): Raise here because if Deckhand cannot even
|
||||
# determine which schema to use for further validation, then there
|
||||
# is no point in trying to continue validation.
|
||||
raise errors.InvalidDocumentSchema(
|
||||
document_schema=document.get_schema(),
|
||||
schema_list=[
|
||||
s['id'] for s in self.SchemaType.schema_versions_info])
|
||||
|
||||
result = {'errors': []}
|
||||
|
||||
# Perform more detailed validation on each document depending on
|
||||
# its schema. If the document is abstract, validation errors are
|
||||
# ignored.
|
||||
if document.is_abstract():
|
||||
LOG.info('Skipping schema validation for abstract '
|
||||
'document: %s.', raw_dict)
|
||||
else:
|
||||
for schema_to_use in schemas_to_use:
|
||||
try:
|
||||
if isinstance(schema_to_use['schema'], dict):
|
||||
schema_validator = schema_to_use['schema']
|
||||
jsonschema.validate(raw_dict.get('data', {}),
|
||||
schema_validator)
|
||||
else:
|
||||
schema_validator = schema_to_use['schema'].schema
|
||||
jsonschema.validate(raw_dict, schema_validator)
|
||||
except jsonschema.exceptions.ValidationError as e:
|
||||
LOG.error(
|
||||
'Document failed schema validation for schema %s.'
|
||||
'Details: %s.', document.get_schema(), e.message)
|
||||
result['errors'].append({
|
||||
'schema': document.get_schema(),
|
||||
'name': document.get_name(),
|
||||
'message': e.message.replace('\\', '')
|
||||
})
|
||||
|
||||
if result['errors']:
|
||||
result.setdefault('status', 'failure')
|
||||
else:
|
||||
result.setdefault('status', 'success')
|
||||
|
||||
return result
|
||||
|
||||
def validate_all(self):
|
||||
"""Pre-validate that the YAML file is correctly formatted.
|
||||
@ -108,65 +238,36 @@ class DocumentValidation(object):
|
||||
Validation is broken up into 2 stages:
|
||||
|
||||
1) Validate that each document contains the basic bulding blocks
|
||||
needed: "schema", "metadata" and "data" using a "base" schema.
|
||||
2) Validate each specific document type (e.g. validation policy)
|
||||
using a more detailed schema.
|
||||
needed: ``schema`` and ``metadata`` using a "base" schema.
|
||||
Failing this validation is deemed a critical failure, resulting
|
||||
in an exception.
|
||||
|
||||
:returns: Dictionary mapping with keys being the unique name for each
|
||||
document and values being the validations executed for that
|
||||
document, including failed and succeeded validations.
|
||||
.. note::
|
||||
|
||||
The ``data`` section, while mandatory, will not result in
|
||||
critical failure. This is because a document can rely
|
||||
on yet another document for ``data`` substitution. But
|
||||
the validation for the document will be tagged as
|
||||
``failure``.
|
||||
|
||||
2) Validate each specific document type (e.g. validation policy)
|
||||
using a more detailed schema. Failing this validation is deemed
|
||||
non-critical, resulting in the error being recorded along with
|
||||
any other non-critical exceptions, which are returned together
|
||||
later.
|
||||
|
||||
:returns: A list of validations (one for each document validated).
|
||||
:rtype: list[dict]
|
||||
:raises errors.InvalidDocumentFormat: If the document failed schema
|
||||
validation and the failure is deemed critical.
|
||||
:raises errors.InvalidDocumentSchema: If no JSON schema for could be
|
||||
found for executing document validation.
|
||||
"""
|
||||
internal_validation_docs = []
|
||||
validation_policy_factory = factories.ValidationPolicyFactory()
|
||||
validation_results = []
|
||||
|
||||
for document in self.documents:
|
||||
self._validate_one(document)
|
||||
result = self._validate_one(document)
|
||||
validation_results.append(result)
|
||||
|
||||
deckhand_schema_validation = validation_policy_factory.gen(
|
||||
types.DECKHAND_SCHEMA_VALIDATION, status='success')
|
||||
internal_validation_docs.append(deckhand_schema_validation)
|
||||
|
||||
return internal_validation_docs
|
||||
|
||||
def _validate_one(self, document):
|
||||
# Subject every document to basic validation to verify that each
|
||||
# main section is present (schema, metadata, data).
|
||||
try:
|
||||
jsonschema.validate(document, base_schema.schema)
|
||||
except jsonschema.exceptions.ValidationError as e:
|
||||
raise errors.InvalidDocumentFormat(
|
||||
detail=e.message, schema=e.schema)
|
||||
|
||||
doc_schema_type = self.SchemaType(document)
|
||||
if doc_schema_type.schema is None:
|
||||
raise errors.UknownDocumentFormat(
|
||||
document_type=document['schema'])
|
||||
|
||||
# Perform more detailed validation on each document depending on
|
||||
# its schema. If the document is abstract, validation errors are
|
||||
# ignored.
|
||||
try:
|
||||
jsonschema.validate(document, doc_schema_type.schema)
|
||||
except jsonschema.exceptions.ValidationError as e:
|
||||
# TODO(fmontei): Use the `Document` object wrapper instead
|
||||
# once other PR is merged.
|
||||
if not self._is_abstract(document):
|
||||
raise errors.InvalidDocumentFormat(
|
||||
detail=e.message, schema=e.schema,
|
||||
document_type=document['schema'])
|
||||
else:
|
||||
LOG.info('Skipping schema validation for abstract '
|
||||
'document: %s.', document)
|
||||
|
||||
def _is_abstract(self, document):
|
||||
try:
|
||||
is_abstract = document['metadata']['layeringDefinition'][
|
||||
'abstract'] is True
|
||||
return is_abstract
|
||||
# NOTE(fmontei): If the document is of ``document_schema`` type and
|
||||
# no "layeringDefinition" or "abstract" property is found, then treat
|
||||
# this as a validation error.
|
||||
except KeyError:
|
||||
doc_schema_type = self.SchemaType(document)
|
||||
return doc_schema_type is v1_0.document_schema
|
||||
return False
|
||||
validations = self._format_validation_results(validation_results)
|
||||
return validations
|
||||
|
@ -32,5 +32,5 @@ schema = {
|
||||
'data': {'type': ['string', 'object']}
|
||||
},
|
||||
'additionalProperties': False,
|
||||
'required': ['schema', 'metadata', 'data']
|
||||
'required': ['schema', 'metadata']
|
||||
}
|
||||
|
@ -14,12 +14,12 @@
|
||||
|
||||
from deckhand.engine.schema.v1_0 import certificate_key_schema
|
||||
from deckhand.engine.schema.v1_0 import certificate_schema
|
||||
from deckhand.engine.schema.v1_0 import data_schema
|
||||
from deckhand.engine.schema.v1_0 import data_schema_schema
|
||||
from deckhand.engine.schema.v1_0 import document_schema
|
||||
from deckhand.engine.schema.v1_0 import layering_schema
|
||||
from deckhand.engine.schema.v1_0 import layering_policy_schema
|
||||
from deckhand.engine.schema.v1_0 import passphrase_schema
|
||||
from deckhand.engine.schema.v1_0 import validation_schema
|
||||
from deckhand.engine.schema.v1_0 import validation_policy_schema
|
||||
|
||||
__all__ = ['certificate_key_schema', 'certificate_schema', 'data_schema',
|
||||
'document_schema', 'layering_schema', 'passphrase_schema',
|
||||
'validation_schema']
|
||||
__all__ = ['certificate_key_schema', 'certificate_schema',
|
||||
'data_schema_schema', 'document_schema', 'layering_policy_schema',
|
||||
'passphrase_schema', 'validation_policy_schema']
|
||||
|
@ -29,7 +29,11 @@ schema = {
|
||||
'type': 'string',
|
||||
'pattern': '^(metadata/Control/v[1]{1}(\.[0]{1}){0,1})$'
|
||||
},
|
||||
'name': {'type': 'string'},
|
||||
'name': {
|
||||
'type': 'string',
|
||||
'pattern': (
|
||||
'^([A-Za-z]+\/[A-Za-z]+\/v[1]{1}(\.[0]{1}){0,1})$')
|
||||
},
|
||||
# Labels are optional.
|
||||
'labels': {
|
||||
'type': 'object'
|
||||
@ -49,7 +53,7 @@ schema = {
|
||||
'type': 'string'
|
||||
}
|
||||
},
|
||||
'additionalProperties': False,
|
||||
'additionalProperties': True,
|
||||
'required': ['$schema']
|
||||
}
|
||||
},
|
@ -39,16 +39,15 @@ class DeckhandException(Exception):
|
||||
|
||||
|
||||
class InvalidDocumentFormat(DeckhandException):
|
||||
msg_fmt = ("The provided YAML failed schema validation. Details: "
|
||||
msg_fmt = ("The provided document YAML failed schema validation. Details: "
|
||||
"%(detail)s. Schema: %(schema)s.")
|
||||
alt_msg_fmt = ("The provided %(document_type)s YAML failed schema "
|
||||
"validation. Details: %(detail)s. Schema: %(schema)s.")
|
||||
code = 400
|
||||
|
||||
def __init__(self, document_type=None, **kwargs):
|
||||
if document_type:
|
||||
self.msg_fmt = self.alt_msg_fmt
|
||||
kwargs.update({'document_type': document_type})
|
||||
super(InvalidDocumentFormat, self).__init__(**kwargs)
|
||||
|
||||
class InvalidDocumentSchema(DeckhandException):
|
||||
msg_fmt = ("The provided %(document_schema)s is invalid. Supported "
|
||||
"schemas: %(schema_list)s.")
|
||||
code = 400
|
||||
|
||||
|
||||
class DocumentExists(DeckhandException):
|
||||
|
@ -40,6 +40,51 @@ class DeckhandFactory(object):
|
||||
pass
|
||||
|
||||
|
||||
class DataSchemaFactory(DeckhandFactory):
|
||||
"""Class for auto-generating ``DataSchema`` templates for testing."""
|
||||
|
||||
DATA_SCHEMA_TEMPLATE = {
|
||||
"data": {
|
||||
"$schema": ""
|
||||
},
|
||||
"metadata": {
|
||||
"schema": "metadata/Control/v1",
|
||||
"name": "",
|
||||
"labels": {}
|
||||
},
|
||||
"schema": "deckhand/DataSchema/v1"
|
||||
}
|
||||
|
||||
def __init__(self):
|
||||
"""Constructor for ``DataSchemaFactory``.
|
||||
|
||||
Returns a template whose YAML representation is of the form::
|
||||
|
||||
---
|
||||
schema: deckhand/DataSchema/v1
|
||||
metadata:
|
||||
schema: metadata/Control/v1
|
||||
name: promenade/Node/v1
|
||||
labels:
|
||||
application: promenade
|
||||
data:
|
||||
$schema: http://blah
|
||||
...
|
||||
"""
|
||||
|
||||
def gen(self):
|
||||
raise NotImplementedError()
|
||||
|
||||
def gen_test(self, metadata_name, data, **metadata_labels):
|
||||
data_schema_template = copy.deepcopy(self.DATA_SCHEMA_TEMPLATE)
|
||||
|
||||
data_schema_template['metadata']['name'] = metadata_name
|
||||
data_schema_template['metadata']['labels'] = metadata_labels
|
||||
data_schema_template['data'] = data
|
||||
|
||||
return data_schema_template
|
||||
|
||||
|
||||
class DocumentFactory(DeckhandFactory):
|
||||
"""Class for auto-generating document templates for testing."""
|
||||
|
||||
@ -130,8 +175,7 @@ class DocumentFactory(DeckhandFactory):
|
||||
self.docs_per_layer = docs_per_layer
|
||||
|
||||
def gen(self):
|
||||
# TODO(fmontei): Implement this if needed later.
|
||||
pass
|
||||
raise NotImplementedError()
|
||||
|
||||
def gen_test(self, mapping, site_abstract=True, region_abstract=True,
|
||||
global_abstract=True, site_parent_selectors=None):
|
||||
@ -218,7 +262,7 @@ class DocumentFactory(DeckhandFactory):
|
||||
if layer_name == 'region':
|
||||
layer_template['metadata']['layeringDefinition'][
|
||||
'abstract'] = region_abstract
|
||||
elif layer_name == 'global':
|
||||
if layer_name == 'global':
|
||||
layer_template['metadata']['layeringDefinition'][
|
||||
'abstract'] = global_abstract
|
||||
|
||||
@ -300,7 +344,7 @@ class DocumentSecretFactory(DeckhandFactory):
|
||||
"""
|
||||
|
||||
def gen(self):
|
||||
pass
|
||||
raise NotImplementedError()
|
||||
|
||||
def gen_test(self, schema, storage_policy, data=None):
|
||||
if data is None:
|
||||
@ -360,7 +404,7 @@ class ValidationPolicyFactory(DeckhandFactory):
|
||||
self.VALIDATION_POLICY_TEMPLATE)
|
||||
|
||||
validation_policy_template['metadata'][
|
||||
'name'] = validation_type
|
||||
'name'] = test_utils.rand_name('validation-policy')
|
||||
validation_policy_template['data']['validations'] = [
|
||||
{'name': validation_type, 'status': status}
|
||||
]
|
||||
|
@ -18,6 +18,7 @@ from deckhand.policies import base
|
||||
from deckhand.policies import document
|
||||
from deckhand.policies import revision
|
||||
from deckhand.policies import revision_tag
|
||||
from deckhand.policies import validation
|
||||
|
||||
|
||||
def list_rules():
|
||||
@ -25,5 +26,6 @@ def list_rules():
|
||||
base.list_rules(),
|
||||
document.list_rules(),
|
||||
revision.list_rules(),
|
||||
revision_tag.list_rules()
|
||||
revision_tag.list_rules(),
|
||||
validation.list_rules()
|
||||
)
|
||||
|
63
deckhand/policies/validation.py
Normal file
63
deckhand/policies/validation.py
Normal file
@ -0,0 +1,63 @@
|
||||
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
from oslo_policy import policy
|
||||
|
||||
from deckhand.policies import base
|
||||
|
||||
|
||||
validation_policies = [
|
||||
policy.DocumentedRuleDefault(
|
||||
base.POLICY_ROOT % 'create_validation',
|
||||
base.RULE_ADMIN_API,
|
||||
"Add the results of a validation for a particular revision.",
|
||||
[
|
||||
{
|
||||
'method': 'POST',
|
||||
'path': '/api/v1.0/revisions/{revision_id}/validations'
|
||||
}
|
||||
]),
|
||||
policy.DocumentedRuleDefault(
|
||||
base.POLICY_ROOT % 'list_validations',
|
||||
base.RULE_ADMIN_API,
|
||||
""""List all validations that have been reported for a revision. Also
|
||||
lists the validation entries for a particular validation.""",
|
||||
[
|
||||
{
|
||||
'method': 'GET',
|
||||
'path': '/api/v1.0/revisions/{revision_id}/validations'
|
||||
},
|
||||
{
|
||||
'method': 'GET',
|
||||
'path': '/api/v1.0/revisions/{revision_id}/validations/'
|
||||
'{validation_name}'
|
||||
}
|
||||
]),
|
||||
policy.DocumentedRuleDefault(
|
||||
base.POLICY_ROOT % 'show_validation',
|
||||
base.RULE_ADMIN_API,
|
||||
"""Gets the full details of a particular validation entry, including
|
||||
all posted error details.""",
|
||||
[
|
||||
{
|
||||
'method': 'GET',
|
||||
'path': '/api/v1.0/revisions/{revision_id}/validations/'
|
||||
'{validation_name}/entries/{entry_id}'
|
||||
}
|
||||
]),
|
||||
]
|
||||
|
||||
|
||||
def list_rules():
|
||||
return validation_policies
|
@ -24,6 +24,7 @@ from deckhand.control import revision_documents
|
||||
from deckhand.control import revision_tags
|
||||
from deckhand.control import revisions
|
||||
from deckhand.control import rollback
|
||||
from deckhand.control import validations
|
||||
from deckhand.control import versions
|
||||
|
||||
CONF = cfg.CONF
|
||||
@ -45,6 +46,12 @@ def configure_app(app, version=''):
|
||||
('revisions/{revision_id}/tags', revision_tags.RevisionTagsResource()),
|
||||
('revisions/{revision_id}/tags/{tag}',
|
||||
revision_tags.RevisionTagsResource()),
|
||||
('revisions/{revision_id}/validations',
|
||||
validations.ValidationsResource()),
|
||||
('revisions/{revision_id}/validations/{validation_name}',
|
||||
validations.ValidationsResource()),
|
||||
('revisions/{revision_id}/validations/{validation_name}/{entry_id}',
|
||||
validations.ValidationsResource()),
|
||||
('rollback/{revision_id}', rollback.RollbackResource())
|
||||
]
|
||||
|
||||
|
@ -25,7 +25,6 @@ tests:
|
||||
PUT: /api/v1.0/bucket/mop/documents
|
||||
status: 200
|
||||
data: <@resources/sample-schema.yaml
|
||||
skip: Not Implemented.
|
||||
|
||||
- name: verify_schema_is_valid
|
||||
desc: Check schema validation of the added schema
|
||||
@ -33,8 +32,8 @@ tests:
|
||||
status: 200
|
||||
response_multidoc_jsonpaths:
|
||||
$.[0].count: 1
|
||||
$.[0].results[0].id: 0
|
||||
$.[0].results[0].status: success
|
||||
skip: Not Implemented.
|
||||
|
||||
- name: verify_schema_validation_in_list_view
|
||||
desc: Check schema validation success shows in list view
|
||||
@ -44,7 +43,6 @@ tests:
|
||||
$.[0].count: 1
|
||||
$.[0].results[0].name: deckhand-schema-validation
|
||||
$.[0].results[0].status: success
|
||||
skip: Not Implemented.
|
||||
|
||||
- name: add_valid_document
|
||||
desc: Add a document that follows the schema
|
||||
@ -61,7 +59,6 @@ tests:
|
||||
data:
|
||||
a: this-one-is-required
|
||||
b: 77
|
||||
skip: Not Implemented.
|
||||
|
||||
- name: verify_document_is_valid
|
||||
desc: Check schema validation of the added document
|
||||
@ -69,8 +66,8 @@ tests:
|
||||
status: 200
|
||||
response_multidoc_jsonpaths:
|
||||
$.[0].count: 1
|
||||
$.[0].results[0].id: 0
|
||||
$.[0].results[0].status: success
|
||||
skip: Not Implemented.
|
||||
|
||||
- name: verify_document_validation_success_in_list_view
|
||||
desc: Check document validation success shows in list view
|
||||
@ -78,9 +75,8 @@ tests:
|
||||
status: 200
|
||||
response_multidoc_jsonpaths:
|
||||
$.[0].count: 1
|
||||
$.[0].results[0].name: deckhand-schema-validation
|
||||
$.[0].results[0].status: success
|
||||
skip: Not Implemented.
|
||||
$.[0].results[*].name: deckhand-schema-validation
|
||||
$.[0].results[*].status: success
|
||||
|
||||
- name: add_invalid_document
|
||||
desc: Add a document that does not follow the schema
|
||||
@ -97,7 +93,6 @@ tests:
|
||||
data:
|
||||
a: this-one-is-required-and-can-be-different
|
||||
b: 177
|
||||
skip: Not Implemented.
|
||||
|
||||
- name: verify_document_is_not_valid
|
||||
desc: Check failure of schema validation of the added document
|
||||
@ -105,8 +100,7 @@ tests:
|
||||
status: 200
|
||||
response_multidoc_jsonpaths:
|
||||
$.[0].count: 1
|
||||
$.[0].results[0].status: failure
|
||||
skip: Not Implemented.
|
||||
$.[0].results[*].status: failure
|
||||
|
||||
- name: verify_document_validation_failure_in_list_view
|
||||
desc: Check document validation failure shows in list view
|
||||
@ -114,6 +108,5 @@ tests:
|
||||
status: 200
|
||||
response_multidoc_jsonpaths:
|
||||
$.[0].count: 1
|
||||
$.[0].results[0].name: deckhand-schema-validation
|
||||
$.[0].results[0].status: failure
|
||||
skip: Not Implemented.
|
||||
$.[0].results[*].name: deckhand-schema-validation
|
||||
$.[0].results[*].status: failure
|
||||
|
@ -24,6 +24,7 @@ from deckhand.control import revision_documents
|
||||
from deckhand.control import revision_tags
|
||||
from deckhand.control import revisions
|
||||
from deckhand.control import rollback
|
||||
from deckhand.control import validations
|
||||
from deckhand.control import versions
|
||||
from deckhand.tests.unit import base as test_base
|
||||
from deckhand import utils
|
||||
@ -33,8 +34,10 @@ class TestApi(test_base.DeckhandTestCase):
|
||||
|
||||
def setUp(self):
|
||||
super(TestApi, self).setUp()
|
||||
# Mock the API resources.
|
||||
for resource in (buckets, revision_diffing, revision_documents,
|
||||
revision_tags, revisions, rollback, versions):
|
||||
revision_tags, revisions, rollback, validations,
|
||||
versions):
|
||||
class_names = self._get_module_class_names(resource)
|
||||
for class_name in class_names:
|
||||
resource_obj = self.patchobject(
|
||||
@ -88,8 +91,16 @@ class TestApi(test_base.DeckhandTestCase):
|
||||
self.revision_tags_resource()),
|
||||
mock.call('/api/v1.0/rollback/{revision_id}',
|
||||
self.rollback_resource()),
|
||||
mock.call('/api/v1.0/revisions/{revision_id}/validations',
|
||||
self.validations_resource()),
|
||||
mock.call('/api/v1.0/revisions/{revision_id}/validations/'
|
||||
'{validation_name}',
|
||||
self.validations_resource()),
|
||||
mock.call('/api/v1.0/revisions/{revision_id}/validations/'
|
||||
'{validation_name}/{entry_id}',
|
||||
self.validations_resource()),
|
||||
mock.call('/versions', self.versions_resource())
|
||||
])
|
||||
], any_order=True)
|
||||
|
||||
mock_db_api.drop_db.assert_called_once_with()
|
||||
mock_db_api.setup_db.assert_called_once_with()
|
||||
|
@ -18,10 +18,10 @@ from deckhand.control import base as api_base
|
||||
from deckhand.tests.unit.control import base as test_base
|
||||
|
||||
|
||||
class TestBaseResource(test_base.BaseControllerTest):
|
||||
class TestBaseController(test_base.BaseControllerTest):
|
||||
|
||||
def setUp(self):
|
||||
super(TestBaseResource, self).setUp()
|
||||
super(TestBaseController, self).setUp()
|
||||
self.base_resource = api_base.BaseResource()
|
||||
|
||||
@mock.patch.object(api_base, 'dir') # noqa
|
@ -119,7 +119,7 @@ schema:
|
||||
layer:site
|
||||
"""
|
||||
invalid_payloads = ['garbage', no_colon_spaces]
|
||||
error_re = ['.*The provided YAML failed schema validation.*',
|
||||
error_re = ['.*The provided document YAML failed schema validation.*',
|
||||
'.*mapping values are not allowed here.*']
|
||||
|
||||
for idx, payload in enumerate(invalid_payloads):
|
||||
|
520
deckhand/tests/unit/control/test_validations_controller.py
Normal file
520
deckhand/tests/unit/control/test_validations_controller.py
Normal file
@ -0,0 +1,520 @@
|
||||
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import copy
|
||||
import yaml
|
||||
|
||||
from oslo_config import cfg
|
||||
|
||||
from deckhand import factories
|
||||
from deckhand.tests import test_utils
|
||||
from deckhand.tests.unit.control import base as test_base
|
||||
from deckhand import types
|
||||
|
||||
CONF = cfg.CONF
|
||||
|
||||
|
||||
VALIDATION_RESULT = """
|
||||
---
|
||||
status: failure
|
||||
errors:
|
||||
- documents:
|
||||
- schema: promenade/Node/v1
|
||||
name: node-document-name
|
||||
- schema: promenade/Masters/v1
|
||||
name: kubernetes-masters
|
||||
message: Node has master role, but not included in cluster masters list.
|
||||
validator:
|
||||
name: promenade
|
||||
version: 1.1.2
|
||||
"""
|
||||
|
||||
VALIDATION_RESULT_ALT = """
|
||||
---
|
||||
status: success
|
||||
errors:
|
||||
- documents:
|
||||
- schema: promenade/Slaves/v1
|
||||
name: kubernetes-slaves
|
||||
message: No slave nodes found.
|
||||
validator:
|
||||
name: promenade
|
||||
version: 1.1.2
|
||||
"""
|
||||
|
||||
|
||||
class TestValidationsController(test_base.BaseControllerTest):
|
||||
"""Test suite for validating positive scenarios for bucket controller."""
|
||||
|
||||
def _create_revision(self, payload=None):
|
||||
if not payload:
|
||||
documents_factory = factories.DocumentFactory(2, [1, 1])
|
||||
payload = documents_factory.gen_test({})
|
||||
resp = self.app.simulate_put('/api/v1.0/bucket/mop/documents',
|
||||
body=yaml.safe_dump_all(payload))
|
||||
self.assertEqual(200, resp.status_code)
|
||||
revision_id = list(yaml.safe_load_all(resp.text))[0]['status'][
|
||||
'revision']
|
||||
return revision_id
|
||||
|
||||
def _create_validation(self, revision_id, validation_name, policy):
|
||||
resp = self.app.simulate_post(
|
||||
'/api/v1.0/revisions/%s/validations/%s' % (revision_id,
|
||||
validation_name),
|
||||
body=policy)
|
||||
return resp
|
||||
|
||||
def test_create_validation(self):
|
||||
rules = {'deckhand:create_cleartext_documents': '@',
|
||||
'deckhand:create_validation': '@'}
|
||||
self.policy.set_rules(rules)
|
||||
|
||||
revision_id = self._create_revision()
|
||||
validation_name = test_utils.rand_name('validation')
|
||||
resp = self._create_validation(revision_id, validation_name,
|
||||
VALIDATION_RESULT)
|
||||
|
||||
self.assertEqual(201, resp.status_code)
|
||||
expected_body = {
|
||||
'status': 'failure',
|
||||
'validator': {
|
||||
'name': 'promenade',
|
||||
'version': '1.1.2'
|
||||
}
|
||||
}
|
||||
self.assertEqual(expected_body, yaml.safe_load(resp.text))
|
||||
|
||||
def test_list_validations(self):
|
||||
rules = {'deckhand:create_cleartext_documents': '@',
|
||||
'deckhand:list_validations': '@'}
|
||||
self.policy.set_rules(rules)
|
||||
|
||||
revision_id = self._create_revision()
|
||||
|
||||
resp = self.app.simulate_get(
|
||||
'/api/v1.0/revisions/%s/validations' % revision_id)
|
||||
self.assertEqual(200, resp.status_code)
|
||||
|
||||
# Validate that the internal deckhand validation was created already.
|
||||
body = list(yaml.safe_load_all(resp.text))
|
||||
expected = {
|
||||
'count': 1,
|
||||
'results': [
|
||||
{
|
||||
'status': 'success',
|
||||
'name': types.DECKHAND_SCHEMA_VALIDATION
|
||||
}
|
||||
]
|
||||
}
|
||||
self.assertEqual(1, len(body))
|
||||
self.assertEqual(expected, body[0])
|
||||
|
||||
rules = {'deckhand:create_cleartext_documents': '@',
|
||||
'deckhand:create_validation': '@',
|
||||
'deckhand:list_validations': '@'}
|
||||
self.policy.set_rules(rules)
|
||||
|
||||
# Validate that, after creating a validation policy by an external
|
||||
# service, it is listed as well.
|
||||
validation_name = test_utils.rand_name('validation')
|
||||
resp = self._create_validation(revision_id, validation_name,
|
||||
VALIDATION_RESULT)
|
||||
|
||||
resp = self.app.simulate_get(
|
||||
'/api/v1.0/revisions/%s/validations' % revision_id)
|
||||
self.assertEqual(200, resp.status_code)
|
||||
|
||||
body = yaml.safe_load(resp.text)
|
||||
expected_body = {
|
||||
'count': 2,
|
||||
'results': [
|
||||
{
|
||||
'name': types.DECKHAND_SCHEMA_VALIDATION,
|
||||
'status': 'success'
|
||||
},
|
||||
{
|
||||
'name': validation_name,
|
||||
'status': 'failure'
|
||||
}
|
||||
]
|
||||
}
|
||||
self.assertEqual(expected_body, body)
|
||||
|
||||
def test_list_validation_entries(self):
|
||||
rules = {'deckhand:create_cleartext_documents': '@',
|
||||
'deckhand:create_validation': '@',
|
||||
'deckhand:list_validations': '@'}
|
||||
self.policy.set_rules(rules)
|
||||
|
||||
revision_id = self._create_revision()
|
||||
|
||||
# Validate that 3 entries (1 for each of the 3 documents created)
|
||||
# exists for
|
||||
# /api/v1.0/revisions/1/validations/deckhand-schema-validation
|
||||
resp = self.app.simulate_get(
|
||||
'/api/v1.0/revisions/%s/validations/%s' % (
|
||||
revision_id, types.DECKHAND_SCHEMA_VALIDATION))
|
||||
self.assertEqual(200, resp.status_code)
|
||||
body = yaml.safe_load(resp.text)
|
||||
expected_body = {
|
||||
'count': 3,
|
||||
'results': [{'id': x, 'status': 'success'} for x in range(3)]
|
||||
}
|
||||
self.assertEqual(expected_body, body)
|
||||
|
||||
# Add the result of a validation to a revision.
|
||||
validation_name = test_utils.rand_name('validation')
|
||||
resp = self._create_validation(revision_id, validation_name,
|
||||
VALIDATION_RESULT)
|
||||
|
||||
# Validate that the entry is present.
|
||||
resp = self.app.simulate_get(
|
||||
'/api/v1.0/revisions/%s/validations/%s' % (revision_id,
|
||||
validation_name))
|
||||
self.assertEqual(200, resp.status_code)
|
||||
|
||||
body = yaml.safe_load(resp.text)
|
||||
expected_body = {
|
||||
'count': 1,
|
||||
'results': [{'id': 0, 'status': 'failure'}]
|
||||
}
|
||||
self.assertEqual(expected_body, body)
|
||||
|
||||
def test_list_validation_entries_after_creating_validation(self):
|
||||
rules = {'deckhand:create_cleartext_documents': '@',
|
||||
'deckhand:create_validation': '@',
|
||||
'deckhand:list_validations': '@'}
|
||||
self.policy.set_rules(rules)
|
||||
|
||||
revision_id = self._create_revision()
|
||||
|
||||
# Add the result of a validation to a revision.
|
||||
validation_name = test_utils.rand_name('validation')
|
||||
resp = self._create_validation(revision_id, validation_name,
|
||||
VALIDATION_RESULT)
|
||||
|
||||
# Validate that the entry is present.
|
||||
resp = self.app.simulate_get(
|
||||
'/api/v1.0/revisions/%s/validations/%s' % (revision_id,
|
||||
validation_name))
|
||||
self.assertEqual(200, resp.status_code)
|
||||
|
||||
body = yaml.safe_load(resp.text)
|
||||
expected_body = {
|
||||
'count': 1,
|
||||
'results': [{'id': 0, 'status': 'failure'}]
|
||||
}
|
||||
self.assertEqual(expected_body, body)
|
||||
|
||||
# Add the result of another validation to the same revision.
|
||||
resp = self._create_validation(revision_id, validation_name,
|
||||
VALIDATION_RESULT_ALT)
|
||||
|
||||
# Validate that 2 entries now exist.
|
||||
resp = self.app.simulate_get(
|
||||
'/api/v1.0/revisions/%s/validations/%s' % (revision_id,
|
||||
validation_name))
|
||||
self.assertEqual(200, resp.status_code)
|
||||
body = yaml.safe_load(resp.text)
|
||||
expected_body = {
|
||||
'count': 2,
|
||||
'results': [
|
||||
{'id': 0, 'status': 'failure'}, {'id': 1, 'status': 'success'}
|
||||
]
|
||||
}
|
||||
self.assertEqual(expected_body, body)
|
||||
|
||||
def test_list_validation_entries_with_multiple_entries(self):
|
||||
rules = {'deckhand:create_cleartext_documents': '@',
|
||||
'deckhand:create_validation': '@',
|
||||
'deckhand:list_validations': '@'}
|
||||
self.policy.set_rules(rules)
|
||||
|
||||
revision_id = self._create_revision()
|
||||
validation_name = test_utils.rand_name('validation')
|
||||
resp = self._create_validation(revision_id, validation_name,
|
||||
VALIDATION_RESULT)
|
||||
resp = resp = self._create_validation(revision_id, validation_name,
|
||||
VALIDATION_RESULT_ALT)
|
||||
|
||||
resp = self.app.simulate_get(
|
||||
'/api/v1.0/revisions/%s/validations/%s' % (revision_id,
|
||||
validation_name))
|
||||
self.assertEqual(200, resp.status_code)
|
||||
|
||||
body = yaml.safe_load(resp.text)
|
||||
expected_body = {
|
||||
'count': 2,
|
||||
'results': [
|
||||
{'id': 0, 'status': 'failure'}, {'id': 1, 'status': 'success'}
|
||||
]
|
||||
}
|
||||
self.assertEqual(expected_body, body)
|
||||
|
||||
def test_show_validation_entry(self):
|
||||
rules = {'deckhand:create_cleartext_documents': '@',
|
||||
'deckhand:create_validation': '@',
|
||||
'deckhand:show_validation': '@'}
|
||||
self.policy.set_rules(rules)
|
||||
|
||||
revision_id = self._create_revision()
|
||||
validation_name = test_utils.rand_name('validation')
|
||||
resp = resp = self._create_validation(revision_id, validation_name,
|
||||
VALIDATION_RESULT)
|
||||
|
||||
resp = self.app.simulate_get(
|
||||
'/api/v1.0/revisions/%s/validations/%s/0' % (revision_id,
|
||||
validation_name))
|
||||
self.assertEqual(200, resp.status_code)
|
||||
|
||||
body = yaml.safe_load(resp.text)
|
||||
expected_body = {
|
||||
'name': validation_name,
|
||||
'status': 'failure',
|
||||
'createdAt': None,
|
||||
'expiresAfter': None,
|
||||
'errors': [
|
||||
{
|
||||
'documents': [
|
||||
{
|
||||
'name': 'node-document-name',
|
||||
'schema': 'promenade/Node/v1'
|
||||
}, {
|
||||
'name': 'kubernetes-masters',
|
||||
'schema': 'promenade/Masters/v1'
|
||||
}
|
||||
],
|
||||
'message': 'Node has master role, but not included in '
|
||||
'cluster masters list.'
|
||||
}
|
||||
]
|
||||
}
|
||||
self.assertEqual(expected_body, body)
|
||||
|
||||
def test_validation_with_registered_data_schema(self):
|
||||
rules = {'deckhand:create_cleartext_documents': '@',
|
||||
'deckhand:list_validations': '@'}
|
||||
self.policy.set_rules(rules)
|
||||
|
||||
# Register a `DataSchema` against which the test document will be
|
||||
# validated.
|
||||
data_schema_factory = factories.DataSchemaFactory()
|
||||
metadata_name = 'example/Doc/v1'
|
||||
schema_to_use = {
|
||||
'$schema': 'http://json-schema.org/schema#',
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'a': {
|
||||
'type': 'string'
|
||||
}
|
||||
},
|
||||
'required': ['a'],
|
||||
'additionalProperties': False
|
||||
}
|
||||
data_schema = data_schema_factory.gen_test(
|
||||
metadata_name, data=schema_to_use)
|
||||
|
||||
revision_id = self._create_revision(payload=[data_schema])
|
||||
|
||||
# Validate that the internal deckhand validation was created.
|
||||
resp = self.app.simulate_get(
|
||||
'/api/v1.0/revisions/%s/validations' % revision_id)
|
||||
self.assertEqual(200, resp.status_code)
|
||||
body = yaml.safe_load(resp.text)
|
||||
expected_body = {
|
||||
'count': 1,
|
||||
'results': [
|
||||
{'name': types.DECKHAND_SCHEMA_VALIDATION, 'status': 'success'}
|
||||
]
|
||||
}
|
||||
self.assertEqual(expected_body, body)
|
||||
|
||||
# Create the test document whose data section adheres to the
|
||||
# `DataSchema` above.
|
||||
doc_factory = factories.DocumentFactory(1, [1])
|
||||
doc_to_test = doc_factory.gen_test(
|
||||
{'_GLOBAL_DATA_1_': {'data': {'a': 'whatever'}}},
|
||||
global_abstract=False)[-1]
|
||||
doc_to_test['schema'] = 'example/Doc/v1'
|
||||
|
||||
revision_id = self._create_revision(
|
||||
payload=[doc_to_test])
|
||||
|
||||
# Validate that the validation was created and passed.
|
||||
resp = self.app.simulate_get(
|
||||
'/api/v1.0/revisions/%s/validations' % revision_id)
|
||||
self.assertEqual(200, resp.status_code)
|
||||
body = yaml.safe_load(resp.text)
|
||||
expected_body = {
|
||||
'count': 1,
|
||||
'results': [
|
||||
{'name': types.DECKHAND_SCHEMA_VALIDATION, 'status': 'success'}
|
||||
]
|
||||
}
|
||||
self.assertEqual(expected_body, body)
|
||||
|
||||
def test_validation_with_registered_data_schema_expect_failure(self):
|
||||
rules = {'deckhand:create_cleartext_documents': '@',
|
||||
'deckhand:list_validations': '@'}
|
||||
self.policy.set_rules(rules)
|
||||
|
||||
# Register a `DataSchema` against which the test document will be
|
||||
# validated.
|
||||
data_schema_factory = factories.DataSchemaFactory()
|
||||
metadata_name = 'example/foo/v1'
|
||||
schema_to_use = {
|
||||
'$schema': 'http://json-schema.org/schema#',
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'a': {
|
||||
'type': 'integer' # Test doc will fail b/c of wrong type.
|
||||
}
|
||||
},
|
||||
'required': ['a']
|
||||
}
|
||||
data_schema = data_schema_factory.gen_test(
|
||||
metadata_name, data=schema_to_use)
|
||||
|
||||
revision_id = self._create_revision(payload=[data_schema])
|
||||
|
||||
# Validate that the internal deckhand validation was created.
|
||||
resp = self.app.simulate_get(
|
||||
'/api/v1.0/revisions/%s/validations' % revision_id)
|
||||
self.assertEqual(200, resp.status_code)
|
||||
body = yaml.safe_load(resp.text)
|
||||
expected_body = {
|
||||
'count': 1,
|
||||
'results': [
|
||||
{'name': types.DECKHAND_SCHEMA_VALIDATION, 'status': 'success'}
|
||||
]
|
||||
}
|
||||
self.assertEqual(expected_body, body)
|
||||
|
||||
# Create the test document that fails the validation due to the
|
||||
# schema defined by the `DataSchema` document.
|
||||
doc_factory = factories.DocumentFactory(1, [1])
|
||||
doc_to_test = doc_factory.gen_test(
|
||||
{'_GLOBAL_DATA_1_': {'data': {'a': 'fail'}}},
|
||||
global_abstract=False)[-1]
|
||||
doc_to_test['schema'] = 'example/foo/v1'
|
||||
doc_to_test['metadata']['name'] = 'test_doc'
|
||||
|
||||
revision_id = self._create_revision(payload=[doc_to_test])
|
||||
|
||||
# Validate that the validation was created and reports failure.
|
||||
resp = self.app.simulate_get(
|
||||
'/api/v1.0/revisions/%s/validations' % revision_id)
|
||||
self.assertEqual(200, resp.status_code)
|
||||
body = yaml.safe_load(resp.text)
|
||||
expected_body = {
|
||||
'count': 1,
|
||||
'results': [
|
||||
{'name': types.DECKHAND_SCHEMA_VALIDATION, 'status': 'failure'}
|
||||
]
|
||||
}
|
||||
self.assertEqual(expected_body, body)
|
||||
|
||||
def test_validation_with_registered_data_schema_expect_mixed(self):
|
||||
rules = {'deckhand:create_cleartext_documents': '@',
|
||||
'deckhand:list_validations': '@'}
|
||||
self.policy.set_rules(rules)
|
||||
|
||||
# Register a `DataSchema` against which the test document will be
|
||||
# validated.
|
||||
data_schema_factory = factories.DataSchemaFactory()
|
||||
metadata_name = 'example/foo/v1'
|
||||
schema_to_use = {
|
||||
'$schema': 'http://json-schema.org/schema#',
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'a': {
|
||||
'type': 'integer' # Test doc will fail b/c of wrong type.
|
||||
}
|
||||
},
|
||||
'required': ['a']
|
||||
}
|
||||
data_schema = data_schema_factory.gen_test(
|
||||
metadata_name, data=schema_to_use)
|
||||
|
||||
revision_id = self._create_revision(payload=[data_schema])
|
||||
|
||||
# Validate that the internal deckhand validation was created.
|
||||
resp = self.app.simulate_get(
|
||||
'/api/v1.0/revisions/%s/validations' % revision_id)
|
||||
self.assertEqual(200, resp.status_code)
|
||||
body = yaml.safe_load(resp.text)
|
||||
expected_body = {
|
||||
'count': 1,
|
||||
'results': [
|
||||
{'name': types.DECKHAND_SCHEMA_VALIDATION, 'status': 'success'}
|
||||
]
|
||||
}
|
||||
self.assertEqual(expected_body, body)
|
||||
|
||||
# Create a document that passes validation and another that fails it.
|
||||
doc_factory = factories.DocumentFactory(1, [1])
|
||||
fail_doc = doc_factory.gen_test(
|
||||
{'_GLOBAL_DATA_1_': {'data': {'a': 'fail'}}},
|
||||
global_abstract=False)[-1]
|
||||
fail_doc['schema'] = 'example/foo/v1'
|
||||
fail_doc['metadata']['name'] = 'test_doc'
|
||||
|
||||
pass_doc = copy.deepcopy(fail_doc)
|
||||
pass_doc['data']['a'] = 5
|
||||
|
||||
revision_id = self._create_revision(payload=[fail_doc, pass_doc])
|
||||
|
||||
# Validate that the validation reports failure since `fail_doc`
|
||||
# should've failed validation.
|
||||
resp = self.app.simulate_get(
|
||||
'/api/v1.0/revisions/%s/validations' % revision_id)
|
||||
self.assertEqual(200, resp.status_code)
|
||||
body = yaml.safe_load(resp.text)
|
||||
expected_body = {
|
||||
'count': 1,
|
||||
'results': [
|
||||
{'name': types.DECKHAND_SCHEMA_VALIDATION, 'status': 'failure'}
|
||||
]
|
||||
}
|
||||
self.assertEqual(expected_body, body)
|
||||
|
||||
def test_document_without_data_section_saves_but_fails_validation(self):
|
||||
"""Validate that a document without the data section is saved to the
|
||||
database, but fails validation. This is a valid use case because a
|
||||
document in a bucket can be created without a data section, which
|
||||
depends on substitution from another document.
|
||||
"""
|
||||
rules = {'deckhand:create_cleartext_documents': '@',
|
||||
'deckhand:create_validation': '@',
|
||||
'deckhand:list_validations': '@'}
|
||||
self.policy.set_rules(rules)
|
||||
|
||||
documents_factory = factories.DocumentFactory(1, [1])
|
||||
payload = documents_factory.gen_test({}, global_abstract=False)[-1]
|
||||
del payload['data']
|
||||
|
||||
revision_id = self._create_revision(payload=[payload])
|
||||
|
||||
# Validate that the entry is present.
|
||||
resp = self.app.simulate_get(
|
||||
'/api/v1.0/revisions/%s/validations/%s' % (
|
||||
revision_id, types.DECKHAND_SCHEMA_VALIDATION))
|
||||
self.assertEqual(200, resp.status_code)
|
||||
|
||||
body = yaml.safe_load(resp.text)
|
||||
expected_body = {
|
||||
'count': 1,
|
||||
'results': [{'id': 0, 'status': 'failure'}]
|
||||
}
|
||||
self.assertEqual(expected_body, body)
|
@ -106,6 +106,9 @@ class TestDbBase(base.DeckhandWithDBTestCase):
|
||||
latest_revision = db_api.revision_get_latest()
|
||||
return db_api.revision_rollback(revision_id, latest_revision)
|
||||
|
||||
def create_validation(self, revision_id, val_name, val_data):
|
||||
return db_api.validation_create(revision_id, val_name, val_data)
|
||||
|
||||
def _validate_object(self, obj):
|
||||
for attr in BASE_EXPECTED_FIELDS:
|
||||
if attr.endswith('_at'):
|
||||
|
86
deckhand/tests/unit/db/test_validations.py
Normal file
86
deckhand/tests/unit/db/test_validations.py
Normal file
@ -0,0 +1,86 @@
|
||||
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import yaml
|
||||
|
||||
from deckhand import factories
|
||||
from deckhand.tests import test_utils
|
||||
from deckhand.tests.unit.db import base
|
||||
from deckhand import types
|
||||
|
||||
ARMADA_VALIDATION_POLICY = """
|
||||
---
|
||||
status: success
|
||||
validator:
|
||||
name: armada
|
||||
version: 1.1.3
|
||||
"""
|
||||
|
||||
PROMENADE_VALIDATION_POLICY = """
|
||||
---
|
||||
status: failure
|
||||
errors:
|
||||
- documents:
|
||||
- schema: promenade/Node/v1
|
||||
name: node-document-name
|
||||
- schema: promenade/Masters/v1
|
||||
name: kubernetes-masters
|
||||
message: Node has master role, but not included in cluster masters list.
|
||||
validator:
|
||||
name: promenade
|
||||
version: 1.1.2
|
||||
"""
|
||||
|
||||
|
||||
class TestValidations(base.TestDbBase):
|
||||
|
||||
def _create_revision_with_validation_policy(self):
|
||||
vp_factory = factories.ValidationPolicyFactory()
|
||||
validation_policy = vp_factory.gen(types.DECKHAND_SCHEMA_VALIDATION,
|
||||
'success')
|
||||
bucket_name = test_utils.rand_name('bucket')
|
||||
documents = self.create_documents(bucket_name, [validation_policy])
|
||||
revision_id = documents[0]['revision_id']
|
||||
return revision_id
|
||||
|
||||
def test_create_validation(self):
|
||||
revision_id = self._create_revision_with_validation_policy()
|
||||
validation_name = test_utils.rand_name('validation')
|
||||
|
||||
payload = yaml.safe_load(PROMENADE_VALIDATION_POLICY)
|
||||
created_validation = self.create_validation(
|
||||
revision_id, validation_name, payload)
|
||||
|
||||
self.assertIsInstance(created_validation, dict)
|
||||
self.assertEqual(validation_name, created_validation['name'])
|
||||
self.assertEqual(payload['status'], created_validation['status'])
|
||||
self.assertEqual(payload['validator'], created_validation['validator'])
|
||||
|
||||
def test_create_multiple_validations(self):
|
||||
revision_id = self._create_revision_with_validation_policy()
|
||||
|
||||
for val_policy in (ARMADA_VALIDATION_POLICY,
|
||||
PROMENADE_VALIDATION_POLICY):
|
||||
validation_name = test_utils.rand_name('validation')
|
||||
|
||||
payload = yaml.safe_load(val_policy)
|
||||
created_validation = self.create_validation(
|
||||
revision_id, validation_name, payload)
|
||||
|
||||
payload.update({'name': validation_name})
|
||||
self.assertIsInstance(created_validation, dict)
|
||||
self.assertEqual(validation_name, created_validation['name'])
|
||||
self.assertEqual(payload['status'], created_validation['status'])
|
||||
self.assertEqual(payload['validator'],
|
||||
created_validation['validator'])
|
@ -155,7 +155,7 @@ class TestDocumentLayering2LayersAbstractConcrete(TestDocumentLayering):
|
||||
}
|
||||
doc_factory = factories.DocumentFactory(2, [1, 1])
|
||||
documents = doc_factory.gen_test(mapping, site_abstract=False,
|
||||
global_abstract=False)
|
||||
global_abstract=False)
|
||||
|
||||
site_expected = {'c': 9}
|
||||
global_expected = {'a': {'x': 1, 'y': 2}, 'c': 9}
|
||||
@ -171,7 +171,7 @@ class TestDocumentLayering2LayersAbstractConcrete(TestDocumentLayering):
|
||||
}
|
||||
doc_factory = factories.DocumentFactory(2, [1, 1])
|
||||
documents = doc_factory.gen_test(mapping, site_abstract=True,
|
||||
global_abstract=True)
|
||||
global_abstract=True)
|
||||
|
||||
site_expected = {"a": {"x": 7, "z": 3}, "b": 4}
|
||||
global_expected = {'a': {'x': 1, 'y': 2}, 'c': 9}
|
||||
|
@ -20,6 +20,11 @@ from deckhand.tests.unit.engine import base as engine_test_base
|
||||
|
||||
class TestDocumentValidation(engine_test_base.TestDocumentValidationBase):
|
||||
|
||||
def setUp(self):
|
||||
super(TestDocumentValidation, self).setUp()
|
||||
# Mock out DB module (i.e. retrieving DataSchema docs from DB).
|
||||
self.patch('deckhand.db.sqlalchemy.api.document_get_all')
|
||||
|
||||
def test_init_document_validation(self):
|
||||
self._read_data('sample_document')
|
||||
doc_validation = document_validation.DocumentValidation(
|
||||
|
@ -15,56 +15,74 @@
|
||||
from deckhand.engine import document_validation
|
||||
from deckhand import errors
|
||||
from deckhand.tests.unit.engine import base as engine_test_base
|
||||
from deckhand import types
|
||||
|
||||
|
||||
class TestDocumentValidationNegative(
|
||||
engine_test_base.TestDocumentValidationBase):
|
||||
"""Negative testing suite for document validation."""
|
||||
|
||||
BASIC_ATTRS = (
|
||||
'schema', 'metadata', 'data', 'metadata.schema', 'metadata.name')
|
||||
SCHEMA_ERR = ("The provided YAML failed schema validation. "
|
||||
"Details: '%s' is a required property.")
|
||||
SCHEMA_ERR_ALT = ("The provided %s YAML failed schema validation. "
|
||||
"Details: '%s' is a required property.")
|
||||
# The 'data' key is mandatory but not critical if excluded.
|
||||
CRITICAL_ATTRS = (
|
||||
'schema', 'metadata', 'metadata.schema', 'metadata.name')
|
||||
SCHEMA_ERR = "'%s' is a required property"
|
||||
|
||||
def setUp(self):
|
||||
super(TestDocumentValidationNegative, self).setUp()
|
||||
# Mock out DB module (i.e. retrieving DataSchema docs from DB).
|
||||
self.patch('deckhand.db.sqlalchemy.api.document_get_all')
|
||||
|
||||
def _test_missing_required_sections(self, properties_to_remove):
|
||||
for idx, property_to_remove in enumerate(properties_to_remove):
|
||||
critical = property_to_remove in self.CRITICAL_ATTRS
|
||||
|
||||
missing_prop = property_to_remove.split('.')[-1]
|
||||
invalid_data = self._corrupt_data(property_to_remove)
|
||||
expected_err = self.SCHEMA_ERR % missing_prop
|
||||
|
||||
if property_to_remove in self.BASIC_ATTRS:
|
||||
expected_err = self.SCHEMA_ERR % missing_prop
|
||||
doc_validator = document_validation.DocumentValidation(
|
||||
invalid_data)
|
||||
if critical:
|
||||
self.assertRaisesRegexp(
|
||||
errors.InvalidDocumentFormat, expected_err,
|
||||
doc_validator.validate_all)
|
||||
else:
|
||||
expected_err = self.SCHEMA_ERR_ALT % (
|
||||
self.data['schema'], missing_prop)
|
||||
|
||||
# NOTE(fmontei): '$' must be escaped for regex to pass.
|
||||
expected_err = expected_err.replace('$', '\$')
|
||||
|
||||
with self.assertRaisesRegex(errors.InvalidDocumentFormat,
|
||||
expected_err):
|
||||
document_validation.DocumentValidation(
|
||||
invalid_data).validate_all()
|
||||
validations = doc_validator.validate_all()
|
||||
self.assertEqual(1, len(validations))
|
||||
self.assertEqual('failure', validations[0]['status'])
|
||||
self.assertEqual({'version': '1.0', 'name': 'deckhand'},
|
||||
validations[0]['validator'])
|
||||
self.assertEqual(types.DECKHAND_SCHEMA_VALIDATION,
|
||||
validations[0]['name'])
|
||||
self.assertEqual(1, len(validations[0]['errors']))
|
||||
self.assertEqual(self.data['metadata']['name'],
|
||||
validations[0]['errors'][0]['name'])
|
||||
self.assertEqual(self.data['schema'],
|
||||
validations[0]['errors'][0]['schema'])
|
||||
self.assertEqual(expected_err,
|
||||
validations[0]['errors'][0]['message'])
|
||||
|
||||
def test_certificate_key_missing_required_sections(self):
|
||||
self._read_data('sample_certificate_key')
|
||||
properties_to_remove = self.BASIC_ATTRS + ('metadata.storagePolicy',)
|
||||
properties_to_remove = self.CRITICAL_ATTRS + (
|
||||
'data', 'metadata.storagePolicy',)
|
||||
self._test_missing_required_sections(properties_to_remove)
|
||||
|
||||
def test_certificate_missing_required_sections(self):
|
||||
self._read_data('sample_certificate')
|
||||
properties_to_remove = self.BASIC_ATTRS + ('metadata.storagePolicy',)
|
||||
properties_to_remove = self.CRITICAL_ATTRS + (
|
||||
'data', 'metadata.storagePolicy',)
|
||||
self._test_missing_required_sections(properties_to_remove)
|
||||
|
||||
def test_data_schema_missing_required_sections(self):
|
||||
self._read_data('sample_data_schema')
|
||||
properties_to_remove = self.BASIC_ATTRS + ('data.$schema',)
|
||||
properties_to_remove = self.CRITICAL_ATTRS + ('data', 'data.$schema',)
|
||||
self._test_missing_required_sections(properties_to_remove)
|
||||
|
||||
def test_document_missing_required_sections(self):
|
||||
self._read_data('sample_document')
|
||||
properties_to_remove = self.BASIC_ATTRS + (
|
||||
properties_to_remove = self.CRITICAL_ATTRS + (
|
||||
'data',
|
||||
'metadata.layeringDefinition',
|
||||
'metadata.layeringDefinition.layer',
|
||||
'metadata.layeringDefinition.actions.0.method',
|
||||
@ -79,23 +97,41 @@ class TestDocumentValidationNegative(
|
||||
|
||||
def test_document_invalid_layering_definition_action(self):
|
||||
self._read_data('sample_document')
|
||||
updated_data = self._corrupt_data(
|
||||
'metadata.layeringDefinition.actions.0.action', 'invalid',
|
||||
corrupted_data = self._corrupt_data(
|
||||
'metadata.layeringDefinition.actions.0.method', 'invalid',
|
||||
op='replace')
|
||||
self._test_missing_required_sections(updated_data)
|
||||
expected_err = "'invalid' is not one of ['replace', 'delete', 'merge']"
|
||||
|
||||
doc_validator = document_validation.DocumentValidation(corrupted_data)
|
||||
validations = doc_validator.validate_all()
|
||||
self.assertEqual(1, len(validations))
|
||||
self.assertEqual('failure', validations[0]['status'])
|
||||
self.assertEqual({'version': '1.0', 'name': 'deckhand'},
|
||||
validations[0]['validator'])
|
||||
self.assertEqual(types.DECKHAND_SCHEMA_VALIDATION,
|
||||
validations[0]['name'])
|
||||
self.assertEqual(1, len(validations[0]['errors']))
|
||||
self.assertEqual(self.data['metadata']['name'],
|
||||
validations[0]['errors'][0]['name'])
|
||||
self.assertEqual(self.data['schema'],
|
||||
validations[0]['errors'][0]['schema'])
|
||||
self.assertEqual(expected_err,
|
||||
validations[0]['errors'][0]['message'])
|
||||
|
||||
def test_layering_policy_missing_required_sections(self):
|
||||
self._read_data('sample_layering_policy')
|
||||
properties_to_remove = self.BASIC_ATTRS + ('data.layerOrder',)
|
||||
properties_to_remove = self.CRITICAL_ATTRS + (
|
||||
'data', 'data.layerOrder',)
|
||||
self._test_missing_required_sections(properties_to_remove)
|
||||
|
||||
def test_passphrase_missing_required_sections(self):
|
||||
self._read_data('sample_passphrase')
|
||||
properties_to_remove = self.BASIC_ATTRS + ('metadata.storagePolicy',)
|
||||
properties_to_remove = self.CRITICAL_ATTRS + (
|
||||
'data', 'metadata.storagePolicy',)
|
||||
self._test_missing_required_sections(properties_to_remove)
|
||||
|
||||
def test_validation_policy_missing_required_sections(self):
|
||||
self._read_data('sample_validation_policy')
|
||||
properties_to_remove = self.BASIC_ATTRS + (
|
||||
'data.validations', 'data.validations.0.name')
|
||||
properties_to_remove = self.CRITICAL_ATTRS + (
|
||||
'data', 'data.validations', 'data.validations.0.name')
|
||||
self._test_missing_required_sections(properties_to_remove)
|
||||
|
@ -105,8 +105,9 @@ class TestRevisionViews(base.TestDbBase):
|
||||
self.assertEqual('success', revision_view['status'])
|
||||
self.assertIsInstance(revision_view['validationPolicies'], list)
|
||||
self.assertEqual(1, len(revision_view['validationPolicies']))
|
||||
self.assertEqual(revision_view['validationPolicies'][0]['name'],
|
||||
'deckhand-schema-validation')
|
||||
self.assertRegexpMatches(
|
||||
revision_view['validationPolicies'][0]['name'],
|
||||
'deckhand-validation-policy-.*')
|
||||
self.assertEqual(revision_view['validationPolicies'][0]['status'],
|
||||
'success')
|
||||
|
||||
@ -133,7 +134,8 @@ class TestRevisionViews(base.TestDbBase):
|
||||
self.assertEqual('failed', revision_view['status'])
|
||||
self.assertIsInstance(revision_view['validationPolicies'], list)
|
||||
self.assertEqual(1, len(revision_view['validationPolicies']))
|
||||
self.assertEqual(revision_view['validationPolicies'][0]['name'],
|
||||
'deckhand-schema-validation')
|
||||
self.assertRegexpMatches(
|
||||
revision_view['validationPolicies'][0]['name'],
|
||||
'deckhand-validation-policy-.*')
|
||||
self.assertEqual(revision_view['validationPolicies'][0]['status'],
|
||||
'failed')
|
||||
|
@ -16,15 +16,17 @@
|
||||
DOCUMENT_SCHEMA_TYPES = (
|
||||
CERTIFICATE_SCHEMA,
|
||||
CERTIFICATE_KEY_SCHEMA,
|
||||
DATA_SCHEMA_SCHEMA,
|
||||
LAYERING_POLICY_SCHEMA,
|
||||
PASSPHRASE_SCHEMA,
|
||||
VALIDATION_POLICY_SCHEMA,
|
||||
) = (
|
||||
'deckhand/Certificate',
|
||||
'deckhand/CertificateKey',
|
||||
'deckhand/LayeringPolicy/v1',
|
||||
'deckhand/DataSchema',
|
||||
'deckhand/LayeringPolicy',
|
||||
'deckhand/Passphrase',
|
||||
'deckhand/ValidationPolicy/v1',
|
||||
'deckhand/ValidationPolicy',
|
||||
)
|
||||
|
||||
|
||||
|
@ -32,7 +32,7 @@ def to_snake_case(name):
|
||||
return re.sub('([a-z0-9])([A-Z])', r'\1_\2', s1).lower()
|
||||
|
||||
|
||||
def jsonpath_parse(data, jsonpath):
|
||||
def jsonpath_parse(data, jsonpath, match_all=False):
|
||||
"""Parse value in the data for the given ``jsonpath``.
|
||||
|
||||
Retrieve the nested entry corresponding to ``data[jsonpath]``. For
|
||||
@ -66,7 +66,8 @@ def jsonpath_parse(data, jsonpath):
|
||||
p = jsonpath_ng.parse(jsonpath)
|
||||
matches = p.find(data)
|
||||
if matches:
|
||||
return matches[0].value
|
||||
result = [m.value for m in matches]
|
||||
return result if match_all else result[0]
|
||||
|
||||
|
||||
def jsonpath_replace(data, value, jsonpath, pattern=None):
|
||||
@ -113,8 +114,8 @@ def jsonpath_replace(data, value, jsonpath, pattern=None):
|
||||
_value = value
|
||||
if pattern:
|
||||
to_replace = p_to_change[0].value
|
||||
# value represents the value to inject into to_replace that
|
||||
# matches the pattern.
|
||||
# `value` represents the value to inject into `to_replace` that
|
||||
# matches the `pattern`.
|
||||
try:
|
||||
_value = re.sub(pattern, value, to_replace)
|
||||
except TypeError:
|
||||
|
@ -89,3 +89,18 @@
|
||||
# DELETE /api/v1.0/revisions/{revision_id}/tags
|
||||
#"deckhand:delete_tags": "rule:admin_api"
|
||||
|
||||
# Add the results of a validation for a particular revision.
|
||||
# POST /api/v1.0/revisions/{revision_id}/validations
|
||||
#"deckhand:create_validation": "rule:admin_api"
|
||||
|
||||
# "List all validations that have been reported for a revision. Also
|
||||
# lists the validation entries for a particular validation.
|
||||
# GET /api/v1.0/revisions/{revision_id}/validations
|
||||
# GET /api/v1.0/revisions/{revision_id}/validations/{validation_name}
|
||||
#"deckhand:list_validations": "rule:admin_api"
|
||||
|
||||
# Gets the full details of a particular validation entry, including
|
||||
# all posted error details.
|
||||
# GET /api/v1.0/revisions/{revision_id}/validations/{validation_name}/entries/{entry_id}
|
||||
#"deckhand:show_validation": "rule:admin_api"
|
||||
|
||||
|
14
releasenotes/notes/validations-api-cf07c8b6b5040f67.yaml
Normal file
14
releasenotes/notes/validations-api-cf07c8b6b5040f67.yaml
Normal file
@ -0,0 +1,14 @@
|
||||
---
|
||||
features:
|
||||
- |
|
||||
The Validations API has been introduced to Deckhand, allowing users
|
||||
to register new validation results in Deckhand, as well as query
|
||||
the API for validation results for a revision. The validation results
|
||||
include a list of errors that occurred during document validation.
|
||||
|
||||
The following endpoints have been implemented:
|
||||
|
||||
* /api/v1.0/revisions/{revision_id}/validations
|
||||
* /api/v1.0/revisions/{revision_id}/validations/{validation_name}
|
||||
* /api/v1.0/revisions/{revision_id}/validations/{validation_name}/entries
|
||||
* /api/v1.0/revisions/{revision_id}/validations/{validation_name}/entries/{entry_id}
|
Loading…
Reference in New Issue
Block a user