[feat] DECKHAND-28: Document pre-validation logic and API integration

This commit constitutes 1 of 2 monolithic ports from Github.
The following major changes have been made:

  - Created schemas for validating different types of documents
    (control and document schemas), including:
    * certificate key
    * certificate
    * data schema
    * document
    * layering policy
    * passphrase
    * validation policy
  - Implemented pre-validation logic which validates that each
    type of document conforms to the correct schema specifications
  - Implemented views for APIs -- this allows views to change the
    DB data to conform with API specifications
  - Implemented relevant unit tests
  - Implement functional testing foundation

Change-Id: I83582cc26ffef91fbe95d2f5f437f82d6fef6aa9
This commit is contained in:
Felipe Monteiro 2017-07-14 18:01:24 +01:00
parent 9bbc767b0a
commit e1446bb9e1
44 changed files with 1415 additions and 315 deletions

View File

@ -2,6 +2,6 @@
test_command=OS_STDOUT_CAPTURE=${OS_STDOUT_CAPTURE:-1} \ test_command=OS_STDOUT_CAPTURE=${OS_STDOUT_CAPTURE:-1} \
OS_STDERR_CAPTURE=${OS_STDERR_CAPTURE:-1} \ OS_STDERR_CAPTURE=${OS_STDERR_CAPTURE:-1} \
OS_TEST_TIMEOUT=${OS_TEST_TIMEOUT:-60} \ OS_TEST_TIMEOUT=${OS_TEST_TIMEOUT:-60} \
${PYTHON:-python} -m subunit.run discover -t ./ . $LISTOPT $IDOPTION ${PYTHON:-python} -m subunit.run discover -t ./ ${OS_TEST_PATH:-./deckhand/tests} $LISTOPT $IDOPTION
test_id_option=--load-list $IDFILE test_id_option=--load-list $IDFILE
test_list_option=--list test_list_option=--list

View File

@ -1,4 +1,5 @@
Alan Meadows <alan.meadows@gmail.com> Alan Meadows <alan.meadows@gmail.com>
Felipe Monteiro <felipe.monteiro@att.com> Felipe Monteiro <felipe.monteiro@att.com>
Felipe Monteiro <fmontei@users.noreply.github.com> Felipe Monteiro <fmontei@users.noreply.github.com>
Mark Burnett <mark.m.burnett@gmail.com>
Scott Hussey <sh8121@att.com> Scott Hussey <sh8121@att.com>

View File

@ -1,6 +1,53 @@
CHANGES CHANGES
======= =======
* Some integration with views/database
* Add validations to document db model
* Fix up \_is\_abstract in document\_validation
* Updated document schema
* Resolved merge conflicts
* Clean up
* Refactor some code
* Fixed failing tests.g
* WIP: more changes, debugging, tests
* Fix unit tests
* Remove deprecated code, update deprecated schemas and add new schemas
* Add schema validation for validation policy
* Changed layers to type string in schemas
* f
* Add layering policy pre-validation schema
* Add layering policy pre-validation schema
* Refactor some code
* Add endpoint/tests for GET /revisions/{revision\_id}
* Fix naming conflict error
* Add view abstraction layer for modifying DB data into view data
* Raise exception instead of return
* Updated /GET revisions response body
* Remove old docstring
* Update control README (with current response bodies, even though they're a WIP
* Return YAML response body
* Add endpoint for GET /revisions
* Use built-in oslo\_db types for Columns serialized as dicts
* Finish retrieving documents by revision\_id, including with filters
* Clean up
* Test and DB API changes
* Add Revision resource
* More tests for revisions-api. Fix minor bugs
* Clarify layering actions start from full parent data
* Add DELETE endpoint
* Skip validation for abstract documents & add unit tests
* Update schema validation to be internal validation
* Update schema/db model/db api to align with design document
* Add basic RBAC details to design document
* Update documents/revisions relationship/tables
* Update revision and document tables and add more unit tests
* temp
* Revisions database and API implementation
* Update API paths for consistency
* Add clarifications based on review
* Use safe\_load\_all instead of safe\_load
* Add unit tests for db documents api
* Remove oslo\_versionedobjects
* Change application/yaml to application/x-yaml * Change application/yaml to application/x-yaml
* Cleaned up some logic, added exception handling to document creation * Cleaned up some logic, added exception handling to document creation
* Add currently necessary oslo namespaces to oslo-config-generator conf file * Add currently necessary oslo namespaces to oslo-config-generator conf file
@ -10,8 +57,11 @@ CHANGES
* Added oslo\_context-based context for oslo\_db compatibility * Added oslo\_context-based context for oslo\_db compatibility
* Update database documents schema * Update database documents schema
* Helper for generating versioned object automatically from dictionary payload * Helper for generating versioned object automatically from dictionary payload
* Add description of substitution
* Update README * Update README
* Temporary change - do not commit * Temporary change - do not commit
* Reference Layering section in layeringDefinition description
* Add overall layering description
* Initial DB API models implementation * Initial DB API models implementation
* Added control (API) readme * Added control (API) readme
* [WIP] Implement documents API * [WIP] Implement documents API
@ -25,9 +75,25 @@ CHANGES
* Add additional documentation * Add additional documentation
* Add jsonschema validation to Deckhand * Add jsonschema validation to Deckhand
* Initial engine framework * Initial engine framework
* fix typo
* Provide a separate rendered-documents endpoint
* Move reporting of validation status
* Add samples for remaining endpoints
* Address some initial review comments
* WIP: Add initial design document
* Fix incorrect comment * Fix incorrect comment
* Deckhand initial ORM implementation * Deckhand initial ORM implementation
* Deckhand initial ORM implementation * Deckhand initial ORM implementation
* Add kind param to SchemaVersion class
* Change apiVersion references to schemaVersion
* Remove apiVersion attribute from substitutions.src attributes
* Remove apiVersion attribute from substitutions.src attributes
* Update default\_schema with our updated schema definition
* Trivial fix to default\_schema
* Use regexes for jsonschema pre-validation
* Add additional documentation
* Add jsonschema validation to Deckhand
* Initial engine framework
* Add oslo.log integration * Add oslo.log integration
* DECKHAND-10: Add Barbican integration to Deckhand * DECKHAND-10: Add Barbican integration to Deckhand
* Update ChangeLog * Update ChangeLog

View File

@ -35,4 +35,4 @@ To run locally in a development environment::
$ . /var/tmp/deckhand/bin/activate $ . /var/tmp/deckhand/bin/activate
$ sudo pip install . $ sudo pip install .
$ sudo python setup.py install $ sudo python setup.py install
$ uwsgi --http :9000 -w deckhand.deckhand --callable deckhand_callable --enable-threads -L $ uwsgi --http :9000 -w deckhand.cmd --callable deckhand_callable --enable-threads -L

View File

@ -12,7 +12,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from .control import api from deckhand.control import api
def start_deckhand(): def start_deckhand():

View File

@ -161,7 +161,7 @@ Document creation can be tested locally using (from root deckhand directory):
$ curl -i -X POST localhost:9000/api/v1.0/documents \ $ curl -i -X POST localhost:9000/api/v1.0/documents \
-H "Content-Type: application/x-yaml" \ -H "Content-Type: application/x-yaml" \
--data-binary "@deckhand/tests/unit/resources/sample.yaml" --data-binary "@deckhand/tests/unit/resources/sample_document.yaml"
# revision_id copy/pasted from previous response. # revision_id copy/pasted from previous response.
$ curl -i -X GET localhost:9000/api/v1.0/revisions/0e99c8b9-bab4-4fc7-8405-7dbd22c33a30/documents $ curl -i -X GET localhost:9000/api/v1.0/revisions/0e99c8b9-bab4-4fc7-8405-7dbd22c33a30/documents

View File

@ -50,13 +50,15 @@ class DocumentsResource(api_base.BaseResource):
# All concrete documents in the payload must successfully pass their # All concrete documents in the payload must successfully pass their
# JSON schema validations. Otherwise raise an error. # JSON schema validations. Otherwise raise an error.
try: try:
for doc in documents: validation_policies = document_validation.DocumentValidation(
document_validation.DocumentValidation(doc).pre_validate() documents).validate_all()
except deckhand_errors.InvalidFormat as e: except (deckhand_errors.InvalidDocumentFormat,
deckhand_errors.UnknownDocumentFormat) as e:
return self.return_error(resp, falcon.HTTP_400, message=e) return self.return_error(resp, falcon.HTTP_400, message=e)
try: try:
created_documents = db_api.documents_create(documents) created_documents = db_api.documents_create(
documents, validation_policies)
except db_exc.DBDuplicateEntry as e: except db_exc.DBDuplicateEntry as e:
return self.return_error(resp, falcon.HTTP_409, message=e) return self.return_error(resp, falcon.HTTP_409, message=e)
except Exception as e: except Exception as e:

View File

@ -16,32 +16,53 @@ from deckhand.control import common
class ViewBuilder(common.ViewBuilder): class ViewBuilder(common.ViewBuilder):
"""Model revision API responses as a python dictionary.""" """Model revision API responses as a python dictionary."""
_collection_name = 'revisions' _collection_name = 'revisions'
def list(self, revisions): def list(self, revisions):
resp_body = { resp_body = {
'count': len(revisions), 'count': len(revisions),
'next': None, 'results': []
'prev': None, }
'results': []
}
for revision in revisions: for revision in revisions:
result = {} result = {}
for attr in ('id', 'created_at'): for attr in ('id', 'created_at'):
result[common.to_camel_case(attr)] = revision[attr] result[common.to_camel_case(attr)] = revision[attr]
result['count'] = len(revision.pop('documents')) result['count'] = len(revision.pop('documents'))
resp_body['results'].append(result) resp_body['results'].append(result)
return resp_body return resp_body
def show(self, revision): def show(self, revision):
return { """Generate view for showing revision details.
'id': revision.get('id'),
'createdAt': revision.get('created_at'), Each revision's documents should only be validation policies.
'url': self._gen_url(revision), """
# TODO: Not yet implemented. validation_policies = []
'validationPolicies': [], success_status = 'success'
}
for vp in revision['validation_policies']:
validation_policy = {}
validation_policy['name'] = vp.get('name')
validation_policy['url'] = self._gen_url(vp)
try:
validation_policy['status'] = vp['data']['validations'][0][
'status']
except KeyError:
validation_policy['status'] = 'unknown'
validation_policies.append(validation_policy)
if validation_policy['status'] != 'success':
success_status = 'failed'
return {
'id': revision.get('id'),
'createdAt': revision.get('created_at'),
'url': self._gen_url(revision),
# TODO: Not yet implemented.
'validationPolicies': validation_policies,
'status': success_status
}

View File

@ -38,6 +38,7 @@ import sqlalchemy.sql as sa_sql
from deckhand.db.sqlalchemy import models from deckhand.db.sqlalchemy import models
from deckhand import errors from deckhand import errors
from deckhand import types
from deckhand import utils from deckhand import utils
sa_logger = None sa_logger = None
@ -111,18 +112,31 @@ def drop_db():
models.unregister_models(get_engine()) models.unregister_models(get_engine())
def documents_create(documents, session=None): def documents_create(documents, validation_policies, session=None):
"""Create a set of documents.""" session = session or get_session()
created_docs = [document_create(doc, session) for doc in documents]
return created_docs documents_created = _documents_create(documents, session)
val_policies_created = _documents_create(validation_policies, session)
all_docs_created = documents_created + val_policies_created
if all_docs_created:
revision = revision_create()
for doc in all_docs_created:
with session.begin():
doc['revision_id'] = revision['id']
doc.save(session=session)
return [d.to_dict() for d in documents_created]
def documents_create(values_list, session=None): def _documents_create(values_list, session=None):
"""Create a set of documents and associated schema. """Create a set of documents and associated schema.
If no changes are detected, a new revision will not be created. This If no changes are detected, a new revision will not be created. This
allows services to periodically re-register their schemas without allows services to periodically re-register their schemas without
creating unnecessary revisions. creating unnecessary revisions.
:param values_list: List of documents to be saved.
""" """
values_list = copy.deepcopy(values_list) values_list = copy.deepcopy(values_list)
session = session or get_session() session = session or get_session()
@ -138,17 +152,24 @@ def documents_create(values_list, session=None):
return True return True
return False return False
def _get_model(schema):
if schema == types.LAYERING_POLICY_SCHEMA:
return models.LayeringPolicy()
elif schema == types.VALIDATION_POLICY_SCHEMA:
return models.ValidationPolicy()
else:
return models.Document()
def _document_create(values): def _document_create(values):
document = models.Document() document = _get_model(values['schema'])
with session.begin(): with session.begin():
document.update(values) document.update(values)
document.save(session=session) return document
return document.to_dict()
for values in values_list: for values in values_list:
values['_metadata'] = values.pop('metadata') values['_metadata'] = values.pop('metadata')
values['name'] = values['_metadata']['name'] values['name'] = values['_metadata']['name']
try: try:
existing_document = document_get( existing_document = document_get(
raw_dict=True, raw_dict=True,
@ -164,10 +185,7 @@ def documents_create(values_list, session=None):
do_create = True do_create = True
if do_create: if do_create:
revision = revision_create()
for values in values_list: for values in values_list:
values['revision_id'] = revision['id']
doc = _document_create(values) doc = _document_create(values)
documents_created.append(doc) documents_created.append(doc)
@ -198,11 +216,13 @@ def revision_get(revision_id, session=None):
:raises: RevisionNotFound if the revision was not found. :raises: RevisionNotFound if the revision was not found.
""" """
session = session or get_session() session = session or get_session()
try: try:
revision = session.query(models.Revision).filter_by( revision = session.query(models.Revision).filter_by(
id=revision_id).one().to_dict() id=revision_id).one().to_dict()
except sa_orm.exc.NoResultFound: except sa_orm.exc.NoResultFound:
raise errors.RevisionNotFound(revision=revision_id) raise errors.RevisionNotFound(revision=revision_id)
return revision return revision

View File

@ -39,7 +39,7 @@ BASE = declarative.declarative_base()
class DeckhandBase(models.ModelBase, models.TimestampMixin): class DeckhandBase(models.ModelBase, models.TimestampMixin):
"""Base class for Deckhand Models.""" """Base class for Deckhand Models."""
__table_args__ = {'mysql_engine': 'InnoDB', 'mysql_charset': 'utf8'} __table_args__ = {'mysql_engine': 'Postgre', 'mysql_charset': 'utf8'}
__table_initialized__ = False __table_initialized__ = False
__protected_attributes__ = set([ __protected_attributes__ = set([
"created_at", "updated_at", "deleted_at", "deleted"]) "created_at", "updated_at", "deleted_at", "deleted"])
@ -70,7 +70,12 @@ class DeckhandBase(models.ModelBase, models.TimestampMixin):
def items(self): def items(self):
return self.__dict__.items() return self.__dict__.items()
def to_dict(self): def to_dict(self, raw_dict=False):
"""Conver the object into dictionary format.
:param raw_dict: if True, returns unmodified data; else returns data
expected by users.
"""
d = self.__dict__.copy() d = self.__dict__.copy()
# Remove private state instance, as it is not serializable and causes # Remove private state instance, as it is not serializable and causes
# CircularReference. # CircularReference.
@ -83,11 +88,16 @@ class DeckhandBase(models.ModelBase, models.TimestampMixin):
if k in d and d[k]: if k in d and d[k]:
d[k] = d[k].isoformat() d[k] = d[k].isoformat()
# NOTE(fmontei): ``metadata`` is reserved by the DB, so ``_metadata``
# must be used to store document metadata information in the DB.
if not raw_dict and '_metadata' in self.keys():
d['metadata'] = d['_metadata']
return d return d
@staticmethod @staticmethod
def gen_unqiue_contraint(self, *fields): def gen_unqiue_contraint(*fields):
constraint_name = 'ix_' + self.__class__.__name__.lower() + '_' constraint_name = 'ix_' + DeckhandBase.__name__.lower() + '_'
for field in fields: for field in fields:
constraint_name = constraint_name + '_%s' % field constraint_name = constraint_name + '_%s' % field
return schema.UniqueConstraint(*fields, name=constraint_name) return schema.UniqueConstraint(*fields, name=constraint_name)
@ -98,56 +108,74 @@ class Revision(BASE, DeckhandBase):
id = Column(String(36), primary_key=True, id = Column(String(36), primary_key=True,
default=lambda: str(uuid.uuid4())) default=lambda: str(uuid.uuid4()))
parent_id = Column(Integer, ForeignKey('revisions.id'), nullable=True)
child_id = Column(Integer, ForeignKey('revisions.id'), nullable=True)
results = Column(oslo_types.JsonEncodedList(), nullable=True)
documents = relationship("Document") documents = relationship("Document")
validation_policies = relationship("ValidationPolicy")
def to_dict(self): def to_dict(self):
d = super(Revision, self).to_dict() d = super(Revision, self).to_dict()
d['documents'] = [doc.to_dict() for doc in self.documents] d['documents'] = [doc.to_dict() for doc in self.documents]
d['validation_policies'] = [
vp.to_dict() for vp in self.validation_policies]
return d return d
class Document(BASE, DeckhandBase): class DocumentMixin(object):
"""Mixin class for sharing common columns across all document resources
such as documents themselves, layering policies and validation policies."""
name = Column(String(64), nullable=False)
schema = Column(String(64), nullable=False)
# NOTE: Do not define a maximum length for these JSON data below. However,
# this approach is not compatible with all database types.
# "metadata" is reserved, so use "_metadata" instead.
_metadata = Column(oslo_types.JsonEncodedDict(), nullable=False)
data = Column(oslo_types.JsonEncodedDict(), nullable=False)
@declarative.declared_attr
def revision_id(cls):
return Column(Integer, ForeignKey('revisions.id'), nullable=False)
class Document(BASE, DeckhandBase, DocumentMixin):
UNIQUE_CONSTRAINTS = ('schema', 'name', 'revision_id') UNIQUE_CONSTRAINTS = ('schema', 'name', 'revision_id')
__tablename__ = 'documents' __tablename__ = 'documents'
__table_args__ = (DeckhandBase.gen_unqiue_contraint(*UNIQUE_CONSTRAINTS),) __table_args__ = (DeckhandBase.gen_unqiue_contraint(*UNIQUE_CONSTRAINTS),)
id = Column(String(36), primary_key=True, id = Column(String(36), primary_key=True,
default=lambda: str(uuid.uuid4())) default=lambda: str(uuid.uuid4()))
schema = Column(String(64), nullable=False)
name = Column(String(64), nullable=False)
# NOTE: Do not define a maximum length for these JSON data below. However,
# this approach is not compatible with all database types.
# "metadata" is reserved, so use "_metadata" instead.
_metadata = Column(oslo_types.JsonEncodedDict(), nullable=False)
data = Column(oslo_types.JsonEncodedDict(), nullable=False)
revision_id = Column(Integer, ForeignKey('revisions.id'), nullable=False)
def to_dict(self, raw_dict=False):
"""Convert the ``Document`` object into a dictionary format.
:param raw_dict: if True, returns unmodified data; else returns data class LayeringPolicy(BASE, DeckhandBase, DocumentMixin):
expected by users.
:returns: dictionary format of ``Document`` object. # NOTE(fmontei): Only one layering policy can exist per revision, so
""" # enforce this constraint at the DB level.
d = super(Document, self).to_dict() UNIQUE_CONSTRAINTS = ('revision_id',)
# ``_metadata`` is used in the DB schema as ``metadata`` is reserved. __tablename__ = 'layering_policies'
if not raw_dict: __table_args__ = (DeckhandBase.gen_unqiue_contraint(*UNIQUE_CONSTRAINTS),)
d['metadata'] = d.pop('_metadata')
return d id = Column(String(36), primary_key=True,
default=lambda: str(uuid.uuid4()))
class ValidationPolicy(BASE, DeckhandBase, DocumentMixin):
UNIQUE_CONSTRAINTS = ('schema', 'name', 'revision_id')
__tablename__ = 'validation_policies'
__table_args__ = (DeckhandBase.gen_unqiue_contraint(*UNIQUE_CONSTRAINTS),)
id = Column(String(36), primary_key=True,
default=lambda: str(uuid.uuid4()))
def register_models(engine): def register_models(engine):
"""Create database tables for all models with the given engine.""" """Create database tables for all models with the given engine."""
models = [Document] models = [Document, Revision, LayeringPolicy, ValidationPolicy]
for model in models: for model in models:
model.metadata.create_all(engine) model.metadata.create_all(engine)
def unregister_models(engine): def unregister_models(engine):
"""Drop database tables for all models with the given engine.""" """Drop database tables for all models with the given engine."""
models = [Document] models = [Document, Revision, LayeringPolicy, ValidationPolicy]
for model in models: for model in models:
model.metadata.drop_all(engine) model.metadata.drop_all(engine)

View File

@ -14,11 +14,12 @@
import jsonschema import jsonschema
from oslo_log import log as logging from oslo_log import log as logging
import six
from deckhand.engine.schema.v1_0 import default_policy_validation from deckhand.engine.schema import base_schema
from deckhand.engine.schema.v1_0 import default_schema_validation from deckhand.engine.schema import v1_0
from deckhand import errors from deckhand import errors
from deckhand import factories
from deckhand import types
LOG = logging.getLogger(__name__) LOG = logging.getLogger(__name__)
@ -26,74 +27,146 @@ LOG = logging.getLogger(__name__)
class DocumentValidation(object): class DocumentValidation(object):
"""Class for document validation logic for YAML files. """Class for document validation logic for YAML files.
This class is responsible for performing built-in validations on Documents. This class is responsible for validating YAML files according to their
schema.
:param data: YAML data that requires secrets to be validated, merged and :param documents: Documents to be validated.
consolidated. :type documents: List of dictionaries or dictionary.
""" """
def __init__(self, data): def __init__(self, documents):
self.data = data if not isinstance(documents, (list, tuple)):
documents = [documents]
class SchemaVersion(object): self.documents = documents
class SchemaType(object):
"""Class for retrieving correct schema for pre-validation on YAML. """Class for retrieving correct schema for pre-validation on YAML.
Retrieves the schema that corresponds to "apiVersion" in the YAML Retrieves the schema that corresponds to "apiVersion" in the YAML
data. This schema is responsible for performing pre-validation on data. This schema is responsible for performing pre-validation on
YAML data. YAML data.
The built-in validation schemas that are always executed include:
- `deckhand-document-schema-validation`
- `deckhand-policy-validation`
""" """
# TODO: Use the correct validation based on the Document's schema. # TODO(fmontei): Support dynamically registered schemas.
internal_validations = [ schema_versions_info = [
{'version': 'v1', 'fqn': 'deckhand-document-schema-validation', {'id': 'deckhand/CertificateKey',
'schema': default_schema_validation}, 'schema': v1_0.certificate_key_schema},
{'version': 'v1', 'fqn': 'deckhand-policy-validation', {'id': 'deckhand/Certificate',
'schema': default_policy_validation}] 'schema': v1_0.certificate_schema},
{'id': 'deckhand/DataSchema',
'schema': v1_0.data_schema},
# NOTE(fmontei): Fall back to the metadata's schema for validating
# generic documents.
{'id': 'metadata/Document',
'schema': v1_0.document_schema},
{'id': 'deckhand/LayeringPolicy',
'schema': v1_0.layering_schema},
{'id': 'deckhand/Passphrase',
'schema': v1_0.passphrase_schema},
{'id': 'deckhand/ValidationPolicy',
'schema': v1_0.validation_schema}]
def __init__(self, schema_version): def __init__(self, data):
self.schema_version = schema_version """Constructor for ``SchemaType``.
@property Retrieve the relevant schema based on the API version and schema
def schema(self): name contained in `document.schema` where `document` constitutes a
# TODO: return schema based on Document's schema. single document in a YAML payload.
return [v['schema'] for v in self.internal_validations
if v['version'] == self.schema_version][0].schema
def pre_validate(self): :param api_version: The API version used for schema validation.
"""Pre-validate that the YAML file is correctly formatted.""" :param schema: The schema property in `document.schema`.
self._validate_with_schema() """
self.schema = self.get_schema(data)
def _validate_with_schema(self): def get_schema(self, data):
# Validate the document using the document's ``schema``. Only validate # Fall back to `document.metadata.schema` if the schema cannot be
# concrete documents. # determined from `data.schema`.
for doc_property in [data['schema'], data['metadata']['schema']]:
schema = self._get_schema_by_property(doc_property)
if schema:
return schema
return None
def _get_schema_by_property(self, doc_property):
schema_parts = doc_property.split('/')
doc_schema_identifier = '/'.join(schema_parts[:-1])
for schema in self.schema_versions_info:
if doc_schema_identifier == schema['id']:
return schema['schema'].schema
return None
def validate_all(self):
"""Pre-validate that the YAML file is correctly formatted.
All concrete documents in the revision successfully pass their JSON
schema validations. The result of the validation is stored under
the "deckhand-document-schema-validation" validation namespace for
a document revision.
Validation is broken up into 2 stages:
1) Validate that each document contains the basic bulding blocks
needed: "schema", "metadata" and "data" using a "base" schema.
2) Validate each specific document type (e.g. validation policy)
using a more detailed schema.
:returns: Dictionary mapping with keys being the unique name for each
document and values being the validations executed for that
document, including failed and succeeded validations.
"""
internal_validation_docs = []
validation_policy_factory = factories.ValidationPolicyFactory()
for document in self.documents:
document_validations = self._validate_one(document)
deckhand_schema_validation = validation_policy_factory.gen(
types.DECKHAND_SCHEMA_VALIDATION, status='success')
internal_validation_docs.append(deckhand_schema_validation)
return internal_validation_docs
def _validate_one(self, document):
# Subject every document to basic validation to verify that each
# main section is present (schema, metadata, data).
try: try:
abstract = self.data['metadata']['layeringDefinition'][ jsonschema.validate(document, base_schema.schema)
'abstract']
is_abstract = six.text_type(abstract).lower() == 'true'
except KeyError as e:
raise errors.InvalidFormat(
"Could not find 'abstract' property from document.")
# TODO: This should be done inside a different module.
if is_abstract:
LOG.info(
"Skipping validation for the document because it is abstract")
return
try:
schema_version = self.data['schema'].split('/')[-1]
doc_schema_version = self.SchemaVersion(schema_version)
except (AttributeError, IndexError, KeyError) as e:
raise errors.InvalidFormat(
'The provided schema is invalid or missing. Exception: '
'%s.' % e)
try:
jsonschema.validate(self.data, doc_schema_version.schema)
except jsonschema.exceptions.ValidationError as e: except jsonschema.exceptions.ValidationError as e:
raise errors.InvalidFormat('The provided YAML file is invalid. ' raise errors.InvalidDocumentFormat(
'Exception: %s.' % e.message) detail=e.message, schema=e.schema)
doc_schema_type = self.SchemaType(document)
if doc_schema_type.schema is None:
raise errors.UknownDocumentFormat(
document_type=document['schema'])
# Perform more detailed validation on each document depending on
# its schema. If the document is abstract, validation errors are
# ignored.
try:
jsonschema.validate(document, doc_schema_type.schema)
except jsonschema.exceptions.ValidationError as e:
# TODO(fmontei): Use the `Document` object wrapper instead
# once other PR is merged.
if not self._is_abstract(document):
raise errors.InvalidDocumentFormat(
detail=e.message, schema=e.schema,
document_type=document['schema'])
else:
LOG.info('Skipping schema validation for abstract '
'document: %s.' % document)
def _is_abstract(self, document):
try:
is_abstract = document['metadata']['layeringDefinition'][
'abstract'] == True
return is_abstract
# NOTE(fmontei): If the document is of ``document_schema`` type and
# no "layeringDefinition" or "abstract" property is found, then treat
# this as a validation error.
except KeyError:
doc_schema_type = self.SchemaType(document)
return doc_schema_type is v1_0.document_schema
return False

View File

@ -0,0 +1,36 @@
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
schema = {
'type': 'object',
'properties': {
'schema': {
'type': 'string',
# Currently supported versions include v1 only.
'pattern': '^([A-Za-z]+\/[A-Za-z]+\/v[1]{1}\.[0]{1})$'
},
'metadata': {
'type': 'object',
'properties': {
'schema': {'type': 'string'},
'name': {'type': 'string'}
},
'additionalProperties': True,
'required': ['schema', 'name']
},
'data': {'type': ['string', 'object']}
},
'additionalProperties': False,
'required': ['schema', 'metadata', 'data']
}

View File

@ -0,0 +1,25 @@
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from deckhand.engine.schema.v1_0 import certificate_key_schema
from deckhand.engine.schema.v1_0 import certificate_schema
from deckhand.engine.schema.v1_0 import data_schema
from deckhand.engine.schema.v1_0 import document_schema
from deckhand.engine.schema.v1_0 import layering_schema
from deckhand.engine.schema.v1_0 import passphrase_schema
from deckhand.engine.schema.v1_0 import validation_schema
__all__ = ['certificate_key_schema', 'certificate_schema', 'data_schema',
'document_schema', 'layering_schema', 'passphrase_schema',
'validation_schema']

View File

@ -0,0 +1,42 @@
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
schema = {
'type': 'object',
'properties': {
'schema': {
'type': 'string',
'pattern': '^(deckhand/CertificateKey/v[1]{1}\.[0]{1})$'
},
'metadata': {
'type': 'object',
'properties': {
'schema': {
'type': 'string',
'pattern': '^(metadata/Document/v[1]{1}\.[0]{1})$',
},
'name': {'type': 'string'},
'storagePolicy': {
'type': 'string',
'pattern': '^(encrypted)$'
}
},
'additionalProperties': False,
'required': ['schema', 'name', 'storagePolicy']
},
'data': {'type': 'string'}
},
'additionalProperties': False,
'required': ['schema', 'metadata', 'data']
}

View File

@ -0,0 +1,42 @@
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
schema = {
'type': 'object',
'properties': {
'schema': {
'type': 'string',
'pattern': '^(deckhand/Certificate/v[1]{1}\.[0]{1})$'
},
'metadata': {
'type': 'object',
'properties': {
'schema': {
'type': 'string',
'pattern': '^(metadata/Document/v[1]{1}\.[0]{1})$',
},
'name': {'type': 'string'},
'storagePolicy': {
'type': 'string',
'pattern': '^(cleartext)$'
}
},
'additionalProperties': False,
'required': ['schema', 'name', 'storagePolicy']
},
'data': {'type': 'string'}
},
'additionalProperties': False,
'required': ['schema', 'metadata', 'data']
}

View File

@ -0,0 +1,54 @@
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# This specifies the official JSON schema meta-schema. DataSchema documents
# are used by various services to register new schemas that Deckhand can use
# for validation.
schema = {
'type': 'object',
'properties': {
'schema': {
'type': 'string',
'pattern': '^(deckhand/DataSchema/v[1]{1}\.[0]{1})$'
},
'metadata': {
'type': 'object',
'properties': {
'schema': {
'type': 'string',
'pattern': '^(metadata/Control/v[1]{1}\.[0]{1})$'
},
'name': {'type': 'string'},
# Labels are optional.
'labels': {
'type': 'object'
}
},
'additionalProperties': False,
'required': ['schema', 'name']
},
'data': {
'type': 'object',
'properties': {
'$schema': {
'type': 'string'
}
},
'additionalProperties': False,
'required': ['$schema']
}
},
'additionalProperties': False,
'required': ['schema', 'metadata', 'data']
}

View File

@ -18,10 +18,10 @@ substitution_schema = {
'dest': { 'dest': {
'type': 'object', 'type': 'object',
'properties': { 'properties': {
'path': {'type': 'string'} 'path': {'type': 'string'},
'pattern': {'type': 'string'}
}, },
'additionalProperties': False, 'additionalProperties': False,
# 'replacePattern' is not required.
'required': ['path'] 'required': ['path']
}, },
'src': { 'src': {
@ -44,35 +44,32 @@ schema = {
'properties': { 'properties': {
'schema': { 'schema': {
'type': 'string', 'type': 'string',
'pattern': '^(.*\/v[0-9]{1})$' 'pattern': '^([A-Za-z]+/[A-Za-z]+/v[1]{1}\.[0]{1})$'
}, },
'metadata': { 'metadata': {
'type': 'object', 'type': 'object',
'properties': { 'properties': {
'schema': { 'schema': {
'type': 'string', 'type': 'string',
'pattern': '^(.*/v[0-9]{1})$' 'pattern': '^(metadata/Document/v[1]{1}\.[0]{1})$'
}, },
'name': {'type': 'string'}, 'name': {'type': 'string'},
'storagePolicy': {'type': 'string'}, 'labels': {'type': 'object'},
'labels': {
'type': 'object'
},
'layeringDefinition': { 'layeringDefinition': {
'type': 'object', 'type': 'object',
'properties': { 'properties': {
'layer': {'type': 'string'}, 'layer': {'type': 'string'},
'abstract': {'type': 'boolean'}, 'abstract': {'type': 'boolean'},
'parentSelector': { # "parentSelector" is optional.
'type': 'object' 'parentSelector': {'type': 'object'},
}, # "actions" is optional.
'actions': { 'actions': {
'type': 'array', 'type': 'array',
'items': { 'items': {
'type': 'object', 'type': 'object',
'properties': { 'properties': {
'method': {'enum': ['merge', 'delete', 'method': {'enum': ['replace', 'delete',
'replace']}, 'merge']},
'path': {'type': 'string'} 'path': {'type': 'string'}
}, },
'additionalProperties': False, 'additionalProperties': False,
@ -81,16 +78,16 @@ schema = {
} }
}, },
'additionalProperties': False, 'additionalProperties': False,
'required': ['layer', 'abstract', 'parentSelector'] 'required': ['layer', 'abstract']
}, },
# "substitutions" is optional.
'substitutions': { 'substitutions': {
'type': 'array', 'type': 'array',
'items': substitution_schema 'items': substitution_schema
} }
}, },
'additionalProperties': False, 'additionalProperties': False,
'required': ['schema', 'name', 'storagePolicy', 'labels', 'required': ['schema', 'name', 'layeringDefinition']
'layeringDefinition', 'substitutions']
}, },
'data': { 'data': {
'type': 'object' 'type': 'object'

View File

@ -0,0 +1,48 @@
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
schema = {
'type': 'object',
'properties': {
'schema': {
'type': 'string',
'pattern': '^(deckhand/LayeringPolicy/v[1]{1}\.[0]{1})$'
},
'metadata': {
'type': 'object',
'properties': {
'schema': {
'type': 'string',
'pattern': '^(metadata/Control/v[1]{1}\.[0]{1})$'
},
'name': {'type': 'string'}
},
'additionalProperties': False,
'required': ['schema', 'name']
},
'data': {
'type': 'object',
'properties': {
'layerOrder': {
'type': 'array',
'items': {'type': 'string'}
}
},
'additionalProperties': True,
'required': ['layerOrder']
}
},
'additionalProperties': False,
'required': ['schema', 'metadata', 'data']
}

View File

@ -0,0 +1,42 @@
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
schema = {
'type': 'object',
'properties': {
'schema': {
'type': 'string',
'pattern': '^(deckhand/Passphrase/v[1]{1}\.[0]{1})$'
},
'metadata': {
'type': 'object',
'properties': {
'schema': {
'type': 'string',
'pattern': '^(metadata/Document/v[1]{1}\.[0]{1})$',
},
'name': {'type': 'string'},
'storagePolicy': {
'type': 'string',
'pattern': '^(encrypted)$'
}
},
'additionalProperties': False,
'required': ['schema', 'name', 'storagePolicy']
},
'data': {'type': 'string'}
},
'additionalProperties': False,
'required': ['schema', 'metadata', 'data']
}

View File

@ -0,0 +1,60 @@
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
schema = {
'type': 'object',
'properties': {
'schema': {
'type': 'string',
'pattern': '^(deckhand/ValidationPolicy/v[1]{1}\.[0]{1})$'
},
'metadata': {
'type': 'object',
'properties': {
'schema': {
'type': 'string',
'pattern': '^(metadata/Control/v[1]{1}\.[0]{1})$'
},
'name': {'type': 'string'}
},
'additionalProperties': False,
'required': ['schema', 'name']
},
'data': {
'type': 'object',
'properties': {
'validations': {
'type': 'array',
'items': {
'type': 'object',
'properties': {
'name': {
'type': 'string',
'pattern': '^.*-(validation|verification)$'
},
# 'expiresAfter' is optional.
'expiresAfter': {'type': 'string'}
},
'additionalProperties': False,
'required': ['name']
}
}
},
'additionalProperties': True,
'required': ['validations']
}
},
'additionalProperties': False,
'required': ['schema', 'metadata', 'data']
}

View File

@ -45,17 +45,23 @@ class DeckhandException(Exception):
return self.args[0] return self.args[0]
class ApiError(Exception): class InvalidDocumentFormat(DeckhandException):
pass msg_fmt = ("The provided YAML failed schema validation. Details: "
"%(detail)s. Schema: %(schema)s.")
alt_msg_fmt = ("The provided %(document_type)s YAML failed schema "
"validation. Details: %(detail)s. Schema: %(schema)s.")
def __init__(self, document_type=None, **kwargs):
if document_type:
self.msg_fmt = self.alt_msg_fmt
kwargs.update({'document_type': document_type})
super(InvalidDocumentFormat, self).__init__(**kwargs)
class InvalidFormat(ApiError): class UnknownDocumentFormat(DeckhandException):
"""The YAML file is incorrectly formatted and cannot be read.""" msg_fmt = ("Could not determine the validation schema to validate the "
"document type: %(document_type)s.")
code = 400
class DocumentExists(DeckhandException):
msg_fmt = ("Document with kind %(kind)s and schemaVersion "
"%(schema_version)s already exists.")
class RevisionNotFound(DeckhandException): class RevisionNotFound(DeckhandException):

115
deckhand/factories.py Normal file
View File

@ -0,0 +1,115 @@
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import abc
import copy
from oslo_log import log as logging
from deckhand.tests import test_utils
from deckhand import types
LOG = logging.getLogger(__name__)
class DeckhandFactory(object):
__metaclass__ = abc.ABCMeta
@abc.abstractmethod
def gen(self, *args):
pass
@abc.abstractmethod
def gen_test(self, *args, **kwargs):
pass
class ValidationPolicyFactory(DeckhandFactory):
"""Class for auto-generating validation policy templates for testing."""
VALIDATION_POLICY_TEMPLATE = {
"data": {
"validations": []
},
"metadata": {
"schema": "metadata/Control/v1",
"name": ""
},
"schema": types.VALIDATION_POLICY_SCHEMA
}
def __init__(self):
"""Constructor for ``ValidationPolicyFactory``.
Returns a template whose YAML representation is of the form::
---
schema: deckhand/ValidationPolicy/v1
metadata:
schema: metadata/Control/v1
name: site-deploy-ready
data:
validations:
- name: deckhand-schema-validation
- name: drydock-site-validation
expiresAfter: P1W
- name: promenade-site-validation
expiresAfter: P1W
- name: armada-deployability-validation
...
"""
pass
def gen(self, validation_type, status):
if validation_type not in types.DECKHAND_VALIDATION_TYPES:
raise ValueError("The validation type must be in %s."
% types.DECKHAND_VALIDATION_TYPES)
validation_policy_template = copy.deepcopy(
self.VALIDATION_POLICY_TEMPLATE)
validation_policy_template['metadata'][
'name'] = validation_type
validation_policy_template['data']['validations'] = [
{'name': validation_type, 'status': status}
]
return validation_policy_template
def gen_test(self, name=None, num_validations=None):
"""Generate the test document template.
Generate the document template based on the arguments passed to
the constructor and to this function.
"""
if not(num_validations and isinstance(num_validations, int)
and num_validations > 0):
raise ValueError('The "num_validations" attribute must be integer '
'value > 1.')
if not name:
name = test_utils.rand_name('validation-policy')
if not num_validations:
num_validations = 3
validations = [
test_utils.rand_name('validation-name')
for _ in range(num_validations)]
validation_policy_template = copy.deepcopy(
self.VALIDATION_POLICY_TEMPLATE)
validation_policy_template['metadata']['name'] = name
validation_policy_template['data']['validations'] = validations
return validation_policy_template

View File

View File

@ -0,0 +1,37 @@
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import mock
import falcon
from falcon import testing as falcon_testing
from deckhand.control import api
from deckhand import factories
from deckhand.tests.unit import base as test_base
class TestFunctionalBase(test_base.DeckhandWithDBTestCase,
falcon_testing.TestCase):
"""Base class for functional testing."""
def setUp(self):
super(TestFunctionalBase, self).setUp()
self.app = falcon_testing.TestClient(api.start_api())
self.validation_policy_factory = factories.ValidationPolicyFactory()
@classmethod
def setUpClass(cls):
super(TestFunctionalBase, cls).setUpClass()
mock.patch.object(api, '__setup_logging').start()

View File

@ -0,0 +1,50 @@
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import yaml
import falcon
from deckhand.control import api
from deckhand.tests.functional import base as test_base
from deckhand import types
class TestDocumentsApi(test_base.TestFunctionalBase):
def _read_test_resource(self, file_name):
dir_path = os.path.dirname(os.path.realpath(__file__))
test_yaml_path = os.path.abspath(os.path.join(
dir_path, os.pardir, 'unit', 'resources', file_name + '.yaml'))
with open(test_yaml_path, 'r') as yaml_file:
yaml_data = yaml_file.read()
return yaml_data
def test_create_document(self):
yaml_data = self._read_test_resource('sample_document')
result = self.app.simulate_post('/api/v1.0/documents', body=yaml_data)
self.assertEqual(falcon.HTTP_201, result.status)
expected_documents = [yaml.safe_load(yaml_data)]
expected_validation_policy = self.validation_policy_factory.gen(
types.DECKHAND_SCHEMA_VALIDATION, status='success')
# Validate that the correct number of documents were created: one
# document corresponding to ``yaml_data``.
resp_documents = [d for d in yaml.safe_load_all(result.text)]
self.assertIsInstance(resp_documents, list)
self.assertEqual(1, len(resp_documents))
self.assertIn('revision_id', resp_documents[0])

View File

@ -14,21 +14,20 @@
import mock import mock
import testtools
from deckhand.control import api from deckhand.control import api
from deckhand.control import base as api_base from deckhand.control import base as api_base
from deckhand.control import documents from deckhand.control import documents
from deckhand.control import revision_documents from deckhand.control import revision_documents
from deckhand.control import revisions from deckhand.control import revisions
from deckhand.control import secrets from deckhand.control import secrets
from deckhand.tests.unit import base as test_base
class TestApi(testtools.TestCase): class TestApi(test_base.DeckhandTestCase):
def setUp(self): def setUp(self):
super(TestApi, self).setUp() super(TestApi, self).setUp()
for resource in (documents, revisions, revision_documents, secrets): for resource in (documents, revision_documents, revisions, secrets):
resource_name = resource.__name__.split('.')[-1] resource_name = resource.__name__.split('.')[-1]
resource_obj = mock.patch.object( resource_obj = mock.patch.object(
resource, '%sResource' % resource_name.title().replace( resource, '%sResource' % resource_name.title().replace(

View File

@ -14,12 +14,11 @@
import mock import mock
import testtools
from deckhand.control import base as api_base from deckhand.control import base as api_base
from deckhand.tests.unit import base as test_base
class TestBaseResource(testtools.TestCase): class TestBaseResource(test_base.DeckhandTestCase):
def setUp(self): def setUp(self):
super(TestBaseResource, self).setUp() super(TestBaseResource, self).setUp()

View File

@ -23,7 +23,7 @@ BASE_EXPECTED_FIELDS = ("created_at", "updated_at", "deleted_at", "deleted")
DOCUMENT_EXPECTED_FIELDS = BASE_EXPECTED_FIELDS + ( DOCUMENT_EXPECTED_FIELDS = BASE_EXPECTED_FIELDS + (
"id", "schema", "name", "metadata", "data", "revision_id") "id", "schema", "name", "metadata", "data", "revision_id")
REVISION_EXPECTED_FIELDS = BASE_EXPECTED_FIELDS + ( REVISION_EXPECTED_FIELDS = BASE_EXPECTED_FIELDS + (
"id", "child_id", "parent_id", "documents") "id", "documents", "validation_policies")
class DocumentFixture(object): class DocumentFixture(object):
@ -48,19 +48,24 @@ class DocumentFixture(object):
@staticmethod @staticmethod
def get_minimal_multi_fixture(count=2, **kwargs): def get_minimal_multi_fixture(count=2, **kwargs):
return [DocumentFixture.get_minimal_fixture(**kwargs) return [DocumentFixture.get_minimal_fixture(**kwargs)
for _ in range(count)] for _ in range(count)]
class TestDbBase(base.DeckhandWithDBTestCase): class TestDbBase(base.DeckhandWithDBTestCase):
def _create_documents(self, payload): def _create_documents(self, documents, validation_policies=None):
if not isinstance(payload, list): if not validation_policies:
payload = [payload] validation_policies = []
docs = db_api.documents_create(payload) if not isinstance(documents, list):
documents = [documents]
if not isinstance(validation_policies, list):
validation_policies = [validation_policies]
docs = db_api.documents_create(documents, validation_policies)
for idx, doc in enumerate(docs): for idx, doc in enumerate(docs):
self._validate_document(expected=payload[idx], actual=doc) self._validate_document(expected=documents[idx], actual=doc)
return docs return docs
def _get_document(self, **fields): def _get_document(self, **fields):

View File

@ -24,9 +24,8 @@ class TestDocuments(base.TestDbBase):
self.assertIsInstance(documents, list) self.assertIsInstance(documents, list)
self.assertEqual(1, len(documents)) self.assertEqual(1, len(documents))
for document in documents: retrieved_document = self._get_document(id=documents[0]['id'])
retrieved_document = self._get_document(id=document['id']) self.assertEqual(documents[0], retrieved_document)
self.assertEqual(document, retrieved_document)
def test_create_document_again_with_no_changes(self): def test_create_document_again_with_no_changes(self):
payload = base.DocumentFixture.get_minimal_fixture() payload = base.DocumentFixture.get_minimal_fixture()

View File

@ -13,16 +13,32 @@
# limitations under the License. # limitations under the License.
from deckhand.tests.unit.db import base from deckhand.tests.unit.db import base
from deckhand import factories
from deckhand import types
class TestRevisionViews(base.TestDbBase): class TestRevisions(base.TestDbBase):
def test_list(self): def test_list(self):
payload = [base.DocumentFixture.get_minimal_fixture() documents = [base.DocumentFixture.get_minimal_fixture()
for _ in range(4)] for _ in range(4)]
self._create_documents(payload) self._create_documents(documents)
revisions = self._list_revisions() revisions = self._list_revisions()
self.assertIsInstance(revisions, list) self.assertIsInstance(revisions, list)
self.assertEqual(1, len(revisions)) self.assertEqual(1, len(revisions))
self.assertEqual(4, len(revisions[0]['documents'])) self.assertEqual(4, len(revisions[0]['documents']))
def test_list_with_validation_policies(self):
documents = [base.DocumentFixture.get_minimal_fixture()
for _ in range(4)]
vp_factory = factories.ValidationPolicyFactory()
validation_policy = vp_factory.gen(types.DECKHAND_SCHEMA_VALIDATION,
'success')
self._create_documents(documents, [validation_policy])
revisions = self._list_revisions()
self.assertIsInstance(revisions, list)
self.assertEqual(1, len(revisions))
self.assertEqual(4, len(revisions[0]['documents']))
self.assertEqual(1, len(revisions[0]['validation_policies']))

View File

@ -0,0 +1,89 @@
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import copy
import os
import yaml
import mock
import six
from deckhand.engine import document_validation
from deckhand import errors
from deckhand.tests.unit import base as test_base
class TestDocumentValidationBase(test_base.DeckhandTestCase):
def _read_data(self, file_name):
dir_path = os.path.dirname(os.path.realpath(__file__))
test_yaml_path = os.path.abspath(os.path.join(
dir_path, os.pardir, 'resources', file_name + '.yaml'))
with open(test_yaml_path, 'r') as yaml_file:
yaml_data = yaml_file.read()
self.data = yaml.safe_load(yaml_data)
def _corrupt_data(self, key, value=None, data=None, op='delete'):
"""Corrupt test data to check that pre-validation works.
Corrupt data by removing a key from the document (if ``op`` is delete)
or by replacing the value corresponding to the key with ``value`` (if
``op`` is replace).
:param key: The document key to be removed. The key can have the
following formats:
* 'data' => document.pop('data')
* 'metadata.name' => document['metadata'].pop('name')
* 'metadata.substitutions.0.dest' =>
document['metadata']['substitutions'][0].pop('dest')
:type key: string
:param value: The new value that corresponds to the (nested) document
key (only used if ``op`` is 'replace').
:type value: type string
:param data: The data to "corrupt".
:type data: dict
:param op: Controls whether data is deleted (if "delete") or is
replaced with ``value`` (if "replace").
:type op: string
:returns: Corrupted data.
"""
if data is None:
data = self.data
if op not in ('delete', 'replace'):
raise ValueError("The ``op`` argument must either be 'delete' or "
"'replace'.")
corrupted_data = copy.deepcopy(data)
if '.' in key:
_corrupted_data = corrupted_data
nested_keys = key.split('.')
for nested_key in nested_keys:
if nested_key == nested_keys[-1]:
break
if nested_key.isdigit():
_corrupted_data = _corrupted_data[int(nested_key)]
else:
_corrupted_data = _corrupted_data[nested_key]
if op == 'delete':
_corrupted_data.pop(nested_keys[-1])
elif op == 'replace':
_corrupted_data[nested_keys[-1]] = value
else:
if op == 'delete':
corrupted_data.pop(key)
elif op == 'replace':
corrupted_data[key] = value
return corrupted_data

View File

@ -12,112 +12,54 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import copy
import os
import testtools
import yaml
import mock import mock
import six
from deckhand.engine import document_validation from deckhand.engine import document_validation
from deckhand import errors from deckhand.tests.unit.engine import base as engine_test_base
class TestDocumentValidation(testtools.TestCase): class TestDocumentValidation(engine_test_base.TestDocumentValidationBase):
def setUp(self): def test_init_document_validation(self):
super(TestDocumentValidation, self).setUp() self._read_data('sample_document')
dir_path = os.path.dirname(os.path.realpath(__file__)) doc_validation = document_validation.DocumentValidation(
test_yaml_path = os.path.abspath(os.path.join( self.data)
dir_path, os.pardir, 'resources', 'sample.yaml')) self.assertIsInstance(doc_validation,
document_validation.DocumentValidation)
with open(test_yaml_path, 'r') as yaml_file: def test_data_schema_missing_optional_sections(self):
yaml_data = yaml_file.read() self._read_data('sample_data_schema')
self.data = yaml.safe_load(yaml_data) optional_missing_data = [
self._corrupt_data('metadata.labels'),
def _corrupt_data(self, key, data=None):
"""Corrupt test data to check that pre-validation works.
Corrupt data by removing a key from the document. Each key must
correspond to a value that is a dictionary.
:param key: The document key to be removed. The key can have the
following formats:
* 'data' => document.pop('data')
* 'metadata.name' => document['metadata'].pop('name')
* 'metadata.substitutions.0.dest' =>
document['metadata']['substitutions'][0].pop('dest')
:returns: Corrupted data.
"""
if data is None:
data = self.data
corrupted_data = copy.deepcopy(data)
if '.' in key:
_corrupted_data = corrupted_data
nested_keys = key.split('.')
for nested_key in nested_keys:
if nested_key == nested_keys[-1]:
break
if nested_key.isdigit():
_corrupted_data = _corrupted_data[int(nested_key)]
else:
_corrupted_data = _corrupted_data[nested_key]
_corrupted_data.pop(nested_keys[-1])
else:
corrupted_data.pop(key)
return corrupted_data
def test_initialization(self):
doc_validation = document_validation.DocumentValidation(self.data)
doc_validation.pre_validate() # Should not raise any errors.
def test_initialization_missing_sections(self):
expected_err = ("The provided YAML file is invalid. Exception: '%s' "
"is a required property.")
invalid_data = [
(self._corrupt_data('data'), 'data'),
(self._corrupt_data('metadata.schema'), 'schema'),
(self._corrupt_data('metadata.name'), 'name'),
(self._corrupt_data('metadata.substitutions'), 'substitutions'),
(self._corrupt_data('metadata.substitutions.0.dest'), 'dest'),
(self._corrupt_data('metadata.substitutions.0.src'), 'src')
] ]
for invalid_entry, missing_key in invalid_data: for missing_data in optional_missing_data:
with six.assertRaisesRegex(self, errors.InvalidFormat, document_validation.DocumentValidation(missing_data).validate_all()
expected_err % missing_key):
doc_validation = document_validation.DocumentValidation(
invalid_entry)
doc_validation.pre_validate()
def test_initialization_missing_abstract_section(self): def test_document_missing_optional_sections(self):
expected_err = ("Could not find 'abstract' property from document.") self._read_data('sample_document')
invalid_data = [ properties_to_remove = (
self._corrupt_data('metadata'), 'metadata.layeringDefinition.actions',
self._corrupt_data('metadata.layeringDefinition'), 'metadata.layeringDefinition.parentSelector',
self._corrupt_data('metadata.layeringDefinition.abstract'), 'metadata.substitutions',
] 'metadata.substitutions.2.dest.pattern')
for invalid_entry in invalid_data: for property_to_remove in properties_to_remove:
with six.assertRaisesRegex(self, errors.InvalidFormat, optional_data_removed = self._corrupt_data(property_to_remove)
expected_err): document_validation.DocumentValidation(
doc_validation = document_validation.DocumentValidation( optional_data_removed).validate_all()
invalid_entry)
doc_validation.pre_validate()
@mock.patch.object(document_validation, 'LOG', autospec=True) @mock.patch.object(document_validation, 'LOG', autospec=True)
def test_initialization_with_abstract_document(self, mock_log): def test_abstract_document_not_validated(self, mock_log):
abstract_data = copy.deepcopy(self.data) self._read_data('sample_document')
# Set the document to abstract.
updated_data = self._corrupt_data(
'metadata.layeringDefinition.abstract', True, op='replace')
# Guarantee that a validation error is thrown by removing a required
# property.
del updated_data['metadata']['layeringDefinition']['layer']
for true_val in (True, 'true', 'True'): document_validation.DocumentValidation(updated_data).validate_all()
abstract_data['metadata']['layeringDefinition']['abstract'] = True self.assertTrue(mock_log.info.called)
self.assertIn("Skipping schema validation for abstract document",
doc_validation = document_validation.DocumentValidation( mock_log.info.mock_calls[0][1][0])
abstract_data)
doc_validation.pre_validate()
mock_log.info.assert_called_once_with(
"Skipping validation for the document because it is abstract")
mock_log.info.reset_mock()

View File

@ -0,0 +1,115 @@
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from deckhand.engine import document_validation
from deckhand import errors
from deckhand.tests.unit.engine import base as engine_test_base
class TestDocumentValidationNegative(
engine_test_base.TestDocumentValidationBase):
"""Negative testing suite for document validation."""
BASIC_ATTRS = (
'schema', 'metadata', 'data', 'metadata.schema', 'metadata.name')
SCHEMA_ERR = ("The provided YAML failed schema validation. "
"Details: '%s' is a required property.")
SCHEMA_ERR_ALT = ("The provided %s YAML failed schema validation. "
"Details: '%s' is a required property.")
def _test_missing_required_sections(self, properties_to_remove):
for idx, property_to_remove in enumerate(properties_to_remove):
missing_prop = property_to_remove.split('.')[-1]
invalid_data = self._corrupt_data(property_to_remove)
if property_to_remove in self.BASIC_ATTRS:
expected_err = self.SCHEMA_ERR % missing_prop
else:
expected_err = self.SCHEMA_ERR_ALT % (
self.data['schema'], missing_prop)
# NOTE(fmontei): '$' must be escaped for regex to pass.
expected_err = expected_err.replace('$', '\$')
with self.assertRaisesRegex(errors.InvalidDocumentFormat,
expected_err):
document_validation.DocumentValidation(
invalid_data).validate_all()
def test_certificate_key_missing_required_sections(self):
self._read_data('sample_certificate_key')
properties_to_remove = self.BASIC_ATTRS + ('metadata.storagePolicy',)
self._test_missing_required_sections(properties_to_remove)
def test_certificate_missing_required_sections(self):
self._read_data('sample_certificate')
properties_to_remove = self.BASIC_ATTRS + ('metadata.storagePolicy',)
self._test_missing_required_sections(properties_to_remove)
def test_data_schema_missing_required_sections(self):
self._read_data('sample_data_schema')
properties_to_remove = self.BASIC_ATTRS + ('data.$schema',)
self._test_missing_required_sections(properties_to_remove)
def test_document_missing_required_sections(self):
self._read_data('sample_document')
properties_to_remove = self.BASIC_ATTRS + (
'metadata.layeringDefinition',
'metadata.layeringDefinition.abstract',
'metadata.layeringDefinition.layer',
'metadata.layeringDefinition.actions.0.method',
'metadata.layeringDefinition.actions.0.path',
'metadata.substitutions.0.dest',
'metadata.substitutions.0.dest.path',
'metadata.substitutions.0.src',
'metadata.substitutions.0.src.schema',
'metadata.substitutions.0.src.name',
'metadata.substitutions.0.src.path')
self._test_missing_required_sections(properties_to_remove)
def test_document_invalid_layering_definition_action(self):
self._read_data('sample_document')
updated_data = self._corrupt_data(
'metadata.layeringDefinition.actions.0.action', 'invalid',
op='replace')
self._test_missing_required_sections(updated_data)
def test_layering_policy_missing_required_sections(self):
self._read_data('sample_layering_policy')
properties_to_remove = self.BASIC_ATTRS + ('data.layerOrder',)
self._test_missing_required_sections(properties_to_remove)
def test_passphrase_missing_required_sections(self):
self._read_data('sample_passphrase')
properties_to_remove = self.BASIC_ATTRS + ('metadata.storagePolicy',)
self._test_missing_required_sections(properties_to_remove)
def test_passphrase_with_incorrect_storage_policy(self):
self._read_data('sample_passphrase')
expected_err = (
"The provided deckhand/Passphrase/v1.0 YAML failed schema "
"validation. Details: 'cleartext' does not match '^(encrypted)$'")
wrong_data = self._corrupt_data('metadata.storagePolicy', 'cleartext',
op='replace')
doc_validation = document_validation.DocumentValidation(wrong_data)
e = self.assertRaises(errors.InvalidDocumentFormat,
doc_validation.validate_all)
self.assertIn(expected_err, str(e))
def test_validation_policy_missing_required_sections(self):
self._read_data('sample_validation_policy')
properties_to_remove = self.BASIC_ATTRS + (
'data.validations', 'data.validations.0.name')
self._test_missing_required_sections(properties_to_remove)

View File

@ -1,38 +0,0 @@
---
schema: some-service/ResourceType/v1
metadata:
schema: metadata/Document/v1
name: unique-name-given-schema
storagePolicy: cleartext
labels:
genesis: enabled
master: enabled
layeringDefinition:
abstract: false
layer: region
parentSelector:
required_key_a: required_label_a
required_key_b: required_label_b
actions:
- method: merge
path: .path.to.merge.into.parent
- method: delete
path: .path.to.delete
substitutions:
- dest:
path: .substitution.target
src:
schema: another-service/SourceType/v1
name: name-of-source-document
path: .source.path
data:
path:
to:
merge:
into:
parent:
foo: bar
ignored: # Will not be part of the resultant document after layering.
data: here
substitution:
target: null # Paths do not need to exist to be specified as substitution destinations.

View File

@ -0,0 +1,13 @@
---
schema: deckhand/Certificate/v1.0
metadata:
schema: metadata/Document/v1.0
name: application-api
storagePolicy: cleartext
data: |-
-----BEGIN CERTIFICATE-----
MIIDYDCCAkigAwIBAgIUKG41PW4VtiphzASAMY4/3hL8OtAwDQYJKoZIhvcNAQEL
...snip...
P3WT9CfFARnsw2nKjnglQcwKkKLYip0WY2wh3FE7nrQZP6xKNaSRlh6p2pCGwwwH
HkvVwA==
-----END CERTIFICATE-----

View File

@ -0,0 +1,12 @@
---
schema: deckhand/CertificateKey/v1.0
metadata:
schema: metadata/Document/v1.0
name: application-api
storagePolicy: encrypted
data: |-
-----BEGIN RSA PRIVATE KEY-----
MIIEpQIBAAKCAQEAx+m1+ao7uTVEs+I/Sie9YsXL0B9mOXFlzEdHX8P8x4nx78/T
...snip...
Zf3ykIG8l71pIs4TGsPlnyeO6LzCWP5WRSh+BHnyXXjzx/uxMOpQ/6I=
-----END RSA PRIVATE KEY-----

View File

@ -0,0 +1,9 @@
---
schema: deckhand/DataSchema/v1.0 # This specifies the official JSON schema meta-schema.
metadata:
schema: metadata/Control/v1.0
name: promenade/Node/v1.0 # Specifies the documents to be used for validation.
labels:
application: promenade
data: # Valid JSON Schema is expected here.
$schema: http://blah

View File

@ -0,0 +1,46 @@
# Sample YAML file for testing forward replacement.
---
schema: promenade/ResourceType/v1.0
metadata:
schema: metadata/Document/v1.0
name: a-unique-config-name-12345
labels:
component: apiserver
hostname: server0
layeringDefinition:
layer: global
abstract: False
parentSelector:
required_key_a: required_label_a
required_key_b: required_label_b
actions:
- method: merge
path: .path.to.merge.into.parent
- method: delete
path: .path.to.delete
substitutions:
- dest:
path: .chart.values.tls.certificate
src:
schema: deckhand/Certificate/v1.0
name: example-cert
path: .
- dest:
path: .chart.values.tls.key
src:
schema: deckhand/CertificateKey/v1.0
name: example-key
path: .
- dest:
path: .chart.values.some_url
pattern: INSERT_[A-Z]+_HERE
src:
schema: deckhand/Passphrase/v1.0
name: example-password
path: .
data:
chart:
details:
data: here
values:
some_url: http://admin:INSERT_PASSWORD_HERE@service-name:8080/v1

View File

@ -0,0 +1,13 @@
# Sample layering policy.
---
schema: deckhand/LayeringPolicy/v1.0
metadata:
schema: metadata/Control/v1
name: a-unique-config-name-12345
data:
layerOrder:
- global
- global-network
- global-storage
- region
- site

View File

@ -0,0 +1,7 @@
---
schema: deckhand/Passphrase/v1.0
metadata:
schema: metadata/Document/v1.0
name: application-admin-password
storagePolicy: encrypted
data: some-password

View File

@ -0,0 +1,14 @@
# Sample post-validation policy document.
---
schema: deckhand/ValidationPolicy/v1.0
metadata:
schema: metadata/Control/v1.0
name: later-validation
data:
validations:
- name: deckhand-schema-validation
- name: drydock-site-validation
expiresAfter: P1W
- name: promenade-site-validation
expiresAfter: P1W
- name: armada-deployability-validation

View File

@ -13,8 +13,10 @@
# limitations under the License. # limitations under the License.
from deckhand.control.views import revision from deckhand.control.views import revision
from deckhand import factories
from deckhand.tests.unit.db import base from deckhand.tests.unit.db import base
from deckhand.tests import test_utils from deckhand.tests import test_utils
from deckhand import types
class TestRevisionViews(base.TestDbBase): class TestRevisionViews(base.TestDbBase):
@ -22,15 +24,16 @@ class TestRevisionViews(base.TestDbBase):
def setUp(self): def setUp(self):
super(TestRevisionViews, self).setUp() super(TestRevisionViews, self).setUp()
self.view_builder = revision.ViewBuilder() self.view_builder = revision.ViewBuilder()
self.factory = factories.ValidationPolicyFactory()
def test_list_revisions(self): def test_list_revisions_with_multiple_documents(self):
payload = [base.DocumentFixture.get_minimal_fixture() payload = [base.DocumentFixture.get_minimal_fixture()
for _ in range(4)] for _ in range(4)]
self._create_documents(payload) self._create_documents(payload)
revisions = self._list_revisions() revisions = self._list_revisions()
revisions_view = self.view_builder.list(revisions) revisions_view = self.view_builder.list(revisions)
expected_attrs = ('next', 'prev', 'results', 'count') expected_attrs = ('results', 'count')
for attr in expected_attrs: for attr in expected_attrs:
self.assertIn(attr, revisions_view) self.assertIn(attr, revisions_view)
# Validate that only 1 revision was returned. # Validate that only 1 revision was returned.
@ -40,7 +43,7 @@ class TestRevisionViews(base.TestDbBase):
self.assertIn('count', revisions_view['results'][0]) self.assertIn('count', revisions_view['results'][0])
self.assertEqual(4, revisions_view['results'][0]['count']) self.assertEqual(4, revisions_view['results'][0]['count'])
def test_list_many_revisions(self): def test_list_multiple_revisions(self):
docs_count = [] docs_count = []
for _ in range(3): for _ in range(3):
doc_count = test_utils.rand_int(3, 9) doc_count = test_utils.rand_int(3, 9)
@ -52,7 +55,7 @@ class TestRevisionViews(base.TestDbBase):
revisions = self._list_revisions() revisions = self._list_revisions()
revisions_view = self.view_builder.list(revisions) revisions_view = self.view_builder.list(revisions)
expected_attrs = ('next', 'prev', 'results', 'count') expected_attrs = ('results', 'count')
for attr in expected_attrs: for attr in expected_attrs:
self.assertIn(attr, revisions_view) self.assertIn(attr, revisions_view)
# Validate that only 1 revision was returned. # Validate that only 1 revision was returned.
@ -69,10 +72,79 @@ class TestRevisionViews(base.TestDbBase):
payload = [base.DocumentFixture.get_minimal_fixture() payload = [base.DocumentFixture.get_minimal_fixture()
for _ in range(4)] for _ in range(4)]
documents = self._create_documents(payload) documents = self._create_documents(payload)
# Validate that each document points to the same revision.
revision_ids = set([d['revision_id'] for d in documents])
self.assertEqual(1, len(revision_ids))
revision = self._get_revision(documents[0]['revision_id']) revision = self._get_revision(documents[0]['revision_id'])
revision_view = self.view_builder.show(revision) revision_view = self.view_builder.show(revision)
expected_attrs = ('id', 'url', 'createdAt', 'validationPolicies') expected_attrs = ('id', 'url', 'createdAt', 'validationPolicies',
'status')
for attr in expected_attrs: for attr in expected_attrs:
self.assertIn(attr, revision_view) self.assertIn(attr, revision_view)
self.assertIsInstance(revision_view['validationPolicies'], list) self.assertIsInstance(revision_view['validationPolicies'], list)
self.assertEqual(revision_view['validationPolicies'], [])
def test_show_revision_successful_validation_policy(self):
# Simulate 4 document payload with an internally generated validation
# policy which executes 'deckhand-schema-validation'.
payload = [base.DocumentFixture.get_minimal_fixture()
for _ in range(4)]
validation_policy = self.factory.gen(types.DECKHAND_SCHEMA_VALIDATION,
status='success')
payload.append(validation_policy)
documents = self._create_documents(payload)
revision = self._get_revision(documents[0]['revision_id'])
revision_view = self.view_builder.show(revision)
expected_attrs = ('id', 'url', 'createdAt', 'validationPolicies',
'status')
expected_validation_policies = [
{'name': 'deckhand-schema-validation'}, 'status'
]
for attr in expected_attrs:
self.assertIn(attr, revision_view)
self.assertEqual('success', revision_view['status'])
self.assertIsInstance(revision_view['validationPolicies'], list)
self.assertEqual(1, len(revision_view['validationPolicies']))
self.assertEqual(revision_view['validationPolicies'][0]['name'],
'deckhand-schema-validation')
self.assertEqual(revision_view['validationPolicies'][0]['status'],
'success')
def test_show_revision_failed_validation_policy(self):
# Simulate 4 document payload with an internally generated validation
# policy which executes 'deckhand-schema-validation'.
payload = [base.DocumentFixture.get_minimal_fixture()
for _ in range(4)]
validation_policy = self.factory.gen(types.DECKHAND_SCHEMA_VALIDATION,
status='failed')
payload.append(validation_policy)
documents = self._create_documents(payload)
revision = self._get_revision(documents[0]['revision_id'])
revision_view = self.view_builder.show(revision)
expected_attrs = ('id', 'url', 'createdAt', 'validationPolicies',
'status')
expected_validation_policies = [
{'name': 'deckhand-schema-validation'}, 'status'
]
for attr in expected_attrs:
self.assertIn(attr, revision_view)
self.assertEqual('failed', revision_view['status'])
self.assertIsInstance(revision_view['validationPolicies'], list)
self.assertEqual(1, len(revision_view['validationPolicies']))
self.assertEqual(revision_view['validationPolicies'][0]['name'],
'deckhand-schema-validation')
self.assertEqual(revision_view['validationPolicies'][0]['status'],
'failed')

View File

@ -11,3 +11,17 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
DOCUMENT_SCHEMA_TYPES = (
LAYERING_POLICY_SCHEMA,
VALIDATION_POLICY_SCHEMA,
) = (
'deckhand/LayeringPolicy/v1',
'deckhand/ValidationPolicy/v1',
)
DECKHAND_VALIDATION_TYPES = (
DECKHAND_SCHEMA_VALIDATION,
) = (
'deckhand-schema-validation',
)

View File

@ -29,6 +29,15 @@ commands =
{[testenv]commands} {[testenv]commands}
ostestr '{posargs}' ostestr '{posargs}'
[testenv:functional]
usedevelop = True
setenv = VIRTUAL_ENV={envdir}
OS_TEST_PATH=./deckhand/tests/functional
LANGUAGE=en_US
commands =
find . -type f -name "*.pyc" -delete
ostestr '{posargs}'
[testenv:genconfig] [testenv:genconfig]
commands = oslo-config-generator --config-file=etc/deckhand/config-generator.conf commands = oslo-config-generator --config-file=etc/deckhand/config-generator.conf