[feat] DECKHAND-13: Document layering (merge) logic

This commit constitutes 1 of 2 monolithic ports from Github.

This commit implements the foundation for document layering
or merging. Included in this commit:

  - Algorithm for layering documents with the same schema
  - Dozens of positive test cases
  - About a dozen negative test cases
  - Factory for dynamically creating testing documents for
    layering

Change-Id: I580bb69a341910b21be8610a416c691c54f7b946
This commit is contained in:
Felipe Monteiro 2017-07-26 23:25:04 +01:00
parent 702c6b783b
commit 1bc0c9818e
8 changed files with 1612 additions and 110 deletions

104
ChangeLog
View File

@ -1,104 +0,0 @@
CHANGES
=======
* Some integration with views/database
* Add validations to document db model
* Fix up \_is\_abstract in document\_validation
* Updated document schema
* Resolved merge conflicts
* Clean up
* Refactor some code
* Fixed failing tests.g
* WIP: more changes, debugging, tests
* Fix unit tests
* Remove deprecated code, update deprecated schemas and add new schemas
* Add schema validation for validation policy
* Changed layers to type string in schemas
* f
* Add layering policy pre-validation schema
* Add layering policy pre-validation schema
* Refactor some code
* Add endpoint/tests for GET /revisions/{revision\_id}
* Fix naming conflict error
* Add view abstraction layer for modifying DB data into view data
* Raise exception instead of return
* Updated /GET revisions response body
* Remove old docstring
* Update control README (with current response bodies, even though they're a WIP
* Return YAML response body
* Add endpoint for GET /revisions
* Use built-in oslo\_db types for Columns serialized as dicts
* Finish retrieving documents by revision\_id, including with filters
* Clean up
* Test and DB API changes
* Add Revision resource
* More tests for revisions-api. Fix minor bugs
* Clarify layering actions start from full parent data
* Add DELETE endpoint
* Skip validation for abstract documents & add unit tests
* Update schema validation to be internal validation
* Update schema/db model/db api to align with design document
* Add basic RBAC details to design document
* Update documents/revisions relationship/tables
* Update revision and document tables and add more unit tests
* temp
* Revisions database and API implementation
* Update API paths for consistency
* Add clarifications based on review
* Use safe\_load\_all instead of safe\_load
* Add unit tests for db documents api
* Remove oslo\_versionedobjects
* Change application/yaml to application/x-yaml
* Cleaned up some logic, added exception handling to document creation
* Add currently necessary oslo namespaces to oslo-config-generator conf file
* Successfully creating document
* Added logic for establishing DB connection
* Refactor database sqlalchemy api/models
* Added oslo\_context-based context for oslo\_db compatibility
* Update database documents schema
* Helper for generating versioned object automatically from dictionary payload
* Add description of substitution
* Update README
* Temporary change - do not commit
* Reference Layering section in layeringDefinition description
* Add overall layering description
* Initial DB API models implementation
* Added control (API) readme
* [WIP] Implement documents API
* Add kind param to SchemaVersion class
* Change apiVersion references to schemaVersion
* Remove apiVersion attribute from substitutions.src attributes
* Remove apiVersion attribute from substitutions.src attributes
* Update default\_schema with our updated schema definition
* Trivial fix to default\_schema
* Use regexes for jsonschema pre-validation
* Add additional documentation
* Add jsonschema validation to Deckhand
* Initial engine framework
* fix typo
* Provide a separate rendered-documents endpoint
* Move reporting of validation status
* Add samples for remaining endpoints
* Address some initial review comments
* WIP: Add initial design document
* Fix incorrect comment
* Deckhand initial ORM implementation
* Deckhand initial ORM implementation
* Add kind param to SchemaVersion class
* Change apiVersion references to schemaVersion
* Remove apiVersion attribute from substitutions.src attributes
* Remove apiVersion attribute from substitutions.src attributes
* Update default\_schema with our updated schema definition
* Trivial fix to default\_schema
* Use regexes for jsonschema pre-validation
* Add additional documentation
* Add jsonschema validation to Deckhand
* Initial engine framework
* Add oslo.log integration
* DECKHAND-10: Add Barbican integration to Deckhand
* Update ChangeLog
* Update AUTHORS
* DECKHAND-2: Design core Deckhand API framework
* Oslo config integration (#1)
* Add ChangeLog
* Initial commit

121
deckhand/engine/document.py Normal file
View File

@ -0,0 +1,121 @@
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import six
class Document(object):
"""Object wrapper for documents.
After "raw" documents undergo schema validation, they can be wrapped with
this class to allow nested dictionary entries to be quickly retrieved.
"""
def __init__(self, data):
"""Constructor for ``Document``.
:param data: Dictionary of all document data (includes metadata, data,
schema, etc.).
"""
self._inner = data
def to_dict(self):
return self._inner
def is_abstract(self):
"""Return whether the document is abstract.
Not all documents contain this property; in that case they are
concrete.
"""
try:
abstract = self._inner['metadata']['layeringDefinition'][
'abstract']
return six.text_type(abstract) == 'True'
except KeyError:
return False
def get_schema(self):
return self._inner['schema']
def get_name(self):
return self._inner['metadata']['name']
def get_layer(self):
return self._inner['metadata']['layeringDefinition']['layer']
def get_parent_selector(self):
"""Return the `parentSelector` for the document.
The topmost document defined by the `layerOrder` in the LayeringPolicy
does not have a `parentSelector` as it has no parent.
:returns: `parentSelcetor` for the document if present, else None.
"""
try:
return self._inner['metadata']['layeringDefinition'][
'parentSelector']
except KeyError:
return None
def get_labels(self):
return self._inner['metadata']['labels']
def get_actions(self):
try:
return self._inner['metadata']['layeringDefinition']['actions']
except KeyError:
return []
def get_children(self, nested=False):
"""Get document children, if any.
:param nested: Recursively retrieve all children for each child
document.
:type nested: boolean
:returns: List of children of type `Document`.
"""
if not nested:
return self._inner.get('children', [])
else:
return self._get_nested_children(self, [])
def _get_nested_children(self, doc, nested_children):
for child in doc.get('children', []):
nested_children.append(child)
if 'children' in child._inner:
self._get_nested_children(child, nested_children)
return nested_children
def get(self, k, default=None):
return self.__getitem__(k, default=default)
def __getitem__(self, k, default=None):
return self._inner.get(k, default)
def __setitem__(self, k, val):
self._inner[k] = val
def __delitem__(self, k):
if self.__contains__(k):
del self._inner[k]
def __contains__(self, k):
return self.get(k, default=None) is not None
def __missing__(self, k):
return not self.__contains__(k)
def __repr__(self):
return repr(self._inner)

295
deckhand/engine/layering.py Normal file
View File

@ -0,0 +1,295 @@
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import collections
import copy
from deckhand.engine import document
from deckhand.engine import utils
from deckhand import errors
class DocumentLayering(object):
"""Class responsible for handling document layering.
Layering is controlled in two places:
1. The `LayeringPolicy` control document, which defines the valid layers
and their order of precedence.
2. In the `metadata.layeringDefinition` section of normal
(`metadata.schema=metadata/Document/v1.0`) documents.
.. note::
Only documents with the same `schema` are allowed to be layered
together into a fully rendered document.
"""
SUPPORTED_METHODS = ('merge', 'replace', 'delete')
LAYERING_POLICY_SCHEMA = 'deckhand/LayeringPolicy/v1.0'
def __init__(self, documents):
"""Contructor for ``DocumentLayering``.
:param documents: List of YAML documents represented as dictionaries.
"""
self.documents = [document.Document(d) for d in documents]
self._find_layering_policy()
self.layered_docs = self._calc_document_children()
def render(self):
"""Perform layering on the set of `documents`.
Each concrete document will undergo layering according to the actions
defined by its `layeringDefinition`.
:returns: the list of rendered documents (does not include layering
policy document).
"""
# ``rendered_data_by_layer`` agglomerates the set of changes across all
# actions across each layer for a specific document.
rendered_data_by_layer = {}
# NOTE(fmontei): ``global_docs`` represents the topmost documents in
# the system. It should probably be impossible for more than 1
# top-level doc to exist, but handle multiple for now.
global_docs = [doc for doc in self.layered_docs
if doc.get_layer() == self.layer_order[0]]
for doc in global_docs:
layer_idx = self.layer_order.index(doc.get_layer())
rendered_data_by_layer[layer_idx] = doc.to_dict()
# Keep iterating as long as a child exists.
for child in doc.get_children(nested=True):
# Retrieve the most up-to-date rendered_data (by
# referencing the child's parent's data).
child_layer_idx = self.layer_order.index(child.get_layer())
rendered_data = rendered_data_by_layer[child_layer_idx - 1]
# Apply each action to the current document.
actions = child.get_actions()
for action in actions:
rendered_data = self._apply_action(
action, child.to_dict(), rendered_data)
# Update the actual document data if concrete.
if not child.is_abstract():
self.layered_docs[self.layered_docs.index(child)][
'data'] = rendered_data['data']
# Update ``rendered_data_by_layer`` for this layer so that
# children in deeper layers can reference the most up-to-date
# changes.
rendered_data_by_layer[child_layer_idx] = rendered_data
if 'children' in doc:
del doc['children']
return [d.to_dict() for d in self.layered_docs]
def _apply_action(self, action, child_data, overall_data):
"""Apply actions to each layer that is rendered.
Supported actions include:
* `merge` - a "deep" merge that layers new and modified data onto
existing data
* `replace` - overwrite data at the specified path and replace it
with the data given in this document
* `delete` - remove the data at the specified path
"""
method = action['method']
if method not in self.SUPPORTED_METHODS:
raise errors.UnsupportedActionMethod(
action=action, document=child_data)
# Use copy prevent these data from being updated referentially.
overall_data = copy.deepcopy(overall_data)
child_data = copy.deepcopy(child_data)
rendered_data = overall_data
# Remove empty string paths and ensure that "data" is always present.
path = action['path'].split('.')
path = [p for p in path if p != '']
path.insert(0, 'data')
last_key = 'data' if not path[-1] else path[-1]
for attr in path:
if attr == path[-1]:
break
rendered_data = rendered_data.get(attr)
child_data = child_data.get(attr)
if method == 'delete':
# If the entire document is passed (i.e. the dict including
# metadata, data, schema, etc.) then reset data to an empty dict.
if last_key == 'data':
rendered_data['data'] = {}
elif last_key in rendered_data:
del rendered_data[last_key]
elif last_key not in rendered_data:
# If the key does not exist in `rendered_data`, this is a
# validation error.
raise errors.MissingDocumentKey(
child=child_data, parent=rendered_data, key=last_key)
elif method == 'merge':
if last_key in rendered_data and last_key in child_data:
# If both entries are dictionaries, do a deep merge. Otherwise
# do a simple merge.
if (isinstance(rendered_data[last_key], dict)
and isinstance(child_data[last_key], dict)):
utils.deep_merge(
rendered_data[last_key], child_data[last_key])
else:
rendered_data.setdefault(last_key, child_data[last_key])
elif last_key in child_data:
rendered_data.setdefault(last_key, child_data[last_key])
else:
# If the key does not exist in the child document, this is a
# validation error.
raise errors.MissingDocumentKey(
child=child_data, parent=rendered_data, key=last_key)
elif method == 'replace':
if last_key in rendered_data and last_key in child_data:
rendered_data[last_key] = child_data[last_key]
elif last_key in child_data:
rendered_data.setdefault(last_key, child_data[last_key])
elif last_key not in child_data:
# If the key does not exist in the child document, this is a
# validation error.
raise errors.MissingDocumentKey(
child=child_data, parent=rendered_data, key=last_key)
return overall_data
def _find_layering_policy(self):
"""Retrieve the current layering policy.
:raises LayeringPolicyMalformed: If the `layerOrder` could not be
found in the LayeringPolicy or if it is not a list.
:raises LayeringPolicyNotFound: If system has no layering policy.
"""
# TODO(fmontei): There should be a DB call here to fetch the layering
# policy from the DB.
for doc in self.documents:
if doc.to_dict()['schema'] == self.LAYERING_POLICY_SCHEMA:
self.layering_policy = doc
break
if not hasattr(self, 'layering_policy'):
raise errors.LayeringPolicyNotFound(
schema=self.LAYERING_POLICY_SCHEMA)
# TODO(fmontei): Rely on schema validation or some such for this.
try:
self.layer_order = list(self.layering_policy['data']['layerOrder'])
except KeyError:
raise errors.LayeringPolicyMalformed(
schema=self.LAYERING_POLICY_SCHEMA,
document=self.layering_policy)
if not isinstance(self.layer_order, list):
raise errors.LayeringPolicyMalformed(
schema=self.LAYERING_POLICY_SCHEMA,
document=self.layering_policy)
def _calc_document_children(self):
"""Determine each document's children.
For each document, attempts to find the document's children. Adds a new
key called "children" to the document's dictionary.
.. note::
A document should only have exactly one parent.
If a document does not have a parent, then its layer must be
the topmost layer defined by the `layerOrder`.
:returns: Ordered list of documents that need to be layered. Each
document contains a "children" property in addition to original
data. List of documents returned is ordered from highest to lowest
layer.
:rtype: list of deckhand.engine.document.Document objects.
:raises IndeterminateDocumentParent: If more than one parent document
was found for a document.
:raises MissingDocumentParent: If the parent document could not be
found. Only applies documents with `layeringDefinition` property.
"""
layered_docs = list(
filter(lambda x: 'layeringDefinition' in x['metadata'],
self.documents))
# ``all_children`` is a counter utility for verifying that each
# document has exactly one parent.
all_children = collections.Counter()
def _get_children(doc):
children = []
doc_layer = doc.get_layer()
try:
next_layer_idx = self.layer_order.index(doc_layer) + 1
children_doc_layer = self.layer_order[next_layer_idx]
except IndexError:
# The lowest layer has been reached, so no children. Return
# empty list.
return children
for other_doc in layered_docs:
# Documents with different schemas are never layered together,
# so consider only documents with same schema as candidates.
if (other_doc.get_layer() == children_doc_layer
and other_doc.get_schema() == doc.get_schema()):
# A document can have many labels but should only have one
# explicit label for the parentSelector.
parent_sel = other_doc.get_parent_selector()
parent_sel_key = list(parent_sel.keys())[0]
parent_sel_val = list(parent_sel.values())[0]
doc_labels = doc.get_labels()
if (parent_sel_key in doc_labels and
parent_sel_val == doc_labels[parent_sel_key]):
children.append(other_doc)
return children
for layer in self.layer_order:
docs_by_layer = list(filter(
(lambda x: x.get_layer() == layer), layered_docs))
for doc in docs_by_layer:
children = _get_children(doc)
if children:
all_children.update(children)
doc.to_dict().setdefault('children', children)
all_children_elements = list(all_children.elements())
secondary_docs = list(
filter(lambda d: d.get_layer() != self.layer_order[0],
layered_docs))
for doc in secondary_docs:
# Unless the document is the topmost document in the
# `layerOrder` of the LayeringPolicy, it should be a child document
# of another document.
if doc not in all_children_elements:
raise errors.MissingDocumentParent(document=doc)
# If the document is a child document of more than 1 parent, then
# the document has too many parents, which is a validation error.
elif all_children[doc] != 1:
raise errors.IndeterminateDocumentParent(document=doc)
return layered_docs

36
deckhand/engine/utils.py Normal file
View File

@ -0,0 +1,36 @@
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import collections
def deep_merge(dct, merge_dct):
"""Recursive dict merge. Inspired by :meth:``dict.update()``, instead of
updating only top-level keys, deep_merge recurses down into dicts nested
to an arbitrary depth, updating keys. The ``merge_dct`` is merged into
``dct``, except for merge conflicts, which are resolved by prioritizing
the ``dct`` value.
Borrowed from: https://gist.github.com/angstwad/bf22d1822c38a92ec0a9#file-deep_merge-py # noqa
:param dct: dict onto which the merge is executed
:param merge_dct: dct merged into dct
:return: None
"""
for k, v in merge_dct.items():
if (k in dct and isinstance(dct[k], dict)
and isinstance(merge_dct[k], collections.Mapping)):
deep_merge(dct[k], merge_dct[k])
else:
dct[k] = merge_dct[k]

View File

@ -58,9 +58,50 @@ class InvalidDocumentFormat(DeckhandException):
super(InvalidDocumentFormat, self).__init__(**kwargs)
class UnknownDocumentFormat(DeckhandException):
msg_fmt = ("Could not determine the validation schema to validate the "
"document type: %(document_type)s.")
# TODO(fmontei): Remove this in a future commit.
class ApiError(Exception):
pass
class InvalidFormat(ApiError):
"""The YAML file is incorrectly formatted and cannot be read."""
class DocumentExists(DeckhandException):
msg_fmt = ("Document with kind %(kind)s and schemaVersion "
"%(schema_version)s already exists.")
code = 409
class LayeringPolicyNotFound(DeckhandException):
msg_fmt = ("LayeringPolicy with schema %(schema)s not found in the "
"system.")
code = 400
class LayeringPolicyMalformed(DeckhandException):
msg_fmt = ("LayeringPolicy with schema %(schema)s is improperly formatted:"
" %(document)s.")
code = 400
class IndeterminateDocumentParent(DeckhandException):
msg_fmt = ("Too many parent documents found for document %(document)s.")
code = 400
class MissingDocumentParent(DeckhandException):
msg_fmt = ("Missing parent document for document %(document)s.")
code = 400
class MissingDocumentKey(DeckhandException):
msg_fmt = ("Missing document key %(key)s from either parent or child. "
"Parent: %(parent)s. Child: %(child)s.")
class UnsupportedActionMethod(DeckhandException):
msg_fmt = ("Method in %(actions)s is invalid for document %(document)s.")
code = 400

View File

@ -26,16 +26,220 @@ LOG = logging.getLogger(__name__)
@six.add_metaclass(abc.ABCMeta)
class DeckhandFactory(object):
# TODO(fmontei): Allow this to be overriden in ``__init__``.
API_VERSION = '1.0'
@abc.abstractmethod
def gen(self, *args):
"""Generate an object for usage by the Deckhand `engine` module."""
pass
@abc.abstractmethod
def gen_test(self, *args, **kwargs):
"""Generate an object with randomized values for a test."""
pass
class DocumentFactory(DeckhandFactory):
"""Class for auto-generating document templates for testing."""
LAYERING_DEFINITION = {
"data": {
"layerOrder": []
},
"metadata": {
"name": "layering-policy",
"schema": "metadata/Control/v%s" % DeckhandFactory.API_VERSION
},
"schema": "deckhand/LayeringPolicy/v%s" % DeckhandFactory.API_VERSION
}
LAYER_TEMPLATE = {
"data": {},
"metadata": {
"labels": {"": ""},
"layeringDefinition": {
"abstract": False,
"layer": "",
"parentSelector": "",
"actions": []
},
"name": "",
"schema": "metadata/Document/v%s" % DeckhandFactory.API_VERSION
},
"schema": "example/Kind/v1.0"
}
def __init__(self, num_layers, docs_per_layer):
"""Constructor for ``DocumentFactory``.
Returns a template whose JSON representation is of the form::
[{'data': {'layerOrder': ['global', 'region', 'site']},
'metadata': {'name': 'layering-policy',
'schema': 'metadata/Control/v1'},
'schema': 'deckhand/LayeringPolicy/v1'},
{'data': {'a': 1, 'b': 2},
'metadata': {'labels': {'global': 'global1'},
'layeringDefinition': {'abstract': True,
'actions': [],
'layer': 'global',
'parentSelector': ''},
'name': 'global1',
'schema': 'metadata/Document/v1'},
'schema': 'example/Kind/v1'}
...
]
:param num_layers: Total number of layers. Only supported values
include 2 or 3.
:type num_layers: integer
:param docs_per_layer: The number of documents to be included per
layer. For example, if ``num_layers`` is 3, then ``docs_per_layer``
can be (1, 1, 1) for 1 document for each layer or (1, 2, 3) for 1
doc for the 1st layer, 2 docs for the 2nd layer, and 3 docs for the
3rd layer.
:type docs_per_layer: tuple, list
:raises TypeError: If ``docs_per_layer`` is not the right type.
:raises ValueError: If ``num_layers`` is not the right value or isn't
compatible with ``docs_per_layer``.
"""
# Set up the layering definition's layerOrder.
if num_layers == 2:
layer_order = ["global", "site"]
elif num_layers == 3:
layer_order = ["global", "region", "site"]
else:
raise ValueError("'num_layers' must either be 2 or 3.")
self.LAYERING_DEFINITION['data']['layerOrder'] = layer_order
if not isinstance(docs_per_layer, (list, tuple)):
raise TypeError("'docs_per_layer' must be a list or tuple "
"indicating the number of documents per layer.")
elif not len(docs_per_layer) == num_layers:
raise ValueError("The number of entries in 'docs_per_layer' must"
"be equal to the value of 'num_layers'.")
for doc_count in docs_per_layer:
if doc_count < 1:
raise ValueError(
"Each entry in 'docs_per_layer' must be >= 1.")
self.num_layers = num_layers
self.docs_per_layer = docs_per_layer
def gen(self):
# TODO(fmontei): Implement this if needed later.
pass
def gen_test(self, mapping, site_abstract=True, region_abstract=True,
global_abstract=True, site_parent_selectors=None):
"""Generate the document template.
Generate the document template based on the arguments passed to
the constructor and to this function.
:param mapping: A list of dictionaries that specify the "data" and
"actions" parameters for each document. A valid mapping is::
mapping = {
"_GLOBAL_DATA_1_": {"data": {"a": {"x": 1, "y": 2}}},
"_SITE_DATA_1_": {"data": {"a": {"x": 7, "z": 3}, "b": 4}},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "merge", "path": path}]}
}
Each key must be of the form "_{LAYER_NAME}_{KEY_NAME}_{N}_"
where:
- {LAYER_NAME} is the name of the layer ("global", "region",
"site")
- {KEY_NAME} is either "DATA" or "ACTIONS"
- {N} is the occurrence of the document based on the
values in ``docs_per_layer``. If ``docs_per_layer`` is
(1, 2) then _GLOBAL_DATA_1_, _SITE_DATA_1_, _SITE_DATA_2_,
_SITE_ACTIONS_1_ and _SITE_ACTIONS_2_ must be provided.
_GLOBAL_ACTIONS_{N}_ is ignored.
:type mapping: dict
:param site_abstract: Whether site layers are abstract/concrete.
:type site_abstract: boolean
:param region_abstract: Whether region layers are abstract/concrete.
:type region_abstract: boolean
:param global_abstract: Whether global layers are abstract/concrete.
:type global_abstract: boolean
:param site_parent_selectors: Override the default parent selector
for each site. Assuming that ``docs_per_layer`` is (2, 2), for
example, a valid value is::
[{'global': 'global1'}, {'global': 'global2'}]
If not specified, each site will default to the first parent.
:type site_parent_selectors: list
:returns: Rendered template of the form specified above.
"""
rendered_template = [self.LAYERING_DEFINITION]
layer_order = rendered_template[0]['data']['layerOrder']
for layer_idx in range(self.num_layers):
for count in range(self.docs_per_layer[layer_idx]):
layer_template = copy.deepcopy(self.LAYER_TEMPLATE)
layer_name = layer_order[layer_idx]
# Set name.
layer_template = copy.deepcopy(layer_template)
layer_template['metadata']['name'] = "%s%d" % (
layer_name, count + 1)
# Set layer.
layer_template['metadata']['layeringDefinition'][
'layer'] = layer_name
# Set labels.
layer_template['metadata']['labels'] = {layer_name: "%s%d" % (
layer_name, count + 1)}
# Set parentSelector.
if layer_name == 'site' and site_parent_selectors:
parent_selector = site_parent_selectors[count]
layer_template['metadata']['layeringDefinition'][
'parentSelector'] = parent_selector
elif layer_idx > 0:
parent_selector = rendered_template[layer_idx][
'metadata']['labels']
layer_template['metadata']['layeringDefinition'][
'parentSelector'] = parent_selector
# Set abstract.
if layer_name == 'site':
layer_template['metadata']['layeringDefinition'][
'abstract'] = site_abstract
if layer_name == 'region':
layer_template['metadata']['layeringDefinition'][
'abstract'] = region_abstract
elif layer_name == 'global':
layer_template['metadata']['layeringDefinition'][
'abstract'] = global_abstract
# Set data and actions.
data_key = "_%s_DATA_%d_" % (layer_name.upper(), count + 1)
actions_key = "_%s_ACTIONS_%d_" % (
layer_name.upper(), count + 1)
try:
layer_template['data'] = mapping[data_key]['data']
layer_template['metadata']['layeringDefinition'][
'actions'] = mapping[actions_key]['actions']
except KeyError as e:
LOG.warning('Could not map %s because it was not found in '
'the `mapping` dict.', e.args[0])
pass
rendered_template.append(layer_template)
return rendered_template
class ValidationPolicyFactory(DeckhandFactory):
"""Class for auto-generating validation policy templates for testing."""
@ -44,7 +248,7 @@ class ValidationPolicyFactory(DeckhandFactory):
"validations": []
},
"metadata": {
"schema": "metadata/Control/v1",
"schema": "metadata/Control/%s" % DeckhandFactory.API_VERSION,
"name": ""
},
"schema": types.VALIDATION_POLICY_SCHEMA
@ -56,9 +260,9 @@ class ValidationPolicyFactory(DeckhandFactory):
Returns a template whose YAML representation is of the form::
---
schema: deckhand/ValidationPolicy/v1
schema: deckhand/ValidationPolicy/v1.0
metadata:
schema: metadata/Control/v1
schema: metadata/Control/v1.0
name: site-deploy-ready
data:
validations:

View File

@ -0,0 +1,740 @@
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from deckhand.engine import layering
from deckhand import errors
from deckhand import factories
from deckhand.tests.unit import base as test_base
class TestDocumentLayering(test_base.DeckhandTestCase):
def _test_layering(self, documents, site_expected=None,
region_expected=None, global_expected=None,
exception_expected=None):
document_layering = layering.DocumentLayering(documents)
if all([site_expected, region_expected, global_expected,
exception_expected]):
raise ValueError(
'(site_expected|region_expected|global_expected) and '
'(exception_expected) are mutually exclusive.')
if exception_expected:
self.assertRaises(exception_expected, document_layering.render)
return
site_docs = []
region_docs = []
global_docs = []
# The layering policy is not returned as it is immutable. So all docs
# should have a metadata.layeringDefinitionn.layer section.
rendered_documents = document_layering.render()
for doc in rendered_documents:
layer = doc['metadata']['layeringDefinition']['layer']
if layer == 'site':
site_docs.append(doc)
if layer == 'region':
region_docs.append(doc)
if layer == 'global':
global_docs.append(doc)
if site_expected:
if not isinstance(site_expected, list):
site_expected = [site_expected]
for idx, expected in enumerate(site_expected):
self.assertEqual(expected, site_docs[idx].get('data'))
if region_expected:
if not isinstance(region_expected, list):
region_expected = [region_expected]
for idx, expected in enumerate(region_expected):
self.assertEqual(expected, region_docs[idx].get('data'))
if global_expected:
if not isinstance(global_expected, list):
global_expected = [global_expected]
for idx, expected in enumerate(global_expected):
self.assertEqual(expected, global_docs[idx].get('data'))
class TestDocumentLayering2Layers(TestDocumentLayering):
def test_layering_default_scenario(self):
# Default scenario mentioned in design document for 2 layers (region
# data is removed).
mapping = {
"_GLOBAL_DATA_1_": {"data": {"a": {"x": 1, "y": 2}}},
"_SITE_DATA_1_": {"data": {"b": 4}},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "merge", "path": "."}]}
}
doc_factory = factories.DocumentFactory(2, [1, 1])
documents = doc_factory.gen_test(mapping, site_abstract=False)
site_expected = {'a': {'x': 1, 'y': 2}, 'b': 4}
self._test_layering(documents, site_expected)
def test_layering_method_delete(self):
site_expected = [{}, {'c': 9}, {"a": {"x": 1, "y": 2}}]
doc_factory = factories.DocumentFactory(2, [1, 1])
for idx, path in enumerate(['.', '.a', '.c']):
mapping = {
"_GLOBAL_DATA_1_": {"data": {"a": {"x": 1, "y": 2}, "c": 9}},
"_SITE_DATA_1_": {"data": {"a": {"x": 7, "z": 3}, "b": 4}},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "delete", "path": path}]}
}
documents = doc_factory.gen_test(mapping, site_abstract=False)
self._test_layering(documents, site_expected[idx])
def test_layering_method_merge(self):
site_expected = [
{'a': {'x': 7, 'y': 2, 'z': 3}, 'b': 4, 'c': 9},
{'a': {'x': 7, 'y': 2, 'z': 3}, 'c': 9},
{'a': {'x': 1, 'y': 2}, 'b': 4, 'c': 9}
]
doc_factory = factories.DocumentFactory(2, [1, 1])
for idx, path in enumerate(['.', '.a', '.b']):
mapping = {
"_GLOBAL_DATA_1_": {"data": {"a": {"x": 1, "y": 2}, "c": 9}},
"_SITE_DATA_1_": {"data": {"a": {"x": 7, "z": 3}, "b": 4}},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "merge", "path": path}]}
}
documents = doc_factory.gen_test(mapping, site_abstract=False)
self._test_layering(documents, site_expected[idx])
def test_layering_method_replace(self):
site_expected = [
{'a': {'x': 7, 'z': 3}, 'b': 4},
{'a': {'x': 7, 'z': 3}, 'c': 9},
{'a': {'x': 1, 'y': 2}, 'b': 4, 'c': 9}
]
doc_factory = factories.DocumentFactory(2, [1, 1])
for idx, path in enumerate(['.', '.a', '.b']):
mapping = {
"_GLOBAL_DATA_1_": {"data": {"a": {"x": 1, "y": 2}, "c": 9}},
"_SITE_DATA_1_": {"data": {"a": {"x": 7, "z": 3}, "b": 4}},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "replace", "path": path}]}
}
documents = doc_factory.gen_test(mapping, site_abstract=False)
self._test_layering(documents, site_expected[idx])
class TestDocumentLayering2LayersAbstractConcrete(TestDocumentLayering):
"""The the 2-layer payload with site/global layers concrete.
Both the site and global data should be updated as they're both
concrete docs. (2-layer has no region layer.)
"""
def test_layering_site_and_global_concrete(self):
mapping = {
"_GLOBAL_DATA_1_": {"data": {"a": {"x": 1, "y": 2}, "c": 9}},
"_SITE_DATA_1_": {"data": {"a": {"x": 7, "z": 3}, "b": 4}},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "delete", "path": '.a'}]}
}
doc_factory = factories.DocumentFactory(2, [1, 1])
documents = doc_factory.gen_test(mapping, site_abstract=False,
global_abstract=False)
site_expected = {'c': 9}
global_expected = {'a': {'x': 1, 'y': 2}, 'c': 9}
self._test_layering(documents, site_expected,
global_expected=global_expected)
def test_layering_site_and_global_abstract(self):
mapping = {
"_GLOBAL_DATA_1_": {"data": {"a": {"x": 1, "y": 2}, "c": 9}},
"_SITE_DATA_1_": {"data": {"a": {"x": 7, "z": 3}, "b": 4}},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "delete", "path": '.a'}]}
}
doc_factory = factories.DocumentFactory(2, [1, 1])
documents = doc_factory.gen_test(mapping, site_abstract=True,
global_abstract=True)
site_expected = {"a": {"x": 7, "z": 3}, "b": 4}
global_expected = {'a': {'x': 1, 'y': 2}, 'c': 9}
self._test_layering(documents, site_expected,
global_expected=global_expected)
class TestDocumentLayering2Layers2Sites(TestDocumentLayering):
def test_layering_default_scenario(self):
mapping = {
"_GLOBAL_DATA_1_": {"data": {"a": {"x": 1, "y": 2}}},
"_SITE_DATA_1_": {"data": {"b": 4}},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "merge", "path": "."}]},
"_SITE_DATA_2_": {"data": {"b": 3}},
"_SITE_ACTIONS_2_": {
"actions": [{"method": "merge", "path": "."},
{"method": "delete", "path": ".a"}]}
}
doc_factory = factories.DocumentFactory(2, [1, 2])
documents = doc_factory.gen_test(mapping, site_abstract=False)
site_expected = [{'a': {'x': 1, 'y': 2}, 'b': 4},
{'b': 3}]
self._test_layering(documents, site_expected)
def test_layering_alternate_scenario(self):
mapping = {
"_GLOBAL_DATA_1_": {"data": {"a": {"x": 1, "y": 2}}},
"_SITE_DATA_1_": {"data": {"b": 4}},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "merge", "path": "."}]},
"_SITE_DATA_2_": {"data": {"b": 3}},
"_SITE_ACTIONS_2_": {
"actions": [{"method": "merge", "path": "."},
{"method": "delete", "path": ".a"},
{"method": "merge", "path": ".b"}]}
}
doc_factory = factories.DocumentFactory(2, [1, 2])
documents = doc_factory.gen_test(mapping, site_abstract=False)
site_expected = [{'a': {'x': 1, 'y': 2}, 'b': 4}, {'b': 3}]
self._test_layering(documents, site_expected)
class TestDocumentLayering2Layers2Sites2Globals(TestDocumentLayering):
def test_layering_two_parents_only_one_with_child(self):
mapping = {
"_GLOBAL_DATA_1_": {"data": {"a": {"x": 1, "y": 2}}},
"_GLOBAL_DATA_2_": {"data": {"a": {"x": 1, "y": 2}}},
"_SITE_DATA_1_": {"data": {"b": 4}},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "merge", "path": "."}]},
"_SITE_DATA_2_": {"data": {"b": 3}},
"_SITE_ACTIONS_2_": {
"actions": [{"method": "merge", "path": "."}]}
}
doc_factory = factories.DocumentFactory(2, [2, 2])
documents = doc_factory.gen_test(
mapping, site_abstract=False, site_parent_selectors=[
{'global': 'global1'}, {'global': 'global2'}])
site_expected = [{'a': {'x': 1, 'y': 2}, 'b': 4},
{'a': {'x': 1, 'y': 2}, 'b': 3}]
self._test_layering(documents, site_expected)
def test_layering_two_parents_one_child_each_1(self):
mapping = {
"_GLOBAL_DATA_1_": {"data": {"a": {"x": 1, "y": 2}}},
"_GLOBAL_DATA_2_": {"data": {"a": {"x": 1, "y": 2}}},
"_SITE_DATA_1_": {"data": {"b": 4}},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "merge", "path": "."}]},
"_SITE_DATA_2_": {"data": {"b": 3}},
"_SITE_ACTIONS_2_": {
"actions": [{"method": "merge", "path": "."}]}
}
doc_factory = factories.DocumentFactory(2, [2, 2])
documents = doc_factory.gen_test(
mapping, site_abstract=False, site_parent_selectors=[
{'global': 'global1'}, {'global': 'global2'}])
site_expected = [{'a': {'x': 1, 'y': 2}, 'b': 4},
{'a': {'x': 1, 'y': 2}, 'b': 3}]
self._test_layering(documents, site_expected)
def test_layering_two_parents_one_child_each_2(self):
"""Scenario:
Initially: p1: {"a": {"x": 1, "y": 2}}, p2: {"b": {"f": -9, "g": 71}}
Where: c1 references p1 and c2 references p2
Merge "." (p1 -> c1): {"a": {"x": 1, "y": 2, "b": 4}}
Merge "." (p2 -> c2): {"b": {"f": -9, "g": 71}, "c": 3}
Delete ".c" (p2 -> c2): {"b": {"f": -9, "g": 71}}
"""
mapping = {
"_GLOBAL_DATA_1_": {"data": {"a": {"x": 1, "y": 2}}},
"_GLOBAL_DATA_2_": {"data": {"b": {"f": -9, "g": 71}}},
"_SITE_DATA_1_": {"data": {"b": 4}},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "merge", "path": "."}]},
"_SITE_DATA_2_": {"data": {"c": 3}},
"_SITE_ACTIONS_2_": {
"actions": [{"method": "merge", "path": "."},
{"method": "delete", "path": ".c"}]}
}
doc_factory = factories.DocumentFactory(2, [2, 2])
documents = doc_factory.gen_test(
mapping, site_abstract=False, site_parent_selectors=[
{'global': 'global1'}, {'global': 'global2'}])
site_expected = [{'a': {'x': 1, 'y': 2}, 'b': 4},
{"b": {"f": -9, "g": 71}}]
self._test_layering(documents, site_expected)
class TestDocumentLayering3Layers(TestDocumentLayering):
def test_layering_default_scenario(self):
# Default scenario mentioned in design document for 3 layers.
mapping = {
"_GLOBAL_DATA_1_": {"data": {"a": {"x": 1, "y": 2}}},
"_REGION_DATA_1_": {"data": {"a": {"z": 3}}},
"_SITE_DATA_1_": {"data": {"b": 4}},
"_REGION_ACTIONS_1_": {
"actions": [{"method": "replace", "path": ".a"}]},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "merge", "path": "."}]}
}
doc_factory = factories.DocumentFactory(3, [1, 1, 1])
documents = doc_factory.gen_test(mapping, site_abstract=False)
site_expected = {'a': {'z': 3}, 'b': 4}
region_expected = {"a": {"z": 3}} # Region is abstract.
self._test_layering(documents, site_expected, region_expected)
def test_layering_delete_everything(self):
mapping = {
"_GLOBAL_DATA_1_": {"data": {"a": {"x": 3, "y": 4}, "b": 99}},
"_REGION_DATA_1_": {"data": {"a": {"x": 1, "y": 2}}},
"_SITE_DATA_1_": {"data": {"b": 4}},
"_REGION_ACTIONS_1_": {
"actions": [{"path": ".a", "method": "delete"}]},
"_SITE_ACTIONS_1_": {"actions": [
{"method": "delete", "path": ".b"}]}
}
doc_factory = factories.DocumentFactory(3, [1, 1, 1])
documents = doc_factory.gen_test(mapping, site_abstract=False)
site_expected = {}
self._test_layering(documents, site_expected)
def test_layering_delete_everything_missing_path(self):
"""Scenario:
Initially: {"a": {"x": 3, "y": 4}, "b": 99}
Delete ".": {}
Delete ".b": MissingDocumentKey
"""
mapping = {
"_GLOBAL_DATA_1_": {"data": {"a": {"x": 3, "y": 4}, "b": 99}},
"_REGION_DATA_1_": {"data": {"a": {"x": 1, "y": 2}}},
"_SITE_DATA_1_": {"data": {"b": 4}},
"_REGION_ACTIONS_1_": {
"actions": [{"path": ".", "method": "delete"}]},
"_SITE_ACTIONS_1_": {"actions": [
{"method": "delete", "path": ".b"}]}
}
doc_factory = factories.DocumentFactory(3, [1, 1, 1])
documents = doc_factory.gen_test(mapping, site_abstract=False)
self._test_layering(
documents, exception_expected=errors.MissingDocumentKey)
def test_layering_delete_path_a(self):
mapping = {
"_GLOBAL_DATA_1_": {
"data": {'a': {'x': 1, 'y': 2}, 'b': {'v': 3, 'w': 4}}},
"_REGION_DATA_1_": {"data": {"a": {"z": 3}}},
"_SITE_DATA_1_": {"data": {"b": 4}},
"_REGION_ACTIONS_1_": {
"actions": [{'path': '.a', 'method': 'delete'}]},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "merge", "path": "."}]}
}
doc_factory = factories.DocumentFactory(3, [1, 1, 1])
documents = doc_factory.gen_test(mapping, site_abstract=False)
site_expected = {'b': 4}
self._test_layering(documents, site_expected)
def test_layering_merge_and_replace(self):
mapping = {
"_GLOBAL_DATA_1_": {
"data": {'a': {'x': 1, 'y': 2}, 'b': {'v': 3, 'w': 4}}},
"_REGION_DATA_1_": {"data": {"a": {"z": 3}}},
"_SITE_DATA_1_": {"data": {'a': {'z': 5}}},
"_REGION_ACTIONS_1_": {
"actions": [{'path': '.', 'method': 'replace'}]},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "merge", "path": "."}]}
}
doc_factory = factories.DocumentFactory(3, [1, 1, 1])
documents = doc_factory.gen_test(mapping, site_abstract=False)
site_expected = {'a': {'z': 5}}
self._test_layering(documents, site_expected)
def test_layering_double_merge(self):
mapping = {
"_GLOBAL_DATA_1_": {"data": {"c": {"e": 55}}},
"_REGION_DATA_1_": {
"data": {'a': {'x': 1, 'y': 2}, 'b': {'v': 3, 'w': 4}}},
"_SITE_DATA_1_": {"data": {"a": {"z": 5}}},
"_REGION_ACTIONS_1_": {
"actions": [{"method": "merge", "path": "."}]},
"_SITE_ACTIONS_1_": {"actions": [{"method": "merge", "path": "."}]}
}
doc_factory = factories.DocumentFactory(3, [1, 1, 1])
documents = doc_factory.gen_test(mapping, site_abstract=False)
site_expected = {'a': {'x': 1, 'y': 2, 'z': 5},
'b': {'v': 3, 'w': 4}, 'c': {'e': 55}}
self._test_layering(documents, site_expected)
def test_layering_double_merge_2(self):
mapping = {
"_GLOBAL_DATA_1_": {
"data": {'a': {'x': 1, 'y': 2}, 'b': {'v': 3, 'w': 4}}},
"_REGION_DATA_1_": {"data": {'a': {'e': 55}}},
"_SITE_DATA_1_": {"data": {"b": 4}},
"_REGION_ACTIONS_1_": {
"actions": [{'path': '.a', 'method': 'merge'}]},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "merge", "path": "."}]}
}
doc_factory = factories.DocumentFactory(3, [1, 1, 1])
documents = doc_factory.gen_test(mapping, site_abstract=False)
site_expected = {'a': {'x': 1, 'y': 2, 'e': 55}, 'b': 4}
self._test_layering(documents, site_expected)
class TestDocumentLayering3LayersAbstractConcrete(TestDocumentLayering):
"""The the 3-layer payload with site/region layers concrete.
Both the site and region data should be updated as they're both concrete
docs.
"""
def test_layering_site_and_region_concrete(self):
"""Scenario:
Initially: {"a": {"x": 1, "y": 2}}
Merge ".": {"a": {"x": 1, "y": 2, "z": 3}, "b": 5, "c": 11}
(Region updated.)
Delete ".c": {"a": {"x": 1, "y": 2, "z": 3}, "b": 5} (Region updated.)
Replace ".b": {"a": {"x": 1, "y": 2, "z": 3}, "b": 4} (Site updated.)
"""
mapping = {
"_GLOBAL_DATA_1_": {"data": {"a": {"x": 1, "y": 2}}},
"_REGION_DATA_1_": {"data": {"a": {"z": 3}, "b": 5, "c": 11}},
"_SITE_DATA_1_": {"data": {"b": 4}},
"_REGION_ACTIONS_1_": {
"actions": [{"method": "merge", "path": "."},
{"method": "delete", "path": ".c"}]},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "replace", "path": ".b"}]}
}
doc_factory = factories.DocumentFactory(3, [1, 1, 1])
documents = doc_factory.gen_test(mapping, site_abstract=False,
region_abstract=False)
site_expected = {"a": {"x": 1, "y": 2, "z": 3}, "b": 4}
region_expected = {"a": {"x": 1, "y": 2, "z": 3}, "b": 5}
self._test_layering(documents, site_expected, region_expected)
def test_layering_site_concrete_and_region_abstract(self):
"""Scenario:
Initially: {"a": {"x": 1, "y": 2}}
Merge ".": {"a": {"x": 1, "y": 2, "z": 3}, "b": 5, "c": 11}
Delete ".c": {"a": {"x": 1, "y": 2, "z": 3}, "b": 5}
Replace ".b": {"a": {"x": 1, "y": 2, "z": 3}, "b": 4} (Site updated.)
"""
mapping = {
"_GLOBAL_DATA_1_": {"data": {"a": {"x": 1, "y": 2}}},
"_REGION_DATA_1_": {"data": {"a": {"z": 3}, "b": 5, "c": 11}},
"_SITE_DATA_1_": {"data": {"b": 4}},
"_REGION_ACTIONS_1_": {
"actions": [{"method": "merge", "path": "."},
{"method": "delete", "path": ".c"}]},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "replace", "path": ".b"}]}
}
doc_factory = factories.DocumentFactory(3, [1, 1, 1])
documents = doc_factory.gen_test(mapping, site_abstract=False,
region_abstract=True)
site_expected = {"a": {"x": 1, "y": 2, "z": 3}, "b": 4}
region_expected = {"a": {"z": 3}, "b": 5, "c": 11}
self._test_layering(documents, site_expected, region_expected)
def test_layering_site_region_and_global_concrete(self):
# Both the site and region data should be updated as they're both
# concrete docs.
mapping = {
"_GLOBAL_DATA_1_": {"data": {"a": {"x": 1, "y": 2}}},
"_REGION_DATA_1_": {"data": {"a": {"z": 3}, "b": 5}},
"_SITE_DATA_1_": {"data": {"b": 4}},
"_REGION_ACTIONS_1_": {
"actions": [{"method": "replace", "path": ".a"}]},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "merge", "path": "."}]}
}
doc_factory = factories.DocumentFactory(3, [1, 1, 1])
documents = doc_factory.gen_test(
mapping, site_abstract=False, region_abstract=False,
global_abstract=False)
site_expected = {'a': {'z': 3}, 'b': 4}
region_expected = {'a': {'z': 3}}
# Global data remains unchanged as there's no layer higher than it in
# this example.
global_expected = {'a': {'x': 1, 'y': 2}}
self._test_layering(documents, site_expected, region_expected,
global_expected)
class TestDocumentLayering3LayersScenario(TestDocumentLayering):
def test_layering_multiple_delete(self):
"""Scenario:
Initially: {'x': 1, 'y': 2}, 'b': {'v': 3, 'w': 4}}
Delete ".": {}
Delete ".": {}
Merge ".": {'b': 4}
"""
mapping = {
"_GLOBAL_DATA_1_": {
"data": {'a': {'x': 1, 'y': 2}, 'b': {'v': 3, 'w': 4}}},
"_REGION_DATA_1_": {"data": {"a": {"z": 3}}},
"_SITE_DATA_1_": {"data": {"b": 4}},
"_REGION_ACTIONS_1_": {
"actions": [{'path': '.', 'method': 'delete'},
{'path': '.', 'method': 'delete'}]},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "merge", "path": "."}]}
}
doc_factory = factories.DocumentFactory(3, [1, 1, 1])
documents = doc_factory.gen_test(mapping, site_abstract=False)
site_expected = {'b': 4}
self._test_layering(documents, site_expected)
def test_layering_multiple_replace_1(self):
"""Scenario:
Initially: {'x': 1, 'y': 2}, 'b': {'v': 3, 'w': 4}}
Replace ".a": {'a': {'z': 5}, 'b': {'v': 3, 'w': 4}}
Replace ".a": {'a': {'z': 5}, 'b': {'v': 3, 'w': 4}}
Merge ".": {'a': {'z': 5}, 'b': 4}
"""
mapping = {
"_GLOBAL_DATA_1_": {
"data": {'a': {'x': 1, 'y': 2}, 'b': {'v': 3, 'w': 4}}},
"_REGION_DATA_1_": {"data": {'a': {'z': 5}}},
"_SITE_DATA_1_": {"data": {"b": 4}},
"_REGION_ACTIONS_1_": {
"actions": [{'path': '.a', 'method': 'replace'},
{'path': '.a', 'method': 'replace'}]},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "merge", "path": "."}]}
}
doc_factory = factories.DocumentFactory(3, [1, 1, 1])
documents = doc_factory.gen_test(mapping, site_abstract=False)
site_expected = {'a': {'z': 5}, 'b': 4}
self._test_layering(documents, site_expected)
def test_layering_multiple_replace_2(self):
"""Scenario:
Initially: {'a': {'x': 1, 'y': 2}, 'b': {'v': 3, 'w': 4}}
Replace ".a": {'a': {'z': 5}, 'b': {'v': 3, 'w': 4}}
Replace ".b": {'a': {'z': 5}, 'b': [109]}
Merge ".": {'a': {'z': 5}, 'b': [32]}
"""
mapping = {
"_GLOBAL_DATA_1_": {
"data": {'a': {'x': 1, 'y': 2}, 'b': {'v': 3, 'w': 4}}},
"_REGION_DATA_1_": {"data": {'a': {'z': 5}, 'b': [109]}},
"_SITE_DATA_1_": {"data": {"b": [32]}},
"_REGION_ACTIONS_1_": {
"actions": [{'path': '.a', 'method': 'replace'},
{'path': '.b', 'method': 'replace'}]},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "merge", "path": "."}]}
}
doc_factory = factories.DocumentFactory(3, [1, 1, 1])
documents = doc_factory.gen_test(mapping, site_abstract=False)
site_expected = {'a': {'z': 5}, 'b': [32]}
self._test_layering(documents, site_expected)
def test_layering_multiple_replace_3(self):
"""Scenario:
Initially: {'a': {'x': 1, 'y': 2}, 'b': {'v': 3, 'w': 4}, 'c': [123]}
Replace ".a": {'a': {'z': 5}, 'b': {'v': 3, 'w': 4}, 'c': [123]}
Replace ".b": {'a': {'z': 5}, 'b': -2, 'c': [123]}
Merge ".": {'a': {'z': 5}, 'b': 4, 'c': [123]}
"""
mapping = {
"_GLOBAL_DATA_1_": {
"data": {'a': {'x': 1, 'y': 2}, 'b': {'v': 3, 'w': 4},
'c': [123]}},
"_REGION_DATA_1_": {"data": {'a': {'z': 5}, 'b': -2, 'c': '_'}},
"_SITE_DATA_1_": {"data": {"b": 4}},
"_REGION_ACTIONS_1_": {
"actions": [{'path': '.a', 'method': 'replace'},
{'path': '.b', 'method': 'replace'}]},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "merge", "path": "."}]}
}
doc_factory = factories.DocumentFactory(3, [1, 1, 1])
documents = doc_factory.gen_test(mapping, site_abstract=False)
site_expected = {'a': {'z': 5}, 'b': 4, 'c': [123]}
self._test_layering(documents, site_expected)
def test_layering_multiple_replace_4(self):
"""Scenario:
Initially: {'a': {'x': 1, 'y': 2}, 'b': {'v': 3, 'w': 4}, 'c': [123]}
Replace ".a": {'a': {'z': 5}, 'b': {'v': 3, 'w': 4}, 'c': [123]}
Replace ".b": {'a': {'z': 5}, 'b': -2, 'c': [123]}
Replace ".c": {'a': {'z': 5}, 'b': -2, 'c': '_'}
Merge ".": {'a': {'z': 5}, 'b': 4, 'c': '_'}
"""
mapping = {
"_GLOBAL_DATA_1_": {
"data": {'a': {'x': 1, 'y': 2}, 'b': {'v': 3, 'w': 4},
'c': [123]}},
"_REGION_DATA_1_": {"data": {'a': {'z': 5}, 'b': -2, 'c': '_'}},
"_SITE_DATA_1_": {"data": {"b": 4}},
"_REGION_ACTIONS_1_": {
"actions": [{'path': '.a', 'method': 'replace'},
{'path': '.b', 'method': 'replace'},
{'path': '.c', 'method': 'replace'}]},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "merge", "path": "."}]}
}
doc_factory = factories.DocumentFactory(3, [1, 1, 1])
documents = doc_factory.gen_test(mapping, site_abstract=False)
site_expected = {'a': {'z': 5}, 'b': 4, 'c': '_'}
self._test_layering(documents, site_expected)
def test_layering_multiple_delete_replace(self):
"""Scenario:
Initially: {'a': {'x': 1, 'y': 2}, 'b': {'v': 3, 'w': 4}}
Delete ".a": {'b': {'v': 3, 'w': 4}}
Replace ".b": {'b': {'z': 3}}
Delete ".b": {}
Merge ".": {'b': 4}
"""
mapping = {
"_GLOBAL_DATA_1_": {
"data": {'a': {'x': 1, 'y': 2}, 'b': {'v': 3, 'w': 4}}},
"_REGION_DATA_1_": {"data": {"b": {"z": 3}}},
"_SITE_DATA_1_": {"data": {"b": 4}},
"_REGION_ACTIONS_1_": {
"actions": [{'path': '.a', 'method': 'delete'},
{'path': '.b', 'method': 'replace'},
{'path': '.b', 'method': 'delete'}]},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "merge", "path": "."}]}
}
doc_factory = factories.DocumentFactory(3, [1, 1, 1])
documents = doc_factory.gen_test(mapping, site_abstract=False)
site_expected = {'b': 4}
self._test_layering(documents, site_expected)
class TestDocumentLayering3Layers2Regions2Sites(TestDocumentLayering):
def test_layering_two_abstract_regions_one_child_each(self):
"""Scenario:
Initially: r1: {"c": 3, "d": 4}, r2: {"e": 5, "f": 6}
Merge "." (g -> r1): {"a": 1, "b": 2, "c": 3, "d": 4}
Merge "." (r1 -> s1): {"a": 1, "b": 2, "c": 3, "d": 4, "g": 7, "h": 8}
Merge "." (g -> r2): {"a": 1, "b": 2, "e": 5, "f": 6}
Merge "." (r2 -> s2): {"a": 1, "b": 2, "e": 5, "f": 6, "i": 9, "j": 10}
"""
mapping = {
"_GLOBAL_DATA_1_": {"data": {"a": 1, "b": 2}},
"_REGION_DATA_1_": {"data": {"c": 3, "d": 4}},
"_REGION_ACTIONS_1_": {
"actions": [{"method": "merge", "path": "."}]},
"_REGION_DATA_2_": {"data": {"e": 5, "f": 6}},
"_REGION_ACTIONS_2_": {
"actions": [{"method": "merge", "path": "."}]},
"_SITE_DATA_1_": {"data": {"g": 7, "h": 8}},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "merge", "path": "."}]},
"_SITE_DATA_2_": {"data": {"i": 9, "j": 10}},
"_SITE_ACTIONS_2_": {
"actions": [{"method": "merge", "path": "."}]}
}
doc_factory = factories.DocumentFactory(3, [1, 2, 2])
documents = doc_factory.gen_test(
mapping, region_abstract=True, site_abstract=False,
site_parent_selectors=[
{'region': 'region1'}, {'region': 'region2'}])
site_expected = [{"a": 1, "b": 2, "c": 3, "d": 4, "g": 7, "h": 8},
{"a": 1, "b": 2, "e": 5, "f": 6, "i": 9, "j": 10}]
region_expected = [{"c": 3, "d": 4}, {"e": 5, "f": 6}]
global_expected = {"a": 1, "b": 2}
self._test_layering(documents, site_expected, region_expected,
global_expected)
def test_layering_two_concrete_regions_one_child_each(self):
"""Scenario:
Initially: r1: {"c": 3, "d": 4}, r2: {"e": 5, "f": 6}
Merge "." (g -> r1): {"a": 1, "b": 2, "c": 3, "d": 4}
Merge "." (r1 -> s1): {"a": 1, "b": 2, "c": 3, "d": 4, "g": 7, "h": 8}
Merge "." (g -> r2): {"a": 1, "b": 2, "e": 5, "f": 6}
Merge "." (r2 -> s2): {"a": 1, "b": 2, "e": 5, "f": 6, "i": 9, "j": 10}
"""
mapping = {
"_GLOBAL_DATA_1_": {"data": {"a": 1, "b": 2}},
"_REGION_DATA_1_": {"data": {"c": 3, "d": 4}},
"_REGION_ACTIONS_1_": {
"actions": [{"method": "merge", "path": "."}]},
"_REGION_DATA_2_": {"data": {"e": 5, "f": 6}},
"_REGION_ACTIONS_2_": {
"actions": [{"method": "merge", "path": "."}]},
"_SITE_DATA_1_": {"data": {"g": 7, "h": 8}},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "merge", "path": "."}]},
"_SITE_DATA_2_": {"data": {"i": 9, "j": 10}},
"_SITE_ACTIONS_2_": {
"actions": [{"method": "merge", "path": "."}]}
}
doc_factory = factories.DocumentFactory(3, [1, 2, 2])
documents = doc_factory.gen_test(
mapping, region_abstract=False, site_abstract=False,
site_parent_selectors=[
{'region': 'region1'}, {'region': 'region2'}])
site_expected = [{"a": 1, "b": 2, "c": 3, "d": 4, "g": 7, "h": 8},
{"a": 1, "b": 2, "e": 5, "f": 6, "i": 9, "j": 10}]
region_expected = [{"a": 1, "b": 2, "c": 3, "d": 4},
{"a": 1, "b": 2, "e": 5, "f": 6}]
global_expected = {"a": 1, "b": 2}
self._test_layering(documents, site_expected, region_expected,
global_expected)

View File

@ -0,0 +1,169 @@
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from deckhand.engine import layering
from deckhand import errors
from deckhand import factories
from deckhand.tests import test_utils
from deckhand.tests.unit.engine import test_document_layering
class TestDocumentLayeringNegative(
test_document_layering.TestDocumentLayering):
def test_layering_method_merge_key_not_in_child(self):
mapping = {
"_GLOBAL_DATA_1_": {"data": {"a": {"x": 1, "y": 2}, "c": 9}},
"_SITE_DATA_1_": {"data": {"a": {"x": 7, "z": 3}, "b": 4}},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "merge", "path": ".c"}]}
}
doc_factory = factories.DocumentFactory(2, [1, 1])
documents = doc_factory.gen_test(mapping, site_abstract=False)
self._test_layering(
documents, exception_expected=errors.MissingDocumentKey)
def test_layering_method_delete_key_not_in_child(self):
# The key will not be in the site after the global data is copied into
# the site data implicitly.
mapping = {
"_GLOBAL_DATA_1_": {"data": {"a": {"x": 1, "y": 2}, "c": 9}},
"_SITE_DATA_1_": {"data": {"a": {"x": 7, "z": 3}, "b": 4}},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "delete", "path": ".b"}]}
}
doc_factory = factories.DocumentFactory(2, [1, 1])
documents = doc_factory.gen_test(mapping, site_abstract=False)
self._test_layering(
documents, exception_expected=errors.MissingDocumentKey)
def test_layering_method_replace_key_not_in_child(self):
mapping = {
"_GLOBAL_DATA_1_": {"data": {"a": {"x": 1, "y": 2}, "c": 9}},
"_SITE_DATA_1_": {"data": {"a": {"x": 7, "z": 3}, "b": 4}},
"_SITE_ACTIONS_1_": {
"actions": [{"method": "replace", "path": ".c"}]}
}
doc_factory = factories.DocumentFactory(2, [1, 1])
documents = doc_factory.gen_test(mapping, site_abstract=False)
self._test_layering(
documents, exception_expected=errors.MissingDocumentKey)
def test_layering_without_layering_policy(self):
doc_factory = factories.DocumentFactory(2, [1, 1])
documents = doc_factory.gen_test({}, site_abstract=False)
documents.pop(0) # First doc is layering policy.
self.assertRaises(errors.LayeringPolicyNotFound,
layering.DocumentLayering, documents)
def test_layering_with_broken_layer_order(self):
doc_factory = factories.DocumentFactory(2, [1, 1])
documents = doc_factory.gen_test({}, site_abstract=False)
broken_layer_orders = [
['site', 'region', 'global'], ['broken', 'global'], ['broken'],
['site', 'broken']]
for broken_layer_order in broken_layer_orders:
documents[0]['data']['layerOrder'] = broken_layer_order
# The site will not be able to find a correct parent.
self.assertRaises(errors.MissingDocumentParent,
layering.DocumentLayering, documents)
def test_layering_child_with_invalid_parent_selector(self):
doc_factory = factories.DocumentFactory(2, [1, 1])
documents = doc_factory.gen_test({}, site_abstract=False)
for parent_selector in ({'key2': 'value2'}, {'key1': 'value2'}):
documents[-1]['metadata']['layeringDefinition'][
'parentSelector'] = parent_selector
self.assertRaises(errors.MissingDocumentParent,
layering.DocumentLayering, documents)
def test_layering_unreferenced_parent_label(self):
doc_factory = factories.DocumentFactory(2, [1, 1])
documents = doc_factory.gen_test({}, site_abstract=False)
for parent_label in ({'key2': 'value2'}, {'key1': 'value2'}):
# Second doc is the global doc, or parent.
documents[1]['metadata']['labels'] = [parent_label]
self.assertRaises(errors.MissingDocumentParent,
layering.DocumentLayering, documents)
def test_layering_duplicate_parent_selector_2_layer(self):
# Validate that documents belonging to the same layer cannot have the
# same unique parent identifier referenced by `parentSelector`.
doc_factory = factories.DocumentFactory(2, [1, 1])
documents = doc_factory.gen_test({}, site_abstract=False)
documents.append(documents[1]) # Copy global layer.
self.assertRaises(errors.IndeterminateDocumentParent,
layering.DocumentLayering, documents)
def test_layering_duplicate_parent_selector_3_layer(self):
# Validate that documents belonging to the same layer cannot have the
# same unique parent identifier referenced by `parentSelector`.
doc_factory = factories.DocumentFactory(3, [1, 1, 1])
documents = doc_factory.gen_test({}, site_abstract=False)
# 1 is global layer, 2 is region layer.
for idx in (1, 2):
documents.append(documents[idx])
self.assertRaises(errors.IndeterminateDocumentParent,
layering.DocumentLayering, documents)
documents.pop(-1) # Remove the just-appended duplicate.
def test_layering_document_references_itself(self):
# Test that a parentSelector cannot reference the document itself
# without an error being raised.
doc_factory = factories.DocumentFactory(3, [1, 1, 1])
documents = doc_factory.gen_test({}, site_abstract=False)
self_ref = {"self": "self"}
documents[2]['metadata']['labels'] = self_ref
documents[2]['metadata']['layeringDefinition'][
'parentSelector'] = self_ref
# Escape '[' and ']' for regex to work.
expected_err = ("Missing parent document for document %s."
% documents[2]).replace('[', '\[').replace(']', '\]')
self.assertRaisesRegex(errors.MissingDocumentParent, expected_err,
layering.DocumentLayering, documents)
def test_layering_documents_with_different_schemas(self):
"""Validate that attempting to layer documents with different schemas
results in errors.
"""
doc_factory = factories.DocumentFactory(3, [1, 1, 1])
documents = doc_factory.gen_test({})
# Region and site documents should result in no parent being found
# since their schemas will not match that of their parent's.
for idx in range(2, 4): # Only region/site have parent.
prev_schema = documents[idx]['schema']
documents[idx]['schema'] = test_utils.rand_name('schema')
# Escape '[' and ']' for regex to work.
expected_err = (
"Missing parent document for document %s."
% documents[idx]).replace('[', '\[').replace(']', '\]')
self.assertRaisesRegex(errors.MissingDocumentParent, expected_err,
layering.DocumentLayering, documents)
# Restore schema for next test run.
documents[idx]['schema'] = prev_schema