Allow rerunning introspection on stored data
As requested in the related bug, this pull request allows to run the introspection again on previously stored data. This should make it simple to correct mistakes in the introspection rules. For this purpose, new API entry point was introduced: /v1/introspection/<UUID>/data/unprocessed that supports an empty POST method to trigger the introspection over stored data. New function `reapply` was introduced that takes care about the entry point and carries out the introspection. The `process` function was modified to allow reusing common parts in the new reapply function. The storage access methods were updated to allow saving the "raw" memdisk data besides the processed introspection data. Following preconditions are checked the reapply function having been triggered: * no data is being sent along with the request * Swift store is configured and enabled and the stored data is present for the node UUID * node_info object is cached for the UUID and it is possible to lock the object Should the preconditions fail, an immediate response is given to the user: * 400 if the request contained data or in case Swift store is not enabled in configuration * 409 if it wasn't possible to acquire lock for the node_info object * 404 in case Ironic didn't keep track of related BM node If the preconditions are met, a background task is executed to carry out the processing and a 202 Accepted response is returned to the endpoint user. As requested, these steps are performed in the background task: * preprocessing hooks * post processing hooks, storing result in Swift * introspection rules These steps are avoided, based on the RFE: * not_found_hook is skipped * power operations Limitations: * IMPI credentials are not updated --- ramdisk not running * there's no way to update the raw data atm. * the raw data is never cleaned from the store * check for stored data presence is performed in background; missing data situation still results in a 202 response Change-Id: Ic027c9d15f7f5475fcc3f599d081d1e8d5e244d4 Closes-Bug: #1525237
This commit is contained in:
parent
cba6394fc9
commit
f9e9a88d89
@ -93,6 +93,25 @@ Response body: JSON dictionary with introspection data
|
|||||||
format and contents of the stored data. Notably, it depends on the ramdisk
|
format and contents of the stored data. Notably, it depends on the ramdisk
|
||||||
used and plugins enabled both in the ramdisk and in inspector itself.
|
used and plugins enabled both in the ramdisk and in inspector itself.
|
||||||
|
|
||||||
|
Reapply introspection on stored data
|
||||||
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||||
|
|
||||||
|
``POST /v1/introspection/<UUID>/data/unprocessed`` to trigger
|
||||||
|
introspection on stored unprocessed data. No data is allowed to be
|
||||||
|
sent along with the request.
|
||||||
|
|
||||||
|
Requires X-Auth-Token header with Keystone token for authentication.
|
||||||
|
Requires enabling Swift store in processing section of the
|
||||||
|
configuration file.
|
||||||
|
|
||||||
|
Response:
|
||||||
|
|
||||||
|
* 202 - accepted
|
||||||
|
* 400 - bad request or store not configured
|
||||||
|
* 401, 403 - missing or invalid authentication
|
||||||
|
* 404 - node not found for UUID
|
||||||
|
* 409 - inspector locked node for processing
|
||||||
|
|
||||||
Introspection Rules
|
Introspection Rules
|
||||||
~~~~~~~~~~~~~~~~~~~
|
~~~~~~~~~~~~~~~~~~~
|
||||||
|
|
||||||
@ -323,3 +342,4 @@ Version History
|
|||||||
* **1.1** adds endpoint to retrieve stored introspection data.
|
* **1.1** adds endpoint to retrieve stored introspection data.
|
||||||
* **1.2** endpoints for manipulating introspection rules.
|
* **1.2** endpoints for manipulating introspection rules.
|
||||||
* **1.3** endpoint for canceling running introspection
|
* **1.3** endpoint for canceling running introspection
|
||||||
|
* **1.4** endpoint for reapplying the introspection over stored data.
|
||||||
|
@ -278,3 +278,51 @@ nodes in the introspection rules using the rule condition ``eq``::
|
|||||||
"conditions": [
|
"conditions": [
|
||||||
{'op': 'eq', 'field': 'data://auto_discovered', 'value': True}
|
{'op': 'eq', 'field': 'data://auto_discovered', 'value': True}
|
||||||
]
|
]
|
||||||
|
|
||||||
|
Reapplying introspection on stored data
|
||||||
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||||
|
|
||||||
|
To allow correcting mistakes in introspection rules the API provides
|
||||||
|
an entry point that triggers the introspection over stored data. The
|
||||||
|
data to use for processing is kept in Swift separately from the data
|
||||||
|
already processed. Reapplying introspection overwrites processed data
|
||||||
|
in the store. Updating the introspection data through the endpoint
|
||||||
|
isn't supported yet. Following preconditions are checked before
|
||||||
|
reapplying introspection:
|
||||||
|
|
||||||
|
* no data is being sent along with the request
|
||||||
|
* Swift store is configured and enabled
|
||||||
|
* introspection data is stored in Swift for the node UUID
|
||||||
|
* node record is kept in database for the UUID
|
||||||
|
* introspection is not ongoing for the node UUID
|
||||||
|
|
||||||
|
Should the preconditions fail an immediate response is given to the
|
||||||
|
user:
|
||||||
|
|
||||||
|
* ``400`` if the request contained data or in case Swift store is not
|
||||||
|
enabled in configuration
|
||||||
|
* ``404`` in case Ironic doesn't keep track of the node UUID
|
||||||
|
* ``409`` if an introspection is already ongoing for the node
|
||||||
|
|
||||||
|
If the preconditions are met a background task is executed to carry
|
||||||
|
out the processing and a ``202 Accepted`` response is returned to the
|
||||||
|
endpoint user. As requested, these steps are performed in the
|
||||||
|
background task:
|
||||||
|
|
||||||
|
* preprocessing hooks
|
||||||
|
* post processing hooks, storing result in Swift
|
||||||
|
* introspection rules
|
||||||
|
|
||||||
|
These steps are avoided, based on the feature requirements:
|
||||||
|
|
||||||
|
* ``node_not_found_hook`` is skipped
|
||||||
|
* power operations
|
||||||
|
* roll-back actions done by hooks
|
||||||
|
|
||||||
|
Limitations:
|
||||||
|
|
||||||
|
* IPMI credentials are not updated --- ramdisk not running
|
||||||
|
* there's no way to update the unprocessed data atm.
|
||||||
|
* the unprocessed data is never cleaned from the store
|
||||||
|
* check for stored data presence is performed in background;
|
||||||
|
missing data situation still results in a ``202`` response
|
||||||
|
@ -196,27 +196,35 @@ class SwiftAPI(object):
|
|||||||
return obj
|
return obj
|
||||||
|
|
||||||
|
|
||||||
def store_introspection_data(data, uuid):
|
def store_introspection_data(data, uuid, suffix=None):
|
||||||
"""Uploads introspection data to Swift.
|
"""Uploads introspection data to Swift.
|
||||||
|
|
||||||
:param data: data to store in Swift
|
:param data: data to store in Swift
|
||||||
:param uuid: UUID of the Ironic node that the data came from
|
:param uuid: UUID of the Ironic node that the data came from
|
||||||
|
:param suffix: optional suffix to add to the underlying swift
|
||||||
|
object name
|
||||||
:returns: name of the Swift object that the data is stored in
|
:returns: name of the Swift object that the data is stored in
|
||||||
"""
|
"""
|
||||||
swift_api = SwiftAPI()
|
swift_api = SwiftAPI()
|
||||||
swift_object_name = '%s-%s' % (OBJECT_NAME_PREFIX, uuid)
|
swift_object_name = '%s-%s' % (OBJECT_NAME_PREFIX, uuid)
|
||||||
|
if suffix is not None:
|
||||||
|
swift_object_name = '%s-%s' % (swift_object_name, suffix)
|
||||||
swift_api.create_object(swift_object_name, json.dumps(data))
|
swift_api.create_object(swift_object_name, json.dumps(data))
|
||||||
return swift_object_name
|
return swift_object_name
|
||||||
|
|
||||||
|
|
||||||
def get_introspection_data(uuid):
|
def get_introspection_data(uuid, suffix=None):
|
||||||
"""Downloads introspection data from Swift.
|
"""Downloads introspection data from Swift.
|
||||||
|
|
||||||
:param uuid: UUID of the Ironic node that the data came from
|
:param uuid: UUID of the Ironic node that the data came from
|
||||||
|
:param suffix: optional suffix to add to the underlying swift
|
||||||
|
object name
|
||||||
:returns: Swift object with the introspection data
|
:returns: Swift object with the introspection data
|
||||||
"""
|
"""
|
||||||
swift_api = SwiftAPI()
|
swift_api = SwiftAPI()
|
||||||
swift_object_name = '%s-%s' % (OBJECT_NAME_PREFIX, uuid)
|
swift_object_name = '%s-%s' % (OBJECT_NAME_PREFIX, uuid)
|
||||||
|
if suffix is not None:
|
||||||
|
swift_object_name = '%s-%s' % (swift_object_name, suffix)
|
||||||
return swift_api.get_object(swift_object_name)
|
return swift_api.get_object(swift_object_name)
|
||||||
|
|
||||||
|
|
||||||
|
@ -47,7 +47,7 @@ app = flask.Flask(__name__)
|
|||||||
LOG = utils.getProcessingLogger(__name__)
|
LOG = utils.getProcessingLogger(__name__)
|
||||||
|
|
||||||
MINIMUM_API_VERSION = (1, 0)
|
MINIMUM_API_VERSION = (1, 0)
|
||||||
CURRENT_API_VERSION = (1, 3)
|
CURRENT_API_VERSION = (1, 4)
|
||||||
_LOGGING_EXCLUDED_KEYS = ('logs',)
|
_LOGGING_EXCLUDED_KEYS = ('logs',)
|
||||||
|
|
||||||
|
|
||||||
@ -234,6 +234,25 @@ def api_introspection_data(uuid):
|
|||||||
code=404)
|
code=404)
|
||||||
|
|
||||||
|
|
||||||
|
@app.route('/v1/introspection/<uuid>/data/unprocessed', methods=['POST'])
|
||||||
|
@convert_exceptions
|
||||||
|
def api_introspection_reapply(uuid):
|
||||||
|
utils.check_auth(flask.request)
|
||||||
|
|
||||||
|
if flask.request.content_length:
|
||||||
|
return error_response(_('User data processing is not '
|
||||||
|
'supported yet'), code=400)
|
||||||
|
|
||||||
|
if CONF.processing.store_data == 'swift':
|
||||||
|
process.reapply(uuid)
|
||||||
|
return '', 202
|
||||||
|
else:
|
||||||
|
return error_response(_('Inspector is not configured to store'
|
||||||
|
' data. Set the [processing] '
|
||||||
|
'store_data configuration option to '
|
||||||
|
'change this.'), code=400)
|
||||||
|
|
||||||
|
|
||||||
def rule_repr(rule, short):
|
def rule_repr(rule, short):
|
||||||
result = rule.as_dict(short=short)
|
result = rule.as_dict(short=short)
|
||||||
result['links'] = [{
|
result['links'] = [{
|
||||||
|
@ -13,7 +13,10 @@
|
|||||||
|
|
||||||
"""Handling introspection data from the ramdisk."""
|
"""Handling introspection data from the ramdisk."""
|
||||||
|
|
||||||
|
import copy
|
||||||
import eventlet
|
import eventlet
|
||||||
|
import json
|
||||||
|
|
||||||
from ironicclient import exceptions
|
from ironicclient import exceptions
|
||||||
from oslo_config import cfg
|
from oslo_config import cfg
|
||||||
|
|
||||||
@ -33,6 +36,7 @@ LOG = utils.getProcessingLogger(__name__)
|
|||||||
_CREDENTIALS_WAIT_RETRIES = 10
|
_CREDENTIALS_WAIT_RETRIES = 10
|
||||||
_CREDENTIALS_WAIT_PERIOD = 3
|
_CREDENTIALS_WAIT_PERIOD = 3
|
||||||
_STORAGE_EXCLUDED_KEYS = {'logs'}
|
_STORAGE_EXCLUDED_KEYS = {'logs'}
|
||||||
|
_UNPROCESSED_DATA_STORE_SUFFIX = 'UNPROCESSED'
|
||||||
|
|
||||||
|
|
||||||
def _find_node_info(introspection_data, failures):
|
def _find_node_info(introspection_data, failures):
|
||||||
@ -60,13 +64,8 @@ def _find_node_info(introspection_data, failures):
|
|||||||
failures.append(_('Look up error: %s') % exc)
|
failures.append(_('Look up error: %s') % exc)
|
||||||
|
|
||||||
|
|
||||||
def process(introspection_data):
|
def _run_pre_hooks(introspection_data, failures):
|
||||||
"""Process data from the ramdisk.
|
|
||||||
|
|
||||||
This function heavily relies on the hooks to do the actual data processing.
|
|
||||||
"""
|
|
||||||
hooks = plugins_base.processing_hooks_manager()
|
hooks = plugins_base.processing_hooks_manager()
|
||||||
failures = []
|
|
||||||
for hook_ext in hooks:
|
for hook_ext in hooks:
|
||||||
# NOTE(dtantsur): catch exceptions, so that we have changes to update
|
# NOTE(dtantsur): catch exceptions, so that we have changes to update
|
||||||
# node introspection status after look up
|
# node introspection status after look up
|
||||||
@ -90,6 +89,64 @@ def process(introspection_data):
|
|||||||
'exc_class': exc.__class__.__name__,
|
'exc_class': exc.__class__.__name__,
|
||||||
'error': exc})
|
'error': exc})
|
||||||
|
|
||||||
|
|
||||||
|
def _filter_data_excluded_keys(data):
|
||||||
|
return {k: v for k, v in data.items()
|
||||||
|
if k not in _STORAGE_EXCLUDED_KEYS}
|
||||||
|
|
||||||
|
|
||||||
|
def _store_data(node_info, data, suffix=None):
|
||||||
|
if CONF.processing.store_data != 'swift':
|
||||||
|
LOG.debug("Swift support is disabled, introspection data "
|
||||||
|
"won't be stored", node_info=node_info)
|
||||||
|
return
|
||||||
|
|
||||||
|
swift_object_name = swift.store_introspection_data(
|
||||||
|
_filter_data_excluded_keys(data),
|
||||||
|
node_info.uuid,
|
||||||
|
suffix=suffix
|
||||||
|
)
|
||||||
|
LOG.info(_LI('Introspection data was stored in Swift in object '
|
||||||
|
'%s'), swift_object_name, node_info=node_info)
|
||||||
|
if CONF.processing.store_data_location:
|
||||||
|
node_info.patch([{'op': 'add', 'path': '/extra/%s' %
|
||||||
|
CONF.processing.store_data_location,
|
||||||
|
'value': swift_object_name}])
|
||||||
|
|
||||||
|
|
||||||
|
def _store_unprocessed_data(node_info, data):
|
||||||
|
# runs in background
|
||||||
|
try:
|
||||||
|
_store_data(node_info, data,
|
||||||
|
suffix=_UNPROCESSED_DATA_STORE_SUFFIX)
|
||||||
|
except Exception:
|
||||||
|
LOG.exception(_LE('Encountered exception saving unprocessed '
|
||||||
|
'introspection data'), node_info=node_info,
|
||||||
|
data=data)
|
||||||
|
|
||||||
|
|
||||||
|
def _get_unprocessed_data(uuid):
|
||||||
|
if CONF.processing.store_data == 'swift':
|
||||||
|
LOG.debug('Fetching unprocessed introspection data from '
|
||||||
|
'Swift for %s', uuid)
|
||||||
|
return json.loads(
|
||||||
|
swift.get_introspection_data(
|
||||||
|
uuid,
|
||||||
|
suffix=_UNPROCESSED_DATA_STORE_SUFFIX
|
||||||
|
)
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
raise utils.Error(_('Swift support is disabled'), code=400)
|
||||||
|
|
||||||
|
|
||||||
|
def process(introspection_data):
|
||||||
|
"""Process data from the ramdisk.
|
||||||
|
|
||||||
|
This function heavily relies on the hooks to do the actual data processing.
|
||||||
|
"""
|
||||||
|
unprocessed_data = copy.deepcopy(introspection_data)
|
||||||
|
failures = []
|
||||||
|
_run_pre_hooks(introspection_data, failures)
|
||||||
node_info = _find_node_info(introspection_data, failures)
|
node_info = _find_node_info(introspection_data, failures)
|
||||||
if node_info:
|
if node_info:
|
||||||
# Locking is already done in find_node() but may be not done in a
|
# Locking is already done in find_node() but may be not done in a
|
||||||
@ -112,6 +169,12 @@ def process(introspection_data):
|
|||||||
'error: %s') % node_info.error,
|
'error: %s') % node_info.error,
|
||||||
node_info=node_info, code=400)
|
node_info=node_info, code=400)
|
||||||
|
|
||||||
|
# Note(mkovacik): store data now when we're sure that a background
|
||||||
|
# thread won't race with other process() or introspect.abort()
|
||||||
|
# call
|
||||||
|
utils.executor().submit(_store_unprocessed_data, node_info,
|
||||||
|
unprocessed_data)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
node = node_info.node()
|
node = node_info.node()
|
||||||
except exceptions.NotFound:
|
except exceptions.NotFound:
|
||||||
@ -148,23 +211,7 @@ def _process_node(node, introspection_data, node_info):
|
|||||||
node_info.create_ports(introspection_data.get('macs') or ())
|
node_info.create_ports(introspection_data.get('macs') or ())
|
||||||
|
|
||||||
_run_post_hooks(node_info, introspection_data)
|
_run_post_hooks(node_info, introspection_data)
|
||||||
|
_store_data(node_info, introspection_data)
|
||||||
if CONF.processing.store_data == 'swift':
|
|
||||||
stored_data = {k: v for k, v in introspection_data.items()
|
|
||||||
if k not in _STORAGE_EXCLUDED_KEYS}
|
|
||||||
swift_object_name = swift.store_introspection_data(stored_data,
|
|
||||||
node_info.uuid)
|
|
||||||
LOG.info(_LI('Introspection data was stored in Swift in object %s'),
|
|
||||||
swift_object_name,
|
|
||||||
node_info=node_info, data=introspection_data)
|
|
||||||
if CONF.processing.store_data_location:
|
|
||||||
node_info.patch([{'op': 'add', 'path': '/extra/%s' %
|
|
||||||
CONF.processing.store_data_location,
|
|
||||||
'value': swift_object_name}])
|
|
||||||
else:
|
|
||||||
LOG.debug('Swift support is disabled, introspection data '
|
|
||||||
'won\'t be stored',
|
|
||||||
node_info=node_info, data=introspection_data)
|
|
||||||
|
|
||||||
ironic = ir_utils.get_client()
|
ironic = ir_utils.get_client()
|
||||||
firewall.update_filters(ironic)
|
firewall.update_filters(ironic)
|
||||||
@ -222,23 +269,93 @@ def _finish_set_ipmi_credentials(ironic, node, node_info, introspection_data,
|
|||||||
raise utils.Error(msg, node_info=node_info, data=introspection_data)
|
raise utils.Error(msg, node_info=node_info, data=introspection_data)
|
||||||
|
|
||||||
|
|
||||||
def _finish(ironic, node_info, introspection_data):
|
def _finish(ironic, node_info, introspection_data, power_off=True):
|
||||||
LOG.debug('Forcing power off of node %s', node_info.uuid)
|
if power_off:
|
||||||
try:
|
LOG.debug('Forcing power off of node %s', node_info.uuid)
|
||||||
ironic.node.set_power_state(node_info.uuid, 'off')
|
try:
|
||||||
except Exception as exc:
|
ironic.node.set_power_state(node_info.uuid, 'off')
|
||||||
if node_info.node().provision_state == 'enroll':
|
except Exception as exc:
|
||||||
LOG.info(_LI("Failed to power off the node in 'enroll' state, "
|
if node_info.node().provision_state == 'enroll':
|
||||||
"ignoring; error was %s") % exc,
|
LOG.info(_LI("Failed to power off the node in"
|
||||||
node_info=node_info, data=introspection_data)
|
"'enroll' state, ignoring; error was "
|
||||||
else:
|
"%s") % exc, node_info=node_info,
|
||||||
msg = (_('Failed to power off node %(node)s, check it\'s '
|
data=introspection_data)
|
||||||
'power management configuration: %(exc)s') %
|
else:
|
||||||
{'node': node_info.uuid, 'exc': exc})
|
msg = (_('Failed to power off node %(node)s, check '
|
||||||
node_info.finished(error=msg)
|
'its power management configuration: '
|
||||||
raise utils.Error(msg, node_info=node_info,
|
'%(exc)s') % {'node': node_info.uuid, 'exc':
|
||||||
data=introspection_data)
|
exc})
|
||||||
|
node_info.finished(error=msg)
|
||||||
|
raise utils.Error(msg, node_info=node_info,
|
||||||
|
data=introspection_data)
|
||||||
|
LOG.info(_LI('Node powered-off'), node_info=node_info,
|
||||||
|
data=introspection_data)
|
||||||
|
|
||||||
node_info.finished()
|
node_info.finished()
|
||||||
LOG.info(_LI('Introspection finished successfully'),
|
LOG.info(_LI('Introspection finished successfully'),
|
||||||
node_info=node_info, data=introspection_data)
|
node_info=node_info, data=introspection_data)
|
||||||
|
|
||||||
|
|
||||||
|
def reapply(uuid):
|
||||||
|
"""Re-apply introspection steps.
|
||||||
|
|
||||||
|
Re-apply preprocessing, postprocessing and introspection rules on
|
||||||
|
stored data.
|
||||||
|
|
||||||
|
:param uuid: node uuid to use
|
||||||
|
:raises: utils.Error
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
LOG.debug('Processing re-apply introspection request for node '
|
||||||
|
'UUID: %s', uuid)
|
||||||
|
node_info = node_cache.get_node(uuid, locked=False)
|
||||||
|
if not node_info.acquire_lock(blocking=False):
|
||||||
|
# Note (mkovacik): it should be sufficient to check data
|
||||||
|
# presence & locking. If either introspection didn't start
|
||||||
|
# yet, was in waiting state or didn't finish yet, either data
|
||||||
|
# won't be available or locking would fail
|
||||||
|
raise utils.Error(_('Node locked, please, try again later'),
|
||||||
|
node_info=node_info, code=409)
|
||||||
|
|
||||||
|
utils.executor().submit(_reapply, node_info)
|
||||||
|
|
||||||
|
|
||||||
|
def _reapply(node_info):
|
||||||
|
# runs in background
|
||||||
|
try:
|
||||||
|
introspection_data = _get_unprocessed_data(node_info.uuid)
|
||||||
|
except Exception:
|
||||||
|
LOG.exception(_LE('Encountered exception while fetching '
|
||||||
|
'stored introspection data'),
|
||||||
|
node_info=node_info)
|
||||||
|
node_info.release_lock()
|
||||||
|
return
|
||||||
|
|
||||||
|
failures = []
|
||||||
|
_run_pre_hooks(introspection_data, failures)
|
||||||
|
if failures:
|
||||||
|
LOG.error(_LE('Pre-processing failures detected reapplying '
|
||||||
|
'introspection on stored data:\n%s'),
|
||||||
|
'\n'.join(failures), node_info=node_info)
|
||||||
|
node_info.finished(error='\n'.join(failures))
|
||||||
|
return
|
||||||
|
|
||||||
|
try:
|
||||||
|
ironic = ir_utils.get_client()
|
||||||
|
node_info.create_ports(introspection_data.get('macs') or ())
|
||||||
|
_run_post_hooks(node_info, introspection_data)
|
||||||
|
_store_data(node_info, introspection_data)
|
||||||
|
node_info.invalidate_cache()
|
||||||
|
rules.apply(node_info, introspection_data)
|
||||||
|
_finish(ironic, node_info, introspection_data,
|
||||||
|
power_off=False)
|
||||||
|
except Exception as exc:
|
||||||
|
LOG.exception(_LE('Encountered exception reapplying '
|
||||||
|
'introspection on stored data'),
|
||||||
|
node_info=node_info,
|
||||||
|
data=introspection_data)
|
||||||
|
node_info.finished(error=str(exc))
|
||||||
|
else:
|
||||||
|
LOG.info(_LI('Successfully reapplied introspection on stored '
|
||||||
|
'data'), node_info=node_info, data=introspection_data)
|
||||||
|
@ -15,6 +15,7 @@ import eventlet
|
|||||||
eventlet.monkey_patch()
|
eventlet.monkey_patch()
|
||||||
|
|
||||||
import contextlib
|
import contextlib
|
||||||
|
import copy
|
||||||
import json
|
import json
|
||||||
import os
|
import os
|
||||||
import shutil
|
import shutil
|
||||||
@ -27,6 +28,7 @@ from oslo_utils import units
|
|||||||
import requests
|
import requests
|
||||||
|
|
||||||
from ironic_inspector.common import ironic as ir_utils
|
from ironic_inspector.common import ironic as ir_utils
|
||||||
|
from ironic_inspector.common import swift
|
||||||
from ironic_inspector import dbsync
|
from ironic_inspector import dbsync
|
||||||
from ironic_inspector import main
|
from ironic_inspector import main
|
||||||
from ironic_inspector import rules
|
from ironic_inspector import rules
|
||||||
@ -164,6 +166,10 @@ class Base(base.NodeTest):
|
|||||||
def call_abort_introspect(self, uuid):
|
def call_abort_introspect(self, uuid):
|
||||||
return self.call('post', '/v1/introspection/%s/abort' % uuid)
|
return self.call('post', '/v1/introspection/%s/abort' % uuid)
|
||||||
|
|
||||||
|
def call_reapply(self, uuid):
|
||||||
|
return self.call('post', '/v1/introspection/%s/data/unprocessed' %
|
||||||
|
uuid)
|
||||||
|
|
||||||
def call_continue(self, data):
|
def call_continue(self, data):
|
||||||
return self.call('post', '/v1/continue', data=data).json()
|
return self.call('post', '/v1/continue', data=data).json()
|
||||||
|
|
||||||
@ -432,6 +438,66 @@ class Test(Base):
|
|||||||
# after releasing the node lock
|
# after releasing the node lock
|
||||||
self.call('post', '/v1/continue', self.data, expect_error=400)
|
self.call('post', '/v1/continue', self.data, expect_error=400)
|
||||||
|
|
||||||
|
@mock.patch.object(swift, 'store_introspection_data', autospec=True)
|
||||||
|
@mock.patch.object(swift, 'get_introspection_data', autospec=True)
|
||||||
|
def test_stored_data_processing(self, get_mock, store_mock):
|
||||||
|
cfg.CONF.set_override('store_data', 'swift', 'processing')
|
||||||
|
|
||||||
|
# ramdisk data copy
|
||||||
|
# please mind the data is changed during processing
|
||||||
|
ramdisk_data = json.dumps(copy.deepcopy(self.data))
|
||||||
|
get_mock.return_value = ramdisk_data
|
||||||
|
|
||||||
|
self.call_introspect(self.uuid)
|
||||||
|
eventlet.greenthread.sleep(DEFAULT_SLEEP)
|
||||||
|
self.cli.node.set_power_state.assert_called_once_with(self.uuid,
|
||||||
|
'reboot')
|
||||||
|
|
||||||
|
res = self.call_continue(self.data)
|
||||||
|
self.assertEqual({'uuid': self.uuid}, res)
|
||||||
|
eventlet.greenthread.sleep(DEFAULT_SLEEP)
|
||||||
|
|
||||||
|
status = self.call_get_status(self.uuid)
|
||||||
|
self.assertEqual({'finished': True, 'error': None}, status)
|
||||||
|
|
||||||
|
res = self.call_reapply(self.uuid)
|
||||||
|
self.assertEqual(202, res.status_code)
|
||||||
|
self.assertEqual(b'', res.text)
|
||||||
|
eventlet.greenthread.sleep(DEFAULT_SLEEP)
|
||||||
|
|
||||||
|
# reapply request data
|
||||||
|
get_mock.assert_called_once_with(self.uuid,
|
||||||
|
suffix='UNPROCESSED')
|
||||||
|
|
||||||
|
# store ramdisk data, store processing result data, store
|
||||||
|
# reapply processing result data; the ordering isn't
|
||||||
|
# guaranteed as store ramdisk data runs in a background
|
||||||
|
# thread; hower, last call has to always be reapply processing
|
||||||
|
# result data
|
||||||
|
store_ramdisk_call = mock.call(mock.ANY, self.uuid,
|
||||||
|
suffix='UNPROCESSED')
|
||||||
|
store_processing_call = mock.call(mock.ANY, self.uuid,
|
||||||
|
suffix=None)
|
||||||
|
self.assertEqual(3, len(store_mock.call_args_list))
|
||||||
|
self.assertIn(store_ramdisk_call,
|
||||||
|
store_mock.call_args_list[0:2])
|
||||||
|
self.assertIn(store_processing_call,
|
||||||
|
store_mock.call_args_list[0:2])
|
||||||
|
self.assertEqual(store_processing_call,
|
||||||
|
store_mock.call_args_list[2])
|
||||||
|
|
||||||
|
# second reapply call
|
||||||
|
get_mock.return_value = ramdisk_data
|
||||||
|
res = self.call_reapply(self.uuid)
|
||||||
|
self.assertEqual(202, res.status_code)
|
||||||
|
self.assertEqual(b'', res.text)
|
||||||
|
eventlet.greenthread.sleep(DEFAULT_SLEEP)
|
||||||
|
|
||||||
|
# reapply saves the result
|
||||||
|
self.assertEqual(4, len(store_mock.call_args_list))
|
||||||
|
self.assertEqual(store_processing_call,
|
||||||
|
store_mock.call_args_list[-1])
|
||||||
|
|
||||||
|
|
||||||
@contextlib.contextmanager
|
@contextlib.contextmanager
|
||||||
def mocked_server():
|
def mocked_server():
|
||||||
|
@ -234,6 +234,78 @@ class TestApiGetData(BaseAPITest):
|
|||||||
self.assertEqual(404, res.status_code)
|
self.assertEqual(404, res.status_code)
|
||||||
|
|
||||||
|
|
||||||
|
@mock.patch.object(process, 'reapply', autospec=True)
|
||||||
|
class TestApiReapply(BaseAPITest):
|
||||||
|
|
||||||
|
def setUp(self):
|
||||||
|
super(TestApiReapply, self).setUp()
|
||||||
|
CONF.set_override('store_data', 'swift', 'processing')
|
||||||
|
|
||||||
|
def test_ok(self, reapply_mock):
|
||||||
|
|
||||||
|
self.app.post('/v1/introspection/%s/data/unprocessed' %
|
||||||
|
self.uuid)
|
||||||
|
reapply_mock.assert_called_once_with(self.uuid)
|
||||||
|
|
||||||
|
def test_user_data(self, reapply_mock):
|
||||||
|
res = self.app.post('/v1/introspection/%s/data/unprocessed' %
|
||||||
|
self.uuid, data='some data')
|
||||||
|
self.assertEqual(400, res.status_code)
|
||||||
|
message = json.loads(res.data.decode())['error']['message']
|
||||||
|
self.assertEqual('User data processing is not supported yet',
|
||||||
|
message)
|
||||||
|
self.assertFalse(reapply_mock.called)
|
||||||
|
|
||||||
|
def test_swift_disabled(self, reapply_mock):
|
||||||
|
CONF.set_override('store_data', 'none', 'processing')
|
||||||
|
|
||||||
|
res = self.app.post('/v1/introspection/%s/data/unprocessed' %
|
||||||
|
self.uuid)
|
||||||
|
self.assertEqual(400, res.status_code)
|
||||||
|
message = json.loads(res.data.decode())['error']['message']
|
||||||
|
self.assertEqual('Inspector is not configured to store '
|
||||||
|
'data. Set the [processing] store_data '
|
||||||
|
'configuration option to change this.',
|
||||||
|
message)
|
||||||
|
self.assertFalse(reapply_mock.called)
|
||||||
|
|
||||||
|
def test_node_locked(self, reapply_mock):
|
||||||
|
exc = utils.Error('Locked.', code=409)
|
||||||
|
reapply_mock.side_effect = exc
|
||||||
|
|
||||||
|
res = self.app.post('/v1/introspection/%s/data/unprocessed' %
|
||||||
|
self.uuid)
|
||||||
|
|
||||||
|
self.assertEqual(409, res.status_code)
|
||||||
|
message = json.loads(res.data.decode())['error']['message']
|
||||||
|
self.assertEqual(str(exc), message)
|
||||||
|
reapply_mock.assert_called_once_with(self.uuid)
|
||||||
|
|
||||||
|
def test_node_not_found(self, reapply_mock):
|
||||||
|
exc = utils.Error('Not found.', code=404)
|
||||||
|
reapply_mock.side_effect = exc
|
||||||
|
|
||||||
|
res = self.app.post('/v1/introspection/%s/data/unprocessed' %
|
||||||
|
self.uuid)
|
||||||
|
|
||||||
|
self.assertEqual(404, res.status_code)
|
||||||
|
message = json.loads(res.data.decode())['error']['message']
|
||||||
|
self.assertEqual(str(exc), message)
|
||||||
|
reapply_mock.assert_called_once_with(self.uuid)
|
||||||
|
|
||||||
|
def test_generic_error(self, reapply_mock):
|
||||||
|
exc = utils.Error('Oops', code=400)
|
||||||
|
reapply_mock.side_effect = exc
|
||||||
|
|
||||||
|
res = self.app.post('/v1/introspection/%s/data/unprocessed' %
|
||||||
|
self.uuid)
|
||||||
|
|
||||||
|
self.assertEqual(400, res.status_code)
|
||||||
|
message = json.loads(res.data.decode())['error']['message']
|
||||||
|
self.assertEqual(str(exc), message)
|
||||||
|
reapply_mock.assert_called_once_with(self.uuid)
|
||||||
|
|
||||||
|
|
||||||
class TestApiRules(BaseAPITest):
|
class TestApiRules(BaseAPITest):
|
||||||
@mock.patch.object(rules, 'get_all')
|
@mock.patch.object(rules, 'get_all')
|
||||||
def test_get_all(self, get_all_mock):
|
def test_get_all(self, get_all_mock):
|
||||||
|
@ -11,6 +11,7 @@
|
|||||||
# See the License for the specific language governing permissions and
|
# See the License for the specific language governing permissions and
|
||||||
# limitations under the License.
|
# limitations under the License.
|
||||||
|
|
||||||
|
import copy
|
||||||
import functools
|
import functools
|
||||||
import json
|
import json
|
||||||
import time
|
import time
|
||||||
@ -98,6 +99,37 @@ class TestProcess(BaseTest):
|
|||||||
process_mock.assert_called_once_with(cli.node.get.return_value,
|
process_mock.assert_called_once_with(cli.node.get.return_value,
|
||||||
self.data, pop_mock.return_value)
|
self.data, pop_mock.return_value)
|
||||||
|
|
||||||
|
@prepare_mocks
|
||||||
|
def test_save_unprocessed_data(self, cli, pop_mock, process_mock):
|
||||||
|
CONF.set_override('store_data', 'swift', 'processing')
|
||||||
|
expected = copy.deepcopy(self.data)
|
||||||
|
|
||||||
|
with mock.patch.object(process, '_store_unprocessed_data',
|
||||||
|
autospec=True) as store_mock:
|
||||||
|
process.process(self.data)
|
||||||
|
|
||||||
|
store_mock.assert_called_once_with(mock.ANY, expected)
|
||||||
|
|
||||||
|
@prepare_mocks
|
||||||
|
def test_save_unprocessed_data_failure(self, cli, pop_mock,
|
||||||
|
process_mock):
|
||||||
|
CONF.set_override('store_data', 'swift', 'processing')
|
||||||
|
name = 'inspector_data-%s-%s' % (
|
||||||
|
self.uuid,
|
||||||
|
process._UNPROCESSED_DATA_STORE_SUFFIX
|
||||||
|
)
|
||||||
|
|
||||||
|
with mock.patch.object(process.swift, 'SwiftAPI',
|
||||||
|
autospec=True) as swift_mock:
|
||||||
|
swift_conn = swift_mock.return_value
|
||||||
|
swift_conn.create_object.side_effect = iter([utils.Error('Oops')])
|
||||||
|
|
||||||
|
res = process.process(self.data)
|
||||||
|
|
||||||
|
# assert store failure doesn't break processing
|
||||||
|
self.assertEqual(self.fake_result_json, res)
|
||||||
|
swift_conn.create_object.assert_called_once_with(name, mock.ANY)
|
||||||
|
|
||||||
@prepare_mocks
|
@prepare_mocks
|
||||||
def test_no_ipmi(self, cli, pop_mock, process_mock):
|
def test_no_ipmi(self, cli, pop_mock, process_mock):
|
||||||
del self.data['ipmi_address']
|
del self.data['ipmi_address']
|
||||||
@ -396,8 +428,9 @@ class TestProcessNode(BaseTest):
|
|||||||
self.assertCalledWithPatch(self.patch_props, self.cli.node.update)
|
self.assertCalledWithPatch(self.patch_props, self.cli.node.update)
|
||||||
finished_mock.assert_called_once_with(
|
finished_mock.assert_called_once_with(
|
||||||
mock.ANY,
|
mock.ANY,
|
||||||
error='Failed to power off node %s, check it\'s power management'
|
error='Failed to power off node %s, check its power '
|
||||||
' configuration: boom' % self.uuid)
|
'management configuration: boom' % self.uuid
|
||||||
|
)
|
||||||
|
|
||||||
@mock.patch.object(node_cache.NodeInfo, 'finished', autospec=True)
|
@mock.patch.object(node_cache.NodeInfo, 'finished', autospec=True)
|
||||||
def test_power_off_enroll_state(self, finished_mock, filters_mock,
|
def test_power_off_enroll_state(self, finished_mock, filters_mock,
|
||||||
@ -462,3 +495,201 @@ class TestProcessNode(BaseTest):
|
|||||||
self.assertEqual(expected,
|
self.assertEqual(expected,
|
||||||
json.loads(swift_conn.create_object.call_args[0][1]))
|
json.loads(swift_conn.create_object.call_args[0][1]))
|
||||||
self.assertCalledWithPatch(self.patch_props, self.cli.node.update)
|
self.assertCalledWithPatch(self.patch_props, self.cli.node.update)
|
||||||
|
|
||||||
|
|
||||||
|
@mock.patch.object(process, '_reapply', autospec=True)
|
||||||
|
@mock.patch.object(node_cache, 'get_node', autospec=True)
|
||||||
|
class TestReapply(BaseTest):
|
||||||
|
def prepare_mocks(func):
|
||||||
|
@functools.wraps(func)
|
||||||
|
def wrapper(self, pop_mock, *args, **kw):
|
||||||
|
pop_mock.return_value = node_cache.NodeInfo(
|
||||||
|
uuid=self.node.uuid,
|
||||||
|
started_at=self.started_at)
|
||||||
|
pop_mock.return_value.finished = mock.Mock()
|
||||||
|
pop_mock.return_value.acquire_lock = mock.Mock()
|
||||||
|
return func(self, pop_mock, *args, **kw)
|
||||||
|
|
||||||
|
return wrapper
|
||||||
|
|
||||||
|
def setUp(self):
|
||||||
|
super(TestReapply, self).setUp()
|
||||||
|
CONF.set_override('store_data', 'swift', 'processing')
|
||||||
|
|
||||||
|
@prepare_mocks
|
||||||
|
def test_ok(self, pop_mock, reapply_mock):
|
||||||
|
process.reapply(self.uuid)
|
||||||
|
pop_mock.assert_called_once_with(self.uuid, locked=False)
|
||||||
|
pop_mock.return_value.acquire_lock.assert_called_once_with(
|
||||||
|
blocking=False
|
||||||
|
)
|
||||||
|
|
||||||
|
reapply_mock.assert_called_once_with(pop_mock.return_value)
|
||||||
|
|
||||||
|
@prepare_mocks
|
||||||
|
def test_locking_failed(self, pop_mock, reapply_mock):
|
||||||
|
pop_mock.return_value.acquire_lock.return_value = False
|
||||||
|
exc = utils.Error('Node locked, please, try again later')
|
||||||
|
|
||||||
|
with self.assertRaises(type(exc)) as cm:
|
||||||
|
process.reapply(self.uuid)
|
||||||
|
|
||||||
|
self.assertEqual(str(exc), str(cm.exception))
|
||||||
|
|
||||||
|
pop_mock.assert_called_once_with(self.uuid, locked=False)
|
||||||
|
pop_mock.return_value.acquire_lock.assert_called_once_with(
|
||||||
|
blocking=False
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@mock.patch.object(example_plugin.ExampleProcessingHook, 'before_update')
|
||||||
|
@mock.patch.object(process.rules, 'apply', autospec=True)
|
||||||
|
@mock.patch.object(process.swift, 'SwiftAPI', autospec=True)
|
||||||
|
@mock.patch.object(node_cache.NodeInfo, 'finished', autospec=True)
|
||||||
|
@mock.patch.object(node_cache.NodeInfo, 'release_lock', autospec=True)
|
||||||
|
class TestReapplyNode(BaseTest):
|
||||||
|
def setUp(self):
|
||||||
|
super(TestReapplyNode, self).setUp()
|
||||||
|
CONF.set_override('processing_hooks',
|
||||||
|
'$processing.default_processing_hooks,example',
|
||||||
|
'processing')
|
||||||
|
CONF.set_override('store_data', 'swift', 'processing')
|
||||||
|
self.data['macs'] = self.macs
|
||||||
|
self.data['all_interfaces'] = self.data['interfaces']
|
||||||
|
self.ports = self.all_ports
|
||||||
|
self.node_info = node_cache.NodeInfo(uuid=self.uuid,
|
||||||
|
started_at=self.started_at,
|
||||||
|
node=self.node)
|
||||||
|
self.node_info.invalidate_cache = mock.Mock()
|
||||||
|
self.new_creds = ('user', 'password')
|
||||||
|
self.cli = mock.Mock()
|
||||||
|
self.cli.port.create.side_effect = self.ports
|
||||||
|
self.cli.node.update.return_value = self.node
|
||||||
|
self.cli.node.list_ports.return_value = []
|
||||||
|
|
||||||
|
@mock.patch.object(ir_utils, 'get_client', autospec=True)
|
||||||
|
def call(self, cli_mock):
|
||||||
|
cli_mock.return_value = self.cli
|
||||||
|
process._reapply(self.node_info)
|
||||||
|
# make sure node_info lock is released after a call
|
||||||
|
self.node_info.release_lock.assert_called_once_with(self.node_info)
|
||||||
|
|
||||||
|
def prepare_mocks(fn):
|
||||||
|
@functools.wraps(fn)
|
||||||
|
def wrapper(self, release_mock, finished_mock, swift_mock,
|
||||||
|
*args, **kw):
|
||||||
|
finished_mock.side_effect = lambda *a, **kw: \
|
||||||
|
release_mock(self.node_info)
|
||||||
|
swift_client_mock = swift_mock.return_value
|
||||||
|
fn(self, finished_mock, swift_client_mock, *args, **kw)
|
||||||
|
return wrapper
|
||||||
|
|
||||||
|
@prepare_mocks
|
||||||
|
def test_ok(self, finished_mock, swift_mock, apply_mock,
|
||||||
|
post_hook_mock):
|
||||||
|
swift_name = 'inspector_data-%s' % self.uuid
|
||||||
|
swift_mock.get_object.return_value = json.dumps(self.data)
|
||||||
|
|
||||||
|
with mock.patch.object(process.LOG, 'error',
|
||||||
|
autospec=True) as log_mock:
|
||||||
|
self.call()
|
||||||
|
|
||||||
|
# no failures logged
|
||||||
|
self.assertFalse(log_mock.called)
|
||||||
|
|
||||||
|
post_hook_mock.assert_called_once_with(mock.ANY, self.node_info)
|
||||||
|
swift_mock.create_object.assert_called_once_with(swift_name,
|
||||||
|
mock.ANY)
|
||||||
|
swifted_data = json.loads(swift_mock.create_object.call_args[0][1])
|
||||||
|
|
||||||
|
self.node_info.invalidate_cache.assert_called_once_with()
|
||||||
|
apply_mock.assert_called_once_with(self.node_info, swifted_data)
|
||||||
|
|
||||||
|
# assert no power operations were performed
|
||||||
|
self.assertFalse(self.cli.node.set_power_state.called)
|
||||||
|
finished_mock.assert_called_once_with(self.node_info)
|
||||||
|
|
||||||
|
# asserting validate_interfaces was called
|
||||||
|
self.assertEqual({'em2': self.data['interfaces']['em2']},
|
||||||
|
swifted_data['interfaces'])
|
||||||
|
self.assertEqual([self.pxe_mac], swifted_data['macs'])
|
||||||
|
|
||||||
|
# assert ports were created with whatever there was left
|
||||||
|
# behind validate_interfaces
|
||||||
|
self.cli.port.create.assert_called_once_with(
|
||||||
|
node_uuid=self.uuid,
|
||||||
|
address=swifted_data['macs'][0]
|
||||||
|
)
|
||||||
|
|
||||||
|
@prepare_mocks
|
||||||
|
def test_get_incomming_data_exception(self, finished_mock,
|
||||||
|
swift_mock, apply_mock,
|
||||||
|
post_hook_mock, ):
|
||||||
|
exc = Exception('Oops')
|
||||||
|
swift_mock.get_object.side_effect = exc
|
||||||
|
with mock.patch.object(process.LOG, 'exception',
|
||||||
|
autospec=True) as log_mock:
|
||||||
|
self.call()
|
||||||
|
|
||||||
|
log_mock.assert_called_once_with('Encountered exception '
|
||||||
|
'while fetching stored '
|
||||||
|
'introspection data',
|
||||||
|
node_info=self.node_info)
|
||||||
|
|
||||||
|
self.assertFalse(swift_mock.create_object.called)
|
||||||
|
self.assertFalse(apply_mock.called)
|
||||||
|
self.assertFalse(post_hook_mock.called)
|
||||||
|
self.assertFalse(finished_mock.called)
|
||||||
|
|
||||||
|
@prepare_mocks
|
||||||
|
def test_prehook_failure(self, finished_mock, swift_mock,
|
||||||
|
apply_mock, post_hook_mock, ):
|
||||||
|
CONF.set_override('processing_hooks', 'example',
|
||||||
|
'processing')
|
||||||
|
plugins_base._HOOKS_MGR = None
|
||||||
|
|
||||||
|
exc = Exception('Failed.')
|
||||||
|
swift_mock.get_object.return_value = json.dumps(self.data)
|
||||||
|
|
||||||
|
with mock.patch.object(example_plugin.ExampleProcessingHook,
|
||||||
|
'before_processing') as before_processing_mock:
|
||||||
|
before_processing_mock.side_effect = exc
|
||||||
|
with mock.patch.object(process.LOG, 'error',
|
||||||
|
autospec=True) as log_mock:
|
||||||
|
self.call()
|
||||||
|
|
||||||
|
exc_failure = ('Unexpected exception %(exc_class)s during '
|
||||||
|
'preprocessing in hook example: %(error)s' %
|
||||||
|
{'exc_class': type(exc).__name__, 'error':
|
||||||
|
exc})
|
||||||
|
log_mock.assert_called_once_with('Pre-processing failures '
|
||||||
|
'detected reapplying '
|
||||||
|
'introspection on stored '
|
||||||
|
'data:\n%s', exc_failure,
|
||||||
|
node_info=self.node_info)
|
||||||
|
finished_mock.assert_called_once_with(self.node_info,
|
||||||
|
error=exc_failure)
|
||||||
|
# assert _reapply ended having detected the failure
|
||||||
|
self.assertFalse(swift_mock.create_object.called)
|
||||||
|
self.assertFalse(apply_mock.called)
|
||||||
|
self.assertFalse(post_hook_mock.called)
|
||||||
|
|
||||||
|
@prepare_mocks
|
||||||
|
def test_generic_exception_creating_ports(self, finished_mock,
|
||||||
|
swift_mock, apply_mock,
|
||||||
|
post_hook_mock):
|
||||||
|
swift_mock.get_object.return_value = json.dumps(self.data)
|
||||||
|
exc = Exception('Oops')
|
||||||
|
self.cli.port.create.side_effect = exc
|
||||||
|
|
||||||
|
with mock.patch.object(process.LOG, 'exception') as log_mock:
|
||||||
|
self.call()
|
||||||
|
|
||||||
|
log_mock.assert_called_once_with('Encountered exception reapplying'
|
||||||
|
' introspection on stored data',
|
||||||
|
node_info=self.node_info,
|
||||||
|
data=mock.ANY)
|
||||||
|
finished_mock.assert_called_once_with(self.node_info, error=str(exc))
|
||||||
|
self.assertFalse(swift_mock.create_object.called)
|
||||||
|
self.assertFalse(apply_mock.called)
|
||||||
|
self.assertFalse(post_hook_mock.called)
|
||||||
|
@ -0,0 +1,4 @@
|
|||||||
|
---
|
||||||
|
features:
|
||||||
|
- Introduced API "POST /v1/introspection/UUID/data/unprocessed"
|
||||||
|
for reapplying the introspection over stored data.
|
Loading…
Reference in New Issue
Block a user