Deprecate ironic-lib master branch
Functionality of ironic-lib has been moved into Ironic and Ironic-Python-Agent directly as needed. Change-Id: I9658af870a9f2e385fcb9d3c25ead6a15886e53d
This commit is contained in:
parent
7e134f060e
commit
2939f5dbc4
36
.gitignore
vendored
36
.gitignore
vendored
@ -1,36 +0,0 @@
|
||||
# Compiled files
|
||||
*.py[co]
|
||||
*.a
|
||||
*.o
|
||||
*.so
|
||||
|
||||
# Sphinx
|
||||
_build
|
||||
doc/source/reference/api/
|
||||
|
||||
# Packages/installer info
|
||||
*.egg
|
||||
*.egg-info
|
||||
dist
|
||||
build
|
||||
eggs
|
||||
.eggs
|
||||
parts
|
||||
var
|
||||
sdist
|
||||
develop-eggs
|
||||
.installed.cfg
|
||||
|
||||
# Other
|
||||
*.DS_Store
|
||||
.stestr
|
||||
.tox
|
||||
.venv
|
||||
.*.swp
|
||||
.coverage
|
||||
cover
|
||||
AUTHORS
|
||||
ChangeLog
|
||||
*.sqlite
|
||||
*~
|
||||
.idea
|
@ -1,3 +0,0 @@
|
||||
[DEFAULT]
|
||||
test_path=${TESTS_DIR:-./ironic_lib/tests}
|
||||
top_dir=./
|
@ -1,10 +0,0 @@
|
||||
If you would like to contribute to the development of OpenStack,
|
||||
you must follow the steps documented at:
|
||||
|
||||
https://docs.openstack.org/infra/manual/developers.html#development-workflow
|
||||
|
||||
Pull requests submitted through GitHub will be ignored.
|
||||
|
||||
Bugs should be filed in StoryBoard, not GitHub:
|
||||
|
||||
https://storyboard.openstack.org/#!/project/946
|
202
LICENSE
202
LICENSE
@ -1,202 +0,0 @@
|
||||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
APPENDIX: How to apply the Apache License to your work.
|
||||
|
||||
To apply the Apache License to your work, attach the following
|
||||
boilerplate notice, with the fields enclosed by brackets "{}"
|
||||
replaced with your own identifying information. (Don't include
|
||||
the brackets!) The text should be enclosed in the appropriate
|
||||
comment syntax for the file format. We also recommend that a
|
||||
file or class name and description of purpose be included on the
|
||||
same "printed page" as the copyright notice for easier
|
||||
identification within third-party archives.
|
||||
|
||||
Copyright {yyyy} {name of copyright owner}
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
||||
|
@ -1,6 +0,0 @@
|
||||
include AUTHORS
|
||||
include ChangeLog
|
||||
exclude .gitignore
|
||||
exclude .gitreview
|
||||
|
||||
global-exclude *.pyc
|
32
README.rst
32
README.rst
@ -2,28 +2,18 @@
|
||||
ironic-lib
|
||||
==========
|
||||
|
||||
Team and repository tags
|
||||
------------------------
|
||||
This project is no longer maintained.
|
||||
|
||||
.. image:: https://governance.openstack.org/tc/badges/ironic-lib.svg
|
||||
:target: https://governance.openstack.org/tc/reference/tags/index.html
|
||||
The contents of this repository are still available in the Git
|
||||
source code management system. To see the contents of this
|
||||
repository before it reached its end of life, please check out the
|
||||
previous commit with "git checkout HEAD^1".
|
||||
|
||||
Overview
|
||||
--------
|
||||
The ironic-lib library will no longer receive releases. Any functionality
|
||||
previously provided by this library has been moved directly into Ironic
|
||||
or Ironic-Python-Agent.
|
||||
|
||||
A common library to be used **exclusively** by projects under the `Ironic
|
||||
governance <https://governance.openstack.org/tc/reference/projects/ironic.html>`_.
|
||||
|
||||
Running Tests
|
||||
-------------
|
||||
|
||||
To run tests in virtualenvs (preferred)::
|
||||
|
||||
$ sudo pip install tox
|
||||
$ tox
|
||||
|
||||
To run tests in the current environment::
|
||||
|
||||
$ sudo pip install -r requirements.txt -r test-requirements.txt
|
||||
$ stestr run
|
||||
For any further questions, please email
|
||||
openstack-discuss@lists.openstack.org or join #openstack-ironic on
|
||||
OFTC.
|
||||
|
||||
|
99
TESTING.rst
99
TESTING.rst
@ -1,99 +0,0 @@
|
||||
===========================
|
||||
Testing Your OpenStack Code
|
||||
===========================
|
||||
------------
|
||||
A Quickstart
|
||||
------------
|
||||
|
||||
This is designed to be enough information for you to run your first tests.
|
||||
Detailed information on testing can be found here: https://wiki.openstack.org/wiki/Testing
|
||||
|
||||
*Install pip*::
|
||||
|
||||
$ [apt-get | yum] install python-pip
|
||||
More information on pip here: http://www.pip-installer.org/en/latest/
|
||||
|
||||
*Use pip to install tox*::
|
||||
|
||||
$ pip install tox
|
||||
|
||||
Run The Tests
|
||||
-------------
|
||||
|
||||
*Navigate to the project's root directory and execute*::
|
||||
|
||||
$ tox
|
||||
Note: completing this command may take a long time (depends on system resources)
|
||||
also, you might not see any output until tox is complete.
|
||||
|
||||
Information about tox can be found here: http://testrun.org/tox/latest/
|
||||
|
||||
Run The Tests in One Environment
|
||||
--------------------------------
|
||||
|
||||
Tox will run your entire test suite in the environments specified in the project tox.ini::
|
||||
|
||||
[tox]
|
||||
|
||||
envlist = <list of available environments>
|
||||
|
||||
To run the test suite in just one of the environments in envlist execute::
|
||||
|
||||
$ tox -e <env>
|
||||
so for example, *run the test suite with the default OS version of Python 3*::
|
||||
|
||||
$ tox -e py3
|
||||
|
||||
or select a specific version, *run the test suite using Python 3.6*::
|
||||
|
||||
$ tox -e py36
|
||||
|
||||
Other useful tox options that can be specified when running the test suite are::
|
||||
|
||||
-v to increase verbosity of the output, can be repeated up to three times
|
||||
based on the desired verbosity level
|
||||
|
||||
-r to recreate the virtual environment from scratch
|
||||
|
||||
Run One Test
|
||||
------------
|
||||
|
||||
To run individual tests with tox:
|
||||
|
||||
if testr is in tox.ini, for example::
|
||||
|
||||
[testenv]
|
||||
|
||||
includes "python setup.py testr --slowest --testr-args='{posargs}'"
|
||||
|
||||
run individual tests with the following syntax::
|
||||
|
||||
$ tox -e <env> -- path.to.module:Class.test
|
||||
so for example, *run the cpu_limited test in Nova*::
|
||||
|
||||
$ tox -e py36 -- nova.tests.test_claims:ClaimTestCase.test_cpu_unlimited
|
||||
|
||||
if nose is in tox.ini, for example::
|
||||
|
||||
[testenv]
|
||||
|
||||
includes "nosetests {posargs}"
|
||||
|
||||
run individual tests with the following syntax::
|
||||
|
||||
$ tox -e <env> -- --tests path.to.module:Class.test
|
||||
so for example, *run the list test in Glance*::
|
||||
|
||||
$ tox -e py36 -- --tests glance.tests.unit.test_auth.py:TestImageRepoProxy.test_list
|
||||
|
||||
Need More Info?
|
||||
---------------
|
||||
|
||||
More information about testr: https://wiki.openstack.org/wiki/Testr
|
||||
|
||||
More information about tox: https://tox.readthedocs.io/en/latest/
|
||||
|
||||
More information about nose: https://nose.readthedocs.org/en/latest/
|
||||
|
||||
More information about testing OpenStack code can be found here:
|
||||
https://wiki.openstack.org/wiki/Testing
|
@ -1,7 +0,0 @@
|
||||
# these are needed to compile Python dependencies from sources
|
||||
python3-all-dev [platform:dpkg test]
|
||||
python3-devel [(platform:rpm test)]
|
||||
build-essential [platform:dpkg test]
|
||||
libffi-dev [platform:dpkg test]
|
||||
libffi-devel [platform:rpm test]
|
||||
|
@ -1,3 +0,0 @@
|
||||
sphinx>=2.0.0 # BSD
|
||||
openstackdocstheme>=2.2.1 # Apache-2.0
|
||||
sphinxcontrib-apidoc>=0.2.0 # BSD
|
@ -1,83 +0,0 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
|
||||
# -- General configuration ----------------------------------------------------
|
||||
|
||||
# Add any Sphinx extension module names here, as strings. They can be
|
||||
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
|
||||
extensions = [
|
||||
'sphinx.ext.viewcode',
|
||||
'openstackdocstheme',
|
||||
'sphinxcontrib.apidoc'
|
||||
]
|
||||
|
||||
wsme_protocols = ['restjson']
|
||||
|
||||
# autodoc generation is a bit aggressive and a nuisance when doing heavy
|
||||
# text edit cycles.
|
||||
# execute "export SPHINX_DEBUG=1" in your terminal to disable
|
||||
|
||||
# Add any paths that contain templates here, relative to this directory.
|
||||
templates_path = ['_templates']
|
||||
|
||||
# The suffix of source filenames.
|
||||
source_suffix = '.rst'
|
||||
|
||||
# The master toctree document.
|
||||
master_doc = 'index'
|
||||
|
||||
# General information about the project.
|
||||
copyright = 'OpenStack Foundation'
|
||||
|
||||
# A list of ignored prefixes for module index sorting.
|
||||
modindex_common_prefix = ['ironic_lib']
|
||||
|
||||
# If true, '()' will be appended to :func: etc. cross-reference text.
|
||||
add_function_parentheses = True
|
||||
|
||||
# If true, the current module name will be prepended to all description
|
||||
# unit titles (such as .. function::).
|
||||
add_module_names = True
|
||||
|
||||
# The name of the Pygments (syntax highlighting) style to use.
|
||||
pygments_style = 'native'
|
||||
|
||||
# -- sphinxcontrib.apidoc configuration --------------------------------------
|
||||
|
||||
apidoc_module_dir = '../../ironic_lib'
|
||||
apidoc_output_dir = 'reference/api'
|
||||
apidoc_excluded_paths = [
|
||||
'tests',
|
||||
]
|
||||
|
||||
# -- Options for HTML output --------------------------------------------------
|
||||
|
||||
# The theme to use for HTML and HTML Help pages. Major themes that come with
|
||||
# Sphinx are currently 'default' and 'sphinxdoc'.
|
||||
#html_theme_path = ["."]
|
||||
#html_static_path = ['_static']
|
||||
|
||||
html_theme = 'openstackdocs'
|
||||
|
||||
# openstackdocstheme options
|
||||
openstackdocs_repo_name = 'openstack/ironic-lib'
|
||||
openstackdocs_pdf_link = True
|
||||
openstackdocs_use_storyboard = True
|
||||
|
||||
# Output file base name for HTML help builder.
|
||||
htmlhelp_basename = 'ironic-libdoc'
|
||||
|
||||
latex_use_xindy = False
|
||||
|
||||
# Grouping the document tree into LaTeX files. List of tuples
|
||||
# (source start file, target name, title, author, documentclass
|
||||
# [howto/manual]).
|
||||
latex_documents = [
|
||||
(
|
||||
'index',
|
||||
'doc-ironic-lib.tex',
|
||||
'Ironic Lib Documentation',
|
||||
'OpenStack Foundation',
|
||||
'manual'
|
||||
),
|
||||
]
|
@ -1,72 +0,0 @@
|
||||
======================
|
||||
Welcome to Ironic-lib!
|
||||
======================
|
||||
|
||||
Overview
|
||||
========
|
||||
|
||||
Ironic-lib is a library for use by projects under Bare Metal governance only.
|
||||
This documentation is intended for developer use only. If you are looking for
|
||||
documentation for deployers, please see the
|
||||
`ironic documentation <https://docs.openstack.org/ironic/latest/>`_.
|
||||
|
||||
Metrics
|
||||
=======
|
||||
|
||||
Ironic-lib provides a pluggable metrics library as of the 2.0.0 release.
|
||||
Current provided backends are the default, 'noop', which discards all data,
|
||||
and 'statsd', which emits metrics to a statsd daemon over the network. The
|
||||
metrics backend to be used is configured via ``CONF.metrics.backend``. How
|
||||
this configuration is set in practice may vary by project.
|
||||
|
||||
The typical usage of metrics is to initialize and cache a metrics logger,
|
||||
using the `get_metrics_logger()` method in `ironic_lib.metrics_utils`, then
|
||||
use that object to decorate functions or create context managers to gather
|
||||
metrics. The general convention is to provide the name of the module as the
|
||||
first argument to set it as the prefix, then set the actual metric name to the
|
||||
method name. For example:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
from ironic_lib import metrics_utils
|
||||
|
||||
METRICS = metrics_utils.get_metrics_logger(__name__)
|
||||
|
||||
@METRICS.timer('my_simple_method')
|
||||
def my_simple_method(arg, matey):
|
||||
pass
|
||||
|
||||
def my_complex_method(arg, matey):
|
||||
with METRICS.timer('complex_method_pt_1'):
|
||||
do_some_work()
|
||||
|
||||
with METRICS.timer('complex_method_pt_2'):
|
||||
do_more_work()
|
||||
|
||||
There are three different kinds of metrics:
|
||||
- **Timers** measure how long the code in the decorated method or context
|
||||
manager takes to execute, and emits the value as a timer metric. These
|
||||
are useful for measuring performance of a given block of code.
|
||||
- **Counters** increment a counter each time a decorated method or context
|
||||
manager is executed. These are useful for counting the number of times a
|
||||
method is called, or the number of times an event occurs.
|
||||
- **Gauges** return the value of a decorated method as a metric. This is
|
||||
useful when you want to monitor the value returned by a method over time.
|
||||
|
||||
Additionally, metrics can be sent directly, rather than using a context
|
||||
manager or decorator, when appropriate. When used in this way, ironic-lib will
|
||||
simply emit the value provided as the requested metric type. For example:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
from ironic_lib import metrics_utils
|
||||
|
||||
METRICS = metrics_utils.get_metrics_logger(__name__)
|
||||
|
||||
def my_node_failure_method(node):
|
||||
if node.failed:
|
||||
METRICS.send_counter(node.uuid, 1)
|
||||
|
||||
The provided statsd backend natively supports all three metric types. For more
|
||||
information about how statsd changes behavior based on the metric type, see
|
||||
`statsd metric types <https://github.com/etsy/statsd/blob/master/docs/metric_types.md>`_
|
@ -1,17 +0,0 @@
|
||||
========================
|
||||
Ironic-lib Documentation
|
||||
========================
|
||||
|
||||
Ironic-lib is a library for use by projects under Bare Metal governance only.
|
||||
This documentation is intended for developer use only. If you are looking for
|
||||
documentation for deployers, please see the
|
||||
`ironic documentation <https://docs.openstack.org/ironic/latest/admin/>`_.
|
||||
|
||||
.. toctree::
|
||||
:maxdepth: 1
|
||||
|
||||
Installation and Usage documentation <contributor/index>
|
||||
reference/index
|
||||
|
||||
* :ref:`genindex`
|
||||
* :ref:`search`
|
@ -1,8 +0,0 @@
|
||||
===========================
|
||||
Autogenerated API Reference
|
||||
===========================
|
||||
|
||||
.. toctree::
|
||||
:maxdepth: 1
|
||||
|
||||
api/modules.rst
|
@ -1,5 +0,0 @@
|
||||
# An ironic-lib.filters to be used with rootwrap command.
|
||||
# This file should be owned by (and only-writeable by) the root user.
|
||||
|
||||
[Filters]
|
||||
# Left for backward compatibility only, do not use.
|
@ -1,5 +0,0 @@
|
||||
# This file mirrors all extra requirements from setup.cfg and must be kept
|
||||
# in sync. It is used both in unit tests and when building docs.
|
||||
keystoneauth1>=4.2.0 # Apache-2.0
|
||||
os-service-types>=1.2.0 # Apache-2.0
|
||||
oslo.service!=1.28.1,>=1.24.0 # Apache-2.0
|
@ -1,22 +0,0 @@
|
||||
# Copyright 2011 OpenStack Foundation.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
# This ensures the ironic_lib namespace is defined
|
||||
try:
|
||||
import pkg_resources
|
||||
pkg_resources.declare_namespace(__name__)
|
||||
except ImportError:
|
||||
import pkgutil
|
||||
__path__ = pkgutil.extend_path(__path__, __name__)
|
@ -1,203 +0,0 @@
|
||||
# Copyright 2020 Red Hat, Inc.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import base64
|
||||
import binascii
|
||||
import logging
|
||||
|
||||
import bcrypt
|
||||
import webob
|
||||
|
||||
from ironic_lib.common.i18n import _
|
||||
from ironic_lib import exception
|
||||
|
||||
LOG = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class BasicAuthMiddleware(object):
|
||||
"""Middleware which performs HTTP basic authentication on requests
|
||||
|
||||
"""
|
||||
def __init__(self, app, auth_file):
|
||||
self.app = app
|
||||
self.auth_file = auth_file
|
||||
validate_auth_file(auth_file)
|
||||
|
||||
def format_exception(self, e):
|
||||
result = {'error': {'message': str(e), 'code': e.code}}
|
||||
headers = list(e.headers.items()) + [
|
||||
('Content-Type', 'application/json')
|
||||
]
|
||||
return webob.Response(content_type='application/json',
|
||||
status_code=e.code,
|
||||
json_body=result,
|
||||
headerlist=headers)
|
||||
|
||||
def __call__(self, env, start_response):
|
||||
|
||||
try:
|
||||
token = parse_header(env)
|
||||
username, password = parse_token(token)
|
||||
env.update(authenticate(self.auth_file, username, password))
|
||||
|
||||
return self.app(env, start_response)
|
||||
|
||||
except exception.IronicException as e:
|
||||
response = self.format_exception(e)
|
||||
return response(env, start_response)
|
||||
|
||||
|
||||
def authenticate(auth_file, username, password):
|
||||
"""Finds username and password match in Apache style user auth file
|
||||
|
||||
The user auth file format is expected to comply with Apache
|
||||
documentation[1] however the bcrypt password digest is the *only*
|
||||
digest format supported.
|
||||
|
||||
[1] https://httpd.apache.org/docs/current/misc/password_encryptions.html
|
||||
|
||||
:param: auth_file: Path to user auth file
|
||||
:param: username: Username to authenticate
|
||||
:param: password: Password encoded as bytes
|
||||
:returns: A dictionary of WSGI environment values to append to the request
|
||||
:raises: Unauthorized, if no file entries match supplied username/password
|
||||
"""
|
||||
line_prefix = username + ':'
|
||||
try:
|
||||
with open(auth_file, 'r') as f:
|
||||
for line in f:
|
||||
entry = line.strip()
|
||||
if entry and entry.startswith(line_prefix):
|
||||
return auth_entry(entry, password)
|
||||
except OSError as exc:
|
||||
LOG.error('Problem reading auth user file: %s', exc)
|
||||
raise exception.ConfigInvalid(
|
||||
error_msg=_('Problem reading auth user file'))
|
||||
|
||||
# reached end of file with no matches
|
||||
LOG.info('User %s not found', username)
|
||||
unauthorized()
|
||||
|
||||
|
||||
def auth_entry(entry, password):
|
||||
"""Compare a password with a single user auth file entry
|
||||
|
||||
:param: entry: Line from auth user file to use for authentication
|
||||
:param: password: Password encoded as bytes
|
||||
:returns: A dictionary of WSGI environment values to append to the request
|
||||
:raises: Unauthorized, if the entry doesn't match supplied password or
|
||||
if the entry is crypted with a method other than bcrypt
|
||||
"""
|
||||
username, crypted = parse_entry(entry)
|
||||
|
||||
if not bcrypt.checkpw(password, crypted):
|
||||
LOG.info('Password for %s does not match', username)
|
||||
unauthorized()
|
||||
|
||||
return {
|
||||
'HTTP_X_USER': username,
|
||||
'HTTP_X_USER_NAME': username
|
||||
}
|
||||
|
||||
|
||||
def validate_auth_file(auth_file):
|
||||
"""Read the auth user file and validate its correctness
|
||||
|
||||
:param: auth_file: Path to user auth file
|
||||
:raises: ConfigInvalid on validation error
|
||||
"""
|
||||
try:
|
||||
with open(auth_file, 'r') as f:
|
||||
for line in f:
|
||||
entry = line.strip()
|
||||
if entry and ':' in entry:
|
||||
parse_entry(entry)
|
||||
except OSError:
|
||||
raise exception.ConfigInvalid(
|
||||
error_msg=_('Problem reading auth user file: %s') % auth_file)
|
||||
|
||||
|
||||
def parse_entry(entry):
|
||||
"""Extrace the username and crypted password from a user auth file entry
|
||||
|
||||
:param: entry: Line from auth user file to use for authentication
|
||||
:returns: a tuple of username and crypted password
|
||||
:raises: ConfigInvalid if the password is not in the supported bcrypt
|
||||
format
|
||||
"""
|
||||
username, crypted_str = entry.split(':', maxsplit=1)
|
||||
crypted = crypted_str.encode('utf-8')
|
||||
|
||||
if crypted[:4] not in (b'$2y$', b'$2a$', b'$2b$'):
|
||||
error_msg = _('Only bcrypt digested passwords are supported for '
|
||||
'%(username)s') % {'username': username}
|
||||
raise exception.ConfigInvalid(error_msg=error_msg)
|
||||
return username, crypted
|
||||
|
||||
|
||||
def parse_token(token):
|
||||
"""Parse the token portion of the Authentication header value
|
||||
|
||||
:param: token: Token value from basic authorization header
|
||||
:returns: tuple of username, password
|
||||
:raises: Unauthorized, if username and password could not be parsed for any
|
||||
reason
|
||||
"""
|
||||
try:
|
||||
if isinstance(token, str):
|
||||
token = token.encode('utf-8')
|
||||
auth_pair = base64.b64decode(token, validate=True)
|
||||
(username, password) = auth_pair.split(b':', maxsplit=1)
|
||||
|
||||
return (username.decode('utf-8'), password)
|
||||
except (TypeError, binascii.Error, ValueError) as exc:
|
||||
LOG.info('Could not decode authorization token: %s', exc)
|
||||
raise exception.BadRequest(_('Could not decode authorization token'))
|
||||
|
||||
|
||||
def parse_header(env):
|
||||
"""Parse WSGI environment for Authorization header of type Basic
|
||||
|
||||
:param: env: WSGI environment to get header from
|
||||
:returns: Token portion of the header value
|
||||
:raises: Unauthorized, if header is missing or if the type is not Basic
|
||||
"""
|
||||
try:
|
||||
auth_header = env.pop('HTTP_AUTHORIZATION')
|
||||
except KeyError:
|
||||
LOG.info('No authorization token received')
|
||||
unauthorized(_('Authorization required'))
|
||||
try:
|
||||
auth_type, token = auth_header.strip().split(maxsplit=1)
|
||||
except (ValueError, AttributeError) as exc:
|
||||
LOG.info('Could not parse Authorization header: %s', exc)
|
||||
raise exception.BadRequest(_('Could not parse Authorization header'))
|
||||
|
||||
if auth_type.lower() != 'basic':
|
||||
msg = _('Unsupported authorization type "%s"') % auth_type
|
||||
LOG.info(msg)
|
||||
raise exception.BadRequest(msg)
|
||||
return token
|
||||
|
||||
|
||||
def unauthorized(message=None):
|
||||
"""Raise an Unauthorized exception to prompt for basic authentication
|
||||
|
||||
:param: message: Optional message for esception
|
||||
:raises: Unauthorized with WWW-Authenticate header set
|
||||
"""
|
||||
if not message:
|
||||
message = _('Incorrect username or password')
|
||||
raise exception.Unauthorized(message)
|
@ -1,22 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
from oslo_config import types
|
||||
|
||||
|
||||
class Octal(types.Integer):
|
||||
|
||||
def __call__(self, value):
|
||||
if isinstance(value, int):
|
||||
return value
|
||||
else:
|
||||
return int(str(value), 8)
|
@ -1,21 +0,0 @@
|
||||
# Copyright (c) 2014 Hewlett-Packard Development Company, L.P.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import oslo_i18n as i18n
|
||||
|
||||
_translators = i18n.TranslatorFactory(domain='ironic-lib')
|
||||
|
||||
# The primary translation function using the well-known name "_"
|
||||
_ = _translators.primary
|
@ -1,201 +0,0 @@
|
||||
# Copyright 2010 United States Government as represented by the
|
||||
# Administrator of the National Aeronautics and Space Administration.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""Ironic base exception handling.
|
||||
|
||||
Includes decorator for re-raising Ironic-type exceptions.
|
||||
|
||||
SHOULD include dedicated exception logging.
|
||||
|
||||
"""
|
||||
|
||||
import collections
|
||||
from http import client as http_client
|
||||
import json
|
||||
import logging
|
||||
|
||||
from oslo_config import cfg
|
||||
from oslo_utils import excutils
|
||||
|
||||
from ironic_lib.common.i18n import _
|
||||
|
||||
|
||||
LOG = logging.getLogger(__name__)
|
||||
|
||||
exc_log_opts = [
|
||||
cfg.BoolOpt('fatal_exception_format_errors',
|
||||
default=False,
|
||||
help=_('Used if there is a formatting error when generating '
|
||||
'an exception message (a programming error). If True, '
|
||||
'raise an exception; if False, use the unformatted '
|
||||
'message.'),
|
||||
deprecated_group='DEFAULT'),
|
||||
]
|
||||
|
||||
CONF = cfg.CONF
|
||||
CONF.register_opts(exc_log_opts, group='ironic_lib')
|
||||
|
||||
|
||||
def list_opts():
|
||||
"""Entry point for oslo-config-generator."""
|
||||
return [('ironic_lib', exc_log_opts)]
|
||||
|
||||
|
||||
def _ensure_exception_kwargs_serializable(exc_class_name, kwargs):
|
||||
"""Ensure that kwargs are serializable
|
||||
|
||||
Ensure that all kwargs passed to exception constructor can be passed over
|
||||
RPC, by trying to convert them to JSON, or, as a last resort, to string.
|
||||
If it is not possible, unserializable kwargs will be removed, letting the
|
||||
receiver to handle the exception string as it is configured to.
|
||||
|
||||
:param exc_class_name: a IronicException class name.
|
||||
:param kwargs: a dictionary of keyword arguments passed to the exception
|
||||
constructor.
|
||||
:returns: a dictionary of serializable keyword arguments.
|
||||
"""
|
||||
serializers = [(json.dumps, _('when converting to JSON')),
|
||||
(str, _('when converting to string'))]
|
||||
exceptions = collections.defaultdict(list)
|
||||
serializable_kwargs = {}
|
||||
for k, v in kwargs.items():
|
||||
for serializer, msg in serializers:
|
||||
try:
|
||||
serializable_kwargs[k] = serializer(v)
|
||||
exceptions.pop(k, None)
|
||||
break
|
||||
except Exception as e:
|
||||
exceptions[k].append(
|
||||
'(%(serializer_type)s) %(e_type)s: %(e_contents)s' %
|
||||
{'serializer_type': msg, 'e_contents': e,
|
||||
'e_type': e.__class__.__name__})
|
||||
if exceptions:
|
||||
LOG.error("One or more arguments passed to the %(exc_class)s "
|
||||
"constructor as kwargs can not be serialized. The "
|
||||
"serialized arguments: %(serialized)s. These "
|
||||
"unserialized kwargs were dropped because of the "
|
||||
"exceptions encountered during their "
|
||||
"serialization:\n%(errors)s",
|
||||
dict(errors=';\n'.join("%s: %s" % (k, '; '.join(v))
|
||||
for k, v in exceptions.items()),
|
||||
exc_class=exc_class_name,
|
||||
serialized=serializable_kwargs))
|
||||
# We might be able to actually put the following keys' values into
|
||||
# format string, but there is no guarantee, drop it just in case.
|
||||
for k in exceptions:
|
||||
del kwargs[k]
|
||||
return serializable_kwargs
|
||||
|
||||
|
||||
class IronicException(Exception):
|
||||
"""Base Ironic Exception
|
||||
|
||||
To correctly use this class, inherit from it and define
|
||||
a '_msg_fmt' property. That _msg_fmt will get printf'd
|
||||
with the keyword arguments provided to the constructor.
|
||||
|
||||
If you need to access the message from an exception you should use
|
||||
str(exc)
|
||||
|
||||
"""
|
||||
|
||||
_msg_fmt = _("An unknown exception occurred.")
|
||||
code = 500
|
||||
headers = {}
|
||||
safe = False
|
||||
|
||||
def __init__(self, message=None, **kwargs):
|
||||
self.kwargs = _ensure_exception_kwargs_serializable(
|
||||
self.__class__.__name__, kwargs)
|
||||
|
||||
if 'code' not in self.kwargs:
|
||||
try:
|
||||
self.kwargs['code'] = self.code
|
||||
except AttributeError:
|
||||
pass
|
||||
else:
|
||||
self.code = int(kwargs['code'])
|
||||
|
||||
if not message:
|
||||
try:
|
||||
message = self._msg_fmt % kwargs
|
||||
|
||||
except Exception:
|
||||
with excutils.save_and_reraise_exception() as ctxt:
|
||||
# kwargs doesn't match a variable in the message
|
||||
# log the issue and the kwargs
|
||||
prs = ', '.join('%s=%s' % pair for pair in kwargs.items())
|
||||
LOG.exception('Exception in string format operation '
|
||||
'(arguments %s)', prs)
|
||||
if not CONF.ironic_lib.fatal_exception_format_errors:
|
||||
# at least get the core message out if something
|
||||
# happened
|
||||
message = self._msg_fmt
|
||||
ctxt.reraise = False
|
||||
|
||||
super(IronicException, self).__init__(message)
|
||||
|
||||
|
||||
class InstanceDeployFailure(IronicException):
|
||||
_msg_fmt = _("Failed to deploy instance: %(reason)s")
|
||||
|
||||
|
||||
class FileSystemNotSupported(IronicException):
|
||||
_msg_fmt = _("Failed to create a file system. "
|
||||
"File system %(fs)s is not supported.")
|
||||
|
||||
|
||||
class InvalidMetricConfig(IronicException):
|
||||
_msg_fmt = _("Invalid value for metrics config option: %(reason)s")
|
||||
|
||||
|
||||
class ServiceLookupFailure(IronicException):
|
||||
_msg_fmt = _("Cannot find %(service)s service through multicast")
|
||||
|
||||
|
||||
class ServiceRegistrationFailure(IronicException):
|
||||
_msg_fmt = _("Cannot register %(service)s service: %(error)s")
|
||||
|
||||
|
||||
class BadRequest(IronicException):
|
||||
code = http_client.BAD_REQUEST
|
||||
|
||||
|
||||
class Unauthorized(IronicException):
|
||||
code = http_client.UNAUTHORIZED
|
||||
headers = {'WWW-Authenticate': 'Basic realm="Baremetal API"'}
|
||||
|
||||
|
||||
class ConfigInvalid(IronicException):
|
||||
_msg_fmt = _("Invalid configuration file. %(error_msg)s")
|
||||
|
||||
|
||||
class CatalogNotFound(IronicException):
|
||||
_msg_fmt = _("Service type %(service_type)s with endpoint type "
|
||||
"%(endpoint_type)s not found in keystone service catalog.")
|
||||
|
||||
|
||||
class KeystoneUnauthorized(IronicException):
|
||||
_msg_fmt = _("Not authorized in Keystone.")
|
||||
|
||||
|
||||
class KeystoneFailure(IronicException):
|
||||
pass
|
||||
|
||||
|
||||
class MetricsNotSupported(IronicException):
|
||||
_msg_fmt = _("Metrics action is not supported. You may need to "
|
||||
"adjust the [metrics] section in ironic.conf.")
|
@ -1,82 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
from oslo_config import cfg
|
||||
|
||||
from ironic_lib.common import config
|
||||
from ironic_lib.common.i18n import _
|
||||
from ironic_lib import keystone
|
||||
|
||||
|
||||
CONF = cfg.CONF
|
||||
|
||||
opts = [
|
||||
cfg.StrOpt('auth_strategy',
|
||||
choices=[('noauth', _('no authentication')),
|
||||
('keystone', _('use the Identity service for '
|
||||
'authentication')),
|
||||
('http_basic', _('HTTP basic authentication'))],
|
||||
help=_('Authentication strategy used by JSON RPC. Defaults to '
|
||||
'the global auth_strategy setting.')),
|
||||
cfg.StrOpt('http_basic_auth_user_file',
|
||||
default='/etc/ironic/htpasswd-json-rpc',
|
||||
help=_('Path to Apache format user authentication file used '
|
||||
'when auth_strategy=http_basic')),
|
||||
cfg.HostAddressOpt('host_ip',
|
||||
default='::',
|
||||
help=_('The IP address or hostname on which JSON RPC '
|
||||
'will listen.')),
|
||||
cfg.PortOpt('port',
|
||||
default=8089,
|
||||
help=_('The port to use for JSON RPC')),
|
||||
cfg.BoolOpt('use_ssl',
|
||||
default=False,
|
||||
help=_('Whether to use TLS for JSON RPC')),
|
||||
cfg.StrOpt('http_basic_username',
|
||||
deprecated_for_removal=True,
|
||||
deprecated_reason=_("Use username instead"),
|
||||
help=_("Name of the user to use for HTTP Basic authentication "
|
||||
"client requests.")),
|
||||
cfg.StrOpt('http_basic_password',
|
||||
deprecated_for_removal=True,
|
||||
deprecated_reason=_("Use password instead"),
|
||||
secret=True,
|
||||
help=_("Password to use for HTTP Basic authentication "
|
||||
"client requests.")),
|
||||
cfg.ListOpt('allowed_roles',
|
||||
default=['admin'],
|
||||
help=_("List of roles allowed to use JSON RPC")),
|
||||
cfg.StrOpt('unix_socket',
|
||||
help=_('Unix socket to listen on. Disables host_ip and port.')),
|
||||
cfg.Opt('unix_socket_mode', type=config.Octal(),
|
||||
help=_('File mode (an octal number) of the unix socket to '
|
||||
'listen on. Ignored if unix_socket is not set.')),
|
||||
]
|
||||
|
||||
|
||||
def register_opts(conf):
|
||||
conf.register_opts(opts, group='json_rpc')
|
||||
keystone.register_auth_opts(conf, 'json_rpc')
|
||||
conf.set_default('timeout', 120, group='json_rpc')
|
||||
|
||||
|
||||
register_opts(CONF)
|
||||
|
||||
|
||||
def list_opts():
|
||||
return [('json_rpc', opts + keystone.add_auth_opts([]))]
|
||||
|
||||
|
||||
def auth_strategy():
|
||||
# NOTE(dtantsur): this expects [DEFAULT]auth_strategy to be provided by the
|
||||
# service configuration.
|
||||
return CONF.json_rpc.auth_strategy or CONF.auth_strategy
|
@ -1,242 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""A simple JSON RPC client.
|
||||
|
||||
This client is compatible with any JSON RPC 2.0 implementation, including ours.
|
||||
"""
|
||||
|
||||
import logging
|
||||
|
||||
from oslo_config import cfg
|
||||
from oslo_utils import importutils
|
||||
from oslo_utils import netutils
|
||||
from oslo_utils import strutils
|
||||
from oslo_utils import uuidutils
|
||||
|
||||
from ironic_lib.common.i18n import _
|
||||
from ironic_lib import exception
|
||||
from ironic_lib import json_rpc
|
||||
from ironic_lib import keystone
|
||||
|
||||
|
||||
CONF = cfg.CONF
|
||||
LOG = logging.getLogger(__name__)
|
||||
_SESSION = None
|
||||
|
||||
|
||||
def _get_session():
|
||||
global _SESSION
|
||||
|
||||
if _SESSION is None:
|
||||
kwargs = {}
|
||||
auth_strategy = json_rpc.auth_strategy()
|
||||
if auth_strategy != 'keystone':
|
||||
auth_type = 'none' if auth_strategy == 'noauth' else auth_strategy
|
||||
CONF.set_default('auth_type', auth_type, group='json_rpc')
|
||||
|
||||
# Deprecated, remove in W
|
||||
if auth_strategy == 'http_basic':
|
||||
if CONF.json_rpc.http_basic_username:
|
||||
kwargs['username'] = CONF.json_rpc.http_basic_username
|
||||
if CONF.json_rpc.http_basic_password:
|
||||
kwargs['password'] = CONF.json_rpc.http_basic_password
|
||||
|
||||
auth = keystone.get_auth('json_rpc', **kwargs)
|
||||
|
||||
session = keystone.get_session('json_rpc', auth=auth)
|
||||
headers = {
|
||||
'Content-Type': 'application/json'
|
||||
}
|
||||
|
||||
# Adds options like connect_retries
|
||||
_SESSION = keystone.get_adapter('json_rpc', session=session,
|
||||
additional_headers=headers)
|
||||
|
||||
return _SESSION
|
||||
|
||||
|
||||
class Client(object):
|
||||
"""JSON RPC client with ironic exception handling."""
|
||||
|
||||
allowed_exception_namespaces = [
|
||||
"ironic_lib.exception.",
|
||||
"ironic.common.exception.",
|
||||
"ironic_inspector.utils.",
|
||||
]
|
||||
|
||||
def __init__(self, serializer, version_cap=None):
|
||||
self.serializer = serializer
|
||||
self.version_cap = version_cap
|
||||
|
||||
def can_send_version(self, version):
|
||||
return _can_send_version(version, self.version_cap)
|
||||
|
||||
def prepare(self, topic, version=None):
|
||||
"""Prepare the client to transmit a request.
|
||||
|
||||
:param topic: Topic which is being addressed. Typically this
|
||||
is the hostname of the remote json-rpc service.
|
||||
:param version: The RPC API version to utilize.
|
||||
"""
|
||||
|
||||
host = topic.split('.', 1)[1]
|
||||
host, port = netutils.parse_host_port(host)
|
||||
return _CallContext(
|
||||
host, self.serializer, version=version,
|
||||
version_cap=self.version_cap,
|
||||
allowed_exception_namespaces=self.allowed_exception_namespaces,
|
||||
port=port)
|
||||
|
||||
|
||||
class _CallContext(object):
|
||||
"""Wrapper object for compatibility with oslo.messaging API."""
|
||||
|
||||
def __init__(self, host, serializer, version=None, version_cap=None,
|
||||
allowed_exception_namespaces=(), port=None):
|
||||
if not port:
|
||||
self.port = CONF.json_rpc.port
|
||||
else:
|
||||
self.port = int(port)
|
||||
self.host = host
|
||||
self.serializer = serializer
|
||||
self.version = version
|
||||
self.version_cap = version_cap
|
||||
self.allowed_exception_namespaces = allowed_exception_namespaces
|
||||
|
||||
def _is_known_exception(self, class_name):
|
||||
for ns in self.allowed_exception_namespaces:
|
||||
if class_name.startswith(ns):
|
||||
return True
|
||||
return False
|
||||
|
||||
def _handle_error(self, error):
|
||||
if not error:
|
||||
return
|
||||
|
||||
message = error['message']
|
||||
try:
|
||||
cls = error['data']['class']
|
||||
except KeyError:
|
||||
LOG.error("Unexpected error from RPC: %s", error)
|
||||
raise exception.IronicException(
|
||||
_("Unexpected error raised by RPC"))
|
||||
else:
|
||||
if not self._is_known_exception(cls):
|
||||
# NOTE(dtantsur): protect against arbitrary code execution
|
||||
LOG.error("Unexpected error from RPC: %s", error)
|
||||
raise exception.IronicException(
|
||||
_("Unexpected error raised by RPC"))
|
||||
raise importutils.import_object(cls, message,
|
||||
code=error.get('code', 500))
|
||||
|
||||
def call(self, context, method, version=None, **kwargs):
|
||||
"""Call conductor RPC.
|
||||
|
||||
Versioned objects are automatically serialized and deserialized.
|
||||
|
||||
:param context: Security context.
|
||||
:param method: Method name.
|
||||
:param version: RPC API version to use.
|
||||
:param kwargs: Keyword arguments to pass.
|
||||
:return: RPC result (if any).
|
||||
"""
|
||||
return self._request(context, method, cast=False, version=version,
|
||||
**kwargs)
|
||||
|
||||
def cast(self, context, method, version=None, **kwargs):
|
||||
"""Call conductor RPC asynchronously.
|
||||
|
||||
Versioned objects are automatically serialized and deserialized.
|
||||
|
||||
:param context: Security context.
|
||||
:param method: Method name.
|
||||
:param version: RPC API version to use.
|
||||
:param kwargs: Keyword arguments to pass.
|
||||
:return: None
|
||||
"""
|
||||
return self._request(context, method, cast=True, version=version,
|
||||
**kwargs)
|
||||
|
||||
def _request(self, context, method, cast=False, version=None, **kwargs):
|
||||
"""Call conductor RPC.
|
||||
|
||||
Versioned objects are automatically serialized and deserialized.
|
||||
|
||||
:param context: Security context.
|
||||
:param method: Method name.
|
||||
:param cast: If true, use a JSON RPC notification.
|
||||
:param version: RPC API version to use.
|
||||
:param kwargs: Keyword arguments to pass.
|
||||
:return: RPC result (if any).
|
||||
"""
|
||||
params = {key: self.serializer.serialize_entity(context, value)
|
||||
for key, value in kwargs.items()}
|
||||
params['context'] = context.to_dict()
|
||||
|
||||
if version is None:
|
||||
version = self.version
|
||||
if version is not None:
|
||||
_check_version(version, self.version_cap)
|
||||
params['rpc.version'] = version
|
||||
|
||||
body = {
|
||||
"jsonrpc": "2.0",
|
||||
"method": method,
|
||||
"params": params,
|
||||
}
|
||||
if not cast:
|
||||
body['id'] = (getattr(context, 'request_id', None)
|
||||
or uuidutils.generate_uuid())
|
||||
|
||||
scheme = 'http'
|
||||
if CONF.json_rpc.use_ssl:
|
||||
scheme = 'https'
|
||||
url = '%s://%s:%d' % (scheme,
|
||||
netutils.escape_ipv6(self.host),
|
||||
self.port)
|
||||
LOG.debug("RPC %s to %s with %s", method, url,
|
||||
strutils.mask_dict_password(body))
|
||||
try:
|
||||
result = _get_session().post(url, json=body)
|
||||
except Exception as exc:
|
||||
LOG.debug('RPC %s to %s failed with %s', method, url, exc)
|
||||
raise
|
||||
LOG.debug('RPC %s to %s returned %s', method, url,
|
||||
strutils.mask_password(result.text or '<None>'))
|
||||
if not cast:
|
||||
result = result.json()
|
||||
self._handle_error(result.get('error'))
|
||||
result = self.serializer.deserialize_entity(context,
|
||||
result['result'])
|
||||
return result
|
||||
|
||||
|
||||
def _can_send_version(requested, version_cap):
|
||||
if requested is None or version_cap is None:
|
||||
return True
|
||||
|
||||
requested_parts = [int(item) for item in requested.split('.', 1)]
|
||||
version_cap_parts = [int(item) for item in version_cap.split('.', 1)]
|
||||
|
||||
if requested_parts[0] != version_cap_parts[0]:
|
||||
return False # major version mismatch
|
||||
else:
|
||||
return requested_parts[1] <= version_cap_parts[1]
|
||||
|
||||
|
||||
def _check_version(requested, version_cap):
|
||||
if not _can_send_version(requested, version_cap):
|
||||
raise RuntimeError(_("Cannot send RPC request: requested version "
|
||||
"%(requested)s, maximum allowed version is "
|
||||
"%(version_cap)s") % {'requested': requested,
|
||||
'version_cap': version_cap})
|
@ -1,292 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""Implementation of JSON RPC for communication between API and conductors.
|
||||
|
||||
This module implementa a subset of JSON RPC 2.0 as defined in
|
||||
https://www.jsonrpc.org/specification. Main differences:
|
||||
* No support for batched requests.
|
||||
* No support for positional arguments passing.
|
||||
* No JSON RPC 1.0 fallback.
|
||||
"""
|
||||
|
||||
import json
|
||||
import logging
|
||||
|
||||
try:
|
||||
from keystonemiddleware import auth_token
|
||||
except ImportError:
|
||||
auth_token = None
|
||||
from oslo_config import cfg
|
||||
try:
|
||||
import oslo_messaging
|
||||
except ImportError:
|
||||
oslo_messaging = None
|
||||
from oslo_utils import strutils
|
||||
import webob
|
||||
|
||||
from ironic_lib import auth_basic
|
||||
from ironic_lib.common.i18n import _
|
||||
from ironic_lib import exception
|
||||
from ironic_lib import json_rpc
|
||||
from ironic_lib import wsgi
|
||||
|
||||
|
||||
CONF = cfg.CONF
|
||||
LOG = logging.getLogger(__name__)
|
||||
_DENY_LIST = {'init_host', 'del_host', 'target', 'iter_nodes'}
|
||||
|
||||
|
||||
def _build_method_map(manager):
|
||||
"""Build mapping from method names to their bodies.
|
||||
|
||||
:param manager: A conductor manager.
|
||||
:return: dict with mapping
|
||||
"""
|
||||
result = {}
|
||||
for method in dir(manager):
|
||||
if method.startswith('_') or method in _DENY_LIST:
|
||||
continue
|
||||
func = getattr(manager, method)
|
||||
if not callable(func):
|
||||
continue
|
||||
LOG.debug('Adding RPC method %s', method)
|
||||
result[method] = func
|
||||
return result
|
||||
|
||||
|
||||
class JsonRpcError(exception.IronicException):
|
||||
pass
|
||||
|
||||
|
||||
class ParseError(JsonRpcError):
|
||||
code = -32700
|
||||
_msg_fmt = _("Invalid JSON received by RPC server")
|
||||
|
||||
|
||||
class InvalidRequest(JsonRpcError):
|
||||
code = -32600
|
||||
_msg_fmt = _("Invalid request object received by RPC server")
|
||||
|
||||
|
||||
class MethodNotFound(JsonRpcError):
|
||||
code = -32601
|
||||
_msg_fmt = _("Method %(name)s was not found")
|
||||
|
||||
|
||||
class InvalidParams(JsonRpcError):
|
||||
code = -32602
|
||||
_msg_fmt = _("Params %(params)s are invalid for %(method)s: %(error)s")
|
||||
|
||||
|
||||
class EmptyContext:
|
||||
|
||||
request_id = None
|
||||
|
||||
def __init__(self, src):
|
||||
self.__dict__.update(src)
|
||||
|
||||
def to_dict(self):
|
||||
return self.__dict__.copy()
|
||||
|
||||
|
||||
class WSGIService(wsgi.WSGIService):
|
||||
"""Provides ability to launch JSON RPC as a WSGI application."""
|
||||
|
||||
def __init__(self, manager, serializer, context_class=EmptyContext):
|
||||
"""Create a JSON RPC service.
|
||||
|
||||
:param manager: Object from which to expose methods.
|
||||
:param serializer: A serializer that supports calls serialize_entity
|
||||
and deserialize_entity.
|
||||
:param context_class: A context class - a callable accepting a dict
|
||||
received from network.
|
||||
"""
|
||||
self.manager = manager
|
||||
self.serializer = serializer
|
||||
self.context_class = context_class
|
||||
self._method_map = _build_method_map(manager)
|
||||
auth_strategy = json_rpc.auth_strategy()
|
||||
if auth_strategy == 'keystone':
|
||||
conf = dict(CONF.keystone_authtoken)
|
||||
if auth_token is None:
|
||||
raise exception.ConfigInvalid(
|
||||
_("keystonemiddleware is required for keystone "
|
||||
"authentication"))
|
||||
app = auth_token.AuthProtocol(self._application, conf)
|
||||
elif auth_strategy == 'http_basic':
|
||||
app = auth_basic.BasicAuthMiddleware(
|
||||
self._application,
|
||||
cfg.CONF.json_rpc.http_basic_auth_user_file)
|
||||
else:
|
||||
app = self._application
|
||||
super().__init__('ironic-json-rpc', app, CONF.json_rpc)
|
||||
|
||||
def _application(self, environment, start_response):
|
||||
"""WSGI application for conductor JSON RPC."""
|
||||
request = webob.Request(environment)
|
||||
if request.method != 'POST':
|
||||
body = {'error': {'code': 405,
|
||||
'message': _('Only POST method can be used')}}
|
||||
return webob.Response(status_code=405, json_body=body)(
|
||||
environment, start_response)
|
||||
|
||||
if json_rpc.auth_strategy() == 'keystone':
|
||||
roles = (request.headers.get('X-Roles') or '').split(',')
|
||||
allowed_roles = CONF.json_rpc.allowed_roles
|
||||
if set(roles).isdisjoint(allowed_roles):
|
||||
LOG.debug('Roles %s do not contain any of %s, rejecting '
|
||||
'request', roles, allowed_roles)
|
||||
body = {'error': {'code': 403, 'message': _('Forbidden')}}
|
||||
return webob.Response(status_code=403, json_body=body)(
|
||||
environment, start_response)
|
||||
|
||||
result = self._call(request)
|
||||
if result is not None:
|
||||
response = webob.Response(content_type='application/json',
|
||||
charset='UTF-8',
|
||||
json_body=result)
|
||||
else:
|
||||
response = webob.Response(status_code=204)
|
||||
return response(environment, start_response)
|
||||
|
||||
def _handle_error(self, exc, request_id=None):
|
||||
"""Generate a JSON RPC 2.0 error body.
|
||||
|
||||
:param exc: Exception object.
|
||||
:param request_id: ID of the request (if any).
|
||||
:return: dict with response body
|
||||
"""
|
||||
if (oslo_messaging is not None
|
||||
and isinstance(exc, oslo_messaging.ExpectedException)):
|
||||
exc = exc.exc_info[1]
|
||||
|
||||
expected = isinstance(exc, exception.IronicException)
|
||||
cls = exc.__class__
|
||||
if expected:
|
||||
LOG.debug('RPC error %s: %s', cls.__name__, exc)
|
||||
else:
|
||||
LOG.exception('Unexpected RPC exception %s', cls.__name__)
|
||||
|
||||
response = {
|
||||
"jsonrpc": "2.0",
|
||||
"id": request_id,
|
||||
"error": {
|
||||
"code": getattr(exc, 'code', 500),
|
||||
"message": str(exc),
|
||||
}
|
||||
}
|
||||
if expected and not isinstance(exc, JsonRpcError):
|
||||
# Allow de-serializing the correct class for expected errors.
|
||||
response['error']['data'] = {
|
||||
'class': '%s.%s' % (cls.__module__, cls.__name__)
|
||||
}
|
||||
return response
|
||||
|
||||
def _call(self, request):
|
||||
"""Process a JSON RPC request.
|
||||
|
||||
:param request: ``webob.Request`` object.
|
||||
:return: dict with response body.
|
||||
"""
|
||||
request_id = None
|
||||
try:
|
||||
try:
|
||||
body = json.loads(request.text)
|
||||
except ValueError:
|
||||
LOG.error('Cannot parse JSON RPC request as JSON')
|
||||
raise ParseError()
|
||||
|
||||
if not isinstance(body, dict):
|
||||
LOG.error('JSON RPC request %s is not an object (batched '
|
||||
'requests are not supported)', body)
|
||||
raise InvalidRequest()
|
||||
|
||||
request_id = body.get('id')
|
||||
params = body.get('params', {})
|
||||
|
||||
if (body.get('jsonrpc') != '2.0'
|
||||
or not body.get('method')
|
||||
or not isinstance(params, dict)):
|
||||
LOG.error('JSON RPC request %s is invalid', body)
|
||||
raise InvalidRequest()
|
||||
except Exception as exc:
|
||||
# We do not treat malformed requests as notifications and return
|
||||
# a response even when request_id is None. This seems in agreement
|
||||
# with the examples in the specification.
|
||||
return self._handle_error(exc, request_id)
|
||||
|
||||
try:
|
||||
method = body['method']
|
||||
try:
|
||||
func = self._method_map[method]
|
||||
except KeyError:
|
||||
raise MethodNotFound(name=method)
|
||||
|
||||
result = self._handle_requests(func, method, params)
|
||||
if request_id is not None:
|
||||
return {
|
||||
"jsonrpc": "2.0",
|
||||
"result": result,
|
||||
"id": request_id
|
||||
}
|
||||
except Exception as exc:
|
||||
result = self._handle_error(exc, request_id)
|
||||
# We treat correctly formed requests without "id" as notifications
|
||||
# and do not return any errors.
|
||||
if request_id is not None:
|
||||
return result
|
||||
|
||||
def _handle_requests(self, func, name, params):
|
||||
"""Convert arguments and call a method.
|
||||
|
||||
:param func: Callable object.
|
||||
:param name: RPC call name for logging.
|
||||
:param params: Keyword arguments.
|
||||
:return: call result as JSON.
|
||||
"""
|
||||
# TODO(dtantsur): server-side version check?
|
||||
params.pop('rpc.version', None)
|
||||
logged_params = strutils.mask_dict_password(params)
|
||||
|
||||
try:
|
||||
context = params.pop('context')
|
||||
except KeyError:
|
||||
context = None
|
||||
else:
|
||||
# A valid context is required for deserialization
|
||||
if not isinstance(context, dict):
|
||||
raise InvalidParams(
|
||||
_("Context must be a dictionary, if provided"))
|
||||
|
||||
context = self.context_class(context)
|
||||
params = {key: self.serializer.deserialize_entity(context, value)
|
||||
for key, value in params.items()}
|
||||
params['context'] = context
|
||||
|
||||
LOG.debug('RPC %s with %s', name, logged_params)
|
||||
try:
|
||||
result = func(**params)
|
||||
# FIXME(dtantsur): we could use the inspect module, but
|
||||
# oslo_messaging.expected_exceptions messes up signatures.
|
||||
except TypeError as exc:
|
||||
raise InvalidParams(params=', '.join(params),
|
||||
method=name, error=exc)
|
||||
|
||||
if context is not None:
|
||||
# Currently it seems that we can serialize even with invalid
|
||||
# context, but I'm not sure it's guaranteed to be the case.
|
||||
result = self.serializer.serialize_entity(context, result)
|
||||
LOG.debug('RPC %s returned %s', name,
|
||||
strutils.mask_dict_password(result)
|
||||
if isinstance(result, dict) else result)
|
||||
return result
|
@ -1,201 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""Central place for handling Keystone authorization and service lookup."""
|
||||
|
||||
import copy
|
||||
import functools
|
||||
import logging
|
||||
|
||||
from keystoneauth1 import exceptions as ks_exception
|
||||
from keystoneauth1 import loading as ks_loading
|
||||
from keystoneauth1 import service_token
|
||||
from keystoneauth1 import token_endpoint
|
||||
import os_service_types
|
||||
from oslo_config import cfg
|
||||
|
||||
from ironic_lib import exception
|
||||
|
||||
|
||||
CONF = cfg.CONF
|
||||
LOG = logging.getLogger(__name__)
|
||||
|
||||
DEFAULT_VALID_INTERFACES = ['internal', 'public']
|
||||
|
||||
|
||||
def ks_exceptions(f):
|
||||
"""Wraps keystoneclient functions and centralizes exception handling."""
|
||||
@functools.wraps(f)
|
||||
def wrapper(group, *args, **kwargs):
|
||||
try:
|
||||
return f(group, *args, **kwargs)
|
||||
except ks_exception.EndpointNotFound:
|
||||
service_type = kwargs.get(
|
||||
'service_type',
|
||||
getattr(getattr(CONF, group), 'service_type', group))
|
||||
endpoint_type = kwargs.get('endpoint_type', 'internal')
|
||||
raise exception.CatalogNotFound(
|
||||
service_type=service_type, endpoint_type=endpoint_type)
|
||||
except (ks_exception.Unauthorized, ks_exception.AuthorizationFailure):
|
||||
raise exception.KeystoneUnauthorized()
|
||||
except (ks_exception.NoMatchingPlugin,
|
||||
ks_exception.MissingRequiredOptions) as e:
|
||||
raise exception.ConfigInvalid(str(e))
|
||||
except Exception as e:
|
||||
LOG.exception('Keystone request failed with unexpected exception')
|
||||
raise exception.KeystoneFailure(str(e))
|
||||
return wrapper
|
||||
|
||||
|
||||
@ks_exceptions
|
||||
def get_session(group, **session_kwargs):
|
||||
"""Loads session object from options in a configuration file section.
|
||||
|
||||
The session_kwargs will be passed directly to keystoneauth1 Session
|
||||
and will override the values loaded from config.
|
||||
Consult keystoneauth1 docs for available options.
|
||||
|
||||
:param group: name of the config section to load session options from
|
||||
|
||||
"""
|
||||
return ks_loading.load_session_from_conf_options(
|
||||
CONF, group, **session_kwargs)
|
||||
|
||||
|
||||
@ks_exceptions
|
||||
def get_auth(group, **auth_kwargs):
|
||||
"""Loads auth plugin from options in a configuration file section.
|
||||
|
||||
The auth_kwargs will be passed directly to keystoneauth1 auth plugin
|
||||
and will override the values loaded from config.
|
||||
Note that the accepted kwargs will depend on auth plugin type as defined
|
||||
by [group]auth_type option.
|
||||
Consult keystoneauth1 docs for available auth plugins and their options.
|
||||
|
||||
:param group: name of the config section to load auth plugin options from
|
||||
|
||||
"""
|
||||
try:
|
||||
auth = ks_loading.load_auth_from_conf_options(CONF, group,
|
||||
**auth_kwargs)
|
||||
except ks_exception.MissingRequiredOptions:
|
||||
LOG.error('Failed to load auth plugin from group %s', group)
|
||||
raise
|
||||
return auth
|
||||
|
||||
|
||||
@ks_exceptions
|
||||
def get_adapter(group, **adapter_kwargs):
|
||||
"""Loads adapter from options in a configuration file section.
|
||||
|
||||
The adapter_kwargs will be passed directly to keystoneauth1 Adapter
|
||||
and will override the values loaded from config.
|
||||
Consult keystoneauth1 docs for available adapter options.
|
||||
|
||||
:param group: name of the config section to load adapter options from
|
||||
|
||||
"""
|
||||
return ks_loading.load_adapter_from_conf_options(CONF, group,
|
||||
**adapter_kwargs)
|
||||
|
||||
|
||||
def get_endpoint(group, **adapter_kwargs):
|
||||
"""Get an endpoint from an adapter.
|
||||
|
||||
The adapter_kwargs will be passed directly to keystoneauth1 Adapter
|
||||
and will override the values loaded from config.
|
||||
Consult keystoneauth1 docs for available adapter options.
|
||||
|
||||
:param group: name of the config section to load adapter options from
|
||||
:raises: CatalogNotFound if the endpoint is not found
|
||||
"""
|
||||
result = get_adapter(group, **adapter_kwargs).get_endpoint()
|
||||
if not result:
|
||||
service_type = adapter_kwargs.get(
|
||||
'service_type',
|
||||
getattr(getattr(CONF, group), 'service_type', group))
|
||||
endpoint_type = adapter_kwargs.get('endpoint_type', 'internal')
|
||||
raise exception.CatalogNotFound(
|
||||
service_type=service_type, endpoint_type=endpoint_type)
|
||||
return result
|
||||
|
||||
|
||||
def get_service_auth(context, endpoint, service_auth):
|
||||
"""Create auth plugin wrapping both user and service auth.
|
||||
|
||||
When properly configured and using auth_token middleware,
|
||||
requests with valid service auth will not fail
|
||||
if the user token is expired.
|
||||
|
||||
Ideally we would use the plugin provided by auth_token middleware
|
||||
however this plugin isn't serialized yet.
|
||||
"""
|
||||
# TODO(pas-ha) use auth plugin from context when it is available
|
||||
user_auth = token_endpoint.Token(endpoint, context.auth_token)
|
||||
return service_token.ServiceTokenAuthWrapper(user_auth=user_auth,
|
||||
service_auth=service_auth)
|
||||
|
||||
|
||||
def register_auth_opts(conf, group, service_type=None):
|
||||
"""Register session- and auth-related options
|
||||
|
||||
Registers only basic auth options shared by all auth plugins.
|
||||
The rest are registered at runtime depending on auth plugin used.
|
||||
"""
|
||||
ks_loading.register_session_conf_options(conf, group)
|
||||
ks_loading.register_auth_conf_options(conf, group)
|
||||
CONF.set_default('auth_type', default='password', group=group)
|
||||
ks_loading.register_adapter_conf_options(conf, group)
|
||||
conf.set_default('valid_interfaces', DEFAULT_VALID_INTERFACES, group=group)
|
||||
if service_type:
|
||||
conf.set_default('service_type', service_type, group=group)
|
||||
else:
|
||||
types = os_service_types.get_service_types()
|
||||
key = 'ironic-inspector' if group == 'inspector' else group
|
||||
service_types = types.service_types_by_project.get(key)
|
||||
if service_types:
|
||||
conf.set_default('service_type', service_types[0], group=group)
|
||||
|
||||
|
||||
def add_auth_opts(options, service_type=None):
|
||||
"""Add auth options to sample config
|
||||
|
||||
As these are dynamically registered at runtime,
|
||||
this adds options for most used auth_plugins
|
||||
when generating sample config.
|
||||
"""
|
||||
def add_options(opts, opts_to_add):
|
||||
for new_opt in opts_to_add:
|
||||
for opt in opts:
|
||||
if opt.name == new_opt.name:
|
||||
break
|
||||
else:
|
||||
opts.append(new_opt)
|
||||
|
||||
opts = copy.deepcopy(options)
|
||||
opts.insert(0, ks_loading.get_auth_common_conf_options()[0])
|
||||
# NOTE(dims): There are a lot of auth plugins, we just generate
|
||||
# the config options for a few common ones
|
||||
plugins = ['password', 'v2password', 'v3password']
|
||||
for name in plugins:
|
||||
plugin = ks_loading.get_plugin_loader(name)
|
||||
add_options(opts, ks_loading.get_auth_plugin_conf_options(plugin))
|
||||
add_options(opts, ks_loading.get_session_conf_options())
|
||||
if service_type:
|
||||
adapter_opts = ks_loading.get_adapter_conf_options(
|
||||
include_deprecated=False)
|
||||
# adding defaults for valid interfaces
|
||||
cfg.set_defaults(adapter_opts, service_type=service_type,
|
||||
valid_interfaces=DEFAULT_VALID_INTERFACES)
|
||||
add_options(opts, adapter_opts)
|
||||
opts.sort(key=lambda x: x.name)
|
||||
return opts
|
@ -1,333 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""Multicast DNS implementation for API discovery.
|
||||
|
||||
This implementation follows RFC 6763 as clarified by the API SIG guideline
|
||||
https://review.opendev.org/651222.
|
||||
"""
|
||||
|
||||
import collections
|
||||
import ipaddress
|
||||
import logging
|
||||
import socket
|
||||
import time
|
||||
from urllib import parse as urlparse
|
||||
|
||||
from oslo_config import cfg
|
||||
from oslo_config import types as cfg_types
|
||||
import zeroconf
|
||||
|
||||
from ironic_lib.common.i18n import _
|
||||
from ironic_lib import exception
|
||||
from ironic_lib import utils
|
||||
|
||||
|
||||
opts = [
|
||||
cfg.IntOpt('registration_attempts',
|
||||
min=1, default=5,
|
||||
help='Number of attempts to register a service. Currently '
|
||||
'has to be larger than 1 because of race conditions '
|
||||
'in the zeroconf library.'),
|
||||
cfg.IntOpt('lookup_attempts',
|
||||
min=1, default=3,
|
||||
help='Number of attempts to lookup a service.'),
|
||||
cfg.Opt('params',
|
||||
# This is required for values that contain commas.
|
||||
type=cfg_types.Dict(cfg_types.String(quotes=True)),
|
||||
default={},
|
||||
help='Additional parameters to pass for the registered '
|
||||
'service.'),
|
||||
cfg.ListOpt('interfaces',
|
||||
help='List of IP addresses of interfaces to use for mDNS. '
|
||||
'Defaults to all interfaces on the system.'),
|
||||
]
|
||||
|
||||
CONF = cfg.CONF
|
||||
opt_group = cfg.OptGroup(name='mdns', title='Options for multicast DNS')
|
||||
CONF.register_group(opt_group)
|
||||
CONF.register_opts(opts, opt_group)
|
||||
|
||||
LOG = logging.getLogger(__name__)
|
||||
|
||||
_MDNS_DOMAIN = '_openstack._tcp.local.'
|
||||
_endpoint = collections.namedtuple('Endpoint',
|
||||
['addresses', 'hostname', 'port', 'params'])
|
||||
|
||||
|
||||
class Zeroconf(object):
|
||||
"""Multicast DNS implementation client and server.
|
||||
|
||||
Uses threading internally, so there is no start method. It starts
|
||||
automatically on creation.
|
||||
|
||||
.. warning::
|
||||
The underlying library does not yet support IPv6.
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
"""Initialize and start the mDNS server."""
|
||||
interfaces = (CONF.mdns.interfaces if CONF.mdns.interfaces
|
||||
else zeroconf.InterfaceChoice.All)
|
||||
# If interfaces are set, let zeroconf auto-detect the version
|
||||
ip_version = None if CONF.mdns.interfaces else zeroconf.IPVersion.All
|
||||
self._zc = zeroconf.Zeroconf(interfaces=interfaces,
|
||||
ip_version=ip_version)
|
||||
self._registered = []
|
||||
|
||||
def register_service(self, service_type, endpoint, params=None):
|
||||
"""Register a service.
|
||||
|
||||
This call announces the new services via multicast and instructs the
|
||||
built-in server to respond to queries about it.
|
||||
|
||||
:param service_type: OpenStack service type, e.g. "baremetal".
|
||||
:param endpoint: full endpoint to reach the service.
|
||||
:param params: optional properties as a dictionary.
|
||||
:raises: :exc:`.ServiceRegistrationFailure` if the service cannot be
|
||||
registered, e.g. because of conflicts.
|
||||
"""
|
||||
parsed = _parse_endpoint(endpoint, service_type)
|
||||
|
||||
all_params = CONF.mdns.params.copy()
|
||||
if params:
|
||||
all_params.update(params)
|
||||
all_params.update(parsed.params)
|
||||
|
||||
properties = {
|
||||
(key.encode('utf-8') if isinstance(key, str) else key):
|
||||
(value.encode('utf-8') if isinstance(value, str) else value)
|
||||
for key, value in all_params.items()
|
||||
}
|
||||
|
||||
# TODO(dtantsur): allow overriding TTL values via configuration
|
||||
info = zeroconf.ServiceInfo(_MDNS_DOMAIN,
|
||||
'%s.%s' % (service_type, _MDNS_DOMAIN),
|
||||
addresses=parsed.addresses,
|
||||
port=parsed.port,
|
||||
properties=properties,
|
||||
server=parsed.hostname)
|
||||
|
||||
LOG.debug('Registering %s via mDNS', info)
|
||||
# Work around a potential race condition in the registration code:
|
||||
# https://github.com/jstasiak/python-zeroconf/issues/163
|
||||
delay = 0.1
|
||||
try:
|
||||
for attempt in range(CONF.mdns.registration_attempts):
|
||||
try:
|
||||
self._zc.register_service(info)
|
||||
except zeroconf.NonUniqueNameException:
|
||||
LOG.debug('Could not register %s - conflict', info)
|
||||
if attempt == CONF.mdns.registration_attempts - 1:
|
||||
raise
|
||||
# reset the cache to purge learned records and retry
|
||||
self._zc.cache = zeroconf.DNSCache()
|
||||
time.sleep(delay)
|
||||
delay *= 2
|
||||
else:
|
||||
break
|
||||
except zeroconf.Error as exc:
|
||||
raise exception.ServiceRegistrationFailure(
|
||||
service=service_type, error=exc)
|
||||
|
||||
self._registered.append(info)
|
||||
|
||||
def get_endpoint(self, service_type, skip_loopback=True,
|
||||
skip_link_local=False):
|
||||
"""Get an endpoint and its properties from mDNS.
|
||||
|
||||
If the requested endpoint is already in the built-in server cache, and
|
||||
its TTL is not exceeded, the cached value is returned.
|
||||
|
||||
:param service_type: OpenStack service type.
|
||||
:param skip_loopback: Whether to ignore loopback addresses.
|
||||
:param skip_link_local: Whether to ignore link local V6 addresses.
|
||||
:returns: tuple (endpoint URL, properties as a dict).
|
||||
:raises: :exc:`.ServiceLookupFailure` if the service cannot be found.
|
||||
"""
|
||||
delay = 0.1
|
||||
for attempt in range(CONF.mdns.lookup_attempts):
|
||||
name = '%s.%s' % (service_type, _MDNS_DOMAIN)
|
||||
info = self._zc.get_service_info(name, name)
|
||||
if info is not None:
|
||||
break
|
||||
elif attempt == CONF.mdns.lookup_attempts - 1:
|
||||
raise exception.ServiceLookupFailure(service=service_type)
|
||||
else:
|
||||
time.sleep(delay)
|
||||
delay *= 2
|
||||
|
||||
all_addr = info.parsed_addresses()
|
||||
|
||||
# Try to find the first routable address
|
||||
fallback = None
|
||||
for addr in all_addr:
|
||||
try:
|
||||
loopback = ipaddress.ip_address(addr).is_loopback
|
||||
except ValueError:
|
||||
LOG.debug('Skipping invalid IP address %s', addr)
|
||||
continue
|
||||
else:
|
||||
if loopback and skip_loopback:
|
||||
LOG.debug('Skipping loopback IP address %s', addr)
|
||||
continue
|
||||
|
||||
if utils.get_route_source(addr, skip_link_local):
|
||||
address = addr
|
||||
break
|
||||
elif fallback is None:
|
||||
fallback = addr
|
||||
else:
|
||||
if fallback is None:
|
||||
raise exception.ServiceLookupFailure(
|
||||
_('None of addresses %(addr)s for service %(service)s '
|
||||
'are valid')
|
||||
% {'addr': all_addr, 'service': service_type})
|
||||
else:
|
||||
LOG.warning('None of addresses %s seem routable, using %s',
|
||||
all_addr, fallback)
|
||||
address = fallback
|
||||
|
||||
properties = {}
|
||||
for key, value in info.properties.items():
|
||||
try:
|
||||
if isinstance(key, bytes):
|
||||
key = key.decode('utf-8')
|
||||
except UnicodeError as exc:
|
||||
raise exception.ServiceLookupFailure(
|
||||
_('Invalid properties for service %(svc)s. Cannot decode '
|
||||
'key %(key)r: %(exc)r') %
|
||||
{'svc': service_type, 'key': key, 'exc': exc})
|
||||
|
||||
try:
|
||||
if isinstance(value, bytes):
|
||||
value = value.decode('utf-8')
|
||||
except UnicodeError as exc:
|
||||
LOG.debug('Cannot convert value %(value)r for key %(key)s '
|
||||
'to string, assuming binary: %(exc)s',
|
||||
{'key': key, 'value': value, 'exc': exc})
|
||||
|
||||
properties[key] = value
|
||||
|
||||
path = properties.pop('path', '')
|
||||
protocol = properties.pop('protocol', None)
|
||||
if not protocol:
|
||||
if info.port == 80:
|
||||
protocol = 'http'
|
||||
else:
|
||||
protocol = 'https'
|
||||
|
||||
if info.server.endswith('.local.'):
|
||||
# Local hostname means that the catalog lists an IP address,
|
||||
# so use it
|
||||
host = address
|
||||
if int(ipaddress.ip_address(host).version) == 6:
|
||||
host = '[%s]' % host
|
||||
else:
|
||||
# Otherwise use the provided hostname.
|
||||
host = info.server.rstrip('.')
|
||||
|
||||
return ('{proto}://{host}:{port}{path}'.format(proto=protocol,
|
||||
host=host,
|
||||
port=info.port,
|
||||
path=path),
|
||||
properties)
|
||||
|
||||
def close(self):
|
||||
"""Shut down mDNS and unregister services.
|
||||
|
||||
.. note::
|
||||
If another server is running for the same services, it will
|
||||
re-register them immediately.
|
||||
"""
|
||||
for info in self._registered:
|
||||
try:
|
||||
self._zc.unregister_service(info)
|
||||
except Exception:
|
||||
LOG.exception('Cound not unregister mDNS service %s', info)
|
||||
self._zc.close()
|
||||
|
||||
def __enter__(self):
|
||||
return self
|
||||
|
||||
def __exit__(self, *args):
|
||||
self.close()
|
||||
|
||||
|
||||
def get_endpoint(service_type):
|
||||
"""Get an endpoint and its properties from mDNS.
|
||||
|
||||
If the requested endpoint is already in the built-in server cache, and
|
||||
its TTL is not exceeded, the cached value is returned.
|
||||
|
||||
:param service_type: OpenStack service type.
|
||||
:returns: tuple (endpoint URL, properties as a dict).
|
||||
:raises: :exc:`.ServiceLookupFailure` if the service cannot be found.
|
||||
"""
|
||||
with Zeroconf() as zc:
|
||||
return zc.get_endpoint(service_type)
|
||||
|
||||
|
||||
def _parse_endpoint(endpoint, service_type=None):
|
||||
params = {}
|
||||
url = urlparse.urlparse(endpoint)
|
||||
port = url.port
|
||||
|
||||
if port is None:
|
||||
if url.scheme == 'https':
|
||||
port = 443
|
||||
else:
|
||||
port = 80
|
||||
|
||||
addresses = []
|
||||
hostname = url.hostname
|
||||
try:
|
||||
infos = socket.getaddrinfo(hostname, port, 0, socket.IPPROTO_TCP)
|
||||
except socket.error as exc:
|
||||
raise exception.ServiceRegistrationFailure(
|
||||
service=service_type,
|
||||
error=_('Could not resolve hostname %(host)s: %(exc)s') %
|
||||
{'host': hostname, 'exc': exc})
|
||||
|
||||
for info in infos:
|
||||
ip = info[4][0]
|
||||
if ip == hostname:
|
||||
# we need a host name for the service record. if what we have in
|
||||
# the catalog is an IP address, use the local hostname instead
|
||||
hostname = None
|
||||
# zeroconf requires addresses in network format
|
||||
ip = socket.inet_pton(info[0], ip)
|
||||
if ip not in addresses:
|
||||
addresses.append(ip)
|
||||
if not addresses:
|
||||
raise exception.ServiceRegistrationFailure(
|
||||
service=service_type,
|
||||
error=_('No suitable addresses found for %s') % url.hostname)
|
||||
|
||||
# avoid storing information that can be derived from existing data
|
||||
if url.path not in ('', '/'):
|
||||
params['path'] = url.path
|
||||
|
||||
if (not (port == 80 and url.scheme == 'http')
|
||||
and not (port == 443 and url.scheme == 'https')):
|
||||
params['protocol'] = url.scheme
|
||||
|
||||
# zeroconf is pretty picky about having the trailing dot
|
||||
if hostname is not None and not hostname.endswith('.'):
|
||||
hostname += '.'
|
||||
|
||||
return _endpoint(addresses, hostname, port, params)
|
||||
|
||||
|
||||
def list_opts():
|
||||
"""Entry point for oslo-config-generator."""
|
||||
return [('mdns', opts)]
|
@ -1,307 +0,0 @@
|
||||
# Copyright 2016 Rackspace Hosting
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import abc
|
||||
import functools
|
||||
import random
|
||||
import time
|
||||
|
||||
from ironic_lib.common.i18n import _
|
||||
from ironic_lib import exception
|
||||
|
||||
|
||||
class Timer(object):
|
||||
"""A timer decorator and context manager.
|
||||
|
||||
This metric type times the decorated method or code running inside the
|
||||
context manager, and emits the time as the metric value. It is bound to
|
||||
this MetricLogger. For example::
|
||||
|
||||
from ironic_lib import metrics_utils
|
||||
|
||||
METRICS = metrics_utils.get_metrics_logger()
|
||||
|
||||
@METRICS.timer('foo')
|
||||
def foo(bar, baz):
|
||||
print bar, baz
|
||||
|
||||
with METRICS.timer('foo'):
|
||||
do_something()
|
||||
"""
|
||||
def __init__(self, metrics, name):
|
||||
"""Init the decorator / context manager.
|
||||
|
||||
:param metrics: The metric logger
|
||||
:param name: The metric name
|
||||
"""
|
||||
if not isinstance(name, str):
|
||||
raise TypeError(_("The metric name is expected to be a string. "
|
||||
"Value is %s") % name)
|
||||
self.metrics = metrics
|
||||
self.name = name
|
||||
self._start = None
|
||||
|
||||
def __call__(self, f):
|
||||
@functools.wraps(f)
|
||||
def wrapped(*args, **kwargs):
|
||||
start = _time()
|
||||
result = f(*args, **kwargs)
|
||||
duration = _time() - start
|
||||
|
||||
# Log the timing data (in ms)
|
||||
self.metrics.send_timer(self.metrics.get_metric_name(self.name),
|
||||
duration * 1000)
|
||||
return result
|
||||
return wrapped
|
||||
|
||||
def __enter__(self):
|
||||
self._start = _time()
|
||||
|
||||
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||
duration = _time() - self._start
|
||||
# Log the timing data (in ms)
|
||||
self.metrics.send_timer(self.metrics.get_metric_name(self.name),
|
||||
duration * 1000)
|
||||
|
||||
|
||||
class Counter(object):
|
||||
"""A counter decorator and context manager.
|
||||
|
||||
This metric type increments a counter every time the decorated method or
|
||||
context manager is executed. It is bound to this MetricLogger. For
|
||||
example::
|
||||
|
||||
from ironic_lib import metrics_utils
|
||||
|
||||
METRICS = metrics_utils.get_metrics_logger()
|
||||
|
||||
@METRICS.counter('foo')
|
||||
def foo(bar, baz):
|
||||
print bar, baz
|
||||
|
||||
with METRICS.counter('foo'):
|
||||
do_something()
|
||||
"""
|
||||
def __init__(self, metrics, name, sample_rate):
|
||||
"""Init the decorator / context manager.
|
||||
|
||||
:param metrics: The metric logger
|
||||
:param name: The metric name
|
||||
:param sample_rate: Probabilistic rate at which the values will be sent
|
||||
"""
|
||||
if not isinstance(name, str):
|
||||
raise TypeError(_("The metric name is expected to be a string. "
|
||||
"Value is %s") % name)
|
||||
|
||||
if (sample_rate is not None
|
||||
and (sample_rate < 0.0 or sample_rate > 1.0)):
|
||||
msg = _("sample_rate is set to %s. Value must be None "
|
||||
"or in the interval [0.0, 1.0]") % sample_rate
|
||||
raise ValueError(msg)
|
||||
|
||||
self.metrics = metrics
|
||||
self.name = name
|
||||
self.sample_rate = sample_rate
|
||||
|
||||
def __call__(self, f):
|
||||
@functools.wraps(f)
|
||||
def wrapped(*args, **kwargs):
|
||||
self.metrics.send_counter(
|
||||
self.metrics.get_metric_name(self.name),
|
||||
1, sample_rate=self.sample_rate)
|
||||
|
||||
result = f(*args, **kwargs)
|
||||
|
||||
return result
|
||||
return wrapped
|
||||
|
||||
def __enter__(self):
|
||||
self.metrics.send_counter(self.metrics.get_metric_name(self.name),
|
||||
1, sample_rate=self.sample_rate)
|
||||
|
||||
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||
pass
|
||||
|
||||
|
||||
class Gauge(object):
|
||||
"""A gauge decorator.
|
||||
|
||||
This metric type returns the value of the decorated method as a metric
|
||||
every time the method is executed. It is bound to this MetricLogger. For
|
||||
example::
|
||||
|
||||
from ironic_lib import metrics_utils
|
||||
|
||||
METRICS = metrics_utils.get_metrics_logger()
|
||||
|
||||
@METRICS.gauge('foo')
|
||||
def add_foo(bar, baz):
|
||||
return (bar + baz)
|
||||
"""
|
||||
def __init__(self, metrics, name):
|
||||
"""Init the decorator / context manager.
|
||||
|
||||
:param metrics: The metric logger
|
||||
:param name: The metric name
|
||||
"""
|
||||
if not isinstance(name, str):
|
||||
raise TypeError(_("The metric name is expected to be a string. "
|
||||
"Value is %s") % name)
|
||||
self.metrics = metrics
|
||||
self.name = name
|
||||
|
||||
def __call__(self, f):
|
||||
@functools.wraps(f)
|
||||
def wrapped(*args, **kwargs):
|
||||
result = f(*args, **kwargs)
|
||||
self.metrics.send_gauge(self.metrics.get_metric_name(self.name),
|
||||
result)
|
||||
|
||||
return result
|
||||
return wrapped
|
||||
|
||||
|
||||
def _time():
|
||||
"""Wraps time.time() for simpler testing."""
|
||||
return time.time()
|
||||
|
||||
|
||||
class MetricLogger(object, metaclass=abc.ABCMeta):
|
||||
"""Abstract class representing a metrics logger.
|
||||
|
||||
A MetricLogger sends data to a backend (noop or statsd).
|
||||
The data can be a gauge, a counter, or a timer.
|
||||
|
||||
The data sent to the backend is composed of:
|
||||
- a full metric name
|
||||
- a numeric value
|
||||
|
||||
The format of the full metric name is:
|
||||
_prefix<delim>name
|
||||
where:
|
||||
- _prefix: [global_prefix<delim>][uuid<delim>][host_name<delim>]prefix
|
||||
- name: the name of this metric
|
||||
- <delim>: the delimiter. Default is '.'
|
||||
"""
|
||||
|
||||
def __init__(self, prefix='', delimiter='.'):
|
||||
"""Init a MetricLogger.
|
||||
|
||||
:param prefix: Prefix for this metric logger. This string will prefix
|
||||
all metric names.
|
||||
:param delimiter: Delimiter used to generate the full metric name.
|
||||
"""
|
||||
self._prefix = prefix
|
||||
self._delimiter = delimiter
|
||||
|
||||
def get_metric_name(self, name):
|
||||
"""Get the full metric name.
|
||||
|
||||
The format of the full metric name is:
|
||||
_prefix<delim>name
|
||||
where:
|
||||
- _prefix: [global_prefix<delim>][uuid<delim>][host_name<delim>]
|
||||
prefix
|
||||
- name: the name of this metric
|
||||
- <delim>: the delimiter. Default is '.'
|
||||
|
||||
|
||||
:param name: The metric name.
|
||||
:return: The full metric name, with logger prefix, as a string.
|
||||
"""
|
||||
if not self._prefix:
|
||||
return name
|
||||
return self._delimiter.join([self._prefix, name])
|
||||
|
||||
def send_gauge(self, name, value):
|
||||
"""Send gauge metric data.
|
||||
|
||||
Gauges are simple values.
|
||||
The backend will set the value of gauge 'name' to 'value'.
|
||||
|
||||
:param name: Metric name
|
||||
:param value: Metric numeric value that will be sent to the backend
|
||||
"""
|
||||
self._gauge(name, value)
|
||||
|
||||
def send_counter(self, name, value, sample_rate=None):
|
||||
"""Send counter metric data.
|
||||
|
||||
Counters are used to count how many times an event occurred.
|
||||
The backend will increment the counter 'name' by the value 'value'.
|
||||
|
||||
Optionally, specify sample_rate in the interval [0.0, 1.0] to
|
||||
sample data probabilistically where::
|
||||
|
||||
P(send metric data) = sample_rate
|
||||
|
||||
If sample_rate is None, then always send metric data, but do not
|
||||
have the backend send sample rate information (if supported).
|
||||
|
||||
:param name: Metric name
|
||||
:param value: Metric numeric value that will be sent to the backend
|
||||
:param sample_rate: Probabilistic rate at which the values will be
|
||||
sent. Value must be None or in the interval [0.0, 1.0].
|
||||
"""
|
||||
if (sample_rate is None or random.random() < sample_rate):
|
||||
return self._counter(name, value,
|
||||
sample_rate=sample_rate)
|
||||
|
||||
def send_timer(self, name, value):
|
||||
"""Send timer data.
|
||||
|
||||
Timers are used to measure how long it took to do something.
|
||||
|
||||
:param m_name: Metric name
|
||||
:param m_value: Metric numeric value that will be sent to the backend
|
||||
"""
|
||||
self._timer(name, value)
|
||||
|
||||
def timer(self, name):
|
||||
return Timer(self, name)
|
||||
|
||||
def counter(self, name, sample_rate=None):
|
||||
return Counter(self, name, sample_rate)
|
||||
|
||||
def gauge(self, name):
|
||||
return Gauge(self, name)
|
||||
|
||||
@abc.abstractmethod
|
||||
def _gauge(self, name, value):
|
||||
"""Abstract method for backends to implement gauge behavior."""
|
||||
|
||||
@abc.abstractmethod
|
||||
def _counter(self, name, value, sample_rate=None):
|
||||
"""Abstract method for backends to implement counter behavior."""
|
||||
|
||||
@abc.abstractmethod
|
||||
def _timer(self, name, value):
|
||||
"""Abstract method for backends to implement timer behavior."""
|
||||
|
||||
def get_metrics_data(self):
|
||||
"""Return the metrics collection, if available."""
|
||||
raise exception.MetricsNotSupported()
|
||||
|
||||
|
||||
class NoopMetricLogger(MetricLogger):
|
||||
"""Noop metric logger that throws away all metric data."""
|
||||
def _gauge(self, name, value):
|
||||
pass
|
||||
|
||||
def _counter(self, name, value, sample_rate=None):
|
||||
pass
|
||||
|
||||
def _timer(self, m_name, value):
|
||||
pass
|
@ -1,120 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
|
||||
from oslo_concurrency import lockutils
|
||||
from oslo_config import cfg
|
||||
|
||||
from ironic_lib import metrics
|
||||
|
||||
|
||||
CONF = cfg.CONF
|
||||
|
||||
STATISTIC_DATA = {}
|
||||
|
||||
|
||||
class DictCollectionMetricLogger(metrics.MetricLogger):
|
||||
"""Metric logger that collects internal counters."""
|
||||
|
||||
# These are internal typing labels in Ironic-lib.
|
||||
GAUGE_TYPE = 'g'
|
||||
COUNTER_TYPE = 'c'
|
||||
TIMER_TYPE = 'ms'
|
||||
|
||||
def __init__(self, prefix, delimiter='.'):
|
||||
"""Initialize the Collection Metrics Logger.
|
||||
|
||||
The logger stores metrics data in a dictionary which can then be
|
||||
retrieved by the program utilizing it whenever needed utilizing a
|
||||
get_metrics_data call to return the metrics data structure.
|
||||
|
||||
:param prefix: Prefix for this metric logger.
|
||||
:param delimiter: Delimiter used to generate the full metric name.
|
||||
"""
|
||||
super(DictCollectionMetricLogger, self).__init__(
|
||||
prefix, delimiter=delimiter)
|
||||
|
||||
@lockutils.synchronized('statistics-update')
|
||||
def _send(self, name, value, metric_type, sample_rate=None):
|
||||
"""Send the metrics to be stored in memory.
|
||||
|
||||
This memory updates the internal dictionary to facilitate
|
||||
collection of statistics, and the retrieval of them for
|
||||
consumers or plugins in Ironic to retrieve the statistic
|
||||
data utilizing the `get_metrics_data` method.
|
||||
|
||||
:param name: Metric name
|
||||
:param value: Metric value
|
||||
:param metric_type: Metric type (GAUGE_TYPE, COUNTER_TYPE),
|
||||
TIMER_TYPE is not supported.
|
||||
:param sample_rate: Not Applicable.
|
||||
"""
|
||||
|
||||
global STATISTIC_DATA
|
||||
if metric_type == self.TIMER_TYPE:
|
||||
if name in STATISTIC_DATA:
|
||||
STATISTIC_DATA[name] = {
|
||||
'count': STATISTIC_DATA[name]['count'] + 1,
|
||||
'sum': STATISTIC_DATA[name]['sum'] + value,
|
||||
'type': 'timer'
|
||||
}
|
||||
else:
|
||||
# Set initial data value.
|
||||
STATISTIC_DATA[name] = {
|
||||
'count': 1,
|
||||
'sum': value,
|
||||
'type': 'timer'
|
||||
}
|
||||
elif metric_type == self.GAUGE_TYPE:
|
||||
STATISTIC_DATA[name] = {
|
||||
'value': value,
|
||||
'type': 'gauge'
|
||||
}
|
||||
elif metric_type == self.COUNTER_TYPE:
|
||||
if name in STATISTIC_DATA:
|
||||
# NOTE(TheJulia): Value is hard coded for counter
|
||||
# data types as a value of 1.
|
||||
STATISTIC_DATA[name] = {
|
||||
'count': STATISTIC_DATA[name]['count'] + 1,
|
||||
'type': 'counter'
|
||||
}
|
||||
else:
|
||||
STATISTIC_DATA[name] = {
|
||||
'count': 1,
|
||||
'type': 'counter'
|
||||
}
|
||||
|
||||
def _gauge(self, name, value):
|
||||
return self._send(name, value, self.GAUGE_TYPE)
|
||||
|
||||
def _counter(self, name, value, sample_rate=None):
|
||||
return self._send(name, value, self.COUNTER_TYPE,
|
||||
sample_rate=sample_rate)
|
||||
|
||||
def _timer(self, name, value):
|
||||
return self._send(name, value, self.TIMER_TYPE)
|
||||
|
||||
def get_metrics_data(self):
|
||||
"""Return the metrics collection dictionary.
|
||||
|
||||
:returns: Dictionary containing the keys and values of
|
||||
data stored via the metrics collection hooks.
|
||||
The values themselves are dictionaries which
|
||||
contain a type field, indicating if the statistic
|
||||
is a counter, gauge, or timer. A counter has a
|
||||
`count` field, a gauge value has a `value` field,
|
||||
and a 'timer' fiend las a 'count' and 'sum' fields.
|
||||
The multiple fields for for a timer type allows
|
||||
for additional statistics to be implied from the
|
||||
data once collected and compared over time.
|
||||
"""
|
||||
return STATISTIC_DATA
|
@ -1,106 +0,0 @@
|
||||
# Copyright 2016 Rackspace Hosting
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import contextlib
|
||||
import logging
|
||||
import socket
|
||||
|
||||
from oslo_config import cfg
|
||||
|
||||
from ironic_lib import metrics
|
||||
|
||||
statsd_opts = [
|
||||
cfg.StrOpt('statsd_host',
|
||||
default='localhost',
|
||||
help='Host for use with the statsd backend.'),
|
||||
cfg.PortOpt('statsd_port',
|
||||
default=8125,
|
||||
help='Port to use with the statsd backend.')
|
||||
]
|
||||
|
||||
CONF = cfg.CONF
|
||||
CONF.register_opts(statsd_opts, group='metrics_statsd')
|
||||
|
||||
LOG = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class StatsdMetricLogger(metrics.MetricLogger):
|
||||
"""Metric logger that reports data via the statsd protocol."""
|
||||
|
||||
GAUGE_TYPE = 'g'
|
||||
COUNTER_TYPE = 'c'
|
||||
TIMER_TYPE = 'ms'
|
||||
|
||||
def __init__(self, prefix, delimiter='.', host=None, port=None):
|
||||
"""Initialize a StatsdMetricLogger
|
||||
|
||||
The logger uses the given prefix list, delimiter, host, and port.
|
||||
|
||||
:param prefix: Prefix for this metric logger.
|
||||
:param delimiter: Delimiter used to generate the full metric name.
|
||||
:param host: The statsd host
|
||||
:param port: The statsd port
|
||||
"""
|
||||
super(StatsdMetricLogger, self).__init__(prefix,
|
||||
delimiter=delimiter)
|
||||
|
||||
self._host = host or CONF.metrics_statsd.statsd_host
|
||||
self._port = port or CONF.metrics_statsd.statsd_port
|
||||
|
||||
self._target = (self._host, self._port)
|
||||
|
||||
def _send(self, name, value, metric_type, sample_rate=None):
|
||||
"""Send metrics to the statsd backend
|
||||
|
||||
:param name: Metric name
|
||||
:param value: Metric value
|
||||
:param metric_type: Metric type (GAUGE_TYPE, COUNTER_TYPE,
|
||||
or TIMER_TYPE)
|
||||
:param sample_rate: Probabilistic rate at which the values will be sent
|
||||
"""
|
||||
if sample_rate is None:
|
||||
metric = '%s:%s|%s' % (name, value, metric_type)
|
||||
else:
|
||||
metric = '%s:%s|%s@%s' % (name, value, metric_type, sample_rate)
|
||||
|
||||
# Ideally, we'd cache a sending socket in self, but that
|
||||
# results in a socket getting shared by multiple green threads.
|
||||
with contextlib.closing(self._open_socket()) as sock:
|
||||
try:
|
||||
sock.settimeout(0.0)
|
||||
sock.sendto(metric.encode(), self._target)
|
||||
except socket.error as e:
|
||||
LOG.warning("Failed to send the metric value to host "
|
||||
"%(host)s, port %(port)s. Error: %(error)s",
|
||||
{'host': self._host, 'port': self._port,
|
||||
'error': e})
|
||||
|
||||
def _open_socket(self):
|
||||
return socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
|
||||
|
||||
def _gauge(self, name, value):
|
||||
return self._send(name, value, self.GAUGE_TYPE)
|
||||
|
||||
def _counter(self, name, value, sample_rate=None):
|
||||
return self._send(name, value, self.COUNTER_TYPE,
|
||||
sample_rate=sample_rate)
|
||||
|
||||
def _timer(self, name, value):
|
||||
return self._send(name, value, self.TIMER_TYPE)
|
||||
|
||||
|
||||
def list_opts():
|
||||
"""Entry point for oslo-config-generator."""
|
||||
return [('metrics_statsd', statsd_opts)]
|
@ -1,108 +0,0 @@
|
||||
# Copyright 2016 Rackspace Hosting
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
from oslo_config import cfg
|
||||
|
||||
from ironic_lib.common.i18n import _
|
||||
from ironic_lib import exception
|
||||
from ironic_lib import metrics
|
||||
from ironic_lib import metrics_collector
|
||||
from ironic_lib import metrics_statsd
|
||||
|
||||
metrics_opts = [
|
||||
cfg.StrOpt('backend',
|
||||
default='noop',
|
||||
choices=[
|
||||
('noop', 'Do nothing in relation to metrics.'),
|
||||
('statsd', 'Transmits metrics data to a statsd backend.'),
|
||||
('collector', 'Collects metrics data and saves it in '
|
||||
'memory for use by the running application.'),
|
||||
],
|
||||
help='Backend to use for the metrics system.'),
|
||||
cfg.BoolOpt('prepend_host',
|
||||
default=False,
|
||||
help='Prepend the hostname to all metric names. '
|
||||
'The format of metric names is '
|
||||
'[global_prefix.][host_name.]prefix.metric_name.'),
|
||||
cfg.BoolOpt('prepend_host_reverse',
|
||||
default=True,
|
||||
help='Split the prepended host value by "." and reverse it '
|
||||
'(to better match the reverse hierarchical form of '
|
||||
'domain names).'),
|
||||
cfg.StrOpt('global_prefix',
|
||||
help='Prefix all metric names with this value. '
|
||||
'By default, there is no global prefix. '
|
||||
'The format of metric names is '
|
||||
'[global_prefix.][host_name.]prefix.metric_name.')
|
||||
]
|
||||
|
||||
CONF = cfg.CONF
|
||||
CONF.register_opts(metrics_opts, group='metrics')
|
||||
|
||||
|
||||
def get_metrics_logger(prefix='', backend=None, host=None, delimiter='.'):
|
||||
"""Return a metric logger with the specified prefix.
|
||||
|
||||
The format of the prefix is:
|
||||
[global_prefix<delim>][host_name<delim>]prefix
|
||||
where <delim> is the delimiter (default is '.')
|
||||
|
||||
:param prefix: Prefix for this metric logger.
|
||||
Value should be a string or None.
|
||||
:param backend: Backend to use for the metrics system.
|
||||
Possible values are 'noop' and 'statsd'.
|
||||
:param host: Name of this node.
|
||||
:param delimiter: Delimiter to use for the metrics name.
|
||||
:return: The new MetricLogger.
|
||||
"""
|
||||
if not isinstance(prefix, str):
|
||||
msg = (_("This metric prefix (%s) is of unsupported type. "
|
||||
"Value should be a string or None")
|
||||
% str(prefix))
|
||||
raise exception.InvalidMetricConfig(msg)
|
||||
|
||||
if CONF.metrics.prepend_host and host:
|
||||
if CONF.metrics.prepend_host_reverse:
|
||||
host = '.'.join(reversed(host.split('.')))
|
||||
|
||||
if prefix:
|
||||
prefix = delimiter.join([host, prefix])
|
||||
else:
|
||||
prefix = host
|
||||
|
||||
if CONF.metrics.global_prefix:
|
||||
if prefix:
|
||||
prefix = delimiter.join([CONF.metrics.global_prefix, prefix])
|
||||
else:
|
||||
prefix = CONF.metrics.global_prefix
|
||||
|
||||
backend = backend or CONF.metrics.backend
|
||||
if backend == 'statsd':
|
||||
return metrics_statsd.StatsdMetricLogger(prefix, delimiter=delimiter)
|
||||
elif backend == 'noop':
|
||||
return metrics.NoopMetricLogger(prefix, delimiter=delimiter)
|
||||
elif backend == 'collector':
|
||||
return metrics_collector.DictCollectionMetricLogger(
|
||||
prefix, delimiter=delimiter)
|
||||
else:
|
||||
msg = (_("The backend is set to an unsupported type: "
|
||||
"%s. Value should be 'noop' or 'statsd'.")
|
||||
% backend)
|
||||
raise exception.InvalidMetricConfig(msg)
|
||||
|
||||
|
||||
def list_opts():
|
||||
"""Entry point for oslo-config-generator."""
|
||||
return [('metrics', metrics_opts)]
|
@ -1,94 +0,0 @@
|
||||
# Copyright 2017 Cisco Systems, Inc
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
# implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
"""Common utilities and classes across all unit tests."""
|
||||
|
||||
import subprocess
|
||||
|
||||
from oslo_concurrency import processutils
|
||||
from oslo_config import fixture as config_fixture
|
||||
from oslotest import base as test_base
|
||||
|
||||
from ironic_lib import utils
|
||||
|
||||
|
||||
class IronicLibTestCase(test_base.BaseTestCase):
|
||||
"""Test case base class for all unit tests except callers of utils.execute.
|
||||
|
||||
This test class prevents calls to the utils.execute() /
|
||||
processutils.execute() and similar functions.
|
||||
"""
|
||||
|
||||
# By default block execution of utils.execute() and related functions.
|
||||
block_execute = True
|
||||
|
||||
def setUp(self):
|
||||
super(IronicLibTestCase, self).setUp()
|
||||
|
||||
# Make sure config overrides do not leak for test to test.
|
||||
self.cfg_fixture = self.useFixture(config_fixture.Config())
|
||||
|
||||
# Ban running external processes via 'execute' like functions. If the
|
||||
# patched function is called, an exception is raised to warn the
|
||||
# tester.
|
||||
if self.block_execute:
|
||||
# NOTE(jlvillal): Intentionally not using mock as if you mock a
|
||||
# mock it causes things to not work correctly. As doing an
|
||||
# autospec=True causes strangeness. By using a simple function we
|
||||
# can then mock it without issue.
|
||||
self.patch(processutils, 'execute', do_not_call)
|
||||
self.patch(subprocess, 'call', do_not_call)
|
||||
self.patch(subprocess, 'check_call', do_not_call)
|
||||
self.patch(subprocess, 'check_output', do_not_call)
|
||||
self.patch(utils, 'execute', do_not_call)
|
||||
|
||||
# subprocess.Popen is a class
|
||||
self.patch(subprocess, 'Popen', DoNotCallPopen)
|
||||
|
||||
def config(self, **kw):
|
||||
"""Override config options for a test."""
|
||||
self.cfg_fixture.config(**kw)
|
||||
|
||||
|
||||
def do_not_call(*args, **kwargs):
|
||||
"""Helper function to raise an exception if it is called"""
|
||||
raise Exception(
|
||||
"Don't call ironic_lib.utils.execute() / "
|
||||
"processutils.execute() or similar functions in tests!")
|
||||
|
||||
|
||||
class DoNotCallPopen(object):
|
||||
"""Helper class to mimic subprocess.popen()
|
||||
|
||||
It's job is to raise an exception if it is called. We create stub functions
|
||||
so mocks that use autospec=True will work.
|
||||
"""
|
||||
def __init__(self, *args, **kwargs):
|
||||
do_not_call(*args, **kwargs)
|
||||
|
||||
def communicate(self, input=None):
|
||||
pass
|
||||
|
||||
def kill(self):
|
||||
pass
|
||||
|
||||
def poll(self):
|
||||
pass
|
||||
|
||||
def terminate(self):
|
||||
pass
|
||||
|
||||
def wait(self):
|
||||
pass
|
@ -1,81 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
# implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import subprocess
|
||||
from unittest import mock
|
||||
|
||||
from oslo_concurrency import processutils
|
||||
|
||||
from ironic_lib.tests import base
|
||||
from ironic_lib import utils
|
||||
|
||||
|
||||
class BlockExecuteTestCase(base.IronicLibTestCase):
|
||||
"""Test to ensure we block access to the 'execute' type functions"""
|
||||
|
||||
def test_exception_raised_for_execute(self):
|
||||
execute_functions = (processutils.execute, subprocess.Popen,
|
||||
subprocess.call, subprocess.check_call,
|
||||
subprocess.check_output, utils.execute)
|
||||
|
||||
for function_name in execute_functions:
|
||||
exc = self.assertRaises(Exception, function_name, ["echo", "%s" % function_name]) # noqa
|
||||
# Have to use 'noqa' as we are raising plain Exception and we will
|
||||
# get H202 error in 'pep8' check.
|
||||
|
||||
self.assertEqual(
|
||||
"Don't call ironic_lib.utils.execute() / "
|
||||
"processutils.execute() or similar functions in tests!",
|
||||
"%s" % exc)
|
||||
|
||||
@mock.patch.object(utils, "execute", autospec=True)
|
||||
def test_can_mock_execute(self, mock_exec):
|
||||
# NOTE(jlvillal): We had discovered an issue where mocking wasn't
|
||||
# working because we had used a mock to block access to the execute
|
||||
# functions. This caused us to "mock a mock" and didn't work correctly.
|
||||
# We want to make sure that we can mock our execute functions even with
|
||||
# our "block execute" code.
|
||||
utils.execute("ls")
|
||||
utils.execute("echo")
|
||||
self.assertEqual(2, mock_exec.call_count)
|
||||
|
||||
@mock.patch.object(processutils, "execute", autospec=True)
|
||||
def test_exception_raised_for_execute_parent_mocked(self, mock_exec):
|
||||
# Make sure that even if we mock the parent execute function, that we
|
||||
# still get an exception for a child. So in this case
|
||||
# ironic_lib.utils.execute() calls processutils.execute(). Make sure an
|
||||
# exception is raised even though we mocked processutils.execute()
|
||||
exc = self.assertRaises(Exception, utils.execute, "ls") # noqa
|
||||
# Have to use 'noqa' as we are raising plain Exception and we will get
|
||||
# H202 error in 'pep8' check.
|
||||
|
||||
self.assertEqual(
|
||||
"Don't call ironic_lib.utils.execute() / "
|
||||
"processutils.execute() or similar functions in tests!",
|
||||
"%s" % exc)
|
||||
|
||||
|
||||
class DontBlockExecuteTestCase(base.IronicLibTestCase):
|
||||
"""Ensure we can turn off blocking access to 'execute' type functions"""
|
||||
|
||||
# Don't block the execute function
|
||||
block_execute = False
|
||||
|
||||
@mock.patch.object(processutils, "execute", autospec=True)
|
||||
def test_no_exception_raised_for_execute(self, mock_exec):
|
||||
# Make sure we can call ironic_lib.utils.execute() even though we
|
||||
# didn't mock it. We do mock processutils.execute() so we don't
|
||||
# actually execute anything.
|
||||
utils.execute("ls")
|
||||
utils.execute("echo")
|
||||
self.assertEqual(2, mock_exec.call_count)
|
@ -1,223 +0,0 @@
|
||||
# Copyright 2020 Red Hat, Inc.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import base64
|
||||
import json
|
||||
import os
|
||||
import tempfile
|
||||
from unittest import mock
|
||||
|
||||
from ironic_lib import auth_basic
|
||||
from ironic_lib import exception
|
||||
from ironic_lib.tests import base
|
||||
|
||||
|
||||
class TestAuthBasic(base.IronicLibTestCase):
|
||||
|
||||
def write_auth_file(self, data=None):
|
||||
if not data:
|
||||
data = '\n'
|
||||
with tempfile.NamedTemporaryFile(mode='w', delete=False) as f:
|
||||
f.write(data)
|
||||
self.addCleanup(os.remove, f.name)
|
||||
return f.name
|
||||
|
||||
def test_middleware_authenticate(self):
|
||||
auth_file = self.write_auth_file(
|
||||
'myName:$2y$05$lE3eGtyj41jZwrzS87KTqe6.'
|
||||
'JETVCWBkc32C63UP2aYrGoYOEpbJm\n\n\n')
|
||||
app = mock.Mock()
|
||||
start_response = mock.Mock()
|
||||
middleware = auth_basic.BasicAuthMiddleware(app, auth_file)
|
||||
env = {
|
||||
'HTTP_AUTHORIZATION': 'Basic bXlOYW1lOm15UGFzc3dvcmQ='
|
||||
}
|
||||
|
||||
result = middleware(env, start_response)
|
||||
self.assertEqual(app.return_value, result)
|
||||
start_response.assert_not_called()
|
||||
|
||||
def test_middleware_unauthenticated(self):
|
||||
auth_file = self.write_auth_file(
|
||||
'myName:$2y$05$lE3eGtyj41jZwrzS87KTqe6.'
|
||||
'JETVCWBkc32C63UP2aYrGoYOEpbJm\n\n\n')
|
||||
app = mock.Mock()
|
||||
start_response = mock.Mock()
|
||||
middleware = auth_basic.BasicAuthMiddleware(app, auth_file)
|
||||
env = {'REQUEST_METHOD': 'GET'}
|
||||
|
||||
body = middleware(env, start_response)
|
||||
decoded = json.loads(body[0].decode())
|
||||
self.assertEqual({'error': {'message': 'Authorization required',
|
||||
'code': 401}}, decoded)
|
||||
|
||||
start_response.assert_called_once_with(
|
||||
'401 Unauthorized',
|
||||
[('WWW-Authenticate', 'Basic realm="Baremetal API"'),
|
||||
('Content-Type', 'application/json'),
|
||||
('Content-Length', str(len(body[0])))]
|
||||
)
|
||||
app.assert_not_called()
|
||||
|
||||
def test_authenticate(self):
|
||||
auth_file = self.write_auth_file(
|
||||
'foo:bar\nmyName:$2y$05$lE3eGtyj41jZwrzS87KTqe6.'
|
||||
'JETVCWBkc32C63UP2aYrGoYOEpbJm\n\n\n')
|
||||
|
||||
# test basic auth
|
||||
self.assertEqual(
|
||||
{'HTTP_X_USER': 'myName', 'HTTP_X_USER_NAME': 'myName'},
|
||||
auth_basic.authenticate(
|
||||
auth_file, 'myName', b'myPassword')
|
||||
)
|
||||
|
||||
# test failed auth
|
||||
e = self.assertRaises(exception.ConfigInvalid,
|
||||
auth_basic.authenticate,
|
||||
auth_file, 'foo', b'bar')
|
||||
self.assertEqual('Invalid configuration file. Only bcrypt digested '
|
||||
'passwords are supported for foo', str(e))
|
||||
|
||||
# test problem reading user data file
|
||||
auth_file = auth_file + '.missing'
|
||||
e = self.assertRaises(exception.ConfigInvalid,
|
||||
auth_basic.authenticate,
|
||||
auth_file, 'myName',
|
||||
b'myPassword')
|
||||
self.assertEqual('Invalid configuration file. Problem reading '
|
||||
'auth user file', str(e))
|
||||
|
||||
def test_auth_entry(self):
|
||||
entry_pass = ('myName:$2y$05$lE3eGtyj41jZwrzS87KTqe6.'
|
||||
'JETVCWBkc32C63UP2aYrGoYOEpbJm')
|
||||
entry_pass_2a = ('myName:$2a$10$I9Fi3DM1sbxQP0560MK9'
|
||||
'tec1dUdytBtIqXfDCyTNfDUabtGvQjW1S')
|
||||
entry_pass_2b = ('myName:$2b$12$dWLBxT6aMxpVTfUNAyOu'
|
||||
'IusHXewu8m6Hrsxw4/e95WGBelFn0oOMW')
|
||||
entry_fail = 'foo:bar'
|
||||
|
||||
# success
|
||||
self.assertEqual(
|
||||
{'HTTP_X_USER': 'myName', 'HTTP_X_USER_NAME': 'myName'},
|
||||
auth_basic.auth_entry(
|
||||
entry_pass, b'myPassword')
|
||||
)
|
||||
|
||||
# success with a bcrypt implementations other than htpasswd
|
||||
self.assertEqual(
|
||||
{'HTTP_X_USER': 'myName', 'HTTP_X_USER_NAME': 'myName'},
|
||||
auth_basic.auth_entry(
|
||||
entry_pass_2a, b'myPassword')
|
||||
)
|
||||
self.assertEqual(
|
||||
{'HTTP_X_USER': 'myName', 'HTTP_X_USER_NAME': 'myName'},
|
||||
auth_basic.auth_entry(
|
||||
entry_pass_2b, b'myPassword')
|
||||
)
|
||||
|
||||
# failed, unknown digest format
|
||||
e = self.assertRaises(exception.ConfigInvalid,
|
||||
auth_basic.auth_entry, entry_fail, b'bar')
|
||||
self.assertEqual('Invalid configuration file. Only bcrypt digested '
|
||||
'passwords are supported for foo', str(e))
|
||||
|
||||
# failed, incorrect password
|
||||
e = self.assertRaises(exception.Unauthorized,
|
||||
auth_basic.auth_entry, entry_pass, b'bar')
|
||||
self.assertEqual('Incorrect username or password', str(e))
|
||||
|
||||
def test_validate_auth_file(self):
|
||||
auth_file = self.write_auth_file(
|
||||
'myName:$2y$05$lE3eGtyj41jZwrzS87KTqe6.'
|
||||
'JETVCWBkc32C63UP2aYrGoYOEpbJm\n\n\n')
|
||||
# success, valid config
|
||||
auth_basic.validate_auth_file(auth_file)
|
||||
|
||||
# failed, missing auth file
|
||||
auth_file = auth_file + '.missing'
|
||||
self.assertRaises(exception.ConfigInvalid,
|
||||
auth_basic.validate_auth_file, auth_file)
|
||||
|
||||
# failed, invalid entry
|
||||
auth_file = self.write_auth_file(
|
||||
'foo:bar\nmyName:$2y$05$lE3eGtyj41jZwrzS87KTqe6.'
|
||||
'JETVCWBkc32C63UP2aYrGoYOEpbJm\n\n\n')
|
||||
self.assertRaises(exception.ConfigInvalid,
|
||||
auth_basic.validate_auth_file, auth_file)
|
||||
|
||||
def test_parse_token(self):
|
||||
|
||||
# success with bytes
|
||||
token = base64.b64encode(b'myName:myPassword')
|
||||
self.assertEqual(
|
||||
('myName', b'myPassword'),
|
||||
auth_basic.parse_token(token)
|
||||
)
|
||||
|
||||
# success with string
|
||||
token = str(token, encoding='utf-8')
|
||||
self.assertEqual(
|
||||
('myName', b'myPassword'),
|
||||
auth_basic.parse_token(token)
|
||||
)
|
||||
|
||||
# failed, invalid base64
|
||||
e = self.assertRaises(exception.BadRequest,
|
||||
auth_basic.parse_token, token[:-1])
|
||||
self.assertEqual('Could not decode authorization token', str(e))
|
||||
|
||||
# failed, no colon in token
|
||||
token = str(base64.b64encode(b'myNamemyPassword'), encoding='utf-8')
|
||||
e = self.assertRaises(exception.BadRequest,
|
||||
auth_basic.parse_token, token[:-1])
|
||||
self.assertEqual('Could not decode authorization token', str(e))
|
||||
|
||||
def test_parse_header(self):
|
||||
auth_value = 'Basic bXlOYW1lOm15UGFzc3dvcmQ='
|
||||
|
||||
# success
|
||||
self.assertEqual(
|
||||
'bXlOYW1lOm15UGFzc3dvcmQ=',
|
||||
auth_basic.parse_header({
|
||||
'HTTP_AUTHORIZATION': auth_value
|
||||
})
|
||||
)
|
||||
|
||||
# failed, missing Authorization header
|
||||
e = self.assertRaises(exception.Unauthorized,
|
||||
auth_basic.parse_header,
|
||||
{})
|
||||
self.assertEqual('Authorization required', str(e))
|
||||
|
||||
# failed missing token
|
||||
e = self.assertRaises(exception.BadRequest,
|
||||
auth_basic.parse_header,
|
||||
{'HTTP_AUTHORIZATION': 'Basic'})
|
||||
self.assertEqual('Could not parse Authorization header', str(e))
|
||||
|
||||
# failed, type other than Basic
|
||||
digest_value = 'Digest username="myName" nonce="foobar"'
|
||||
e = self.assertRaises(exception.BadRequest,
|
||||
auth_basic.parse_header,
|
||||
{'HTTP_AUTHORIZATION': digest_value})
|
||||
self.assertEqual('Unsupported authorization type "Digest"', str(e))
|
||||
|
||||
def test_unauthorized(self):
|
||||
e = self.assertRaises(exception.Unauthorized,
|
||||
auth_basic.unauthorized, 'ouch')
|
||||
self.assertEqual('ouch', str(e))
|
||||
self.assertEqual({
|
||||
'WWW-Authenticate': 'Basic realm="Baremetal API"'
|
||||
}, e.headers)
|
@ -1,73 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
# implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
|
||||
import re
|
||||
from unittest import mock
|
||||
|
||||
from oslo_config import cfg
|
||||
|
||||
from ironic_lib import exception
|
||||
from ironic_lib.tests import base
|
||||
|
||||
CONF = cfg.CONF
|
||||
|
||||
|
||||
class Unserializable(object):
|
||||
def __str__(self):
|
||||
raise NotImplementedError('nostr')
|
||||
|
||||
|
||||
class TestException(exception.IronicException):
|
||||
_msg_fmt = 'Some exception: %(spam)s, %(ham)s'
|
||||
|
||||
|
||||
class TestIronicException(base.IronicLibTestCase):
|
||||
def test___init___json_serializable(self):
|
||||
exc = TestException(spam=[1, 2, 3], ham='eggs')
|
||||
self.assertIn('[1, 2, 3]', str(exc))
|
||||
self.assertEqual('[1, 2, 3]', exc.kwargs['spam'])
|
||||
|
||||
def test___init___string_serializable(self):
|
||||
exc = TestException(
|
||||
spam=type('ni', (object,), dict(a=1, b=2))(), ham='eggs'
|
||||
)
|
||||
check_str = 'ni object at'
|
||||
self.assertIn(check_str, str(exc))
|
||||
self.assertIn(check_str, exc.kwargs['spam'])
|
||||
|
||||
@mock.patch.object(exception.LOG, 'error', autospec=True)
|
||||
def test___init___invalid_kwarg(self, log_mock):
|
||||
CONF.set_override('fatal_exception_format_errors', False,
|
||||
group='ironic_lib')
|
||||
e = TestException(spam=Unserializable(), ham='eggs')
|
||||
message = \
|
||||
log_mock.call_args_list[0][0][0] % log_mock.call_args_list[0][0][1]
|
||||
self.assertIsNotNone(
|
||||
re.search('spam: .*JSON.* NotImplementedError: nostr', message),
|
||||
message
|
||||
)
|
||||
self.assertEqual({'ham': '"eggs"', 'code': 500}, e.kwargs)
|
||||
|
||||
@mock.patch.object(exception.LOG, 'error', autospec=True)
|
||||
def test___init___invalid_kwarg_reraise(self, log_mock):
|
||||
CONF.set_override('fatal_exception_format_errors', True,
|
||||
group='ironic_lib')
|
||||
self.assertRaises(KeyError, TestException, spam=Unserializable(),
|
||||
ham='eggs')
|
||||
message = \
|
||||
log_mock.call_args_list[0][0][0] % log_mock.call_args_list[0][0][1]
|
||||
self.assertIsNotNone(
|
||||
re.search('spam: .*JSON.* NotImplementedError: nostr', message),
|
||||
message
|
||||
)
|
@ -1,734 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import copy
|
||||
import os
|
||||
import tempfile
|
||||
from unittest import mock
|
||||
|
||||
import fixtures
|
||||
import oslo_messaging
|
||||
import webob
|
||||
|
||||
from ironic_lib import exception
|
||||
from ironic_lib.json_rpc import client
|
||||
from ironic_lib.json_rpc import server
|
||||
from ironic_lib.tests import base
|
||||
|
||||
|
||||
class FakeContext(server.EmptyContext):
|
||||
|
||||
request_id = 'abcd'
|
||||
|
||||
|
||||
class FakeManager(object):
|
||||
|
||||
def success(self, context, x, y=0):
|
||||
assert isinstance(context, FakeContext)
|
||||
assert context.user_name == 'admin'
|
||||
return x - y
|
||||
|
||||
def no_result(self, context):
|
||||
assert isinstance(context, FakeContext)
|
||||
return None
|
||||
|
||||
def no_context(self):
|
||||
return 42
|
||||
|
||||
def fail(self, context, message):
|
||||
assert isinstance(context, FakeContext)
|
||||
raise exception.IronicException(message)
|
||||
|
||||
@oslo_messaging.expected_exceptions(exception.BadRequest)
|
||||
def expected(self, context, message):
|
||||
assert isinstance(context, FakeContext)
|
||||
raise exception.BadRequest(message)
|
||||
|
||||
def crash(self, context):
|
||||
raise RuntimeError('boom')
|
||||
|
||||
def copy(self, context, data):
|
||||
return copy.deepcopy(data)
|
||||
|
||||
def init_host(self, context):
|
||||
assert False, "This should not be exposed"
|
||||
|
||||
def _private(self, context):
|
||||
assert False, "This should not be exposed"
|
||||
|
||||
# This should not be exposed either
|
||||
value = 42
|
||||
|
||||
|
||||
class FakeSerializer:
|
||||
|
||||
def serialize_entity(self, context, entity):
|
||||
return entity
|
||||
|
||||
def deserialize_entity(self, context, data):
|
||||
return data
|
||||
|
||||
|
||||
class TestService(base.IronicLibTestCase):
|
||||
|
||||
def setUp(self):
|
||||
super(TestService, self).setUp()
|
||||
self.config(auth_strategy='noauth', group='json_rpc')
|
||||
self.server_mock = self.useFixture(fixtures.MockPatch(
|
||||
'oslo_service.wsgi.Server', autospec=True)).mock
|
||||
|
||||
self.serializer = FakeSerializer()
|
||||
self.service = server.WSGIService(FakeManager(), self.serializer,
|
||||
FakeContext)
|
||||
self.app = self.service._application
|
||||
self.ctx = {'user_name': 'admin'}
|
||||
|
||||
def _request(self, name=None, params=None, expected_error=None,
|
||||
request_id='abcd', **kwargs):
|
||||
body = {
|
||||
'jsonrpc': '2.0',
|
||||
}
|
||||
if request_id is not None:
|
||||
body['id'] = request_id
|
||||
if name is not None:
|
||||
body['method'] = name
|
||||
if params is not None:
|
||||
body['params'] = params
|
||||
if 'json_body' not in kwargs:
|
||||
kwargs['json_body'] = body
|
||||
kwargs.setdefault('method', 'POST')
|
||||
kwargs.setdefault('headers', {'Content-Type': 'application/json'})
|
||||
|
||||
request = webob.Request.blank("/", **kwargs)
|
||||
response = request.get_response(self.app)
|
||||
self.assertEqual(response.status_code,
|
||||
expected_error or (200 if request_id else 204))
|
||||
if request_id is not None:
|
||||
if expected_error:
|
||||
self.assertEqual(expected_error,
|
||||
response.json_body['error']['code'])
|
||||
else:
|
||||
return response.json_body
|
||||
else:
|
||||
return response.text
|
||||
|
||||
def _check(self, body, result=None, error=None, request_id='abcd'):
|
||||
self.assertEqual('2.0', body.pop('jsonrpc'))
|
||||
self.assertEqual(request_id, body.pop('id'))
|
||||
if error is not None:
|
||||
self.assertEqual({'error': error}, body)
|
||||
else:
|
||||
self.assertEqual({'result': result}, body)
|
||||
|
||||
def _setup_http_basic(self):
|
||||
with tempfile.NamedTemporaryFile(mode='w', delete=False) as f:
|
||||
f.write('myName:$2y$05$lE3eGtyj41jZwrzS87KTqe6.'
|
||||
'JETVCWBkc32C63UP2aYrGoYOEpbJm\n\n\n')
|
||||
self.addCleanup(os.remove, f.name)
|
||||
self.config(http_basic_auth_user_file=f.name, group='json_rpc')
|
||||
self.config(auth_strategy='http_basic', group='json_rpc')
|
||||
# self.config(http_basic_username='myUser', group='json_rpc')
|
||||
# self.config(http_basic_password='myPassword', group='json_rpc')
|
||||
self.service = server.WSGIService(FakeManager(), self.serializer,
|
||||
FakeContext)
|
||||
self.app = self.server_mock.call_args[0][2]
|
||||
|
||||
def test_http_basic_not_authenticated(self):
|
||||
self._setup_http_basic()
|
||||
self._request('success', {'context': self.ctx, 'x': 42},
|
||||
request_id=None, expected_error=401)
|
||||
|
||||
def test_http_basic(self):
|
||||
self._setup_http_basic()
|
||||
headers = {
|
||||
'Content-Type': 'application/json',
|
||||
'Authorization': 'Basic bXlOYW1lOm15UGFzc3dvcmQ='
|
||||
}
|
||||
body = self._request('success', {'context': self.ctx, 'x': 42},
|
||||
headers=headers)
|
||||
self._check(body, result=42)
|
||||
|
||||
def test_success(self):
|
||||
body = self._request('success', {'context': self.ctx, 'x': 42})
|
||||
self._check(body, result=42)
|
||||
|
||||
def test_success_no_result(self):
|
||||
body = self._request('no_result', {'context': self.ctx})
|
||||
self._check(body, result=None)
|
||||
|
||||
def test_notification(self):
|
||||
body = self._request('no_result', {'context': self.ctx},
|
||||
request_id=None)
|
||||
self.assertEqual('', body)
|
||||
|
||||
def test_no_context(self):
|
||||
body = self._request('no_context')
|
||||
self._check(body, result=42)
|
||||
|
||||
def test_non_json_body(self):
|
||||
for body in (b'', b'???', b"\xc3\x28"):
|
||||
request = webob.Request.blank("/", method='POST', body=body)
|
||||
response = request.get_response(self.app)
|
||||
self._check(
|
||||
response.json_body,
|
||||
error={
|
||||
'message': server.ParseError._msg_fmt,
|
||||
'code': -32700,
|
||||
},
|
||||
request_id=None)
|
||||
|
||||
def test_invalid_requests(self):
|
||||
bodies = [
|
||||
# Invalid requests with request ID.
|
||||
{'method': 'no_result', 'id': 'abcd',
|
||||
'params': {'context': self.ctx}},
|
||||
{'jsonrpc': '2.0', 'id': 'abcd', 'params': {'context': self.ctx}},
|
||||
# These do not count as notifications, since they're malformed.
|
||||
{'method': 'no_result', 'params': {'context': self.ctx}},
|
||||
{'jsonrpc': '2.0', 'params': {'context': self.ctx}},
|
||||
42,
|
||||
# We do not implement batched requests.
|
||||
[],
|
||||
[{'jsonrpc': '2.0', 'method': 'no_result',
|
||||
'params': {'context': self.ctx}}],
|
||||
]
|
||||
for body in bodies:
|
||||
body = self._request(json_body=body)
|
||||
self._check(
|
||||
body,
|
||||
error={
|
||||
'message': server.InvalidRequest._msg_fmt,
|
||||
'code': -32600,
|
||||
},
|
||||
request_id=body.get('id'))
|
||||
|
||||
def test_malformed_context(self):
|
||||
body = self._request(json_body={'jsonrpc': '2.0', 'id': 'abcd',
|
||||
'method': 'no_result',
|
||||
'params': {'context': 42}})
|
||||
self._check(
|
||||
body,
|
||||
error={
|
||||
'message': 'Context must be a dictionary, if provided',
|
||||
'code': -32602,
|
||||
})
|
||||
|
||||
def test_expected_failure(self):
|
||||
body = self._request('fail', {'context': self.ctx,
|
||||
'message': 'some error'})
|
||||
self._check(body,
|
||||
error={
|
||||
'message': 'some error',
|
||||
'code': 500,
|
||||
'data': {
|
||||
'class': 'ironic_lib.exception.IronicException'
|
||||
}
|
||||
})
|
||||
|
||||
def test_expected_failure_oslo(self):
|
||||
# Check that exceptions wrapped by oslo's expected_exceptions get
|
||||
# unwrapped correctly.
|
||||
body = self._request('expected', {'context': self.ctx,
|
||||
'message': 'some error'})
|
||||
self._check(body,
|
||||
error={
|
||||
'message': 'some error',
|
||||
'code': 400,
|
||||
'data': {
|
||||
'class': 'ironic_lib.exception.BadRequest'
|
||||
}
|
||||
})
|
||||
|
||||
@mock.patch.object(server.LOG, 'exception', autospec=True)
|
||||
def test_unexpected_failure(self, mock_log):
|
||||
body = self._request('crash', {'context': self.ctx})
|
||||
self._check(body,
|
||||
error={
|
||||
'message': 'boom',
|
||||
'code': 500,
|
||||
})
|
||||
self.assertTrue(mock_log.called)
|
||||
|
||||
def test_method_not_found(self):
|
||||
body = self._request('banana', {'context': self.ctx})
|
||||
self._check(body,
|
||||
error={
|
||||
'message': 'Method banana was not found',
|
||||
'code': -32601,
|
||||
})
|
||||
|
||||
def test_no_deny_methods(self):
|
||||
for name in ('__init__', '_private', 'init_host', 'value'):
|
||||
body = self._request(name, {'context': self.ctx})
|
||||
self._check(body,
|
||||
error={
|
||||
'message': 'Method %s was not found' % name,
|
||||
'code': -32601,
|
||||
})
|
||||
|
||||
def test_missing_argument(self):
|
||||
body = self._request('success', {'context': self.ctx})
|
||||
# The exact error message depends on the Python version
|
||||
self.assertEqual(-32602, body['error']['code'])
|
||||
self.assertNotIn('result', body)
|
||||
|
||||
def test_method_not_post(self):
|
||||
self._request('success', {'context': self.ctx, 'x': 42},
|
||||
method='GET', expected_error=405)
|
||||
|
||||
def test_authenticated(self):
|
||||
self.config(auth_strategy='keystone', group='json_rpc')
|
||||
self.service = server.WSGIService(FakeManager(), self.serializer,
|
||||
FakeContext)
|
||||
self.app = self.server_mock.call_args[0][2]
|
||||
self._request('success', {'context': self.ctx, 'x': 42},
|
||||
expected_error=401)
|
||||
|
||||
def test_authenticated_with_allowed_role(self):
|
||||
self.config(auth_strategy='keystone', group='json_rpc')
|
||||
self.config(allowed_roles=['allowed', 'ignored'], group='json_rpc')
|
||||
self.service = server.WSGIService(FakeManager(), self.serializer,
|
||||
FakeContext)
|
||||
self.app = self.server_mock.call_args[0][2]
|
||||
self._request('success', {'context': self.ctx, 'x': 42},
|
||||
expected_error=401,
|
||||
headers={'Content-Type': 'application/json',
|
||||
'X-Roles': 'allowed,denied'})
|
||||
|
||||
def test_authenticated_no_admin_role(self):
|
||||
self.config(auth_strategy='keystone', group='json_rpc')
|
||||
self._request('success', {'context': self.ctx, 'x': 42},
|
||||
expected_error=403)
|
||||
|
||||
def test_authenticated_no_allowed_role(self):
|
||||
self.config(auth_strategy='keystone', group='json_rpc')
|
||||
self.config(allowed_roles=['allowed', 'ignored'], group='json_rpc')
|
||||
self._request('success', {'context': self.ctx, 'x': 42},
|
||||
expected_error=403,
|
||||
headers={'Content-Type': 'application/json',
|
||||
'X-Roles': 'denied,notallowed'})
|
||||
|
||||
@mock.patch.object(server.LOG, 'debug', autospec=True)
|
||||
def test_mask_secrets(self, mock_log):
|
||||
data = {'ipmi_username': 'admin', 'ipmi_password': 'secret'}
|
||||
node = self.serializer.serialize_entity(self.ctx, data)
|
||||
body = self._request('copy', {'context': self.ctx, 'data': data})
|
||||
self.assertIsNone(body.get('error'))
|
||||
node = self.serializer.deserialize_entity(self.ctx, body['result'])
|
||||
logged_params = mock_log.call_args_list[0][0][2]
|
||||
logged_node = logged_params['data']
|
||||
self.assertEqual({'ipmi_username': 'admin', 'ipmi_password': '***'},
|
||||
logged_node)
|
||||
logged_resp = mock_log.call_args_list[1][0][2]
|
||||
self.assertEqual({'ipmi_username': 'admin', 'ipmi_password': '***'},
|
||||
logged_resp)
|
||||
# The result is not affected, only logging
|
||||
self.assertEqual(data, node)
|
||||
|
||||
|
||||
@mock.patch.object(client, '_get_session', autospec=True)
|
||||
class TestClient(base.IronicLibTestCase):
|
||||
|
||||
def setUp(self):
|
||||
super(TestClient, self).setUp()
|
||||
self.serializer = FakeSerializer()
|
||||
self.client = client.Client(self.serializer)
|
||||
self.context = FakeContext({'user_name': 'admin'})
|
||||
self.ctx_json = self.context.to_dict()
|
||||
|
||||
def test_can_send_version(self, mock_session):
|
||||
self.assertTrue(self.client.can_send_version('1.42'))
|
||||
self.client = client.Client(self.serializer, version_cap='1.42')
|
||||
self.assertTrue(self.client.can_send_version('1.42'))
|
||||
self.assertTrue(self.client.can_send_version('1.0'))
|
||||
self.assertFalse(self.client.can_send_version('1.99'))
|
||||
self.assertFalse(self.client.can_send_version('2.0'))
|
||||
|
||||
def test_call_success(self, mock_session):
|
||||
response = mock_session.return_value.post.return_value
|
||||
response.json.return_value = {
|
||||
'jsonrpc': '2.0',
|
||||
'result': 42
|
||||
}
|
||||
cctx = self.client.prepare('foo.example.com')
|
||||
self.assertEqual('example.com', cctx.host)
|
||||
result = cctx.call(self.context, 'do_something', answer=42)
|
||||
self.assertEqual(42, result)
|
||||
mock_session.return_value.post.assert_called_once_with(
|
||||
'http://example.com:8089',
|
||||
json={'jsonrpc': '2.0',
|
||||
'method': 'do_something',
|
||||
'params': {'answer': 42, 'context': self.ctx_json},
|
||||
'id': self.context.request_id})
|
||||
|
||||
def test_call_ipv4_success(self, mock_session):
|
||||
response = mock_session.return_value.post.return_value
|
||||
response.json.return_value = {
|
||||
'jsonrpc': '2.0',
|
||||
'result': 42
|
||||
}
|
||||
cctx = self.client.prepare('foo.192.0.2.1')
|
||||
self.assertEqual('192.0.2.1', cctx.host)
|
||||
result = cctx.call(self.context, 'do_something', answer=42)
|
||||
self.assertEqual(42, result)
|
||||
mock_session.return_value.post.assert_called_once_with(
|
||||
'http://192.0.2.1:8089',
|
||||
json={'jsonrpc': '2.0',
|
||||
'method': 'do_something',
|
||||
'params': {'answer': 42, 'context': self.ctx_json},
|
||||
'id': self.context.request_id})
|
||||
|
||||
def test_call_ipv6_success(self, mock_session):
|
||||
response = mock_session.return_value.post.return_value
|
||||
response.json.return_value = {
|
||||
'jsonrpc': '2.0',
|
||||
'result': 42
|
||||
}
|
||||
cctx = self.client.prepare('foo.2001:db8::1')
|
||||
self.assertEqual('2001:db8::1', cctx.host)
|
||||
self.assertEqual(8089, cctx.port)
|
||||
result = cctx.call(self.context, 'do_something', answer=42)
|
||||
self.assertEqual(42, result)
|
||||
mock_session.return_value.post.assert_called_once_with(
|
||||
'http://[2001:db8::1]:8089',
|
||||
json={'jsonrpc': '2.0',
|
||||
'method': 'do_something',
|
||||
'params': {'answer': 42, 'context': self.ctx_json},
|
||||
'id': self.context.request_id})
|
||||
|
||||
def test_call_ipv6_success_rfc2732(self, mock_session):
|
||||
response = mock_session.return_value.post.return_value
|
||||
response.json.return_value = {
|
||||
'jsonrpc': '2.0',
|
||||
'result': 42
|
||||
}
|
||||
cctx = self.client.prepare('foo.[2001:db8::1]:8192')
|
||||
self.assertEqual('2001:db8::1', cctx.host)
|
||||
result = cctx.call(self.context, 'do_something', answer=42)
|
||||
self.assertEqual(42, result)
|
||||
mock_session.return_value.post.assert_called_once_with(
|
||||
'http://[2001:db8::1]:8192',
|
||||
json={'jsonrpc': '2.0',
|
||||
'method': 'do_something',
|
||||
'params': {'answer': 42, 'context': self.ctx_json},
|
||||
'id': self.context.request_id})
|
||||
|
||||
def test_call_success_with_version(self, mock_session):
|
||||
response = mock_session.return_value.post.return_value
|
||||
response.json.return_value = {
|
||||
'jsonrpc': '2.0',
|
||||
'result': 42
|
||||
}
|
||||
cctx = self.client.prepare('foo.example.com:8192', version='1.42')
|
||||
self.assertEqual('example.com', cctx.host)
|
||||
result = cctx.call(self.context, 'do_something', answer=42)
|
||||
self.assertEqual(42, result)
|
||||
mock_session.return_value.post.assert_called_once_with(
|
||||
'http://example.com:8192',
|
||||
json={'jsonrpc': '2.0',
|
||||
'method': 'do_something',
|
||||
'params': {'answer': 42, 'context': self.ctx_json,
|
||||
'rpc.version': '1.42'},
|
||||
'id': self.context.request_id})
|
||||
|
||||
def test_call_success_with_version_and_cap(self, mock_session):
|
||||
self.client = client.Client(self.serializer, version_cap='1.99')
|
||||
response = mock_session.return_value.post.return_value
|
||||
response.json.return_value = {
|
||||
'jsonrpc': '2.0',
|
||||
'result': 42
|
||||
}
|
||||
cctx = self.client.prepare('foo.example.com', version='1.42')
|
||||
self.assertEqual('example.com', cctx.host)
|
||||
result = cctx.call(self.context, 'do_something', answer=42)
|
||||
self.assertEqual(42, result)
|
||||
mock_session.return_value.post.assert_called_once_with(
|
||||
'http://example.com:8089',
|
||||
json={'jsonrpc': '2.0',
|
||||
'method': 'do_something',
|
||||
'params': {'answer': 42, 'context': self.ctx_json,
|
||||
'rpc.version': '1.42'},
|
||||
'id': self.context.request_id})
|
||||
|
||||
def test_call_with_ssl(self, mock_session):
|
||||
self.config(use_ssl=True, group='json_rpc')
|
||||
response = mock_session.return_value.post.return_value
|
||||
response.json.return_value = {
|
||||
'jsonrpc': '2.0',
|
||||
'result': 42
|
||||
}
|
||||
cctx = self.client.prepare('foo.example.com')
|
||||
self.assertEqual('example.com', cctx.host)
|
||||
result = cctx.call(self.context, 'do_something', answer=42)
|
||||
self.assertEqual(42, result)
|
||||
mock_session.return_value.post.assert_called_once_with(
|
||||
'https://example.com:8089',
|
||||
json={'jsonrpc': '2.0',
|
||||
'method': 'do_something',
|
||||
'params': {'answer': 42, 'context': self.ctx_json},
|
||||
'id': self.context.request_id})
|
||||
|
||||
def test_cast_success(self, mock_session):
|
||||
cctx = self.client.prepare('foo.example.com')
|
||||
self.assertEqual('example.com', cctx.host)
|
||||
result = cctx.cast(self.context, 'do_something', answer=42)
|
||||
self.assertIsNone(result)
|
||||
mock_session.return_value.post.assert_called_once_with(
|
||||
'http://example.com:8089',
|
||||
json={'jsonrpc': '2.0',
|
||||
'method': 'do_something',
|
||||
'params': {'answer': 42, 'context': self.ctx_json}})
|
||||
|
||||
def test_cast_success_with_version(self, mock_session):
|
||||
cctx = self.client.prepare('foo.example.com', version='1.42')
|
||||
self.assertEqual('example.com', cctx.host)
|
||||
result = cctx.cast(self.context, 'do_something', answer=42)
|
||||
self.assertIsNone(result)
|
||||
mock_session.return_value.post.assert_called_once_with(
|
||||
'http://example.com:8089',
|
||||
json={'jsonrpc': '2.0',
|
||||
'method': 'do_something',
|
||||
'params': {'answer': 42, 'context': self.ctx_json,
|
||||
'rpc.version': '1.42'}})
|
||||
|
||||
def test_call_failure(self, mock_session):
|
||||
response = mock_session.return_value.post.return_value
|
||||
response.json.return_value = {
|
||||
'jsonrpc': '2.0',
|
||||
'error': {
|
||||
'code': 418,
|
||||
'message': 'I am a teapot',
|
||||
'data': {
|
||||
'class': 'ironic_lib.exception.BadRequest'
|
||||
}
|
||||
}
|
||||
}
|
||||
cctx = self.client.prepare('foo.example.com')
|
||||
self.assertEqual('example.com', cctx.host)
|
||||
# Make sure that the class is restored correctly for expected errors.
|
||||
exc = self.assertRaises(exception.BadRequest,
|
||||
cctx.call,
|
||||
self.context, 'do_something', answer=42)
|
||||
# Code from the body has priority over one in the class.
|
||||
self.assertEqual(418, exc.code)
|
||||
self.assertIn('I am a teapot', str(exc))
|
||||
mock_session.return_value.post.assert_called_once_with(
|
||||
'http://example.com:8089',
|
||||
json={'jsonrpc': '2.0',
|
||||
'method': 'do_something',
|
||||
'params': {'answer': 42, 'context': self.ctx_json},
|
||||
'id': self.context.request_id})
|
||||
|
||||
def test_call_unexpected_failure(self, mock_session):
|
||||
response = mock_session.return_value.post.return_value
|
||||
response.json.return_value = {
|
||||
'jsonrpc': '2.0',
|
||||
'error': {
|
||||
'code': 500,
|
||||
'message': 'AttributeError',
|
||||
}
|
||||
}
|
||||
cctx = self.client.prepare('foo.example.com')
|
||||
self.assertEqual('example.com', cctx.host)
|
||||
exc = self.assertRaises(exception.IronicException,
|
||||
cctx.call,
|
||||
self.context, 'do_something', answer=42)
|
||||
self.assertEqual(500, exc.code)
|
||||
self.assertIn('Unexpected error', str(exc))
|
||||
mock_session.return_value.post.assert_called_once_with(
|
||||
'http://example.com:8089',
|
||||
json={'jsonrpc': '2.0',
|
||||
'method': 'do_something',
|
||||
'params': {'answer': 42, 'context': self.ctx_json},
|
||||
'id': self.context.request_id})
|
||||
|
||||
def test_call_failure_with_foreign_class(self, mock_session):
|
||||
# This should not happen, but provide an additional safeguard
|
||||
response = mock_session.return_value.post.return_value
|
||||
response.json.return_value = {
|
||||
'jsonrpc': '2.0',
|
||||
'error': {
|
||||
'code': 500,
|
||||
'message': 'AttributeError',
|
||||
'data': {
|
||||
'class': 'AttributeError'
|
||||
}
|
||||
}
|
||||
}
|
||||
cctx = self.client.prepare('foo.example.com')
|
||||
self.assertEqual('example.com', cctx.host)
|
||||
exc = self.assertRaises(exception.IronicException,
|
||||
cctx.call,
|
||||
self.context, 'do_something', answer=42)
|
||||
self.assertEqual(500, exc.code)
|
||||
self.assertIn('Unexpected error', str(exc))
|
||||
mock_session.return_value.post.assert_called_once_with(
|
||||
'http://example.com:8089',
|
||||
json={'jsonrpc': '2.0',
|
||||
'method': 'do_something',
|
||||
'params': {'answer': 42, 'context': self.ctx_json},
|
||||
'id': self.context.request_id})
|
||||
|
||||
def test_cast_failure(self, mock_session):
|
||||
# Cast cannot return normal failures, but make sure we ignore them even
|
||||
# if server sends something in violation of the protocol (or because
|
||||
# it's a low-level error like HTTP Forbidden).
|
||||
response = mock_session.return_value.post.return_value
|
||||
response.json.return_value = {
|
||||
'jsonrpc': '2.0',
|
||||
'error': {
|
||||
'code': 418,
|
||||
'message': 'I am a teapot',
|
||||
'data': {
|
||||
'class': 'ironic.common.exception.IronicException'
|
||||
}
|
||||
}
|
||||
}
|
||||
cctx = self.client.prepare('foo.example.com')
|
||||
self.assertEqual('example.com', cctx.host)
|
||||
result = cctx.cast(self.context, 'do_something', answer=42)
|
||||
self.assertIsNone(result)
|
||||
mock_session.return_value.post.assert_called_once_with(
|
||||
'http://example.com:8089',
|
||||
json={'jsonrpc': '2.0',
|
||||
'method': 'do_something',
|
||||
'params': {'answer': 42, 'context': self.ctx_json}})
|
||||
|
||||
def test_call_failure_with_version_and_cap(self, mock_session):
|
||||
self.client = client.Client(self.serializer, version_cap='1.42')
|
||||
cctx = self.client.prepare('foo.example.com', version='1.99')
|
||||
self.assertRaisesRegex(RuntimeError,
|
||||
"requested version 1.99, maximum allowed "
|
||||
"version is 1.42",
|
||||
cctx.call, self.context, 'do_something',
|
||||
answer=42)
|
||||
self.assertFalse(mock_session.return_value.post.called)
|
||||
|
||||
@mock.patch.object(client.LOG, 'debug', autospec=True)
|
||||
def test_mask_secrets(self, mock_log, mock_session):
|
||||
request = {
|
||||
'redfish_username': 'admin',
|
||||
'redfish_password': 'passw0rd'
|
||||
}
|
||||
body = """{
|
||||
"jsonrpc": "2.0",
|
||||
"result": {
|
||||
"driver_info": {
|
||||
"ipmi_username": "admin",
|
||||
"ipmi_password": "passw0rd"
|
||||
}
|
||||
}
|
||||
}"""
|
||||
response = mock_session.return_value.post.return_value
|
||||
response.text = body
|
||||
cctx = self.client.prepare('foo.example.com')
|
||||
cctx.cast(self.context, 'do_something', node=request)
|
||||
mock_session.return_value.post.assert_called_once_with(
|
||||
'http://example.com:8089',
|
||||
json={'jsonrpc': '2.0',
|
||||
'method': 'do_something',
|
||||
'params': {'node': request, 'context': self.ctx_json}})
|
||||
self.assertEqual(2, mock_log.call_count)
|
||||
node = mock_log.call_args_list[0][0][3]['params']['node']
|
||||
self.assertEqual(node, {'redfish_username': 'admin',
|
||||
'redfish_password': '***'})
|
||||
resp_text = mock_log.call_args_list[1][0][3]
|
||||
self.assertEqual(body.replace('passw0rd', '***'), resp_text)
|
||||
|
||||
|
||||
@mock.patch('ironic_lib.json_rpc.client.keystone', autospec=True)
|
||||
class TestSession(base.IronicLibTestCase):
|
||||
|
||||
def setUp(self):
|
||||
super(TestSession, self).setUp()
|
||||
client._SESSION = None
|
||||
|
||||
def test_noauth(self, mock_keystone):
|
||||
self.config(auth_strategy='noauth', group='json_rpc')
|
||||
session = client._get_session()
|
||||
|
||||
mock_keystone.get_auth.assert_called_once_with('json_rpc')
|
||||
auth = mock_keystone.get_auth.return_value
|
||||
|
||||
mock_keystone.get_session.assert_called_once_with(
|
||||
'json_rpc', auth=auth)
|
||||
|
||||
internal_session = mock_keystone.get_session.return_value
|
||||
|
||||
mock_keystone.get_adapter.assert_called_once_with(
|
||||
'json_rpc',
|
||||
session=internal_session,
|
||||
additional_headers={
|
||||
'Content-Type': 'application/json'
|
||||
})
|
||||
self.assertEqual(mock_keystone.get_adapter.return_value, session)
|
||||
|
||||
def test_keystone(self, mock_keystone):
|
||||
self.config(auth_strategy='keystone', group='json_rpc')
|
||||
session = client._get_session()
|
||||
|
||||
mock_keystone.get_auth.assert_called_once_with('json_rpc')
|
||||
auth = mock_keystone.get_auth.return_value
|
||||
|
||||
mock_keystone.get_session.assert_called_once_with(
|
||||
'json_rpc', auth=auth)
|
||||
|
||||
internal_session = mock_keystone.get_session.return_value
|
||||
|
||||
mock_keystone.get_adapter.assert_called_once_with(
|
||||
'json_rpc',
|
||||
session=internal_session,
|
||||
additional_headers={
|
||||
'Content-Type': 'application/json'
|
||||
})
|
||||
self.assertEqual(mock_keystone.get_adapter.return_value, session)
|
||||
|
||||
def test_http_basic(self, mock_keystone):
|
||||
self.config(auth_strategy='http_basic', group='json_rpc')
|
||||
session = client._get_session()
|
||||
|
||||
mock_keystone.get_auth.assert_called_once_with('json_rpc')
|
||||
auth = mock_keystone.get_auth.return_value
|
||||
mock_keystone.get_session.assert_called_once_with(
|
||||
'json_rpc', auth=auth)
|
||||
|
||||
internal_session = mock_keystone.get_session.return_value
|
||||
|
||||
mock_keystone.get_adapter.assert_called_once_with(
|
||||
'json_rpc',
|
||||
session=internal_session,
|
||||
additional_headers={
|
||||
'Content-Type': 'application/json'
|
||||
})
|
||||
self.assertEqual(mock_keystone.get_adapter.return_value, session)
|
||||
|
||||
def test_http_basic_deprecated(self, mock_keystone):
|
||||
self.config(auth_strategy='http_basic', group='json_rpc')
|
||||
self.config(http_basic_username='myName', group='json_rpc')
|
||||
self.config(http_basic_password='myPassword', group='json_rpc')
|
||||
session = client._get_session()
|
||||
|
||||
mock_keystone.get_auth.assert_called_once_with(
|
||||
'json_rpc', username='myName', password='myPassword')
|
||||
auth = mock_keystone.get_auth.return_value
|
||||
mock_keystone.get_session.assert_called_once_with(
|
||||
'json_rpc', auth=auth)
|
||||
|
||||
internal_session = mock_keystone.get_session.return_value
|
||||
|
||||
mock_keystone.get_adapter.assert_called_once_with(
|
||||
'json_rpc',
|
||||
session=internal_session,
|
||||
additional_headers={
|
||||
'Content-Type': 'application/json'
|
||||
})
|
||||
self.assertEqual(mock_keystone.get_adapter.return_value, session)
|
@ -1,121 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
from unittest import mock
|
||||
|
||||
from keystoneauth1 import loading as ks_loading
|
||||
from oslo_config import cfg
|
||||
|
||||
from ironic_lib import exception
|
||||
from ironic_lib import keystone
|
||||
from ironic_lib.tests import base
|
||||
|
||||
|
||||
class KeystoneTestCase(base.IronicLibTestCase):
|
||||
|
||||
def setUp(self):
|
||||
super(KeystoneTestCase, self).setUp()
|
||||
self.test_group = 'test_group'
|
||||
self.cfg_fixture.conf.register_group(cfg.OptGroup(self.test_group))
|
||||
keystone.register_auth_opts(self.cfg_fixture.conf, self.test_group,
|
||||
service_type='vikings')
|
||||
self.config(auth_type='password',
|
||||
group=self.test_group)
|
||||
# NOTE(pas-ha) this is due to auth_plugin options
|
||||
# being dynamically registered on first load,
|
||||
# but we need to set the config before
|
||||
plugin = ks_loading.get_plugin_loader('password')
|
||||
opts = ks_loading.get_auth_plugin_conf_options(plugin)
|
||||
self.cfg_fixture.register_opts(opts, group=self.test_group)
|
||||
self.config(auth_url='http://127.0.0.1:9898',
|
||||
username='fake_user',
|
||||
password='fake_pass',
|
||||
project_name='fake_tenant',
|
||||
group=self.test_group)
|
||||
|
||||
def test_get_session(self):
|
||||
self.config(timeout=10, group=self.test_group)
|
||||
session = keystone.get_session(self.test_group, timeout=20)
|
||||
self.assertEqual(20, session.timeout)
|
||||
|
||||
def test_get_auth(self):
|
||||
auth = keystone.get_auth(self.test_group)
|
||||
self.assertEqual('http://127.0.0.1:9898', auth.auth_url)
|
||||
|
||||
def test_get_auth_fail(self):
|
||||
# NOTE(pas-ha) 'password' auth_plugin is used,
|
||||
# so when we set the required auth_url to None,
|
||||
# MissingOption is raised
|
||||
self.config(auth_url=None, group=self.test_group)
|
||||
self.assertRaises(exception.ConfigInvalid,
|
||||
keystone.get_auth,
|
||||
self.test_group)
|
||||
|
||||
def test_get_adapter_from_config(self):
|
||||
self.config(valid_interfaces=['internal', 'public'],
|
||||
group=self.test_group)
|
||||
session = keystone.get_session(self.test_group)
|
||||
adapter = keystone.get_adapter(self.test_group, session=session,
|
||||
interface='admin')
|
||||
self.assertEqual('admin', adapter.interface)
|
||||
self.assertEqual(session, adapter.session)
|
||||
|
||||
@mock.patch('keystoneauth1.service_token.ServiceTokenAuthWrapper',
|
||||
autospec=True)
|
||||
@mock.patch('keystoneauth1.token_endpoint.Token', autospec=True)
|
||||
def test_get_service_auth(self, token_mock, service_auth_mock):
|
||||
ctxt = mock.Mock(spec=['auth_token'], auth_token='spam')
|
||||
mock_auth = mock.Mock()
|
||||
self.assertEqual(service_auth_mock.return_value,
|
||||
keystone.get_service_auth(ctxt, 'ham', mock_auth))
|
||||
token_mock.assert_called_once_with('ham', 'spam')
|
||||
service_auth_mock.assert_called_once_with(
|
||||
user_auth=token_mock.return_value, service_auth=mock_auth)
|
||||
|
||||
|
||||
class AuthConfTestCase(base.IronicLibTestCase):
|
||||
|
||||
def setUp(self):
|
||||
super(AuthConfTestCase, self).setUp()
|
||||
self.test_group = 'test_group'
|
||||
self.cfg_fixture.conf.register_group(cfg.OptGroup(self.test_group))
|
||||
keystone.register_auth_opts(self.cfg_fixture.conf, self.test_group)
|
||||
self.config(auth_type='password',
|
||||
group=self.test_group)
|
||||
# NOTE(pas-ha) this is due to auth_plugin options
|
||||
# being dynamically registered on first load,
|
||||
# but we need to set the config before
|
||||
plugin = ks_loading.get_plugin_loader('password')
|
||||
opts = ks_loading.get_auth_plugin_conf_options(plugin)
|
||||
self.cfg_fixture.register_opts(opts, group=self.test_group)
|
||||
self.config(auth_url='http://127.0.0.1:9898',
|
||||
username='fake_user',
|
||||
password='fake_pass',
|
||||
project_name='fake_tenant',
|
||||
group=self.test_group)
|
||||
|
||||
def test_add_auth_opts(self):
|
||||
opts = keystone.add_auth_opts([])
|
||||
# check that there is no duplicates
|
||||
names = {o.dest for o in opts}
|
||||
self.assertEqual(len(names), len(opts))
|
||||
# NOTE(pas-ha) checking for most standard auth and session ones only
|
||||
expected = {'timeout', 'insecure', 'cafile', 'certfile', 'keyfile',
|
||||
'auth_type', 'auth_url', 'username', 'password',
|
||||
'tenant_name', 'project_name', 'trust_id',
|
||||
'domain_id', 'user_domain_id', 'project_domain_id'}
|
||||
self.assertTrue(expected.issubset(names))
|
||||
|
||||
def test_os_service_types_alias(self):
|
||||
keystone.register_auth_opts(self.cfg_fixture.conf, 'barbican')
|
||||
self.assertEqual(self.cfg_fixture.conf.barbican.service_type,
|
||||
'key-manager')
|
@ -1,355 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import socket
|
||||
from unittest import mock
|
||||
|
||||
from oslo_config import cfg
|
||||
import zeroconf
|
||||
|
||||
from ironic_lib import exception
|
||||
from ironic_lib import mdns
|
||||
from ironic_lib.tests import base
|
||||
|
||||
CONF = cfg.CONF
|
||||
|
||||
|
||||
@mock.patch.object(zeroconf, 'Zeroconf', autospec=True)
|
||||
class RegisterServiceTestCase(base.IronicLibTestCase):
|
||||
|
||||
def test_ok(self, mock_zc):
|
||||
zc = mdns.Zeroconf()
|
||||
zc.register_service('baremetal', 'https://127.0.0.1/baremetal')
|
||||
mock_zc.assert_called_once_with(
|
||||
interfaces=zeroconf.InterfaceChoice.All,
|
||||
ip_version=zeroconf.IPVersion.All)
|
||||
mock_zc.return_value.register_service.assert_called_once_with(mock.ANY)
|
||||
info = mock_zc.return_value.register_service.call_args[0][0]
|
||||
self.assertEqual('_openstack._tcp.local.', info.type)
|
||||
self.assertEqual('baremetal._openstack._tcp.local.', info.name)
|
||||
self.assertEqual('127.0.0.1', socket.inet_ntoa(info.addresses[0]))
|
||||
self.assertEqual({b'path': b'/baremetal'}, info.properties)
|
||||
|
||||
def test_with_params(self, mock_zc):
|
||||
CONF.set_override('params', {'answer': 'none', 'foo': 'bar'},
|
||||
group='mdns')
|
||||
zc = mdns.Zeroconf()
|
||||
zc.register_service('baremetal', 'https://127.0.0.1/baremetal',
|
||||
params={'answer': b'42'})
|
||||
mock_zc.return_value.register_service.assert_called_once_with(mock.ANY)
|
||||
info = mock_zc.return_value.register_service.call_args[0][0]
|
||||
self.assertEqual('_openstack._tcp.local.', info.type)
|
||||
self.assertEqual('baremetal._openstack._tcp.local.', info.name)
|
||||
self.assertEqual('127.0.0.1', socket.inet_ntoa(info.addresses[0]))
|
||||
self.assertEqual({b'path': b'/baremetal',
|
||||
b'answer': b'42',
|
||||
b'foo': b'bar'},
|
||||
info.properties)
|
||||
|
||||
@mock.patch.object(mdns.time, 'sleep', autospec=True)
|
||||
def test_with_race(self, mock_sleep, mock_zc):
|
||||
mock_zc.return_value.register_service.side_effect = [
|
||||
zeroconf.NonUniqueNameException,
|
||||
zeroconf.NonUniqueNameException,
|
||||
zeroconf.NonUniqueNameException,
|
||||
None
|
||||
]
|
||||
zc = mdns.Zeroconf()
|
||||
zc.register_service('baremetal', 'https://127.0.0.1/baremetal')
|
||||
mock_zc.return_value.register_service.assert_called_with(mock.ANY)
|
||||
self.assertEqual(4, mock_zc.return_value.register_service.call_count)
|
||||
mock_sleep.assert_has_calls([mock.call(i) for i in (0.1, 0.2, 0.4)])
|
||||
|
||||
def test_with_interfaces(self, mock_zc):
|
||||
CONF.set_override('interfaces', ['10.0.0.1', '192.168.1.1'],
|
||||
group='mdns')
|
||||
zc = mdns.Zeroconf()
|
||||
zc.register_service('baremetal', 'https://127.0.0.1/baremetal')
|
||||
mock_zc.assert_called_once_with(interfaces=['10.0.0.1', '192.168.1.1'],
|
||||
ip_version=None)
|
||||
mock_zc.return_value.register_service.assert_called_once_with(mock.ANY)
|
||||
info = mock_zc.return_value.register_service.call_args[0][0]
|
||||
self.assertEqual('_openstack._tcp.local.', info.type)
|
||||
self.assertEqual('baremetal._openstack._tcp.local.', info.name)
|
||||
self.assertEqual('127.0.0.1', socket.inet_ntoa(info.addresses[0]))
|
||||
self.assertEqual({b'path': b'/baremetal'}, info.properties)
|
||||
|
||||
@mock.patch.object(mdns.time, 'sleep', autospec=True)
|
||||
def test_failure(self, mock_sleep, mock_zc):
|
||||
mock_zc.return_value.register_service.side_effect = (
|
||||
zeroconf.NonUniqueNameException
|
||||
)
|
||||
zc = mdns.Zeroconf()
|
||||
self.assertRaises(exception.ServiceRegistrationFailure,
|
||||
zc.register_service,
|
||||
'baremetal', 'https://127.0.0.1/baremetal')
|
||||
mock_zc.return_value.register_service.assert_called_with(mock.ANY)
|
||||
self.assertEqual(CONF.mdns.registration_attempts,
|
||||
mock_zc.return_value.register_service.call_count)
|
||||
self.assertEqual(CONF.mdns.registration_attempts - 1,
|
||||
mock_sleep.call_count)
|
||||
|
||||
|
||||
class ParseEndpointTestCase(base.IronicLibTestCase):
|
||||
|
||||
def test_simple(self):
|
||||
endpoint = mdns._parse_endpoint('http://127.0.0.1')
|
||||
self.assertEqual(1, len(endpoint.addresses))
|
||||
self.assertEqual('127.0.0.1', socket.inet_ntoa(endpoint.addresses[0]))
|
||||
self.assertEqual(80, endpoint.port)
|
||||
self.assertEqual({}, endpoint.params)
|
||||
self.assertIsNone(endpoint.hostname)
|
||||
|
||||
def test_simple_https(self):
|
||||
endpoint = mdns._parse_endpoint('https://127.0.0.1')
|
||||
self.assertEqual(1, len(endpoint.addresses))
|
||||
self.assertEqual('127.0.0.1', socket.inet_ntoa(endpoint.addresses[0]))
|
||||
self.assertEqual(443, endpoint.port)
|
||||
self.assertEqual({}, endpoint.params)
|
||||
self.assertIsNone(endpoint.hostname)
|
||||
|
||||
def test_with_path_and_port(self):
|
||||
endpoint = mdns._parse_endpoint('http://127.0.0.1:8080/bm')
|
||||
self.assertEqual(1, len(endpoint.addresses))
|
||||
self.assertEqual('127.0.0.1', socket.inet_ntoa(endpoint.addresses[0]))
|
||||
self.assertEqual(8080, endpoint.port)
|
||||
self.assertEqual({'path': '/bm', 'protocol': 'http'}, endpoint.params)
|
||||
self.assertIsNone(endpoint.hostname)
|
||||
|
||||
@mock.patch.object(socket, 'getaddrinfo', autospec=True)
|
||||
def test_resolve(self, mock_resolve):
|
||||
mock_resolve.return_value = [
|
||||
(socket.AF_INET, None, None, None, ('1.2.3.4',)),
|
||||
(socket.AF_INET6, None, None, None, ('::2', 'scope')),
|
||||
]
|
||||
endpoint = mdns._parse_endpoint('http://example.com')
|
||||
self.assertEqual(2, len(endpoint.addresses))
|
||||
self.assertEqual('1.2.3.4', socket.inet_ntoa(endpoint.addresses[0]))
|
||||
self.assertEqual('::2', socket.inet_ntop(socket.AF_INET6,
|
||||
endpoint.addresses[1]))
|
||||
self.assertEqual(80, endpoint.port)
|
||||
self.assertEqual({}, endpoint.params)
|
||||
self.assertEqual('example.com.', endpoint.hostname)
|
||||
mock_resolve.assert_called_once_with('example.com', 80, mock.ANY,
|
||||
socket.IPPROTO_TCP)
|
||||
|
||||
|
||||
@mock.patch('ironic_lib.utils.get_route_source', autospec=True)
|
||||
@mock.patch('zeroconf.Zeroconf', autospec=True)
|
||||
class GetEndpointTestCase(base.IronicLibTestCase):
|
||||
def test_simple(self, mock_zc, mock_route):
|
||||
mock_zc.return_value.get_service_info.return_value = mock.Mock(
|
||||
address=socket.inet_aton('192.168.1.1'),
|
||||
port=80,
|
||||
properties={},
|
||||
**{'parsed_addresses.return_value': ['192.168.1.1']}
|
||||
)
|
||||
|
||||
endp, params = mdns.get_endpoint('baremetal')
|
||||
self.assertEqual('http://192.168.1.1:80', endp)
|
||||
self.assertEqual({}, params)
|
||||
mock_zc.return_value.get_service_info.assert_called_once_with(
|
||||
'baremetal._openstack._tcp.local.',
|
||||
'baremetal._openstack._tcp.local.'
|
||||
)
|
||||
mock_zc.return_value.close.assert_called_once_with()
|
||||
|
||||
def test_v6(self, mock_zc, mock_route):
|
||||
mock_zc.return_value.get_service_info.return_value = mock.Mock(
|
||||
port=80,
|
||||
properties={},
|
||||
**{'parsed_addresses.return_value': ['::2']}
|
||||
)
|
||||
|
||||
endp, params = mdns.get_endpoint('baremetal')
|
||||
self.assertEqual('http://[::2]:80', endp)
|
||||
self.assertEqual({}, params)
|
||||
mock_zc.return_value.get_service_info.assert_called_once_with(
|
||||
'baremetal._openstack._tcp.local.',
|
||||
'baremetal._openstack._tcp.local.'
|
||||
)
|
||||
mock_zc.return_value.close.assert_called_once_with()
|
||||
|
||||
def test_skip_invalid(self, mock_zc, mock_route):
|
||||
mock_zc.return_value.get_service_info.return_value = mock.Mock(
|
||||
port=80,
|
||||
properties={},
|
||||
**{'parsed_addresses.return_value': ['::1', '::2', '::3']}
|
||||
)
|
||||
mock_route.side_effect = [None, '::4']
|
||||
|
||||
endp, params = mdns.get_endpoint('baremetal')
|
||||
self.assertEqual('http://[::3]:80', endp)
|
||||
self.assertEqual({}, params)
|
||||
mock_zc.return_value.get_service_info.assert_called_once_with(
|
||||
'baremetal._openstack._tcp.local.',
|
||||
'baremetal._openstack._tcp.local.'
|
||||
)
|
||||
mock_zc.return_value.close.assert_called_once_with()
|
||||
self.assertEqual(2, mock_route.call_count)
|
||||
|
||||
def test_fallback(self, mock_zc, mock_route):
|
||||
mock_zc.return_value.get_service_info.return_value = mock.Mock(
|
||||
port=80,
|
||||
properties={},
|
||||
**{'parsed_addresses.return_value': ['::2', '::3']}
|
||||
)
|
||||
mock_route.side_effect = [None, None]
|
||||
|
||||
endp, params = mdns.get_endpoint('baremetal')
|
||||
self.assertEqual('http://[::2]:80', endp)
|
||||
self.assertEqual({}, params)
|
||||
mock_zc.return_value.get_service_info.assert_called_once_with(
|
||||
'baremetal._openstack._tcp.local.',
|
||||
'baremetal._openstack._tcp.local.'
|
||||
)
|
||||
mock_zc.return_value.close.assert_called_once_with()
|
||||
self.assertEqual(2, mock_route.call_count)
|
||||
|
||||
def test_localhost_only(self, mock_zc, mock_route):
|
||||
mock_zc.return_value.get_service_info.return_value = mock.Mock(
|
||||
port=80,
|
||||
properties={},
|
||||
**{'parsed_addresses.return_value': ['::1']}
|
||||
)
|
||||
|
||||
self.assertRaises(exception.ServiceLookupFailure,
|
||||
mdns.get_endpoint, 'baremetal')
|
||||
mock_zc.return_value.get_service_info.assert_called_once_with(
|
||||
'baremetal._openstack._tcp.local.',
|
||||
'baremetal._openstack._tcp.local.'
|
||||
)
|
||||
mock_zc.return_value.close.assert_called_once_with()
|
||||
self.assertFalse(mock_route.called)
|
||||
|
||||
def test_https(self, mock_zc, mock_route):
|
||||
mock_zc.return_value.get_service_info.return_value = mock.Mock(
|
||||
address=socket.inet_aton('192.168.1.1'),
|
||||
port=443,
|
||||
properties={},
|
||||
**{'parsed_addresses.return_value': ['192.168.1.1']}
|
||||
)
|
||||
|
||||
endp, params = mdns.get_endpoint('baremetal')
|
||||
self.assertEqual('https://192.168.1.1:443', endp)
|
||||
self.assertEqual({}, params)
|
||||
mock_zc.return_value.get_service_info.assert_called_once_with(
|
||||
'baremetal._openstack._tcp.local.',
|
||||
'baremetal._openstack._tcp.local.'
|
||||
)
|
||||
|
||||
def test_with_custom_port_and_path(self, mock_zc, mock_route):
|
||||
mock_zc.return_value.get_service_info.return_value = mock.Mock(
|
||||
address=socket.inet_aton('192.168.1.1'),
|
||||
port=8080,
|
||||
properties={b'path': b'/baremetal'},
|
||||
**{'parsed_addresses.return_value': ['192.168.1.1']}
|
||||
)
|
||||
|
||||
endp, params = mdns.get_endpoint('baremetal')
|
||||
self.assertEqual('https://192.168.1.1:8080/baremetal', endp)
|
||||
self.assertEqual({}, params)
|
||||
mock_zc.return_value.get_service_info.assert_called_once_with(
|
||||
'baremetal._openstack._tcp.local.',
|
||||
'baremetal._openstack._tcp.local.'
|
||||
)
|
||||
|
||||
def test_with_custom_port_path_and_protocol(self, mock_zc, mock_route):
|
||||
mock_zc.return_value.get_service_info.return_value = mock.Mock(
|
||||
address=socket.inet_aton('192.168.1.1'),
|
||||
port=8080,
|
||||
properties={b'path': b'/baremetal', b'protocol': b'http'},
|
||||
**{'parsed_addresses.return_value': ['192.168.1.1']}
|
||||
)
|
||||
|
||||
endp, params = mdns.get_endpoint('baremetal')
|
||||
self.assertEqual('http://192.168.1.1:8080/baremetal', endp)
|
||||
self.assertEqual({}, params)
|
||||
mock_zc.return_value.get_service_info.assert_called_once_with(
|
||||
'baremetal._openstack._tcp.local.',
|
||||
'baremetal._openstack._tcp.local.'
|
||||
)
|
||||
|
||||
def test_with_params(self, mock_zc, mock_route):
|
||||
mock_zc.return_value.get_service_info.return_value = mock.Mock(
|
||||
address=socket.inet_aton('192.168.1.1'),
|
||||
port=80,
|
||||
properties={b'ipa_debug': True},
|
||||
**{'parsed_addresses.return_value': ['192.168.1.1']}
|
||||
)
|
||||
|
||||
endp, params = mdns.get_endpoint('baremetal')
|
||||
self.assertEqual('http://192.168.1.1:80', endp)
|
||||
self.assertEqual({'ipa_debug': True}, params)
|
||||
mock_zc.return_value.get_service_info.assert_called_once_with(
|
||||
'baremetal._openstack._tcp.local.',
|
||||
'baremetal._openstack._tcp.local.'
|
||||
)
|
||||
|
||||
def test_binary_data(self, mock_zc, mock_route):
|
||||
mock_zc.return_value.get_service_info.return_value = mock.Mock(
|
||||
address=socket.inet_aton('192.168.1.1'),
|
||||
port=80,
|
||||
properties={b'ipa_debug': True, b'binary': b'\xe2\x28\xa1'},
|
||||
**{'parsed_addresses.return_value': ['192.168.1.1']}
|
||||
)
|
||||
|
||||
endp, params = mdns.get_endpoint('baremetal')
|
||||
self.assertEqual('http://192.168.1.1:80', endp)
|
||||
self.assertEqual({'ipa_debug': True, 'binary': b'\xe2\x28\xa1'},
|
||||
params)
|
||||
mock_zc.return_value.get_service_info.assert_called_once_with(
|
||||
'baremetal._openstack._tcp.local.',
|
||||
'baremetal._openstack._tcp.local.'
|
||||
)
|
||||
|
||||
def test_invalid_key(self, mock_zc, mock_route):
|
||||
mock_zc.return_value.get_service_info.return_value = mock.Mock(
|
||||
address=socket.inet_aton('192.168.1.1'),
|
||||
port=80,
|
||||
properties={b'ipa_debug': True, b'\xc3\x28': b'value'},
|
||||
**{'parsed_addresses.return_value': ['192.168.1.1']}
|
||||
)
|
||||
|
||||
self.assertRaisesRegex(exception.ServiceLookupFailure,
|
||||
'Cannot decode key',
|
||||
mdns.get_endpoint, 'baremetal')
|
||||
mock_zc.return_value.get_service_info.assert_called_once_with(
|
||||
'baremetal._openstack._tcp.local.',
|
||||
'baremetal._openstack._tcp.local.'
|
||||
)
|
||||
|
||||
def test_with_server(self, mock_zc, mock_route):
|
||||
mock_zc.return_value.get_service_info.return_value = mock.Mock(
|
||||
address=socket.inet_aton('192.168.1.1'),
|
||||
port=443,
|
||||
server='openstack.example.com.',
|
||||
properties={},
|
||||
**{'parsed_addresses.return_value': ['192.168.1.1']}
|
||||
)
|
||||
|
||||
endp, params = mdns.get_endpoint('baremetal')
|
||||
self.assertEqual('https://openstack.example.com:443', endp)
|
||||
self.assertEqual({}, params)
|
||||
mock_zc.return_value.get_service_info.assert_called_once_with(
|
||||
'baremetal._openstack._tcp.local.',
|
||||
'baremetal._openstack._tcp.local.'
|
||||
)
|
||||
|
||||
@mock.patch('time.sleep', autospec=True)
|
||||
def test_not_found(self, mock_sleep, mock_zc, mock_route):
|
||||
mock_zc.return_value.get_service_info.return_value = None
|
||||
|
||||
self.assertRaisesRegex(exception.ServiceLookupFailure,
|
||||
'baremetal service',
|
||||
mdns.get_endpoint, 'baremetal')
|
||||
self.assertEqual(CONF.mdns.lookup_attempts - 1, mock_sleep.call_count)
|
@ -1,204 +0,0 @@
|
||||
# Copyright 2016 Rackspace Hosting
|
||||
# All Rights Reserved
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import types
|
||||
from unittest import mock
|
||||
|
||||
from oslo_utils import reflection
|
||||
|
||||
from ironic_lib import metrics as metricslib
|
||||
from ironic_lib import metrics_utils
|
||||
from ironic_lib.tests import base
|
||||
|
||||
|
||||
METRICS = metrics_utils.get_metrics_logger(prefix='foo', backend='noop')
|
||||
|
||||
|
||||
@METRICS.timer('testing1')
|
||||
def timer_check(run, timer=None):
|
||||
pass
|
||||
|
||||
|
||||
@METRICS.counter('testing2')
|
||||
def counter_check(run, counter=None):
|
||||
pass
|
||||
|
||||
|
||||
@METRICS.gauge('testing2')
|
||||
def gauge_check(run, gauge=None):
|
||||
pass
|
||||
|
||||
|
||||
class MockedMetricLogger(metricslib.MetricLogger):
|
||||
_gauge = mock.Mock(spec_set=types.FunctionType)
|
||||
_counter = mock.Mock(spec_set=types.FunctionType)
|
||||
_timer = mock.Mock(spec_set=types.FunctionType)
|
||||
|
||||
|
||||
class TestMetricReflection(base.IronicLibTestCase):
|
||||
def test_timer_reflection(self):
|
||||
# Ensure our decorator is done correctly (functools.wraps) and we can
|
||||
# get the arguments of our decorated function.
|
||||
expected = ['run', 'timer']
|
||||
signature = reflection.get_signature(timer_check)
|
||||
parameters = list(signature.parameters)
|
||||
self.assertEqual(expected, parameters)
|
||||
|
||||
def test_counter_reflection(self):
|
||||
# Ensure our decorator is done correctly (functools.wraps) and we can
|
||||
# get the arguments of our decorated function.
|
||||
expected = ['run', 'counter']
|
||||
signature = reflection.get_signature(counter_check)
|
||||
parameters = list(signature.parameters)
|
||||
self.assertEqual(expected, parameters)
|
||||
|
||||
def test_gauge_reflection(self):
|
||||
# Ensure our decorator is done correctly (functools.wraps) and we can
|
||||
# get the arguments of our decorated function.
|
||||
expected = ['run', 'gauge']
|
||||
signature = reflection.get_signature(gauge_check)
|
||||
parameters = list(signature.parameters)
|
||||
self.assertEqual(expected, parameters)
|
||||
|
||||
|
||||
class TestMetricLogger(base.IronicLibTestCase):
|
||||
def setUp(self):
|
||||
super(TestMetricLogger, self).setUp()
|
||||
self.ml = MockedMetricLogger('prefix', '.')
|
||||
self.ml_no_prefix = MockedMetricLogger('', '.')
|
||||
self.ml_other_delim = MockedMetricLogger('prefix', '*')
|
||||
self.ml_default = MockedMetricLogger()
|
||||
|
||||
def test_init(self):
|
||||
self.assertEqual(self.ml._prefix, 'prefix')
|
||||
self.assertEqual(self.ml._delimiter, '.')
|
||||
|
||||
self.assertEqual(self.ml_no_prefix._prefix, '')
|
||||
self.assertEqual(self.ml_other_delim._delimiter, '*')
|
||||
self.assertEqual(self.ml_default._prefix, '')
|
||||
|
||||
def test_get_metric_name(self):
|
||||
self.assertEqual(
|
||||
self.ml.get_metric_name('metric'),
|
||||
'prefix.metric')
|
||||
|
||||
self.assertEqual(
|
||||
self.ml_no_prefix.get_metric_name('metric'),
|
||||
'metric')
|
||||
|
||||
self.assertEqual(
|
||||
self.ml_other_delim.get_metric_name('metric'),
|
||||
'prefix*metric')
|
||||
|
||||
def test_send_gauge(self):
|
||||
self.ml.send_gauge('prefix.metric', 10)
|
||||
self.ml._gauge.assert_called_once_with('prefix.metric', 10)
|
||||
|
||||
def test_send_counter(self):
|
||||
self.ml.send_counter('prefix.metric', 10)
|
||||
self.ml._counter.assert_called_once_with(
|
||||
'prefix.metric', 10,
|
||||
sample_rate=None)
|
||||
self.ml._counter.reset_mock()
|
||||
|
||||
self.ml.send_counter('prefix.metric', 10, sample_rate=1.0)
|
||||
self.ml._counter.assert_called_once_with(
|
||||
'prefix.metric', 10,
|
||||
sample_rate=1.0)
|
||||
self.ml._counter.reset_mock()
|
||||
|
||||
self.ml.send_counter('prefix.metric', 10, sample_rate=0.0)
|
||||
self.assertFalse(self.ml._counter.called)
|
||||
|
||||
def test_send_timer(self):
|
||||
self.ml.send_timer('prefix.metric', 10)
|
||||
self.ml._timer.assert_called_once_with('prefix.metric', 10)
|
||||
|
||||
@mock.patch('ironic_lib.metrics._time', autospec=True)
|
||||
@mock.patch('ironic_lib.metrics.MetricLogger.send_timer', autospec=True)
|
||||
def test_decorator_timer(self, mock_timer, mock_time):
|
||||
mock_time.side_effect = [1, 43]
|
||||
|
||||
@self.ml.timer('foo.bar.baz')
|
||||
def func(x):
|
||||
return x * x
|
||||
|
||||
func(10)
|
||||
|
||||
mock_timer.assert_called_once_with(self.ml, 'prefix.foo.bar.baz',
|
||||
42 * 1000)
|
||||
|
||||
@mock.patch('ironic_lib.metrics.MetricLogger.send_counter', autospec=True)
|
||||
def test_decorator_counter(self, mock_counter):
|
||||
|
||||
@self.ml.counter('foo.bar.baz')
|
||||
def func(x):
|
||||
return x * x
|
||||
|
||||
func(10)
|
||||
|
||||
mock_counter.assert_called_once_with(self.ml, 'prefix.foo.bar.baz', 1,
|
||||
sample_rate=None)
|
||||
|
||||
@mock.patch('ironic_lib.metrics.MetricLogger.send_counter', autospec=True)
|
||||
def test_decorator_counter_sample_rate(self, mock_counter):
|
||||
|
||||
@self.ml.counter('foo.bar.baz', sample_rate=0.5)
|
||||
def func(x):
|
||||
return x * x
|
||||
|
||||
func(10)
|
||||
|
||||
mock_counter.assert_called_once_with(self.ml, 'prefix.foo.bar.baz', 1,
|
||||
sample_rate=0.5)
|
||||
|
||||
@mock.patch('ironic_lib.metrics.MetricLogger.send_gauge', autospec=True)
|
||||
def test_decorator_gauge(self, mock_gauge):
|
||||
@self.ml.gauge('foo.bar.baz')
|
||||
def func(x):
|
||||
return x
|
||||
|
||||
func(10)
|
||||
|
||||
mock_gauge.assert_called_once_with(self.ml, 'prefix.foo.bar.baz', 10)
|
||||
|
||||
@mock.patch('ironic_lib.metrics._time', autospec=True)
|
||||
@mock.patch('ironic_lib.metrics.MetricLogger.send_timer', autospec=True)
|
||||
def test_context_mgr_timer(self, mock_timer, mock_time):
|
||||
mock_time.side_effect = [1, 43]
|
||||
|
||||
with self.ml.timer('foo.bar.baz'):
|
||||
pass
|
||||
|
||||
mock_timer.assert_called_once_with(self.ml, 'prefix.foo.bar.baz',
|
||||
42 * 1000)
|
||||
|
||||
@mock.patch('ironic_lib.metrics.MetricLogger.send_counter', autospec=True)
|
||||
def test_context_mgr_counter(self, mock_counter):
|
||||
|
||||
with self.ml.counter('foo.bar.baz'):
|
||||
pass
|
||||
|
||||
mock_counter.assert_called_once_with(self.ml, 'prefix.foo.bar.baz', 1,
|
||||
sample_rate=None)
|
||||
|
||||
@mock.patch('ironic_lib.metrics.MetricLogger.send_counter', autospec=True)
|
||||
def test_context_mgr_counter_sample_rate(self, mock_counter):
|
||||
|
||||
with self.ml.counter('foo.bar.baz', sample_rate=0.5):
|
||||
pass
|
||||
|
||||
mock_counter.assert_called_once_with(self.ml, 'prefix.foo.bar.baz', 1,
|
||||
sample_rate=0.5)
|
@ -1,68 +0,0 @@
|
||||
# Copyright 2016 Rackspace Hosting
|
||||
# All Rights Reserved
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
from unittest import mock
|
||||
|
||||
|
||||
from ironic_lib import metrics_collector
|
||||
from ironic_lib.tests import base
|
||||
|
||||
|
||||
def connect(family=None, type=None, proto=None):
|
||||
"""Dummy function to provide signature for autospec"""
|
||||
pass
|
||||
|
||||
|
||||
class TestDictCollectionMetricLogger(base.IronicLibTestCase):
|
||||
def setUp(self):
|
||||
super(TestDictCollectionMetricLogger, self).setUp()
|
||||
self.ml = metrics_collector.DictCollectionMetricLogger(
|
||||
'prefix', '.')
|
||||
|
||||
@mock.patch('ironic_lib.metrics_collector.'
|
||||
'DictCollectionMetricLogger._send',
|
||||
autospec=True)
|
||||
def test_gauge(self, mock_send):
|
||||
self.ml._gauge('metric', 10)
|
||||
mock_send.assert_called_once_with(self.ml, 'metric', 10, 'g')
|
||||
|
||||
@mock.patch('ironic_lib.metrics_collector.'
|
||||
'DictCollectionMetricLogger._send',
|
||||
autospec=True)
|
||||
def test_counter(self, mock_send):
|
||||
self.ml._counter('metric', 10)
|
||||
mock_send.assert_called_once_with(self.ml, 'metric', 10, 'c',
|
||||
sample_rate=None)
|
||||
|
||||
@mock.patch('ironic_lib.metrics_collector.'
|
||||
'DictCollectionMetricLogger._send',
|
||||
autospec=True)
|
||||
def test_timer(self, mock_send):
|
||||
self.ml._timer('metric', 10)
|
||||
mock_send.assert_called_once_with(self.ml, 'metric', 10, 'ms')
|
||||
|
||||
def test_send(self):
|
||||
expected = {
|
||||
'part1.part1': {'count': 2, 'type': 'counter'},
|
||||
'part1.part2': {'type': 'gauge', 'value': 66},
|
||||
'part1.magic': {'count': 2, 'sum': 22, 'type': 'timer'},
|
||||
}
|
||||
self.ml._send('part1.part1', 1, 'c')
|
||||
self.ml._send('part1.part1', 1, 'c')
|
||||
self.ml._send('part1.part2', 66, 'g')
|
||||
self.ml._send('part1.magic', 2, 'ms')
|
||||
self.ml._send('part1.magic', 20, 'ms')
|
||||
results = self.ml.get_metrics_data()
|
||||
self.assertEqual(expected, results)
|
@ -1,101 +0,0 @@
|
||||
# Copyright 2016 Rackspace Hosting
|
||||
# All Rights Reserved
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import socket
|
||||
from unittest import mock
|
||||
|
||||
|
||||
from ironic_lib import metrics_statsd
|
||||
from ironic_lib.tests import base
|
||||
|
||||
|
||||
def connect(family=None, type=None, proto=None):
|
||||
"""Dummy function to provide signature for autospec"""
|
||||
pass
|
||||
|
||||
|
||||
class TestStatsdMetricLogger(base.IronicLibTestCase):
|
||||
def setUp(self):
|
||||
super(TestStatsdMetricLogger, self).setUp()
|
||||
self.ml = metrics_statsd.StatsdMetricLogger('prefix', '.', 'test-host',
|
||||
4321)
|
||||
|
||||
def test_init(self):
|
||||
self.assertEqual(self.ml._host, 'test-host')
|
||||
self.assertEqual(self.ml._port, 4321)
|
||||
self.assertEqual(self.ml._target, ('test-host', 4321))
|
||||
|
||||
@mock.patch('ironic_lib.metrics_statsd.StatsdMetricLogger._send',
|
||||
autospec=True)
|
||||
def test_gauge(self, mock_send):
|
||||
self.ml._gauge('metric', 10)
|
||||
mock_send.assert_called_once_with(self.ml, 'metric', 10, 'g')
|
||||
|
||||
@mock.patch('ironic_lib.metrics_statsd.StatsdMetricLogger._send',
|
||||
autospec=True)
|
||||
def test_counter(self, mock_send):
|
||||
self.ml._counter('metric', 10)
|
||||
mock_send.assert_called_once_with(self.ml, 'metric', 10, 'c',
|
||||
sample_rate=None)
|
||||
mock_send.reset_mock()
|
||||
|
||||
self.ml._counter('metric', 10, sample_rate=1.0)
|
||||
mock_send.assert_called_once_with(self.ml, 'metric', 10, 'c',
|
||||
sample_rate=1.0)
|
||||
|
||||
@mock.patch('ironic_lib.metrics_statsd.StatsdMetricLogger._send',
|
||||
autospec=True)
|
||||
def test_timer(self, mock_send):
|
||||
self.ml._timer('metric', 10)
|
||||
mock_send.assert_called_once_with(self.ml, 'metric', 10, 'ms')
|
||||
|
||||
@mock.patch('socket.socket', autospec=connect)
|
||||
def test_open_socket(self, mock_socket_constructor):
|
||||
self.ml._open_socket()
|
||||
mock_socket_constructor.assert_called_once_with(
|
||||
socket.AF_INET,
|
||||
socket.SOCK_DGRAM)
|
||||
|
||||
@mock.patch('socket.socket', autospec=connect)
|
||||
def test_send(self, mock_socket_constructor):
|
||||
mock_socket = mock.Mock()
|
||||
mock_socket_constructor.return_value = mock_socket
|
||||
|
||||
self.ml._send('part1.part2', 2, 'type')
|
||||
mock_socket.sendto.assert_called_once_with(
|
||||
b'part1.part2:2|type',
|
||||
('test-host', 4321))
|
||||
mock_socket.close.assert_called_once_with()
|
||||
mock_socket.reset_mock()
|
||||
|
||||
self.ml._send('part1.part2', 3.14159, 'type')
|
||||
mock_socket.sendto.assert_called_once_with(
|
||||
b'part1.part2:3.14159|type',
|
||||
('test-host', 4321))
|
||||
mock_socket.close.assert_called_once_with()
|
||||
mock_socket.reset_mock()
|
||||
|
||||
self.ml._send('part1.part2', 5, 'type')
|
||||
mock_socket.sendto.assert_called_once_with(
|
||||
b'part1.part2:5|type',
|
||||
('test-host', 4321))
|
||||
mock_socket.close.assert_called_once_with()
|
||||
mock_socket.reset_mock()
|
||||
|
||||
self.ml._send('part1.part2', 5, 'type', sample_rate=0.5)
|
||||
mock_socket.sendto.assert_called_once_with(
|
||||
b'part1.part2:5|type@0.5',
|
||||
('test-host', 4321))
|
||||
mock_socket.close.assert_called_once_with()
|
@ -1,104 +0,0 @@
|
||||
# Copyright 2016 Rackspace Hosting
|
||||
# All Rights Reserved
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
from oslo_config import cfg
|
||||
|
||||
from ironic_lib import exception
|
||||
from ironic_lib import metrics as metricslib
|
||||
from ironic_lib import metrics_statsd
|
||||
from ironic_lib import metrics_utils
|
||||
from ironic_lib.tests import base
|
||||
|
||||
CONF = cfg.CONF
|
||||
|
||||
|
||||
class TestGetLogger(base.IronicLibTestCase):
|
||||
def setUp(self):
|
||||
super(TestGetLogger, self).setUp()
|
||||
|
||||
def test_default_backend(self):
|
||||
metrics = metrics_utils.get_metrics_logger('foo')
|
||||
self.assertIsInstance(metrics, metricslib.NoopMetricLogger)
|
||||
|
||||
def test_statsd_backend(self):
|
||||
CONF.set_override('backend', 'statsd', group='metrics')
|
||||
|
||||
metrics = metrics_utils.get_metrics_logger('foo')
|
||||
self.assertIsInstance(metrics, metrics_statsd.StatsdMetricLogger)
|
||||
CONF.clear_override('backend', group='metrics')
|
||||
|
||||
def test_nonexisting_backend(self):
|
||||
self.assertRaises(exception.InvalidMetricConfig,
|
||||
metrics_utils.get_metrics_logger, 'foo', 'test')
|
||||
|
||||
def test_numeric_prefix(self):
|
||||
self.assertRaises(exception.InvalidMetricConfig,
|
||||
metrics_utils.get_metrics_logger, 1)
|
||||
|
||||
def test_numeric_list_prefix(self):
|
||||
self.assertRaises(exception.InvalidMetricConfig,
|
||||
metrics_utils.get_metrics_logger, (1, 2))
|
||||
|
||||
def test_default_prefix(self):
|
||||
metrics = metrics_utils.get_metrics_logger()
|
||||
self.assertIsInstance(metrics, metricslib.NoopMetricLogger)
|
||||
self.assertEqual(metrics.get_metric_name("bar"), "bar")
|
||||
|
||||
def test_prepend_host_backend(self):
|
||||
CONF.set_override('prepend_host', True, group='metrics')
|
||||
CONF.set_override('prepend_host_reverse', False, group='metrics')
|
||||
|
||||
metrics = metrics_utils.get_metrics_logger(prefix='foo',
|
||||
host="host.example.com")
|
||||
self.assertIsInstance(metrics, metricslib.NoopMetricLogger)
|
||||
self.assertEqual(metrics.get_metric_name("bar"),
|
||||
"host.example.com.foo.bar")
|
||||
|
||||
CONF.clear_override('prepend_host', group='metrics')
|
||||
CONF.clear_override('prepend_host_reverse', group='metrics')
|
||||
|
||||
def test_prepend_global_prefix_host_backend(self):
|
||||
CONF.set_override('prepend_host', True, group='metrics')
|
||||
CONF.set_override('prepend_host_reverse', False, group='metrics')
|
||||
CONF.set_override('global_prefix', 'global_pre', group='metrics')
|
||||
|
||||
metrics = metrics_utils.get_metrics_logger(prefix='foo',
|
||||
host="host.example.com")
|
||||
self.assertIsInstance(metrics, metricslib.NoopMetricLogger)
|
||||
self.assertEqual(metrics.get_metric_name("bar"),
|
||||
"global_pre.host.example.com.foo.bar")
|
||||
|
||||
CONF.clear_override('prepend_host', group='metrics')
|
||||
CONF.clear_override('prepend_host_reverse', group='metrics')
|
||||
CONF.clear_override('global_prefix', group='metrics')
|
||||
|
||||
def test_prepend_other_delim(self):
|
||||
metrics = metrics_utils.get_metrics_logger('foo', delimiter='*')
|
||||
self.assertIsInstance(metrics, metricslib.NoopMetricLogger)
|
||||
self.assertEqual(metrics.get_metric_name("bar"),
|
||||
"foo*bar")
|
||||
|
||||
def test_prepend_host_reverse_backend(self):
|
||||
CONF.set_override('prepend_host', True, group='metrics')
|
||||
CONF.set_override('prepend_host_reverse', True, group='metrics')
|
||||
|
||||
metrics = metrics_utils.get_metrics_logger('foo',
|
||||
host="host.example.com")
|
||||
self.assertIsInstance(metrics, metricslib.NoopMetricLogger)
|
||||
self.assertEqual(metrics.get_metric_name("bar"),
|
||||
"com.example.host.foo.bar")
|
||||
|
||||
CONF.clear_override('prepend_host', group='metrics')
|
||||
CONF.clear_override('prepend_host_reverse', group='metrics')
|
@ -1,599 +0,0 @@
|
||||
# Copyright 2011 Justin Santa Barbara
|
||||
# Copyright 2012 Hewlett-Packard Development Company, L.P.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import copy
|
||||
import errno
|
||||
import os
|
||||
import os.path
|
||||
from unittest import mock
|
||||
|
||||
from oslo_concurrency import processutils
|
||||
from oslo_config import cfg
|
||||
|
||||
from ironic_lib import exception
|
||||
from ironic_lib.tests import base
|
||||
from ironic_lib import utils
|
||||
|
||||
CONF = cfg.CONF
|
||||
|
||||
|
||||
class BareMetalUtilsTestCase(base.IronicLibTestCase):
|
||||
|
||||
def test_unlink(self):
|
||||
with mock.patch.object(os, "unlink", autospec=True) as unlink_mock:
|
||||
unlink_mock.return_value = None
|
||||
utils.unlink_without_raise("/fake/path")
|
||||
unlink_mock.assert_called_once_with("/fake/path")
|
||||
|
||||
def test_unlink_ENOENT(self):
|
||||
with mock.patch.object(os, "unlink", autospec=True) as unlink_mock:
|
||||
unlink_mock.side_effect = OSError(errno.ENOENT)
|
||||
utils.unlink_without_raise("/fake/path")
|
||||
unlink_mock.assert_called_once_with("/fake/path")
|
||||
|
||||
|
||||
class ExecuteTestCase(base.IronicLibTestCase):
|
||||
# Allow calls to utils.execute() and related functions
|
||||
block_execute = False
|
||||
|
||||
@mock.patch.object(processutils, 'execute', autospec=True)
|
||||
@mock.patch.object(os.environ, 'copy', return_value={}, autospec=True)
|
||||
def test_execute_use_standard_locale_no_env_variables(self, env_mock,
|
||||
execute_mock):
|
||||
utils.execute('foo', use_standard_locale=True)
|
||||
execute_mock.assert_called_once_with('foo',
|
||||
env_variables={'LC_ALL': 'C'})
|
||||
|
||||
@mock.patch.object(processutils, 'execute', autospec=True)
|
||||
def test_execute_use_standard_locale_with_env_variables(self,
|
||||
execute_mock):
|
||||
utils.execute('foo', use_standard_locale=True,
|
||||
env_variables={'foo': 'bar'})
|
||||
execute_mock.assert_called_once_with('foo',
|
||||
env_variables={'LC_ALL': 'C',
|
||||
'foo': 'bar'})
|
||||
|
||||
@mock.patch.object(processutils, 'execute', autospec=True)
|
||||
def test_execute_not_use_standard_locale(self, execute_mock):
|
||||
utils.execute('foo', use_standard_locale=False,
|
||||
env_variables={'foo': 'bar'})
|
||||
execute_mock.assert_called_once_with('foo',
|
||||
env_variables={'foo': 'bar'})
|
||||
|
||||
@mock.patch.object(utils, 'LOG', autospec=True)
|
||||
def _test_execute_with_log_stdout(self, log_mock, log_stdout=None):
|
||||
with mock.patch.object(
|
||||
processutils, 'execute', autospec=True) as execute_mock:
|
||||
execute_mock.return_value = ('stdout', 'stderr')
|
||||
if log_stdout is not None:
|
||||
utils.execute('foo', log_stdout=log_stdout)
|
||||
else:
|
||||
utils.execute('foo')
|
||||
execute_mock.assert_called_once_with('foo')
|
||||
name, args, kwargs = log_mock.debug.mock_calls[0]
|
||||
if log_stdout is False:
|
||||
self.assertEqual(1, log_mock.debug.call_count)
|
||||
self.assertNotIn('stdout', args[0])
|
||||
else:
|
||||
self.assertEqual(2, log_mock.debug.call_count)
|
||||
self.assertIn('stdout', args[0])
|
||||
|
||||
def test_execute_with_log_stdout_default(self):
|
||||
self._test_execute_with_log_stdout()
|
||||
|
||||
def test_execute_with_log_stdout_true(self):
|
||||
self._test_execute_with_log_stdout(log_stdout=True)
|
||||
|
||||
def test_execute_with_log_stdout_false(self):
|
||||
self._test_execute_with_log_stdout(log_stdout=False)
|
||||
|
||||
@mock.patch.object(utils, 'LOG', autospec=True)
|
||||
@mock.patch.object(processutils, 'execute', autospec=True)
|
||||
def test_execute_command_not_found(self, execute_mock, log_mock):
|
||||
execute_mock.side_effect = FileNotFoundError
|
||||
self.assertRaises(FileNotFoundError, utils.execute, 'foo')
|
||||
execute_mock.assert_called_once_with('foo')
|
||||
name, args, kwargs = log_mock.debug.mock_calls[0]
|
||||
self.assertEqual(1, log_mock.debug.call_count)
|
||||
self.assertIn('not found', args[0])
|
||||
|
||||
|
||||
class MkfsTestCase(base.IronicLibTestCase):
|
||||
|
||||
@mock.patch.object(utils, 'execute', autospec=True)
|
||||
def test_mkfs(self, execute_mock):
|
||||
utils.mkfs('ext4', '/my/block/dev')
|
||||
utils.mkfs('msdos', '/my/msdos/block/dev')
|
||||
utils.mkfs('swap', '/my/swap/block/dev')
|
||||
|
||||
expected = [mock.call('mkfs', '-t', 'ext4', '-F', '/my/block/dev',
|
||||
use_standard_locale=True),
|
||||
mock.call('mkfs', '-t', 'msdos', '/my/msdos/block/dev',
|
||||
use_standard_locale=True),
|
||||
mock.call('mkswap', '/my/swap/block/dev',
|
||||
use_standard_locale=True)]
|
||||
self.assertEqual(expected, execute_mock.call_args_list)
|
||||
|
||||
@mock.patch.object(utils, 'execute', autospec=True)
|
||||
def test_mkfs_with_label(self, execute_mock):
|
||||
utils.mkfs('ext4', '/my/block/dev', 'ext4-vol')
|
||||
utils.mkfs('msdos', '/my/msdos/block/dev', 'msdos-vol')
|
||||
utils.mkfs('swap', '/my/swap/block/dev', 'swap-vol')
|
||||
|
||||
expected = [mock.call('mkfs', '-t', 'ext4', '-F', '-L', 'ext4-vol',
|
||||
'/my/block/dev',
|
||||
use_standard_locale=True),
|
||||
mock.call('mkfs', '-t', 'msdos', '-n', 'msdos-vol',
|
||||
'/my/msdos/block/dev',
|
||||
use_standard_locale=True),
|
||||
mock.call('mkswap', '-L', 'swap-vol',
|
||||
'/my/swap/block/dev',
|
||||
use_standard_locale=True)]
|
||||
self.assertEqual(expected, execute_mock.call_args_list)
|
||||
|
||||
@mock.patch.object(utils, 'execute', autospec=True,
|
||||
side_effect=processutils.ProcessExecutionError(
|
||||
stderr=os.strerror(errno.ENOENT)))
|
||||
def test_mkfs_with_unsupported_fs(self, execute_mock):
|
||||
self.assertRaises(exception.FileSystemNotSupported,
|
||||
utils.mkfs, 'foo', '/my/block/dev')
|
||||
|
||||
@mock.patch.object(utils, 'execute', autospec=True,
|
||||
side_effect=processutils.ProcessExecutionError(
|
||||
stderr='fake'))
|
||||
def test_mkfs_with_unexpected_error(self, execute_mock):
|
||||
self.assertRaises(processutils.ProcessExecutionError, utils.mkfs,
|
||||
'ext4', '/my/block/dev', 'ext4-vol')
|
||||
|
||||
|
||||
class IsHttpUrlTestCase(base.IronicLibTestCase):
|
||||
|
||||
def test_is_http_url(self):
|
||||
self.assertTrue(utils.is_http_url('http://127.0.0.1'))
|
||||
self.assertTrue(utils.is_http_url('https://127.0.0.1'))
|
||||
self.assertTrue(utils.is_http_url('HTTP://127.1.2.3'))
|
||||
self.assertTrue(utils.is_http_url('HTTPS://127.3.2.1'))
|
||||
self.assertFalse(utils.is_http_url('Zm9vYmFy'))
|
||||
self.assertFalse(utils.is_http_url('11111111'))
|
||||
|
||||
|
||||
class ParseRootDeviceTestCase(base.IronicLibTestCase):
|
||||
|
||||
def test_parse_root_device_hints_without_operators(self):
|
||||
root_device = {
|
||||
'wwn': '123456', 'model': 'FOO model', 'size': 12345,
|
||||
'serial': 'foo-serial', 'vendor': 'foo VENDOR with space',
|
||||
'name': '/dev/sda', 'wwn_with_extension': '123456111',
|
||||
'wwn_vendor_extension': '111', 'rotational': True,
|
||||
'hctl': '1:0:0:0', 'by_path': '/dev/disk/by-path/1:0:0:0'}
|
||||
result = utils.parse_root_device_hints(root_device)
|
||||
expected = {
|
||||
'wwn': 's== 123456', 'model': 's== foo%20model',
|
||||
'size': '== 12345', 'serial': 's== foo-serial',
|
||||
'vendor': 's== foo%20vendor%20with%20space',
|
||||
'name': 's== /dev/sda', 'wwn_with_extension': 's== 123456111',
|
||||
'wwn_vendor_extension': 's== 111', 'rotational': True,
|
||||
'hctl': 's== 1%3A0%3A0%3A0',
|
||||
'by_path': 's== /dev/disk/by-path/1%3A0%3A0%3A0'}
|
||||
self.assertEqual(expected, result)
|
||||
|
||||
def test_parse_root_device_hints_with_operators(self):
|
||||
root_device = {
|
||||
'wwn': 's== 123456', 'model': 's== foo MODEL', 'size': '>= 12345',
|
||||
'serial': 's!= foo-serial', 'vendor': 's== foo VENDOR with space',
|
||||
'name': '<or> /dev/sda <or> /dev/sdb',
|
||||
'wwn_with_extension': 's!= 123456111',
|
||||
'wwn_vendor_extension': 's== 111', 'rotational': True,
|
||||
'hctl': 's== 1:0:0:0', 'by_path': 's== /dev/disk/by-path/1:0:0:0'}
|
||||
|
||||
# Validate strings being normalized
|
||||
expected = copy.deepcopy(root_device)
|
||||
expected['model'] = 's== foo%20model'
|
||||
expected['vendor'] = 's== foo%20vendor%20with%20space'
|
||||
expected['hctl'] = 's== 1%3A0%3A0%3A0'
|
||||
expected['by_path'] = 's== /dev/disk/by-path/1%3A0%3A0%3A0'
|
||||
|
||||
result = utils.parse_root_device_hints(root_device)
|
||||
# The hints already contain the operators, make sure we keep it
|
||||
self.assertEqual(expected, result)
|
||||
|
||||
def test_parse_root_device_hints_string_compare_operator_name(self):
|
||||
root_device = {'name': 's== /dev/sdb'}
|
||||
# Validate strings being normalized
|
||||
expected = copy.deepcopy(root_device)
|
||||
result = utils.parse_root_device_hints(root_device)
|
||||
# The hints already contain the operators, make sure we keep it
|
||||
self.assertEqual(expected, result)
|
||||
|
||||
def test_parse_root_device_hints_no_hints(self):
|
||||
result = utils.parse_root_device_hints({})
|
||||
self.assertIsNone(result)
|
||||
|
||||
def test_parse_root_device_hints_convert_size(self):
|
||||
for size in (12345, '12345'):
|
||||
result = utils.parse_root_device_hints({'size': size})
|
||||
self.assertEqual({'size': '== 12345'}, result)
|
||||
|
||||
def test_parse_root_device_hints_invalid_size(self):
|
||||
for value in ('not-int', -123, 0):
|
||||
self.assertRaises(ValueError, utils.parse_root_device_hints,
|
||||
{'size': value})
|
||||
|
||||
def test_parse_root_device_hints_int_or(self):
|
||||
expr = '<or> 123 <or> 456 <or> 789'
|
||||
result = utils.parse_root_device_hints({'size': expr})
|
||||
self.assertEqual({'size': expr}, result)
|
||||
|
||||
def test_parse_root_device_hints_int_or_invalid(self):
|
||||
expr = '<or> 123 <or> non-int <or> 789'
|
||||
self.assertRaises(ValueError, utils.parse_root_device_hints,
|
||||
{'size': expr})
|
||||
|
||||
def test_parse_root_device_hints_string_or_space(self):
|
||||
expr = '<or> foo <or> foo bar <or> bar'
|
||||
expected = '<or> foo <or> foo%20bar <or> bar'
|
||||
result = utils.parse_root_device_hints({'model': expr})
|
||||
self.assertEqual({'model': expected}, result)
|
||||
|
||||
def _parse_root_device_hints_convert_rotational(self, values,
|
||||
expected_value):
|
||||
for value in values:
|
||||
result = utils.parse_root_device_hints({'rotational': value})
|
||||
self.assertEqual({'rotational': expected_value}, result)
|
||||
|
||||
def test_parse_root_device_hints_convert_rotational(self):
|
||||
self._parse_root_device_hints_convert_rotational(
|
||||
(True, 'true', 'on', 'y', 'yes'), True)
|
||||
|
||||
self._parse_root_device_hints_convert_rotational(
|
||||
(False, 'false', 'off', 'n', 'no'), False)
|
||||
|
||||
def test_parse_root_device_hints_invalid_rotational(self):
|
||||
self.assertRaises(ValueError, utils.parse_root_device_hints,
|
||||
{'rotational': 'not-bool'})
|
||||
|
||||
def test_parse_root_device_hints_invalid_wwn(self):
|
||||
self.assertRaises(ValueError, utils.parse_root_device_hints,
|
||||
{'wwn': 123})
|
||||
|
||||
def test_parse_root_device_hints_invalid_wwn_with_extension(self):
|
||||
self.assertRaises(ValueError, utils.parse_root_device_hints,
|
||||
{'wwn_with_extension': 123})
|
||||
|
||||
def test_parse_root_device_hints_invalid_wwn_vendor_extension(self):
|
||||
self.assertRaises(ValueError, utils.parse_root_device_hints,
|
||||
{'wwn_vendor_extension': 123})
|
||||
|
||||
def test_parse_root_device_hints_invalid_model(self):
|
||||
self.assertRaises(ValueError, utils.parse_root_device_hints,
|
||||
{'model': 123})
|
||||
|
||||
def test_parse_root_device_hints_invalid_serial(self):
|
||||
self.assertRaises(ValueError, utils.parse_root_device_hints,
|
||||
{'serial': 123})
|
||||
|
||||
def test_parse_root_device_hints_invalid_vendor(self):
|
||||
self.assertRaises(ValueError, utils.parse_root_device_hints,
|
||||
{'vendor': 123})
|
||||
|
||||
def test_parse_root_device_hints_invalid_name(self):
|
||||
self.assertRaises(ValueError, utils.parse_root_device_hints,
|
||||
{'name': 123})
|
||||
|
||||
def test_parse_root_device_hints_invalid_hctl(self):
|
||||
self.assertRaises(ValueError, utils.parse_root_device_hints,
|
||||
{'hctl': 123})
|
||||
|
||||
def test_parse_root_device_hints_invalid_by_path(self):
|
||||
self.assertRaises(ValueError, utils.parse_root_device_hints,
|
||||
{'by_path': 123})
|
||||
|
||||
def test_parse_root_device_hints_non_existent_hint(self):
|
||||
self.assertRaises(ValueError, utils.parse_root_device_hints,
|
||||
{'non-existent': 'foo'})
|
||||
|
||||
def test_extract_hint_operator_and_values_single_value(self):
|
||||
expected = {'op': '>=', 'values': ['123']}
|
||||
self.assertEqual(
|
||||
expected, utils._extract_hint_operator_and_values(
|
||||
'>= 123', 'size'))
|
||||
|
||||
def test_extract_hint_operator_and_values_multiple_values(self):
|
||||
expected = {'op': '<or>', 'values': ['123', '456', '789']}
|
||||
expr = '<or> 123 <or> 456 <or> 789'
|
||||
self.assertEqual(
|
||||
expected, utils._extract_hint_operator_and_values(expr, 'size'))
|
||||
|
||||
def test_extract_hint_operator_and_values_multiple_values_space(self):
|
||||
expected = {'op': '<or>', 'values': ['foo', 'foo bar', 'bar']}
|
||||
expr = '<or> foo <or> foo bar <or> bar'
|
||||
self.assertEqual(
|
||||
expected, utils._extract_hint_operator_and_values(expr, 'model'))
|
||||
|
||||
def test_extract_hint_operator_and_values_no_operator(self):
|
||||
expected = {'op': '', 'values': ['123']}
|
||||
self.assertEqual(
|
||||
expected, utils._extract_hint_operator_and_values('123', 'size'))
|
||||
|
||||
def test_extract_hint_operator_and_values_empty_value(self):
|
||||
self.assertRaises(
|
||||
ValueError, utils._extract_hint_operator_and_values, '', 'size')
|
||||
|
||||
def test_extract_hint_operator_and_values_integer(self):
|
||||
expected = {'op': '', 'values': ['123']}
|
||||
self.assertEqual(
|
||||
expected, utils._extract_hint_operator_and_values(123, 'size'))
|
||||
|
||||
def test__append_operator_to_hints(self):
|
||||
root_device = {'serial': 'foo', 'size': 12345,
|
||||
'model': 'foo model', 'rotational': True}
|
||||
expected = {'serial': 's== foo', 'size': '== 12345',
|
||||
'model': 's== foo model', 'rotational': True}
|
||||
|
||||
result = utils._append_operator_to_hints(root_device)
|
||||
self.assertEqual(expected, result)
|
||||
|
||||
def test_normalize_hint_expression_or(self):
|
||||
expr = '<or> foo <or> foo bar <or> bar'
|
||||
expected = '<or> foo <or> foo%20bar <or> bar'
|
||||
result = utils._normalize_hint_expression(expr, 'model')
|
||||
self.assertEqual(expected, result)
|
||||
|
||||
def test_normalize_hint_expression_in(self):
|
||||
expr = '<in> foo <in> foo bar <in> bar'
|
||||
expected = '<in> foo <in> foo%20bar <in> bar'
|
||||
result = utils._normalize_hint_expression(expr, 'model')
|
||||
self.assertEqual(expected, result)
|
||||
|
||||
def test_normalize_hint_expression_op_space(self):
|
||||
expr = 's== test string with space'
|
||||
expected = 's== test%20string%20with%20space'
|
||||
result = utils._normalize_hint_expression(expr, 'model')
|
||||
self.assertEqual(expected, result)
|
||||
|
||||
def test_normalize_hint_expression_op_no_space(self):
|
||||
expr = 's!= SpongeBob'
|
||||
expected = 's!= spongebob'
|
||||
result = utils._normalize_hint_expression(expr, 'model')
|
||||
self.assertEqual(expected, result)
|
||||
|
||||
def test_normalize_hint_expression_no_op_space(self):
|
||||
expr = 'no operators'
|
||||
expected = 'no%20operators'
|
||||
result = utils._normalize_hint_expression(expr, 'model')
|
||||
self.assertEqual(expected, result)
|
||||
|
||||
def test_normalize_hint_expression_no_op_no_space(self):
|
||||
expr = 'NoSpace'
|
||||
expected = 'nospace'
|
||||
result = utils._normalize_hint_expression(expr, 'model')
|
||||
self.assertEqual(expected, result)
|
||||
|
||||
def test_normalize_hint_expression_empty_value(self):
|
||||
self.assertRaises(
|
||||
ValueError, utils._normalize_hint_expression, '', 'size')
|
||||
|
||||
|
||||
class MatchRootDeviceTestCase(base.IronicLibTestCase):
|
||||
|
||||
def setUp(self):
|
||||
super(MatchRootDeviceTestCase, self).setUp()
|
||||
self.devices = [
|
||||
{'name': '/dev/sda', 'size': 64424509440, 'model': 'ok model',
|
||||
'serial': 'fakeserial'},
|
||||
{'name': '/dev/sdb', 'size': 128849018880, 'model': 'big model',
|
||||
'serial': 'veryfakeserial', 'rotational': 'yes'},
|
||||
{'name': '/dev/sdc', 'size': 10737418240, 'model': 'small model',
|
||||
'serial': 'veryveryfakeserial', 'rotational': False},
|
||||
]
|
||||
|
||||
def test_match_root_device_hints_one_hint(self):
|
||||
root_device_hints = {'size': '>= 70'}
|
||||
dev = utils.match_root_device_hints(self.devices, root_device_hints)
|
||||
self.assertEqual('/dev/sdb', dev['name'])
|
||||
|
||||
def test_match_root_device_hints_rotational(self):
|
||||
root_device_hints = {'rotational': False}
|
||||
dev = utils.match_root_device_hints(self.devices, root_device_hints)
|
||||
self.assertEqual('/dev/sdc', dev['name'])
|
||||
|
||||
def test_match_root_device_hints_rotational_convert_devices_bool(self):
|
||||
root_device_hints = {'size': '>=100', 'rotational': True}
|
||||
dev = utils.match_root_device_hints(self.devices, root_device_hints)
|
||||
self.assertEqual('/dev/sdb', dev['name'])
|
||||
|
||||
def test_match_root_device_hints_multiple_hints(self):
|
||||
root_device_hints = {'size': '>= 50', 'model': 's==big model',
|
||||
'serial': 's==veryfakeserial'}
|
||||
dev = utils.match_root_device_hints(self.devices, root_device_hints)
|
||||
self.assertEqual('/dev/sdb', dev['name'])
|
||||
|
||||
def test_match_root_device_hints_multiple_hints2(self):
|
||||
root_device_hints = {
|
||||
'size': '<= 20',
|
||||
'model': '<or> model 5 <or> foomodel <or> small model <or>',
|
||||
'serial': 's== veryveryfakeserial'}
|
||||
dev = utils.match_root_device_hints(self.devices, root_device_hints)
|
||||
self.assertEqual('/dev/sdc', dev['name'])
|
||||
|
||||
def test_match_root_device_hints_multiple_hints3(self):
|
||||
root_device_hints = {'rotational': False, 'model': '<in> small'}
|
||||
dev = utils.match_root_device_hints(self.devices, root_device_hints)
|
||||
self.assertEqual('/dev/sdc', dev['name'])
|
||||
|
||||
def test_match_root_device_hints_no_operators(self):
|
||||
root_device_hints = {'size': '120', 'model': 'big model',
|
||||
'serial': 'veryfakeserial'}
|
||||
dev = utils.match_root_device_hints(self.devices, root_device_hints)
|
||||
self.assertEqual('/dev/sdb', dev['name'])
|
||||
|
||||
def test_match_root_device_hints_no_device_found(self):
|
||||
root_device_hints = {'size': '>=50', 'model': 's==foo'}
|
||||
dev = utils.match_root_device_hints(self.devices, root_device_hints)
|
||||
self.assertIsNone(dev)
|
||||
|
||||
@mock.patch.object(utils.LOG, 'warning', autospec=True)
|
||||
def test_match_root_device_hints_empty_device_attribute(self, mock_warn):
|
||||
empty_dev = [{'name': '/dev/sda', 'model': ' '}]
|
||||
dev = utils.match_root_device_hints(empty_dev, {'model': 'foo'})
|
||||
self.assertIsNone(dev)
|
||||
self.assertTrue(mock_warn.called)
|
||||
|
||||
def test_find_devices_all(self):
|
||||
root_device_hints = {'size': '>= 10'}
|
||||
devs = list(utils.find_devices_by_hints(self.devices,
|
||||
root_device_hints))
|
||||
self.assertEqual(self.devices, devs)
|
||||
|
||||
def test_find_devices_none(self):
|
||||
root_device_hints = {'size': '>= 100500'}
|
||||
devs = list(utils.find_devices_by_hints(self.devices,
|
||||
root_device_hints))
|
||||
self.assertEqual([], devs)
|
||||
|
||||
def test_find_devices_name(self):
|
||||
root_device_hints = {'name': 's== /dev/sda'}
|
||||
devs = list(utils.find_devices_by_hints(self.devices,
|
||||
root_device_hints))
|
||||
self.assertEqual([self.devices[0]], devs)
|
||||
|
||||
|
||||
@mock.patch.object(utils, 'execute', autospec=True)
|
||||
class GetRouteSourceTestCase(base.IronicLibTestCase):
|
||||
|
||||
def test_get_route_source_ipv4(self, mock_execute):
|
||||
mock_execute.return_value = ('XXX src 1.2.3.4 XXX\n cache', None)
|
||||
|
||||
source = utils.get_route_source('XXX')
|
||||
self.assertEqual('1.2.3.4', source)
|
||||
|
||||
def test_get_route_source_ipv6(self, mock_execute):
|
||||
mock_execute.return_value = ('XXX src 1:2::3:4 metric XXX\n cache',
|
||||
None)
|
||||
|
||||
source = utils.get_route_source('XXX')
|
||||
self.assertEqual('1:2::3:4', source)
|
||||
|
||||
def test_get_route_source_ipv6_linklocal(self, mock_execute):
|
||||
mock_execute.return_value = (
|
||||
'XXX src fe80::1234:1234:1234:1234 metric XXX\n cache', None)
|
||||
|
||||
source = utils.get_route_source('XXX')
|
||||
self.assertIsNone(source)
|
||||
|
||||
def test_get_route_source_ipv6_linklocal_allowed(self, mock_execute):
|
||||
mock_execute.return_value = (
|
||||
'XXX src fe80::1234:1234:1234:1234 metric XXX\n cache', None)
|
||||
|
||||
source = utils.get_route_source('XXX', ignore_link_local=False)
|
||||
self.assertEqual('fe80::1234:1234:1234:1234', source)
|
||||
|
||||
def test_get_route_source_indexerror(self, mock_execute):
|
||||
mock_execute.return_value = ('XXX src \n cache', None)
|
||||
|
||||
source = utils.get_route_source('XXX')
|
||||
self.assertIsNone(source)
|
||||
|
||||
|
||||
@mock.patch('shutil.rmtree', autospec=True)
|
||||
@mock.patch.object(utils, 'execute', autospec=True)
|
||||
@mock.patch('tempfile.mkdtemp', autospec=True)
|
||||
class MountedTestCase(base.IronicLibTestCase):
|
||||
|
||||
def test_temporary(self, mock_temp, mock_execute, mock_rmtree):
|
||||
with utils.mounted('/dev/fake') as path:
|
||||
self.assertIs(path, mock_temp.return_value)
|
||||
mock_execute.assert_has_calls([
|
||||
mock.call("mount", '/dev/fake', mock_temp.return_value,
|
||||
attempts=1, delay_on_retry=True),
|
||||
mock.call("umount", mock_temp.return_value,
|
||||
attempts=3, delay_on_retry=True),
|
||||
])
|
||||
mock_rmtree.assert_called_once_with(mock_temp.return_value)
|
||||
|
||||
def test_with_dest(self, mock_temp, mock_execute, mock_rmtree):
|
||||
with utils.mounted('/dev/fake', '/mnt/fake') as path:
|
||||
self.assertEqual('/mnt/fake', path)
|
||||
mock_execute.assert_has_calls([
|
||||
mock.call("mount", '/dev/fake', '/mnt/fake',
|
||||
attempts=1, delay_on_retry=True),
|
||||
mock.call("umount", '/mnt/fake',
|
||||
attempts=3, delay_on_retry=True),
|
||||
])
|
||||
self.assertFalse(mock_temp.called)
|
||||
self.assertFalse(mock_rmtree.called)
|
||||
|
||||
def test_with_opts(self, mock_temp, mock_execute, mock_rmtree):
|
||||
with utils.mounted('/dev/fake', '/mnt/fake',
|
||||
opts=['ro', 'foo=bar']) as path:
|
||||
self.assertEqual('/mnt/fake', path)
|
||||
mock_execute.assert_has_calls([
|
||||
mock.call("mount", '/dev/fake', '/mnt/fake', '-o', 'ro,foo=bar',
|
||||
attempts=1, delay_on_retry=True),
|
||||
mock.call("umount", '/mnt/fake',
|
||||
attempts=3, delay_on_retry=True),
|
||||
])
|
||||
|
||||
def test_with_type(self, mock_temp, mock_execute, mock_rmtree):
|
||||
with utils.mounted('/dev/fake', '/mnt/fake',
|
||||
fs_type='iso9660') as path:
|
||||
self.assertEqual('/mnt/fake', path)
|
||||
mock_execute.assert_has_calls([
|
||||
mock.call("mount", '/dev/fake', '/mnt/fake', '-t', 'iso9660',
|
||||
attempts=1, delay_on_retry=True),
|
||||
mock.call("umount", '/mnt/fake',
|
||||
attempts=3, delay_on_retry=True),
|
||||
])
|
||||
|
||||
def test_failed_to_mount(self, mock_temp, mock_execute, mock_rmtree):
|
||||
mock_execute.side_effect = OSError
|
||||
self.assertRaises(OSError, utils.mounted('/dev/fake').__enter__)
|
||||
mock_execute.assert_called_once_with("mount", '/dev/fake',
|
||||
mock_temp.return_value,
|
||||
attempts=1,
|
||||
delay_on_retry=True)
|
||||
mock_rmtree.assert_called_once_with(mock_temp.return_value)
|
||||
|
||||
def test_failed_to_unmount(self, mock_temp, mock_execute, mock_rmtree):
|
||||
mock_execute.side_effect = [('', ''),
|
||||
processutils.ProcessExecutionError]
|
||||
with utils.mounted('/dev/fake', '/mnt/fake') as path:
|
||||
self.assertEqual('/mnt/fake', path)
|
||||
mock_execute.assert_has_calls([
|
||||
mock.call("mount", '/dev/fake', '/mnt/fake',
|
||||
attempts=1, delay_on_retry=True),
|
||||
mock.call("umount", '/mnt/fake',
|
||||
attempts=3, delay_on_retry=True),
|
||||
])
|
||||
self.assertFalse(mock_rmtree.called)
|
||||
|
||||
|
||||
class ParseDeviceTagsTestCase(base.IronicLibTestCase):
|
||||
|
||||
def test_empty(self):
|
||||
result = utils.parse_device_tags("\n\n")
|
||||
self.assertEqual([], list(result))
|
||||
|
||||
def test_parse(self):
|
||||
tags = """
|
||||
PTUUID="00016a50" PTTYPE="dos" LABEL=""
|
||||
TYPE="vfat" PART_ENTRY_SCHEME="gpt" PART_ENTRY_NAME="EFI System Partition"
|
||||
"""
|
||||
result = list(utils.parse_device_tags(tags))
|
||||
self.assertEqual([
|
||||
{'PTUUID': '00016a50', 'PTTYPE': 'dos', 'LABEL': ''},
|
||||
{'TYPE': 'vfat', 'PART_ENTRY_SCHEME': 'gpt',
|
||||
'PART_ENTRY_NAME': 'EFI System Partition'}
|
||||
], result)
|
@ -1,582 +0,0 @@
|
||||
# Copyright 2010 United States Government as represented by the
|
||||
# Administrator of the National Aeronautics and Space Administration.
|
||||
# Copyright 2011 Justin Santa Barbara
|
||||
# Copyright (c) 2012 NTT DOCOMO, INC.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""Utilities and helper functions."""
|
||||
|
||||
import contextlib
|
||||
import copy
|
||||
import errno
|
||||
import ipaddress
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
import shlex
|
||||
import shutil
|
||||
import tempfile
|
||||
from urllib import parse as urlparse
|
||||
import warnings
|
||||
|
||||
from oslo_concurrency import processutils
|
||||
from oslo_utils import excutils
|
||||
from oslo_utils import specs_matcher
|
||||
from oslo_utils import strutils
|
||||
from oslo_utils import units
|
||||
|
||||
from ironic_lib.common.i18n import _
|
||||
from ironic_lib import exception
|
||||
|
||||
|
||||
LOG = logging.getLogger(__name__)
|
||||
|
||||
# A dictionary in the form {hint name: hint type}
|
||||
VALID_ROOT_DEVICE_HINTS = {
|
||||
'size': int, 'model': str, 'wwn': str, 'serial': str, 'vendor': str,
|
||||
'wwn_with_extension': str, 'wwn_vendor_extension': str, 'name': str,
|
||||
'rotational': bool, 'hctl': str, 'by_path': str,
|
||||
}
|
||||
|
||||
|
||||
ROOT_DEVICE_HINTS_GRAMMAR = specs_matcher.make_grammar()
|
||||
|
||||
|
||||
def execute(*cmd, use_standard_locale=False, log_stdout=True, **kwargs):
|
||||
"""Convenience wrapper around oslo's execute() method.
|
||||
|
||||
Executes and logs results from a system command. See docs for
|
||||
oslo_concurrency.processutils.execute for usage.
|
||||
|
||||
:param cmd: positional arguments to pass to processutils.execute()
|
||||
:param use_standard_locale: Defaults to False. If set to True,
|
||||
execute command with standard locale
|
||||
added to environment variables.
|
||||
:param log_stdout: Defaults to True. If set to True, logs the output.
|
||||
:param kwargs: keyword arguments to pass to processutils.execute()
|
||||
:returns: (stdout, stderr) from process execution
|
||||
:raises: UnknownArgumentError on receiving unknown arguments
|
||||
:raises: ProcessExecutionError
|
||||
:raises: OSError
|
||||
"""
|
||||
if use_standard_locale:
|
||||
env = kwargs.pop('env_variables', os.environ.copy())
|
||||
env['LC_ALL'] = 'C'
|
||||
kwargs['env_variables'] = env
|
||||
|
||||
if kwargs.pop('run_as_root', False):
|
||||
warnings.warn("run_as_root is deprecated and has no effect",
|
||||
DeprecationWarning)
|
||||
|
||||
def _log(stdout, stderr):
|
||||
if log_stdout:
|
||||
try:
|
||||
LOG.debug('Command stdout is: "%s"', stdout)
|
||||
except UnicodeEncodeError:
|
||||
LOG.debug('stdout contains invalid UTF-8 characters')
|
||||
stdout = (stdout.encode('utf8', 'surrogateescape')
|
||||
.decode('utf8', 'ignore'))
|
||||
LOG.debug('Command stdout is: "%s"', stdout)
|
||||
try:
|
||||
LOG.debug('Command stderr is: "%s"', stderr)
|
||||
except UnicodeEncodeError:
|
||||
LOG.debug('stderr contains invalid UTF-8 characters')
|
||||
stderr = (stderr.encode('utf8', 'surrogateescape')
|
||||
.decode('utf8', 'ignore'))
|
||||
LOG.debug('Command stderr is: "%s"', stderr)
|
||||
|
||||
try:
|
||||
result = processutils.execute(*cmd, **kwargs)
|
||||
except FileNotFoundError:
|
||||
with excutils.save_and_reraise_exception():
|
||||
LOG.debug('Command not found: "%s"', ' '.join(map(str, cmd)))
|
||||
except processutils.ProcessExecutionError as exc:
|
||||
with excutils.save_and_reraise_exception():
|
||||
_log(exc.stdout, exc.stderr)
|
||||
else:
|
||||
_log(result[0], result[1])
|
||||
return result
|
||||
|
||||
|
||||
def try_execute(*cmd, **kwargs):
|
||||
"""The same as execute but returns None on error.
|
||||
|
||||
Executes and logs results from a system command. See docs for
|
||||
oslo_concurrency.processutils.execute for usage.
|
||||
|
||||
Instead of raising an exception on failure, this method simply
|
||||
returns None in case of failure.
|
||||
|
||||
:param cmd: positional arguments to pass to processutils.execute()
|
||||
:param kwargs: keyword arguments to pass to processutils.execute()
|
||||
:raises: UnknownArgumentError on receiving unknown arguments
|
||||
:returns: tuple of (stdout, stderr) or None in some error cases
|
||||
"""
|
||||
try:
|
||||
return execute(*cmd, **kwargs)
|
||||
except (processutils.ProcessExecutionError, OSError) as e:
|
||||
LOG.debug('Command failed: %s', e)
|
||||
|
||||
|
||||
def mkfs(fs, path, label=None):
|
||||
"""Format a file or block device
|
||||
|
||||
:param fs: Filesystem type (examples include 'swap', 'ext3', 'ext4'
|
||||
'btrfs', etc.)
|
||||
:param path: Path to file or block device to format
|
||||
:param label: Volume label to use
|
||||
"""
|
||||
if fs == 'swap':
|
||||
args = ['mkswap']
|
||||
else:
|
||||
args = ['mkfs', '-t', fs]
|
||||
# add -F to force no interactive execute on non-block device.
|
||||
if fs in ('ext3', 'ext4'):
|
||||
args.extend(['-F'])
|
||||
if label:
|
||||
if fs in ('msdos', 'vfat'):
|
||||
label_opt = '-n'
|
||||
else:
|
||||
label_opt = '-L'
|
||||
args.extend([label_opt, label])
|
||||
args.append(path)
|
||||
try:
|
||||
execute(*args, use_standard_locale=True)
|
||||
except processutils.ProcessExecutionError as e:
|
||||
with excutils.save_and_reraise_exception() as ctx:
|
||||
if os.strerror(errno.ENOENT) in e.stderr:
|
||||
ctx.reraise = False
|
||||
LOG.exception('Failed to make file system. '
|
||||
'File system %s is not supported.', fs)
|
||||
raise exception.FileSystemNotSupported(fs=fs)
|
||||
else:
|
||||
LOG.exception('Failed to create a file system '
|
||||
'in %(path)s. Error: %(error)s',
|
||||
{'path': path, 'error': e})
|
||||
|
||||
|
||||
def unlink_without_raise(path):
|
||||
try:
|
||||
os.unlink(path)
|
||||
except OSError as e:
|
||||
if e.errno == errno.ENOENT:
|
||||
return
|
||||
else:
|
||||
LOG.warning("Failed to unlink %(path)s, error: %(e)s",
|
||||
{'path': path, 'e': e})
|
||||
|
||||
|
||||
def dd(src, dst, *args):
|
||||
"""Execute dd from src to dst.
|
||||
|
||||
:param src: the input file for dd command.
|
||||
:param dst: the output file for dd command.
|
||||
:param args: a tuple containing the arguments to be
|
||||
passed to dd command.
|
||||
:raises: processutils.ProcessExecutionError if it failed
|
||||
to run the process.
|
||||
"""
|
||||
LOG.debug("Starting dd process.")
|
||||
execute('dd', 'if=%s' % src, 'of=%s' % dst, *args,
|
||||
use_standard_locale=True)
|
||||
|
||||
|
||||
def is_http_url(url):
|
||||
url = url.lower()
|
||||
return url.startswith('http://') or url.startswith('https://')
|
||||
|
||||
|
||||
def list_opts():
|
||||
"""Entry point for oslo-config-generator."""
|
||||
return [('ironic_lib', [])] # placeholder
|
||||
|
||||
|
||||
def _extract_hint_operator_and_values(hint_expression, hint_name):
|
||||
"""Extract the operator and value(s) of a root device hint expression.
|
||||
|
||||
A root device hint expression could contain one or more values
|
||||
depending on the operator. This method extracts the operator and
|
||||
value(s) and returns a dictionary containing both.
|
||||
|
||||
:param hint_expression: The hint expression string containing value(s)
|
||||
and operator (optionally).
|
||||
:param hint_name: The name of the hint. Used for logging.
|
||||
:raises: ValueError if the hint_expression is empty.
|
||||
:returns: A dictionary containing:
|
||||
|
||||
:op: The operator. An empty string in case of None.
|
||||
:values: A list of values stripped and converted to lowercase.
|
||||
"""
|
||||
expression = str(hint_expression).strip().lower()
|
||||
if not expression:
|
||||
raise ValueError(
|
||||
_('Root device hint "%s" expression is empty') % hint_name)
|
||||
|
||||
# parseString() returns a list of tokens which the operator (if
|
||||
# present) is always the first element.
|
||||
ast = ROOT_DEVICE_HINTS_GRAMMAR.parseString(expression)
|
||||
if len(ast) <= 1:
|
||||
# hint_expression had no operator
|
||||
return {'op': '', 'values': [expression]}
|
||||
|
||||
op = ast[0]
|
||||
return {'values': [v.strip() for v in re.split(op, expression) if v],
|
||||
'op': op}
|
||||
|
||||
|
||||
def _normalize_hint_expression(hint_expression, hint_name):
|
||||
"""Normalize a string type hint expression.
|
||||
|
||||
A string-type hint expression contains one or more operators and
|
||||
one or more values: [<op>] <value> [<op> <value>]*. This normalizes
|
||||
the values by url-encoding white spaces and special characters. The
|
||||
operators are not normalized. For example: the hint value of "<or>
|
||||
foo bar <or> bar" will become "<or> foo%20bar <or> bar".
|
||||
|
||||
:param hint_expression: The hint expression string containing value(s)
|
||||
and operator (optionally).
|
||||
:param hint_name: The name of the hint. Used for logging.
|
||||
:raises: ValueError if the hint_expression is empty.
|
||||
:returns: A normalized string.
|
||||
"""
|
||||
hdict = _extract_hint_operator_and_values(hint_expression, hint_name)
|
||||
result = hdict['op'].join([' %s ' % urlparse.quote(t)
|
||||
for t in hdict['values']])
|
||||
return (hdict['op'] + result).strip()
|
||||
|
||||
|
||||
def _append_operator_to_hints(root_device):
|
||||
"""Add an equal (s== or ==) operator to the hints.
|
||||
|
||||
For backwards compatibility, for root device hints where no operator
|
||||
means equal, this method adds the equal operator to the hint. This is
|
||||
needed when using oslo.utils.specs_matcher methods.
|
||||
|
||||
:param root_device: The root device hints dictionary.
|
||||
"""
|
||||
for name, expression in root_device.items():
|
||||
# NOTE(lucasagomes): The specs_matcher from oslo.utils does not
|
||||
# support boolean, so we don't need to append any operator
|
||||
# for it.
|
||||
if VALID_ROOT_DEVICE_HINTS[name] is bool:
|
||||
continue
|
||||
|
||||
expression = str(expression)
|
||||
ast = ROOT_DEVICE_HINTS_GRAMMAR.parseString(expression)
|
||||
if len(ast) > 1:
|
||||
continue
|
||||
|
||||
op = 's== %s' if VALID_ROOT_DEVICE_HINTS[name] is str else '== %s'
|
||||
root_device[name] = op % expression
|
||||
|
||||
return root_device
|
||||
|
||||
|
||||
def parse_root_device_hints(root_device):
|
||||
"""Parse the root_device property of a node.
|
||||
|
||||
Parses and validates the root_device property of a node. These are
|
||||
hints for how a node's root device is created. The 'size' hint
|
||||
should be a positive integer. The 'rotational' hint should be a
|
||||
Boolean value.
|
||||
|
||||
:param root_device: the root_device dictionary from the node's property.
|
||||
:returns: a dictionary with the root device hints parsed or
|
||||
None if there are no hints.
|
||||
:raises: ValueError, if some information is invalid.
|
||||
|
||||
"""
|
||||
if not root_device:
|
||||
return
|
||||
|
||||
root_device = copy.deepcopy(root_device)
|
||||
|
||||
invalid_hints = set(root_device) - set(VALID_ROOT_DEVICE_HINTS)
|
||||
if invalid_hints:
|
||||
raise ValueError(
|
||||
_('The hints "%(invalid_hints)s" are invalid. '
|
||||
'Valid hints are: "%(valid_hints)s"') %
|
||||
{'invalid_hints': ', '.join(invalid_hints),
|
||||
'valid_hints': ', '.join(VALID_ROOT_DEVICE_HINTS)})
|
||||
|
||||
for name, expression in root_device.items():
|
||||
hint_type = VALID_ROOT_DEVICE_HINTS[name]
|
||||
if hint_type is str:
|
||||
if not isinstance(expression, str):
|
||||
raise ValueError(
|
||||
_('Root device hint "%(name)s" is not a string value. '
|
||||
'Hint expression: %(expression)s') %
|
||||
{'name': name, 'expression': expression})
|
||||
root_device[name] = _normalize_hint_expression(expression, name)
|
||||
|
||||
elif hint_type is int:
|
||||
for v in _extract_hint_operator_and_values(expression,
|
||||
name)['values']:
|
||||
try:
|
||||
integer = int(v)
|
||||
except ValueError:
|
||||
raise ValueError(
|
||||
_('Root device hint "%(name)s" is not an integer '
|
||||
'value. Current value: %(expression)s') %
|
||||
{'name': name, 'expression': expression})
|
||||
|
||||
if integer <= 0:
|
||||
raise ValueError(
|
||||
_('Root device hint "%(name)s" should be a positive '
|
||||
'integer. Current value: %(expression)s') %
|
||||
{'name': name, 'expression': expression})
|
||||
|
||||
elif hint_type is bool:
|
||||
try:
|
||||
root_device[name] = strutils.bool_from_string(
|
||||
expression, strict=True)
|
||||
except ValueError:
|
||||
raise ValueError(
|
||||
_('Root device hint "%(name)s" is not a Boolean value. '
|
||||
'Current value: %(expression)s') %
|
||||
{'name': name, 'expression': expression})
|
||||
|
||||
return _append_operator_to_hints(root_device)
|
||||
|
||||
|
||||
def find_devices_by_hints(devices, root_device_hints):
|
||||
"""Find all devices that match the root device hints.
|
||||
|
||||
Try to find devices that match the root device hints. In order
|
||||
for a device to be matched it needs to satisfy all the given hints.
|
||||
|
||||
:param devices: A list of dictionaries representing the devices
|
||||
containing one or more of the following keys:
|
||||
|
||||
:name: (String) The device name, e.g /dev/sda
|
||||
:size: (Integer) Size of the device in *bytes*
|
||||
:model: (String) Device model
|
||||
:vendor: (String) Device vendor name
|
||||
:serial: (String) Device serial number
|
||||
:wwn: (String) Unique storage identifier
|
||||
:wwn_with_extension: (String): Unique storage identifier with
|
||||
the vendor extension appended
|
||||
:wwn_vendor_extension: (String): United vendor storage identifier
|
||||
:rotational: (Boolean) Whether it's a rotational device or
|
||||
not. Useful to distinguish HDDs (rotational) and SSDs
|
||||
(not rotational).
|
||||
:hctl: (String): The SCSI address: Host, channel, target and lun.
|
||||
For example: '1:0:0:0'.
|
||||
:by_path: (String): The alternative device name,
|
||||
e.g. /dev/disk/by-path/pci-0000:00
|
||||
|
||||
:param root_device_hints: A dictionary with the root device hints.
|
||||
:raises: ValueError, if some information is invalid.
|
||||
:returns: A generator with all matching devices as dictionaries.
|
||||
"""
|
||||
LOG.debug('Trying to find devices from "%(devs)s" that match the '
|
||||
'device hints "%(hints)s"',
|
||||
{'devs': ', '.join([d.get('name') for d in devices]),
|
||||
'hints': root_device_hints})
|
||||
parsed_hints = parse_root_device_hints(root_device_hints)
|
||||
for dev in devices:
|
||||
device_name = dev.get('name')
|
||||
|
||||
for hint in parsed_hints:
|
||||
hint_type = VALID_ROOT_DEVICE_HINTS[hint]
|
||||
device_value = dev.get(hint)
|
||||
hint_value = parsed_hints[hint]
|
||||
|
||||
if hint_type is str:
|
||||
try:
|
||||
device_value = _normalize_hint_expression(device_value,
|
||||
hint)
|
||||
except ValueError:
|
||||
LOG.warning(
|
||||
'The attribute "%(attr)s" of the device "%(dev)s" '
|
||||
'has an empty value. Skipping device.',
|
||||
{'attr': hint, 'dev': device_name})
|
||||
break
|
||||
|
||||
if hint == 'size':
|
||||
# Since we don't support units yet we expect the size
|
||||
# in GiB for now
|
||||
device_value = device_value / units.Gi
|
||||
|
||||
LOG.debug('Trying to match the device hint "%(hint)s" '
|
||||
'with a value of "%(hint_value)s" against the same '
|
||||
'device\'s (%(dev)s) attribute with a value of '
|
||||
'"%(dev_value)s"', {'hint': hint, 'dev': device_name,
|
||||
'hint_value': hint_value,
|
||||
'dev_value': device_value})
|
||||
|
||||
# NOTE(lucasagomes): Boolean hints are not supported by
|
||||
# specs_matcher.match(), so we need to do the comparison
|
||||
# ourselves
|
||||
if hint_type is bool:
|
||||
try:
|
||||
device_value = strutils.bool_from_string(device_value,
|
||||
strict=True)
|
||||
except ValueError:
|
||||
LOG.warning('The attribute "%(attr)s" (with value '
|
||||
'"%(value)s") of device "%(dev)s" is not '
|
||||
'a valid Boolean. Skipping device.',
|
||||
{'attr': hint, 'value': device_value,
|
||||
'dev': device_name})
|
||||
break
|
||||
if device_value == hint_value:
|
||||
continue
|
||||
|
||||
elif specs_matcher.match(device_value, hint_value):
|
||||
continue
|
||||
|
||||
LOG.debug('The attribute "%(attr)s" (with value "%(value)s") '
|
||||
'of device "%(dev)s" does not match the hint %(hint)s',
|
||||
{'attr': hint, 'value': device_value,
|
||||
'dev': device_name, 'hint': hint_value})
|
||||
break
|
||||
else:
|
||||
yield dev
|
||||
|
||||
|
||||
def match_root_device_hints(devices, root_device_hints):
|
||||
"""Try to find a device that matches the root device hints.
|
||||
|
||||
Try to find a device that matches the root device hints. In order
|
||||
for a device to be matched it needs to satisfy all the given hints.
|
||||
|
||||
:param devices: A list of dictionaries representing the devices
|
||||
containing one or more of the following keys:
|
||||
|
||||
:name: (String) The device name, e.g /dev/sda
|
||||
:size: (Integer) Size of the device in *bytes*
|
||||
:model: (String) Device model
|
||||
:vendor: (String) Device vendor name
|
||||
:serial: (String) Device serial number
|
||||
:wwn: (String) Unique storage identifier
|
||||
:wwn_with_extension: (String): Unique storage identifier with
|
||||
the vendor extension appended
|
||||
:wwn_vendor_extension: (String): United vendor storage identifier
|
||||
:rotational: (Boolean) Whether it's a rotational device or
|
||||
not. Useful to distinguish HDDs (rotational) and SSDs
|
||||
(not rotational).
|
||||
:hctl: (String): The SCSI address: Host, channel, target and lun.
|
||||
For example: '1:0:0:0'.
|
||||
:by_path: (String): The alternative device name,
|
||||
e.g. /dev/disk/by-path/pci-0000:00
|
||||
|
||||
:param root_device_hints: A dictionary with the root device hints.
|
||||
:raises: ValueError, if some information is invalid.
|
||||
:returns: The first device to match all the hints or None.
|
||||
"""
|
||||
try:
|
||||
dev = next(find_devices_by_hints(devices, root_device_hints))
|
||||
except StopIteration:
|
||||
LOG.warning('No device found that matches the root device hints %s',
|
||||
root_device_hints)
|
||||
else:
|
||||
LOG.info('Root device found! The device "%s" matches the root '
|
||||
'device hints %s', dev, root_device_hints)
|
||||
return dev
|
||||
|
||||
|
||||
def get_route_source(dest, ignore_link_local=True):
|
||||
"""Get the IP address to send packages to destination."""
|
||||
try:
|
||||
out, _err = execute('ip', 'route', 'get', dest)
|
||||
except (EnvironmentError, processutils.ProcessExecutionError) as e:
|
||||
LOG.warning('Cannot get route to host %(dest)s: %(err)s',
|
||||
{'dest': dest, 'err': e})
|
||||
return
|
||||
|
||||
try:
|
||||
source = out.strip().split('\n')[0].split('src')[1].split()[0]
|
||||
if (ipaddress.ip_address(source).is_link_local
|
||||
and ignore_link_local):
|
||||
LOG.debug('Ignoring link-local source to %(dest)s: %(rec)s',
|
||||
{'dest': dest, 'rec': out})
|
||||
return
|
||||
return source
|
||||
except (IndexError, ValueError):
|
||||
LOG.debug('No route to host %(dest)s, route record: %(rec)s',
|
||||
{'dest': dest, 'rec': out})
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def mounted(source, dest=None, opts=None, fs_type=None,
|
||||
mount_attempts=1, umount_attempts=3):
|
||||
"""A context manager for a temporary mount.
|
||||
|
||||
:param source: A device to mount.
|
||||
:param dest: Mount destination. If not specified, a temporary directory
|
||||
will be created and removed afterwards. An existing destination is
|
||||
not removed.
|
||||
:param opts: Mount options (``-o`` argument).
|
||||
:param fs_type: File system type (``-t`` argument).
|
||||
:param mount_attempts: A number of attempts to mount the device.
|
||||
:param umount_attempts: A number of attempts to unmount the device.
|
||||
:returns: A generator yielding the destination.
|
||||
"""
|
||||
params = []
|
||||
if opts:
|
||||
params.extend(['-o', ','.join(opts)])
|
||||
if fs_type:
|
||||
params.extend(['-t', fs_type])
|
||||
|
||||
if dest is None:
|
||||
dest = tempfile.mkdtemp()
|
||||
clean_up = True
|
||||
else:
|
||||
clean_up = False
|
||||
|
||||
mounted = False
|
||||
try:
|
||||
execute("mount", source, dest, *params, attempts=mount_attempts,
|
||||
delay_on_retry=True)
|
||||
mounted = True
|
||||
yield dest
|
||||
finally:
|
||||
if mounted:
|
||||
try:
|
||||
execute("umount", dest, attempts=umount_attempts,
|
||||
delay_on_retry=True)
|
||||
except (EnvironmentError,
|
||||
processutils.ProcessExecutionError) as exc:
|
||||
LOG.warning(
|
||||
'Unable to unmount temporary location %(dest)s: %(err)s',
|
||||
{'dest': dest, 'err': exc})
|
||||
# NOTE(dtantsur): don't try to remove a still mounted location
|
||||
clean_up = False
|
||||
|
||||
if clean_up:
|
||||
try:
|
||||
shutil.rmtree(dest)
|
||||
except EnvironmentError as exc:
|
||||
LOG.warning(
|
||||
'Unable to remove temporary location %(dest)s: %(err)s',
|
||||
{'dest': dest, 'err': exc})
|
||||
|
||||
|
||||
def parse_device_tags(output):
|
||||
"""Parse tags from the lsblk/blkid output.
|
||||
|
||||
Parses format KEY="VALUE" KEY2="VALUE2".
|
||||
|
||||
:return: a generator yielding dicts with information from each line.
|
||||
"""
|
||||
for line in output.strip().split('\n'):
|
||||
if line.strip():
|
||||
try:
|
||||
yield {key: value for key, value in
|
||||
(v.split('=', 1) for v in shlex.split(line))}
|
||||
except ValueError as err:
|
||||
raise ValueError(
|
||||
_("Malformed blkid/lsblk output line '%(line)s': %(err)s")
|
||||
% {'line': line, 'err': err})
|
@ -1,77 +0,0 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import socket
|
||||
|
||||
from oslo_config import cfg
|
||||
from oslo_service import service
|
||||
from oslo_service import wsgi
|
||||
|
||||
from ironic_lib import utils
|
||||
|
||||
CONF = cfg.CONF
|
||||
|
||||
|
||||
class WSGIService(service.ServiceBase):
|
||||
|
||||
def __init__(self, name, app, conf):
|
||||
"""Initialize, but do not start the WSGI server.
|
||||
|
||||
:param name: The name of the WSGI server given to the loader.
|
||||
:param app: WSGI application to run.
|
||||
:param conf: Object to load configuration from.
|
||||
:returns: None
|
||||
"""
|
||||
self.name = name
|
||||
self._conf = conf
|
||||
if conf.unix_socket:
|
||||
utils.unlink_without_raise(conf.unix_socket)
|
||||
self.server = wsgi.Server(CONF, name, app,
|
||||
socket_family=socket.AF_UNIX,
|
||||
socket_file=conf.unix_socket,
|
||||
socket_mode=conf.unix_socket_mode,
|
||||
use_ssl=conf.use_ssl)
|
||||
else:
|
||||
self.server = wsgi.Server(CONF, name, app,
|
||||
host=conf.host_ip,
|
||||
port=conf.port,
|
||||
use_ssl=conf.use_ssl)
|
||||
|
||||
def start(self):
|
||||
"""Start serving this service using loaded configuration.
|
||||
|
||||
:returns: None
|
||||
"""
|
||||
self.server.start()
|
||||
|
||||
def stop(self):
|
||||
"""Stop serving this API.
|
||||
|
||||
:returns: None
|
||||
"""
|
||||
self.server.stop()
|
||||
if self._conf.unix_socket:
|
||||
utils.unlink_without_raise(self._conf.unix_socket)
|
||||
|
||||
def wait(self):
|
||||
"""Wait for the service to stop serving this API.
|
||||
|
||||
:returns: None
|
||||
"""
|
||||
self.server.wait()
|
||||
|
||||
def reset(self):
|
||||
"""Reset server greenpool size to default.
|
||||
|
||||
:returns: None
|
||||
"""
|
||||
self.server.reset()
|
@ -1,3 +0,0 @@
|
||||
[build-system]
|
||||
requires = ["pbr>=6.0.0", "setuptools>=64.0.0"]
|
||||
build-backend = "pbr.build"
|
@ -1,11 +0,0 @@
|
||||
---
|
||||
fixes:
|
||||
- |
|
||||
Adds an additional error to look for in the ``qemu-img`` image conversion
|
||||
retry logic to automatically retry if 'Cannot allocate memory' is
|
||||
encountered, as ``qemu-img`` makes a number of memory allocation requests
|
||||
and the most likely is upon creating the convesrsion thread resulting in
|
||||
'qemu: qemu_thread_create_: Resource temporarily unavailable'
|
||||
but other memory allocation fails can result in
|
||||
'Failed to allocate memory: Cannot allocate memory'. Both types of errors
|
||||
are now checked and automatically retried upon.
|
@ -1,9 +0,0 @@
|
||||
---
|
||||
features:
|
||||
- |
|
||||
Adds a new metrics collection backend, ``collector``, to collect
|
||||
counter, gauge, and timer information, enabling the applicationg to
|
||||
access these statistics during process runtime. Adds a new metrics method
|
||||
``get_metrics_data`` to allow the dictionary structure containing
|
||||
the metrics data to be accessed. This feature may be enabled by setting
|
||||
the ``[metrics]\backend`` option to ``collector``.
|
@ -1,6 +0,0 @@
|
||||
---
|
||||
features:
|
||||
- |
|
||||
Adds the capability for the ``json_rpc`` client to identify and utilize
|
||||
a specific port from the supplied ``topic`` field as opposed to the
|
||||
default configured port.
|
@ -1,18 +0,0 @@
|
||||
---
|
||||
features:
|
||||
- |
|
||||
Implement Basic HTTP authentication middleware.
|
||||
|
||||
This middleware is added to ironic-lib so that it can eventually be
|
||||
used by ironic and ironic-inspector as an alternative to noauth in
|
||||
standalone environments.
|
||||
|
||||
This middleware is passed a path to a file which supports the
|
||||
Apache htpasswd syntax[1]. This file is read for every request, so no
|
||||
service restart is required when changes are made.
|
||||
|
||||
The only password digest supported is bcrypt, and the ``bcrypt``
|
||||
python library is used for password checks since it supports ``$2y$``
|
||||
prefixed bcrypt passwords as generated by the Apache htpasswd utility.
|
||||
|
||||
[1] https://httpd.apache.org/docs/current/misc/password_encryptions.html
|
@ -1,7 +0,0 @@
|
||||
---
|
||||
fixes:
|
||||
- |
|
||||
Fixes an py3 compatibility issue in metrics_statsd where str need be
|
||||
explicitly converted to bytes before send with socket.
|
||||
See `Story 2007537 <https://storyboard.openstack.org/#!/story/2007537>`_
|
||||
for details.
|
@ -1,6 +0,0 @@
|
||||
---
|
||||
upgrade:
|
||||
- |
|
||||
Python 2.7 support has been dropped. Last release of ironic-lib
|
||||
to support Python 2.7 is OpenStack Train. The minimum version of Python now
|
||||
supported by ironic-lib is Python 3.6.
|
@ -1,6 +0,0 @@
|
||||
---
|
||||
fixes:
|
||||
- |
|
||||
Fixes cleaning errors when trying to erase a GPT from a partition
|
||||
which is smaller than a GPT (33 sectors).
|
||||
|
@ -1,6 +0,0 @@
|
||||
---
|
||||
fixes:
|
||||
- |
|
||||
Fixes an issues when parsing GPT partitions with names or multiple flags.
|
||||
See `story 2005322 <https://storyboard.openstack.org/#!/story/2005322>`_
|
||||
for details.
|
@ -1,6 +0,0 @@
|
||||
---
|
||||
fixes:
|
||||
- |
|
||||
Fixes a bug when erasing a partition table: the corresponding I/O needs
|
||||
to be synchronous in order to avoid masking failed write requests to
|
||||
broken devices.
|
@ -1,6 +0,0 @@
|
||||
---
|
||||
fixes:
|
||||
- |
|
||||
Fixes an issue where the incorrect partition naming was used for metadisk (md)
|
||||
devices. See `Story 2006154 <https://storyboard.openstack.org/#!/story/2006154>`_
|
||||
for details.
|
@ -1,12 +0,0 @@
|
||||
---
|
||||
other:
|
||||
- |
|
||||
The default size of EFI system partitions to be created, when writing a
|
||||
partition image, has been increased to 550 Megabytes from 200 Megabytes.
|
||||
If this change is undesirable, please utilize the
|
||||
``efi_system_partition_size`` configuration option. This value is now
|
||||
also consistent with the internal default when creating ESP volumes
|
||||
for Software RAID with ``ironic-python-agent``, and the default carried
|
||||
by ``diskimage-builder``. The prime driver for an increased partition
|
||||
size is to enable OS driven firmware updates and appropriate space to
|
||||
house Unikernels which requires additional space.
|
@ -1,5 +0,0 @@
|
||||
---
|
||||
features:
|
||||
- |
|
||||
The new ``[json_rpc] allowed_roles`` parameter has been added. This
|
||||
parameter determines the list of roles allowed to use JSON RPC.
|
@ -1,7 +0,0 @@
|
||||
---
|
||||
upgrade:
|
||||
- |
|
||||
The configuration option ``[disk_utils]iscsi_verify_attempts was
|
||||
deprecated in Train and it's now removed.
|
||||
Please use the ``[disk_utils]partition_detection_attempts``
|
||||
option instead.
|
@ -1,5 +0,0 @@
|
||||
---
|
||||
upgrade:
|
||||
- |
|
||||
Support for Python 3.8 has been removed. Now the minimum python version
|
||||
supported is 3.9 .
|
@ -1,7 +0,0 @@
|
||||
---
|
||||
fixes:
|
||||
- |
|
||||
Fixes an issue with the ``disk_utils`` method ``make_partitions``,
|
||||
which is utilized to facilitate the write-out of partition images to disk.
|
||||
Previously when this method was invoked on a ramdisk, the partition may
|
||||
not have been found.
|
@ -1,5 +0,0 @@
|
||||
---
|
||||
fixes:
|
||||
- |
|
||||
Restore compatibility of blkid command with CentOS 7. For more details
|
||||
see `Story 2009328 <https://storyboard.openstack.org/#!/story/2009328>`_.
|
@ -1,5 +0,0 @@
|
||||
---
|
||||
fixes:
|
||||
- |
|
||||
Adds support for disks with 4096 sector size when cleaning disk metadata.
|
||||
Previously, only 512 sector size disks where supported.
|
@ -1,7 +0,0 @@
|
||||
---
|
||||
fixes:
|
||||
- |
|
||||
Fixes an issue where CRC errors in the GPT partition information would
|
||||
cause cleaning to fail. A similar issue was previously encountered with
|
||||
ironic-python-agent. See `Story 1737556 <https://storyboard.openstack.org/#!/story/1737556>`_
|
||||
for details.
|
@ -1,12 +0,0 @@
|
||||
# Requirements lower bounds listed here are our best effort to keep them up to
|
||||
# date but we do not test them so no guarantee of having them all correct. If
|
||||
# you find any incorrect lower bounds, let us know or propose a fix.
|
||||
|
||||
pbr>=6.0.0 # Apache-2.0
|
||||
oslo.concurrency>=3.26.0 # Apache-2.0
|
||||
oslo.config>=5.2.0 # Apache-2.0
|
||||
oslo.i18n>=3.15.3 # Apache-2.0
|
||||
oslo.utils>=3.34.0 # Apache-2.0
|
||||
zeroconf>=0.24.0 # LGPL
|
||||
bcrypt>=3.1.3 # Apache-2.0
|
||||
WebOb>=1.7.1 # MIT
|
57
setup.cfg
57
setup.cfg
@ -1,57 +0,0 @@
|
||||
[metadata]
|
||||
name = ironic-lib
|
||||
summary = Ironic common library
|
||||
description_file =
|
||||
README.rst
|
||||
author = OpenStack Ironic
|
||||
author_email = openstack-discuss@lists.openstack.org
|
||||
home_page = https://docs.openstack.org/ironic-lib/
|
||||
python_requires = >=3.9
|
||||
classifier =
|
||||
Environment :: OpenStack
|
||||
Intended Audience :: Information Technology
|
||||
Intended Audience :: System Administrators
|
||||
License :: OSI Approved :: Apache Software License
|
||||
Operating System :: POSIX :: Linux
|
||||
Programming Language :: Python
|
||||
Programming Language :: Python :: Implementation :: CPython
|
||||
Programming Language :: Python :: 3 :: Only
|
||||
Programming Language :: Python :: 3
|
||||
Programming Language :: Python :: 3.9
|
||||
Programming Language :: Python :: 3.10
|
||||
Programming Language :: Python :: 3.11
|
||||
Programming Language :: Python :: 3.12
|
||||
|
||||
[files]
|
||||
data_files =
|
||||
etc/ironic/rootwrap.d = etc/ironic/rootwrap.d/*
|
||||
packages =
|
||||
ironic_lib
|
||||
|
||||
[entry_points]
|
||||
oslo.config.opts =
|
||||
ironic_lib.disk_partitioner = ironic_lib.disk_partitioner:list_opts
|
||||
ironic_lib.disk_utils = ironic_lib.disk_utils:list_opts
|
||||
ironic_lib.exception = ironic_lib.exception:list_opts
|
||||
ironic_lib.json_rpc = ironic_lib.json_rpc:list_opts
|
||||
ironic_lib.mdns = ironic_lib.mdns:list_opts
|
||||
ironic_lib.metrics = ironic_lib.metrics_utils:list_opts
|
||||
ironic_lib.metrics_statsd = ironic_lib.metrics_statsd:list_opts
|
||||
ironic_lib.qemu_img = ironic_lib.qemu_img:list_opts
|
||||
ironic_lib.utils = ironic_lib.utils:list_opts
|
||||
|
||||
[extra]
|
||||
keystone =
|
||||
keystoneauth1>=4.2.0 # Apache-2.0
|
||||
os-service-types>=1.2.0 # Apache-2.0
|
||||
json_rpc =
|
||||
keystoneauth1>=4.2.0 # Apache-2.0
|
||||
os-service-types>=1.2.0 # Apache-2.0
|
||||
oslo.service!=1.28.1,>=1.24.0 # Apache-2.0
|
||||
|
||||
[codespell]
|
||||
quiet-level = 4
|
||||
# Words to ignore:
|
||||
# crypted: Valid in some contexts, e.g. "crypted password"
|
||||
# assertIn: Python's unittest method
|
||||
ignore-words-list = crypted,assertIn
|
20
setup.py
20
setup.py
@ -1,20 +0,0 @@
|
||||
# Copyright (c) 2013 Hewlett-Packard Development Company, L.P.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
# implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import setuptools
|
||||
|
||||
setuptools.setup(
|
||||
setup_requires=['pbr>=6.0.0'],
|
||||
pbr=True)
|
@ -1,8 +0,0 @@
|
||||
coverage>=4.0 # Apache-2.0
|
||||
stestr>=2.0.0 # Apache-2.0
|
||||
oslotest>=3.2.0 # Apache-2.0
|
||||
fixtures>=3.0.0 # Apache-2.0/BSD
|
||||
|
||||
# used for JSON RPC unit tests
|
||||
keystonemiddleware>=4.17.0 # Apache-2.0
|
||||
oslo.messaging>=5.29.0 # Apache-2.0
|
84
tox.ini
84
tox.ini
@ -1,84 +0,0 @@
|
||||
[tox]
|
||||
minversion = 4.4.0
|
||||
envlist = py3,pep8
|
||||
ignore_basepython_conflict=true
|
||||
|
||||
[testenv]
|
||||
constrain_package_deps = true
|
||||
usedevelop = True
|
||||
setenv = VIRTUAL_ENV={envdir}
|
||||
PYTHONDONTWRITEBYTECODE = 1
|
||||
LANGUAGE=en_US
|
||||
TESTS_DIR=./ironic_lib/tests/
|
||||
deps =
|
||||
-c{env:TOX_CONSTRAINTS_FILE:https://releases.openstack.org/constraints/upper/master}
|
||||
-r{toxinidir}/test-requirements.txt
|
||||
-r{toxinidir}/extra-requirements.txt
|
||||
-r{toxinidir}/requirements.txt
|
||||
commands = stestr run {posargs}
|
||||
|
||||
[flake8]
|
||||
show-source = True
|
||||
# [E129] Visually indented line with same indent as next logical line.
|
||||
# [W503] Line break occurred before a binary operator. Conflicts with W504.
|
||||
ignore = E129,W503
|
||||
exclude=.*,dist,doc,*lib/python*,*egg,build
|
||||
import-order-style = pep8
|
||||
application-import-names = ironic_lib
|
||||
# [H106] Don't put vim configuration in source files.
|
||||
# [H203] Use assertIs(Not)None to check for None.
|
||||
# [H204] Use assert(Not)Equal to check for equality.
|
||||
# [H205] Use assert(Greater|Less)(Equal) for comparison.
|
||||
# [H210] Require 'autospec', 'spec', or 'spec_set' in mock.patch/mock.patch.object calls
|
||||
# [H904] Delay string interpolations at logging calls.
|
||||
enable-extensions=H106,H203,H204,H205,H210,H904
|
||||
|
||||
[testenv:pep8]
|
||||
deps =
|
||||
flake8-import-order~=0.18.0 # LGPLv3
|
||||
hacking~=6.1.0 # Apache-2.0
|
||||
pycodestyle>=2.0.0,<3.0.0 # MIT
|
||||
doc8~=1.1.0 # Apache 2.0
|
||||
-c{env:TOX_CONSTRAINTS_FILE:https://releases.openstack.org/constraints/upper/master}
|
||||
commands =
|
||||
flake8 {posargs}
|
||||
doc8 README.rst doc/source --ignore D001
|
||||
|
||||
[testenv:cover]
|
||||
setenv = VIRTUALENV={envdir}
|
||||
LANGUAGE=en_US
|
||||
PYTHON=coverage run --source ironic_lib --omit='*tests*' --parallel-mode
|
||||
commands =
|
||||
coverage erase
|
||||
stestr run {posargs}
|
||||
coverage combine
|
||||
coverage report --omit='*tests*'
|
||||
coverage html -d ./cover --omit='*tests*'
|
||||
|
||||
[testenv:venv]
|
||||
commands = {posargs}
|
||||
|
||||
[testenv:docs]
|
||||
deps =
|
||||
-c{env:TOX_CONSTRAINTS_FILE:https://releases.openstack.org/constraints/upper/master}
|
||||
-r{toxinidir}/doc/requirements.txt
|
||||
-r{toxinidir}/extra-requirements.txt
|
||||
commands =
|
||||
sphinx-build -W -b html doc/source doc/build/html
|
||||
|
||||
|
||||
[testenv:pdf-docs]
|
||||
deps = {[testenv:docs]deps}
|
||||
allowlist_externals = make
|
||||
commands = sphinx-build -b latex doc/source doc/build/pdf
|
||||
make -C doc/build/pdf
|
||||
|
||||
|
||||
[testenv:codespell]
|
||||
description =
|
||||
Run codespell to check spelling
|
||||
deps = codespell
|
||||
# note(adamcarthur): {posargs} lets us run `tox -ecodespell -- -w` to get codespell
|
||||
# to correct spelling issues in our code it's aware of.
|
||||
commands =
|
||||
codespell {posargs}
|
@ -1,57 +0,0 @@
|
||||
- job:
|
||||
name: ironic-lib-base
|
||||
parent: ironic-base
|
||||
irrelevant-files:
|
||||
- ^test-requirements.txt$
|
||||
- ^.*\.rst$
|
||||
- ^api-ref/.*$
|
||||
- ^doc/.*$
|
||||
- ^ironic_lib/tests/.*$
|
||||
- ^releasenotes/.*$
|
||||
- ^setup.cfg$
|
||||
- ^tools/.*$
|
||||
- ^tox.ini$
|
||||
required-projects:
|
||||
- openstack/ironic-lib
|
||||
vars:
|
||||
tempest_test_timeout: 1800
|
||||
devstack_localrc:
|
||||
BUILD_TIMEOUT: 900
|
||||
IRONIC_BUILD_DEPLOY_RAMDISK: True
|
||||
IRONIC_TEMPEST_BUILD_TIMEOUT: 900
|
||||
SWIFT_ENABLE_TEMPURLS: True
|
||||
SWIFT_TEMPURL_KEY: secretkey
|
||||
|
||||
- job:
|
||||
name: ironic-lib-uefi-ipmi-src
|
||||
parent: ironic-lib-base
|
||||
timeout: 7200
|
||||
vars:
|
||||
devstack_services:
|
||||
s-account: True
|
||||
s-container: True
|
||||
s-object: True
|
||||
s-proxy: True
|
||||
devstack_localrc:
|
||||
IRONIC_TEMPEST_WHOLE_DISK_IMAGE: True
|
||||
IRONIC_VM_EPHEMERAL_DISK: 0
|
||||
|
||||
- job:
|
||||
name: ironic-lib-bios-ipmi-src
|
||||
parent: ironic-lib-base
|
||||
timeout: 7200
|
||||
vars:
|
||||
devstack_services:
|
||||
s-account: True
|
||||
s-container: True
|
||||
s-object: True
|
||||
s-proxy: True
|
||||
devstack_localrc:
|
||||
IRONIC_BOOT_MODE: bios
|
||||
|
||||
- job:
|
||||
name: ironic-lib-tox-codespell
|
||||
parent: openstack-tox
|
||||
timeout: 7200
|
||||
vars:
|
||||
tox_envlist: codespell
|
@ -1,19 +0,0 @@
|
||||
- project:
|
||||
templates:
|
||||
- check-requirements
|
||||
- openstack-cover-jobs
|
||||
- openstack-python3-jobs
|
||||
- publish-openstack-docs-pti
|
||||
check:
|
||||
jobs:
|
||||
- ironic-lib-uefi-ipmi-src
|
||||
- ironic-lib-bios-ipmi-src
|
||||
- ironic-lib-tox-codespell:
|
||||
voting: false
|
||||
gate:
|
||||
jobs:
|
||||
- ironic-lib-uefi-ipmi-src
|
||||
- ironic-lib-bios-ipmi-src
|
||||
post:
|
||||
jobs:
|
||||
- ironic-python-agent-build-image-tinyipa
|
Loading…
x
Reference in New Issue
Block a user