pki: Port Promenade's PKI catalog into Pegleg
This patch set implements the PKICatalog [0] requirements as well as PeglegManagedDocument [1] generation requirements outlined in the spec [2]. Included in this patch set: * New CLI entry point called "pegleg site secrets generate-pki" * PeglegManagedDocument generation logic in engine.cache.managed_document * Refactored PKICatalog logic in engine.cache.pki_catalog derived from the Promenade PKI implementation [3], responsible for generating certificates, CAs, and keypairs * Refactored PKIGenerator logic in engine.cache.pki_generator derived from Promenade Generator implementation [4], responsible for reading in pegleg/PKICatalog/v1 documents (as well as promenade/PKICatalog/v1 documents for backwards compatibility) and generating required secrets and storing them into the paths specified under [0] * Unit tests for all of the above [5] * Example pki-catalog.yaml document under pegleg/site_yamls * Validation schema for pki-catalog.yaml (TODO: implement validation logic here: [6]) * Updates to CLI documentation and inclusion of PKICatalog and PeglegManagedDocument documentation * Documentation updates with PKI information [7] TODO (in follow-up patch sets): * Expand on overview documentation to include new Pegleg responsibilities * Allow the original repository (not the copied one) to be the destination where the secrets are written to * Finish up cert expiry/revocation logic [0] https://airship-specs.readthedocs.io/en/latest/specs/approved/pegleg-secrets.html#document-generation [1] https://airship-specs.readthedocs.io/en/latest/specs/approved/pegleg-secrets.html#peglegmanageddocument [2] https://airship-specs.readthedocs.io/en/latest/specs/approved/pegleg-secrets.html [3] https://github.com/openstack/airship-promenade/blob/master/promenade/pki.py [4] https://github.com/openstack/airship-promenade/blob/master/promenade/generator.py [5] https://review.openstack.org/#/c/611739/ [6] https://review.openstack.org/#/c/608159/ [7] https://review.openstack.org/#/c/611738/ Change-Id: I3010d04cac6d22c656d144f0dafeaa5e19a13068
This commit is contained in:
parent
40da373023
commit
2a8d2638b3
16
.zuul.yaml
16
.zuul.yaml
@ -18,11 +18,13 @@
|
|||||||
check:
|
check:
|
||||||
jobs:
|
jobs:
|
||||||
- openstack-tox-pep8
|
- openstack-tox-pep8
|
||||||
|
- airship-pegleg-tox-py36
|
||||||
- airship-pegleg-doc-build
|
- airship-pegleg-doc-build
|
||||||
- airship-pegleg-docker-build-gate
|
- airship-pegleg-docker-build-gate
|
||||||
gate:
|
gate:
|
||||||
jobs:
|
jobs:
|
||||||
- openstack-tox-pep8
|
- openstack-tox-pep8
|
||||||
|
- airship-pegleg-tox-py36
|
||||||
- airship-pegleg-doc-build
|
- airship-pegleg-doc-build
|
||||||
- airship-pegleg-docker-build-gate
|
- airship-pegleg-docker-build-gate
|
||||||
post:
|
post:
|
||||||
@ -35,6 +37,20 @@
|
|||||||
- name: primary
|
- name: primary
|
||||||
label: ubuntu-xenial
|
label: ubuntu-xenial
|
||||||
|
|
||||||
|
- job:
|
||||||
|
name: airship-pegleg-tox-py36
|
||||||
|
description: |
|
||||||
|
Executes unit tests under Python 3.6
|
||||||
|
parent: openstack-tox-py36
|
||||||
|
pre-run:
|
||||||
|
- tools/gate/playbooks/install-cfssl.yaml
|
||||||
|
irrelevant-files:
|
||||||
|
- ^.*\.rst$
|
||||||
|
- ^doc/.*$
|
||||||
|
- ^etc/.*$
|
||||||
|
- ^releasenotes/.*$
|
||||||
|
- ^setup.cfg$
|
||||||
|
|
||||||
- job:
|
- job:
|
||||||
name: airship-pegleg-doc-build
|
name: airship-pegleg-doc-build
|
||||||
description: |
|
description: |
|
||||||
|
@ -81,10 +81,10 @@ CLI Options
|
|||||||
|
|
||||||
Enable debug logging.
|
Enable debug logging.
|
||||||
|
|
||||||
.. _site:
|
.. _repo-group:
|
||||||
|
|
||||||
Repo
|
Repo Group
|
||||||
====
|
==========
|
||||||
|
|
||||||
Allows you to perform repository-level operations.
|
Allows you to perform repository-level operations.
|
||||||
|
|
||||||
@ -127,8 +127,10 @@ a specific site, see :ref:`site-level linting <cli-site-lint>`.
|
|||||||
|
|
||||||
See :ref:`linting` for more information.
|
See :ref:`linting` for more information.
|
||||||
|
|
||||||
Site
|
.. _site-group:
|
||||||
====
|
|
||||||
|
Site Group
|
||||||
|
==========
|
||||||
|
|
||||||
Allows you to perform site-level operations.
|
Allows you to perform site-level operations.
|
||||||
|
|
||||||
@ -303,7 +305,7 @@ Show details for one site.
|
|||||||
|
|
||||||
Name of site.
|
Name of site.
|
||||||
|
|
||||||
**-o /--output** (Optional).
|
**-o/--output** (Optional).
|
||||||
|
|
||||||
Where to output.
|
Where to output.
|
||||||
|
|
||||||
@ -331,7 +333,7 @@ Render documents via `Deckhand`_ for one site.
|
|||||||
|
|
||||||
Name of site.
|
Name of site.
|
||||||
|
|
||||||
**-o /--output** (Optional).
|
**-o/--output** (Optional).
|
||||||
|
|
||||||
Where to output.
|
Where to output.
|
||||||
|
|
||||||
@ -418,6 +420,39 @@ Usage:
|
|||||||
|
|
||||||
./pegleg.sh site <options> upload <site_name> --context-marker=<uuid>
|
./pegleg.sh site <options> upload <site_name> --context-marker=<uuid>
|
||||||
|
|
||||||
|
Site Secrets Group
|
||||||
|
==================
|
||||||
|
|
||||||
|
Subgroup of :ref:`site-group`.
|
||||||
|
|
||||||
|
Generate PKI
|
||||||
|
------------
|
||||||
|
|
||||||
|
Generate certificates and keys according to all PKICatalog documents in the
|
||||||
|
site using the PKI module. Regenerating certificates can be
|
||||||
|
accomplished by re-running this command.
|
||||||
|
|
||||||
|
Pegleg places generated document files in ``<site>/secrets/passphrases``,
|
||||||
|
``<site>/secrets/certificates``, or ``<site>/secrets/keypairs`` as
|
||||||
|
appropriate:
|
||||||
|
|
||||||
|
* The generated filenames for passphrases will follow the pattern
|
||||||
|
:file:`<passphrase-doc-name>.yaml`.
|
||||||
|
* The generated filenames for certificate authorities will follow the pattern
|
||||||
|
:file:`<ca-name>_ca.yaml`.
|
||||||
|
* The generated filenames for certificates will follow the pattern
|
||||||
|
:file:`<ca-name>_<certificate-doc-name>_certificate.yaml`.
|
||||||
|
* The generated filenames for certificate keys will follow the pattern
|
||||||
|
:file:`<ca-name>_<certificate-doc-name>_key.yaml`.
|
||||||
|
* The generated filenames for keypairs will follow the pattern
|
||||||
|
:file:`<keypair-doc-name>.yaml`.
|
||||||
|
|
||||||
|
Dashes in the document names will be converted to underscores for consistency.
|
||||||
|
|
||||||
|
**site_name** (Required).
|
||||||
|
|
||||||
|
Name of site.
|
||||||
|
|
||||||
Examples
|
Examples
|
||||||
^^^^^^^^
|
^^^^^^^^
|
||||||
|
|
||||||
@ -427,6 +462,14 @@ Examples
|
|||||||
upload <site_name> <options>
|
upload <site_name> <options>
|
||||||
|
|
||||||
|
|
||||||
|
::
|
||||||
|
|
||||||
|
./pegleg.sh site -r <site_repo> -e <extra_repo> \
|
||||||
|
secrets generate-pki \
|
||||||
|
<site_name> \
|
||||||
|
-o <output> \
|
||||||
|
-f <filename>
|
||||||
|
|
||||||
.. _command-line-repository-overrides:
|
.. _command-line-repository-overrides:
|
||||||
|
|
||||||
Secrets
|
Secrets
|
||||||
@ -571,13 +614,13 @@ Example:
|
|||||||
|
|
||||||
|
|
||||||
CLI Repository Overrides
|
CLI Repository Overrides
|
||||||
------------------------
|
========================
|
||||||
|
|
||||||
Repository overrides should only be used for entries included underneath
|
Repository overrides should only be used for entries included underneath
|
||||||
the ``repositories`` field for a given :file:`site-definition.yaml`.
|
the ``repositories`` field for a given :file:`site-definition.yaml`.
|
||||||
|
|
||||||
Overrides are specified via the ``-e`` flag for all :ref:`site` commands. They
|
Overrides are specified via the ``-e`` flag for all :ref:`site-group` commands.
|
||||||
have the following format:
|
They have the following format:
|
||||||
|
|
||||||
::
|
::
|
||||||
|
|
||||||
@ -611,7 +654,7 @@ Where:
|
|||||||
.. _self-contained-repo:
|
.. _self-contained-repo:
|
||||||
|
|
||||||
Self-Contained Repository
|
Self-Contained Repository
|
||||||
^^^^^^^^^^^^^^^^^^^^^^^^^
|
-------------------------
|
||||||
|
|
||||||
For self-contained repositories, specification of extra repositories is
|
For self-contained repositories, specification of extra repositories is
|
||||||
unnecessary. The following command can be used to deploy the manifests in
|
unnecessary. The following command can be used to deploy the manifests in
|
||||||
|
@ -100,8 +100,8 @@ directory):
|
|||||||
|
|
||||||
.. code-block:: console
|
.. code-block:: console
|
||||||
|
|
||||||
# Quick way of building a venv and installing all required dependencies into
|
# Quick way of building a virtualenv and installing all required
|
||||||
# it.
|
# dependencies into it.
|
||||||
tox -e py36 --notest
|
tox -e py36 --notest
|
||||||
source .tox/py36/bin/activate
|
source .tox/py36/bin/activate
|
||||||
pip install -e .
|
pip install -e .
|
||||||
@ -128,11 +128,11 @@ Unit Tests
|
|||||||
|
|
||||||
To run all unit tests, execute::
|
To run all unit tests, execute::
|
||||||
|
|
||||||
$ tox -epy36
|
$ tox -e py36
|
||||||
|
|
||||||
To run unit tests using a regex, execute::
|
To run unit tests using a regex, execute::
|
||||||
|
|
||||||
$ tox -epy36 -- <regex>
|
$ tox -e py36 -- <regex>
|
||||||
|
|
||||||
.. _Airship: https://airshipit.readthedocs.io
|
.. _Airship: https://airshipit.readthedocs.io
|
||||||
.. _Deckhand: https://airship-deckhand.readthedocs.io/
|
.. _Deckhand: https://airship-deckhand.readthedocs.io/
|
||||||
|
@ -63,3 +63,11 @@ Authentication Exceptions
|
|||||||
.. autoexception:: pegleg.engine.util.shipyard_helper.AuthValuesError
|
.. autoexception:: pegleg.engine.util.shipyard_helper.AuthValuesError
|
||||||
:members:
|
:members:
|
||||||
:undoc-members:
|
:undoc-members:
|
||||||
|
|
||||||
|
PKI Exceptions
|
||||||
|
--------------
|
||||||
|
|
||||||
|
.. autoexception:: pegleg.engine.exceptions.IncompletePKIPairError
|
||||||
|
:members:
|
||||||
|
:show-inheritance:
|
||||||
|
:undoc-members:
|
||||||
|
@ -21,13 +21,14 @@ Getting Started
|
|||||||
What is Pegleg?
|
What is Pegleg?
|
||||||
---------------
|
---------------
|
||||||
|
|
||||||
Pegleg is a document aggregator that will aggregate all the documents in a
|
Pegleg is a document aggregator that aggregates all the documents in a
|
||||||
repository and pack them into a single YAML file. This allows for operators to
|
repository and packs them into a single YAML file. This allows for operators to
|
||||||
structure their site definitions in a maintainable directory layout, while
|
structure their site definitions in a maintainable directory layout, while
|
||||||
providing them with the automation and tooling needed to aggregate, lint, and
|
providing them with the automation and tooling needed to aggregate, lint, and
|
||||||
render those documents for deployment.
|
render those documents for deployment.
|
||||||
|
|
||||||
For more information on the documents that Pegleg works on see `Document Fundamentals`_.
|
For more information on the documents that Pegleg works on see
|
||||||
|
`Document Fundamentals`_.
|
||||||
|
|
||||||
Basic Usage
|
Basic Usage
|
||||||
-----------
|
-----------
|
||||||
|
Binary file not shown.
Before Width: | Height: | Size: 37 KiB After Width: | Height: | Size: 37 KiB |
@ -1,5 +1,6 @@
|
|||||||
ARG FROM=python:3.6
|
ARG FROM=python:3.6
|
||||||
FROM ${FROM}
|
FROM ${FROM}
|
||||||
|
ARG CFSSLURL=https://pkg.cfssl.org/R1.2/cfssl_linux-amd64
|
||||||
|
|
||||||
LABEL org.opencontainers.image.authors='airship-discuss@lists.airshipit.org, irc://#airshipit@freenode'
|
LABEL org.opencontainers.image.authors='airship-discuss@lists.airshipit.org, irc://#airshipit@freenode'
|
||||||
LABEL org.opencontainers.image.url='https://airshipit.org'
|
LABEL org.opencontainers.image.url='https://airshipit.org'
|
||||||
@ -14,5 +15,8 @@ WORKDIR /var/pegleg
|
|||||||
COPY requirements.txt /opt/pegleg/requirements.txt
|
COPY requirements.txt /opt/pegleg/requirements.txt
|
||||||
RUN pip3 install --no-cache-dir -r /opt/pegleg/requirements.txt
|
RUN pip3 install --no-cache-dir -r /opt/pegleg/requirements.txt
|
||||||
|
|
||||||
|
COPY tools/install-cfssl.sh /opt/pegleg/tools/install-cfssl.sh
|
||||||
|
RUN /opt/pegleg/tools/install-cfssl.sh ${CFSSLURL}
|
||||||
|
|
||||||
COPY . /opt/pegleg
|
COPY . /opt/pegleg
|
||||||
RUN pip3 install -e /opt/pegleg
|
RUN pip3 install -e /opt/pegleg
|
||||||
|
@ -20,6 +20,7 @@ import click
|
|||||||
|
|
||||||
from pegleg import config
|
from pegleg import config
|
||||||
from pegleg import engine
|
from pegleg import engine
|
||||||
|
from pegleg.engine import catalog
|
||||||
from pegleg.engine.util.shipyard_helper import ShipyardHelper
|
from pegleg.engine.util.shipyard_helper import ShipyardHelper
|
||||||
|
|
||||||
LOG = logging.getLogger(__name__)
|
LOG = logging.getLogger(__name__)
|
||||||
@ -130,7 +131,6 @@ def main(*, verbose):
|
|||||||
|
|
||||||
* site: site-level actions
|
* site: site-level actions
|
||||||
* repo: repository-level actions
|
* repo: repository-level actions
|
||||||
* stub (DEPRECATED)
|
|
||||||
|
|
||||||
"""
|
"""
|
||||||
|
|
||||||
@ -208,7 +208,7 @@ def site(*, site_repository, clone_path, extra_repositories, repo_key,
|
|||||||
* list: list available sites in a manifests repo
|
* list: list available sites in a manifests repo
|
||||||
* lint: lint a site along with all its dependencies
|
* lint: lint a site along with all its dependencies
|
||||||
* render: render a site using Deckhand
|
* render: render a site using Deckhand
|
||||||
* show: show a sites' files
|
* show: show a site's files
|
||||||
|
|
||||||
"""
|
"""
|
||||||
|
|
||||||
@ -375,6 +375,39 @@ def upload(ctx, *, os_project_domain_name,
|
|||||||
click.echo(ShipyardHelper(ctx).upload_documents())
|
click.echo(ShipyardHelper(ctx).upload_documents())
|
||||||
|
|
||||||
|
|
||||||
|
@site.group(name='secrets', help='Commands to manage site secrets documents')
|
||||||
|
def secrets():
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
@secrets.command(
|
||||||
|
'generate-pki',
|
||||||
|
help="""
|
||||||
|
Generate certificates and keys according to all PKICatalog documents in the
|
||||||
|
site. Regenerating certificates can be accomplished by re-running this command.
|
||||||
|
""")
|
||||||
|
@click.option(
|
||||||
|
'-a',
|
||||||
|
'--author',
|
||||||
|
'author',
|
||||||
|
help="""Identifying name of the author generating new certificates. Used
|
||||||
|
for tracking provenance information in the PeglegManagedDocuments. An attempt
|
||||||
|
is made to automatically determine this value, but should be provided.""")
|
||||||
|
@click.argument('site_name')
|
||||||
|
def generate_pki(site_name, author):
|
||||||
|
"""Generate certificates, certificate authorities and keypairs for a given
|
||||||
|
site.
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
engine.repository.process_repositories(site_name,
|
||||||
|
overwrite_existing=True)
|
||||||
|
pkigenerator = catalog.pki_generator.PKIGenerator(site_name, author=author)
|
||||||
|
output_paths = pkigenerator.generate()
|
||||||
|
|
||||||
|
click.echo("Generated PKI files written to:\n%s" % '\n'.join(output_paths))
|
||||||
|
|
||||||
|
|
||||||
@main.group(help='Commands related to types')
|
@main.group(help='Commands related to types')
|
||||||
@MAIN_REPOSITORY_OPTION
|
@MAIN_REPOSITORY_OPTION
|
||||||
@REPOSITORY_CLONE_PATH_OPTION
|
@REPOSITORY_CLONE_PATH_OPTION
|
||||||
@ -409,11 +442,6 @@ def list_types(*, output_stream):
|
|||||||
engine.type.list_types(output_stream)
|
engine.type.list_types(output_stream)
|
||||||
|
|
||||||
|
|
||||||
@site.group(name='secrets', help='Commands to manage site secrets documents')
|
|
||||||
def secrets():
|
|
||||||
pass
|
|
||||||
|
|
||||||
|
|
||||||
@secrets.command(
|
@secrets.command(
|
||||||
'encrypt',
|
'encrypt',
|
||||||
help='Command to encrypt and wrap site secrets '
|
help='Command to encrypt and wrap site secrets '
|
||||||
@ -437,7 +465,9 @@ def secrets():
|
|||||||
'documents')
|
'documents')
|
||||||
@click.argument('site_name')
|
@click.argument('site_name')
|
||||||
def encrypt(*, save_location, author, site_name):
|
def encrypt(*, save_location, author, site_name):
|
||||||
engine.repository.process_repositories(site_name)
|
engine.repository.process_repositories(site_name, overwrite_existing=True)
|
||||||
|
if save_location is None:
|
||||||
|
save_location = config.get_site_repo()
|
||||||
engine.secrets.encrypt(save_location, author, site_name)
|
engine.secrets.encrypt(save_location, author, site_name)
|
||||||
|
|
||||||
|
|
||||||
@ -453,4 +483,9 @@ def encrypt(*, save_location, author, site_name):
|
|||||||
@click.argument('site_name')
|
@click.argument('site_name')
|
||||||
def decrypt(*, file_name, site_name):
|
def decrypt(*, file_name, site_name):
|
||||||
engine.repository.process_repositories(site_name)
|
engine.repository.process_repositories(site_name)
|
||||||
engine.secrets.decrypt(file_name, site_name)
|
try:
|
||||||
|
click.echo(engine.secrets.decrypt(file_name, site_name))
|
||||||
|
except FileNotFoundError:
|
||||||
|
raise click.exceptions.FileError("Couldn't find file %s, "
|
||||||
|
"check your arguments and try "
|
||||||
|
"again." % file_name)
|
||||||
|
@ -25,7 +25,8 @@ except NameError:
|
|||||||
'extra_repos': [],
|
'extra_repos': [],
|
||||||
'clone_path': None,
|
'clone_path': None,
|
||||||
'site_path': 'site',
|
'site_path': 'site',
|
||||||
'type_path': 'type'
|
'site_rev': None,
|
||||||
|
'type_path': 'type',
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
@ -49,6 +50,16 @@ def set_clone_path(p):
|
|||||||
GLOBAL_CONTEXT['clone_path'] = p
|
GLOBAL_CONTEXT['clone_path'] = p
|
||||||
|
|
||||||
|
|
||||||
|
def get_site_rev():
|
||||||
|
"""Get site revision derived from the site repo URL/path, if provided."""
|
||||||
|
return GLOBAL_CONTEXT['site_rev']
|
||||||
|
|
||||||
|
|
||||||
|
def set_site_rev(r):
|
||||||
|
"""Set site revision derived from the site repo URL/path."""
|
||||||
|
GLOBAL_CONTEXT['site_rev'] = r
|
||||||
|
|
||||||
|
|
||||||
def get_extra_repo_overrides():
|
def get_extra_repo_overrides():
|
||||||
"""Get extra repository overrides specified via ``-e`` CLI flag."""
|
"""Get extra repository overrides specified via ``-e`` CLI flag."""
|
||||||
return GLOBAL_CONTEXT.get('extra_repo_overrides', [])
|
return GLOBAL_CONTEXT.get('extra_repo_overrides', [])
|
||||||
|
17
pegleg/engine/catalog/__init__.py
Normal file
17
pegleg/engine/catalog/__init__.py
Normal file
@ -0,0 +1,17 @@
|
|||||||
|
# Copyright 2018 AT&T Intellectual Property. All other rights reserved.
|
||||||
|
#
|
||||||
|
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
# you may not use this file except in compliance with the License.
|
||||||
|
# You may obtain a copy of the License at
|
||||||
|
#
|
||||||
|
# http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
#
|
||||||
|
# Unless required by applicable law or agreed to in writing, software
|
||||||
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
# See the License for the specific language governing permissions and
|
||||||
|
# limitations under the License.
|
||||||
|
|
||||||
|
# flake8: noqa
|
||||||
|
from pegleg.engine.catalog import pki_utility
|
||||||
|
from pegleg.engine.catalog import pki_generator
|
307
pegleg/engine/catalog/pki_generator.py
Normal file
307
pegleg/engine/catalog/pki_generator.py
Normal file
@ -0,0 +1,307 @@
|
|||||||
|
# Copyright 2018 AT&T Intellectual Property. All other rights reserved.
|
||||||
|
#
|
||||||
|
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
# you may not use this file except in compliance with the License.
|
||||||
|
# You may obtain a copy of the License at
|
||||||
|
#
|
||||||
|
# http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
#
|
||||||
|
# Unless required by applicable law or agreed to in writing, software
|
||||||
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
# See the License for the specific language governing permissions and
|
||||||
|
# limitations under the License.
|
||||||
|
|
||||||
|
import collections
|
||||||
|
import itertools
|
||||||
|
import logging
|
||||||
|
import os
|
||||||
|
|
||||||
|
import yaml
|
||||||
|
|
||||||
|
from pegleg import config
|
||||||
|
from pegleg.engine.catalog import pki_utility
|
||||||
|
from pegleg.engine.common import managed_document as md
|
||||||
|
from pegleg.engine import exceptions
|
||||||
|
from pegleg.engine import util
|
||||||
|
from pegleg.engine.util.pegleg_managed_document import \
|
||||||
|
PeglegManagedSecretsDocument
|
||||||
|
|
||||||
|
__all__ = ['PKIGenerator']
|
||||||
|
|
||||||
|
LOG = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class PKIGenerator(object):
|
||||||
|
"""Generates certificates, certificate authorities and keypairs using
|
||||||
|
the ``PKIUtility`` class.
|
||||||
|
|
||||||
|
Pegleg searches through a given "site" to derive all the documents
|
||||||
|
of kind ``PKICatalog``, which are in turn parsed for information related
|
||||||
|
to the above secret types and passed to ``PKIUtility`` for generation.
|
||||||
|
|
||||||
|
These secrets are output to various subdirectories underneath
|
||||||
|
``<site>/secrets/<subpath>``.
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, sitename, block_strings=True, author=None):
|
||||||
|
"""Constructor for ``PKIGenerator``.
|
||||||
|
|
||||||
|
:param str sitename: Site name for which to retrieve documents used for
|
||||||
|
certificate and keypair generation.
|
||||||
|
:param bool block_strings: Whether to dump out certificate data as
|
||||||
|
block-style YAML string. Defaults to true.
|
||||||
|
:param str author: Identifying name of the author generating new
|
||||||
|
certificates.
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
self._sitename = sitename
|
||||||
|
self._documents = util.definition.documents_for_site(sitename)
|
||||||
|
self._author = author
|
||||||
|
|
||||||
|
self.keys = pki_utility.PKIUtility(block_strings=block_strings)
|
||||||
|
self.outputs = collections.defaultdict(dict)
|
||||||
|
|
||||||
|
# Maps certificates to CAs in order to derive certificate paths.
|
||||||
|
self._cert_to_ca_map = {}
|
||||||
|
|
||||||
|
def generate(self):
|
||||||
|
for catalog in util.catalog.iterate(
|
||||||
|
documents=self._documents, kind='PKICatalog'):
|
||||||
|
for ca_name, ca_def in catalog['data'].get(
|
||||||
|
'certificate_authorities', {}).items():
|
||||||
|
ca_cert, ca_key = self.get_or_gen_ca(ca_name)
|
||||||
|
|
||||||
|
for cert_def in ca_def.get('certificates', []):
|
||||||
|
document_name = cert_def['document_name']
|
||||||
|
self._cert_to_ca_map.setdefault(document_name, ca_name)
|
||||||
|
cert, key = self.get_or_gen_cert(
|
||||||
|
document_name,
|
||||||
|
ca_cert=ca_cert,
|
||||||
|
ca_key=ca_key,
|
||||||
|
cn=cert_def['common_name'],
|
||||||
|
hosts=_extract_hosts(cert_def),
|
||||||
|
groups=cert_def.get('groups', []))
|
||||||
|
|
||||||
|
for keypair_def in catalog['data'].get('keypairs', []):
|
||||||
|
document_name = keypair_def['name']
|
||||||
|
self.get_or_gen_keypair(document_name)
|
||||||
|
|
||||||
|
return self._write(config.get_site_repo())
|
||||||
|
|
||||||
|
def get_or_gen_ca(self, document_name):
|
||||||
|
kinds = [
|
||||||
|
'CertificateAuthority',
|
||||||
|
'CertificateAuthorityKey',
|
||||||
|
]
|
||||||
|
return self._get_or_gen(self.gen_ca, kinds, document_name)
|
||||||
|
|
||||||
|
def get_or_gen_cert(self, document_name, **kwargs):
|
||||||
|
kinds = [
|
||||||
|
'Certificate',
|
||||||
|
'CertificateKey',
|
||||||
|
]
|
||||||
|
return self._get_or_gen(self.gen_cert, kinds, document_name, **kwargs)
|
||||||
|
|
||||||
|
def get_or_gen_keypair(self, document_name):
|
||||||
|
kinds = [
|
||||||
|
'PublicKey',
|
||||||
|
'PrivateKey',
|
||||||
|
]
|
||||||
|
return self._get_or_gen(self.gen_keypair, kinds, document_name)
|
||||||
|
|
||||||
|
def gen_ca(self, document_name, **kwargs):
|
||||||
|
return self.keys.generate_ca(document_name, **kwargs)
|
||||||
|
|
||||||
|
def gen_cert(self, document_name, *, ca_cert, ca_key, **kwargs):
|
||||||
|
ca_cert_data = ca_cert['data']['managedDocument']['data']
|
||||||
|
ca_key_data = ca_key['data']['managedDocument']['data']
|
||||||
|
return self.keys.generate_certificate(
|
||||||
|
document_name, ca_cert=ca_cert_data, ca_key=ca_key_data, **kwargs)
|
||||||
|
|
||||||
|
def gen_keypair(self, document_name):
|
||||||
|
return self.keys.generate_keypair(document_name)
|
||||||
|
|
||||||
|
def _get_or_gen(self, generator, kinds, document_name, *args, **kwargs):
|
||||||
|
docs = self._find_docs(kinds, document_name)
|
||||||
|
if not docs:
|
||||||
|
docs = generator(document_name, *args, **kwargs)
|
||||||
|
else:
|
||||||
|
docs = [PeglegManagedSecretsDocument(doc).pegleg_document
|
||||||
|
for doc in docs]
|
||||||
|
|
||||||
|
# Adding these to output should be idempotent, so we use a dict.
|
||||||
|
|
||||||
|
for wrapper_doc in docs:
|
||||||
|
wrapped_doc = wrapper_doc['data']['managedDocument']
|
||||||
|
schema = wrapped_doc['schema']
|
||||||
|
name = wrapped_doc['metadata']['name']
|
||||||
|
self.outputs[schema][name] = wrapper_doc
|
||||||
|
|
||||||
|
return docs
|
||||||
|
|
||||||
|
def _find_docs(self, kinds, document_name):
|
||||||
|
schemas = ['deckhand/%s/v1' % k for k in kinds]
|
||||||
|
docs = self._find_among_collected(schemas, document_name)
|
||||||
|
if docs:
|
||||||
|
if len(docs) == len(kinds):
|
||||||
|
LOG.debug('Found docs in input config named %s, kinds: %s',
|
||||||
|
document_name, kinds)
|
||||||
|
return docs
|
||||||
|
else:
|
||||||
|
raise exceptions.IncompletePKIPairError(
|
||||||
|
kinds=kinds, name=document_name)
|
||||||
|
|
||||||
|
else:
|
||||||
|
docs = self._find_among_outputs(schemas, document_name)
|
||||||
|
if docs:
|
||||||
|
LOG.debug('Found docs in current outputs named %s, kinds: %s',
|
||||||
|
document_name, kinds)
|
||||||
|
return docs
|
||||||
|
# TODO(felipemonteiro): Should this be a critical error?
|
||||||
|
LOG.debug('No docs existing docs named %s, kinds: %s', document_name,
|
||||||
|
kinds)
|
||||||
|
return []
|
||||||
|
|
||||||
|
def _find_among_collected(self, schemas, document_name):
|
||||||
|
result = []
|
||||||
|
for schema in schemas:
|
||||||
|
doc = _find_document_by(
|
||||||
|
self._documents, schema=schema, name=document_name)
|
||||||
|
# If the document wasn't found, then means it needs to be
|
||||||
|
# generated.
|
||||||
|
if doc:
|
||||||
|
result.append(doc)
|
||||||
|
return result
|
||||||
|
|
||||||
|
def _find_among_outputs(self, schemas, document_name):
|
||||||
|
result = []
|
||||||
|
for schema in schemas:
|
||||||
|
if document_name in self.outputs.get(schema, {}):
|
||||||
|
result.append(self.outputs[schema][document_name])
|
||||||
|
return result
|
||||||
|
|
||||||
|
def _write(self, output_dir):
|
||||||
|
documents = self.get_documents()
|
||||||
|
output_paths = set()
|
||||||
|
|
||||||
|
# First, delete each of the output paths below because we do an append
|
||||||
|
# action in the `open` call below. This means that for regeneration
|
||||||
|
# of certs, the original paths must be deleted.
|
||||||
|
for document in documents:
|
||||||
|
output_file_path = md.get_document_path(
|
||||||
|
sitename=self._sitename,
|
||||||
|
wrapper_document=document,
|
||||||
|
cert_to_ca_map=self._cert_to_ca_map)
|
||||||
|
output_path = os.path.join(output_dir, 'site', output_file_path)
|
||||||
|
# NOTE(felipemonteiro): This is currently an entirely safe
|
||||||
|
# operation as these files are being removed in the temporarily
|
||||||
|
# replicated versions of the local repositories.
|
||||||
|
if os.path.exists(output_path):
|
||||||
|
os.remove(output_path)
|
||||||
|
|
||||||
|
# Next, generate (or regenerate) the certificates.
|
||||||
|
for document in documents:
|
||||||
|
output_file_path = md.get_document_path(
|
||||||
|
sitename=self._sitename,
|
||||||
|
wrapper_document=document,
|
||||||
|
cert_to_ca_map=self._cert_to_ca_map)
|
||||||
|
output_path = os.path.join(output_dir, 'site', output_file_path)
|
||||||
|
dir_name = os.path.dirname(output_path)
|
||||||
|
|
||||||
|
if not os.path.exists(dir_name):
|
||||||
|
LOG.debug('Creating secrets path: %s', dir_name)
|
||||||
|
os.makedirs(dir_name)
|
||||||
|
|
||||||
|
with open(output_path, 'a') as f:
|
||||||
|
# Don't use safe_dump so we can block format certificate
|
||||||
|
# data.
|
||||||
|
yaml.dump(
|
||||||
|
document,
|
||||||
|
stream=f,
|
||||||
|
default_flow_style=False,
|
||||||
|
explicit_start=True,
|
||||||
|
indent=2)
|
||||||
|
|
||||||
|
output_paths.add(output_path)
|
||||||
|
return output_paths
|
||||||
|
|
||||||
|
def get_documents(self):
|
||||||
|
return list(
|
||||||
|
itertools.chain.from_iterable(
|
||||||
|
v.values() for v in self.outputs.values()))
|
||||||
|
|
||||||
|
|
||||||
|
def get_host_list(service_names):
|
||||||
|
service_list = []
|
||||||
|
for service in service_names:
|
||||||
|
parts = service.split('.')
|
||||||
|
for i in range(len(parts)):
|
||||||
|
service_list.append('.'.join(parts[:i + 1]))
|
||||||
|
return service_list
|
||||||
|
|
||||||
|
|
||||||
|
def _extract_hosts(cert_def):
|
||||||
|
hosts = cert_def.get('hosts', [])
|
||||||
|
hosts.extend(get_host_list(cert_def.get('kubernetes_service_names', [])))
|
||||||
|
return hosts
|
||||||
|
|
||||||
|
|
||||||
|
def _find_document_by(documents, **kwargs):
|
||||||
|
try:
|
||||||
|
return next(_iterate(documents, **kwargs))
|
||||||
|
except StopIteration:
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def _iterate(documents, *, kind=None, schema=None, labels=None, name=None):
|
||||||
|
if kind is not None:
|
||||||
|
if schema is not None:
|
||||||
|
raise AssertionError('Logic error: specified both kind and schema')
|
||||||
|
schema = 'promenade/%s/v1' % kind
|
||||||
|
|
||||||
|
for document in documents:
|
||||||
|
if _matches_filter(document, schema=schema, labels=labels, name=name):
|
||||||
|
yield document
|
||||||
|
|
||||||
|
|
||||||
|
def _matches_filter(document, *, schema, labels, name):
|
||||||
|
matches = True
|
||||||
|
|
||||||
|
if md.is_managed_document(document):
|
||||||
|
document = document['data']['managedDocument']
|
||||||
|
else:
|
||||||
|
document_schema = document['schema']
|
||||||
|
if document_schema in md.SUPPORTED_SCHEMAS:
|
||||||
|
# Can't use the filter value as they might not be an exact match.
|
||||||
|
document_metadata = document['metadata']
|
||||||
|
document_labels = document_metadata.get('labels', {})
|
||||||
|
document_name = document_metadata['name']
|
||||||
|
LOG.warning('Detected deprecated unmanaged document during PKI '
|
||||||
|
'generation. Details: schema=%s, name=%s, labels=%s.',
|
||||||
|
document_schema, document_labels, document_name)
|
||||||
|
|
||||||
|
if schema is not None and not document.get('schema',
|
||||||
|
'').startswith(schema):
|
||||||
|
matches = False
|
||||||
|
|
||||||
|
if labels is not None:
|
||||||
|
document_labels = _mg(document, 'labels', [])
|
||||||
|
for key, value in labels.items():
|
||||||
|
if key not in document_labels:
|
||||||
|
matches = False
|
||||||
|
else:
|
||||||
|
if document_labels[key] != value:
|
||||||
|
matches = False
|
||||||
|
|
||||||
|
if name is not None:
|
||||||
|
if _mg(document, 'name') != name:
|
||||||
|
matches = False
|
||||||
|
|
||||||
|
return matches
|
||||||
|
|
||||||
|
|
||||||
|
def _mg(document, field, default=None):
|
||||||
|
return document.get('metadata', {}).get(field, default)
|
330
pegleg/engine/catalog/pki_utility.py
Normal file
330
pegleg/engine/catalog/pki_utility.py
Normal file
@ -0,0 +1,330 @@
|
|||||||
|
# Copyright 2018 AT&T Intellectual Property. All other rights reserved.
|
||||||
|
#
|
||||||
|
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
# you may not use this file except in compliance with the License.
|
||||||
|
# You may obtain a copy of the License at
|
||||||
|
#
|
||||||
|
# http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
#
|
||||||
|
# Unless required by applicable law or agreed to in writing, software
|
||||||
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
# See the License for the specific language governing permissions and
|
||||||
|
# limitations under the License.
|
||||||
|
|
||||||
|
from datetime import datetime
|
||||||
|
import json
|
||||||
|
import logging
|
||||||
|
import os
|
||||||
|
# Ignore bandit false positive: B404:blacklist
|
||||||
|
# The purpose of this module is to safely encapsulate calls via fork.
|
||||||
|
import subprocess # nosec
|
||||||
|
import tempfile
|
||||||
|
|
||||||
|
from dateutil import parser
|
||||||
|
import pytz
|
||||||
|
import yaml
|
||||||
|
|
||||||
|
from pegleg.engine.util.pegleg_managed_document import \
|
||||||
|
PeglegManagedSecretsDocument
|
||||||
|
|
||||||
|
LOG = logging.getLogger(__name__)
|
||||||
|
_ONE_YEAR_IN_HOURS = '8760h' # 365 * 24
|
||||||
|
|
||||||
|
__all__ = ['PKIUtility']
|
||||||
|
|
||||||
|
|
||||||
|
# TODO(felipemonteiro): Create an abstract base class for other future Catalog
|
||||||
|
# classes.
|
||||||
|
|
||||||
|
|
||||||
|
class PKIUtility(object):
|
||||||
|
"""Public Key Infrastructure utility class.
|
||||||
|
|
||||||
|
Responsible for generating certificate and CA documents using ``cfssl`` and
|
||||||
|
keypairs using ``openssl``. These secrets are all wrapped in instances
|
||||||
|
of ``pegleg/PeglegManagedDocument/v1``.
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def cfssl_exists():
|
||||||
|
"""Checks whether cfssl command exists. Useful for testing."""
|
||||||
|
try:
|
||||||
|
subprocess.check_output( # nosec
|
||||||
|
['which', 'cfssl'], stderr=subprocess.STDOUT)
|
||||||
|
return True
|
||||||
|
except subprocess.CalledProcessError:
|
||||||
|
return False
|
||||||
|
|
||||||
|
def __init__(self, *, block_strings=True):
|
||||||
|
self.block_strings = block_strings
|
||||||
|
self._ca_config_string = None
|
||||||
|
|
||||||
|
@property
|
||||||
|
def ca_config(self):
|
||||||
|
if not self._ca_config_string:
|
||||||
|
self._ca_config_string = json.dumps({
|
||||||
|
'signing': {
|
||||||
|
'default': {
|
||||||
|
# TODO(felipemonteiro): Make this configurable.
|
||||||
|
'expiry':
|
||||||
|
_ONE_YEAR_IN_HOURS,
|
||||||
|
'usages': [
|
||||||
|
'signing', 'key encipherment', 'server auth',
|
||||||
|
'client auth'
|
||||||
|
],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
})
|
||||||
|
return self._ca_config_string
|
||||||
|
|
||||||
|
def generate_ca(self, ca_name):
|
||||||
|
"""Generate CA cert and associated key.
|
||||||
|
|
||||||
|
:param str ca_name: Name of Certificate Authority in wrapped document.
|
||||||
|
:returns: Tuple of (wrapped CA cert, wrapped CA key)
|
||||||
|
:rtype: tuple[dict, dict]
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
result = self._cfssl(
|
||||||
|
['gencert', '-initca', 'csr.json'],
|
||||||
|
files={
|
||||||
|
'csr.json': self.csr(name=ca_name),
|
||||||
|
})
|
||||||
|
|
||||||
|
return (self._wrap_ca(ca_name, result['cert']),
|
||||||
|
self._wrap_ca_key(ca_name, result['key']))
|
||||||
|
|
||||||
|
def generate_keypair(self, name):
|
||||||
|
"""Generate keypair.
|
||||||
|
|
||||||
|
:param str name: Name of keypair in wrapped document.
|
||||||
|
:returns: Tuple of (wrapped public key, wrapped private key)
|
||||||
|
:rtype: tuple[dict, dict]
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
priv_result = self._openssl(['genrsa', '-out', 'priv.pem'])
|
||||||
|
pub_result = self._openssl(
|
||||||
|
['rsa', '-in', 'priv.pem', '-pubout', '-out', 'pub.pem'],
|
||||||
|
files={
|
||||||
|
'priv.pem': priv_result['priv.pem'],
|
||||||
|
})
|
||||||
|
|
||||||
|
return (self._wrap_pub_key(name, pub_result['pub.pem']),
|
||||||
|
self._wrap_priv_key(name, priv_result['priv.pem']))
|
||||||
|
|
||||||
|
def generate_certificate(self,
|
||||||
|
name,
|
||||||
|
*,
|
||||||
|
ca_cert,
|
||||||
|
ca_key,
|
||||||
|
cn,
|
||||||
|
groups=None,
|
||||||
|
hosts=None):
|
||||||
|
"""Generate certificate and associated key given CA cert and key.
|
||||||
|
|
||||||
|
:param str name: Name of certificate in wrapped document.
|
||||||
|
:param str ca_cert: CA certificate.
|
||||||
|
:param str ca_key: CA certificate key.
|
||||||
|
:param str cn: Common name associated with certificate.
|
||||||
|
:param list groups: List of groups associated with certificate.
|
||||||
|
:param list hosts: List of hosts associated with certificate.
|
||||||
|
:returns: Tuple of (wrapped certificate, wrapped certificate key)
|
||||||
|
:rtype: tuple[dict, dict]
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
if groups is None:
|
||||||
|
groups = []
|
||||||
|
if hosts is None:
|
||||||
|
hosts = []
|
||||||
|
|
||||||
|
result = self._cfssl(
|
||||||
|
[
|
||||||
|
'gencert', '-ca', 'ca.pem', '-ca-key', 'ca-key.pem', '-config',
|
||||||
|
'ca-config.json', 'csr.json'
|
||||||
|
],
|
||||||
|
files={
|
||||||
|
'ca-config.json': self.ca_config,
|
||||||
|
'ca.pem': ca_cert,
|
||||||
|
'ca-key.pem': ca_key,
|
||||||
|
'csr.json': self.csr(name=cn, groups=groups, hosts=hosts),
|
||||||
|
})
|
||||||
|
|
||||||
|
return (self._wrap_cert(name, result['cert']),
|
||||||
|
self._wrap_cert_key(name, result['key']))
|
||||||
|
|
||||||
|
def csr(self,
|
||||||
|
*,
|
||||||
|
name,
|
||||||
|
groups=None,
|
||||||
|
hosts=None,
|
||||||
|
key={
|
||||||
|
'algo': 'rsa',
|
||||||
|
'size': 2048
|
||||||
|
}):
|
||||||
|
if groups is None:
|
||||||
|
groups = []
|
||||||
|
if hosts is None:
|
||||||
|
hosts = []
|
||||||
|
|
||||||
|
return json.dumps({
|
||||||
|
'CN': name,
|
||||||
|
'key': key,
|
||||||
|
'hosts': hosts,
|
||||||
|
'names': [{
|
||||||
|
'O': g
|
||||||
|
} for g in groups],
|
||||||
|
})
|
||||||
|
|
||||||
|
def cert_info(self, cert):
|
||||||
|
"""Retrieve certificate info via ``cfssl``.
|
||||||
|
|
||||||
|
:param str cert: Client certificate that contains the public key.
|
||||||
|
:returns: Information related to certificate.
|
||||||
|
:rtype: dict
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
return self._cfssl(
|
||||||
|
['certinfo', '-cert', 'cert.pem'], files={
|
||||||
|
'cert.pem': cert,
|
||||||
|
})
|
||||||
|
|
||||||
|
def check_expiry(self, cert):
|
||||||
|
"""Chek whether a given certificate is expired.
|
||||||
|
|
||||||
|
:param str cert: Client certificate that contains the public key.
|
||||||
|
:returns: True if certificate is expired, else False.
|
||||||
|
:rtype: bool
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
info = self.cert_info(cert)
|
||||||
|
expiry_str = info['not_after']
|
||||||
|
expiry = parser.parse(expiry_str)
|
||||||
|
# expiry is timezone-aware; do the same for `now`.
|
||||||
|
now = pytz.utc.localize(datetime.utcnow())
|
||||||
|
return now > expiry
|
||||||
|
|
||||||
|
def _cfssl(self, command, *, files=None):
|
||||||
|
"""Executes ``cfssl`` command via ``subprocess`` call."""
|
||||||
|
if not files:
|
||||||
|
files = {}
|
||||||
|
with tempfile.TemporaryDirectory() as tmp:
|
||||||
|
for filename, data in files.items():
|
||||||
|
with open(os.path.join(tmp, filename), 'w') as f:
|
||||||
|
f.write(data)
|
||||||
|
|
||||||
|
# Ignore bandit false positive:
|
||||||
|
# B603:subprocess_without_shell_equals_true
|
||||||
|
# This method wraps cfssl calls originating from this module.
|
||||||
|
result = subprocess.check_output( # nosec
|
||||||
|
['cfssl'] + command, cwd=tmp, stderr=subprocess.PIPE)
|
||||||
|
if not isinstance(result, str):
|
||||||
|
result = result.decode('utf-8')
|
||||||
|
return json.loads(result)
|
||||||
|
|
||||||
|
def _openssl(self, command, *, files=None):
|
||||||
|
"""Executes ``openssl`` command via ``subprocess`` call."""
|
||||||
|
if not files:
|
||||||
|
files = {}
|
||||||
|
|
||||||
|
with tempfile.TemporaryDirectory() as tmp:
|
||||||
|
for filename, data in files.items():
|
||||||
|
with open(os.path.join(tmp, filename), 'w') as f:
|
||||||
|
f.write(data)
|
||||||
|
|
||||||
|
# Ignore bandit false positive:
|
||||||
|
# B603:subprocess_without_shell_equals_true
|
||||||
|
# This method wraps openssl calls originating from this module.
|
||||||
|
subprocess.check_call( # nosec
|
||||||
|
['openssl'] + command,
|
||||||
|
cwd=tmp,
|
||||||
|
stderr=subprocess.PIPE)
|
||||||
|
|
||||||
|
result = {}
|
||||||
|
for filename in os.listdir(tmp):
|
||||||
|
if filename not in files:
|
||||||
|
with open(os.path.join(tmp, filename)) as f:
|
||||||
|
result[filename] = f.read()
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
def _wrap_ca(self, name, data):
|
||||||
|
return self.wrap_document(kind='CertificateAuthority', name=name,
|
||||||
|
data=data, block_strings=self.block_strings)
|
||||||
|
|
||||||
|
def _wrap_ca_key(self, name, data):
|
||||||
|
return self.wrap_document(kind='CertificateAuthorityKey', name=name,
|
||||||
|
data=data, block_strings=self.block_strings)
|
||||||
|
|
||||||
|
def _wrap_cert(self, name, data):
|
||||||
|
return self.wrap_document(kind='Certificate', name=name, data=data,
|
||||||
|
block_strings=self.block_strings)
|
||||||
|
|
||||||
|
def _wrap_cert_key(self, name, data):
|
||||||
|
return self.wrap_document(kind='CertificateKey', name=name, data=data,
|
||||||
|
block_strings=self.block_strings)
|
||||||
|
|
||||||
|
def _wrap_priv_key(self, name, data):
|
||||||
|
return self.wrap_document(kind='PrivateKey', name=name, data=data,
|
||||||
|
block_strings=self.block_strings)
|
||||||
|
|
||||||
|
def _wrap_pub_key(self, name, data):
|
||||||
|
return self.wrap_document(kind='PublicKey', name=name, data=data,
|
||||||
|
block_strings=self.block_strings)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def wrap_document(kind, name, data, block_strings=True):
|
||||||
|
"""Wrap document ``data`` with PeglegManagedDocument pattern.
|
||||||
|
|
||||||
|
:param str kind: The kind of document (found in ``schema``).
|
||||||
|
:param str name: Name of the document.
|
||||||
|
:param dict data: Document data.
|
||||||
|
:param bool block_strings: Whether to dump out certificate data as
|
||||||
|
block-style YAML string. Defaults to true.
|
||||||
|
:return: the wrapped document
|
||||||
|
:rtype: dict
|
||||||
|
"""
|
||||||
|
|
||||||
|
wrapped_schema = 'deckhand/%s/v1' % kind
|
||||||
|
wrapped_metadata = {
|
||||||
|
'schema': 'metadata/Document/v1',
|
||||||
|
'name': name,
|
||||||
|
'layeringDefinition': {
|
||||||
|
'abstract': False,
|
||||||
|
'layer': 'site',
|
||||||
|
}
|
||||||
|
}
|
||||||
|
wrapped_data = PKIUtility._block_literal(
|
||||||
|
data, block_strings=block_strings)
|
||||||
|
|
||||||
|
document = {
|
||||||
|
"schema": wrapped_schema,
|
||||||
|
"metadata": wrapped_metadata,
|
||||||
|
"data": wrapped_data
|
||||||
|
}
|
||||||
|
|
||||||
|
return PeglegManagedSecretsDocument(document).pegleg_document
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _block_literal(data, block_strings=True):
|
||||||
|
if block_strings:
|
||||||
|
return block_literal(data)
|
||||||
|
else:
|
||||||
|
return data
|
||||||
|
|
||||||
|
|
||||||
|
class block_literal(str):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
def block_literal_representer(dumper, data):
|
||||||
|
return dumper.represent_scalar('tag:yaml.org,2002:str', data, style='|')
|
||||||
|
|
||||||
|
|
||||||
|
yaml.add_representer(block_literal, block_literal_representer)
|
0
pegleg/engine/common/__init__.py
Normal file
0
pegleg/engine/common/__init__.py
Normal file
115
pegleg/engine/common/managed_document.py
Normal file
115
pegleg/engine/common/managed_document.py
Normal file
@ -0,0 +1,115 @@
|
|||||||
|
# Copyright 2018 AT&T Intellectual Property. All other rights reserved.
|
||||||
|
#
|
||||||
|
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
# you may not use this file except in compliance with the License.
|
||||||
|
# You may obtain a copy of the License at
|
||||||
|
#
|
||||||
|
# http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
#
|
||||||
|
# Unless required by applicable law or agreed to in writing, software
|
||||||
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
# See the License for the specific language governing permissions and
|
||||||
|
# limitations under the License.
|
||||||
|
|
||||||
|
import os
|
||||||
|
|
||||||
|
from pegleg import config
|
||||||
|
from pegleg.engine.util import git
|
||||||
|
|
||||||
|
MANAGED_DOCUMENT_SCHEMA = 'pegleg/PeglegManagedDocument/v1'
|
||||||
|
SUPPORTED_SCHEMAS = (
|
||||||
|
'deckhand/CertificateAuthority/v1',
|
||||||
|
'deckhand/CertificateAuthorityKey/v1',
|
||||||
|
'deckhand/Certificate/v1',
|
||||||
|
'deckhand/CertificateKey/v1',
|
||||||
|
'deckhand/PublicKey/v1',
|
||||||
|
'deckhand/PrivateKey/v1',
|
||||||
|
)
|
||||||
|
|
||||||
|
_KIND_TO_PATH = {
|
||||||
|
'CertificateAuthority': 'certificates',
|
||||||
|
'CertificateAuthorityKey': 'certificates',
|
||||||
|
'Certificate': 'certificates',
|
||||||
|
'CertificateKey': 'certificates',
|
||||||
|
'PublicKey': 'keypairs',
|
||||||
|
'PrivateKey': 'keypairs'
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def is_managed_document(document):
|
||||||
|
"""Utility for determining whether a document is wrapped by
|
||||||
|
``pegleg/PeglegManagedDocument/v1`` pattern.
|
||||||
|
|
||||||
|
:param dict document: Document to check.
|
||||||
|
:returns: True if document is managed, else False.
|
||||||
|
:rtype: bool
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
return document.get('schema') == "pegleg/PeglegManagedDocument/v1"
|
||||||
|
|
||||||
|
|
||||||
|
def get_document_path(sitename, wrapper_document, cert_to_ca_map=None):
|
||||||
|
"""Get path for outputting generated certificates or keys to.
|
||||||
|
|
||||||
|
Also updates the provenance path (``data.generated.specifiedBy.path``)
|
||||||
|
for ``wrapper_document``.
|
||||||
|
|
||||||
|
* Certificates ar written to: ``<site>/secrets/certificates``
|
||||||
|
* Keypairs are written to: ``<site>/secrets/keypairs``
|
||||||
|
* Passphrases are written to: ``<site>/secrets/passphrases``
|
||||||
|
|
||||||
|
* The generated filenames for passphrases will follow the pattern
|
||||||
|
``<passphrase-doc-name>.yaml``.
|
||||||
|
* The generated filenames for certificate authorities will follow the
|
||||||
|
pattern ``<ca-name>_ca.yaml``.
|
||||||
|
* The generated filenames for certificates will follow the pattern
|
||||||
|
``<ca-name>_<certificate-doc-name>_certificate.yaml``.
|
||||||
|
* The generated filenames for certificate keys will follow the pattern
|
||||||
|
``<ca-name>_<certificate-doc-name>_key.yaml``.
|
||||||
|
* The generated filenames for keypairs will follow the pattern
|
||||||
|
``<keypair-doc-name>.yaml``.
|
||||||
|
|
||||||
|
:param str sitename: Name of site.
|
||||||
|
:param dict wrapper_document: Generated ``PeglegManagedDocument``.
|
||||||
|
:param dict cert_to_ca_map: Dict that maps certificate names to
|
||||||
|
their respective CA name.
|
||||||
|
:returns: Path to write document out to.
|
||||||
|
:rtype: str
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
cert_to_ca_map = cert_to_ca_map or {}
|
||||||
|
|
||||||
|
managed_document = wrapper_document['data']['managedDocument']
|
||||||
|
kind = managed_document['schema'].split("/")[1]
|
||||||
|
name = managed_document['metadata']['name']
|
||||||
|
|
||||||
|
path = "%s/secrets/%s" % (sitename, _KIND_TO_PATH[kind])
|
||||||
|
|
||||||
|
if 'authority' in kind.lower():
|
||||||
|
filename_structure = '%s_ca.yaml'
|
||||||
|
elif 'certificate' in kind.lower():
|
||||||
|
ca_name = cert_to_ca_map[name]
|
||||||
|
filename_structure = ca_name + '_%s_certificate.yaml'
|
||||||
|
elif 'public' in kind.lower() or 'private' in kind.lower():
|
||||||
|
filename_structure = '%s.yaml'
|
||||||
|
|
||||||
|
# Dashes in the document names are converted to underscores for
|
||||||
|
# consistency.
|
||||||
|
filename = (filename_structure % name).replace('-', '_')
|
||||||
|
fullpath = os.path.join(path, filename)
|
||||||
|
|
||||||
|
# Not all managed documents are generated. Only update path provenance
|
||||||
|
# information for those that are.
|
||||||
|
if wrapper_document['data'].get('generated'):
|
||||||
|
wrapper_document['data']['generated']['specifiedBy']['path'] = fullpath
|
||||||
|
return fullpath
|
||||||
|
|
||||||
|
|
||||||
|
def _get_repo_url_and_rev():
|
||||||
|
repo_path_or_url = config.get_site_repo()
|
||||||
|
repo_url = git.repo_url(repo_path_or_url)
|
||||||
|
repo_rev = config.get_site_rev()
|
||||||
|
return repo_url, repo_rev
|
@ -65,3 +65,13 @@ class GitConfigException(PeglegBaseException):
|
|||||||
class GitInvalidRepoException(PeglegBaseException):
|
class GitInvalidRepoException(PeglegBaseException):
|
||||||
"""Exception raised when an invalid repository is detected."""
|
"""Exception raised when an invalid repository is detected."""
|
||||||
message = 'The repository path or URL is invalid: %(repo_path)s'
|
message = 'The repository path or URL is invalid: %(repo_path)s'
|
||||||
|
|
||||||
|
|
||||||
|
#
|
||||||
|
# PKI EXCEPTIONS
|
||||||
|
#
|
||||||
|
|
||||||
|
|
||||||
|
class IncompletePKIPairError(PeglegBaseException):
|
||||||
|
"""Exception for incomplete private/public keypair."""
|
||||||
|
message = ("Incomplete keypair set %(kinds)s for name: %(name)s")
|
||||||
|
@ -42,18 +42,19 @@ def _clean_temp_folders():
|
|||||||
shutil.rmtree(r, ignore_errors=True)
|
shutil.rmtree(r, ignore_errors=True)
|
||||||
|
|
||||||
|
|
||||||
def process_repositories(site_name):
|
def process_repositories(site_name, overwrite_existing=False):
|
||||||
"""Process and setup all repositories including ensuring we are at the
|
"""Process and setup all repositories including ensuring we are at the
|
||||||
right revision based on the site's own site-definition.yaml file.
|
right revision based on the site's own site-definition.yaml file.
|
||||||
|
|
||||||
:param site_name: Site name for which to clone relevant repos.
|
:param site_name: Site name for which to clone relevant repos.
|
||||||
|
:param overwrite_existing: Whether to overwrite an existing directory
|
||||||
|
|
||||||
"""
|
"""
|
||||||
|
|
||||||
# Only tracks extra repositories - not the site (primary) repository.
|
# Only tracks extra repositories - not the site (primary) repository.
|
||||||
extra_repos = []
|
extra_repos = []
|
||||||
|
|
||||||
site_repo = process_site_repository()
|
site_repo = process_site_repository(overwrite_existing=overwrite_existing)
|
||||||
|
|
||||||
# Retrieve extra repo data from site-definition.yaml files.
|
# Retrieve extra repo data from site-definition.yaml files.
|
||||||
site_data = util.definition.load_as_params(
|
site_data = util.definition.load_as_params(
|
||||||
@ -94,7 +95,9 @@ def process_repositories(site_name):
|
|||||||
"repo_username=%s, revision=%s", repo_alias, repo_url_or_path,
|
"repo_username=%s, revision=%s", repo_alias, repo_url_or_path,
|
||||||
repo_key, repo_user, repo_revision)
|
repo_key, repo_user, repo_revision)
|
||||||
|
|
||||||
temp_extra_repo = _process_repository(repo_url_or_path, repo_revision)
|
temp_extra_repo = _process_repository(
|
||||||
|
repo_url_or_path, repo_revision,
|
||||||
|
overwrite_existing=overwrite_existing)
|
||||||
extra_repos.append(temp_extra_repo)
|
extra_repos.append(temp_extra_repo)
|
||||||
|
|
||||||
# Overwrite the site repo and extra repos in the config because further
|
# Overwrite the site repo and extra repos in the config because further
|
||||||
@ -105,12 +108,13 @@ def process_repositories(site_name):
|
|||||||
config.set_extra_repo_list(extra_repos)
|
config.set_extra_repo_list(extra_repos)
|
||||||
|
|
||||||
|
|
||||||
def process_site_repository(update_config=False):
|
def process_site_repository(update_config=False, overwrite_existing=False):
|
||||||
"""Process and setup site repository including ensuring we are at the right
|
"""Process and setup site repository including ensuring we are at the right
|
||||||
revision based on the site's own site-definition.yaml file.
|
revision based on the site's own site-definition.yaml file.
|
||||||
|
|
||||||
:param bool update_config: Whether to update Pegleg config with computed
|
:param bool update_config: Whether to update Pegleg config with computed
|
||||||
site repo path.
|
site repo path.
|
||||||
|
:param overwrite_existing: Whether to overwrite an existing directory
|
||||||
|
|
||||||
"""
|
"""
|
||||||
|
|
||||||
@ -122,8 +126,10 @@ def process_site_repository(update_config=False):
|
|||||||
|
|
||||||
repo_url_or_path, repo_revision = _extract_repo_url_and_revision(
|
repo_url_or_path, repo_revision = _extract_repo_url_and_revision(
|
||||||
site_repo_or_path)
|
site_repo_or_path)
|
||||||
|
config.set_site_rev(repo_revision)
|
||||||
repo_url_or_path = _format_url_with_repo_username(repo_url_or_path)
|
repo_url_or_path = _format_url_with_repo_username(repo_url_or_path)
|
||||||
new_repo_path = _process_repository(repo_url_or_path, repo_revision)
|
new_repo_path = _process_repository(repo_url_or_path, repo_revision,
|
||||||
|
overwrite_existing=overwrite_existing)
|
||||||
|
|
||||||
if update_config:
|
if update_config:
|
||||||
# Overwrite the site repo in the config because further processing will
|
# Overwrite the site repo in the config because further processing will
|
||||||
@ -134,17 +140,19 @@ def process_site_repository(update_config=False):
|
|||||||
return new_repo_path
|
return new_repo_path
|
||||||
|
|
||||||
|
|
||||||
def _process_repository(repo_url_or_path, repo_revision):
|
def _process_repository(repo_url_or_path, repo_revision,
|
||||||
|
overwrite_existing=False):
|
||||||
"""Process a repository located at ``repo_url_or_path``.
|
"""Process a repository located at ``repo_url_or_path``.
|
||||||
|
|
||||||
:param str repo_url_or_path: Path to local repo or URL of remote URL.
|
:param str repo_url_or_path: Path to local repo or URL of remote URL.
|
||||||
:param str repo_revision: branch, commit or ref in the repo to checkout.
|
:param str repo_revision: branch, commit or ref in the repo to checkout.
|
||||||
|
:param overwrite_existing: Whether to overwrite an existing directory
|
||||||
|
|
||||||
"""
|
"""
|
||||||
|
|
||||||
global __REPO_FOLDERS
|
global __REPO_FOLDERS
|
||||||
|
|
||||||
if os.path.exists(repo_url_or_path):
|
if os.path.exists(repo_url_or_path) and not overwrite_existing:
|
||||||
repo_name = util.git.repo_name(repo_url_or_path)
|
repo_name = util.git.repo_name(repo_url_or_path)
|
||||||
parent_temp_path = tempfile.mkdtemp()
|
parent_temp_path = tempfile.mkdtemp()
|
||||||
__REPO_FOLDERS.setdefault(repo_name, parent_temp_path)
|
__REPO_FOLDERS.setdefault(repo_name, parent_temp_path)
|
||||||
|
@ -75,12 +75,13 @@ def decrypt(file_path, site_name):
|
|||||||
:type file_path: string
|
:type file_path: string
|
||||||
:param site_name: The name of the site to search for the file.
|
:param site_name: The name of the site to search for the file.
|
||||||
:type site_name: string
|
:type site_name: string
|
||||||
|
:return: The decrypted secrets
|
||||||
|
:rtype: list
|
||||||
"""
|
"""
|
||||||
|
|
||||||
LOG.info('Started decrypting...')
|
LOG.info('Started decrypting...')
|
||||||
if (os.path.isfile(file_path) and
|
if (os.path.isfile(file_path) and
|
||||||
[s for s in file_path.split(os.path.sep) if s == site_name]):
|
[s for s in file_path.split(os.path.sep) if s == site_name]):
|
||||||
PeglegSecretManagement(file_path).decrypt_secrets()
|
return PeglegSecretManagement(file_path).decrypt_secrets()
|
||||||
else:
|
else:
|
||||||
LOG.info('File: {} was not found. Check your file path and name, '
|
LOG.info('File: {} was not found. Check your file path and name, '
|
||||||
'and try again.'.format(file_path))
|
'and try again.'.format(file_path))
|
||||||
|
@ -13,7 +13,8 @@
|
|||||||
# limitations under the License.
|
# limitations under the License.
|
||||||
|
|
||||||
# flake8: noqa
|
# flake8: noqa
|
||||||
from . import definition
|
from pegleg.engine.util import catalog
|
||||||
from . import files
|
from pegleg.engine.util import definition
|
||||||
from . import deckhand
|
from pegleg.engine.util import deckhand
|
||||||
from . import git
|
from pegleg.engine.util import files
|
||||||
|
from pegleg.engine.util import git
|
||||||
|
52
pegleg/engine/util/catalog.py
Normal file
52
pegleg/engine/util/catalog.py
Normal file
@ -0,0 +1,52 @@
|
|||||||
|
# Copyright 2018 AT&T Intellectual Property. All other rights reserved.
|
||||||
|
#
|
||||||
|
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
# you may not use this file except in compliance with the License.
|
||||||
|
# You may obtain a copy of the License at
|
||||||
|
#
|
||||||
|
# http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
#
|
||||||
|
# Unless required by applicable law or agreed to in writing, software
|
||||||
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
# See the License for the specific language governing permissions and
|
||||||
|
# limitations under the License.
|
||||||
|
"""Utility functions for catalog files such as pki-catalog.yaml."""
|
||||||
|
|
||||||
|
import logging
|
||||||
|
|
||||||
|
from pegleg.engine.util import definition
|
||||||
|
|
||||||
|
LOG = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
__all__ = ('iterate', )
|
||||||
|
|
||||||
|
|
||||||
|
def iterate(kind, sitename=None, documents=None):
|
||||||
|
"""Retrieve the list of catalog documents by catalog schema ``kind``.
|
||||||
|
|
||||||
|
:param str kind: The schema kind of the catalog. For example, for schema
|
||||||
|
``pegleg/PKICatalog/v1`` kind should be "PKICatalog".
|
||||||
|
:param str sitename: (optional) Site name for retrieving documents.
|
||||||
|
Multually exclusive with ``documents``.
|
||||||
|
:param str documents: (optional) Documents to search through. Mutually
|
||||||
|
exclusive with ``sitename``.
|
||||||
|
:return: All catalog documents for ``kind``.
|
||||||
|
:rtype: generator[dict]
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
if not any([sitename, documents]):
|
||||||
|
raise ValueError('Either `sitename` or `documents` must be specified')
|
||||||
|
|
||||||
|
documents = documents or definition.documents_for_site(sitename)
|
||||||
|
for document in documents:
|
||||||
|
schema = document.get('schema')
|
||||||
|
# TODO(felipemonteiro): Remove 'promenade/%s/v1' once site manifest
|
||||||
|
# documents switch to new 'pegleg' namespace.
|
||||||
|
if schema == 'pegleg/%s/v1' % kind:
|
||||||
|
yield document
|
||||||
|
elif schema == 'promenade/%s/v1' % kind:
|
||||||
|
LOG.warning('The schema promenade/%s/v1 is deprecated. Use '
|
||||||
|
'pegleg/%s/v1 instead.', kind, kind)
|
||||||
|
yield document
|
@ -41,10 +41,10 @@ def load_schemas_from_docs(documents):
|
|||||||
return schema_set, errors
|
return schema_set, errors
|
||||||
|
|
||||||
|
|
||||||
def deckhand_render(documents=[],
|
def deckhand_render(documents=None,
|
||||||
fail_on_missing_sub_src=False,
|
fail_on_missing_sub_src=False,
|
||||||
validate=False):
|
validate=False):
|
||||||
|
documents = documents or []
|
||||||
errors = []
|
errors = []
|
||||||
rendered_documents = []
|
rendered_documents = []
|
||||||
|
|
||||||
|
@ -11,6 +11,7 @@
|
|||||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
# See the License for the specific language governing permissions and
|
# See the License for the specific language governing permissions and
|
||||||
# limitations under the License.
|
# limitations under the License.
|
||||||
|
"""Utility functions for site-definition.yaml files."""
|
||||||
|
|
||||||
import os
|
import os
|
||||||
|
|
||||||
|
@ -26,7 +26,8 @@ from pegleg.engine import exceptions
|
|||||||
|
|
||||||
LOG = logging.getLogger(__name__)
|
LOG = logging.getLogger(__name__)
|
||||||
|
|
||||||
__all__ = ('git_handler', )
|
__all__ = ('git_handler', 'is_repository', 'is_equal', 'repo_url', 'repo_name',
|
||||||
|
'normalize_repo_path')
|
||||||
|
|
||||||
|
|
||||||
def git_handler(repo_url,
|
def git_handler(repo_url,
|
||||||
@ -377,21 +378,26 @@ def is_equal(first_repo, other_repo):
|
|||||||
return False
|
return False
|
||||||
|
|
||||||
|
|
||||||
def repo_name(repo_path):
|
def repo_url(repo_url_or_path):
|
||||||
"""Get the repository name for local repo at ``repo_path``.
|
"""Get the repository URL for the local or remote repo at
|
||||||
|
``repo_url_or_path``.
|
||||||
|
|
||||||
:param repo_path: Path to local Git repo.
|
:param repo_url_or_path: URL of remote Git repo or path to local Git repo.
|
||||||
:returns: Corresponding repo name.
|
:returns: Corresponding repo name.
|
||||||
:rtype: str
|
:rtype: str
|
||||||
:raises GitConfigException: If the path is not a valid Git repo.
|
:raises GitConfigException: If the path is not a valid Git repo.
|
||||||
|
|
||||||
"""
|
"""
|
||||||
|
|
||||||
if not is_repository(normalize_repo_path(repo_path)[0]):
|
# If ``repo_url_or_path`` is already a URL, no point in checking.
|
||||||
raise exceptions.GitConfigException(repo_path=repo_path)
|
if not os.path.exists(repo_url_or_path):
|
||||||
|
return repo_url_or_path
|
||||||
|
|
||||||
|
if not is_repository(normalize_repo_path(repo_url_or_path)[0]):
|
||||||
|
raise exceptions.GitConfigException(repo_url=repo_url_or_path)
|
||||||
|
|
||||||
# TODO(felipemonteiro): Support this for remote URLs too?
|
# TODO(felipemonteiro): Support this for remote URLs too?
|
||||||
repo = Repo(repo_path, search_parent_directories=True)
|
repo = Repo(repo_url_or_path, search_parent_directories=True)
|
||||||
config_reader = repo.config_reader()
|
config_reader = repo.config_reader()
|
||||||
section = 'remote "origin"'
|
section = 'remote "origin"'
|
||||||
option = 'url'
|
option = 'url'
|
||||||
@ -408,9 +414,24 @@ def repo_name(repo_path):
|
|||||||
else:
|
else:
|
||||||
return repo_url.split('/')[-1]
|
return repo_url.split('/')[-1]
|
||||||
except Exception:
|
except Exception:
|
||||||
raise exceptions.GitConfigException(repo_path=repo_path)
|
raise exceptions.GitConfigException(repo_url=repo_url_or_path)
|
||||||
|
|
||||||
raise exceptions.GitConfigException(repo_path=repo_path)
|
raise exceptions.GitConfigException(repo_url=repo_url_or_path)
|
||||||
|
|
||||||
|
|
||||||
|
def repo_name(repo_url_or_path):
|
||||||
|
"""Get the repository name for the local or remote repo at
|
||||||
|
``repo_url_or_path``.
|
||||||
|
|
||||||
|
:param repo_url_or_path: URL of remote Git repo or path to local Git repo.
|
||||||
|
:returns: Corresponding repo name.
|
||||||
|
:rtype: str
|
||||||
|
:raises GitConfigException: If the path is not a valid Git repo.
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
_repo_url = repo_url(repo_url_or_path)
|
||||||
|
return _repo_url.split('/')[-1].split('.git')[0]
|
||||||
|
|
||||||
|
|
||||||
def normalize_repo_path(repo_url_or_path):
|
def normalize_repo_path(repo_url_or_path):
|
||||||
@ -435,7 +456,7 @@ def normalize_repo_path(repo_url_or_path):
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
repo_url_or_path = repo_url_or_path.rstrip('/')
|
repo_url_or_path = repo_url_or_path.rstrip('/')
|
||||||
orig_repo_path = repo_url_or_path
|
orig_repo_url_or_path = repo_url_or_path
|
||||||
sub_path = ""
|
sub_path = ""
|
||||||
is_local_repo = os.path.exists(repo_url_or_path)
|
is_local_repo = os.path.exists(repo_url_or_path)
|
||||||
|
|
||||||
@ -455,8 +476,10 @@ def normalize_repo_path(repo_url_or_path):
|
|||||||
repo_url_or_path = os.path.abspath(repo_url_or_path)
|
repo_url_or_path = os.path.abspath(repo_url_or_path)
|
||||||
|
|
||||||
if not repo_url_or_path or not is_repository(repo_url_or_path):
|
if not repo_url_or_path or not is_repository(repo_url_or_path):
|
||||||
msg = "The repo_path=%s is not a valid Git repo" % (orig_repo_path)
|
msg = "The repo_path=%s is not a valid Git repo" % (
|
||||||
|
orig_repo_url_or_path)
|
||||||
LOG.error(msg)
|
LOG.error(msg)
|
||||||
raise exceptions.GitInvalidRepoException(repo_path=repo_url_or_path)
|
raise exceptions.GitInvalidRepoException(
|
||||||
|
repo_path=orig_repo_url_or_path)
|
||||||
|
|
||||||
return repo_url_or_path, sub_path
|
return repo_url_or_path, sub_path
|
||||||
|
@ -15,7 +15,6 @@
|
|||||||
import logging
|
import logging
|
||||||
import os
|
import os
|
||||||
import re
|
import re
|
||||||
import sys
|
|
||||||
|
|
||||||
import click
|
import click
|
||||||
import yaml
|
import yaml
|
||||||
@ -130,9 +129,10 @@ class PeglegSecretManagement(object):
|
|||||||
included in a site secrets file, and print the result to the standard
|
included in a site secrets file, and print the result to the standard
|
||||||
out."""
|
out."""
|
||||||
|
|
||||||
yaml.safe_dump_all(
|
secrets = self.get_decrypted_secrets()
|
||||||
self.get_decrypted_secrets(),
|
|
||||||
sys.stdout,
|
return yaml.safe_dump_all(
|
||||||
|
secrets,
|
||||||
explicit_start=True,
|
explicit_start=True,
|
||||||
explicit_end=True,
|
explicit_end=True,
|
||||||
default_flow_style=False)
|
default_flow_style=False)
|
||||||
|
44
pegleg/schemas/PKICatalog.yaml
Normal file
44
pegleg/schemas/PKICatalog.yaml
Normal file
@ -0,0 +1,44 @@
|
|||||||
|
# TODO(felipemonteiro): Implement validation and use this.
|
||||||
|
---
|
||||||
|
schema: deckhand/DataSchema/v1
|
||||||
|
metadata:
|
||||||
|
schema: metadata/Control/v1
|
||||||
|
name: pegleg/PKICatalog/v1
|
||||||
|
labels:
|
||||||
|
application: pegleg
|
||||||
|
data:
|
||||||
|
$schema: http://json-schema.org/schema#
|
||||||
|
certificate_authorities:
|
||||||
|
type: array
|
||||||
|
items:
|
||||||
|
type: object
|
||||||
|
properties:
|
||||||
|
description:
|
||||||
|
type: string
|
||||||
|
certificates:
|
||||||
|
type: array
|
||||||
|
items:
|
||||||
|
type: object
|
||||||
|
properties:
|
||||||
|
document_name:
|
||||||
|
type: string
|
||||||
|
description:
|
||||||
|
type: string
|
||||||
|
common_name:
|
||||||
|
type: string
|
||||||
|
hosts:
|
||||||
|
type: array
|
||||||
|
items: string
|
||||||
|
groups:
|
||||||
|
type: array
|
||||||
|
items: string
|
||||||
|
keypairs:
|
||||||
|
type: array
|
||||||
|
items:
|
||||||
|
type: object
|
||||||
|
properties:
|
||||||
|
name:
|
||||||
|
type: string
|
||||||
|
description:
|
||||||
|
type: string
|
||||||
|
...
|
@ -3,5 +3,8 @@ click==6.7
|
|||||||
jsonschema==2.6.0
|
jsonschema==2.6.0
|
||||||
pyyaml==3.12
|
pyyaml==3.12
|
||||||
cryptography==2.3.1
|
cryptography==2.3.1
|
||||||
|
python-dateutil==2.7.3
|
||||||
|
|
||||||
|
# External dependencies
|
||||||
git+https://github.com/openstack/airship-deckhand.git@7d697012fcbd868b14670aa9cf895acfad5a7f8d
|
git+https://github.com/openstack/airship-deckhand.git@7d697012fcbd868b14670aa9cf895acfad5a7f8d
|
||||||
git+https://github.com/openstack/airship-shipyard.git@44f7022df6438de541501c2fdd5c46df198b82bf#egg=shipyard_client&subdirectory=src/bin/shipyard_client
|
git+https://github.com/openstack/airship-shipyard.git@44f7022df6438de541501c2fdd5c46df198b82bf#egg=shipyard_client&subdirectory=src/bin/shipyard_client
|
||||||
|
23
site_yamls/site/pki-catalog.yaml
Normal file
23
site_yamls/site/pki-catalog.yaml
Normal file
@ -0,0 +1,23 @@
|
|||||||
|
# Basic example of pki-catalog.yaml for k8s.
|
||||||
|
---
|
||||||
|
schema: promenade/PKICatalog/v1
|
||||||
|
metadata:
|
||||||
|
schema: metadata/Document/v1
|
||||||
|
name: cluster-certificates-addition
|
||||||
|
layeringDefinition:
|
||||||
|
abstract: false
|
||||||
|
layer: site
|
||||||
|
storagePolicy: cleartext
|
||||||
|
data:
|
||||||
|
certificate_authorities:
|
||||||
|
kubernetes:
|
||||||
|
description: CA for Kubernetes components
|
||||||
|
certificates:
|
||||||
|
- document_name: kubelet-n3
|
||||||
|
common_name: system:node:n3
|
||||||
|
hosts:
|
||||||
|
- n3
|
||||||
|
- 192.168.77.13
|
||||||
|
groups:
|
||||||
|
- system:nodes
|
||||||
|
...
|
@ -1,3 +1,4 @@
|
|||||||
|
# TODO(felipemonteiro): Update `data` section below with new values.
|
||||||
---
|
---
|
||||||
data:
|
data:
|
||||||
revision: v1.0
|
revision: v1.0
|
||||||
|
@ -12,23 +12,29 @@
|
|||||||
# See the License for the specific language governing permissions and
|
# See the License for the specific language governing permissions and
|
||||||
# limitations under the License.
|
# limitations under the License.
|
||||||
|
|
||||||
import click
|
|
||||||
import os
|
import os
|
||||||
import tempfile
|
from os import listdir
|
||||||
|
|
||||||
|
import click
|
||||||
import mock
|
import mock
|
||||||
import pytest
|
import pytest
|
||||||
import yaml
|
import yaml
|
||||||
|
import tempfile
|
||||||
|
|
||||||
from pegleg.engine.util import encryption as crypt
|
from pegleg import config
|
||||||
from tests.unit import test_utils
|
from pegleg.engine import secrets
|
||||||
|
from pegleg.engine.catalog import pki_utility
|
||||||
|
from pegleg.engine.catalog.pki_generator import PKIGenerator
|
||||||
|
from pegleg.engine.util import encryption as crypt, catalog, git
|
||||||
|
from pegleg.engine.util import files
|
||||||
from pegleg.engine.util.pegleg_managed_document import \
|
from pegleg.engine.util.pegleg_managed_document import \
|
||||||
PeglegManagedSecretsDocument
|
PeglegManagedSecretsDocument
|
||||||
from pegleg.engine.util.pegleg_secret_management import PeglegSecretManagement
|
|
||||||
from pegleg.engine.util.pegleg_secret_management import ENV_PASSPHRASE
|
from pegleg.engine.util.pegleg_secret_management import ENV_PASSPHRASE
|
||||||
from pegleg.engine.util.pegleg_secret_management import ENV_SALT
|
from pegleg.engine.util.pegleg_secret_management import ENV_SALT
|
||||||
from tests.unit.fixtures import temp_path
|
from pegleg.engine.util.pegleg_secret_management import PeglegSecretManagement
|
||||||
from pegleg.engine.util import files
|
from tests.unit import test_utils
|
||||||
|
from tests.unit.fixtures import temp_path, create_tmp_deployment_files, _gen_document
|
||||||
|
from tests.unit.test_cli import TestSiteSecretsActions, BaseCLIActionTest, TEST_PARAMS
|
||||||
|
|
||||||
TEST_DATA = """
|
TEST_DATA = """
|
||||||
---
|
---
|
||||||
@ -69,6 +75,44 @@ def test_short_passphrase():
|
|||||||
PeglegSecretManagement('file_path')
|
PeglegSecretManagement('file_path')
|
||||||
|
|
||||||
|
|
||||||
|
@mock.patch.dict(os.environ, {
|
||||||
|
ENV_PASSPHRASE: 'ytrr89erARAiPE34692iwUMvWqqBvC',
|
||||||
|
ENV_SALT: 'MySecretSalt'})
|
||||||
|
def test_secret_encrypt_and_decrypt(create_tmp_deployment_files, tmpdir):
|
||||||
|
site_dir = tmpdir.join("deployment_files", "site", "cicd")
|
||||||
|
passphrase_doc = """---
|
||||||
|
schema: deckhand/Passphrase/v1
|
||||||
|
metadata:
|
||||||
|
schema: metadata/Document/v1
|
||||||
|
name: {0}
|
||||||
|
storagePolicy: {1}
|
||||||
|
layeringDefinition:
|
||||||
|
abstract: False
|
||||||
|
layer: {2}
|
||||||
|
data: {0}-password
|
||||||
|
...
|
||||||
|
""".format("cicd-passphrase-encrypted", "encrypted",
|
||||||
|
"site")
|
||||||
|
with open(os.path.join(str(site_dir), 'secrets',
|
||||||
|
'passphrases',
|
||||||
|
'cicd-passphrase-encrypted.yaml'), "w") \
|
||||||
|
as outfile:
|
||||||
|
outfile.write(passphrase_doc)
|
||||||
|
|
||||||
|
save_location = tmpdir.mkdir("encrypted_files")
|
||||||
|
save_location_str = str(save_location)
|
||||||
|
|
||||||
|
secrets.encrypt(save_location_str, "pytest", "cicd")
|
||||||
|
encrypted_files = listdir(save_location_str)
|
||||||
|
assert len(encrypted_files) > 0
|
||||||
|
|
||||||
|
# for _file in encrypted_files:
|
||||||
|
decrypted = secrets.decrypt(str(save_location.join(
|
||||||
|
"site/cicd/secrets/passphrases/"
|
||||||
|
"cicd-passphrase-encrypted.yaml")), "cicd")
|
||||||
|
assert yaml.load(decrypted) == yaml.load(passphrase_doc)
|
||||||
|
|
||||||
|
|
||||||
def test_pegleg_secret_management_constructor():
|
def test_pegleg_secret_management_constructor():
|
||||||
test_data = yaml.load(TEST_DATA)
|
test_data = yaml.load(TEST_DATA)
|
||||||
doc = PeglegManagedSecretsDocument(test_data)
|
doc = PeglegManagedSecretsDocument(test_data)
|
||||||
@ -141,3 +185,52 @@ def test_encrypt_decrypt_using_docs(temp_path):
|
|||||||
'name']
|
'name']
|
||||||
assert test_data[0]['metadata']['storagePolicy'] == decrypted_data[0][
|
assert test_data[0]['metadata']['storagePolicy'] == decrypted_data[0][
|
||||||
'metadata']['storagePolicy']
|
'metadata']['storagePolicy']
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.skipif(
|
||||||
|
not pki_utility.PKIUtility.cfssl_exists(),
|
||||||
|
reason='cfssl must be installed to execute these tests')
|
||||||
|
def test_generate_pki_using_local_repo_path(create_tmp_deployment_files):
|
||||||
|
"""Validates ``generate-pki`` action using local repo path."""
|
||||||
|
# Scenario:
|
||||||
|
#
|
||||||
|
# 1) Generate PKI using local repo path
|
||||||
|
|
||||||
|
repo_path = str(git.git_handler(TEST_PARAMS["repo_url"],
|
||||||
|
ref=TEST_PARAMS["repo_rev"]))
|
||||||
|
with mock.patch.dict(config.GLOBAL_CONTEXT, {"site_repo": repo_path}):
|
||||||
|
pki_generator = PKIGenerator(sitename=TEST_PARAMS["site_name"])
|
||||||
|
generated_files = pki_generator.generate()
|
||||||
|
|
||||||
|
assert len(generated_files), 'No secrets were generated'
|
||||||
|
for generated_file in generated_files:
|
||||||
|
with open(generated_file, 'r') as f:
|
||||||
|
result = yaml.safe_load_all(f) # Validate valid YAML.
|
||||||
|
assert list(result), "%s file is empty" % generated_file.name
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.skipif(
|
||||||
|
not pki_utility.PKIUtility.cfssl_exists(),
|
||||||
|
reason='cfssl must be installed to execute these tests')
|
||||||
|
def test_check_expiry(create_tmp_deployment_files):
|
||||||
|
""" Validates check_expiry """
|
||||||
|
repo_path = str(git.git_handler(TEST_PARAMS["repo_url"],
|
||||||
|
ref=TEST_PARAMS["repo_rev"]))
|
||||||
|
with mock.patch.dict(config.GLOBAL_CONTEXT, {"site_repo": repo_path}):
|
||||||
|
pki_generator = PKIGenerator(sitename=TEST_PARAMS["site_name"])
|
||||||
|
generated_files = pki_generator.generate()
|
||||||
|
|
||||||
|
pki_util = pki_utility.PKIUtility()
|
||||||
|
|
||||||
|
assert len(generated_files), 'No secrets were generated'
|
||||||
|
for generated_file in generated_files:
|
||||||
|
if "certificate" not in generated_file:
|
||||||
|
continue
|
||||||
|
with open(generated_file, 'r') as f:
|
||||||
|
results = yaml.safe_load_all(f) # Validate valid YAML.
|
||||||
|
for result in results:
|
||||||
|
if result['data']['managedDocument']['schema'] == \
|
||||||
|
"deckhand/Certificate/v1":
|
||||||
|
cert = result['data']['managedDocument']['data']
|
||||||
|
assert not pki_util.check_expiry(cert), \
|
||||||
|
"%s is expired!" % generated_file.name
|
@ -30,7 +30,7 @@ schema: deckhand/Passphrase/v1
|
|||||||
metadata:
|
metadata:
|
||||||
schema: metadata/Document/v1
|
schema: metadata/Document/v1
|
||||||
name: %(name)s
|
name: %(name)s
|
||||||
storagePolicy: cleartext
|
storagePolicy: %(storagePolicy)s
|
||||||
layeringDefinition:
|
layeringDefinition:
|
||||||
abstract: False
|
abstract: False
|
||||||
layer: %(layer)s
|
layer: %(layer)s
|
||||||
@ -40,6 +40,8 @@ data: %(name)s-password
|
|||||||
|
|
||||||
|
|
||||||
def _gen_document(**kwargs):
|
def _gen_document(**kwargs):
|
||||||
|
if "storagePolicy" not in kwargs:
|
||||||
|
kwargs["storagePolicy"] = "cleartext"
|
||||||
test_document = TEST_DOCUMENT % kwargs
|
test_document = TEST_DOCUMENT % kwargs
|
||||||
return yaml.load(test_document)
|
return yaml.load(test_document)
|
||||||
|
|
||||||
@ -154,7 +156,7 @@ schema: pegleg/SiteDefinition/v1
|
|||||||
cicd_path = os.path.join(str(p), files._site_path(site))
|
cicd_path = os.path.join(str(p), files._site_path(site))
|
||||||
files._create_tree(cicd_path, tree=test_structure)
|
files._create_tree(cicd_path, tree=test_structure)
|
||||||
|
|
||||||
yield
|
yield tmpdir
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture()
|
@pytest.fixture()
|
||||||
|
@ -19,14 +19,25 @@ from click.testing import CliRunner
|
|||||||
from mock import ANY
|
from mock import ANY
|
||||||
import mock
|
import mock
|
||||||
import pytest
|
import pytest
|
||||||
|
import yaml
|
||||||
|
|
||||||
from pegleg import cli
|
from pegleg import cli
|
||||||
|
from pegleg.engine.catalog import pki_utility
|
||||||
from pegleg.engine import errorcodes
|
from pegleg.engine import errorcodes
|
||||||
from pegleg.engine.util import git
|
from pegleg.engine.util import git
|
||||||
from tests.unit import test_utils
|
from tests.unit import test_utils
|
||||||
from tests.unit.fixtures import temp_path
|
from tests.unit.fixtures import temp_path
|
||||||
|
|
||||||
|
|
||||||
|
TEST_PARAMS = {
|
||||||
|
"site_name": "airship-seaworthy",
|
||||||
|
"site_type": "foundry",
|
||||||
|
"repo_rev": '6b183e148b9bb7ba6f75c98dd13451088255c60b',
|
||||||
|
"repo_name": "airship-treasuremap",
|
||||||
|
"repo_url": "https://github.com/openstack/airship-treasuremap.git",
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.skipif(
|
@pytest.mark.skipif(
|
||||||
not test_utils.is_connected(),
|
not test_utils.is_connected(),
|
||||||
reason='git clone requires network connectivity.')
|
reason='git clone requires network connectivity.')
|
||||||
@ -50,13 +61,13 @@ class BaseCLIActionTest(object):
|
|||||||
cls.runner = CliRunner()
|
cls.runner = CliRunner()
|
||||||
|
|
||||||
# Pin so we know that airship-seaworthy is a valid site.
|
# Pin so we know that airship-seaworthy is a valid site.
|
||||||
cls.site_name = "airship-seaworthy"
|
cls.site_name = TEST_PARAMS["site_name"]
|
||||||
cls.site_type = "foundry"
|
cls.site_type = TEST_PARAMS["site_type"]
|
||||||
|
|
||||||
cls.repo_rev = '6b183e148b9bb7ba6f75c98dd13451088255c60b'
|
cls.repo_rev = TEST_PARAMS["repo_rev"]
|
||||||
cls.repo_name = "airship-treasuremap"
|
cls.repo_name = TEST_PARAMS["repo_name"]
|
||||||
repo_url = "https://github.com/openstack/%s.git" % cls.repo_name
|
cls.treasuremap_path = git.git_handler(TEST_PARAMS["repo_url"],
|
||||||
cls.treasuremap_path = git.git_handler(repo_url, ref=cls.repo_rev)
|
ref=TEST_PARAMS["repo_rev"])
|
||||||
|
|
||||||
|
|
||||||
class TestSiteCLIOptions(BaseCLIActionTest):
|
class TestSiteCLIOptions(BaseCLIActionTest):
|
||||||
@ -428,6 +439,94 @@ class TestRepoCliActions(BaseCLIActionTest):
|
|||||||
assert not result.output
|
assert not result.output
|
||||||
|
|
||||||
|
|
||||||
|
class TestSiteSecretsActions(BaseCLIActionTest):
|
||||||
|
"""Tests site secrets-related CLI actions."""
|
||||||
|
|
||||||
|
def _validate_generate_pki_action(self, result):
|
||||||
|
assert result.exit_code == 0
|
||||||
|
|
||||||
|
generated_files = []
|
||||||
|
output_lines = result.output.split("\n")
|
||||||
|
for line in output_lines:
|
||||||
|
if self.repo_name in line:
|
||||||
|
generated_files.append(line)
|
||||||
|
|
||||||
|
assert len(generated_files), 'No secrets were generated'
|
||||||
|
for generated_file in generated_files:
|
||||||
|
with open(generated_file, 'r') as f:
|
||||||
|
result = yaml.safe_load_all(f) # Validate valid YAML.
|
||||||
|
assert list(result), "%s file is empty" % filename
|
||||||
|
|
||||||
|
@pytest.mark.skipif(
|
||||||
|
not pki_utility.PKIUtility.cfssl_exists(),
|
||||||
|
reason='cfssl must be installed to execute these tests')
|
||||||
|
def test_site_secrets_generate_pki_using_remote_repo_url(self):
|
||||||
|
"""Validates ``generate-pki`` action using remote repo URL."""
|
||||||
|
# Scenario:
|
||||||
|
#
|
||||||
|
# 1) Generate PKI using remote repo URL
|
||||||
|
|
||||||
|
repo_url = 'https://github.com/openstack/%s@%s' % (self.repo_name,
|
||||||
|
self.repo_rev)
|
||||||
|
|
||||||
|
secrets_opts = ['secrets', 'generate-pki', self.site_name]
|
||||||
|
|
||||||
|
result = self.runner.invoke(cli.site, ['-r', repo_url] + secrets_opts)
|
||||||
|
self._validate_generate_pki_action(result)
|
||||||
|
|
||||||
|
@pytest.mark.skipif(
|
||||||
|
not pki_utility.PKIUtility.cfssl_exists(),
|
||||||
|
reason='cfssl must be installed to execute these tests')
|
||||||
|
def test_site_secrets_generate_pki_using_local_repo_path(self):
|
||||||
|
"""Validates ``generate-pki`` action using local repo path."""
|
||||||
|
# Scenario:
|
||||||
|
#
|
||||||
|
# 1) Generate PKI using local repo path
|
||||||
|
|
||||||
|
repo_path = self.treasuremap_path
|
||||||
|
secrets_opts = ['secrets', 'generate-pki', self.site_name]
|
||||||
|
|
||||||
|
result = self.runner.invoke(cli.site, ['-r', repo_path] + secrets_opts)
|
||||||
|
self._validate_generate_pki_action(result)
|
||||||
|
|
||||||
|
@pytest.mark.skipif(
|
||||||
|
not pki_utility.PKIUtility.cfssl_exists(),
|
||||||
|
reason='cfssl must be installed to execute these tests')
|
||||||
|
@mock.patch.dict(os.environ, {
|
||||||
|
"PEGLEG_PASSPHRASE": "123456789012345678901234567890",
|
||||||
|
"PEGLEG_SALT": "123456"
|
||||||
|
})
|
||||||
|
def test_site_secrets_encrypt_local_repo_path(self):
|
||||||
|
"""Validates ``generate-pki`` action using local repo path."""
|
||||||
|
# Scenario:
|
||||||
|
#
|
||||||
|
# 1) Encrypt a file in a local repo
|
||||||
|
|
||||||
|
repo_path = self.treasuremap_path
|
||||||
|
with open(os.path.join(repo_path, "site", "airship-seaworthy",
|
||||||
|
"secrets", "passphrases", "ceph_fsid.yaml"), "r") \
|
||||||
|
as ceph_fsid_fi:
|
||||||
|
ceph_fsid = yaml.load(ceph_fsid_fi)
|
||||||
|
ceph_fsid["metadata"]["storagePolicy"] = "encrypted"
|
||||||
|
|
||||||
|
with open(os.path.join(repo_path, "site", "airship-seaworthy",
|
||||||
|
"secrets", "passphrases", "ceph_fsid.yaml"), "w") \
|
||||||
|
as ceph_fsid_fi:
|
||||||
|
yaml.dump(ceph_fsid, ceph_fsid_fi)
|
||||||
|
|
||||||
|
secrets_opts = ['secrets', 'encrypt', '-a', 'test', self.site_name]
|
||||||
|
result = self.runner.invoke(cli.site, ['-r', repo_path] + secrets_opts)
|
||||||
|
|
||||||
|
assert result.exit_code == 0
|
||||||
|
|
||||||
|
with open(os.path.join(repo_path, "site", "airship-seaworthy",
|
||||||
|
"secrets", "passphrases", "ceph_fsid.yaml"), "r") \
|
||||||
|
as ceph_fsid_fi:
|
||||||
|
ceph_fsid = yaml.load(ceph_fsid_fi)
|
||||||
|
assert "encrypted" in ceph_fsid["data"]
|
||||||
|
assert "managedDocument" in ceph_fsid["data"]
|
||||||
|
|
||||||
|
|
||||||
class TestTypeCliActions(BaseCLIActionTest):
|
class TestTypeCliActions(BaseCLIActionTest):
|
||||||
"""Tests type-level CLI actions."""
|
"""Tests type-level CLI actions."""
|
||||||
|
|
||||||
|
23
tools/gate/playbooks/install-cfssl.yaml
Normal file
23
tools/gate/playbooks/install-cfssl.yaml
Normal file
@ -0,0 +1,23 @@
|
|||||||
|
# Copyright 2018 AT&T Intellectual Property. All other rights reserved.
|
||||||
|
#
|
||||||
|
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
# you may not use this file except in compliance with the License.
|
||||||
|
# You may obtain a copy of the License at
|
||||||
|
#
|
||||||
|
# http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
#
|
||||||
|
# Unless required by applicable law or agreed to in writing, software
|
||||||
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
# See the License for the specific language governing permissions and
|
||||||
|
# limitations under the License.
|
||||||
|
|
||||||
|
- hosts: all
|
||||||
|
gather_facts: False
|
||||||
|
tasks:
|
||||||
|
- name: Install cfssl for Ubuntu
|
||||||
|
shell: |-
|
||||||
|
./tools/install-cfssl.sh
|
||||||
|
become: yes
|
||||||
|
args:
|
||||||
|
chdir: "{{ zuul.project.src_dir }}"
|
@ -8,6 +8,7 @@ RES=$(find . \
|
|||||||
-not -path "*/htmlcov/*" \
|
-not -path "*/htmlcov/*" \
|
||||||
-not -name "*.tgz" \
|
-not -name "*.tgz" \
|
||||||
-not -name "*.pyc" \
|
-not -name "*.pyc" \
|
||||||
|
-not -name "*.html" \
|
||||||
-type f -exec egrep -l " +$" {} \;)
|
-type f -exec egrep -l " +$" {} \;)
|
||||||
|
|
||||||
if [[ -n $RES ]]; then
|
if [[ -n $RES ]]; then
|
||||||
|
22
tools/install-cfssl.sh
Executable file
22
tools/install-cfssl.sh
Executable file
@ -0,0 +1,22 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
set -ex
|
||||||
|
|
||||||
|
if [ $# -eq 1 ]; then
|
||||||
|
CFSSLURL=$1
|
||||||
|
else
|
||||||
|
CFSSLURL=${CFSSLURL:="http://pkg.cfssl.org/R1.2/cfssl_linux-amd64"}
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ -z $(which cfssl) ]; then
|
||||||
|
if [ $(whoami) == "root" ]; then
|
||||||
|
curl -Lo /usr/local/bin/cfssl ${CFSSLURL}
|
||||||
|
chmod 555 /usr/local/bin/cfssl
|
||||||
|
else
|
||||||
|
if [ ! -d ~/.local/bin ]; then
|
||||||
|
mkdir -p ~/.local/bin
|
||||||
|
fi
|
||||||
|
curl -Lo ~/.local/bin/cfssl ${CFSSLURL}
|
||||||
|
chmod 555 ~/.local/bin/cfssl
|
||||||
|
fi
|
||||||
|
fi
|
7
tox.ini
7
tox.ini
@ -57,7 +57,12 @@ deps =
|
|||||||
-r{toxinidir}/requirements.txt
|
-r{toxinidir}/requirements.txt
|
||||||
-r{toxinidir}/test-requirements.txt
|
-r{toxinidir}/test-requirements.txt
|
||||||
commands =
|
commands =
|
||||||
pytest --cov=pegleg --cov-report html:cover --cov-report xml:cover/coverage.xml --cov-report term --cov-fail-under 84 tests/
|
{toxinidir}/tools/install-cfssl.sh
|
||||||
|
bash -c 'PATH=$PATH:~/.local/bin; pytest --cov=pegleg --cov-report \
|
||||||
|
html:cover --cov-report xml:cover/coverage.xml --cov-report term \
|
||||||
|
--cov-fail-under 84 tests/'
|
||||||
|
whitelist_externals =
|
||||||
|
bash
|
||||||
|
|
||||||
[testenv:releasenotes]
|
[testenv:releasenotes]
|
||||||
basepython = python3
|
basepython = python3
|
||||||
|
Loading…
Reference in New Issue
Block a user