Add git and branch revision support to Pegleg CLI
https://review.openstack.org/#/c/582652/ added support for cloning from repositories using various reference types and allowing checkouts from already-cloned repositories. This patch set expands on that behavior to enhance the CLI to allow for using revisioned repositories rather than relying on pseudo-revision subfolders. That is, previously Pegleg achieved revision behavior through subdirectories. With this patch set, Pegleg will treat every manifests directory it encounters as a proper Git repository, such that revision support for clones/checkouts is now supported. Thus, this patch set is backwards incompatible and breaks the previous behavior of treating every directory as a flat directory from which revision sub-folders were referenced. Going forward, only repositories will be supported. * Remove directory revision support * Add support for repository definitions within the site-definition: repositories: global: revision: 0.9 url: ssh://REPO_USERNAME@gerrit:29418/aic-clcp-manifests.git secrets: revision: master url: ssh://REPO_USERNAME@gerrit:29418/aic-clcp-security-manifests.git where REPO_USERNAME would be replaced with the value from -u and use the public SSH key supplied with -k. * Remove primary and aux repository logic and replace with site repository and extra repositories * Allow overriding definitions in the above site-definition on the command line: -e global=/mydir@revision * Move linting into a site action even though more work to be done to restrict it to a site Co-Authored-By: Felipe Monteiro <felipe.monteiro@att.com> Change-Id: Iaa928ec2f777ed6f899d3b1790f5f9de613da9bb
This commit is contained in:
parent
ac6297eae6
commit
922083448c
@ -67,17 +67,45 @@ to meet requirements.::
|
||||
schema: metadata/Document/v1
|
||||
storagePolicy: cleartext
|
||||
data:
|
||||
revision: 'v1.0'
|
||||
site_type: 'cicd'
|
||||
repositories: # Optional field.
|
||||
global:
|
||||
revision: 47676764d3935e4934624bf9593e9115984fe668
|
||||
url: ssh://REPO_USERNAME@gerrit:29418/aic-clcp-manifests.git
|
||||
secrets:
|
||||
revision: master
|
||||
url: ssh://REPO_USERNAME@gerrit:29418/aic-clcp-security-manifests.git
|
||||
|
||||
The ``revision`` field is used
|
||||
to select the definition libraries in the ``global`` layer. This
|
||||
layer will be composed of a union of documents in the ``common``
|
||||
definition library and the definition library
|
||||
for the ``revision``. The ``revision`` field and
|
||||
the ``site_type`` fields select the definition library from the
|
||||
``type`` layer. And the ``site`` layer is defined by the single
|
||||
definition library under the sitename.
|
||||
The ``repositories`` field (optional) maps default authentication information
|
||||
for each of the manifests repositories supported, for example:
|
||||
|
||||
* global
|
||||
* secrets
|
||||
* site
|
||||
|
||||
Each of the above fields must have 2 pieces of information:
|
||||
|
||||
* ``revision`` (required) - specifies a valid :ref:`git-reference`.
|
||||
|
||||
* ``url`` (required) - specifies the repository remote path. Consists of the
|
||||
following required segments:
|
||||
|
||||
* protocol - http, https, or ssh.
|
||||
* REPO_USERNAME - must be included for ssh only. Can be overridden with the
|
||||
CLI via :ref:`command-line-repository-overrides`.
|
||||
* port - e.g. 29418 - must be included for ssh only.
|
||||
* repository name - e.g. aic-clcp-manifests
|
||||
|
||||
Self-Contained Repository
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
Note that if the ``repositories`` field is omitted, then this implies that the
|
||||
repository contains **all** the manifests required for site deployment. One
|
||||
such example is `Airship in a Bottle`_ which hosts all the manifests required
|
||||
for deploying a minimum OpenStack development environment.
|
||||
|
||||
Please see the :ref:`related CLI documentation <self-contained-repo>` for
|
||||
information on how to issue relevant commands.
|
||||
|
||||
Definition Library Layout
|
||||
=========================
|
||||
@ -157,3 +185,4 @@ site definition contains a set of documents.::
|
||||
in whatever way makes sense. The best practice here to define
|
||||
them by racks is only a suggestion.
|
||||
|
||||
.. _Airship in a Bottle: https://github.com/openstack/airship-in-a-bottle
|
||||
|
@ -49,29 +49,76 @@ CLI Options
|
||||
|
||||
Enable debug logging.
|
||||
|
||||
.. _site:
|
||||
|
||||
Site
|
||||
----
|
||||
|
||||
This allows you to set the primary and auxiliary repositories.
|
||||
|
||||
**-p / --primary**
|
||||
|
||||
Path to the root of the primary (containing site_definition.yaml) repo.
|
||||
(Required).
|
||||
|
||||
**-a / --auxiliary**
|
||||
|
||||
Path to the root of an auxiliary repo.
|
||||
|
||||
::
|
||||
|
||||
./pegleg.sh site -p <primary_repo> -a <auxiliary_repo> <command> <options>
|
||||
./pegleg.sh site -r <site_repo> -e <extra_repo> <command> <options>
|
||||
|
||||
Example:
|
||||
./pegleg.sh site -p /workspace/repo_1 -a /workspace/repo_2
|
||||
**-r / --site-repository** (mandatory)
|
||||
|
||||
Path to the root of the site repository (containing site_definition.yaml) repo.
|
||||
(Required). For example: /opt/aic-site-clcp-manifests.
|
||||
|
||||
**-e / --extra-repository** (optional)
|
||||
|
||||
Path to the root of extra repositories used for overriding those specified
|
||||
under the ``repositories`` field in a given :file:`site-definition.yaml`.
|
||||
|
||||
These should be named per the site-definition file, e.g.:
|
||||
|
||||
::
|
||||
|
||||
-e global=/opt/global -e secrets=/opt/secrets
|
||||
|
||||
By default, the revision specified in the :file:`site-definition.yaml` for the
|
||||
site will be leveraged but can be
|
||||
:ref:`overridden <command-line-repository-overrides>` using:
|
||||
|
||||
::
|
||||
|
||||
-e global=/opt/global@revision
|
||||
|
||||
Example usage:
|
||||
|
||||
::
|
||||
|
||||
./pegleg.sh site -r /opt/aic-clcp-site-manifests/ \
|
||||
-u <AUTH_USER> \
|
||||
-k /opt/.ssh/gerrit.pub \
|
||||
-e global=ssh://REPO_USERNAME@<GERRIT URL>:29418/aic-clcp-manifests.git@master \
|
||||
<command> <options>
|
||||
|
||||
.. _self-contained-repo:
|
||||
|
||||
Self-Contained Repository
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
For self-contained repositories, specification of extra repositories is
|
||||
unnecessary. The following command can be used to deploy the manifests in
|
||||
the example repository ``/opt/airship-in-a-bottle`` for the *currently checked
|
||||
out revision*:
|
||||
|
||||
::
|
||||
|
||||
pegleg site -r /opt/airship-in-a-bottle <command> <options>
|
||||
|
||||
To specify a specific revision at run time, execute:
|
||||
|
||||
::
|
||||
|
||||
pegleg site -r /opt/airship-in-a-bottle@<REVISION> <command> <options>
|
||||
|
||||
Where ``<REVISION>`` must be a valid :ref:`git-reference`.
|
||||
|
||||
Collect
|
||||
-------
|
||||
|
||||
Output complete config for one site.
|
||||
|
||||
**site_name**
|
||||
@ -80,7 +127,8 @@ Name of the site. (Required).
|
||||
|
||||
**-s / --save-location**
|
||||
|
||||
Where to output collected documents.
|
||||
Where to output collected documents. If omitted, the results will be dumped
|
||||
to ``stdout``.
|
||||
|
||||
**-x <code>** (Optional, validation only).
|
||||
|
||||
@ -101,25 +149,28 @@ Usage:
|
||||
|
||||
::
|
||||
|
||||
./pegleg.sh <command> <options> collect site_name -s save_location
|
||||
-x P001 -w P002 --validate
|
||||
./pegleg.sh <command> <options> collect <site_name> -s <save_location> \
|
||||
-x P001 -w P002 --validate
|
||||
|
||||
Example without validation:
|
||||
|
||||
::
|
||||
|
||||
./pegleg.sh site -p /workspace/repo_1 -a /workspace/repo_2
|
||||
collect site_name -s /workspace
|
||||
./pegleg.sh site -r /opt/aic-clcp-site-manifests \
|
||||
-e global=/opt/aic-clcp-manifests \
|
||||
collect <site_name> -s /workspace
|
||||
|
||||
Example with validation:
|
||||
|
||||
::
|
||||
|
||||
./pegleg.sh site -p /workspace/repo_1 -a /workspace/repo_2
|
||||
collect site_name -s /workspace -x P004 --validate
|
||||
./pegleg.sh site -r /opt/aic-clcp-site-manifests \
|
||||
-e global=/opt/aic-clcp-manifests \
|
||||
collect <site_name> -s /workspace -x P004 --validate
|
||||
|
||||
Impacted
|
||||
--------
|
||||
|
||||
Find sites impacted by changed files.
|
||||
|
||||
**-i / --input**
|
||||
@ -136,6 +187,7 @@ Where to output.
|
||||
|
||||
List
|
||||
----
|
||||
|
||||
List known sites.
|
||||
|
||||
**-o/--output**
|
||||
@ -147,10 +199,11 @@ Where to output.
|
||||
./pegleg <command> <options> list
|
||||
|
||||
Example:
|
||||
./pegleg site -p /workspace/repo_1 list -o /workspace
|
||||
./pegleg site -r /opt/aic-clcp-site-manifests list -o /workspace
|
||||
|
||||
Show
|
||||
----
|
||||
|
||||
Show details for one site.
|
||||
|
||||
**site_name**
|
||||
@ -166,33 +219,38 @@ Where to output.
|
||||
./pegleg <command> <options> show site_name
|
||||
|
||||
Example:
|
||||
./pegleg site -p /workspace/repo_1 show site_name -o /workspace
|
||||
./pegleg site -r /opt/aic-clcp-site-manifests show site_name -o /workspace
|
||||
|
||||
.. _linting:
|
||||
|
||||
Lint
|
||||
----
|
||||
|
||||
Sanity checks for repository content. Validations for linting are done
|
||||
utilizing `Deckhand Validations`_.
|
||||
|
||||
Example:
|
||||
|
||||
::
|
||||
|
||||
./pegleg.sh lint -p <primary_repo> -a <auxiliary_repo>
|
||||
-f -x <lint_code> -w <lint_code>
|
||||
./pegleg.sh site -r <site_repo> -e <extra_repo> \
|
||||
lint <site_name> \
|
||||
-f -x <lint_code> -w <lint_code>
|
||||
|
||||
Example:
|
||||
The most basic way to lint a document set is as follows:
|
||||
|
||||
./pegleg.sh lint -p /workspace/site-repo -a /workspace/secondary-repo
|
||||
-x P001 -x P002 -w P003
|
||||
::
|
||||
|
||||
**-p / --primary**
|
||||
./pegleg.sh site -r <site_repo> -e <extra_repo> lint <site_name>
|
||||
|
||||
Path to the root of the primary (containing site_definition.yaml) repo.
|
||||
(Required).
|
||||
A more complex example involves excluding certain linting checks:
|
||||
|
||||
**-a / --auxiliary**
|
||||
::
|
||||
|
||||
Path to the root of an auxiliary repo.
|
||||
./pegleg.sh site -r /opt/aic-clcp-site-manifests \
|
||||
-e global=/opt/aic-clcp-manifests \
|
||||
lint <site_name> \
|
||||
-x P001 -x P002 -w P003
|
||||
|
||||
**-f / --fail-on-missing-sub-src**
|
||||
|
||||
@ -224,5 +282,55 @@ Will warn of lint failures from the specified lint options.
|
||||
P002 - Deckhand rendering is expected to complete without errors.
|
||||
P003 - All repos contain expected directories.
|
||||
|
||||
.. _command-line-repository-overrides:
|
||||
|
||||
.. _`Deckhand Validations`: https://airship-deckhand.readthedocs.io/en/latest/validation.html
|
||||
CLI Repository Overrides
|
||||
------------------------
|
||||
|
||||
Repository overrides should only be used for entries included underneath
|
||||
the ``repositories`` field for a given :file:`site-definition.yaml`.
|
||||
|
||||
Overrides are specified via the ``-e`` flag for all :ref:`site` commands. They
|
||||
have the following format:
|
||||
|
||||
::
|
||||
|
||||
-e <REPO_NAME>=<REPO_PATH_OR_URL>@<REVISION>
|
||||
|
||||
Where:
|
||||
|
||||
* REPO_NAME is one of: ``global``, ``secrets`` or ``site``.
|
||||
* REPO_PATH_OR_URL is one of:
|
||||
|
||||
* path (relative or absolute) - /opt/global or ../global though absolute is
|
||||
recommended
|
||||
* url (fully qualified) - must have following formats:
|
||||
|
||||
* ssh - <PROTOCOL>://<REPO_USERNAME>@<GERRIT URL>:<PORT>/<REPO_NAME>.git
|
||||
* http|https - <PROTOCOL>://<GERRIT URL>/<REPO_NAME>.git
|
||||
|
||||
Where:
|
||||
|
||||
* <PROTOCOL> must be a valid authentication protocol: ssh, https, or http
|
||||
* <REPO_USERNAME> must be a user with access rights to the repository.
|
||||
This value will replace the literal string REPO_USERNAME in the
|
||||
corresponding entry under the ``repositories`` field in the relevant
|
||||
:file:`site-definition.yaml`, if applicable
|
||||
* <GERRIT_URL> must be a valid Gerrit URL
|
||||
* <PORT> must be a valid authentication port for SSH
|
||||
* <REVISION> must be a valid :ref:`git-reference`
|
||||
* <REPO_NAME> must be a valid Git repository name,
|
||||
e.g. aic-clcp-site-manifests
|
||||
|
||||
.. _git-reference:
|
||||
|
||||
Git Reference
|
||||
^^^^^^^^^^^^^
|
||||
|
||||
Valid Git references for checking out repositories include:
|
||||
|
||||
* 47676764d3935e4934624bf9593e9115984fe668 (commit ID)
|
||||
* refs/changes/79/47079/12 (ref)
|
||||
* master (branch name)
|
||||
|
||||
.. _Deckhand Validations: https://airship-deckhand.readthedocs.io/en/latest/validation.html
|
||||
|
@ -46,21 +46,42 @@ def main(*, verbose):
|
||||
|
||||
@main.group(help='Commands related to sites')
|
||||
@click.option(
|
||||
'-p',
|
||||
'--primary',
|
||||
'primary_repo',
|
||||
'-r',
|
||||
'--site-repository',
|
||||
'site_repository',
|
||||
required=True,
|
||||
help=
|
||||
'Path to the root of the primary (containing site_definition.yaml) repo.')
|
||||
'Path or URL to the primary repository (containing site_definition.yaml) '
|
||||
'repo.')
|
||||
@click.option(
|
||||
'-a',
|
||||
'--auxiliary',
|
||||
'aux_repo',
|
||||
'-e',
|
||||
'--extra-repository',
|
||||
'extra_repositories',
|
||||
multiple=True,
|
||||
help='Path to the root of an auxiliary repo.')
|
||||
def site(primary_repo, aux_repo):
|
||||
config.set_primary_repo(primary_repo)
|
||||
config.set_auxiliary_repo_list(aux_repo or [])
|
||||
help='Path or URL of additional repositories. These should be named per '
|
||||
'the site-definition file, e.g. -e global=/opt/global -e '
|
||||
'secrets=/opt/secrets. By default, the revision specified in the '
|
||||
'site-definition for the site will be leveraged but can be overridden '
|
||||
'using -e global=/opt/global@revision.')
|
||||
@click.option(
|
||||
'-k',
|
||||
'--repo-key',
|
||||
'repo_key',
|
||||
help='The SSH public key to use when cloning remote authenticated '
|
||||
'repositories.')
|
||||
@click.option(
|
||||
'-u',
|
||||
'--repo-username',
|
||||
'repo_username',
|
||||
help=
|
||||
'The SSH username to use when cloning remote authenticated repositories '
|
||||
'specified in the site-definition file. Any occurrences of REPO_USERNAME '
|
||||
'will be replaced with this value.')
|
||||
def site(*, site_repository, extra_repositories, repo_key, repo_username):
|
||||
config.set_site_repo(site_repository)
|
||||
config.set_extra_repo_store(extra_repositories or [])
|
||||
config.set_repo_key(repo_key)
|
||||
config.set_repo_username(repo_username)
|
||||
|
||||
|
||||
@site.command(help='Output complete config for one site')
|
||||
@ -104,6 +125,9 @@ def collect(*, save_location, validate, exclude_lint, warn_lint, site_name):
|
||||
Collect can lint documents prior to collection if the ``--validate``
|
||||
flag is optionally included.
|
||||
"""
|
||||
|
||||
engine.repository.process_repositories(site_name)
|
||||
|
||||
if validate:
|
||||
# Lint the primary repo prior to document collection.
|
||||
_lint(
|
||||
@ -140,6 +164,7 @@ def impacted(*, input_stream, output_stream):
|
||||
default=sys.stdout,
|
||||
help='Where to output')
|
||||
def list_(*, output_stream):
|
||||
engine.repository.process_site_repository(update_config=True)
|
||||
engine.site.list_(output_stream)
|
||||
|
||||
|
||||
@ -153,6 +178,7 @@ def list_(*, output_stream):
|
||||
help='Where to output')
|
||||
@click.argument('site_name')
|
||||
def show(*, output_stream, site_name):
|
||||
engine.repository.process_repositories(site_name)
|
||||
engine.site.show(site_name, output_stream)
|
||||
|
||||
|
||||
@ -166,9 +192,41 @@ def show(*, output_stream, site_name):
|
||||
help='Where to output')
|
||||
@click.argument('site_name')
|
||||
def render(*, output_stream, site_name):
|
||||
engine.repository.process_repositories(site_name)
|
||||
engine.site.render(site_name, output_stream)
|
||||
|
||||
|
||||
@site.command('lint', help='Lint a site')
|
||||
@click.option(
|
||||
'-f',
|
||||
'--fail-on-missing-sub-src',
|
||||
'fail_on_missing_sub_src',
|
||||
required=False,
|
||||
type=click.BOOL,
|
||||
default=True,
|
||||
help='Fail when there is missing substitution source.')
|
||||
@click.option(
|
||||
'-x',
|
||||
'--exclude',
|
||||
'exclude_lint',
|
||||
multiple=True,
|
||||
help='Excludes specified linting checks. Warnings will still be issued. '
|
||||
'-w takes priority over -x.')
|
||||
@click.option(
|
||||
'-w',
|
||||
'--warn',
|
||||
'warn_lint',
|
||||
multiple=True,
|
||||
help='Warn if linting check fails. -w takes priority over -x.')
|
||||
@click.argument('site_name')
|
||||
def lint(*, fail_on_missing_sub_src, exclude_lint, warn_lint, site_name):
|
||||
engine.repository.process_repositories(site_name)
|
||||
_lint(
|
||||
fail_on_missing_sub_src=fail_on_missing_sub_src,
|
||||
exclude_lint=exclude_lint,
|
||||
warn_lint=warn_lint)
|
||||
|
||||
|
||||
def _validate_revision_callback(_ctx, _param, value):
|
||||
if value is not None and value.startswith('v'):
|
||||
return value
|
||||
@ -232,41 +290,3 @@ def _lint(*, fail_on_missing_sub_src, exclude_lint, warn_lint):
|
||||
click.echo("Linting passed, but produced some warnings.")
|
||||
for w in warns:
|
||||
click.echo(w)
|
||||
|
||||
|
||||
@LINT_OPTION
|
||||
@main.command(help='Sanity checks for repository content')
|
||||
@click.option(
|
||||
'-p',
|
||||
'--primary',
|
||||
'primary_repo',
|
||||
required=True,
|
||||
help=
|
||||
'Path to the root of the primary (containing site_definition.yaml) repo.')
|
||||
@click.option(
|
||||
'-a',
|
||||
'--auxiliary',
|
||||
'aux_repo',
|
||||
multiple=True,
|
||||
help='Path to the root of a auxiliary repo.')
|
||||
@click.option(
|
||||
'-x',
|
||||
'--exclude',
|
||||
'exclude_lint',
|
||||
multiple=True,
|
||||
help='Excludes specified linting checks. Warnings will still be issued. '
|
||||
'-w takes priority over -x.')
|
||||
@click.option(
|
||||
'-w',
|
||||
'--warn',
|
||||
'warn_lint',
|
||||
multiple=True,
|
||||
help='Warn if linting check fails. -w takes priority over -x.')
|
||||
def lint(*, fail_on_missing_sub_src, primary_repo, aux_repo, exclude_lint,
|
||||
warn_lint):
|
||||
config.set_primary_repo(primary_repo)
|
||||
config.set_auxiliary_repo_list(aux_repo or [])
|
||||
_lint(
|
||||
fail_on_missing_sub_src=fail_on_missing_sub_src,
|
||||
exclude_lint=exclude_lint,
|
||||
warn_lint=warn_lint)
|
||||
|
@ -17,40 +17,64 @@ try:
|
||||
pass
|
||||
except NameError:
|
||||
GLOBAL_CONTEXT = {
|
||||
'primary_repo': './',
|
||||
'aux_repos': [],
|
||||
'site_repo': './',
|
||||
'extra_repos': [],
|
||||
'site_path': 'site'
|
||||
}
|
||||
|
||||
|
||||
def get_primary_repo():
|
||||
return GLOBAL_CONTEXT['primary_repo']
|
||||
def get_site_repo():
|
||||
return GLOBAL_CONTEXT['site_repo']
|
||||
|
||||
|
||||
def set_primary_repo(r):
|
||||
GLOBAL_CONTEXT['primary_repo'] = r.rstrip('/') + '/'
|
||||
def set_site_repo(r):
|
||||
GLOBAL_CONTEXT['site_repo'] = r.rstrip('/') + '/'
|
||||
|
||||
|
||||
def set_auxiliary_repo_list(a):
|
||||
GLOBAL_CONTEXT['aux_repos'] = [r.rstrip('/') + '/' for r in a]
|
||||
def get_extra_repo_store():
|
||||
return GLOBAL_CONTEXT.get('extra_repo_store', [])
|
||||
|
||||
|
||||
def add_auxiliary_repo(a):
|
||||
GLOBAL_CONTEXT['aux_repos'].append(a.rstrip('/') + '/')
|
||||
def set_extra_repo_store(r):
|
||||
GLOBAL_CONTEXT['extra_repo_store'] = r
|
||||
|
||||
|
||||
def get_auxiliary_repo_list():
|
||||
return GLOBAL_CONTEXT['aux_repos']
|
||||
def set_repo_key(k):
|
||||
GLOBAL_CONTEXT['repo_key'] = k
|
||||
|
||||
|
||||
def each_auxiliary_repo():
|
||||
for a in GLOBAL_CONTEXT['aux_repos']:
|
||||
def get_repo_key():
|
||||
return GLOBAL_CONTEXT.get('repo_key', None)
|
||||
|
||||
|
||||
def set_repo_username(u):
|
||||
GLOBAL_CONTEXT['repo_username'] = u
|
||||
|
||||
|
||||
def get_repo_username():
|
||||
return GLOBAL_CONTEXT.get('repo_username', None)
|
||||
|
||||
|
||||
def set_extra_repo_list(a):
|
||||
GLOBAL_CONTEXT['extra_repos'] = [r.rstrip('/') + '/' for r in a]
|
||||
|
||||
|
||||
def add_extra_repo(a):
|
||||
GLOBAL_CONTEXT['extra_repos'].append(a.rstrip('/') + '/')
|
||||
|
||||
|
||||
def get_extra_repo_list():
|
||||
return GLOBAL_CONTEXT['extra_repos']
|
||||
|
||||
|
||||
def each_extra_repo():
|
||||
for a in GLOBAL_CONTEXT['extra_repos']:
|
||||
yield a
|
||||
|
||||
|
||||
def all_repos():
|
||||
repos = [get_primary_repo()]
|
||||
repos.extend(get_auxiliary_repo_list())
|
||||
repos = [get_site_repo()]
|
||||
repos.extend(get_extra_repo_list())
|
||||
return repos
|
||||
|
||||
|
||||
|
@ -14,5 +14,6 @@
|
||||
|
||||
# flake8: noqa
|
||||
from . import lint
|
||||
from . import repository
|
||||
from . import site
|
||||
from . import stub
|
||||
|
@ -47,8 +47,16 @@ def full(fail_on_missing_sub_src=False, exclude_lint=None, warn_lint=None):
|
||||
# be added to the error list if SCHEMA_STORAGE_POLICY_MISMATCH_FLAG
|
||||
messages.extend(_verify_file_contents())
|
||||
|
||||
# FIXME(felipemonteiro): Now that we are using revisioned repositories
|
||||
# instead of flat directories with subfolders mirroring "revisions",
|
||||
# this lint check analyzes ALL the directories (including these
|
||||
# no-longer-valid "revision directories") against the new subset of
|
||||
# relevant directories. We need to rewrite this check so that it works
|
||||
# after site definitions have been refactored to move the manifests
|
||||
# under fake repository folders into the common/ folders.
|
||||
#
|
||||
# All repos contain expected directories
|
||||
messages.extend(_verify_no_unexpected_files())
|
||||
# messages.extend(_verify_no_unexpected_files())
|
||||
|
||||
# Deckhand Rendering completes without error
|
||||
messages.extend(_verify_deckhand_render(fail_on_missing_sub_src))
|
||||
@ -71,8 +79,9 @@ def _verify_no_unexpected_files():
|
||||
expected_directories = set()
|
||||
for site_name in util.files.list_sites():
|
||||
params = util.definition.load_as_params(site_name)
|
||||
expected_directories.update(util.files.directories_for(**params))
|
||||
|
||||
expected_directories.update(
|
||||
util.files.directories_for(
|
||||
site_name=params['site_name'], site_type=params['site_type']))
|
||||
LOG.debug('expected_directories: %s', expected_directories)
|
||||
found_directories = util.files.existing_directories()
|
||||
LOG.debug('found_directories: %s', found_directories)
|
||||
@ -174,7 +183,8 @@ def _gather_relevant_documents_per_site():
|
||||
|
||||
for sitename in sitenames:
|
||||
params = util.definition.load_as_params(sitename)
|
||||
paths = util.files.directories_for(**params)
|
||||
paths = util.files.directories_for(
|
||||
site_name=params['site_name'], site_type=params['site_type'])
|
||||
filenames = set(util.files.search(paths))
|
||||
for filename in filenames:
|
||||
with open(filename) as f:
|
||||
|
311
src/bin/pegleg/pegleg/engine/repository.py
Normal file
311
src/bin/pegleg/pegleg/engine/repository.py
Normal file
@ -0,0 +1,311 @@
|
||||
# Copyright 2018 AT&T Intellectual Property. All other rights reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import atexit
|
||||
import logging
|
||||
import os
|
||||
import shutil
|
||||
import tempfile
|
||||
|
||||
import click
|
||||
|
||||
from pegleg import config
|
||||
from pegleg.engine import exceptions
|
||||
from pegleg.engine import util
|
||||
|
||||
__all__ = ('process_repositories', 'process_site_repository')
|
||||
|
||||
__REPO_FOLDERS = {}
|
||||
_INVALID_FORMAT_MSG = ("The repository %s must be in the form of "
|
||||
"name=repoUrl[@revision]")
|
||||
|
||||
LOG = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@atexit.register
|
||||
def _clean_temp_folders():
|
||||
global __REPO_FOLDERS
|
||||
|
||||
for r in __REPO_FOLDERS.values():
|
||||
shutil.rmtree(r, ignore_errors=True)
|
||||
|
||||
|
||||
def process_repositories(site_name):
|
||||
"""Process and setup all repositories including ensuring we are at the
|
||||
right revision based on the site's own site-definition.yaml file.
|
||||
|
||||
:param site_name: Site name for which to clone relevant repos.
|
||||
|
||||
"""
|
||||
|
||||
# Only tracks extra repositories - not the site (primary) repository.
|
||||
extra_repos = []
|
||||
|
||||
site_repo = process_site_repository()
|
||||
|
||||
# Retrieve extra repo data from site-definition.yaml files.
|
||||
site_data = util.definition.load_as_params(
|
||||
site_name, primary_repo_base=site_repo)
|
||||
site_def_repos = _get_and_validate_site_repositories(site_name, site_data)
|
||||
|
||||
# Dict mapping repository names to associated URL/revision info for clone.
|
||||
repo_overrides = _process_repository_overrides(site_def_repos)
|
||||
if not site_def_repos:
|
||||
LOG.info('No repositories found in site-definition.yaml for site: %s. '
|
||||
'Defaulting to specified repository overrides.', site_name)
|
||||
site_def_repos = repo_overrides
|
||||
|
||||
# Extract user/key that we will use for all repositories.
|
||||
repo_key = config.get_repo_key()
|
||||
repo_user = config.get_repo_username()
|
||||
|
||||
for repo_alias in site_def_repos.keys():
|
||||
if repo_alias == "site":
|
||||
LOG.warning("The primary site repository path must be specified "
|
||||
"via the -r flag. Ignoring the provided "
|
||||
"site-definition entry: %s",
|
||||
site_def_repos[repo_alias])
|
||||
continue
|
||||
|
||||
# Extract URL and revision, prioritizing overrides over the defaults in
|
||||
# the site-definition.yaml.
|
||||
if repo_alias in repo_overrides:
|
||||
repo_path_or_url = repo_overrides[repo_alias]['url']
|
||||
repo_revision = repo_overrides[repo_alias]['revision']
|
||||
else:
|
||||
repo_path_or_url = site_def_repos[repo_alias]['url']
|
||||
repo_revision = site_def_repos[repo_alias]['revision']
|
||||
|
||||
# If a repo user is provided, do the necessary replacements.
|
||||
if repo_user:
|
||||
if "REPO_USERNAME" not in repo_path_or_url:
|
||||
LOG.warning(
|
||||
"A repository username was specified but no REPO_USERNAME "
|
||||
"string found in repository url %s", repo_path_or_url)
|
||||
else:
|
||||
repo_path_or_url = repo_path_or_url.replace(
|
||||
'REPO_USERNAME', repo_user)
|
||||
|
||||
LOG.info("Processing repository %s with url=%s, repo_key=%s, "
|
||||
"repo_username=%s, revision=%s", repo_alias, repo_path_or_url,
|
||||
repo_key, repo_user, repo_revision)
|
||||
|
||||
temp_extra_repo = _copy_to_temp_folder(repo_path_or_url, repo_alias)
|
||||
temp_extra_repo = _handle_repository(
|
||||
temp_extra_repo, ref=repo_revision, auth_key=repo_key)
|
||||
extra_repos.append(temp_extra_repo)
|
||||
|
||||
# Overwrite the site repo and extra repos in the config because further
|
||||
# processing will fail if they contain revision info in their paths.
|
||||
LOG.debug("Updating site_repo=%s extra_repo_list=%s in config", site_repo,
|
||||
extra_repos)
|
||||
config.set_site_repo(site_repo)
|
||||
config.set_extra_repo_list(extra_repos)
|
||||
|
||||
|
||||
def process_site_repository(update_config=False):
|
||||
"""Process and setup site repository including ensuring we are at the right
|
||||
revision based on the site's own site-definition.yaml file.
|
||||
|
||||
:param bool update_config: Whether to update Pegleg config with computed
|
||||
site repo path.
|
||||
|
||||
"""
|
||||
|
||||
# Retrieve the main site repository and validate it.
|
||||
site_repo_or_path = config.get_site_repo()
|
||||
if not site_repo_or_path:
|
||||
raise ValueError("Site repository directory (%s) must be specified" %
|
||||
site_repo_or_path)
|
||||
|
||||
repo_path_or_url, repo_revision = _extract_repo_url_and_revision(
|
||||
site_repo_or_path)
|
||||
temp_site_repo = _copy_to_temp_folder(repo_path_or_url, "site")
|
||||
_process_site_repository(temp_site_repo, repo_revision)
|
||||
|
||||
if update_config:
|
||||
# Overwrite the site repo in the config because further processing will
|
||||
# fail if they contain revision info in their paths.
|
||||
LOG.debug("Updating site_repo=%s in config", temp_site_repo)
|
||||
config.set_site_repo(temp_site_repo)
|
||||
|
||||
return temp_site_repo
|
||||
|
||||
|
||||
def _process_site_repository(repo_url_or_path, repo_revision):
|
||||
"""Process the primary or site repository located at ``repo_url_or_path``.
|
||||
|
||||
Also validate that the provided ``repo_url_or_path`` is a valid Git
|
||||
repository. If ``repo_url_or_path`` doesn't already exist, clone it.
|
||||
If it does, extra the appropriate revision and check it out.
|
||||
|
||||
:param repo_url_or_path: Repo URL and associated auth information. E.g.:
|
||||
|
||||
* ssh://REPO_USERNAME@<GERRIT_URL>:29418/aic-clcp-manifests.git@<ref>
|
||||
* https://<GERRIT_URL>/aic-clcp-manifests.git@<ref>
|
||||
* http://<GERRIT_URL>/aic-clcp-manifests.git@<ref>
|
||||
* <LOCAL_REPO_PATH>@<ref>
|
||||
* same values as above without @<ref>
|
||||
|
||||
"""
|
||||
|
||||
repo_alias = 'site' # We are processing the site repo necessarily.
|
||||
repo_key = config.get_repo_key()
|
||||
repo_user = config.get_repo_username()
|
||||
|
||||
LOG.info("Processing repository %s with url=%s, repo_key=%s, "
|
||||
"repo_username=%s, revision=%s", repo_alias, repo_url_or_path,
|
||||
repo_key, repo_user, repo_revision)
|
||||
_handle_repository(repo_url_or_path, ref=repo_revision, auth_key=repo_key)
|
||||
|
||||
|
||||
def _get_and_validate_site_repositories(site_name, site_data):
|
||||
"""Validate that repositories entry exists in ``site_data``."""
|
||||
if 'repositories' not in site_data:
|
||||
LOG.info("The repository for site_name: %s does not contain a "
|
||||
"site-definition.yaml with a 'repositories' key. Ensure "
|
||||
"your repository is self-contained and doesn't require "
|
||||
"extra repositories for correct rendering." % site_name)
|
||||
return site_data.get('repositories', {})
|
||||
|
||||
|
||||
def _copy_to_temp_folder(repo_path_or_url, repo_alias):
|
||||
"""Helper to ensure that local repos remain untouched by Pegleg processing.
|
||||
This is accomplished by copying local repos into temp folders.
|
||||
|
||||
"""
|
||||
|
||||
global __REPO_FOLDERS
|
||||
|
||||
if os.path.exists(repo_path_or_url):
|
||||
repo_name = util.git.repo_name(repo_path_or_url)
|
||||
new_temp_path = os.path.join(tempfile.mkdtemp(), repo_name)
|
||||
norm_path, sub_path = util.git.normalize_repo_path(repo_path_or_url)
|
||||
shutil.copytree(src=norm_path, dst=new_temp_path, symlinks=True)
|
||||
__REPO_FOLDERS.setdefault(repo_name, new_temp_path)
|
||||
return os.path.join(new_temp_path, sub_path)
|
||||
else:
|
||||
return repo_path_or_url
|
||||
|
||||
|
||||
def _process_repository_overrides(site_def_repos):
|
||||
"""Process specified repository overrides via the CLI ``-e`` flag.
|
||||
|
||||
This will resolve ``-e`` site CLI arguments and override the corresponding
|
||||
values in the relevant :file:`site-definition.yaml`, if applicable.
|
||||
|
||||
For example, given CLI override of::
|
||||
|
||||
-e global=/opt/global@foo
|
||||
|
||||
And site-definition.yaml ``repositories`` value of::
|
||||
|
||||
repositories:
|
||||
global:
|
||||
revision: bar
|
||||
url: /opt/global
|
||||
|
||||
Then the resulting dictionary that is returned will be::
|
||||
|
||||
{"global": {"url": "/opt/global", "revision": "foo"}}
|
||||
|
||||
:param site_def_repos: Dictionary of ``repositories`` field included
|
||||
in relevant :file:`site-definition.yaml`.
|
||||
:returns: Dictionary with above format.
|
||||
|
||||
"""
|
||||
|
||||
# Extra repositories to process.
|
||||
provided_repo_overrides = config.get_extra_repo_store()
|
||||
# Map repository names to the associated URL/revision for cloning.
|
||||
repo_overrides = {}
|
||||
|
||||
for repo_override in provided_repo_overrides:
|
||||
# break apart global=repoUrl
|
||||
try:
|
||||
repo_alias, repo_url_or_path = repo_override.split('=', 1)
|
||||
except ValueError:
|
||||
# TODO(felipemonteiro): Use internal exceptions for this.
|
||||
raise click.ClickException(_INVALID_FORMAT_MSG % repo_override)
|
||||
|
||||
if repo_alias == "site":
|
||||
LOG.warning("The primary site repository path must be specified "
|
||||
"via the -r flag. Ignoring the provided override: %s",
|
||||
repo_override)
|
||||
continue
|
||||
|
||||
if repo_alias not in site_def_repos:
|
||||
# If we are overriding a value that doesn't exist in the
|
||||
# site-definition.yaml make a note of it in case the override
|
||||
# is something bogus, but we won't make this a hard requirement,
|
||||
# so just log the discrepancy.
|
||||
LOG.debug("Repo override: %s not found under `repositories` for "
|
||||
"site-definition.yaml. Site def repositories: %s",
|
||||
repo_override, ", ".join(site_def_repos.keys()))
|
||||
|
||||
repo_url, revision = _extract_repo_url_and_revision(repo_url_or_path)
|
||||
|
||||
# store what we've learned
|
||||
repo_overrides.setdefault(repo_alias, {})
|
||||
repo_overrides[repo_alias]['url'] = repo_url
|
||||
repo_overrides[repo_alias]['revision'] = revision
|
||||
|
||||
return repo_overrides
|
||||
|
||||
|
||||
def _extract_repo_url_and_revision(repo_path_or_url):
|
||||
"""Break up repository path/url into the repo URL and revision.
|
||||
|
||||
:param repo_path_or_url: Repo URL and associated auth information. E.g.:
|
||||
|
||||
* ssh://REPO_USERNAME@<GERRIT_URL>:29418/aic-clcp-manifests.git@<ref>
|
||||
* https://<GERRIT_URL>/aic-clcp-manifests.git@<ref>
|
||||
* http://<GERRIT_URL>/aic-clcp-manifests.git@<ref>
|
||||
* <LOCAL_REPO_PATH>@<ref>
|
||||
* same values as above without @<ref>
|
||||
|
||||
"""
|
||||
|
||||
# they've forced a revision using @revision - careful not to confuse
|
||||
# this with auth
|
||||
revision = None
|
||||
try:
|
||||
if '@' in repo_path_or_url:
|
||||
# extract revision from repo URL or path
|
||||
repo_url_or_path, revision = repo_path_or_url.rsplit('@', 1)
|
||||
revision = revision[:-1] if revision.endswith('/') else revision
|
||||
else:
|
||||
repo_url_or_path = repo_path_or_url
|
||||
except Exception:
|
||||
# TODO(felipemonteiro): Use internal exceptions for this.
|
||||
raise click.ClickException(_INVALID_FORMAT_MSG % repo_path_or_url)
|
||||
|
||||
return repo_url_or_path, revision
|
||||
|
||||
|
||||
def _handle_repository(repo_url_or_path, *args, **kwargs):
|
||||
"""Clone remote remote (if ``repo_url_or_path`` is a remote URL) and
|
||||
checkout specified reference .
|
||||
|
||||
"""
|
||||
|
||||
try:
|
||||
return util.git.git_handler(repo_url_or_path, *args, **kwargs)
|
||||
except exceptions.GitException as e:
|
||||
raise click.ClickException(e)
|
||||
except Exception as e:
|
||||
LOG.exception('Unknown exception was raised during git clone/checkout:'
|
||||
' %s', e)
|
||||
# TODO(felipemonteiro): Use internal exceptions for this.
|
||||
raise click.ClickException(e)
|
@ -12,17 +12,18 @@
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import os
|
||||
import click
|
||||
import collections
|
||||
import csv
|
||||
import json
|
||||
import yaml
|
||||
import logging
|
||||
import os
|
||||
|
||||
import click
|
||||
import yaml
|
||||
|
||||
from pegleg.engine import util
|
||||
|
||||
__all__ = ['collect', 'impacted', 'list_', 'show', 'render']
|
||||
__all__ = ('collect', 'impacted', 'list_', 'show', 'render')
|
||||
|
||||
LOG = logging.getLogger(__name__)
|
||||
|
||||
@ -62,7 +63,7 @@ def _collect_to_file(site_name, save_location):
|
||||
# In case save_location already exists and isn't a directory.
|
||||
if not os.path.isdir(save_location):
|
||||
raise click.ClickException('save_location %s already exists, but must '
|
||||
'be a directory')
|
||||
'be a directory' % save_location)
|
||||
|
||||
save_files = dict()
|
||||
try:
|
||||
@ -122,11 +123,24 @@ def render(site_name, output_stream):
|
||||
|
||||
|
||||
def list_(output_stream):
|
||||
fieldnames = ['site_name', 'site_type', 'revision']
|
||||
"""List site names for a given repository."""
|
||||
|
||||
# TODO(felipemonteiro): This should output a formatted table, not rows of
|
||||
# data without delimited columns.
|
||||
fieldnames = ['site_name', 'site_type', 'repositories']
|
||||
writer = csv.DictWriter(
|
||||
output_stream, fieldnames=fieldnames, delimiter=' ')
|
||||
for site_name in util.files.list_sites():
|
||||
params = util.definition.load_as_params(site_name)
|
||||
# TODO(felipemonteiro): This is a temporary hack around legacy manifest
|
||||
# repositories containing the name of a directory that symbolizes a
|
||||
# repository. Once all these manifest repositories migrate over to Git
|
||||
# references instead, remove this hack.
|
||||
# NOTE(felipemonteiro): The 'revision' information can instead be
|
||||
# computed using :func:`process_site_repository` and storing into
|
||||
# a configuration via a "set_site_revision" function, for example.
|
||||
if 'revision' in params:
|
||||
params.pop('revision')
|
||||
writer.writerow(params)
|
||||
|
||||
|
||||
|
@ -63,7 +63,7 @@ def load_as_params(site_name, primary_repo_base=None):
|
||||
|
||||
def path(site_name, primary_repo_base=None):
|
||||
if not primary_repo_base:
|
||||
primary_repo_base = config.get_primary_repo()
|
||||
primary_repo_base = config.get_site_repo()
|
||||
return os.path.join(primary_repo_base, 'site', site_name,
|
||||
'site-definition.yaml')
|
||||
|
||||
@ -80,14 +80,17 @@ def pluck(site_definition, key):
|
||||
|
||||
def site_files(site_name):
|
||||
params = load_as_params(site_name)
|
||||
for filename in files.search(files.directories_for(**params)):
|
||||
for filename in files.search(
|
||||
files.directories_for(
|
||||
site_name=params['site_name'], site_type=params['site_type'])):
|
||||
yield filename
|
||||
|
||||
|
||||
def site_files_by_repo(site_name):
|
||||
"""Yield tuples of repo_base, file_name."""
|
||||
params = load_as_params(site_name)
|
||||
dir_map = files.directories_for_each_repo(**params)
|
||||
dir_map = files.directories_for_each_repo(
|
||||
site_name=params['site_name'], site_type=params['site_type'])
|
||||
for repo, dl in dir_map.items():
|
||||
for filename in files.search(dl):
|
||||
yield (repo, filename)
|
||||
|
@ -48,18 +48,18 @@ def all():
|
||||
])
|
||||
|
||||
|
||||
def create_global_directories(revision):
|
||||
def create_global_directories():
|
||||
_create_tree(_global_root_path())
|
||||
_create_tree(_global_common_path())
|
||||
_create_tree(_global_revision_path(revision))
|
||||
|
||||
|
||||
def create_site_directories(*, site_name, revision, **_kwargs):
|
||||
def create_site_directories(*, site_name, **_kwargs):
|
||||
_create_tree(_site_path(site_name))
|
||||
|
||||
|
||||
def create_site_type_directories(*, revision, site_type):
|
||||
def create_site_type_directories(*, site_type):
|
||||
_create_tree(_site_type_common_path(site_type))
|
||||
_create_tree(_site_type_revision_path(site_type, revision))
|
||||
_create_tree(_site_type_root_path(site_type))
|
||||
|
||||
|
||||
FULL_STRUCTURE = {
|
||||
@ -110,12 +110,10 @@ def _create_tree(root_path, *, tree=FULL_STRUCTURE):
|
||||
yaml.safe_dump(yaml_data, f)
|
||||
|
||||
|
||||
def directories_for(*, site_name, revision, site_type):
|
||||
def directories_for(*, site_name, site_type):
|
||||
library_list = [
|
||||
_global_common_path(),
|
||||
_global_revision_path(revision),
|
||||
_site_type_common_path(site_type),
|
||||
_site_type_revision_path(site_type, revision),
|
||||
_global_root_path(),
|
||||
_site_type_root_path(site_type),
|
||||
_site_path(site_name),
|
||||
]
|
||||
|
||||
@ -124,7 +122,7 @@ def directories_for(*, site_name, revision, site_type):
|
||||
]
|
||||
|
||||
|
||||
def directories_for_each_repo(*, site_name, revision, site_type):
|
||||
def directories_for_each_repo(*, site_name, site_type):
|
||||
"""Provide directories for each repo.
|
||||
|
||||
When producing bucketized output files, the documents collected
|
||||
@ -132,10 +130,8 @@ def directories_for_each_repo(*, site_name, revision, site_type):
|
||||
by repo.
|
||||
"""
|
||||
library_list = [
|
||||
_global_common_path(),
|
||||
_global_revision_path(revision),
|
||||
_site_type_common_path(site_type),
|
||||
_site_type_revision_path(site_type, revision),
|
||||
_global_root_path(),
|
||||
_site_type_root_path(site_type),
|
||||
_site_path(site_name),
|
||||
]
|
||||
|
||||
@ -150,16 +146,16 @@ def _global_common_path():
|
||||
return 'global/common'
|
||||
|
||||
|
||||
def _global_revision_path(revision):
|
||||
return 'global/%s' % revision
|
||||
def _global_root_path():
|
||||
return 'global'
|
||||
|
||||
|
||||
def _site_type_common_path(site_type):
|
||||
return 'type/%s/common' % site_type
|
||||
|
||||
|
||||
def _site_type_revision_path(site_type, revision):
|
||||
return 'type/%s/%s' % (site_type, revision)
|
||||
def _site_type_root_path(site_type):
|
||||
return 'type/%s' % site_type
|
||||
|
||||
|
||||
def _site_path(site_name):
|
||||
@ -169,7 +165,7 @@ def _site_path(site_name):
|
||||
def list_sites(primary_repo_base=None):
|
||||
"""Get a list of site definition directories in the primary repo."""
|
||||
if not primary_repo_base:
|
||||
primary_repo_base = config.get_primary_repo()
|
||||
primary_repo_base = config.get_site_repo()
|
||||
full_site_path = os.path.join(primary_repo_base,
|
||||
config.get_rel_site_path())
|
||||
for path in os.listdir(full_site_path):
|
||||
|
@ -12,11 +12,13 @@
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import atexit
|
||||
import logging
|
||||
import os
|
||||
import tempfile
|
||||
from urllib.parse import urlparse
|
||||
|
||||
import click
|
||||
from git import exc as git_exc
|
||||
from git import Git
|
||||
from git import Repo
|
||||
@ -29,6 +31,8 @@ __all__ = [
|
||||
'git_handler',
|
||||
]
|
||||
|
||||
__MODIFIED_REPOS = []
|
||||
|
||||
|
||||
def git_handler(repo_url, ref=None, proxy_server=None, auth_key=None):
|
||||
"""Handle directories that are Git repositories.
|
||||
@ -78,7 +82,6 @@ def git_handler(repo_url, ref=None, proxy_server=None, auth_key=None):
|
||||
raise ValueError('repo_url=%s must use one of the following '
|
||||
'protocols: %s' %
|
||||
(repo_url, ', '.join(supported_clone_protocols)))
|
||||
|
||||
# otherwise, we're dealing with a local directory so although
|
||||
# we do not need to clone, we may need to process the reference
|
||||
# by checking that out and returning the directory they passed in
|
||||
@ -87,13 +90,13 @@ def git_handler(repo_url, ref=None, proxy_server=None, auth_key=None):
|
||||
'Attempting to checkout ref=%s', repo_url, ref)
|
||||
try:
|
||||
# get absolute path of what is probably a directory
|
||||
repo_url = os.path.abspath(repo_url)
|
||||
repo_url, _ = normalize_repo_path(repo_url)
|
||||
except Exception:
|
||||
msg = "The repo_url=%s is not a valid directory" % repo_url
|
||||
msg = "The repo_url=%s is not a valid Git repo" % repo_url
|
||||
LOG.error(msg)
|
||||
raise NotADirectoryError(msg)
|
||||
|
||||
repo = Repo(repo_url)
|
||||
repo = Repo(repo_url, search_parent_directories=True)
|
||||
if repo.is_dirty(untracked_files=True):
|
||||
LOG.error('The locally cloned repo_url=%s is dirty. Manual clean '
|
||||
'up of tracked/untracked files required.', repo_url)
|
||||
@ -127,7 +130,7 @@ def _get_current_ref(repo_url):
|
||||
"""
|
||||
|
||||
try:
|
||||
repo = Repo(repo_url)
|
||||
repo = Repo(repo_url, search_parent_directories=True)
|
||||
current_ref = repo.head.ref.name
|
||||
LOG.debug('ref for repo_url=%s not specified, defaulting to currently '
|
||||
'checked out ref=%s', repo_url, current_ref)
|
||||
@ -305,7 +308,7 @@ def _create_local_ref(g, branches, ref, newref, reftype=None):
|
||||
branches.append(newref)
|
||||
|
||||
|
||||
def is_repository(path):
|
||||
def is_repository(path, *args, **kwargs):
|
||||
"""Checks whether the directory ``path`` is a Git repository.
|
||||
|
||||
:param str path: Directory path to check.
|
||||
@ -313,7 +316,99 @@ def is_repository(path):
|
||||
:rtype: boolean
|
||||
"""
|
||||
try:
|
||||
Repo(path).git_dir
|
||||
Repo(path, *args, **kwargs).git_dir
|
||||
return True
|
||||
except git_exc.InvalidGitRepositoryError:
|
||||
return False
|
||||
|
||||
|
||||
def is_equal(first_repo, other_repo):
|
||||
"""Compares whether two repositories are the same.
|
||||
|
||||
Sameness is defined as whether they point to the same remote repository.
|
||||
|
||||
:param str first_repo: Path or URL of first repository.
|
||||
:param str other_repo: Path or URL of other repository.
|
||||
:returns: True if both are the same, else False.
|
||||
:rtype: boolean
|
||||
|
||||
"""
|
||||
|
||||
try:
|
||||
# Compare whether the first reference from each repository is the
|
||||
# same: by doing so we know the repositories are the same.
|
||||
first = Repo(first_repo, search_parent_directories=True)
|
||||
other = Repo(other_repo, search_parent_directories=True)
|
||||
first_rev = first.git.rev_list('master').splitlines()[-1]
|
||||
other_rev = other.git.rev_list('master').splitlines()[-1]
|
||||
return first_rev == other_rev
|
||||
except Exception:
|
||||
return False
|
||||
|
||||
|
||||
def repo_name(repo_url_or_path):
|
||||
"""Get the repository name for the local or remote repo at
|
||||
``repo_url_or_path``.
|
||||
|
||||
:param repo_url: URL of remote Git repo or path to local Git repo.
|
||||
:returns: Corresponding repo name.
|
||||
:rtype: str
|
||||
:raises GitConfigException: If the path is not a valid Git repo.
|
||||
|
||||
"""
|
||||
|
||||
repo = Repo(repo_url_or_path, search_parent_directories=True)
|
||||
config_reader = repo.config_reader()
|
||||
section = 'remote "origin"'
|
||||
option = 'url'
|
||||
|
||||
if config_reader.has_section(section):
|
||||
repo_url = config_reader.get_value(section, option)
|
||||
return repo_url.split('/')[-1].split('.git')[0]
|
||||
raise click.ClickException(
|
||||
"Repo=%s is not a valid Git repository" % repo_url_or_path)
|
||||
|
||||
|
||||
def normalize_repo_path(repo_path):
|
||||
"""A utility function for retrieving the root repo path when the site
|
||||
repository path contains subfolders.
|
||||
|
||||
Given (for example): ../airship-in-a-bottle/deployment_files@master
|
||||
|
||||
It is necessary to pass ../airship-in-a-bottle to Git for checkout/clone
|
||||
as that is the actual repository path.
|
||||
|
||||
Yet it is necessary to pass ../airship-in-a-bottle/deployment_files to
|
||||
:func:`util.definition.site_files_by_repo` for discovering the
|
||||
site-definition.yaml.
|
||||
|
||||
"""
|
||||
|
||||
orig_repo_path = repo_path
|
||||
sub_path = ""
|
||||
|
||||
# Only resolve the root path if it's not a URL and exists.
|
||||
if os.path.exists(repo_path):
|
||||
repo_path = os.path.abspath(repo_path)
|
||||
while (repo_path and os.path.exists(repo_path)
|
||||
and not is_repository(repo_path)):
|
||||
paths = repo_path.rsplit("/", 1)
|
||||
if not all(paths):
|
||||
break
|
||||
repo_path = os.path.abspath(paths[0])
|
||||
sub_path = os.path.join(sub_path, paths[1])
|
||||
if not repo_path or not is_repository(repo_path):
|
||||
raise click.ClickException(
|
||||
"Specified site repo path=%s exists but isn't a Git "
|
||||
"repository" % orig_repo_path)
|
||||
|
||||
return repo_path, sub_path
|
||||
|
||||
|
||||
@atexit.register
|
||||
def clean_repo():
|
||||
global __MODIFIED_REPOS
|
||||
|
||||
for r in __MODIFIED_REPOS:
|
||||
repo = Repo(r)
|
||||
repo.head.reset(index=True, working_tree=True)
|
||||
|
@ -127,8 +127,10 @@ def test_verify_deckhand_render_site_documents_separately(
|
||||
'schema': 'deckhand/Passphrase/v1'
|
||||
}, {
|
||||
'data': {
|
||||
'revision': 'v1.0',
|
||||
'site_type': sitename
|
||||
'site_type': sitename,
|
||||
'repositories': {
|
||||
'global': mock.ANY
|
||||
}
|
||||
},
|
||||
'metadata': {
|
||||
'layeringDefinition': {
|
||||
|
@ -19,12 +19,17 @@ import pytest
|
||||
from pegleg import config
|
||||
from pegleg.engine import lint
|
||||
|
||||
_SKIP_P003_REASON = """Currently broken with revisioned repositories
|
||||
directory layout changes. The old pseudo-revision folders like 'v4.0' is
|
||||
no longer relevant and so the lint logic for this rule needs to be updated.
|
||||
For more information, see: https://storyboard.openstack.org/#!/story/2003762
|
||||
"""
|
||||
|
||||
@mock.patch.object(lint, '_verify_deckhand_render', return_value=[])
|
||||
@mock.patch.object(lint, '_verify_no_unexpected_files', return_value=[])
|
||||
def test_lint_excludes_P001(*args):
|
||||
exclude_lint = ['P001']
|
||||
config.set_primary_repo('../pegleg/site_yamls/')
|
||||
config.set_site_repo('../pegleg/site_yamls/')
|
||||
|
||||
code_1 = 'X001'
|
||||
msg_1 = 'is a secret, but has unexpected storagePolicy: "cleartext"'
|
||||
@ -43,7 +48,7 @@ def test_lint_excludes_P001(*args):
|
||||
@mock.patch.object(lint, '_verify_no_unexpected_files', return_value=[])
|
||||
def test_lint_excludes_P002(*args):
|
||||
exclude_lint = ['P002']
|
||||
config.set_primary_repo('../pegleg/site_yamls/')
|
||||
config.set_site_repo('../pegleg/site_yamls/')
|
||||
with mock.patch.object(
|
||||
lint,
|
||||
'_verify_deckhand_render',
|
||||
@ -52,6 +57,7 @@ def test_lint_excludes_P002(*args):
|
||||
mock_method.assert_called()
|
||||
|
||||
|
||||
@pytest.mark.skip(reason=_SKIP_P003_REASON)
|
||||
@mock.patch.object(lint, '_verify_deckhand_render', return_value=[])
|
||||
def test_lint_excludes_P003(*args):
|
||||
exclude_lint = ['P003']
|
||||
@ -67,7 +73,7 @@ def test_lint_excludes_P003(*args):
|
||||
@mock.patch.object(lint, '_verify_no_unexpected_files', return_value=[])
|
||||
def test_lint_warns_P001(*args):
|
||||
warn_lint = ['P001']
|
||||
config.set_primary_repo('../pegleg/site_yamls/')
|
||||
config.set_site_repo('../pegleg/site_yamls/')
|
||||
|
||||
code_1 = 'X001'
|
||||
msg_1 = 'is a secret, but has unexpected storagePolicy: "cleartext"'
|
||||
@ -86,17 +92,18 @@ def test_lint_warns_P001(*args):
|
||||
@mock.patch.object(lint, '_verify_no_unexpected_files', return_value=[])
|
||||
def test_lint_warns_P002(*args):
|
||||
warn_lint = ['P002']
|
||||
config.set_primary_repo('../pegleg/site_yamls/')
|
||||
config.set_site_repo('../pegleg/site_yamls/')
|
||||
|
||||
with mock.patch.object(lint, '_verify_deckhand_render') as mock_method:
|
||||
lint.full(False, [], warn_lint)
|
||||
mock_method.assert_called()
|
||||
|
||||
|
||||
@pytest.mark.skip(reason=_SKIP_P003_REASON)
|
||||
@mock.patch.object(lint, '_verify_deckhand_render', return_value=[])
|
||||
def test_lint_warns_P003(*args):
|
||||
warn_lint = ['P003']
|
||||
config.set_primary_repo('../pegleg/site_yamls/')
|
||||
config.set_site_repo('../pegleg/site_yamls/')
|
||||
|
||||
with mock.patch.object(lint, '_verify_no_unexpected_files') as mock_method:
|
||||
lint.full(False, [], warn_lint)
|
||||
|
428
src/bin/pegleg/tests/unit/engine/test_site_repository.py
Normal file
428
src/bin/pegleg/tests/unit/engine/test_site_repository.py
Normal file
@ -0,0 +1,428 @@
|
||||
# Copyright 2018 AT&T Intellectual Property. All other rights reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import copy
|
||||
import mock
|
||||
import yaml
|
||||
|
||||
import click
|
||||
import pytest
|
||||
|
||||
from pegleg import config
|
||||
from pegleg.engine import repository
|
||||
from pegleg.engine import util
|
||||
|
||||
TEST_REPOSITORIES = {
|
||||
'repositories': {
|
||||
'global': {
|
||||
'revision': '843d1a50106e1f17f3f722e2ef1634ae442fe68f',
|
||||
'url': 'ssh://REPO_USERNAME@gerrit:29418/aic-clcp-manifests.git'
|
||||
},
|
||||
'secrets': {
|
||||
'revision':
|
||||
'master',
|
||||
'url': ('ssh://REPO_USERNAME@gerrit:29418/aic-clcp-security-'
|
||||
'manifests.git')
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def clean_temp_folders():
|
||||
try:
|
||||
yield
|
||||
finally:
|
||||
repository._clean_temp_folders()
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def stub_out_copy_functionality():
|
||||
try:
|
||||
mock.patch.object(
|
||||
repository,
|
||||
'_copy_to_temp_folder',
|
||||
autospec=True,
|
||||
side_effect=lambda x, *a, **k: x).start()
|
||||
yield
|
||||
finally:
|
||||
mock.patch.stopall()
|
||||
|
||||
|
||||
def _repo_name(repo_url):
|
||||
repo_name = repo_url.split('/')[-1]
|
||||
if repo_name.endswith('.git'):
|
||||
return repo_name[:-4]
|
||||
return repo_name
|
||||
|
||||
|
||||
def _test_process_repositories_inner(site_name="test_site",
|
||||
expected_extra_repos=None):
|
||||
repository.process_repositories(site_name)
|
||||
actual_repo_list = config.get_extra_repo_list()
|
||||
expected_repos = expected_extra_repos.get('repositories', {})
|
||||
|
||||
assert len(expected_repos) == len(actual_repo_list)
|
||||
for repo in expected_repos.values():
|
||||
repo_name = _repo_name(repo['url'])
|
||||
assert any(repo_name in r for r in actual_repo_list)
|
||||
|
||||
|
||||
def _test_process_repositories(site_repo=None,
|
||||
repo_username=None,
|
||||
repo_overrides=None):
|
||||
"""Validate :func:`repository.process_repositories`.
|
||||
|
||||
:param site_repo: Primary site repository.
|
||||
:param repo_username: Auth username that replaces REPO_USERNAME.
|
||||
:param dict repo_overrides: Overrides with format: -e global=/opt/global,
|
||||
keyed with name of override, e.g. global.
|
||||
|
||||
All params above are mutually exclusive. Can only test one at a time.
|
||||
|
||||
"""
|
||||
|
||||
@mock.patch.object(
|
||||
util.definition,
|
||||
'load_as_params',
|
||||
autospec=True,
|
||||
return_value=TEST_REPOSITORIES)
|
||||
@mock.patch.object(
|
||||
util.git, 'is_repository', autospec=True, return_value=True)
|
||||
@mock.patch.object(
|
||||
repository,
|
||||
'_handle_repository',
|
||||
autospec=True,
|
||||
side_effect=lambda repo_url, *a, **k: _repo_name(repo_url))
|
||||
def do_test(m_clone_repo, *_):
|
||||
_test_process_repositories_inner(
|
||||
expected_extra_repos=TEST_REPOSITORIES)
|
||||
|
||||
if site_repo:
|
||||
# Validate that the primary site repository is cloned, in addition
|
||||
# to the extra repositories.
|
||||
repo_revision = None
|
||||
repo_url = site_repo.rsplit('@', 1)
|
||||
if len(repo_url) == 1: # Case: local repo path.
|
||||
repo_url = repo_url[0]
|
||||
elif len(repo_url) == 2: # Case: remote repo URL.
|
||||
repo_url, repo_revision = repo_url
|
||||
mock_calls = [
|
||||
mock.call(repo_url, ref=repo_revision, auth_key=None)
|
||||
]
|
||||
mock_calls.extend([
|
||||
mock.call(r['url'], ref=r['revision'], auth_key=None)
|
||||
for r in TEST_REPOSITORIES['repositories'].values()
|
||||
])
|
||||
m_clone_repo.assert_has_calls(mock_calls)
|
||||
elif repo_username:
|
||||
# Validate that the REPO_USERNAME placeholder is replaced by
|
||||
# repo_username.
|
||||
m_clone_repo.assert_has_calls([
|
||||
mock.call(
|
||||
r['url'].replace('REPO_USERNAME', repo_username),
|
||||
ref=r['revision'],
|
||||
auth_key=None)
|
||||
for r in TEST_REPOSITORIES['repositories'].values()
|
||||
])
|
||||
elif repo_overrides:
|
||||
# This is computed from: len(cloned extra repos) +
|
||||
# len(cloned primary repo), which is len(cloned extra repos) + 1
|
||||
expected_call_count = len(TEST_REPOSITORIES['repositories']) + 1
|
||||
assert (expected_call_count == m_clone_repo.call_count)
|
||||
|
||||
for x, r in TEST_REPOSITORIES['repositories'].items():
|
||||
if x in repo_overrides:
|
||||
ref = None
|
||||
repo_url = repo_overrides[x].rsplit('@', 1)
|
||||
if len(repo_url) == 1: # Case: local repo path.
|
||||
repo_url = repo_url[0]
|
||||
elif len(repo_url) == 2: # Case: remote repo URL.
|
||||
repo_url, ref = repo_url
|
||||
repo_url = repo_url.split('=')[-1]
|
||||
else:
|
||||
repo_url = r['url']
|
||||
ref = r['revision']
|
||||
m_clone_repo.assert_any_call(repo_url, ref=ref, auth_key=None)
|
||||
else:
|
||||
m_clone_repo.assert_has_calls([
|
||||
mock.call(r['url'], ref=r['revision'], auth_key=None)
|
||||
for r in TEST_REPOSITORIES['repositories'].values()
|
||||
])
|
||||
|
||||
if site_repo:
|
||||
# Set a test site repo, call the test and clean up.
|
||||
with mock.patch.object(
|
||||
config, 'get_site_repo', autospec=True,
|
||||
return_value=site_repo):
|
||||
do_test()
|
||||
elif repo_username:
|
||||
# Set a test repo username, call the test and clean up.
|
||||
with mock.patch.object(
|
||||
config,
|
||||
'get_repo_username',
|
||||
autospec=True,
|
||||
return_value=repo_username):
|
||||
do_test()
|
||||
elif repo_overrides:
|
||||
with mock.patch.object(
|
||||
config,
|
||||
'get_extra_repo_store',
|
||||
autospec=True,
|
||||
return_value=list(repo_overrides.values())):
|
||||
do_test()
|
||||
else:
|
||||
do_test()
|
||||
|
||||
|
||||
def test_process_repositories():
|
||||
_test_process_repositories()
|
||||
|
||||
|
||||
def test_process_repositories_with_site_repo_url():
|
||||
"""Test process_repository when site repo is a remote URL."""
|
||||
site_repo = (
|
||||
'ssh://REPO_USERNAME@gerrit:29418/aic-clcp-site-manifests.git@333')
|
||||
_test_process_repositories(site_repo=site_repo)
|
||||
|
||||
|
||||
def test_process_repositories_handles_local_site_repo_path():
|
||||
site_repo = '/opt/aic-clcp-site-manifests'
|
||||
_test_process_repositories(site_repo=site_repo)
|
||||
|
||||
|
||||
def test_process_repositories_handles_local_site_repo_path_with_revision():
|
||||
site_repo = '/opt/aic-clcp-site-manifests@333'
|
||||
_test_process_repositories(site_repo=site_repo)
|
||||
|
||||
|
||||
@mock.patch.object(
|
||||
util.definition,
|
||||
'load_as_params',
|
||||
autospec=True,
|
||||
return_value=TEST_REPOSITORIES)
|
||||
@mock.patch('os.path.exists', autospec=True, return_value=True)
|
||||
@mock.patch.object(
|
||||
util.git, 'is_repository', autospec=True, return_value=False)
|
||||
def test_process_repositories_with_local_site_path_exists_not_repo(*_):
|
||||
"""Validate that when the site repo already exists but isn't a repository
|
||||
that an error is raised.
|
||||
"""
|
||||
with pytest.raises(click.ClickException) as exc:
|
||||
_test_process_repositories_inner(
|
||||
expected_extra_repos=TEST_REPOSITORIES)
|
||||
assert "is not a valid Git repo" in str(exc.value)
|
||||
|
||||
|
||||
def test_process_repositories_with_repo_username():
|
||||
_test_process_repositories(repo_username='test_username')
|
||||
|
||||
|
||||
def test_process_repositories_with_repo_overrides_remote_urls():
|
||||
# Same URL, different revision (than TEST_REPOSITORIES).
|
||||
overrides = {
|
||||
'global':
|
||||
'global=ssh://REPO_USERNAME@gerrit:29418/aic-clcp-manifests.git@12345'
|
||||
}
|
||||
_test_process_repositories(repo_overrides=overrides)
|
||||
|
||||
# Different URL, different revision (than TEST_REPOSITORIES).
|
||||
overrides = {
|
||||
'global': 'global=https://gerrit/aic-clcp-manifests.git@12345'
|
||||
}
|
||||
_test_process_repositories(repo_overrides=overrides)
|
||||
|
||||
|
||||
def test_process_repositories_with_repo_overrides_local_paths():
|
||||
# Local path without revision.
|
||||
overrides = {'global': 'global=/opt/aic-clcp-manifests'}
|
||||
_test_process_repositories(repo_overrides=overrides)
|
||||
|
||||
# Local path with revision.
|
||||
overrides = {'global': 'global=/opt/aic-clcp-manifests@12345'}
|
||||
_test_process_repositories(repo_overrides=overrides)
|
||||
|
||||
|
||||
def test_process_repositories_with_multiple_repo_overrides_remote_urls():
|
||||
overrides = {
|
||||
'global':
|
||||
'global=ssh://errit:29418/aic-clcp-manifests.git@12345',
|
||||
'secrets':
|
||||
'secrets=ssh://gerrit:29418/aic-clcp-security-manifests.git@54321'
|
||||
}
|
||||
_test_process_repositories(repo_overrides=overrides)
|
||||
|
||||
|
||||
def test_process_repositories_with_multiple_repo_overrides_local_paths():
|
||||
overrides = {
|
||||
'global': 'global=/opt/aic-clcp-manifests@12345',
|
||||
'secrets': 'secrets=/opt/aic-clcp-security-manifests.git@54321'
|
||||
}
|
||||
_test_process_repositories(repo_overrides=overrides)
|
||||
|
||||
|
||||
@mock.patch.object(
|
||||
util.definition,
|
||||
'load_as_params',
|
||||
autospec=True,
|
||||
return_value=TEST_REPOSITORIES)
|
||||
@mock.patch.object(util.git, 'is_repository', autospec=True, return_value=True)
|
||||
@mock.patch.object(
|
||||
repository,
|
||||
'_handle_repository',
|
||||
autospec=True,
|
||||
side_effect=lambda repo_url, *a, **k: _repo_name(repo_url))
|
||||
@mock.patch.object(repository, 'LOG', autospec=True)
|
||||
def test_process_repositiories_extraneous_user_repo_value(m_log, *_):
|
||||
repo_overrides = ['global=ssh://gerrit:29418/aic-clcp-manifests.git']
|
||||
|
||||
# Provide a repo user value.
|
||||
with mock.patch.object(
|
||||
config,
|
||||
'get_repo_username',
|
||||
autospec=True,
|
||||
return_value='test_username'):
|
||||
# Get rid of REPO_USERNAME through an override.
|
||||
with mock.patch.object(
|
||||
config,
|
||||
'get_extra_repo_store',
|
||||
autospec=True,
|
||||
return_value=repo_overrides):
|
||||
_test_process_repositories_inner(
|
||||
expected_extra_repos=TEST_REPOSITORIES)
|
||||
|
||||
msg = ("A repository username was specified but no REPO_USERNAME "
|
||||
"string found in repository url %s",
|
||||
repo_overrides[0].split('=')[-1])
|
||||
m_log.warning.assert_any_call(*msg)
|
||||
|
||||
|
||||
@mock.patch.object(
|
||||
util.definition, 'load_as_params', autospec=True,
|
||||
return_value={}) # No repositories in site definition.
|
||||
@mock.patch.object(util.git, 'is_repository', autospec=True, return_value=True)
|
||||
@mock.patch.object(
|
||||
repository,
|
||||
'_handle_repository',
|
||||
autospec=True,
|
||||
side_effect=lambda repo_url, *a, **k: _repo_name(repo_url))
|
||||
def test_process_repositiories_no_site_def_repos(*_):
|
||||
_test_process_repositories_inner(expected_extra_repos={})
|
||||
|
||||
|
||||
@mock.patch.object(
|
||||
util.definition, 'load_as_params', autospec=True,
|
||||
return_value={}) # No repositories in site definition.
|
||||
@mock.patch.object(util.git, 'is_repository', autospec=True, return_value=True)
|
||||
@mock.patch.object(
|
||||
repository,
|
||||
'_handle_repository',
|
||||
autospec=True,
|
||||
side_effect=lambda repo_url, *a, **k: _repo_name(repo_url))
|
||||
@mock.patch.object(repository, 'LOG', autospec=True)
|
||||
def test_process_repositiories_no_site_def_repos_with_extraneous_overrides(
|
||||
m_log, *_):
|
||||
"""Validate that overrides that don't match site-definition entries are
|
||||
ignored.
|
||||
"""
|
||||
site_name = mock.sentinel.site
|
||||
repo_overrides = ['global=ssh://gerrit:29418/aic-clcp-manifests.git']
|
||||
expected_overrides = {
|
||||
'repositories': {
|
||||
'global': {
|
||||
'revision': '843d1a50106e1f17f3f722e2ef1634ae442fe68f',
|
||||
'url': repo_overrides[0]
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
# Provide repo overrides.
|
||||
with mock.patch.object(
|
||||
config,
|
||||
'get_extra_repo_store',
|
||||
autospec=True,
|
||||
return_value=repo_overrides):
|
||||
_test_process_repositories_inner(
|
||||
site_name=site_name, expected_extra_repos=expected_overrides)
|
||||
|
||||
debug_msg = ("Repo override: %s not found under `repositories` for "
|
||||
"site-definition.yaml. Site def repositories: %s",
|
||||
repo_overrides[0], "")
|
||||
info_msg = ("No repositories found in site-definition.yaml for site: %s. "
|
||||
"Defaulting to specified repository overrides.", site_name)
|
||||
m_log.debug.assert_any_call(*debug_msg)
|
||||
m_log.info.assert_any_call(*info_msg)
|
||||
|
||||
|
||||
@mock.patch.object(
|
||||
util.definition, 'load_as_params', autospec=True,
|
||||
return_value={}) # No repositories in site definition.
|
||||
@mock.patch.object(util.git, 'is_repository', autospec=True, return_value=True)
|
||||
@mock.patch.object(repository, 'LOG', autospec=True)
|
||||
def test_process_repositories_without_repositories_key_in_site_definition(
|
||||
m_log, *_):
|
||||
# Stub this out since default config site repo is '.' and local repo might
|
||||
# be dirty.
|
||||
with mock.patch.object(repository, '_handle_repository', autospec=True):
|
||||
_test_process_repositories_inner(
|
||||
site_name=mock.sentinel.site, expected_extra_repos={})
|
||||
msg = ("The repository for site_name: %s does not contain a "
|
||||
"site-definition.yaml with a 'repositories' key" % str(
|
||||
mock.sentinel.site))
|
||||
assert any(msg in x[1][0] for x in m_log.info.mock_calls)
|
||||
|
||||
|
||||
@mock.patch.object(
|
||||
util.definition,
|
||||
'load_as_params',
|
||||
autospec=True,
|
||||
return_value=TEST_REPOSITORIES)
|
||||
@mock.patch.object(util.git, 'is_repository', autospec=True, return_value=True)
|
||||
@mock.patch.object(config, 'get_extra_repo_store', autospec=True)
|
||||
def test_process_extra_repositories_malformed_format_raises_exception(
|
||||
m_get_extra_repo_store, *_):
|
||||
# Will fail since it doesn't contain "=".
|
||||
broken_repo_url = 'broken_url'
|
||||
m_get_extra_repo_store.return_value = [broken_repo_url]
|
||||
error = ("The repository %s must be in the form of "
|
||||
"name=repoUrl[@revision]" % broken_repo_url)
|
||||
|
||||
# Stub this out since default config site repo is '.' and local repo might
|
||||
# be dirty.
|
||||
with mock.patch.object(repository, '_handle_repository', autospec=True):
|
||||
with pytest.raises(click.ClickException) as exc:
|
||||
repository.process_repositories(mock.sentinel.site)
|
||||
assert error == str(exc.value)
|
||||
|
||||
|
||||
def test_process_site_repository():
|
||||
def _do_test(site_repo):
|
||||
expected = site_repo.rsplit('@', 1)[0]
|
||||
|
||||
with mock.patch.object(
|
||||
config, 'get_site_repo', autospec=True,
|
||||
return_value=site_repo) as mock_get_site_repo:
|
||||
|
||||
with mock.patch.object(
|
||||
repository, '_handle_repository', autospec=True):
|
||||
result = repository.process_site_repository()
|
||||
assert expected == result
|
||||
|
||||
# Ensure that the reference is always pruned.
|
||||
_do_test('http://github.com/openstack/treasuremap@master')
|
||||
_do_test('http://github.com/openstack/treasuremap')
|
||||
_do_test('https://github.com/openstack/treasuremap@master')
|
||||
_do_test('https://github.com/openstack/treasuremap')
|
||||
_do_test('ssh://foo@github.com/openstack/treasuremap:12345@master')
|
||||
_do_test('ssh://foo@github.com/openstack/treasuremap:12345')
|
@ -24,7 +24,7 @@ def test_no_non_yamls(tmpdir):
|
||||
for x in range(3): # Create 3 YAML files
|
||||
p.join("good-%d.yaml" % x).write('fake-content')
|
||||
p.join("bad.txt").write("fake-content")
|
||||
config.set_primary_repo(str(tmpdir.listdir()[0]))
|
||||
config.set_site_repo(str(tmpdir.listdir()[0]))
|
||||
results = list(files.all())
|
||||
|
||||
assert 3 == len(results)
|
||||
|
@ -67,13 +67,10 @@ def clean_git_repos():
|
||||
|
||||
"""
|
||||
|
||||
RELEVANT_REPOS = ('airship-armada', 'openstack-helm')
|
||||
|
||||
root_tempdir = tempfile.gettempdir()
|
||||
for tempdir in os.listdir(root_tempdir):
|
||||
path = os.path.join(root_tempdir, tempdir)
|
||||
if git.is_repository(path) and any(
|
||||
path.endswith(x) for x in RELEVANT_REPOS):
|
||||
if git.is_repository(path):
|
||||
shutil.rmtree(path, ignore_errors=True)
|
||||
|
||||
|
||||
@ -101,7 +98,9 @@ def _validate_git_clone(repo_dir, fetched_ref=None, checked_out_ref=None):
|
||||
assert checked_out_ref in git_file.read()
|
||||
|
||||
|
||||
def _assert_repo_url_was_cloned(mock_log, git_dir):
|
||||
def _assert_check_out_from_local_repo(mock_log, git_dir):
|
||||
"""Validates that check out happened from local repo, without a reclone.
|
||||
"""
|
||||
expected_msg = ('Treating repo_url=%s as an already-cloned repository')
|
||||
assert mock_log.debug.called
|
||||
mock_calls = mock_log.debug.mock_calls
|
||||
@ -180,7 +179,7 @@ def test_git_clone_existing_directory_checks_out_earlier_ref_from_local(
|
||||
ref = 'refs/changes/15/536215/34'
|
||||
git_dir = git.git_handler(git_dir, ref)
|
||||
_validate_git_clone(git_dir, checked_out_ref=ref)
|
||||
_assert_repo_url_was_cloned(mock_log, git_dir)
|
||||
_assert_check_out_from_local_repo(mock_log, git_dir)
|
||||
|
||||
|
||||
@pytest.mark.skipif(
|
||||
@ -200,7 +199,7 @@ def test_git_clone_existing_directory_checks_out_master_from_local(mock_log):
|
||||
ref = 'master'
|
||||
git_dir = git.git_handler(git_dir, ref)
|
||||
_validate_git_clone(git_dir, checked_out_ref=ref)
|
||||
_assert_repo_url_was_cloned(mock_log, git_dir)
|
||||
_assert_check_out_from_local_repo(mock_log, git_dir)
|
||||
|
||||
|
||||
@pytest.mark.skipif(
|
||||
@ -220,14 +219,14 @@ def test_git_clone_checkout_refpath_saves_references_locally(mock_log):
|
||||
ref = 'refs/changes/15/536215/34'
|
||||
git_dir = git.git_handler(git_dir, ref)
|
||||
_validate_git_clone(git_dir, checked_out_ref=ref)
|
||||
_assert_repo_url_was_cloned(mock_log, git_dir)
|
||||
_assert_check_out_from_local_repo(mock_log, git_dir)
|
||||
|
||||
# Verify that passing in the hexsha variation of refpath
|
||||
# 'refs/changes/15/536215/34' also works.
|
||||
hexref = '276102a115dac3c0a6e91f9047d8b086bc8d2ff0'
|
||||
git_dir = git.git_handler(git_dir, hexref)
|
||||
_validate_git_clone(git_dir, checked_out_ref=hexref)
|
||||
_assert_repo_url_was_cloned(mock_log, git_dir)
|
||||
_assert_check_out_from_local_repo(mock_log, git_dir)
|
||||
|
||||
|
||||
@pytest.mark.skipif(
|
||||
@ -251,13 +250,13 @@ def test_git_clone_checkout_hexsha_saves_references_locally(mock_log):
|
||||
ref = 'bf126f46b1c175a8038949a87dafb0a716e3b9b6'
|
||||
git_dir = git.git_handler(git_dir, ref)
|
||||
_validate_git_clone(git_dir, checked_out_ref=ref)
|
||||
_assert_repo_url_was_cloned(mock_log, git_dir)
|
||||
_assert_check_out_from_local_repo(mock_log, git_dir)
|
||||
|
||||
# Verify that passing in the refpath variation of hexsha also works.
|
||||
hexref = 'refs/changes/15/536215/35'
|
||||
git_dir = git.git_handler(git_dir, hexref)
|
||||
_validate_git_clone(git_dir, checked_out_ref=hexref)
|
||||
_assert_repo_url_was_cloned(mock_log, git_dir)
|
||||
_assert_check_out_from_local_repo(mock_log, git_dir)
|
||||
|
||||
|
||||
@pytest.mark.skipif(
|
||||
@ -278,7 +277,7 @@ def test_git_clone_existing_directory_checks_out_next_local_ref(mock_log):
|
||||
ref = 'refs/changes/54/457754/74'
|
||||
git_dir = git.git_handler(git_dir, ref)
|
||||
_validate_git_clone(git_dir, ref)
|
||||
_assert_repo_url_was_cloned(mock_log, git_dir)
|
||||
_assert_check_out_from_local_repo(mock_log, git_dir)
|
||||
|
||||
|
||||
@pytest.mark.skipif(
|
||||
@ -295,7 +294,7 @@ def test_git_checkout_without_reference_defaults_to_current(mock_log):
|
||||
|
||||
git_dir = git.git_handler(git_dir, ref=None) # Defaults to commit ref.
|
||||
_validate_git_clone(git_dir, commit) # Validate with the original ref.
|
||||
_assert_repo_url_was_cloned(mock_log, git_dir)
|
||||
_assert_check_out_from_local_repo(mock_log, git_dir)
|
||||
|
||||
|
||||
@pytest.mark.skipif(
|
||||
@ -355,7 +354,7 @@ def test_git_clone_existing_directory_raises_exc_for_invalid_ref(mock_log):
|
||||
ref = 'refs/changes/54/457754/9000'
|
||||
with pytest.raises(exceptions.GitException):
|
||||
git_dir = git.git_handler(git_dir, ref)
|
||||
_assert_repo_url_was_cloned(mock_log, git_dir)
|
||||
_assert_check_out_from_local_repo(mock_log, git_dir)
|
||||
|
||||
|
||||
@pytest.mark.skipif(
|
||||
@ -476,3 +475,53 @@ def test_git_clone_ssh_auth_method_missing_ssh_key(_):
|
||||
with pytest.raises(exceptions.GitSSHException):
|
||||
git.git_handler(
|
||||
url, ref='refs/changes/17/388517/5', auth_key='/home/user/.ssh/')
|
||||
|
||||
|
||||
@pytest.mark.skipif(
|
||||
not is_connected(), reason='git clone requires network connectivity.')
|
||||
def test_is_repository():
|
||||
|
||||
cloned_directories = {}
|
||||
|
||||
def do_test(url, ref, subpath=None):
|
||||
nonlocal cloned_directories
|
||||
|
||||
if url not in cloned_directories:
|
||||
git_dir = git.git_handler(url, ref=ref)
|
||||
_validate_git_clone(git_dir)
|
||||
cloned_directories.setdefault(url, git_dir)
|
||||
else:
|
||||
git_dir = cloned_directories[url]
|
||||
|
||||
assert os.path.exists(git_dir)
|
||||
assert git.is_repository(git_dir)
|
||||
|
||||
if subpath:
|
||||
assert git.is_repository(
|
||||
os.path.join(git_dir, subpath), search_parent_directories=True)
|
||||
|
||||
# airship-treasuremap
|
||||
do_test(
|
||||
url='http://github.com/openstack/airship-treasuremap',
|
||||
ref='refs/changes/17/597217/1')
|
||||
do_test(
|
||||
url='http://github.com/openstack/airship-treasuremap',
|
||||
ref='refs/changes/17/597217/1',
|
||||
subpath='site')
|
||||
|
||||
# airship-in-a-bottle
|
||||
do_test(
|
||||
url='http://github.com/openstack/airship-in-a-bottle',
|
||||
ref='refs/changes/39/596439/1')
|
||||
do_test(
|
||||
url='http://github.com/openstack/airship-in-a-bottle',
|
||||
ref='refs/changes/39/596439/1',
|
||||
subpath='deployment_files')
|
||||
do_test(
|
||||
url='http://github.com/openstack/airship-in-a-bottle',
|
||||
ref='refs/changes/39/596439/1',
|
||||
subpath='deployment_files/site')
|
||||
|
||||
|
||||
def test_is_repository_negative():
|
||||
assert not git.is_repository(tempfile.mkdtemp())
|
||||
|
@ -69,7 +69,7 @@ def create_tmp_deployment_files(tmpdir):
|
||||
}
|
||||
|
||||
p = tmpdir.mkdir("deployment_files")
|
||||
config.set_primary_repo(str(p))
|
||||
config.set_site_repo(str(p))
|
||||
|
||||
# Create global directories and files.
|
||||
files._create_tree(
|
||||
@ -123,7 +123,10 @@ def create_tmp_deployment_files(tmpdir):
|
||||
site_definition = """
|
||||
---
|
||||
data:
|
||||
revision: v1.0
|
||||
repositories:
|
||||
global:
|
||||
revision: v1.0
|
||||
url: http://nothing.com
|
||||
site_type: %s
|
||||
metadata:
|
||||
layeringDefinition: {abstract: false, layer: site}
|
||||
|
Loading…
Reference in New Issue
Block a user