Updated the py_pkgs lookup plugin for multi source

The py_pkgs lookup plugin will now handle multiple sources. This change
was needed to enable the system to allow for overrides of repo_package
resources and general python packaging as found throughout the stack.
The module has precedence as it loads / searches for packages and he item
loaded last is the one that has the most precedence.

The repo-build play has been updated to now search for compatible packages
in the main playbook directory, the root ansible role directory and the
user_variables file.

Documentation has been added regarding this capability and how to override
and set items to meet the needs of the deployment.

Closes-Bug: #1510575
Implements: blueprint independent-role-repositories
Change-Id: Ib7cda14db18c0ca58bb5ac495d1c201812cf0f48
Signed-off-by: Kevin Carter <kevin.carter@rackspace.com>
This commit is contained in:
Kevin Carter
2015-11-03 23:39:21 -06:00
parent fa0772aa1b
commit 2f35e36e54
7 changed files with 372 additions and 112 deletions

View File

@@ -139,6 +139,47 @@ This module has been `submitted for consideration`_ into Ansible Core.
.. _Install Guide: ../install-guide/configure-openstack.html .. _Install Guide: ../install-guide/configure-openstack.html
.. _submitted for consideration: https://github.com/ansible/ansible/pull/12555 .. _submitted for consideration: https://github.com/ansible/ansible/pull/12555
Build the environment with additional python packages
+++++++++++++++++++++++++++++++++++++++++++++++++++++
The system will allow you to install and build any package that is a python
installable. The repository infrastructure will look for and create any
git based or PyPi installable package. When the package is built the repo-build
role will create the sources as Python wheels to extend the base system and
requirements.
While the packages pre-built in the repository-infrastructure are comprehensive,
it may be needed to change the source locations and versions of packages to suit
different deployment needs. Adding additional repositories as overrides is as
simple as listing entries within the variable file of your choice. Any
``user_.*.yml`` file within the "/etc/openstack_deployment" directory will work
to facilitate the addition of a new packages.
.. code-block:: yaml
swift_git_repo: https://private-git.example.org/example-org/swift
swift_git_install_branch: master
Additional lists of python packages can also be overridden using a ``user_.*.yml``
variable file.
.. code-block:: yaml
swift_requires_pip_packages:
- virtualenv
- virtualenv-tools
- python-keystoneclient
- NEW-SPECIAL-PACKAGE
Once the variables are set call the play ``repo-build.yml`` to build all of the
wheels within the repository infrastructure. When ready run the target plays to
deploy your overridden source code.
Module documentation Module documentation
++++++++++++++++++++ ++++++++++++++++++++

View File

@@ -19,9 +19,15 @@ For the sake of anyone else editing this file:
* If you add clients to this file please do so in alphabetical order. * If you add clients to this file please do so in alphabetical order.
* Every entry should be name spaced with the name of the client followed by an "_" * Every entry should be name spaced with the name of the client followed by an "_"
Repository data can be set in any of the following locations by default.
- <MAIN REPO LOCATION>
- /etc/ansible/roles
- /etc/openstack_deploy
The basic structure of all of these files: The basic structure of all of these files:
* git_repo: ``string`` URI to the git repository to clone from. * git_repo: ``string`` URI to the git repository to clone from.
* git_fallback_repo: ``string`` URI to an alternative git repository to clone from when **git_repo** fails. * git_fallback_repo: ``string`` URI to an alternative git repository to clone from when **git_repo** fails.
* git_dest: ``string`` full path to place a cloned git repository. This will normally incorporate the **repo_path** variable for consistency purposes. * git_dest: ``string`` full path to place a cloned git repository. This will normally incorporate the **repo_path** variable for consistency purposes.
* git_install_branch: ``string`` branch, tag or SHA of a git repository to clone into. * git_install_branch: ``string`` branch, tag or SHA of a git repository to clone into.
* git_repo_plugins: ``list`` of ``hashes`` with keys: path, package | This is used to install additional packages which may be installable from the same base repository. * git_repo_plugins: ``list`` of ``hashes`` with keys: path, package | This is used to install additional packages which may be installable from the same base repository.
* git_package_name: ``string`` that will override the "egg" name given for the repo.

View File

@@ -27,6 +27,28 @@ Simple filters that may be useful from within the stack
""" """
def _pip_requirement_split(requirement):
version_descriptors = "(>=|<=|>|<|==|~=|!=)"
requirement = requirement.split(';')
requirement_info = re.split(r'%s\s*' % version_descriptors, requirement[0])
name = requirement_info[0]
marker = None
if len(requirement) > 1:
marker = requirement[1]
versions = None
if len(requirement_info) > 1:
versions = requirement_info[1]
return name, versions, marker
def _lower_set_lists(list_one, list_two):
_list_one = set([i.lower() for i in list_one])
_list_two = set([i.lower() for i in list_two])
return _list_one, _list_two
def bit_length_power_of_2(value): def bit_length_power_of_2(value):
"""Return the smallest power of 2 greater than a numeric value. """Return the smallest power of 2 greater than a numeric value.
@@ -120,17 +142,33 @@ def pip_requirement_names(requirements):
:return: ``str`` :return: ``str``
""" """
version_descriptors = "(>=|<=|>|<|==|~=|!=)"
named_requirements = list() named_requirements = list()
for requirement in requirements: for requirement in requirements:
requirement = requirement.split(';')[0] name = _pip_requirement_split(requirement)[0]
name = re.split(r'%s\s*' % version_descriptors, requirement)[0]
if name and not name.startswith('#'): if name and not name.startswith('#'):
named_requirements.append(name.lower()) named_requirements.append(name.lower())
return sorted(set(named_requirements)) return sorted(set(named_requirements))
def pip_constraint_update(list_one, list_two):
_list_one, _list_two = _lower_set_lists(list_one, list_two)
_list_one, _list_two = list(_list_one), list(_list_two)
for item2 in _list_two:
item2_name, item2_versions, _ = _pip_requirement_split(item2)
if item2_versions:
for item1 in _list_one:
if item2_name == _pip_requirement_split(item1)[0]:
item1_index = _list_one.index(item1)
_list_one[item1_index] = item2
break
else:
_list_one.append(item2)
return sorted(_list_one)
def splitlines(string_with_lines): def splitlines(string_with_lines):
"""Return a ``list`` from a string with lines.""" """Return a ``list`` from a string with lines."""
@@ -139,8 +177,7 @@ def splitlines(string_with_lines):
def filtered_list(list_one, list_two): def filtered_list(list_one, list_two):
_list_one = set([i.lower() for i in list_one]) _list_one, _list_two = _lower_set_lists(list_one, list_two)
_list_two = set([i.lower() for i in list_two])
return list(_list_one-_list_two) return list(_list_one-_list_two)
@@ -199,6 +236,7 @@ class FilterModule(object):
'netorigin': get_netorigin, 'netorigin': get_netorigin,
'string_2_int': string_2_int, 'string_2_int': string_2_int,
'pip_requirement_names': pip_requirement_names, 'pip_requirement_names': pip_requirement_names,
'pip_constraint_update': pip_constraint_update,
'splitlines': splitlines, 'splitlines': splitlines,
'filtered_list': filtered_list, 'filtered_list': filtered_list,
'git_link_parse': git_link_parse, 'git_link_parse': git_link_parse,

View File

@@ -15,6 +15,7 @@
# (c) 2014, Kevin Carter <kevin.carter@rackspace.com> # (c) 2014, Kevin Carter <kevin.carter@rackspace.com>
import os import os
import re
import traceback import traceback
from ansible import errors from ansible import errors
@@ -22,13 +23,18 @@ from ansible import utils
import yaml import yaml
VERSION_DESCRIPTORS = ['>=', '<=', '==', '!=', '<', '>'] # Used to keep track of git package parts as various files are processed
GIT_PACKAGE_DEFAULT_PARTS = dict()
ROLE_PACKAGES = dict()
REQUIREMENTS_FILE_TYPES = [ REQUIREMENTS_FILE_TYPES = [
'global-requirements.txt', 'global-requirements.txt',
'test-requirements.txt', 'test-requirements.txt',
'dev-requirements.txt', 'dev-requirements.txt',
'requirements.txt',
'global-requirement-pins.txt' 'global-requirement-pins.txt'
] ]
@@ -47,29 +53,36 @@ def git_pip_link_parse(repo):
"""Return a tuple containing the parts of a git repository. """Return a tuple containing the parts of a git repository.
Example parsing a standard git repo: Example parsing a standard git repo:
>>> git_pip_link_parse('git+https://github.com/username/repo@tag') >>> git_pip_link_parse('git+https://github.com/username/repo-name@tag')
('repo', ('repo-name',
'tag', 'tag',
None, None,
'https://github.com/username/repo', 'https://github.com/username/repo',
'git+https://github.com/username/repo@tag') 'git+https://github.com/username/repo@tag',
'repo_name')
Example parsing a git repo that uses an installable from a subdirectory: Example parsing a git repo that uses an installable from a subdirectory:
>>> git_pip_link_parse( >>> git_pip_link_parse(
... 'git+https://github.com/username/repo@tag#egg=plugin.name' ... 'git+https://github.com/username/repo@tag#egg=plugin.name'
... '&subdirectory=remote_path/plugin.name' ... '&subdirectory=remote_path/plugin.name'
... ) ... )
('repo', ('plugin.name',
'tag', 'tag',
'remote_path/plugin.name', 'remote_path/plugin.name',
'https://github.com/username/repo', 'https://github.com/username/repo',
'git+https://github.com/username/repo@tag#egg=plugin.name&' 'git+https://github.com/username/repo@tag#egg=plugin.name&'
'subdirectory=remote_path/plugin.name') 'subdirectory=remote_path/plugin.name',
'plugin.name')
:param repo: git repo string to parse. :param repo: git repo string to parse.
:type repo: ``str`` :type repo: ``str``
:returns: ``tuple`` :returns: ``tuple``
""" """'meta'
def _meta_return(meta_data, item):
"""Return the value of an item in meta data."""
return meta_data.lstrip('#').split('%s=' % item)[-1].split('&')[0]
_git_url = repo.split('+') _git_url = repo.split('+')
if len(_git_url) >= 2: if len(_git_url) >= 2:
@@ -78,23 +91,58 @@ def git_pip_link_parse(repo):
_git_url = _git_url[0] _git_url = _git_url[0]
git_branch_sha = _git_url.split('@') git_branch_sha = _git_url.split('@')
if len(git_branch_sha) > 1: if len(git_branch_sha) > 2:
branch = git_branch_sha.pop()
url = '@'.join(git_branch_sha)
elif len(git_branch_sha) > 1:
url, branch = git_branch_sha url, branch = git_branch_sha
else: else:
url = git_branch_sha[0] url = git_branch_sha[0]
branch = 'master' branch = 'master'
name = os.path.basename(url.rstrip('/')) egg_name = name = os.path.basename(url.rstrip('/'))
egg_name = egg_name.replace('-', '_')
_branch = branch.split('#') _branch = branch.split('#')
branch = _branch[0] branch = _branch[0]
plugin_path = None plugin_path = None
# Determine if the package is a plugin type # Determine if the package is a plugin type
if len(_branch) > 1: if len(_branch) > 1:
if 'subdirectory' in _branch[-1]: if 'subdirectory=' in _branch[-1]:
plugin_path = _branch[1].split('subdirectory=')[1].split('&')[0] plugin_path = _meta_return(_branch[-1], 'subdirectory')
name = os.path.basename(plugin_path)
return name.lower(), branch, plugin_path, url, repo if 'egg=' in _branch[-1]:
egg_name = _meta_return(_branch[-1], 'egg')
egg_name = egg_name.replace('-', '_')
if 'gitname=' in _branch[-1]:
name = _meta_return(_branch[-1], 'gitname')
return name.lower(), branch, plugin_path, url, repo, egg_name
def _pip_requirement_split(requirement):
"""Split pip versions from a given requirement.
The method will return the package name, versions, and any markers.
:type requirement: ``str``
:returns: ``tuple``
"""
version_descriptors = "(>=|<=|>|<|==|~=|!=)"
requirement = requirement.split(';')
requirement_info = re.split(r'%s\s*' % version_descriptors, requirement[0])
name = requirement_info[0]
marker = None
if len(requirement) > 1:
marker = requirement[-1]
versions = None
if len(requirement_info) > 1:
versions = ''.join(requirement_info[1:])
return name, versions, marker
class DependencyFileProcessor(object): class DependencyFileProcessor(object):
@@ -107,14 +155,25 @@ class DependencyFileProcessor(object):
self.pip = dict() self.pip = dict()
self.pip['git_package'] = list() self.pip['git_package'] = list()
self.pip['py_package'] = list() self.pip['py_package'] = list()
self.pip['role_packages'] = dict() self.pip['git_data'] = list()
self.git_pip_install = 'git+%s@%s' self.git_pip_install = 'git+%s@%s'
self.file_names = self._get_files(path=local_path) self.file_names = self._get_files(path=local_path)
# Process everything simply by calling the method # Process everything simply by calling the method
self._process_files(ext=('yaml', 'yml')) self._process_files()
def _filter_files(self, file_names, ext): def _py_pkg_extend(self, packages):
for pkg in packages:
pkg_name = _pip_requirement_split(pkg)[0]
for py_pkg in self.pip['py_package']:
py_pkg_name = _pip_requirement_split(py_pkg)[0]
if pkg_name == py_pkg_name:
self.pip['py_package'].remove(py_pkg)
else:
self.pip['py_package'].extend([i.lower() for i in packages])
@staticmethod
def _filter_files(file_names, ext):
"""Filter the files and return a sorted list. """Filter the files and return a sorted list.
:type file_names: :type file_names:
@@ -122,22 +181,14 @@ class DependencyFileProcessor(object):
:returns: ``list`` :returns: ``list``
""" """
_file_names = list() _file_names = list()
file_name_words = ['/defaults/', '/vars/', '/user_']
file_name_words.extend(REQUIREMENTS_FILE_TYPES)
for file_name in file_names: for file_name in file_names:
if file_name.endswith(ext): if file_name.endswith(ext):
if '/defaults/' in file_name or '/vars/' in file_name: if any(i in file_name for i in file_name_words):
_file_names.append(file_name) _file_names.append(file_name)
else: else:
continue return _file_names
elif os.path.basename(file_name) in REQUIREMENTS_FILE_TYPES:
with open(file_name, 'rb') as f:
packages = [
i.split()[0] for i in f.read().splitlines()
if i
if not i.startswith('#')
]
self.pip['py_package'].extend(packages)
else:
return sorted(_file_names, reverse=True)
@staticmethod @staticmethod
def _get_files(path): def _get_files(path):
@@ -161,25 +212,44 @@ class DependencyFileProcessor(object):
:type git_data: ``dict`` :type git_data: ``dict``
""" """
for repo_plugin in git_repo_plugins: for repo_plugin in git_repo_plugins:
strip_plugin_path = repo_plugin['package'].lstrip('/')
plugin = '%s/%s' % ( plugin = '%s/%s' % (
repo_plugin['path'].strip('/'), repo_plugin['path'].strip('/'),
repo_plugin['package'].lstrip('/') strip_plugin_path
) )
name = git_data['name'] = os.path.basename(strip_plugin_path)
git_data['egg_name'] = name.replace('-', '_')
package = self.git_pip_install % ( package = self.git_pip_install % (
git_data['repo'], git_data['repo'], git_data['branch']
'%s#egg=%s&subdirectory=%s' % (
git_data['branch'],
repo_plugin['package'].strip('/'),
plugin
) )
) package += '#egg=%s' % git_data['egg_name']
package += '&subdirectory=%s' % plugin
package += '&gitname=%s' % name
if git_data['fragments']: if git_data['fragments']:
package = '%s&%s' % (package, git_data['fragments']) package += '&%s' % git_data['fragments']
self.pip['git_data'].append(git_data)
self.pip['git_package'].append(package) self.pip['git_package'].append(package)
if name not in GIT_PACKAGE_DEFAULT_PARTS:
GIT_PACKAGE_DEFAULT_PARTS[name] = git_data.copy()
else:
GIT_PACKAGE_DEFAULT_PARTS[name].update(git_data.copy())
@staticmethod
def _check_defaults(git_data, name, item):
"""Check if a default exists and use it if an item is undefined.
:type git_data: ``dict``
:type name: ``str``
:type item: ``str``
"""
if not git_data[item] and name in GIT_PACKAGE_DEFAULT_PARTS:
check_item = GIT_PACKAGE_DEFAULT_PARTS[name].get(item)
if check_item:
git_data[item] = check_item
def _process_git(self, loaded_yaml, git_item): def _process_git(self, loaded_yaml, git_item):
"""Process git repos. """Process git repos.
@@ -188,53 +258,69 @@ class DependencyFileProcessor(object):
""" """
git_data = dict() git_data = dict()
if git_item.split('_')[0] == 'git': if git_item.split('_')[0] == 'git':
var_name = 'git' prefix = ''
else: else:
var_name = git_item.split('_git_repo')[0] prefix = '%s_' % git_item.split('_git_repo')[0].replace('.', '_')
git_data['repo'] = loaded_yaml.get(git_item) # Set the various variable definitions
git_data['branch'] = loaded_yaml.get( repo_var = prefix + 'git_repo'
'%s_git_install_branch' % var_name.replace('.', '_') name_var = prefix + 'git_package_name'
branch_var = prefix + 'git_install_branch'
fragment_var = prefix + 'git_install_fragments'
plugins_var = prefix + 'repo_plugins'
# get the repo definition
git_data['repo'] = loaded_yaml.get(repo_var)
# get the repo name definition
name = git_data['name'] = loaded_yaml.get(name_var)
if not name:
name = git_data['name'] = os.path.basename(
git_data['repo'].rstrip('/')
) )
git_data['egg_name'] = name.replace('-', '_')
# get the repo branch definition
git_data['branch'] = loaded_yaml.get(branch_var)
self._check_defaults(git_data, name, 'branch')
if not git_data['branch']: if not git_data['branch']:
git_data['branch'] = loaded_yaml.get( git_data['branch'] = 'master'
'git_install_branch',
'master'
)
package = self.git_pip_install % ( package = self.git_pip_install % (git_data['repo'], git_data['branch'])
git_data['repo'], git_data['branch']
) # get the repo fragment definitions, if any
package = '%s#egg=%s' % (package, git_pip_link_parse(package)[0].replace('-', '_')) git_data['fragments'] = loaded_yaml.get(fragment_var)
git_data['fragments'] = loaded_yaml.get( self._check_defaults(git_data, name, 'fragments')
'%s_git_install_fragments' % var_name.replace('.', '_')
) package += '#egg=%s' % git_data['egg_name']
package += '&gitname=%s' % name
if git_data['fragments']: if git_data['fragments']:
package = '%s#%s' % (package, git_data['fragments']) package += '&%s' % git_data['fragments']
self.pip['git_package'].append(package) self.pip['git_package'].append(package)
self.pip['git_data'].append(git_data.copy())
git_repo_plugins = loaded_yaml.get('%s_repo_plugins' % var_name) # Set the default package parts to track data during the run
if git_repo_plugins: if name not in GIT_PACKAGE_DEFAULT_PARTS:
GIT_PACKAGE_DEFAULT_PARTS[name] = git_data.copy()
else:
GIT_PACKAGE_DEFAULT_PARTS[name].update()
# get the repo plugin definitions, if any
git_data['plugins'] = loaded_yaml.get(plugins_var)
self._check_defaults(git_data, name, 'plugins')
if git_data['plugins']:
self._check_plugins( self._check_plugins(
git_repo_plugins=git_repo_plugins, git_repo_plugins=git_data['plugins'],
git_data=git_data git_data=git_data
) )
def _process_files(self, ext): def _process_files(self):
"""Process files. """Process files."""
:type ext: ``tuple``
"""
file_names = self._filter_files(
file_names=self.file_names,
ext=ext
)
role_name = None role_name = None
for file_name in file_names: for file_name in self._filter_files(self.file_names, ('yaml', 'yml')):
with open(file_name, 'rb') as f: with open(file_name, 'r') as f:
# If there is an exception loading the file continue # If there is an exception loading the file continue
# and if the loaded_config is None continue. This makes # and if the loaded_config is None continue. This makes
# no bad config gets passed to the rest of the process. # no bad config gets passed to the rest of the process.
@@ -250,12 +336,11 @@ class DependencyFileProcessor(object):
_role_name = file_name.split('roles%s' % os.sep)[-1] _role_name = file_name.split('roles%s' % os.sep)[-1]
role_name = _role_name.split(os.sep)[0] role_name = _role_name.split(os.sep)[0]
for key, values in loaded_config.items(): for key, values in loaded_config.items():
# This conditional is set to ensure we're not processes git repos # This conditional is set to ensure we're not processes git
# from the defaults file which may conflict with what is being set # repos from the defaults file which may conflict with what is
# in the repo_packages files. # being set in the repo_packages files.
if not '/defaults/main' in file_name: if '/defaults/main' not in file_name:
if key.endswith('git_repo'): if key.endswith('git_repo'):
self._process_git( self._process_git(
loaded_yaml=loaded_config, loaded_yaml=loaded_config,
@@ -263,18 +348,31 @@ class DependencyFileProcessor(object):
) )
if [i for i in BUILT_IN_PIP_PACKAGE_VARS if i in key]: if [i for i in BUILT_IN_PIP_PACKAGE_VARS if i in key]:
self.pip['py_package'].extend(values) self._py_pkg_extend(values)
if role_name: if role_name:
if not role_name in self.pip['role_packages']: if role_name in ROLE_PACKAGES:
self.pip['role_packages'][role_name] = values role_pkgs = ROLE_PACKAGES[role_name]
else: else:
self.pip['role_packages'][role_name].extend(values) role_pkgs = ROLE_PACKAGES[role_name] = dict()
self.pip['role_packages'][role_name] = list(
set( pkgs = role_pkgs.get(key, list())
self.pip['role_packages'][role_name] pkgs.extend(values)
) ROLE_PACKAGES[role_name][key] = pkgs
) else:
for k, v in ROLE_PACKAGES.items():
for item_name in v.keys():
if key == item_name:
ROLE_PACKAGES[k][item_name].extend(values)
for file_name in self._filter_files(self.file_names, 'txt'):
if os.path.basename(file_name) in REQUIREMENTS_FILE_TYPES:
with open(file_name, 'r') as f:
packages = [
i.split()[0] for i in f.read().splitlines()
if i
if not i.startswith('#')
]
self._py_pkg_extend(packages)
def _abs_path(path): def _abs_path(path):
@@ -308,11 +406,13 @@ class LookupModule(object):
terms = [terms] terms = [terms]
return_data = { return_data = {
'packages': list(), 'packages': set(),
'remote_packages': list() 'remote_packages': set(),
'remote_package_parts': list(),
'role_packages': dict()
} }
return_list = list()
for term in terms: for term in terms:
return_list = list()
try: try:
dfp = DependencyFileProcessor( dfp = DependencyFileProcessor(
local_path=_abs_path(str(term)) local_path=_abs_path(str(term))
@@ -328,30 +428,92 @@ class LookupModule(object):
) )
) )
for item in sorted(set(return_list)): for item in return_list:
if item.startswith(('http:', 'https:', 'git+')): if item.startswith(('http:', 'https:', 'git+')):
if '@' not in item: if '@' not in item:
return_data['packages'].append(item) return_data['packages'].add(item)
else: else:
return_data['remote_packages'].append(item) git_parts = git_pip_link_parse(item)
item_name = git_parts[-1]
if not item_name:
item_name = git_pip_link_parse(item)[0]
for rpkg in list(return_data['remote_packages']):
rpkg_name = git_pip_link_parse(rpkg)[-1]
if not rpkg_name:
rpkg_name = git_pip_link_parse(item)[0]
if rpkg_name == item_name:
return_data['remote_packages'].remove(rpkg)
return_data['remote_packages'].add(item)
break
else: else:
return_data['packages'].append(item) return_data['remote_packages'].add(item)
else: else:
return_data['packages'] = list( return_data['packages'].add(item)
set([i.lower() for i in return_data['packages']]) else:
) keys = [
return_data['remote_packages'] = list( 'name',
set(return_data['remote_packages']) 'version',
) 'fragment',
keys = ['name', 'version', 'fragment', 'url', 'original'] 'url',
remote_package_parts = [ 'original',
'egg_name'
]
remote_pkg_parts = [
dict( dict(
zip( zip(
keys, git_pip_link_parse(i) keys, git_pip_link_parse(i)
) )
) for i in return_data['remote_packages'] ) for i in return_data['remote_packages']
] ]
return_data['remote_package_parts'] = remote_package_parts return_data['remote_package_parts'].extend(remote_pkg_parts)
return_data['role_packages'] = dfp.pip['role_packages'] return_data['remote_package_parts'] = list(
dict(
(i['name'], i)
for i in return_data['remote_package_parts']
).values()
)
else:
for k, v in ROLE_PACKAGES.items():
role_pkgs = return_data['role_packages'][k] = list()
for pkg_list in v.values():
role_pkgs.extend(pkg_list)
else:
return_data['role_packages'][k] = sorted(set(role_pkgs))
check_pkgs = dict()
base_packages = sorted(list(return_data['packages']))
for pkg in base_packages:
name, versions, markers = _pip_requirement_split(pkg)
if versions and markers:
versions = '%s;%s' % (versions, markers)
elif not versions and markers:
versions = ';%s' % markers
if name in check_pkgs:
if versions and not check_pkgs[name]:
check_pkgs[name] = versions
else:
check_pkgs[name] = versions
else:
return_pkgs = list()
for k, v in check_pkgs.items():
if v:
return_pkgs.append('%s%s' % (k, v))
else:
return_pkgs.append(k)
return_data['packages'] = set(return_pkgs)
# Sort everything within the returned data
for key, value in return_data.items():
if isinstance(value, (list, set)):
return_data[key] = sorted(value)
return [return_data] return [return_data]
# Used for testing and debuging usage: `python plugins/lookups/py_pkgs.py ../`
if __name__ == '__main__':
import sys
import json
print(json.dumps(LookupModule().run(terms=sys.argv[1:]), indent=4))

View File

@@ -21,7 +21,10 @@
- name: Load local packages - name: Load local packages
debug: debug:
msg: "Loading Packages" msg: "Loading Packages"
with_py_pkgs: ../ with_py_pkgs:
- ../
- /etc/ansible/roles
- /etc/openstack_deploy
register: local_packages register: local_packages
tags: tags:
- repo-clone-repos - repo-clone-repos

View File

@@ -57,7 +57,15 @@
- name: Set upper constraints - name: Set upper constraints
set_fact: set_fact:
upper_constraints: "{{ slurp_upper_constraints.content | b64decode | splitlines }}" _upper_constraints: "{{ slurp_upper_constraints.content | b64decode | splitlines }}"
when: slurp_upper_constraints | success
tags:
- repo-set-constraints
- repo-build-constraints-file
- name: Set upper constraints
set_fact:
upper_constraints: "{{ _upper_constraints | pip_constraint_update(local_packages.results.0.item.packages) }}"
when: slurp_upper_constraints | success when: slurp_upper_constraints | success
tags: tags:
- repo-set-constraints - repo-set-constraints

View File

@@ -1,8 +1,10 @@
# Computed constraints # Computed constraints
{% set constraint_pkgs = [] -%} {% set constraint_pkgs = [] -%}
{% for clone_item in local_packages.results.0.item.remote_package_parts -%} {% for clone_item in local_packages.results.0.item.remote_package_parts -%}
{% if 'ignorerequirements=true' not in clone_item['original'] %}
git+file://{{ repo_build_git_dir }}/{{ clone_item['name'] }}@{{ clone_item['version'] }}#egg={{ clone_item['name'] | replace('-', '_') | lower }} git+file://{{ repo_build_git_dir }}/{{ clone_item['name'] }}@{{ clone_item['version'] }}#egg={{ clone_item['name'] | replace('-', '_') | lower }}
{% set _ = constraint_pkgs.append(clone_item['name'] | replace('-', '_') | lower) %} {% set _ = constraint_pkgs.append(clone_item['name'] | replace('-', '_') | lower) %}
{% endif %}
{% endfor %} {% endfor %}
# upper boundry constraints from requirements repo. # upper boundry constraints from requirements repo.
{% for constraint_item in upper_constraints %} {% for constraint_item in upper_constraints %}
@@ -10,10 +12,10 @@ git+file://{{ repo_build_git_dir }}/{{ clone_item['name'] }}@{{ clone_item['vers
{%- set constraint_name = constraint_split[0] %} {%- set constraint_name = constraint_split[0] %}
{%- set constraint_name_normalized = constraint_name | replace('-', '_') | lower %} {%- set constraint_name_normalized = constraint_name | replace('-', '_') | lower %}
{% if constraint_name_normalized not in constraint_pkgs %} {% if constraint_name_normalized not in constraint_pkgs %}
{% if repo_build_use_upper_constraints | bool %} {% if repo_build_use_upper_constraints | bool and (constraint_split | length) > 1 %}
{{ constraint_split[0] | replace('-', '_') | lower }}<={{ constraint_split[1] }} {{ constraint_split[0] | replace('-', '_') | lower }}<={{ constraint_split[1] }}
{% else %} {% elif (constraint_split | length) == 1 %}
# {{ constraint_split[0] | replace('-', '_') | lower }}<={{ constraint_split[1] }} {{ constraint_item }}
{% endif %} {% endif %}
{% endif %} {% endif %}
{% endfor %} {% endfor %}