Add override control
This allows certain job attributes to be explicitly overriden (or inherited). Change-Id: Iaa4292488c5a4f66926c91b24068500c168e3a35
This commit is contained in:
parent
f11ebcc250
commit
2c792ca34e
@ -18,10 +18,45 @@ starting with very basic jobs which describe characteristics that all
|
||||
jobs on the system should have, progressing through stages of
|
||||
specialization before arriving at a particular job. A job may inherit
|
||||
from any other job in any project (however, if the other job is marked
|
||||
as :attr:`job.final`, jobs may not inherit from it). Generally,
|
||||
attributes on child jobs will override (or completely replace)
|
||||
attributes on the parent, however some attributes are combined. See
|
||||
the documentation for individual attributes for these exceptions.
|
||||
as :attr:`job.final`, jobs may not inherit from it).
|
||||
|
||||
Generally, if an attribute is set on a child job, it will override (or
|
||||
completely replace) attributes on the parent. This is always true for
|
||||
attributes that only accept single values, but attributes that accept
|
||||
multiple values (lists, or mappings) are sometimes combined.
|
||||
The default behavior varies; see the documentation for individual
|
||||
attributes for details. A special YAML tag may be used to control the
|
||||
behavior explicitly. For example, in order to specify that the tags
|
||||
in the present job should override those in the parent:
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
- job:
|
||||
name: child
|
||||
tags: !override
|
||||
- foo
|
||||
|
||||
Or to indicate that they should be combined with those in the parent:
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
- job:
|
||||
name: child
|
||||
tags: !inherit
|
||||
- foo
|
||||
|
||||
Attributes which support this feature are indicated in this
|
||||
documentation with "Supports override control".
|
||||
|
||||
When lists are combined, they are merged without duplication.
|
||||
|
||||
When mappings (or dictionaries, for example, those used for job
|
||||
variables) are combined, they are deeply merged. This means a leaf
|
||||
node (an entry whose value is not another mapping) with the same name
|
||||
will override a previous entry, but non-leaf nodes (entries whose
|
||||
values are mappings) will have their entries updated in the same
|
||||
manner, recursively. New entries with unique names will be added to
|
||||
mappings.
|
||||
|
||||
A job with no parent is called a *base job* and may only be defined in
|
||||
a :term:`config-project`. Every other job must have a parent, and so
|
||||
@ -242,7 +277,8 @@ Here is an example of two job definitions:
|
||||
|
||||
When inheriting jobs or applying variants, the list of
|
||||
semaphores is extended (semaphores specified in a job definition
|
||||
are added to any supplied by their parents).
|
||||
are added to any supplied by their parents). This can not be
|
||||
changed via override control.
|
||||
|
||||
.. attr:: name
|
||||
:required:
|
||||
@ -269,10 +305,10 @@ Here is an example of two job definitions:
|
||||
that subsystem, and if the job's results are reported into a
|
||||
database, then the results of all jobs affecting that subsystem
|
||||
could be queried. This attribute is specified as a list of
|
||||
strings, and when inheriting jobs or applying variants, tags
|
||||
accumulate in a set, so the result is always a set of all the
|
||||
tags from all the jobs and variants used in constructing the
|
||||
frozen job, with no duplication.
|
||||
strings.
|
||||
|
||||
Supports override control. The default is ``!inherit``: values
|
||||
are merged without duplication.
|
||||
|
||||
.. attr:: provides
|
||||
|
||||
@ -280,9 +316,8 @@ Here is an example of two job definitions:
|
||||
by this job which may be used by other jobs for other changes
|
||||
using the :attr:`job.requires` attribute.
|
||||
|
||||
When inheriting jobs or applying variants, the list of
|
||||
`provides` is extended (`provides` specified in a job definition
|
||||
are added to any supplied by their parents).
|
||||
Supports override control. The default is ``!inherit``: values
|
||||
are merged without duplication.
|
||||
|
||||
.. attr:: requires
|
||||
|
||||
@ -314,10 +349,6 @@ Here is an example of two job definitions:
|
||||
items, e.g. for branch items in `supercedent` pipeline, branch items
|
||||
in periodic `independent` pipeline, tag items in `independent` pipeline.
|
||||
|
||||
When inheriting jobs or applying variants, the list of
|
||||
`requires` is extended (`requires` specified in a job definition
|
||||
are added to any supplied by their parents).
|
||||
|
||||
For example, a job which produces a builder container image in
|
||||
one project that is then consumed by a container image build job
|
||||
in another project might look like this:
|
||||
@ -344,6 +375,9 @@ Here is an example of two job definitions:
|
||||
jobs:
|
||||
- build-final-image
|
||||
|
||||
Supports override control. The default is ``!inherit``: values
|
||||
are merged without duplication.
|
||||
|
||||
.. attr:: secrets
|
||||
|
||||
A list of secrets which may be used by the job. A
|
||||
@ -821,15 +855,13 @@ Here is an example of two job definitions:
|
||||
of the time the item was enqueued will be frozen and used for
|
||||
all jobs for a given change (see :ref:`global_repo_state`).
|
||||
|
||||
This attribute is not overridden by inheritance; instead it is
|
||||
the union of all applicable parents and variants (i.e., jobs can
|
||||
expand but not reduce the set of required projects when they
|
||||
inherit).
|
||||
|
||||
The format for this attribute is either a list of strings or
|
||||
dictionaries. Strings are interpreted as project names,
|
||||
dictionaries, if used, may have the following attributes:
|
||||
|
||||
Supports override control. The default is ``!inherit``: values
|
||||
are merged without duplication.
|
||||
|
||||
.. attr:: name
|
||||
:required:
|
||||
|
||||
@ -856,11 +888,7 @@ Here is an example of two job definitions:
|
||||
|
||||
.. attr:: vars
|
||||
|
||||
A dictionary of variables to supply to Ansible. When inheriting
|
||||
from a job (or creating a variant of a job) vars are merged with
|
||||
previous definitions. This means a variable definition with the
|
||||
same name will override a previously defined variable, but new
|
||||
variable names will be added to the set of defined variables.
|
||||
A dictionary of variables to supply to Ansible.
|
||||
|
||||
When running a trusted playbook, the value of variables will be
|
||||
frozen at the start of the job. Therefore if the value of the
|
||||
@ -878,6 +906,9 @@ Here is an example of two job definitions:
|
||||
it is not recommended to do so except in the most controlled of
|
||||
circumstances. They are almost impossible to render safely.
|
||||
|
||||
Supports override control. The default is ``!inherit``: values
|
||||
are deep-merged.
|
||||
|
||||
.. attr:: extra-vars
|
||||
|
||||
A dictionary of variables to supply to Ansible with higher
|
||||
@ -885,6 +916,9 @@ Here is an example of two job definitions:
|
||||
the name this is not passed to Ansible using the `--extra-vars`
|
||||
flag.
|
||||
|
||||
Supports override control. The default is ``!inherit``: values
|
||||
are deep-merged.
|
||||
|
||||
.. attr:: host-vars
|
||||
|
||||
A dictionary of host variables to supply to Ansible. The keys
|
||||
@ -892,6 +926,9 @@ Here is an example of two job definitions:
|
||||
:ref:`nodeset`, and the values are dictionaries of variables,
|
||||
just as in :attr:`job.vars`.
|
||||
|
||||
Supports override control. The default is ``!inherit``: values
|
||||
are deep-merged.
|
||||
|
||||
.. attr:: group-vars
|
||||
|
||||
A dictionary of group variables to supply to Ansible. The keys
|
||||
@ -899,6 +936,9 @@ Here is an example of two job definitions:
|
||||
:ref:`nodeset`, and the values are dictionaries of variables,
|
||||
just as in :attr:`job.vars`.
|
||||
|
||||
Supports override control. The default is ``!inherit``: values
|
||||
are deep-merged.
|
||||
|
||||
An example of three kinds of variables:
|
||||
|
||||
.. code-block:: yaml
|
||||
@ -926,6 +966,7 @@ Here is an example of two job definitions:
|
||||
group-vars:
|
||||
api:
|
||||
baz: "this variable is visible on api1 and api2"
|
||||
|
||||
.. attr:: dependencies
|
||||
|
||||
A list of other jobs upon which this job depends. Zuul will not
|
||||
@ -959,6 +1000,9 @@ Here is an example of two job definitions:
|
||||
*soft* dependency will simply be ignored if the dependent job
|
||||
is not run.
|
||||
|
||||
Supports override control. The default is ``!override``: values
|
||||
are overridden.
|
||||
|
||||
.. attr:: allowed-projects
|
||||
|
||||
A list of Zuul projects which may use this job. By default, a
|
||||
@ -1099,6 +1143,9 @@ Here is an example of two job definitions:
|
||||
value is used to determine if the job should run. This is a
|
||||
:ref:`regular expression <regex>` or list of regular expressions.
|
||||
|
||||
Supports override control. The default is ``!override``: values
|
||||
are overridden.
|
||||
|
||||
.. warning::
|
||||
|
||||
File filters will be ignored for refs that don't have any
|
||||
@ -1116,6 +1163,9 @@ Here is an example of two job definitions:
|
||||
are in the docs directory. A :ref:`regular expression <regex>`
|
||||
or list of regular expressions.
|
||||
|
||||
Supports override control. The default is ``!override``: values
|
||||
are overridden.
|
||||
|
||||
.. warning::
|
||||
|
||||
File filters will be ignored for refs that don't have any
|
||||
@ -1186,14 +1236,13 @@ Here is an example of two job definitions:
|
||||
In this case, it will be able to restart jobs for changes behind
|
||||
it in a dependent pipeline.
|
||||
|
||||
When inheriting or applying variants this option is combined
|
||||
so that regular expressions from all parents and variants used
|
||||
will be applied.
|
||||
|
||||
Use caution when specifying this option. If an early failure is
|
||||
triggered, the job result will be recorded as FAILURE even if
|
||||
the job playbooks ultimately succeed.
|
||||
|
||||
Supports override control. The default is ``!inherit``: values
|
||||
are merged without duplication.
|
||||
|
||||
.. attr:: workspace-scheme
|
||||
:default: golang
|
||||
|
||||
|
@ -87,7 +87,7 @@ class TestAbstractMatcherCollection(BaseTestMatcher):
|
||||
|
||||
def test_repr(self):
|
||||
matcher = cm.MatchAll([])
|
||||
self.assertEqual(repr(matcher), '<MatchAll>')
|
||||
self.assertEqual(repr(matcher), '<MatchAll []>')
|
||||
|
||||
|
||||
class BaseTestFilesMatcher(BaseTestMatcher):
|
||||
|
@ -18,12 +18,14 @@ import configparser
|
||||
import collections
|
||||
import os
|
||||
import random
|
||||
import textwrap
|
||||
import types
|
||||
import uuid
|
||||
from unittest import mock
|
||||
|
||||
import fixtures
|
||||
import testtools
|
||||
import voluptuous
|
||||
|
||||
from zuul import model
|
||||
from zuul import configloader
|
||||
@ -481,6 +483,372 @@ class TestJob(BaseTestCase):
|
||||
job.secrets[run_idx]['encrypted_data'])
|
||||
self.assertEqual(run_secret, secret2_data)
|
||||
|
||||
def _test_job_override_control(self, attr, job_attr,
|
||||
default, default_value,
|
||||
inherit, inherit_value,
|
||||
override, override_value,
|
||||
errors,
|
||||
):
|
||||
# Default behavior
|
||||
data = configloader.safe_load_yaml(default, self.context)
|
||||
parent = self.pcontext.job_parser.fromYaml(data[0]['job'])
|
||||
child = self.pcontext.job_parser.fromYaml(data[1]['job'])
|
||||
job = parent.copy()
|
||||
job.applyVariant(child, self.layout, None)
|
||||
self.assertEqual(default_value, getattr(job, job_attr))
|
||||
|
||||
# Explicit inherit
|
||||
data = configloader.safe_load_yaml(inherit, self.context)
|
||||
parent = self.pcontext.job_parser.fromYaml(data[0]['job'])
|
||||
child = self.pcontext.job_parser.fromYaml(data[1]['job'])
|
||||
job = parent.copy()
|
||||
job.applyVariant(child, self.layout, None)
|
||||
self.assertEqual(inherit_value, getattr(job, job_attr))
|
||||
|
||||
# Explicit override
|
||||
data = configloader.safe_load_yaml(override, self.context)
|
||||
parent = self.pcontext.job_parser.fromYaml(data[0]['job'])
|
||||
child = self.pcontext.job_parser.fromYaml(data[1]['job'])
|
||||
job = parent.copy()
|
||||
job.applyVariant(child, self.layout, None)
|
||||
self.assertEqual(override_value, getattr(job, job_attr))
|
||||
|
||||
# Make sure we can't put the override in the wrong place
|
||||
for conf, exc in errors:
|
||||
with testtools.ExpectedException(exc):
|
||||
data = configloader.safe_load_yaml(conf, self.context)
|
||||
child = self.pcontext.job_parser.fromYaml(data[0]['job'])
|
||||
|
||||
def _test_job_override_control_set(
|
||||
self, attr, job_attr=None,
|
||||
default_override=False,
|
||||
value_factory=lambda values: {v for v in values}):
|
||||
if job_attr is None:
|
||||
job_attr = attr
|
||||
default = textwrap.dedent(
|
||||
f"""
|
||||
- job:
|
||||
name: parent
|
||||
{attr}: parent-{attr}
|
||||
- job:
|
||||
name: child
|
||||
{attr}: child-{attr}
|
||||
""")
|
||||
|
||||
inherit = textwrap.dedent(
|
||||
f"""
|
||||
- job:
|
||||
name: parent
|
||||
{attr}: parent-{attr}
|
||||
- job:
|
||||
name: child
|
||||
{attr}: !inherit child-{attr}
|
||||
""")
|
||||
inherit_value = value_factory([f'parent-{attr}', f'child-{attr}'])
|
||||
|
||||
override = textwrap.dedent(
|
||||
f"""
|
||||
- job:
|
||||
name: parent
|
||||
{attr}: parent-{attr}
|
||||
- job:
|
||||
name: child
|
||||
{attr}: !override [child-{attr}]
|
||||
""")
|
||||
override_value = value_factory([f'child-{attr}'])
|
||||
|
||||
if default_override:
|
||||
default_value = override_value
|
||||
else:
|
||||
default_value = inherit_value
|
||||
|
||||
errors = [(textwrap.dedent(
|
||||
f"""
|
||||
- job:
|
||||
name: child
|
||||
{attr}: [!override child-{attr}]
|
||||
"""), voluptuous.error.MultipleInvalid)]
|
||||
self._test_job_override_control(attr, job_attr,
|
||||
default, default_value,
|
||||
inherit, inherit_value,
|
||||
override, override_value,
|
||||
errors)
|
||||
|
||||
def test_job_override_control_tags(self):
|
||||
self._test_job_override_control_set('tags')
|
||||
|
||||
def test_job_override_control_requires(self):
|
||||
self._test_job_override_control_set('requires')
|
||||
|
||||
def test_job_override_control_provides(self):
|
||||
self._test_job_override_control_set('provides')
|
||||
|
||||
def test_job_override_control_dependencies(self):
|
||||
self._test_job_override_control_set(
|
||||
'dependencies',
|
||||
default_override=True,
|
||||
value_factory=lambda values:
|
||||
{model.JobDependency(v) for v in values})
|
||||
|
||||
def test_job_override_control_failure_output(self):
|
||||
self._test_job_override_control_set(
|
||||
'failure-output',
|
||||
job_attr='failure_output',
|
||||
value_factory=lambda values: tuple(v for v in sorted(values)))
|
||||
|
||||
def test_job_override_control_files(self):
|
||||
self._test_job_override_control_set(
|
||||
'files',
|
||||
job_attr='file_matcher',
|
||||
default_override=True,
|
||||
value_factory=lambda values: change_matcher.MatchAnyFiles(
|
||||
[change_matcher.FileMatcher(ZuulRegex(v))
|
||||
for v in sorted(values)]))
|
||||
|
||||
def test_job_override_control_irrelevant_files(self):
|
||||
self._test_job_override_control_set(
|
||||
'irrelevant-files',
|
||||
job_attr='irrelevant_file_matcher',
|
||||
default_override=True,
|
||||
value_factory=lambda values: change_matcher.MatchAllFiles(
|
||||
[change_matcher.FileMatcher(ZuulRegex(v))
|
||||
for v in sorted(values)]))
|
||||
|
||||
def _test_job_override_control_dict(
|
||||
self, attr, job_attr=None,
|
||||
default_override=False):
|
||||
if job_attr is None:
|
||||
job_attr = attr
|
||||
default = textwrap.dedent(
|
||||
f"""
|
||||
- job:
|
||||
name: parent
|
||||
{attr}:
|
||||
parent: 1
|
||||
- job:
|
||||
name: child
|
||||
{attr}:
|
||||
child: 2
|
||||
""")
|
||||
|
||||
inherit = textwrap.dedent(
|
||||
f"""
|
||||
- job:
|
||||
name: parent
|
||||
{attr}:
|
||||
parent: 1
|
||||
- job:
|
||||
name: child
|
||||
{attr}: !inherit
|
||||
child: 2
|
||||
""")
|
||||
inherit_value = {'parent': 1, 'child': 2}
|
||||
|
||||
override = textwrap.dedent(
|
||||
f"""
|
||||
- job:
|
||||
name: parent
|
||||
{attr}:
|
||||
parent: 1
|
||||
- job:
|
||||
name: child
|
||||
{attr}: !override
|
||||
child: 2
|
||||
""")
|
||||
override_value = {'child': 2}
|
||||
|
||||
if default_override:
|
||||
default_value = override_value
|
||||
else:
|
||||
default_value = inherit_value
|
||||
|
||||
errors = [
|
||||
(textwrap.dedent(
|
||||
f"""
|
||||
- job:
|
||||
name: child
|
||||
{attr}:
|
||||
!override child: 2
|
||||
"""), voluptuous.error.MultipleInvalid),
|
||||
(textwrap.dedent(
|
||||
f"""
|
||||
- job:
|
||||
name: child
|
||||
{attr}:
|
||||
child: !override 2
|
||||
"""), Exception),
|
||||
]
|
||||
self._test_job_override_control(attr, job_attr,
|
||||
default, default_value,
|
||||
inherit, inherit_value,
|
||||
override, override_value,
|
||||
errors)
|
||||
|
||||
def test_job_override_control_vars(self):
|
||||
self._test_job_override_control_dict(
|
||||
'vars', job_attr='variables')
|
||||
|
||||
def test_job_override_control_extra_vars(self):
|
||||
self._test_job_override_control_dict(
|
||||
'extra-vars', job_attr='extra_variables')
|
||||
|
||||
def _test_job_override_control_host_dict(
|
||||
self, attr, job_attr=None,
|
||||
default_override=False):
|
||||
if job_attr is None:
|
||||
job_attr = attr
|
||||
default = textwrap.dedent(
|
||||
f"""
|
||||
- job:
|
||||
name: parent
|
||||
{attr}:
|
||||
host:
|
||||
parent: 1
|
||||
- job:
|
||||
name: child
|
||||
{attr}:
|
||||
host:
|
||||
child: 2
|
||||
""")
|
||||
|
||||
inherit = textwrap.dedent(
|
||||
f"""
|
||||
- job:
|
||||
name: parent
|
||||
{attr}:
|
||||
host:
|
||||
parent: 1
|
||||
- job:
|
||||
name: child
|
||||
{attr}: !inherit
|
||||
host:
|
||||
child: 2
|
||||
""")
|
||||
inherit_value = {'host': {'parent': 1, 'child': 2}}
|
||||
|
||||
override = textwrap.dedent(
|
||||
f"""
|
||||
- job:
|
||||
name: parent
|
||||
{attr}:
|
||||
host:
|
||||
parent: 1
|
||||
- job:
|
||||
name: child
|
||||
{attr}: !override
|
||||
host:
|
||||
child: 2
|
||||
""")
|
||||
override_value = {'host': {'child': 2}}
|
||||
|
||||
if default_override:
|
||||
default_value = override_value
|
||||
else:
|
||||
default_value = inherit_value
|
||||
|
||||
errors = [
|
||||
(textwrap.dedent(
|
||||
f"""
|
||||
- job:
|
||||
name: child
|
||||
{attr}:
|
||||
!override host:
|
||||
child: 2
|
||||
"""), voluptuous.error.MultipleInvalid),
|
||||
(textwrap.dedent(
|
||||
f"""
|
||||
- job:
|
||||
name: child
|
||||
{attr}:
|
||||
host: !override
|
||||
child: 2
|
||||
"""), voluptuous.error.MultipleInvalid),
|
||||
(textwrap.dedent(
|
||||
f"""
|
||||
- job:
|
||||
name: child
|
||||
{attr}:
|
||||
host:
|
||||
child: !override 2
|
||||
"""), Exception),
|
||||
]
|
||||
self._test_job_override_control(attr, job_attr,
|
||||
default, default_value,
|
||||
inherit, inherit_value,
|
||||
override, override_value,
|
||||
errors)
|
||||
|
||||
def test_job_override_control_host_vars(self):
|
||||
self._test_job_override_control_host_dict(
|
||||
'host-vars', job_attr='host_variables')
|
||||
|
||||
def test_job_override_control_group_vars(self):
|
||||
self._test_job_override_control_host_dict(
|
||||
'group-vars', job_attr='group_variables')
|
||||
|
||||
def test_job_override_control_required_projects(self):
|
||||
parent = model.Project('parent-project', self.source)
|
||||
child = model.Project('child-project', self.source)
|
||||
parent_tpc = model.TenantProjectConfig(parent)
|
||||
child_tpc = model.TenantProjectConfig(child)
|
||||
self.tenant.addTPC(parent_tpc)
|
||||
self.tenant.addTPC(child_tpc)
|
||||
|
||||
default = textwrap.dedent(
|
||||
"""
|
||||
- job:
|
||||
name: parent
|
||||
required-projects: parent-project
|
||||
- job:
|
||||
name: child
|
||||
required-projects: child-project
|
||||
""")
|
||||
|
||||
inherit = textwrap.dedent(
|
||||
"""
|
||||
- job:
|
||||
name: parent
|
||||
required-projects: parent-project
|
||||
- job:
|
||||
name: child
|
||||
required-projects: !inherit child-project
|
||||
""")
|
||||
inherit_value = {
|
||||
'git.example.com/parent-project': model.JobProject(
|
||||
'git.example.com/parent-project'),
|
||||
'git.example.com/child-project': model.JobProject(
|
||||
'git.example.com/child-project'),
|
||||
}
|
||||
|
||||
override = textwrap.dedent(
|
||||
"""
|
||||
- job:
|
||||
name: parent
|
||||
required-projects: parent-project
|
||||
- job:
|
||||
name: child
|
||||
required-projects: !override [child-project]
|
||||
""")
|
||||
override_value = {
|
||||
'git.example.com/child-project': model.JobProject(
|
||||
'git.example.com/child-project'),
|
||||
}
|
||||
|
||||
default_value = inherit_value
|
||||
|
||||
errors = [(textwrap.dedent(
|
||||
"""
|
||||
- job:
|
||||
name: child
|
||||
required-projects: [!override child-project]
|
||||
"""), voluptuous.error.MultipleInvalid)]
|
||||
self._test_job_override_control('required-projects',
|
||||
'required_projects',
|
||||
default, default_value,
|
||||
inherit, inherit_value,
|
||||
override, override_value,
|
||||
errors)
|
||||
|
||||
|
||||
class FakeFrozenJob(model.Job):
|
||||
|
||||
|
@ -18,6 +18,8 @@ This module defines classes used in matching changes based on job
|
||||
configuration.
|
||||
"""
|
||||
|
||||
import json
|
||||
|
||||
from zuul.lib.re2util import ZuulRegex
|
||||
|
||||
COMMIT_MSG = '/COMMIT_MSG'
|
||||
@ -54,6 +56,9 @@ class AbstractChangeMatcher(object):
|
||||
def __ne__(self, other):
|
||||
return not self.__eq__(other)
|
||||
|
||||
def __hash__(self):
|
||||
return hash(json.dumps(self.regex.toDict(), sort_keys=True))
|
||||
|
||||
def __str__(self):
|
||||
return '{%s:%s}' % (self.__class__.__name__, self._regex)
|
||||
|
||||
@ -127,7 +132,8 @@ class AbstractMatcherCollection(AbstractChangeMatcher):
|
||||
','.join([str(x) for x in self.matchers]))
|
||||
|
||||
def __repr__(self):
|
||||
return '<%s>' % self.__class__.__name__
|
||||
return '<%s [%s]>' % (self.__class__.__name__,
|
||||
', '.join([repr(x) for x in self.matchers]))
|
||||
|
||||
def copy(self):
|
||||
return self.__class__(self.matchers[:])
|
||||
|
@ -76,6 +76,22 @@ def to_list(x):
|
||||
return vs.Any([x], x)
|
||||
|
||||
|
||||
def override_list(x):
|
||||
def validator(v):
|
||||
if isinstance(v, yaml.OverrideValue):
|
||||
v = v.value
|
||||
vs.Any([x], x)(v)
|
||||
return validator
|
||||
|
||||
|
||||
def override_value(x):
|
||||
def validator(v):
|
||||
if isinstance(v, yaml.OverrideValue):
|
||||
v = v.value
|
||||
vs.Schema(x)(v)
|
||||
return validator
|
||||
|
||||
|
||||
def as_list(item):
|
||||
if not item:
|
||||
return []
|
||||
@ -729,8 +745,8 @@ class JobParser(object):
|
||||
'abstract': bool,
|
||||
'protected': bool,
|
||||
'intermediate': bool,
|
||||
'requires': to_list(str),
|
||||
'provides': to_list(str),
|
||||
'requires': override_list(str),
|
||||
'provides': override_list(str),
|
||||
'failure-message': str,
|
||||
'success-message': str,
|
||||
# TODO: ignored, remove for v5
|
||||
@ -741,11 +757,12 @@ class JobParser(object):
|
||||
'voting': bool,
|
||||
'semaphore': vs.Any(semaphore, str),
|
||||
'semaphores': to_list(vs.Any(semaphore, str)),
|
||||
'tags': to_list(str),
|
||||
'tags': override_list(str),
|
||||
'branches': to_list(vs.Any(ZUUL_REGEX, str)),
|
||||
'files': to_list(vs.Any(ZUUL_REGEX, str)),
|
||||
'files': override_list(vs.Any(ZUUL_REGEX, str)),
|
||||
'secrets': to_list(vs.Any(secret, str)),
|
||||
'irrelevant-files': to_list(vs.Any(ZUUL_REGEX, str)),
|
||||
'irrelevant-files': override_list(
|
||||
vs.Any(ZUUL_REGEX, str)),
|
||||
# validation happens in NodeSetParser
|
||||
'nodeset': vs.Any(dict, str),
|
||||
'timeout': int,
|
||||
@ -760,12 +777,14 @@ class JobParser(object):
|
||||
'_source_context': model.SourceContext,
|
||||
'_start_mark': model.ZuulMark,
|
||||
'roles': to_list(role),
|
||||
'required-projects': to_list(vs.Any(job_project, str)),
|
||||
'vars': ansible_vars_dict,
|
||||
'extra-vars': ansible_vars_dict,
|
||||
'host-vars': {str: ansible_vars_dict},
|
||||
'group-vars': {str: ansible_vars_dict},
|
||||
'dependencies': to_list(vs.Any(job_dependency, str)),
|
||||
'required-projects': override_list(
|
||||
vs.Any(job_project, str)),
|
||||
'vars': override_value(ansible_vars_dict),
|
||||
'extra-vars': override_value(ansible_vars_dict),
|
||||
'host-vars': override_value({str: ansible_vars_dict}),
|
||||
'group-vars': override_value({str: ansible_vars_dict}),
|
||||
'dependencies': override_list(
|
||||
vs.Any(job_dependency, str)),
|
||||
'allowed-projects': to_list(str),
|
||||
'override-branch': str,
|
||||
'override-checkout': str,
|
||||
@ -775,7 +794,7 @@ class JobParser(object):
|
||||
'match-on-config-updates': bool,
|
||||
'workspace-scheme': vs.Any('golang', 'flat', 'unique'),
|
||||
'deduplicate': vs.Any(bool, 'auto'),
|
||||
'failure-output': to_list(str),
|
||||
'failure-output': override_list(str),
|
||||
}
|
||||
|
||||
job_name = {vs.Required('name'): str}
|
||||
@ -990,52 +1009,68 @@ class JobParser(object):
|
||||
job.nodeset = ns
|
||||
|
||||
if 'required-projects' in conf:
|
||||
new_projects = {}
|
||||
projects = as_list(conf.get('required-projects', []))
|
||||
unknown_projects = []
|
||||
for project in projects:
|
||||
if isinstance(project, dict):
|
||||
project_name = project['name']
|
||||
project_override_branch = project.get('override-branch')
|
||||
project_override_checkout = project.get(
|
||||
'override-checkout')
|
||||
else:
|
||||
project_name = project
|
||||
project_override_branch = None
|
||||
project_override_checkout = None
|
||||
(trusted, project) = self.pcontext.tenant.getProject(
|
||||
project_name)
|
||||
if project is None:
|
||||
unknown_projects.append(project_name)
|
||||
continue
|
||||
job_project = model.JobProject(project.canonical_name,
|
||||
project_override_branch,
|
||||
project_override_checkout)
|
||||
new_projects[project.canonical_name] = job_project
|
||||
with self.pcontext.confAttr(
|
||||
conf, 'required-projects') as conf_projects:
|
||||
if isinstance(conf_projects, yaml.OverrideValue):
|
||||
job.override_control['required_projects'] =\
|
||||
conf_projects.override
|
||||
conf_projects = conf_projects.value
|
||||
new_projects = {}
|
||||
projects = as_list(conf_projects)
|
||||
unknown_projects = []
|
||||
for project in projects:
|
||||
if isinstance(project, dict):
|
||||
project_name = project['name']
|
||||
project_override_branch = project.get(
|
||||
'override-branch')
|
||||
project_override_checkout = project.get(
|
||||
'override-checkout')
|
||||
else:
|
||||
project_name = project
|
||||
project_override_branch = None
|
||||
project_override_checkout = None
|
||||
(trusted, project) = self.pcontext.tenant.getProject(
|
||||
project_name)
|
||||
if project is None:
|
||||
unknown_projects.append(project_name)
|
||||
continue
|
||||
job_project = model.JobProject(project.canonical_name,
|
||||
project_override_branch,
|
||||
project_override_checkout)
|
||||
new_projects[project.canonical_name] = job_project
|
||||
|
||||
# NOTE(mnaser): We accumulate all unknown projects and throw an
|
||||
# exception only once to capture all of them in the
|
||||
# error message.
|
||||
if unknown_projects:
|
||||
raise ProjectNotFoundError(unknown_projects)
|
||||
# We accumulate all unknown projects and throw an
|
||||
# exception only once to capture all of them in the
|
||||
# error message.
|
||||
if unknown_projects:
|
||||
raise ProjectNotFoundError(unknown_projects)
|
||||
|
||||
job.required_projects = new_projects
|
||||
job.required_projects = new_projects
|
||||
|
||||
if 'dependencies' in conf:
|
||||
new_dependencies = []
|
||||
dependencies = as_list(conf.get('dependencies', []))
|
||||
for dep in dependencies:
|
||||
if isinstance(dep, dict):
|
||||
dep_name = dep['name']
|
||||
dep_soft = dep.get('soft', False)
|
||||
else:
|
||||
dep_name = dep
|
||||
dep_soft = False
|
||||
job_dependency = model.JobDependency(dep_name, dep_soft)
|
||||
new_dependencies.append(job_dependency)
|
||||
job.dependencies = new_dependencies
|
||||
with self.pcontext.confAttr(conf, 'dependencies') as conf_deps:
|
||||
if isinstance(conf_deps, yaml.OverrideValue):
|
||||
job.override_control['dependencies'] = conf_deps.override
|
||||
conf_deps = conf_deps.value
|
||||
new_dependencies = []
|
||||
dependencies = as_list(conf_deps)
|
||||
for dep in dependencies:
|
||||
if isinstance(dep, dict):
|
||||
dep_name = dep['name']
|
||||
dep_soft = dep.get('soft', False)
|
||||
else:
|
||||
dep_name = dep
|
||||
dep_soft = False
|
||||
job_dependency = model.JobDependency(dep_name, dep_soft)
|
||||
new_dependencies.append(job_dependency)
|
||||
job.dependencies = frozenset(new_dependencies)
|
||||
|
||||
semaphores = as_list(conf.get('semaphores', conf.get('semaphore', [])))
|
||||
with self.pcontext.confAttr(conf, 'semaphore') as conf_semaphore:
|
||||
semaphores = as_list(conf_semaphore)
|
||||
if 'semaphores' in conf:
|
||||
with self.pcontext.confAttr(conf, 'semaphores') as conf_semaphores:
|
||||
semaphores = as_list(conf_semaphores)
|
||||
# TODO: after removing semaphore, indent this section
|
||||
job_semaphores = []
|
||||
for semaphore in semaphores:
|
||||
if isinstance(semaphore, str):
|
||||
@ -1060,28 +1095,31 @@ class JobParser(object):
|
||||
"be used for one")
|
||||
|
||||
for k in ('tags', 'requires', 'provides'):
|
||||
v = frozenset(as_list(conf.get(k)))
|
||||
if v:
|
||||
setattr(job, k, v)
|
||||
with self.pcontext.confAttr(conf, k) as conf_k:
|
||||
if isinstance(conf_k, yaml.OverrideValue):
|
||||
job.override_control[k] = conf_k.override
|
||||
conf_k = conf_k.value
|
||||
v = frozenset(as_list(conf_k))
|
||||
if v:
|
||||
setattr(job, k, v)
|
||||
|
||||
variables = conf.get('vars', None)
|
||||
if variables:
|
||||
check_varnames(variables)
|
||||
job.variables = variables
|
||||
extra_variables = conf.get('extra-vars', None)
|
||||
if extra_variables:
|
||||
check_varnames(extra_variables)
|
||||
job.extra_variables = extra_variables
|
||||
host_variables = conf.get('host-vars', None)
|
||||
if host_variables:
|
||||
for host, hvars in host_variables.items():
|
||||
check_varnames(hvars)
|
||||
job.host_variables = host_variables
|
||||
group_variables = conf.get('group-vars', None)
|
||||
if group_variables:
|
||||
for group, gvars in group_variables.items():
|
||||
check_varnames(gvars)
|
||||
job.group_variables = group_variables
|
||||
for (yaml_attr, job_attr, hostvars) in [
|
||||
('vars', 'variables', False),
|
||||
('extra-vars', 'extra_variables', False),
|
||||
('host-vars', 'host_variables', True),
|
||||
('group-vars', 'group_variables', True),
|
||||
]:
|
||||
with self.pcontext.confAttr(conf, yaml_attr) as conf_vars:
|
||||
if isinstance(conf_vars, yaml.OverrideValue):
|
||||
job.override_control[job_attr] = conf_vars.override
|
||||
conf_vars = conf_vars.value
|
||||
if conf_vars:
|
||||
if hostvars:
|
||||
for host, hvars in conf_vars.items():
|
||||
check_varnames(hvars)
|
||||
else:
|
||||
check_varnames(conf_vars)
|
||||
setattr(job, job_attr, conf_vars)
|
||||
|
||||
allowed_projects = conf.get('allowed-projects', None)
|
||||
# See note above at "post-review".
|
||||
@ -1108,6 +1146,9 @@ class JobParser(object):
|
||||
job.setBranchMatcher(branches)
|
||||
if 'files' in conf:
|
||||
with self.pcontext.confAttr(conf, 'files') as conf_files:
|
||||
if isinstance(conf_files, yaml.OverrideValue):
|
||||
job.override_control['file_matcher'] = conf_files.override
|
||||
conf_files = conf_files.value
|
||||
job.setFileMatcher([
|
||||
make_regex(x, self.pcontext)
|
||||
for x in as_list(conf_files)
|
||||
@ -1115,18 +1156,28 @@ class JobParser(object):
|
||||
if 'irrelevant-files' in conf:
|
||||
with self.pcontext.confAttr(conf,
|
||||
'irrelevant-files') as conf_ifiles:
|
||||
if isinstance(conf_ifiles, yaml.OverrideValue):
|
||||
job.override_control['irrelevant_file_matcher'] =\
|
||||
conf_ifiles.override
|
||||
conf_ifiles = conf_ifiles.value
|
||||
job.setIrrelevantFileMatcher([
|
||||
make_regex(x, self.pcontext)
|
||||
for x in as_list(conf_ifiles)
|
||||
])
|
||||
if 'failure-output' in conf:
|
||||
failure_output = as_list(conf['failure-output'])
|
||||
with self.pcontext.confAttr(conf,
|
||||
'failure-output') as conf_output:
|
||||
if isinstance(conf_output, yaml.OverrideValue):
|
||||
job.override_control['failure_output'] =\
|
||||
conf_output.override
|
||||
conf_output = conf_output.value
|
||||
failure_output = as_list(conf_output)
|
||||
# Test compilation to detect errors, but the zuul_stream
|
||||
# callback plugin is what actually needs re objects, so we
|
||||
# let it recompile them later.
|
||||
for x in failure_output:
|
||||
re2.compile(x)
|
||||
job.failure_output = tuple(failure_output)
|
||||
job.failure_output = tuple(sorted(failure_output))
|
||||
|
||||
job.freeze()
|
||||
return job
|
||||
|
@ -14,7 +14,9 @@
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import json
|
||||
import re
|
||||
|
||||
import re2
|
||||
|
||||
|
||||
@ -78,6 +80,9 @@ class ZuulRegex:
|
||||
def __ne__(self, other):
|
||||
return not self.__eq__(other)
|
||||
|
||||
def __hash__(self):
|
||||
return hash(json.dumps(self.toDict(), sort_keys=True))
|
||||
|
||||
def match(self, subject):
|
||||
if self.negate:
|
||||
return not self.re.match(subject)
|
||||
|
@ -32,6 +32,56 @@ except ImportError:
|
||||
Mark = yaml.Mark
|
||||
|
||||
|
||||
class OverrideValue:
|
||||
def __init__(self, value, override):
|
||||
self.value = value
|
||||
self.override = override
|
||||
|
||||
|
||||
# Generally it only makes sense to override lists or dicts, but
|
||||
# because of the to_list construction, we might end up with strings
|
||||
# too.
|
||||
class OverrideStr(OverrideValue):
|
||||
pass
|
||||
|
||||
|
||||
class OverrideList(OverrideValue):
|
||||
pass
|
||||
|
||||
|
||||
class OverrideDict(OverrideValue):
|
||||
pass
|
||||
|
||||
|
||||
class Override:
|
||||
yaml_tag = u'!override'
|
||||
override_value = True
|
||||
|
||||
@classmethod
|
||||
def from_yaml(cls, loader, node):
|
||||
if isinstance(node, yaml.MappingNode):
|
||||
return OverrideDict(loader.construct_mapping(node),
|
||||
cls.override_value)
|
||||
elif isinstance(node, yaml.SequenceNode):
|
||||
return OverrideList(loader.construct_sequence(node),
|
||||
cls.override_value)
|
||||
elif isinstance(node, yaml.ScalarNode):
|
||||
tag = loader.resolve(yaml.ScalarNode, node.value, (True, False))
|
||||
node = yaml.ScalarNode(tag, node.value,
|
||||
node.start_mark, node.end_mark)
|
||||
raw_value = loader.construct_object(node)
|
||||
if isinstance(raw_value, str):
|
||||
return OverrideStr(raw_value, cls.override_value)
|
||||
raise Exception("Unsupported type for scalar override control: "
|
||||
f"{type(raw_value)}")
|
||||
raise Exception(f"Unsupported type for override control: {type(node)}")
|
||||
|
||||
|
||||
class Inherit(Override):
|
||||
yaml_tag = u'!inherit'
|
||||
override_value = False
|
||||
|
||||
|
||||
class EncryptedPKCS1_OAEP:
|
||||
yaml_tag = u'!encrypted/pkcs1-oaep'
|
||||
|
||||
@ -123,6 +173,11 @@ EncryptedDumper.add_representer(
|
||||
EncryptedDumper.add_representer(
|
||||
ZuulConfigKey,
|
||||
ZuulConfigKey.to_yaml)
|
||||
# Add support for override control
|
||||
EncryptedLoader.add_constructor(Override.yaml_tag,
|
||||
Override.from_yaml)
|
||||
EncryptedLoader.add_constructor(Inherit.yaml_tag,
|
||||
Inherit.from_yaml)
|
||||
|
||||
|
||||
def encrypted_dump(data, *args, **kwargs):
|
||||
|
133
zuul/model.py
133
zuul/model.py
@ -3410,6 +3410,20 @@ class Job(ConfigObject):
|
||||
failure_output=(),
|
||||
)
|
||||
|
||||
override_control = defaultdict(lambda: True)
|
||||
override_control['file_matcher'] = True
|
||||
override_control['irrelevant_file_matcher'] = True
|
||||
override_control['tags'] = False
|
||||
override_control['provides'] = False
|
||||
override_control['requires'] = False
|
||||
override_control['dependencies'] = True
|
||||
override_control['variables'] = False
|
||||
override_control['extra_variables'] = False
|
||||
override_control['host_variables'] = False
|
||||
override_control['group_variables'] = False
|
||||
override_control['required_projects'] = False
|
||||
override_control['failure_output'] = False
|
||||
|
||||
# These are generally internal attributes which are not
|
||||
# accessible via configuration.
|
||||
self.other_attributes = dict(
|
||||
@ -3426,6 +3440,8 @@ class Job(ConfigObject):
|
||||
secrets=(), # secrets aren't inheritable
|
||||
queued=False,
|
||||
waiting_status=None, # Text description of why its waiting
|
||||
# Override settings for context attributes:
|
||||
override_control=override_control,
|
||||
)
|
||||
|
||||
self.attributes = {}
|
||||
@ -3772,42 +3788,56 @@ class Job(ConfigObject):
|
||||
# Set the file matcher to match any of the change files
|
||||
# Input is a list of ZuulRegex objects
|
||||
self._files = [x.toDict() for x in files]
|
||||
matchers = []
|
||||
for zuul_regex in files:
|
||||
matchers.append(change_matcher.FileMatcher(zuul_regex))
|
||||
self.file_matcher = change_matcher.MatchAnyFiles(matchers)
|
||||
self.file_matcher = change_matcher.MatchAnyFiles(
|
||||
[change_matcher.FileMatcher(zuul_regex)
|
||||
for zuul_regex in sorted(files, key=lambda x: x.pattern)])
|
||||
|
||||
def setIrrelevantFileMatcher(self, irrelevant_files):
|
||||
# Set the irrelevant file matcher to match any of the change files
|
||||
# Input is a list of ZuulRegex objects
|
||||
self._irrelevant_files = [x.toDict() for x in irrelevant_files]
|
||||
matchers = []
|
||||
for zuul_regex in irrelevant_files:
|
||||
matchers.append(change_matcher.FileMatcher(zuul_regex))
|
||||
self.irrelevant_file_matcher = change_matcher.MatchAllFiles(matchers)
|
||||
self.irrelevant_file_matcher = change_matcher.MatchAllFiles(
|
||||
[change_matcher.FileMatcher(zuul_regex)
|
||||
for zuul_regex in sorted(irrelevant_files,
|
||||
key=lambda x: x.pattern)])
|
||||
|
||||
def updateVariables(self, other_vars, other_extra_vars, other_host_vars,
|
||||
other_group_vars):
|
||||
if other_vars is not None:
|
||||
self.variables = Job._deepUpdate(self.variables, other_vars)
|
||||
if other_extra_vars is not None:
|
||||
self.extra_variables = Job._deepUpdate(
|
||||
self.extra_variables, other_extra_vars)
|
||||
if other_host_vars is not None:
|
||||
self.host_variables = Job._deepUpdate(
|
||||
self.host_variables, other_host_vars)
|
||||
if other_group_vars is not None:
|
||||
self.group_variables = Job._deepUpdate(
|
||||
self.group_variables, other_group_vars)
|
||||
def updateVariables(self, other):
|
||||
if other.variables is not None:
|
||||
if other.override_control['variables']:
|
||||
self.variables = other.variables
|
||||
else:
|
||||
self.variables = Job._deepUpdate(
|
||||
self.variables, other.variables)
|
||||
if other.extra_variables is not None:
|
||||
if other.override_control['extra_variables']:
|
||||
self.extra_variables = other.extra_variables
|
||||
else:
|
||||
self.extra_variables = Job._deepUpdate(
|
||||
self.extra_variables, other.extra_variables)
|
||||
if other.host_variables is not None:
|
||||
if other.override_control['host_variables']:
|
||||
self.host_variables = other.host_variables
|
||||
else:
|
||||
self.host_variables = Job._deepUpdate(
|
||||
self.host_variables, other.host_variables)
|
||||
if other.group_variables is not None:
|
||||
if other.override_control['group_variables']:
|
||||
self.group_variables = other.group_variables
|
||||
else:
|
||||
self.group_variables = Job._deepUpdate(
|
||||
self.group_variables, other.group_variables)
|
||||
|
||||
def updateProjectVariables(self, project_vars):
|
||||
# Merge project/template variables directly into the job
|
||||
# variables. Job variables override project variables.
|
||||
self.variables = Job._deepUpdate(project_vars, self.variables)
|
||||
|
||||
def updateProjects(self, other_projects):
|
||||
required_projects = self.required_projects.copy()
|
||||
required_projects.update(other_projects)
|
||||
def updateProjects(self, other):
|
||||
if other.override_control['required_projects']:
|
||||
required_projects = {}
|
||||
else:
|
||||
required_projects = self.required_projects.copy()
|
||||
required_projects.update(other.required_projects)
|
||||
self.required_projects = required_projects
|
||||
|
||||
@staticmethod
|
||||
@ -3980,10 +4010,9 @@ class Job(ConfigObject):
|
||||
other_cleanup_run = self.freezePlaybooks(
|
||||
other.cleanup_run, layout, semaphore_handler)
|
||||
self.cleanup_run = other_cleanup_run + self.cleanup_run
|
||||
self.updateVariables(other.variables, other.extra_variables,
|
||||
other.host_variables, other.group_variables)
|
||||
self.updateVariables(other)
|
||||
if other._get('required_projects') is not None:
|
||||
self.updateProjects(other.required_projects)
|
||||
self.updateProjects(other)
|
||||
if (other._get('allowed_projects') is not None and
|
||||
self._get('allowed_projects') is not None):
|
||||
self.allowed_projects = frozenset(
|
||||
@ -3996,11 +4025,16 @@ class Job(ConfigObject):
|
||||
# contention (where two jobs try to start at the same time
|
||||
# and fail due to acquiring the same semaphores but in
|
||||
# reverse order.
|
||||
self.semaphores = tuple(
|
||||
sorted(other.semaphores + self.semaphores,
|
||||
key=lambda x: x.name))
|
||||
# Override control is explicitly not supported.
|
||||
semaphores = set(self.semaphores).union(set(other.semaphores))
|
||||
self.semaphores = tuple(sorted(semaphores, key=lambda x: x.name))
|
||||
if other._get('failure_output') is not None:
|
||||
self.failure_output = self.failure_output + other.failure_output
|
||||
if other.override_control['failure_output']:
|
||||
failure_output = other.failure_output
|
||||
else:
|
||||
failure_output = set(self.failure_output).union(
|
||||
set(other.failure_output))
|
||||
self.failure_output = tuple(sorted(failure_output))
|
||||
|
||||
pb_semaphores = set()
|
||||
for pb in self.run + self.pre_run + self.post_run + self.cleanup_run:
|
||||
@ -4014,13 +4048,28 @@ class Job(ConfigObject):
|
||||
"be used for one")
|
||||
|
||||
for k in self.context_attributes:
|
||||
if (other._get(k) is not None and
|
||||
k not in set(['tags', 'requires', 'provides'])):
|
||||
setattr(self, k, other._get(k))
|
||||
|
||||
for k in ('tags', 'requires', 'provides'):
|
||||
if other._get(k) is not None:
|
||||
setattr(self, k, getattr(self, k).union(other._get(k)))
|
||||
if (v := other._get(k)) is None:
|
||||
continue
|
||||
if other.override_control[k]:
|
||||
setattr(self, k, v)
|
||||
else:
|
||||
if isinstance(v, (set, frozenset)):
|
||||
setattr(self, k, getattr(self, k).union(v))
|
||||
elif isinstance(v, change_matcher.AbstractMatcherCollection):
|
||||
ours = getattr(self, k)
|
||||
ours = ours and set(ours.matchers) or set()
|
||||
matchers = ours.union(set(v.matchers))
|
||||
if k == 'file_matcher':
|
||||
self.setFileMatcher([m.regex for m in matchers])
|
||||
elif k == 'irrelevant_file_matcher':
|
||||
self.setIrrelevantFileMatcher(
|
||||
[m.regex for m in matchers])
|
||||
else:
|
||||
raise NotImplementedError()
|
||||
elif k in ('_files', '_irrelevant_files',):
|
||||
pass
|
||||
else:
|
||||
raise NotImplementedError()
|
||||
|
||||
self.inheritance_path = self.inheritance_path + (repr(other),)
|
||||
|
||||
@ -4091,6 +4140,10 @@ class JobSemaphore(ConfigObject):
|
||||
self.name = semaphore_name
|
||||
self.resources_first = resources_first
|
||||
|
||||
def __repr__(self):
|
||||
first = self.resources_first and ' resources first' or ''
|
||||
return '<JobSemaphore %s%s>' % (self.name, first)
|
||||
|
||||
def toDict(self):
|
||||
d = dict()
|
||||
d['name'] = self.name
|
||||
@ -4138,6 +4191,10 @@ class JobDependency(ConfigObject):
|
||||
self.name = name
|
||||
self.soft = soft
|
||||
|
||||
def __repr__(self):
|
||||
soft = self.soft and ' soft' or ''
|
||||
return '<JobDependency %s%s>' % (self.name, soft)
|
||||
|
||||
def __ne__(self, other):
|
||||
return not self.__eq__(other)
|
||||
|
||||
|
@ -0,0 +1,24 @@
|
||||
---
|
||||
features:
|
||||
- |
|
||||
Several job attributes may now have their inheritance behavior
|
||||
changed through "override control". This introduces two new YAML
|
||||
tags, ``!override`` and ``!inherit`` which may be used to
|
||||
explicitly specify whether certain job attributes should inherit
|
||||
values from parent jobs or override them.
|
||||
|
||||
See the general job documentation at :ref:`job` and also
|
||||
documentation for the following individual attributes:
|
||||
|
||||
* :attr:`job.dependencies`
|
||||
* :attr:`job.extra-vars`
|
||||
* :attr:`job.failure-output`
|
||||
* :attr:`job.files`
|
||||
* :attr:`job.group-vars`
|
||||
* :attr:`job.host-vars`
|
||||
* :attr:`job.irrelevant-files`
|
||||
* :attr:`job.provides`
|
||||
* :attr:`job.required-projects`
|
||||
* :attr:`job.requires`
|
||||
* :attr:`job.tags`
|
||||
* :attr:`job.vars`
|
Loading…
x
Reference in New Issue
Block a user