Add "debug" flag to triggers

This allows users to create pipeline triggers that provide expanded
debug information in reports.  This includes not only the debug
info currently available with the project-pipeline-config "debug" flag
(which includes decisions about running jobs), but additionally includes
information about why the change may not have been enqueued (and in
these cases, we will exceptionally report a non-enqueued item in order
to get this information to the user).

The intended workflow is:

1) User approves change
2) User is baffled by why change does not appear in pipeline
3) User leaves a comment with "gate debug" or a similar label/tag/etc
4) Zuul reports with a comment indicating why it was not enqueued
   (depends on abandoned change, etc).

To make this more useful, the pipeline manager will now add several more
bits of information to the "warnings" list for non-enqueue conditions.
Also, the canMerge methods of the connections may now return a
FalseWithReason so their reasoning may end up in these messages.

Tests of this are added to each of the 4 code-review systems since the
attribute is set on the trigger itself, which is driver-specific.
These tests add coverage of the parsing and also exercise the canMerge
method.  The Gerrit driver test includes several more variations to
exercise each of the new points in the pipeline manager that can
report a non-enqueued item.

While the git, timer, and zuul drivers also support this for completeness,
testing is not added to them since it is unlikely to be used.

Change-Id: Ifa59406bda40c2ede98ba275894b4bd05663cd6c
This commit is contained in:
James E. Blair
2025-05-26 12:11:55 -07:00
parent 0ced2e68d7
commit 8c804042fb
38 changed files with 649 additions and 72 deletions

View File

@ -539,6 +539,26 @@ Trigger Configuration
question. It follows the same syntax as
:ref:`gerrit_requirements`.
.. attr:: debug
:default: false
When set to `true`, this will cause debug messages to be
included when the queue item is reported. These debug messages
may be used to help diagnose why certain jobs did or did not
run, and in many cases, why the item was not ultimately enqueued
into the pipeline.
Setting this value also effectively sets
:attr:`project.<pipeline>.debug` for affected queue items.
This only applies to items that arrive at a pipeline via this
particular trigger. Since the output is very verbose and
typically not needed or desired, this allows for a configuration
where typical pipeline triggers omit the debug output, but
triggers that match certain specific criteria may be used to
request debug information.
Reporter Configuration
----------------------

View File

@ -57,3 +57,22 @@ Trigger Configuration
newrev of all zeros specified. The ``ignore-deletes`` field is a
boolean value that describes whether or not these newrevs
trigger ref-updated events.
.. attr:: debug
:default: false
When set to `true`, this will cause debug messages to be
included when the queue item is reported. These debug messages
may be used to help diagnose why certain jobs did or did not
run, and in many cases, why the item was not ultimately enqueued
into the pipeline.
Setting this value also effectively sets
:attr:`project.<pipeline>.debug` for affected queue items.
This only applies to items that arrive at a pipeline via this
particular trigger. Since the output is very verbose and
typically not needed or desired, this allows for a configuration
where typical pipeline triggers omit the debug output, but
triggers that match certain specific criteria may be used to
request debug information.

View File

@ -478,6 +478,25 @@ the following options.
question. It follows the same syntax as
:ref:`github_requirements`.
.. attr:: debug
:default: false
When set to `true`, this will cause debug messages to be
included when the queue item is reported. These debug messages
may be used to help diagnose why certain jobs did or did not
run, and in many cases, why the item was not ultimately enqueued
into the pipeline.
Setting this value also effectively sets
:attr:`project.<pipeline>.debug` for affected queue items.
This only applies to items that arrive at a pipeline via this
particular trigger. Since the output is very verbose and
typically not needed or desired, this allows for a configuration
where typical pipeline triggers omit the debug output, but
triggers that match certain specific criteria may be used to
request debug information.
Reporter Configuration
----------------------

View File

@ -207,6 +207,25 @@ the following options.
always sends full ref name, eg. ``refs/heads/bar`` and this
string is matched against the regular expression.
.. attr:: debug
:default: false
When set to `true`, this will cause debug messages to be
included when the queue item is reported. These debug messages
may be used to help diagnose why certain jobs did or did not
run, and in many cases, why the item was not ultimately enqueued
into the pipeline.
Setting this value also effectively sets
:attr:`project.<pipeline>.debug` for affected queue items.
This only applies to items that arrive at a pipeline via this
particular trigger. Since the output is very verbose and
typically not needed or desired, this allows for a configuration
where typical pipeline triggers omit the debug output, but
triggers that match certain specific criteria may be used to
request debug information.
Reporter Configuration
----------------------

View File

@ -194,6 +194,26 @@ the following options.
always sends full ref name, eg. ``refs/tags/bar`` and this
string is matched against the regular expression.
.. attr:: debug
:default: false
When set to `true`, this will cause debug messages to be
included when the queue item is reported. These debug messages
may be used to help diagnose why certain jobs did or did not
run, and in many cases, why the item was not ultimately enqueued
into the pipeline.
Setting this value also effectively sets
:attr:`project.<pipeline>.debug` for affected queue items.
This only applies to items that arrive at a pipeline via this
particular trigger. Since the output is very verbose and
typically not needed or desired, this allows for a configuration
where typical pipeline triggers omit the debug output, but
triggers that match certain specific criteria may be used to
request debug information.
Reporter Configuration
----------------------
Zuul reports back to Pagure via Pagure API. Available reports include a PR

View File

@ -73,6 +73,25 @@ Zuul implements the timer using `apscheduler`_, Please check the
Be aware the day-of-week value differs from cron.
The first weekday is Monday (0), and the last is Sunday (6).
.. attr:: debug
:default: false
When set to `true`, this will cause debug messages to be
included when the queue item is reported. These debug messages
may be used to help diagnose why certain jobs did or did not
run, and in many cases, why the item was not ultimately enqueued
into the pipeline.
Setting this value also effectively sets
:attr:`project.<pipeline>.debug` for affected queue items.
This only applies to items that arrive at a pipeline via this
particular trigger. Since the output is very verbose and
typically not needed or desired, this allows for a configuration
where typical pipeline triggers omit the debug output, but
triggers that match certain specific criteria may be used to
request debug information.
.. _apscheduler: https://apscheduler.readthedocs.io/
.. _apscheduler documentation: https://apscheduler.readthedocs.io/en/3.x/modules/triggers/cron.html#module-apscheduler.triggers.cron

View File

@ -52,3 +52,22 @@ can simply be used by listing ``zuul`` as the trigger.
Only available for ``parent-change-enqueued`` events. This is
the name of the pipeline in which the parent change was
enqueued.
.. attr:: debug
:default: false
When set to `true`, this will cause debug messages to be
included when the queue item is reported. These debug messages
may be used to help diagnose why certain jobs did or did not
run, and in many cases, why the item was not ultimately enqueued
into the pipeline.
Setting this value also effectively sets
:attr:`project.<pipeline>.debug` for affected queue items.
This only applies to items that arrive at a pipeline via this
particular trigger. Since the output is very verbose and
typically not needed or desired, this allows for a configuration
where typical pipeline triggers omit the debug output, but
triggers that match certain specific criteria may be used to
request debug information.

View File

@ -0,0 +1,10 @@
---
features:
- |
Most pipeline triggers now accept a ``debug`` attribute which may
be used to add a trigger to allow users to request extra debug
information about why a change may or may not have been enqueued
into a pipeline and why certain jobs ran or did not run.
For example, a site may add a trigger that matches the comment
"check debug" to request an expanded report on a change.

View File

@ -0,0 +1,60 @@
- pipeline:
name: check
manager: independent
trigger:
gerrit:
- event: patchset-created
debug: true
success:
gerrit:
Verified: 1
failure:
gerrit:
Verified: -1
- pipeline:
name: gate
manager: dependent
success-message: Build succeeded (gate).
require:
gerrit:
open: true
trigger:
gerrit:
- event: comment-added
debug: true
approval:
- Approved: 1
success:
gerrit:
Verified: 2
submit: true
failure:
gerrit:
Verified: -2
start:
gerrit:
Verified: 0
precedence: high
- job:
name: base
parent: null
run: playbooks/base.yaml
nodeset:
nodes:
- label: ubuntu-xenial
name: controller
- job:
name: check-job
run: playbooks/check.yaml
- project:
name: org/project
check:
jobs:
- check-job
gate:
jobs:
- check-job

View File

@ -0,0 +1,54 @@
- pipeline:
name: check
description: Standard check
manager: independent
trigger:
github:
- event: pull_request
action: opened
debug: true
start:
github:
status: pending
comment: false
success:
github:
status: success
- pipeline:
name: gate
manager: dependent
trigger:
github:
- event: pull_request
action: comment
comment: merge me
debug: true
success:
github:
status: success
merge: true
failure:
github: {}
- job:
name: base
parent: null
run: playbooks/base.yaml
nodeset:
nodes:
- label: ubuntu-xenial
name: controller
- job:
name: check-job
run: playbooks/check.yaml
- project:
name: org/project
check:
jobs:
- check-job
gate:
jobs:
- check-job

View File

@ -0,0 +1,52 @@
- pipeline:
name: check
manager: independent
trigger:
gitlab:
- event: gl_merge_request
action:
- opened
debug: true
success:
gitlab:
comment: true
- pipeline:
name: gate
manager: dependent
trigger:
gitlab:
- event: gl_merge_request
action:
- labeled
labels:
- gateit
debug: true
success:
gitlab:
merge: true
comment: true
failure:
gitlab: {}
- job:
name: base
parent: null
run: playbooks/base.yaml
nodeset:
nodes:
- label: ubuntu-xenial
name: controller
- job:
name: check-job
run: playbooks/check.yaml
- project:
name: org/project
check:
jobs:
- check-job
gate:
jobs:
- check-job

View File

@ -0,0 +1,52 @@
- pipeline:
name: check
manager: independent
trigger:
pagure:
- event: pg_pull_request
action:
- opened
- changed
debug: true
success:
pagure:
comment: true
- pipeline:
name: gate
manager: dependent
trigger:
pagure:
- event: pg_pull_request
action: tagged
tag:
- gateit
debug: true
success:
pagure:
status: 'success'
merge: true
failure:
pagure: {}
- job:
name: base
parent: null
run: playbooks/base.yaml
nodeset:
nodes:
- label: ubuntu-xenial
name: controller
- job:
name: check-job
run: playbooks/check.yaml
- project:
name: org/project
check:
jobs:
- check-job
gate:
jobs:
- check-job

View File

@ -489,6 +489,70 @@ class TestGerritWeb(ZuulTestCase):
project = source.getProject('org/project')
self.assertIsNotNone(source.getProjectBranchSha(project, 'master'))
@simple_layout('layouts/gerrit-pipeline-trigger-debug.yaml')
def test_gerrit_pipeline_trigger_debug(self):
# Test that we get debug info for a pipeline debug trigger
A = self.fake_gerrit.addFakeChange('org/project', 'master', 'A')
self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
self.waitUntilSettled()
self.assertEqual(A.data['status'], 'NEW')
self.assertEqual(A.reported, 1)
self.assertIn('Debug information:',
A.messages[0])
# Test that we get debug info for a pipeline debug trigger
# when a change depends on an abandoned change
B = self.fake_gerrit.addFakeChange('org/project', 'master', 'B')
# We check approvals before state, so set those
B.addApproval('Code-Review', 2)
B.addApproval('Approved', 1)
B.setAbandoned()
A.setDependsOn(B, 1)
A.addApproval('Code-Review', 2)
self.fake_gerrit.addEvent(A.addApproval('Approved', 1))
self.waitUntilSettled()
self.assertEqual(A.data['status'], 'NEW')
self.assertEqual(A.reported, 2)
self.assertIn('does not match pipeline requirement',
A.messages[1])
# Test that we get debug info for a pipeline debug trigger
# when a change is missing a pipeline requirement
self.fake_gerrit.addEvent(B.addApproval('Approved', 1))
self.waitUntilSettled()
self.assertEqual(B.data['status'], 'ABANDONED')
self.assertEqual(B.reported, 1)
self.assertIn('does not match pipeline requirement',
B.messages[0])
# Test that we get debug info for a pipeline debug trigger
# when a change is missing a merge requirement
C = self.fake_gerrit.addFakeChange('org/project', 'master', 'C')
self.fake_gerrit.addEvent(C.addApproval('Approved', 1))
self.waitUntilSettled()
self.assertEqual(C.data['status'], 'NEW')
self.assertEqual(C.reported, 1)
self.assertIn('can not be merged due to: missing labels:',
C.messages[0])
# Test that we get debug info for a pipeline debug trigger
# when a change depends on a change missing a merge requirement
D = self.fake_gerrit.addFakeChange('org/project', 'master', 'D')
D.setDependsOn(C, 1)
D.addApproval('Code-Review', 2)
self.fake_gerrit.addEvent(D.addApproval('Approved', 1))
self.waitUntilSettled()
self.assertEqual(D.data['status'], 'NEW')
self.assertEqual(D.reported, 1)
self.assertIn(
'is needed but can not be merged due to: missing labels:',
D.messages[0])
class TestFileComments(AnsibleZuulTestCase):
config_file = 'zuul-gerrit-web.conf'

View File

@ -1633,6 +1633,30 @@ class TestGithubDriver(ZuulTestCase):
project = source.getProject('org/project')
self.assertIsNotNone(source.getProjectBranchSha(project, 'master'))
@simple_layout('layouts/github-pipeline-trigger-debug.yaml',
driver='github')
def test_github_pipeline_trigger_debug(self):
# Test that we get debug info for a pipeline debug trigger
A = self.fake_github.openFakePullRequest("org/project", "master", "A")
self.fake_github.emitEvent(A.getPullRequestOpenedEvent())
self.waitUntilSettled()
self.assertFalse(A.is_merged)
self.assertEqual(1, len(A.comments))
self.assertIn('Debug information:',
A.comments[0])
# Test that we get a reason from canMerge.
B = self.fake_github.openFakePullRequest("org/project", "master", "B")
B.draft = True
self.fake_github.emitEvent(B.getCommentAddedEvent('merge me'))
self.waitUntilSettled()
self.assertFalse(B.is_merged)
self.assertEqual(1, len(B.comments))
self.assertIn("can not be merged due to: draft state",
B.comments[0])
class TestMultiGithubDriver(ZuulTestCase):
config_file = 'zuul-multi-github.conf'

View File

@ -1031,6 +1031,31 @@ class TestGitlabDriver(ZuulTestCase):
project = source.getProject('org/project')
self.assertIsNotNone(source.getProjectBranchSha(project, 'master'))
@simple_layout('layouts/gitlab-pipeline-trigger-debug.yaml',
driver='gitlab')
def test_gitlab_pipeline_trigger_debug(self):
# Test that we get debug info for a pipeline debug trigger
A = self.fake_gitlab.openFakeMergeRequest("org/project", "master", "A")
self.fake_gitlab.emitEvent(A.getMergeRequestOpenedEvent())
self.waitUntilSettled()
self.assertFalse(A.is_merged)
self.assertEqual(1, len(A.notes))
self.assertIn('Debug information:',
A.notes[0]['body'])
# Test that we get a reason from canMerge.
B = self.fake_gitlab.openFakeMergeRequest("org/project", "master", "B")
B.blocking_discussions_resolved = False
self.fake_gitlab.emitEvent(B.getMergeRequestLabeledEvent(
add_labels=('gateit', )))
self.waitUntilSettled()
self.assertFalse(B.is_merged)
self.assertEqual(1, len(B.notes))
self.assertIn("can not be merged due to: blocking discussions",
B.notes[0]['body'])
class TestGitlabUnprotectedBranches(ZuulTestCase):
config_file = 'zuul-gitlab-driver.conf'

View File

@ -695,6 +695,31 @@ class TestPagureDriver(ZuulTestCase):
project = source.getProject('org/project')
self.assertIsNotNone(source.getProjectBranchSha(project, 'master'))
@simple_layout('layouts/pagure-pipeline-trigger-debug.yaml',
driver='pagure')
def test_pagure_pipeline_trigger_debug(self):
# Test that we get debug info for a pipeline debug trigger
A = self.fake_pagure.openFakePullRequest("org/project", "master", "A")
self.fake_pagure.emitEvent(A.getPullRequestOpenedEvent())
self.waitUntilSettled()
self.assertFalse(A.is_merged)
self.assertEqual(1, len(A.comments))
self.assertIn('Debug information:',
A.comments[0]['comment'])
# Test the canMerge code path
B = self.fake_pagure.openFakePullRequest("org/project", "master", "B")
B.draft = True
self.fake_pagure.emitEvent(
B.getPullRequestTagAddedEvent(["gateit"]))
self.waitUntilSettled()
self.assertFalse(B.is_merged)
self.assertEqual(2, len(B.comments))
self.assertIn("can not be merged",
B.comments[1]['comment'])
class TestPagureToGerritCRD(ZuulTestCase):
config_file = 'zuul-crd-pagure.conf'

View File

@ -61,7 +61,7 @@ from zuul.driver.gerrit.gerriteventgcloudpubsub import (
from zuul.driver.git.gitwatcher import GitWatcher
from zuul.lib import tracing
from zuul.lib.logutil import get_annotated_logger
from zuul.model import Ref, Tag, Branch, Project
from zuul.model import Ref, Tag, Branch, Project, FalseWithReason
from zuul.zk.branch_cache import BranchCache, BranchFlag, BranchInfo
from zuul.zk.change_cache import (
AbstractChangeCache,
@ -1321,12 +1321,13 @@ class GerritConnection(ZKChangeCacheMixin, ZKBranchCacheMixin, BaseConnection):
# means it's merged.
return True
if change.wip:
return False
self.log.debug("Unable to merge due to WIP")
return FalseWithReason("work in progress flag")
missing_labels = change.missing_labels - set(allow_needs)
if missing_labels:
self.log.debug("Unable to merge due to "
"missing labels: %s", missing_labels)
return False
return FalseWithReason(f"missing labels: {missing_labels}")
for sr in change.submit_requirements:
if sr.get('status') == 'UNSATISFIED':
# Otherwise, we don't care and should skip.
@ -1349,7 +1350,7 @@ class GerritConnection(ZKChangeCacheMixin, ZKBranchCacheMixin, BaseConnection):
if not expr_contains_allow:
self.log.debug("Unable to merge due to "
"submit requirement: %s", sr)
return False
return FalseWithReason(f"submit requirement: {sr}")
return True
def getProjectOpenChanges(self, project: Project) -> List[GerritChange]:

View File

@ -401,9 +401,9 @@ class GerritEventFilter(EventFilter):
comments=[], emails=[], usernames=[], required_approvals=[],
reject_approvals=[], added=[], removed=[], uuid=None,
scheme=None, ignore_deletes=True, require=None, reject=None,
parse_context=None):
debug=None, parse_context=None):
EventFilter.__init__(self, connection_name, trigger)
EventFilter.__init__(self, connection_name, trigger, debug)
# TODO: Backwards compat, remove after 9.x:
if required_approvals and require is None:

View File

@ -117,6 +117,7 @@ class GerritTrigger(BaseTrigger):
ignore_deletes=ignore_deletes,
require=trigger.get('require'),
reject=trigger.get('reject'),
debug=trigger.get('debug'),
parse_context=parse_context,
)
efilters.append(f)
@ -165,6 +166,7 @@ def getSchema():
'removed': scalar_or_list(v.Any(ZUUL_REGEX, str)),
'require': gerritsource.getRequireSchema(),
'reject': gerritsource.getRejectSchema(),
'debug': bool,
}
return gerrit_trigger

View File

@ -39,9 +39,9 @@ class GitTriggerEvent(TriggerEvent):
class GitEventFilter(EventFilter):
def __init__(self, connection_name, trigger, types=None, refs=None,
ignore_deletes=True):
ignore_deletes=True, debug=None):
super().__init__(connection_name, trigger)
super().__init__(connection_name, trigger, debug)
refs = refs if refs is not None else []
self._refs = [x.pattern for x in refs]

View File

@ -39,7 +39,8 @@ class GitTrigger(BaseTrigger):
types=to_list(trigger['event']),
refs=refs,
ignore_deletes=trigger.get(
'ignore-deletes', True)
'ignore-deletes', True),
debug=trigger.get('debug'),
)
efilters.append(f)
@ -52,6 +53,7 @@ def getSchema():
scalar_or_list(v.Any('ref-updated')),
'ref': scalar_or_list(v.Any(ZUUL_REGEX, str)),
'ignore-deletes': bool,
'debug': bool,
}
return git_trigger

View File

@ -52,7 +52,7 @@ from zuul.lib import tracing
from zuul.web.handler import BaseWebController
from zuul.lib.logutil import get_annotated_logger
from zuul import model
from zuul.model import Ref, Branch, Tag, Project
from zuul.model import Ref, Branch, Tag, Project, FalseWithReason
from zuul.exceptions import MergeFailure
from zuul.driver.github.githubmodel import PullRequest, GithubTriggerEvent
from zuul.model import DequeueEvent
@ -1961,31 +1961,32 @@ class GithubConnection(ZKChangeCacheMixin, ZKBranchCacheMixin, BaseConnection):
# If the PR is a draft it cannot be merged.
if change.draft:
log.debug('Change %s can not merge because it is a draft', change)
return False
return FalseWithReason('draft state')
if not change.mergeable:
log.debug('Change %s can not merge because Github detected a '
'merge conflict', change)
return False
return FalseWithReason('Github detected a merge conflict')
missing_status_checks = self._getMissingStatusChecks(
change, allow_needs)
if missing_status_checks:
log.debug('Change %s can not merge because required status checks '
'are missing: %s', change, missing_status_checks)
return False
return FalseWithReason(
f'required status checks are missing {missing_status_checks}')
if change.review_decision and change.review_decision != 'APPROVED':
# If we got a review decision it must be approved
log.debug('Change %s can not merge because it is not approved',
change)
return False
return FalseWithReason('approval state')
if change.unresolved_conversations:
log.debug('Change %s can not merge because '
'it has unresolved conversations',
change)
return False
return FalseWithReason('unresolved conversations')
return True

View File

@ -185,9 +185,9 @@ class GithubEventFilter(EventFilter):
labels=[], unlabels=[], states=[], statuses=[],
required_statuses=[], check_runs=[],
ignore_deletes=True,
require=None, reject=None):
require=None, reject=None, debug=None):
EventFilter.__init__(self, connection_name, trigger)
EventFilter.__init__(self, connection_name, trigger, debug)
# TODO: Backwards compat, remove after 9.x:
if required_statuses and require is None:

View File

@ -127,6 +127,7 @@ class GithubTrigger(BaseTrigger):
required_statuses=to_list(trigger.get('require-status')),
require=trigger.get('require'),
reject=trigger.get('reject'),
debug=trigger.get('debug'),
)
efilters.append(f)
@ -147,6 +148,7 @@ def getNewSchema():
'check_run'),
'require': githubsource.getRequireSchema(),
'reject': githubsource.getRejectSchema(),
'debug': bool,
})
# Pull request
@ -270,6 +272,7 @@ def getSchema():
'reject': githubsource.getRejectSchema(),
'status': scalar_or_list(str),
'check': scalar_or_list(str),
'debug': bool,
})
return old_schema

View File

@ -39,7 +39,7 @@ from zuul.lib.http import ZuulHTTPAdapter
from zuul.lib.logutil import get_annotated_logger
from zuul.lib.config import any_to_bool
from zuul.exceptions import MergeFailure
from zuul.model import Branch, Ref, Tag
from zuul.model import Branch, Ref, Tag, FalseWithReason
from zuul.driver.gitlab.gitlabmodel import GitlabTriggerEvent, MergeRequest
from zuul.zk.branch_cache import BranchCache, BranchFlag, BranchInfo
from zuul.zk.change_cache import (
@ -814,12 +814,11 @@ class GitlabConnection(ZKChangeCacheMixin, ZKBranchCacheMixin, BaseConnection):
def canMerge(self, change, allow_needs, event=None):
log = get_annotated_logger(self.log, event)
can_merge = False
if (
change.merge_status == "can_be_merged" and
change.blocking_discussions_resolved
):
can_merge = True
can_merge = True
if not change.blocking_discussions_resolved:
can_merge = FalseWithReason('blocking discussions not resolved')
elif change.merge_status != "can_be_merged":
can_merge = FalseWithReason('GitLab mergeability')
log.info('Check MR %s#%s mergeability can_merge: %s'
' (merge status: %s, blocking discussions resolved: %s)',

View File

@ -159,8 +159,8 @@ class GitlabEventFilter(EventFilter):
def __init__(
self, connection_name, trigger, types=None, actions=None,
comments=None, refs=None, labels=None, unlabels=None,
ignore_deletes=True):
super().__init__(connection_name, trigger)
ignore_deletes=True, debug=None):
super().__init__(connection_name, trigger, debug)
types = types if types is not None else []
refs = refs if refs is not None else []

View File

@ -48,6 +48,7 @@ class GitlabTrigger(BaseTrigger):
refs=refs,
labels=to_list(trigger.get('labels')),
unlabels=to_list(trigger.get('unlabels')),
debug=trigger.get('debug'),
)
efilters.append(f)
return efilters
@ -69,5 +70,6 @@ def getSchema():
'ref': scalar_or_list(v.Any(ZUUL_REGEX, str)),
'labels': scalar_or_list(str),
'unlabels': scalar_or_list(str),
'debug': bool,
}
return gitlab_trigger

View File

@ -137,9 +137,9 @@ class PagureTriggerEvent(TriggerEvent):
class PagureEventFilter(EventFilter):
def __init__(self, connection_name, trigger, types=[], refs=[],
statuses=[], comments=[], actions=[], tags=[],
ignore_deletes=True):
ignore_deletes=True, debug=None):
EventFilter.__init__(self, connection_name, trigger)
EventFilter.__init__(self, connection_name, trigger, debug)
self._types = [x.pattern for x in types]
self._refs = [x.pattern for x in refs]

View File

@ -48,6 +48,7 @@ class PagureTrigger(BaseTrigger):
comments=comments,
statuses=to_list(trigger.get('status')),
tags=to_list(trigger.get('tag')),
debug=trigger.get('debug'),
)
efilters.append(f)
@ -71,7 +72,8 @@ def getSchema():
'ref': scalar_or_list(v.Any(ZUUL_REGEX, str)),
'comment': scalar_or_list(v.Any(ZUUL_REGEX, str)),
'status': scalar_or_list(str),
'tag': scalar_or_list(str)
'tag': scalar_or_list(str),
'debug': bool,
}
return pagure_trigger

View File

@ -18,8 +18,8 @@ from zuul.model import EventFilter, TriggerEvent
class TimerEventFilter(EventFilter):
def __init__(self, connection_name, trigger, types=[], timespecs=[],
dereference=False):
EventFilter.__init__(self, connection_name, trigger)
dereference=False, debug=None):
EventFilter.__init__(self, connection_name, trigger, debug)
self._types = [x.pattern for x in types]
self.types = types

View File

@ -35,7 +35,9 @@ class TimerTrigger(BaseTrigger):
types=types,
timespecs=to_list(trigger['time']),
dereference=trigger.get('dereference', False),
debug=trigger.get('debug'),
)
efilters.append(f)
return efilters
@ -45,5 +47,6 @@ def getSchema():
timer_trigger = {
v.Required('time'): str,
'dereference': bool,
'debug': bool,
}
return timer_trigger

View File

@ -16,8 +16,9 @@ from zuul.model import EventFilter, TriggerEvent
class ZuulEventFilter(EventFilter):
def __init__(self, connection_name, trigger, types=[], pipelines=[]):
EventFilter.__init__(self, connection_name, trigger)
def __init__(self, connection_name, trigger, types=[], pipelines=[],
debug=None):
EventFilter.__init__(self, connection_name, trigger, debug)
self._types = [x.pattern for x in types]
self._pipelines = [x.pattern for x in pipelines]

View File

@ -45,6 +45,7 @@ class ZuulTrigger(BaseTrigger):
trigger=self,
types=types,
pipelines=pipelines,
debug=trigger.get('debug'),
)
efilters.append(f)
@ -61,6 +62,7 @@ def getSchema():
'image-validate',
)),
'pipeline': scalar_or_list(v.Any(ZUUL_REGEX, str)),
'debug': bool,
}
return zuul_trigger

View File

@ -66,6 +66,9 @@ class StaticChangeQueueContextManager(object):
pass
EventMatchInfo = collections.namedtuple('EventMatchInfo', ['debug'])
class PipelineManager(metaclass=ABCMeta):
"""Abstract Base Class for enqueing and processing Changes in a Pipeline"""
@ -199,12 +202,16 @@ class PipelineManager(metaclass=ABCMeta):
return allow_needs
def eventMatches(self, event, change):
# Return False if the event does not match, return a
# EventMatchInfo (which will eval to True) if it does. The
# EventMatchInfo further indicates whether pipeline debugging
# should be enabled.
log = get_annotated_logger(self.log, event)
if event.forced_pipeline:
if event.forced_pipeline == self.pipeline.name:
log.debug("Event %s for change %s was directly assigned "
"to pipeline %s" % (event, change, self))
return True
return EventMatchInfo(debug=False)
else:
return False
for ef in self.pipeline.event_filters:
@ -212,7 +219,7 @@ class PipelineManager(metaclass=ABCMeta):
if match_result:
log.debug("Event %s for change %s matched %s "
"in pipeline %s" % (event, change, ef, self))
return True
return EventMatchInfo(debug=ef.debug)
else:
log.debug("Event %s for change %s does not match %s "
"in pipeline %s because %s" % (
@ -436,12 +443,13 @@ class PipelineManager(metaclass=ABCMeta):
report_errors.append(str(e))
return report_errors
def areChangesReadyToBeEnqueued(self, changes, event):
def areChangesReadyToBeEnqueued(self, changes, event,
warnings=None, debug=False):
return True
def enqueueChangesAhead(self, change, event, quiet, ignore_requirements,
change_queue, history=None, dependency_graph=None,
warnings=None):
warnings=None, debug=False):
return True
def enqueueChangesBehind(self, change, event, quiet, ignore_requirements,
@ -660,7 +668,8 @@ class PipelineManager(metaclass=ABCMeta):
def addChange(self, change, event, quiet=False, enqueue_time=None,
ignore_requirements=False, live=True,
change_queue=None, history=None, dependency_graph=None,
skip_presence_check=False, warnings=None):
skip_presence_check=False, warnings=None,
debug=False):
log = get_annotated_logger(self.log, event)
log.debug("Considering adding change %s" % change)
@ -720,14 +729,28 @@ class PipelineManager(metaclass=ABCMeta):
continue
match_result = f.matches(cycle_change)
if not match_result:
log.debug("Change %s does not match pipeline "
"requirement %s because %s",
cycle_change, f, str(match_result))
msg = (
f"Change {cycle_change} "
"does not match pipeline "
f"requirement {f} because {match_result}"
)
log.debug(msg)
if debug:
warnings.append(msg)
if not history:
self._reportNonEnqueuedItem(
change_queue, change, event, warnings)
return False
if not self.areChangesReadyToBeEnqueued(cycle, event):
if not self.areChangesReadyToBeEnqueued(
cycle, event,
warnings=warnings, debug=debug):
log.debug("Cycle %s is not ready to be enqueued, ignoring" %
cycle)
if warnings:
if not history:
self._reportNonEnqueuedItem(change_queue, change,
event, warnings)
return False
if len(cycle) > 1:
@ -764,7 +787,7 @@ class PipelineManager(metaclass=ABCMeta):
ignore_requirements,
change_queue, history=history,
dependency_graph=dependency_graph,
warnings=warnings):
warnings=warnings, debug=debug):
log.debug("Failed to enqueue changes ahead of %s" % change)
if warnings:
self._reportNonEnqueuedItem(change_queue, change,
@ -793,7 +816,8 @@ class PipelineManager(metaclass=ABCMeta):
item = change_queue.enqueueChanges(cycle, event,
span_info=span_info,
enqueue_time=enqueue_time)
enqueue_time=enqueue_time,
debug=debug)
with item.activeContext(self.current_context):
if enqueue_time:

View File

@ -48,13 +48,24 @@ class DependentPipelineManager(SharedQueuePipelineManager):
def getNodePriority(self, item, change):
return item.queue.queue.index(item)
def areChangesReadyToBeEnqueued(self, changes, event):
def areChangesReadyToBeEnqueued(self, changes, event,
warnings=None, debug=False):
log = get_annotated_logger(self.log, event)
for change in changes:
source = change.project.source
if not source.canMerge(change, self.getSubmitAllowNeeds(),
event=event):
log.debug("Change %s can not merge", change)
can_merge = source.canMerge(change, self.getSubmitAllowNeeds(),
event=event)
if not can_merge:
msg = (
f"Change {change._id()} "
f"in project {change.project} "
"can not be merged"
)
if isinstance(can_merge, model.FalseWithReason):
msg += f" due to: {can_merge}"
log.debug(" " + msg)
if debug and warnings is not None:
warnings.append(msg)
return False
return True
@ -146,7 +157,7 @@ class DependentPipelineManager(SharedQueuePipelineManager):
def enqueueChangesAhead(self, changes, event, quiet, ignore_requirements,
change_queue, history=None, dependency_graph=None,
warnings=None):
warnings=None, debug=False):
log = get_annotated_logger(self.log, event)
history = history if history is not None else []
@ -160,7 +171,7 @@ class DependentPipelineManager(SharedQueuePipelineManager):
abort, needed_changes = self.getMissingNeededChanges(
changes, change_queue, event,
dependency_graph=dependency_graph,
warnings=warnings)
warnings=warnings, debug=debug)
if abort:
return False
@ -178,14 +189,14 @@ class DependentPipelineManager(SharedQueuePipelineManager):
ignore_requirements=ignore_requirements,
change_queue=change_queue, history=history,
dependency_graph=dependency_graph,
warnings=warnings)
warnings=warnings, debug=debug)
if not r:
return False
return True
def getMissingNeededChanges(self, changes, change_queue, event,
dependency_graph=None, warnings=None,
item=None):
item=None, debug=False):
log = get_annotated_logger(self.log, event)
changes_needed = []
abort = False
@ -215,7 +226,7 @@ class DependentPipelineManager(SharedQueuePipelineManager):
msg = ("Change %s in project %s does not "
"share a change queue with %s "
"in project %s" %
(needed_change.number,
(needed_change._id(),
needed_change.project,
change.number,
change.project))
@ -225,9 +236,15 @@ class DependentPipelineManager(SharedQueuePipelineManager):
changes_needed.append(needed_change)
abort = True
if not needed_change.is_current_patchset:
log.debug(" Needed change is not "
"the current patchset")
msg = (
f"Needed change {needed_change._id()} "
f"in project {needed_change.project} is not "
"the current patchset"
)
log.debug(" " + msg)
changes_needed.append(needed_change)
if debug and warnings is not None:
warnings.append(msg)
abort = True
if needed_change in changes:
log.debug(" Needed change is in cycle")
@ -237,18 +254,30 @@ class DependentPipelineManager(SharedQueuePipelineManager):
log.debug(" Needed change is already "
"ahead in the queue")
continue
if needed_change.project.source.canMerge(
needed_change, self.getSubmitAllowNeeds(),
event=event):
log.debug(" Change %s is needed", needed_change)
can_merge = needed_change.project.source.canMerge(
needed_change, self.getSubmitAllowNeeds(),
event=event)
if can_merge:
msg = (
f"Change {needed_change._id()} "
f"in project {needed_change.project} is needed "
)
log.debug(" " + msg)
if needed_change not in changes_needed:
changes_needed.append(needed_change)
continue
else:
# The needed change can't be merged.
log.debug(" Change %s is needed "
"but can not be merged",
needed_change)
msg = (
f"Change {needed_change._id()} "
f"in project {needed_change.project} is needed "
"but can not be merged"
)
if isinstance(can_merge, model.FalseWithReason):
msg += f" due to: {can_merge}"
log.debug(" " + msg)
if debug and warnings is not None:
warnings.append(msg)
changes_needed.append(needed_change)
abort = True
return abort, changes_needed

View File

@ -42,7 +42,7 @@ class IndependentPipelineManager(PipelineManager):
def enqueueChangesAhead(self, changes, event, quiet, ignore_requirements,
change_queue, history=None, dependency_graph=None,
warnings=None):
warnings=None, debug=False):
log = get_annotated_logger(self.log, event)
history = history if history is not None else []
@ -55,7 +55,8 @@ class IndependentPipelineManager(PipelineManager):
abort, needed_changes = self.getMissingNeededChanges(
changes, change_queue, event,
dependency_graph=dependency_graph)
dependency_graph=dependency_graph,
warnings=warnings, debug=debug)
if abort:
return False
@ -82,7 +83,8 @@ class IndependentPipelineManager(PipelineManager):
return True
def getMissingNeededChanges(self, changes, change_queue, event,
dependency_graph=None, item=None):
dependency_graph=None, item=None,
warnings=None, debug=False):
log = get_annotated_logger(self.log, event)
if self.pipeline.ignore_dependencies:

View File

@ -1225,7 +1225,7 @@ class ChangeQueue(zkobject.ZKObject):
return (project_cname, branch) in self.project_branches
def enqueueChanges(self, changes, event, span_info=None,
enqueue_time=None):
enqueue_time=None, debug=False):
if enqueue_time is None:
enqueue_time = time.time()
@ -1257,7 +1257,8 @@ class ChangeQueue(zkobject.ZKObject):
changes=changes,
event=event_info,
span_info=span_info,
enqueue_time=enqueue_time)
enqueue_time=enqueue_time,
debug=debug)
self.enqueueItem(item)
return item
@ -6386,7 +6387,7 @@ class QueueItem(zkobject.ZKObject):
layout_uuid=None,
_cached_sql_results={},
event=None, # Info about the event that lead to this queue item
debug=False,
# Additional container for connection specifig information to be
# used by reporters throughout the lifecycle
dynamic_state=defaultdict(dict),
@ -6467,6 +6468,7 @@ class QueueItem(zkobject.ZKObject):
},
"dynamic_state": self.dynamic_state,
"first_job_start_time": self.first_job_start_time,
"debug": self.debug,
}
return json.dumps(data, sort_keys=True).encode("utf8")
@ -8678,10 +8680,11 @@ class BaseFilter(ConfigObject):
class EventFilter(BaseFilter):
"""Allows a Pipeline to only respond to certain events."""
def __init__(self, connection_name, trigger):
def __init__(self, connection_name, trigger, debug=None):
super(EventFilter, self).__init__()
self.connection_name = connection_name
self.trigger = trigger
self.debug = bool(debug)
def matches(self, event, ref):
# TODO(jeblair): consider removing ref argument
@ -10138,7 +10141,7 @@ class Layout(object):
"""Find or create actual matching jobs for this item's change and
store the resulting job tree."""
enable_debug = False
enable_debug = item.debug
fail_fast = item.current_build_set.fail_fast
debug_messages = []
if old:

View File

@ -2715,8 +2715,8 @@ class Scheduler(threading.Thread):
if event.isPatchsetCreated() or event.isMessageChanged():
manager.refreshDeps(change, event)
if manager.eventMatches(event, change):
manager.addChange(change, event)
if match_info := manager.eventMatches(event, change):
manager.addChange(change, event, debug=match_info.debug)
def process_tenant_management_queue(self, tenant):
try: