Merge branch 'master' into v3_merge

Includes minor py3 fixes (for pep8 on py3).

 Conflicts:
	tests/base.py
	tests/test_model.py
	tests/test_scheduler.py
	tox.ini
	zuul/model.py
	zuul/reporter/__init__.py
	zuul/scheduler.py
	zuul/source/gerrit.py

Change-Id: I99daf9acd746767967b42396881a2dff82134a07
This commit is contained in:
Joshua Hesketh 2016-07-14 00:12:25 +10:00
commit 0aa7e8bdbf
60 changed files with 3217 additions and 206 deletions

View File

@ -1,4 +1,4 @@
[DEFAULT]
test_command=OS_STDOUT_CAPTURE=${OS_STDOUT_CAPTURE:-1} OS_STDERR_CAPTURE=${OS_STDERR_CAPTURE:-1} OS_LOG_CAPTURE=${OS_LOG_CAPTURE:-1} ${PYTHON:-python} -m subunit.run discover -t ./ tests $LISTOPT $IDOPTION
test_command=OS_STDOUT_CAPTURE=${OS_STDOUT_CAPTURE:-1} OS_STDERR_CAPTURE=${OS_STDERR_CAPTURE:-1} OS_LOG_CAPTURE=${OS_LOG_CAPTURE:-1} OS_LOG_DEFAULTS=${OS_LOG_DEFAULTS:-""} ${PYTHON:-python} -m subunit.run discover -t ./ tests $LISTOPT $IDOPTION
test_id_option=--load-list $IDFILE
test_list_option=--list

View File

@ -13,6 +13,7 @@ Contents:
.. toctree::
:maxdepth: 2
quick-start
gating
connections
triggers

View File

@ -6,7 +6,7 @@
https://wiki.jenkins-ci.org/display/JENKINS/Gearman+Plugin
.. _`Turbo-Hipster`:
http://git.openstack.org/cgit/stackforge/turbo-hipster/
https://git.openstack.org/cgit/openstack/turbo-hipster/
.. _`Turbo-Hipster Documentation`:
http://turbo-hipster.rtfd.org/

View File

@ -58,3 +58,17 @@ instance, a clone will produce a repository in an unpredictable state
depending on what the state of Zuul's repository is when the clone
happens). They are, however, suitable for automated systems that
respond to Zuul triggers.
Clearing old references
~~~~~~~~~~~~~~~~~~~~~~~
The references created under refs/zuul are not garbage collected. Since
git fetch send them all to Gerrit to sync the repositories, the time
spent on merge will slightly grow overtime and start being noticeable.
To clean them you can use the ``tools/zuul-clear-refs.py`` script on
each repositories. It will delete Zuul references that point to commits
for which the commit date is older than a given amount of days (default
360)::
./tools/zuul-clear-refs.py /path/to/zuul/git/repo

162
doc/source/quick-start.rst Normal file
View File

@ -0,0 +1,162 @@
Quick Start Guide
=================
System Requirements
-------------------
For most deployments zuul only needs 1-2GB. OpenStack uses a 30GB setup.
Install Zuul
------------
You can get zuul from pypi via::
pip install zuul
Zuul Components
---------------
Zuul provides the following components:
- **zuul-server**: scheduler daemon which communicates with Gerrit and
Gearman. Handles receiving events, launching jobs, collecting results
and postingreports.
- **zuul-merger**: speculative-merger which communicates with Gearman.
Prepares Git repositories for jobs to test against. This additionally
requires a web server hosting the Git repositories which can be cloned
by the jobs.
- **zuul-cloner**: client side script used to setup job workspace. It is
used to clone the repositories prepared by the zuul-merger described
previously.
- **gearmand**: optional builtin gearman daemon provided by zuul-server
External components:
- Jenkins Gearman plugin: Used by Jenkins to connect to Gearman
Zuul Communication
------------------
All the Zuul components communicate with each other using Gearman. As well as
the following communication channels:
zuul-server:
- Gerrit
- Gearman Daemon
zuul-merger:
- Gerrit
- Gearman Daemon
zuul-cloner:
- http hosted zuul-merger git repos
Jenkins:
- Gearman Daemon via Jenkins Gearman Plugin
Zuul Setup
----------
At minimum we need to provide **zuul.conf** and **layout.yaml** and placed
in /etc/zuul/ directory. You will also need a zuul user and ssh key for the
zuul user in Gerrit. The following example uses the builtin gearmand service
in zuul.
**zuul.conf**::
[zuul]
layout_config=/etc/zuul/layout.yaml
[merger]
git_dir=/git
zuul_url=http://zuul.example.com/p
[gearman_server]
start=true
[gearman]
server=127.0.0.1
[connection gerrit]
driver=gerrit
server=git.example.com
port=29418
baseurl=https://git.example.com/gerrit/
user=zuul
sshkey=/home/zuul/.ssh/id_rsa
See :doc:`zuul` for more details.
The following sets up a basic timer triggered job using zuul.
**layout.yaml**::
pipelines:
- name: periodic
source: gerrit
manager: IndependentPipelineManager
trigger:
timer:
- time: '0 * * * *'
projects:
- name: aproject
periodic:
- aproject-periodic-build
Starting Zuul
-------------
You can run zuul-server with the **-d** option to make it not daemonize. It's
a good idea at first to confirm there's no issues with your configuration.
Simply run::
zuul-server
Once run you should have 2 zuul-server processes::
zuul 12102 1 0 Jan21 ? 00:15:45 /home/zuul/zuulvenv/bin/python /home/zuul/zuulvenv/bin/zuul-server -d
zuul 12107 12102 0 Jan21 ? 00:00:01 /home/zuul/zuulvenv/bin/python /home/zuul/zuulvenv/bin/zuul-server -d
Note: In this example zuul was installed in a virtualenv.
The 2nd zuul-server process is gearmand running if you are using the builtin
gearmand server, otherwise there will only be 1 process.
Zuul won't actually process your Job queue however unless you also have a
zuul-merger process running.
Simply run::
zuul-merger
Zuul should now be able to process your periodic job as configured above once
the Jenkins side of things is configured.
Jenkins Setup
-------------
Install the Jenkins Gearman Plugin via Jenkins Plugin management interface.
Then naviage to **Manage > Configuration > Gearman** and setup the Jenkins
server hostname/ip and port to connect to gearman.
At this point gearman should be running your Jenkins jobs.
Troubleshooting
---------------
Checking Gearman function registration (jobs). You can use telnet to connect
to gearman to check that Jenkins is registering your configured jobs in
gearman::
telnet <gearman_ip> 4730
Useful commands are **workers** and **status** which you can run by just
typing those commands once connected to gearman. Every job in your Jenkins
master must appear when you run **workers** for Zuul to be able to run jobs
against your Jenkins instance.

View File

@ -31,7 +31,7 @@ Metrics
The metrics are emitted by the Zuul scheduler (`zuul/scheduler.py`):
**gerrit.events.<type> (counters)**
**gerrit.event.<type> (counters)**
Gerrit emits different kind of message over its `stream-events` interface. As
a convenience, Zuul emits metrics to statsd which save you from having to use
a different daemon to measure Gerrit events.
@ -52,6 +52,18 @@ The metrics are emitted by the Zuul scheduler (`zuul/scheduler.py`):
Refer to your Gerrit installation documentation for an exhaustive list of
Gerrit event types.
**zuul.node_type.**
Holds metrics specifc to build nodes per label. The hierarchy is:
#. **<build node label>** each of the labels associated to a build in
Jenkins. It contains:
#. **job.<jobname>** subtree detailing per job statistics:
#. **wait_time** counter and timer of the wait time, with the
difference of the job start time and the launch time, in
milliseconds.
**zuul.pipeline.**
Holds metrics specific to jobs. The hierarchy is:
@ -75,10 +87,13 @@ The metrics are emitted by the Zuul scheduler (`zuul/scheduler.py`):
known by Zuul (which includes build time and Zuul overhead).
#. **total_changes** counter of the number of change proceeding since
Zuul started.
#. **wait_time** counter and timer of the wait time, with the difference
of the job start time and the launch time, in milliseconds.
Additionally, the `zuul.pipeline.<pipeline name>` hierarchy contains
`current_changes` and `resident_time` metrics for each projects. The slash
separator used in Gerrit name being replaced by dots.
`current_changes` (gauge), `resident_time` (timing) and `total_changes`
(counter) metrics for each projects. The slash separator used in Gerrit name
being replaced by dots.
As an example, given a job named `myjob` triggered by the `gate` pipeline
which took 40 seconds to build, the Zuul scheduler will emit the following

View File

@ -10,11 +10,11 @@ Zuul has three configuration files:
**zuul.conf**
Connection information for Gerrit and Gearman, locations of the
other config files.
other config files. (required)
**layout.yaml**
Project and pipeline configuration -- what Zuul does.
Project and pipeline configuration -- what Zuul does. (required)
**logging.conf**
Python logging config.
Python logging config. (optional)
Examples of each of the three files can be found in the etc/ directory
of the source distribution.
@ -41,17 +41,28 @@ You can also find an example zuul.conf file in the git
gearman
"""""""
Client connection information for gearman. If using Zuul's builtin gearmand
server just set **server** to 127.0.0.1.
**server**
Hostname or IP address of the Gearman server.
``server=gearman.example.com``
``server=gearman.example.com`` (required)
**port**
Port on which the Gearman server is listening.
``port=4730``
``port=4730`` (optional)
**check_job_registration**
Check to see if job is registered with Gearman or not. When True
a build result of NOT_REGISTERED will be return if job is not found.
``check_job_registration=True``
gearman_server
""""""""""""""
The builtin gearman server. Zuul can fork a gearman process from itself rather
than connecting to an external one.
**start**
Whether to start the internal Gearman server (default: False).
``start=true``
@ -64,9 +75,25 @@ gearman_server
Path to log config file for internal Gearman server.
``log_config=/etc/zuul/gearman-logging.yaml``
webapp
""""""
**listen_address**
IP address or domain name on which to listen (default: 0.0.0.0).
``listen_address=127.0.0.1``
**port**
Port on which the webapp is listening (default: 8001).
``port=8008``
zuul
""""
Zuul's main configuration section. At minimum zuul must be able to find
layout.yaml to be useful.
.. note:: Must be provided when running zuul-server
.. _layout_config:
**layout_config**
@ -118,6 +145,13 @@ zuul
merger
""""""
The zuul-merger process configuration. Detailed documentation on this process
can be found on the :doc:`merger` page.
.. note:: Must be provided when running zuul-merger. Both services may share the
same configuration (and even host) or otherwise have an individual
zuul.conf.
**git_dir**
Directory that Zuul should clone local git repositories to.
``git_dir=/var/lib/zuul/git``
@ -394,11 +428,12 @@ explanation of each of the parameters::
approval matching all specified requirements.
*username*
If present, an approval from this username is required.
If present, an approval from this username is required. It is
treated as a regular expression.
*email*
If present, an approval with this email address is required. It
is treated as a regular expression as above.
is treated as a regular expression.
*email-filter* (deprecated)
A deprecated alternate spelling of *email*. Only one of *email* or
@ -759,7 +794,10 @@ each job as it builds a list from the project specification.
expressions.
The pattern for '/COMMIT_MSG' is always matched on and does not
have to be included.
have to be included. Exception is merge commits (without modified
files), in this case '/COMMIT_MSG' is not matched, and job is not
skipped. In case of merge commits it's assumed that list of modified
files isn't predictible and CI should be run.
**voting (optional)**
Boolean value (``true`` or ``false``) that indicates whatever
@ -997,9 +1035,8 @@ normal operation, omit ``-d`` and let Zuul run as a daemon.
If you send signal 1 (SIGHUP) to the zuul-server process, Zuul will
stop executing new jobs, wait until all executing jobs are finished,
reload its configuration, and resume. Any values in any of the
configuration files may be changed, except the location of Zuul's PID
file (a change to that will be ignored until Zuul is restarted).
reload its layout.yaml, and resume. Changes to any connections or
the PID file will be ignored until Zuul is restarted.
If you send a SIGUSR1 to the zuul-server process, Zuul will stop
executing new jobs, wait until all executing jobs are finished,

View File

@ -490,10 +490,12 @@
$header_div.append($heading);
if (typeof pipeline.description === 'string') {
var descr = $('<small />')
$.each( pipeline.description.split(/\r?\n\r?\n/), function(index, descr_part){
descr.append($('<p />').text(descr_part));
});
$header_div.append(
$('<p />').append(
$('<small />').text(pipeline.description)
)
$('<p />').append(descr)
);
}
return $header_div;

View File

@ -26,6 +26,10 @@ default_container=logs
region_name=EXP
logserver_prefix=http://logs.example.org/server.app/
[webapp]
listen_address=0.0.0.0
port=8001
[connection gerrit]
driver=gerrit
server=review.example.com

4
other-requirements.txt Normal file
View File

@ -0,0 +1,4 @@
mysql-client [test]
mysql-server [test]
postgresql [test]
postgresql-client [test]

View File

@ -3,7 +3,7 @@ pbr>=1.1.0
PyYAML>=3.1.0
Paste
WebOb>=1.2.3
paramiko>=1.8.0
paramiko>=1.8.0,<2.0.0
GitPython>=0.3.3
ordereddict
python-daemon>=2.0.4,<2.1.0

View File

@ -25,6 +25,7 @@ console_scripts =
zuul-merger = zuul.cmd.merger:main
zuul = zuul.cmd.client:main
zuul-cloner = zuul.cmd.cloner:main
zuul-launcher = zuul.cmd.launcher:main
[build_sphinx]
source-dir = doc/source

View File

@ -22,10 +22,12 @@ import logging
import os
import pprint
from six.moves import queue as Queue
from six.moves import urllib
import random
import re
import select
import shutil
from six.moves import reload_module
import socket
import string
import subprocess
@ -33,12 +35,10 @@ import swiftclient
import tempfile
import threading
import time
import urllib2
import git
import gear
import fixtures
import six.moves.urllib.parse as urlparse
import statsd
import testtools
from git import GitCommandError
@ -482,7 +482,7 @@ class FakeURLOpener(object):
self.url = url
def read(self):
res = urlparse.urlparse(self.url)
res = urllib.parse.urlparse(self.url)
path = res.path
project = '/'.join(path.split('/')[2:-2])
ret = '001e# service=git-upload-pack\n'
@ -882,6 +882,28 @@ class BaseTestCase(testtools.TestCase):
format='%(asctime)s %(name)-32s '
'%(levelname)-8s %(message)s'))
# NOTE(notmorgan): Extract logging overrides for specific libraries
# from the OS_LOG_DEFAULTS env and create FakeLogger fixtures for
# each. This is used to limit the output during test runs from
# libraries that zuul depends on such as gear.
log_defaults_from_env = os.environ.get('OS_LOG_DEFAULTS')
if log_defaults_from_env:
for default in log_defaults_from_env.split(','):
try:
name, level_str = default.split('=', 1)
level = getattr(logging, level_str, logging.DEBUG)
self.useFixture(fixtures.FakeLogger(
name=name,
level=level,
format='%(asctime)s %(name)-32s '
'%(levelname)-8s %(message)s'))
except ValueError:
# NOTE(notmorgan): Invalid format of the log default,
# skip and don't try and apply a logger for the
# specified module
pass
class ZuulTestCase(BaseTestCase):
config_file = 'zuul.conf'
@ -897,11 +919,13 @@ class ZuulTestCase(BaseTestCase):
self.test_root = os.path.join(tmp_root, "zuul-test")
self.upstream_root = os.path.join(self.test_root, "upstream")
self.git_root = os.path.join(self.test_root, "git")
self.state_root = os.path.join(self.test_root, "lib")
if os.path.exists(self.test_root):
shutil.rmtree(self.test_root)
os.makedirs(self.test_root)
os.makedirs(self.upstream_root)
os.makedirs(self.state_root)
# Make per test copy of Configuration.
self.setup_config()
@ -909,6 +933,7 @@ class ZuulTestCase(BaseTestCase):
os.path.join(FIXTURE_DIR,
self.config.get('zuul', 'tenant_config')))
self.config.set('merger', 'git_dir', self.git_root)
self.config.set('zuul', 'state_dir', self.state_root)
# For each project in config:
self.init_repo("org/project")
@ -937,8 +962,8 @@ class ZuulTestCase(BaseTestCase):
os.environ['STATSD_PORT'] = str(self.statsd.port)
self.statsd.start()
# the statsd client object is configured in the statsd module import
reload(statsd)
reload(zuul.scheduler)
reload_module(statsd)
reload_module(zuul.scheduler)
self.gearman_server = FakeGearmanServer()
@ -967,12 +992,12 @@ class ZuulTestCase(BaseTestCase):
self.ansible_server.start()
def URLOpenerFactory(*args, **kw):
if isinstance(args[0], urllib2.Request):
if isinstance(args[0], urllib.request.Request):
return old_urlopen(*args, **kw)
return FakeURLOpener(self.upstream_root, *args, **kw)
old_urlopen = urllib2.urlopen
urllib2.urlopen = URLOpenerFactory
old_urlopen = urllib.request.urlopen
urllib.request.urlopen = URLOpenerFactory
self.launcher = zuul.launcher.client.LaunchClient(
self.config, self.sched, self.swift)
@ -982,7 +1007,8 @@ class ZuulTestCase(BaseTestCase):
self.sched.setLauncher(self.launcher)
self.sched.setMerger(self.merge_client)
self.webapp = zuul.webapp.WebApp(self.sched, port=0)
self.webapp = zuul.webapp.WebApp(
self.sched, port=0, listen_address='127.0.0.1')
self.rpc = zuul.rpclistener.RPCListener(self.config, self.sched)
self.sched.start()
@ -1179,6 +1205,17 @@ class ZuulTestCase(BaseTestCase):
zuul.merger.merger.reset_repo_to_head(repo)
repo.git.clean('-x', '-f', '-d')
def create_commit(self, project):
path = os.path.join(self.upstream_root, project)
repo = git.Repo(path)
repo.head.reference = repo.heads['master']
file_name = os.path.join(path, 'README')
with open(file_name, 'a') as f:
f.write('creating fake commit\n')
repo.index.add([file_name])
commit = repo.index.commit('Creating a fake commit')
return commit.hexsha
def ref_has_change(self, ref, change):
path = os.path.join(self.git_root, change.project)
repo = git.Repo(path)
@ -1325,9 +1362,11 @@ class ZuulTestCase(BaseTestCase):
start = time.time()
while True:
if time.time() - start > 10:
print 'queue status:',
print ' '.join(self.eventQueuesEmpty())
print self.areAllBuildsWaiting()
self.log.debug("Queue status:")
for queue in self.event_queues:
self.log.debug(" %s: %s" % (queue, queue.empty()))
self.log.debug("All builds waiting: %s" %
(self.areAllBuildsWaiting(),))
raise Exception("Timeout waiting for Zuul to settle")
# Make sure no new events show up while we're checking
# have all build states propogated to zuul?
@ -1369,8 +1408,8 @@ class ZuulTestCase(BaseTestCase):
for pipeline in tenant.layout.pipelines.values():
for queue in pipeline.queues:
if len(queue.queue) != 0:
print 'pipeline %s queue %s contents %s' % (
pipeline.name, queue.name, queue.queue)
print('pipeline %s queue %s contents %s' % (
pipeline.name, queue.name, queue.queue))
self.assertEqual(len(queue.queue), 0,
"Pipelines queues should be empty")

View File

@ -3,7 +3,7 @@ pipelines:
manager: IndependentPipelineManager
require:
approval:
- username: jenkins
- username: ^(jenkins|zuul)$
trigger:
gerrit:
- event: comment-added

View File

@ -0,0 +1,21 @@
pipelines:
- name: check
manager: IndependentPipelineManager
trigger:
gerrit:
- event: patchset-created
success:
smtp:
to: me@example.org
jobs:
- name: docs-draft-test
success-pattern: http://docs-draft.example.org/{build.parameters[LOG_PATH]}/publish-docs/
- name: docs-draft-test2
success-pattern: http://docs-draft.example.org/{NOPE}/{build.parameters[BAD]}/publish-docs/
projects:
- name: org/docs
check:
- docs-draft-test:
- docs-draft-test2

View File

@ -123,13 +123,13 @@ class TestMatchAllFiles(BaseTestMatcher):
self._test_matches(False)
def test_matches_returns_false_when_not_all_files_match(self):
self._test_matches(False, files=['docs/foo', 'foo/bar'])
self._test_matches(False, files=['/COMMIT_MSG', 'docs/foo', 'foo/bar'])
def test_matches_returns_true_when_commit_message_matches(self):
self._test_matches(True, files=['/COMMIT_MSG'])
def test_matches_returns_false_when_commit_message_matches(self):
self._test_matches(False, files=['/COMMIT_MSG'])
def test_matches_returns_true_when_all_files_match(self):
self._test_matches(True, files=['docs/foo'])
self._test_matches(True, files=['/COMMIT_MSG', 'docs/foo'])
class TestMatchAll(BaseTestMatcher):

View File

@ -568,3 +568,57 @@ class TestCloner(ZuulTestCase):
self.worker.hold_jobs_in_build = False
self.worker.release()
self.waitUntilSettled()
def test_post_checkout(self):
project = "org/project"
path = os.path.join(self.upstream_root, project)
repo = git.Repo(path)
repo.head.reference = repo.heads['master']
commits = []
for i in range(0, 3):
commits.append(self.create_commit(project))
newRev = commits[1]
cloner = zuul.lib.cloner.Cloner(
git_base_url=self.upstream_root,
projects=[project],
workspace=self.workspace_root,
zuul_branch=None,
zuul_ref='master',
zuul_url=self.git_root,
zuul_project=project,
zuul_newrev=newRev,
)
cloner.execute()
repos = self.getWorkspaceRepos([project])
cloned_sha = repos[project].rev_parse('HEAD').hexsha
self.assertEqual(newRev, cloned_sha)
def test_post_and_master_checkout(self):
project = "org/project1"
master_project = "org/project2"
path = os.path.join(self.upstream_root, project)
repo = git.Repo(path)
repo.head.reference = repo.heads['master']
commits = []
for i in range(0, 3):
commits.append(self.create_commit(project))
newRev = commits[1]
cloner = zuul.lib.cloner.Cloner(
git_base_url=self.upstream_root,
projects=[project, master_project],
workspace=self.workspace_root,
zuul_branch=None,
zuul_ref='master',
zuul_url=self.git_root,
zuul_project=project,
zuul_newrev=newRev
)
cloner.execute()
repos = self.getWorkspaceRepos([project, master_project])
cloned_sha = repos[project].rev_parse('HEAD').hexsha
self.assertEqual(newRev, cloned_sha)
self.assertEqual(
repos[master_project].rev_parse('HEAD').hexsha,
repos[master_project].rev_parse('master').hexsha)

View File

@ -14,7 +14,7 @@
# License for the specific language governing permissions and limitations
# under the License.
import ConfigParser
from six.moves import configparser as ConfigParser
import os
import re
@ -36,13 +36,13 @@ class TestLayoutValidator(testtools.TestCase):
def test_layouts(self):
"""Test layout file validation"""
print
print()
errors = []
for fn in os.listdir(os.path.join(FIXTURE_DIR, 'layouts')):
m = LAYOUT_RE.match(fn)
if not m:
continue
print fn
print(fn)
# Load any .conf file by the same name but .conf extension.
config_file = ("%s.conf" %
@ -72,7 +72,7 @@ class TestLayoutValidator(testtools.TestCase):
fn)
except voluptuous.Invalid as e:
error = str(e)
print ' ', error
print(' ', error)
if error in errors:
raise Exception("Error has already been tested: %s" %
error)

View File

@ -12,6 +12,12 @@
# License for the specific language governing permissions and limitations
# under the License.
import os
import random
import fixtures
from zuul import model
from zuul import configloader
@ -32,12 +38,12 @@ class TestJob(BaseTestCase):
def test_change_matches_returns_false_for_matched_skip_if(self):
change = model.Change('project')
change.files = ['docs/foo']
change.files = ['/COMMIT_MSG', 'docs/foo']
self.assertFalse(self.job.changeMatches(change))
def test_change_matches_returns_true_for_unmatched_skip_if(self):
change = model.Change('project')
change.files = ['foo']
change.files = ['/COMMIT_MSG', 'foo']
self.assertTrue(self.job.changeMatches(change))
def test_job_sets_defaults_for_boolean_attributes(self):
@ -98,3 +104,76 @@ class TestJob(BaseTestCase):
job = item.getJobs()[0]
self.assertEqual(job.name, 'python27')
self.assertEqual(job.timeout, 50)
class TestJobTimeData(BaseTestCase):
def setUp(self):
super(TestJobTimeData, self).setUp()
self.tmp_root = self.useFixture(fixtures.TempDir(
rootdir=os.environ.get("ZUUL_TEST_ROOT"))
).path
def test_empty_timedata(self):
path = os.path.join(self.tmp_root, 'job-name')
self.assertFalse(os.path.exists(path))
self.assertFalse(os.path.exists(path + '.tmp'))
td = model.JobTimeData(path)
self.assertEqual(td.success_times, [0, 0, 0, 0, 0, 0, 0, 0, 0, 0])
self.assertEqual(td.failure_times, [0, 0, 0, 0, 0, 0, 0, 0, 0, 0])
self.assertEqual(td.results, [0, 0, 0, 0, 0, 0, 0, 0, 0, 0])
def test_save_reload(self):
path = os.path.join(self.tmp_root, 'job-name')
self.assertFalse(os.path.exists(path))
self.assertFalse(os.path.exists(path + '.tmp'))
td = model.JobTimeData(path)
self.assertEqual(td.success_times, [0, 0, 0, 0, 0, 0, 0, 0, 0, 0])
self.assertEqual(td.failure_times, [0, 0, 0, 0, 0, 0, 0, 0, 0, 0])
self.assertEqual(td.results, [0, 0, 0, 0, 0, 0, 0, 0, 0, 0])
success_times = []
failure_times = []
results = []
for x in range(10):
success_times.append(int(random.random() * 1000))
failure_times.append(int(random.random() * 1000))
results.append(0)
results.append(1)
random.shuffle(results)
s = f = 0
for result in results:
if result:
td.add(failure_times[f], 'FAILURE')
f += 1
else:
td.add(success_times[s], 'SUCCESS')
s += 1
self.assertEqual(td.success_times, success_times)
self.assertEqual(td.failure_times, failure_times)
self.assertEqual(td.results, results[10:])
td.save()
self.assertTrue(os.path.exists(path))
self.assertFalse(os.path.exists(path + '.tmp'))
td = model.JobTimeData(path)
td.load()
self.assertEqual(td.success_times, success_times)
self.assertEqual(td.failure_times, failure_times)
self.assertEqual(td.results, results[10:])
class TestTimeDataBase(BaseTestCase):
def setUp(self):
super(TestTimeDataBase, self).setUp()
self.tmp_root = self.useFixture(fixtures.TempDir(
rootdir=os.environ.get("ZUUL_TEST_ROOT"))
).path
self.db = model.TimeDataBase(self.tmp_root)
def test_timedatabase(self):
self.assertEqual(self.db.getEstimatedTime('job-name'), 0)
self.db.update('job-name', 50, 'SUCCESS')
self.assertEqual(self.db.getEstimatedTime('job-name'), 50)
self.db.update('job-name', 100, 'SUCCESS')
self.assertEqual(self.db.getEstimatedTime('job-name'), 75)
for x in range(10):
self.db.update('job-name', 100, 'SUCCESS')
self.assertEqual(self.db.getEstimatedTime('job-name'), 100)

View File

@ -20,11 +20,10 @@ import os
import re
import shutil
import time
import urllib
import urllib2
import yaml
import git
from six.moves import urllib
import testtools
import zuul.change_matcher
@ -501,6 +500,46 @@ class TestScheduler(ZuulTestCase):
self.assertEqual(B.reported, 2)
self.assertEqual(C.reported, 2)
def _test_time_database(self, iteration):
self.worker.hold_jobs_in_build = True
A = self.fake_gerrit.addFakeChange('org/project', 'master', 'A')
A.addApproval('CRVW', 2)
self.fake_gerrit.addEvent(A.addApproval('APRV', 1))
self.waitUntilSettled()
time.sleep(2)
data = json.loads(self.sched.formatStatusJSON())
found_job = None
for pipeline in data['pipelines']:
if pipeline['name'] != 'gate':
continue
for queue in pipeline['change_queues']:
for head in queue['heads']:
for item in head:
for job in item['jobs']:
if job['name'] == 'project-merge':
found_job = job
break
self.assertIsNotNone(found_job)
if iteration == 1:
self.assertIsNotNone(found_job['estimated_time'])
self.assertIsNone(found_job['remaining_time'])
else:
self.assertIsNotNone(found_job['estimated_time'])
self.assertTrue(found_job['estimated_time'] >= 2)
self.assertIsNotNone(found_job['remaining_time'])
self.worker.hold_jobs_in_build = False
self.worker.release()
self.waitUntilSettled()
def test_time_database(self):
"Test the time database"
self._test_time_database(1)
self._test_time_database(2)
def test_two_failed_changes_at_head(self):
"Test that changes are reparented correctly if 2 fail at head"
@ -606,6 +645,36 @@ class TestScheduler(ZuulTestCase):
self.assertEqual(B.reported, 2)
self.assertEqual(C.reported, 2)
def test_parse_skip_if(self):
job_yaml = """
jobs:
- name: job_name
skip-if:
- project: ^project_name$
branch: ^stable/icehouse$
all-files-match-any:
- ^filename$
- project: ^project2_name$
all-files-match-any:
- ^filename2$
""".strip()
data = yaml.load(job_yaml)
config_job = data.get('jobs')[0]
cm = zuul.change_matcher
expected = cm.MatchAny([
cm.MatchAll([
cm.ProjectMatcher('^project_name$'),
cm.BranchMatcher('^stable/icehouse$'),
cm.MatchAllFiles([cm.FileMatcher('^filename$')]),
]),
cm.MatchAll([
cm.ProjectMatcher('^project2_name$'),
cm.MatchAllFiles([cm.FileMatcher('^filename2$')]),
]),
])
matcher = self.sched._parseSkipIf(config_job)
self.assertEqual(expected, matcher)
def test_patch_order(self):
"Test that dependent patches are tested in the right order"
A = self.fake_gerrit.addFakeChange('org/project', 'master', 'A')
@ -1455,7 +1524,7 @@ class TestScheduler(ZuulTestCase):
self.worker.build_history = []
path = os.path.join(self.git_root, "org/project")
print repack_repo(path)
print(repack_repo(path))
A = self.fake_gerrit.addFakeChange('org/project', 'master', 'A')
A.addApproval('CRVW', 2)
@ -1480,9 +1549,9 @@ class TestScheduler(ZuulTestCase):
A = self.fake_gerrit.addFakeChange('org/project1', 'master', 'A')
A.addPatchset(large=True)
path = os.path.join(self.upstream_root, "org/project1")
print repack_repo(path)
print(repack_repo(path))
path = os.path.join(self.git_root, "org/project1")
print repack_repo(path)
print(repack_repo(path))
A.addApproval('CRVW', 2)
self.fake_gerrit.addEvent(A.addApproval('APRV', 1))
@ -2241,15 +2310,18 @@ class TestScheduler(ZuulTestCase):
self.fake_gerrit.addEvent(A.addApproval('APRV', 1))
self.waitUntilSettled()
self.worker.release('project-merge')
self.waitUntilSettled()
port = self.webapp.server.socket.getsockname()[1]
req = urllib2.Request("http://localhost:%s/status.json" % port)
f = urllib2.urlopen(req)
req = urllib.request.Request("http://localhost:%s/status.json" % port)
f = urllib.request.urlopen(req)
headers = f.info()
self.assertIn('Content-Length', headers)
self.assertIn('Content-Type', headers)
self.assertEqual(headers['Content-Type'],
'application/json; charset=UTF-8')
self.assertIsNotNone(re.match('^application/json(; charset=UTF-8)?$',
headers['Content-Type']))
self.assertIn('Access-Control-Allow-Origin', headers)
self.assertIn('Cache-Control', headers)
self.assertIn('Last-Modified', headers)
@ -2261,7 +2333,7 @@ class TestScheduler(ZuulTestCase):
self.waitUntilSettled()
data = json.loads(data)
status_jobs = set()
status_jobs = []
for p in data['pipelines']:
for q in p['change_queues']:
if p['name'] in ['gate', 'conflict']:
@ -2273,10 +2345,24 @@ class TestScheduler(ZuulTestCase):
self.assertTrue(change['active'])
self.assertEqual(change['id'], '1,1')
for job in change['jobs']:
status_jobs.add(job['name'])
self.assertIn('project-merge', status_jobs)
self.assertIn('project-test1', status_jobs)
self.assertIn('project-test2', status_jobs)
status_jobs.append(job)
self.assertEqual('project-merge', status_jobs[0]['name'])
self.assertEqual('https://server/job/project-merge/0/',
status_jobs[0]['url'])
self.assertEqual('http://logs.example.com/1/1/gate/project-merge/0',
status_jobs[0]['report_url'])
self.assertEqual('project-test1', status_jobs[1]['name'])
self.assertEqual('https://server/job/project-test1/1/',
status_jobs[1]['url'])
self.assertEqual('http://logs.example.com/1/1/gate/project-test1/1',
status_jobs[1]['report_url'])
self.assertEqual('project-test2', status_jobs[2]['name'])
self.assertEqual('https://server/job/project-test2/2/',
status_jobs[2]['url'])
self.assertEqual('http://logs.example.com/1/1/gate/project-test2/2',
status_jobs[2]['report_url'])
def test_merging_queues(self):
"Test that transitively-connected change queues are merged"
@ -2829,7 +2915,8 @@ class TestScheduler(ZuulTestCase):
port = self.webapp.server.socket.getsockname()[1]
f = urllib.urlopen("http://localhost:%s/status.json" % port)
req = urllib.request.Request("http://localhost:%s/status.json" % port)
f = urllib.request.urlopen(req)
data = f.read()
self.worker.hold_jobs_in_build = False
@ -4215,6 +4302,45 @@ For CI problems and help debugging, contact ci@example.org"""
self.waitUntilSettled()
self.assertEqual(self.history[-1].changes, '3,2 2,1 1,2')
def test_crd_cycle_join(self):
"Test an updated change creates a cycle"
A = self.fake_gerrit.addFakeChange('org/project2', 'master', 'A')
self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
self.waitUntilSettled()
# Create B->A
B = self.fake_gerrit.addFakeChange('org/project1', 'master', 'B')
B.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
B.subject, A.data['id'])
self.fake_gerrit.addEvent(B.getPatchsetCreatedEvent(1))
self.waitUntilSettled()
# Update A to add A->B (a cycle).
A.addPatchset()
A.data['commitMessage'] = '%s\n\nDepends-On: %s\n' % (
A.subject, B.data['id'])
# Normally we would submit the patchset-created event for
# processing here, however, we have no way of noting whether
# the dependency cycle detection correctly raised an
# exception, so instead, we reach into the source driver and
# call the method that would ultimately be called by the event
# processing.
source = self.sched.layout.pipelines['gate'].source
with testtools.ExpectedException(
Exception, "Dependency cycle detected"):
source._getChange(u'1', u'2', True)
self.log.debug("Got expected dependency cycle exception")
# Now if we update B to remove the depends-on, everything
# should be okay. B; A->B
B.addPatchset()
B.data['commitMessage'] = '%s\n' % (B.subject,)
source._getChange(u'1', u'2', True)
source._getChange(u'2', u'2', True)
def test_disable_at(self):
"Test a pipeline will only report to the disabled trigger when failing"
@ -4336,3 +4462,38 @@ For CI problems and help debugging, contact ci@example.org"""
self.assertIn('Build failed.', K.messages[0])
# No more messages reported via smtp
self.assertEqual(3, len(self.smtp_messages))
def test_success_pattern(self):
"Ensure bad build params are ignored"
# Use SMTP reporter to grab the result message easier
self.init_repo("org/docs")
self.config.set('zuul', 'layout_config',
'tests/fixtures/layout-success-pattern.yaml')
self.sched.reconfigure(self.config)
self.worker.hold_jobs_in_build = True
self.registerJobs()
A = self.fake_gerrit.addFakeChange('org/docs', 'master', 'A')
self.fake_gerrit.addEvent(A.getPatchsetCreatedEvent(1))
self.waitUntilSettled()
# Grab build id
self.assertEqual(len(self.builds), 1)
uuid = self.builds[0].unique[:7]
self.worker.hold_jobs_in_build = False
self.worker.release()
self.waitUntilSettled()
self.assertEqual(len(self.smtp_messages), 1)
body = self.smtp_messages[0]['body'].splitlines()
self.assertEqual('Build succeeded.', body[0])
self.assertIn(
'- docs-draft-test http://docs-draft.example.org/1/1/1/check/'
'docs-draft-test/%s/publish-docs/' % uuid,
body[2])
self.assertIn(
'- docs-draft-test2 https://server/job/docs-draft-test2/1/',
body[3])

View File

@ -16,7 +16,8 @@
# under the License.
import json
import urllib2
from six.moves import urllib
from tests.base import ZuulTestCase
@ -46,41 +47,41 @@ class TestWebapp(ZuulTestCase):
def test_webapp_status(self):
"Test that we can filter to only certain changes in the webapp."
req = urllib2.Request(
req = urllib.request.Request(
"http://localhost:%s/status" % self.port)
f = urllib2.urlopen(req)
f = urllib.request.urlopen(req)
data = json.loads(f.read())
self.assertIn('pipelines', data)
def test_webapp_status_compat(self):
# testing compat with status.json
req = urllib2.Request(
req = urllib.request.Request(
"http://localhost:%s/status.json" % self.port)
f = urllib2.urlopen(req)
f = urllib.request.urlopen(req)
data = json.loads(f.read())
self.assertIn('pipelines', data)
def test_webapp_bad_url(self):
# do we 404 correctly
req = urllib2.Request(
req = urllib.request.Request(
"http://localhost:%s/status/foo" % self.port)
self.assertRaises(urllib2.HTTPError, urllib2.urlopen, req)
self.assertRaises(urllib.error.HTTPError, urllib.request.urlopen, req)
def test_webapp_find_change(self):
# can we filter by change id
req = urllib2.Request(
req = urllib.request.Request(
"http://localhost:%s/status/change/1,1" % self.port)
f = urllib2.urlopen(req)
f = urllib.request.urlopen(req)
data = json.loads(f.read())
self.assertEqual(1, len(data), data)
self.assertEqual("org/project", data[0]['project'])
req = urllib2.Request(
req = urllib.request.Request(
"http://localhost:%s/status/change/2,1" % self.port)
f = urllib2.urlopen(req)
f = urllib.request.urlopen(req)
data = json.loads(f.read())
self.assertEqual(1, len(data), data)

View File

@ -68,7 +68,7 @@ def main():
job = gear.Job("build:%s" % args.job,
json.dumps(data),
unique=data['ZUUL_UUID'])
c.submitJob(job)
c.submitJob(job, precedence=gear.PRECEDENCE_HIGH)
while not job.complete:
time.sleep(1)

View File

@ -35,7 +35,7 @@ for pipeline in data['pipelines']:
if not change['live']:
continue
cid, cps = change['id'].split(',')
print (
print(
"zuul enqueue --trigger gerrit --pipeline %s "
"--project %s --change %s,%s" % (
options.pipeline_name,

94
tools/zuul-clear-refs.py Executable file
View File

@ -0,0 +1,94 @@
#!/usr/bin/env python
# Copyright 2014-2015 Antoine "hashar" Musso
# Copyright 2014-2015 Wikimedia Foundation Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
# pylint: disable=locally-disabled, invalid-name
"""
Zuul references cleaner.
Clear up references under /refs/zuul/ by inspecting the age of the commit the
reference points to. If the commit date is older than a number of days
specificed by --until, the reference is deleted from the git repository.
Use --dry-run --verbose to finely inspect the script behavior.
"""
import argparse
import git
import logging
import time
import sys
NOW = int(time.time())
DEFAULT_DAYS = 360
ZUUL_REF_PREFIX = 'refs/zuul/'
parser = argparse.ArgumentParser(
description=__doc__,
formatter_class=argparse.RawDescriptionHelpFormatter,
)
parser.add_argument('--until', dest='days_ago', default=DEFAULT_DAYS, type=int,
help='references older than this number of day will '
'be deleted. Default: %s' % DEFAULT_DAYS)
parser.add_argument('-n', '--dry-run', dest='dryrun', action='store_true',
help='do not delete references')
parser.add_argument('-v', '--verbose', dest='verbose', action='store_true',
help='set log level from info to debug')
parser.add_argument('gitrepo', help='path to a Zuul git repository')
args = parser.parse_args()
logging.basicConfig()
log = logging.getLogger('zuul-clear-refs')
if args.verbose:
log.setLevel(logging.DEBUG)
else:
log.setLevel(logging.INFO)
try:
repo = git.Repo(args.gitrepo)
except git.exc.InvalidGitRepositoryError:
log.error("Invalid git repo: %s" % args.gitrepo)
sys.exit(1)
for ref in repo.references:
if not ref.path.startswith(ZUUL_REF_PREFIX):
continue
if type(ref) is not git.refs.reference.Reference:
# Paranoia: ignore heads/tags/remotes ..
continue