In the normal course of script execution, clone_repo.sh routinely logs
the following error message:
error: pathspec 'xxx' did not match any file(s) known to git
This error can be safely ignored, as the script moves on to checking out
master instead. But it can be (and has been) mistakenly interpreted as
the reason why the script fails.
This change avoids logging the error message and logs a clearer
explanation instead.
Change-Id: Iaaaed7b0ea343bd81b8ad898654658545942a3d0
Use python3 when running the script to add release note page as
part of the release process.
Without that we get the following error:
```
2020-12-26 16:58:15.569890 | ubuntu-focal | + python -c 'print('\''victoria'\''.title())'
2020-12-26 16:58:15.570573 | ubuntu-focal |
/home/zuul/scripts/release-tools/add_release_note_page.sh: line 51:
python: command not found
```
c.f http://lists.openstack.org/pipermail/release-job-failures/2020-December/001499.html
Change-Id: I6a5a68570b8948692aa48f09003d26590ee621e4
Not all teams have PTLs now. Don't assume we will get PTL information as
we check for approvals on release patches.
Change-Id: Id0d6eeaf66394806374781d1cf26c087a6e90f87
Signed-off-by: Sean McGinnis <sean.mcginnis@gmail.com>
Octavia has a diskimage-builder element to install octavia-lib. When
creating an amphora image from an Octavia stable branch, the expectation
is the octavia-lib code will match the same branch (master or stable).
This patch updates the branch name and upper constraints for octavia-lib
in Octavia on stable branch creation.
This change is in line with I8eba64c886c187c8652f94735ca6153702203d17.
Depends-On: https://review.opendev.org/#/c/745506
Change-Id: I31b762f631304636bafabec9c54cd31c6c91f124
We can not speculatively test changes to the wheel build jobs while it
is project config. These jobs are in the gate for the requirements
project, which run them when bindep.txt changes.
https://review.opendev.org/731394 moved the build-wheel-mirror-base
job/playbook/role into openstack-zuul-jobs, renamed as
build-wheel-cache-* to a) distinguish the name and b) make it a little
clearer that we're building the wheels that goes on the mirror, rather
than passing through to mirroring something existing.
This updates the job names in project-config to reference these new
jobs.
Note the publishing steps are kept here along with the AFS secret.
Depends-On: https://review.opendev.org/732085
Depends-On: https://review.opendev.org/732087
Depends-On: https://review.opendev.org/732083
Depends-On: https://review.opendev.org/732084
Depends-On: https://review.opendev.org/732086
Change-Id: Iac3a906803177e8365f4cfb611800b5ccaed4a6e
It seem that pip has updated its verbose output to say "Downloading
$filename" rather than "Downloading from url $URL". We need to update
our wheel mirror processing to handle this change so that it can
properly remove any pypi hosted wheels before publishing to our wheel
mirrors. Make it more liberal so we match pretty much any line with
"Downloading" and then eventually something which looks like a whl
file, just in case they keep fiddling with it.
Change-Id: I7965f296cdb1241c96293edd68e84e1d3d78b331
Patch I8eba64c886c187c8652f94735ca6153702203d17 added logic to update
the constraints URL for a project specific file, but adding the file
before committing and submitting the changes was missed, resulting in
git review erroring during the rebase step of submitting the review due
to unstaged files.
Change-Id: I4d87546f12c4a866e9c666e83c42ad8d07a637a3
Signed-off-by: Sean McGinnis <sean.mcginnis@gmail.com>
This reverts commit 8b3532c562.
The change mentioned inline has been incoporated upstream with [1]. I
don't think we've managed to replicate the corrupt wheel situation
since.
[1] 0dbab23df9
Change-Id: I58fa0a53939207427f286584e91e1c59d4863992
Some instances have failed due to the repo not being configured for the
expected series job templates. This changes the modified file detection
to only look for the zuul files we care about, then not error out on git
operations if it ends up we can't actually commit and propose changes.
Change-Id: Ic301b039d080dfc0bcbefeecce099c8fd00ad8c5
Signed-off-by: Sean McGinnis <sean.mcginnis@gmail.com>
The check-release-approval job currently considers a change being
approved if a release liaison is the change owner, or if a release
liaison approves it in a subsequent vote.
This means that if a release liaison pushes the original change (and
is therefore the change owner), any patchset pushed above the original
one will retain the owner and therefore the approval.
Instead of the change owner, use the committer of the last revision.
Change-Id: I2f7730626a8785b63f88fdf5d507fbd97717b8f3
Local testing passed, but when run in the gate, script execution failed
on the sed expression not being preceded by '-e'.
Change-Id: I02545e1f976f220f197556bdfa6e6225d9bd8d36
Signed-off-by: Sean McGinnis <sean.mcginnis@gmail.com>
Missed in Id58f439052b4ea6b092b87682576a746433dcc27 that the branch name
passed in when determining the next series name will be of the form
'stable/series', resulting in not finding the next name and job update
logic being skipped. This adjusts the branch name to properly match the
series name.
Change-Id: Ie4102ceb0d12b7d98919ddb89b8e17df1859fa6a
Signed-off-by: Sean McGinnis <sean.mcginnis@gmail.com>
Currently, we need to update jobs manually after a branch is
created.
When a project is branched, the master branch should then
be pointing to the new named python3 tests.
This should do it.
Change-Id: Id58f439052b4ea6b092b87682576a746433dcc27
When no review is posted yet, Gerrit returns simplified 'labels'
data. In particular it's missing the ['labels']['Code-Review]['all']
dictionary, which we assumed would always be present.
We should only add approvers from the reviews if the 'all' key is
provided, and otherwise just work from the owner email.
Remove all changes pushed to investigate the issue: use a narrow
query again (rather than the /detail call) and no longer catch the
exception.
Change-Id: I13fa07754e38281c63dcf0eceaa4c3b3c2715618
- Use zuul_return fake module to make ansible lint happy, this allows
to remove Zuul import.
- While ansible-lint 4.2.0 is now able to detect playbooks/roles, this
is broken, so don't use it
- move its config out of tox.ini so it can be used by any tools, or
without tox
Change-Id: Ie8935f47db855647e19ae091044e5ac1871f1551
Co-Authored-By: Sorin Sbarnea <ssbarnea@redhat.com>
Co-Authored-By: Andreas Jaeger <aj@suse.com>
Gerrit sometimes returns a partial JSON answer, missing
the details that o=DETAILED_LABELS should trigger. This
results in false negatives in check-release-approval tests.
This cannot be reproduced easily. We put in place a retry
but the issue seems to stick on immediate retries.
As an experiment, this change switches the API call from
/changes/ID with o=DETAILED_LABELS&o=DETAILED_ACCOUNTS to
/changes/ID/detail (which includes these two options, amongst
others), to see if that would workaround the Gerrit issue.
We also remove the retry since it does not improve significantly
the situation.
Change-Id: I4de49da1b48f7b87879102a0e18e97168e39406b
Current update constraints patches replace all entries of a package with
an entry for the new version. But the upper-constraints file now needs
to have multiple entries with specific ";python_version=='x.x'" to be
able to handle differing requirements for packages that no longer
support Python 2, or even earlier Python 3 versions.
This adds awareness to the update script to only update the version for
matching python_version specifiers if they are present. It then falls
back to replacing the full line if the entry for the package does not
have any python_version markers at all.
Change-Id: I5e5e604fe9e461e45af0aa4446edd0af89d63381
Signed-off-by: Sean McGinnis <sean.mcginnis@gmail.com>
We've had a number of occurrences where Gerrit failed to return a
complete JSON answer. In particular, the ['labels']['CodeReview']['all']
content is not provided, while the query requires o=DETAILED_LABELS.
Since this is a rare occurrence (which could not be reproduced on
manual testing) let's retry once to load the results from Gerrit.
Change-Id: I98d1010e9586d2329137d5d600c66eeb8343fa97
We've had hard-to-track exceptions with missing keys in Gerrit
query answers. Let's log the JSON in case of KeyError parsing
Gerrit response.
Change-Id: I495c225022250f9fa055c12c422071b1f42ff753
Octavia has a diskimage-builder element to install the amphora-agent.
When creating an amphora image from an Octavia stable branch, the
expectation is the amphora-agent code will match the same branch (master
or stable). However, the amphora-agent always pulls master versions of
upper-constraints.txt and of the Octavia Git repository.
This patch updates the branch name and upper constraints for the
amphora-agent in Octavia on stable branch creation.
Change-Id: I8eba64c886c187c8652f94735ca6153702203d17
Limit check-release-approval to openstack/releases since no
other repository should be using it.
Also fix a corner case where release requests combined with
other changes would crash the check code (only changes in
deliverable files should be taken into account).
Change-Id: I2d744c1d4dfe13a795c0d6754520d1bbf39162b5
A recent change assumed $LOGS was a relative path, but it starts
with $(pwd) so is absolute. Correct this broken assumption which was
causing the script to exit non-zero and fail the
publish-wheel-mirror-* jobs.
Change-Id: I837a0657aa9ae11211599b019711cba8f17ee094
In the build-wheels role, parse stdout from all the build logs to
build up a unique list of all wheels downloaded from PyPI and delete
them from the wheelhouse, since this is only meant to provide built
wheels which are absent from PyPI. Those will be retrieved from our
caching proxy of PyPI served from the same mirror hosts anyway.
Change-Id: Id021ba5fd55bf6d43e99f9f3a7121aee8b0d0a6f
Creating the index has turned out to be much more difficult than first
expected. Checksumming the files on AFS to make the index.html takes
too long and times out the jobs.
We've identified other issues such as the jobs only ever appending
wheels, making far more data than probably necessary being kept, and
the possibility of skipping those wheels already available on PyPi.
Disable this while we reconsider the approach.
Change-Id: I86a922d59b396a059b255a4d87d8c033b79f7564
I swear this should work now ... $f is the full path to the wheel
directory
Additionally ... f-strings aren't available on Python 3.5, and we
still build for Trusty which has Python 3.5 ... sigh :/ convert to
regular strings.
Change-Id: Ia03b97198f9f45914b8b99c222680d732c2cc8f8
wheel-copy.sh is the wrong place to do the index of the package
directory. Move it to wheel-index.sh where we are walking the final
wheel directories.
Change-Id: I58329650962b4dbe24f79a1f4e4ea54e709233c1
Iterating on this further; fix incorrect yum install and match for the
default Python3 install YAML.
Change-Id: Ic1f902297d869ddafaa57f5910d99461a8a004f3
Some things have become evident when generating the indexes requiring
some larger changes.
Firstly, the indexer script needs python3 on the host. Since we're
still building CentOS 7 wheels, we need to install Python 3 from EPEL
there.
Secondly, because part of the PEP503 index page is the file hash,
reading all the files back over AFS is quite slow. It's also quite
slow having ansible loop a task each time, which all adds up to job
timeouts.
Instead, make the indexes on the local disk before we copy the results
to AFS. This requires copying both scripts to the host for execution
(rather than relying on "script:" ) so the wheel-copy.sh script can
call wheel-indexer.py.
While we are there, a small refactor on the wheel-indexer.py to use
os.walk() (which makes it easier to have this as a stand-alone
recursive script later, if something changes). Also update the output
to use <ul><li> for the filenames, so it looks a little better on the
output html.
Change-Id: I85f9e132bc55fd8d33583a698e15c47665e5cf8d
From the ansible manual
"The difference between single quotes and double quotes is that in
double quotes you can use escapes:"
This turns out to be important when you're splitting based on \n ...
Change-Id: Ia033426bf05a63034fb11882e92d29e3ede8fc53
What I didn't notice is that on disk the projects are in a/ b/ c/ type
subdirectories, while from the mirror side we paper-over this with
mod_rewrite rules so the projects all look like they're at top-level.
This should make the script descend into each directory correctly to
generate the index.
Change-Id: I89d255e2bcd5b5c86a33dec1a77086c97bde4c7a
PEP503 [1] defines the simple respository API format that we should
follow for our mirrors publishing .whl files.
This loops the output directories and creates a index.html for each
project with the required tags.
We do not really have gate testing for this, so to avoid destroying
the whole mirror we write out a temporary index file first, and ignore
errors. When this works, we will revert to index.html and do a single
manual removal of the temporary indexes.
[1] https://www.python.org/dev/peps/pep-0503/
Change-Id: I0541fb58535f27589b7e3bc1462cb083e3d12f56
This version of hacking doesn't understand f-strings as usable in
Python 3. Update to the latest and fix current issues, which are all
just formatting fixes.
Change-Id: I0a7d6f93f07477b6dd29ab143130dd9064c250be
A recent 45.0.0 release of Setuptools dropped support for (the now
EOL) Python 2.7 interpreter. It publishes to PyPI with
options.python_requires set to >=3.5 in its setup.cfg which causes
PyPI's simple index to serve that via a data-requires-python
attribute (see PEP 503) so that pip knows not to pick that one on
unsupported interpreters. Our wheel cache is just served via Apache
with mod_autoindex for its file listing, so lacks an ability to
convey this information.
In the short term, stop copying new versions of the pip, setuptools
and virtialenv wheels into the cache we publish, so that we can
safely remove broken versions of them manually and not have to worry
about them reappearing. Longer term, it may make sense to filter out
any wheels with manylinux or none-any platform specifiers, since
those the first absolutely exist on PyPI already or and the second
are trivial to build at job runtime if they don't. We really only
care about making sure we serve platform-specific wheels we've
built, as those will save the most time in jobs.
Change-Id: I02fac6beef9bb94cdeb1d0a84774c1b93357deb1