zuul-jobs/test-playbooks/ensure-pip.yaml
Ian Wienand 67f223b53a Partial revert "Ensure wheel exists for build-release-python"; move to ensure-pip
This partially reverts commit
3f961ce202.

This alternative installs wheel with the ensure-pip role instead of in
a separate role.  wheel is very closely linked with pip install
operations so this isn't a large overreach of the role.

I suggest this for several reasons; firstly the python-wheel role
doesn't try to install packages, so we end up with mixed system pip
and upstream versions of wheel most of the time.  This is the type of
thing that has proven problematic in the past.  It also installs via
pip --user; something we've already had problems with tox when for
various reasons roles want to run this as non-zuul user.  Using
ensure-pip we keep the packaged versions together.

[1] did try to install wheel with root, but during runtime which
didn't work due to sudo being revoked.  This should work for the
existing build-python-release job, because it already includes
ensure-pip in pre-run via playbooks/python/pre.yaml

I believe our conclusion on the ensure-* roles was that requiring
root/become: for installation is OK, but we should have a no-op path
if the tools are found.  This is consistent with that approach
(i.e. if you want wheel and can't do sudo, you should pre-install it
on your image using whatever you build that with).

This adds a check to the existing "is pip installed" check to also
check if wheel packages are available.  If not we trigger the install
path.

This revealed some issues with RedHat.yaml -- we can always install
Python 3 (packages available for CentOS 7) so remove that check, and
if Ansible is running under Python 2; ensure we install the
dependencies too (not only if it is forced).

Update the documentation to describe that it will enable support for
bdist_wheel, and add a basic sanity test that wheels are produced by
pip.  The existing build-python-release job is kept; although it is
modified to use the playbooks/python/pre.yaml playbook as the build
job does.

Change-Id: I2ab11bb45b6b2a49d54db39195228ab40141185c
[1] https://review.opendev.org/#/c/736001/5/roles/build-python-release/tasks/main.yaml
2020-06-18 12:51:56 +00:00

67 lines
1.8 KiB
YAML

- hosts: all
tasks:
# ensure-pip
- name: Include ensure-pip
include_role:
name: ensure-pip
- name: Create temp directory
tempfile:
state: directory
suffix: venv-test
register: _tmp_venv
- name: Sanity check provided virtualenv command installs
pip:
name: tox
virtualenv_command: '{{ ensure_pip_virtualenv_command }}'
virtualenv: '{{ _tmp_venv.path }}'
- name: Sanity check installed command runs without error
command: '{{ _tmp_venv.path }}/bin/tox --version'
- name: Remove tmpdir
file:
path: '{{ _tmp_venv.path }}'
state: absent
- name: Sanity check pip wheel generation
shell: |
cd {{ ansible_user_dir }}/src/opendev.org/zuul/zuul
# This should run anywhere without too much logic ...
run_pip=$(command -v pip3 || command -v pip2 || command -v pip)
$run_pip wheel --no-deps .
ls zuul-*.whl || exit 1
# ensure-virtualenv
- name: Include ensure-virtualenv
include_role:
name: ensure-virtualenv
- name: Sanity check virtualenv command works
shell: |
tmp_venv=$(mktemp -d -t venv-XXXXXXXXXX)
trap "rm -rf $tmp_venv" EXIT
virtualenv $tmp_venv
$tmp_venv/bin/pip install tox
failed_when: false
register: _virtualenv_sanity
- name: Assert sanity check
fail:
msg: 'The virtualenv command does not appear to work!'
when:
- _virtualenv_sanity.rc != 0
# NOTE(ianw) : this does not play nicely with pip-and-virtualenv which
# has already installed from source. We might be able to test this
# once it's gone...
# - hosts: all
# roles:
# - role: ensure-pip
# vars:
# ensure_pip_from_upstream: True