Split out oslo.packaging.
Include logic taken from d2to1 to allow us to inject into setup.py. Combined with ols openstack/common/setup.py code. Change-Id: I27b341403bb8245e38f8e3c386f1a835b90b1843
This commit is contained in:
parent
43e71193fb
commit
aa4641f0e6
27
.gitignore
vendored
Normal file
27
.gitignore
vendored
Normal file
@ -0,0 +1,27 @@
|
||||
# Compiled files
|
||||
*.py[co]
|
||||
*.a
|
||||
*.o
|
||||
*.so
|
||||
|
||||
# Sphinx
|
||||
_build
|
||||
|
||||
# Packages/installer info
|
||||
*.egg
|
||||
*.egg-info
|
||||
dist
|
||||
build
|
||||
eggs
|
||||
parts
|
||||
bin
|
||||
var
|
||||
sdist
|
||||
develop-eggs
|
||||
.installed.cfg
|
||||
|
||||
# Other
|
||||
.tox
|
||||
.*.swp
|
||||
.coverage
|
||||
cover
|
4
.gitreview
Normal file
4
.gitreview
Normal file
@ -0,0 +1,4 @@
|
||||
[gerrit]
|
||||
host=review.openstack.org
|
||||
port=29418
|
||||
project=openstack/oslo-packaging.git
|
176
LICENSE
Normal file
176
LICENSE
Normal file
@ -0,0 +1,176 @@
|
||||
|
||||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
||||
|
75
README.rst
Normal file
75
README.rst
Normal file
@ -0,0 +1,75 @@
|
||||
Introduction
|
||||
============
|
||||
|
||||
oslo.packaging provides a set of default python packaging configuration and
|
||||
behaviors. It started off as a combination of an invasive fork of d2to1
|
||||
and the openstack.common.setup module.
|
||||
|
||||
Behaviors
|
||||
=========
|
||||
|
||||
Version strings will be inferred from git. If a given revision is tagged,
|
||||
that's the version. If it's not, and you don't provide a version, the version
|
||||
will be very similar to git describe. If you do, then we'll assume that's the
|
||||
version you are working towards, and will generate alpha version strings
|
||||
based on commits since last tag and the current git sha.
|
||||
|
||||
requirements.txt and test-requirements.txt will be used to populate
|
||||
install requirements as needed.
|
||||
|
||||
Sphinx documentation setups are altered to generate man pages by default. They
|
||||
also have several pieces of information that are known to setup.py injected
|
||||
into the sphinx config.
|
||||
|
||||
Usage
|
||||
=====
|
||||
oslo.packaging requires a distribution to use distribute. Your distribution
|
||||
must include a distutils2-like setup.cfg file, and a minimal
|
||||
setup.py script. The purpose is not to actually support distutils2, that's
|
||||
turned in to a boondoggle. However, having declarative setup is a great idea,
|
||||
and we can have that now if we don't care about distutils2 ever being finished.
|
||||
|
||||
A simple sample can be found in oslo.packaging s own setup.cfg
|
||||
(it uses its own machinery to install itself)::
|
||||
|
||||
[metadata]
|
||||
name = oslo.packaging
|
||||
author = OpenStack Foundation
|
||||
author-email = openstack-dev@lists.openstack.org
|
||||
summary = OpenStack's setup automation in a reuable form
|
||||
description-file = README
|
||||
license = Apache-2
|
||||
classifier =
|
||||
Development Status :: 4 - Beta
|
||||
Environment :: Console
|
||||
Environment :: OpenStack
|
||||
Intended Audience :: Developers
|
||||
Intended Audience :: Information Technology
|
||||
License :: OSI Approved :: Apache Software License
|
||||
Operating System :: OS Independent
|
||||
Programming Language :: Python
|
||||
keywords =
|
||||
setup
|
||||
distutils
|
||||
[files]
|
||||
packages =
|
||||
oslo
|
||||
oslo.packaging
|
||||
|
||||
The minimal setup.py should look something like this::
|
||||
|
||||
#!/usr/bin/env python
|
||||
|
||||
from setuptools import setup
|
||||
|
||||
setup(
|
||||
setup_requires=['oslo.packaging'],
|
||||
oslo_packaging=True
|
||||
)
|
||||
|
||||
Note that it's important to specify oslo_packaging=True or else the
|
||||
oslo.packaging functionality will not be enabled.
|
||||
|
||||
It should also work fine if additional arguments are passed to `setup()`,
|
||||
but it should be noted that they will be clobbered by any options in the
|
||||
setup.cfg file.
|
16
oslo/__init__.py
Normal file
16
oslo/__init__.py
Normal file
@ -0,0 +1,16 @@
|
||||
# Copyright (c) 2013 Hewlett-Packard Development Company, L.P.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
# implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
__import__('pkg_resources').declare_namespace(__name__)
|
14
oslo/packaging/__init__.py
Normal file
14
oslo/packaging/__init__.py
Normal file
@ -0,0 +1,14 @@
|
||||
# Copyright (c) 2013 Hewlett-Packard Development Company, L.P.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
# implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
144
oslo/packaging/core.py
Normal file
144
oslo/packaging/core.py
Normal file
@ -0,0 +1,144 @@
|
||||
# Copyright (c) 2013 Hewlett-Packard Development Company, L.P.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
# implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
#
|
||||
# Copyright (C) 2013 Association of Universities for Research in Astronomy (AURA)
|
||||
|
||||
# Redistribution and use in source and binary forms, with or without
|
||||
# modification, are permitted provided that the following conditions are met:
|
||||
#
|
||||
# 1. Redistributions of source code must retain the above copyright
|
||||
# notice, this list of conditions and the following disclaimer.
|
||||
#
|
||||
# 2. Redistributions in binary form must reproduce the above
|
||||
# copyright notice, this list of conditions and the following
|
||||
# disclaimer in the documentation and/or other materials provided
|
||||
# with the distribution.
|
||||
#
|
||||
# 3. The name of AURA and its representatives may not be used to
|
||||
# endorse or promote products derived from this software without
|
||||
# specific prior written permission.
|
||||
#
|
||||
# THIS SOFTWARE IS PROVIDED BY AURA ``AS IS'' AND ANY EXPRESS OR IMPLIED
|
||||
# WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
|
||||
# MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
|
||||
# DISCLAIMED. IN NO EVENT SHALL AURA BE LIABLE FOR ANY DIRECT, INDIRECT,
|
||||
# INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
|
||||
# BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS
|
||||
# OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
|
||||
# ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR
|
||||
# TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE
|
||||
# USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH
|
||||
# DAMAGE.
|
||||
|
||||
from distutils.core import Distribution as _Distribution
|
||||
from distutils.errors import DistutilsFileError, DistutilsSetupError
|
||||
import os
|
||||
import sys
|
||||
import warnings
|
||||
|
||||
from setuptools.dist import _get_unpatched
|
||||
import six
|
||||
|
||||
from oslo.packaging import util
|
||||
|
||||
|
||||
_Distribution = _get_unpatched(_Distribution)
|
||||
|
||||
|
||||
def setup(dist, attr, value):
|
||||
"""Implements the actual oslo.packaging setup() keyword.
|
||||
|
||||
When used, this should be the only keyword in your setup() aside from
|
||||
`setup_requires`.
|
||||
|
||||
This works by reading the setup.cfg file, parsing out the supported
|
||||
metadata and command options, and using them to rebuild the
|
||||
`DistributionMetadata` object and set the newly added command options.
|
||||
|
||||
The reason for doing things this way is that a custom `Distribution` class
|
||||
will not play nicely with setup_requires; however, this implementation may
|
||||
not work well with distributions that do use a `Distribution` subclass.
|
||||
"""
|
||||
|
||||
if not value:
|
||||
return
|
||||
path = os.path.abspath('setup.cfg')
|
||||
if not os.path.exists(path):
|
||||
raise DistutilsFileError(
|
||||
'The setup.cfg file %s does not exist.' % path)
|
||||
|
||||
# Converts the setup.cfg file to setup() arguments
|
||||
try:
|
||||
attrs = util.cfg_to_args(path)
|
||||
except:
|
||||
e = sys.exc_info()[1]
|
||||
raise DistutilsSetupError(
|
||||
'Error parsing %s: %s: %s' % (path, e.__class__.__name__,
|
||||
six.u(e)))
|
||||
|
||||
# Repeat some of the Distribution initialization code with the newly
|
||||
# provided attrs
|
||||
if attrs:
|
||||
|
||||
# Handle additional setup processing
|
||||
if 'setup_requires' in attrs:
|
||||
attrs = chainload_setups(dist, attrs['setup_requires'], attrs)
|
||||
|
||||
# Skips 'options' and 'licence' support which are rarely used; may add
|
||||
# back in later if demanded
|
||||
for key, val in six.iteritems(attrs):
|
||||
if hasattr(dist.metadata, 'set_' + key):
|
||||
getattr(dist.metadata, 'set_' + key)(val)
|
||||
elif hasattr(dist.metadata, key):
|
||||
setattr(dist.metadata, key, val)
|
||||
elif hasattr(dist, key):
|
||||
setattr(dist, key, val)
|
||||
else:
|
||||
msg = 'Unknown distribution option: %s' % repr(key)
|
||||
warnings.warn(msg)
|
||||
|
||||
# Re-finalize the underlying Distribution
|
||||
_Distribution.finalize_options(dist)
|
||||
|
||||
# This bit comes out of distribute/setuptools
|
||||
if isinstance(dist.metadata.version, six.integer_types + (float,)):
|
||||
# Some people apparently take "version number" too literally :)
|
||||
dist.metadata.version = str(dist.metadata.version)
|
||||
|
||||
dist.command_options = util.DefaultGetDict(lambda: util.IgnoreDict(ignore))
|
||||
|
||||
|
||||
def chainload_setups(dist, requires_list, attrs):
|
||||
try:
|
||||
import pip.command.install
|
||||
except ImportError:
|
||||
from setuptools.command.easy_install import easy_install
|
||||
cmd = easy_install(dist, args=["x"], install_dir=os.curdir,
|
||||
exclude_scripts=True, always_copy=False,
|
||||
build_directory=None, editable=False,
|
||||
upgrade=False, multi_version=True, no_report=True)
|
||||
cmd.ensure_finalized()
|
||||
cmd.easy_install("req")
|
||||
import pip.command.install
|
||||
import pkg_resources
|
||||
|
||||
pip_install = pip.command.install.InstallCommand()
|
||||
pip_install.run({}, requires_list)
|
||||
|
||||
for ep in pkg_resources.iter_entry_points('oslo.packaging.attr_filters'):
|
||||
filter_method = ep.load()
|
||||
attrs = filter_method(attrs)
|
||||
|
||||
return attrs
|
369
oslo/packaging/packaging.py
Normal file
369
oslo/packaging/packaging.py
Normal file
@ -0,0 +1,369 @@
|
||||
# vim: tabstop=4 shiftwidth=4 softtabstop=4
|
||||
|
||||
# Copyright 2011 OpenStack LLC.
|
||||
# Copyright 2012-2013 Hewlett-Packard Development Company, L.P.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""
|
||||
Utilities with minimum-depends for use in setup.py
|
||||
"""
|
||||
|
||||
import email
|
||||
import os
|
||||
import re
|
||||
import subprocess
|
||||
import sys
|
||||
|
||||
from setuptools.command import sdist
|
||||
|
||||
|
||||
def parse_mailmap(mailmap='.mailmap'):
|
||||
mapping = {}
|
||||
if os.path.exists(mailmap):
|
||||
with open(mailmap, 'r') as fp:
|
||||
for l in fp:
|
||||
try:
|
||||
canonical_email, alias = re.match(
|
||||
r'[^#]*?(<.+>).*(<.+>).*', l).groups()
|
||||
except AttributeError:
|
||||
continue
|
||||
mapping[alias] = canonical_email
|
||||
return mapping
|
||||
|
||||
|
||||
def _parse_git_mailmap(git_dir, mailmap='.mailmap'):
|
||||
mailmap = os.path.join(os.path.dirname(git_dir), mailmap)
|
||||
return parse_mailmap(mailmap)
|
||||
|
||||
|
||||
def canonicalize_emails(changelog, mapping):
|
||||
"""Takes in a string and an email alias mapping and replaces all
|
||||
instances of the aliases in the string with their real email.
|
||||
"""
|
||||
for alias, email_address in mapping.iteritems():
|
||||
changelog = changelog.replace(alias, email_address)
|
||||
return changelog
|
||||
|
||||
|
||||
# Get requirements from the first file that exists
|
||||
def get_reqs_from_files(requirements_files):
|
||||
for requirements_file in requirements_files:
|
||||
if os.path.exists(requirements_file):
|
||||
with open(requirements_file, 'r') as fil:
|
||||
return fil.read().split('\n')
|
||||
return []
|
||||
|
||||
|
||||
def parse_requirements(requirements_files=['requirements.txt',
|
||||
'tools/pip-requires']):
|
||||
requirements = []
|
||||
for line in get_reqs_from_files(requirements_files):
|
||||
# For the requirements list, we need to inject only the portion
|
||||
# after egg= so that distutils knows the package it's looking for
|
||||
# such as:
|
||||
# -e git://github.com/openstack/nova/master#egg=nova
|
||||
if re.match(r'\s*-e\s+', line):
|
||||
requirements.append(re.sub(r'\s*-e\s+.*#egg=(.*)$', r'\1',
|
||||
line))
|
||||
# such as:
|
||||
# http://github.com/openstack/nova/zipball/master#egg=nova
|
||||
elif re.match(r'\s*https?:', line):
|
||||
requirements.append(re.sub(r'\s*https?:.*#egg=(.*)$', r'\1',
|
||||
line))
|
||||
# -f lines are for index locations, and don't get used here
|
||||
elif re.match(r'\s*-f\s+', line):
|
||||
pass
|
||||
# argparse is part of the standard library starting with 2.7
|
||||
# adding it to the requirements list screws distro installs
|
||||
elif line == 'argparse' and sys.version_info >= (2, 7):
|
||||
pass
|
||||
else:
|
||||
requirements.append(line)
|
||||
|
||||
return requirements
|
||||
|
||||
|
||||
def parse_dependency_links(requirements_files=['requirements.txt',
|
||||
'tools/pip-requires']):
|
||||
dependency_links = []
|
||||
# dependency_links inject alternate locations to find packages listed
|
||||
# in requirements
|
||||
for line in get_reqs_from_files(requirements_files):
|
||||
# skip comments and blank lines
|
||||
if re.match(r'(\s*#)|(\s*$)', line):
|
||||
continue
|
||||
# lines with -e or -f need the whole line, minus the flag
|
||||
if re.match(r'\s*-[ef]\s+', line):
|
||||
dependency_links.append(re.sub(r'\s*-[ef]\s+', '', line))
|
||||
# lines that are only urls can go in unmolested
|
||||
elif re.match(r'\s*https?:', line):
|
||||
dependency_links.append(line)
|
||||
return dependency_links
|
||||
|
||||
|
||||
def _run_shell_command(cmd, throw_on_error=False):
|
||||
if os.name == 'nt':
|
||||
output = subprocess.Popen(["cmd.exe", "/C", cmd],
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.PIPE)
|
||||
else:
|
||||
output = subprocess.Popen(["/bin/sh", "-c", cmd],
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.PIPE)
|
||||
out = output.communicate()
|
||||
if output.returncode and throw_on_error:
|
||||
raise Exception("%s returned %d" % cmd, output.returncode)
|
||||
if len(out) == 0:
|
||||
return None
|
||||
if len(out[0].strip()) == 0:
|
||||
return None
|
||||
return out[0].strip()
|
||||
|
||||
|
||||
def _get_git_directory():
|
||||
parent_dir = os.path.dirname(__file__)
|
||||
while True:
|
||||
git_dir = os.path.join(parent_dir, '.git')
|
||||
if os.path.exists(git_dir):
|
||||
return git_dir
|
||||
parent_dir, child = os.path.split(parent_dir)
|
||||
if not child: # reached to root dir
|
||||
return None
|
||||
|
||||
|
||||
def write_git_changelog():
|
||||
"""Write a changelog based on the git changelog."""
|
||||
new_changelog = 'ChangeLog'
|
||||
git_dir = _get_git_directory()
|
||||
if not os.getenv('SKIP_WRITE_GIT_CHANGELOG'):
|
||||
if git_dir:
|
||||
git_log_cmd = 'git --git-dir=%s log --stat' % git_dir
|
||||
changelog = _run_shell_command(git_log_cmd)
|
||||
mailmap = _parse_git_mailmap(git_dir)
|
||||
with open(new_changelog, "w") as changelog_file:
|
||||
changelog_file.write(canonicalize_emails(changelog, mailmap))
|
||||
else:
|
||||
open(new_changelog, 'w').close()
|
||||
|
||||
|
||||
def generate_authors():
|
||||
"""Create AUTHORS file using git commits."""
|
||||
jenkins_email = 'jenkins@review.(openstack|stackforge).org'
|
||||
old_authors = 'AUTHORS.in'
|
||||
new_authors = 'AUTHORS'
|
||||
git_dir = _get_git_directory()
|
||||
if not os.getenv('SKIP_GENERATE_AUTHORS'):
|
||||
if git_dir:
|
||||
# don't include jenkins email address in AUTHORS file
|
||||
git_log_cmd = ("git --git-dir=" + git_dir +
|
||||
" log --format='%aN <%aE>' | sort -u | "
|
||||
"egrep -v '" + jenkins_email + "'")
|
||||
changelog = _run_shell_command(git_log_cmd)
|
||||
mailmap = _parse_git_mailmap(git_dir)
|
||||
with open(new_authors, 'w') as new_authors_fh:
|
||||
new_authors_fh.write(canonicalize_emails(changelog, mailmap))
|
||||
if os.path.exists(old_authors):
|
||||
with open(old_authors, "r") as old_authors_fh:
|
||||
new_authors_fh.write('\n' + old_authors_fh.read())
|
||||
else:
|
||||
open(new_authors, 'w').close()
|
||||
|
||||
|
||||
_rst_template = """%(heading)s
|
||||
%(underline)s
|
||||
|
||||
.. automodule:: %(module)s
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
"""
|
||||
|
||||
|
||||
def get_cmdclass():
|
||||
"""Return dict of commands to run from setup.py."""
|
||||
|
||||
cmdclass = dict()
|
||||
|
||||
def _find_modules(arg, dirname, files):
|
||||
for filename in files:
|
||||
if filename.endswith('.py') and filename != '__init__.py':
|
||||
arg["%s.%s" % (dirname.replace('/', '.'),
|
||||
filename[:-3])] = True
|
||||
|
||||
class LocalSDist(sdist.sdist):
|
||||
"""Builds the ChangeLog and Authors files from VC first."""
|
||||
|
||||
def run(self):
|
||||
write_git_changelog()
|
||||
generate_authors()
|
||||
# sdist.sdist is an old style class, can't use super()
|
||||
sdist.sdist.run(self)
|
||||
|
||||
cmdclass['sdist'] = LocalSDist
|
||||
|
||||
# If Sphinx is installed on the box running setup.py,
|
||||
# enable setup.py to build the documentation, otherwise,
|
||||
# just ignore it
|
||||
try:
|
||||
from sphinx.setup_command import BuildDoc
|
||||
|
||||
class LocalBuildDoc(BuildDoc):
|
||||
|
||||
builders = ['html', 'man']
|
||||
|
||||
def generate_autoindex(self):
|
||||
print "**Autodocumenting from %s" % os.path.abspath(os.curdir)
|
||||
modules = {}
|
||||
option_dict = self.distribution.get_option_dict('build_sphinx')
|
||||
source_dir = os.path.join(option_dict['source_dir'][1], 'api')
|
||||
if not os.path.exists(source_dir):
|
||||
os.makedirs(source_dir)
|
||||
for pkg in self.distribution.packages:
|
||||
if '.' not in pkg:
|
||||
os.path.walk(pkg, _find_modules, modules)
|
||||
module_list = modules.keys()
|
||||
module_list.sort()
|
||||
autoindex_filename = os.path.join(source_dir, 'autoindex.rst')
|
||||
with open(autoindex_filename, 'w') as autoindex:
|
||||
autoindex.write(""".. toctree::
|
||||
:maxdepth: 1
|
||||
|
||||
""")
|
||||
for module in module_list:
|
||||
output_filename = os.path.join(source_dir,
|
||||
"%s.rst" % module)
|
||||
heading = "The :mod:`%s` Module" % module
|
||||
underline = "=" * len(heading)
|
||||
values = dict(module=module, heading=heading,
|
||||
underline=underline)
|
||||
|
||||
print "Generating %s" % output_filename
|
||||
with open(output_filename, 'w') as output_file:
|
||||
output_file.write(_rst_template % values)
|
||||
autoindex.write(" %s.rst\n" % module)
|
||||
|
||||
def run(self):
|
||||
if not os.getenv('SPHINX_DEBUG'):
|
||||
self.generate_autoindex()
|
||||
|
||||
for builder in self.builders:
|
||||
self.builder = builder
|
||||
self.finalize_options()
|
||||
self.project = self.distribution.get_name()
|
||||
self.version = self.distribution.get_version()
|
||||
self.release = self.distribution.get_version()
|
||||
BuildDoc.run(self)
|
||||
|
||||
class LocalBuildLatex(LocalBuildDoc):
|
||||
builders = ['latex']
|
||||
|
||||
cmdclass['build_sphinx'] = LocalBuildDoc
|
||||
cmdclass['build_sphinx_latex'] = LocalBuildLatex
|
||||
except ImportError:
|
||||
pass
|
||||
|
||||
return cmdclass
|
||||
|
||||
|
||||
def _get_revno(git_dir):
|
||||
"""Return the number of commits since the most recent tag.
|
||||
|
||||
We use git-describe to find this out, but if there are no
|
||||
tags then we fall back to counting commits since the beginning
|
||||
of time.
|
||||
"""
|
||||
describe = _run_shell_command(
|
||||
"git --git-dir=%s describe --always" % git_dir)
|
||||
if "-" in describe:
|
||||
return describe.rsplit("-", 2)[-2]
|
||||
|
||||
# no tags found
|
||||
revlist = _run_shell_command(
|
||||
"git --git-dir=%s rev-list --abbrev-commit HEAD" % git_dir)
|
||||
return len(revlist.splitlines())
|
||||
|
||||
|
||||
def _get_version_from_git(pre_version):
|
||||
"""Return a version which is equal to the tag that's on the current
|
||||
revision if there is one, or tag plus number of additional revisions
|
||||
if the current revision has no tag."""
|
||||
|
||||
git_dir = _get_git_directory()
|
||||
if git_dir:
|
||||
if pre_version:
|
||||
try:
|
||||
return _run_shell_command(
|
||||
"git --git-dir=" + git_dir + " describe --exact-match",
|
||||
throw_on_error=True).replace('-', '.')
|
||||
except Exception:
|
||||
sha = _run_shell_command(
|
||||
"git --git-dir=" + git_dir + " log -n1 --pretty=format:%h")
|
||||
return "%s.a%s.g%s" % (pre_version, _get_revno(git_dir), sha)
|
||||
else:
|
||||
return _run_shell_command(
|
||||
"git --git-dir=" + git_dir + " describe --always").replace(
|
||||
'-', '.')
|
||||
return None
|
||||
|
||||
|
||||
def _get_version_from_pkg_info(package_name):
|
||||
"""Get the version from PKG-INFO file if we can."""
|
||||
try:
|
||||
pkg_info_file = open('PKG-INFO', 'r')
|
||||
except (IOError, OSError):
|
||||
return None
|
||||
try:
|
||||
pkg_info = email.message_from_file(pkg_info_file)
|
||||
except email.MessageError:
|
||||
return None
|
||||
# Check to make sure we're in our own dir
|
||||
if pkg_info.get('Name', None) != package_name:
|
||||
return None
|
||||
return pkg_info.get('Version', None)
|
||||
|
||||
|
||||
def get_version(package_name, pre_version=None):
|
||||
"""Get the version of the project. First, try getting it from PKG-INFO, if
|
||||
it exists. If it does, that means we're in a distribution tarball or that
|
||||
install has happened. Otherwise, if there is no PKG-INFO file, pull the
|
||||
version from git.
|
||||
|
||||
We do not support setup.py version sanity in git archive tarballs, nor do
|
||||
we support packagers directly sucking our git repo into theirs. We expect
|
||||
that a source tarball be made from our git repo - or that if someone wants
|
||||
to make a source tarball from a fork of our repo with additional tags in it
|
||||
that they understand and desire the results of doing that.
|
||||
"""
|
||||
version = os.environ.get("OSLO_PACKAGE_VERSION", None)
|
||||
if version:
|
||||
return version
|
||||
version = _get_version_from_pkg_info(package_name)
|
||||
if version:
|
||||
return version
|
||||
version = _get_version_from_git(pre_version)
|
||||
if version:
|
||||
return version
|
||||
raise Exception("Versioning for this project requires either an sdist"
|
||||
" tarball, or access to an upstream git repository.")
|
||||
|
||||
|
||||
def attr_filter(attrs):
|
||||
"""Filter attrs parsed from a setup.cfg to inject our defaults."""
|
||||
attrs['version'] = get_version(attrs['name'], attrs.get('version', None))
|
||||
attrs['cmdclass'] = get_cmdclass()
|
||||
attrs['install_requires'] = parse_requirements()
|
||||
attrs['dependency_links'] = parse_dependency_links()
|
||||
attrs['include_package_data'] = True
|
||||
return attrs
|
629
oslo/packaging/util.py
Normal file
629
oslo/packaging/util.py
Normal file
@ -0,0 +1,629 @@
|
||||
# Copyright (c) 2013 Hewlett-Packard Development Company, L.P.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
# implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
#
|
||||
# Copyright (C) 2013 Association of Universities for Research in Astronomy (AURA)
|
||||
|
||||
# Redistribution and use in source and binary forms, with or without
|
||||
# modification, are permitted provided that the following conditions are met:
|
||||
#
|
||||
# 1. Redistributions of source code must retain the above copyright
|
||||
# notice, this list of conditions and the following disclaimer.
|
||||
#
|
||||
# 2. Redistributions in binary form must reproduce the above
|
||||
# copyright notice, this list of conditions and the following
|
||||
# disclaimer in the documentation and/or other materials provided
|
||||
# with the distribution.
|
||||
#
|
||||
# 3. The name of AURA and its representatives may not be used to
|
||||
# endorse or promote products derived from this software without
|
||||
# specific prior written permission.
|
||||
#
|
||||
# THIS SOFTWARE IS PROVIDED BY AURA ``AS IS'' AND ANY EXPRESS OR IMPLIED
|
||||
# WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
|
||||
# MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
|
||||
# DISCLAIMED. IN NO EVENT SHALL AURA BE LIABLE FOR ANY DIRECT, INDIRECT,
|
||||
# INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
|
||||
# BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS
|
||||
# OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
|
||||
# ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR
|
||||
# TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE
|
||||
# USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH
|
||||
# DAMAGE.
|
||||
|
||||
"""The code in this module is mostly copy/pasted out of the distutils2 source
|
||||
code, as recommended by Tarek Ziade. As such, it may be subject to some change
|
||||
as distutils2 development continues, and will have to be kept up to date.
|
||||
|
||||
I didn't want to use it directly from distutils2 itself, since I do not want it
|
||||
to be an installation dependency for our packages yet--it is still too unstable
|
||||
(the latest version on PyPI doesn't even install).
|
||||
"""
|
||||
|
||||
# These first two imports are not used, but are needed to get around an
|
||||
# irritating Python bug that can crop up when using ./setup.py test.
|
||||
# See: http://www.eby-sarna.com/pipermail/peak/2010-May/003355.html
|
||||
try:
|
||||
import multiprocessing
|
||||
_ = multiprocessing
|
||||
except ImportError:
|
||||
pass
|
||||
import logging
|
||||
_ = logging
|
||||
|
||||
import os
|
||||
import re
|
||||
import sys
|
||||
import traceback
|
||||
|
||||
from collections import defaultdict
|
||||
|
||||
import distutils.ccompiler
|
||||
|
||||
from distutils import log
|
||||
from distutils.errors import (DistutilsOptionError, DistutilsModuleError,
|
||||
DistutilsFileError)
|
||||
from setuptools.command.egg_info import manifest_maker
|
||||
from setuptools.dist import Distribution
|
||||
from setuptools.extension import Extension
|
||||
|
||||
try:
|
||||
import configparser.RawConfigParser as RawConfigParser
|
||||
except ImportError:
|
||||
import ConfigParser.RawConfigParser as RawConfigParser
|
||||
|
||||
from oslo.packaging import packaging
|
||||
|
||||
# A simplified RE for this; just checks that the line ends with version
|
||||
# predicates in ()
|
||||
_VERSION_SPEC_RE = re.compile(r'\s*(.*?)\s*\((.*)\)\s*$')
|
||||
|
||||
|
||||
# Mappings from setup() keyword arguments to setup.cfg options;
|
||||
# The values are (section, option) tuples, or simply (section,) tuples if
|
||||
# the option has the same name as the setup() argument
|
||||
D1_D2_SETUP_ARGS = {
|
||||
"name": ("metadata",),
|
||||
"version": ("metadata",),
|
||||
"author": ("metadata",),
|
||||
"author_email": ("metadata",),
|
||||
"maintainer": ("metadata",),
|
||||
"maintainer_email": ("metadata",),
|
||||
"url": ("metadata", "home_page"),
|
||||
"description": ("metadata", "summary"),
|
||||
"keywords": ("metadata",),
|
||||
"long_description": ("metadata", "description"),
|
||||
"download-url": ("metadata",),
|
||||
"classifiers": ("metadata", "classifier"),
|
||||
"platforms": ("metadata", "platform"), # **
|
||||
"license": ("metadata",),
|
||||
# Use setuptools install_requires, not
|
||||
# broken distutils requires
|
||||
"install_requires": ("metadata", "requires_dist"),
|
||||
"tests_require": ("metadata",),
|
||||
"setup_requires": ("metadata",),
|
||||
"provides": ("metadata", "provides_dist"), # **
|
||||
"obsoletes": ("metadata", "obsoletes_dist"), # **
|
||||
"package_dir": ("files", 'packages_root'),
|
||||
"packages": ("files",),
|
||||
"package_data": ("files",),
|
||||
"data_files": ("files",),
|
||||
"scripts": ("files",),
|
||||
"namespace_packages": ("files",),
|
||||
"py_modules": ("files", "modules"), # **
|
||||
"cmdclass": ("global", "commands"),
|
||||
"use_2to3": ("metadata",),
|
||||
"zip_safe": ("metadata",),
|
||||
}
|
||||
|
||||
# setup() arguments that can have multiple values in setup.cfg
|
||||
MULTI_FIELDS = ("classifiers",
|
||||
"platforms",
|
||||
"install_requires",
|
||||
"provides",
|
||||
"obsoletes",
|
||||
"packages",
|
||||
"namespace_packages",
|
||||
"package_data",
|
||||
"data_files",
|
||||
"scripts",
|
||||
"py_modules",
|
||||
"tests_require",
|
||||
"setup_requires",
|
||||
"cmdclass")
|
||||
|
||||
# setup() arguments that contain boolean values
|
||||
BOOL_FIELDS = ("use_2to3", "zip_safe")
|
||||
|
||||
|
||||
CSV_FIELDS = ("keywords",)
|
||||
|
||||
|
||||
log.set_verbosity(log.INFO)
|
||||
|
||||
|
||||
def resolve_name(name):
|
||||
"""Resolve a name like ``module.object`` to an object and return it.
|
||||
|
||||
Raise ImportError if the module or name is not found.
|
||||
"""
|
||||
|
||||
parts = name.split('.')
|
||||
cursor = len(parts) - 1
|
||||
module_name = parts[:cursor]
|
||||
attr_name = parts[-1]
|
||||
|
||||
while cursor > 0:
|
||||
try:
|
||||
ret = __import__('.'.join(module_name), fromlist=[attr_name])
|
||||
break
|
||||
except ImportError:
|
||||
if cursor == 0:
|
||||
raise
|
||||
cursor -= 1
|
||||
module_name = parts[:cursor]
|
||||
attr_name = parts[cursor]
|
||||
ret = ''
|
||||
|
||||
for part in parts[cursor:]:
|
||||
try:
|
||||
ret = getattr(ret, part)
|
||||
except AttributeError:
|
||||
raise ImportError(name)
|
||||
|
||||
return ret
|
||||
|
||||
|
||||
def cfg_to_args(path='setup.cfg'):
|
||||
""" Distutils2 to distutils1 compatibility util.
|
||||
|
||||
This method uses an existing setup.cfg to generate a dictionary of
|
||||
keywords that can be used by distutils.core.setup(kwargs**).
|
||||
|
||||
:param file:
|
||||
The setup.cfg path.
|
||||
:raises DistutilsFileError:
|
||||
When the setup.cfg file is not found.
|
||||
|
||||
"""
|
||||
|
||||
# The method source code really starts here.
|
||||
parser = RawConfigParser()
|
||||
if not os.path.exists(path):
|
||||
raise DistutilsFileError("file '%s' does not exist" %
|
||||
os.path.abspath(path))
|
||||
parser.read(path)
|
||||
config = {}
|
||||
for section in parser.sections():
|
||||
config[section] = dict(parser.items(section))
|
||||
|
||||
# Run setup_hooks, if configured
|
||||
setup_hooks = has_get_option(config, 'global', 'setup_hooks')
|
||||
package_dir = has_get_option(config, 'files', 'packages_root')
|
||||
|
||||
# Add the source package directory to sys.path in case it contains
|
||||
# additional hooks, and to make sure it's on the path before any existing
|
||||
# installations of the package
|
||||
if package_dir:
|
||||
package_dir = os.path.abspath(package_dir)
|
||||
sys.path.insert(0, package_dir)
|
||||
|
||||
try:
|
||||
if setup_hooks:
|
||||
setup_hooks = split_multiline(setup_hooks)
|
||||
for hook in setup_hooks:
|
||||
hook_fn = resolve_name(hook)
|
||||
try :
|
||||
hook_fn(config)
|
||||
except:
|
||||
e = sys.exc_info()[1]
|
||||
log.error('setup hook %s raised exception: %s\n' %
|
||||
(hook, e))
|
||||
log.error(traceback.format_exc())
|
||||
sys.exit(1)
|
||||
|
||||
kwargs = setup_cfg_to_setup_kwargs(config)
|
||||
|
||||
register_custom_compilers(config)
|
||||
|
||||
ext_modules = get_extension_modules(config)
|
||||
if ext_modules:
|
||||
kwargs['ext_modules'] = ext_modules
|
||||
|
||||
entry_points = get_entry_points(config)
|
||||
if entry_points:
|
||||
kwargs['entry_points'] = entry_points
|
||||
|
||||
wrap_commands(kwargs)
|
||||
|
||||
# Handle the [files]/extra_files option
|
||||
extra_files = has_get_option(config, 'files', 'extra_files')
|
||||
if extra_files:
|
||||
extra_files = split_multiline(extra_files)
|
||||
# Let's do a sanity check
|
||||
for filename in extra_files:
|
||||
if not os.path.exists(filename):
|
||||
raise DistutilsFileError(
|
||||
'%s from the extra_files option in setup.cfg does not '
|
||||
'exist' % filename)
|
||||
# Unfortunately the only really sensible way to do this is to
|
||||
# monkey-patch the manifest_maker class
|
||||
@monkeypatch_method(manifest_maker)
|
||||
def add_defaults(self, extra_files=extra_files, log=log):
|
||||
log.info('[oslo.packaging] running patched manifest_maker'
|
||||
' command with extra_files support')
|
||||
add_defaults._orig(self)
|
||||
self.filelist.extend(extra_files)
|
||||
|
||||
finally:
|
||||
# Perform cleanup if any paths were added to sys.path
|
||||
if package_dir:
|
||||
sys.path.pop(0)
|
||||
|
||||
return kwargs
|
||||
|
||||
|
||||
def filtered_args(attr):
|
||||
"""Process and pass on the attrs dict."""
|
||||
return packaging.attr_filter(attr)
|
||||
|
||||
|
||||
def setup_cfg_to_setup_kwargs(config):
|
||||
"""Processes the setup.cfg options and converts them to arguments accepted
|
||||
by setuptools' setup() function.
|
||||
"""
|
||||
|
||||
kwargs = {}
|
||||
|
||||
for arg in D1_D2_SETUP_ARGS:
|
||||
if len(D1_D2_SETUP_ARGS[arg]) == 2:
|
||||
# The distutils field name is different than distutils2's.
|
||||
section, option = D1_D2_SETUP_ARGS[arg]
|
||||
|
||||
elif len(D1_D2_SETUP_ARGS[arg]) == 1:
|
||||
# The distutils field name is the same thant distutils2's.
|
||||
section = D1_D2_SETUP_ARGS[arg][0]
|
||||
option = arg
|
||||
|
||||
in_cfg_value = has_get_option(config, section, option)
|
||||
if not in_cfg_value:
|
||||
# There is no such option in the setup.cfg
|
||||
if arg == "long_description":
|
||||
in_cfg_value = has_get_option(config, section,
|
||||
"description_file")
|
||||
if in_cfg_value:
|
||||
in_cfg_value = split_multiline(in_cfg_value)
|
||||
value = ''
|
||||
for filename in in_cfg_value:
|
||||
description_file = open(filename)
|
||||
try:
|
||||
value += description_file.read().strip() + '\n\n'
|
||||
finally:
|
||||
description_file.close()
|
||||
in_cfg_value = value
|
||||
else:
|
||||
continue
|
||||
|
||||
if arg in CSV_FIELDS:
|
||||
in_cfg_value = split_csv(in_cfg_value)
|
||||
if arg in MULTI_FIELDS:
|
||||
in_cfg_value = split_multiline(in_cfg_value)
|
||||
elif arg in BOOL_FIELDS:
|
||||
# Provide some flexibility here...
|
||||
if in_cfg_value.lower() in ('true', 't', '1', 'yes', 'y'):
|
||||
in_cfg_value = True
|
||||
else:
|
||||
in_cfg_value = False
|
||||
|
||||
if in_cfg_value:
|
||||
if arg in ('install_requires', 'tests_require'):
|
||||
# Replaces PEP345-style version specs with the sort expected by
|
||||
# setuptools
|
||||
in_cfg_value = [_VERSION_SPEC_RE.sub(r'\1\2', pred)
|
||||
for pred in in_cfg_value]
|
||||
elif arg == 'package_dir':
|
||||
in_cfg_value = {'': in_cfg_value}
|
||||
elif arg in ('package_data', 'data_files'):
|
||||
data_files = {}
|
||||
firstline = True
|
||||
prev = None
|
||||
for line in in_cfg_value:
|
||||
if '=' in line:
|
||||
key, value = line.split('=', 1)
|
||||
key, value = (key.strip(), value.strip())
|
||||
if key in data_files:
|
||||
# Multiple duplicates of the same package name;
|
||||
# this is for backwards compatibility of the old
|
||||
# format prior to d2to1 0.2.6.
|
||||
prev = data_files[key]
|
||||
prev.extend(value.split())
|
||||
else:
|
||||
prev = data_files[key.strip()] = value.split()
|
||||
elif firstline:
|
||||
raise DistutilsOptionError(
|
||||
'malformed package_data first line %r (misses '
|
||||
'"=")' % line)
|
||||
else:
|
||||
prev.extend(line.strip().split())
|
||||
firstline = False
|
||||
if arg == 'data_files':
|
||||
# the data_files value is a pointlessly different structure
|
||||
# from the package_data value
|
||||
data_files = data_files.items()
|
||||
in_cfg_value = data_files
|
||||
elif arg == 'cmdclass':
|
||||
cmdclass = {}
|
||||
dist = Distribution()
|
||||
for cls in in_cfg_value:
|
||||
cls = resolve_name(cls)
|
||||
cmd = cls(dist)
|
||||
cmdclass[cmd.get_command_name()] = cls
|
||||
in_cfg_value = cmdclass
|
||||
|
||||
kwargs[arg] = in_cfg_value
|
||||
|
||||
return kwargs
|
||||
|
||||
|
||||
def register_custom_compilers(config):
|
||||
"""Handle custom compilers; this has no real equivalent in distutils, where
|
||||
additional compilers could only be added programmatically, so we have to
|
||||
hack it in somehow.
|
||||
"""
|
||||
|
||||
compilers = has_get_option(config, 'global', 'compilers')
|
||||
if compilers:
|
||||
compilers = split_multiline(compilers)
|
||||
for compiler in compilers:
|
||||
compiler = resolve_name(compiler)
|
||||
|
||||
# In distutils2 compilers these class attributes exist; for
|
||||
# distutils1 we just have to make something up
|
||||
if hasattr(compiler, 'name'):
|
||||
name = compiler.name
|
||||
else:
|
||||
name = compiler.__name__
|
||||
if hasattr(compiler, 'description'):
|
||||
desc = compiler.description
|
||||
else:
|
||||
desc = 'custom compiler %s' % name
|
||||
|
||||
module_name = compiler.__module__
|
||||
# Note; this *will* override built in compilers with the same name
|
||||
# TODO: Maybe display a warning about this?
|
||||
cc = distutils.ccompiler.compiler_class
|
||||
cc[name] = (module_name, compiler.__name__, desc)
|
||||
|
||||
# HACK!!!! Distutils assumes all compiler modules are in the
|
||||
# distutils package
|
||||
sys.modules['distutils.' + module_name] = sys.modules[module_name]
|
||||
|
||||
|
||||
def get_extension_modules(config):
|
||||
"""Handle extension modules"""
|
||||
|
||||
EXTENSION_FIELDS = ("sources",
|
||||
"include_dirs",
|
||||
"define_macros",
|
||||
"undef_macros",
|
||||
"library_dirs",
|
||||
"libraries",
|
||||
"runtime_library_dirs",
|
||||
"extra_objects",
|
||||
"extra_compile_args",
|
||||
"extra_link_args",
|
||||
"export_symbols",
|
||||
"swig_opts",
|
||||
"depends")
|
||||
|
||||
ext_modules = []
|
||||
for section in config:
|
||||
if ':' in section:
|
||||
labels = section.split(':', 1)
|
||||
else:
|
||||
# Backwards compatibility for old syntax; don't use this though
|
||||
labels = section.split('=', 1)
|
||||
labels = [l.strip() for l in labels]
|
||||
if (len(labels) == 2) and (labels[0] == 'extension'):
|
||||
ext_args = {}
|
||||
for field in EXTENSION_FIELDS:
|
||||
value = has_get_option(config, section, field)
|
||||
# All extension module options besides name can have multiple
|
||||
# values
|
||||
if not value:
|
||||
continue
|
||||
value = split_multiline(value)
|
||||
if field == 'define_macros':
|
||||
macros = []
|
||||
for macro in value:
|
||||
macro = macro.split('=', 1)
|
||||
if len(macro) == 1:
|
||||
macro = (macro[0].strip(), None)
|
||||
else:
|
||||
macro = (macro[0].strip(), macro[1].strip())
|
||||
macros.append(macro)
|
||||
value = macros
|
||||
ext_args[field] = value
|
||||
if ext_args:
|
||||
if 'name' not in ext_args:
|
||||
ext_args['name'] = labels[1]
|
||||
ext_modules.append(Extension(ext_args.pop('name'),
|
||||
**ext_args))
|
||||
return ext_modules
|
||||
|
||||
|
||||
def get_entry_points(config):
|
||||
"""Process the [entry_points] section of setup.cfg to handle setuptools
|
||||
entry points. This is, of course, not a standard feature of
|
||||
distutils2/packaging, but as there is not currently a standard alternative
|
||||
in packaging, we provide support for them.
|
||||
"""
|
||||
|
||||
if not 'entry_points' in config:
|
||||
return {}
|
||||
|
||||
return dict((option, split_multiline(value))
|
||||
for option, value in config['entry_points'].items())
|
||||
|
||||
|
||||
def wrap_commands(kwargs):
|
||||
dist = Distribution()
|
||||
|
||||
# This should suffice to get the same config values and command classes
|
||||
# that the actual Distribution will see (not counting cmdclass, which is
|
||||
# handled below)
|
||||
dist.parse_config_files()
|
||||
|
||||
for cmd, _ in dist.get_command_list():
|
||||
hooks = {}
|
||||
for opt, val in dist.get_option_dict(cmd).items():
|
||||
val = val[1]
|
||||
if opt.startswith('pre_hook.') or opt.startswith('post_hook.'):
|
||||
hook_type, alias = opt.split('.', 1)
|
||||
hook_dict = hooks.setdefault(hook_type, {})
|
||||
hook_dict[alias] = val
|
||||
if not hooks:
|
||||
continue
|
||||
|
||||
if 'cmdclass' in kwargs and cmd in kwargs['cmdclass']:
|
||||
cmdclass = kwargs['cmdclass'][cmd]
|
||||
else:
|
||||
cmdclass = dist.get_command_class(cmd)
|
||||
|
||||
new_cmdclass = wrap_command(cmd, cmdclass, hooks)
|
||||
kwargs.setdefault('cmdclass', {})[cmd] = new_cmdclass
|
||||
|
||||
|
||||
def wrap_command(cmd, cmdclass, hooks):
|
||||
def run(self, cmdclass=cmdclass):
|
||||
self.run_command_hooks('pre_hook')
|
||||
cmdclass.run(self)
|
||||
self.run_command_hooks('post_hook')
|
||||
|
||||
return type(cmd, (cmdclass, object),
|
||||
{'run': run, 'run_command_hooks': run_command_hooks,
|
||||
'pre_hook': hooks.get('pre_hook'),
|
||||
'post_hook': hooks.get('post_hook')})
|
||||
|
||||
|
||||
def run_command_hooks(cmd_obj, hook_kind):
|
||||
"""Run hooks registered for that command and phase.
|
||||
|
||||
*cmd_obj* is a finalized command object; *hook_kind* is either
|
||||
'pre_hook' or 'post_hook'.
|
||||
"""
|
||||
|
||||
if hook_kind not in ('pre_hook', 'post_hook'):
|
||||
raise ValueError('invalid hook kind: %r' % hook_kind)
|
||||
|
||||
hooks = getattr(cmd_obj, hook_kind, None)
|
||||
|
||||
if hooks is None:
|
||||
return
|
||||
|
||||
for hook in hooks.values():
|
||||
if isinstance(hook, str):
|
||||
try:
|
||||
hook_obj = resolve_name(hook)
|
||||
except ImportError:
|
||||
err = sys.exc_info()[1] # For py3k
|
||||
raise DistutilsModuleError('cannot find hook %s: %s' %
|
||||
(hook,err))
|
||||
else:
|
||||
hook_obj = hook
|
||||
|
||||
if not hasattr(hook_obj, '__call__'):
|
||||
raise DistutilsOptionError('hook %r is not callable' % hook)
|
||||
|
||||
log.info('running %s %s for command %s',
|
||||
hook_kind, hook, cmd_obj.get_command_name())
|
||||
|
||||
try :
|
||||
hook_obj(cmd_obj)
|
||||
except:
|
||||
e = sys.exc_info()[1]
|
||||
log.error('hook %s raised exception: %s\n' % (hook, e))
|
||||
log.error(traceback.format_exc())
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
def has_get_option(config, section, option):
|
||||
if section in config and option in config[section]:
|
||||
return config[section][option]
|
||||
elif section in config and option.replace('_', '-') in config[section]:
|
||||
return config[section][option.replace('_', '-')]
|
||||
else:
|
||||
return False
|
||||
|
||||
|
||||
def split_multiline(value):
|
||||
"""Special behaviour when we have a multi line options"""
|
||||
|
||||
value = [element for element in
|
||||
(line.strip() for line in value.split('\n'))
|
||||
if element]
|
||||
return value
|
||||
|
||||
|
||||
def split_csv(value):
|
||||
"""Special behaviour when we have a comma separated options"""
|
||||
|
||||
value = [element for element in
|
||||
(chunk.strip() for chunk in value.split(','))
|
||||
if element]
|
||||
return value
|
||||
|
||||
|
||||
def monkeypatch_method(cls):
|
||||
"""A function decorator to monkey-patch a method of the same name on the
|
||||
given class.
|
||||
"""
|
||||
|
||||
def wrapper(func):
|
||||
orig = getattr(cls, func.__name__, None)
|
||||
if orig and not hasattr(orig, '_orig'): # Already patched
|
||||
setattr(func, '_orig', orig)
|
||||
setattr(cls, func.__name__, func)
|
||||
return func
|
||||
|
||||
return wrapper
|
||||
|
||||
|
||||
# The following classes are used to hack Distribution.command_options a bit
|
||||
class DefaultGetDict(defaultdict):
|
||||
"""Like defaultdict, but the get() method also sets and returns the default
|
||||
value.
|
||||
"""
|
||||
|
||||
def get(self, key, default=None):
|
||||
if default is None:
|
||||
default = self.default_factory()
|
||||
return super(DefaultGetDict, self).setdefault(key, default)
|
||||
|
||||
|
||||
class IgnoreDict(dict):
|
||||
"""A dictionary that ignores any insertions in which the key is a string
|
||||
matching any string in `ignore`. The ignore list can also contain wildcard
|
||||
patterns using '*'.
|
||||
"""
|
||||
|
||||
def __init__(self, ignore):
|
||||
self.__ignore = re.compile(r'(%s)' % ('|'.join(
|
||||
[pat.replace('*', '.*')
|
||||
for pat in ignore])))
|
||||
|
||||
def __setitem__(self, key, val):
|
||||
if self.__ignore.match(key):
|
||||
return
|
||||
super(IgnoreDict, self).__setitem__(key, val)
|
4
requirements.txt
Normal file
4
requirements.txt
Normal file
@ -0,0 +1,4 @@
|
||||
distribute>=0.6.24
|
||||
six
|
||||
sphinx>=1.1.2
|
||||
setuptools_git>=0.4
|
30
setup.cfg
Normal file
30
setup.cfg
Normal file
@ -0,0 +1,30 @@
|
||||
[metadata]
|
||||
name = oslo.packaging
|
||||
author = OpenStack
|
||||
author-email = openstack-dev@lists.openstack.org
|
||||
summary = OpenStack's setup automation in a reusable form
|
||||
description-file =
|
||||
README.rst
|
||||
home-page = http://pypi.python.org/pypi/oslo.packaging
|
||||
classifier =
|
||||
Development Status :: 4 - Beta
|
||||
Environment :: Console
|
||||
Environment :: OpenStack
|
||||
Intended Audience :: Developers
|
||||
Intended Audience :: Information Technology
|
||||
License :: OSI Approved :: Apache Software License
|
||||
Operating System :: OS Independent
|
||||
Programming Language :: Python
|
||||
|
||||
[files]
|
||||
packages =
|
||||
oslo
|
||||
oslo.packaging
|
||||
namespace_packages =
|
||||
oslo
|
||||
|
||||
[entry_points]
|
||||
distutils.setup_keywords =
|
||||
oslo_packaging = oslo.packaging.core:setup
|
||||
oslo.packaging.attr_filters =
|
||||
oslo_packaging = oslo.packaging.packaging:attr_filter
|
368
setup.py
Normal file → Executable file
368
setup.py
Normal file → Executable file
@ -1,359 +1,23 @@
|
||||
# vim: tabstop=4 shiftwidth=4 softtabstop=4
|
||||
|
||||
# Copyright 2011 OpenStack LLC.
|
||||
# Copyright 2012-2013 Hewlett-Packard Development Company, L.P.
|
||||
# All Rights Reserved.
|
||||
#!/usr/bin/env python
|
||||
# Copyright (c) 2013 Hewlett-Packard Development Company, L.P.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
# implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
"""
|
||||
Utilities with minimum-depends for use in setup.py
|
||||
"""
|
||||
from setuptools import setup
|
||||
|
||||
import email
|
||||
import os
|
||||
import re
|
||||
import subprocess
|
||||
import sys
|
||||
|
||||
from setuptools.command import sdist
|
||||
# See setup.cfg for the project metadata.
|
||||
from oslo.setup import util import filtered_args
|
||||
|
||||
|
||||
def parse_mailmap(mailmap='.mailmap'):
|
||||
mapping = {}
|
||||
if os.path.exists(mailmap):
|
||||
with open(mailmap, 'r') as fp:
|
||||
for l in fp:
|
||||
try:
|
||||
canonical_email, alias = re.match(
|
||||
r'[^#]*?(<.+>).*(<.+>).*', l).groups()
|
||||
except AttributeError:
|
||||
continue
|
||||
mapping[alias] = canonical_email
|
||||
return mapping
|
||||
|
||||
|
||||
def _parse_git_mailmap(git_dir, mailmap='.mailmap'):
|
||||
mailmap = os.path.join(os.path.dirname(git_dir), mailmap)
|
||||
return parse_mailmap(mailmap)
|
||||
|
||||
|
||||
def canonicalize_emails(changelog, mapping):
|
||||
"""Takes in a string and an email alias mapping and replaces all
|
||||
instances of the aliases in the string with their real email.
|
||||
"""
|
||||
for alias, email_address in mapping.iteritems():
|
||||
changelog = changelog.replace(alias, email_address)
|
||||
return changelog
|
||||
|
||||
|
||||
# Get requirements from the first file that exists
|
||||
def get_reqs_from_files(requirements_files):
|
||||
for requirements_file in requirements_files:
|
||||
if os.path.exists(requirements_file):
|
||||
with open(requirements_file, 'r') as fil:
|
||||
return fil.read().split('\n')
|
||||
return []
|
||||
|
||||
|
||||
def parse_requirements(requirements_files=['requirements.txt',
|
||||
'tools/pip-requires']):
|
||||
requirements = []
|
||||
for line in get_reqs_from_files(requirements_files):
|
||||
# For the requirements list, we need to inject only the portion
|
||||
# after egg= so that distutils knows the package it's looking for
|
||||
# such as:
|
||||
# -e git://github.com/openstack/nova/master#egg=nova
|
||||
if re.match(r'\s*-e\s+', line):
|
||||
requirements.append(re.sub(r'\s*-e\s+.*#egg=(.*)$', r'\1',
|
||||
line))
|
||||
# such as:
|
||||
# http://github.com/openstack/nova/zipball/master#egg=nova
|
||||
elif re.match(r'\s*https?:', line):
|
||||
requirements.append(re.sub(r'\s*https?:.*#egg=(.*)$', r'\1',
|
||||
line))
|
||||
# -f lines are for index locations, and don't get used here
|
||||
elif re.match(r'\s*-f\s+', line):
|
||||
pass
|
||||
# argparse is part of the standard library starting with 2.7
|
||||
# adding it to the requirements list screws distro installs
|
||||
elif line == 'argparse' and sys.version_info >= (2, 7):
|
||||
pass
|
||||
else:
|
||||
requirements.append(line)
|
||||
|
||||
return requirements
|
||||
|
||||
|
||||
def parse_dependency_links(requirements_files=['requirements.txt',
|
||||
'tools/pip-requires']):
|
||||
dependency_links = []
|
||||
# dependency_links inject alternate locations to find packages listed
|
||||
# in requirements
|
||||
for line in get_reqs_from_files(requirements_files):
|
||||
# skip comments and blank lines
|
||||
if re.match(r'(\s*#)|(\s*$)', line):
|
||||
continue
|
||||
# lines with -e or -f need the whole line, minus the flag
|
||||
if re.match(r'\s*-[ef]\s+', line):
|
||||
dependency_links.append(re.sub(r'\s*-[ef]\s+', '', line))
|
||||
# lines that are only urls can go in unmolested
|
||||
elif re.match(r'\s*https?:', line):
|
||||
dependency_links.append(line)
|
||||
return dependency_links
|
||||
|
||||
|
||||
def _run_shell_command(cmd, throw_on_error=False):
|
||||
if os.name == 'nt':
|
||||
output = subprocess.Popen(["cmd.exe", "/C", cmd],
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.PIPE)
|
||||
else:
|
||||
output = subprocess.Popen(["/bin/sh", "-c", cmd],
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.PIPE)
|
||||
out = output.communicate()
|
||||
if output.returncode and throw_on_error:
|
||||
raise Exception("%s returned %d" % cmd, output.returncode)
|
||||
if len(out) == 0:
|
||||
return None
|
||||
if len(out[0].strip()) == 0:
|
||||
return None
|
||||
return out[0].strip()
|
||||
|
||||
|
||||
def _get_git_directory():
|
||||
parent_dir = os.path.dirname(__file__)
|
||||
while True:
|
||||
git_dir = os.path.join(parent_dir, '.git')
|
||||
if os.path.exists(git_dir):
|
||||
return git_dir
|
||||
parent_dir, child = os.path.split(parent_dir)
|
||||
if not child: # reached to root dir
|
||||
return None
|
||||
|
||||
|
||||
def write_git_changelog():
|
||||
"""Write a changelog based on the git changelog."""
|
||||
new_changelog = 'ChangeLog'
|
||||
git_dir = _get_git_directory()
|
||||
if not os.getenv('SKIP_WRITE_GIT_CHANGELOG'):
|
||||
if git_dir:
|
||||
git_log_cmd = 'git --git-dir=%s log --stat' % git_dir
|
||||
changelog = _run_shell_command(git_log_cmd)
|
||||
mailmap = _parse_git_mailmap(git_dir)
|
||||
with open(new_changelog, "w") as changelog_file:
|
||||
changelog_file.write(canonicalize_emails(changelog, mailmap))
|
||||
else:
|
||||
open(new_changelog, 'w').close()
|
||||
|
||||
|
||||
def generate_authors():
|
||||
"""Create AUTHORS file using git commits."""
|
||||
jenkins_email = 'jenkins@review.(openstack|stackforge).org'
|
||||
old_authors = 'AUTHORS.in'
|
||||
new_authors = 'AUTHORS'
|
||||
git_dir = _get_git_directory()
|
||||
if not os.getenv('SKIP_GENERATE_AUTHORS'):
|
||||
if git_dir:
|
||||
# don't include jenkins email address in AUTHORS file
|
||||
git_log_cmd = ("git --git-dir=" + git_dir +
|
||||
" log --format='%aN <%aE>' | sort -u | "
|
||||
"egrep -v '" + jenkins_email + "'")
|
||||
changelog = _run_shell_command(git_log_cmd)
|
||||
mailmap = _parse_git_mailmap(git_dir)
|
||||
with open(new_authors, 'w') as new_authors_fh:
|
||||
new_authors_fh.write(canonicalize_emails(changelog, mailmap))
|
||||
if os.path.exists(old_authors):
|
||||
with open(old_authors, "r") as old_authors_fh:
|
||||
new_authors_fh.write('\n' + old_authors_fh.read())
|
||||
else:
|
||||
open(new_authors, 'w').close()
|
||||
|
||||
|
||||
_rst_template = """%(heading)s
|
||||
%(underline)s
|
||||
|
||||
.. automodule:: %(module)s
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
"""
|
||||
|
||||
|
||||
def get_cmdclass():
|
||||
"""Return dict of commands to run from setup.py."""
|
||||
|
||||
cmdclass = dict()
|
||||
|
||||
def _find_modules(arg, dirname, files):
|
||||
for filename in files:
|
||||
if filename.endswith('.py') and filename != '__init__.py':
|
||||
arg["%s.%s" % (dirname.replace('/', '.'),
|
||||
filename[:-3])] = True
|
||||
|
||||
class LocalSDist(sdist.sdist):
|
||||
"""Builds the ChangeLog and Authors files from VC first."""
|
||||
|
||||
def run(self):
|
||||
write_git_changelog()
|
||||
generate_authors()
|
||||
# sdist.sdist is an old style class, can't use super()
|
||||
sdist.sdist.run(self)
|
||||
|
||||
cmdclass['sdist'] = LocalSDist
|
||||
|
||||
# If Sphinx is installed on the box running setup.py,
|
||||
# enable setup.py to build the documentation, otherwise,
|
||||
# just ignore it
|
||||
try:
|
||||
from sphinx.setup_command import BuildDoc
|
||||
|
||||
class LocalBuildDoc(BuildDoc):
|
||||
|
||||
builders = ['html', 'man']
|
||||
|
||||
def generate_autoindex(self):
|
||||
print "**Autodocumenting from %s" % os.path.abspath(os.curdir)
|
||||
modules = {}
|
||||
option_dict = self.distribution.get_option_dict('build_sphinx')
|
||||
source_dir = os.path.join(option_dict['source_dir'][1], 'api')
|
||||
if not os.path.exists(source_dir):
|
||||
os.makedirs(source_dir)
|
||||
for pkg in self.distribution.packages:
|
||||
if '.' not in pkg:
|
||||
os.path.walk(pkg, _find_modules, modules)
|
||||
module_list = modules.keys()
|
||||
module_list.sort()
|
||||
autoindex_filename = os.path.join(source_dir, 'autoindex.rst')
|
||||
with open(autoindex_filename, 'w') as autoindex:
|
||||
autoindex.write(""".. toctree::
|
||||
:maxdepth: 1
|
||||
|
||||
""")
|
||||
for module in module_list:
|
||||
output_filename = os.path.join(source_dir,
|
||||
"%s.rst" % module)
|
||||
heading = "The :mod:`%s` Module" % module
|
||||
underline = "=" * len(heading)
|
||||
values = dict(module=module, heading=heading,
|
||||
underline=underline)
|
||||
|
||||
print "Generating %s" % output_filename
|
||||
with open(output_filename, 'w') as output_file:
|
||||
output_file.write(_rst_template % values)
|
||||
autoindex.write(" %s.rst\n" % module)
|
||||
|
||||
def run(self):
|
||||
if not os.getenv('SPHINX_DEBUG'):
|
||||
self.generate_autoindex()
|
||||
|
||||
for builder in self.builders:
|
||||
self.builder = builder
|
||||
self.finalize_options()
|
||||
self.project = self.distribution.get_name()
|
||||
self.version = self.distribution.get_version()
|
||||
self.release = self.distribution.get_version()
|
||||
BuildDoc.run(self)
|
||||
|
||||
class LocalBuildLatex(LocalBuildDoc):
|
||||
builders = ['latex']
|
||||
|
||||
cmdclass['build_sphinx'] = LocalBuildDoc
|
||||
cmdclass['build_sphinx_latex'] = LocalBuildLatex
|
||||
except ImportError:
|
||||
pass
|
||||
|
||||
return cmdclass
|
||||
|
||||
|
||||
def _get_revno(git_dir):
|
||||
"""Return the number of commits since the most recent tag.
|
||||
|
||||
We use git-describe to find this out, but if there are no
|
||||
tags then we fall back to counting commits since the beginning
|
||||
of time.
|
||||
"""
|
||||
describe = _run_shell_command(
|
||||
"git --git-dir=%s describe --always" % git_dir)
|
||||
if "-" in describe:
|
||||
return describe.rsplit("-", 2)[-2]
|
||||
|
||||
# no tags found
|
||||
revlist = _run_shell_command(
|
||||
"git --git-dir=%s rev-list --abbrev-commit HEAD" % git_dir)
|
||||
return len(revlist.splitlines())
|
||||
|
||||
|
||||
def _get_version_from_git(pre_version):
|
||||
"""Return a version which is equal to the tag that's on the current
|
||||
revision if there is one, or tag plus number of additional revisions
|
||||
if the current revision has no tag."""
|
||||
|
||||
git_dir = _get_git_directory()
|
||||
if git_dir:
|
||||
if pre_version:
|
||||
try:
|
||||
return _run_shell_command(
|
||||
"git --git-dir=" + git_dir + " describe --exact-match",
|
||||
throw_on_error=True).replace('-', '.')
|
||||
except Exception:
|
||||
sha = _run_shell_command(
|
||||
"git --git-dir=" + git_dir + " log -n1 --pretty=format:%h")
|
||||
return "%s.a%s.g%s" % (pre_version, _get_revno(git_dir), sha)
|
||||
else:
|
||||
return _run_shell_command(
|
||||
"git --git-dir=" + git_dir + " describe --always").replace(
|
||||
'-', '.')
|
||||
return None
|
||||
|
||||
|
||||
def _get_version_from_pkg_info(package_name):
|
||||
"""Get the version from PKG-INFO file if we can."""
|
||||
try:
|
||||
pkg_info_file = open('PKG-INFO', 'r')
|
||||
except (IOError, OSError):
|
||||
return None
|
||||
try:
|
||||
pkg_info = email.message_from_file(pkg_info_file)
|
||||
except email.MessageError:
|
||||
return None
|
||||
# Check to make sure we're in our own dir
|
||||
if pkg_info.get('Name', None) != package_name:
|
||||
return None
|
||||
return pkg_info.get('Version', None)
|
||||
|
||||
|
||||
def get_version(package_name, pre_version=None):
|
||||
"""Get the version of the project. First, try getting it from PKG-INFO, if
|
||||
it exists. If it does, that means we're in a distribution tarball or that
|
||||
install has happened. Otherwise, if there is no PKG-INFO file, pull the
|
||||
version from git.
|
||||
|
||||
We do not support setup.py version sanity in git archive tarballs, nor do
|
||||
we support packagers directly sucking our git repo into theirs. We expect
|
||||
that a source tarball be made from our git repo - or that if someone wants
|
||||
to make a source tarball from a fork of our repo with additional tags in it
|
||||
that they understand and desire the results of doing that.
|
||||
"""
|
||||
version = os.environ.get("OSLO_PACKAGE_VERSION", None)
|
||||
if version:
|
||||
return version
|
||||
version = _get_version_from_pkg_info(package_name)
|
||||
if version:
|
||||
return version
|
||||
version = _get_version_from_git(pre_version)
|
||||
if version:
|
||||
return version
|
||||
raise Exception("Versioning for this project requires either an sdist"
|
||||
" tarball, or access to an upstream git repository.")
|
||||
setup(**filtered_args())
|
||||
|
8
test-requirements.txt
Normal file
8
test-requirements.txt
Normal file
@ -0,0 +1,8 @@
|
||||
coverage>=3.6
|
||||
discover
|
||||
fixtures>=0.3.12
|
||||
pep8==1.3.3
|
||||
pyflakes
|
||||
python-subunit
|
||||
testrepository>=0.0.13
|
||||
testtools>=0.9.27
|
33
tox.ini
Normal file
33
tox.ini
Normal file
@ -0,0 +1,33 @@
|
||||
[tox]
|
||||
envlist = py26,py27,pep8
|
||||
|
||||
[testenv]
|
||||
setenv = VIRTUAL_ENV={envdir}
|
||||
LANG=en_US.UTF-8
|
||||
LANGUAGE=en_US:en
|
||||
LC_ALL=C
|
||||
deps = -r{toxinidir}/requirements.txt
|
||||
-r{toxinidir}/test-requirements.txt
|
||||
commands =
|
||||
python setup.py testr --slowest --testr-args='{posargs}'
|
||||
|
||||
[tox:jenkins]
|
||||
sitepackages = True
|
||||
downloadcache = ~/cache/pip
|
||||
|
||||
[testenv:pep8]
|
||||
commands =
|
||||
pep8 --repeat --show-source --exclude=.venv,.tox,dist,doc .
|
||||
|
||||
[testenv:pyflakes]
|
||||
deps = pyflakes
|
||||
commands = pyflakes oslo
|
||||
|
||||
[testenv:cover]
|
||||
setenv = VIRTUAL_ENV={envdir}
|
||||
commands =
|
||||
python setup.py testr --coverage \
|
||||
--testr-args='^(?!.*test.*coverage).*$'
|
||||
|
||||
[testenv:venv]
|
||||
commands = {posargs}
|
Loading…
Reference in New Issue
Block a user