Initial commit
This commit is contained in:
65
.gitignore
vendored
Normal file
65
.gitignore
vendored
Normal file
@@ -0,0 +1,65 @@
|
|||||||
|
# Byte-compiled / optimized / DLL files
|
||||||
|
.*
|
||||||
|
!.gitignore
|
||||||
|
.*.sw?
|
||||||
|
__pycache__/
|
||||||
|
*.py[cod]
|
||||||
|
*$py.class
|
||||||
|
|
||||||
|
# C extensions
|
||||||
|
*.so
|
||||||
|
|
||||||
|
# Distribution / packaging
|
||||||
|
.Python
|
||||||
|
env/
|
||||||
|
build/
|
||||||
|
develop-eggs/
|
||||||
|
dist/
|
||||||
|
downloads/
|
||||||
|
eggs/
|
||||||
|
.eggs/
|
||||||
|
lib/
|
||||||
|
lib64/
|
||||||
|
parts/
|
||||||
|
sdist/
|
||||||
|
var/
|
||||||
|
*.egg-info/
|
||||||
|
.installed.cfg
|
||||||
|
*.egg
|
||||||
|
|
||||||
|
# PyInstaller
|
||||||
|
# Usually these files are written by a python script from a template
|
||||||
|
# before PyInstaller builds the exe, so as to inject date/other infos into it.
|
||||||
|
*.manifest
|
||||||
|
*.spec
|
||||||
|
|
||||||
|
# Installer logs
|
||||||
|
pip-log.txt
|
||||||
|
pip-delete-this-directory.txt
|
||||||
|
|
||||||
|
# Unit test / coverage reports
|
||||||
|
htmlcov/
|
||||||
|
.tox/
|
||||||
|
.coverage
|
||||||
|
.coverage.*
|
||||||
|
.cache
|
||||||
|
nosetests.xml
|
||||||
|
coverage.xml
|
||||||
|
*,cover
|
||||||
|
.hypothesis/
|
||||||
|
|
||||||
|
# Translations
|
||||||
|
*.mo
|
||||||
|
*.pot
|
||||||
|
|
||||||
|
# Django stuff:
|
||||||
|
*.log
|
||||||
|
|
||||||
|
# Sphinx documentation
|
||||||
|
docs/_build/
|
||||||
|
|
||||||
|
# PyBuilder
|
||||||
|
target/
|
||||||
|
|
||||||
|
# pyenv python configuration file
|
||||||
|
.python-version
|
||||||
13
AUTHORS.rst
Normal file
13
AUTHORS.rst
Normal file
@@ -0,0 +1,13 @@
|
|||||||
|
=======
|
||||||
|
Credits
|
||||||
|
=======
|
||||||
|
|
||||||
|
Development Lead
|
||||||
|
----------------
|
||||||
|
|
||||||
|
* Gorka Eguileor <geguileo@redhat.com>
|
||||||
|
|
||||||
|
Contributors
|
||||||
|
------------
|
||||||
|
|
||||||
|
None yet. Why not be the first?
|
||||||
114
CONTRIBUTING.rst
Normal file
114
CONTRIBUTING.rst
Normal file
@@ -0,0 +1,114 @@
|
|||||||
|
.. highlight:: shell
|
||||||
|
|
||||||
|
============
|
||||||
|
Contributing
|
||||||
|
============
|
||||||
|
|
||||||
|
Contributions are welcome, and they are greatly appreciated! Every
|
||||||
|
little bit helps, and credit will always be given.
|
||||||
|
|
||||||
|
You can contribute in many ways:
|
||||||
|
|
||||||
|
Types of Contributions
|
||||||
|
----------------------
|
||||||
|
|
||||||
|
Report Bugs
|
||||||
|
~~~~~~~~~~~
|
||||||
|
|
||||||
|
Report bugs at https://github.com/akrog/cinderlib/issues.
|
||||||
|
|
||||||
|
If you are reporting a bug, please include:
|
||||||
|
|
||||||
|
* Your operating system name and version.
|
||||||
|
* Any details about your local setup that might be helpful in troubleshooting.
|
||||||
|
* Detailed steps to reproduce the bug.
|
||||||
|
|
||||||
|
Fix Bugs
|
||||||
|
~~~~~~~~
|
||||||
|
|
||||||
|
Look through the GitHub issues for bugs. Anything tagged with "bug"
|
||||||
|
and "help wanted" is open to whoever wants to implement it.
|
||||||
|
|
||||||
|
Implement Features
|
||||||
|
~~~~~~~~~~~~~~~~~~
|
||||||
|
|
||||||
|
Look through the GitHub issues for features. Anything tagged with "enhancement"
|
||||||
|
and "help wanted" is open to whoever wants to implement it.
|
||||||
|
|
||||||
|
Write Documentation
|
||||||
|
~~~~~~~~~~~~~~~~~~~
|
||||||
|
|
||||||
|
Cinder Library could always use more documentation, whether as part of the
|
||||||
|
official Cinder Library docs, in docstrings, or even on the web in blog posts,
|
||||||
|
articles, and such.
|
||||||
|
|
||||||
|
Submit Feedback
|
||||||
|
~~~~~~~~~~~~~~~
|
||||||
|
|
||||||
|
The best way to send feedback is to file an issue at https://github.com/akrog/cinderlib/issues.
|
||||||
|
|
||||||
|
If you are proposing a feature:
|
||||||
|
|
||||||
|
* Explain in detail how it would work.
|
||||||
|
* Keep the scope as narrow as possible, to make it easier to implement.
|
||||||
|
* Remember that this is a volunteer-driven project, and that contributions
|
||||||
|
are welcome :)
|
||||||
|
|
||||||
|
Get Started!
|
||||||
|
------------
|
||||||
|
|
||||||
|
Ready to contribute? Here's how to set up `cinderlib` for local development.
|
||||||
|
|
||||||
|
1. Fork the `cinderlib` repo on GitHub.
|
||||||
|
2. Clone your fork locally::
|
||||||
|
|
||||||
|
$ git clone git@github.com:your_name_here/cinderlib.git
|
||||||
|
|
||||||
|
3. Install your local copy into a virtualenv. Assuming you have virtualenvwrapper installed, this is how you set up your fork for local development::
|
||||||
|
|
||||||
|
$ mkvirtualenv cinderlib
|
||||||
|
$ cd cinderlib/
|
||||||
|
$ python setup.py develop
|
||||||
|
|
||||||
|
4. Create a branch for local development::
|
||||||
|
|
||||||
|
$ git checkout -b name-of-your-bugfix-or-feature
|
||||||
|
|
||||||
|
Now you can make your changes locally.
|
||||||
|
|
||||||
|
5. When you're done making changes, check that your changes pass flake8 and the tests, including testing other Python versions with tox::
|
||||||
|
|
||||||
|
$ flake8 cinderlib tests
|
||||||
|
$ python setup.py test or py.test
|
||||||
|
$ tox
|
||||||
|
|
||||||
|
To get flake8 and tox, just pip install them into your virtualenv.
|
||||||
|
|
||||||
|
6. Commit your changes and push your branch to GitHub::
|
||||||
|
|
||||||
|
$ git add .
|
||||||
|
$ git commit -m "Your detailed description of your changes."
|
||||||
|
$ git push origin name-of-your-bugfix-or-feature
|
||||||
|
|
||||||
|
7. Submit a pull request through the GitHub website.
|
||||||
|
|
||||||
|
Pull Request Guidelines
|
||||||
|
-----------------------
|
||||||
|
|
||||||
|
Before you submit a pull request, check that it meets these guidelines:
|
||||||
|
|
||||||
|
1. The pull request should include tests.
|
||||||
|
2. If the pull request adds functionality, the docs should be updated. Put
|
||||||
|
your new functionality into a function with a docstring, and add the
|
||||||
|
feature to the list in README.rst.
|
||||||
|
3. The pull request should work for Python 2.7, 3.3, 3.4 and 3.5, and for PyPy. Check
|
||||||
|
https://travis-ci.org/akrog/cinderlib/pull_requests
|
||||||
|
and make sure that the tests pass for all supported Python versions.
|
||||||
|
|
||||||
|
Tips
|
||||||
|
----
|
||||||
|
|
||||||
|
To run a subset of tests::
|
||||||
|
|
||||||
|
|
||||||
|
$ python -m unittest tests.test_cinderlib
|
||||||
8
HISTORY.rst
Normal file
8
HISTORY.rst
Normal file
@@ -0,0 +1,8 @@
|
|||||||
|
=======
|
||||||
|
History
|
||||||
|
=======
|
||||||
|
|
||||||
|
0.1.0 (2017-11-03)
|
||||||
|
------------------
|
||||||
|
|
||||||
|
* First release on PyPI.
|
||||||
17
LICENSE
Normal file
17
LICENSE
Normal file
@@ -0,0 +1,17 @@
|
|||||||
|
|
||||||
|
Apache Software License 2.0
|
||||||
|
|
||||||
|
Copyright (c) 2017, Red Hat, Inc.
|
||||||
|
|
||||||
|
Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
you may not use this file except in compliance with the License.
|
||||||
|
You may obtain a copy of the License at
|
||||||
|
|
||||||
|
http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
|
||||||
|
Unless required by applicable law or agreed to in writing, software
|
||||||
|
distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
See the License for the specific language governing permissions and
|
||||||
|
limitations under the License.
|
||||||
|
|
||||||
13
MANIFEST.in
Normal file
13
MANIFEST.in
Normal file
@@ -0,0 +1,13 @@
|
|||||||
|
|
||||||
|
include AUTHORS.rst
|
||||||
|
|
||||||
|
include CONTRIBUTING.rst
|
||||||
|
include HISTORY.rst
|
||||||
|
include LICENSE
|
||||||
|
include README.rst
|
||||||
|
|
||||||
|
recursive-include tests *
|
||||||
|
recursive-exclude * __pycache__
|
||||||
|
recursive-exclude * *.py[co]
|
||||||
|
|
||||||
|
recursive-include docs *.rst conf.py Makefile make.bat *.jpg *.png *.gif
|
||||||
92
Makefile
Normal file
92
Makefile
Normal file
@@ -0,0 +1,92 @@
|
|||||||
|
.PHONY: clean clean-test clean-pyc clean-build docs help
|
||||||
|
.DEFAULT_GOAL := help
|
||||||
|
define BROWSER_PYSCRIPT
|
||||||
|
import os, webbrowser, sys
|
||||||
|
try:
|
||||||
|
from urllib import pathname2url
|
||||||
|
except:
|
||||||
|
from urllib.request import pathname2url
|
||||||
|
|
||||||
|
webbrowser.open("file://" + pathname2url(os.path.abspath(sys.argv[1])))
|
||||||
|
endef
|
||||||
|
export BROWSER_PYSCRIPT
|
||||||
|
|
||||||
|
define PRINT_HELP_PYSCRIPT
|
||||||
|
import re, sys
|
||||||
|
|
||||||
|
for line in sys.stdin:
|
||||||
|
match = re.match(r'^([a-zA-Z_-]+):.*?## (.*)$$', line)
|
||||||
|
if match:
|
||||||
|
target, help = match.groups()
|
||||||
|
print("%-20s %s" % (target, help))
|
||||||
|
endef
|
||||||
|
export PRINT_HELP_PYSCRIPT
|
||||||
|
BROWSER := python -c "$$BROWSER_PYSCRIPT"
|
||||||
|
|
||||||
|
help:
|
||||||
|
@python -c "$$PRINT_HELP_PYSCRIPT" < $(MAKEFILE_LIST)
|
||||||
|
|
||||||
|
clean: clean-build clean-pyc clean-test ## remove all build, test, coverage and Python artifacts
|
||||||
|
|
||||||
|
|
||||||
|
clean-build: ## remove build artifacts
|
||||||
|
rm -fr build/
|
||||||
|
rm -fr dist/
|
||||||
|
rm -fr .eggs/
|
||||||
|
find . -name '*.egg-info' -exec rm -fr {} +
|
||||||
|
find . -name '*.egg' -exec rm -f {} +
|
||||||
|
|
||||||
|
clean-pyc: ## remove Python file artifacts
|
||||||
|
find . -name '*.pyc' -exec rm -f {} +
|
||||||
|
find . -name '*.pyo' -exec rm -f {} +
|
||||||
|
find . -name '*~' -exec rm -f {} +
|
||||||
|
find . -name '__pycache__' -exec rm -fr {} +
|
||||||
|
|
||||||
|
clean-test: ## remove test and coverage artifacts
|
||||||
|
rm -fr .tox/
|
||||||
|
rm -f .coverage
|
||||||
|
rm -fr htmlcov/
|
||||||
|
|
||||||
|
lint: ## check style with flake8
|
||||||
|
flake8 cinderlib tests
|
||||||
|
|
||||||
|
test: ## run tests quickly with the default Python
|
||||||
|
|
||||||
|
python setup.py test
|
||||||
|
|
||||||
|
test-all: ## run tests on every Python version with tox
|
||||||
|
tox
|
||||||
|
|
||||||
|
coverage: ## check code coverage quickly with the default Python
|
||||||
|
|
||||||
|
coverage run --source cinderlib setup.py test
|
||||||
|
|
||||||
|
coverage report -m
|
||||||
|
coverage html
|
||||||
|
$(BROWSER) htmlcov/index.html
|
||||||
|
|
||||||
|
docs: ## generate Sphinx HTML documentation, including API docs
|
||||||
|
rm -f docs/cinderlib.rst
|
||||||
|
rm -f docs/modules.rst
|
||||||
|
sphinx-apidoc -o docs/ cinderlib
|
||||||
|
$(MAKE) -C docs clean
|
||||||
|
$(MAKE) -C docs html
|
||||||
|
$(BROWSER) docs/_build/html/index.html
|
||||||
|
|
||||||
|
servedocs: docs ## compile the docs watching for changes
|
||||||
|
watchmedo shell-command -p '*.rst' -c '$(MAKE) -C docs html' -R -D .
|
||||||
|
|
||||||
|
register: ## register package in pypi
|
||||||
|
python setup.py register --repository pypi
|
||||||
|
|
||||||
|
release: clean ## package and upload a release
|
||||||
|
python setup.py sdist upload --repository pypi
|
||||||
|
python setup.py bdist_wheel upload --repository pypi
|
||||||
|
|
||||||
|
dist: clean ## builds source and wheel package
|
||||||
|
python setup.py sdist
|
||||||
|
python setup.py bdist_wheel
|
||||||
|
ls -l dist
|
||||||
|
|
||||||
|
install: clean ## install the package to the active Python's site-packages
|
||||||
|
python setup.py install
|
||||||
109
README.rst
Normal file
109
README.rst
Normal file
@@ -0,0 +1,109 @@
|
|||||||
|
Cinder Library
|
||||||
|
===============================
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
.. image:: https://img.shields.io/pypi/v/cinderlib.svg
|
||||||
|
:target: https://pypi.python.org/pypi/cinderlib
|
||||||
|
|
||||||
|
.. image:: https://img.shields.io/travis/akrog/cinderlib.svg
|
||||||
|
:target: https://travis-ci.org/akrog/cinderlib
|
||||||
|
|
||||||
|
.. image:: https://readthedocs.org/projects/cinderlib/badge/?version=latest
|
||||||
|
:target: https://cinderlib.readthedocs.io/en/latest/?badge=latest
|
||||||
|
:alt: Documentation Status
|
||||||
|
|
||||||
|
.. image:: https://img.shields.io/pypi/pyversions/cinderlib.svg
|
||||||
|
:target: https://pypi.python.org/pypi/cinderlib
|
||||||
|
|
||||||
|
.. image:: https://pyup.io/repos/github/akrog/cinderlib/shield.svg
|
||||||
|
:target: https://pyup.io/repos/github/akrog/cinderlib/
|
||||||
|
:alt: Updates
|
||||||
|
|
||||||
|
.. image:: https://img.shields.io/:license-apache-blue.svg
|
||||||
|
:target: http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
|
||||||
|
|
||||||
|
Introduction
|
||||||
|
------------
|
||||||
|
|
||||||
|
Cinder Library is a Python library that allows using storage drivers outside of
|
||||||
|
Cinder.
|
||||||
|
|
||||||
|
* Free software: Apache Software License 2.0
|
||||||
|
* Documentation: https://cinderlib.readthedocs.io.
|
||||||
|
|
||||||
|
This library is currently at an Alpha status and is primarily intended as a
|
||||||
|
proof of concept at this stage. While some drivers have been manually
|
||||||
|
validated most drivers have not, so there's a good chance that they could
|
||||||
|
experience issues.
|
||||||
|
|
||||||
|
When using this library one should be aware that this is in no way close to the
|
||||||
|
robustness or feature richness that the Cinder project provides. Some of the
|
||||||
|
more obvious limitations are:
|
||||||
|
|
||||||
|
* There are no argument validation on the methods so it's a classic GIGO_
|
||||||
|
library.
|
||||||
|
* The logic has been kept to a minimum and higher functioning logic is expected
|
||||||
|
to be in the caller. For example you can delete a volume that still has
|
||||||
|
snapshots, and the end results will depend on the Cinder driver and the
|
||||||
|
storage array, so you will have some that will delete the snapshots and
|
||||||
|
others that will leave them there.
|
||||||
|
* There is no CI, or unit tests for that matter, and certainly nothing so fancy
|
||||||
|
as third party vendor CIs, so things being broken could be considered the
|
||||||
|
norm.
|
||||||
|
* Only a subset number of basic operations are supported by the library.
|
||||||
|
|
||||||
|
The minimum version of Cinder required by this library is Pike; although,
|
||||||
|
depending on my my availability, I may make the library support Ocata as well.
|
||||||
|
|
||||||
|
Since it's using Cinder's code the library is still bound by the same
|
||||||
|
restrictions and behaviors of the drivers running under the standard Cinder
|
||||||
|
services, which means that not all operations will behave consistently across
|
||||||
|
drivers. For example you can find drivers where cloning is a cheap operation
|
||||||
|
performed by the storage array whereas other will actually create a new volume,
|
||||||
|
attach the source and the new volume and perform a full copy of the data.
|
||||||
|
|
||||||
|
If a driver in Cinder requires external libraries or packages they will also
|
||||||
|
be required by the library and will need to be manually installed.
|
||||||
|
|
||||||
|
For more detailed information please refer to the `official project
|
||||||
|
documentation`_ and `OpenStack's Cinder volume driver configuration
|
||||||
|
documentation`_.
|
||||||
|
|
||||||
|
Due to the limited access to Cinder backends and time constraints the list of
|
||||||
|
drivers that have been manually tested are (I'll try to test more):
|
||||||
|
|
||||||
|
- LVM
|
||||||
|
- XtremIO
|
||||||
|
- Kaminario
|
||||||
|
|
||||||
|
If you try the library with another storage array I would appreciate a note on
|
||||||
|
the library version, Cinder release, and results of your testing.
|
||||||
|
|
||||||
|
Features
|
||||||
|
--------
|
||||||
|
|
||||||
|
* Use a Cinder driver without running a DBMS, Message broker, or Cinder
|
||||||
|
service.
|
||||||
|
* Using multiple simultaneous drivers on the same program.
|
||||||
|
* Stateless: Support full serialization of objects and context to json or
|
||||||
|
string so the state can be restored.
|
||||||
|
* Basic operations support:
|
||||||
|
- Create volume
|
||||||
|
- Delete volume
|
||||||
|
- Extend volume
|
||||||
|
- Clone volume
|
||||||
|
- Create snapshot
|
||||||
|
- Delete snapshot
|
||||||
|
- Create volume from snapshot
|
||||||
|
- Connect volume
|
||||||
|
- Disconnect volume
|
||||||
|
- Local attach
|
||||||
|
- Local detach
|
||||||
|
- Validate connector
|
||||||
|
|
||||||
|
|
||||||
|
.. _GIGO: https://en.wikipedia.org/wiki/Garbage_in,_garbage_out
|
||||||
|
.. _official project documentation: https://readthedocs.org/projects/cinderlib/badge/?version=latest
|
||||||
|
.. _OpenStack's Cinder volume driver configuration documentation: https://docs.openstack.org/cinder/latest/configuration/block-storage/volume-drivers.html
|
||||||
8
cinderlib/__init__.py
Normal file
8
cinderlib/__init__.py
Normal file
@@ -0,0 +1,8 @@
|
|||||||
|
from __future__ import absolute_import
|
||||||
|
import cinderlib.cinderlib as clib
|
||||||
|
from cinderlib.cinderlib import * # noqa
|
||||||
|
|
||||||
|
__author__ = """Gorka Eguileor"""
|
||||||
|
__email__ = 'geguileo@redhat.com'
|
||||||
|
__version__ = '0.1.0'
|
||||||
|
__all__ = clib.__all__
|
||||||
970
cinderlib/cinderlib.py
Normal file
970
cinderlib/cinderlib.py
Normal file
@@ -0,0 +1,970 @@
|
|||||||
|
# Copyright (c) 2017, Red Hat, Inc.
|
||||||
|
# All Rights Reserved.
|
||||||
|
#
|
||||||
|
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||||
|
# not use this file except in compliance with the License. You may obtain
|
||||||
|
# a copy of the License at
|
||||||
|
#
|
||||||
|
# http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
#
|
||||||
|
# Unless required by applicable law or agreed to in writing, software
|
||||||
|
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||||
|
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||||
|
# License for the specific language governing permissions and limitations
|
||||||
|
# under the License.
|
||||||
|
|
||||||
|
from __future__ import absolute_import
|
||||||
|
import collections
|
||||||
|
import functools
|
||||||
|
import json as json_lib
|
||||||
|
import logging
|
||||||
|
import os
|
||||||
|
import requests
|
||||||
|
import uuid
|
||||||
|
|
||||||
|
from cinder import coordination
|
||||||
|
# NOTE(geguileo): If we want to prevent eventlet from monkey_patching we would
|
||||||
|
# need to do something about volume's L27-32.
|
||||||
|
# NOTE(geguileo): Provably a good idea not to depend on cinder.cmd.volume
|
||||||
|
# having all the other imports as they could change.
|
||||||
|
from cinder.cmd import volume as volume_cmd
|
||||||
|
from cinder import context
|
||||||
|
from cinder import exception
|
||||||
|
from cinder.objects import base as cinder_base_ovo
|
||||||
|
from cinder import utils
|
||||||
|
from cinder.volume import configuration
|
||||||
|
from oslo_utils import importutils
|
||||||
|
from oslo_versionedobjects import base as base_ovo
|
||||||
|
from os_brick import exception as brick_exception
|
||||||
|
from os_brick.privileged import rootwrap
|
||||||
|
import six
|
||||||
|
|
||||||
|
|
||||||
|
__all__ = ['setup', 'load', 'json', 'jsons', 'Backend', 'Volume', 'Snapshot',
|
||||||
|
'Connection']
|
||||||
|
|
||||||
|
|
||||||
|
volume_cmd.objects.register_all()
|
||||||
|
|
||||||
|
|
||||||
|
class Backend(object):
|
||||||
|
"""Representation of a Cinder Driver.
|
||||||
|
|
||||||
|
User facing attributes are:
|
||||||
|
|
||||||
|
- __init__
|
||||||
|
- json
|
||||||
|
- jsons
|
||||||
|
- load
|
||||||
|
- stats
|
||||||
|
- create_volume
|
||||||
|
- global_setup
|
||||||
|
- validate_connector
|
||||||
|
"""
|
||||||
|
backends = {}
|
||||||
|
global_initialization = False
|
||||||
|
context = context.get_admin_context()
|
||||||
|
|
||||||
|
def __init__(self, volume_backend_name, **driver_cfg):
|
||||||
|
if not self.global_initialization:
|
||||||
|
self.global_setup()
|
||||||
|
driver_cfg['volume_backend_name'] = volume_backend_name
|
||||||
|
Backend.backends[volume_backend_name] = self
|
||||||
|
|
||||||
|
conf = self._get_config(**driver_cfg)
|
||||||
|
self.driver = importutils.import_object(
|
||||||
|
conf.volume_driver,
|
||||||
|
configuration=conf,
|
||||||
|
db=OVO.fake_db,
|
||||||
|
host=volume_cmd.CONF.host,
|
||||||
|
cluster_name=None, # No clusters for now: volume_cmd.CONF.cluster,
|
||||||
|
active_backend_id=None) # No failover for now
|
||||||
|
self.driver.do_setup(self.context)
|
||||||
|
self.driver.check_for_setup_error()
|
||||||
|
self.driver.init_capabilities()
|
||||||
|
self.driver.set_throttle()
|
||||||
|
self.driver.set_initialized()
|
||||||
|
self.volumes = set()
|
||||||
|
self._driver_cfg = driver_cfg
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return '<cinderlib.Backend %s>' % self.id
|
||||||
|
|
||||||
|
def __getattr__(self, name):
|
||||||
|
return getattr(self.driver, name)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def id(self):
|
||||||
|
return self._driver_cfg['volume_backend_name']
|
||||||
|
|
||||||
|
@property
|
||||||
|
def config(self):
|
||||||
|
if self.output_all_backend_info:
|
||||||
|
return self._driver_cfg
|
||||||
|
return {'volume_backend_name': self._driver_cfg['volume_backend_name']}
|
||||||
|
|
||||||
|
@property
|
||||||
|
def json(self):
|
||||||
|
result = [volume.json for volume in self.volumes]
|
||||||
|
# We only need to output the full backend configuration once
|
||||||
|
if self.output_all_backend_info:
|
||||||
|
backend = {'volume_backend_name': self.id}
|
||||||
|
for volume in result:
|
||||||
|
volume['backend'] = backend
|
||||||
|
return {'class': type(self).__name__,
|
||||||
|
'backend': self.config,
|
||||||
|
'volumes': result}
|
||||||
|
|
||||||
|
@property
|
||||||
|
def jsons(self):
|
||||||
|
return json_lib.dumps(self.json)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def load(cls, json_src):
|
||||||
|
backend = Backend.load_backend(json_src['backend'])
|
||||||
|
for volume in json_src['volumes']:
|
||||||
|
Volume.load(volume)
|
||||||
|
return backend
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def load_backend(cls, backend_data):
|
||||||
|
backend_name = backend_data['volume_backend_name']
|
||||||
|
if backend_name in cls.backends:
|
||||||
|
return cls.backends[backend_name]
|
||||||
|
|
||||||
|
if len(backend_data) > 1:
|
||||||
|
return cls(**backend_data)
|
||||||
|
|
||||||
|
raise Exception('Backend not present in system or json.')
|
||||||
|
|
||||||
|
def stats(self, refresh=False):
|
||||||
|
stats = self.driver.get_volume_stats(refresh=refresh)
|
||||||
|
return stats
|
||||||
|
|
||||||
|
def create_volume(self, size, name='', description='', bootable=False,
|
||||||
|
**kwargs):
|
||||||
|
vol = Volume(self, size=size, name=name, description=description,
|
||||||
|
bootable=bootable, **kwargs)
|
||||||
|
vol.create()
|
||||||
|
return vol
|
||||||
|
|
||||||
|
def validate_connector(self, connector_dict):
|
||||||
|
"""Raise exception if missing info for volume's connect call."""
|
||||||
|
self.driver.validate_connector(connector_dict)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def global_setup(cls, file_locks_path=None, disable_sudo=False,
|
||||||
|
suppress_requests_ssl_warnings=True, disable_logs=True,
|
||||||
|
non_uuid_ids=False, output_all_backend_info=False,
|
||||||
|
**log_params):
|
||||||
|
# Global setup can only be set once
|
||||||
|
if cls.global_initialization:
|
||||||
|
raise Exception('Already setup')
|
||||||
|
|
||||||
|
# Prevent driver dynamic loading clearing configuration options
|
||||||
|
volume_cmd.CONF._ConfigOpts__cache = MyDict()
|
||||||
|
|
||||||
|
volume_cmd.CONF.version = volume_cmd.version.version_string()
|
||||||
|
volume_cmd.CONF.register_opt(
|
||||||
|
configuration.cfg.StrOpt('stateless_cinder'),
|
||||||
|
group=configuration.SHARED_CONF_GROUP)
|
||||||
|
|
||||||
|
OVO._ovo_init(non_uuid_ids)
|
||||||
|
|
||||||
|
cls._set_logging(disable_logs, **log_params)
|
||||||
|
cls._set_priv_helper()
|
||||||
|
cls._set_coordinator(file_locks_path)
|
||||||
|
|
||||||
|
if suppress_requests_ssl_warnings:
|
||||||
|
requests.packages.urllib3.disable_warnings(
|
||||||
|
requests.packages.urllib3.exceptions.InsecureRequestWarning)
|
||||||
|
requests.packages.urllib3.disable_warnings(
|
||||||
|
requests.packages.urllib3.exceptions.InsecurePlatformWarning)
|
||||||
|
|
||||||
|
cls.global_initialization = True
|
||||||
|
cls.output_all_backend_info = output_all_backend_info
|
||||||
|
|
||||||
|
def _get_config(self, volume_backend_name, **kwargs):
|
||||||
|
volume_cmd.CONF.register_opt(volume_cmd.host_opt,
|
||||||
|
group=volume_backend_name)
|
||||||
|
backend_opts = getattr(volume_cmd.CONF, volume_backend_name)
|
||||||
|
for key, value in kwargs.items():
|
||||||
|
setattr(backend_opts, key, value)
|
||||||
|
config = configuration.Configuration([],
|
||||||
|
config_group=volume_backend_name)
|
||||||
|
return config
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def _set_logging(cls, disable_logs, **log_params):
|
||||||
|
if disable_logs:
|
||||||
|
logging.Logger.disabled = property(lambda s: True,
|
||||||
|
lambda s, x: None)
|
||||||
|
return
|
||||||
|
|
||||||
|
for key, value in log_params.items():
|
||||||
|
setattr(volume_cmd.CONF, key, value)
|
||||||
|
volume_cmd.logging.setup(volume_cmd.CONF, 'cinder')
|
||||||
|
volume_cmd.python_logging.captureWarnings(True)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def _set_priv_helper(cls):
|
||||||
|
root_helper = 'sudo'
|
||||||
|
utils.get_root_helper = lambda: root_helper
|
||||||
|
volume_cmd.priv_context.init(root_helper=[root_helper])
|
||||||
|
|
||||||
|
existing_bgcp = utils.connector.get_connector_properties
|
||||||
|
existing_bcp = utils.connector.InitiatorConnector.factory
|
||||||
|
|
||||||
|
def my_bgcp(*args, **kwargs):
|
||||||
|
if len(args):
|
||||||
|
args = list(args)
|
||||||
|
args[0] = root_helper
|
||||||
|
else:
|
||||||
|
kwargs['root_helper'] = root_helper
|
||||||
|
kwargs['execute'] = rootwrap.custom_execute
|
||||||
|
return existing_bgcp(*args, **kwargs)
|
||||||
|
|
||||||
|
def my_bgc(*args, **kwargs):
|
||||||
|
if len(args) >= 2:
|
||||||
|
args = list(args)
|
||||||
|
args[1] = root_helper
|
||||||
|
else:
|
||||||
|
kwargs['root_helper'] = root_helper
|
||||||
|
kwargs['execute'] = rootwrap.custom_execute
|
||||||
|
return existing_bcp(*args, **kwargs)
|
||||||
|
|
||||||
|
utils.connector.get_connector_properties = my_bgcp
|
||||||
|
utils.connector.InitiatorConnector.factory = staticmethod(my_bgc)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def _set_coordinator(cls, file_locks_path):
|
||||||
|
file_locks_path = file_locks_path or os.getcwd()
|
||||||
|
volume_cmd.CONF.oslo_concurrency.lock_path = file_locks_path
|
||||||
|
volume_cmd.CONF.coordination.backend_url = 'file://' + file_locks_path
|
||||||
|
coordination.COORDINATOR.start()
|
||||||
|
|
||||||
|
|
||||||
|
setup = Backend.global_setup
|
||||||
|
|
||||||
|
|
||||||
|
def load(json_src):
|
||||||
|
"""Load any json serialized cinderlib object."""
|
||||||
|
if isinstance(json_src, six.string_types):
|
||||||
|
json_src = json_lib.loads(json_src)
|
||||||
|
|
||||||
|
if isinstance(json_src, list):
|
||||||
|
return [globals()[obj['class']].load(obj) for obj in json_src]
|
||||||
|
|
||||||
|
return globals()[json_src['class']].load(json_src)
|
||||||
|
|
||||||
|
|
||||||
|
def json():
|
||||||
|
"""Conver to Json everything we have in this system."""
|
||||||
|
return [backend.json for backend in Backend.backends.values()]
|
||||||
|
|
||||||
|
|
||||||
|
def jsons():
|
||||||
|
"""Convert to a Json string everything we have in this system."""
|
||||||
|
return json_lib.dumps(json())
|
||||||
|
|
||||||
|
|
||||||
|
class Object(object):
|
||||||
|
"""Base class for our resource representation objects."""
|
||||||
|
DEFAULT_FIELDS_VALUES = {}
|
||||||
|
objects = collections.defaultdict(dict)
|
||||||
|
context = context.get_admin_context()
|
||||||
|
|
||||||
|
def __init__(self, backend, **fields_data):
|
||||||
|
self.backend = backend
|
||||||
|
|
||||||
|
__ovo = fields_data.get('__ovo')
|
||||||
|
if __ovo:
|
||||||
|
self._ovo = __ovo
|
||||||
|
else:
|
||||||
|
self._ovo = self._create_ovo(**fields_data)
|
||||||
|
|
||||||
|
cls = type(self)
|
||||||
|
cls.objects = Object.objects[cls.__name__]
|
||||||
|
# TODO: Don't replace if present is newer
|
||||||
|
self.objects[self._ovo.id] = self
|
||||||
|
|
||||||
|
def _to_primitive(self):
|
||||||
|
return None
|
||||||
|
|
||||||
|
def _create_ovo(self, **fields_data):
|
||||||
|
# The base are the default values we define on our own classes
|
||||||
|
fields_values = self.DEFAULT_FIELDS_VALUES.copy()
|
||||||
|
|
||||||
|
# Apply the values defined by the caller
|
||||||
|
fields_values.update(fields_data)
|
||||||
|
|
||||||
|
# We support manually setting the id, so set only if not already set
|
||||||
|
fields_values.setdefault('id', self.new_uuid())
|
||||||
|
|
||||||
|
# Set non set field values based on OVO's default value and on whether
|
||||||
|
# it is nullable or not.
|
||||||
|
for field_name, field in self.OVO_CLASS.fields.items():
|
||||||
|
if field.default != cinder_base_ovo.fields.UnspecifiedDefault:
|
||||||
|
fields_values.setdefault(field_name, field.default)
|
||||||
|
elif field.nullable:
|
||||||
|
fields_values.setdefault(field_name, None)
|
||||||
|
|
||||||
|
return self.OVO_CLASS(context=self.context, **fields_values)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def json(self):
|
||||||
|
ovo = self._ovo.obj_to_primitive()
|
||||||
|
return {'class': type(self).__name__,
|
||||||
|
'backend': self.backend.config,
|
||||||
|
'ovo': ovo}
|
||||||
|
|
||||||
|
@property
|
||||||
|
def jsons(self):
|
||||||
|
return json_lib.dumps(self.json)
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return ('<cinderlib.%s object %s on backend %s>' %
|
||||||
|
(type(self).__name__,
|
||||||
|
self.id,
|
||||||
|
self.backend.id))
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def load(cls, json_src):
|
||||||
|
backend = Backend.load_backend(json_src['backend'])
|
||||||
|
|
||||||
|
backend_name = json_src['backend']['volume_backend_name']
|
||||||
|
if backend_name in Backend.backends:
|
||||||
|
backend = Backend.backends[backend_name]
|
||||||
|
elif len(json_src['backend']) == 1:
|
||||||
|
raise Exception('Backend not present in system or json.')
|
||||||
|
else:
|
||||||
|
backend = Backend(**json_src['backend'])
|
||||||
|
|
||||||
|
ovo = cinder_base_ovo.CinderObject.obj_from_primitive(json_src['ovo'],
|
||||||
|
cls.context)
|
||||||
|
return cls._load(backend, ovo)
|
||||||
|
|
||||||
|
def _replace_ovo(self, ovo):
|
||||||
|
self._ovo = ovo
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def new_uuid():
|
||||||
|
return str(uuid.uuid4())
|
||||||
|
|
||||||
|
def __getattr__(self, name):
|
||||||
|
if name == '_ovo':
|
||||||
|
raise AttributeError('Attribute _ovo is not yet set')
|
||||||
|
return getattr(self._ovo, name)
|
||||||
|
|
||||||
|
|
||||||
|
class Volume(Object):
|
||||||
|
OVO_CLASS = volume_cmd.objects.Volume
|
||||||
|
DEFAULT_FIELDS_VALUES = {
|
||||||
|
'size': 1,
|
||||||
|
'user_id': Object.context.user_id,
|
||||||
|
'project_id': Object.context.project_id,
|
||||||
|
'host': volume_cmd.CONF.host,
|
||||||
|
'status': 'creating',
|
||||||
|
'attach_status': 'detached',
|
||||||
|
'metadata': {},
|
||||||
|
'admin_metadata': {},
|
||||||
|
'glance_metadata': {},
|
||||||
|
}
|
||||||
|
|
||||||
|
_ignore_keys = ('id', 'volume_attachment' 'snapshots')
|
||||||
|
|
||||||
|
def __init__(self, backend_or_vol, **kwargs):
|
||||||
|
# Accept a volume as additional source data
|
||||||
|
if isinstance(backend_or_vol, Volume):
|
||||||
|
for key in backend_or_vol._ovo.fields:
|
||||||
|
if (backend_or_vol._ovo.obj_attr_is_set(key) and
|
||||||
|
key not in self._ignore_keys):
|
||||||
|
kwargs.setdefault(key, getattr(backend_or_vol._ovo, key))
|
||||||
|
backend_or_vol = backend_or_vol.backend
|
||||||
|
|
||||||
|
if '__ovo' not in kwargs:
|
||||||
|
if 'description' in kwargs:
|
||||||
|
kwargs['display_description'] = kwargs.pop('description')
|
||||||
|
if 'name' in kwargs:
|
||||||
|
kwargs['display_name'] = kwargs.pop('name')
|
||||||
|
kwargs.setdefault(
|
||||||
|
'volume_attachment',
|
||||||
|
volume_cmd.objects.VolumeAttachmentList(context=self.context))
|
||||||
|
kwargs.setdefault(
|
||||||
|
'snapshots',
|
||||||
|
volume_cmd.objects.SnapshotList(context=self.context))
|
||||||
|
|
||||||
|
super(Volume, self).__init__(backend_or_vol, **kwargs)
|
||||||
|
self.snapshots = set()
|
||||||
|
self.connections = []
|
||||||
|
self._populate_data()
|
||||||
|
self.backend.volumes.add(self)
|
||||||
|
|
||||||
|
def _to_primitive(self):
|
||||||
|
local_attach = self.local_attach.id if self.local_attach else None
|
||||||
|
return {'local_attach': local_attach,
|
||||||
|
'exported': self.exported}
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def _load(cls, backend, ovo):
|
||||||
|
# Restore snapshot's circular reference removed on serialization
|
||||||
|
for snap in ovo.snapshots:
|
||||||
|
snap.volume = ovo
|
||||||
|
|
||||||
|
# If this object is already present it will be replaced
|
||||||
|
obj = Object.objects['Volume'].get(ovo.id)
|
||||||
|
if obj:
|
||||||
|
obj._replace_ovo(ovo)
|
||||||
|
else:
|
||||||
|
obj = cls(backend, __ovo=ovo)
|
||||||
|
return obj
|
||||||
|
|
||||||
|
def _replace_ovo(self, ovo):
|
||||||
|
super(Volume, self)._replace_ovo(ovo)
|
||||||
|
self._populate_data()
|
||||||
|
|
||||||
|
def _populate_data(self):
|
||||||
|
old_snapshots = {snap.id: snap for snap in self.snapshots}
|
||||||
|
|
||||||
|
for snap_ovo in self._ovo.snapshots:
|
||||||
|
snap = Object.objects['Snapshot'].get(snap_ovo.id)
|
||||||
|
if snap:
|
||||||
|
snap._replace_ovo(snap_ovo)
|
||||||
|
del old_snapshots[snap.id]
|
||||||
|
else:
|
||||||
|
snap = Snapshot(self, __ovo=snap_ovo)
|
||||||
|
self.snapshots.add(snap)
|
||||||
|
|
||||||
|
for snap_id, snap in old_snapshots.items():
|
||||||
|
self.snapshots.discard(snap)
|
||||||
|
# We leave snapshots in the global DB just in case...
|
||||||
|
# del Object.objects['Snapshot'][snap_id]
|
||||||
|
|
||||||
|
old_connections = {conn.id: conn for conn in self.connections}
|
||||||
|
|
||||||
|
for conn_ovo in self._ovo.volume_attachment:
|
||||||
|
conn = Object.objects['Connection'].get(conn_ovo.id)
|
||||||
|
if conn:
|
||||||
|
conn._replace_ovo(conn_ovo)
|
||||||
|
del old_connections[conn.id]
|
||||||
|
else:
|
||||||
|
conn = Connection(self.backend, volume=self, __ovo=conn_ovo)
|
||||||
|
self.connections.append(conn)
|
||||||
|
|
||||||
|
for conn_id, conn in old_connections.items():
|
||||||
|
self.connections.remove(conn)
|
||||||
|
# We leave connections in the global DB just in case...
|
||||||
|
# del Object.objects['Connection'][conn_id]
|
||||||
|
|
||||||
|
data = getattr(self._ovo, 'cinderlib_data', {})
|
||||||
|
self.exported = data.get('exported', False)
|
||||||
|
self.local_attach = data.get('local_attach', None)
|
||||||
|
if self.local_attach:
|
||||||
|
self.local_attach = Object.objects['Connection'][self.local_attach]
|
||||||
|
|
||||||
|
def create(self):
|
||||||
|
try:
|
||||||
|
model_update = self.backend.driver.create_volume(self._ovo)
|
||||||
|
self._ovo.status = 'available'
|
||||||
|
if model_update:
|
||||||
|
self._ovo.update(model_update)
|
||||||
|
except Exception:
|
||||||
|
self._ovo.status = 'error'
|
||||||
|
# TODO: raise with the vol info
|
||||||
|
raise
|
||||||
|
|
||||||
|
def delete(self):
|
||||||
|
# Some backends delete existing snapshots while others leave them
|
||||||
|
try:
|
||||||
|
self.backend.driver.delete_volume(self._ovo)
|
||||||
|
self._ovo.status = 'deleted'
|
||||||
|
self._ovo.deleted = True
|
||||||
|
# volume.deleted_at =
|
||||||
|
except Exception:
|
||||||
|
self._ovo.status = 'error'
|
||||||
|
# TODO: raise with the vol info
|
||||||
|
raise
|
||||||
|
self.backend.volumes.discard(self)
|
||||||
|
|
||||||
|
def extend(self, size):
|
||||||
|
volume = self._ovo
|
||||||
|
volume.previous_status = volume.status
|
||||||
|
volume.status = 'extending'
|
||||||
|
try:
|
||||||
|
self.backend.driver.extend_volume(volume, size)
|
||||||
|
except Exception:
|
||||||
|
volume.status = 'error'
|
||||||
|
# TODO: raise with the vol info
|
||||||
|
raise
|
||||||
|
|
||||||
|
volume.size = size
|
||||||
|
volume.status = volume.previous_status
|
||||||
|
volume.previous_status = None
|
||||||
|
|
||||||
|
def clone(self, **new_vol_attrs):
|
||||||
|
new_vol_attrs['source_vol_id'] = self.id
|
||||||
|
new_vol = Volume(self, **new_vol_attrs)
|
||||||
|
try:
|
||||||
|
model_update = self.backend.driver.create_cloned_volume(
|
||||||
|
new_vol._ovo, self._ovo)
|
||||||
|
new_vol.status = 'available'
|
||||||
|
if model_update:
|
||||||
|
new_vol.update(model_update)
|
||||||
|
except Exception:
|
||||||
|
new_vol.status = 'error'
|
||||||
|
# TODO: raise with the new volume info
|
||||||
|
raise
|
||||||
|
return new_vol
|
||||||
|
|
||||||
|
def create_snapshot(self, name='', description='', **kwargs):
|
||||||
|
snap = Snapshot(self, name=name, description=description, **kwargs)
|
||||||
|
snap.create()
|
||||||
|
self.snapshots.add(snap)
|
||||||
|
self._ovo.snapshots.objects.append(snap._ovo)
|
||||||
|
return snap
|
||||||
|
|
||||||
|
def attach(self):
|
||||||
|
connector_dict = utils.brick_get_connector_properties(
|
||||||
|
self.backend.configuration.use_multipath_for_image_xfer,
|
||||||
|
self.backend.configuration.enforce_multipath_for_image_xfer)
|
||||||
|
conn = self.connect(connector_dict)
|
||||||
|
try:
|
||||||
|
conn.attach()
|
||||||
|
except Exception:
|
||||||
|
self.disconnect(conn)
|
||||||
|
raise
|
||||||
|
return conn
|
||||||
|
|
||||||
|
def detach(self, force=False, ignore_errors=False):
|
||||||
|
if not self.local_attach:
|
||||||
|
raise Exception('Not attached')
|
||||||
|
exc = brick_exception.ExceptionChainer()
|
||||||
|
|
||||||
|
conn = self.local_attach
|
||||||
|
try:
|
||||||
|
conn.detach(force, ignore_errors, exc)
|
||||||
|
except Exception:
|
||||||
|
if not force:
|
||||||
|
raise
|
||||||
|
|
||||||
|
with exc.context(force, 'Unable to disconnect'):
|
||||||
|
conn.disconnect(force)
|
||||||
|
|
||||||
|
if exc and not ignore_errors:
|
||||||
|
raise exc
|
||||||
|
|
||||||
|
def connect(self, connector_dict):
|
||||||
|
if not self.exported:
|
||||||
|
model_update = self.backend.driver.create_export(self.context,
|
||||||
|
self._ovo,
|
||||||
|
connector_dict)
|
||||||
|
if model_update:
|
||||||
|
self._ovo.update(model_update)
|
||||||
|
self.exported = True
|
||||||
|
|
||||||
|
try:
|
||||||
|
conn = Connection.connect(self, connector_dict)
|
||||||
|
self.connections.append(conn)
|
||||||
|
self._ovo.status = 'in-use'
|
||||||
|
except Exception:
|
||||||
|
if not self.connections:
|
||||||
|
self._remove_export()
|
||||||
|
# TODO: Improve raised exception
|
||||||
|
raise
|
||||||
|
return conn
|
||||||
|
|
||||||
|
def _disconnect(self, connection):
|
||||||
|
self.connections.remove(connection)
|
||||||
|
if not self.connections:
|
||||||
|
self._remove_export()
|
||||||
|
self._ovo.status = 'available'
|
||||||
|
|
||||||
|
def disconnect(self, connection, force=False):
|
||||||
|
connection._disconnect(force)
|
||||||
|
self._disconnect(connection)
|
||||||
|
|
||||||
|
def cleanup(self):
|
||||||
|
for attach in self.attachments:
|
||||||
|
attach.detach()
|
||||||
|
self._remove_export()
|
||||||
|
|
||||||
|
def _remove_export(self):
|
||||||
|
self.backend.driver.remove_export(self._context, self._ovo)
|
||||||
|
self.exported = False
|
||||||
|
|
||||||
|
|
||||||
|
class Connection(Object):
|
||||||
|
OVO_CLASS = volume_cmd.objects.VolumeAttachment
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def connect(cls, volume, connector):
|
||||||
|
conn_info = volume.backend.driver.initialize_connection(
|
||||||
|
volume._ovo, connector)
|
||||||
|
conn = cls(volume.backend,
|
||||||
|
connector=connector,
|
||||||
|
volume=volume,
|
||||||
|
status='attached',
|
||||||
|
attach_mode='rw',
|
||||||
|
connection_info=conn_info)
|
||||||
|
volume._ovo.volume_attachment.objects.append(conn._ovo)
|
||||||
|
return conn
|
||||||
|
|
||||||
|
def __init__(self, *args, **kwargs):
|
||||||
|
self.connected = True
|
||||||
|
self.volume = kwargs.pop('volume')
|
||||||
|
self.connector = kwargs.pop('connector', None)
|
||||||
|
self.attach_info = kwargs.pop('attach_info', None)
|
||||||
|
if '__ovo' not in kwargs:
|
||||||
|
kwargs['volume'] = self.volume._ovo
|
||||||
|
kwargs['volume_id'] = self.volume._ovo.id
|
||||||
|
|
||||||
|
super(Connection, self).__init__(*args, **kwargs)
|
||||||
|
|
||||||
|
self._populate_data()
|
||||||
|
|
||||||
|
def _to_primitive(self):
|
||||||
|
attach_info = self.attach_info.copy()
|
||||||
|
connector = self.attach_info['connector']
|
||||||
|
attach_info['connector'] = {
|
||||||
|
'use_multipath': connector.use_multipath,
|
||||||
|
'device_scan_attempts': connector.device_scan_attempts,
|
||||||
|
}
|
||||||
|
|
||||||
|
return {'connector': self.connector,
|
||||||
|
'attach_info': attach_info}
|
||||||
|
|
||||||
|
def _populate_data(self):
|
||||||
|
# Ensure circular reference is set
|
||||||
|
self._ovo.volume = self.volume._ovo
|
||||||
|
|
||||||
|
data = getattr(self._ovo, 'cinderlib_data', None)
|
||||||
|
if data:
|
||||||
|
self.connector = data.get('connector', None)
|
||||||
|
self.attach_info = data.get('attach_info', None)
|
||||||
|
conn = (self.attach_info or {}).get('connector')
|
||||||
|
if isinstance(conn, dict):
|
||||||
|
self.attach_info['connector'] = utils.brick_get_connector(
|
||||||
|
self.connection_info['driver_volume_type'],
|
||||||
|
conn=self.connection_info,
|
||||||
|
**conn)
|
||||||
|
|
||||||
|
def _replace_ovo(self, ovo):
|
||||||
|
super(Connection, self)._replace_ovo(ovo)
|
||||||
|
self._populate_data()
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def _load(cls, backend, ovo):
|
||||||
|
# Turn this around and do a Volume load
|
||||||
|
volume = ovo.volume
|
||||||
|
volume.volume_attachment.objects.append(ovo)
|
||||||
|
# Remove circular reference
|
||||||
|
delattr(ovo, base_ovo._get_attrname('volume'))
|
||||||
|
Volume._load(backend, volume)
|
||||||
|
return Connection.objects[ovo.id]
|
||||||
|
|
||||||
|
def _disconnect(self, force=False):
|
||||||
|
self.backend.driver.terminate_connection(self._ovo.volume,
|
||||||
|
self.connector,
|
||||||
|
force=force)
|
||||||
|
self.connected = False
|
||||||
|
|
||||||
|
self._ovo.volume.volume_attachment.objects.remove(self._ovo)
|
||||||
|
self._ovo.status = 'detached'
|
||||||
|
self._ovo.deleted = True
|
||||||
|
|
||||||
|
def disconnect(self, force=False):
|
||||||
|
self._disconnect(force)
|
||||||
|
self.volume._disconnect(self)
|
||||||
|
|
||||||
|
def attach(self):
|
||||||
|
self.attach_info = self.backend.driver._connect_device(
|
||||||
|
self.connection_info)
|
||||||
|
self.attached = True
|
||||||
|
self.volume.local_attach = self
|
||||||
|
|
||||||
|
def detach(self, force=False, ignore_errors=False, exc=None):
|
||||||
|
if not exc:
|
||||||
|
exc = brick_exception.ExceptionChainer()
|
||||||
|
connector = self.attach_info['connector']
|
||||||
|
with exc.context(force, 'Disconnect failed'):
|
||||||
|
connector.disconnect_volume(self.connection_info['data'],
|
||||||
|
self.attach_info['device'],
|
||||||
|
force=force,
|
||||||
|
ignore_errors=ignore_errors)
|
||||||
|
self.attached = False
|
||||||
|
self.volume.local_attach = None
|
||||||
|
|
||||||
|
if exc and not ignore_errors:
|
||||||
|
raise exc
|
||||||
|
|
||||||
|
@property
|
||||||
|
def path(self):
|
||||||
|
if self.attach_info:
|
||||||
|
return self.attach_info['device']['path']
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
class Snapshot(Object):
|
||||||
|
OVO_CLASS = volume_cmd.objects.Snapshot
|
||||||
|
DEFAULT_FIELDS_VALUES = {
|
||||||
|
'status': 'creating',
|
||||||
|
'metadata': {},
|
||||||
|
}
|
||||||
|
|
||||||
|
def __init__(self, volume, **kwargs):
|
||||||
|
self.volume = volume
|
||||||
|
if '__ovo' in kwargs:
|
||||||
|
# Ensure circular reference is set
|
||||||
|
kwargs['__ovo'].volume = volume._ovo
|
||||||
|
else:
|
||||||
|
kwargs.setdefault('user_id', volume.user_id)
|
||||||
|
kwargs.setdefault('project_id', volume.project_id)
|
||||||
|
kwargs['volume_id'] = volume.id
|
||||||
|
kwargs['volume_size'] = volume.size
|
||||||
|
kwargs['volume_type_id'] = volume.volume_type_id
|
||||||
|
kwargs['volume'] = volume._ovo
|
||||||
|
|
||||||
|
if 'description' in kwargs:
|
||||||
|
kwargs['display_description'] = kwargs.pop('description')
|
||||||
|
if 'name' in kwargs:
|
||||||
|
kwargs['display_name'] = kwargs.pop('name')
|
||||||
|
|
||||||
|
super(Snapshot, self).__init__(volume.backend, **kwargs)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def _load(cls, backend, ovo):
|
||||||
|
# Turn this around and do a Volume load
|
||||||
|
volume = ovo.volume
|
||||||
|
volume.snapshots.objects.append(ovo)
|
||||||
|
# Remove circular reference
|
||||||
|
delattr(ovo, base_ovo._get_attrname('volume'))
|
||||||
|
Volume._load(backend, volume)
|
||||||
|
return Snapshot.objects[ovo.id]
|
||||||
|
|
||||||
|
def _replace_ovo(self, ovo):
|
||||||
|
super(Snapshot, self)._replace_ovo(ovo)
|
||||||
|
# Ensure circular reference is set
|
||||||
|
self._ovo.volume = self.volume._ovo
|
||||||
|
|
||||||
|
def create(self):
|
||||||
|
try:
|
||||||
|
model_update = self.backend.driver.create_snapshot(self._ovo)
|
||||||
|
self._ovo.status = 'available'
|
||||||
|
if model_update:
|
||||||
|
self._ovo.update(model_update)
|
||||||
|
except Exception:
|
||||||
|
self._ovo.status = 'error'
|
||||||
|
# TODO: raise with the vol info
|
||||||
|
raise
|
||||||
|
|
||||||
|
def delete(self):
|
||||||
|
try:
|
||||||
|
self.backend.driver.delete_snapshot(self._ovo)
|
||||||
|
self._ovo.status = 'deleted'
|
||||||
|
self._ovo.deleted = True
|
||||||
|
# snapshot.deleted_at =
|
||||||
|
except Exception:
|
||||||
|
self._ovo.status = 'error'
|
||||||
|
# TODO: raise with the snap info
|
||||||
|
raise
|
||||||
|
self.volume.snapshots.discard(self)
|
||||||
|
try:
|
||||||
|
self.volume._ovo.snapshots.objects.remove(self._ovo)
|
||||||
|
except ValueError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
def create_volume(self, **new_vol_params):
|
||||||
|
new_vol_params['snapshot_id'] = self.id
|
||||||
|
new_vol = Volume(self.volume, **new_vol_params)
|
||||||
|
try:
|
||||||
|
model_update = self.backend.driver.create_volume_from_snapshot(
|
||||||
|
new_vol._ovo, self._ovo)
|
||||||
|
new_vol._ovo.status = 'available'
|
||||||
|
if model_update:
|
||||||
|
new_vol._ovo.update(model_update)
|
||||||
|
except Exception:
|
||||||
|
new_vol._ovo.status = 'error'
|
||||||
|
# TODO: raise with the new volume info
|
||||||
|
raise
|
||||||
|
return new_vol
|
||||||
|
|
||||||
|
|
||||||
|
class OVO(object):
|
||||||
|
"""Oslo Versioned Objects helper class.
|
||||||
|
|
||||||
|
Class will prevent OVOs from actually trying to save to the DB on request,
|
||||||
|
replace some 'get_by_id' methods to prevent them from going to the DB while
|
||||||
|
still returned the expected data, remove circular references when saving
|
||||||
|
objects (for example in a Volume OVO it has a 'snapshot' field which is a
|
||||||
|
Snapshot OVO that has a 'volume' back reference), piggy back on the OVO's
|
||||||
|
serialization mechanism to add/get additional data we want.
|
||||||
|
"""
|
||||||
|
OBJ_NAME_MAPS = {'VolumeAttachment': 'Connection'}
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def _ovo_init(cls, non_uuid_ids):
|
||||||
|
# Create fake DB for drivers
|
||||||
|
cls.fake_db = DB()
|
||||||
|
|
||||||
|
# Replace the standard DB configuration for code that doesn't use the
|
||||||
|
# driver.db attribute (ie: OVOs).
|
||||||
|
volume_cmd.session.IMPL = cls.fake_db
|
||||||
|
|
||||||
|
# Replace get_by_id methods with something that will return expected
|
||||||
|
# data
|
||||||
|
volume_cmd.objects.Volume.get_by_id = DB.volume_get
|
||||||
|
volume_cmd.objects.Snapshot.get_by_id = DB.snapshot_get
|
||||||
|
|
||||||
|
# Use custom dehydration methods that prevent maximum recursion errors
|
||||||
|
# due to circular references:
|
||||||
|
# ie: snapshot -> volume -> snapshots -> snapshot
|
||||||
|
base_ovo.VersionedObject.obj_to_primitive = cls.obj_to_primitive
|
||||||
|
cinder_base_ovo.CinderObject.obj_from_primitive = classmethod(
|
||||||
|
cls.obj_from_primitive)
|
||||||
|
|
||||||
|
fields = base_ovo.obj_fields
|
||||||
|
fields.Object.to_primitive = staticmethod(cls.field_ovo_to_primitive)
|
||||||
|
fields.Field.to_primitive = cls.field_to_primitive
|
||||||
|
fields.List.to_primitive = cls.iterable_to_primitive
|
||||||
|
fields.Set.to_primitive = cls.iterable_to_primitive
|
||||||
|
fields.Dict.to_primitive = cls.dict_to_primitive
|
||||||
|
cls.wrap_to_primitive(fields.FieldType)
|
||||||
|
cls.wrap_to_primitive(fields.DateTime)
|
||||||
|
cls.wrap_to_primitive(fields.IPAddress)
|
||||||
|
|
||||||
|
# Disable saving in ovos
|
||||||
|
for ovo_name in cinder_base_ovo.CinderObjectRegistry.obj_classes():
|
||||||
|
ovo_cls = getattr(volume_cmd.objects, ovo_name)
|
||||||
|
ovo_cls.save = lambda *args, **kwargs: None
|
||||||
|
if non_uuid_ids and 'id' in ovo_cls.fields:
|
||||||
|
ovo_cls.fields['id'] = cinder_base_ovo.fields.StringField()
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def wrap_to_primitive(cls):
|
||||||
|
method = getattr(cls, 'to_primitive')
|
||||||
|
|
||||||
|
@functools.wraps(method)
|
||||||
|
def to_primitive(obj, attr, value, visited=None):
|
||||||
|
return method(obj, attr, value)
|
||||||
|
setattr(cls, 'to_primitive', staticmethod(to_primitive))
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def obj_to_primitive(self, target_version=None, version_manifest=None,
|
||||||
|
visited=None):
|
||||||
|
# No target_version, version_manifest, or changes support
|
||||||
|
if visited is None:
|
||||||
|
visited = set()
|
||||||
|
visited.add(id(self))
|
||||||
|
|
||||||
|
primitive = {}
|
||||||
|
for name, field in self.fields.items():
|
||||||
|
if self.obj_attr_is_set(name):
|
||||||
|
value = getattr(self, name)
|
||||||
|
# Skip cycles
|
||||||
|
if id(value) in visited:
|
||||||
|
continue
|
||||||
|
primitive[name] = field.to_primitive(self, name, value,
|
||||||
|
visited)
|
||||||
|
|
||||||
|
obj_name = self.obj_name()
|
||||||
|
obj = {
|
||||||
|
self._obj_primitive_key('name'): obj_name,
|
||||||
|
self._obj_primitive_key('namespace'): self.OBJ_PROJECT_NAMESPACE,
|
||||||
|
self._obj_primitive_key('version'): self.VERSION,
|
||||||
|
self._obj_primitive_key('data'): primitive
|
||||||
|
}
|
||||||
|
|
||||||
|
# Piggyback to store our own data
|
||||||
|
my_obj_name = OVO.OBJ_NAME_MAPS.get(obj_name, obj_name)
|
||||||
|
if 'id' in primitive and my_obj_name in Object.objects:
|
||||||
|
my_obj = Object.objects[my_obj_name][primitive['id']]
|
||||||
|
obj['cinderlib.data'] = my_obj._to_primitive()
|
||||||
|
|
||||||
|
return obj
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def obj_from_primitive(
|
||||||
|
cls, primitive, context=None,
|
||||||
|
original_method=cinder_base_ovo.CinderObject.obj_from_primitive):
|
||||||
|
result = original_method(primitive, context)
|
||||||
|
result.cinderlib_data = primitive.get('cinderlib.data')
|
||||||
|
return result
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def field_ovo_to_primitive(obj, attr, value, visited=None):
|
||||||
|
return value.obj_to_primitive(visited=visited)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def field_to_primitive(self, obj, attr, value, visited=None):
|
||||||
|
if value is None:
|
||||||
|
return None
|
||||||
|
return self._type.to_primitive(obj, attr, value, visited)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def iterable_to_primitive(self, obj, attr, value, visited=None):
|
||||||
|
if visited is None:
|
||||||
|
visited = set()
|
||||||
|
visited.add(id(value))
|
||||||
|
result = []
|
||||||
|
for elem in value:
|
||||||
|
if id(elem) in visited:
|
||||||
|
continue
|
||||||
|
visited.add(id(elem))
|
||||||
|
r = self._element_type.to_primitive(obj, attr, elem, visited)
|
||||||
|
result.append(r)
|
||||||
|
return result
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def dict_to_primitive(self, obj, attr, value, visited=None):
|
||||||
|
if visited is None:
|
||||||
|
visited = set()
|
||||||
|
visited.add(id(value))
|
||||||
|
|
||||||
|
primitive = {}
|
||||||
|
for key, elem in value.items():
|
||||||
|
if id(elem) in visited:
|
||||||
|
continue
|
||||||
|
visited.add(id(elem))
|
||||||
|
primitive[key] = self._element_type.to_primitive(
|
||||||
|
obj, '%s["%s"]' % (attr, key), elem, visited)
|
||||||
|
return primitive
|
||||||
|
|
||||||
|
|
||||||
|
class DB(object):
|
||||||
|
"""Replacement for DB access methods.
|
||||||
|
|
||||||
|
This will serve as replacement for methods used by:
|
||||||
|
|
||||||
|
- Drivers
|
||||||
|
- OVOs' get_by_id method
|
||||||
|
- DB implementation
|
||||||
|
|
||||||
|
Data will be retrieved based on the objects we store in our own Volume
|
||||||
|
and Snapshots classes.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def volume_get(cls, context, volume_id, *args, **kwargs):
|
||||||
|
if volume_id not in Volume.objects:
|
||||||
|
raise exception.VolumeNotFound(volume_id=volume_id)
|
||||||
|
return Volume.objects[volume_id]._ovo
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def snapshot_get(cls, context, snapshot_id, *args, **kwargs):
|
||||||
|
if snapshot_id not in Snapshot.objects:
|
||||||
|
raise exception.SnapshotNotFound(snapshot_id=snapshot_id)
|
||||||
|
return Snapshot.objects[snapshot_id]._ovo
|
||||||
|
|
||||||
|
|
||||||
|
class MyDict(dict):
|
||||||
|
"""Custom non clearable dictionary.
|
||||||
|
|
||||||
|
Required to overcome the nature of oslo.config where configuration comes
|
||||||
|
from files and command line input.
|
||||||
|
|
||||||
|
Using this dictionary we can load from memory everything and it won't clear
|
||||||
|
things when we dynamically load a driver and the driver has new
|
||||||
|
configuration options.
|
||||||
|
"""
|
||||||
|
def clear(self):
|
||||||
|
pass
|
||||||
177
docs/Makefile
Normal file
177
docs/Makefile
Normal file
@@ -0,0 +1,177 @@
|
|||||||
|
# Makefile for Sphinx documentation
|
||||||
|
#
|
||||||
|
|
||||||
|
# You can set these variables from the command line.
|
||||||
|
SPHINXOPTS =
|
||||||
|
SPHINXBUILD = sphinx-build
|
||||||
|
PAPER =
|
||||||
|
BUILDDIR = _build
|
||||||
|
|
||||||
|
# User-friendly check for sphinx-build
|
||||||
|
ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1)
|
||||||
|
$(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/)
|
||||||
|
endif
|
||||||
|
|
||||||
|
# Internal variables.
|
||||||
|
PAPEROPT_a4 = -D latex_paper_size=a4
|
||||||
|
PAPEROPT_letter = -D latex_paper_size=letter
|
||||||
|
ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
|
||||||
|
# the i18n builder cannot share the environment and doctrees with the others
|
||||||
|
I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
|
||||||
|
|
||||||
|
.PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest gettext
|
||||||
|
|
||||||
|
help:
|
||||||
|
@echo "Please use \`make <target>' where <target> is one of"
|
||||||
|
@echo " html to make standalone HTML files"
|
||||||
|
@echo " dirhtml to make HTML files named index.html in directories"
|
||||||
|
@echo " singlehtml to make a single large HTML file"
|
||||||
|
@echo " pickle to make pickle files"
|
||||||
|
@echo " json to make JSON files"
|
||||||
|
@echo " htmlhelp to make HTML files and a HTML help project"
|
||||||
|
@echo " qthelp to make HTML files and a qthelp project"
|
||||||
|
@echo " devhelp to make HTML files and a Devhelp project"
|
||||||
|
@echo " epub to make an epub"
|
||||||
|
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
|
||||||
|
@echo " latexpdf to make LaTeX files and run them through pdflatex"
|
||||||
|
@echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx"
|
||||||
|
@echo " text to make text files"
|
||||||
|
@echo " man to make manual pages"
|
||||||
|
@echo " texinfo to make Texinfo files"
|
||||||
|
@echo " info to make Texinfo files and run them through makeinfo"
|
||||||
|
@echo " gettext to make PO message catalogs"
|
||||||
|
@echo " changes to make an overview of all changed/added/deprecated items"
|
||||||
|
@echo " xml to make Docutils-native XML files"
|
||||||
|
@echo " pseudoxml to make pseudoxml-XML files for display purposes"
|
||||||
|
@echo " linkcheck to check all external links for integrity"
|
||||||
|
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
|
||||||
|
|
||||||
|
clean:
|
||||||
|
rm -rf $(BUILDDIR)/*
|
||||||
|
|
||||||
|
html:
|
||||||
|
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
|
||||||
|
@echo
|
||||||
|
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
|
||||||
|
|
||||||
|
dirhtml:
|
||||||
|
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
|
||||||
|
@echo
|
||||||
|
@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
|
||||||
|
|
||||||
|
singlehtml:
|
||||||
|
$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
|
||||||
|
@echo
|
||||||
|
@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."
|
||||||
|
|
||||||
|
pickle:
|
||||||
|
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
|
||||||
|
@echo
|
||||||
|
@echo "Build finished; now you can process the pickle files."
|
||||||
|
|
||||||
|
json:
|
||||||
|
$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
|
||||||
|
@echo
|
||||||
|
@echo "Build finished; now you can process the JSON files."
|
||||||
|
|
||||||
|
htmlhelp:
|
||||||
|
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
|
||||||
|
@echo
|
||||||
|
@echo "Build finished; now you can run HTML Help Workshop with the" \
|
||||||
|
".hhp project file in $(BUILDDIR)/htmlhelp."
|
||||||
|
|
||||||
|
qthelp:
|
||||||
|
$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
|
||||||
|
@echo
|
||||||
|
@echo "Build finished; now you can run "qcollectiongenerator" with the" \
|
||||||
|
".qhcp project file in $(BUILDDIR)/qthelp, like this:"
|
||||||
|
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/cinderlib.qhcp"
|
||||||
|
@echo "To view the help file:"
|
||||||
|
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/cinderlib.qhc"
|
||||||
|
|
||||||
|
devhelp:
|
||||||
|
$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
|
||||||
|
@echo
|
||||||
|
@echo "Build finished."
|
||||||
|
@echo "To view the help file:"
|
||||||
|
@echo "# mkdir -p $$HOME/.local/share/devhelp/cinderlib"
|
||||||
|
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/cinderlib"
|
||||||
|
@echo "# devhelp"
|
||||||
|
|
||||||
|
epub:
|
||||||
|
$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
|
||||||
|
@echo
|
||||||
|
@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
|
||||||
|
|
||||||
|
latex:
|
||||||
|
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
|
||||||
|
@echo
|
||||||
|
@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
|
||||||
|
@echo "Run \`make' in that directory to run these through (pdf)latex" \
|
||||||
|
"(use \`make latexpdf' here to do that automatically)."
|
||||||
|
|
||||||
|
latexpdf:
|
||||||
|
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
|
||||||
|
@echo "Running LaTeX files through pdflatex..."
|
||||||
|
$(MAKE) -C $(BUILDDIR)/latex all-pdf
|
||||||
|
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
|
||||||
|
|
||||||
|
latexpdfja:
|
||||||
|
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
|
||||||
|
@echo "Running LaTeX files through platex and dvipdfmx..."
|
||||||
|
$(MAKE) -C $(BUILDDIR)/latex all-pdf-ja
|
||||||
|
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
|
||||||
|
|
||||||
|
text:
|
||||||
|
$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
|
||||||
|
@echo
|
||||||
|
@echo "Build finished. The text files are in $(BUILDDIR)/text."
|
||||||
|
|
||||||
|
man:
|
||||||
|
$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
|
||||||
|
@echo
|
||||||
|
@echo "Build finished. The manual pages are in $(BUILDDIR)/man."
|
||||||
|
|
||||||
|
texinfo:
|
||||||
|
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
|
||||||
|
@echo
|
||||||
|
@echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo."
|
||||||
|
@echo "Run \`make' in that directory to run these through makeinfo" \
|
||||||
|
"(use \`make info' here to do that automatically)."
|
||||||
|
|
||||||
|
info:
|
||||||
|
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
|
||||||
|
@echo "Running Texinfo files through makeinfo..."
|
||||||
|
make -C $(BUILDDIR)/texinfo info
|
||||||
|
@echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."
|
||||||
|
|
||||||
|
gettext:
|
||||||
|
$(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
|
||||||
|
@echo
|
||||||
|
@echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."
|
||||||
|
|
||||||
|
changes:
|
||||||
|
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
|
||||||
|
@echo
|
||||||
|
@echo "The overview file is in $(BUILDDIR)/changes."
|
||||||
|
|
||||||
|
linkcheck:
|
||||||
|
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
|
||||||
|
@echo
|
||||||
|
@echo "Link check complete; look for any errors in the above output " \
|
||||||
|
"or in $(BUILDDIR)/linkcheck/output.txt."
|
||||||
|
|
||||||
|
doctest:
|
||||||
|
$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
|
||||||
|
@echo "Testing of doctests in the sources finished, look at the " \
|
||||||
|
"results in $(BUILDDIR)/doctest/output.txt."
|
||||||
|
|
||||||
|
xml:
|
||||||
|
$(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml
|
||||||
|
@echo
|
||||||
|
@echo "Build finished. The XML files are in $(BUILDDIR)/xml."
|
||||||
|
|
||||||
|
pseudoxml:
|
||||||
|
$(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml
|
||||||
|
@echo
|
||||||
|
@echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml."
|
||||||
1
docs/authors.rst
Normal file
1
docs/authors.rst
Normal file
@@ -0,0 +1 @@
|
|||||||
|
.. include:: ../AUTHORS.rst
|
||||||
275
docs/conf.py
Executable file
275
docs/conf.py
Executable file
@@ -0,0 +1,275 @@
|
|||||||
|
#!/usr/bin/env python
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
#
|
||||||
|
# cinderlib documentation build configuration file, created by
|
||||||
|
# sphinx-quickstart on Tue Jul 9 22:26:36 2013.
|
||||||
|
#
|
||||||
|
# This file is execfile()d with the current directory set to its
|
||||||
|
# containing dir.
|
||||||
|
#
|
||||||
|
# Note that not all possible configuration values are present in this
|
||||||
|
# autogenerated file.
|
||||||
|
#
|
||||||
|
# All configuration values have a default; values that are commented out
|
||||||
|
# serve to show the default.
|
||||||
|
|
||||||
|
import sys
|
||||||
|
import os
|
||||||
|
|
||||||
|
# If extensions (or modules to document with autodoc) are in another
|
||||||
|
# directory, add these directories to sys.path here. If the directory is
|
||||||
|
# relative to the documentation root, use os.path.abspath to make it
|
||||||
|
# absolute, like shown here.
|
||||||
|
#sys.path.insert(0, os.path.abspath('.'))
|
||||||
|
|
||||||
|
# Get the project root dir, which is the parent dir of this
|
||||||
|
cwd = os.getcwd()
|
||||||
|
project_root = os.path.dirname(cwd)
|
||||||
|
|
||||||
|
# Insert the project root dir as the first element in the PYTHONPATH.
|
||||||
|
# This lets us ensure that the source package is imported, and that its
|
||||||
|
# version is used.
|
||||||
|
sys.path.insert(0, project_root)
|
||||||
|
|
||||||
|
import cinderlib
|
||||||
|
|
||||||
|
# -- General configuration ---------------------------------------------
|
||||||
|
|
||||||
|
# If your documentation needs a minimal Sphinx version, state it here.
|
||||||
|
#needs_sphinx = '1.0'
|
||||||
|
|
||||||
|
# Add any Sphinx extension module names here, as strings. They can be
|
||||||
|
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
|
||||||
|
extensions = ['sphinx.ext.autodoc', 'sphinx.ext.viewcode']
|
||||||
|
|
||||||
|
# Add any paths that contain templates here, relative to this directory.
|
||||||
|
templates_path = ['_templates']
|
||||||
|
|
||||||
|
# The suffix of source filenames.
|
||||||
|
source_suffix = '.rst'
|
||||||
|
|
||||||
|
# The encoding of source files.
|
||||||
|
#source_encoding = 'utf-8-sig'
|
||||||
|
|
||||||
|
# The master toctree document.
|
||||||
|
master_doc = 'index'
|
||||||
|
|
||||||
|
# General information about the project.
|
||||||
|
project = u'Cinder Library'
|
||||||
|
copyright = u"2017, Gorka Eguileor"
|
||||||
|
|
||||||
|
# The version info for the project you're documenting, acts as replacement
|
||||||
|
# for |version| and |release|, also used in various other places throughout
|
||||||
|
# the built documents.
|
||||||
|
#
|
||||||
|
# The short X.Y version.
|
||||||
|
version = cinderlib.__version__
|
||||||
|
# The full version, including alpha/beta/rc tags.
|
||||||
|
release = cinderlib.__version__
|
||||||
|
|
||||||
|
# The language for content autogenerated by Sphinx. Refer to documentation
|
||||||
|
# for a list of supported languages.
|
||||||
|
#language = None
|
||||||
|
|
||||||
|
# There are two options for replacing |today|: either, you set today to
|
||||||
|
# some non-false value, then it is used:
|
||||||
|
#today = ''
|
||||||
|
# Else, today_fmt is used as the format for a strftime call.
|
||||||
|
#today_fmt = '%B %d, %Y'
|
||||||
|
|
||||||
|
# List of patterns, relative to source directory, that match files and
|
||||||
|
# directories to ignore when looking for source files.
|
||||||
|
exclude_patterns = ['_build']
|
||||||
|
|
||||||
|
# The reST default role (used for this markup: `text`) to use for all
|
||||||
|
# documents.
|
||||||
|
#default_role = None
|
||||||
|
|
||||||
|
# If true, '()' will be appended to :func: etc. cross-reference text.
|
||||||
|
#add_function_parentheses = True
|
||||||
|
|
||||||
|
# If true, the current module name will be prepended to all description
|
||||||
|
# unit titles (such as .. function::).
|
||||||
|
#add_module_names = True
|
||||||
|
|
||||||
|
# If true, sectionauthor and moduleauthor directives will be shown in the
|
||||||
|
# output. They are ignored by default.
|
||||||
|
#show_authors = False
|
||||||
|
|
||||||
|
# The name of the Pygments (syntax highlighting) style to use.
|
||||||
|
pygments_style = 'sphinx'
|
||||||
|
|
||||||
|
# A list of ignored prefixes for module index sorting.
|
||||||
|
#modindex_common_prefix = []
|
||||||
|
|
||||||
|
# If true, keep warnings as "system message" paragraphs in the built
|
||||||
|
# documents.
|
||||||
|
#keep_warnings = False
|
||||||
|
|
||||||
|
|
||||||
|
# -- Options for HTML output -------------------------------------------
|
||||||
|
|
||||||
|
# The theme to use for HTML and HTML Help pages. See the documentation for
|
||||||
|
# a list of builtin themes.
|
||||||
|
html_theme = 'default'
|
||||||
|
|
||||||
|
# Theme options are theme-specific and customize the look and feel of a
|
||||||
|
# theme further. For a list of options available for each theme, see the
|
||||||
|
# documentation.
|
||||||
|
#html_theme_options = {}
|
||||||
|
|
||||||
|
# Add any paths that contain custom themes here, relative to this directory.
|
||||||
|
#html_theme_path = []
|
||||||
|
|
||||||
|
# The name for this set of Sphinx documents. If None, it defaults to
|
||||||
|
# "<project> v<release> documentation".
|
||||||
|
#html_title = None
|
||||||
|
|
||||||
|
# A shorter title for the navigation bar. Default is the same as
|
||||||
|
# html_title.
|
||||||
|
#html_short_title = None
|
||||||
|
|
||||||
|
# The name of an image file (relative to this directory) to place at the
|
||||||
|
# top of the sidebar.
|
||||||
|
#html_logo = None
|
||||||
|
|
||||||
|
# The name of an image file (within the static path) to use as favicon
|
||||||
|
# of the docs. This file should be a Windows icon file (.ico) being
|
||||||
|
# 16x16 or 32x32 pixels large.
|
||||||
|
#html_favicon = None
|
||||||
|
|
||||||
|
# Add any paths that contain custom static files (such as style sheets)
|
||||||
|
# here, relative to this directory. They are copied after the builtin
|
||||||
|
# static files, so a file named "default.css" will overwrite the builtin
|
||||||
|
# "default.css".
|
||||||
|
html_static_path = ['_static']
|
||||||
|
|
||||||
|
# If not '', a 'Last updated on:' timestamp is inserted at every page
|
||||||
|
# bottom, using the given strftime format.
|
||||||
|
#html_last_updated_fmt = '%b %d, %Y'
|
||||||
|
|
||||||
|
# If true, SmartyPants will be used to convert quotes and dashes to
|
||||||
|
# typographically correct entities.
|
||||||
|
#html_use_smartypants = True
|
||||||
|
|
||||||
|
# Custom sidebar templates, maps document names to template names.
|
||||||
|
#html_sidebars = {}
|
||||||
|
|
||||||
|
# Additional templates that should be rendered to pages, maps page names
|
||||||
|
# to template names.
|
||||||
|
#html_additional_pages = {}
|
||||||
|
|
||||||
|
# If false, no module index is generated.
|
||||||
|
#html_domain_indices = True
|
||||||
|
|
||||||
|
# If false, no index is generated.
|
||||||
|
#html_use_index = True
|
||||||
|
|
||||||
|
# If true, the index is split into individual pages for each letter.
|
||||||
|
#html_split_index = False
|
||||||
|
|
||||||
|
# If true, links to the reST sources are added to the pages.
|
||||||
|
#html_show_sourcelink = True
|
||||||
|
|
||||||
|
# If true, "Created using Sphinx" is shown in the HTML footer.
|
||||||
|
# Default is True.
|
||||||
|
#html_show_sphinx = True
|
||||||
|
|
||||||
|
# If true, "(C) Copyright ..." is shown in the HTML footer.
|
||||||
|
# Default is True.
|
||||||
|
#html_show_copyright = True
|
||||||
|
|
||||||
|
# If true, an OpenSearch description file will be output, and all pages
|
||||||
|
# will contain a <link> tag referring to it. The value of this option
|
||||||
|
# must be the base URL from which the finished HTML is served.
|
||||||
|
#html_use_opensearch = ''
|
||||||
|
|
||||||
|
# This is the file name suffix for HTML files (e.g. ".xhtml").
|
||||||
|
#html_file_suffix = None
|
||||||
|
|
||||||
|
# Output file base name for HTML help builder.
|
||||||
|
htmlhelp_basename = 'cinderlibdoc'
|
||||||
|
|
||||||
|
|
||||||
|
# -- Options for LaTeX output ------------------------------------------
|
||||||
|
|
||||||
|
latex_elements = {
|
||||||
|
# The paper size ('letterpaper' or 'a4paper').
|
||||||
|
#'papersize': 'letterpaper',
|
||||||
|
|
||||||
|
# The font size ('10pt', '11pt' or '12pt').
|
||||||
|
#'pointsize': '10pt',
|
||||||
|
|
||||||
|
# Additional stuff for the LaTeX preamble.
|
||||||
|
#'preamble': '',
|
||||||
|
}
|
||||||
|
|
||||||
|
# Grouping the document tree into LaTeX files. List of tuples
|
||||||
|
# (source start file, target name, title, author, documentclass
|
||||||
|
# [howto/manual]).
|
||||||
|
latex_documents = [
|
||||||
|
('index', 'cinderlib.tex',
|
||||||
|
u'Cinder Library Documentation',
|
||||||
|
u'Gorka Eguileor', 'manual'),
|
||||||
|
]
|
||||||
|
|
||||||
|
# The name of an image file (relative to this directory) to place at
|
||||||
|
# the top of the title page.
|
||||||
|
#latex_logo = None
|
||||||
|
|
||||||
|
# For "manual" documents, if this is true, then toplevel headings
|
||||||
|
# are parts, not chapters.
|
||||||
|
#latex_use_parts = False
|
||||||
|
|
||||||
|
# If true, show page references after internal links.
|
||||||
|
#latex_show_pagerefs = False
|
||||||
|
|
||||||
|
# If true, show URL addresses after external links.
|
||||||
|
#latex_show_urls = False
|
||||||
|
|
||||||
|
# Documents to append as an appendix to all manuals.
|
||||||
|
#latex_appendices = []
|
||||||
|
|
||||||
|
# If false, no module index is generated.
|
||||||
|
#latex_domain_indices = True
|
||||||
|
|
||||||
|
|
||||||
|
# -- Options for manual page output ------------------------------------
|
||||||
|
|
||||||
|
# One entry per manual page. List of tuples
|
||||||
|
# (source start file, name, description, authors, manual section).
|
||||||
|
man_pages = [
|
||||||
|
('index', 'cinderlib',
|
||||||
|
u'Cinder Library Documentation',
|
||||||
|
[u'Gorka Eguileor'], 1)
|
||||||
|
]
|
||||||
|
|
||||||
|
# If true, show URL addresses after external links.
|
||||||
|
#man_show_urls = False
|
||||||
|
|
||||||
|
|
||||||
|
# -- Options for Texinfo output ----------------------------------------
|
||||||
|
|
||||||
|
# Grouping the document tree into Texinfo files. List of tuples
|
||||||
|
# (source start file, target name, title, author,
|
||||||
|
# dir menu entry, description, category)
|
||||||
|
texinfo_documents = [
|
||||||
|
('index', 'cinderlib',
|
||||||
|
u'Cinder Library Documentation',
|
||||||
|
u'Gorka Eguileor',
|
||||||
|
'cinderlib',
|
||||||
|
'One line description of project.',
|
||||||
|
'Miscellaneous'),
|
||||||
|
]
|
||||||
|
|
||||||
|
# Documents to append as an appendix to all manuals.
|
||||||
|
#texinfo_appendices = []
|
||||||
|
|
||||||
|
# If false, no module index is generated.
|
||||||
|
#texinfo_domain_indices = True
|
||||||
|
|
||||||
|
# How to display URL addresses: 'footnote', 'no', or 'inline'.
|
||||||
|
#texinfo_show_urls = 'footnote'
|
||||||
|
|
||||||
|
# If true, do not generate a @detailmenu in the "Top" node's menu.
|
||||||
|
#texinfo_no_detailmenu = False
|
||||||
1
docs/contributing.rst
Normal file
1
docs/contributing.rst
Normal file
@@ -0,0 +1 @@
|
|||||||
|
.. include:: ../CONTRIBUTING.rst
|
||||||
1
docs/history.rst
Normal file
1
docs/history.rst
Normal file
@@ -0,0 +1 @@
|
|||||||
|
.. include:: ../HISTORY.rst
|
||||||
20
docs/index.rst
Normal file
20
docs/index.rst
Normal file
@@ -0,0 +1,20 @@
|
|||||||
|
Welcome to Cinder Library's documentation!
|
||||||
|
======================================
|
||||||
|
|
||||||
|
Contents:
|
||||||
|
|
||||||
|
.. toctree::
|
||||||
|
:maxdepth: 2
|
||||||
|
|
||||||
|
readme
|
||||||
|
installation
|
||||||
|
usage
|
||||||
|
contributing
|
||||||
|
authorshistory
|
||||||
|
|
||||||
|
Indices and tables
|
||||||
|
==================
|
||||||
|
|
||||||
|
* :ref:`genindex`
|
||||||
|
* :ref:`modindex`
|
||||||
|
* :ref:`search`
|
||||||
51
docs/installation.rst
Normal file
51
docs/installation.rst
Normal file
@@ -0,0 +1,51 @@
|
|||||||
|
.. highlight:: shell
|
||||||
|
|
||||||
|
============
|
||||||
|
Installation
|
||||||
|
============
|
||||||
|
|
||||||
|
|
||||||
|
Stable release
|
||||||
|
--------------
|
||||||
|
|
||||||
|
To install Cinder Library, run this command in your terminal:
|
||||||
|
|
||||||
|
.. code-block:: console
|
||||||
|
|
||||||
|
$ pip install cinderlib
|
||||||
|
|
||||||
|
This is the preferred method to install Cinder Library, as it will always install the most recent stable release.
|
||||||
|
|
||||||
|
If you don't have `pip`_ installed, this `Python installation guide`_ can guide
|
||||||
|
you through the process.
|
||||||
|
|
||||||
|
.. _pip: https://pip.pypa.io
|
||||||
|
.. _Python installation guide: http://docs.python-guide.org/en/latest/starting/installation/
|
||||||
|
|
||||||
|
|
||||||
|
From sources
|
||||||
|
------------
|
||||||
|
|
||||||
|
The sources for Cinder Library can be downloaded from the `Github repo`_.
|
||||||
|
|
||||||
|
You can either clone the public repository:
|
||||||
|
|
||||||
|
.. code-block:: console
|
||||||
|
|
||||||
|
$ git clone git://github.com/akrog/cinderlib
|
||||||
|
|
||||||
|
Or download the `tarball`_:
|
||||||
|
|
||||||
|
.. code-block:: console
|
||||||
|
|
||||||
|
$ curl -OL https://github.com/akrog/cinderlib/tarball/master
|
||||||
|
|
||||||
|
Once you have a copy of the source, you can install it with:
|
||||||
|
|
||||||
|
.. code-block:: console
|
||||||
|
|
||||||
|
$ python setup.py install
|
||||||
|
|
||||||
|
|
||||||
|
.. _Github repo: https://github.com/akrog/cinderlib
|
||||||
|
.. _tarball: https://github.com/akrog/cinderlib/tarball/master
|
||||||
242
docs/make.bat
Normal file
242
docs/make.bat
Normal file
@@ -0,0 +1,242 @@
|
|||||||
|
@ECHO OFF
|
||||||
|
|
||||||
|
REM Command file for Sphinx documentation
|
||||||
|
|
||||||
|
if "%SPHINXBUILD%" == "" (
|
||||||
|
set SPHINXBUILD=sphinx-build
|
||||||
|
)
|
||||||
|
set BUILDDIR=_build
|
||||||
|
set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% .
|
||||||
|
set I18NSPHINXOPTS=%SPHINXOPTS% .
|
||||||
|
if NOT "%PAPER%" == "" (
|
||||||
|
set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS%
|
||||||
|
set I18NSPHINXOPTS=-D latex_paper_size=%PAPER% %I18NSPHINXOPTS%
|
||||||
|
)
|
||||||
|
|
||||||
|
if "%1" == "" goto help
|
||||||
|
|
||||||
|
if "%1" == "help" (
|
||||||
|
:help
|
||||||
|
echo.Please use `make ^<target^>` where ^<target^> is one of
|
||||||
|
echo. html to make standalone HTML files
|
||||||
|
echo. dirhtml to make HTML files named index.html in directories
|
||||||
|
echo. singlehtml to make a single large HTML file
|
||||||
|
echo. pickle to make pickle files
|
||||||
|
echo. json to make JSON files
|
||||||
|
echo. htmlhelp to make HTML files and a HTML help project
|
||||||
|
echo. qthelp to make HTML files and a qthelp project
|
||||||
|
echo. devhelp to make HTML files and a Devhelp project
|
||||||
|
echo. epub to make an epub
|
||||||
|
echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter
|
||||||
|
echo. text to make text files
|
||||||
|
echo. man to make manual pages
|
||||||
|
echo. texinfo to make Texinfo files
|
||||||
|
echo. gettext to make PO message catalogs
|
||||||
|
echo. changes to make an overview over all changed/added/deprecated items
|
||||||
|
echo. xml to make Docutils-native XML files
|
||||||
|
echo. pseudoxml to make pseudoxml-XML files for display purposes
|
||||||
|
echo. linkcheck to check all external links for integrity
|
||||||
|
echo. doctest to run all doctests embedded in the documentation if enabled
|
||||||
|
goto end
|
||||||
|
)
|
||||||
|
|
||||||
|
if "%1" == "clean" (
|
||||||
|
for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i
|
||||||
|
del /q /s %BUILDDIR%\*
|
||||||
|
goto end
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
%SPHINXBUILD% 2> nul
|
||||||
|
if errorlevel 9009 (
|
||||||
|
echo.
|
||||||
|
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
|
||||||
|
echo.installed, then set the SPHINXBUILD environment variable to point
|
||||||
|
echo.to the full path of the 'sphinx-build' executable. Alternatively you
|
||||||
|
echo.may add the Sphinx directory to PATH.
|
||||||
|
echo.
|
||||||
|
echo.If you don't have Sphinx installed, grab it from
|
||||||
|
echo.http://sphinx-doc.org/
|
||||||
|
exit /b 1
|
||||||
|
)
|
||||||
|
|
||||||
|
if "%1" == "html" (
|
||||||
|
%SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html
|
||||||
|
if errorlevel 1 exit /b 1
|
||||||
|
echo.
|
||||||
|
echo.Build finished. The HTML pages are in %BUILDDIR%/html.
|
||||||
|
goto end
|
||||||
|
)
|
||||||
|
|
||||||
|
if "%1" == "dirhtml" (
|
||||||
|
%SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml
|
||||||
|
if errorlevel 1 exit /b 1
|
||||||
|
echo.
|
||||||
|
echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml.
|
||||||
|
goto end
|
||||||
|
)
|
||||||
|
|
||||||
|
if "%1" == "singlehtml" (
|
||||||
|
%SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml
|
||||||
|
if errorlevel 1 exit /b 1
|
||||||
|
echo.
|
||||||
|
echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml.
|
||||||
|
goto end
|
||||||
|
)
|
||||||
|
|
||||||
|
if "%1" == "pickle" (
|
||||||
|
%SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle
|
||||||
|
if errorlevel 1 exit /b 1
|
||||||
|
echo.
|
||||||
|
echo.Build finished; now you can process the pickle files.
|
||||||
|
goto end
|
||||||
|
)
|
||||||
|
|
||||||
|
if "%1" == "json" (
|
||||||
|
%SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json
|
||||||
|
if errorlevel 1 exit /b 1
|
||||||
|
echo.
|
||||||
|
echo.Build finished; now you can process the JSON files.
|
||||||
|
goto end
|
||||||
|
)
|
||||||
|
|
||||||
|
if "%1" == "htmlhelp" (
|
||||||
|
%SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp
|
||||||
|
if errorlevel 1 exit /b 1
|
||||||
|
echo.
|
||||||
|
echo.Build finished; now you can run HTML Help Workshop with the ^
|
||||||
|
.hhp project file in %BUILDDIR%/htmlhelp.
|
||||||
|
goto end
|
||||||
|
)
|
||||||
|
|
||||||
|
if "%1" == "qthelp" (
|
||||||
|
%SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp
|
||||||
|
if errorlevel 1 exit /b 1
|
||||||
|
echo.
|
||||||
|
echo.Build finished; now you can run "qcollectiongenerator" with the ^
|
||||||
|
.qhcp project file in %BUILDDIR%/qthelp, like this:
|
||||||
|
echo.^> qcollectiongenerator %BUILDDIR%\qthelp\cinderlib.qhcp
|
||||||
|
echo.To view the help file:
|
||||||
|
echo.^> assistant -collectionFile %BUILDDIR%\qthelp\cinderlib.ghc
|
||||||
|
goto end
|
||||||
|
)
|
||||||
|
|
||||||
|
if "%1" == "devhelp" (
|
||||||
|
%SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp
|
||||||
|
if errorlevel 1 exit /b 1
|
||||||
|
echo.
|
||||||
|
echo.Build finished.
|
||||||
|
goto end
|
||||||
|
)
|
||||||
|
|
||||||
|
if "%1" == "epub" (
|
||||||
|
%SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub
|
||||||
|
if errorlevel 1 exit /b 1
|
||||||
|
echo.
|
||||||
|
echo.Build finished. The epub file is in %BUILDDIR%/epub.
|
||||||
|
goto end
|
||||||
|
)
|
||||||
|
|
||||||
|
if "%1" == "latex" (
|
||||||
|
%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
|
||||||
|
if errorlevel 1 exit /b 1
|
||||||
|
echo.
|
||||||
|
echo.Build finished; the LaTeX files are in %BUILDDIR%/latex.
|
||||||
|
goto end
|
||||||
|
)
|
||||||
|
|
||||||
|
if "%1" == "latexpdf" (
|
||||||
|
%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
|
||||||
|
cd %BUILDDIR%/latex
|
||||||
|
make all-pdf
|
||||||
|
cd %BUILDDIR%/..
|
||||||
|
echo.
|
||||||
|
echo.Build finished; the PDF files are in %BUILDDIR%/latex.
|
||||||
|
goto end
|
||||||
|
)
|
||||||
|
|
||||||
|
if "%1" == "latexpdfja" (
|
||||||
|
%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
|
||||||
|
cd %BUILDDIR%/latex
|
||||||
|
make all-pdf-ja
|
||||||
|
cd %BUILDDIR%/..
|
||||||
|
echo.
|
||||||
|
echo.Build finished; the PDF files are in %BUILDDIR%/latex.
|
||||||
|
goto end
|
||||||
|
)
|
||||||
|
|
||||||
|
if "%1" == "text" (
|
||||||
|
%SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text
|
||||||
|
if errorlevel 1 exit /b 1
|
||||||
|
echo.
|
||||||
|
echo.Build finished. The text files are in %BUILDDIR%/text.
|
||||||
|
goto end
|
||||||
|
)
|
||||||
|
|
||||||
|
if "%1" == "man" (
|
||||||
|
%SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man
|
||||||
|
if errorlevel 1 exit /b 1
|
||||||
|
echo.
|
||||||
|
echo.Build finished. The manual pages are in %BUILDDIR%/man.
|
||||||
|
goto end
|
||||||
|
)
|
||||||
|
|
||||||
|
if "%1" == "texinfo" (
|
||||||
|
%SPHINXBUILD% -b texinfo %ALLSPHINXOPTS% %BUILDDIR%/texinfo
|
||||||
|
if errorlevel 1 exit /b 1
|
||||||
|
echo.
|
||||||
|
echo.Build finished. The Texinfo files are in %BUILDDIR%/texinfo.
|
||||||
|
goto end
|
||||||
|
)
|
||||||
|
|
||||||
|
if "%1" == "gettext" (
|
||||||
|
%SPHINXBUILD% -b gettext %I18NSPHINXOPTS% %BUILDDIR%/locale
|
||||||
|
if errorlevel 1 exit /b 1
|
||||||
|
echo.
|
||||||
|
echo.Build finished. The message catalogs are in %BUILDDIR%/locale.
|
||||||
|
goto end
|
||||||
|
)
|
||||||
|
|
||||||
|
if "%1" == "changes" (
|
||||||
|
%SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes
|
||||||
|
if errorlevel 1 exit /b 1
|
||||||
|
echo.
|
||||||
|
echo.The overview file is in %BUILDDIR%/changes.
|
||||||
|
goto end
|
||||||
|
)
|
||||||
|
|
||||||
|
if "%1" == "linkcheck" (
|
||||||
|
%SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck
|
||||||
|
if errorlevel 1 exit /b 1
|
||||||
|
echo.
|
||||||
|
echo.Link check complete; look for any errors in the above output ^
|
||||||
|
or in %BUILDDIR%/linkcheck/output.txt.
|
||||||
|
goto end
|
||||||
|
)
|
||||||
|
|
||||||
|
if "%1" == "doctest" (
|
||||||
|
%SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest
|
||||||
|
if errorlevel 1 exit /b 1
|
||||||
|
echo.
|
||||||
|
echo.Testing of doctests in the sources finished, look at the ^
|
||||||
|
results in %BUILDDIR%/doctest/output.txt.
|
||||||
|
goto end
|
||||||
|
)
|
||||||
|
|
||||||
|
if "%1" == "xml" (
|
||||||
|
%SPHINXBUILD% -b xml %ALLSPHINXOPTS% %BUILDDIR%/xml
|
||||||
|
if errorlevel 1 exit /b 1
|
||||||
|
echo.
|
||||||
|
echo.Build finished. The XML files are in %BUILDDIR%/xml.
|
||||||
|
goto end
|
||||||
|
)
|
||||||
|
|
||||||
|
if "%1" == "pseudoxml" (
|
||||||
|
%SPHINXBUILD% -b pseudoxml %ALLSPHINXOPTS% %BUILDDIR%/pseudoxml
|
||||||
|
if errorlevel 1 exit /b 1
|
||||||
|
echo.
|
||||||
|
echo.Build finished. The pseudo-XML files are in %BUILDDIR%/pseudoxml.
|
||||||
|
goto end
|
||||||
|
)
|
||||||
|
|
||||||
|
:end
|
||||||
1
docs/readme.rst
Normal file
1
docs/readme.rst
Normal file
@@ -0,0 +1 @@
|
|||||||
|
.. include:: ../README.rst
|
||||||
7
docs/usage.rst
Normal file
7
docs/usage.rst
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
=====
|
||||||
|
Usage
|
||||||
|
=====
|
||||||
|
|
||||||
|
To use Cinder Library in a project::
|
||||||
|
|
||||||
|
import cinderlib
|
||||||
10
requirements_dev.txt
Normal file
10
requirements_dev.txt
Normal file
@@ -0,0 +1,10 @@
|
|||||||
|
pip==8.1.2
|
||||||
|
bumpversion==0.5.3
|
||||||
|
wheel==0.29.0
|
||||||
|
watchdog==0.8.3
|
||||||
|
flake8==2.6.0
|
||||||
|
tox==2.3.1
|
||||||
|
coverage==4.1
|
||||||
|
Sphinx==1.4.8
|
||||||
|
|
||||||
|
|
||||||
18
setup.cfg
Normal file
18
setup.cfg
Normal file
@@ -0,0 +1,18 @@
|
|||||||
|
[bumpversion]
|
||||||
|
current_version = 0.1.0
|
||||||
|
commit = True
|
||||||
|
tag = True
|
||||||
|
|
||||||
|
[bumpversion:file:setup.py]
|
||||||
|
search = version='{current_version}'
|
||||||
|
replace = version='{new_version}'
|
||||||
|
|
||||||
|
[bumpversion:file:cinderlib/__init__.py]
|
||||||
|
search = __version__ = '{current_version}'
|
||||||
|
replace = __version__ = '{new_version}'
|
||||||
|
|
||||||
|
[bdist_wheel]
|
||||||
|
universal = 1
|
||||||
|
|
||||||
|
[flake8]
|
||||||
|
exclude = docs
|
||||||
53
setup.py
Normal file
53
setup.py
Normal file
@@ -0,0 +1,53 @@
|
|||||||
|
#!/usr/bin/env python
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
|
||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
with open('README.rst') as readme_file:
|
||||||
|
readme = readme_file.read()
|
||||||
|
|
||||||
|
with open('HISTORY.rst') as history_file:
|
||||||
|
history = history_file.read()
|
||||||
|
|
||||||
|
requirements = [
|
||||||
|
'cinder>=11.0',
|
||||||
|
]
|
||||||
|
|
||||||
|
test_requirements = [
|
||||||
|
# TODO: put package test requirements here
|
||||||
|
]
|
||||||
|
|
||||||
|
setup(
|
||||||
|
name='cinderlib',
|
||||||
|
version='0.1.0',
|
||||||
|
description=("Cinder Library allows using storage drivers outside of "
|
||||||
|
"Cinder."),
|
||||||
|
long_description=readme + '\n\n' + history,
|
||||||
|
author="Gorka Eguileor",
|
||||||
|
author_email='geguileo@redhat.com',
|
||||||
|
url='https://github.com/akrog/cinderlib',
|
||||||
|
packages=[
|
||||||
|
'cinderlib',
|
||||||
|
],
|
||||||
|
package_dir={'cinderlib':
|
||||||
|
'cinderlib'},
|
||||||
|
include_package_data=True,
|
||||||
|
install_requires=requirements,
|
||||||
|
license="Apache Software License 2.0",
|
||||||
|
zip_safe=False,
|
||||||
|
keywords='cinderlib',
|
||||||
|
classifiers=[
|
||||||
|
'Development Status :: 2 - Pre-Alpha',
|
||||||
|
'Intended Audience :: Developers',
|
||||||
|
'License :: OSI Approved :: Apache Software License',
|
||||||
|
'Natural Language :: English',
|
||||||
|
"Programming Language :: Python :: 2",
|
||||||
|
'Programming Language :: Python :: 2.7',
|
||||||
|
'Programming Language :: Python :: 3',
|
||||||
|
'Programming Language :: Python :: 3.3',
|
||||||
|
'Programming Language :: Python :: 3.4',
|
||||||
|
'Programming Language :: Python :: 3.5',
|
||||||
|
],
|
||||||
|
test_suite='tests',
|
||||||
|
tests_require=test_requirements
|
||||||
|
)
|
||||||
1
tests/__init__.py
Normal file
1
tests/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
# -*- coding: utf-8 -*-
|
||||||
28
tests/test_cinderlib.py
Normal file
28
tests/test_cinderlib.py
Normal file
@@ -0,0 +1,28 @@
|
|||||||
|
#!/usr/bin/env python
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
|
||||||
|
"""
|
||||||
|
test_cinderlib
|
||||||
|
----------------------------------
|
||||||
|
|
||||||
|
Tests for `cinderlib` module.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
import sys
|
||||||
|
import unittest
|
||||||
|
|
||||||
|
from cinderlib import cinderlib
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
class TestCinderlib(unittest.TestCase):
|
||||||
|
|
||||||
|
def setUp(self):
|
||||||
|
pass
|
||||||
|
|
||||||
|
def tearDown(self):
|
||||||
|
pass
|
||||||
|
|
||||||
|
def test_000_something(self):
|
||||||
|
pass
|
||||||
18
tox.ini
Normal file
18
tox.ini
Normal file
@@ -0,0 +1,18 @@
|
|||||||
|
[tox]
|
||||||
|
envlist = py26, py27, py33, py34, py35, flake8
|
||||||
|
|
||||||
|
[testenv:flake8]
|
||||||
|
basepython=python
|
||||||
|
deps=flake8
|
||||||
|
commands=flake8 cinderlib
|
||||||
|
|
||||||
|
[testenv]
|
||||||
|
setenv =
|
||||||
|
PYTHONPATH = {toxinidir}:{toxinidir}/cinderlib
|
||||||
|
|
||||||
|
commands = python setup.py test
|
||||||
|
|
||||||
|
; If you want to make tox run the tests with the same versions, create a
|
||||||
|
; requirements.txt with the pinned versions and uncomment the following lines:
|
||||||
|
; deps =
|
||||||
|
; -r{toxinidir}/requirements.txt
|
||||||
Reference in New Issue
Block a user