Retire craton

Remove everything, add a README with explanation.

Change-Id: I7c8fcfac0d904f144b6a3838d25db0987d1e74a5
This commit is contained in:
Andreas Jaeger 2018-01-18 15:22:28 +01:00
parent 555d0c8856
commit e88f6b747a
137 changed files with 8 additions and 22469 deletions

View File

@ -1,7 +0,0 @@
[run]
branch = True
source = craton
omit = craton/openstack/*
[report]
ignore_errors = True

55
.gitignore vendored
View File

@ -1,55 +0,0 @@
*.py[cod]
# C extensions
*.so
# Packages
*.egg*
*.egg-info
dist
build
eggs
parts
bin
var
sdist
develop-eggs
.installed.cfg
lib
lib64
# Installer logs
pip-log.txt
# Unit test / coverage reports
cover/
.coverage*
!.coveragerc
.tox
nosetests.xml
.testrepository
.venv
# Translations
*.mo
# Mr Developer
.mr.developer.cfg
.project
.pydevproject
# Complexity
output/*.html
output/*/index.html
# Sphinx
doc/build
# pbr generates these
AUTHORS
ChangeLog
# Editors
*~
.*.swp
.*sw?

View File

@ -1,4 +0,0 @@
[gerrit]
host=review.openstack.org
port=29418
project=openstack/craton.git

View File

@ -1,3 +0,0 @@
# Format is:
# <preferred e-mail> <other e-mail 1>
# <preferred e-mail> <other e-mail 2>

View File

@ -1,7 +0,0 @@
[DEFAULT]
test_command=OS_STDOUT_CAPTURE=${OS_STDOUT_CAPTURE:-1} \
OS_STDERR_CAPTURE=${OS_STDERR_CAPTURE:-1} \
OS_TEST_TIMEOUT=${OS_TEST_TIMEOUT:-60} \
${PYTHON:-python} -m subunit.run discover -t ./ . $LISTOPT $IDOPTION
test_id_option=--load-list $IDFILE
test_list_option=--list

View File

@ -1,23 +0,0 @@
Developer's Guide
-----------------
If you would like to contribute to the development of OpenStack, you must
follow the steps in this page:
http://docs.openstack.org/infra/manual/developers.html
Development Workflow
--------------------
If you already have a good understanding of how the system works and your
OpenStack accounts are set up, you can skip to the development workflow
section of this documentation to learn how changes to OpenStack should be
submitted for review via the Gerrit tool:
http://docs.openstack.org/infra/manual/developers.html#development-workflow
Engagement
----------
Pull requests submitted through GitHub will be ignored.
Bugs should be filed on Launchpad, not GitHub:
https://bugs.launchpad.net/craton

View File

@ -1,85 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
############################################################################
## Usage:
## docker build --pull -t craton-api:latest .
## docker run -t --name craton-api -p 127.0.0.1:7780:7780 -d craton-api:latest
## docker logs <container> and copy the username, api_key, and project_id
## python tools/generate_fake_data.py --url http://127.0.0.1:7780/v1 --user bootstrap --project <project-id from above> --key <api_key from above>
## Use the credentials from above to try different commands using python-cratonclient.
##############################################################################
# Get Ubuntu base image
FROM ubuntu:16.04
# File Author / Maintainer
MAINTAINER Sulochan Acharya
# Install required software and tools
RUN apt-get update \
&& apt-get install -y \
gcc \
git \
curl \
build-essential \
python3.5 \
python3.5-dev
# Get pip
ADD https://bootstrap.pypa.io/get-pip.py /root/get-pip.py
# Install pip
RUN python3.5 /root/get-pip.py
# Install MySQL 5.7
ENV MYSQL_ROOTPW root
RUN echo "mysql-server mysql-server/root_password password $MYSQL_ROOTPW" | debconf-set-selections && \
echo "mysql-server mysql-server/root_password_again password $MYSQL_ROOTPW" | debconf-set-selections
RUN apt-get install -y mysql-server-5.7 mysql-client-5.7
RUN service mysql start && \
mysqladmin -u root -p"$MYSQL_ROOTPW" password '' && \
service mysql stop
# Change mysql bind address
RUN sed -i -e"s/^bind-address\s*=\s*127.0.0.1/bind-address = 0.0.0.0/" /etc/mysql/mysql.conf.d/mysqld.cnf
# Install MySQL-python
RUN apt-get install -y libmysqlclient-dev python-mysqldb
# pip install virtualenv
RUN pip3 install virtualenv
# Expose port
EXPOSE 7780 3306
Add ./requirements.txt /requirements.txt
# Init virutalenv
RUN virtualenv -p /usr/bin/python3.5 /craton
# Change Working Dir
WORKDIR /craton
# Install requirements
RUN bin/pip install -r /requirements.txt
# pip install mysql-python
RUN bin/pip install mysqlclient
# Add Craton
ADD . /craton
# Install Craton
RUN bin/pip install .
CMD ["tools/docker_run.sh"]

View File

@ -1,4 +0,0 @@
craton Style Commandments
===============================================
Read the OpenStack Style Commandments http://docs.openstack.org/developer/hacking/

176
LICENSE
View File

@ -1,176 +0,0 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.

View File

View File

@ -1,29 +1,10 @@
Craton This project is no longer maintained.
======
Craton is a new project we plan to propose for OpenStack inclusion. The contents of this repository are still available in the Git
Craton supports deploying and operating OpenStack clouds by providing source code management system. To see the contents of this
scalable fleet management: repository before it reached its end of life, please check out the
previous commit with "git checkout HEAD^1".
* Inventory of configurable physical devices/hosts (the fleet) For any further questions, please email
* Audit and remediation workflows against this inventory openstack-dev@lists.openstack.org or join #openstack-dev on
* REST APIs, CLI, and Python client to manage Freenode.
Support for workflows, CLI, and the Python client is in progress.
For more information, please refer to the following project resources:
* **Free software:** under the `Apache license <http://www.apache.org/licenses/LICENSE-2.0>`_
* **Documentation:** https://craton.readthedocs.io
* **Source:** https://github.com/openstack/craton
* **Blueprints:** https://blueprints.launchpad.net/craton
* **Bugs:** https://bugs.launchpad.net/craton
For information on how to contribute to Craton, please see the
contents of the `CONTRIBUTING.rst file <CONTRIBUTING.rst>`_.
For information on how to setup a Developer's Environment, please
see the contents of `INSTALL.RST file <doc/source/dev/install.rst>`_.
For more information on Craton distribution license, please see
the contents of the `LICENSE file <LICENSE>`_.

View File

@ -1,273 +0,0 @@
.. -*- rst -*-
======
Cells
======
Definition of cell
Create Cell
============
.. rest_method:: POST /v1/cells
Create a new Cell
Normal response codes: OK(201)
Error response codes: invalid request(400), validation exception(405)
Request
-------
.. rest_parameters:: parameters.yaml
- name : cell_name
- region_id: region_id_body
- project_id: project_id
Required Header
^^^^^^^^^^^^^^^
- Content-Type: Content_Type
- X-Auth-Token: X-Auth-Token
- X-Auth-User: X-Auth-User
- X-Auth-Project: X-Auth-Project
**Example Create Cell** (TO-DO)
..literalinclude:: ../../doc/api_samples/cells/cells-create-req.json
:language: javascript
Response
--------
.. rest_parameters:: parameters.yaml
- cell: cell
- id: cell_id_body
- name: cell_name
- region_id: region_id_body
- project_id: project_id
- note: note
- variables: variables
**Example Create Cell** (TO-DO)
..literalinclude:: ../../doc/api_samples/cells/cells-create-resp.json
:language: javascript
List Cells
==========
.. rest_method:: GET /v1/cells
Gets all Cells
Normal response codes: OK(200)
Error response codes: invalid request(400), cell not found(404), validation exception(405)
Default response: unexpected error
Request
--------
.. rest_parameters:: parameters.yaml
- cell: cell_name_query
- region: region_name_query
Required Header
^^^^^^^^^^^^^^^
- Content-Type: Content_Type
- X-Auth-Token: X-Auth-Token
- X-Auth-User: X-Auth-User
- X-Auth-Project: X-Auth-Project
Response
--------
.. rest_parameters:: parameters.yaml
- cells: cells
- id: cell_id_body
- name: cell_name
- region_id: region_id_body
- project_id: project_id
- note: note
- variables: variables
**Example List Cells** (TO-DO)
..literalinclude:: ../../doc/api_samples/cells/cells-list-resp.json
:language: javascript
**Example Unexpected Error **
..literalinclude:: ../../doc/api_samples/errors/errors-unexpected-resp.json
:language: javascript
Update Cells
============
.. rest_method:: PUT /v1/cells/{cell_id}
Update an existing cell
Normal response codes: OK(200)
Error response codes: invalid request(400), cell not found(404), validation exception(405)
Request
-------
.. rest_parameters:: parameters.yaml
- id: cell_id_body
- name: cell_name
- region_id: region_id_body
- project_id: project_id
- note: note
- variables: variables
- cell_id: cell_id
Required Header
^^^^^^^^^^^^^^^
- Content-Type: Content_Type
- X-Auth-Token: X-Auth-Token
- X-Auth-User: X-Auth-User
- X-Auth-Project: X-Auth-Project
**Example Update Cell** (TO-DO)
..literalinclude:: ../../doc/api_samples/cells/cells-update-req.json
:language: javascript
Response
--------
.. rest_parameters:: parameters.yaml
- cell: cell
- id: cell_id_body
- name: cell_name
- region_id: region_id_body
- project_id: project_id
- note: note
- variables: variables
**Example Update Cell** (TO-DO)
..literalinclude:: ../../doc/api_samples/cells/cells-update-resp.json
:language: javascript
Update Cell Data
==================
.. rest_method:: PUT /v1/cells/{cell_id}/variables
Update user defined variables for the cell
Normal response codes: OK(200)
Error response codes: invalid request(400), cell not found(404), validation exception(405)
Request
-------
.. rest_parameters:: parameters.yaml
- key: key
- value: value
- cell_id: cell_id
Required Header
^^^^^^^^^^^^^^^
- Content-Type: Content_Type
- X-Auth-Token: X-Auth-Token
- X-Auth-User: X-Auth-User
- X-Auth-Project: X-Auth-Project
**Example Update Cell Data** (TO-DO)
..literalinclude:: ../../doc/api_samples/cells/cells-upadate—data-req.json
:language: javascript
Response
--------
.. rest_parameters:: parameters.yaml
- key: key
- value: value
**Example Update Cell Data** (TO-DO)
..literalinclude:: ../../doc/api_samples/cells/cells-update-data-resp.json
:language: javascript
Delete Cell
===========
.. rest_method:: DELETE /v1/cells/{cell_id}
Deletes an existing record of a Cell
Normal response codes: OK(200)
Error response codes: invalid request(400), cell not found(404)
Request
-------
.. rest_parameters:: parameters.yaml
- cell_id: cell_id
Required Header
^^^^^^^^^^^^^^^
- Content-Type: Content_Type
- X-Auth-Token: X-Auth-Token
- X-Auth-User: X-Auth-User
- X-Auth-Project: X-Auth-Project
Response
--------
No body content is returned on a successful DELETE
Delete Cell Data
================
.. rest_method:: DELETE /v1/cells/{cell_id}/variables
Delete existing key/value variable for the cell
Normal response codes: OK(200)
Error response codes: invalid request(400), cell not found(404) validation exception(405)
Request
-------
.. rest_parameters:: parameters.yaml
- cell_id: cell_id
Required Header
^^^^^^^^^^^^^^^
- Content-Type: Content_Type
- X-Auth-Token: X-Auth-Token
- X-Auth-User: X-Auth-User
- X-Auth-Project: X-Auth-Project
Response
--------
No body content is returned on a successful DELETE

View File

@ -1,301 +0,0 @@
.. -*- rst -*-
=====
Hosts
=====
Definition of host
Create Host
============
.. rest_method:: POST /v1/hosts
Create a new host
Normal response codes: OK(201)
Error response codes: invalid request(400), validation exception(405)
Request
-------
.. rest_parameters:: parameters.yaml
- name: host_name
- region_id: region_id_body
- project_id: project_id
- ip_address: ip_address
- device_type: device_type
Required Header
^^^^^^^^^^^^^^^
- Content-Type: Content_Type
- X-Auth-Token: X-Auth-Token
- X-Auth-User: X-Auth-User
- X-Auth-Project: X-Auth-Project
**Example Create Host** (TO-DO)
..literalinclude:: ../../doc/api_samples/hosts/hosts-create-req.json
:language: javascript
Response
--------
.. rest_parameters:: parameters.yaml
- host: host
- id: host_id_body
- name: host_name
- cell_id: cell_id_body
- parent_id: parent_id
- project_id: project_id
- region_id: region_id_body
- ip_address: ip_address
- device_type: device_type
- labels: labels
- note: note
- variables: variables
**Example Create Host** (TO-DO)
..literalinclude:: ../../doc/api_samples/hosts/hosts-create-resp.json
:language: javascript
List Hosts
==========
.. rest_method:: GET /v1/hosts
Gets all Host
Normal response codes: OK(200)
Error response codes: invalid request(400), host not found(404), validation exception(405)
Default response: unexpected error
Request
--------
.. rest_parameters:: parameters.yaml
- limit: limit
- name: host_name_query
- id: host_id_query
- region: region_name_query
- cell: cell_name_query
- ip_address: ip_address_query
- service: service
Required Header
^^^^^^^^^^^^^^^
- Content-Type: Content_Type
- X-Auth-Token: X-Auth-Token
- X-Auth-User: X-Auth-User
- X-Auth-Project: X-Auth-Project
Response
--------
.. rest_parameters:: parameters.yaml
- hosts: hosts
- id: host_id_body
- name: host_name
- cell_id: cell_id_body
- parent_id: parent_id
- project_id: project_id
- region_id: region_id_body
- ip_address: ip_address
- device_type: device_type
- labels: labels
- note: note
- variables: variables
**Example List Host** (TO-DO)
..literalinclude:: ../../doc/api_samples/hosts/hosts-list-resp.json
:language: javascript
**Example Unexpected Error **
..literalinclude:: ../../doc/api_samples/errors/errors-unexpected-resp.json
:language: javascript
Update Hosts
============
.. rest_method:: PUT /v1/hosts/{host_id}
Update an existing host
Normal response codes: OK(200)
Error response codes: invalid request(400), host not found(404), validation exception(405)
Request
-------
.. rest_parameters:: parameters.yaml
- id: host_id_body
- name: host_name
- cell_id: cell_id_body
- parent_id: parent_id
- project_id: project_id
- region_id: region_id_body
- ip_address: ip_address
- device_type: device_type
- labels: labels
- note: note
- variables: variables
- host_id: host_id
Required Header
^^^^^^^^^^^^^^^
- Content-Type: Content_Type
- X-Auth-Token: X-Auth-Token
- X-Auth-User: X-Auth-User
- X-Auth-Project: X-Auth-Project
**Example Update Host** (TO-DO)
..literalinclude:: ../../doc/api_samples/hosts/hosts-update-req.json
:language: javascript
Response
--------
.. rest_parameters:: parameters.yaml
- host: host
- id: host_id_body
- name: host_name
- cell_id: cell_id_body
- parent_id: parent_id
- project_id: project_id
- region_id: region_id_body
- ip_address: ip_address
- device_type: device_type
- labels: labels
- note: note
- variables: variables
**Example Update Host** (TO-DO)
..literalinclude:: ../../doc/api_samples/hosts/hosts-update-resp.json
:language: javascript
Update Host Data
==================
.. rest_method:: PUT /v1/hosts/{host_id}/variables
Update user defined variables for the host
Normal response codes: OK(200)
Error response codes: invalid request(400), host not found(404), validation exception(405)
Request
-------
.. rest_parameters:: parameters.yaml
- key: key
- value: value
- host_id: host_id
Required Header
^^^^^^^^^^^^^^^
- Content-Type: Content_Type
- X-Auth-Token: X-Auth-Token
- X-Auth-User: X-Auth-User
- X-Auth-Project: X-Auth-Project
**Example Update Host Data** (TO-DO)
..literalinclude:: ../../doc/api_samples/hosts/hosts-upadate—data-req.json
:language: javascript
Response
--------
.. rest_parameters:: parameters.yaml
- key: key
- value: value
**Example Update Host Data** (TO-DO)
..literalinclude:: ../../doc/api_samples/hosts/hosts-update-data-resp.json
:language: javascript
Delete Host
===========
.. rest_method:: DELETE /v1/hosts/{host_id}
Deletes an existing record of a Host
Normal response codes: OK(200)
Error response codes: invalid request(400), host not found(404)
Request
-------
.. rest_parameters:: parameters.yaml
- host_id: host_id
Required Header
^^^^^^^^^^^^^^^
- Content-Type: Content_Type
- X-Auth-Token: X-Auth-Token
- X-Auth-User: X-Auth-User
- X-Auth-Project: X-Auth-Project
Response
--------
No body content is returned on a successful DELETE
Delete Host Data
================
.. rest_method:: DELETE /v1/hosts/{host_id}/variables
Delete existing key/value variables for the Host
Normal response codes: OK(200)
Error response codes: invalid request(400), host not found(404) validation exception(405)
Request
-------
.. rest_parameters:: parameters.yaml
- host_id: host_id
Required Header
^^^^^^^^^^^^^^^
- Content-Type: Content_Type
- X-Auth-Token: X-Auth-Token
- X-Auth-User: X-Auth-User
- X-Auth-Project: X-Auth-Project
Response
--------
No body content is returned on a successful DELETE

View File

@ -1,11 +0,0 @@
:tocdepth: 2
===========
Craton API
===========
.. rest_expand_all::
.. include:: cells.inc
.. include:: hosts.inc
.. include:: regions.inc

View File

@ -1,201 +0,0 @@
# variables in header
Content-Type:
description: |
Type of content sent using cURL
in: header
required: true
type: string
X-Auth-Project:
description: |
ID of the project this user is assigned to.
in: header
required: true
type: integer
X-Auth-Token:
description: |
User authentication token for the current session
in: header
required: true
type: string
X-Auth-User
description: |
User of the current session
in: header
required: true
type: string
# variables in path
cell_id:
description: |
The unique ID of the cell
in: path
required: true
type: integer
host_id:
description: |
The unique ID of the host
region_id:
description: |
The unique ID of the region
in: path
required: true
type: integer
# variables in body
cell:
description: |
A cell object
in: body
required: false
type: object
cell_id_body:
description: |
Unique ID of the cell
in: body
required: false
type: integer
cell_name:
description: |
Unique name of the cell
in: body
required: true
type: string
cells:
description: |
An array of cell objects
in: body
required: false
type: array
variables:
description: |
User defined information
in: body
required: false
type: object
device_type:
description: |
Type of host
in: body
required: false
type: string
host:
description: |
A host object
in: body
required: false
type: object
host_id_body:
description: |
Unique ID of the host
in: body
required: false
type: integer
host_name:
description: |
Unique name of the host
hosts:
description: |
An array of host objects
in: body
required: false
type: array
ip_address:
description: |
IP address
in: body
type: string
labels:
description: |
User defined labels
in: body
required: false
type: string
parent_id:
description: |
Parent ID of this host
in: body
required: false
type: integer
project_id:
description: |
ID of the project
in: body
required: true
type: integer
note:
description: |
Note used for governance
in: body
required: false
type: string
region:
description: |
A region object
in: body
required: false
type: object
region_id_body:
description: |
The unique ID of the region
in: body
required: false
type: integer
region_name:
description: |
Unique name of the region
in: body
required: true
type: string
regions:
description: |
An array of region objects
in: body
required: true
type: array
# variables in query
cell_name_query:
description: |
Name of the cell to get
in: query
required: false
type: string
ip_address_query:
description: |
IP address to get
in: query
required: false
type: string
host_id_query:
description: |
ID of the host to get
in: query
required: false
type: integer
host_name_query:
description: |
Name of host to get
in: query
required: false
type: string
limit:
description: |
Number of host to return ranging from 1 - 10000. Default = 1000
in: query
required: false
type: integer
region_id_query:
description: |
ID of the region to get
in: query
required: false
type: string
region_name_query:
description: |
Name of the the region to get
in: query
required: false
type: string
service:
description: |
Openstack service to query host by
in: query
required: false
type: array

View File

@ -1,260 +0,0 @@
.. -*- rst -*-
=======
Regions
=======
Definition of region
Create Region
==============
.. rest_method:: POST /v1/region
Creates a new Region
Normal response codes: OK(201)
Error response codes: invalid request(400), validation exception(405)
Request
-------
.. rest_parameters:: parameters.yaml
- name: region_name
- project_id: project_id
Required Header
^^^^^^^^^^^^^^^
- Content-Type: Content_Type
- X-Auth-Token: X-Auth-Token
- X-Auth-User: X-Auth-User
- X-Auth-Project: X-Auth-Project
**Example Create Region**
..literalinclude:: ../../doc/api_samples/regions/regions-create-req.json
:language: javascript
Response
--------
- region: region
- id: region_id_body
- name: region_name
- project_id: project_id
- cells: cells
- variables: variables
**Example Create Region**
..literalinclude:: ../../doc/api_samples/regions/regions-create-resp.json
:language: javascript
List Regions
==============
.. rest_method:: GET /v1/regions
Gets all Regions
Normal response codes: OK(200)
Error response codes: invalid request(400), validation exception(405)
Default response: unexpected error
Request
--------
- name: region_name_query
- id: region_id_query
Required Header
^^^^^^^^^^^^^^^
- Content-Type: Content_Type
- X-Auth-Token: X-Auth-Token
- X-Auth-User: X-Auth-User
- X-Auth-Project: X-Auth-Project
Response
--------
.. rest_parameters:: parameters.yaml
- region: region
- id: region_id_body
- name: region_name
- project_id: project_id
- cells: cells
- variables: variables
**Example List Regions**
..literalinclude:: ../../doc/api_samples/regions/regions-list-resp.json
:language: javascript
**Example Unexpected Error **
..literalinclude:: ../../doc/api_samples/errors/errors-unexpected-resp.json
:language: javascript
Update Region
=============
.. rest_method:: PUT /v1/regions/{region_id}
Update an existing region
Normal response codes: OK(200)
Error response codes: invalid request(400), region not found(404), validation exception(405)
Request
-------
.. rest_parameters:: parameters.yaml
- id: region_id_body
- name: region_name
- project_id: project_id
- cells: cells
- variables: variables
- region_id: region_id
Required Header
^^^^^^^^^^^^^^^
- Content-Type: Content_Type
- X-Auth-Token: X-Auth-Token
- X-Auth-User: X-Auth-User
- X-Auth-Project: X-Auth-Project
**Example Update Region** (TO-DO)
..literalinclude:: ../../doc/api_samples/regions/regions-update-req.json
:language: javascript
Response
--------
- region: region
- id: region_id_body
- name: region_name
- project_id: project_id
- cells: cells
- variables: variables
**Example Update Region** (TO-DO)
..literalinclude:: ../../doc/api_samples/regions/regions-update-resp.json
:language: javascript
Update Region Data
==================
.. rest_method:: PUT /v1/regions/{region_id}/variables
Update user defined variables for the region
Normal response codes: OK(200)
Error response codes: invalid request(400), region not found(404), validation exception(405)
Request
-------
.. rest_parameters:: parameters.yaml
- key: key
- value: value
- region_id: region_id
Required Header
^^^^^^^^^^^^^^^
- Content-Type: Content_Type
- X-Auth-Token: X-Auth-Token
- X-Auth-User: X-Auth-User
- X-Auth-Project: X-Auth-Project
**Example Update Region Data** (TO-DO)
..literalinclude:: ../../doc/api_samples/regions/regions-upadate—data-req.json
:language: javascript
Response
--------
- key: key
- value: value
**Example Update Region Data** (TO-DO)
..literalinclude:: ../../doc/api_samples/regions/regions-update-data-resp.json
:language: javascript
Delete Region
==============
.. rest_method:: DELETE /v1/regions/{region_id}
Deletes an existing record of a Region
Normal response codes: OK(200)
Error response codes: invalid request(400), region not found(404)
Request
-------
.. rest_parameters:: parameters.yaml
- region_id: region_id
Required Header
^^^^^^^^^^^^^^^
- Content-Type: Content_Type
- X-Auth-Token: X-Auth-Token
- X-Auth-User: X-Auth-User
- X-Auth-Project: X-Auth-Project
Response
--------
No body content is returned on a successful DELETE
Delete Region Data
==================
.. rest_method:: DELETE /v1/regions/{region_id}/variables
Delete existing key/value variables for the region
Normal response codes: OK(200)
Error response codes: invalid request(400), region not found(404) validation exception(405)
Request
-------
.. rest_parameters:: parameters.yaml
- region_id: region_id
Required Header
^^^^^^^^^^^^^^^
- Content-Type: Content_Type
- X-Auth-Token: X-Auth-Token
- X-Auth-User: X-Auth-User
- X-Auth-Project: X-Auth-Project
Response
--------
No body content is returned on a successful DELETE

View File

@ -1,2 +0,0 @@
[python: **.py]

View File

@ -1 +0,0 @@
docker.io [platform:dpkg]

View File

@ -1,19 +0,0 @@
# -*- coding: utf-8 -*-
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import pbr.version
__version__ = pbr.version.VersionInfo(
'craton').version_string()

View File

@ -1,19 +0,0 @@
"""oslo.i18n integration module.
See http://docs.openstack.org/developer/oslo.i18n/usage.html
"""
import oslo_i18n
_translators = oslo_i18n.TranslatorFactory(domain='craton')
# The primary translation function using the well-known name "_"
_ = _translators.primary
# The contextual translation function using the name "_C"
_C = _translators.contextual_form
# The plural translation function using the name "_P"
_P = _translators.plural_form

View File

@ -1,75 +0,0 @@
import os
from paste import deploy
from flask import Flask
from oslo_config import cfg
from oslo_log import log as logging
from craton.api import v1
from craton.util import JSON_KWARGS
LOG = logging.getLogger(__name__)
api_opts = [
cfg.StrOpt('api_paste_config',
default="api-paste.ini",
help="Configuration file for API service."),
cfg.StrOpt('paste_pipeline',
default="local-auth",
choices=["local-auth", "keystone-auth"],
help="""\
The name of the Paste pipeline to use for Craton.
Pipelines are organized according to authentication scheme. The available
choices are:
- ``local-auth`` (the default) Uses Craton's default authentication and
authorization scheme
- ``keystone-auth`` Uses Keystone for identity, authentication, and
authorization
"""),
cfg.StrOpt('host',
default="127.0.0.1",
help="API host IP"),
cfg.IntOpt('port',
default=5000,
help="API port to use.")
]
CONF = cfg.CONF
opt_group = cfg.OptGroup(name='api',
title='Craton API service group options')
CONF.register_group(opt_group)
CONF.register_opts(api_opts, opt_group)
def create_app(global_config, **local_config):
return setup_app()
def setup_app(config=None):
app = Flask(__name__, static_folder=None)
app.config.update(
PROPAGATE_EXCEPTIONS=True,
RESTFUL_JSON=JSON_KWARGS,
)
app.register_blueprint(v1.bp, url_prefix='/v1')
return app
def load_app():
cfg_file = None
cfg_path = CONF.api.api_paste_config
paste_pipeline = CONF.api.paste_pipeline
if not os.path.isabs(cfg_path):
cfg_file = CONF.find_file(cfg_path)
elif os.path.exists(cfg_path):
cfg_file = cfg_path
if not cfg_file:
raise cfg.ConfigFilesNotFoundError([cfg.CONF.api.api_paste_config])
LOG.info("Loading craton-api with pipeline %(pipeline)s and WSGI config:"
"%(conf)s", {'conf': cfg_file, 'pipeline': paste_pipeline})
return deploy.loadapp("config:%s" % cfg_file, name=paste_pipeline)

View File

@ -1,136 +0,0 @@
from oslo_middleware import base
from oslo_middleware import request_id
from oslo_context import context
from oslo_log import log
from oslo_utils import uuidutils
from craton.db import api as dbapi
from craton import exceptions
from craton.util import handle_all_exceptions_decorator
LOG = log.getLogger(__name__)
class RequestContext(context.RequestContext):
def __init__(self, **kwargs):
self.using_keystone = kwargs.pop('using_keystone', False)
self.token_info = kwargs.pop('token_info', None)
super(RequestContext, self).__init__(**kwargs)
class ContextMiddleware(base.Middleware):
def make_context(self, request, *args, **kwargs):
req_id = request.environ.get(request_id.ENV_REQUEST_ID)
kwargs.setdefault('request_id', req_id)
# TODO(sulo): Insert Craton specific context here if needed,
# for now we are using generic context object.
ctxt = RequestContext(*args, **kwargs)
request.environ['context'] = ctxt
return ctxt
class NoAuthContextMiddleware(ContextMiddleware):
def __init__(self, application):
self.application = application
@handle_all_exceptions_decorator
def process_request(self, request):
# Simply insert some dummy context info
self.make_context(
request,
auth_token='noauth-token',
user='noauth-user',
tenant=None,
is_admin=True,
is_admin_project=True,
)
@classmethod
def factory(cls, global_config, **local_config):
def _factory(application):
return cls(application)
return _factory
class LocalAuthContextMiddleware(ContextMiddleware):
def __init__(self, application):
self.application = application
@handle_all_exceptions_decorator
def process_request(self, request):
headers = request.headers
project_id = headers.get('X-Auth-Project')
if not uuidutils.is_uuid_like(project_id):
raise exceptions.AuthenticationError(
message="Project ID ('{}') is not a valid UUID".format(
project_id
)
)
ctx = self.make_context(
request,
auth_token=headers.get('X-Auth-Token', None),
user=headers.get('X-Auth-User', None),
tenant=project_id,
)
# NOTE(sulo): this means every api call hits the db
# at least once for auth. Better way to handle this?
try:
user_info = dbapi.get_user_info(ctx,
headers.get('X-Auth-User', None))
if user_info.api_key != headers.get('X-Auth-Token', None):
raise exceptions.AuthenticationError
if user_info.is_root:
ctx.is_admin = True
ctx.is_admin_project = True
elif user_info.is_admin:
ctx.is_admin = True
ctx.is_admin_project = False
else:
ctx.is_admin = False
ctx.is_admin_project = False
except exceptions.NotFound:
raise exceptions.AuthenticationError
@classmethod
def factory(cls, global_config, **local_config):
def _factory(application):
return cls(application)
return _factory
class KeystoneContextMiddleware(ContextMiddleware):
@handle_all_exceptions_decorator
def process_request(self, request):
headers = request.headers
environ = request.environ
if headers.get('X-Identity-Status', '').lower() != 'confirmed':
raise exceptions.AuthenticationError
token_info = environ['keystone.token_info']['token']
roles = (role['name'] for role in token_info['roles'])
self.make_context(
request,
auth_token=headers.get('X-Auth-Token'),
is_admin=any(name == 'admin' for name in roles),
is_admin_project=environ['HTTP_X_IS_ADMIN_PROJECT'],
user=token_info['user']['name'],
tenant=token_info['project']['id'],
using_keystone=True,
token_info=token_info,
)
@classmethod
def factory(cls, global_config, **local_config):
def _factory(application):
return cls(application)
return _factory

View File

@ -1,21 +0,0 @@
from flask import Blueprint
import flask_restful as restful
from craton.api.v1.routes import routes
from craton.util import handle_all_exceptions
class CratonApi(restful.Api):
def error_router(self, _, e):
return self.handle_error(e)
def handle_error(self, e):
return handle_all_exceptions(e)
bp = Blueprint('v1', __name__)
api = CratonApi(bp, catch_all_404s=False)
for route in routes:
api.add_resource(route.pop('resource'), *route.pop('urls'), **route)

View File

@ -1,85 +0,0 @@
import functools
import re
import urllib.parse as urllib
import flask
import flask_restful as restful
from craton.api.v1.validators import ensure_project_exists
from craton.api.v1.validators import request_validate
from craton.api.v1.validators import response_filter
SORT_KEY_SPLITTER = re.compile('[ ,]')
class Resource(restful.Resource):
method_decorators = [request_validate, ensure_project_exists,
response_filter]
def pagination_context(function):
@functools.wraps(function)
def wrapper(self, context, request_args):
pagination_parameters = {
'limit': limit_from(request_args),
'marker': request_args.pop('marker', None),
}
sort_keys = request_args.get('sort_keys')
if sort_keys is not None:
request_args['sort_keys'] = SORT_KEY_SPLITTER.split(sort_keys)
return function(self, context, request_args=request_args,
pagination_params=pagination_parameters)
return wrapper
def limit_from(filters, minimum=10, default=30, maximum=100):
"""Retrieve the limit from query filters."""
limit_str = filters.pop('limit', None)
if limit_str is None:
return default
limit = int(limit_str)
# NOTE(sigmavirus24): If our limit falls within in our constraints, just
# return that
if minimum <= limit <= maximum:
return limit
if limit < minimum:
return minimum
# NOTE(sigmavirus24): If our limit isn't within the constraints, and it
# isn't too small, then it must be too big. In that case, let's just
# return the maximum.
return maximum
def links_from(link_params):
"""Generate the list of hypermedia link relations from their parameters.
This uses the request thread-local to determine the endpoint and generate
URLs from that.
:param dict link_params:
A dictionary mapping the relation name to the query parameters.
:returns:
List of dictionaries to represent hypermedia link relations.
:rtype:
list
"""
links = []
relations = ["first", "prev", "self", "next"]
base_url = flask.request.base_url
for relation in relations:
query_params = link_params.get(relation)
if not query_params:
continue
link_rel = {
"rel": relation,
"href": base_url + "?" + urllib.urlencode(query_params),
}
links.append(link_rel)
return links

View File

@ -1,105 +0,0 @@
from collections import OrderedDict
from operator import attrgetter
from oslo_serialization import jsonutils
from oslo_log import log
from craton.api.v1 import base
from craton import db as dbapi
from craton import exceptions
LOG = log.getLogger(__name__)
class AnsibleInventory(base.Resource):
def get_hierarchy(self, devices):
regions = set()
cells = set()
labels = set()
for device in devices:
if device.region not in regions:
regions.add(device.region)
if device.cell:
if device.cell not in cells:
cells.add(device.cell)
for label in device.labels:
if label not in labels:
labels.add(label)
regions = sorted(regions, key=attrgetter('name'))
cells = sorted(cells, key=attrgetter('name'))
labels = sorted(labels, key=attrgetter('label'))
devices = sorted(devices, key=attrgetter('ip_address'))
return regions, cells, labels, devices
def generate_ansible_inventory(self, hosts):
"""Generate and return Ansible inventory in json format
for hosts given by provided filters.
"""
regions, cells, labels, hosts = self.get_hierarchy(hosts)
hosts_set = set(hosts)
# Set group 'all' and set '_meta'
inventory = OrderedDict(
[('all', {'hosts': []}),
('_meta', {'hostvars': OrderedDict()})]
)
for host in hosts:
ip = str(host.ip_address)
inventory['all']['hosts'].append(ip)
inventory['_meta']['hostvars'][ip] = host.resolved
def matching_hosts(obj):
return sorted(
[str(device.ip_address) for device in obj.devices
if device in hosts_set])
# Group hosts by label
# TODO(sulo): make sure we have a specified label to
# identify host group. Fix this after label refractoring.
for label in labels:
inventory[label.label] = {
'hosts': matching_hosts(label),
'vars': label.variables
}
for cell in cells:
inventory['%s-%s' % (cell.region.name, cell.name)] = {
'hosts': matching_hosts(cell),
'vars': cell.variables
}
for region in regions:
ch = ['%s-%s' % (region.name, cell.name) for cell in region.cells]
inventory['%s' % region.name] = {
'children': ch,
'vars': region.variables
}
return inventory
def get(self, context, request_args):
region_id = request_args["region_id"]
cell_id = request_args["cell_id"]
filters = {}
if region_id:
filters['region_id'] = region_id
# TODO(sulo): allow other filters based on services
if cell_id:
filters['cell_id'] = cell_id
try:
hosts_obj = dbapi.hosts_get_all(context, filters)
except exceptions.NotFound:
return self.error_response(404, 'Not Found')
except Exception as err:
LOG.error("Error during host get: %s" % err)
return self.error_response(500, 'Unknown Error')
_inventory = self.generate_ansible_inventory(hosts_obj)
inventory = jsonutils.to_primitive(_inventory)
return inventory, 200, None

View File

@ -1,65 +0,0 @@
from oslo_serialization import jsonutils
from oslo_log import log
from craton.api import v1
from craton.api.v1 import base
from craton.api.v1.resources import utils
from craton import db as dbapi
from craton import util
LOG = log.getLogger(__name__)
class Cells(base.Resource):
@base.pagination_context
def get(self, context, request_args, pagination_params):
"""Get all cells, with optional filtering."""
details = request_args.get("details")
cells_obj, link_params = dbapi.cells_get_all(
context, request_args, pagination_params,
)
if details:
cells_obj = [utils.get_resource_with_vars(request_args, cell)
for cell in cells_obj]
links = base.links_from(link_params)
response_body = {'cells': cells_obj, 'links': links}
return jsonutils.to_primitive(response_body), 200, None
def post(self, context, request_data):
"""Create a new cell."""
json = util.copy_project_id_into_json(context, request_data)
cell_obj = dbapi.cells_create(context, json)
cell = jsonutils.to_primitive(cell_obj)
if 'variables' in json:
cell["variables"] = jsonutils.to_primitive(cell_obj.variables)
else:
cell["variables"] = {}
location = v1.api.url_for(
CellById, id=cell_obj.id, _external=True
)
headers = {'Location': location}
return cell, 201, headers
class CellById(base.Resource):
def get(self, context, id, request_args):
cell_obj = dbapi.cells_get_by_id(context, id)
cell = utils.get_resource_with_vars(request_args, cell_obj)
return cell, 200, None
def put(self, context, id, request_data):
"""Update existing cell."""
cell_obj = dbapi.cells_update(context, id, request_data)
return jsonutils.to_primitive(cell_obj), 200, None
def delete(self, context, id):
"""Delete existing cell."""
dbapi.cells_delete(context, id)
return None, 204, None

View File

@ -1,82 +0,0 @@
from oslo_serialization import jsonutils
from oslo_log import log
from craton.api import v1
from craton.api.v1 import base
from craton.api.v1.resources import utils
from craton import db as dbapi
from craton import util
LOG = log.getLogger(__name__)
class Clouds(base.Resource):
@base.pagination_context
def get(self, context, request_args, pagination_params):
"""Get cloud(s) for the project. Get cloud details if
for a particular cloud.
"""
cloud_id = request_args.get("id")
cloud_name = request_args.get("name")
details = request_args.get("details")
if not (cloud_id or cloud_name):
# Get all clouds for this project
clouds_obj, link_params = dbapi.clouds_get_all(
context, request_args, pagination_params,
)
if details:
clouds_obj = [utils.get_resource_with_vars(request_args, c)
for c in clouds_obj]
else:
if cloud_name:
cloud_obj = dbapi.clouds_get_by_name(context, cloud_name)
cloud_obj.data = cloud_obj.variables
if cloud_id:
cloud_obj = dbapi.clouds_get_by_id(context, cloud_id)
cloud_obj.data = cloud_obj.variables
clouds_obj = [cloud_obj]
link_params = {}
links = base.links_from(link_params)
response_body = {'clouds': clouds_obj, 'links': links}
return jsonutils.to_primitive(response_body), 200, None
def post(self, context, request_data):
"""Create a new cloud."""
json = util.copy_project_id_into_json(context, request_data)
cloud_obj = dbapi.clouds_create(context, json)
cloud = jsonutils.to_primitive(cloud_obj)
if 'variables' in json:
cloud["variables"] = jsonutils.to_primitive(cloud_obj.variables)
else:
cloud["variables"] = {}
location = v1.api.url_for(
CloudsById, id=cloud_obj.id, _external=True
)
headers = {'Location': location}
return cloud, 201, headers
class CloudsById(base.Resource):
def get(self, context, id):
cloud_obj = dbapi.clouds_get_by_id(context, id)
cloud = jsonutils.to_primitive(cloud_obj)
cloud['variables'] = jsonutils.to_primitive(cloud_obj.variables)
return cloud, 200, None
def put(self, context, id, request_data):
"""Update existing cloud."""
cloud_obj = dbapi.clouds_update(context, id, request_data)
return jsonutils.to_primitive(cloud_obj), 200, None
def delete(self, context, id):
"""Delete existing cloud."""
dbapi.clouds_delete(context, id)
return None, 204, None

View File

@ -1,49 +0,0 @@
from oslo_serialization import jsonutils
from oslo_log import log
from craton.api.v1 import base
from craton.api.v1.resources import utils
from craton import exceptions
from craton import db as dbapi
from craton.db.sqlalchemy import models
LOG = log.getLogger(__name__)
class Devices(base.Resource):
@base.pagination_context
def get(self, context, request_args, pagination_params):
"""Get all devices, with optional filtering."""
details = request_args.get("details")
device_objs, link_params = dbapi.devices_get_all(
context, request_args, pagination_params,
)
links = base.links_from(link_params)
devices = {"hosts": [], "network-devices": []}
for device_obj in device_objs:
if details:
device = utils.get_resource_with_vars(request_args,
device_obj)
else:
device = jsonutils.to_primitive(device_obj)
utils.add_up_link(context, device)
if isinstance(device_obj, models.Host):
devices["hosts"].append(device)
elif isinstance(device_obj, models.NetworkDevice):
devices["network-devices"].append(device)
else:
LOG.error(
"The device is of unknown type: '%s'", device_obj
)
raise exceptions.UnknownException
response_body = jsonutils.to_primitive(
{'devices': devices, 'links': links}
)
return response_body, 200, None

View File

@ -1,104 +0,0 @@
from oslo_serialization import jsonutils
from oslo_log import log
from craton.api import v1
from craton.api.v1 import base
from craton.api.v1.resources import utils
from craton import db as dbapi
from craton import util
LOG = log.getLogger(__name__)
class Hosts(base.Resource):
@base.pagination_context
def get(self, context, request_args, pagination_params):
"""Get all hosts for region, with optional filtering."""
details = request_args.get("details")
hosts_obj, link_params = dbapi.hosts_get_all(
context, request_args, pagination_params,
)
if details:
hosts_obj = [utils.get_resource_with_vars(request_args, h)
for h in hosts_obj]
links = base.links_from(link_params)
response_body = jsonutils.to_primitive(
{'hosts': hosts_obj, 'links': links}
)
for host in response_body["hosts"]:
utils.add_up_link(context, host)
return response_body, 200, None
def post(self, context, request_data):
"""Create a new host."""
json = util.copy_project_id_into_json(context, request_data)
host_obj = dbapi.hosts_create(context, json)
host = jsonutils.to_primitive(host_obj)
if 'variables' in json:
host["variables"] = jsonutils.to_primitive(host_obj.variables)
else:
host["variables"] = {}
utils.add_up_link(context, host)
location = v1.api.url_for(
HostById, id=host_obj.id, _external=True
)
headers = {'Location': location}
return host, 201, headers
class HostById(base.Resource):
def get(self, context, id, request_args):
"""Get host by given id"""
host_obj = dbapi.hosts_get_by_id(context, id)
host = utils.get_resource_with_vars(request_args, host_obj)
utils.add_up_link(context, host)
return host, 200, None
def put(self, context, id, request_data):
"""Update existing host data, or create if it does not exist."""
host_obj = dbapi.hosts_update(context, id, request_data)
host = jsonutils.to_primitive(host_obj)
utils.add_up_link(context, host)
return host, 200, None
def delete(self, context, id):
"""Delete existing host."""
dbapi.hosts_delete(context, id)
return None, 204, None
class HostsLabels(base.Resource):
def get(self, context, id):
"""Get labels for given host device."""
host_obj = dbapi.hosts_get_by_id(context, id)
response = {"labels": list(host_obj.labels)}
return response, 200, None
def put(self, context, id, request_data):
"""
Update existing device label entirely, or add if it does
not exist.
"""
resp = dbapi.hosts_labels_update(context, id, request_data)
response = {"labels": list(resp.labels)}
return response, 200, None
def delete(self, context, id, request_data):
"""Delete device label entirely."""
dbapi.hosts_labels_delete(context, id, request_data)
return None, 204, None

View File

@ -1,210 +0,0 @@
from oslo_serialization import jsonutils
from oslo_log import log
from craton.api import v1
from craton.api.v1 import base
from craton.api.v1.resources import utils
from craton import db as dbapi
from craton import util
LOG = log.getLogger(__name__)
class Networks(base.Resource):
"""Controller for Networks resources."""
@base.pagination_context
def get(self, context, request_args, pagination_params):
"""Get all networks, with optional filtering."""
details = request_args.get("details")
networks_obj, link_params = dbapi.networks_get_all(
context, request_args, pagination_params,
)
if details:
networks_obj = [utils.get_resource_with_vars(request_args, n)
for n in networks_obj]
links = base.links_from(link_params)
response_body = {'networks': networks_obj, 'links': links}
return jsonutils.to_primitive(response_body), 200, None
def post(self, context, request_data):
"""Create a new network."""
json = util.copy_project_id_into_json(context, request_data)
network_obj = dbapi.networks_create(context, json)
network = jsonutils.to_primitive(network_obj)
if 'variables' in json:
network["variables"] = jsonutils.to_primitive(
network_obj.variables)
else:
network["variables"] = {}
location = v1.api.url_for(
NetworkById, id=network_obj.id, _external=True
)
headers = {'Location': location}
return network, 201, headers
class NetworkById(base.Resource):
"""Controller for Networks by ID."""
def get(self, context, id):
"""Get network by given id"""
obj = dbapi.networks_get_by_id(context, id)
device = jsonutils.to_primitive(obj)
device['variables'] = jsonutils.to_primitive(obj.variables)
return device, 200, None
def put(self, context, id, request_data):
"""Update existing network values."""
net_obj = dbapi.networks_update(context, id, request_data)
return jsonutils.to_primitive(net_obj), 200, None
def delete(self, context, id):
"""Delete existing network."""
dbapi.networks_delete(context, id)
return None, 204, None
class NetworkDevices(base.Resource):
"""Controller for Network Device resources."""
@base.pagination_context
def get(self, context, request_args, pagination_params):
"""Get all network devices."""
details = request_args.get("details")
devices_obj, link_params = dbapi.network_devices_get_all(
context, request_args, pagination_params,
)
if details:
devices_obj = [utils.get_resource_with_vars(request_args, d)
for d in devices_obj]
links = base.links_from(link_params)
response_body = jsonutils.to_primitive(
{'network_devices': devices_obj, 'links': links}
)
for device in response_body["network_devices"]:
utils.add_up_link(context, device)
return response_body, 200, None
def post(self, context, request_data):
"""Create a new network device."""
json = util.copy_project_id_into_json(context, request_data)
obj = dbapi.network_devices_create(context, json)
device = jsonutils.to_primitive(obj)
if 'variables' in json:
device["variables"] = jsonutils.to_primitive(obj.variables)
else:
device["variables"] = {}
utils.add_up_link(context, device)
location = v1.api.url_for(
NetworkDeviceById, id=obj.id, _external=True
)
headers = {'Location': location}
return device, 201, headers
class NetworkDeviceById(base.Resource):
"""Controller for Network Devices by ID."""
def get(self, context, id, request_args):
"""Get network device by given id"""
obj = dbapi.network_devices_get_by_id(context, id)
obj = utils.format_variables(request_args, obj)
device = jsonutils.to_primitive(obj)
device['variables'] = jsonutils.to_primitive(obj.vars)
utils.add_up_link(context, device)
return device, 200, None
def put(self, context, id, request_data):
"""Update existing device values."""
net_obj = dbapi.network_devices_update(context, id, request_data)
device = jsonutils.to_primitive(net_obj)
utils.add_up_link(context, device)
return device, 200, None
def delete(self, context, id):
"""Delete existing network device."""
dbapi.network_devices_delete(context, id)
return None, 204, None
class NetworkDeviceLabels(base.Resource):
"""Controller for Netowrk Device Labels."""
def get(self, context, id):
"""Get labels for given network device."""
obj = dbapi.network_devices_get_by_id(context, id)
response = {"labels": list(obj.labels)}
return response, 200, None
def put(self, context, id, request_data):
"""Update existing device label. Adds if it does not exist."""
resp = dbapi.network_devices_labels_update(context, id, request_data)
response = {"labels": list(resp.labels)}
return response, 200, None
def delete(self, context, id, request_data):
"""Delete device label(s)."""
dbapi.network_devices_labels_delete(context, id, request_data)
return None, 204, None
class NetworkInterfaces(base.Resource):
"""Controller for Netowrk Interfaces."""
@base.pagination_context
def get(self, context, request_args, pagination_params):
"""Get all network interfaces."""
interfaces_obj, link_params = dbapi.network_interfaces_get_all(
context, request_args, pagination_params,
)
links = base.links_from(link_params)
response_body = {'network_interfaces': interfaces_obj, 'links': links}
return jsonutils.to_primitive(response_body), 200, None
def post(self, context, request_data):
"""Create a new network interface."""
json = util.copy_project_id_into_json(context, request_data)
obj = dbapi.network_interfaces_create(context, json)
interface = jsonutils.to_primitive(obj)
location = v1.api.url_for(
NetworkInterfaceById, id=obj.id, _external=True
)
headers = {'Location': location}
return interface, 201, headers
class NetworkInterfaceById(base.Resource):
def get(self, context, id):
"""Get network interface by given id"""
obj = dbapi.network_interfaces_get_by_id(context, id)
interface = jsonutils.to_primitive(obj)
interface['variables'] = jsonutils.to_primitive(obj.variables)
return interface, 200, None
def put(self, context, id, request_data):
"""Update existing network interface values."""
net_obj = dbapi.network_interfaces_update(context, id, request_data)
return jsonutils.to_primitive(net_obj), 200, None
def delete(self, context, id):
"""Delete existing network interface."""
dbapi.network_interfaces_delete(context, id)
return None, 204, None

View File

@ -1,81 +0,0 @@
from oslo_serialization import jsonutils
from oslo_log import log
from craton.api import v1
from craton.api.v1 import base
from craton.api.v1.resources import utils
from craton import db as dbapi
from craton import util
LOG = log.getLogger(__name__)
class Regions(base.Resource):
@base.pagination_context
def get(self, context, request_args, pagination_params):
"""Get region(s) for the project. Get region details if
for a particular region.
"""
region_id = request_args.get("id")
region_name = request_args.get("name")
details = request_args.get("details")
if not (region_id or region_name):
# Get all regions for this tenant
regions_obj, link_params = dbapi.regions_get_all(
context, request_args, pagination_params,
)
if details:
regions_obj = [utils.get_resource_with_vars(request_args, r)
for r in regions_obj]
else:
if region_name:
region_obj = dbapi.regions_get_by_name(context, region_name)
region_obj.data = region_obj.variables
if region_id:
region_obj = dbapi.regions_get_by_id(context, region_id)
region_obj.data = region_obj.variables
regions_obj = [region_obj]
link_params = {}
links = base.links_from(link_params)
response_body = {'regions': regions_obj, 'links': links}
return jsonutils.to_primitive(response_body), 200, None
def post(self, context, request_data):
"""Create a new region."""
json = util.copy_project_id_into_json(context, request_data)
region_obj = dbapi.regions_create(context, json)
region = jsonutils.to_primitive(region_obj)
if 'variables' in json:
region["variables"] = jsonutils.to_primitive(region_obj.variables)
else:
region["variables"] = {}
location = v1.api.url_for(
RegionsById, id=region_obj.id, _external=True
)
headers = {'Location': location}
return region, 201, headers
class RegionsById(base.Resource):
def get(self, context, id, request_args):
region_obj = dbapi.regions_get_by_id(context, id)
region = utils.get_resource_with_vars(request_args, region_obj)
return region, 200, None
def put(self, context, id, request_data):
"""Update existing region."""
region_obj = dbapi.regions_update(context, id, request_data)
return jsonutils.to_primitive(region_obj), 200, None
def delete(self, context, id):
"""Delete existing region."""
dbapi.regions_delete(context, id)
return None, 204, None

View File

@ -1,67 +0,0 @@
from oslo_serialization import jsonutils
from oslo_log import log
from craton.api import v1
from craton.api.v1 import base
from craton.api.v1.resources import utils
from craton import db as dbapi
LOG = log.getLogger(__name__)
class Projects(base.Resource):
@base.pagination_context
def get(self, context, request_args, pagination_params):
"""Get all projects. Requires super admin privileges."""
project_name = request_args["name"]
details = request_args.get("details")
if project_name:
projects_obj, link_params = dbapi.projects_get_by_name(
context, project_name, request_args, pagination_params,
)
else:
projects_obj, link_params = dbapi.projects_get_all(
context, request_args, pagination_params,
)
if details:
projects_obj = [utils.get_resource_with_vars(request_args, p)
for p in projects_obj]
links = base.links_from(link_params)
response_body = {'projects': projects_obj, 'links': links}
return jsonutils.to_primitive(response_body), 200, None
def post(self, context, request_data):
"""Create a new project. Requires super admin privileges."""
project_obj = dbapi.projects_create(context, request_data)
location = v1.api.url_for(
ProjectById, id=project_obj.id, _external=True
)
headers = {'Location': location}
project = jsonutils.to_primitive(project_obj)
if 'variables' in request_data:
project["variables"] = \
jsonutils.to_primitive(project_obj.variables)
else:
project["variables"] = {}
return project, 201, headers
class ProjectById(base.Resource):
def get(self, context, id):
"""Get a project details by id. Requires super admin privileges."""
project_obj = dbapi.projects_get_by_id(context, id)
project = jsonutils.to_primitive(project_obj)
project['variables'] = jsonutils.to_primitive(project_obj.variables)
return project, 200, None
def delete(self, context, id):
"""Delete existing project. Requires super admin privileges."""
dbapi.projects_delete(context, id)
return None, 204, None

View File

@ -1,68 +0,0 @@
from oslo_serialization import jsonutils
from oslo_log import log
from oslo_utils import uuidutils
from craton.api import v1
from craton.api.v1 import base
from craton import db as dbapi
LOG = log.getLogger(__name__)
class Users(base.Resource):
@base.pagination_context
def get(self, context, request_args, pagination_params):
"""Get all users. Requires project admin privileges."""
user_id = request_args["id"]
user_name = request_args["name"]
if user_id:
user_obj = dbapi.users_get_by_id(context, user_id)
user_obj.data = user_obj.variables
users_obj = [user_obj]
link_params = {}
if user_name:
users_obj, link_params = dbapi.users_get_by_name(
context, user_name, request_args, pagination_params,
)
else:
users_obj, link_params = dbapi.users_get_all(
context, request_args, pagination_params,
)
links = base.links_from(link_params)
response_body = {'users': users_obj, 'links': links}
return jsonutils.to_primitive(response_body), 200, None
def post(self, context, request_data):
"""Create a new user. Requires project admin privileges."""
# NOTE(sulo): Instead of using context project_id from
# header, here we always ensure, user create gets project_id
# from request param.
project_id = request_data["project_id"]
dbapi.projects_get_by_id(context, project_id)
api_key = uuidutils.generate_uuid()
request_data["api_key"] = api_key
user_obj = dbapi.users_create(context, request_data)
location = v1.api.url_for(
UserById, id=user_obj.id, _external=True
)
headers = {'Location': location}
return jsonutils.to_primitive(user_obj), 201, headers
class UserById(base.Resource):
def get(self, context, id):
"""Get a user details by id. Requires project admin privileges."""
user_obj = dbapi.users_get_by_id(context, id)
return jsonutils.to_primitive(user_obj), 200, None
def delete(self, context, id):
"""Delete existing user. Requires project admin privileges."""
dbapi.users_delete(context, id)
return None, 204, None

View File

@ -1,69 +0,0 @@
import binascii
import os
from flask import url_for
from oslo_serialization import jsonutils
from craton import db as dbapi
def format_variables(args, obj):
"""Update resource response with requested type of variables."""
if args:
resolved_values = args.get("resolved-values", None)
else:
resolved_values = None
if resolved_values:
obj.vars = obj.resolved
else:
obj.vars = obj.variables
return obj
def get_resource_with_vars(args, obj):
"""Get resource in json primitive with variables."""
obj = format_variables(args, obj)
res = jsonutils.to_primitive(obj)
res['variables'] = jsonutils.to_primitive(obj.vars)
return res
def get_device_type(context, device_id):
device = dbapi.resource_get_by_id(context, "devices", device_id)
return device.type
def get_resource_url(resource_type, resource_id):
resources = {
"cells": "v1.cells_id",
"hosts": "v1.hosts_id",
"network_devices": "v1.network_devices_id",
"regions": "v1.regions_id",
}
return url_for(resources[resource_type], id=resource_id, _external=True)
def add_up_link(context, device):
if device["parent_id"]:
device_type = get_device_type(context, device["parent_id"])
link_url = get_resource_url(device_type, device["parent_id"])
elif device["cell_id"]:
link_url = get_resource_url("cells", device["cell_id"])
else:
link_url = get_resource_url("regions", device["region_id"])
link = {
"href": link_url,
"rel": "up",
}
links = device.setdefault("links", [])
links.append(link)
def gen_api_key():
"""Generates crypto strong 16 bytes api key."""
# NOTE(sulo): this implementation is taken from secrets
# moudule available in python 3.6
tbytes = os.urandom(16)
return binascii.hexlify(tbytes).decode('ascii')

View File

@ -1,43 +0,0 @@
from oslo_serialization import jsonutils
from oslo_log import log
from craton.api.v1 import base
from craton.api.v1.resources import utils
from craton import db as dbapi
# NOTE(thomasem): LOG must exist for craton.api.v1.base module to introspect
# and execute this modules LOG.
LOG = log.getLogger(__name__)
class Variables(base.Resource):
def get(self, context, resources, id, request_args=None):
"""Get variables for given resource."""
obj = dbapi.resource_get_by_id(context, resources, id)
obj = utils.format_variables(request_args, obj)
resp = {"variables": jsonutils.to_primitive(obj.vars)}
return resp, 200, None
def put(self, context, resources, id, request_data):
"""
Update existing resource variables, or create if it does
not exist.
"""
obj = dbapi.variables_update_by_resource_id(
context, resources, id, request_data
)
resp = {"variables": jsonutils.to_primitive(obj.variables)}
return resp, 200, None
def delete(self, context, resources, id, request_data):
"""Delete resource variables."""
# NOTE(sulo): this is not that great. Find a better way to do this.
# We can pass multiple keys suchs as key1=one key2=two etc. but not
# the best way to do this.
dbapi.variables_delete_by_resource_id(
context, resources, id, request_data
)
return None, 204, None

View File

@ -1,93 +0,0 @@
from craton.api.v1.resources import users
from craton.api.v1.resources import projects
from craton.api.v1.resources import variables
from craton.api.v1.resources.inventory import ansible_inventory
from craton.api.v1.resources.inventory import cells
from craton.api.v1.resources.inventory import clouds
from craton.api.v1.resources.inventory import devices
from craton.api.v1.resources.inventory import hosts
from craton.api.v1.resources.inventory import regions
from craton.api.v1.resources.inventory import networks
VARS_RESOLVE = ", ".join(map(repr, ("hosts", )))
VARS_NOT_RESOLVE = ", ".join(
map(repr, ("network-devices", "cells", "regions", "networks", "projects",
"clouds"))
)
routes = [
dict(resource=ansible_inventory.AnsibleInventory,
urls=['/ansible-inventory'],
endpoint='ansible_inventory'),
dict(resource=devices.Devices,
urls=['/devices'],
endpoint='devices'),
dict(resource=hosts.HostsLabels,
urls=['/hosts/<id>/labels'],
endpoint='hosts_labels'),
dict(resource=hosts.HostById,
urls=['/hosts/<id>'],
endpoint='hosts_id'),
dict(resource=hosts.Hosts,
urls=['/hosts'],
endpoint='hosts'),
dict(resource=regions.Regions,
urls=['/regions'],
endpoint='regions'),
dict(resource=regions.RegionsById,
urls=['/regions/<id>'],
endpoint='regions_id'),
dict(resource=clouds.Clouds,
urls=['/clouds'],
endpoint='clouds'),
dict(resource=clouds.CloudsById,
urls=['/clouds/<id>'],
endpoint='clouds_id'),
dict(resource=cells.CellById,
urls=['/cells/<id>'],
endpoint='cells_id'),
dict(resource=cells.Cells,
urls=['/cells'],
endpoint='cells'),
dict(resource=projects.Projects,
urls=['/projects'],
endpoint='projects'),
dict(resource=projects.ProjectById,
urls=['/projects/<id>'],
endpoint='projects_id'),
dict(resource=users.Users,
urls=['/users'],
endpoint='users'),
dict(resource=users.UserById,
urls=['/users/<id>'],
endpoint='users_id'),
dict(resource=networks.Networks,
urls=['/networks'],
endpoint='networks'),
dict(resource=networks.NetworkById,
urls=['/networks/<id>'],
endpoint='networks_id'),
dict(resource=networks.NetworkInterfaces,
urls=['/network-interfaces'],
endpoint='network_interfaces'),
dict(resource=networks.NetworkInterfaceById,
urls=['/network-interfaces/<id>'],
endpoint='network_interfaces_id'),
dict(resource=networks.NetworkDevices,
urls=['/network-devices'],
endpoint='network_devices'),
dict(resource=networks.NetworkDeviceById,
urls=['/network-devices/<id>'],
endpoint='network_devices_id'),
dict(resource=networks.NetworkDeviceLabels,
urls=['/network-devices/<id>/labels'],
endpoint='network_devices_labels'),
dict(resource=variables.Variables,
urls=['/<any({}):resources>/<id>/variables'.format(VARS_RESOLVE)],
endpoint='variables_with_resolve'),
dict(resource=variables.Variables,
urls=['/<any({}):resources>/<id>/variables'.format(VARS_NOT_RESOLVE)],
endpoint='variables_without_resolve'),
]

File diff suppressed because it is too large Load Diff

View File

@ -1,285 +0,0 @@
# The code is auto generated, your change will be overwritten by
# code generating.
from functools import wraps
from werkzeug.datastructures import MultiDict, Headers
from flask import request
from jsonschema import Draft4Validator
from oslo_log import log
from craton.api.v1.schemas import filters
from craton.api.v1.schemas import validators
from craton import db as dbapi
from craton import exceptions
LOG = log.getLogger(__name__)
def merge_default(schema, value):
# TODO: more types support
type_defaults = {
'integer': 9573,
'string': 'something',
'object': {},
'array': [],
'boolean': False
}
return normalize(schema, value, type_defaults)[0]
def normalize(schema, data, required_defaults=None):
if required_defaults is None:
required_defaults = {}
errors = []
class DataWrapper(object):
def __init__(self, data):
super(DataWrapper, self).__init__()
self.data = data
def get(self, key, default=None):
if isinstance(self.data, dict):
return self.data.get(key, default)
if hasattr(self.data, key):
return getattr(self.data, key)
else:
return default
def has(self, key):
if isinstance(self.data, dict):
return key in self.data
return hasattr(self.data, key)
def keys(self):
if isinstance(self.data, dict):
return self.data.keys()
return vars(self.data).keys()
def _normalize_dict(schema, data):
result = {}
if not isinstance(data, DataWrapper):
data = DataWrapper(data)
for pattern, _schema in (schema.get('patternProperties', {})).items():
if pattern == "^.+":
for key in data.keys():
result[key] = _normalize(_schema, data.get(key))
for key, _schema in schema.get('properties', {}).items():
# set default
type_ = _schema.get('type', 'object')
if ('default' not in _schema and
key in schema.get('required', []) and
type_ in required_defaults):
_schema['default'] = required_defaults[type_]
# get value
if data.has(key):
result[key] = _normalize(_schema, data.get(key))
elif 'default' in _schema:
result[key] = _schema['default']
elif key in schema.get('required', []):
errors.append(dict(name='property_missing',
message='`%s` is required' % key))
for _schema in schema.get('allOf', []):
rs_component = _normalize(_schema, data)
rs_component.update(result)
result = rs_component
if schema.get('anyOf'):
# In case of anyOf simply return data, since we dont
# care in normalization of the data as long as
# its been verified.
result = data.data
additional_properties_schema = schema.get('additionalProperties',
False)
if additional_properties_schema:
aproperties_set = set(data.keys()) - set(result.keys())
for pro in aproperties_set:
result[pro] = _normalize(additional_properties_schema,
data.get(pro))
return result
def _normalize_list(schema, data):
result = []
if hasattr(data, '__iter__') and not isinstance(data, dict):
for item in data:
result.append(_normalize(schema.get('items'), item))
elif 'default' in schema:
result = schema['default']
return result
def _normalize_default(schema, data):
if data is None:
return schema.get('default')
else:
return data
def _normalize(schema, data):
if not schema:
return None
funcs = {
'object': _normalize_dict,
'array': _normalize_list,
'default': _normalize_default,
}
type_ = schema.get('type', 'object')
if type_ not in funcs:
type_ = 'default'
return funcs[type_](schema, data)
return _normalize(schema, data), errors
class FlaskValidatorAdaptor(object):
def __init__(self, schema):
self.validator = Draft4Validator(schema)
def type_convert(self, obj):
if obj is None:
return None
if isinstance(obj, (dict, list)) and not isinstance(obj, MultiDict):
return obj
if isinstance(obj, Headers):
obj = MultiDict(obj)
result = dict()
convert_funs = {
'integer': lambda v: int(v[0]),
'boolean': lambda v: v[0].lower() not in ['n', 'no',
'false', '', '0'],
'null': lambda v: None,
'number': lambda v: float(v[0]),
'string': lambda v: v[0]
}
def convert_array(type_, v):
func = convert_funs.get(type_, lambda v: v[0])
return [func([i]) for i in v]
for k, values in obj.lists():
prop = self.validator.schema['properties'].get(k, {})
type_ = prop.get('type')
fun = convert_funs.get(type_, lambda v: v[0])
if type_ == 'array':
item_type = prop.get('items', {}).get('type')
result[k] = convert_array(item_type, values)
else:
result[k] = fun(values)
return result
def validate(self, value):
value = self.type_convert(value)
errors = sorted(e.message for e in self.validator.iter_errors(value))
if errors:
msg = "The request included the following errors:\n- {}".format(
"\n- ".join(errors)
)
raise exceptions.BadRequest(message=msg)
return merge_default(self.validator.schema, value)
def request_validate(view):
@wraps(view)
def wrapper(*args, **kwargs):
endpoint = request.endpoint.partition('.')[-1]
# data
method = request.method
if method == 'HEAD':
method = 'GET'
locations = validators.get((endpoint, method), {})
data_type = {"json": "request_data", "args": "request_args"}
for location, schema in locations.items():
value = getattr(request, location, MultiDict())
validator = FlaskValidatorAdaptor(schema)
result = validator.validate(value)
LOG.info("Validated request %s: %s" % (location, result))
if schema.get("maxProperties") == 0:
continue
else:
kwargs[data_type[location]] = result
context = request.environ['context']
return view(*args, context=context, **kwargs)
return wrapper
def ensure_project_exists(view):
@wraps(view)
def wrapper(*args, **kwargs):
context = request.environ['context']
if context.using_keystone:
find_or_create_project(request, context)
return view(*args, **kwargs)
return wrapper
def response_filter(view):
@wraps(view)
def wrapper(*args, **kwargs):
resp = view(*args, **kwargs)
endpoint = request.endpoint.partition('.')[-1]
method = request.method
if method == 'HEAD':
method = 'GET'
try:
resp_filter = filters[(endpoint, method)]
except KeyError:
LOG.error(
'"(%(endpoint)s, %(method)s)" is not defined in the response '
'filters.',
{"endpoint": endpoint, "method": method}
)
raise exceptions.UnknownException
body, status, headers = resp
try:
schemas = resp_filter[status]
except KeyError:
LOG.error(
'The status code %(status)d is not defined in the response '
'filter "(%(endpoint)s, %(method)s)".',
{"status": status, "endpoint": endpoint, "method": method}
)
raise exceptions.UnknownException
body, errors = normalize(schemas['schema'], body)
if schemas['headers']:
headers, header_errors = normalize(
{'properties': schemas['headers']}, headers)
errors.extend(header_errors)
if errors:
LOG.error('Expectation Failed: %s', errors)
raise exceptions.UnknownException
return body, status, headers
return wrapper
def find_or_create_project(request, context):
project_id = context.tenant
token_info = context.token_info
try:
dbapi.projects_get_by_id(context, project_id)
except exceptions.NotFound:
LOG.info('Adding Project "%s" to projects table', project_id)
dbapi.projects_create(context,
{'id': project_id,
'name': token_info['project']['name']})

View File

@ -1,3 +0,0 @@
##########
Craton CLI
##########

View File

View File

@ -1,30 +0,0 @@
import os
import sys
from wsgiref import simple_server
from oslo_config import cfg
from oslo_log import log as logging
from craton import api
LOG = logging.getLogger(__name__)
CONF = cfg.CONF
def main():
logging.register_options(CONF)
CONF(sys.argv[1:],
project='craton-api',
default_config_files=[])
logging.setup(CONF, 'craton-api')
app = api.load_app()
host, port = cfg.CONF.api.host, cfg.CONF.api.port
srv = simple_server.make_server(host, port, app)
LOG.info("Starting API server in PID: %s" % os.getpid())
srv.serve_forever()
if __name__ == "__main__":
main()

View File

@ -1,86 +0,0 @@
from oslo_config import cfg
from craton.db.sqlalchemy import migration
CONF = cfg.CONF
class DBCommand(object):
def upgrade(self):
migration.upgrade(CONF.command.revision)
def revision(self):
migration.revision(CONF.command.message, CONF.command.autogenerate)
def stamp(self):
migration.stamp(CONF.command.revision)
def version(self):
print(migration.version())
def create_schema(self):
migration.create_schema()
def bootstrap_project(self):
name = 'bootstrap'
project = migration.create_bootstrap_project(
name,
db_uri=CONF.database.connection)
user = migration.create_bootstrap_user(
project.id,
name,
db_uri=CONF.database.connection)
msg = ("\nProjectId: %s\nUsername: %s\nAPIKey: %s"
% (user.project_id, user.username, user.api_key))
print(msg)
def add_command_parsers(subparsers):
command_object = DBCommand()
parser = subparsers.add_parser(
'upgrade',
help=("Upgrade the database schema to the latest version. "
"Optionally, use --revision to specify an alembic revision "
"string to upgrade to."))
parser.set_defaults(func=command_object.upgrade)
parser.add_argument('--revision', nargs='?')
parser = subparsers.add_parser('stamp')
parser.add_argument('--revision', nargs='?')
parser.set_defaults(func=command_object.stamp)
parser = subparsers.add_parser(
'revision',
help=("Create a new alembic revision. "
"Use --message to set the message string."))
parser.add_argument('-m', '--message')
parser.add_argument('--autogenerate', action='store_true')
parser.set_defaults(func=command_object.revision)
parser = subparsers.add_parser(
'version',
help=("Print the current version information and exit."))
parser.set_defaults(func=command_object.version)
parser = subparsers.add_parser(
'create_schema',
help=("Create the database schema."))
parser.set_defaults(func=command_object.create_schema)
parser = subparsers.add_parser('bootstrap')
parser.set_defaults(func=command_object.bootstrap_project)
def main():
command_opt = cfg.SubCommandOpt('command',
title='Command',
help=('Available commands'),
handler=add_command_parsers)
CONF.register_cli_opt(command_opt)
CONF(project='craton-api')
CONF.command.func()

View File

@ -1,84 +0,0 @@
import contextlib
import signal
import sys
from oslo_config import cfg
from oslo_log import log as logging
from oslo_utils import uuidutils
from stevedore import driver
from taskflow import engines
from taskflow.persistence import models
from craton.workflow import worker
LOG = logging.getLogger(__name__)
CONF = cfg.CONF
# This needs to be a globally accessible (ie: top-level) function, so
# flow recovery can execute it to re-create the intended workflows.
def workflow_factory(name, *args, **kwargs):
mgr = driver.DriverManager(
namespace='craton.workflow', name=name,
invoke_on_load=True, invoke_args=args, invoke_kwds=kwargs)
return mgr.driver.workflow()
def main():
logging.register_options(CONF)
CONF(sys.argv[1:], project='craton-worker', default_config_files=[])
logging.setup(CONF, 'craton')
persistence, board, conductor = worker.start(CONF)
def stop(signum, _frame):
LOG.info('Caught signal %s, gracefully exiting', signum)
conductor.stop()
signal.signal(signal.SIGTERM, stop)
# TODO(gus): eventually feeding in jobs will happen elsewhere and
# main() will end here.
#
# conductor.wait()
# sys.exit(0)
def make_save_book(persistence, job_id,
flow_plugin, plugin_args=(), plugin_kwds={}):
flow_id = book_id = job_id # Do these need to be different?
book = models.LogBook(book_id)
detail = models.FlowDetail(flow_id, uuidutils.generate_uuid())
book.add(detail)
factory_args = [flow_plugin] + list(plugin_args)
factory_kwargs = plugin_kwds
engines.save_factory_details(detail, workflow_factory,
factory_args, factory_kwargs)
with contextlib.closing(persistence.get_connection()) as conn:
conn.save_logbook(book)
return book
# Feed in example task
job_uuid = uuidutils.generate_uuid()
LOG.debug('Posting job %s', job_uuid)
details = {
'store': {
'foo': 'bar',
},
}
job = board.post(
job_uuid,
book=make_save_book(
persistence, job_uuid,
'testflow', plugin_kwds=dict(task_delay=2)),
details=details)
# Run forever. TODO(gus): This is what we want to do in production
# conductor.wait()
job.wait()
LOG.debug('Job finished: %s', job.state)
conductor.stop()
if __name__ == '__main__':
main()

View File

@ -1,5 +0,0 @@
"""
DB abstraction for Craton Inventory
"""
from craton.db.api import * # noqa

View File

@ -1,335 +0,0 @@
"""Defines interface for DB access."""
from collections import namedtuple
from oslo_config import cfg
from oslo_db import api as db_api
db_opts = [
cfg.StrOpt('db_backend', default='sqlalchemy',
help='The backend to use for DB.'),
]
CONF = cfg.CONF
CONF.register_opts(db_opts)
# entrypoint namespace for db backend
BACKEND_MAPPING = {'sqlalchemy': 'craton.db.sqlalchemy.api'}
IMPL = db_api.DBAPI.from_config(cfg.CONF, backend_mapping=BACKEND_MAPPING,
lazy=True)
# Blame supports generic blame tracking for variables
# TODO(jimbaker) add additional governance support, such as
# versioning, user, notes
Blame = namedtuple('Blame', ['source', 'variable'])
def devices_get_all(context, filters, pagination_params):
"""Get all available devices."""
return IMPL.devices_get_all(context, filters, pagination_params)
def get_user_info(context, user):
return IMPL.get_user_info(context, user)
def resource_get_by_id(context, resources, resource_id):
"""Get resource for the unique resource id."""
return IMPL.resource_get_by_id(context, resources, resource_id)
def variables_update_by_resource_id(context, resources, resource_id, data):
"""Update/create existing resource's variables."""
return IMPL.variables_update_by_resource_id(
context,
resources,
resource_id,
data,
)
def variables_delete_by_resource_id(context, resources, resource_id, data):
"""Delete the existing variables, if present, from resource's data."""
return IMPL.variables_delete_by_resource_id(
context,
resources,
resource_id,
data,
)
# Cells
def cells_get_all(context, filters, pagination_params):
"""Get all available cells."""
return IMPL.cells_get_all(context, filters, pagination_params)
def cells_get_by_id(context, cell_id):
"""Get cell detail for the unique cell id."""
return IMPL.cells_get_by_id(context, cell_id)
def cells_create(context, values):
"""Create a new cell."""
return IMPL.cells_create(context, values)
def cells_update(context, cell_id, values):
"""Update an existing cell."""
return IMPL.cells_update(context, cell_id, values)
def cells_delete(context, cell_id):
"""Delete an existing cell."""
return IMPL.cells_delete(context, cell_id)
# Regions
def regions_get_all(context, filters, pagination_params):
"""Get all available regions."""
return IMPL.regions_get_all(context, filters, pagination_params)
def regions_get_by_name(context, name):
"""Get cell detail for the region with given name."""
return IMPL.regions_get_by_name(context, name)
def regions_get_by_id(context, region_id):
"""Get cell detail for the region with given id."""
return IMPL.regions_get_by_id(context, region_id)
def regions_create(context, values):
"""Create a new region."""
return IMPL.regions_create(context, values)
def regions_update(context, region_id, values):
"""Update an existing region."""
return IMPL.regions_update(context, region_id, values)
def regions_delete(context, region_id):
"""Delete an existing region."""
return IMPL.regions_delete(context, region_id)
# Clouds
def clouds_get_all(context, filters, pagination_params):
"""Get all available clouds."""
return IMPL.clouds_get_all(context, filters, pagination_params)
def clouds_get_by_name(context, name):
"""Get clouds with given name."""
return IMPL.clouds_get_by_name(context, name)
def clouds_get_by_id(context, cloud_id):
"""Get cloud detail for the cloud with given id."""
return IMPL.clouds_get_by_id(context, cloud_id)
def clouds_create(context, values):
"""Create a new cloud."""
return IMPL.clouds_create(context, values)
def clouds_update(context, cloud_id, values):
"""Update an existing cloud."""
return IMPL.clouds_update(context, cloud_id, values)
def clouds_delete(context, cloud_id):
"""Delete an existing cloud."""
return IMPL.clouds_delete(context, cloud_id)
# Hosts
def hosts_get_all(context, filters, pagination_params):
"""Get all hosts."""
return IMPL.hosts_get_all(context, filters, pagination_params)
def hosts_get_by_id(context, host_id):
"""Get details for the host with given id."""
return IMPL.hosts_get_by_id(context, host_id)
def hosts_create(context, values):
"""Create a new host."""
return IMPL.hosts_create(context, values)
def hosts_update(context, host_id, values):
"""Update an existing host."""
return IMPL.hosts_update(context, host_id, values)
def hosts_delete(context, host_id):
"""Delete an existing host."""
return IMPL.hosts_delete(context, host_id)
def hosts_labels_delete(context, host_id, labels):
"""Delete existing device label(s)."""
return IMPL.hosts_labels_delete(context, host_id, labels)
def hosts_labels_update(context, host_id, labels):
"""Update existing device label entirely."""
return IMPL.hosts_labels_update(context, host_id, labels)
# Projects
def projects_get_all(context, filters, pagination_params):
"""Get all the projects."""
return IMPL.projects_get_all(context, filters, pagination_params)
def projects_get_by_name(context, project_name, filters, pagination_params):
"""Get all projects that match the given name."""
return IMPL.projects_get_by_name(context, project_name, filters,
pagination_params)
def projects_get_by_id(context, project_id):
"""Get project by its id."""
return IMPL.projects_get_by_id(context, project_id)
def projects_create(context, values):
"""Create a new project with given values."""
return IMPL.projects_create(context, values)
def projects_delete(context, project_id):
"""Delete an existing project given by its id."""
return IMPL.projects_delete(context, project_id)
# Users
def users_get_all(context, filters, pagination_params):
"""Get all the users."""
return IMPL.users_get_all(context, filters, pagination_params)
def users_get_by_name(context, user_name, filters, pagination_params):
"""Get all users that match the given username."""
return IMPL.users_get_by_name(context, user_name, filters,
pagination_params)
def users_get_by_id(context, user_id):
"""Get user by its id."""
return IMPL.users_get_by_id(context, user_id)
def users_create(context, values):
"""Create a new user with given values."""
return IMPL.users_create(context, values)
def users_delete(context, user_id):
"""Delete an existing user given by its id."""
return IMPL.users_delete(context, user_id)
# Networks
def networks_get_all(context, filters, pagination_params):
"""Get all networks for the given region."""
return IMPL.networks_get_all(context, filters, pagination_params)
def networks_get_by_id(context, network_id):
"""Get a given network by its id."""
return IMPL.networks_get_by_id(context, network_id)
def networks_create(context, values):
"""Create a new network."""
return IMPL.networks_create(context, values)
def networks_update(context, network_id, values):
"""Update an existing network."""
return IMPL.networks_update(context, network_id, values)
def networks_delete(context, network_id):
"""Delete existing network."""
return IMPL.networks_delete(context, network_id)
def network_devices_get_all(context, filters, pagination_params):
"""Get all network devices."""
return IMPL.network_devices_get_all(context, filters, pagination_params)
def network_devices_get_by_id(context, network_device_id):
"""Get a given network device by its id."""
return IMPL.network_devices_get_by_id(context, network_device_id)
def network_devices_create(context, values):
"""Create a new network device."""
return IMPL.network_devices_create(context, values)
def network_devices_update(context, network_device_id, values):
"""Update an existing network device"""
return IMPL.network_devices_update(context, network_device_id, values)
def network_devices_delete(context, network_device_id):
"""Delete existing network device."""
return IMPL.network_devices_delete(context, network_device_id)
def network_devices_labels_delete(context, network_device_id, labels):
"""Delete network device labels."""
return IMPL.network_devices_labels_delete(context, network_device_id,
labels)
def network_devices_labels_update(context, network_device_id, labels):
"""Update network device labels."""
return IMPL.network_devices_labels_update(context, network_device_id,
labels)
def network_interfaces_get_all(context, filters, pagination_params):
"""Get all network interfaces."""
return IMPL.network_interfaces_get_all(
context, filters, pagination_params,
)
def network_interfaces_get_by_id(context, interface_id):
"""Get a given network interface by its id."""
return IMPL.network_interfaces_get_by_id(context, interface_id)
def network_interfaces_create(context, values):
"""Create a new network interface."""
return IMPL.network_interfaces_create(context, values)
def network_interfaces_update(context, interface_id, values):
"""Update an existing network interface."""
return IMPL.network_interfaces_update(context, interface_id, values)
def network_interfaces_delete(context, interface_id):
"""Delete existing network interface."""
return IMPL.network_interfaces_delete(context, interface_id)

View File

@ -1,68 +0,0 @@
# A generic, single database configuration.
[alembic]
# path to migration scripts
script_location = %(here)s/alembic
# template used to generate migration files
# file_template = %%(rev)s_%%(slug)s
# max length of characters to apply to the
# "slug" field
#truncate_slug_length = 40
# set to 'true' to run the environment during
# the 'revision' command, regardless of autogenerate
# revision_environment = false
# set to 'true' to allow .pyc and .pyo files without
# a source .py file to be detected as revisions in the
# versions/ directory
# sourceless = false
# version location specification; this defaults
# to alembic/versions. When using multiple version
# directories, initial revisions must be specified with --version-path
# version_locations = %(here)s/bar %(here)s/bat alembic/versions
# the output encoding used when revision files
# are written from script.py.mako
# output_encoding = utf-8
#sqlalchemy.url = driver://user:pass@localhost/dbname
# Logging configuration
[loggers]
keys = root,sqlalchemy,alembic
[handlers]
keys = console
[formatters]
keys = generic
[logger_root]
level = WARN
handlers = console
qualname =
[logger_sqlalchemy]
level = WARN
handlers =
qualname = sqlalchemy.engine
[logger_alembic]
level = INFO
handlers =
qualname = alembic
[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic
[formatter_generic]
format = %(levelname)-5.5s [%(name)s] %(message)s
datefmt = %H:%M:%S

View File

@ -1,11 +0,0 @@
Please see https://alembic.readthedocs.org/en/latest/index.html for general documentation
To create alembic migrations use:
$ craton-dbsync --config-file=craton.conf revision --message "revision description" --autogenerate
Stamp db with most recent migration version, without actually running migrations
$ craton-dbsync --config-file=craton.conf stamp head
Upgrade can be performed by:
$ craton-dbsync --config-file=craton.conf upgrade
$ craton-dbsync --config-file=craton.conf upgrade head

View File

@ -1,66 +0,0 @@
from __future__ import with_statement
from alembic import context
from logging.config import fileConfig
from craton.db.sqlalchemy import api as sa_api
from craton.db.sqlalchemy import models as db_models
# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config
# Interpret the config file for Python logging.
# This line sets up loggers basically.
fileConfig(config.config_file_name)
# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
target_metadata = db_models.Base.metadata
# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.
def run_migrations_offline():
"""Run migrations in 'offline' mode.
This configures the context with just a URL
and not an Engine, though an Engine is acceptable
here as well. By skipping the Engine creation
we don't even need a DBAPI to be available.
Calls to context.execute() here emit the given string to the
script output.
"""
url = config.get_main_option("sqlalchemy.url")
context.configure(
url=url, target_metadata=target_metadata, literal_binds=True)
with context.begin_transaction():
context.run_migrations()
def run_migrations_online():
"""Run migrations in 'online' mode.
In this scenario we need to create an Engine
and associate a connection with the context.
"""
engine = sa_api.get_engine()
with engine.connect() as connection:
context.configure(
connection=connection,
target_metadata=target_metadata
)
with context.begin_transaction():
context.run_migrations()
run_migrations_online()

View File

@ -1,24 +0,0 @@
"""${message}
Revision ID: ${up_revision}
Revises: ${down_revision | comma,n}
Create Date: ${create_date}
"""
# revision identifiers, used by Alembic.
revision = ${repr(up_revision)}
down_revision = ${repr(down_revision)}
branch_labels = ${repr(branch_labels)}
depends_on = ${repr(depends_on)}
from alembic import op
import sqlalchemy as sa
${imports if imports else ""}
def upgrade():
${upgrades if upgrades else "pass"}
def downgrade():
${downgrades if downgrades else "pass"}

View File

@ -1,407 +0,0 @@
"""craton_inventory_init
Revision ID: ffdc1a500db1
Revises:
Create Date: 2016-06-03 09:52:55.302936
"""
# revision identifiers, used by Alembic.
revision = 'ffdc1a500db1'
down_revision = None
branch_labels = None
depends_on = None
from alembic import op
import sqlalchemy as sa
import sqlalchemy_utils
def upgrade():
op.create_table(
'variable_association',
sa.Column('created_at', sa.DateTime, nullable=False),
sa.Column('updated_at', sa.DateTime, nullable=True),
sa.Column('id', sa.Integer, primary_key=True),
sa.Column('discriminator', sa.String(length=50), nullable=False),
)
op.create_table(
'variables',
sa.Column('created_at', sa.DateTime, nullable=False),
sa.Column('updated_at', sa.DateTime, nullable=True),
sa.Column(
'association_id', sa.Integer,
sa.ForeignKey(
'variable_association.id',
name='fk_variables__variables_association',
ondelete='cascade'),
primary_key=True),
sa.Column('key_', sa.String(length=255), primary_key=True),
sa.Column('value_', sa.JSON, nullable=True),
)
op.create_table(
'projects',
sa.Column('created_at', sa.DateTime, nullable=False),
sa.Column('updated_at', sa.DateTime, nullable=True),
sa.Column(
'id', sqlalchemy_utils.types.UUIDType(binary=False),
primary_key=True),
sa.Column(
'variable_association_id', sa.Integer,
sa.ForeignKey(
'variable_association.id',
name='fk_projects__variable_association'),
nullable=False),
sa.Column('name', sa.String(length=255), nullable=True),
)
op.create_table(
'users',
sa.Column('created_at', sa.DateTime, nullable=False),
sa.Column('updated_at', sa.DateTime, nullable=True),
sa.Column('id', sa.Integer, primary_key=True),
sa.Column(
'project_id', sqlalchemy_utils.types.UUIDType(binary=False),
sa.ForeignKey(
'projects.id', name='fk_users__projects', ondelete='cascade'),
nullable=False),
sa.Column(
'variable_association_id', sa.Integer,
sa.ForeignKey(
'variable_association.id',
name='fk_users__variable_association'),
nullable=False),
sa.Column('username', sa.String(length=255), nullable=True),
sa.Column('api_key', sa.String(length=36), nullable=True),
sa.Column('is_root', sa.Boolean, nullable=True),
sa.Column('is_admin', sa.Boolean, nullable=True),
sa.UniqueConstraint(
'username', 'project_id',
name='uq_users_username_project_id'),
)
op.create_index(
op.f('ix_users_project_id'), 'users', ['project_id'], unique=False)
op.create_table(
'clouds',
sa.Column('created_at', sa.DateTime, nullable=False),
sa.Column('updated_at', sa.DateTime, nullable=True),
sa.Column('id', sa.Integer, primary_key=True),
sa.Column(
'project_id', sqlalchemy_utils.types.UUIDType(binary=False),
sa.ForeignKey(
'projects.id', name='fk_clouds__projects', ondelete='cascade'),
nullable=False),
sa.Column(
'variable_association_id', sa.Integer,
sa.ForeignKey(
'variable_association.id',
name='fk_clouds__variable_association'),
nullable=False),
sa.Column('name', sa.String(length=255), nullable=True),
sa.Column('note', sa.Text, nullable=True),
sa.UniqueConstraint(
'project_id', 'name',
name='uq_clouds__project_id__name'),
)
op.create_index(
op.f('ix_clouds_project_id'), 'clouds', ['project_id'], unique=False)
op.create_table(
'regions',
sa.Column('created_at', sa.DateTime, nullable=False),
sa.Column('updated_at', sa.DateTime, nullable=True),
sa.Column('id', sa.Integer, primary_key=True),
sa.Column(
'project_id', sqlalchemy_utils.types.UUIDType(binary=False),
sa.ForeignKey(
'projects.id',
name='fk_projects__regions', ondelete='cascade')),
sa.Column(
'cloud_id', sa.Integer,
sa.ForeignKey(
'clouds.id', name='fk_regions__clouds', ondelete='cascade'),
nullable=False),
sa.Column(
'variable_association_id', sa.Integer,
sa.ForeignKey(
'variable_association.id',
name='fk_regions__variable_association'),
nullable=False),
sa.Column('name', sa.String(length=255), nullable=True),
sa.Column('note', sa.Text, nullable=True),
sa.UniqueConstraint(
'cloud_id', 'name', name='uq_regions__cloud_id__name'),
)
op.create_index(
op.f('ix_regions_project_id'), 'regions', ['project_id'], unique=False)
op.create_index(
op.f('ix_regions_cloud_id'), 'regions', ['cloud_id'], unique=False)
op.create_table(
'cells',
sa.Column('created_at', sa.DateTime, nullable=False),
sa.Column('updated_at', sa.DateTime, nullable=True),
sa.Column('id', sa.Integer, primary_key=True),
sa.Column(
'project_id', sqlalchemy_utils.types.UUIDType(binary=False),
sa.ForeignKey(
'projects.id', name='fk_cells__projects', ondelete='cascade'),
nullable=False),
sa.Column(
'cloud_id', sa.Integer,
sa.ForeignKey(
'clouds.id', name='fk_cells__clouds', ondelete='cascade'),
nullable=False),
sa.Column(
'region_id', sa.Integer,
sa.ForeignKey(
'regions.id', name='fk_cells__regions', ondelete='cascade'),
nullable=False),
sa.Column(
'variable_association_id', sa.Integer,
sa.ForeignKey(
'variable_association.id',
name='fk_cells__variable_association'),
nullable=False),
sa.Column('name', sa.String(length=255), nullable=True),
sa.Column('note', sa.Text, nullable=True),
sa.UniqueConstraint(
'region_id', 'name', name='uq_cells__region_id__name'),
)
op.create_index(
op.f('ix_cells_project_id'), 'cells', ['project_id'], unique=False)
op.create_index(
op.f('ix_cells_cloud_id'), 'cells', ['cloud_id'], unique=False)
op.create_index(
op.f('ix_cells_region_id'), 'cells', ['region_id'], unique=False)
op.create_table(
'networks',
sa.Column('created_at', sa.DateTime, nullable=False),
sa.Column('updated_at', sa.DateTime, nullable=True),
sa.Column('id', sa.Integer, primary_key=True),
sa.Column(
'project_id', sqlalchemy_utils.types.UUIDType(binary=False),
sa.ForeignKey(
'projects.id',
name='fk_networks__projects', ondelete='cascade'),
nullable=False),
sa.Column(
'cloud_id', sa.Integer,
sa.ForeignKey(
'clouds.id', name='fk_networks__clouds', ondelete='cascade'),
nullable=False),
sa.Column(
'region_id', sa.Integer,
sa.ForeignKey(
'regions.id', name='fk_networks__regions', ondelete='cascade'),
nullable=False),
sa.Column(
'cell_id', sa.Integer,
sa.ForeignKey(
'cells.id', name='fk_networks__cells', ondelete='cascade'),
nullable=True),
sa.Column(
'variable_association_id', sa.Integer,
sa.ForeignKey(
'variable_association.id',
name='fk_networks__variable_association'),
nullable=False),
sa.Column('name', sa.String(length=255), nullable=True),
sa.Column('cidr', sa.String(length=255), nullable=True),
sa.Column('gateway', sa.String(length=255), nullable=True),
sa.Column('netmask', sa.String(length=255), nullable=True),
sa.Column('ip_block_type', sa.String(length=255), nullable=True),
sa.Column('nss', sa.String(length=255), nullable=True),
sa.UniqueConstraint(
'name', 'project_id', 'region_id',
name='uq_networks__name__project_id__region_id'),
)
op.create_index(
op.f('ix_networks_cell_id'), 'networks', ['cell_id'], unique=False)
op.create_index(
op.f('ix_networks_project_id'), 'networks', ['project_id'],
unique=False)
op.create_index(
op.f('ix_networks_region_id'), 'networks', ['region_id'],
unique=False)
op.create_table(
'devices',
sa.Column('created_at', sa.DateTime, nullable=False),
sa.Column('updated_at', sa.DateTime, nullable=True),
sa.Column('id', sa.Integer, primary_key=True),
sa.Column(
'project_id', sqlalchemy_utils.types.UUIDType(binary=False),
sa.ForeignKey(
'projects.id',
name='fk_devices__projects', ondelete='cascade'),
nullable=False),
sa.Column(
'cloud_id', sa.Integer,
sa.ForeignKey(
'clouds.id', name='fk_devices__clouds', ondelete='cascade'),
nullable=False),
sa.Column(
'region_id', sa.Integer,
sa.ForeignKey(
'regions.id', name='fk_devices__regions', ondelete='cascade'),
nullable=False),
sa.Column(
'cell_id', sa.Integer,
sa.ForeignKey(
'cells.id', name='fk_devices__cells', ondelete='cascade'),
nullable=True),
sa.Column(
'parent_id', sa.Integer,
sa.ForeignKey('devices.id', name='fk_devices__devices'),
nullable=True),
sa.Column(
'variable_association_id', sa.Integer,
sa.ForeignKey(
'variable_association.id',
name='fk_devices__variable_association'),
nullable=False),
sa.Column('type', sa.String(length=50), nullable=False),
sa.Column('device_type', sa.String(length=255), nullable=False),
sa.Column('name', sa.String(length=255), nullable=False),
sa.Column(
'ip_address',
sqlalchemy_utils.types.IPAddressType(length=64),
nullable=False),
sa.Column('active', sa.Boolean(), nullable=True),
sa.Column('note', sa.Text(), nullable=True),
sa.UniqueConstraint('region_id', 'name',
name='uq_device0regionid0name'),
)
op.create_index(
op.f('ix_devices_cell_id'), 'devices', ['cell_id'], unique=False)
op.create_index(
op.f('ix_devices_project_id'), 'devices', ['project_id'], unique=False)
op.create_index(
op.f('ix_devices_region_id'), 'devices', ['region_id'], unique=False)
op.create_index(
op.f('ix_devices_cloud_id'), 'devices', ['cloud_id'], unique=False)
op.create_table(
'hosts',
sa.Column(
'id', sa.Integer,
sa.ForeignKey(
'devices.id', name='fk_hosts__devices', ondelete='cascade'),
primary_key=True)
)
op.create_table(
'network_devices',
sa.Column(
'id', sa.Integer,
sa.ForeignKey(
'devices.id',
name='fk_network_devices__devices', ondelete='cascade'),
primary_key=True),
sa.Column('model_name', sa.String(length=255), nullable=True),
sa.Column('os_version', sa.String(length=255), nullable=True),
sa.Column('vlans', sa.JSON,
nullable=True)
)
op.create_table(
'labels',
sa.Column('created_at', sa.DateTime, nullable=False),
sa.Column('updated_at', sa.DateTime, nullable=True),
sa.Column(
'device_id', sa.Integer,
sa.ForeignKey(
'devices.id',
name='fk_labels__devices', ondelete='cascade'),
primary_key=True),
sa.Column('label', sa.String(length=255), primary_key=True),
)
op.create_index(
op.f('ix_devices_labels'), 'labels', ['label', 'device_id'])
op.create_table(
'network_interfaces',
sa.Column('created_at', sa.DateTime, nullable=False),
sa.Column('updated_at', sa.DateTime, nullable=True),
sa.Column('id', sa.Integer, primary_key=True),
sa.Column(
'project_id', sqlalchemy_utils.types.UUIDType(binary=False),
sa.ForeignKey(
'projects.id',
name='fk_network_interfaces__projects', ondelete='cascade'),
nullable=False),
sa.Column(
'device_id', sa.Integer,
sa.ForeignKey(
'devices.id',
name='fk_network_interfaces__devices', ondelete='cascade'),
nullable=False),
sa.Column(
'network_id', sa.Integer,
sa.ForeignKey(
'networks.id',
name='fk_network_interfaces__networks'),
nullable=True),
sa.Column(
'variable_association_id', sa.Integer,
sa.ForeignKey(
'variable_association.id',
name='fk_network_interfaces__variable_association'),
nullable=False),
sa.Column('name', sa.String(length=255), nullable=True),
sa.Column('interface_type', sa.String(length=255), nullable=True),
sa.Column('vlan_id', sa.Integer, nullable=True),
sa.Column('port', sa.Integer, nullable=True),
sa.Column('vlan', sa.String(length=255), nullable=True),
sa.Column('duplex', sa.String(length=255), nullable=True),
sa.Column('speed', sa.String(length=255), nullable=True),
sa.Column('link', sa.String(length=255), nullable=True),
sa.Column('cdp', sa.String(length=255), nullable=True),
sa.Column('security', sa.String(length=255), nullable=True),
sa.Column(
'ip_address',
sqlalchemy_utils.types.IPAddressType(length=64),
nullable=False),
sa.UniqueConstraint(
'device_id', 'name',
name='uq_network_interfaces__device_id__name'),
)
def downgrade():
op.drop_table('network_interfaces')
op.drop_index(op.f('ix_devices_labels'), table_name='labels')
op.drop_table('labels')
op.drop_table('network_devices')
op.drop_table('hosts')
op.drop_index(op.f('ix_networks_region_id'), table_name='networks')
op.drop_index(op.f('ix_networks_cloud_id'), table_name='networks')
op.drop_index(op.f('ix_networks_project_id'), table_name='networks')
op.drop_index(op.f('ix_networks_cell_id'), table_name='networks')
op.drop_table('networks')
op.drop_index(op.f('ix_devices_region_id'), table_name='devices')
op.drop_index(op.f('ix_devices_cloud_id'), table_name='devices')
op.drop_index(op.f('ix_devices_project_id'), table_name='devices')
op.drop_index(op.f('ix_devices_cell_id'), table_name='devices')
op.drop_table('devices')
op.drop_index(op.f('ix_cells_region_id'), table_name='cells')
op.drop_index(op.f('ix_cells_cloud_id'), table_name='cells')
op.drop_index(op.f('ix_cells_project_id'), table_name='cells')
op.drop_table('cells')
op.drop_index(op.f('ix_users_project_id'), table_name='users')
op.drop_index(op.f('ix_regions_project_id'), table_name='regions')
op.drop_index(op.f('ix_regions_cloud_id'), table_name='regions')
op.drop_table('regions')
op.drop_index(op.f('ix_clouds_project_id'), table_name='clouds')
op.drop_table('clouds')
op.drop_table('users')
op.drop_table('projects')
op.drop_index(op.f('ix_variable_keys'), table_name='variables')
op.drop_table('variables')
op.drop_table('variable_association')

File diff suppressed because it is too large Load Diff

View File

@ -1,114 +0,0 @@
import alembic
import os
import uuid
from alembic import config as alembic_config
import alembic.migration as alembic_migration
from sqlalchemy import create_engine
from sqlalchemy import exc
from sqlalchemy.orm import sessionmaker
import sqlalchemy.orm.exc as sa_exc
from oslo_db.sqlalchemy import enginefacade
from craton.api.v1.resources import utils
from craton.db.sqlalchemy import models
def _alembic_config():
path = os.path.join(os.path.dirname(__file__), 'alembic.ini')
config = alembic_config.Config(path)
return config
def version(config=None, engine=None):
"""Current database version."""
if engine is None:
engine = enginefacade.get_legacy_facade().get_engine()
with engine.connect() as conn:
context = alembic_migration.MigrationContext.configure(conn)
return context.get_current_revision()
def upgrade(revision, config=None):
"""Used for upgrading database.
:param version: Desired database version
"""
revision = revision or 'head'
config = config or _alembic_config()
alembic.command.upgrade(config, revision or 'head')
def stamp(revision, config=None):
"""Stamps database with provided revision.
Don't run any migrations.
:param revision: Should match one from repository or head - to stamp
database with most recent revision
"""
config = config or _alembic_config()
return alembic.command.stamp(config, revision=revision)
def revision(message=None, autogenerate=False, config=None):
"""Creates template for migration.
:param message: Text that will be used for migration title
:param autogenerate: If True - generates diff based on current database
state
"""
config = config or _alembic_config()
return alembic.command.revision(config, message=message,
autogenerate=autogenerate)
def create_bootstrap_project(name, project_id=None, db_uri=None):
"""Creates a new project.
:param name: Name of the new project
"""
if not project_id:
project_id = str(uuid.uuid4())
engine = create_engine(db_uri)
Session = sessionmaker(bind=engine)
session = Session()
project = models.Project(name=name,
id=project_id)
try:
project = session.query(models.Project).filter_by(name=name).one()
except sa_exc.NoResultFound:
session.add(project)
session.commit()
return project
def create_bootstrap_user(project_id, username, db_uri=None):
"""Creates a new project.
:param username: Username for new user
:param project_id: Project ID for the user
"""
engine = create_engine(db_uri)
Session = sessionmaker(bind=engine)
session = Session()
api_key = utils.gen_api_key()
user = models.User(project_id=project_id,
username=username,
api_key=api_key,
is_admin=True,
is_root=True)
try:
session.add(user)
session.commit()
except exc.IntegrityError as err:
if err.orig.args[0] == 1062:
# NOTE(sulo): 1062 is the normal sql duplicate error code
# also see pymysql/constants/ER.py#L65
session.rollback()
user = session.query(models.User).filter_by(username=username)
user = user.filter_by(project_id=project_id).one()
return user
else:
raise
return user

View File

@ -1,553 +0,0 @@
"""Models inventory, as defined using SQLAlchemy ORM
Craton uses the following related aspects of inventory:
* Device inventory, with devices are further organized by region,
cell, and labels. Variables are associated with all of these
entities, with the ability to override via resolution and to track
with blaming. This in terms forms the foundation of an *inventory
fabric*, which is implemented above this level.
* Workflows are run against this inventory, taking in account the
variable configuration; as well as any specifics baked into the
workflow itself.
"""
from collections import ChainMap, deque, OrderedDict
import itertools
from oslo_db.sqlalchemy import models
from sqlalchemy import (
Boolean, Column, ForeignKey, Integer, String, Text, UniqueConstraint, JSON)
from sqlalchemy.ext.associationproxy import association_proxy
from sqlalchemy.ext.declarative import declarative_base, declared_attr
from sqlalchemy.ext.declarative.api import _declarative_constructor
from sqlalchemy.orm import backref, object_mapper, relationship, validates
from sqlalchemy.orm.collections import attribute_mapped_collection
from sqlalchemy_utils.types.ip_address import IPAddressType
from sqlalchemy_utils.types.uuid import UUIDType
from craton import exceptions
from craton.db.api import Blame
# TODO(jimbaker) set up table args for a given database/storage
# engine, as configured. See
# https://github.com/rackerlabs/craton/issues/19
class CratonBase(models.ModelBase, models.TimestampMixin):
def __repr__(self):
mapper = object_mapper(self)
cols = getattr(self, '_repr_columns', mapper.primary_key)
items = [(p.key, getattr(self, p.key))
for p in [
mapper.get_property_by_column(c) for c in cols]]
return "{0}({1})".format(
self.__class__.__name__,
', '.join(['{0}={1!r}'.format(*item) for item in items]))
def _variable_mixin_aware_constructor(self, **kwargs):
# The standard default for the underlying relationship for
# variables sets it to None, which means it cannot directly be
# used as a mappable collection. Cure the problem accordingly with
# a different default.
if isinstance(self, VariableMixin):
kwargs.setdefault('variables', {})
return _declarative_constructor(self, **kwargs)
Base = declarative_base(
cls=CratonBase, constructor=_variable_mixin_aware_constructor)
class VariableAssociation(Base):
"""Associates a collection of Variable key-value objects
with a particular parent.
"""
__tablename__ = "variable_association"
id = Column(Integer, primary_key=True)
discriminator = Column(String(50), nullable=False)
"""Refers to the type of parent, such as 'cell' or 'device'"""
variables = relationship(
'Variable',
collection_class=attribute_mapped_collection('key'),
back_populates='association',
cascade='all, delete-orphan', lazy='joined',
)
def _variable_creator(key, value):
# Necessary to create a single key/value setting, even once
# the corresponding variable association has been setup
return Variable(key=key, value=value)
values = association_proxy(
'variables', 'value', creator=_variable_creator)
__mapper_args__ = {
'polymorphic_on': discriminator,
}
class Variable(Base):
"""The Variable class.
This represents all variable records in a single table.
"""
__tablename__ = 'variables'
association_id = Column(
Integer,
ForeignKey(VariableAssociation.id,
name='fk_variables_variable_association'),
primary_key=True)
# Use "key_", "value_" to avoid the use of reserved keywords in
# MySQL. This difference in naming is only visible in the use of
# raw SQL.
key = Column('key_', String(255), primary_key=True)
value = Column('value_', JSON)
association = relationship(
VariableAssociation, back_populates='variables',
)
parent = association_proxy('association', 'parent')
def __repr__(self):
return '%s(key=%r, value=%r)' % \
(self.__class__.__name__, self.key, self.value)
# The VariableMixin mixin is adapted from this example code:
# http://docs.sqlalchemy.org/en/latest/_modules/examples/generic_associations/discriminator_on_association.html
# This blog post goes into more details about the underlying modeling:
# http://techspot.zzzeek.org/2007/05/29/polymorphic-associations-with-sqlalchemy/
class VariableMixin(object):
"""VariableMixin mixin, creates a relationship to
the variable_association table for each parent.
"""
@declared_attr
def variable_association_id(cls):
return Column(
Integer,
ForeignKey(VariableAssociation.id,
name='fk_%ss_variable_association' %
cls.__name__.lower()))
@declared_attr
def variable_association(cls):
name = cls.__name__
discriminator = name.lower()
# Defines a polymorphic class to distinguish variables stored
# for regions, cells, etc.
cls.variable_assoc_cls = assoc_cls = type(
"%sVariableAssociation" % name,
(VariableAssociation,),
{
'__tablename__': None, # because mapping into a shared table
'__mapper_args__': {
'polymorphic_identity': discriminator
}
})
def _assoc_creator(kv):
assoc = assoc_cls()
for key, value in kv.items():
assoc.variables[key] = Variable(key=key, value=value)
return assoc
cls._variables = association_proxy(
'variable_association', 'variables', creator=_assoc_creator)
# Using a composite associative proxy here enables returning the
# underlying values for a given key, as opposed to the
# Variable object; we need both.
cls.variables = association_proxy(
'variable_association', 'values', creator=_assoc_creator)
def with_characteristic(self, key, value):
return self._variables.any(key=key, value=value)
cls.with_characteristic = classmethod(with_characteristic)
rel = relationship(
assoc_cls,
collection_class=attribute_mapped_collection('key'),
cascade='all, delete-orphan', lazy='joined',
single_parent=True,
backref=backref('parent', uselist=False))
return rel
# For resolution ordering, the default is to just include
# self. Override as desired for other resolution policy.
@property
def resolution_order(self):
return [self]
@property
def resolution_order_variables(self):
return [obj.variables for obj in self.resolution_order]
@property
def resolved(self):
"""Provides a mapping that uses scope resolution for variables"""
return ChainMap(*self.resolution_order_variables)
def blame(self, keys=None):
"""Determines the sources of how variables have been set.
:param keys: keys to check sourcing, or all keys if None
Returns the (source, variable) in a named tuple; note that
variable contains certain audit/governance information
(created_at, modified_at).
TODO(jimbaker) further extend schema on mixed-in variable tables
to capture additional governance, such as user who set the key;
this will then transparently become available in the blame.
"""
if keys is None:
keys = self.resolved.keys()
blamed = {}
for key in keys:
for source in self.resolution_order:
try:
blamed[key] = Blame(source, source._variables[key])
break
except KeyError:
pass
return blamed
class Project(Base, VariableMixin):
"""Supports multitenancy for all other schema elements."""
__tablename__ = 'projects'
id = Column(UUIDType(binary=False), primary_key=True)
name = Column(String(255))
_repr_columns = [id, name]
# TODO(jimbaker) we will surely need to define more columns, but
# this suffices to define multitenancy for MVP
# one-to-many relationship with the following objects
clouds = relationship('Cloud', back_populates='project')
regions = relationship('Region', back_populates='project')
cells = relationship('Cell', back_populates='project')
devices = relationship('Device', back_populates='project')
users = relationship('User', back_populates='project')
networks = relationship('Network', back_populates='project')
interfaces = relationship('NetworkInterface', back_populates='project')
class User(Base, VariableMixin):
__tablename__ = 'users'
__table_args__ = (
UniqueConstraint("username", "project_id",
name="uq_user0username0project"),
)
id = Column(Integer, primary_key=True)
project_id = Column(
UUIDType(binary=False), ForeignKey('projects.id'), index=True,
nullable=False)
username = Column(String(255))
api_key = Column(String(36))
# root = craton admin that can create other projects/users
is_root = Column(Boolean, default=False)
# admin = project context admin
is_admin = Column(Boolean, default=False)
_repr_columns = [id, username]
project = relationship('Project', back_populates='users')
class Cloud(Base, VariableMixin):
__tablename__ = 'clouds'
__table_args__ = (
UniqueConstraint("project_id", "name",
name="uq_cloud0projectid0name"),
)
id = Column(Integer, primary_key=True)
project_id = Column(
UUIDType(binary=False), ForeignKey('projects.id'), index=True,
nullable=False)
name = Column(String(255))
note = Column(Text)
_repr_columns = [id, name]
project = relationship('Project', back_populates='clouds')
regions = relationship('Region', back_populates='cloud')
cells = relationship('Cell', back_populates='cloud')
devices = relationship('Device', back_populates='cloud')
networks = relationship('Network', back_populates='cloud')
class Region(Base, VariableMixin):
__tablename__ = 'regions'
__table_args__ = (
UniqueConstraint("cloud_id", "name",
name="uq_region0cloudid0name"),
)
id = Column(Integer, primary_key=True)
project_id = Column(
UUIDType(binary=False), ForeignKey('projects.id'), index=True,
nullable=False)
cloud_id = Column(
Integer, ForeignKey('clouds.id'), index=True, nullable=False)
name = Column(String(255))
note = Column(Text)
_repr_columns = [id, name]
project = relationship('Project', back_populates='regions')
cloud = relationship('Cloud', back_populates='regions')
cells = relationship('Cell', back_populates='region')
devices = relationship('Device', back_populates='region')
networks = relationship('Network', back_populates='region')
@property
def resolution_order(self):
return list(itertools.chain(
[self],
[self.cloud],
[self.project]))
class Cell(Base, VariableMixin):
__tablename__ = 'cells'
__table_args__ = (
UniqueConstraint("region_id", "name",
name="uq_cell0regionid0name"),
)
id = Column(Integer, primary_key=True)
region_id = Column(
Integer, ForeignKey('regions.id'), index=True, nullable=False)
cloud_id = Column(
Integer, ForeignKey('clouds.id'), index=True, nullable=False)
project_id = Column(
UUIDType(binary=False), ForeignKey('projects.id'), index=True,
nullable=False)
name = Column(String(255))
note = Column(Text)
_repr_columns = [id, name]
project = relationship('Project', back_populates='cells')
cloud = relationship('Cloud', back_populates='cells')
region = relationship('Region', back_populates='cells')
devices = relationship('Device', back_populates='cell')
networks = relationship('Network', back_populates='cell')
@property
def resolution_order(self):
return list(itertools.chain(
[self],
[self.region],
[self.cloud],
[self.project]))
class Device(Base, VariableMixin):
"""Base class for all devices."""
__tablename__ = 'devices'
__table_args__ = (
UniqueConstraint("region_id", "name",
name="uq_device0regionid0name"),
)
id = Column(Integer, primary_key=True)
type = Column(String(50)) # discriminant for joined table inheritance
name = Column(String(255), nullable=False)
cloud_id = Column(
Integer, ForeignKey('clouds.id'), index=True, nullable=False)
region_id = Column(
Integer, ForeignKey('regions.id'), index=True, nullable=False)
cell_id = Column(
Integer, ForeignKey('cells.id'), index=True, nullable=True)
project_id = Column(
UUIDType(binary=False), ForeignKey('projects.id'), index=True,
nullable=False)
parent_id = Column(Integer, ForeignKey('devices.id'))
ip_address = Column(IPAddressType, nullable=False)
device_type = Column(String(255), nullable=False)
# TODO(jimbaker) generalize `note` for supporting governance
active = Column(Boolean, default=True)
note = Column(Text)
_repr_columns = [id, name]
project = relationship('Project', back_populates='devices')
cloud = relationship('Cloud', back_populates='devices')
region = relationship('Region', back_populates='devices')
cell = relationship('Cell', back_populates='devices')
related_labels = relationship(
'Label', back_populates='device', collection_class=set,
cascade='all, delete-orphan', lazy='joined')
labels = association_proxy('related_labels', 'label')
interfaces = relationship('NetworkInterface', back_populates='device')
children = relationship(
'Device', backref=backref('parent', remote_side=[id]))
@validates("parent_id")
def validate_parent_id(self, _, parent_id):
if parent_id is None:
return parent_id
elif parent_id == self.id:
msg = (
"A device cannot be its own parent. The id for '{name}'"
" cannot be used as its parent_id."
).format(name=self.name)
raise exceptions.ParentIDError(msg)
elif parent_id in (descendant.id for descendant in self.descendants):
msg = (
"A device cannot have a descendant as its parent. The"
" parent_id for '{name}' cannot be set to the id '{bad_id}'."
).format(name=self.name, bad_id=parent_id)
raise exceptions.ParentIDError(msg)
else:
return parent_id
@property
def ancestors(self):
lineage = []
ancestor = self.parent
while ancestor:
lineage.append(ancestor)
ancestor = ancestor.parent
return lineage
@property
def descendants(self):
marked = OrderedDict()
descent = deque(self.children)
while descent:
descendant = descent.popleft()
marked[descendant] = True
descent.extend(
child for child in descendant.children if child not in marked
)
return list(marked.keys())
@property
def resolution_order(self):
return list(itertools.chain(
[self],
self.ancestors,
[self.cell] if self.cell else [],
[self.region],
[self.cloud],
[self.project]))
__mapper_args__ = {
'polymorphic_on': type,
'polymorphic_identity': 'devices',
'with_polymorphic': '*'
}
class Host(Device):
__tablename__ = 'hosts'
id = Column(Integer, ForeignKey('devices.id'), primary_key=True)
hostname = Device.name
__mapper_args__ = {
'polymorphic_identity': 'hosts',
'inherit_condition': (id == Device.id)
}
class NetworkInterface(Base, VariableMixin):
__tablename__ = 'network_interfaces'
__table_args__ = (
UniqueConstraint("device_id", "name",
name="uq_netinter0deviceid0name"),
)
id = Column(Integer, primary_key=True)
name = Column(String(255), nullable=True)
interface_type = Column(String(255), nullable=True)
vlan_id = Column(Integer, nullable=True)
port = Column(Integer, nullable=True)
vlan = Column(String(255), nullable=True)
duplex = Column(String(255), nullable=True)
speed = Column(String(255), nullable=True)
link = Column(String(255), nullable=True)
cdp = Column(String(255), nullable=True)
security = Column(String(255), nullable=True)
ip_address = Column(IPAddressType, nullable=False)
project_id = Column(UUIDType(binary=False), ForeignKey('projects.id'),
index=True, nullable=False)
device_id = Column(Integer, ForeignKey('devices.id'))
network_id = Column(Integer, ForeignKey('networks.id'), nullable=True)
project = relationship(
'Project', back_populates='interfaces', cascade='all', lazy='joined')
network = relationship(
'Network', back_populates='interfaces', cascade='all', lazy='joined')
device = relationship(
'Device', back_populates='interfaces', cascade='all', lazy='joined')
class Network(Base, VariableMixin):
__tablename__ = 'networks'
__table_args__ = (
UniqueConstraint("name", "project_id", "region_id",
name="uq_name0projectid0regionid"),
)
id = Column(Integer, primary_key=True)
name = Column(String(255), nullable=True)
cidr = Column(String(255), nullable=True)
gateway = Column(String(255), nullable=True)
netmask = Column(String(255), nullable=True)
ip_block_type = Column(String(255), nullable=True)
nss = Column(String(255), nullable=True)
cloud_id = Column(
Integer, ForeignKey('clouds.id'), index=True, nullable=False)
region_id = Column(
Integer, ForeignKey('regions.id'), index=True, nullable=False)
cell_id = Column(
Integer, ForeignKey('cells.id'), index=True, nullable=True)
project_id = Column(
UUIDType(binary=False), ForeignKey('projects.id'), index=True,
nullable=False)
project = relationship('Project', back_populates='networks')
cloud = relationship('Cloud', back_populates='networks')
region = relationship('Region', back_populates='networks')
cell = relationship('Cell', back_populates='networks')
interfaces = relationship('NetworkInterface', back_populates='network')
class NetworkDevice(Device):
__tablename__ = 'network_devices'
id = Column(Integer, ForeignKey('devices.id'), primary_key=True)
hostname = Device.name
# network device specific properties
model_name = Column(String(255), nullable=True)
os_version = Column(String(255), nullable=True)
vlans = Column(JSON)
__mapper_args__ = {
'polymorphic_identity': 'network_devices',
'inherit_condition': (id == Device.id)
}
class Label(Base):
"""Models arbitrary labeling (aka tagging) of devices."""
__tablename__ = 'labels'
device_id = Column(
Integer,
ForeignKey(Device.id, name='fk_labels_devices'),
primary_key=True)
label = Column(String(255), primary_key=True)
_repr_columns = [device_id, label]
def __init__(self, label):
self.label = label
device = relationship("Device", back_populates="related_labels")

View File

@ -1,100 +0,0 @@
"""Exceptions for Craton Inventory system."""
from oslo_log import log as logging
LOG = logging.getLogger(__name__)
class Base(Exception):
"""Base Exception for Craton Inventory."""
code = 500
message = "An unknown exception occurred"
def __str__(self):
return self.message
def __init__(self, code=None, message=None, **kwargs):
if code:
self.code = code
if not message:
try:
message = self.msg % kwargs
except Exception:
LOG.exception('Error in formatting exception message')
message = self.msg
self.message = message
super(Base, self).__init__(
'%s: %s' % (self.code, self.message))
class DuplicateCloud(Base):
code = 409
msg = "A cloud with the given name already exists."
class DuplicateRegion(Base):
code = 409
msg = "A region with the given name already exists."
class DuplicateCell(Base):
code = 409
msg = "A cell with the given name already exists."
class DuplicateDevice(Base):
code = 409
msg = "A device with the given name already exists."
class DuplicateNetwork(Base):
code = 409
msg = "Network with the given name already exists in this region."
class NetworkNotFound(Base):
code = 404
msg = "Network not found for ID %(id)s."
class DeviceNotFound(Base):
code = 404
msg = "%(device_type)s device not found for ID %(id)s."
class AuthenticationError(Base):
code = 401
msg = "The request could not be authenticated."
class AdminRequired(Base):
code = 401
msg = "This action requires the 'admin' role"
class BadRequest(Base):
code = 400
class InvalidJSONPath(BadRequest):
msg = "The query contains an invalid JSON Path expression."
class InvalidJSONValue(BadRequest):
msg = "An invalid JSON value was specified."
class NotFound(Base):
code = 404
msg = "Not Found"
class UnknownException(Base):
code = 500
class ParentIDError(ValueError):
pass

View File

@ -1,30 +0,0 @@
import mock
import testtools
from oslo_middleware import base
class TestContext(base.Middleware):
def __init__(self, auth_token=None, user=None, tenant=None,
is_admin=False, is_admin_project=False):
self.auth_token = auth_token
self.user = user
self.tenant = tenant
self.is_admin = is_admin
self.is_admin_project = is_admin_project
def make_context(*args, **kwargs):
return TestContext(*args, **kwargs)
class TestCase(testtools.TestCase):
def setUp(self):
super(TestCase, self).setUp()
self.addCleanup(mock.patch.stopall)
self.context = make_context(auth_token='fake-token',
user='fake-user',
tenant='fake-tenant',
is_admin=True,
is_admin_project=True)

View File

@ -1,493 +0,0 @@
import contextlib
import copy
import json
import threading
import docker
from oslo_log import log as logging
from oslo_utils import uuidutils
import requests
from retrying import retry
from sqlalchemy import create_engine
from sqlalchemy import MetaData
from sqlalchemy.orm import sessionmaker
import testtools
from craton.db.sqlalchemy import models
LOG = logging.getLogger(__name__)
FAKE_DATA_GEN_USERNAME = 'demo'
FAKE_DATA_GEN_TOKEN = 'demo'
FAKE_DATA_GEN_PROJECT_ID = 'b9f10eca66ac4c279c139d01e65f96b4'
FAKE_DATA_GEN_BOOTSTRAP_USERNAME = 'bootstrap'
FAKE_DATA_GEN_BOOTSTRAP_TOKEN = 'bootstrap'
HEADER_TOKEN = 'X-Auth-Token'
HEADER_USERNAME = 'X-Auth-User'
HEADER_PROJECT = 'X-Auth-Project'
def get_root_headers():
return {
HEADER_USERNAME: FAKE_DATA_GEN_BOOTSTRAP_USERNAME,
HEADER_TOKEN: FAKE_DATA_GEN_BOOTSTRAP_TOKEN
}
class DockerSetup(threading.Thread):
def __init__(self):
self.container = None
self.container_is_ready = threading.Event()
self.error = None
self.client = None
self.repo_dir = './'
super(DockerSetup, self).__init__()
def run(self):
"""Build a docker container from the given Dockerfile and start
the container in a separate thread."""
try:
self.client = docker.Client(version='auto')
is_ok = self.client.ping()
if is_ok != 'OK':
msg = 'Docker daemon ping failed.'
self.error = msg
LOG.error(self.error)
self.container_is_ready.set()
return
except Exception as err:
self.error = err
LOG.error(self.error)
self.container_is_ready.set()
return
# Create Docker image for Craton
build_output = self.client.build(path=self.repo_dir,
tag='craton-functional-testing-api',
dockerfile='Dockerfile',
pull=True,
forcerm=True)
LOG.debug(build_output)
output_last_line = ""
for output_last_line in build_output:
pass
message = output_last_line.decode("utf-8")
if "Successfully built" not in message:
msg = 'Failed to build docker image.'
self.error = msg
self.container_is_ready.set()
return
# create and start the container
container_tag = 'craton-functional-testing-api'
self.container = self.client.create_container(container_tag)
self.client.start(self.container)
self.container_data = self.client.inspect_container(self.container)
if self.container_data['State']['Status'] != 'running':
msg = 'Container is not running.'
self.error = msg
self.container_is_ready.set()
return
self.container_is_ready.set()
def stop(self):
"""Stop a running container."""
if self.container is not None:
self.client.stop(self.container, timeout=30)
def remove(self):
"""Remove/Delete a stopped container."""
if self.container is not None:
self.client.remove_container(self.container)
def remove_image(self):
"""Remove the image we created."""
if self.client:
self.client.remove_image('craton-functional-testing-api')
@retry(wait_fixed=1000, stop_max_attempt_number=20)
def ensure_running_endpoint(container_data):
service_ip = container_data['NetworkSettings']['IPAddress']
url = 'http://{}:7780/v1'.format(service_ip)
headers = {"Content-Type": "application/json"}
requests.get(url, headers=headers)
_container = None
def setup_container():
global _container
_container = DockerSetup()
_container.daemon = True
_container.start()
_container.container_is_ready.wait()
if _container.error:
teardown_container()
else:
try:
ensure_running_endpoint(_container.container_data)
except Exception:
msg = 'Error during data generation script run.'
_container.error = msg
teardown_container()
def teardown_container():
if _container:
_container.stop()
_container.remove()
_container.remove_image()
def setUpModule():
setup_container()
def tearDownModule():
teardown_container()
def setup_database(container_ip):
mysqldb = "mysql+pymysql://craton:craton@{}/craton".format(container_ip)
engine = create_engine(mysqldb)
meta = MetaData()
meta.reflect(engine)
# NOTE(sulo, jimbaker): First clean the db up for tests, and do
# our own bootstrapping to isolate all test from any external
# scripts.
with contextlib.closing(engine.connect()) as conn:
transaction = conn.begin()
conn.execute("SET foreign_key_checks = 0")
for table in reversed(meta.sorted_tables):
conn.execute(table.delete())
conn.execute("SET foreign_key_checks = 1")
transaction.commit()
# NOTE(sulo, jimbaker): now bootstrap user and project; using the
# SA model allows us to respect the additional constraints in the
# model, vs having to duplicate logic if working against the
# database directly.
Session = sessionmaker(bind=engine)
session = Session()
project = models.Project(
name=FAKE_DATA_GEN_USERNAME,
id=FAKE_DATA_GEN_PROJECT_ID)
bootstrap_user = models.User(
project=project,
username=FAKE_DATA_GEN_BOOTSTRAP_USERNAME,
api_key=FAKE_DATA_GEN_BOOTSTRAP_TOKEN,
is_admin=True,
is_root=True)
demo_user = models.User(
project=project,
username=FAKE_DATA_GEN_USERNAME,
api_key=FAKE_DATA_GEN_TOKEN,
is_admin=True)
session.add(project)
session.add(bootstrap_user)
session.add(demo_user)
# NOTE(jimbaker) simple assumption: either this commit succeeds,
# or we need to fail fast - there's no recovery allowed in this
# testing setup.
session.commit()
class TestCase(testtools.TestCase):
def setUp(self):
"""Base setup provides container data back individual tests."""
super(TestCase, self).setUp()
self.container_setup_error = _container.error
self.session = requests.Session()
if not self.container_setup_error:
data = _container.container_data
self.service_ip = data['NetworkSettings']['IPAddress']
self.url = 'http://{}:7780'.format(self.service_ip)
self.session.headers[HEADER_PROJECT] = FAKE_DATA_GEN_PROJECT_ID
self.session.headers[HEADER_USERNAME] = FAKE_DATA_GEN_USERNAME
self.session.headers[HEADER_TOKEN] = FAKE_DATA_GEN_TOKEN
self.root_headers = copy.deepcopy(self.session.headers)
self.root_headers.update(get_root_headers())
setup_database(self.service_ip)
def tearDown(self):
super(TestCase, self).tearDown()
def assertSuccessOk(self, response):
self.assertEqual(requests.codes.OK, response.status_code)
def assertSuccessCreated(self, response):
self.assertEqual(requests.codes.CREATED, response.status_code)
def assertNoContent(self, response):
self.assertEqual(requests.codes.NO_CONTENT, response.status_code)
def assertBadRequest(self, response):
self.assertEqual(requests.codes.BAD_REQUEST, response.status_code)
def assertJSON(self, response):
if response.text:
try:
data = json.loads(response.text)
except json.JSONDecodeError:
self.fail("Response data is not JSON.")
else:
reference = "{formatted_data}\n".format(
formatted_data=json.dumps(
data, indent=2, sort_keys=True, separators=(',', ': ')
)
)
self.assertEqual(
reference,
response.text
)
def assertFailureFormat(self, response):
if response.status_code >= 400:
body = response.json()
self.assertEqual(2, len(body))
self.assertEqual(response.status_code, body["status"])
self.assertIn("message", body)
def get(self, url, headers=None, **params):
resp = self.session.get(
url, verify=False, headers=headers, params=params,
)
self.assertJSON(resp)
self.assertFailureFormat(resp)
return resp
def post(self, url, headers=None, data=None):
resp = self.session.post(
url, verify=False, headers=headers, json=data,
)
self.assertJSON(resp)
self.assertFailureFormat(resp)
return resp
def put(self, url, headers=None, data=None):
resp = self.session.put(
url, verify=False, headers=headers, json=data,
)
self.assertJSON(resp)
self.assertFailureFormat(resp)
return resp
def delete(self, url, headers=None, body=None):
resp = self.session.delete(
url, verify=False, headers=headers, json=body,
)
self.assertJSON(resp)
self.assertFailureFormat(resp)
return resp
def create_project(self, name, variables=None):
url = self.url + '/v1/projects'
payload = {'name': name}
if variables:
payload['variables'] = variables
response = self.post(url, headers=self.root_headers, data=payload)
self.assertEqual(201, response.status_code)
self.assertIn('Location', response.headers)
project = response.json()
self.assertTrue(uuidutils.is_uuid_like(project['id']))
self.assertEqual(
response.headers['Location'],
"{}/{}".format(url, project['id'])
)
return project
def create_cloud(self, name, variables=None):
url = self.url + '/v1/clouds'
values = {'name': name}
if variables:
values['variables'] = variables
resp = self.post(url, data=values)
self.assertSuccessCreated(resp)
self.assertIn('Location', resp.headers)
json = resp.json()
self.assertEqual(
resp.headers['Location'],
"{}/{}".format(url, json['id'])
)
return json
def delete_clouds(self, clouds):
base_url = self.url + '/v1/clouds/{}'
for cloud in clouds:
url = base_url.format(cloud['id'])
resp = self.delete(url)
self.assertNoContent(resp)
def create_region(self, name, cloud, variables=None):
url = self.url + '/v1/regions'
values = {'name': name, 'cloud_id': cloud['id']}
if variables:
values['variables'] = variables
resp = self.post(url, data=values)
self.assertSuccessCreated(resp)
self.assertIn('Location', resp.headers)
json = resp.json()
self.assertEqual(
resp.headers['Location'],
"{}/{}".format(url, json['id'])
)
return json
def delete_regions(self, regions):
base_url = self.url + '/v1/regions/{}'
for region in regions:
url = base_url.format(region['id'])
resp = self.delete(url)
self.assertNoContent(resp)
def create_cell(self, name, cloud, region, variables=None):
url = self.url + '/v1/cells'
payload = {'name': name, 'region_id': region['id'],
'cloud_id': cloud['id']}
if variables:
payload['variables'] = variables
cell = self.post(url, data=payload)
self.assertEqual(201, cell.status_code)
self.assertIn('Location', cell.headers)
self.assertEqual(
cell.headers['Location'],
"{}/{}".format(url, cell.json()['id'])
)
return cell.json()
def create_network(
self, name, cloud, region, cidr, gateway, netmask, variables=None
):
url = self.url + '/v1/networks'
payload = {
'name': name,
'cidr': cidr,
'gateway': gateway,
'netmask': netmask,
'region_id': region['id'],
'cloud_id': cloud['id'],
}
if variables:
payload['variables'] = variables
network = self.post(url, data=payload)
self.assertEqual(201, network.status_code)
self.assertIn('Location', network.headers)
self.assertEqual(
network.headers['Location'],
"{}/{}".format(url, network.json()['id'])
)
return network.json()
def create_host(self, name, cloud, region, hosttype, ip_address,
parent_id=None, **variables):
url = self.url + '/v1/hosts'
payload = {
'name': name,
'device_type': hosttype,
'ip_address': ip_address,
'region_id': region['id'],
'cloud_id': cloud['id']
}
if parent_id:
payload['parent_id'] = parent_id
if variables:
payload['variables'] = variables
host = self.post(url, data=payload)
self.assertEqual(201, host.status_code)
self.assertIn('Location', host.headers)
self.assertEqual(
host.headers['Location'],
"{}/{}".format(url, host.json()['id'])
)
return host.json()
def create_network_device(
self, name, cloud, region, device_type, ip_address, parent_id=None,
**variables
):
url = self.url + '/v1/network-devices'
payload = {
'name': name,
'device_type': device_type,
'ip_address': ip_address,
'region_id': region['id'],
'cloud_id': cloud['id'],
}
if parent_id:
payload['parent_id'] = parent_id
if variables:
payload['variables'] = variables
network_device = self.post(url, data=payload)
self.assertEqual(201, network_device.status_code)
self.assertIn('Location', network_device.headers)
self.assertEqual(
network_device.headers['Location'],
"{}/{}".format(url, network_device.json()['id'])
)
return network_device.json()
class DeviceTestBase(TestCase):
def setUp(self):
super(DeviceTestBase, self).setUp()
self.cloud = self.create_cloud()
self.region = self.create_region()
def create_cloud(self, name='cloud-1'):
return super(DeviceTestBase, self).create_cloud(name=name)
def create_region(self, name='region-1', cloud=None, variables=None):
return super(DeviceTestBase, self).create_region(
name=name,
cloud=cloud if cloud else self.cloud,
variables=variables,
)
def create_network_device(self, name, device_type, ip_address, region=None,
cloud=None, parent_id=None, **variables):
return super(DeviceTestBase, self).create_network_device(
name=name,
cloud=cloud if cloud else self.cloud,
region=region if region else self.region,
device_type=device_type,
ip_address=ip_address,
parent_id=parent_id,
**variables
)
def create_host(self, name, hosttype, ip_address, region=None, cloud=None,
parent_id=None, **variables):
return super(DeviceTestBase, self).create_host(
name=name,
cloud=cloud if cloud else self.cloud,
region=region if region else self.region,
hosttype=hosttype,
ip_address=ip_address,
parent_id=parent_id,
**variables
)

View File

@ -1,216 +0,0 @@
from craton.tests.functional.test_variable_calls import \
APIV1ResourceWithVariablesTestCase
class APIV1CellTest(APIV1ResourceWithVariablesTestCase):
resource = 'cells'
def setUp(self):
super(APIV1CellTest, self).setUp()
self.cloud = self.create_cloud()
self.region = self.create_region()
def tearDown(self):
super(APIV1CellTest, self).tearDown()
def create_cloud(self):
return super(APIV1CellTest, self).create_cloud(name='cloud-1')
def create_region(self):
return super(APIV1CellTest, self).create_region(
name='region-1',
cloud=self.cloud,
variables={"region": "one"},
)
def create_cell(self, name, variables=None):
return super(APIV1CellTest, self).create_cell(
name=name,
cloud=self.cloud,
region=self.region,
variables=variables
)
def test_cell_create_with_variables(self):
variables = {'a': 'b'}
cell = self.create_cell('cell-a', variables=variables)
self.assertEqual('cell-a', cell['name'])
self.assertEqual(variables, cell['variables'])
def test_create_cell_supports_vars_ops(self):
cell = self.create_cell('new-cell', {'a': 'b'})
self.assert_vars_get_expected(cell['id'], {'a': 'b'})
self.assert_vars_can_be_set(cell['id'])
self.assert_vars_can_be_deleted(cell['id'])
def test_cell_create_with_no_name_fails(self):
url = self.url + '/v1/cells'
payload = {'region_id': self.region['id']}
cell = self.post(url, data=payload)
self.assertEqual(400, cell.status_code)
def test_cell_create_with_duplicate_name_fails(self):
self.create_cell('test-cell')
url = self.url + '/v1/cells'
payload = {'name': 'test-cell', 'region_id': self.region['id'],
"cloud_id": self.cloud['id']}
cell = self.post(url, data=payload)
self.assertEqual(409, cell.status_code)
def test_cell_create_with_extra_id_property_fails(self):
url = self.url + '/v1/cells'
payload = {'region_id': self.region['id'],
'cloud_id': self.cloud['id'], 'name': 'a', 'id': 3}
cell = self.post(url, data=payload)
self.assertEqual(400, cell.status_code)
msg = (
"The request included the following errors:\n"
"- Additional properties are not allowed ('id' was unexpected)"
)
self.assertEqual(cell.json()['message'], msg)
def test_cell_create_with_extra_created_at_property_fails(self):
url = self.url + '/v1/cells'
payload = {'region_id': self.region['id'],
'cloud_id': self.cloud['id'], 'name': 'a',
'created_at': "some date"}
cell = self.post(url, data=payload)
self.assertEqual(400, cell.status_code)
msg = (
"The request included the following errors:\n"
"- Additional properties are not allowed "
"('created_at' was unexpected)"
)
self.assertEqual(cell.json()['message'], msg)
def test_cell_create_with_extra_updated_at_property_fails(self):
url = self.url + '/v1/cells'
payload = {'region_id': self.region['id'],
'cloud_id': self.cloud['id'], 'name': 'a',
'updated_at': "some date"}
cell = self.post(url, data=payload)
self.assertEqual(400, cell.status_code)
msg = (
"The request included the following errors:\n"
"- Additional properties are not allowed "
"('updated_at' was unexpected)"
)
self.assertEqual(cell.json()['message'], msg)
def test_cell_create_missing_all_properties_fails(self):
url = self.url + '/v1/cells'
cell = self.post(url, data={})
self.assertEqual(400, cell.status_code)
msg = (
"The request included the following errors:\n"
"- 'cloud_id' is a required property\n"
"- 'name' is a required property\n"
"- 'region_id' is a required property"
)
self.assertEqual(cell.json()['message'], msg)
def test_cells_get_all_with_details(self):
self.create_cell('cell1', variables={'a': 'b'})
self.create_cell('cell2', variables={'c': 'd'})
url = self.url + '/v1/cells?details=all'
resp = self.get(url)
cells = resp.json()['cells']
self.assertEqual(2, len(cells))
for cell in cells:
self.assertTrue('variables' in cell)
for cell in cells:
if cell['name'] == 'cell1':
expected = {'a': 'b', "region": "one"}
self.assertEqual(expected, cell['variables'])
if cell['name'] == 'cell2':
expected = {'c': 'd', "region": "one"}
self.assertEqual(expected, cell['variables'])
def test_cells_get_all_for_region(self):
# Create a cell first
self.create_cell('cell-1')
url = self.url + '/v1/cells?region_id={}'.format(self.region['id'])
resp = self.get(url)
cells = resp.json()['cells']
self.assertEqual(1, len(cells))
self.assertEqual(['cell-1'], [i['name'] for i in cells])
def test_cells_get_all_for_cloud(self):
# Create a cell first
for i in range(2):
self.create_cell('cell-{}'.format(str(i)))
url = self.url + '/v1/cells?cloud_id={}'.format(self.cloud['id'])
resp = self.get(url)
cells = resp.json()['cells']
self.assertEqual(2, len(cells))
self.assertEqual(['cell-0', 'cell-1'], [i['name'] for i in cells])
def test_cell_get_all_with_name_filter(self):
self.create_cell('cell1')
self.create_cell('cell2')
url = self.url + '/v1/cells?name=cell2'
resp = self.get(url)
cells = resp.json()['cells']
self.assertEqual(1, len(cells))
self.assertEqual({'cell2'}, {cell['name'] for cell in cells})
def test_get_cell_details(self):
cellvars = {"who": "that"}
cell = self.create_cell('cell1', variables=cellvars)
url = self.url + '/v1/cells/{}'.format(cell['id'])
resp = self.get(url)
cell_with_detail = resp.json()
self.assertEqual('cell1', cell_with_detail['name'])
def test_get_cell_resolved_vars(self):
cellvars = {"who": "that"}
cell = self.create_cell('cell1', variables=cellvars)
url = self.url + '/v1/cells/{}'.format(cell['id'])
resp = self.get(url)
cell_with_detail = resp.json()
self.assertEqual('cell1', cell_with_detail['name'])
self.assertEqual({"who": "that", "region": "one"},
cell_with_detail['variables'])
def test_get_cell_unresolved_vars(self):
cellvars = {"who": "that"}
cell = self.create_cell('cell1', variables=cellvars)
cell_id = cell['id']
url = self.url + '/v1/cells/{}?resolved-values=false'.format(cell_id)
resp = self.get(url)
cell_with_detail = resp.json()
self.assertEqual('cell1', cell_with_detail['name'])
self.assertEqual({"who": "that"}, cell_with_detail['variables'])
def test_cell_update(self):
cell = self.create_cell('cell-1')
url = self.url + '/v1/cells/{}'.format(cell['id'])
data = {'note': 'Updated cell note.'}
resp = self.put(url, data=data)
self.assertEqual(200, resp.status_code)
cell = resp.json()
self.assertEqual(data['note'], cell['note'])
def test_cell_delete(self):
cell1 = self.create_cell('cell-1')
self.create_cell('cell-2')
url = self.url + '/v1/cells'
resp = self.get(url)
self.assertEqual(200, resp.status_code)
cells = resp.json()['cells']
self.assertEqual(2, len(cells))
self.assertEqual({'cell-1', 'cell-2'},
{cell['name'] for cell in cells})
delurl = self.url + '/v1/cells/{}'.format(cell1['id'])
resp = self.delete(delurl)
self.assertEqual(204, resp.status_code)
resp = self.get(url)
self.assertEqual(200, resp.status_code)
cells = resp.json()['cells']
self.assertEqual(1, len(cells))
self.assertEqual({'cell-2'},
{cell['name'] for cell in cells})

View File

@ -1,204 +0,0 @@
import urllib.parse
from craton.tests.functional import TestCase
class APIV1CloudTest(TestCase):
"""Test cases for /cloud calls.
"""
def test_create_cloud_full_data(self):
# Test with full set of allowed parameters
values = {"name": "cloud-new",
"note": "This is cloud-new.",
"variables": {"a": "b"}}
url = self.url + '/v1/clouds'
resp = self.post(url, data=values)
self.assertEqual(201, resp.status_code)
self.assertIn('Location', resp.headers)
self.assertEqual(
resp.headers['Location'],
"{}/{}".format(url, resp.json()['id'])
)
self.assertEqual(values['name'], resp.json()['name'])
def test_create_cloud_without_variables(self):
values = {"name": "cloud-two",
"note": "This is cloud-two"}
url = self.url + '/v1/clouds'
resp = self.post(url, data=values)
self.assertEqual(201, resp.status_code)
self.assertIn('Location', resp.headers)
self.assertEqual(
resp.headers['Location'],
"{}/{}".format(url, resp.json()['id'])
)
self.assertEqual("cloud-two", resp.json()['name'])
def test_create_cloud_with_no_name_fails(self):
values = {"note": "This is cloud one."}
url = self.url + '/v1/clouds'
resp = self.post(url, data=values)
self.assertEqual(resp.status_code, 400)
err_msg = (
"The request included the following errors:\n"
"- 'name' is a required property"
)
self.assertEqual(resp.json()['message'], err_msg)
def test_create_cloud_with_duplicate_name_fails(self):
self.create_cloud("ORD135")
values = {"name": "ORD135"}
url = self.url + '/v1/clouds'
resp = self.post(url, data=values)
self.assertEqual(409, resp.status_code)
def test_create_region_with_extra_id_property_fails(self):
values = {"name": "test", "id": 101}
url = self.url + '/v1/clouds'
resp = self.post(url, data=values)
self.assertEqual(resp.status_code, 400)
msg = (
"The request included the following errors:\n"
"- Additional properties are not allowed ('id' was unexpected)"
)
self.assertEqual(resp.json()['message'], msg)
def test_create_region_with_extra_created_at_property_fails(self):
values = {"name": "test", "created_at": "some date"}
url = self.url + '/v1/clouds'
resp = self.post(url, data=values)
self.assertEqual(resp.status_code, 400)
msg = (
"The request included the following errors:\n"
"- Additional properties are not allowed "
"('created_at' was unexpected)"
)
self.assertEqual(resp.json()['message'], msg)
def test_create_region_with_extra_updated_at_property_fails(self):
values = {"name": "test", "updated_at": "some date"}
url = self.url + '/v1/clouds'
resp = self.post(url, data=values)
self.assertEqual(resp.status_code, 400)
msg = (
"The request included the following errors:\n"
"- Additional properties are not allowed "
"('updated_at' was unexpected)"
)
self.assertEqual(resp.json()['message'], msg)
def test_cloud_create_missing_all_properties_fails(self):
url = self.url + '/v1/clouds'
cloud = self.post(url, data={})
self.assertEqual(400, cloud.status_code)
msg = (
"The request included the following errors:\n"
"- 'name' is a required property"
)
self.assertEqual(cloud.json()['message'], msg)
def test_clouds_get_all(self):
self.create_cloud("ORD1")
self.create_cloud("ORD2")
url = self.url + '/v1/clouds'
resp = self.get(url)
self.assertEqual(200, resp.status_code)
self.assertEqual(2, len(resp.json()))
def test_clouds_get_all_with_details_filter(self):
c1 = self.create_cloud("ORD1", variables={'a': 'b'})
c2 = self.create_cloud("ORD2", variables={'c': 'd'})
url = self.url + '/v1/clouds?details=all'
resp = self.get(url)
self.assertEqual(200, resp.status_code)
clouds = resp.json()['clouds']
self.assertEqual(2, len(clouds))
for cloud in clouds:
self.assertTrue('variables' in cloud)
for cloud in clouds:
if cloud['name'] == 'ORD1':
self.assertEqual(c1['variables'], {'a': 'b'})
if cloud['name'] == 'ORD2':
self.assertEqual(c2['variables'], {'c': 'd'})
def test_clouds_get_all_with_name_filter(self):
self.create_cloud("ORD1")
self.create_cloud("ORD2")
url = self.url + '/v1/clouds?name=ORD1'
resp = self.get(url)
self.assertEqual(200, resp.status_code)
clouds = resp.json()['clouds']
self.assertEqual(1, len(clouds))
self.assertEqual('ORD1', clouds[0]['name'])
def test_cloud_with_non_existing_filters(self):
self.create_cloud("ORD1")
url = self.url + '/v1/clouds?name=idontexist'
resp = self.get(url)
self.assertEqual(404, resp.status_code)
def test_cloud_get_details_for_cloud(self):
regvars = {"a": "b", "one": "two"}
cloud = self.create_cloud("ORD1", variables=regvars)
url = self.url + '/v1/clouds/{}'.format(cloud['id'])
resp = self.get(url)
cloud = resp.json()
self.assertEqual(cloud['name'], 'ORD1')
self.assertEqual(regvars, cloud['variables'])
class TestPagination(TestCase):
def setUp(self):
super(TestPagination, self).setUp()
self.clouds = [self.create_cloud('cloud-{}'.format(i))
for i in range(0, 61)]
self.addCleanup(self.delete_clouds, self.clouds)
def test_list_first_thirty_clouds(self):
url = self.url + '/v1/clouds'
response = self.get(url)
self.assertSuccessOk(response)
json = response.json()
self.assertIn('clouds', json)
self.assertEqual(30, len(json['clouds']))
self.assertListEqual([r['id'] for r in self.clouds[:30]],
[r['id'] for r in json['clouds']])
def test_get_returns_correct_next_link(self):
url = self.url + '/v1/clouds'
thirtieth_cloud = self.clouds[29]
response = self.get(url)
self.assertSuccessOk(response)
json = response.json()
self.assertIn('links', json)
for link_rel in json['links']:
if link_rel['rel'] == 'next':
break
else:
self.fail("No 'next' link was returned in response")
parsed_next = urllib.parse.urlparse(link_rel['href'])
self.assertIn('marker={}'.format(thirtieth_cloud['id']),
parsed_next.query)
def test_get_returns_correct_prev_link(self):
first_cloud = self.clouds[0]
thirtieth_cloud = self.clouds[29]
url = self.url + '/v1/clouds?marker={}'.format(thirtieth_cloud['id'])
response = self.get(url)
self.assertSuccessOk(response)
json = response.json()
self.assertIn('links', json)
for link_rel in json['links']:
if link_rel['rel'] == 'prev':
break
else:
self.fail("No 'prev' link was returned in response")
parsed_prev = urllib.parse.urlparse(link_rel['href'])
self.assertIn('marker={}'.format(first_cloud['id']),
parsed_prev.query)

View File

@ -1,190 +0,0 @@
from itertools import count, cycle
import urllib.parse
from craton.tests.functional import DeviceTestBase
class DeviceTests(DeviceTestBase):
def count_devices(self, devices):
num_devices = (
len(devices['hosts']) +
len(devices['network-devices'])
)
return num_devices
class APIV1DeviceTest(DeviceTests):
def setUp(self):
super().setUp()
self.net_device1 = self.create_network_device(
'network_device1', 'switch', '192.168.1.1'
)
self.net_device2 = self.create_network_device(
'network_device2', 'switch', '192.168.1.2',
parent_id=self.net_device1['id'],
)
self.host1 = self.create_host(
'host1', 'server', '192.168.1.3', parent_id=self.net_device2['id']
)
self.container1 = self.create_host(
'host1container1', 'container', '192.168.1.4',
parent_id=self.host1['id'],
)
url = self.url + '/v1/devices'
resp = self.get(url)
self.assertEqual(200, resp.status_code)
devices = resp.json()['devices']
self.assertEqual(4, self.count_devices(devices))
def test_device_get_by_parent_id_no_descendants(self):
url = '{}/v1/devices?parent_id={}'.format(
self.url, self.net_device1['id']
)
resp = self.get(url)
self.assertEqual(200, resp.status_code)
devices = resp.json()['devices']
self.assertEqual(1, self.count_devices(devices))
self.assertEqual(
self.net_device1['id'], devices['network-devices'][0]['parent_id']
)
def test_device_get_by_parent_id_with_descendants(self):
url = '{}/v1/devices?parent_id={}&descendants=true'.format(
self.url, self.net_device1['id']
)
resp = self.get(url)
self.assertEqual(200, resp.status_code)
devices = resp.json()['devices']
self.assertEqual(3, self.count_devices(devices))
self.assertEqual(
self.net_device1['id'], devices['network-devices'][0]['parent_id']
)
self.assertEqual(
self.net_device2['id'], devices['hosts'][0]['parent_id']
)
self.assertEqual(self.host1['id'], devices['hosts'][1]['parent_id'])
def test_device_by_missing_filter(self):
url = self.url + '/v1/devices?active=false'
resp = self.get(url)
self.assertEqual(200, resp.status_code)
devices = resp.json()['devices']
self.assertEqual(0, self.count_devices(devices))
class TestPagination(DeviceTests):
def setUp(self):
super().setUp()
self.devices = []
last_octet = count(1)
first_network_device = self.create_network_device(
'network-device0',
'switch',
'192.168.1.{}'.format(next(last_octet)),
)
self.devices.append(first_network_device)
for i in range(1, 3):
network_device = self.create_network_device(
'network-device{}'.format(i),
'switch',
'192.168.1.{}'.format(next(last_octet)),
)
self.devices.append(network_device)
host_parents = (
self.devices[1],
self.devices[2],
)
for i, host_parent in zip(range(12), cycle(host_parents)):
host = self.create_host(
'host{}'.format(i),
'server',
'192.168.1.{}'.format(next(last_octet)),
parent_id=host_parent['id'],
)
self.devices.append(host)
for j in range(4):
container = self.create_host(
'host{}container{}'.format(i, j),
'container',
'192.168.1.{}'.format(next(last_octet)),
parent_id=host['id'],
)
self.devices.append(container)
def test_get_returns_a_default_list_of_thirty_devices(self):
response = self.get(self.url + '/v1/devices')
self.assertSuccessOk(response)
devices = response.json()
self.assertIn('devices', devices)
self.assertEqual(30, self.count_devices(devices['devices']))
returned_device_ids = sorted(
device['id']
for dt in devices['devices'].values()
for device in dt
)
self.assertListEqual(
[d['id'] for d in self.devices[:30]],
returned_device_ids
)
def test_get_returns_correct_next_link(self):
thirtieth_device = self.devices[29]
response = self.get(self.url + '/v1/devices')
self.assertSuccessOk(response)
devices = response.json()
self.assertIn('links', devices)
for link_rel in devices['links']:
if link_rel['rel'] == 'next':
break
else:
self.fail("No 'next' link was returned in response")
parsed_next = urllib.parse.urlparse(link_rel['href'])
self.assertIn('marker={}'.format(thirtieth_device['id']),
parsed_next.query)
def test_get_returns_correct_prev_link(self):
first_device = self.devices[0]
thirtieth_device = self.devices[29]
url = self.url + '/v1/devices?marker={}'.format(thirtieth_device['id'])
response = self.get(url)
self.assertSuccessOk(response)
devices = response.json()
self.assertIn('links', devices)
for link_rel in devices['links']:
if link_rel['rel'] == 'prev':
break
else:
self.fail("No 'prev' link was returned in response")
parsed_prev = urllib.parse.urlparse(link_rel['href'])
self.assertIn(
'marker={}'.format(first_device['id']), parsed_prev.query
)
def test_ascending_sort_by_name(self):
response = self.get(self.url + '/v1/devices',
sort_keys='name', sort_dir='asc')
self.assertSuccessOk(response)
devices = response.json()['devices']
self.assertEqual(30, self.count_devices(devices))
def test_ascending_sort_by_name_and_id(self):
response = self.get(self.url + '/v1/devices',
sort_keys='name,id', sort_dir='asc')
self.assertSuccessOk(response)
devices = response.json()['devices']
self.assertEqual(30, self.count_devices(devices))
def test_ascending_sort_by_name_and_id_space_separated(self):
response = self.get(self.url + '/v1/devices',
sort_keys='name id', sort_dir='asc')
self.assertSuccessOk(response)
devices = response.json()['devices']
self.assertEqual(30, self.count_devices(devices))

View File

@ -1,552 +0,0 @@
import urllib.parse
from craton.tests.functional import DeviceTestBase
from craton.tests.functional.test_variable_calls import \
APIV1ResourceWithVariablesTestCase
class APIV1HostTest(DeviceTestBase, APIV1ResourceWithVariablesTestCase):
resource = 'hosts'
def test_create_host_supports_vars_ops(self):
host = self.create_host('host1', 'server', '192.168.1.1')
self.assert_vars_get_expected(host['id'], {})
self.assert_vars_can_be_set(host['id'])
self.assert_vars_can_be_deleted(host['id'])
def test_host_get_by_vars_filter(self):
vars1 = {"a": "b", "host": "one"}
self.create_host('host1', 'server', '192.168.1.1', **vars1)
vars2 = {"a": "b"}
self.create_host('host2', 'server', '192.168.1.2', **vars2)
url = self.url + '/v1/hosts'
resp = self.get(url, vars='a:"b"')
self.assertEqual(200, resp.status_code)
hosts = resp.json()['hosts']
self.assertEqual(2, len(hosts))
self.assertEqual({'192.168.1.1', '192.168.1.2'},
{host['ip_address'] for host in hosts})
url = self.url + '/v1/hosts'
resp = self.get(url, vars='host:"one"')
self.assertEqual(200, resp.status_code)
hosts = resp.json()['hosts']
self.assertEqual(1, len(hosts))
self.assertEqual('192.168.1.1', hosts[0]['ip_address'])
self.assert_vars_get_expected(hosts[0]['id'], vars1)
def test_create_host(self):
host = self.create_host('host1', 'server', '192.168.1.1')
self.assertEqual('host1', host['name'])
def test_create_with_missing_name_fails(self):
url = self.url + '/v1/hosts'
payload = {'device_type': 'server', 'ip_address': '192.168.1.1',
'region_id': self.region['id']}
host = self.post(url, data=payload)
self.assertEqual(400, host.status_code)
def test_create_with_missing_ip_fails(self):
url = self.url + '/v1/hosts'
payload = {'name': 'test', 'device_type': 'server',
'region_id': self.region['id']}
host = self.post(url, data=payload)
self.assertEqual(400, host.status_code)
def test_create_with_missing_type_fails(self):
url = self.url + '/v1/hosts'
payload = {'name': 'who', 'ip_address': '192.168.1.1',
'region_id': self.region['id']}
host = self.post(url, data=payload)
self.assertEqual(400, host.status_code)
def test_create_with_extra_id_property_fails(self):
url = self.url + '/v1/hosts'
payload = {'device_type': 'server', 'ip_address': '192.168.1.1',
'region_id': self.region['id'],
'cloud_id': self.cloud['id'], 'name': 'a', 'id': 1}
host = self.post(url, data=payload)
self.assertEqual(400, host.status_code)
msg = (
"The request included the following errors:\n"
"- Additional properties are not allowed ('id' was unexpected)"
)
self.assertEqual(host.json()['message'], msg)
def test_create_with_extra_created_at_property_fails(self):
url = self.url + '/v1/hosts'
payload = {'device_type': 'server', 'ip_address': '192.168.1.1',
'region_id': self.region['id'],
'cloud_id': self.cloud['id'], 'name': 'a',
'created_at': 'some date'}
host = self.post(url, data=payload)
self.assertEqual(400, host.status_code)
msg = (
"The request included the following errors:\n"
"- Additional properties are not allowed "
"('created_at' was unexpected)"
)
self.assertEqual(host.json()['message'], msg)
def test_create_with_extra_updated_at_property_fails(self):
url = self.url + '/v1/hosts'
payload = {'device_type': 'server', 'ip_address': '192.168.1.1',
'region_id': self.region['id'],
'cloud_id': self.cloud['id'], 'name': 'a',
'updated_at': 'some date'}
host = self.post(url, data=payload)
self.assertEqual(400, host.status_code)
msg = (
"The request included the following errors:\n"
"- Additional properties are not allowed "
"('updated_at' was unexpected)"
)
self.assertEqual(host.json()['message'], msg)
def test_create_missing_all_properties_fails(self):
url = self.url + '/v1/hosts'
host = self.post(url, data={})
self.assertEqual(400, host.status_code)
msg = (
"The request included the following errors:\n"
"- 'cloud_id' is a required property\n"
"- 'device_type' is a required property\n"
"- 'ip_address' is a required property\n"
"- 'name' is a required property\n"
"- 'region_id' is a required property"
)
self.assertEqual(host.json()['message'], msg)
def test_create_with_parent_id(self):
parent = self.create_host(
name='test1',
cloud=self.cloud,
region=self.region,
hosttype='server',
ip_address='192.168.1.1',
)
child = self.create_host(
name='test2',
cloud=self.cloud,
region=self.region,
hosttype='server',
ip_address='192.168.1.2',
parent_id=parent['id'],
)
self.assertEqual(parent['id'], child['parent_id'])
def test_update_with_parent_id(self):
parent = self.create_host(
name='test1',
cloud=self.cloud,
region=self.region,
hosttype='server',
ip_address='192.168.1.1',
)
child = self.create_host(
name='test2',
cloud=self.cloud,
region=self.region,
hosttype='server',
ip_address='192.168.1.2',
)
self.assertIsNone(child['parent_id'])
url = '{}/v1/hosts/{}'.format(self.url, child['id'])
child_update_resp = self.put(
url, data={'parent_id': parent['id']}
)
self.assertEqual(200, child_update_resp.status_code)
child_update = child_update_resp.json()
self.assertEqual(parent['id'], child_update['parent_id'])
def test_update_with_parent_id_equal_id_fails(self):
host = self.create_host(
name='test1',
cloud=self.cloud,
region=self.region,
hosttype='server',
ip_address='192.168.1.1',
)
url = '{}/v1/hosts/{}'.format(self.url, host['id'])
host_update_resp = self.put(
url, data={'parent_id': host['id']}
)
self.assertEqual(400, host_update_resp.status_code)
def test_update_with_parent_id_equal_descendant_id_fails(self):
parent = self.create_host(
name='test1',
cloud=self.cloud,
region=self.region,
hosttype='server',
ip_address='192.168.1.1',
)
self.assertIsNone(parent['parent_id'])
child = self.create_host(
name='test2',
cloud=self.cloud,
region=self.region,
hosttype='server',
ip_address='192.168.1.2',
parent_id=parent['id'],
)
self.assertEqual(parent['id'], child['parent_id'])
grandchild = self.create_host(
name='test3',
cloud=self.cloud,
region=self.region,
hosttype='server',
ip_address='192.168.1.3',
parent_id=child['id'],
)
self.assertEqual(child['id'], grandchild['parent_id'])
url = '{}/v1/hosts/{}'.format(self.url, parent['id'])
parent_update_resp = self.put(
url, data={'parent_id': grandchild['id']}
)
self.assertEqual(400, parent_update_resp.status_code)
def test_get_all_hosts_with_details(self):
region_vars = {'x': 'y'}
region = self.create_region(name='region1', variables=region_vars)
variables = {"a": "b"}
self.create_host('host1', 'server', '192.168.1.1', region=region,
**variables)
self.create_host('host2', 'server', '192.168.1.2', region=region,
**variables)
url = self.url + '/v1/hosts?details=all'
resp = self.get(url)
self.assertEqual(200, resp.status_code)
hosts = resp.json()['hosts']
self.assertEqual(2, len(hosts))
for host in hosts:
self.assertTrue('variables' in host)
self.assertEqual({'a': 'b', 'x': 'y'}, host['variables'])
def test_host_get_by_ip_filter(self):
self.create_host('host1', 'server', '192.168.1.1')
self.create_host('host2', 'server', '192.168.1.2')
url = self.url + '/v1/hosts?ip_address=192.168.1.1'
resp = self.get(url)
self.assertEqual(200, resp.status_code)
hosts = resp.json()['hosts']
self.assertEqual(1, len(hosts))
self.assertEqual('192.168.1.1', hosts[0]['ip_address'])
def test_host_by_missing_filter(self):
self.create_host('host1', 'server', '192.168.1.1')
url = self.url + '/v1/hosts?ip_address=192.168.1.2'
resp = self.get(url)
self.assertEqual(200, resp.status_code)
self.assertEqual(0, len(resp.json()['hosts']))
def test_host_create_labels(self):
res = self.create_host('host1', 'server', '192.168.1.1')
url = self.url + '/v1/hosts/{}/labels'.format(res['id'])
data = {"labels": ["compute"]}
resp = self.put(url, data=data)
self.assertEqual(200, resp.status_code)
resp = self.get(url)
self.assertEqual(data, resp.json())
def test_host_by_label_filter_match_one(self):
labels_route_mask = '/v1/hosts/{}/labels'
host1 = self.create_host('host1', 'server', '192.168.1.1')
host2 = self.create_host('host2', 'server', '192.168.1.2')
host3 = self.create_host('host3', 'server', '192.168.1.3')
# set labels on hosts
data = {"labels": ["compute"]}
for host in (host1, host2, host3):
url = self.url + labels_route_mask.format(host['id'])
resp = self.put(url, data=data)
self.assertEqual(200, resp.status_code)
# set one of them with extra labels
data = {"labels": ["compute", "scheduler"]}
url = self.url + labels_route_mask.format(host3['id'])
resp = self.put(url, data=data)
self.assertEqual(200, resp.status_code)
# get hosts by its label
url = self.url + '/v1/hosts?label=scheduler'
resp = self.get(url)
self.assertEqual(200, resp.status_code)
hosts = resp.json()['hosts']
self.assertEqual(1, len(hosts))
self.assertEqual(host3['id'], hosts[0]['id'])
def test_host_by_label_filters_match_all(self):
labels_route_mask = '/v1/hosts/{}/labels'
host1 = self.create_host('host1', 'server', '192.168.1.1')
host2 = self.create_host('host2', 'server', '192.168.1.2')
host3 = self.create_host('host3', 'server', '192.168.1.3')
# set labels on hosts
data = {"labels": ["compute"]}
for host in (host1, host2, host3):
url = self.url + labels_route_mask.format(host['id'])
resp = self.put(url, data=data)
self.assertEqual(200, resp.status_code)
# set one of them with extra labels
data = {"labels": ["compute", "scheduler"]}
url = self.url + labels_route_mask.format(host2['id'])
resp = self.put(url, data=data)
self.assertEqual(200, resp.status_code)
# get hosts by its label
url = self.url + '/v1/hosts?label=scheduler&label=compute'
resp = self.get(url)
self.assertEqual(200, resp.status_code)
hosts = resp.json()['hosts']
self.assertEqual(1, len(hosts))
self.assertEqual(host2['id'], hosts[0]['id'])
def test_host_by_label_filters_match_one_common(self):
labels_route_mask = '/v1/hosts/{}/labels'
test_hosts = [
self.create_host('host1', 'server', '192.168.1.1'),
self.create_host('host2', 'server', '192.168.1.2'),
self.create_host('host3', 'server', '192.168.1.3'),
]
# set labels on hosts
data = {"labels": ["compute"]}
for host in test_hosts:
url = self.url + labels_route_mask.format(host['id'])
resp = self.put(url, data=data)
self.assertEqual(200, resp.status_code)
# set one of them with extra labels
data = {"labels": ["compute", "scheduler"]}
url = self.url + labels_route_mask.format(test_hosts[1]['id'])
resp = self.put(url, data=data)
self.assertEqual(200, resp.status_code)
# get hosts by its label
url = self.url + '/v1/hosts?label=compute'
resp = self.get(url)
self.assertEqual(200, resp.status_code)
hosts = resp.json()['hosts']
self.assertEqual(3, len(hosts))
self.assertEqual(sorted([host['id'] for host in test_hosts]),
sorted([host['id'] for host in hosts]))
def test_host_get_all_vars_filter_resolved_region(self):
region_vars = {'foo': 'bar'}
region = self.create_region(name='region-2', variables=region_vars)
host_vars = {'baz': 'zoo'}
self.create_host('host1', 'server', '192.168.1.1', **host_vars)
host2 = self.create_host('host2', 'server', '192.168.1.2',
region=region, **host_vars)
url = self.url + '/v1/hosts'
resp = self.get(url, vars='foo:"bar",baz:"zoo"')
hosts = resp.json()['hosts']
self.assertEqual(1, len(hosts))
self.assertEqual(host2['id'], hosts[0]['id'])
def test_host_get_all_vars_filter_resolved_region_and_host(self):
region_vars = {'foo': 'bar'}
region = self.create_region(name='region-2', variables=region_vars)
host_vars = {'baz': 'zoo'}
host1 = self.create_host('host1', 'server', '192.168.1.1',
**region_vars)
host2 = self.create_host('host2', 'server', '192.168.1.2',
region=region, **host_vars)
url = self.url + '/v1/hosts'
resp = self.get(url, vars='foo:"bar"')
hosts = resp.json()['hosts']
self.assertEqual(2, len(hosts))
self.assertListEqual(sorted([host1['id'], host2['id']]),
sorted([host['id'] for host in hosts]))
def test_host_get_all_vars_filter_resolved_region_child_override(self):
region_vars = {'foo': 'bar'}
region = self.create_region(name='region-2', variables=region_vars)
host1 = self.create_host('host1', 'server', '192.168.1.1',
region=region, foo='baz')
host2 = self.create_host('host2', 'server', '192.168.1.2',
region=region)
url = self.url + '/v1/hosts'
resp = self.get(url, vars='foo:"baz"')
hosts = resp.json()['hosts']
self.assertEqual(1, len(hosts))
self.assertEqual(host1['id'], hosts[0]['id'])
resp = self.get(url, vars='foo:"bar"')
hosts = resp.json()['hosts']
self.assertEqual(1, len(hosts))
self.assertEqual(host2['id'], hosts[0]['id'])
def test_host_get_all_vars_filter_resolved_host_child_override(self):
host1 = self.create_host('host1', 'server', '192.168.1.1',
baz='zoo')
host2 = self.create_host('host2', 'server', '192.168.1.2',
parent_id=host1['id'], baz='boo')
url = self.url + '/v1/hosts'
resp = self.get(url, vars='baz:"zoo"')
hosts = resp.json()['hosts']
self.assertEqual(1, len(hosts))
self.assertEqual(host1['id'], hosts[0]['id'])
resp = self.get(url, vars='baz:"boo"')
hosts = resp.json()['hosts']
self.assertEqual(1, len(hosts))
self.assertEqual(host2['id'], hosts[0]['id'])
def test_host_get_all_vars_filter_unresolved(self):
host1 = self.create_host('host1', 'server', '192.168.1.1',
foo='bar', baz='zoo')
self.create_host('host2', 'server', '192.168.1.2', foo='bar')
# NOTE(thomasem): Unfortunately, we use resolved-values instead of
# resolved_values, so we can't pass this in as kwargs to self.get(...),
# see https://bugs.launchpad.net/craton/+bug/1672880.
url = self.url + \
'/v1/hosts?resolved-values=false&vars=foo:"bar",baz:"zoo"'
resp = self.get(url)
hosts = resp.json()['hosts']
self.assertEqual(1, len(hosts))
self.assertEqual(host1['id'], hosts[0]['id'])
def test_host_delete(self):
host = self.create_host('host1', 'server', '192.168.1.1')
url = self.url + '/v1/hosts/{}'.format(host['id'])
resp = self.delete(url)
self.assertEqual(204, resp.status_code)
resp = self.get(url)
self.assertEqual(404, resp.status_code)
self.assertEqual({'status': 404, 'message': 'Not Found'},
resp.json())
class TestPagination(DeviceTestBase):
def setUp(self):
super(TestPagination, self).setUp()
self.hosts = [
self.create_host('host{}'.format(i), 'server',
'192.168.1.{}'.format(i + 1))
for i in range(0, 61)
]
def test_get_returns_a_default_list_of_thirty_hosts(self):
response = self.get(self.url + '/v1/hosts')
self.assertSuccessOk(response)
hosts = response.json()
self.assertIn('hosts', hosts)
self.assertEqual(30, len(hosts['hosts']))
self.assertListEqual([h['id'] for h in self.hosts[:30]],
[h['id'] for h in hosts['hosts']])
def test_get_returns_correct_next_link(self):
thirtieth_host = self.hosts[29]
response = self.get(self.url + '/v1/hosts')
self.assertSuccessOk(response)
hosts = response.json()
self.assertIn('links', hosts)
for link_rel in hosts['links']:
if link_rel['rel'] == 'next':
break
else:
self.fail("No 'next' link was returned in response")
parsed_next = urllib.parse.urlparse(link_rel['href'])
self.assertIn('marker={}'.format(thirtieth_host['id']),
parsed_next.query)
def test_get_returns_correct_prev_link(self):
first_host = self.hosts[0]
thirtieth_host = self.hosts[29]
url = self.url + '/v1/hosts?marker={}'.format(thirtieth_host['id'])
response = self.get(url)
self.assertSuccessOk(response)
hosts = response.json()
self.assertIn('links', hosts)
for link_rel in hosts['links']:
if link_rel['rel'] == 'prev':
break
else:
self.fail("No 'prev' link was returned in response")
parsed_prev = urllib.parse.urlparse(link_rel['href'])
self.assertIn('marker={}'.format(first_host['id']), parsed_prev.query)
def test_get_all_for_region(self):
region = self.create_region('region-2')
self.create_host('host1', 'server', '192.168.1.1', region=region)
self.create_host('host2', 'server', '192.168.1.2', region=region)
url = self.url + '/v1/hosts?region_id={}'.format(region['id'])
resp = self.get(url)
self.assertSuccessOk(resp)
hosts = resp.json()
self.assertEqual(2, len(hosts['hosts']))
def test_get_all_for_cloud(self):
cloud = self.create_cloud('cloud-2')
region = self.create_region(cloud=cloud)
self.create_host('host1', 'server', '192.168.1.1', cloud=cloud,
region=region)
self.create_host('host2', 'server', '192.168.1.2', cloud=cloud,
region=region)
url = self.url + '/v1/hosts?cloud_id={}'.format(cloud['id'])
resp = self.get(url)
self.assertSuccessOk(resp)
hosts = resp.json()['hosts']
self.assertEqual(2, len(hosts))
self.assertEqual(['host1', 'host2'], [h['name'] for h in hosts])
def test_ascending_sort_by_name(self):
response = self.get(self.url + '/v1/hosts',
sort_keys='name', sort_dir='asc')
self.assertSuccessOk(response)
hosts = response.json()['hosts']
self.assertEqual(30, len(hosts))
def test_ascending_sort_by_name_and_id(self):
response = self.get(self.url + '/v1/hosts',
sort_keys='name,id', sort_dir='asc')
self.assertSuccessOk(response)
hosts = response.json()['hosts']
self.assertEqual(30, len(hosts))
def test_ascending_sort_by_name_and_id_space_separated(self):
response = self.get(self.url + '/v1/hosts',
sort_keys='name id', sort_dir='asc')
self.assertSuccessOk(response)
hosts = response.json()['hosts']
self.assertEqual(30, len(hosts))
def test_follows_next_link(self):
url = self.url + '/v1/hosts'
response = self.get(url)
self.assertSuccessOk(response)
json = response.json()
hosts = json['hosts']
while hosts:
for link in json['links']:
if link['rel'] == 'next':
break
else:
break
response = self.get(link['href'])
self.assertSuccessOk(response)
json = response.json()
hosts = json['hosts']

View File

@ -1,550 +0,0 @@
from craton import exceptions
from craton.tests import functional
TEST_STRING = "I'm just a string"
TEST_ARRAY = [
1,
23.4,
True,
False,
'false',
TEST_STRING,
{
'bumbleywump': 'cucumberpatch',
'literal_boolean': 'true'
},
['sub', 'array', True]
]
TEST_DICT = {
'foo': {
'nested_string': 'Bumbleywump Cucumberpatch',
'nested_bool': True,
'nested_null': None,
'nested_int': 1,
'nested_float': 3.14,
'nested_boolstr': 'false',
'hyphenated-key': 'look-at-all-these-hyphens!',
},
'bar': TEST_ARRAY,
'baz': 'zoo'
}
def _get_variables_for(name):
return {
'{}_dict'.format(name): TEST_DICT,
'{}_array'.format(name): TEST_ARRAY,
'{}_string'.format(name): TEST_STRING,
}
class JSONPathResolvedSearchTestCase(functional.TestCase):
def setUp(self):
super(JSONPathResolvedSearchTestCase, self).setUp()
self.cloud = self.create_cloud(
name='cloud1',
variables=_get_variables_for('cloud1'),
)
self.region = self.create_region(
name='region1',
cloud=self.cloud,
variables=_get_variables_for('region1'),
)
self.cell = self.create_cell(
name='cell1',
cloud=self.cloud,
region=self.region,
variables=_get_variables_for('cell1')
)
self.switches = []
for i in range(2):
name = 'netdev{}'.format(str(i))
self.switches.append(self.create_network_device(
name=name,
cloud=self.cloud,
region=self.region,
cell=self.cell,
device_type='switch',
ip_address='192.168.{}.1'.format(i),
**_get_variables_for(name)
))
self.hosts = []
for i in range(len(self.switches) * 3):
name = 'host{}'.format(i)
self.hosts.append(self.create_host(
name=name,
cloud=self.cloud,
region=self.region,
cell=self.cell,
hosttype='server',
ip_address='192.168.{}.2'.format(i),
parent_id=self.switches[i % len(self.switches)]['id'],
**_get_variables_for(name)
))
def test_jsonpath_search_device_parent(self):
url = self.url + '/v1/hosts'
queries = [
'netdev1_dict.foo."hyphenated-key":"look-at-all-these-hyphens!"',
]
expected_names = ['host1', 'host3', 'host5']
resp = self.get(url, vars=','.join(queries))
hosts = resp.json()['hosts']
parent_ids = set([h['parent_id'] for h in hosts])
self.assertEqual(3, len(hosts))
self.assertEqual(1, len(parent_ids))
self.assertEqual(self.switches[1]['id'], parent_ids.pop())
self.assertListEqual(
sorted(expected_names),
sorted([h['name'] for h in hosts])
)
def test_jsonpath_search_device_parent_override(self):
url = self.url + '/v1/hosts'
queries = [
'netdev1_dict.foo."hyphenated-key":"look-at-all-these-hyphens!"',
]
variables_put = {
'netdev1_dict': {
'foo': {
'hyphenated-key': 'look-at-all-these-hyphens'
}
}
}
self.put('{}/{}/variables'.format(url, self.hosts[3]['id']),
data=variables_put)
resp = self.get(url, vars=','.join(queries))
hosts = resp.json()['hosts']
parent_ids = set([h['parent_id'] for h in hosts])
self.assertEqual(2, len(hosts))
self.assertEqual(1, len(parent_ids))
self.assertEqual(self.switches[1]['id'], parent_ids.pop())
def test_jsonpath_search_device_child_vars_included(self):
url = self.url + '/v1/hosts'
queries = [
'netdev1_dict.foo."hyphenated-key":"look-at-all-these-hyphens!"',
]
modified_id = self.hosts[0]['id']
variables_put = {
'netdev1_dict': {
'foo': {
'hyphenated-key': 'look-at-all-these-hyphens!'
}
}
}
self.put('{}/{}/variables'.format(url, modified_id),
data=variables_put)
expected_names = ['host0', 'host1', 'host3', 'host5']
resp = self.get(url, vars=','.join(queries))
hosts = resp.json()['hosts']
self.assertEqual(4, len(hosts))
self.assertListEqual(
sorted(expected_names),
sorted([h['name'] for h in hosts])
)
def test_jsonpath_search_device_conjunctive_parent_vars(self):
url = self.url + '/v1/hosts'
queries = [
'netdev1_dict.foo."hyphenated-key":"look-at-all-these-hyphens!"',
'region1_array[2]:true',
'cloud1_dict.bar[3]:false',
]
resp = self.get(url, vars=','.join(queries))
hosts = resp.json()['hosts']
parent_ids = set([h['parent_id'] for h in hosts])
self.assertEqual(3, len(hosts))
self.assertEqual(1, len(parent_ids))
self.assertEqual(self.switches[1]['id'], parent_ids.pop())
class JSONPathSearchTestCaseMixin(object):
resource = '<resource>'
def get_resource_url(self):
return '{}/v1/{}'.format(self.url, self.resource)
def setup_projects(self, projects):
created = []
for name, variables in projects:
created.append(self.create_project(
name=name,
variables=variables
))
return created
def setup_clouds(self, clouds):
created = []
for name, variables in clouds:
created.append(self.create_cloud(
name=name,
variables=variables
))
return created
def setup_regions(self, regions):
created = []
cloud = self.create_cloud(name='cloud1')
for name, variables in regions:
created.append(self.create_region(
name=name,
cloud=cloud,
variables=variables
))
return created
def setup_cells(self, cells):
created = []
cloud = self.create_cloud(name='cloud1')
region = self.create_region(
name='region1',
cloud=cloud
)
for name, variables in cells:
created.append(self.create_cell(
name=name,
cloud=cloud,
region=region,
variables=variables
))
return created
def setup_networks(self, networks):
created = []
cloud = self.create_cloud(name='cloud1')
region = self.create_region(
name='region1',
cloud=cloud
)
for name, variables in networks:
created.append(self.create_network(
name=name,
cloud=cloud,
region=region,
cidr='192.168.0.0/24',
gateway='192.168.0.1',
netmask='255.255.255.0',
variables=variables
))
return created
def setup_network_devices(self, network_devices):
created = []
cloud = self.create_cloud(name='cloud1')
region = self.create_region(
name='region1',
cloud=cloud
)
for name, variables in network_devices:
created.append(self.create_network_device(
name=name,
cloud=cloud,
region=region,
device_type='switch',
ip_address='192.168.0.1',
**variables
))
return created
def setup_hosts(self, hosts):
created = []
cloud = self.create_cloud(name='cloud1')
region = self.create_region(
name='region1',
cloud=cloud
)
for name, variables in hosts:
created.append(self.create_host(
name=name,
cloud=cloud,
region=region,
hosttype='server',
ip_address='192.168.0.1',
**variables
))
return created
def setup_resources(self, resources):
setup_fn = {
"projects": self.setup_projects,
"clouds": self.setup_clouds,
"regions": self.setup_regions,
"cells": self.setup_cells,
"networks": self.setup_networks,
"network-devices": self.setup_network_devices,
"hosts": self.setup_hosts,
}
return setup_fn[self.resource](resources)
def resources_from_response(self, resp):
return resp.json()[self.resource.replace('-', '_')]
def get_resources(self, **params):
headers = None
if self.resource in ('projects',):
headers = self.root_headers
resp = self.get(self.get_resource_url(), headers=headers,
details='all', **params)
return resp
def test_jsonpath_search_nested_string(self):
resources = (
('resource1', {'foo': TEST_DICT}),
('resource2', {'foo': {'baz': 'nope'}})
)
created = self.setup_resources(resources)
found = self.resources_from_response(self.get_resources(
vars='foo.foo.nested_string:"Bumbleywump Cucumberpatch"'))
self.assertEqual(1, len(found))
self.assertEqual(created[0]['id'], found[0]['id'])
self.assertEqual(created[0]['variables'], found[0]['variables'])
def test_jsonpath_search_nested_string_wildcard(self):
resources = (
('resource1', {'foo': TEST_DICT}),
('resource2', {'foo': {"baz": "zoom"}})
)
created = self.setup_resources(resources)
found = self.resources_from_response(
self.get_resources(vars='foo.*:"zoo"'))
self.assertEqual(1, len(found))
self.assertEqual(created[0]['id'], found[0]['id'])
self.assertEqual(created[0]['variables'], found[0]['variables'])
def test_jsonpath_search_array_string(self):
resources = (
('resource1', {'foo': TEST_ARRAY}),
('resource2', {'foo': TEST_ARRAY}),
('resource3', {'foo': ["I'm just a string", 1, 2, 3, 4, 'foo']}),
)
created = self.setup_resources(resources)
found = self.resources_from_response(
self.get_resources(vars='foo[5]:"I\'m just a string"'))
self.assertEqual(2, len(found))
self.assertListEqual(sorted([c['id'] for c in created[:2]]),
sorted([f['id'] for f in found]))
def test_jsonpath_search_array_string_wildcard(self):
resources = (
('resource1', {'foo': TEST_ARRAY}),
('resource2', {'foo': TEST_ARRAY}),
('resource3', {'foo': ["I'm just a string", True]}),
('resource4', {'foo': ['Bumbleywump Cucumberpatch']}),
)
created = self.setup_resources(resources)
found = self.resources_from_response(
self.get_resources(vars='foo[*]:"I\'m just a string"'))
self.assertEqual(3, len(found))
self.assertListEqual(sorted([c['id'] for c in created[:3]]),
sorted([f['id'] for f in found]))
def test_jsonpath_search_nested_array_string(self):
resources = (
('resource1', {'foo': TEST_DICT}),
('resource2', {'foo': TEST_DICT}),
('resource3', {'foo': {"bar": ["I'm just a string", True]}}),
('resource4', {'foo': TEST_ARRAY}),
)
created = self.setup_resources(resources)
found = self.resources_from_response(
self.get_resources(vars='foo.bar[*]:"I\'m just a string"'))
self.assertEqual(3, len(found))
self.assertListEqual(sorted([c['id'] for c in created[:3]]),
sorted([f['id'] for f in found]))
def test_jsonpath_search_nested_int(self):
resources = (
('resource1', {'foo': TEST_DICT}),
('resource2', {'foo': {"foo": {"nested_int": "1"}}})
)
created = self.setup_resources(resources)
found = self.resources_from_response(
self.get_resources(vars='foo.foo.nested_int:1'))
self.assertEqual(1, len(found))
self.assertEqual(created[0]['id'], found[0]['id'])
self.assertEqual(created[0]['variables'], found[0]['variables'])
def test_jsonpath_search_nested_float(self):
resources = (
('resource1', {'foo': TEST_DICT}),
('resource2', {'foo': {"foo": {"nested_float": 3}}})
)
created = self.setup_resources(resources)
found = self.resources_from_response(
self.get_resources(vars='foo.foo.nested_float:3.14'))
self.assertEqual(1, len(found))
self.assertEqual(created[0]['id'], found[0]['id'])
self.assertEqual(created[0]['variables'], found[0]['variables'])
def test_jsonpath_search_nested_bool(self):
resources = (
('resource1', {'foo': TEST_DICT}),
('resource2', {'foo': {"foo": {"nested_bool": 'true'}}})
)
created = self.setup_resources(resources)
found = self.resources_from_response(
self.get_resources(vars='foo.foo.nested_bool:true'))
self.assertEqual(1, len(found))
self.assertEqual(created[0]['id'], found[0]['id'])
self.assertEqual(created[0]['variables'], found[0]['variables'])
def test_jsonpath_search_nested_boolstr(self):
resources = (
('resource1', {'foo': TEST_DICT}),
('resource2', {'foo': {"foo": {"nested_boolstr": False}}})
)
created = self.setup_resources(resources)
found = self.resources_from_response(
self.get_resources(vars='foo.foo.nested_boolstr:"false"'))
self.assertEqual(1, len(found))
self.assertEqual(created[0]['id'], found[0]['id'])
self.assertEqual(created[0]['variables'], found[0]['variables'])
def test_jsonpath_search_nested_null(self):
resources = (
('resource1', {'foo': TEST_DICT}),
('resource2', {'foo': {"foo": {"nested_null": 'test'}}})
)
created = self.setup_resources(resources)
found = self.resources_from_response(
self.get_resources(vars='foo.foo.nested_null:null'))
self.assertEqual(1, len(found))
self.assertEqual(created[0]['id'], found[0]['id'])
self.assertEqual(created[0]['variables'], found[0]['variables'])
def test_jsonpath_search_hyphenated(self):
resources = (
('resource1', {'foo': TEST_DICT}),
('resource2', {'foo': {"foo": {"hyphenated-key": 'test-test'}}})
)
created = self.setup_resources(resources)
found = self.resources_from_response(self.get_resources(
vars='foo.foo."hyphenated-key":"look-at-all-these-hyphens!"'))
self.assertEqual(1, len(found))
self.assertEqual(created[0]['id'], found[0]['id'])
self.assertEqual(created[0]['variables'], found[0]['variables'])
def test_jsonpath_search_key_with_period(self):
resources = (
('resource1', {'v3.0': TEST_DICT}),
('resource2', {'v3.0': {"foo": {"hyphenated-key": 'test-test'}}})
)
created = self.setup_resources(resources)
found = self.resources_from_response(self.get_resources(
vars='"v3.0".foo."hyphenated-key":"look-at-all-these-hyphens!"'))
self.assertEqual(1, len(found))
self.assertEqual(created[0]['id'], found[0]['id'])
self.assertEqual(created[0]['variables'], found[0]['variables'])
def test_jsonpath_search_non_string_member(self):
self.setup_resources((
('resource1', {'v3.0': TEST_DICT}),
))
resp = self.get_resources(
vars='v3.0.foo."hyphenated-key":"look-at-all-these-hyphens!"')
self.assertBadRequest(resp)
self.assertEqual(exceptions.InvalidJSONPath.msg,
resp.json()['message'])
def test_jsonpath_search_hyphenated_without_quotes(self):
self.setup_resources((
('resource1', {'v3.0': TEST_DICT}),
))
resp = self.get_resources(
vars='foo.hyphenated-key:"look-at-all-these-hyphens!"')
self.assertBadRequest(resp)
self.assertEqual(exceptions.InvalidJSONPath.msg,
resp.json()['message'])
def test_jsonpath_search_invalid_first_key(self):
self.setup_resources((
('resource1', {'v3.0': TEST_DICT}),
))
resp = self.get_resources(vars='[*]foo.bar:"string"')
self.assertBadRequest(resp)
self.assertEqual(exceptions.InvalidJSONPath.msg,
resp.json()['message'])
def test_jsonpath_search_bad_json_string_value(self):
self.setup_resources((
('resource1', {'v3.0': TEST_DICT}),
))
resp = self.get_resources(vars='foo.bar:string')
self.assertBadRequest(resp)
self.assertEqual(exceptions.InvalidJSONValue.msg,
resp.json()['message'])
class ProjectsJSONPathSearchTestCase(functional.TestCase,
JSONPathSearchTestCaseMixin):
resource = 'projects'
class CloudsJSONPathSearchTestCase(functional.TestCase,
JSONPathSearchTestCaseMixin):
resource = 'clouds'
class RegionsJSONPathSearchTestCase(functional.TestCase,
JSONPathSearchTestCaseMixin):
resource = 'regions'
class CellsJSONPathSearchTestCase(functional.TestCase,
JSONPathSearchTestCaseMixin):
resource = 'cells'
class NetworksJSONPathSearchTestCase(functional.TestCase,
JSONPathSearchTestCaseMixin):
resource = 'networks'
class NetworkDevicesJSONPathSearchTestCase(functional.TestCase,
JSONPathSearchTestCaseMixin):
resource = 'network-devices'
class HostsJSONPathSearchTestCase(functional.TestCase,
JSONPathSearchTestCaseMixin):
resource = 'hosts'

View File

@ -1,162 +0,0 @@
from craton.tests.functional import TestCase
class APIV1NetworkSchemaTest(TestCase):
def setUp(self):
super(APIV1NetworkSchemaTest, self).setUp()
self.cloud = self.create_cloud(name='cloud-1')
self.region = self.create_region(name='region-1', cloud=self.cloud)
self.networks_url = self.url + '/v1/networks'
self.cidr = '192.168.0.0/24'
self.netmask = '255.255.255.0'
self.gateway = '192.168.0.1'
def test_network_create_with_required_works(self):
payload = {
'cloud_id': self.cloud['id'],
'region_id': self.region['id'],
'name': 'a',
'cidr': self.cidr,
'netmask': self.netmask,
'gateway': self.gateway,
}
resp = self.post(self.networks_url, data=payload)
self.assertEqual(201, resp.status_code)
network = resp.json()
self.assertEqual('a', network['name'])
self.assertEqual(self.cloud['id'], network['cloud_id'])
self.assertEqual(self.region['id'], network['region_id'])
self.assertEqual(self.cidr, network['cidr'])
self.assertEqual(self.gateway, network['gateway'])
self.assertEqual(self.netmask, network['netmask'])
def test_network_create_without_region_id_fails(self):
payload = {
'cloud_id': self.cloud['id'],
'name': 'a',
'cidr': self.cidr,
'netmask': self.netmask,
'gateway': self.gateway,
}
network = self.post(self.networks_url, data=payload)
self.assertEqual(400, network.status_code)
msg = (
"The request included the following errors:\n"
"- 'region_id' is a required property"
)
self.assertEqual(network.json()['message'], msg)
def test_network_create_without_cloud_id_fails(self):
payload = {
'region_id': self.region['id'],
'name': 'a',
'cidr': self.cidr,
'netmask': self.netmask,
'gateway': self.gateway,
}
network = self.post(self.networks_url, data=payload)
self.assertEqual(400, network.status_code)
msg = (
"The request included the following errors:\n"
"- 'cloud_id' is a required property"
)
self.assertEqual(network.json()['message'], msg)
def test_network_create_with_extra_id_property_fails(self):
payload = {
'region_id': self.region['id'],
'cloud_id': self.cloud['id'],
'name': 'a',
'cidr': self.cidr,
'netmask': self.netmask,
'gateway': self.gateway,
'id': 3
}
network = self.post(self.networks_url, data=payload)
self.assertEqual(400, network.status_code)
msg = (
"The request included the following errors:\n"
"- Additional properties are not allowed ('id' was unexpected)"
)
self.assertEqual(network.json()['message'], msg)
def test_network_create_with_extra_created_at_property_fails(self):
payload = {
'region_id': self.region['id'],
'cloud_id': self.cloud['id'],
'name': 'a',
'cidr': self.cidr,
'netmask': self.netmask,
'gateway': self.gateway,
'created_at': 'This should not work'
}
network = self.post(self.networks_url, data=payload)
self.assertEqual(400, network.status_code)
msg = (
"The request included the following errors:\n"
"- Additional properties are not allowed ('created_at' was "
"unexpected)"
)
self.assertEqual(network.json()['message'], msg)
def test_network_create_with_extra_updated_at_property_fails(self):
payload = {
'region_id': self.region['id'],
'cloud_id': self.cloud['id'],
'name': 'a',
'cidr': self.cidr,
'netmask': self.netmask,
'gateway': self.gateway,
'updated_at': 'This should not work'
}
network = self.post(self.networks_url, data=payload)
self.assertEqual(400, network.status_code)
msg = (
"The request included the following errors:\n"
"- Additional properties are not allowed ('updated_at' was "
"unexpected)"
)
self.assertEqual(network.json()['message'], msg)
def test_network_create_missing_all_properties_fails(self):
url = self.url + '/v1/networks'
network = self.post(url, data={})
self.assertEqual(400, network.status_code)
msg = (
"The request included the following errors:\n"
"- 'cidr' is a required property\n"
"- 'cloud_id' is a required property\n"
"- 'gateway' is a required property\n"
"- 'name' is a required property\n"
"- 'netmask' is a required property\n"
"- 'region_id' is a required property"
)
self.assertEqual(network.json()['message'], msg)
def test_network_get_all_with_details(self):
payload = {
'cloud_id': self.cloud['id'],
'region_id': self.region['id'],
'name': 'a',
'cidr': self.cidr,
'netmask': self.netmask,
'gateway': self.gateway,
'variables': {'a': 'b'},
}
resp = self.post(self.networks_url, data=payload)
self.assertEqual(201, resp.status_code)
payload['name'] = 'b'
resp = self.post(self.networks_url, data=payload)
self.assertEqual(201, resp.status_code)
url = self.networks_url + '?details=all'
resp = self.get(url)
self.assertEqual(200, resp.status_code)
networks = resp.json()['networks']
for network in networks:
self.assertTrue('variables' in network)
self.assertEqual({'a': 'b'}, network['variables'])

View File

@ -1,115 +0,0 @@
from craton.tests.functional import DeviceTestBase
class APIV1NetworkDeviceTest(DeviceTestBase):
resource = 'network-devices'
def test_create_with_parent_id(self):
parent = self.create_network_device(
name='test1',
cloud=self.cloud,
region=self.region,
device_type='switch',
ip_address='192.168.1.1',
)
child = self.create_network_device(
name='test2',
cloud=self.cloud,
region=self.region,
device_type='switch',
ip_address='192.168.1.2',
parent_id=parent['id'],
)
self.assertEqual(parent['id'], child['parent_id'])
def test_update_with_parent_id(self):
parent = self.create_network_device(
name='test1',
cloud=self.cloud,
region=self.region,
device_type='switch',
ip_address='192.168.1.1',
)
child = self.create_network_device(
name='test2',
cloud=self.cloud,
region=self.region,
device_type='switch',
ip_address='192.168.1.2',
)
self.assertIsNone(child['parent_id'])
url = '{}/v1/network-devices/{}'.format(self.url, child['id'])
child_update_resp = self.put(
url, data={'parent_id': parent['id']}
)
self.assertEqual(200, child_update_resp.status_code)
child_update = child_update_resp.json()
self.assertEqual(parent['id'], child_update['parent_id'])
def test_update_with_parent_id_equal_id_fails(self):
network_device = self.create_network_device(
name='test1',
cloud=self.cloud,
region=self.region,
device_type='switch',
ip_address='192.168.1.1',
)
url = '{}/v1/network-devices/{}'.format(self.url, network_device['id'])
network_device_update_resp = self.put(
url, data={'parent_id': network_device['id']}
)
self.assertEqual(400, network_device_update_resp.status_code)
def test_update_with_parent_id_equal_descendant_id_fails(self):
parent = self.create_network_device(
name='test1',
cloud=self.cloud,
region=self.region,
device_type='switch',
ip_address='192.168.1.1',
)
self.assertIsNone(parent['parent_id'])
child = self.create_network_device(
name='test2',
cloud=self.cloud,
region=self.region,
device_type='switch',
ip_address='192.168.1.2',
parent_id=parent['id'],
)
self.assertEqual(parent['id'], child['parent_id'])
grandchild = self.create_network_device(
name='test3',
cloud=self.cloud,
region=self.region,
device_type='switch',
ip_address='192.168.1.3',
parent_id=child['id'],
)
self.assertEqual(child['id'], grandchild['parent_id'])
url = '{}/v1/network-devices/{}'.format(self.url, parent['id'])
parent_update_resp = self.put(
url, data={'parent_id': grandchild['id']}
)
self.assertEqual(400, parent_update_resp.status_code)
def test_network_device_create_missing_all_properties_fails(self):
url = self.url + '/v1/network-devices'
network_device = self.post(url, data={})
self.assertEqual(400, network_device.status_code)
msg = (
"The request included the following errors:\n"
"- 'cloud_id' is a required property\n"
"- 'device_type' is a required property\n"
"- 'ip_address' is a required property\n"
"- 'name' is a required property\n"
"- 'region_id' is a required property"
)
self.assertEqual(network_device.json()['message'], msg)

View File

@ -1,70 +0,0 @@
from craton.tests import functional
class APIv1NetworkInterfacesTest(functional.DeviceTestBase):
def setUp(self):
super(APIv1NetworkInterfacesTest, self).setUp()
self.interfaces_url = self.url + '/v1/network-interfaces'
def test_associate_network_device_with_a_host(self):
host = self.create_host('host-0', 'server', '127.0.0.1')
payload = {
'name': 'lo',
'ip_address': '127.0.0.1',
'device_id': host['id'],
'interface_type': 'loopback',
}
response = self.post(self.interfaces_url, data=payload)
self.assertSuccessCreated(response)
self.assertIn('Location', response.headers)
interface = response.json()
self.assertEqual(
'{}/{}'.format(self.interfaces_url, interface['id']),
response.headers['Location']
)
def test_port_must_be_an_integer_on_create(self):
host = self.create_host('host-0', 'server', '127.0.0.1')
payload = {
'name': 'lo',
'ip_address': '127.0.0.1',
'device_id': host['id'],
'interface_type': 'loopback',
'port': 'asdf',
}
response = self.post(self.interfaces_url, data=payload)
self.assertBadRequest(response)
def test_port_must_be_an_integer_on_update(self):
host = self.create_host('host-0', 'server', '127.0.0.1')
payload = {
'name': 'lo',
'ip_address': '127.0.0.1',
'device_id': host['id'],
'interface_type': 'loopback',
'port': 80,
}
response = self.post(self.interfaces_url, data=payload)
self.assertSuccessCreated(response)
interface = response.json()
url = self.interfaces_url + '/{}'.format(interface['id'])
payload = {'port': 'asdf'}
response = self.put(url, data=payload)
self.assertBadRequest(response)
def test_network_interface_create_missing_all_properties_fails(self):
url = self.url + '/v1/network-interfaces'
network_interface = self.post(url, data={})
self.assertEqual(400, network_interface.status_code)
msg = (
"The request included the following errors:\n"
"- 'device_id' is a required property\n"
"- 'interface_type' is a required property\n"
"- 'ip_address' is a required property\n"
"- 'name' is a required property"
)
self.assertEqual(network_interface.json()['message'], msg)

View File

@ -1,129 +0,0 @@
from craton.tests import functional
from craton.tests.functional.test_variable_calls import \
APIV1ResourceWithVariablesTestCase
class TestPaginationOfProjects(functional.TestCase):
def setUp(self):
super(TestPaginationOfProjects, self).setUp()
self.projects = [
self.create_project('project-{}'.format(i))
for i in range(0, 61)
]
def test_lists_first_thirty_projects(self):
response = self.get(self.url + '/v1/projects',
headers=self.root_headers)
self.assertSuccessOk(response)
json = response.json()
self.assertIn('projects', json)
projects = json['projects']
self.assertEqual(30, len(projects))
def test_lists_projects_with_the_same_name(self):
self.create_project('project-0')
response = self.get(self.url + '/v1/projects',
name='project-0',
headers=self.root_headers)
self.assertSuccessOk(response)
projects = response.json()['projects']
self.assertEqual(2, len(projects))
class APIV1ProjectTest(APIV1ResourceWithVariablesTestCase):
resource = 'projects'
def test_project_create_with_variables(self):
variables = {'a': 'b'}
project_name = 'test'
project = self.create_project(project_name, variables=variables)
self.assertEqual(project_name, project['name'])
self.assertEqual(variables, project['variables'])
def test_create_project_supports_vars_ops(self):
project = self.create_project('test', variables={'a': 'b'})
self.assert_vars_get_expected(project['id'], {'a': 'b'})
self.assert_vars_can_be_set(project['id'])
self.assert_vars_can_be_deleted(project['id'])
def test_project_create_with_duplicate_name_works(self):
project_name = 'test'
self.create_project(project_name)
url = self.url + '/v1/projects'
payload = {'name': project_name}
project = self.post(url, headers=self.root_headers, data=payload)
self.assertEqual(201, project.status_code)
def test_project_get_all_with_name_filter(self):
proj1 = 'test1'
proj2 = 'test2'
self.create_project(proj2)
for i in range(3):
self.create_project(proj1)
url = self.url + '/v1/projects?name={}'.format(proj1)
resp = self.get(url, headers=self.root_headers)
projects = resp.json()['projects']
self.assertEqual(3, len(projects))
for project in projects:
self.assertEqual(proj1, project['name'])
def test_get_project_details(self):
project_name = 'test'
project_vars = {"who": "that"}
project = self.create_project(project_name, variables=project_vars)
url = self.url + '/v1/projects/{}'.format(project['id'])
project_with_detail = self.get(url, headers=self.root_headers)
self.assertEqual(project_name, project_with_detail.json()['name'])
self.assertEqual(project_vars, project_with_detail.json()['variables'])
def test_project_delete(self):
project1 = self.create_project('test1')
url = self.url + '/v1/projects'
projects = self.get(url, headers=self.root_headers)
# NOTE(thomasem): Have to include the default project created by
# test setup.
self.assertEqual(2, len(projects.json()['projects']))
delurl = self.url + '/v1/projects/{}'.format(project1['id'])
self.delete(delurl, headers=self.root_headers)
projects = self.get(url, headers=self.root_headers)
self.assertEqual(1, len(projects.json()['projects']))
def test_project_variables_update(self):
project_name = 'test'
project = self.create_project(project_name)
variables = {"bumbleywump": "cucumberpatch"}
put_url = self.url + '/v1/projects/{}/variables'.format(project['id'])
resp = self.put(put_url, headers=self.root_headers, data=variables)
self.assertEqual(200, resp.status_code)
get_url = self.url + '/v1/projects/{}'.format(project['id'])
project = self.get(get_url, headers=self.root_headers)
self.assertEqual(variables, project.json()['variables'])
def test_project_variables_delete(self):
project_name = 'test'
delete_key = 'bumbleywump'
variables = {
delete_key: 'cucumberpatch'
}
expected_vars = {'foo': 'bar'}
variables.update(expected_vars)
project = self.create_project(project_name, variables=variables)
self.assert_vars_get_expected(project['id'], variables)
self.assert_vars_can_be_deleted(project['id'])
def test_project_create_missing_all_properties_fails(self):
url = self.url + '/v1/projects'
project = self.post(url, data={})
self.assertEqual(400, project.status_code)
msg = (
"The request included the following errors:\n"
"- 'name' is a required property"
)
self.assertEqual(project.json()['message'], msg)

View File

@ -1,288 +0,0 @@
import urllib.parse
from craton.tests.functional import TestCase
class RegionTests(TestCase):
def setUp(self):
super(RegionTests, self).setUp()
self.cloud = self.create_cloud()
def create_cloud(self):
return super(RegionTests, self).create_cloud(
name='cloud-1',
variables={'version': 'x'},
)
def create_region(self, name, variables=None):
return super(RegionTests, self).create_region(
name=name,
cloud=self.cloud,
variables=variables
)
class APIV1RegionTest(RegionTests):
"""Test cases for /region calls.
One set of data for the test is generated by fake data generateion
script during test module setup.
"""
def test_create_region_full_data(self):
# Test with full set of allowed parameters
values = {"name": "region-new",
"note": "This is region-new.",
"cloud_id": self.cloud['id'],
"variables": {"a": "b"}}
url = self.url + '/v1/regions'
resp = self.post(url, data=values)
self.assertEqual(201, resp.status_code)
self.assertIn('Location', resp.headers)
self.assertEqual(
resp.headers['Location'],
"{}/{}".format(url, resp.json()['id'])
)
self.assertEqual(values['name'], resp.json()['name'])
def test_create_region_without_variables(self):
values = {"name": "region-two",
"note": "This is region-two",
"cloud_id": self.cloud['id']}
url = self.url + '/v1/regions'
resp = self.post(url, data=values)
self.assertEqual(201, resp.status_code)
self.assertIn('Location', resp.headers)
self.assertEqual(
resp.headers['Location'],
"{}/{}".format(url, resp.json()['id'])
)
self.assertEqual("region-two", resp.json()['name'])
def test_create_region_with_no_name_fails(self):
values = {"note": "This is region one.", "cloud_id": self.cloud['id']}
url = self.url + '/v1/regions'
resp = self.post(url, data=values)
self.assertEqual(resp.status_code, 400)
err_msg = (
"The request included the following errors:\n"
"- 'name' is a required property"
)
self.assertEqual(resp.json()['message'], err_msg)
def test_create_region_with_no_cloud_id_fails(self):
values = {"name": "I don't work at all, you know."}
url = self.url + '/v1/regions'
resp = self.post(url, data=values)
self.assertEqual(resp.status_code, 400)
err_msg = (
"The request included the following errors:\n"
"- 'cloud_id' is a required property"
)
self.assertEqual(resp.json()['message'], err_msg)
def test_create_region_with_duplicate_name_fails(self):
self.create_region("ORD135")
values = {"name": "ORD135", "cloud_id": self.cloud['id']}
url = self.url + '/v1/regions'
resp = self.post(url, data=values)
self.assertEqual(409, resp.status_code)
def test_create_region_with_extra_id_property_fails(self):
values = {"name": "test", 'cloud_id': self.cloud['id'], "id": 101}
url = self.url + '/v1/regions'
resp = self.post(url, data=values)
self.assertEqual(resp.status_code, 400)
msg = (
"The request included the following errors:\n"
"- Additional properties are not allowed ('id' was unexpected)"
)
self.assertEqual(resp.json()['message'], msg)
def test_create_region_with_extra_created_at_property_fails(self):
values = {"name": "test", 'cloud_id': self.cloud['id'],
"created_at": "some date"}
url = self.url + '/v1/regions'
resp = self.post(url, data=values)
self.assertEqual(resp.status_code, 400)
msg = (
"The request included the following errors:\n"
"- Additional properties are not allowed "
"('created_at' was unexpected)"
)
self.assertEqual(resp.json()['message'], msg)
def test_create_region_with_extra_updated_at_property_fails(self):
values = {"name": "test", 'cloud_id': self.cloud['id'],
"updated_at": "some date"}
url = self.url + '/v1/regions'
resp = self.post(url, data=values)
self.assertEqual(resp.status_code, 400)
msg = (
"The request included the following errors:\n"
"- Additional properties are not allowed "
"('updated_at' was unexpected)"
)
self.assertEqual(resp.json()['message'], msg)
def test_region_create_missing_all_properties_fails(self):
url = self.url + '/v1/regions'
region = self.post(url, data={})
self.assertEqual(400, region.status_code)
msg = (
"The request included the following errors:\n"
"- 'cloud_id' is a required property\n"
"- 'name' is a required property"
)
self.assertEqual(region.json()['message'], msg)
def test_regions_get_all(self):
self.create_region("ORD1")
self.create_region("ORD2")
url = self.url + '/v1/regions'
resp = self.get(url)
self.assertEqual(200, resp.status_code)
self.assertEqual(2, len(resp.json()))
def test_regions_get_all_with_details(self):
self.create_region('ORD1', variables={'a': 'b'})
self.create_region('ORD2', variables={'c': 'd'})
url = self.url + '/v1/regions?details=all'
resp = self.get(url)
self.assertEqual(200, resp.status_code)
regions = resp.json()['regions']
self.assertEqual(2, len(regions))
for region in regions:
self.assertTrue('variables' in region)
for region in regions:
if region['name'] == 'ORD1':
self.assertEqual({'a': 'b', 'version': 'x'},
region['variables'])
if region['name'] == 'ORD2':
self.assertEqual({'c': 'd', 'version': 'x'},
region['variables'])
def test_regions_get_all_with_name_filter(self):
self.create_region("ORD1")
self.create_region("ORD2")
url = self.url + '/v1/regions?name=ORD1'
resp = self.get(url)
self.assertEqual(200, resp.status_code)
regions = resp.json()['regions']
self.assertEqual(1, len(regions))
self.assertEqual('ORD1', regions[0]['name'])
def test_regions_get_all_for_cloud(self):
for i in range(2):
self.create_region("ORD{}".format(str(i)))
url = self.url + '/v1/regions?cloud_id={}'.format(self.cloud['id'])
resp = self.get(url)
self.assertEqual(200, resp.status_code)
regions = resp.json()['regions']
self.assertEqual(2, len(regions))
self.assertEqual(['ORD0', 'ORD1'], [r['name'] for r in regions])
def test_region_with_non_existing_filters(self):
self.create_region("ORD1")
url = self.url + '/v1/regions?name=idontexist'
resp = self.get(url)
self.assertEqual(404, resp.status_code)
def test_region_get_details_for_region(self):
regvars = {"a": "b", "one": "two"}
region = self.create_region("ORD1", variables=regvars)
url = self.url + '/v1/regions/{}'.format(region['id'])
resp = self.get(url)
region = resp.json()
self.assertEqual(region['name'], 'ORD1')
def test_region_get_details_has_resolved_vars(self):
regvars = {"a": "b", "one": "two"}
region = self.create_region("ORD1", variables=regvars)
url = self.url + '/v1/regions/{}'.format(region['id'])
resp = self.get(url)
region = resp.json()
self.assertEqual(region['name'], 'ORD1')
expected = {"a": "b", "one": "two", "version": "x"}
self.assertEqual(expected, region['variables'])
def test_region_get_details_with_unresolved_vars(self):
regvars = {"a": "b", "one": "two"}
region = self.create_region("ORD1", variables=regvars)
r_id = region['id']
url = self.url + '/v1/regions/{}?resolved-values=false'.format(r_id)
resp = self.get(url)
region = resp.json()
self.assertEqual(region['name'], 'ORD1')
self.assertEqual(regvars, region['variables'])
class TestPagination(RegionTests):
def setUp(self):
super(TestPagination, self).setUp()
self.regions = [self.create_region('region-{}'.format(i))
for i in range(0, 61)]
self.addCleanup(self.delete_regions, self.regions)
def test_list_first_thirty_regions(self):
url = self.url + '/v1/regions'
response = self.get(url)
self.assertSuccessOk(response)
json = response.json()
self.assertIn('regions', json)
self.assertEqual(30, len(json['regions']))
self.assertListEqual([r['id'] for r in self.regions[:30]],
[r['id'] for r in json['regions']])
def test_get_returns_correct_next_link(self):
url = self.url + '/v1/regions'
thirtieth_region = self.regions[29]
response = self.get(url)
self.assertSuccessOk(response)
json = response.json()
self.assertIn('links', json)
for link_rel in json['links']:
if link_rel['rel'] == 'next':
break
else:
self.fail("No 'next' link was returned in response")
parsed_next = urllib.parse.urlparse(link_rel['href'])
self.assertIn('marker={}'.format(thirtieth_region['id']),
parsed_next.query)
def test_get_returns_correct_prev_link(self):
first_region = self.regions[0]
thirtieth_region = self.regions[29]
url = self.url + '/v1/regions?marker={}'.format(thirtieth_region['id'])
response = self.get(url)
self.assertSuccessOk(response)
json = response.json()
self.assertIn('links', json)
for link_rel in json['links']:
if link_rel['rel'] == 'prev':
break
else:
self.fail("No 'prev' link was returned in response")
parsed_prev = urllib.parse.urlparse(link_rel['href'])
self.assertIn('marker={}'.format(first_region['id']),
parsed_prev.query)
def test_follow_all_region_links(self):
url = self.url + '/v1/regions'
response = self.get(url)
self.assertSuccessOk(response)
json = response.json()
regions = json['regions']
while regions:
for link in json['links']:
if link['rel'] == 'next':
break
else:
break
response = self.get(link['href'])
self.assertSuccessOk(response)
json = response.json()
regions = json['regions']

View File

@ -1,29 +0,0 @@
from craton.tests import functional
class UserTests(functional.TestCase):
def test_create_user(self):
project = self.create_project('test')
url = self.url + '/v1/users'
payload = {'username': 'testuser', 'project_id': project['id']}
user = self.post(url, data=payload)
self.assertEqual(201, user.status_code)
self.assertEqual(payload['username'], user.json()['username'])
self.assertEqual(payload['project_id'], user.json()['project_id'])
def test_create_user_with_admin_priv(self):
project = self.create_project('test')
url = self.url + '/v1/users'
payload = {'username': 'testuser', 'project_id': project['id'],
'is_admin': True}
user = self.post(url, headers=self.root_headers, data=payload)
self.assertEqual(201, user.status_code)
self.assertEqual(payload['username'], user.json()['username'])
self.assertEqual(payload['is_admin'], user.json()['is_admin'])
def test_create_user_with_no_project_id_fails(self):
url = self.url + '/v1/users'
payload = {'username': 'testuser'}
user = self.post(url, headers=self.root_headers, data=payload)
self.assertEqual(400, user.status_code)

View File

@ -1,57 +0,0 @@
from craton.tests.functional import TestCase
class APIV1ResourceWithVariablesTestCase(TestCase):
"""Base test case for resources that have variables mixed in"""
resource = '<resource>' # Test classes that mix in should set
path = '/v1/{resource}/{resource_id}/variables'
def get_vars_url(self, resource_id):
return self.url + self.path.format(
resource=self.resource, resource_id=resource_id)
def get_current_vars(self, resource_id):
url = self.get_vars_url(resource_id)
response = self.get(url)
self.assertEqual(200, response.status_code)
return response.json()['variables']
def assert_vars_get_expected(self, resource_id, expected_vars):
self.assertEqual(expected_vars, self.get_current_vars(resource_id))
def assert_vars_can_be_set(self, resource_id):
"""Asserts new vars can be added to the existing vars, if any"""
# track the expected current state of vars for this resource,
# verifying expectations
current_vars = self.get_current_vars(resource_id)
payload = {'string-key': 'string-value', 'num-key': 47,
'bookean-key': False, 'none-key': None,
'object-key': {'a': 1, 'b': 2},
'list-key': ['a', 'b', 1, 2, 3, True, None]}
url = self.get_vars_url(resource_id)
response = self.put(url, data=payload)
current_vars.update(payload)
self.assertEqual(200, response.status_code)
self.assertEqual(current_vars, response.json()['variables'])
self.assertEqual(current_vars, self.get_current_vars(resource_id))
def assert_vars_can_be_deleted(self, resource_id):
"""Asserts that new vars can be added, then deleted"""
# track the expected current state of vars for this resource,
# verifying expectations
current_vars = self.get_current_vars(resource_id)
url = self.get_vars_url(resource_id)
added_vars = {'will-keep': 42, 'will-delete': 47}
response = self.put(url, data=added_vars)
current_vars.update(added_vars)
self.assertEqual(200, response.status_code)
self.assertEqual(current_vars, response.json()['variables'])
self.assertEqual(current_vars, self.get_current_vars(resource_id))
response = self.delete(url, body=['will-delete', 'non-existent-key'])
del current_vars['will-delete']
self.assertEqual(204, response.status_code)
self.assertEqual(current_vars, self.get_current_vars(resource_id))

View File

@ -1,38 +0,0 @@
import fixtures
from craton.db.sqlalchemy import api as sa_api
from craton.db.sqlalchemy import models
from craton.tests import TestCase
_DB_SCHEMA = None
class Database(fixtures.Fixture):
def __init__(self):
self.engine = sa_api.get_engine()
self.engine.dispose()
conn = self.engine.connect()
self.setup_sqlite()
self._DB = "".join(line for line in conn.connection.iterdump())
self.engine.dispose()
def setup_sqlite(self):
# NOTE(sulo): there is no version here. We will be using
# Alembic in the near future to manage migrations.
models.Base.metadata.create_all(self.engine)
def _setUp(self):
conn = self.engine.connect()
conn.connection.executescript(self._DB)
self.addCleanup(self.engine.dispose)
class DBTestCase(TestCase):
def setUp(self):
super(DBTestCase, self).setUp()
global _DB_SCHEMA
if not _DB_SCHEMA:
_DB_SCHEMA = Database()
self.useFixture(_DB_SCHEMA)

View File

@ -1,15 +0,0 @@
from craton.db.sqlalchemy import api as dbapi
from craton.tests.unit.db import base
class TestProjectsGetAll(base.DBTestCase):
def test_link_params_dictionary(self):
_, links = dbapi.projects_get_all(
self.context,
filters={'name': None, 'id': None,
'sort_keys': ['id', 'created_at'], 'sort_dir': 'asc'},
pagination_params={'limit': 30, 'marker': None},
)
self.assertNotIn('name', links)
self.assertNotIn('id', links)

View File

@ -1,116 +0,0 @@
import uuid
from craton import exceptions
from craton.db import api as dbapi
from craton.tests.unit.db import base
project_id1 = uuid.uuid4().hex
cloud_id1 = uuid.uuid4().hex
cell1 = {'region_id': 1, 'project_id': project_id1, 'name': 'cell1',
"cloud_id": cloud_id1}
cell1_region2 = {'region_id': 2, 'project_id': project_id1, 'name': 'cell1',
"cloud_id": cloud_id1}
cell2 = {'region_id': 1, 'project_id': project_id1, 'name': 'cell2',
"cloud_id": cloud_id1}
cells = (cell1, cell1_region2, cell2)
default_pagination = {'limit': 30, 'marker': None}
class CellsDBTestCase(base.DBTestCase):
def test_cells_create(self):
try:
dbapi.cells_create(self.context, cell1)
except Exception:
self.fail("Cell create raised unexpected exception")
def test_duplicate_cell_create_raises_409(self):
dbapi.cells_create(self.context, cell1)
self.assertRaises(exceptions.DuplicateCell, dbapi.cells_create,
self.context, cell1)
def test_cells_get_all(self):
dbapi.cells_create(self.context, cell1)
filters = {
"region_id": cell1["region_id"],
}
res, _ = dbapi.cells_get_all(self.context, filters, default_pagination)
self.assertEqual(len(res), 1)
self.assertEqual(res[0]['name'], 'cell1')
def test_cells_get_all_filter_name(self):
for cell in cells:
dbapi.cells_create(self.context, cell)
setup_res, _ = dbapi.cells_get_all(self.context, {},
default_pagination)
self.assertGreater(len(setup_res), 2)
filters = {
"name": cell1["name"],
}
res, _ = dbapi.cells_get_all(self.context, filters, default_pagination)
self.assertEqual(len(res), 2)
for cell in res:
self.assertEqual(cell['name'], 'cell1')
def test_cells_get_all_filter_id(self):
for cell in cells:
dbapi.cells_create(self.context, cell)
setup_res, _ = dbapi.cells_get_all(self.context, {},
default_pagination)
self.assertGreater(len(setup_res), 2)
self.assertEqual(
len([cell for cell in setup_res if cell['id'] == 1]), 1
)
filters = {
"id": 1,
}
res, _ = dbapi.cells_get_all(self.context, filters, default_pagination)
self.assertEqual(len(res), 1)
self.assertEqual(res[0]['id'], 1)
def test_cells_get_all_with_filters(self):
res = dbapi.cells_create(self.context, cell1)
variables = {"key1": "value1", "key2": "value2"}
dbapi.variables_update_by_resource_id(
self.context, "cells", res.id, variables
)
filters = {
"vars": "key2:value2",
"region_id": cell1["region_id"],
}
res, _ = dbapi.cells_get_all(self.context, filters, default_pagination)
self.assertEqual(len(res), 1)
self.assertEqual(res[0]['name'], 'cell1')
def test_cells_get_all_with_filters_noexist(self):
res = dbapi.cells_create(self.context, cell1)
variables = {"key1": "value1", "key2": "value2"}
dbapi.variables_update_by_resource_id(
self.context, "cells", res.id, variables
)
filters = {}
filters["vars"] = "key2:value5"
res, _ = dbapi.cells_get_all(self.context, filters, default_pagination)
self.assertEqual(len(res), 0)
def test_cell_delete(self):
create_res = dbapi.cells_create(self.context, cell1)
# First make sure we have the cell
res = dbapi.cells_get_by_id(self.context, create_res.id)
self.assertEqual(res.name, 'cell1')
dbapi.cells_delete(self.context, res.id)
self.assertRaises(exceptions.NotFound, dbapi.cells_get_by_id,
self.context, res.id)
def test_cell_update(self):
create_res = dbapi.cells_create(self.context, cell1)
res = dbapi.cells_get_by_id(self.context, create_res.id)
self.assertEqual(res.name, 'cell1')
new_name = 'cell1_New'
res = dbapi.cells_update(self.context, res.id, {'name': 'cell1_New'})
self.assertEqual(res.name, new_name)

View File

@ -1,89 +0,0 @@
import uuid
from craton.db import api as dbapi
from craton.tests.unit.db import base
from craton import exceptions
default_pagination = {'limit': 30, 'marker': None}
project_id1 = uuid.uuid4().hex
cloud1 = {'project_id': project_id1, 'name': 'cloud1'}
class CloudsDBTestCase(base.DBTestCase):
def test_cloud_create(self):
try:
dbapi.clouds_create(self.context, cloud1)
except Exception:
self.fail("Cloud create raised unexpected exception")
def test_cloud_create_duplicate_name_raises(self):
dbapi.clouds_create(self.context, cloud1)
self.assertRaises(exceptions.DuplicateCloud, dbapi.clouds_create,
self.context, cloud1)
def test_clouds_get_all(self):
dbapi.clouds_create(self.context, cloud1)
filters = {}
res, _ = dbapi.clouds_get_all(self.context, filters,
default_pagination)
self.assertEqual(len(res), 1)
self.assertEqual(res[0]['name'], 'cloud1')
def test_clouds_get_all_with_var_filters(self):
res = dbapi.clouds_create(self.context, cloud1)
variables = {"key1": "value1", "key2": "value2"}
dbapi.variables_update_by_resource_id(
self.context, "clouds", res.id, variables
)
filters = {}
filters["vars"] = "key1:value1"
clouds, _ = dbapi.clouds_get_all(
self.context, filters, default_pagination,
)
self.assertEqual(len(clouds), 1)
self.assertEqual(clouds[0].name, cloud1['name'])
def test_clouds_get_all_with_var_filters_noexist(self):
res = dbapi.clouds_create(self.context, cloud1)
variables = {"key1": "value1", "key2": "value2"}
dbapi.variables_update_by_resource_id(
self.context, "clouds", res.id, variables
)
filters = {}
filters["vars"] = "key1:value12"
clouds, _ = dbapi.clouds_get_all(
self.context, filters, default_pagination,
)
self.assertEqual(len(clouds), 0)
def test_cloud_get_by_name(self):
dbapi.clouds_create(self.context, cloud1)
res = dbapi.clouds_get_by_name(self.context, cloud1['name'])
self.assertEqual(res.name, 'cloud1')
def test_cloud_get_by_id(self):
dbapi.clouds_create(self.context, cloud1)
res = dbapi.clouds_get_by_id(self.context, 1)
self.assertEqual(res.name, 'cloud1')
def test_cloud_update(self):
dbapi.clouds_create(self.context, cloud1)
res = dbapi.clouds_get_by_id(self.context, 1)
self.assertEqual(res.name, 'cloud1')
new_name = "cloud_New1"
res = dbapi.clouds_update(self.context, res.id,
{'name': 'cloud_New1'})
self.assertEqual(res.name, new_name)
def test_cloud_delete(self):
dbapi.clouds_create(self.context, cloud1)
# First make sure we have the cloud
res = dbapi.clouds_get_by_name(self.context, cloud1['name'])
self.assertEqual(res.name, 'cloud1')
dbapi.clouds_delete(self.context, res.id)
self.assertRaises(exceptions.NotFound,
dbapi.clouds_get_by_name,
self.context, 'fake-cloud')

View File

@ -1,704 +0,0 @@
import uuid
from netaddr import IPAddress
from craton import exceptions
from craton.db import api as dbapi
from craton.tests.unit.db import base
default_pagination = {'limit': 30, 'marker': None}
class BaseDevicesDBTestCase(base.DBTestCase):
mock_project_id = uuid.uuid4().hex
def make_project(self, name, **variables):
project = dbapi.projects_create(
self.context,
{"name": name,
"variables": variables})
return str(project.id)
def make_cloud(self, project_id, name, **variables):
cloud = dbapi.clouds_create(
self.context,
{'name': name,
'project_id': project_id,
'variables': variables})
return cloud.id
def make_region(self, project_id, cloud_id, name, **variables):
region = dbapi.regions_create(
self.context,
{'name': name,
'project_id': project_id,
'cloud_id': cloud_id,
'variables': variables})
return region.id
def make_cell(self, project_id, cloud_id, region_id, name, **variables):
cell = dbapi.cells_create(
self.context,
{'name': name,
'project_id': project_id,
'cloud_id': cloud_id,
'region_id': region_id,
'variables': variables})
return cell.id
def make_host(self, project_id, cloud_id, region_id, name, ip_address,
host_type, cell_id=None, parent_id=None, labels=None,
**variables):
if cell_id:
host = {'name': name,
'project_id': project_id,
'cloud_id': cloud_id,
'region_id': region_id,
'cell_id': cell_id,
'ip_address': ip_address,
'parent_id': parent_id,
'device_type': host_type,
'active': True,
'labels': set() if labels is None else labels,
'variables': variables}
else:
host = {'name': name,
'project_id': project_id,
'cloud_id': cloud_id,
'region_id': region_id,
'ip_address': ip_address,
'parent_id': parent_id,
'device_type': host_type,
'active': True,
'labels': set() if labels is None else labels,
'variables': variables}
host = dbapi.hosts_create(self.context, host)
return host.id
def make_network_device(
self, project_id, cloud_id, region_id, name, ip_address,
device_type, cell_id=None, parent_id=None, **variables
):
network_device_data = {
'name': name,
'project_id': project_id,
'cloud_id': cloud_id,
'region_id': region_id,
'cell_id': cell_id,
'ip_address': ip_address,
'parent_id': parent_id,
'device_type': device_type,
'active': True,
'variables': variables,
}
network_device = dbapi.network_devices_create(
self.context, network_device_data
)
return network_device.id
class DevicesDBTestCase(BaseDevicesDBTestCase):
def setUp(self):
super().setUp()
project_id = self.make_project('project_1')
cloud_id = self.make_cloud(project_id, 'cloud_1')
region_id = self.make_region(project_id, cloud_id, 'region_1')
net_device1_id = self.make_network_device(
project_id, cloud_id, region_id, 'switch1.example.com',
IPAddress('10.1.2.101'), 'switch'
)
net_device2_id = self.make_network_device(
project_id, cloud_id, region_id, 'switch2.example.com',
IPAddress('10.1.2.102'), 'switch', parent_id=net_device1_id
)
host1_id = self.make_host(
project_id, cloud_id, region_id, 'www1.example.com',
IPAddress(u'10.1.2.103'), 'server', parent_id=net_device2_id
)
host2_id = self.make_host(
project_id, cloud_id, region_id, 'www2.example.com',
IPAddress(u'10.1.2.104'), 'container', parent_id=host1_id
)
host3_id = self.make_host(
project_id, cloud_id, region_id, 'www3.example.com',
IPAddress(u'10.1.2.105'), 'server'
)
self.parent = net_device1_id
self.children = [net_device2_id]
self.descendants = [net_device2_id, host1_id, host2_id]
self.all = [
net_device1_id, net_device2_id, host1_id, host2_id, host3_id
]
def test_devices_get_all(self):
devices, _ = dbapi.devices_get_all(
self.context, {}, default_pagination
)
self.assertEqual(self.all, [device.id for device in devices])
def test_devices_get_all_children(self):
devices, _ = dbapi.devices_get_all(
self.context, {'parent_id': self.parent}, default_pagination
)
self.assertEqual(self.children, [device.id for device in devices])
def test_devices_get_all_descendants(self):
devices, _ = dbapi.devices_get_all(
self.context,
{'parent_id': self.parent, 'descendants': True},
default_pagination
)
self.assertEqual(self.descendants, [device.id for device in devices])
class HostsDBTestCase(BaseDevicesDBTestCase):
def make_very_small_cloud(self, with_cell=False):
project_id = self.make_project('project_1', foo='P1', zoo='P2',
boo='P3')
cloud_id = self.make_cloud(project_id, 'cloud_1', zoo='CL1')
region_id = self.make_region(
project_id,
cloud_id,
'region_1',
foo='R1', bar='R2', bax='R3')
if with_cell:
cell_id = self.make_cell(project_id, cloud_id, region_id, 'cell_1',
bar='C2')
else:
cell_id = None
host_id = self.make_host(project_id, cloud_id, region_id,
'www1.example.com',
IPAddress(u'10.1.2.101'), 'server',
cell_id=cell_id, foo='H1', baz='H3')
return project_id, cloud_id, region_id, cell_id, host_id
def test_hosts_create(self):
# Need to do this query despite creation above because other
# elements (cell, region) were in separate committed sessions
# when the host was created. Verify these linked elements load
# correctly
project_id, cloud_id, region_id, cell_id, host_id = \
self.make_very_small_cloud(with_cell=True)
host = dbapi.hosts_get_by_id(self.context, host_id)
self.assertEqual(host.region.id, region_id)
self.assertEqual(host.region.name, 'region_1')
self.assertEqual(host.cell.id, cell_id)
self.assertEqual(host.cell.name, 'cell_1')
# Verify resolved variables/blames override properly
self.assertEqual(
[obj.id for obj in host.resolution_order],
[host_id, cell_id, region_id, cloud_id, uuid.UUID(project_id)])
self.assertEqual(
[variables for variables in host.resolution_order_variables],
[{'foo': 'H1', 'baz': 'H3'},
{'bar': 'C2'},
{'foo': 'R1', 'bar': 'R2', 'bax': 'R3'},
{'zoo': 'CL1'},
{'foo': 'P1', 'zoo': 'P2', 'boo': 'P3'}])
self.assertEqual(
host.resolved,
{'foo': 'H1', 'bar': 'C2', 'baz': 'H3', 'bax': 'R3', 'zoo': 'CL1',
'boo': 'P3'})
blame = host.blame(['foo', 'bar', 'zoo', 'boo'])
self.assertEqual(blame['foo'].source.name, 'www1.example.com')
self.assertEqual(blame['foo'].variable.value, 'H1')
self.assertEqual(blame['bar'].source.name, 'cell_1')
self.assertEqual(blame['bar'].variable.value, 'C2')
self.assertEqual(blame['zoo'].source.name, 'cloud_1')
self.assertEqual(blame['zoo'].variable.value, 'CL1')
self.assertEqual(blame['boo'].source.name, 'project_1')
self.assertEqual(blame['boo'].variable.value, 'P3')
def test_hosts_create_duplicate_raises(self):
cloud_id = self.make_cloud(self.mock_project_id, 'cloud_1')
region_id = self.make_region(self.mock_project_id, cloud_id,
'region_1')
self.make_host(self.mock_project_id, cloud_id, region_id,
'www1.example.com',
IPAddress(u'10.1.2.101'), 'server')
new_host = {'name': 'www1.example.com', 'region_id': region_id,
'ip_address': IPAddress(u'10.1.2.101'),
'device_type': 'server',
'cloud_id': cloud_id, 'project_id': self.mock_project_id}
self.assertRaises(exceptions.DuplicateDevice, dbapi.hosts_create,
self.context, new_host)
def test_hosts_create_without_cell(self):
project_id, cloud_id, region_id, _, host_id = \
self.make_very_small_cloud()
host = dbapi.hosts_get_by_id(self.context, host_id)
self.assertEqual(host.cloud_id, cloud_id)
self.assertEqual(host.region.id, region_id)
self.assertEqual(host.region.name, 'region_1')
self.assertIsNone(host.cell)
# Verify resolved variables/blames override properly
self.assertEqual(
[obj.id for obj in host.resolution_order],
[host_id, region_id, cloud_id, uuid.UUID(project_id)])
self.assertEqual(
[variables for variables in host.resolution_order_variables],
[{'foo': 'H1', 'baz': 'H3'},
{'foo': 'R1', 'bar': 'R2', 'bax': 'R3'},
{'zoo': 'CL1'},
{'foo': 'P1', 'zoo': 'P2', 'boo': 'P3'}])
self.assertEqual(
host.resolved,
{'foo': 'H1', 'bar': 'R2', 'baz': 'H3', 'bax': 'R3', 'zoo': 'CL1',
'boo': 'P3'})
blame = host.blame(['foo', 'bar', 'zoo', 'boo'])
self.assertEqual(blame['foo'].source.name, 'www1.example.com')
self.assertEqual(blame['foo'].variable.value, 'H1')
self.assertEqual(blame['bar'].source.name, 'region_1')
self.assertEqual(blame['bar'].variable.value, 'R2')
self.assertEqual(blame['zoo'].source.name, 'cloud_1')
self.assertEqual(blame['zoo'].variable.value, 'CL1')
self.assertEqual(blame['boo'].source.name, 'project_1')
self.assertEqual(blame['boo'].variable.value, 'P3')
def test_hosts_update(self):
cloud_id = self.make_cloud(self.mock_project_id, 'cloud_1')
region_id = self.make_region(self.mock_project_id, cloud_id,
'region_1')
host_id = self.make_host(self.mock_project_id, cloud_id, region_id,
'example',
IPAddress(u'10.1.2.101'), 'server',
bar='bar2')
name = "Host_New"
res = dbapi.hosts_update(self.context, host_id, {'name': 'Host_New'})
self.assertEqual(res.name, name)
def test_hosts_variable_resolved_with_parent(self):
project_id = self.make_project(
'project_1',
foo='P1', zoo='P2', boo='P3')
cloud_id = self.make_cloud(
project_id,
'cloud_1',
zoo='CL1', zab='CL2')
region_id = self.make_region(
project_id,
cloud_id,
'region_1',
foo='R1', bar='R2', bax='R3')
cell_id = self.make_cell(project_id, cloud_id, region_id, 'cell_1',
bar='C2')
host1_id = self.make_host(project_id, cloud_id, region_id,
'www1.example.com',
IPAddress(u'10.1.2.101'), 'server',
cell_id=cell_id, foo='H1', baz='H3')
host2_id = self.make_host(project_id, cloud_id, region_id,
'www1.example2.com',
IPAddress(u'10.1.2.102'), 'server',
cell_id=cell_id, parent_id=host1_id)
host2 = dbapi.hosts_get_by_id(self.context, host2_id)
# Verify resolved variables/blames override properly
self.assertEqual(
[obj.id for obj in host2.resolution_order],
[host2_id, host1_id, cell_id, region_id, cloud_id,
uuid.UUID(project_id)])
self.assertEqual(
[variables for variables in host2.resolution_order_variables],
[{},
{'baz': 'H3', 'foo': 'H1'},
{'bar': 'C2'},
{'bar': 'R2', 'foo': 'R1', 'bax': 'R3'},
{'zoo': 'CL1', 'zab': 'CL2'},
{'foo': 'P1', 'zoo': 'P2', 'boo': 'P3'}])
self.assertEqual(
host2.resolved,
{'foo': 'H1', 'bar': 'C2', 'baz': 'H3', 'bax': 'R3', 'zoo': 'CL1',
'boo': 'P3', 'zab': 'CL2'})
blame = host2.blame(['foo', 'bar', 'zoo', 'boo', 'zab'])
self.assertEqual(blame['foo'].source.name, 'www1.example.com')
self.assertEqual(blame['foo'].variable.value, 'H1')
self.assertEqual(blame['bar'].source.name, 'cell_1')
self.assertEqual(blame['bar'].variable.value, 'C2')
self.assertEqual(blame['zoo'].source.name, 'cloud_1')
self.assertEqual(blame['zoo'].variable.value, 'CL1')
self.assertEqual(blame['zab'].source.name, 'cloud_1')
self.assertEqual(blame['zab'].variable.value, 'CL2')
self.assertEqual(blame['boo'].source.name, 'project_1')
self.assertEqual(blame['boo'].variable.value, 'P3')
def test_hosts_variables_no_resolved(self):
project_id = self.make_project('project_1', zoo='P2')
cloud_id = self.make_cloud(project_id, 'cloud_1')
region_id = self.make_region(project_id, cloud_id, 'region_1',
foo='R1')
host_id = self.make_host(project_id, cloud_id, region_id,
'www.example.xyz',
IPAddress(u'10.1.2.101'),
'server', bar='bar2')
host = dbapi.hosts_get_by_id(self.context, host_id)
self.assertEqual(host.name, 'www.example.xyz')
self.assertEqual(host.variables, {'bar': 'bar2'})
def test_hosts_resolved_vars_no_cells(self):
project_id = self.make_project('project_1')
cloud_id = self.make_cloud(project_id, 'cloud_1')
region_id = self.make_region(project_id, cloud_id, 'region_1',
foo='R1')
host_id = self.make_host(project_id, cloud_id, region_id,
'www.example.xyz',
IPAddress(u'10.1.2.101'),
'server', bar='bar2')
host = dbapi.hosts_get_by_id(self.context, host_id)
self.assertEqual(host.name, 'www.example.xyz')
self.assertEqual(host.resolved, {'bar': 'bar2', 'foo': 'R1'})
def test_host_labels_create(self):
cloud_id = self.make_cloud(self.mock_project_id, 'cloud_1')
region_id = self.make_region(self.mock_project_id, cloud_id,
'region_1',
foo='R1')
host_id = self.make_host(self.mock_project_id, cloud_id, region_id,
'www.example.xyz',
IPAddress(u'10.1.2.101'),
'server', bar='bar2')
labels = {"labels": ["tom", "jerry"]}
dbapi.hosts_labels_update(self.context, host_id, labels)
def test_host_labels_delete(self):
cloud_id = self.make_cloud(self.mock_project_id, 'cloud_1')
region_id = self.make_region(self.mock_project_id, cloud_id,
'region_1',
foo='R1')
host_id = self.make_host(self.mock_project_id, cloud_id, region_id,
'www.example.xyz',
IPAddress(u'10.1.2.101'),
'server', bar='bar2')
_labels = {"labels": ["tom", "jerry", "jones"]}
dbapi.hosts_labels_update(self.context, host_id, _labels)
host = dbapi.hosts_get_by_id(self.context, host_id)
self.assertEqual(sorted(host.labels), sorted(_labels["labels"]))
_dlabels = {"labels": ["tom"]}
dbapi.hosts_labels_delete(self.context, host_id, _dlabels)
host = dbapi.hosts_get_by_id(self.context, host_id)
self.assertEqual(host.labels, {"jerry", "jones"})
def test_hosts_get_all_with_label_filters(self):
cloud_id = self.make_cloud(self.mock_project_id, 'cloud_1')
region_id = self.make_region(self.mock_project_id, cloud_id,
'region_1')
labels = {"labels": ["compute"]}
host1 = self.make_host(
self.mock_project_id,
cloud_id,
region_id,
'www1.example.com',
IPAddress(u'10.1.2.101'),
'server',
)
dbapi.hosts_labels_update(self.context, host1, labels)
self.make_host(
self.mock_project_id,
cloud_id,
region_id,
'www1.example2.com',
IPAddress(u'10.1.2.102'),
'server',
)
res, _ = dbapi.hosts_get_all(self.context, {"label": "compute"},
default_pagination)
self.assertEqual(len(res), 1)
self.assertEqual(res[0].name, 'www1.example.com')
def test_hosts_get_all_with_filter_cell_id(self):
project_id = self.make_project('project_1', foo='P1', zoo='P2')
cloud_id = self.make_cloud(project_id, 'cloud_1')
region_id = self.make_region(project_id, cloud_id, 'region_1',
foo='R1')
cell_id1 = self.make_cell(project_id, cloud_id, region_id, 'cell_1',
bar='C2')
cell_id2 = self.make_cell(project_id, cloud_id, region_id, 'cell_2',
bar='C2')
self.assertNotEqual(cell_id1, cell_id2)
self.make_host(
project_id,
cloud_id,
region_id,
'www.example.xyz',
IPAddress(u'10.1.2.101'),
'server',
cell_id=cell_id1,
)
self.make_host(
project_id,
cloud_id,
region_id,
'www.example.abc',
IPAddress(u'10.1.2.102'),
'server',
cell_id=cell_id2,
)
all_res, _ = dbapi.hosts_get_all(self.context, {}, default_pagination)
self.assertEqual(len(all_res), 2)
self.assertEqual(
len([host for host in all_res if host['cell_id'] == cell_id1]), 1
)
filters = {
"cell_id": cell_id1,
}
res, _ = dbapi.hosts_get_all(self.context, filters,
default_pagination)
self.assertEqual(len(res), 1)
self.assertEqual(res[0].name, 'www.example.xyz')
def test_hosts_get_all_with_filters(self):
project_id = self.make_project('project_1', foo='P1', zoo='P2')
cloud_id = self.make_cloud(project_id, 'cloud_1')
region_id = self.make_region(project_id, cloud_id, 'region_1',
foo='R1')
host_id = self.make_host(project_id, cloud_id, region_id,
'www.example.xyz',
IPAddress(u'10.1.2.101'),
'server')
variables = {"key1": "value1", "key2": "value2"}
dbapi.variables_update_by_resource_id(
self.context, "hosts", host_id, variables
)
filters = {
"region_id": region_id,
"vars": "key2:value2",
}
res, _ = dbapi.hosts_get_all(self.context, filters,
default_pagination)
self.assertEqual(len(res), 1)
self.assertEqual(res[0].name, 'www.example.xyz')
def test_hosts_get_with_key_value_filters(self):
project_id = self.make_project('project_1', foo='P1', zoo='P2')
cloud_id = self.make_cloud(project_id, 'cloud_1')
region_id = self.make_region(project_id, cloud_id, 'region_1',
foo='R1')
host1 = self.make_host(project_id, cloud_id, region_id,
'www.example.xyz',
IPAddress(u'10.1.2.101'),
'server')
variables = {"key1": "example1", "key2": "Tom"}
dbapi.variables_update_by_resource_id(
self.context, "hosts", host1, variables
)
# Second host with own variables
host2 = self.make_host(project_id, cloud_id, region_id,
'www.example2.xyz',
IPAddress(u'10.1.2.102'),
'server')
variables = {"key1": "example2", "key2": "Tom"}
dbapi.variables_update_by_resource_id(
self.context, "hosts", host2, variables
)
filters = {"vars": "key1:example2"}
res, _ = dbapi.hosts_get_all(self.context, filters, default_pagination)
self.assertEqual(len(res), 1)
self.assertEqual('www.example2.xyz', res[0].name)
filters = {"vars": "key2:Tom"}
res, _ = dbapi.hosts_get_all(self.context, filters, default_pagination)
self.assertEqual(len(res), 2)
def test_hosts_get_all_with_filters_noexist(self):
project_id = self.make_project('project_1', foo='P1', zoo='P2')
cloud_id = self.make_cloud(project_id, 'cloud_1')
region_id = self.make_region(project_id, cloud_id, 'region_1',
foo='R1')
host_id = self.make_host(project_id, cloud_id, region_id,
'www.example.xyz',
IPAddress(u'10.1.2.101'),
'server')
variables = {"key1": "value1", "key2": "value2"}
dbapi.variables_update_by_resource_id(
self.context, "hosts", host_id, variables
)
filters = {
"region_id": 1,
"vars": "key1:value5",
}
res, _ = dbapi.hosts_get_all(self.context, filters,
default_pagination)
self.assertEqual(len(res), 0)
def test_hosts_create_sets_parent_id(self):
project_id = self.make_project('project_1')
cloud_id = self.make_cloud(project_id, 'cloud_1')
region_id = self.make_region(project_id, cloud_id, 'region_1')
parent_id = self.make_host(
project_id, cloud_id, region_id, '1.www.example.com',
IPAddress(u'10.1.2.101'), 'server'
)
child = dbapi.hosts_create(
self.context,
{
'project_id': project_id,
'cloud_id': cloud_id,
'region_id': region_id,
'hostname': '2.www.example.com',
'ip_address': IPAddress(u'10.1.2.102'),
'device_type': 'server',
'parent_id': parent_id,
}
)
self.assertEqual(parent_id, child.parent_id)
def test_hosts_update_sets_parent_id(self):
project_id = self.make_project('project_1')
cloud_id = self.make_cloud(project_id, 'cloud_1')
region_id = self.make_region(project_id, cloud_id, 'region_1')
parent_id = self.make_host(
project_id, cloud_id, region_id, '1.www.example.com',
IPAddress(u'10.1.2.101'), 'server'
)
child = dbapi.hosts_create(
self.context,
{
'project_id': project_id,
'cloud_id': cloud_id,
'region_id': region_id,
'hostname': '2.www.example.com',
'ip_address': IPAddress(u'10.1.2.102'),
'device_type': 'server',
'parent_id': None,
}
)
self.assertIsNone(child.parent_id)
child_update = dbapi.hosts_update(
self.context,
child.id,
{
'parent_id': parent_id,
}
)
self.assertEqual(parent_id, child_update.parent_id)
def test_hosts_update_fails_when_parent_id_set_to_own_id(self):
project_id = self.make_project('project_1')
cloud_id = self.make_cloud(project_id, 'cloud_1')
region_id = self.make_region(project_id, cloud_id, 'region_1')
host1 = dbapi.hosts_create(
self.context,
{
'project_id': project_id,
'cloud_id': cloud_id,
'region_id': region_id,
'hostname': '1.www.example.com',
'ip_address': IPAddress(u'10.1.2.101'),
'device_type': 'server',
'parent_id': None,
}
)
self.assertRaises(
exceptions.BadRequest,
dbapi.hosts_update,
self.context,
host1.id,
{
'parent_id': host1.id,
}
)
def test_hosts_update_fails_when_parent_set_to_descendant(self):
project_id = self.make_project('project_1')
cloud_id = self.make_cloud(project_id, 'cloud_1')
region_id = self.make_region(project_id, cloud_id, 'region_1')
parent = dbapi.hosts_create(
self.context,
{
'project_id': project_id,
'cloud_id': cloud_id,
'region_id': region_id,
'hostname': '1.www.example.com',
'ip_address': IPAddress(u'10.1.2.101'),
'device_type': 'server',
'parent_id': None,
}
)
child = dbapi.hosts_create(
self.context,
{
'project_id': project_id,
'cloud_id': cloud_id,
'region_id': region_id,
'hostname': '2.www.example.com',
'ip_address': IPAddress(u'10.1.2.102'),
'device_type': 'server',
'parent_id': parent.id,
}
)
grandchild = dbapi.hosts_create(
self.context,
{
'project_id': project_id,
'cloud_id': cloud_id,
'region_id': region_id,
'hostname': '3.www.example.com',
'ip_address': IPAddress(u'10.1.2.103'),
'device_type': 'server',
'parent_id': child.id,
}
)
self.assertRaises(
exceptions.BadRequest,
dbapi.hosts_update,
self.context,
parent.id,
{
'parent_id': grandchild.id,
}
)
def test_hosts_get_all_with_resolved_var_filters(self):
project_id = self.make_project('project_1', foo='P1', zoo='P2')
cloud_id = self.make_cloud(project_id, 'cloud_1')
region_id = self.make_region(
project_id, cloud_id, 'region_1', foo='R1')
switch_id = self.make_network_device(
project_id, cloud_id, region_id,
'switch1.example.com', IPAddress('10.1.2.101'), 'switch',
zoo='S1', bar='S2')
self.make_host(
project_id, cloud_id, region_id,
'www.example.xyz', IPAddress(u'10.1.2.101'), 'server',
parent_id=switch_id,
key1="value1", key2="value2")
self.make_host(
project_id, cloud_id, region_id,
'www2.example.xyz', IPAddress(u'10.1.2.102'), 'server',
parent_id=switch_id,
key1="value-will-not-match", key2="value2")
filters = {
"region_id": 1,
"vars": "key1:value1,zoo:S1,foo:R1",
"resolved-values": True,
}
res, _ = dbapi.hosts_get_all(
self.context, filters, default_pagination)
self.assertEqual(len(res), 1)
self.assertEqual(res[0].name, 'www.example.xyz')

View File

@ -1,522 +0,0 @@
import uuid
from craton import exceptions
from craton.db import api as dbapi
from craton.tests.unit.db import base
default_pagination = {'limit': 30, 'marker': None}
project_id1 = uuid.uuid4().hex
cloud_id1 = uuid.uuid4().hex
network1 = {"name": "test network",
"cidr": "192.168.1.0/24",
"gateway": "192.168.1.1",
"netmask": "255.255.255.0",
"region_id": 1,
"project_id": project_id1,
"cloud_id": cloud_id1}
network2 = {"name": "test network2",
"cidr": "192.168.1.0/24",
"gateway": "192.168.1.1",
"netmask": "255.255.255.0",
"region_id": 2,
"project_id": project_id1,
"cloud_id": cloud_id1}
device1 = {"hostname": "switch1",
"model_name": "Model-X",
"region_id": 1,
"project_id": project_id1,
"device_type": "switch",
"ip_address": "192.168.1.1",
"cloud_id": cloud_id1}
device2 = {"hostname": "switch2",
"model_name": "Model-X",
"region_id": 2,
"project_id": project_id1,
"device_type": "switch",
"ip_address": "192.168.1.1",
"cloud_id": cloud_id1}
device3 = {"hostname": "foo1",
"model_name": "Model-Bar",
"region_id": 1,
"project_id": project_id1,
"device_type": "foo",
"ip_address": "192.168.1.2",
"cloud_id": cloud_id1}
network_interface1 = {"device_id": 1,
"project_id": project_id1,
"name": "eth1",
"ip_address": "192.168.0.2",
"interface_type": "ethernet"}
network_interface2 = {"device_id": 2,
"project_id": project_id1,
"name": "eth1",
"ip_address": "192.168.0.3",
"interface_type": "ethernet"}
class NetworksDBTestCase(base.DBTestCase):
def test_networks_create(self):
try:
dbapi.networks_create(self.context, network1)
except Exception:
self.fail("Networks create raised unexpected exception")
def test_network_create_duplicate_name_raises(self):
dbapi.networks_create(self.context, network1)
self.assertRaises(exceptions.DuplicateNetwork, dbapi.networks_create,
self.context, network1)
def test_networks_get_all(self):
dbapi.networks_create(self.context, network1)
dbapi.networks_create(self.context, network2)
filters = {}
res, _ = dbapi.networks_get_all(self.context, filters,
default_pagination)
self.assertEqual(len(res), 2)
def test_networks_get_all_filter_region(self):
dbapi.networks_create(self.context, network1)
dbapi.networks_create(self.context, network2)
filters = {
'region_id': network1['region_id'],
}
res, _ = dbapi.networks_get_all(self.context, filters,
default_pagination)
self.assertEqual(len(res), 1)
self.assertEqual(res[0]['name'], 'test network')
def test_networks_get_by_id(self):
network = dbapi.networks_create(self.context, network1)
res = dbapi.networks_get_by_id(self.context, network.id)
self.assertEqual(res.name, 'test network')
def test_networks_get_by_name_filter_no_exit(self):
dbapi.networks_create(self.context, network1)
filters = {"name": "foo", "region_id": network1['region_id']}
res, _ = dbapi.networks_get_all(self.context, filters,
default_pagination)
self.assertEqual(res, [])
def test_network_update(self):
network = dbapi.networks_create(self.context, network1)
res = dbapi.networks_get_by_id(self.context, network.id)
self.assertEqual(res.name, 'test network')
new_name = 'test_network1'
res = dbapi.networks_update(self.context, res.id,
{'name': 'test_network1'})
self.assertEqual(res.name, new_name)
def test_networks_get_by_id_no_exist_raises(self):
# Since no network is created, any id should raise
self.assertRaises(exceptions.NotFound, dbapi.networks_get_by_id,
self.context, 4)
def test_networks_delete(self):
network = dbapi.networks_create(self.context, network1)
# First make sure we have the network created
res = dbapi.networks_get_by_id(self.context, network.id)
self.assertEqual(res.id, network.id)
# Delete the network
dbapi.networks_delete(self.context, res.id)
self.assertRaises(exceptions.NotFound, dbapi.networks_get_by_id,
self.context, res.id)
class NetworkDevicesDBTestCase(base.DBTestCase):
def test_network_devices_create(self):
try:
dbapi.network_devices_create(self.context, device1)
except Exception:
self.fail("Network device create raised unexpected exception")
def test_network_devices_get_all(self):
dbapi.network_devices_create(self.context, device1)
dbapi.network_devices_create(self.context, device2)
filters = {}
res, _ = dbapi.network_devices_get_all(self.context, filters,
default_pagination)
self.assertEqual(len(res), 2)
def test_network_device_get_all_filter_region(self):
dbapi.network_devices_create(self.context, device1)
dbapi.network_devices_create(self.context, device2)
filters = {
'region_id': device1['region_id'],
}
res, _ = dbapi.network_devices_get_all(self.context, filters,
default_pagination)
self.assertEqual(len(res), 1)
self.assertEqual(res[0]['hostname'], 'switch1')
def test_network_device_get_all_filter_name(self):
dbapi.network_devices_create(self.context, device1)
dbapi.network_devices_create(self.context, device2)
name = device1['hostname']
setup_res, _ = dbapi.network_devices_get_all(self.context, {},
default_pagination)
self.assertEqual(len(setup_res), 2)
matches = [dev for dev in setup_res if dev['hostname'] == name]
self.assertEqual(len(matches), 1)
filters = {
'name': name,
}
res, _ = dbapi.network_devices_get_all(self.context, filters,
default_pagination)
self.assertEqual(len(res), 1)
self.assertEqual(res[0]['hostname'], name)
def test_network_device_get_all_filter_cell_id(self):
region_id = 1
cell1 = dbapi.cells_create(
self.context,
{
'name': 'cell1',
'project_id': project_id1,
'cloud_id': cloud_id1,
'region_id': region_id,
}
)
cell2 = dbapi.cells_create(
self.context,
{
'name': 'cell2',
'project_id': project_id1,
'cloud_id': cloud_id1,
'region_id': region_id,
}
)
dbapi.network_devices_create(
self.context, dict(cell_id=cell1.id, **device1)
)
dbapi.network_devices_create(
self.context, dict(cell_id=cell2.id, **device2)
)
setup_res, _ = dbapi.network_devices_get_all(self.context, {},
default_pagination)
self.assertEqual(len(setup_res), 2)
matches = [dev for dev in setup_res if dev['cell_id'] == cell1.id]
self.assertEqual(len(matches), 1)
filters = {
'cell_id': cell1.id,
}
res, _ = dbapi.network_devices_get_all(self.context, filters,
default_pagination)
self.assertEqual(len(res), 1)
self.assertEqual(res[0]['cell_id'], cell1.id)
def test_network_device_get_all_filter_device_type(self):
dbapi.network_devices_create(self.context, device1)
dbapi.network_devices_create(self.context, device3)
dev_type = device1['device_type']
setup_res, _ = dbapi.network_devices_get_all(self.context, {},
default_pagination)
self.assertEqual(len(setup_res), 2)
matches = [dev for dev in setup_res if dev['device_type'] == dev_type]
self.assertEqual(len(matches), 1)
filters = {
'device_type': dev_type,
}
res, _ = dbapi.network_devices_get_all(self.context, filters,
default_pagination)
self.assertEqual(len(res), 1)
self.assertEqual(res[0]['device_type'], dev_type)
def test_network_device_get_all_filter_id(self):
dbapi.network_devices_create(self.context, device1)
dbapi.network_devices_create(self.context, device2)
setup_res, _ = dbapi.network_devices_get_all(self.context, {},
default_pagination)
self.assertEqual(len(setup_res), 2)
dev_id = setup_res[0]['id']
self.assertNotEqual(dev_id, setup_res[1]['id'])
filters = {
'id': dev_id
}
res, _ = dbapi.network_devices_get_all(self.context, filters,
default_pagination)
self.assertEqual(len(res), 1)
self.assertEqual(res[0]['id'], dev_id)
def test_network_device_get_all_filter_ip_address(self):
dbapi.network_devices_create(self.context, device1)
dbapi.network_devices_create(self.context, device3)
ip = device1['ip_address']
setup_res, _ = dbapi.network_devices_get_all(self.context, {},
default_pagination)
self.assertEqual(len(setup_res), 2)
matches = [dev for dev in setup_res if str(dev['ip_address']) == ip]
self.assertEqual(len(matches), 1)
filters = {
'ip_address': ip,
}
res, _ = dbapi.network_devices_get_all(self.context, filters,
default_pagination)
self.assertEqual(len(res), 1)
self.assertEqual(str(res[0]['ip_address']), ip)
def test_network_devices_get_by_id(self):
device = dbapi.network_devices_create(self.context, device1)
res = dbapi.network_devices_get_by_id(self.context, device.id)
self.assertEqual(res.hostname, 'switch1')
def test_network_devices_get_by_filter_no_exit(self):
dbapi.network_devices_create(self.context, device1)
filters = {"hostname": "foo"}
res, _ = dbapi.networks_get_all(self.context, filters,
default_pagination)
self.assertEqual(res, [])
def test_network_devices_delete(self):
device = dbapi.network_devices_create(self.context, device1)
# First make sure we have the device
res = dbapi.network_devices_get_by_id(self.context, device.id)
self.assertEqual(res.id, device.id)
# Delete the device
dbapi.network_devices_delete(self.context, res.id)
self.assertRaises(exceptions.NotFound, dbapi.network_devices_get_by_id,
self.context, res.id)
def test_network_devices_labels_create(self):
device = dbapi.network_devices_create(self.context, device1)
labels = {"labels": ["tom", "jerry"]}
dbapi.network_devices_labels_update(self.context, device.id, labels)
def test_network_devices_update(self):
device = dbapi.network_devices_create(self.context, device1)
res = dbapi.network_devices_get_by_id(self.context, device.id)
self.assertEqual(res.hostname, 'switch1')
new_name = 'switch2'
res = dbapi.network_devices_update(self.context, res.id,
{'name': 'switch2'})
self.assertEqual(res.name, new_name)
def test_network_devices_labels_delete(self):
device = dbapi.network_devices_create(self.context, device1)
_labels = {"labels": ["tom", "jerry"]}
dbapi.network_devices_labels_update(self.context, device.id, _labels)
ndevice = dbapi.network_devices_get_by_id(self.context, device.id)
self.assertEqual(sorted(ndevice.labels), sorted(_labels["labels"]))
_dlabels = {"labels": ["tom"]}
dbapi.network_devices_labels_delete(self.context, ndevice.id, _dlabels)
ndevice = dbapi.network_devices_get_by_id(self.context, ndevice.id)
self.assertEqual(ndevice.labels, {"jerry"})
def test_network_devices_create_sets_parent_id(self):
parent = dbapi.network_devices_create(
self.context,
{
'project_id': project_id1,
'cloud_id': cloud_id1,
'region_id': 1,
'name': '1.www.example.com',
'ip_address': '10.1.2.102',
'device_type': 'switch',
}
)
child = dbapi.network_devices_create(
self.context,
{
'project_id': project_id1,
'cloud_id': cloud_id1,
'region_id': 1,
'name': '2.www.example.com',
'ip_address': '10.1.2.102',
'device_type': 'switch',
'parent_id': parent.id,
}
)
self.assertEqual(parent.id, child.parent_id)
def test_network_devices_update_sets_parent_id(self):
parent = dbapi.network_devices_create(
self.context,
{
'project_id': project_id1,
'cloud_id': cloud_id1,
'region_id': 1,
'name': '1.www.example.com',
'ip_address': '10.1.2.102',
'device_type': 'switch',
}
)
child = dbapi.network_devices_create(
self.context,
{
'project_id': project_id1,
'cloud_id': cloud_id1,
'region_id': 1,
'name': '2.www.example.com',
'ip_address': '10.1.2.102',
'device_type': 'switch',
'parent_id': None,
}
)
self.assertIsNone(child.parent_id)
child_update = dbapi.network_devices_update(
self.context,
child.id,
{
'parent_id': parent.id,
}
)
self.assertEqual(parent.id, child_update.parent_id)
def test_network_devices_update_fails_when_parent_id_set_to_own_id(self):
network_device1 = dbapi.network_devices_create(
self.context,
{
'project_id': project_id1,
'cloud_id': cloud_id1,
'region_id': 1,
'name': '1.www.example.com',
'ip_address': '10.1.2.101',
'device_type': 'switch',
'parent_id': None,
}
)
self.assertRaises(
exceptions.BadRequest,
dbapi.network_devices_update,
self.context,
network_device1.id,
{
'parent_id': network_device1.id,
}
)
def test_network_devices_update_fails_when_parent_set_to_descendant(self):
parent = dbapi.network_devices_create(
self.context,
{
'project_id': project_id1,
'cloud_id': cloud_id1,
'region_id': 1,
'name': '1.www.example.com',
'ip_address': '10.1.2.101',
'device_type': 'switch',
'parent_id': None,
}
)
child = dbapi.network_devices_create(
self.context,
{
'project_id': project_id1,
'cloud_id': cloud_id1,
'region_id': 1,
'name': '2.www.example.com',
'ip_address': '10.1.2.102',
'device_type': 'switch',
'parent_id': parent.id,
}
)
grandchild = dbapi.network_devices_create(
self.context,
{
'project_id': project_id1,
'cloud_id': cloud_id1,
'region_id': 1,
'name': '3.www.example.com',
'ip_address': '10.1.2.103',
'device_type': 'switch',
'parent_id': child.id,
}
)
self.assertRaises(
exceptions.BadRequest,
dbapi.network_devices_update,
self.context,
parent.id,
{
'parent_id': grandchild.id,
}
)
class NetworkInterfacesDBTestCase(base.DBTestCase):
def test_network_interfaces_create(self):
try:
dbapi.network_interfaces_create(self.context, network_interface1)
except Exception:
self.fail("Network interface create raised unexpected exception")
def test_network_interfaces_get_all(self):
dbapi.network_interfaces_create(self.context, network_interface1)
dbapi.network_interfaces_create(self.context, network_interface2)
filters = {}
res, _ = dbapi.network_interfaces_get_all(self.context, filters,
default_pagination)
self.assertEqual(len(res), 2)
self.assertEqual(
str(res[0]['ip_address']), network_interface1['ip_address']
)
self.assertEqual(
str(res[1]['ip_address']), network_interface2['ip_address']
)
def test_interface_get_all_filter_device_id(self):
dbapi.network_interfaces_create(self.context, network_interface1)
dbapi.network_interfaces_create(self.context, network_interface2)
filters = {
"device_id": 1,
}
res, _ = dbapi.network_interfaces_get_all(self.context, filters,
default_pagination)
self.assertEqual(len(res), 1)
self.assertEqual(res[0]['name'], 'eth1')
def test_network_interfaces_get_by_id(self):
interface = dbapi.network_interfaces_create(self.context,
network_interface1)
res = dbapi.network_interfaces_get_by_id(self.context, interface.id)
self.assertEqual(res.name, 'eth1')
self.assertEqual(str(res.ip_address), network_interface1['ip_address'])
def test_network_interfaces_update(self):
interface = dbapi.network_interfaces_create(self.context,
network_interface1)
res = dbapi.network_interfaces_get_by_id(self.context, interface.id)
self.assertEqual(res.name, 'eth1')
new_name = 'eth2'
res = dbapi.network_interfaces_update(self.context, interface.id,
{'name': 'eth2'})
self.assertEqual(res.name, new_name)
self.assertEqual(str(res.ip_address), network_interface1['ip_address'])
def test_network_interfaces_delete(self):
interface = dbapi.network_interfaces_create(self.context,
network_interface1)
# First make sure we have the interface created
res = dbapi.network_interfaces_get_by_id(self.context, interface.id)
self.assertEqual(res.id, interface.id)
# Delete the device
dbapi.network_interfaces_delete(self.context, res.id)
self.assertRaises(exceptions.NotFound,
dbapi.network_interfaces_get_by_id,
self.context, res.id)

View File

@ -1,99 +0,0 @@
import copy
import uuid
from craton import exceptions
from craton.db import api as dbapi
from craton.tests.unit.db import base
default_pagination = {'limit': 30, 'marker': None}
project1 = {'name': 'project1'}
project2 = {'name': 'project2'}
class ProjectsDBTestCase(base.DBTestCase):
def test_create_project(self):
# Set root, as only admin project can create other projects
project = dbapi.projects_create(self.context, project1)
self.assertEqual(project['name'], project1['name'])
def test_create_project_no_root_fails(self):
context = copy.deepcopy(self.context)
context.is_admin_project = False
self.assertRaises(exceptions.AdminRequired,
dbapi.projects_create,
context,
project1)
def test_project_get_all(self):
dbapi.projects_create(self.context, project1)
dbapi.projects_create(self.context, project2)
res, _ = dbapi.projects_get_all(self.context, {}, default_pagination)
self.assertEqual(len(res), 2)
def test_project_get_no_admin_project_raises(self):
self.context.is_admin_project = True
dbapi.projects_create(self.context, project1)
dbapi.projects_create(self.context, project2)
# Now set admin_project = false to become normal project user
self.context.is_admin_project = False
self.assertRaises(exceptions.AdminRequired,
dbapi.projects_get_all,
self.context,
{}, default_pagination)
def test_project_get_by_name(self):
dbapi.projects_create(self.context, project1)
dbapi.projects_create(self.context, project2)
res, _ = dbapi.projects_get_by_name(self.context, project1['name'], {},
default_pagination)
self.assertEqual(len(res), 1)
self.assertEqual(res[0].name, project1['name'])
def test_project_get_by_id(self):
project = dbapi.projects_create(self.context, project1)
res = dbapi.projects_get_by_id(self.context, project['id'])
self.assertEqual(str(res['id']), str(project['id']))
def test_project_create_id_uuid_type(self):
project = dbapi.projects_create(self.context, project1)
self.assertEqual(type(project['id']), uuid.UUID)
def test_project_get_id_uuid_type(self):
project = dbapi.projects_create(self.context, project1)
res = dbapi.projects_get_by_id(self.context, project['id'])
self.assertEqual(type(res['id']), uuid.UUID)
def test_project_variables_update_does_update_variables(self):
create_res = dbapi.projects_create(self.context, project1)
res = dbapi.projects_get_by_id(self.context, create_res.id)
self.assertEqual(res.variables, {})
variables = {"key1": "value1", "key2": "value2"}
res = dbapi.variables_update_by_resource_id(
self.context, "projects", res.id, variables
)
self.assertEqual(res.variables, variables)
new_variables = {"key1": "tom", "key2": "cat"}
res = dbapi.variables_update_by_resource_id(
self.context, "projects", res.id, new_variables
)
self.assertEqual(res.variables, new_variables)
def test_project_variables_delete(self):
create_res = dbapi.projects_create(self.context, project1)
res = dbapi.projects_get_by_id(self.context, create_res.id)
self.assertEqual(res.variables, {})
variables = {"key1": "value1", "key2": "value2"}
res = dbapi.variables_update_by_resource_id(
self.context, "projects", res.id, variables
)
self.assertEqual(res.variables, variables)
# NOTE(sulo): we delete variables by their key
res = dbapi.variables_delete_by_resource_id(
self.context, "projects", res.id, {"key1": "key1"}
)
self.assertEqual(res.variables, {"key2": "value2"})

View File

@ -1,98 +0,0 @@
import uuid
from craton.db import api as dbapi
from craton.tests.unit.db import base
from craton import exceptions
default_pagination = {'limit': 30, 'marker': None}
project_id1 = uuid.uuid4().hex
cloud_id1 = uuid.uuid4().hex
region1 = {'project_id': project_id1, 'cloud_id': cloud_id1, 'name': 'region1'}
class RegionsDBTestCase(base.DBTestCase):
def test_region_create(self):
try:
dbapi.regions_create(self.context, region1)
except Exception:
self.fail("Region create raised unexpected exception")
def test_region_create_duplicate_name_raises(self):
dbapi.regions_create(self.context, region1)
self.assertRaises(exceptions.DuplicateRegion, dbapi.regions_create,
self.context, region1)
def test_regions_get_all(self):
dbapi.regions_create(self.context, region1)
filters = {}
res, _ = dbapi.regions_get_all(self.context, filters,
default_pagination)
self.assertEqual(len(res), 1)
self.assertEqual(res[0]['name'], 'region1')
def test_regions_get_all_with_var_filters(self):
res = dbapi.regions_create(self.context, region1)
variables = {"key1": "value1", "key2": "value2"}
dbapi.variables_update_by_resource_id(
self.context, "regions", res.id, variables
)
filters = {}
filters["vars"] = "key1:value1"
regions, _ = dbapi.regions_get_all(
self.context, filters, default_pagination,
)
self.assertEqual(len(regions), 1)
self.assertEqual(regions[0].name, region1['name'])
def test_regions_get_all_with_var_filters_noexist(self):
res = dbapi.regions_create(self.context, region1)
variables = {"key1": "value1", "key2": "value2"}
dbapi.variables_update_by_resource_id(
self.context, "regions", res.id, variables
)
filters = {}
filters["vars"] = "key1:value12"
regions, _ = dbapi.regions_get_all(
self.context, filters, default_pagination,
)
self.assertEqual(len(regions), 0)
def test_region_get_by_name(self):
dbapi.regions_create(self.context, region1)
res = dbapi.regions_get_by_name(self.context, region1['name'])
self.assertEqual(res.name, 'region1')
def test_region_get_by_id(self):
dbapi.regions_create(self.context, region1)
res = dbapi.regions_get_by_id(self.context, 1)
self.assertEqual(res.name, 'region1')
def test_region_get_by_name_no_exit_raises(self):
# TODO(sulo): fix sqlalchemy api first
pass
def test_region_get_by_id_no_exist_raises(self):
# TODO(sulo): fix sqlalchemy api first
pass
def test_region_update(self):
dbapi.regions_create(self.context, region1)
res = dbapi.regions_get_by_id(self.context, 1)
self.assertEqual(res.name, 'region1')
new_name = "region_New1"
res = dbapi.regions_update(self.context, res.id,
{'name': 'region_New1'})
self.assertEqual(res.name, new_name)
def test_region_delete(self):
dbapi.regions_create(self.context, region1)
# First make sure we have the region
res = dbapi.regions_get_by_name(self.context, region1['name'])
self.assertEqual(res.name, 'region1')
dbapi.regions_delete(self.context, res.id)
self.assertRaises(exceptions.NotFound,
dbapi.regions_get_by_name,
self.context, 'fake-region')

View File

@ -1,74 +0,0 @@
import uuid
from craton import exceptions
from craton.db import api as dbapi
from craton.tests.unit.db import base
default_pagination = {'limit': 30, 'marker': None}
project_id1 = uuid.uuid4().hex
project_id2 = uuid.uuid4().hex
root = {'project_id': project_id1, 'username': 'root', "is_admin": True,
"is_root": True}
user1 = {'project_id': project_id2, 'username': 'user1', "is_admin": True}
user2 = {'project_id': project_id2, 'username': 'user2', "is_admin": False}
class UsersDBTestCase(base.DBTestCase):
def make_user(self, user, is_admin=True, is_root=False):
# Set admin context first
self.context.is_admin = is_admin
self.context.is_admin_project = is_root
user = dbapi.users_create(self.context, user)
return user
def test_user_create(self):
user = self.make_user(user1)
self.assertEqual(user['username'], 'user1')
def test_user_create_no_admin_context_fails(self):
self.assertRaises(exceptions.AdminRequired,
self.make_user,
user1,
is_admin=False)
def test_users_get_all(self):
# Ensure context tenant is the same one as the
# one that will make request, test context has
# fake-tenant set by default.
self.context.tenant = user1['project_id']
dbapi.users_create(self.context, user1)
dbapi.users_create(self.context, user2)
res = dbapi.users_get_all(self.context, {}, default_pagination)
self.assertEqual(len(res), 2)
def test_user_get_all_no_project_context(self):
# Ensure when request has no root context and the request
# is not for the same project no user info is given back.
self.make_user(user1)
self.context.tenant = uuid.uuid4().hex
res, _ = dbapi.users_get_all(self.context, {}, default_pagination)
self.assertEqual(len(res), 0)
def test_user_get_no_admin_context_raises(self):
self.make_user(user1)
self.context.is_admin = False
self.assertRaises(exceptions.AdminRequired,
dbapi.users_get_all,
self.context,
{}, default_pagination)
def test_user_get_by_name(self):
dbapi.users_create(self.context, user1)
dbapi.users_create(self.context, user2)
self.context.tenant = user1['project_id']
res, _ = dbapi.users_get_by_name(self.context, user1['username'], {},
default_pagination)
self.assertEqual(len(res), 1)
self.assertEqual(res[0]['username'], user1['username'])
def test_user_get_by_id(self):
user = self.make_user(user1)
res = dbapi.users_get_by_id(self.context, user["id"])
self.assertEqual(res["username"], user["username"])

View File

@ -1,473 +0,0 @@
from copy import deepcopy
from craton import exceptions
from craton.db import api as dbapi
from craton.tests.unit.db import base
class VariablesDBTestCase:
def _get_mock_resource_id(self):
# NOTE(thomasem): Project IDs are UUIDs not integers
if self.resources_type in ("projects",):
return "5a4e32e1-8571-4c2c-a088-a11f98900355"
return 1
def create_project(self, name, variables=None):
project = dbapi.projects_create(
self.context,
{
"name": name,
"variables": variables or {},
},
)
return project.id
def create_cloud(self, name, project_id, variables=None):
cloud = dbapi.clouds_create(
self.context,
{
'name': name,
'project_id': project_id,
'variables': variables or {},
},
)
return cloud.id
def create_region(self, name, project_id, cloud_id, variables=None):
region = dbapi.regions_create(
self.context,
{
'name': name,
'project_id': project_id,
'cloud_id': cloud_id,
'variables': variables or {},
},
)
return region.id
def create_cell(self, name, project_id, cloud_id, region_id,
variables=None):
cell = dbapi.cells_create(
self.context,
{
'name': name,
'project_id': project_id,
'cloud_id': cloud_id,
'region_id': region_id,
'variables': variables or {}
},
)
return cell.id
def create_host(
self, name, project_id, cloud_id, region_id, ip_address, host_type,
cell_id=None, parent_id=None, labels=None, variables=None,
):
host = {
'name': name,
'project_id': project_id,
'cloud_id': cloud_id,
'region_id': region_id,
'cell_id': cell_id,
'ip_address': ip_address,
'parent_id': parent_id,
'device_type': host_type,
'active': True,
'labels': labels or (),
'variables': variables or {},
}
host = dbapi.hosts_create(self.context, host)
self.assertEqual(variables, host.variables)
return host.id
def create_network(
self, name, project_id, cloud_id, region_id, cidr, gateway,
netmask, cell_id=None, variables=None,
):
network = {
'name': name,
'project_id': project_id,
'cloud_id': cloud_id,
'region_id': region_id,
'cell_id': cell_id,
'cidr': cidr,
'gateway': gateway,
'netmask': netmask,
'variables': variables or {},
}
network = dbapi.networks_create(self.context, network)
self.assertEqual(variables, network.variables)
return network.id
def create_network_device(
self, name, project_id, cloud_id, region_id, ip_address,
network_device_type, cell_id=None, parent_id=None, labels=None,
variables=None,
):
network_device = {
'name': name,
'project_id': project_id,
'cloud_id': cloud_id,
'region_id': region_id,
'cell_id': cell_id,
'ip_address': ip_address,
'parent_id': parent_id,
'device_type': network_device_type,
'active': True,
'labels': labels or (),
'variables': variables or {},
}
network_device = dbapi.network_devices_create(
self.context, network_device
)
self.assertEqual(variables, network_device.variables)
return network_device.id
def setup_host(self, variables):
project_id = self.create_project(name='project1')
cloud_id = self.create_cloud(name='cloud1', project_id=project_id)
region_id = self.create_region(
name='region1',
project_id=project_id,
cloud_id=cloud_id,
)
cell_id = self.create_cell(
name="cell1",
project_id=project_id,
cloud_id=cloud_id,
region_id=region_id,
)
host_id = self.create_host(
name="host1",
project_id=project_id,
cloud_id=cloud_id,
region_id=region_id,
ip_address="192.168.2.1",
host_type="server",
cell_id=cell_id,
parent_id=None,
labels=None,
variables=variables,
)
return host_id
def setup_network_device(self, variables):
project_id = self.create_project(name='project1')
cloud_id = self.create_cloud(name='cloud1', project_id=project_id)
region_id = self.create_region(
name='region1',
project_id=project_id,
cloud_id=cloud_id,
)
cell_id = self.create_cell(
name="cell1",
project_id=project_id,
cloud_id=cloud_id,
region_id=region_id
)
network_device_id = self.create_network_device(
name="network_device1",
project_id=project_id,
cloud_id=cloud_id,
region_id=region_id,
ip_address="192.168.2.1",
network_device_type="switch",
cell_id=cell_id,
parent_id=None,
labels=None,
variables=variables,
)
return network_device_id
def setup_network(self, variables):
project_id = self.create_project(name='project1')
cloud_id = self.create_cloud(name='cloud1', project_id=project_id)
region_id = self.create_region(
name='region1',
project_id=project_id,
cloud_id=cloud_id,
)
cell_id = self.create_cell(
name="cell1",
project_id=project_id,
cloud_id=cloud_id,
region_id=region_id,
)
network_id = self.create_network(
name="network1",
project_id=project_id,
cloud_id=cloud_id,
region_id=region_id,
cell_id=cell_id,
cidr="192.168.2.0/24",
gateway="192.168.2.1",
netmask="255.255.255.0",
variables=variables,
)
return network_id
def setup_cell(self, variables):
project_id = self.create_project(name='project1')
cloud_id = self.create_cloud(name='cloud1', project_id=project_id)
region_id = self.create_region(
name='region1',
project_id=project_id,
cloud_id=cloud_id,
)
cell_id = self.create_cell(
name="cell1",
project_id=project_id,
cloud_id=cloud_id,
region_id=region_id,
variables=variables,
)
return cell_id
def setup_region(self, variables):
project_id = self.create_project(name='project1')
cloud_id = self.create_cloud(name='cloud1', project_id=project_id)
region_id = self.create_region(
name='region1',
project_id=project_id,
cloud_id=cloud_id,
variables=variables,
)
return region_id
def setup_cloud(self, variables):
project_id = self.create_project(name='project1')
cloud_id = self.create_cloud(
name='cloud1',
project_id=project_id,
variables=variables,
)
return cloud_id
def setup_project(self, variables):
project_id = self.create_project(name='project1', variables=variables)
return project_id
def setup_resource(self, *args, **kwargs):
setup_fn = {
"cells": self.setup_cell,
"hosts": self.setup_host,
"networks": self.setup_network,
"network-devices": self.setup_network_device,
"regions": self.setup_region,
"clouds": self.setup_cloud,
"projects": self.setup_project,
}
return setup_fn[self.resources_type](*args, *kwargs)
def test_get_resource_by_id_with_variables(self):
variables = {
"key1": "value1",
"key2": "value2",
"key3": "value3",
}
resource_id = self.setup_resource(deepcopy(variables))
test = dbapi.resource_get_by_id(
self.context, self.resources_type, resource_id
)
self.assertEqual(resource_id, test.id)
self.assertEqual(variables, test.variables)
def test_get_resource_by_id_not_found(self):
self.assertRaises(
exceptions.NotFound,
dbapi.resource_get_by_id,
context=self.context,
resources=self.resources_type,
resource_id=self._get_mock_resource_id(),
)
def test_variables_update_by_resource_id_existing_empty(self):
existing_variables = {}
resource_id = self.setup_resource(existing_variables)
variables = {
"key1": "value1",
"key2": "value2",
"key3": "value3",
}
test = dbapi.variables_update_by_resource_id(
self.context, self.resources_type, resource_id, deepcopy(variables)
)
self.assertEqual(resource_id, test.id)
self.assertEqual(variables, test.variables)
validate = dbapi.resource_get_by_id(
self.context, self.resources_type, resource_id
)
self.assertEqual(resource_id, validate.id)
self.assertEqual(variables, validate.variables)
def test_variables_update_by_resource_id_not_found(self):
self.assertRaises(
exceptions.NotFound,
dbapi.variables_update_by_resource_id,
context=self.context,
resources=self.resources_type,
resource_id=self._get_mock_resource_id(),
data={"key1": "value1"},
)
def test_variables_update_by_resource_id_modify_existing(self):
existing_variables = {
"key1": "value1",
"key2": "value2",
"key3": "value3",
}
update_variables = {
"key3": "newvalue3",
"key4": "value4",
}
result_variables = deepcopy(existing_variables)
result_variables.update(deepcopy(update_variables))
resource_id = self.setup_resource(existing_variables)
test = dbapi.variables_update_by_resource_id(
context=self.context,
resources=self.resources_type,
resource_id=resource_id,
data=deepcopy(update_variables)
)
self.assertEqual(resource_id, test.id)
self.assertEqual(result_variables, test.variables)
validate = dbapi.resource_get_by_id(
self.context, self.resources_type, resource_id
)
self.assertEqual(resource_id, validate.id)
self.assertEqual(result_variables, validate.variables)
def test_variables_delete_by_resource_id(self):
existing_variables = {
"key1": "value1",
"key2": "value2",
"key3": "value3",
}
delete_variables = [
"key2",
"key3",
]
result_variables = {"key1": "value1"}
resource_id = self.setup_resource(existing_variables)
test = dbapi.variables_delete_by_resource_id(
context=self.context,
resources=self.resources_type,
resource_id=resource_id,
data=delete_variables
)
self.assertEqual(resource_id, test.id)
self.assertEqual(result_variables, test.variables)
validate = dbapi.resource_get_by_id(
self.context, self.resources_type, resource_id
)
self.assertEqual(resource_id, validate.id)
self.assertEqual(result_variables, validate.variables)
def test_variables_delete_by_resource_id_resource_not_found(self):
self.assertRaises(
exceptions.NotFound,
dbapi.variables_delete_by_resource_id,
context=self.context,
resources=self.resources_type,
resource_id=self._get_mock_resource_id(),
data={"key1": "value1"},
)
def test_variables_delete_by_resource_id_variable_not_found(self):
existing_variables = {
"key1": "value1",
"key2": "value2",
"key3": "value3",
}
delete_variables = [
"key4",
]
result_variables = deepcopy(existing_variables)
resource_id = self.setup_resource(existing_variables)
test = dbapi.variables_delete_by_resource_id(
context=self.context,
resources=self.resources_type,
resource_id=resource_id,
data=delete_variables
)
self.assertEqual(resource_id, test.id)
self.assertEqual(result_variables, test.variables)
validate = dbapi.resource_get_by_id(
self.context, self.resources_type, resource_id
)
self.assertEqual(resource_id, validate.id)
self.assertEqual(result_variables, validate.variables)
class HostsVariablesDBTestCase(VariablesDBTestCase, base.DBTestCase):
resources_type = "hosts"
class NetworkDevicesVariablesDBTestCase(VariablesDBTestCase, base.DBTestCase):
resources_type = "network-devices"
class CellsVariablesDBTestCase(VariablesDBTestCase, base.DBTestCase):
resources_type = "cells"
class RegionsVariablesDBTestCase(VariablesDBTestCase, base.DBTestCase):
resources_type = "regions"
class NetworksVariablesDBTestCase(VariablesDBTestCase, base.DBTestCase):
resources_type = "networks"
class ProjectsVariablesDBTestCase(VariablesDBTestCase, base.DBTestCase):
resources_type = "projects"
class CloudsVariablesDBTestCase(VariablesDBTestCase, base.DBTestCase):
resources_type = "clouds"

View File

@ -1,232 +0,0 @@
import copy
import uuid
"""
Provides some fake resources - region, cell, host and other related
objects for test.
"""
class Project(object):
def __init__(self, id, name, variables):
self.id = uuid.UUID(id)
self.name = name
self.variables = variables
def items(self):
return iter(self.__dict__.items())
PROJECT1 = Project("4534dcb4-dacd-474f-8afc-8bd5ab2d26e8",
"project1", {"key1": "value1", "key2": "value2"})
PROJECT2 = Project("77c527cb-837d-4fcb-bafb-af37ba3d13a4",
"project2", {"key1": "value1", "key2": "value2"})
class Cloud(object):
def __init__(self, id, name, project_id, variables, labels=None):
self.id = id
self.name = name
self.project_id = project_id
self.variables = variables
self.labels = labels
def items(self):
return iter(self.__dict__.items())
CLOUD1 = Cloud(1, "cloud1", "abcd", {"key1": "value1", "key2": "value2"})
CLOUD2 = Cloud(2, "cloud2", "abcd", {"key3": "value3", "key4": "value4"})
CLOUDS_LIST = [CLOUD1, CLOUD2]
class User(object):
def __init__(self, id, username, project_id, is_admin, is_root,
api_key, roles=None):
self.id = id
self.username = username
self.project_id = project_id
self.is_admin = is_admin
self.is_root = is_root
self.api_key = api_key
self.roles = roles
def items(self):
return iter(self.__dict__.items())
USER1 = User(1, 'user1', "2757a1b4-cd90-4891-886c-a246fd4e7064", True, False,
'xx-yy-zz')
USER2 = User(2, 'user2', "05d081ca-dcf5-4e96-b132-23b94d665799", False, False,
'aa-bb-cc')
class Cell(object):
def __init__(self, id, name, status, region_id, cloud_id, project_id,
variables, labels=None):
self.id = id
self.name = name
self.status = status
self.region_id = region_id
self.cloud_id = cloud_id
self.project_id = project_id
self.variables = variables
self.resolved = variables
self.labels = labels
def items(self):
return iter(self.__dict__.items())
CELL1 = Cell(1, "cell1", "active", 1, 1, 1, {"key1": "value1",
"key2": "value2"})
CELL2 = Cell(2, "cell2", "active", "2", "1", "abcd", {"key3": "value3",
"key4": "value4"})
CELL3 = Cell(3, "cell1", "active", 2, 1, 1, {"key1": "value1",
"key2": "value2"})
CELL_LIST = [CELL1, CELL2]
CELL_LIST2 = [CELL1, CELL3]
class Region(object):
def __init__(self, id, name, project_id, cloud_id, variables, labels=None):
self.id = id
self.name = name
self.project_id = project_id
self.cloud_id = cloud_id
self.variables = variables
self.resolved = variables
self.labels = labels
def items(self):
return iter(self.__dict__.items())
REGION1 = Region(1, "region1", "abcd", 1, {"key1": "value1", "key2": "value2"})
REGION2 = Region(2, "region2", "abcd", 1, {"key3": "value3", "key4": "value4"})
REGIONS_LIST = [REGION1, REGION2]
class Host(object):
def __init__(self, id, name, project_id, cloud_id, region_id, ip_address,
device_type, variables, labels=None, cell_id=None,
parent_id=None):
self.id = id
self.name = name
self.project_id = project_id
self.cloud_id = cloud_id
self.region_id = region_id
self.ip_address = ip_address
self.variables = variables
self.resolved = copy.copy(variables)
self.device_type = device_type
self.labels = labels
self.cell_id = cell_id
self.parent_id = parent_id
def items(self):
return iter(self.__dict__.items())
HOST1 = Host(1, "www.craton.com", 1, 1, 1, "192.168.1.1", "server",
{"key1": "value1", "key2": "value2"})
HOST2 = Host(2, "www.example.com", "1", "1", "1", "192.168.1.2", "server",
{"key1": "value1", "key2": "value2"})
HOST3 = Host(3, "www.example.net", "1", "!", "2", "10.10.0.1", "server",
{"key1": "value1", "key2": "value2"})
HOST4 = Host(4, "www.example.net", "1", "1", "2", "10.10.0.1", "server",
{"key1": "value1", "key2": "value2"}, labels=["a", "b"])
HOSTS_LIST_R1 = [HOST1, HOST2]
HOSTS_LIST_R2 = [HOST3]
HOSTS_LIST_R3 = [HOST1, HOST2, HOST3]
class Networks(object):
def __init__(self, id, name, project_id, cidr, gateway, netmask,
variables, cloud_id, region_id, labels=None):
self.id = id
self.name = name
self.project_id = project_id
self.cidr = cidr
self.gateway = gateway
self.netmask = netmask
self.variables = variables
self.resolved = copy.copy(variables)
self.labels = labels
self.cloud_id = cloud_id
self.region_id = region_id
def items(self):
return iter(self.__dict__.items())
NETWORK1 = Networks(1, "PrivateNetwork", 1, "192.168.1.0/24", "192.168.1.1",
"255.255.255.0", {"key1": "value1"}, 1, 1)
NETWORK2 = Networks(2, "PublicNetwork", 1, "10.10.1.0/24", "10.10.1.1",
"255.255.255.0", {"pkey1": "pvalue1"}, 1, 1)
NETWORK3 = Networks(3, "OtherNetwork", 1, "10.10.1.0/24", "10.10.1.2",
"255.255.255.0", {"okey1": "ovalue1"}, 1, 2)
NETWORKS_LIST = [NETWORK1, NETWORK2]
NETWORKS_LIST2 = [NETWORK1, NETWORK2, NETWORK3]
class NetworkDevice():
def __init__(self, id, name, project_id, cloud_id, region_id, device_type,
ip_address, variables, labels=None, cell_id=None,
parent_id=None):
self.name = name
self.id = id
self.project_id = project_id
self.region_id = region_id
self.device_type = device_type
self.ip_address = ip_address
self.variables = variables
self.resolved = copy.copy(variables)
self.labels = labels
self.cloud_id = cloud_id
self.cell_id = cell_id
self.parent_id = parent_id
def items(self):
return iter(self.__dict__.items())
NETWORK_DEVICE1 = NetworkDevice(1, "NetDevices1", 1, 1, 1, "Server",
"10.10.0.1",
{"key1": "value1", "key2": "value2"},
labels=["a", "b"])
NETWORK_DEVICE2 = NetworkDevice(2, "NetDevices2", 1, 1, 2, "Server",
"10.10.0.2",
{"key1": "value1", "key2": "value2"},
labels=["a", "b"])
NETWORK_DEVICE_LIST1 = [NETWORK_DEVICE1]
NETWORK_DEVICE_LIST2 = [NETWORK_DEVICE1, NETWORK_DEVICE2]
class NetworkInterface():
def __init__(self, id, name, device_id, project_id, interface_type,
ip_address, variables):
self.id = id
self.name = name
self.device_id = device_id
self.project_id = project_id
self.interface_type = interface_type
self.ip_address = ip_address
self.variables = variables
def items(self):
return iter(self.__dict__.items())
NETWORK_INTERFACE1 = NetworkInterface(1, "NetInterface", 1, 1,
"interface_type1", "10.10.0.1",
{"key1": "value1", "key2": "value2"})
NETWORK_INTERFACE2 = NetworkInterface(2, "NetInterface", 2, 1,
"interface_type2", "10.10.0.2",
{"key1": "value1", "key2": "value2"})
NETWORK_INTERFACE_LIST1 = [NETWORK_INTERFACE1]
NETWORK_INTERFACE_LIST2 = [NETWORK_INTERFACE1, NETWORK_INTERFACE2]

File diff suppressed because it is too large Load Diff

View File

@ -1,27 +0,0 @@
from craton import api
from craton.tests import TestCase
class TestRouteURLNaming(TestCase):
pass
def generate_route_naming_functions(cls):
def gen_test(endpoint, url):
def test(self):
pattern = (
"^/v1/([a-z-]+|<any\('[a-z-]+'(, '[a-z-]+')*\):resources>)"
"(/<id>(/[a-z-]+)?)?"
)
self.assertRegex(url, pattern)
test_name = 'test_route_naming_{}'.format(endpoint)
setattr(cls, test_name, test)
app = api.setup_app()
for rule in app.url_map.iter_rules():
endpoint = rule.endpoint[3:]
url = rule.rule
gen_test(endpoint, url)
generate_route_naming_functions(TestRouteURLNaming)

View File

@ -1,207 +0,0 @@
import jsonschema
from craton import api
from craton.api.v1.schemas import filters, validators
from craton.tests import TestCase
VALIDATORS = {
"with_schema": [
('ansible_inventory', 'GET'),
('cells', 'GET'),
('cells', 'POST'),
('cells_id', 'GET'),
('cells_id', 'PUT'),
('devices', 'GET'),
('hosts', 'GET'),
('hosts', 'POST'),
('hosts_id', 'GET'),
('hosts_id', 'PUT'),
('hosts_labels', 'DELETE'),
('hosts_labels', 'GET'),
('hosts_labels', 'PUT'),
('network_devices', 'GET'),
('network_devices', 'POST'),
('network_devices_id', 'GET'),
('network_devices_id', 'PUT'),
('network_devices_labels', 'GET'),
('network_devices_labels', 'PUT'),
('network_devices_labels', 'DELETE'),
('network_interfaces', 'GET'),
('network_interfaces', 'POST'),
("network_interfaces_id", "GET"),
('network_interfaces_id', 'PUT'),
('networks', 'GET'),
('networks', 'POST'),
("networks_id", "GET"),
('networks_id', 'PUT'),
('projects', 'GET'),
('projects', 'POST'),
("projects_id", "GET"),
('regions', 'GET'),
('regions', 'POST'),
("regions_id", "GET"),
('regions_id', 'PUT'),
('clouds', 'GET'),
('clouds', 'POST'),
("clouds_id", "GET"),
('clouds_id', 'PUT'),
('users', 'GET'),
('users', 'POST'),
("users_id", "GET"),
('variables_with_resolve', 'DELETE'),
('variables_with_resolve', 'GET'),
('variables_with_resolve', 'PUT'),
('variables_without_resolve', 'DELETE'),
('variables_without_resolve', 'GET'),
('variables_without_resolve', 'PUT'),
],
"without_schema": [
('cells_id', 'DELETE'),
('hosts_id', 'DELETE'),
('network_devices_id', 'DELETE'),
("network_interfaces_id", "DELETE"),
("networks_id", "DELETE"),
("projects_id", "DELETE"),
("users_id", "DELETE"),
("regions_id", "DELETE"),
("clouds_id", "DELETE"),
]
}
class TestAPISchema(TestCase):
"""Confirm that valid schema are defined."""
def test_all_validators_have_test(self):
known = set(VALIDATORS["with_schema"] + VALIDATORS["without_schema"])
defined = set(validators.keys())
self.assertSetEqual(known, defined)
def generate_schema_validation_functions(cls):
def gen_validator_schema_test(endpoint, method):
def test(self):
try:
loc_schema = validators[(endpoint, method)]
except KeyError:
self.fail(
'The validator {} is missing from the schemas '
'validators object.'.format((endpoint, method))
)
self.assertEqual(len(loc_schema), 1)
locations = {
'GET': 'args',
'DELETE': 'json',
'PUT': 'json',
'POST': 'json',
}
location, schema = loc_schema.popitem()
self.assertIn(method, locations)
self.assertEqual(locations[method], location)
self.assertIs(
jsonschema.Draft4Validator.check_schema(schema), None
)
if 'type' not in schema or schema['type'] == 'object':
self.assertFalse(schema['additionalProperties'])
name = '_'.join(('validator', endpoint, method))
setattr(cls, 'test_valid_schema_{}'.format(name), test)
for (endpoint, method) in VALIDATORS["with_schema"]:
gen_validator_schema_test(endpoint, method)
def gen_no_validator_schema_test(endpoint, method):
def test(self):
try:
loc_schema = validators[(endpoint, method)]
except KeyError:
self.fail(
'The validator {} is missing from the schemas '
'validators object.'.format((endpoint, method))
)
self.assertEqual({}, loc_schema)
name = '_'.join(('validator', endpoint, method))
setattr(cls, 'test_no_schema_{}'.format(name), test)
for (endpoint, method) in VALIDATORS["without_schema"]:
gen_no_validator_schema_test(endpoint, method)
def gen_filter_test(name, schema):
def test(self):
self.assertIs(
jsonschema.Draft4Validator.check_schema(schema), None
)
if 'type' not in schema or schema['type'] == 'object':
self.assertFalse(schema['additionalProperties'])
setattr(cls, 'test_valid_schema_{}'.format(name), test)
for (endpoint, method), responses in filters.items():
for return_code, json in responses.items():
if json['schema']:
name = '_'.join(('filter', endpoint, method, str(return_code)))
gen_filter_test(name, json['schema'])
generate_schema_validation_functions(TestAPISchema)
class TestSchemaLocationInRoute(TestCase):
def setUp(self):
super().setUp()
self.app = api.setup_app()
def generate_endpoint_method_validation_functions(cls):
def gen_test(test_type, endpoint, method):
def test(self):
rules = [
rule for rule in self.app.url_map.iter_rules()
if rule.endpoint == endpoint and method in rule.methods
]
self.assertEqual(len(rules), 1)
test_name = 'test_{}_endpoint_method_in_routes_{}_{}'.format(
test_type, endpoint, method
)
setattr(cls, test_name, test)
for (_endpoint, method) in validators:
endpoint = "v1.{}".format(_endpoint)
gen_test('validators', endpoint, method)
for (_endpoint, method) in filters:
endpoint = "v1.{}".format(_endpoint)
gen_test('filters', endpoint, method)
generate_endpoint_method_validation_functions(TestSchemaLocationInRoute)
class TestRoutesInValidators(TestCase):
pass
def generate_route_validation_functions(cls):
def gen_test(test_type, checker, endpoint, method):
def test(self):
self.assertIn((endpoint, method), checker)
test_name = 'test_route_in_{}_{}_{}'.format(
test_type, endpoint, method
)
setattr(cls, test_name, test)
app = api.setup_app()
for rule in app.url_map.iter_rules():
# remove 'v1.' from start of endpoint
endpoint = rule.endpoint[3:]
for method in rule.methods:
if method == 'OPTIONS':
continue
elif method == 'HEAD' and 'GET' in rule.methods:
continue
else:
gen_test('validators', validators, endpoint, method)
gen_test('filters', filters, endpoint, method)
generate_route_validation_functions(TestRoutesInValidators)

View File

@ -1,22 +0,0 @@
"""Tests for craton.util module."""
import uuid
from craton import tests
from craton import util
class TestProjectIdUtilities(tests.TestCase):
"""Unit tests for the copy_project_id_into_json function."""
def test_adds_project_id_to_json(self):
"""Verify we add the project_id to the json body."""
project_id = uuid.uuid4().hex
self.context.tenant = project_id
json = util.copy_project_id_into_json(self.context, {})
self.assertDictEqual({'project_id': project_id}, json)
def test_defaults_project_id_to_zero(self):
"""Verify if there's no tenant attribute on the context we use 0."""
del self.context.tenant
json = util.copy_project_id_into_json(self.context, {})
self.assertDictEqual({'project_id': ''}, json)

View File

@ -1,82 +0,0 @@
"""Module containing generic utilies for Craton."""
from datetime import date
from decorator import decorator
from flask import json, Response
import werkzeug.exceptions
from oslo_log import log
import craton.exceptions as exceptions
LOG = log.getLogger(__name__)
def copy_project_id_into_json(context, json, project_id_key='project_id'):
"""Copy the project_id from the context into the JSON request body.
:param context:
The request context object.
:param json:
The parsed JSON request body.
:returns:
The JSON with the project-id from the headers added as the
"project_id" value in the JSON.
:rtype:
dict
"""
json[project_id_key] = getattr(context, 'tenant', '')
return json
class JSONEncoder(json.JSONEncoder):
def default(self, o):
if isinstance(o, date):
return o.isoformat()
return json.JSONEncoder.default(self, o)
JSON_KWARGS = {
"indent": 2,
"sort_keys": True,
"cls": JSONEncoder,
"separators": (",", ": "),
}
def handle_all_exceptions(e):
"""Generate error Flask response object from exception."""
headers = [("Content-Type", "application/json")]
if isinstance(e, exceptions.Base):
message = e.message
status = e.code
elif isinstance(e, werkzeug.exceptions.HTTPException):
message = e.description
status = e.code
# Werkzeug exceptions can include additional headers, those should be
# kept unless the header is "Content-Type" which is set by this
# function.
headers.extend(
h for h in e.get_headers(None) if h[0].lower() != "content-type"
)
else:
LOG.exception(e)
e_ = exceptions.UnknownException
message = e_.message
status = e_.code
body = {
"message": message,
"status": status,
}
body_ = "{}\n".format(json.dumps(body, **JSON_KWARGS))
return Response(body_, status, headers)
@decorator
def handle_all_exceptions_decorator(fn, *args, **kwargs):
try:
return fn(*args, **kwargs)
except Exception as e:
return handle_all_exceptions(e)

View File

@ -1,4 +0,0 @@
"""Tasks for managing the execution of Ansible playbooks
Takes in account failure.
"""

View File

@ -1,11 +0,0 @@
import abc
class WorkflowFactory(object, metaclass=abc.ABCMeta):
@abc.abstractmethod
def workflow(self):
"""Construct appropriate taskflow flow object.
:returns: A flow.Flow subclass
"""

View File

@ -1,40 +0,0 @@
import time
from oslo_log import log as logging
from taskflow import task
from taskflow.patterns import linear_flow
from craton.workflow import base
LOG = logging.getLogger(__name__)
class Sleep(task.Task):
def __init__(self, delay=10, **kwargs):
super(Sleep, self).__init__(**kwargs)
self.delay = delay
def execute(self):
LOG.info('Doing task %s', self)
time.sleep(self.delay)
class Fail(task.Task):
def execute(self):
LOG.info('Failing task %s', self)
raise RuntimeError('failure in task %s' % self)
class TestFlow(base.WorkflowFactory):
def __init__(self, task_delay=5):
super(TestFlow, self).__init__()
self.task_delay = task_delay
def workflow(self):
f = linear_flow.Flow('example')
f.add(
Sleep(name='step 1', delay=self.task_delay),
Sleep(name='step 2', delay=self.task_delay),
Fail(name='step 3'),
)
return f

View File

@ -1,74 +0,0 @@
import contextlib
import threading
from oslo_config import cfg
from oslo_log import log as logging
from oslo_utils import uuidutils
from taskflow.conductors import backends as conductors
from taskflow.jobs import backends as boards
from taskflow.persistence import backends as persistence_backends
from zake import fake_client
LOG = logging.getLogger(__name__)
CONF = cfg.CONF
OPTS = [
cfg.StrOpt('job_board_name', default='craton_jobs',
help='Name of job board used to store outstanding jobs.'),
cfg.IntOpt('max_simultaneous_jobs', default=9,
help='Number of tasks to run in parallel on this worker.'),
]
CONF.register_opts(OPTS)
TASKFLOW_OPTS = [
cfg.StrOpt('connection', default='memory',
help='Taskflow backend used for persisting taskstate.'),
cfg.StrOpt('job_board_url',
default='zookeeper://localhost?path=/taskflow/craton/jobs',
help='URL used to store outstanding jobs'),
cfg.BoolOpt('db_upgrade', default=True,
help='Upgrade DB schema on startup.'),
]
CONF.register_opts(TASKFLOW_OPTS, group='taskflow')
def _get_persistence_backend(conf):
return persistence_backends.fetch({
'connection': conf.taskflow.connection,
})
def _get_jobboard_backend(conf, persistence=None):
client = None
if conf.taskflow.connection == 'memory':
client = fake_client.FakeClient()
return boards.fetch(conf.job_board_name,
{'board': conf.taskflow.job_board_url},
client=client, persistence=persistence)
def start(conf):
persistence = _get_persistence_backend(conf)
if conf.taskflow.db_upgrade:
with contextlib.closing(persistence.get_connection()) as conn:
LOG.info('Checking for database schema upgrade')
conn.upgrade()
my_name = uuidutils.generate_uuid()
LOG.info('I am %s', my_name)
board = _get_jobboard_backend(conf, persistence=persistence)
conductor = conductors.fetch(
'nonblocking', my_name, board,
engine='parallel',
max_simultaneous_jobs=conf.max_simultaneous_jobs,
persistence=persistence)
board.connect()
LOG.debug('Starting taskflow conductor loop')
threading.Thread(target=conductor.run).start()
return persistence, board, conductor

View File

@ -1,21 +0,0 @@
Craton's API Reference Guide
============================
Resources:
.. toctree::
:maxdepth: 2
cells
devices
hosts
networks
net-devices
net-interfaces
regions
API Usage:
.. toctree::
:maxdepth: 2
filtering-by-variables

View File

@ -1,167 +0,0 @@
digraph structs {
node [shape=plaintext]
# overlap=false;
# splines=true;
# layout="neato";
Cli [label=<
<TABLE BORDER="0" CELLBORDER="1" CELLSPACING="0">
<TR><TD PORT="Cli"><font face="Helvetica" point-size="12">CLI<br/></font>
</TD></TR>
</TABLE>>];
PythonApi [label=<
<TABLE BORDER="0" CELLBORDER="1" CELLSPACING="0">
<TR><TD PORT="PythonApi"><font face="Helvetica" point-size="12">Python API<br/></font>
</TD></TR>
</TABLE>>];
CratonCore [label=<
<TABLE BORDER="0" CELLBORDER="0" CELLSPACING="4">
<TR>
<TD>
<TABLE BORDER="0" CELLBORDER="1" CELLSPACING="8">
<TR>
<TD PORT="Horizon"><font face="Helvetica" point-size="12">
<font face="Helvetica" point-size="12">Horizon UI<br/><font point-size="8">
Inventory,<br/>Workflow Panels</font></font>
</font></TD>
</TR>
<TR><TD PORT="Keystone"><font face="Helvetica" point-size="12">Keystone<br/><font point-size="8">
Principals, roles,<br/>privileges,<br/>catalog endpoints</font></font>
</TD></TR>
<TR><TD PORT="Barbican"><font face="Helvetica" point-size="12">Barbican<br/><font point-size="8">
Key Storage for<br/>TaskFlow Workers</font></font>
</TD></TR>
</TABLE>
</TD>
<TD>
<TABLE BORDER="1" CELLBORDER="1" CELLSPACING="4">
<!--font face="Helvetica"-->
<TR>
<TD rowspan="5" PORT="Rbac"><font face="Helvetica" point-size="12">RBAC</font></TD>
<TD colspan="4" PORT="RestApi"><font face="Helvetica" point-size="12">REST API Service (Flask)</font></TD>
</TR>
<TR>
<TD colspan="3" PORT="PythonObjectModel"><font face="Helvetica" point-size="12">Python Object Model</font></TD>
<TD colspan="1" PORT="OsloCache"><font face="Helvetica" point-size="12">oslo.cache</font></TD>
</TR>
<TR>
<TD colspan="2" PORT="InventoryFabric"><font face="Helvetica" point-size="12">Inventory Fabric</font></TD>
<TD colspan="2" PORT="Workflows"><font face="Helvetica" point-size="12">Workflows</font></TD>
</TR>
<TR>
<TD colspan="1" PORT="VirtualizedVariables"><font face="Helvetica" point-size="12">Virtualized <br/>Variables</font></TD>
<TD colspan="2" PORT="DefaultInventoryModel"><font face="Helvetica" point-size="12">Default<br/>Inventory<br/>Model</font></TD>
<TD colspan="1" PORT="TaskFlowController"><font face="Helvetica" point-size="12">TaskFlow<br/>Controller</font></TD>
</TR>
<TR>
<TD colspan="1" PORT="VariablePlugin"><font face="Helvetica" point-size="12">Variable<br/>Plugin<br/>(Stevedore)</font></TD>
<TD colspan="2" PORT="SqlAlchemy"><font face="Helvetica" point-size="12">SQL<br/>Alchemy</font></TD>
<TD colspan="1" PORT="WorkflowPlugin"><font face="Helvetica" point-size="12">Workflow<br/>Plugin<br/>(Stevedore)</font></TD>
</TR>
<!--/font-->
</TABLE>
</TD>
<TD>
<TABLE BORDER="0" CELLBORDER="1" CELLSPACING="8">
<TR><TD COLSPAN="2" PORT="Redis"><font face="Helvetica" point-size="12">REDIS<br/></font>
</TD></TR>
<TR><TD COLSPAN="2" PORT="MySqlGalera"><font face="Helvetica" point-size="12">MySQL/Galera<br/></font>
</TD></TR>
<TR>
<TD PORT="TfJobBoard"><font face="Helvetica" point-size="12">TF<br/>JobBoard<br/></font>
</TD>
<TD PORT="WaLogCapture" bgcolor="#D6DBDF"><font face="Helvetica" point-size="12">WA Log<br/>Capture<br/></font>
</TD>
</TR>
<TR>
<TD ><font face="Helvetica" point-size="12">TF<br/>Worker<br/>Pool<br/></font>
</TD>
<TD bgcolor="#D7BDE2"><font face="Helvetica" point-size="12" >ZooKeeper<br/></font>
</TD>
</TR>
</TABLE>
</TD>
</TR>
</TABLE>
>];
NovaPlugin [label=<
<TABLE BORDER="0" CELLBORDER="1" CELLSPACING="0">
<TR><TD PORT="NovaPlugin"><font face="Helvetica" point-size="12">Nova Plugin<br/><font point-size="8">
(Inventory)</font></font>
</TD></TR>
</TABLE>>];
HistoryPlugin [label=<
<TABLE BORDER="0" CELLBORDER="1" CELLSPACING="0">
<TR><TD PORT="HistoryPlugin" bgcolor="#D6DBDF"><font face="Helvetica" point-size="12">History Plugin<br/><font point-size="8">
(Inventory)</font></font>
</TD></TR>
</TABLE>>];
AnsiblePlugin [label=<
<TABLE BORDER="0" CELLBORDER="1" CELLSPACING="0">
<TR><TD PORT="AnsiblePlugin"><font face="Helvetica" point-size="12">Ansible Plugin<br/><font point-size="8">
(Workflow)</font></font>
</TD></TR>
</TABLE>>];
HistoricalData [label=<
<TABLE BORDER="0" CELLBORDER="1" CELLSPACING="0">
<TR><TD PORT="HistoricalData" bgcolor="#D6DBDF"><font face="Helvetica" point-size="12">Historica lData</font>
</TD></TR>
</TABLE>>];
Legend [label=<
<TABLE BORDER="0" CELLBORDER="0" CELLSPACING="0">
<TR><TD><font face="Helvetica" point-size="12">Legend</font>
</TD></TR>
<TR><TD border="1" bgcolor="#D7BDE2"><font face="Helvetica" point-size="10">Used For Scaling</font>
</TD></TR>
<TR><TD border="1" bgcolor="#D6DBDF"><font face="Helvetica" point-size="10">Future Work</font>
</TD></TR>
</TABLE>>];
//UndercloudIntegrations [pos="1,1"];
#subgraph cluster1 {
# style=invis;
# Barbican;
# Horizon;
# Keystone;
# }
ranksep=.25;
#size = "8,8";
#{ rank = same; Horizon; CratonCore:PythonObjectModel; }
#{ rank = same; UndercloudIntegrations; CratonCore; }
#Horizon -> Keystone [style=invis]
NovaPlugin -> Legend [style=invis];
CratonCore:Barbican -> Legend [style=invis];
CratonCore:WaLogCapture -> HistoricalData:HistoricalData;
HistoryPlugin:HistoryPlugin -> HistoricalData:HistoricalData;
CratonCore:Horizon -> PythonApi:PythonApi [constraint=false];
CratonCore:RBAC -> CratonCore:Keystone;
PythonApi:PythonApi -> CratonCore:RestApi;
Cli:Cli -> PythonApi:PythonApi;
CratonCore:VariablePlugin -> NovaPlugin:NovaPlugin;
CratonCore:VariablePlugin -> HistoryPlugin:HistoryPlugin;
CratonCore:WorkflowPlugin -> AnsiblePlugin:AnsiblePlugin;
CratonCore:OsloCache -> CratonCore:Redis [constraint=false];
CratonCore:SqlAlchemy -> CratonCore:MySqlGalera;
}

View File

@ -1,75 +0,0 @@
Architecture
============
.. graphviz:: arch-diagram.dot
CLI
---
TODO: Add Documentation
Python API
----------
TODO: Add Documentation
RBAC
----
TODO: Add Documentation
REST API Service (Flask)
------------------------
TODO: Add Documentation
Python Object Model
-------------------
TODO: Add Documentation
oslo.cache
----------
TODO: Add Documentation
Inventory Fabric
----------------
TODO: Add Documentation
Workflows
---------
TODO: Add Documentation
Virtualized Variables
---------------------
TODO: Add Documentation
Default Inventory Mode
----------------------
TODO: Add Documentation
TaskFlow Controller
-------------------
TODO: Add Documentation
Variable Plugin (Stevedore)
---------------------------
TODO: Add Documentation
SQL Alchemy
-----------
TODO: Add Documentation
Workflow Plugin (Stevedore)
---------------------------
TODO: Add Documentation
Nova Plugin
-----------
TODO: Add Documentation
History Plugin
--------------
TODO: Add Documentation
Ansible Plugin
--------------
TODO: Add Documentation

View File

@ -1,402 +0,0 @@
.. _cells:
=====
Cells
=====
Definition of cell
Create Cell
===========
:POST: /v1/cells
Create a new Cell
Normal response codes: OK(201)
Error response codes: invalid request(400), validation exception(405)
Request
-------
+------------+------+---------+-------------------------+
| Name | In | Type | Description |
+============+======+=========+=========================+
| name | body | string | Unique name of the cell |
+------------+------+---------+-------------------------+
| region_id | body | integer | Unique ID of the region |
+------------+------+---------+-------------------------+
| labels | body | string | User defined labels |
+------------+------+---------+-------------------------+
| note | body | string | Note used for governance|
+------------+------+---------+-------------------------+
| variables | body | object | User defined variables |
+------------+------+---------+-------------------------+
Required Header
^^^^^^^^^^^^^^^
- Content-Type: application/json
- X-Auth-Token
- X-Auth-User
- X-Auth-Project
Example Cell Create
*******************
.. code-block:: bash
curl -i "http://${MY_IP}:7780/v1/cells" \
-d '{"name": "myCell", "region_id": 1}' \
-H "Content-Type: application/json" \
-H "X-Auth-Token: demo" \
-H "X-Auth-User: demo" \
-H "X-Auth-Project: 717e9a216e2d44e0bc848398563bda06"
Response
--------
+-----------+------+---------+-------------------------------+
| Name | In | Type | Description |
+===========+======+=========+===============================+
| cell | body | object | - id |
| | | | - name |
| | | | - region_id |
| | | | - labels |
| | | | - note |
| | | | - variables |
+-----------+------+---------+-------------------------------+
| id | body | integer | Unique ID of the cell |
+-----------+------+---------+-------------------------------+
| name | body | string | Unique name of the cell |
+-----------+------+---------+-------------------------------+
| region_id | body | integer | Unique ID of the cell's region|
+-----------+------+---------+-------------------------------+
| labels | body | string | User defined labels |
+-----------+------+---------+-------------------------------+
| note | body | string | Note used for governance |
+-----------+------+---------+-------------------------------+
| variables | body | object | User defined variables |
+-----------+------+---------+-------------------------------+
Example Cell Create
*******************
.. code-block:: json
{
"id": 1,
"name": "myCell",
"note": null,
"region_id": 1
}
List Cells
==========
:GET: /v1/cells?region_id=
Gets all Cells
Normal response codes: OK(200)
Error response codes: invalid request(400), cell not found(404), validation exception(405)
Default response: unexpected error
Request
-------
+-----------+-------+--------+---------+----------------------------------+
| Name | In | Type | Required| Description |
+===========+=======+========+=========+==================================+
| region_id | query | string | Yes | ID of the region to get cells for|
+-----------+-------+--------+---------+----------------------------------+
Required Header
^^^^^^^^^^^^^^^
- Content-Type: application/json
- X-Auth-Token
- X-Auth-User
- X-Auth-Project
Example Cell List
*****************
.. code-block:: bash
curl -i "http://${MY_IP}:7780/v1/cells?region_id=1" \
-H "Content-Type: application/json" \
-H "X-Auth-Token: demo" \
-H "X-Auth-User: demo" \
-H "X-Auth-Project: 717e9a216e2d44e0bc848398563bda06"
Response
--------
+------------+------+---------+-------------------------------+
| Name | In | Type | Description |
+============+======+=========+===============================+
| cells | body | array | Array of cell objects |
+------------+------+---------+-------------------------------+
| id | body | integer | Unique ID of the cell |
+------------+------+---------+-------------------------------+
| name | body | string | Unique name of the cell |
+------------+------+---------+-------------------------------+
| region_id | body | integer | Unique ID of the cell's region|
+------------+------+---------+-------------------------------+
| labels | body | string | User defined labels |
+------------+------+---------+-------------------------------+
| note | body | string | Note used for governance |
+------------+------+---------+-------------------------------+
| variables | body | object | User defined variables |
+------------+------+---------+-------------------------------+
Example Cell List
*****************
.. code-block:: json
[
{
"id": 2,
"name": "cellJr",
"note": null,
"region_id": 1
},
{
"id": 1,
"name": "myCell",
"note": null,
"region_id": 1
}
]
.. todo:: **Example Unexpected Error**
..literalinclude:: ./api_samples/errors/errors-unexpected-resp.json
:language: javascript
Update Cells
============
:PUT: /v1/cells/{id}
Update an existing cell
Normal response codes: OK(200)
Error response codes: invalid request(400), cell not found(404), validation exception(405)
Request
-------
+----------+------+---------+------------------------------------+
| Name | In | Type | Description |
+==========+======+=========+====================================+
| name | body | string | Unique name of the cell |
+----------+------+---------+------------------------------------+
| labels | body | string | User defined labels |
+----------+------+---------+------------------------------------+
| note | body | string | Note used for governance |
+----------+------+---------+------------------------------------+
Required Header
^^^^^^^^^^^^^^^
- Content-Type: application/json
- X-Auth-Token
- X-Auth-User
- X-Auth-Project
Example Cell Update
*******************
.. code-block:: bash
curl -i "http://${MY_IP}:7780/v1/cells/1" \
-XPUT \
-d '{"name": "changedName"}' \
-H "Content-Type: application/json" \
-H "X-Auth-Token: demo" \
-H "X-Auth-User: demo" \
-H "X-Auth-Project: 717e9a216e2d44e0bc848398563bda06"
Response
--------
+----------+------+---------+-------------------------------+
| Name | In | Type | Description |
+==========+======+=========+===============================+
| cell | body | object | - id |
| | | | - name |
| | | | - region_id |
| | | | - labels |
| | | | - note |
| | | | - variables |
+----------+------+---------+-------------------------------+
| id | body | integer | Unique ID of the cell |
+----------+------+---------+-------------------------------+
| name | body | string | Unique name of the cell |
+----------+------+---------+-------------------------------+
| region_id| body | integer | Unique ID of the cell's region|
+----------+------+---------+-------------------------------+
| labels | body | string | User defined labels |
+----------+------+---------+-------------------------------+
| note | body | string | Note used for governance |
+----------+------+---------+-------------------------------+
| variables| body | object | User defined variables |
+----------+------+---------+-------------------------------+
Examples Cell Update
********************
.. code-block:: json
{
"id": 1,
"name": "changedName",
"note": null,
"project_id": "717e9a21-6e2d-44e0-bc84-8398563bda06",
"region_id": 1
}
Update Cell Variables
=====================
:PUT: /v1/cells/{id}/variables
Update user defined variables for the cell
Normal response codes: OK(200)
Error response codes: invalid request(400), cell not found(404), validation exception(405)
Request
-------
+--------+------+---------+------------------------------------+
| Name | In | Type | Description |
+========+======+=========+====================================+
| key | body | string | Identifier |
+--------+------+---------+------------------------------------+
| value | body | object | Data |
+--------+------+---------+------------------------------------+
| id | path | integer | Unique ID of the cell to be updated|
+--------+------+---------+------------------------------------+
Required Header
^^^^^^^^^^^^^^^
- Content-Type: application/json
- X-Auth-Token
- X-Auth-User
- X-Auth-Project
Example Cell Update Variables
*****************************
.. code-block:: bash
curl -i "http://${MY_IP}:7780/v1/cells/1/variables" \
-XPUT \
-d '{"newKey": "sampleKey"}' \
-H "Content-Type: application/json" \
-H "X-Auth-Token: demo" \
-H "X-Auth-User: demo" \
-H "X-Auth-Project: 717e9a216e2d44e0bc848398563bda06"
Response
--------
+--------+------+---------+-------------------------+
| Name | In | Type | Description |
+========+======+=========+=========================+
| key | body | string | Identifier |
+--------+------+---------+-------------------------+
| value | body | object | Data |
+--------+------+---------+-------------------------+
Example Cell Update Variables
*****************************
.. code-block:: json
{
"variables":
{
"newKey": “sampleKey”
}
}
Delete Cell
===========
:DELETE: /v1/cells/{id}
Deletes an existing record of a Cell
Normal response codes: no content(204)
Error response codes: invalid request(400), cell not found(404)
Request
-------
+--------+------+---------+------------------------------------+
| Name | In | Type | Description |
+========+======+=========+====================================+
| id | path | integer | Unique ID of the cell to be deleted|
+--------+------+---------+------------------------------------+
Required Header
^^^^^^^^^^^^^^^
- Content-Type: application/json
- X-Auth-Token
- X-Auth-User
- X-Auth-Project
Response
--------
No body content is returned on a successful DELETE
Delete Cell Variables
=====================
:DELETE: /v1/cells/{id}/variables
Delete existing key/value variables for the cell
Normal response codes: no content(204)
Error response codes: invalid request(400), cell not found(404) validation exception(405)
Request
-------
+--------+------+---------+-------------------------+
| Name | In | Type | Description |
+========+======+=========+=========================+
| id | path | integer | Unique ID of the cell |
+--------+------+---------+-------------------------+
| key | body | string | Identifier to be deleted|
+--------+------+---------+-------------------------+
| value | body | object | Data to be deleted |
+--------+------+---------+-------------------------+
Required Header
^^^^^^^^^^^^^^^
- Content-Type: application/json
- X-Auth-Token
- X-Auth-User
- X-Auth-Project
Response
--------
No body content is returned on a successful DELETE

Some files were not shown because too many files have changed in this diff Show More