Retire repository
Fuel (from openstack namespace) and fuel-ccp (in x namespace) repositories are unused and ready to retire. This change removes all content from the repository and adds the usual README file to point out that the repository is retired following the process from https://docs.openstack.org/infra/manual/drivers.html#retiring-a-project See also http://lists.openstack.org/pipermail/openstack-discuss/2019-December/011647.html Depends-On: https://review.opendev.org/699362 Change-Id: I6a9128d5c87d7d4a2ca77db96ba6c8d9d2d91b17
62
.gitignore
vendored
@ -1,62 +0,0 @@
|
|||||||
# Byte-compiled / optimized / DLL files
|
|
||||||
__pycache__/
|
|
||||||
*.py[cod]
|
|
||||||
|
|
||||||
# C extensions
|
|
||||||
*.so
|
|
||||||
|
|
||||||
# Distribution / packaging
|
|
||||||
.Python
|
|
||||||
env/
|
|
||||||
build/
|
|
||||||
develop-eggs/
|
|
||||||
dist/
|
|
||||||
downloads/
|
|
||||||
eggs/
|
|
||||||
lib/
|
|
||||||
lib64/
|
|
||||||
parts/
|
|
||||||
sdist/
|
|
||||||
var/
|
|
||||||
*.egg-info/
|
|
||||||
.installed.cfg
|
|
||||||
*.egg
|
|
||||||
|
|
||||||
# PyInstaller
|
|
||||||
# Usually these files are written by a python script from a template
|
|
||||||
# before PyInstaller builds the exe, so as to inject date/other infos into it.
|
|
||||||
*.manifest
|
|
||||||
*.spec
|
|
||||||
|
|
||||||
# Installer logs
|
|
||||||
pip-log.txt
|
|
||||||
pip-delete-this-directory.txt
|
|
||||||
|
|
||||||
# Unit test / coverage reports
|
|
||||||
htmlcov/
|
|
||||||
.tox/
|
|
||||||
.coverage
|
|
||||||
.cache
|
|
||||||
nosetests.xml
|
|
||||||
coverage.xml
|
|
||||||
|
|
||||||
# Translations
|
|
||||||
*.mo
|
|
||||||
*.pot
|
|
||||||
|
|
||||||
# Django stuff:
|
|
||||||
*.log
|
|
||||||
|
|
||||||
# Sphinx documentation
|
|
||||||
docs/_build/
|
|
||||||
|
|
||||||
# PyBuilder
|
|
||||||
target/
|
|
||||||
|
|
||||||
# Custom options
|
|
||||||
.venv/
|
|
||||||
.idea
|
|
||||||
*.swp
|
|
||||||
analytics/static/bower_components
|
|
||||||
analytics/static/node_modules
|
|
||||||
analytics/static/js/libs
|
|
176
LICENSE
@ -1,176 +0,0 @@
|
|||||||
|
|
||||||
Apache License
|
|
||||||
Version 2.0, January 2004
|
|
||||||
http://www.apache.org/licenses/
|
|
||||||
|
|
||||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
|
||||||
|
|
||||||
1. Definitions.
|
|
||||||
|
|
||||||
"License" shall mean the terms and conditions for use, reproduction,
|
|
||||||
and distribution as defined by Sections 1 through 9 of this document.
|
|
||||||
|
|
||||||
"Licensor" shall mean the copyright owner or entity authorized by
|
|
||||||
the copyright owner that is granting the License.
|
|
||||||
|
|
||||||
"Legal Entity" shall mean the union of the acting entity and all
|
|
||||||
other entities that control, are controlled by, or are under common
|
|
||||||
control with that entity. For the purposes of this definition,
|
|
||||||
"control" means (i) the power, direct or indirect, to cause the
|
|
||||||
direction or management of such entity, whether by contract or
|
|
||||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
|
||||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
|
||||||
|
|
||||||
"You" (or "Your") shall mean an individual or Legal Entity
|
|
||||||
exercising permissions granted by this License.
|
|
||||||
|
|
||||||
"Source" form shall mean the preferred form for making modifications,
|
|
||||||
including but not limited to software source code, documentation
|
|
||||||
source, and configuration files.
|
|
||||||
|
|
||||||
"Object" form shall mean any form resulting from mechanical
|
|
||||||
transformation or translation of a Source form, including but
|
|
||||||
not limited to compiled object code, generated documentation,
|
|
||||||
and conversions to other media types.
|
|
||||||
|
|
||||||
"Work" shall mean the work of authorship, whether in Source or
|
|
||||||
Object form, made available under the License, as indicated by a
|
|
||||||
copyright notice that is included in or attached to the work
|
|
||||||
(an example is provided in the Appendix below).
|
|
||||||
|
|
||||||
"Derivative Works" shall mean any work, whether in Source or Object
|
|
||||||
form, that is based on (or derived from) the Work and for which the
|
|
||||||
editorial revisions, annotations, elaborations, or other modifications
|
|
||||||
represent, as a whole, an original work of authorship. For the purposes
|
|
||||||
of this License, Derivative Works shall not include works that remain
|
|
||||||
separable from, or merely link (or bind by name) to the interfaces of,
|
|
||||||
the Work and Derivative Works thereof.
|
|
||||||
|
|
||||||
"Contribution" shall mean any work of authorship, including
|
|
||||||
the original version of the Work and any modifications or additions
|
|
||||||
to that Work or Derivative Works thereof, that is intentionally
|
|
||||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
|
||||||
or by an individual or Legal Entity authorized to submit on behalf of
|
|
||||||
the copyright owner. For the purposes of this definition, "submitted"
|
|
||||||
means any form of electronic, verbal, or written communication sent
|
|
||||||
to the Licensor or its representatives, including but not limited to
|
|
||||||
communication on electronic mailing lists, source code control systems,
|
|
||||||
and issue tracking systems that are managed by, or on behalf of, the
|
|
||||||
Licensor for the purpose of discussing and improving the Work, but
|
|
||||||
excluding communication that is conspicuously marked or otherwise
|
|
||||||
designated in writing by the copyright owner as "Not a Contribution."
|
|
||||||
|
|
||||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
|
||||||
on behalf of whom a Contribution has been received by Licensor and
|
|
||||||
subsequently incorporated within the Work.
|
|
||||||
|
|
||||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
|
||||||
this License, each Contributor hereby grants to You a perpetual,
|
|
||||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
|
||||||
copyright license to reproduce, prepare Derivative Works of,
|
|
||||||
publicly display, publicly perform, sublicense, and distribute the
|
|
||||||
Work and such Derivative Works in Source or Object form.
|
|
||||||
|
|
||||||
3. Grant of Patent License. Subject to the terms and conditions of
|
|
||||||
this License, each Contributor hereby grants to You a perpetual,
|
|
||||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
|
||||||
(except as stated in this section) patent license to make, have made,
|
|
||||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
|
||||||
where such license applies only to those patent claims licensable
|
|
||||||
by such Contributor that are necessarily infringed by their
|
|
||||||
Contribution(s) alone or by combination of their Contribution(s)
|
|
||||||
with the Work to which such Contribution(s) was submitted. If You
|
|
||||||
institute patent litigation against any entity (including a
|
|
||||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
|
||||||
or a Contribution incorporated within the Work constitutes direct
|
|
||||||
or contributory patent infringement, then any patent licenses
|
|
||||||
granted to You under this License for that Work shall terminate
|
|
||||||
as of the date such litigation is filed.
|
|
||||||
|
|
||||||
4. Redistribution. You may reproduce and distribute copies of the
|
|
||||||
Work or Derivative Works thereof in any medium, with or without
|
|
||||||
modifications, and in Source or Object form, provided that You
|
|
||||||
meet the following conditions:
|
|
||||||
|
|
||||||
(a) You must give any other recipients of the Work or
|
|
||||||
Derivative Works a copy of this License; and
|
|
||||||
|
|
||||||
(b) You must cause any modified files to carry prominent notices
|
|
||||||
stating that You changed the files; and
|
|
||||||
|
|
||||||
(c) You must retain, in the Source form of any Derivative Works
|
|
||||||
that You distribute, all copyright, patent, trademark, and
|
|
||||||
attribution notices from the Source form of the Work,
|
|
||||||
excluding those notices that do not pertain to any part of
|
|
||||||
the Derivative Works; and
|
|
||||||
|
|
||||||
(d) If the Work includes a "NOTICE" text file as part of its
|
|
||||||
distribution, then any Derivative Works that You distribute must
|
|
||||||
include a readable copy of the attribution notices contained
|
|
||||||
within such NOTICE file, excluding those notices that do not
|
|
||||||
pertain to any part of the Derivative Works, in at least one
|
|
||||||
of the following places: within a NOTICE text file distributed
|
|
||||||
as part of the Derivative Works; within the Source form or
|
|
||||||
documentation, if provided along with the Derivative Works; or,
|
|
||||||
within a display generated by the Derivative Works, if and
|
|
||||||
wherever such third-party notices normally appear. The contents
|
|
||||||
of the NOTICE file are for informational purposes only and
|
|
||||||
do not modify the License. You may add Your own attribution
|
|
||||||
notices within Derivative Works that You distribute, alongside
|
|
||||||
or as an addendum to the NOTICE text from the Work, provided
|
|
||||||
that such additional attribution notices cannot be construed
|
|
||||||
as modifying the License.
|
|
||||||
|
|
||||||
You may add Your own copyright statement to Your modifications and
|
|
||||||
may provide additional or different license terms and conditions
|
|
||||||
for use, reproduction, or distribution of Your modifications, or
|
|
||||||
for any such Derivative Works as a whole, provided Your use,
|
|
||||||
reproduction, and distribution of the Work otherwise complies with
|
|
||||||
the conditions stated in this License.
|
|
||||||
|
|
||||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
|
||||||
any Contribution intentionally submitted for inclusion in the Work
|
|
||||||
by You to the Licensor shall be under the terms and conditions of
|
|
||||||
this License, without any additional terms or conditions.
|
|
||||||
Notwithstanding the above, nothing herein shall supersede or modify
|
|
||||||
the terms of any separate license agreement you may have executed
|
|
||||||
with Licensor regarding such Contributions.
|
|
||||||
|
|
||||||
6. Trademarks. This License does not grant permission to use the trade
|
|
||||||
names, trademarks, service marks, or product names of the Licensor,
|
|
||||||
except as required for reasonable and customary use in describing the
|
|
||||||
origin of the Work and reproducing the content of the NOTICE file.
|
|
||||||
|
|
||||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
|
||||||
agreed to in writing, Licensor provides the Work (and each
|
|
||||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
|
||||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
|
||||||
implied, including, without limitation, any warranties or conditions
|
|
||||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
|
||||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
|
||||||
appropriateness of using or redistributing the Work and assume any
|
|
||||||
risks associated with Your exercise of permissions under this License.
|
|
||||||
|
|
||||||
8. Limitation of Liability. In no event and under no legal theory,
|
|
||||||
whether in tort (including negligence), contract, or otherwise,
|
|
||||||
unless required by applicable law (such as deliberate and grossly
|
|
||||||
negligent acts) or agreed to in writing, shall any Contributor be
|
|
||||||
liable to You for damages, including any direct, indirect, special,
|
|
||||||
incidental, or consequential damages of any character arising as a
|
|
||||||
result of this License or out of the use or inability to use the
|
|
||||||
Work (including but not limited to damages for loss of goodwill,
|
|
||||||
work stoppage, computer failure or malfunction, or any and all
|
|
||||||
other commercial damages or losses), even if such Contributor
|
|
||||||
has been advised of the possibility of such damages.
|
|
||||||
|
|
||||||
9. Accepting Warranty or Additional Liability. While redistributing
|
|
||||||
the Work or Derivative Works thereof, You may choose to offer,
|
|
||||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
|
||||||
or other liability obligations and/or rights consistent with this
|
|
||||||
License. However, in accepting such obligations, You may act only
|
|
||||||
on Your own behalf and on Your sole responsibility, not on behalf
|
|
||||||
of any other Contributor, and only if You agree to indemnify,
|
|
||||||
defend, and hold each Contributor harmless for any liability
|
|
||||||
incurred by, or claims asserted against, such Contributor by reason
|
|
||||||
of your accepting any such warranty or additional liability.
|
|
||||||
|
|
59
MAINTAINERS
@ -1,59 +0,0 @@
|
|||||||
---
|
|
||||||
description:
|
|
||||||
For Fuel team structure and contribution policy, see [1].
|
|
||||||
|
|
||||||
This is repository level MAINTAINERS file. All contributions to this
|
|
||||||
repository must be approved by one or more Core Reviewers [2].
|
|
||||||
If you are contributing to files (or create new directories) in
|
|
||||||
root folder of this repository, please contact Core Reviewers for
|
|
||||||
review and merge requests.
|
|
||||||
|
|
||||||
If you are contributing to subfolders of this repository, please
|
|
||||||
check 'maintainers' section of this file in order to find maintainers
|
|
||||||
for those specific modules.
|
|
||||||
|
|
||||||
It is mandatory to get +1 from one or more maintainers before asking
|
|
||||||
Core Reviewers for review/merge in order to decrease a load on Core Reviewers [3].
|
|
||||||
Exceptions are when maintainers are actually cores, or when maintainers
|
|
||||||
are not available for some reason (e.g. on vacation).
|
|
||||||
|
|
||||||
[1] https://github.com/stackforge/fuel-specs/blob/e9946cbacd9470bf9f78f3b3fc44dc50f84f231b/policy/team-structure.rst
|
|
||||||
[2] https://review.openstack.org/#/admin/groups/663,members
|
|
||||||
[3] http://lists.openstack.org/pipermail/openstack-dev/2015-August/072406.html
|
|
||||||
|
|
||||||
Please keep this file in YAML format in order to allow helper scripts
|
|
||||||
to read this as a configuration data.
|
|
||||||
|
|
||||||
maintainers:
|
|
||||||
|
|
||||||
- analytics/:
|
|
||||||
- name: Alexander Kislitsky
|
|
||||||
email: akislitsky@mirantis.com
|
|
||||||
IRC: akislitsky
|
|
||||||
|
|
||||||
- analytics/static/:
|
|
||||||
- name: Ekaterina Pimenova
|
|
||||||
email: kpimenova@mirantis.com
|
|
||||||
IRC: kat_pimenova
|
|
||||||
|
|
||||||
- name: Vitaly Kramskikh
|
|
||||||
email: vkramskikh@mirantis.com
|
|
||||||
IRC: vkramskikh
|
|
||||||
|
|
||||||
- collector/:
|
|
||||||
- name: Aleksei Kasatkin
|
|
||||||
email: akasatkin@mirantis.com
|
|
||||||
IRC: akasatkin
|
|
||||||
|
|
||||||
- name: Artem Roma
|
|
||||||
email: aroma@mirantis.com
|
|
||||||
IRC: a_teem
|
|
||||||
|
|
||||||
- name: Alexander Kislitsky
|
|
||||||
email: akislitsky@mirantis.com
|
|
||||||
IRC: akislitsky
|
|
||||||
|
|
||||||
- migration/:
|
|
||||||
- name: Alexander Kislitsky
|
|
||||||
email: akislitsky@mirantis.com
|
|
||||||
IRC: akislitsky
|
|
@ -1,7 +0,0 @@
|
|||||||
include *.txt
|
|
||||||
include collector/collector/test/logs
|
|
||||||
graft analytics/static
|
|
||||||
graft collector/collector/api/db/migrations
|
|
||||||
recursive-include collector/api/schemas *.json
|
|
||||||
prune analytics/static/bower_components
|
|
||||||
prune analytics/static/node_modules
|
|
246
README.rst
@ -1,240 +1,10 @@
|
|||||||
========================
|
This project is no longer maintained.
|
||||||
Team and repository tags
|
|
||||||
========================
|
|
||||||
|
|
||||||
.. image:: http://governance.openstack.org/badges/fuel-stats.svg
|
The contents of this repository are still available in the Git
|
||||||
:target: http://governance.openstack.org/reference/tags/index.html
|
source code management system. To see the contents of this
|
||||||
|
repository before it reached its end of life, please check out the
|
||||||
|
previous commit with "git checkout HEAD^1".
|
||||||
|
|
||||||
.. Change things from this point on
|
For any further questions, please email
|
||||||
|
openstack-discuss@lists.openstack.org or join #openstack-dev on
|
||||||
==========
|
Freenode.
|
||||||
Fuel stats
|
|
||||||
==========
|
|
||||||
|
|
||||||
Project purpose
|
|
||||||
---------------
|
|
||||||
|
|
||||||
* collects stats information about OpenStack installations made by Fuel_,
|
|
||||||
* generates stat reports in the CSV format,
|
|
||||||
* provides API for fetching raw data in the JSON format,
|
|
||||||
* provides Web UI for reports generation and basic stats charts/histograms.
|
|
||||||
|
|
||||||
Components
|
|
||||||
----------
|
|
||||||
|
|
||||||
Collector is the service for collecting stats. It has REST API and DB storage.
|
|
||||||
Analytics is the service for generating reports. It has REST API.
|
|
||||||
Migrator is the tool for migrating data from the DB to the Elasticsearch_.
|
|
||||||
|
|
||||||
The collector and analytics services are started by uWSGI_. Migrator is
|
|
||||||
started by cron to migrate the fresh data into Elasticsearch_.
|
|
||||||
|
|
||||||
Collector
|
|
||||||
---------
|
|
||||||
|
|
||||||
Data origin for collector is the Fuel_ master node. Stats collecting daemons
|
|
||||||
collect and send data to the collector if allowed by the cloud operator.
|
|
||||||
|
|
||||||
Stats data is stored to the DB PostgreSQL_.
|
|
||||||
|
|
||||||
Migrator
|
|
||||||
--------
|
|
||||||
|
|
||||||
Migrator periodically migrates data from the fuel-stats DB to the
|
|
||||||
Elasticsearch_ storage. This storage is used to generate basic stats charts
|
|
||||||
and histograms for the Web UI.
|
|
||||||
|
|
||||||
Analytics
|
|
||||||
---------
|
|
||||||
|
|
||||||
There are two sub-components in the analytics:
|
|
||||||
|
|
||||||
* analytics service
|
|
||||||
* Web UI
|
|
||||||
|
|
||||||
The analytics service API provides generation of CSV reports for installation
|
|
||||||
info, plugins, and OpenStack workloads. The analytics API also provides
|
|
||||||
export of data from DB as JSON.
|
|
||||||
|
|
||||||
The analytics Web UI provides basic summary stats charts and histograms with
|
|
||||||
the possibility of filtering data by the Fuel_ release version. Also, in the
|
|
||||||
Web UI we can generate and download stats reports on a selected time period.
|
|
||||||
|
|
||||||
.. _howto configure dev environment:
|
|
||||||
|
|
||||||
How to configure development environment
|
|
||||||
----------------------------------------
|
|
||||||
|
|
||||||
To start fuel-stats on a localhost we need to have:
|
|
||||||
|
|
||||||
* PostgreSQL_ of version 9.3 or greater,
|
|
||||||
* Elasticsearch_ of version 1.3.4 or greater,
|
|
||||||
* Nginx_.
|
|
||||||
|
|
||||||
Install PostgreSQL_ and development libraries: ::
|
|
||||||
|
|
||||||
sudo apt-get install --yes postgresql postgresql-server-dev-all
|
|
||||||
|
|
||||||
Configure Elasticsearch_ repo as described in the `elasticsearch docs`_ and
|
|
||||||
install Elasticsearch_:::
|
|
||||||
|
|
||||||
sudo apt-get install --yes elasticsearch
|
|
||||||
|
|
||||||
Install pip and development tools: ::
|
|
||||||
|
|
||||||
sudo apt-get install --yes python-dev python-pip
|
|
||||||
|
|
||||||
Install virtualenv. This step increases flexibility when dealing with
|
|
||||||
environment settings and package installation: ::
|
|
||||||
|
|
||||||
sudo pip install virtualenv virtualenvwrapper
|
|
||||||
|
|
||||||
You can add '. /usr/local/bin/virtualenvwrapper.sh' to .bashrc or just
|
|
||||||
execute it.::
|
|
||||||
|
|
||||||
. /usr/local/bin/virtualenvwrapper.sh
|
|
||||||
|
|
||||||
Create and activate virtual env for fuel-stats: ::
|
|
||||||
|
|
||||||
mkvirtualenv stats
|
|
||||||
workon stats
|
|
||||||
|
|
||||||
You can use any name for the virtual env instead of 'stats'.
|
|
||||||
|
|
||||||
Install the fuel-stats requirements: ::
|
|
||||||
|
|
||||||
pip install -r test-requirements.txt
|
|
||||||
|
|
||||||
Create a DB user for fuel-stats: ::
|
|
||||||
|
|
||||||
sudo -u postgres psql
|
|
||||||
CREATE ROLE collector WITH NOSUPERUSER NOCREATEDB NOCREATEROLE LOGIN ENCRYPTED PASSWORD 'collector';
|
|
||||||
|
|
||||||
Create a DB and grant privileges to it: ::
|
|
||||||
|
|
||||||
sudo -u postgres psql
|
|
||||||
CREATE DATABASE collector;
|
|
||||||
GRANT ALL ON DATABASE collector TO collector;
|
|
||||||
|
|
||||||
Check that all tests are passed: ::
|
|
||||||
|
|
||||||
cd fuel-stats/collector && tox
|
|
||||||
cd fuel-stats/migration && tox
|
|
||||||
cd fuel-stats/analytics && tox
|
|
||||||
|
|
||||||
**NOTE:** The collector tests must be performed the first.
|
|
||||||
|
|
||||||
Now you are ready to develop fuel-stats.
|
|
||||||
|
|
||||||
How to configure Web UI
|
|
||||||
-----------------------
|
|
||||||
|
|
||||||
We assume that you already have configured virtual env as described in
|
|
||||||
`howto configure dev environment`_.
|
|
||||||
|
|
||||||
Install elsticsearch library and create sample data: ::
|
|
||||||
|
|
||||||
workon stats
|
|
||||||
pip install elasticsearch
|
|
||||||
cd migration
|
|
||||||
nosetests migration.test.report.test_reports:Reports.test_libvirt_type_distribution
|
|
||||||
|
|
||||||
Install nodejs: ::
|
|
||||||
|
|
||||||
sudo apt-get remove nodejs nodejs-legacy npm
|
|
||||||
sudo add-apt-repository -y ppa:chris-lea/node.js
|
|
||||||
sudo apt-get update
|
|
||||||
sudo apt-get install nodejs
|
|
||||||
|
|
||||||
Install nodejs and bower packages: ::
|
|
||||||
|
|
||||||
cd fuel-stats/analytics/static
|
|
||||||
npm install
|
|
||||||
gulp bower
|
|
||||||
|
|
||||||
You can anytime lint your code by running: ::
|
|
||||||
|
|
||||||
gulp lint
|
|
||||||
|
|
||||||
Add site configuration to Nginx_: ::
|
|
||||||
|
|
||||||
server {
|
|
||||||
listen 8888;
|
|
||||||
location / {
|
|
||||||
root /your-path-to-fuel-stats/fuel-stats/analytics/static;
|
|
||||||
}
|
|
||||||
location ~ ^(/fuel)?(/[A-Za-z_0-9])?/(_count|_search) {
|
|
||||||
proxy_pass http://127.0.0.1:9200;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
Then restart Nginx: ::
|
|
||||||
|
|
||||||
service nginx restart
|
|
||||||
|
|
||||||
After this, your local server will be available at 0.0.0.0:8888
|
|
||||||
or any other port you've set up.
|
|
||||||
|
|
||||||
How to start local collector
|
|
||||||
----------------------------
|
|
||||||
|
|
||||||
You can use uWSGI_ to start the collector. Sample config can be found in
|
|
||||||
collector/uwsgi/collector_example.yaml.
|
|
||||||
|
|
||||||
Or test web service to be used. To start the test web service, run: ::
|
|
||||||
|
|
||||||
python collector/manage_collector.py --mode test runserver
|
|
||||||
|
|
||||||
How to start local analytics
|
|
||||||
----------------------------
|
|
||||||
|
|
||||||
You can use uWSGI_ to start analytics. Sample config can be found in
|
|
||||||
analytics/uwsgi/analytics_example.yaml.
|
|
||||||
|
|
||||||
Or test the web service to be used. To start the test web service, run: ::
|
|
||||||
|
|
||||||
python analytics/manage_analytics.py --mode test runserver
|
|
||||||
|
|
||||||
How to deal with DB migrations
|
|
||||||
------------------------------
|
|
||||||
|
|
||||||
Create new DB migration: ::
|
|
||||||
|
|
||||||
python manage_collector.py --mode test db migrate -m "Migration comment" \
|
|
||||||
-d collector/api/db/migrations/
|
|
||||||
|
|
||||||
Apply all DB migrations: ::
|
|
||||||
|
|
||||||
python manage_collector.py --mode test db upgrade -d collector/api/db/migrations/
|
|
||||||
|
|
||||||
Revert all migrations: ::
|
|
||||||
|
|
||||||
python manage_collector.py --mode test db downgrade -d collector/api/db/migrations/
|
|
||||||
|
|
||||||
|
|
||||||
Switching off Elasticsearch
|
|
||||||
---------------------------
|
|
||||||
|
|
||||||
Elasticsearch was chosen as data storage for the dynamically generated
|
|
||||||
statistics reports, but now only CSV reports are used for analytical purposes.
|
|
||||||
Thus, Elasticsearch is an unnecessary complication of the infrastructure and
|
|
||||||
data flow.
|
|
||||||
|
|
||||||
Without Elasticsearch, we are using memcached as cache for the web UI. Data
|
|
||||||
expiration is configured by the parameter MEMCACHED_JSON_REPORTS_EXPIRATION
|
|
||||||
for fuel_analytics.
|
|
||||||
|
|
||||||
Changes in the Nginx config: ::
|
|
||||||
|
|
||||||
# Add this to the block 'server'
|
|
||||||
location /api/ {
|
|
||||||
proxy_pass http://IP_OF_ANALYTICS_SERVICE:PORT_OF_ANALYTICS_SERVICE/api/;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
.. _Fuel: https://docs.mirantis.com/openstack/fuel/
|
|
||||||
.. _Elasticsearch: https://www.elastic.co/
|
|
||||||
.. _uWSGI: https://pypi.python.org/pypi/uWSGI/
|
|
||||||
.. _PostgreSQL: http://www.postgresql.org/
|
|
||||||
.. _Nginx: http://nginx.org/
|
|
||||||
.. _elasticsearch docs: https://www.elastic.co/guide/en/elasticsearch/reference/current/setup-repositories.html
|
|
||||||
|
@ -1,4 +0,0 @@
|
|||||||
include *.txt
|
|
||||||
graft static
|
|
||||||
prune static/bower_components
|
|
||||||
prune static/node_modules
|
|
@ -1,13 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
@ -1,13 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
@ -1,48 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
from flask import Flask
|
|
||||||
from flask import make_response
|
|
||||||
import flask_sqlalchemy
|
|
||||||
import six
|
|
||||||
|
|
||||||
from fuel_analytics.api.errors import DateExtractionError
|
|
||||||
from sqlalchemy.orm.exc import NoResultFound
|
|
||||||
|
|
||||||
app = Flask(__name__)
|
|
||||||
db = flask_sqlalchemy.SQLAlchemy(app)
|
|
||||||
|
|
||||||
# Registering blueprints
|
|
||||||
from fuel_analytics.api.resources.csv_exporter import bp as csv_exporter_bp
|
|
||||||
from fuel_analytics.api.resources.json_exporter import bp as json_exporter_bp
|
|
||||||
from fuel_analytics.api.resources.json_reports import bp as json_reports_bp
|
|
||||||
|
|
||||||
app.register_blueprint(csv_exporter_bp, url_prefix='/api/v1/csv')
|
|
||||||
app.register_blueprint(json_exporter_bp, url_prefix='/api/v1/json')
|
|
||||||
app.register_blueprint(json_reports_bp, url_prefix='/api/v1/json/report')
|
|
||||||
|
|
||||||
|
|
||||||
@app.errorhandler(DateExtractionError)
|
|
||||||
def date_parsing_error(error):
|
|
||||||
return make_response(six.text_type(error), 400)
|
|
||||||
|
|
||||||
|
|
||||||
@app.errorhandler(NoResultFound)
|
|
||||||
def db_object_not_found(error):
|
|
||||||
return make_response(six.text_type(error), 404)
|
|
||||||
|
|
||||||
|
|
||||||
@app.teardown_appcontext
|
|
||||||
def shutdown_session(exception=None):
|
|
||||||
db.session.remove()
|
|
@ -1,21 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
from fuel_analytics.api.app import app
|
|
||||||
from fuel_analytics.api import log
|
|
||||||
|
|
||||||
|
|
||||||
app.config.from_object('fuel_analytics.api.config.Production')
|
|
||||||
app.config.from_envvar('ANALYTICS_SETTINGS', silent=True)
|
|
||||||
log.init_logger()
|
|
@ -1,21 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
from fuel_analytics.api.app import app
|
|
||||||
from fuel_analytics.api import log
|
|
||||||
|
|
||||||
|
|
||||||
app.config.from_object('fuel_analytics.api.config.Testing')
|
|
||||||
app.config.from_envvar('ANALYTICS_SETTINGS', silent=True)
|
|
||||||
log.init_logger()
|
|
@ -1,13 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
@ -1,34 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
from collections import namedtuple
|
|
||||||
|
|
||||||
|
|
||||||
def make_enum(*values, **kwargs):
|
|
||||||
names = kwargs.get('names')
|
|
||||||
if names:
|
|
||||||
return namedtuple('Enum', names)(*values)
|
|
||||||
return namedtuple('Enum', values)(*values)
|
|
||||||
|
|
||||||
|
|
||||||
OSWL_RESOURCE_TYPES = make_enum(
|
|
||||||
'vm',
|
|
||||||
'tenant',
|
|
||||||
'volume',
|
|
||||||
'image',
|
|
||||||
'security_group',
|
|
||||||
'keystone_user',
|
|
||||||
'flavor',
|
|
||||||
'cluster_stats'
|
|
||||||
)
|
|
@ -1,53 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
import logging
|
|
||||||
import os
|
|
||||||
|
|
||||||
|
|
||||||
class Production(object):
|
|
||||||
DEBUG = False
|
|
||||||
LOG_FILE = '/var/log/fuel-stats/analytics.log'
|
|
||||||
LOG_LEVEL = logging.ERROR
|
|
||||||
LOG_ROTATION = False
|
|
||||||
LOGGER_NAME = 'analytics'
|
|
||||||
ELASTIC_HOST = 'product-stats.mirantis.com'
|
|
||||||
ELASTIC_PORT = 443
|
|
||||||
ELASTIC_USE_SSL = True
|
|
||||||
ELASTIC_INDEX_FUEL = 'fuel'
|
|
||||||
ELASTIC_DOC_TYPE_STRUCTURE = 'structure'
|
|
||||||
SQLALCHEMY_DATABASE_URI = \
|
|
||||||
'postgresql://collector:*****@localhost/collector'
|
|
||||||
CSV_DEFAULT_FROM_DATE_DAYS = 90
|
|
||||||
CSV_DB_YIELD_PER = 100
|
|
||||||
JSON_DB_DEFAULT_LIMIT = 1000
|
|
||||||
CSV_DEFAULT_LIST_ITEMS_NUM = 5
|
|
||||||
MEMCACHED_HOSTS = ['localhost:11211']
|
|
||||||
MEMCACHED_JSON_REPORTS_EXPIRATION = 3600
|
|
||||||
|
|
||||||
|
|
||||||
class Testing(Production):
|
|
||||||
DEBUG = True
|
|
||||||
LOG_FILE = os.path.realpath(os.path.join(
|
|
||||||
os.path.dirname(__file__), '..', 'test', 'logs', 'analytics.log'))
|
|
||||||
LOG_LEVEL = logging.DEBUG
|
|
||||||
LOG_ROTATION = True
|
|
||||||
LOG_FILE_SIZE = 2048000
|
|
||||||
LOG_FILES_COUNT = 5
|
|
||||||
ELASTIC_HOST = 'localhost'
|
|
||||||
ELASTIC_PORT = 9200
|
|
||||||
ELASTIC_USE_SSL = False
|
|
||||||
SQLALCHEMY_DATABASE_URI = \
|
|
||||||
'postgresql://collector:collector@localhost/collector'
|
|
||||||
SQLALCHEMY_ECHO = True
|
|
@ -1,13 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
@ -1,54 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
from sqlalchemy.dialects.postgresql import JSON
|
|
||||||
|
|
||||||
from fuel_analytics.api.app import db
|
|
||||||
|
|
||||||
|
|
||||||
class OpenStackWorkloadStats(db.Model):
|
|
||||||
__tablename__ = 'oswl_stats'
|
|
||||||
id = db.Column(db.Integer, primary_key=True)
|
|
||||||
master_node_uid = db.Column(db.Text)
|
|
||||||
external_id = db.Column(db.Integer)
|
|
||||||
cluster_id = db.Column(db.Integer)
|
|
||||||
created_date = db.Column(db.Date)
|
|
||||||
updated_time = db.Column(db.Time)
|
|
||||||
resource_type = db.Column(db.Text)
|
|
||||||
resource_data = db.Column(JSON)
|
|
||||||
resource_checksum = db.Column(db.Text)
|
|
||||||
version_info = db.Column(JSON)
|
|
||||||
|
|
||||||
|
|
||||||
class InstallationStructure(db.Model):
|
|
||||||
__tablename__ = 'installation_structures'
|
|
||||||
id = db.Column(db.Integer, primary_key=True)
|
|
||||||
master_node_uid = db.Column(db.String, nullable=False, unique=True)
|
|
||||||
structure = db.Column(JSON)
|
|
||||||
creation_date = db.Column(db.DateTime)
|
|
||||||
modification_date = db.Column(db.DateTime)
|
|
||||||
is_filtered = db.Column(db.Boolean)
|
|
||||||
release = db.Column(db.Text)
|
|
||||||
|
|
||||||
|
|
||||||
class ActionLog(db.Model):
|
|
||||||
__tablename__ = 'action_logs'
|
|
||||||
id = db.Column(db.Integer, primary_key=True)
|
|
||||||
master_node_uid = db.Column(db.String, nullable=False)
|
|
||||||
external_id = db.Column(db.Integer, nullable=False)
|
|
||||||
body = db.Column(JSON)
|
|
||||||
action_type = db.Column(db.Text)
|
|
||||||
action_name = db.Column(db.Text)
|
|
||||||
db.Index('ix_action_logs_action_name_action_type',
|
|
||||||
'action_name', 'action_type')
|
|
@ -1,23 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
|
|
||||||
class FuelAnalyticsException(Exception):
|
|
||||||
"""Base fuel analytics exception
|
|
||||||
"""
|
|
||||||
|
|
||||||
|
|
||||||
class DateExtractionError(FuelAnalyticsException):
|
|
||||||
"""Error of extracting date from HTTP request parameters
|
|
||||||
"""
|
|
@ -1,49 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
from logging import FileHandler
|
|
||||||
from logging import Formatter
|
|
||||||
from logging.handlers import RotatingFileHandler
|
|
||||||
import os
|
|
||||||
|
|
||||||
from fuel_analytics.api.app import app
|
|
||||||
|
|
||||||
|
|
||||||
def get_file_handler():
|
|
||||||
if app.config.get('LOG_ROTATION'):
|
|
||||||
file_handler = RotatingFileHandler(
|
|
||||||
app.config.get('LOG_FILE'),
|
|
||||||
maxBytes=app.config.get('LOG_FILE_SIZE'),
|
|
||||||
backupCount=app.config.get('LOG_FILES_COUNT')
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
file_handler = FileHandler(app.config.get('LOG_FILE'))
|
|
||||||
file_handler.setLevel(app.config.get('LOG_LEVEL'))
|
|
||||||
formatter = get_formatter()
|
|
||||||
file_handler.setFormatter(formatter)
|
|
||||||
return file_handler
|
|
||||||
|
|
||||||
|
|
||||||
def get_formatter():
|
|
||||||
DATE_FORMAT = "%Y-%m-%d %H:%M:%S"
|
|
||||||
LOG_FORMAT = "%(asctime)s.%(msecs)03d %(levelname)s " \
|
|
||||||
"[%(thread)x] (%(module)s) %(message)s"
|
|
||||||
return Formatter(fmt=LOG_FORMAT, datefmt=DATE_FORMAT)
|
|
||||||
|
|
||||||
|
|
||||||
def init_logger():
|
|
||||||
log_dir = os.path.dirname(app.config.get('LOG_FILE'))
|
|
||||||
if not os.path.exists(log_dir):
|
|
||||||
os.mkdir(log_dir, 750)
|
|
||||||
app.logger.addHandler(get_file_handler())
|
|
@ -1,13 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
@ -1,405 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
from datetime import datetime
|
|
||||||
from datetime import timedelta
|
|
||||||
import io
|
|
||||||
import six
|
|
||||||
import tarfile
|
|
||||||
|
|
||||||
from flask import Blueprint
|
|
||||||
from flask import request
|
|
||||||
from flask import Response
|
|
||||||
from sqlalchemy import distinct
|
|
||||||
from sqlalchemy import or_
|
|
||||||
from sqlalchemy import sql
|
|
||||||
|
|
||||||
from fuel_analytics.api.app import app
|
|
||||||
from fuel_analytics.api.app import db
|
|
||||||
from fuel_analytics.api.db.model import InstallationStructure as IS
|
|
||||||
from fuel_analytics.api.db.model import OpenStackWorkloadStats as OSWS
|
|
||||||
from fuel_analytics.api.errors import DateExtractionError
|
|
||||||
from fuel_analytics.api.resources.utils.oswl_stats_to_csv import OswlStatsToCsv
|
|
||||||
from fuel_analytics.api.resources.utils.stats_to_csv import StatsToCsv
|
|
||||||
|
|
||||||
bp = Blueprint('clusters_to_csv', __name__)
|
|
||||||
|
|
||||||
CLUSTERS_REPORT_FILE = 'clusters.csv'
|
|
||||||
PLUGINS_REPORT_FILE = 'plugins.csv'
|
|
||||||
NODES_REPORT_FILE = 'nodes.csv'
|
|
||||||
|
|
||||||
|
|
||||||
def extract_date(field_name, default_value=None, date_format='%Y-%m-%d'):
|
|
||||||
if field_name not in request.args:
|
|
||||||
return default_value
|
|
||||||
date_string = request.args.get(field_name, default_value)
|
|
||||||
try:
|
|
||||||
result = datetime.strptime(date_string, date_format).date()
|
|
||||||
app.logger.debug("Extracted '{}' value {}".format(field_name, result))
|
|
||||||
return result
|
|
||||||
except ValueError:
|
|
||||||
msg = "Date '{}' value in wrong format. Use format: {}".format(
|
|
||||||
field_name, date_format)
|
|
||||||
app.logger.debug(msg)
|
|
||||||
raise DateExtractionError(msg)
|
|
||||||
|
|
||||||
|
|
||||||
def get_from_date():
|
|
||||||
default_value = datetime.utcnow().date() - \
|
|
||||||
timedelta(days=app.config.get('CSV_DEFAULT_FROM_DATE_DAYS'))
|
|
||||||
return extract_date('from_date', default_value=default_value)
|
|
||||||
|
|
||||||
|
|
||||||
def get_to_date():
|
|
||||||
return extract_date('to_date',
|
|
||||||
default_value=datetime.utcnow().date())
|
|
||||||
|
|
||||||
|
|
||||||
def get_inst_structures_query(from_date=None, to_date=None, fields=()):
|
|
||||||
"""Composes query for fetching not filtered installation
|
|
||||||
structures info with filtering by from and to dates and
|
|
||||||
ordering by id. Installation structure is not filtered
|
|
||||||
if is_filtered is False or None.
|
|
||||||
|
|
||||||
:param from_date: filter from creation or modification date
|
|
||||||
:param to_date: filter to creation or modification date
|
|
||||||
:param fields: fields to be filtered in query. All fields will
|
|
||||||
be fetched if parameter is empty.
|
|
||||||
:return: SQLAlchemy query
|
|
||||||
"""
|
|
||||||
if fields:
|
|
||||||
query = db.session.query(*fields)
|
|
||||||
else:
|
|
||||||
query = db.session.query(IS)
|
|
||||||
query = query.filter(or_(
|
|
||||||
IS.is_filtered == bool(False), # workaround for PEP8 error E712
|
|
||||||
IS.is_filtered.is_(None)))
|
|
||||||
if from_date is not None:
|
|
||||||
query = query.filter(or_(IS.creation_date >= from_date,
|
|
||||||
IS.modification_date >= from_date))
|
|
||||||
if to_date is not None:
|
|
||||||
# modification_date is datetime field, so we need to
|
|
||||||
# increase to_date for right filtering
|
|
||||||
to_date += timedelta(days=1)
|
|
||||||
query = query.filter(or_(IS.creation_date <= to_date,
|
|
||||||
IS.modification_date <= to_date))
|
|
||||||
return query.order_by(IS.id)
|
|
||||||
|
|
||||||
|
|
||||||
def get_inst_structures():
|
|
||||||
yield_per = app.config['CSV_DB_YIELD_PER']
|
|
||||||
from_date = get_from_date()
|
|
||||||
to_date = get_to_date()
|
|
||||||
query = get_inst_structures_query(from_date=from_date,
|
|
||||||
to_date=to_date)
|
|
||||||
return query.yield_per(yield_per)
|
|
||||||
|
|
||||||
|
|
||||||
def get_clusters_version_info():
|
|
||||||
"""Returns dict of version info from clusters.
|
|
||||||
|
|
||||||
:return: dict of saved cluster versions with
|
|
||||||
structure {mn_uid: {cluster_id: version_info}}
|
|
||||||
"""
|
|
||||||
yield_per = app.config['CSV_DB_YIELD_PER']
|
|
||||||
from_date = get_from_date()
|
|
||||||
to_date = get_to_date()
|
|
||||||
query = get_inst_structures_query(
|
|
||||||
from_date=from_date,
|
|
||||||
to_date=to_date,
|
|
||||||
fields=(IS.master_node_uid, IS.structure['clusters'].label('clusters'))
|
|
||||||
)
|
|
||||||
clusters_version_info = {}
|
|
||||||
for info in query.yield_per(yield_per):
|
|
||||||
_add_oswl_to_clusters_versions_cache(info, clusters_version_info)
|
|
||||||
return clusters_version_info
|
|
||||||
|
|
||||||
|
|
||||||
def get_action_logs_query():
|
|
||||||
"""Selecting only last network verification task for master node cluster
|
|
||||||
:return: SQLAlchemy query
|
|
||||||
"""
|
|
||||||
query = "SELECT DISTINCT ON (master_node_uid, body->>'cluster_id') " \
|
|
||||||
"external_id, master_node_uid, body->'cluster_id' cluster_id, " \
|
|
||||||
"body->'additional_info'->'ended_with_status' status, " \
|
|
||||||
"to_timestamp(body->>'end_timestamp', 'YYYY-MM-DD')::TIMESTAMP " \
|
|
||||||
"WITHOUT TIME ZONE end_timestamp, " \
|
|
||||||
"body->>'action_name' action_name " \
|
|
||||||
"FROM action_logs " \
|
|
||||||
"WHERE action_type='nailgun_task' " \
|
|
||||||
"AND action_name='verify_networks' " \
|
|
||||||
"AND to_timestamp(body->>'end_timestamp', 'YYYY-MM-DD')::" \
|
|
||||||
"TIMESTAMP WITHOUT TIME ZONE >= :from_date " \
|
|
||||||
"AND to_timestamp(body->>'end_timestamp', 'YYYY-MM-DD')::" \
|
|
||||||
"TIMESTAMP WITHOUT TIME ZONE <= :to_date " \
|
|
||||||
"ORDER BY master_node_uid, body->>'cluster_id', external_id DESC"
|
|
||||||
return sql.text(query)
|
|
||||||
|
|
||||||
|
|
||||||
def get_action_logs():
|
|
||||||
from_date = get_from_date()
|
|
||||||
to_date = get_to_date()
|
|
||||||
query = get_action_logs_query()
|
|
||||||
return db.session.execute(query, {'from_date': from_date,
|
|
||||||
'to_date': to_date})
|
|
||||||
|
|
||||||
|
|
||||||
@bp.route('/clusters', methods=['GET'])
|
|
||||||
def clusters_to_csv():
|
|
||||||
app.logger.debug("Handling clusters_to_csv get request")
|
|
||||||
inst_structures = get_inst_structures()
|
|
||||||
action_logs = get_action_logs()
|
|
||||||
exporter = StatsToCsv()
|
|
||||||
result = exporter.export_clusters(inst_structures, action_logs)
|
|
||||||
|
|
||||||
# NOTE: result - is generator, but streaming can not work with some
|
|
||||||
# WSGI middlewares: http://flask.pocoo.org/docs/0.10/patterns/streaming/
|
|
||||||
app.logger.debug("Get request for clusters_to_csv handled")
|
|
||||||
headers = {
|
|
||||||
'Content-Disposition': 'attachment; filename={}'.format(
|
|
||||||
CLUSTERS_REPORT_FILE)
|
|
||||||
}
|
|
||||||
return Response(result, mimetype='text/csv', headers=headers)
|
|
||||||
|
|
||||||
|
|
||||||
@bp.route('/nodes', methods=['GET'])
|
|
||||||
def nodes_to_csv():
|
|
||||||
app.logger.debug("Handling nodes_to_csv get request")
|
|
||||||
inst_structures = get_inst_structures()
|
|
||||||
exporter = StatsToCsv()
|
|
||||||
result = exporter.export_nodes(inst_structures)
|
|
||||||
|
|
||||||
# NOTE: result - is generator, but streaming can not work with some
|
|
||||||
# WSGI middlewares: http://flask.pocoo.org/docs/0.10/patterns/streaming/
|
|
||||||
app.logger.debug("Get request for nodes_to_csv handled")
|
|
||||||
headers = {
|
|
||||||
'Content-Disposition': 'attachment; filename={}'.format(
|
|
||||||
NODES_REPORT_FILE)
|
|
||||||
}
|
|
||||||
return Response(result, mimetype='text/csv', headers=headers)
|
|
||||||
|
|
||||||
|
|
||||||
@bp.route('/plugins', methods=['GET'])
|
|
||||||
def plugins_to_csv():
|
|
||||||
app.logger.debug("Handling plugins_to_csv get request")
|
|
||||||
inst_structures = get_inst_structures()
|
|
||||||
exporter = StatsToCsv()
|
|
||||||
result = exporter.export_plugins(inst_structures)
|
|
||||||
|
|
||||||
# NOTE: result - is generator, but streaming can not work with some
|
|
||||||
# WSGI middlewares: http://flask.pocoo.org/docs/0.10/patterns/streaming/
|
|
||||||
app.logger.debug("Get request for plugins_to_csv handled")
|
|
||||||
headers = {
|
|
||||||
'Content-Disposition': 'attachment; filename={}'.format(
|
|
||||||
PLUGINS_REPORT_FILE)
|
|
||||||
}
|
|
||||||
return Response(result, mimetype='text/csv', headers=headers)
|
|
||||||
|
|
||||||
|
|
||||||
def get_oswls_query(resource_type, from_date=None, to_date=None):
|
|
||||||
"""Composes query for fetching oswls with installation
|
|
||||||
info creation and update dates with ordering by created_date
|
|
||||||
:param resource_type: resource type
|
|
||||||
:param from_date: filter from date
|
|
||||||
:param to_date: filter to date
|
|
||||||
:return: SQLAlchemy query
|
|
||||||
"""
|
|
||||||
query = db.session.query(
|
|
||||||
OSWS.id, OSWS.master_node_uid, OSWS.cluster_id,
|
|
||||||
OSWS.created_date, # for checking if row is duplicated in CSV
|
|
||||||
OSWS.created_date.label('stats_on_date'), # for showing in CSV
|
|
||||||
OSWS.resource_type, OSWS.resource_data, OSWS.resource_checksum,
|
|
||||||
OSWS.version_info,
|
|
||||||
IS.creation_date.label('installation_created_date'),
|
|
||||||
IS.modification_date.label('installation_updated_date'),
|
|
||||||
IS.structure['fuel_release'].label('fuel_release_from_inst_info'),
|
|
||||||
IS.is_filtered).\
|
|
||||||
join(IS, IS.master_node_uid == OSWS.master_node_uid).\
|
|
||||||
filter(OSWS.resource_type == resource_type).\
|
|
||||||
filter(or_(IS.is_filtered == bool(False), IS.is_filtered.is_(None)))
|
|
||||||
if from_date is not None:
|
|
||||||
query = query.filter(OSWS.created_date >= from_date)
|
|
||||||
if to_date is not None:
|
|
||||||
query = query.filter(OSWS.created_date <= to_date)
|
|
||||||
# For proper handling of paging we must use additional ordering by id.
|
|
||||||
# In other case we will lose some OSWLs form the execution result.
|
|
||||||
query = query.order_by(OSWS.created_date, OSWS.id)
|
|
||||||
return query
|
|
||||||
|
|
||||||
|
|
||||||
def get_oswls(resource_type):
|
|
||||||
yield_per = app.config['CSV_DB_YIELD_PER']
|
|
||||||
app.logger.debug("Fetching %s oswls with yield per %d",
|
|
||||||
resource_type, yield_per)
|
|
||||||
from_date = get_from_date()
|
|
||||||
to_date = get_to_date()
|
|
||||||
query = get_oswls_query(resource_type, from_date=from_date,
|
|
||||||
to_date=to_date)
|
|
||||||
return query.yield_per(yield_per)
|
|
||||||
|
|
||||||
|
|
||||||
def _add_oswl_to_clusters_versions_cache(inst_structure, clusters_versions):
|
|
||||||
"""Adds oswl clusters version_info into clusters_versions cache.
|
|
||||||
|
|
||||||
:param inst_structure: InstallationStructure DB object
|
|
||||||
:type inst_structure: fuel_analytics.api.db.model.InstallationStructure
|
|
||||||
:param clusters_versions: cache for saving cluster versions with
|
|
||||||
structure {mn_uid: {cluster_id: version_info}}
|
|
||||||
:type clusters_versions: dict
|
|
||||||
"""
|
|
||||||
|
|
||||||
mn_uid = inst_structure.master_node_uid
|
|
||||||
|
|
||||||
clusters = inst_structure.clusters or []
|
|
||||||
clusters_versions[mn_uid] = {}
|
|
||||||
|
|
||||||
for cluster in clusters:
|
|
||||||
fuel_version = cluster.get('fuel_version')
|
|
||||||
if not fuel_version:
|
|
||||||
continue
|
|
||||||
|
|
||||||
version_info = {'fuel_version': fuel_version}
|
|
||||||
release = cluster.get('release')
|
|
||||||
if release:
|
|
||||||
version_info['release_version'] = release.get('version')
|
|
||||||
version_info['release_os'] = release.get('os')
|
|
||||||
version_info['release_name'] = release.get('name')
|
|
||||||
|
|
||||||
clusters_versions[mn_uid][cluster['id']] = version_info
|
|
||||||
|
|
||||||
|
|
||||||
@bp.route('/<resource_type>', methods=['GET'])
|
|
||||||
def oswl_to_csv(resource_type):
|
|
||||||
app.logger.debug("Handling oswl_to_csv get request for resource %s",
|
|
||||||
resource_type)
|
|
||||||
|
|
||||||
exporter = OswlStatsToCsv()
|
|
||||||
oswls = get_oswls(resource_type)
|
|
||||||
|
|
||||||
clusters_version_info = get_clusters_version_info()
|
|
||||||
result = exporter.export(resource_type, oswls, get_to_date(),
|
|
||||||
clusters_version_info)
|
|
||||||
|
|
||||||
# NOTE: result - is generator, but streaming can not work with some
|
|
||||||
# WSGI middlewares: http://flask.pocoo.org/docs/0.10/patterns/streaming/
|
|
||||||
app.logger.debug("Request oswl_to_csv for resource %s handled",
|
|
||||||
resource_type)
|
|
||||||
headers = {
|
|
||||||
'Content-Disposition': 'attachment; filename={}.csv'.format(
|
|
||||||
resource_type)
|
|
||||||
}
|
|
||||||
return Response(result, mimetype='text/csv', headers=headers)
|
|
||||||
|
|
||||||
|
|
||||||
def get_resources_types():
|
|
||||||
"""Gets all available resource types
|
|
||||||
:return: generator of resources types names collection
|
|
||||||
"""
|
|
||||||
result = db.session.query(distinct(OSWS.resource_type))
|
|
||||||
return (row[0] for row in result)
|
|
||||||
|
|
||||||
|
|
||||||
def get_all_reports(from_date, to_date, clusters_version_info):
|
|
||||||
"""Returns generator on all reports info.
|
|
||||||
:param from_date: get reports from date
|
|
||||||
:param to_date: get reports to date
|
|
||||||
:param clusters_version_info: dict with version_info fetched from
|
|
||||||
clusters
|
|
||||||
:return: generator on sequence of tuples (report data, report name)
|
|
||||||
"""
|
|
||||||
app.logger.debug("Getting all reports")
|
|
||||||
stats_exporter = StatsToCsv()
|
|
||||||
oswl_exporter = OswlStatsToCsv()
|
|
||||||
|
|
||||||
resources_types = sorted(get_resources_types())
|
|
||||||
app.logger.debug("Resources reports list: %s", resources_types)
|
|
||||||
|
|
||||||
# OSWLs reports
|
|
||||||
for resource_type in resources_types:
|
|
||||||
app.logger.debug("Getting report '%s'", resource_type)
|
|
||||||
oswls = get_oswls_query(resource_type, from_date=from_date,
|
|
||||||
to_date=to_date)
|
|
||||||
report = oswl_exporter.export(resource_type, oswls, to_date,
|
|
||||||
clusters_version_info)
|
|
||||||
app.logger.debug("Report '%s' got", resource_type)
|
|
||||||
yield report, '{}.csv'.format(resource_type)
|
|
||||||
|
|
||||||
# Clusters report
|
|
||||||
app.logger.debug("Getting clusters report")
|
|
||||||
inst_structures = get_inst_structures_query(from_date=from_date,
|
|
||||||
to_date=to_date)
|
|
||||||
query_action_logs = get_action_logs_query()
|
|
||||||
action_logs = db.session.execute(query_action_logs,
|
|
||||||
{'from_date': from_date,
|
|
||||||
'to_date': to_date})
|
|
||||||
clusters = stats_exporter.export_clusters(inst_structures,
|
|
||||||
action_logs)
|
|
||||||
app.logger.debug("Clusters report got")
|
|
||||||
yield clusters, CLUSTERS_REPORT_FILE
|
|
||||||
|
|
||||||
# Plugins report
|
|
||||||
app.logger.debug("Getting plugins report")
|
|
||||||
plugins = stats_exporter.export_plugins(inst_structures)
|
|
||||||
app.logger.debug("Plugins report got")
|
|
||||||
yield plugins, PLUGINS_REPORT_FILE
|
|
||||||
|
|
||||||
app.logger.debug("All reports got")
|
|
||||||
|
|
||||||
|
|
||||||
def stream_reports_tar(reports):
|
|
||||||
"""Streams reports data as tar archive.
|
|
||||||
:param reports: generator of collection of tuples
|
|
||||||
(report data, report name)
|
|
||||||
:return: streamed reports tar archive
|
|
||||||
"""
|
|
||||||
app.logger.debug("Streaming reports as tar archive started")
|
|
||||||
tar_stream = six.StringIO()
|
|
||||||
with tarfile.open(None, mode='w', fileobj=tar_stream) as f:
|
|
||||||
for report, report_name in reports:
|
|
||||||
app.logger.debug("Streaming report %s", report_name)
|
|
||||||
stream = six.StringIO()
|
|
||||||
info = tarfile.TarInfo(report_name)
|
|
||||||
for row in report:
|
|
||||||
stream.write(row)
|
|
||||||
info.size = stream.tell()
|
|
||||||
stream.seek(io.SEEK_SET)
|
|
||||||
f.addfile(info, stream)
|
|
||||||
|
|
||||||
tar_stream.seek(io.SEEK_SET)
|
|
||||||
yield tar_stream.getvalue()
|
|
||||||
|
|
||||||
tar_stream.seek(io.SEEK_SET)
|
|
||||||
tar_stream.truncate()
|
|
||||||
|
|
||||||
app.logger.debug("Streaming reports as tar archive finished")
|
|
||||||
|
|
||||||
|
|
||||||
@bp.route('/all', methods=['GET'])
|
|
||||||
def all_reports():
|
|
||||||
"""Single report for all resource types, clusters and plugins info
|
|
||||||
:return: tar archive of CSV reports
|
|
||||||
"""
|
|
||||||
app.logger.debug("Handling all_reports get request")
|
|
||||||
from_date = get_from_date()
|
|
||||||
to_date = get_to_date()
|
|
||||||
|
|
||||||
clusters_version_info = get_clusters_version_info()
|
|
||||||
reports = get_all_reports(from_date, to_date, clusters_version_info)
|
|
||||||
|
|
||||||
name = 'reports_from{}_to{}'.format(get_from_date(), get_to_date())
|
|
||||||
headers = {
|
|
||||||
'Content-Disposition': 'attachment; filename={}.tar'.format(name)
|
|
||||||
}
|
|
||||||
return Response(stream_reports_tar(reports),
|
|
||||||
mimetype='application/x-tar', headers=headers)
|
|
@ -1,215 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
import datetime
|
|
||||||
import json
|
|
||||||
from sqlalchemy import and_
|
|
||||||
from sqlalchemy import func
|
|
||||||
|
|
||||||
from flask import Blueprint
|
|
||||||
from flask import request
|
|
||||||
from flask import Response
|
|
||||||
|
|
||||||
from fuel_analytics.api.app import app
|
|
||||||
from fuel_analytics.api.app import db
|
|
||||||
from fuel_analytics.api.db.model import ActionLog as AL
|
|
||||||
from fuel_analytics.api.db.model import InstallationStructure as IS
|
|
||||||
from fuel_analytics.api.db.model import OpenStackWorkloadStats as OSWL
|
|
||||||
|
|
||||||
bp = Blueprint('dto', __name__)
|
|
||||||
|
|
||||||
|
|
||||||
def row_as_serializable_dict(row):
|
|
||||||
"""Converts SqlAlchemy object to dict serializable to json
|
|
||||||
:param row: SqlAlchemy object
|
|
||||||
:return: dict serializable to json
|
|
||||||
"""
|
|
||||||
result = {}
|
|
||||||
for c in row.__table__.columns:
|
|
||||||
name = c.name
|
|
||||||
value = getattr(row, c.name)
|
|
||||||
if isinstance(value, (datetime.datetime, datetime.date,
|
|
||||||
datetime.time)):
|
|
||||||
value = value.isoformat()
|
|
||||||
result[name] = value
|
|
||||||
return result
|
|
||||||
|
|
||||||
|
|
||||||
def get_dict_param(name):
|
|
||||||
params = request.args.get(name)
|
|
||||||
try:
|
|
||||||
params = json.loads(params)
|
|
||||||
except Exception:
|
|
||||||
pass
|
|
||||||
finally:
|
|
||||||
if not isinstance(params, dict):
|
|
||||||
params = {}
|
|
||||||
return params
|
|
||||||
|
|
||||||
|
|
||||||
def get_paging_params():
|
|
||||||
params = get_dict_param('paging_params')
|
|
||||||
return {
|
|
||||||
'limit': params.get('limit', app.config['JSON_DB_DEFAULT_LIMIT']),
|
|
||||||
'offset': params.get('offset', 0)
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
@bp.route('/installation_info/<master_node_uid>', methods=['GET'])
|
|
||||||
def get_installation_info(master_node_uid):
|
|
||||||
app.logger.debug("Fetching installation info for: %s", master_node_uid)
|
|
||||||
result = db.session.query(IS).filter(
|
|
||||||
IS.master_node_uid == master_node_uid).one()
|
|
||||||
dict_result = row_as_serializable_dict(result)
|
|
||||||
app.logger.debug("Installation info for: %s fetched", master_node_uid)
|
|
||||||
return Response(json.dumps(dict_result), mimetype='application/json')
|
|
||||||
|
|
||||||
|
|
||||||
def _get_db_objs_count(model, sql_clauses):
|
|
||||||
return db.session.query(model).filter(and_(*sql_clauses)).count()
|
|
||||||
|
|
||||||
|
|
||||||
def _get_db_objs_data(model, sql_clauses, order_by, paging_params):
|
|
||||||
"""Gets DB objects by sql_clauses
|
|
||||||
:param model: DB model
|
|
||||||
:param sql_clauses: collection of clauses for selecting DB objects
|
|
||||||
:param order_by: tuple of orderings for DB objects
|
|
||||||
:param paging_params: dictionary with limit, offset values
|
|
||||||
:return: generator on dicts of DB objects data
|
|
||||||
"""
|
|
||||||
query = db.session.query(model).filter(and_(*sql_clauses))
|
|
||||||
for order in order_by:
|
|
||||||
query = query.order_by(order)
|
|
||||||
result = query.limit(paging_params['limit']).\
|
|
||||||
offset(paging_params['offset']).all()
|
|
||||||
return (row_as_serializable_dict(obj) for obj in result)
|
|
||||||
|
|
||||||
|
|
||||||
def _jsonify_collection(collection_iter):
|
|
||||||
"""Jsonifyes collection. Used for streaming
|
|
||||||
list of jsons into Flask application response
|
|
||||||
:param collection_iter: iterator on input collection
|
|
||||||
:return: generator on chunks of jsonifyed result
|
|
||||||
"""
|
|
||||||
yield '['
|
|
||||||
try:
|
|
||||||
yield json.dumps(collection_iter.next())
|
|
||||||
while True:
|
|
||||||
d = collection_iter.next()
|
|
||||||
yield ', {}'.format(json.dumps(d))
|
|
||||||
except StopIteration:
|
|
||||||
pass
|
|
||||||
finally:
|
|
||||||
yield ']'
|
|
||||||
|
|
||||||
|
|
||||||
def _jsonify_paged_collection(collection_iter, paging_params, total):
|
|
||||||
|
|
||||||
page_info = paging_params.copy()
|
|
||||||
page_info['total'] = total
|
|
||||||
|
|
||||||
yield '{{"paging_params": {0}, "objs": '.format(json.dumps(page_info))
|
|
||||||
for item in _jsonify_collection(collection_iter):
|
|
||||||
yield item
|
|
||||||
yield '}'
|
|
||||||
|
|
||||||
|
|
||||||
@bp.route('/oswls/<master_node_uid>', methods=['GET'])
|
|
||||||
def get_oswls(master_node_uid):
|
|
||||||
paging_params = get_paging_params()
|
|
||||||
app.logger.debug("Fetching oswl info for: %s, paging prams: %s",
|
|
||||||
master_node_uid, paging_params)
|
|
||||||
sql_clauses = (OSWL.master_node_uid == master_node_uid,)
|
|
||||||
oswls_data = _get_db_objs_data(OSWL, sql_clauses,
|
|
||||||
(OSWL.id.asc(),), paging_params)
|
|
||||||
oswls_count = _get_db_objs_count(OSWL, sql_clauses)
|
|
||||||
jsons_data = _jsonify_paged_collection(oswls_data, paging_params,
|
|
||||||
oswls_count)
|
|
||||||
app.logger.debug("Oswl info for: %s, paging params: %s fetched",
|
|
||||||
master_node_uid, paging_params)
|
|
||||||
return Response(jsons_data, mimetype='application/json')
|
|
||||||
|
|
||||||
|
|
||||||
@bp.route('/oswls/<master_node_uid>/<resource_type>', methods=['GET'])
|
|
||||||
def get_oswls_by_resource_type(master_node_uid, resource_type):
|
|
||||||
paging_params = get_paging_params()
|
|
||||||
app.logger.debug("Fetching oswl info for: %s, %s, paging params: %s",
|
|
||||||
master_node_uid, resource_type, paging_params)
|
|
||||||
sql_clauses = (OSWL.master_node_uid == master_node_uid,
|
|
||||||
OSWL.resource_type == resource_type)
|
|
||||||
oswls_data = _get_db_objs_data(
|
|
||||||
OSWL, sql_clauses, (OSWL.id.asc(), OSWL.resource_type.asc()),
|
|
||||||
paging_params)
|
|
||||||
oswls_total = _get_db_objs_count(OSWL, sql_clauses)
|
|
||||||
jsons_data = _jsonify_paged_collection(oswls_data, paging_params,
|
|
||||||
oswls_total)
|
|
||||||
app.logger.debug("Oswl info for: %s, %s, paging prams: %s fetched",
|
|
||||||
master_node_uid, resource_type, paging_params)
|
|
||||||
return Response(jsons_data, mimetype='application/json')
|
|
||||||
|
|
||||||
|
|
||||||
@bp.route('/action_logs/<master_node_uid>', methods=['GET'])
|
|
||||||
def get_action_logs(master_node_uid):
|
|
||||||
paging_params = get_paging_params()
|
|
||||||
app.logger.debug("Fetching action_logs for: %s, paging params: %s",
|
|
||||||
master_node_uid, paging_params)
|
|
||||||
sql_clauses = (AL.master_node_uid == master_node_uid,)
|
|
||||||
action_logs_data = _get_db_objs_data(AL, sql_clauses,
|
|
||||||
(AL.id.asc(),), paging_params)
|
|
||||||
action_logs_total = _get_db_objs_count(AL, sql_clauses)
|
|
||||||
jsons_data = _jsonify_paged_collection(action_logs_data, paging_params,
|
|
||||||
action_logs_total)
|
|
||||||
app.logger.debug("Action_logs for: %s, paging params: %s fetched",
|
|
||||||
master_node_uid, paging_params)
|
|
||||||
return Response(jsons_data, mimetype='application/json')
|
|
||||||
|
|
||||||
|
|
||||||
@bp.route('/installation_infos/filtered', methods=['GET'])
|
|
||||||
def get_filtered_installation_infos():
|
|
||||||
paging_params = get_paging_params()
|
|
||||||
app.logger.debug("Fetching filtered installation_info, paging params: %s",
|
|
||||||
paging_params)
|
|
||||||
sql_clauses = (IS.is_filtered == bool(1),) # Workaround for PEP8 E712
|
|
||||||
inst_infos = _get_db_objs_data(IS, sql_clauses, (IS.id.asc(),),
|
|
||||||
paging_params)
|
|
||||||
inst_infos_total = _get_db_objs_count(IS, sql_clauses)
|
|
||||||
jsons_data = _jsonify_paged_collection(inst_infos, paging_params,
|
|
||||||
inst_infos_total)
|
|
||||||
app.logger.debug("Filtered installation_info: %s fetched", paging_params)
|
|
||||||
return Response(jsons_data, mimetype='application/json')
|
|
||||||
|
|
||||||
|
|
||||||
@bp.route('/summary', methods=['GET'])
|
|
||||||
def get_db_summary():
|
|
||||||
app.logger.debug("Getting db summary")
|
|
||||||
summary = {}
|
|
||||||
for model in (IS, AL, OSWL):
|
|
||||||
count = db.session.query(model).count()
|
|
||||||
summary[model.__tablename__] = {'total': count}
|
|
||||||
|
|
||||||
# Counting filtered installation info
|
|
||||||
filtered_summary = db.session.query(
|
|
||||||
IS.is_filtered, func.count(IS.id)).group_by(IS.is_filtered).all()
|
|
||||||
filtered_num = 0
|
|
||||||
not_filtered_num = 0
|
|
||||||
for is_filtered, count in filtered_summary:
|
|
||||||
if is_filtered is False:
|
|
||||||
not_filtered_num += count
|
|
||||||
else:
|
|
||||||
filtered_num += count
|
|
||||||
summary[IS.__tablename__]['not_filtered'] = not_filtered_num
|
|
||||||
summary[IS.__tablename__]['filtered'] = filtered_num
|
|
||||||
|
|
||||||
app.logger.debug("Db summary got")
|
|
||||||
return Response(json.dumps(summary), mimetype='application/json')
|
|
@ -1,204 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
from collections import defaultdict
|
|
||||||
import copy
|
|
||||||
import json
|
|
||||||
|
|
||||||
from flask import Blueprint
|
|
||||||
from flask import request
|
|
||||||
from flask import Response
|
|
||||||
import memcache
|
|
||||||
import sqlalchemy
|
|
||||||
|
|
||||||
from fuel_analytics.api.app import app
|
|
||||||
from fuel_analytics.api.app import db
|
|
||||||
|
|
||||||
bp = Blueprint('reports', __name__)
|
|
||||||
|
|
||||||
|
|
||||||
@bp.route('/installations', methods=['GET'])
|
|
||||||
def get_installations_info():
|
|
||||||
release = request.args.get('release')
|
|
||||||
refresh = request.args.get('refresh')
|
|
||||||
cache_key_prefix = 'fuel-stats-installations-info'
|
|
||||||
mc = memcache.Client(app.config.get('MEMCACHED_HOSTS'))
|
|
||||||
app.logger.debug("Fetching installations info for release: %s", release)
|
|
||||||
|
|
||||||
# Checking cache
|
|
||||||
if not refresh:
|
|
||||||
cache_key = '{0}{1}'.format(cache_key_prefix, release)
|
|
||||||
app.logger.debug("Checking installations info by key: %s in cache",
|
|
||||||
cache_key)
|
|
||||||
cached_result = mc.get(cache_key)
|
|
||||||
if cached_result:
|
|
||||||
app.logger.debug("Installations info cache found by key: %s",
|
|
||||||
cache_key)
|
|
||||||
return Response(cached_result, mimetype='application/json')
|
|
||||||
else:
|
|
||||||
app.logger.debug("No cached installations info for key: %s",
|
|
||||||
cache_key)
|
|
||||||
else:
|
|
||||||
app.logger.debug("Enforce refresh cache of installations info "
|
|
||||||
"for release: %s", release)
|
|
||||||
|
|
||||||
# Fetching data from DB
|
|
||||||
info_from_db = get_installations_info_from_db(release)
|
|
||||||
|
|
||||||
# Saving fetched data to cache
|
|
||||||
for for_release, info in info_from_db.items():
|
|
||||||
cache_key = '{0}{1}'.format(cache_key_prefix, for_release)
|
|
||||||
app.logger.debug("Caching installations info for key: %s, data: %s",
|
|
||||||
cache_key, info)
|
|
||||||
mc.set(cache_key, json.dumps(info),
|
|
||||||
app.config.get('MEMCACHED_JSON_REPORTS_EXPIRATION'))
|
|
||||||
|
|
||||||
return Response(json.dumps(info_from_db[release]),
|
|
||||||
mimetype='application/json')
|
|
||||||
|
|
||||||
|
|
||||||
def get_installations_info_from_db(release):
|
|
||||||
"""Extracts and aggregates installation and environments info
|
|
||||||
|
|
||||||
We have list of clusters in the DB field installations_info.structure.
|
|
||||||
The cluster data stored as dict. Unfortunately we have no ways in the
|
|
||||||
DB layer to extract only required fields from the dicts in the list.
|
|
||||||
For decrease memory consumption we are selecting only required fields
|
|
||||||
from clusters data.
|
|
||||||
|
|
||||||
For instance we want to extract only statuses of the clusters:
|
|
||||||
{"clusters": [{"status": "error", ...}, {"status": "new", ...},
|
|
||||||
{"status": "operational", ...}].
|
|
||||||
|
|
||||||
The only way to fetch only required data is expanding of cluster data to
|
|
||||||
separate rows in the SQL query result and extract only required fields.
|
|
||||||
For this purpose we are selecting FROM installation_structures,
|
|
||||||
json_array_elements(...).
|
|
||||||
|
|
||||||
Unfortunately rows with empty clusters list wouldn't be in the output.
|
|
||||||
As workaround we are adding empty cluster data in this case [{}].
|
|
||||||
Also we have ordering or rows by id.
|
|
||||||
|
|
||||||
Now we able to select only required fields in rows and rows are ordered
|
|
||||||
by id. So clusters are grouped by the installation id. When we are
|
|
||||||
iterating other the clusters the changing of id is marker of changing
|
|
||||||
installation.
|
|
||||||
|
|
||||||
:param release: filter data by Fuel release
|
|
||||||
:return: aggregated installations and environments info
|
|
||||||
"""
|
|
||||||
|
|
||||||
params = {'is_filtered': False}
|
|
||||||
# For counting installations without clusters we are
|
|
||||||
# adding empty cluster data into SQL result: [{}]
|
|
||||||
query = "SELECT id, release, " \
|
|
||||||
"cluster_data->>'status' status, " \
|
|
||||||
"structure->>'clusters_num' clusters_num, " \
|
|
||||||
"cluster_data->>'nodes_num' nodes_num, " \
|
|
||||||
"cluster_data->'attributes'->>'libvirt_type' hypervisor, " \
|
|
||||||
"cluster_data->'release'->>'os' os_name " \
|
|
||||||
"FROM installation_structures, " \
|
|
||||||
"json_array_elements(CASE " \
|
|
||||||
" WHEN structure->>'clusters' = '[]' THEN '[{}]' " \
|
|
||||||
" ELSE structure->'clusters' " \
|
|
||||||
" END" \
|
|
||||||
") AS cluster_data " \
|
|
||||||
"WHERE is_filtered = :is_filtered"
|
|
||||||
if release:
|
|
||||||
params['release'] = release
|
|
||||||
query += " AND release = :release"
|
|
||||||
query += " ORDER BY id"
|
|
||||||
query = sqlalchemy.text(query)
|
|
||||||
|
|
||||||
info_template = {
|
|
||||||
'installations': {
|
|
||||||
'count': 0,
|
|
||||||
'environments_num': defaultdict(int)
|
|
||||||
},
|
|
||||||
'environments': {
|
|
||||||
'count': 0,
|
|
||||||
'operable_envs_count': 0,
|
|
||||||
'statuses': defaultdict(int),
|
|
||||||
'nodes_num': defaultdict(int),
|
|
||||||
'hypervisors_num': defaultdict(int),
|
|
||||||
'oses_num': defaultdict(int)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
info = defaultdict(lambda: copy.deepcopy(info_template))
|
|
||||||
|
|
||||||
app.logger.debug("Fetching installations info from DB for release: %s",
|
|
||||||
release)
|
|
||||||
|
|
||||||
last_id = None
|
|
||||||
for row in db.session.execute(query, params):
|
|
||||||
|
|
||||||
extract_installation_info(row, info[release], last_id)
|
|
||||||
cur_release = row[1]
|
|
||||||
|
|
||||||
# Splitting info by release if fetching for all releases
|
|
||||||
if not release and cur_release != release:
|
|
||||||
extract_installation_info(row, info[cur_release], last_id)
|
|
||||||
|
|
||||||
last_id = row[0]
|
|
||||||
|
|
||||||
app.logger.debug("Fetched installations info from DB for release: "
|
|
||||||
"%s, info: %s", release, info)
|
|
||||||
return info
|
|
||||||
|
|
||||||
|
|
||||||
def extract_installation_info(row, result, last_id):
|
|
||||||
"""Extracts installation info from structure
|
|
||||||
|
|
||||||
:param row: row with data from DB
|
|
||||||
:type row: tuple
|
|
||||||
:param result: placeholder for extracted data
|
|
||||||
:type result: dict
|
|
||||||
:param last_id: DB id of last processed installation
|
|
||||||
:param last_id: int
|
|
||||||
"""
|
|
||||||
|
|
||||||
(cur_id, cur_release, status, clusters_num, nodes_num,
|
|
||||||
hypervisor, os_name) = row
|
|
||||||
|
|
||||||
inst_info = result['installations']
|
|
||||||
env_info = result['environments']
|
|
||||||
|
|
||||||
production_statuses = ('operational', 'error')
|
|
||||||
|
|
||||||
if last_id != cur_id:
|
|
||||||
inst_info['count'] += 1
|
|
||||||
inst_info['environments_num'][clusters_num] += 1
|
|
||||||
|
|
||||||
# For empty clusters data we don't increase environments count
|
|
||||||
try:
|
|
||||||
if int(clusters_num):
|
|
||||||
env_info['count'] += 1
|
|
||||||
except (ValueError, TypeError):
|
|
||||||
app.logger.exception("Value of clusters_num %s "
|
|
||||||
"can't be casted to int", clusters_num)
|
|
||||||
|
|
||||||
if status in production_statuses:
|
|
||||||
if nodes_num:
|
|
||||||
env_info['nodes_num'][nodes_num] += 1
|
|
||||||
env_info['operable_envs_count'] += 1
|
|
||||||
|
|
||||||
if hypervisor:
|
|
||||||
env_info['hypervisors_num'][hypervisor.lower()] += 1
|
|
||||||
|
|
||||||
if os_name:
|
|
||||||
env_info['oses_num'][os_name.lower()] += 1
|
|
||||||
|
|
||||||
if status is not None:
|
|
||||||
env_info['statuses'][status] += 1
|
|
@ -1,13 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
@ -1,205 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
import csv
|
|
||||||
import io
|
|
||||||
import itertools
|
|
||||||
|
|
||||||
import six
|
|
||||||
from sqlalchemy.util import KeyedTuple
|
|
||||||
|
|
||||||
from fuel_analytics.api.app import app
|
|
||||||
|
|
||||||
|
|
||||||
def get_keys_paths(skeleton):
|
|
||||||
"""Gets paths to leaf keys in the data
|
|
||||||
:param skeleton: data skeleton
|
|
||||||
:return: list of lists of dict keys
|
|
||||||
"""
|
|
||||||
def _keys_paths_helper(keys, skel):
|
|
||||||
result = []
|
|
||||||
|
|
||||||
if isinstance(skel, dict):
|
|
||||||
for k in sorted(six.iterkeys(skel)):
|
|
||||||
result.extend(_keys_paths_helper(keys + [k], skel[k]))
|
|
||||||
elif isinstance(skel, (list, tuple)):
|
|
||||||
# For lists in the skeleton we can specify repeats value.
|
|
||||||
# For instance we want to show 3 roles in the CSV report.
|
|
||||||
# In this case skeleton for roles will be {'roles': [None, 3]}
|
|
||||||
if len(skel) > 1:
|
|
||||||
repeats = skel[1]
|
|
||||||
else:
|
|
||||||
repeats = app.config['CSV_DEFAULT_LIST_ITEMS_NUM']
|
|
||||||
|
|
||||||
if len(skel):
|
|
||||||
for idx in six.moves.xrange(repeats):
|
|
||||||
result.extend(_keys_paths_helper(keys + [idx], skel[0]))
|
|
||||||
else:
|
|
||||||
result.append(keys)
|
|
||||||
|
|
||||||
elif callable(skel):
|
|
||||||
# Handling aggregate functions in the skeleton. For instance if
|
|
||||||
# we want to show number of networks we will have the following
|
|
||||||
# skeleton: {'networks': count}
|
|
||||||
result.append(keys + [skel])
|
|
||||||
else:
|
|
||||||
result.append(keys)
|
|
||||||
return result
|
|
||||||
return _keys_paths_helper([], skeleton)
|
|
||||||
|
|
||||||
|
|
||||||
def get_flatten_data(keys_paths, data):
|
|
||||||
"""Creates flatten data from data by keys_paths
|
|
||||||
|
|
||||||
:param keys_paths: list of dict keys lists
|
|
||||||
:param data: dict with nested structures
|
|
||||||
:return: list of flatten data dicts
|
|
||||||
"""
|
|
||||||
flatten_data = []
|
|
||||||
for key_path in keys_paths:
|
|
||||||
d = data
|
|
||||||
for key in key_path:
|
|
||||||
if callable(key):
|
|
||||||
# Handling aggregate functions in the skeleton
|
|
||||||
d = key(d)
|
|
||||||
break
|
|
||||||
if isinstance(d, dict):
|
|
||||||
d = d.get(key, None)
|
|
||||||
elif isinstance(d, KeyedTuple):
|
|
||||||
# If we specify DB fields in the query SQLAlchemy
|
|
||||||
# returns KeyedTuple inherited from tuple
|
|
||||||
d = getattr(d, key, None)
|
|
||||||
elif isinstance(d, (list, tuple)):
|
|
||||||
d = d[key] if key < len(d) else None
|
|
||||||
else:
|
|
||||||
d = getattr(d, key, None)
|
|
||||||
if d is None:
|
|
||||||
break
|
|
||||||
if isinstance(d, (list, tuple)):
|
|
||||||
# If type for list items is not specified values
|
|
||||||
# will be shown as joined text
|
|
||||||
flatten_data.append(' '.join(map(six.text_type, d)))
|
|
||||||
else:
|
|
||||||
flatten_data.append(d)
|
|
||||||
return flatten_data
|
|
||||||
|
|
||||||
|
|
||||||
def construct_skeleton(data):
|
|
||||||
"""Creates structure for searching all key paths in given data
|
|
||||||
:param data: fetched from ES dict
|
|
||||||
:return: skeleton of data structure
|
|
||||||
"""
|
|
||||||
if isinstance(data, dict):
|
|
||||||
result = {}
|
|
||||||
for k in sorted(data.keys()):
|
|
||||||
result[k] = construct_skeleton(data[k])
|
|
||||||
return result
|
|
||||||
elif isinstance(data, (list, tuple)):
|
|
||||||
list_result = []
|
|
||||||
dict_result = {}
|
|
||||||
for d in data:
|
|
||||||
if isinstance(d, dict):
|
|
||||||
dict_result.update(construct_skeleton(d))
|
|
||||||
elif isinstance(d, (list, tuple)):
|
|
||||||
if not list_result:
|
|
||||||
list_result.append(construct_skeleton(d))
|
|
||||||
else:
|
|
||||||
list_result[0].extend(construct_skeleton(d))
|
|
||||||
if dict_result:
|
|
||||||
list_result.append(dict_result)
|
|
||||||
return list_result
|
|
||||||
else:
|
|
||||||
return None
|
|
||||||
|
|
||||||
|
|
||||||
def get_data_skeleton(structures):
|
|
||||||
"""Constructs and merges skeletons from raw data
|
|
||||||
|
|
||||||
:param structures: list of data
|
|
||||||
:return: skeleton for provided data structures
|
|
||||||
"""
|
|
||||||
def _merge_skeletons(lh, rh):
|
|
||||||
keys_paths = get_keys_paths(rh)
|
|
||||||
for keys_path in keys_paths:
|
|
||||||
merge_point = lh
|
|
||||||
data_point = rh
|
|
||||||
for key in keys_path:
|
|
||||||
data_point = data_point[key]
|
|
||||||
if isinstance(data_point, dict):
|
|
||||||
if key not in merge_point:
|
|
||||||
merge_point[key] = {}
|
|
||||||
elif isinstance(data_point, (list, tuple)):
|
|
||||||
if key not in merge_point:
|
|
||||||
merge_point[key] = [get_data_skeleton(data_point)]
|
|
||||||
else:
|
|
||||||
_merge_skeletons(merge_point[key][0],
|
|
||||||
get_data_skeleton(data_point))
|
|
||||||
break
|
|
||||||
else:
|
|
||||||
merge_point[key] = None
|
|
||||||
merge_point = merge_point[key]
|
|
||||||
|
|
||||||
skeleton = {}
|
|
||||||
for structure in structures:
|
|
||||||
app.logger.debug("Constructing skeleton by data: %s", structure)
|
|
||||||
new_skeleton = construct_skeleton(structure)
|
|
||||||
app.logger.debug("Updating skeleton by %s", new_skeleton)
|
|
||||||
_merge_skeletons(skeleton, new_skeleton)
|
|
||||||
app.logger.debug("Result skeleton is %s", skeleton)
|
|
||||||
return skeleton
|
|
||||||
|
|
||||||
|
|
||||||
def flatten_data_as_csv(keys_paths, flatten_data):
|
|
||||||
"""Returns flatten data in CSV
|
|
||||||
:param keys_paths: list of dict keys lists for columns names
|
|
||||||
generation
|
|
||||||
:param flatten_data: list of flatten data dicts
|
|
||||||
:return: stream with data in CSV format
|
|
||||||
"""
|
|
||||||
app.logger.debug("Saving flatten data as CSV started")
|
|
||||||
names = []
|
|
||||||
for key_path in keys_paths:
|
|
||||||
# Handling functions and list indexes in key_path
|
|
||||||
key_texts = (getattr(k, '__name__', six.text_type(k))
|
|
||||||
for k in key_path)
|
|
||||||
names.append('.'.join(key_texts))
|
|
||||||
|
|
||||||
output = six.BytesIO()
|
|
||||||
writer = csv.writer(output)
|
|
||||||
|
|
||||||
def read_and_flush():
|
|
||||||
output.seek(io.SEEK_SET)
|
|
||||||
data = output.read()
|
|
||||||
output.seek(io.SEEK_SET)
|
|
||||||
output.truncate()
|
|
||||||
return data
|
|
||||||
|
|
||||||
for d in itertools.chain((names,), flatten_data):
|
|
||||||
encoded_d = [s.encode("utf-8") if isinstance(s, unicode) else s
|
|
||||||
for s in d]
|
|
||||||
writer.writerow(encoded_d)
|
|
||||||
yield read_and_flush()
|
|
||||||
app.logger.debug("Saving flatten data as CSV finished")
|
|
||||||
|
|
||||||
|
|
||||||
def get_index(target, *fields):
|
|
||||||
"""Gets value of index for target object
|
|
||||||
:param target: target object
|
|
||||||
:param fields: fields names for index creation
|
|
||||||
:return: tuple of attributes values of target from 'fields'
|
|
||||||
"""
|
|
||||||
if isinstance(target, dict):
|
|
||||||
return tuple(target[field_name] for field_name in fields)
|
|
||||||
else:
|
|
||||||
return tuple(getattr(target, field_name) for field_name in fields)
|
|
@ -1,281 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
import copy
|
|
||||||
import datetime
|
|
||||||
import itertools
|
|
||||||
import six
|
|
||||||
|
|
||||||
from fuel_analytics.api.app import app
|
|
||||||
from fuel_analytics.api.resources.utils import export_utils
|
|
||||||
from fuel_analytics.api.resources.utils.skeleton import OSWL_SKELETONS
|
|
||||||
|
|
||||||
|
|
||||||
class OswlStatsToCsv(object):
|
|
||||||
|
|
||||||
OSWL_INDEX_FIELDS = ('master_node_uid', 'cluster_id', 'resource_type')
|
|
||||||
|
|
||||||
def get_additional_keys_paths(self, resource_type):
|
|
||||||
"""Returns additional key paths for resource type info."""
|
|
||||||
|
|
||||||
return [[resource_type, 'is_added'],
|
|
||||||
[resource_type, 'is_modified'],
|
|
||||||
[resource_type, 'is_removed']]
|
|
||||||
|
|
||||||
def get_resource_keys_paths(self, resource_type):
|
|
||||||
"""Gets key paths for resource type. csv key paths is combination
|
|
||||||
of oswl, vm and additional resource type key paths
|
|
||||||
:return: tuple of lists of oswl, resource type, csv key paths
|
|
||||||
"""
|
|
||||||
app.logger.debug("Getting %s keys paths", resource_type)
|
|
||||||
oswl_key_paths = export_utils.get_keys_paths(OSWL_SKELETONS['general'])
|
|
||||||
resource_key_paths = export_utils.get_keys_paths(
|
|
||||||
{resource_type: OSWL_SKELETONS[resource_type]})
|
|
||||||
|
|
||||||
resource_additional_key_paths = self.get_additional_keys_paths(
|
|
||||||
resource_type)
|
|
||||||
|
|
||||||
result_key_paths = oswl_key_paths + resource_key_paths + \
|
|
||||||
resource_additional_key_paths
|
|
||||||
|
|
||||||
app.logger.debug("%s keys paths got: %s", resource_type,
|
|
||||||
result_key_paths)
|
|
||||||
return oswl_key_paths, resource_key_paths, result_key_paths
|
|
||||||
|
|
||||||
def get_additional_resource_info(self, resource, resource_type,
|
|
||||||
added_ids, modified_ids, removed_ids):
|
|
||||||
"""Gets additional info about operations with resource
|
|
||||||
:param resource: resource info
|
|
||||||
:param resource_type: resource type
|
|
||||||
:param added_ids: set of added ids from oswl
|
|
||||||
:param modified_ids: set of modified ids from oswl
|
|
||||||
:param removed_ids: set of removed ids from oswl
|
|
||||||
:return: list of integer flags: is_added, is_removed, is_modified
|
|
||||||
"""
|
|
||||||
id_val = resource.get('id')
|
|
||||||
is_added = id_val in added_ids
|
|
||||||
is_modified = id_val in modified_ids
|
|
||||||
is_removed = id_val in removed_ids
|
|
||||||
result = [is_added, is_modified, is_removed]
|
|
||||||
|
|
||||||
return result
|
|
||||||
|
|
||||||
def handle_empty_version_info(self, oswl, clusters_versions):
|
|
||||||
"""Handles empty version info in oswl object
|
|
||||||
|
|
||||||
For OSWLs with empty version_info data we compose version_info
|
|
||||||
from InstallationStructure data and assign it to oswl object.
|
|
||||||
We bound InstallationStructure.structure.clusters to the oswl
|
|
||||||
and extract fuel_version and fuel_release from clusters data.
|
|
||||||
If fuel_version info doesn't provided by clusters data then
|
|
||||||
InstallationStructure.structure.fuel_release is used.
|
|
||||||
|
|
||||||
:param oswl: OSWL DB object
|
|
||||||
:type oswl: fuel_analytics.api.db.model.OpenStackWorkloadStats
|
|
||||||
:param clusters_versions: cache for saving cluster versions with
|
|
||||||
structure {mn_uid: {cluster_id: fuel_release}}
|
|
||||||
:type clusters_versions: dict
|
|
||||||
"""
|
|
||||||
if oswl.version_info:
|
|
||||||
return
|
|
||||||
|
|
||||||
mn_uid = oswl.master_node_uid
|
|
||||||
cluster_id = oswl.cluster_id
|
|
||||||
|
|
||||||
# Fetching version_info info from cache
|
|
||||||
version_info = clusters_versions.get(mn_uid, {}).get(cluster_id)
|
|
||||||
|
|
||||||
# If clusters data doesn't contain fuel_version we are using
|
|
||||||
# data from installation info
|
|
||||||
if not version_info:
|
|
||||||
release = oswl.fuel_release_from_inst_info or {}
|
|
||||||
version_info = {
|
|
||||||
'fuel_version': release.get('release'),
|
|
||||||
'release_version': release.get('openstack_version')
|
|
||||||
}
|
|
||||||
|
|
||||||
oswl.version_info = version_info
|
|
||||||
|
|
||||||
def get_flatten_resources(self, resource_type, oswl_keys_paths,
|
|
||||||
resource_keys_paths, oswls,
|
|
||||||
clusters_version_info):
|
|
||||||
"""Gets flatten resources data
|
|
||||||
:param oswl_keys_paths: list of keys paths in the OpenStack workload
|
|
||||||
info
|
|
||||||
:param resource_keys_paths: list of keys paths in the resource
|
|
||||||
:param oswls: list of OpenStack workloads
|
|
||||||
:param clusters_version_info: clusters version info cache.
|
|
||||||
Cache is used only if version_info is not provided in the oswl.
|
|
||||||
Cache structure: {mn_uid: {cluster_id: fuel_release}}
|
|
||||||
:return: generator on flatten resources info collection
|
|
||||||
"""
|
|
||||||
app.logger.debug("Getting OSWL flatten %s info started", resource_type)
|
|
||||||
|
|
||||||
for oswl in oswls:
|
|
||||||
try:
|
|
||||||
self.handle_empty_version_info(oswl, clusters_version_info)
|
|
||||||
flatten_oswl = export_utils.get_flatten_data(oswl_keys_paths,
|
|
||||||
oswl)
|
|
||||||
resource_data = oswl.resource_data
|
|
||||||
current = resource_data.get('current', [])
|
|
||||||
added = resource_data.get('added', [])
|
|
||||||
modified = resource_data.get('modified', [])
|
|
||||||
removed = resource_data.get('removed', [])
|
|
||||||
# Filtering wrong formatted removed data
|
|
||||||
# delivered by old Fuel versions
|
|
||||||
removed = [res for res in removed if len(res) > 2]
|
|
||||||
|
|
||||||
# Extracting ids or oswl resources
|
|
||||||
added_ids = set(item['id'] for item in added)
|
|
||||||
modified_ids = set(item['id'] for item in modified)
|
|
||||||
removed_ids = set(item['id'] for item in removed)
|
|
||||||
|
|
||||||
# If resource removed and added several times it would
|
|
||||||
# be present in current and removed. We should exclude
|
|
||||||
# duplicates from flatten resources of the same
|
|
||||||
# resource.
|
|
||||||
current_ids = set(item['id'] for item in current)
|
|
||||||
finally_removed = (res for res in removed
|
|
||||||
if res['id'] not in current_ids)
|
|
||||||
|
|
||||||
for resource in itertools.chain(current, finally_removed):
|
|
||||||
flatten_resource = export_utils.get_flatten_data(
|
|
||||||
resource_keys_paths, {resource_type: resource})
|
|
||||||
additional_info = self.get_additional_resource_info(
|
|
||||||
resource, oswl.resource_type,
|
|
||||||
added_ids, modified_ids, removed_ids)
|
|
||||||
yield flatten_oswl + flatten_resource + additional_info
|
|
||||||
except Exception as e:
|
|
||||||
# Generation of report should be reliable
|
|
||||||
app.logger.error("Getting OSWL flatten data failed. "
|
|
||||||
"Id: %s, master node uid: %s, "
|
|
||||||
"resource_data: %s, error: %s",
|
|
||||||
oswl.id, oswl.master_node_uid,
|
|
||||||
oswl.resource_data, six.text_type(e))
|
|
||||||
app.logger.debug("Getting flatten %s info finished", resource_type)
|
|
||||||
|
|
||||||
def get_last_sync_datetime(self, oswl):
|
|
||||||
"""Gets datetime of last synchronization of masternode with
|
|
||||||
stats collector.
|
|
||||||
:param oswl: OpenStackWorkloadStats object with mixed info
|
|
||||||
from InstallationStructure
|
|
||||||
:return: datetime
|
|
||||||
"""
|
|
||||||
return max(filter(
|
|
||||||
lambda x: x is not None,
|
|
||||||
(oswl.installation_created_date, oswl.installation_updated_date)))
|
|
||||||
|
|
||||||
def stream_horizon_content(self, horizon, on_date):
|
|
||||||
"""Streams content of horizon of oswls. Obsolete
|
|
||||||
oswls with master node last sync date < on_date will be removed
|
|
||||||
from the horizon.
|
|
||||||
:param horizon: dictionary of oswls, indexed by OSWL_INDEX_FIELDS
|
|
||||||
:param on_date: date on which oswls time series is generating
|
|
||||||
:return: generator on horizon of oswls on on_date
|
|
||||||
"""
|
|
||||||
app.logger.debug("Streaming oswls content on date: %s", on_date)
|
|
||||||
# Copying keys to list for make dictionary modification possible
|
|
||||||
# during iteration through it
|
|
||||||
keys = list(six.iterkeys(horizon))
|
|
||||||
for key in keys:
|
|
||||||
last_value = horizon[key]
|
|
||||||
update_date = self.get_last_sync_datetime(last_value)
|
|
||||||
if on_date is not None and update_date.date() < on_date:
|
|
||||||
# Removing obsolete oswl from horizon
|
|
||||||
horizon.pop(key)
|
|
||||||
else:
|
|
||||||
return_value = copy.deepcopy(last_value)
|
|
||||||
return_value.stats_on_date = on_date
|
|
||||||
# Removing modified, removed, added from resource data
|
|
||||||
# if we are duplicating oswl in CSV
|
|
||||||
if return_value.created_date != on_date:
|
|
||||||
return_value.resource_data['added'] = {}
|
|
||||||
return_value.resource_data['removed'] = {}
|
|
||||||
return_value.resource_data['modified'] = {}
|
|
||||||
yield return_value
|
|
||||||
|
|
||||||
def _add_oswl_to_horizon(self, horizon, oswl):
|
|
||||||
idx = export_utils.get_index(oswl, *self.OSWL_INDEX_FIELDS)
|
|
||||||
|
|
||||||
# We can have duplication of the oswls in the DB with the same
|
|
||||||
# checksum but with different external_id. Checksum is calculated
|
|
||||||
# by the resource_data['current'] only. Thus we shouldn't add
|
|
||||||
# the same oswl into horizon if it already present in it and
|
|
||||||
# has no differences in checksum and added, modified, removed
|
|
||||||
# resources.
|
|
||||||
old_oswl = horizon.get(idx)
|
|
||||||
if (
|
|
||||||
old_oswl is None or
|
|
||||||
old_oswl.resource_checksum != oswl.resource_checksum or
|
|
||||||
old_oswl.resource_data != oswl.resource_data
|
|
||||||
):
|
|
||||||
horizon[idx] = oswl
|
|
||||||
|
|
||||||
def fill_date_gaps(self, oswls, to_date):
|
|
||||||
"""Fills the gaps of stats info. If masternode sends stats on
|
|
||||||
on_date and we haven't oswl on this date - the last one oswl for
|
|
||||||
this master_node, cluster_id, resource_type will be used
|
|
||||||
:param oswls: collection of SQLAlchemy oswl objects ordered by
|
|
||||||
creation_date in ascending order
|
|
||||||
:param to_date: fill gaps until this date
|
|
||||||
:return: generator on seamless by dates oswls collection
|
|
||||||
"""
|
|
||||||
app.logger.debug("Filling gaps in oswls started")
|
|
||||||
horizon = {}
|
|
||||||
last_date = None
|
|
||||||
|
|
||||||
# Filling horizon of oswls on last date. Oswls are ordered by
|
|
||||||
# created_date so, then last_date is changed we can assume horizon
|
|
||||||
# of oswls is filled and can be shown
|
|
||||||
for oswl in oswls:
|
|
||||||
last_date = last_date or oswl.created_date
|
|
||||||
if last_date != oswl.created_date:
|
|
||||||
# Filling gaps in created_dates of oswls
|
|
||||||
while last_date != oswl.created_date:
|
|
||||||
app.logger.debug("Filling gap on date: %s for oswl: %s",
|
|
||||||
last_date, oswl.id)
|
|
||||||
for content in self.stream_horizon_content(
|
|
||||||
horizon, last_date):
|
|
||||||
yield content
|
|
||||||
last_date += datetime.timedelta(days=1)
|
|
||||||
if last_date > to_date:
|
|
||||||
break
|
|
||||||
|
|
||||||
self._add_oswl_to_horizon(horizon, oswl)
|
|
||||||
|
|
||||||
# Filling gaps if oswls exhausted on date before to_date
|
|
||||||
if last_date is not None:
|
|
||||||
while last_date <= to_date:
|
|
||||||
for content in self.stream_horizon_content(
|
|
||||||
horizon, last_date):
|
|
||||||
yield content
|
|
||||||
last_date += datetime.timedelta(days=1)
|
|
||||||
|
|
||||||
app.logger.debug("Filling gaps in oswls finished")
|
|
||||||
|
|
||||||
def export(self, resource_type, oswls, to_date, clusters_version_info):
|
|
||||||
app.logger.info("Export oswls %s info into CSV started",
|
|
||||||
resource_type)
|
|
||||||
oswl_keys_paths, resource_keys_paths, csv_keys_paths = \
|
|
||||||
self.get_resource_keys_paths(resource_type)
|
|
||||||
seamless_oswls = self.fill_date_gaps(
|
|
||||||
oswls, to_date)
|
|
||||||
flatten_resources = self.get_flatten_resources(
|
|
||||||
resource_type, oswl_keys_paths, resource_keys_paths,
|
|
||||||
seamless_oswls, clusters_version_info)
|
|
||||||
result = export_utils.flatten_data_as_csv(
|
|
||||||
csv_keys_paths, flatten_resources)
|
|
||||||
app.logger.info("Export oswls %s info into CSV finished",
|
|
||||||
resource_type)
|
|
||||||
return result
|
|
@ -1,312 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
from fuel_analytics.api.common import consts
|
|
||||||
|
|
||||||
|
|
||||||
def count(items):
|
|
||||||
return len(items) if items is not None else None
|
|
||||||
|
|
||||||
|
|
||||||
INSTALLATION_INFO_SKELETON = {
|
|
||||||
'structure': {
|
|
||||||
'allocated_nodes_num': None,
|
|
||||||
'clusters': [
|
|
||||||
{
|
|
||||||
'attributes': {
|
|
||||||
'assign_public_to_all_nodes': None,
|
|
||||||
'auto_assign_floating_ip': None,
|
|
||||||
'ceilometer': None,
|
|
||||||
'corosync_verified': None,
|
|
||||||
'debug_mode': None,
|
|
||||||
'ephemeral_ceph': None,
|
|
||||||
'external_mongo_replset': None,
|
|
||||||
'external_ntp_list': None,
|
|
||||||
'heat': None,
|
|
||||||
'images_ceph': None,
|
|
||||||
'images_vcenter': None,
|
|
||||||
'ironic': None,
|
|
||||||
'iser': None,
|
|
||||||
'kernel_params': None,
|
|
||||||
'libvirt_type': None,
|
|
||||||
'mellanox': None,
|
|
||||||
'mellanox_vf_num': None,
|
|
||||||
'mongo': None,
|
|
||||||
'murano': None,
|
|
||||||
'murano-cfapi': None,
|
|
||||||
'murano_glance_artifacts_plugin': None,
|
|
||||||
'neutron_dvr': None,
|
|
||||||
'neutron_l2_pop': None,
|
|
||||||
'neutron_l3_ha': None,
|
|
||||||
'neutron_qos': None,
|
|
||||||
'nova_quota': None,
|
|
||||||
'nsx': None,
|
|
||||||
'nsx_replication': None,
|
|
||||||
'nsx_transport': None,
|
|
||||||
'objects_ceph': None,
|
|
||||||
'osd_pool_size': None,
|
|
||||||
'provision_method': None,
|
|
||||||
'public_ssl_cert_source': None,
|
|
||||||
'public_ssl_horizon': None,
|
|
||||||
'public_ssl_services': None,
|
|
||||||
'puppet_debug': None,
|
|
||||||
'repos': None,
|
|
||||||
'resume_guests_state_on_host_boot': None,
|
|
||||||
'sahara': None,
|
|
||||||
'syslog_transport': None,
|
|
||||||
'task_deploy': None,
|
|
||||||
'use_cow_images': None,
|
|
||||||
'volumes_block_device': None,
|
|
||||||
'vcenter': None,
|
|
||||||
'vlan_splinters': None,
|
|
||||||
'vlan_splinters_ovs': None,
|
|
||||||
'volumes_ceph': None,
|
|
||||||
'volumes_lvm': None,
|
|
||||||
'volumes_vmdk': None,
|
|
||||||
'workloads_collector_enabled': None,
|
|
||||||
},
|
|
||||||
'vmware_attributes': {
|
|
||||||
'vmware_az_cinder_enable': None,
|
|
||||||
'vmware_az_nova_computes_num': None
|
|
||||||
},
|
|
||||||
'fuel_version': None,
|
|
||||||
'id': None,
|
|
||||||
'is_customized': None,
|
|
||||||
'mode': None,
|
|
||||||
'net_provider': None,
|
|
||||||
'node_groups': [{'id': None}],
|
|
||||||
'nodes': [
|
|
||||||
{
|
|
||||||
'bond_interfaces': count,
|
|
||||||
'nic_interfaces': count,
|
|
||||||
'group_id': None,
|
|
||||||
'id': None,
|
|
||||||
'online': None,
|
|
||||||
'os': None,
|
|
||||||
'pending_addition': None,
|
|
||||||
'pending_deletion': None,
|
|
||||||
'roles': [None],
|
|
||||||
'pending_roles': [None],
|
|
||||||
'status': None,
|
|
||||||
'meta': {
|
|
||||||
'cpu': {
|
|
||||||
'real': None,
|
|
||||||
'total': None,
|
|
||||||
'spec': [
|
|
||||||
{
|
|
||||||
'frequency': None,
|
|
||||||
'model': None,
|
|
||||||
},
|
|
||||||
10 # number of showing items
|
|
||||||
]
|
|
||||||
},
|
|
||||||
'memory': {
|
|
||||||
'slots': None,
|
|
||||||
'total': None,
|
|
||||||
'maximum_capacity': None,
|
|
||||||
'devices': [
|
|
||||||
{
|
|
||||||
'frequency': None,
|
|
||||||
'type': None,
|
|
||||||
'size': None
|
|
||||||
},
|
|
||||||
10 # number of showing items
|
|
||||||
]
|
|
||||||
},
|
|
||||||
'disks': [
|
|
||||||
{
|
|
||||||
'name': None,
|
|
||||||
'removable': None,
|
|
||||||
'model': None,
|
|
||||||
'size': None
|
|
||||||
},
|
|
||||||
10 # number of showing items
|
|
||||||
],
|
|
||||||
'system': {
|
|
||||||
'product': None,
|
|
||||||
'family': None,
|
|
||||||
'version': None,
|
|
||||||
'manufacturer': None
|
|
||||||
},
|
|
||||||
'interfaces': [
|
|
||||||
{
|
|
||||||
'pxe': None,
|
|
||||||
'name': None,
|
|
||||||
'driver': None,
|
|
||||||
'state': None,
|
|
||||||
'max_speed': None,
|
|
||||||
'current_speed': None,
|
|
||||||
'offloading_modes': [
|
|
||||||
{
|
|
||||||
'state': None,
|
|
||||||
'name': None,
|
|
||||||
'sub': [
|
|
||||||
{
|
|
||||||
'state': None,
|
|
||||||
'name': None,
|
|
||||||
}
|
|
||||||
]
|
|
||||||
}
|
|
||||||
],
|
|
||||||
'interface_properties': {
|
|
||||||
'mtu': None,
|
|
||||||
'numa_node': None,
|
|
||||||
'disable_offloading': None,
|
|
||||||
'sriov': {
|
|
||||||
'available': None,
|
|
||||||
'enabled': None,
|
|
||||||
'physnet': None,
|
|
||||||
'sriov_numvfs': None,
|
|
||||||
'sriov_totalvfs': None
|
|
||||||
},
|
|
||||||
'dpdk': {
|
|
||||||
'enabled': None
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
],
|
|
||||||
'numa_topology': {
|
|
||||||
'numa_nodes': [
|
|
||||||
{
|
|
||||||
'memory': None,
|
|
||||||
'id': None,
|
|
||||||
'cpus': count
|
|
||||||
}
|
|
||||||
],
|
|
||||||
'supported_hugepages': [None],
|
|
||||||
'distances': [None]
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
],
|
|
||||||
'nodes_num': None,
|
|
||||||
'network_configuration': {
|
|
||||||
'segmentation_type': None,
|
|
||||||
'net_l23_provider': None,
|
|
||||||
'net_manager': None,
|
|
||||||
'fixed_networks_vlan_start': None,
|
|
||||||
'fixed_network_size': None,
|
|
||||||
'fixed_networks_amount': None
|
|
||||||
},
|
|
||||||
'installed_plugins': [
|
|
||||||
{
|
|
||||||
'name': None,
|
|
||||||
'version': None,
|
|
||||||
'releases': [{
|
|
||||||
'deployment_scripts_path': None,
|
|
||||||
'repository_path': None,
|
|
||||||
'mode': [None],
|
|
||||||
'os': None,
|
|
||||||
'version': None,
|
|
||||||
}],
|
|
||||||
'fuel_version': None,
|
|
||||||
'package_version': None,
|
|
||||||
'is_hotpluggable': None,
|
|
||||||
'groups': [None],
|
|
||||||
'licenses': [None]
|
|
||||||
}
|
|
||||||
],
|
|
||||||
'release': {'name': None, 'os': None},
|
|
||||||
'status': None
|
|
||||||
}
|
|
||||||
],
|
|
||||||
'clusters_num': None,
|
|
||||||
'fuel_release': {
|
|
||||||
'api': None,
|
|
||||||
'build_id': None,
|
|
||||||
'build_number': None,
|
|
||||||
'feature_groups': [None],
|
|
||||||
'release': None
|
|
||||||
},
|
|
||||||
'fuel_packages': [None],
|
|
||||||
'unallocated_nodes_num': None,
|
|
||||||
'user_information': {
|
|
||||||
'company': None,
|
|
||||||
'contact_info_provided': None,
|
|
||||||
'email': None,
|
|
||||||
'name': None
|
|
||||||
}
|
|
||||||
},
|
|
||||||
'master_node_uid': None,
|
|
||||||
'modification_date': None,
|
|
||||||
'creation_date': None
|
|
||||||
}
|
|
||||||
|
|
||||||
OSWL_SKELETONS = {
|
|
||||||
'general': {
|
|
||||||
'master_node_uid': None,
|
|
||||||
'cluster_id': None,
|
|
||||||
'stats_on_date': None,
|
|
||||||
'resource_type': None,
|
|
||||||
'version_info': {
|
|
||||||
'fuel_version': None,
|
|
||||||
'release_version': None,
|
|
||||||
'release_os': None,
|
|
||||||
'release_name': None,
|
|
||||||
'environment_version': None
|
|
||||||
}
|
|
||||||
},
|
|
||||||
consts.OSWL_RESOURCE_TYPES.vm: {
|
|
||||||
'id': None,
|
|
||||||
'status': None,
|
|
||||||
'tenant_id': None,
|
|
||||||
'host_id': None,
|
|
||||||
'created_at': None,
|
|
||||||
'power_state': None,
|
|
||||||
'flavor_id': None,
|
|
||||||
'image_id': None
|
|
||||||
},
|
|
||||||
consts.OSWL_RESOURCE_TYPES.flavor: {
|
|
||||||
'id': None,
|
|
||||||
'ram': None,
|
|
||||||
'vcpus': None,
|
|
||||||
'ephemeral': None,
|
|
||||||
'disk': None,
|
|
||||||
'swap': None,
|
|
||||||
},
|
|
||||||
consts.OSWL_RESOURCE_TYPES.volume: {
|
|
||||||
'id': None,
|
|
||||||
'availability_zone': None,
|
|
||||||
'encrypted_flag': None,
|
|
||||||
'bootable_flag': None,
|
|
||||||
'status': None,
|
|
||||||
'volume_type': None,
|
|
||||||
'size': None,
|
|
||||||
'snapshot_id': None,
|
|
||||||
'tenant_id': None,
|
|
||||||
'attachments': count
|
|
||||||
},
|
|
||||||
'volume_attachment': {
|
|
||||||
"device": None,
|
|
||||||
"server_id": None,
|
|
||||||
"id": None
|
|
||||||
},
|
|
||||||
consts.OSWL_RESOURCE_TYPES.image: {
|
|
||||||
'id': None,
|
|
||||||
'minDisk': None,
|
|
||||||
'minRam': None,
|
|
||||||
'sizeBytes': None,
|
|
||||||
'created_at': None,
|
|
||||||
'updated_at': None,
|
|
||||||
},
|
|
||||||
consts.OSWL_RESOURCE_TYPES.tenant: {
|
|
||||||
'id': None,
|
|
||||||
'enabled_flag': None,
|
|
||||||
},
|
|
||||||
consts.OSWL_RESOURCE_TYPES.keystone_user: {
|
|
||||||
'id': None,
|
|
||||||
'enabled_flag': None,
|
|
||||||
'tenant_id': None
|
|
||||||
}
|
|
||||||
}
|
|
@ -1,263 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
import collections
|
|
||||||
import copy
|
|
||||||
import six
|
|
||||||
|
|
||||||
from fuel_analytics.api.app import app
|
|
||||||
from fuel_analytics.api.resources.utils import export_utils
|
|
||||||
from fuel_analytics.api.resources.utils.skeleton import \
|
|
||||||
INSTALLATION_INFO_SKELETON
|
|
||||||
|
|
||||||
|
|
||||||
ActionLogInfo = collections.namedtuple(
|
|
||||||
'ActionLogInfo', ['external_id', 'master_node_uid', 'cluster_id',
|
|
||||||
'status', 'end_datetime', 'action_name'])
|
|
||||||
|
|
||||||
|
|
||||||
class StatsToCsv(object):
|
|
||||||
|
|
||||||
MANUFACTURERS_NUM = 3
|
|
||||||
PLATFORM_NAMES_NUM = 3
|
|
||||||
|
|
||||||
ACTION_LOG_INDEX_FIELDS = ('master_node_uid', 'cluster_id', 'action_name')
|
|
||||||
NETWORK_VERIFICATION_COLUMN = 'verify_networks_status'
|
|
||||||
NETWORK_VERIFICATION_ACTION = 'verify_networks'
|
|
||||||
|
|
||||||
def get_cluster_keys_paths(self):
|
|
||||||
app.logger.debug("Getting cluster keys paths")
|
|
||||||
structure_skeleton = copy.deepcopy(INSTALLATION_INFO_SKELETON)
|
|
||||||
clusters = structure_skeleton['structure'].pop('clusters')
|
|
||||||
structure_key_paths = export_utils.get_keys_paths(structure_skeleton)
|
|
||||||
cluster_skeleton = clusters[0]
|
|
||||||
|
|
||||||
# Removing lists of dicts from cluster skeleton
|
|
||||||
cluster_skeleton.pop('nodes', None)
|
|
||||||
cluster_skeleton.pop('installed_plugins', None)
|
|
||||||
cluster_key_paths = export_utils.get_keys_paths(cluster_skeleton)
|
|
||||||
|
|
||||||
result_key_paths = cluster_key_paths + structure_key_paths
|
|
||||||
|
|
||||||
# Handling network verification check
|
|
||||||
result_key_paths.append([self.NETWORK_VERIFICATION_COLUMN])
|
|
||||||
app.logger.debug("Cluster keys paths got")
|
|
||||||
return structure_key_paths, cluster_key_paths, result_key_paths
|
|
||||||
|
|
||||||
def _get_subcluster_keys_paths(self, skeleton):
|
|
||||||
key_paths = export_utils.get_keys_paths(skeleton)
|
|
||||||
structure_key_paths = [['master_node_uid'],
|
|
||||||
['structure', 'fuel_packages']]
|
|
||||||
cluster_key_paths = [['cluster_id'], ['cluster_fuel_version']]
|
|
||||||
result_key_paths = key_paths + cluster_key_paths + structure_key_paths
|
|
||||||
return structure_key_paths, cluster_key_paths, \
|
|
||||||
key_paths, result_key_paths
|
|
||||||
|
|
||||||
def get_plugin_keys_paths(self):
|
|
||||||
app.logger.debug("Getting plugin keys paths")
|
|
||||||
structure_skeleton = copy.deepcopy(INSTALLATION_INFO_SKELETON)
|
|
||||||
clusters = structure_skeleton['structure']['clusters']
|
|
||||||
plugin_skeleton = clusters[0]['installed_plugins'][0]
|
|
||||||
plugin_skeleton.pop('releases', None)
|
|
||||||
|
|
||||||
result = self._get_subcluster_keys_paths(plugin_skeleton)
|
|
||||||
app.logger.debug("Plugin keys paths got")
|
|
||||||
return result
|
|
||||||
|
|
||||||
def get_node_keys_paths(self):
|
|
||||||
app.logger.debug("Getting node keys paths")
|
|
||||||
structure_skeleton = copy.deepcopy(INSTALLATION_INFO_SKELETON)
|
|
||||||
clusters = structure_skeleton['structure']['clusters']
|
|
||||||
node_skeleton = clusters[0]['nodes'][0]
|
|
||||||
|
|
||||||
result = self._get_subcluster_keys_paths(node_skeleton)
|
|
||||||
app.logger.debug("Node keys paths got")
|
|
||||||
return result
|
|
||||||
|
|
||||||
def build_action_logs_idx(self, action_logs):
|
|
||||||
app.logger.debug("Building action logs index started")
|
|
||||||
action_logs_idx = {}
|
|
||||||
for action_log in action_logs:
|
|
||||||
idx = export_utils.get_index(
|
|
||||||
action_log, *self.ACTION_LOG_INDEX_FIELDS)
|
|
||||||
action_logs_idx[idx] = ActionLogInfo(*action_log)
|
|
||||||
app.logger.debug("Building action logs index finished")
|
|
||||||
return action_logs_idx
|
|
||||||
|
|
||||||
def get_flatten_clusters(self, structure_keys_paths, cluster_keys_paths,
|
|
||||||
inst_structures, action_logs):
|
|
||||||
"""Gets flatten clusters data form installation structures collection
|
|
||||||
:param structure_keys_paths: list of keys paths in the
|
|
||||||
installation structure
|
|
||||||
:param cluster_keys_paths: list of keys paths in the cluster
|
|
||||||
:param inst_structures: list of installation structures
|
|
||||||
:param action_logs: list of action logs
|
|
||||||
:return: list of flatten clusters info
|
|
||||||
"""
|
|
||||||
app.logger.debug("Getting flatten clusters info is started")
|
|
||||||
action_logs_idx = self.build_action_logs_idx(action_logs)
|
|
||||||
|
|
||||||
for inst_structure in inst_structures:
|
|
||||||
try:
|
|
||||||
structure = inst_structure.structure
|
|
||||||
clusters = structure.pop('clusters', [])
|
|
||||||
flatten_structure = export_utils.get_flatten_data(
|
|
||||||
structure_keys_paths, inst_structure)
|
|
||||||
|
|
||||||
for cluster in clusters:
|
|
||||||
cluster.pop('installed_plugins', None)
|
|
||||||
flatten_cluster = export_utils.get_flatten_data(
|
|
||||||
cluster_keys_paths, cluster)
|
|
||||||
flatten_cluster.extend(flatten_structure)
|
|
||||||
|
|
||||||
# Adding network verification status
|
|
||||||
idx = export_utils.get_index(
|
|
||||||
{'master_node_uid': inst_structure.master_node_uid,
|
|
||||||
'cluster_id': cluster['id'],
|
|
||||||
'action_name': self.NETWORK_VERIFICATION_ACTION},
|
|
||||||
*self.ACTION_LOG_INDEX_FIELDS
|
|
||||||
)
|
|
||||||
al_info = action_logs_idx.get(idx)
|
|
||||||
nv_status = None if al_info is None else al_info.status
|
|
||||||
flatten_cluster.append(nv_status)
|
|
||||||
yield flatten_cluster
|
|
||||||
except Exception as e:
|
|
||||||
# Generation of report should be reliable
|
|
||||||
app.logger.error("Getting flatten cluster data failed. "
|
|
||||||
"Installation info id: %s, "
|
|
||||||
"master node uid: %s, error: %s",
|
|
||||||
inst_structure.id,
|
|
||||||
inst_structure.master_node_uid,
|
|
||||||
six.text_type(e))
|
|
||||||
app.logger.debug("Flatten clusters info is got")
|
|
||||||
|
|
||||||
def get_flatten_plugins(self, structure_keys_paths, cluster_keys_paths,
|
|
||||||
plugin_keys_paths, inst_structures):
|
|
||||||
"""Gets flatten plugins data form clusters from installation
|
|
||||||
structures collection
|
|
||||||
:param structure_keys_paths: list of keys paths in the
|
|
||||||
installation structure
|
|
||||||
:param cluster_keys_paths: list of keys paths in the cluster
|
|
||||||
:param plugin_keys_paths: list of keys paths in the plugin
|
|
||||||
:param inst_structures: list of installation structures
|
|
||||||
:return: list of flatten plugins info
|
|
||||||
"""
|
|
||||||
|
|
||||||
return self._get_flatten_subcluster_data(
|
|
||||||
'installed_plugins',
|
|
||||||
structure_keys_paths,
|
|
||||||
cluster_keys_paths,
|
|
||||||
plugin_keys_paths,
|
|
||||||
inst_structures
|
|
||||||
)
|
|
||||||
|
|
||||||
def _get_flatten_subcluster_data(self, data_path, structure_keys_paths,
|
|
||||||
cluster_keys_paths, keys_paths,
|
|
||||||
inst_structures):
|
|
||||||
"""Gets flatten data form clusters from installation
|
|
||||||
structures collection
|
|
||||||
:param structure_keys_paths: list of keys paths in the
|
|
||||||
installation structure
|
|
||||||
:param cluster_keys_paths: list of keys paths in the cluster
|
|
||||||
:param keys_paths: list of keys paths in the data
|
|
||||||
:param inst_structures: list of installation structures
|
|
||||||
:return: list of flatten plugins info
|
|
||||||
"""
|
|
||||||
app.logger.debug("Getting flatten %s info started", data_path)
|
|
||||||
|
|
||||||
for inst_structure in inst_structures:
|
|
||||||
try:
|
|
||||||
structure = inst_structure.structure
|
|
||||||
clusters = structure.pop('clusters', [])
|
|
||||||
flatten_structure = export_utils.get_flatten_data(
|
|
||||||
structure_keys_paths, inst_structure)
|
|
||||||
|
|
||||||
for cluster in clusters:
|
|
||||||
cluster['cluster_id'] = cluster['id']
|
|
||||||
cluster['cluster_fuel_version'] = \
|
|
||||||
cluster.get('fuel_version')
|
|
||||||
flatten_cluster = export_utils.get_flatten_data(
|
|
||||||
cluster_keys_paths, cluster)
|
|
||||||
data = cluster.pop(data_path, [])
|
|
||||||
for item in data:
|
|
||||||
flatten_data = export_utils.get_flatten_data(
|
|
||||||
keys_paths, item)
|
|
||||||
flatten_data.extend(flatten_cluster)
|
|
||||||
flatten_data.extend(flatten_structure)
|
|
||||||
yield flatten_data
|
|
||||||
except Exception as e:
|
|
||||||
# Generation of report should be reliable
|
|
||||||
app.logger.error("Getting flatten %s data failed. "
|
|
||||||
"Installation info id: %s, "
|
|
||||||
"master node uid: %s, error: %s",
|
|
||||||
data_path,
|
|
||||||
inst_structure.id,
|
|
||||||
inst_structure.master_node_uid,
|
|
||||||
six.text_type(e))
|
|
||||||
app.logger.debug("Getting flatten %s info finished", data_path)
|
|
||||||
|
|
||||||
def get_flatten_nodes(self, structure_keys_paths, cluster_keys_paths,
|
|
||||||
node_keys_paths, inst_structures):
|
|
||||||
"""Gets flatten plugins data form clusters from installation
|
|
||||||
structures collection
|
|
||||||
:param structure_keys_paths: list of keys paths in the
|
|
||||||
installation structure
|
|
||||||
:param cluster_keys_paths: list of keys paths in the cluster
|
|
||||||
:param node_keys_paths: list of keys paths in the node
|
|
||||||
:param inst_structures: list of installation structures
|
|
||||||
:return: list of flatten plugins info
|
|
||||||
"""
|
|
||||||
return self._get_flatten_subcluster_data(
|
|
||||||
'nodes',
|
|
||||||
structure_keys_paths,
|
|
||||||
cluster_keys_paths,
|
|
||||||
node_keys_paths,
|
|
||||||
inst_structures
|
|
||||||
)
|
|
||||||
|
|
||||||
def export_clusters(self, inst_structures, action_logs):
|
|
||||||
app.logger.info("Export clusters info into CSV started")
|
|
||||||
structure_keys_paths, cluster_keys_paths, csv_keys_paths = \
|
|
||||||
self.get_cluster_keys_paths()
|
|
||||||
flatten_clusters = self.get_flatten_clusters(
|
|
||||||
structure_keys_paths, cluster_keys_paths,
|
|
||||||
inst_structures, action_logs)
|
|
||||||
result = export_utils.flatten_data_as_csv(
|
|
||||||
csv_keys_paths, flatten_clusters)
|
|
||||||
app.logger.info("Export clusters info into CSV finished")
|
|
||||||
return result
|
|
||||||
|
|
||||||
def export_plugins(self, inst_structures):
|
|
||||||
app.logger.info("Export plugins info into CSV started")
|
|
||||||
(structure_keys_paths, cluster_keys_paths,
|
|
||||||
plugin_keys_paths, csv_keys_paths) = self.get_plugin_keys_paths()
|
|
||||||
flatten_plugins = self.get_flatten_plugins(
|
|
||||||
structure_keys_paths, cluster_keys_paths,
|
|
||||||
plugin_keys_paths, inst_structures)
|
|
||||||
result = export_utils.flatten_data_as_csv(
|
|
||||||
csv_keys_paths, flatten_plugins)
|
|
||||||
app.logger.info("Export plugins info into CSV finished")
|
|
||||||
return result
|
|
||||||
|
|
||||||
def export_nodes(self, inst_structures):
|
|
||||||
app.logger.info("Export nodes info into CSV started")
|
|
||||||
(structure_keys_paths, cluster_keys_paths,
|
|
||||||
node_keys_paths, csv_keys_paths) = self.get_node_keys_paths()
|
|
||||||
flatten_nodes = self.get_flatten_nodes(
|
|
||||||
structure_keys_paths, cluster_keys_paths,
|
|
||||||
node_keys_paths, inst_structures)
|
|
||||||
result = export_utils.flatten_data_as_csv(
|
|
||||||
csv_keys_paths, flatten_nodes)
|
|
||||||
app.logger.info("Export nodes info into CSV finished")
|
|
||||||
return result
|
|
@ -1,13 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
@ -1,13 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
@ -1,13 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
@ -1,13 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
@ -1,256 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
from datetime import datetime
|
|
||||||
from datetime import timedelta
|
|
||||||
import random
|
|
||||||
import six
|
|
||||||
from six.moves import xrange
|
|
||||||
import uuid
|
|
||||||
|
|
||||||
from fuel_analytics.test.base import BaseTest
|
|
||||||
|
|
||||||
from fuel_analytics.api.app import db
|
|
||||||
from fuel_analytics.api.db.model import ActionLog
|
|
||||||
from fuel_analytics.api.db.model import InstallationStructure
|
|
||||||
|
|
||||||
|
|
||||||
class InstStructureTest(BaseTest):
|
|
||||||
|
|
||||||
def gen_id(self, id_range=(0, 1000000)):
|
|
||||||
return random.randint(*id_range)
|
|
||||||
|
|
||||||
def generate_node(
|
|
||||||
self,
|
|
||||||
roles_range=(0, 5),
|
|
||||||
node_roles=('compute', 'controller', 'cinder', 'ceph-osd',
|
|
||||||
'zabbix', 'mongo'),
|
|
||||||
oses=('Ubuntu', 'CentOs', 'Ubuntu LTS XX'),
|
|
||||||
node_statuses = ('ready', 'discover', 'provisioning',
|
|
||||||
'provisioned', 'deploying', 'error'),
|
|
||||||
manufacturers = ('Dell Inc.', 'VirtualBox', 'QEMU',
|
|
||||||
'VirtualBox', 'Supermicro', 'Cisco Systems Inc',
|
|
||||||
'KVM', 'VMWARE', 'HP')
|
|
||||||
):
|
|
||||||
roles = []
|
|
||||||
for _ in xrange(random.randint(*roles_range)):
|
|
||||||
roles.append(random.choice(node_roles))
|
|
||||||
node = {
|
|
||||||
'id': self.gen_id(),
|
|
||||||
'roles': roles,
|
|
||||||
'os': random.choice(oses),
|
|
||||||
'status': random.choice(node_statuses),
|
|
||||||
'manufacturer': random.choice(manufacturers)
|
|
||||||
}
|
|
||||||
return node
|
|
||||||
|
|
||||||
def generate_cluster(
|
|
||||||
self,
|
|
||||||
nodes_range=(0, 100),
|
|
||||||
oses=('Ubuntu', 'CentOs', 'Ubuntu LTS XX'),
|
|
||||||
release_names=('Juno on CentOS 6.5', 'Juno on Ubuntu 12.04.4'),
|
|
||||||
release_versions=('6.0 TechPreview', '6.0 GA', '6.1'),
|
|
||||||
cluster_statuses=('new', 'deployment', 'stopped', 'operational',
|
|
||||||
'error', 'remove', 'update', 'update_error'),
|
|
||||||
libvirt_names=('qemu', 'kvm', 'vCenter'),
|
|
||||||
plugins_num_range=(0, 5)
|
|
||||||
):
|
|
||||||
nodes_num = random.randint(*nodes_range)
|
|
||||||
cluster = {
|
|
||||||
'id': self.gen_id(),
|
|
||||||
'nodes_num': nodes_num,
|
|
||||||
'release': {
|
|
||||||
'os': random.choice(oses),
|
|
||||||
'name': random.choice(release_names),
|
|
||||||
'version': random.choice(release_versions),
|
|
||||||
},
|
|
||||||
'status': random.choice(cluster_statuses),
|
|
||||||
'nodes': [],
|
|
||||||
'attributes': {
|
|
||||||
'libvirt_type': random.choice(libvirt_names),
|
|
||||||
'heat': random.choice((True, False)),
|
|
||||||
},
|
|
||||||
'vmware_attributes': {
|
|
||||||
'vmware_az_cinder_enable': [True, False],
|
|
||||||
}
|
|
||||||
}
|
|
||||||
network_configuration = self.generate_network_configuration()
|
|
||||||
cluster.update(network_configuration)
|
|
||||||
cluster['installed_plugins'] = self.generate_installed_plugins(
|
|
||||||
plugins_num_range=plugins_num_range)
|
|
||||||
for _ in six.moves.range(nodes_num):
|
|
||||||
cluster['nodes'].append(self.generate_node())
|
|
||||||
return cluster
|
|
||||||
|
|
||||||
def generate_network_configuration(self):
|
|
||||||
return random.choice((
|
|
||||||
{'network_configuration': {
|
|
||||||
'segmentation_type': random.choice(("gre", "vlan")),
|
|
||||||
'net_l23_provider': random.choice(("ovs", "nsx")),
|
|
||||||
}},
|
|
||||||
{'network_configuration': {
|
|
||||||
'net_manager': random.choice(('FlatDHCPManager',
|
|
||||||
'VlanManager')),
|
|
||||||
'fixed_networks_vlan_start': random.choice((2, 3, None)),
|
|
||||||
'fixed_network_size': random.randint(0, 255),
|
|
||||||
'fixed_networks_amount': random.randint(0, 10),
|
|
||||||
}},
|
|
||||||
{'network_configuration': {}},
|
|
||||||
{}
|
|
||||||
))
|
|
||||||
|
|
||||||
def generate_installed_plugins(
|
|
||||||
self,
|
|
||||||
plugins_num_range=(0, 5),
|
|
||||||
plugins_names=('fuel-plugin-gluster-fs', 'fuel-plugin-vpnaas')
|
|
||||||
):
|
|
||||||
plugins_info = []
|
|
||||||
for i in six.moves.range(random.randint(*plugins_num_range)):
|
|
||||||
plugins_info.append({
|
|
||||||
'id': i,
|
|
||||||
'name': random.choice(plugins_names)
|
|
||||||
})
|
|
||||||
return plugins_info
|
|
||||||
|
|
||||||
def _fuel_release_gen(self, releases):
|
|
||||||
return {
|
|
||||||
'release': random.choice(releases),
|
|
||||||
'api': 1,
|
|
||||||
'nailgun_sha': "Unknown build nailgun",
|
|
||||||
'astute_sha': "Unknown build astute",
|
|
||||||
'fuellib_sha': "Unknown build fuellib",
|
|
||||||
'ostf_sha': "Unknown build ostf",
|
|
||||||
'fuelmain_sha': "Unknown build fuelmain",
|
|
||||||
'feature_groups': ['experimental', 'mirantis']
|
|
||||||
}
|
|
||||||
|
|
||||||
def _fuel_release_gen_2015_04(self, releases):
|
|
||||||
return {
|
|
||||||
'release': random.choice(releases),
|
|
||||||
'api': 1,
|
|
||||||
'nailgun_sha': "Unknown build nailgun",
|
|
||||||
'astute_sha': "Unknown build astute astute",
|
|
||||||
'fuel-ostf_sha': "Unknown build fuel-ostf",
|
|
||||||
'python-fuelclient_sha': "Unknown build python-fuelclient",
|
|
||||||
'fuel-library_sha': "Unknown build fuel-library",
|
|
||||||
'fuelmain_sha': "Unknown build fuelmain",
|
|
||||||
'feature_groups': ['experimental', 'mirantis']
|
|
||||||
}
|
|
||||||
|
|
||||||
def generate_structure(self, clusters_num_range=(0, 10),
|
|
||||||
unallocated_nodes_num_range=(0, 20),
|
|
||||||
plugins_num_range=(0, 5),
|
|
||||||
release_generators=('_fuel_release_gen',
|
|
||||||
'_fuel_release_gen_2015_04'),
|
|
||||||
releases=("6.0-techpreview", "6.0-ga")):
|
|
||||||
clusters_num = random.randint(*clusters_num_range)
|
|
||||||
|
|
||||||
release_generator = random.choice(release_generators)
|
|
||||||
fuel_release = getattr(self, release_generator)(releases)
|
|
||||||
|
|
||||||
structure = {
|
|
||||||
'fuel_release': fuel_release,
|
|
||||||
'clusters_num': clusters_num,
|
|
||||||
'clusters': [],
|
|
||||||
'unallocated_nodes_num_range': random.randint(
|
|
||||||
*unallocated_nodes_num_range),
|
|
||||||
'allocated_nodes_num': 0
|
|
||||||
}
|
|
||||||
|
|
||||||
for _ in xrange(clusters_num):
|
|
||||||
cluster = self.generate_cluster(
|
|
||||||
plugins_num_range=plugins_num_range)
|
|
||||||
structure['clusters'].append(cluster)
|
|
||||||
structure['allocated_nodes_num'] += cluster['nodes_num']
|
|
||||||
return structure
|
|
||||||
|
|
||||||
def generate_inst_structures(
|
|
||||||
self, installations_num=100, creation_date_range=(1, 10),
|
|
||||||
modification_date_range=(1, 10), clusters_num_range=(0, 10),
|
|
||||||
plugins_num_range=(0, 5), releases=("6.0-techpreview", "6.0-ga"),
|
|
||||||
release_generators=('_fuel_release_gen',
|
|
||||||
'_fuel_release_gen_2015_04'),
|
|
||||||
is_filtered_values=(False, None)):
|
|
||||||
for _ in xrange(installations_num):
|
|
||||||
mn_uid = '{}'.format(uuid.uuid4())
|
|
||||||
structure = self.generate_structure(
|
|
||||||
clusters_num_range=clusters_num_range,
|
|
||||||
plugins_num_range=plugins_num_range,
|
|
||||||
releases=releases,
|
|
||||||
release_generators=release_generators)
|
|
||||||
creation_date = datetime.utcnow() - timedelta(
|
|
||||||
days=random.randint(*creation_date_range))
|
|
||||||
modification_date = datetime.utcnow() - timedelta(
|
|
||||||
days=random.randint(*modification_date_range))
|
|
||||||
obj = InstallationStructure(
|
|
||||||
master_node_uid=mn_uid,
|
|
||||||
structure=structure,
|
|
||||||
creation_date=creation_date,
|
|
||||||
modification_date=modification_date,
|
|
||||||
is_filtered=random.choice(is_filtered_values)
|
|
||||||
)
|
|
||||||
yield obj
|
|
||||||
|
|
||||||
def _get_saved_objs(self, generator_func, *args, **kwargs):
|
|
||||||
objs = generator_func(*args, **kwargs)
|
|
||||||
result = []
|
|
||||||
for obj in objs:
|
|
||||||
db.session.add(obj)
|
|
||||||
result.append(obj)
|
|
||||||
return result
|
|
||||||
|
|
||||||
def get_saved_inst_structures(self, *args, **kwargs):
|
|
||||||
return self._get_saved_objs(self.generate_inst_structures,
|
|
||||||
*args, **kwargs)
|
|
||||||
|
|
||||||
def generate_action_logs(
|
|
||||||
self, inst_sturctures, num_per_struct_range=(1, 100),
|
|
||||||
action_types=('nailgun_task',),
|
|
||||||
action_groups=('cluster_changes', 'cluster_checking',
|
|
||||||
'operations'),
|
|
||||||
action_names=('deploy', 'deployment', 'provision',
|
|
||||||
'stop_deployment', 'reset_environment',
|
|
||||||
'update', 'node_deletion', 'cluster_deletion',
|
|
||||||
'check_before_deployment', 'check_networks',
|
|
||||||
'verify_networks')):
|
|
||||||
for struct in inst_sturctures:
|
|
||||||
for idx in six.moves.range(random.randint(*num_per_struct_range)):
|
|
||||||
action_type = random.choice(action_types)
|
|
||||||
action_name = random.choice(action_names)
|
|
||||||
body = {
|
|
||||||
"id": idx,
|
|
||||||
"actor_id": six.text_type(uuid.uuid4()),
|
|
||||||
"action_group": random.choice(action_groups),
|
|
||||||
"action_name": random.choice(action_names),
|
|
||||||
"action_type": action_type,
|
|
||||||
"start_timestamp": datetime.utcnow().isoformat(),
|
|
||||||
"end_timestamp": datetime.utcnow().isoformat(),
|
|
||||||
"additional_info": {
|
|
||||||
"parent_task_id": None,
|
|
||||||
"subtasks_ids": [],
|
|
||||||
"operation": action_name
|
|
||||||
},
|
|
||||||
"is_sent": False,
|
|
||||||
"cluster_id": idx
|
|
||||||
}
|
|
||||||
obj = ActionLog(
|
|
||||||
master_node_uid=struct.master_node_uid,
|
|
||||||
external_id=idx,
|
|
||||||
body=body
|
|
||||||
)
|
|
||||||
yield obj
|
|
||||||
|
|
||||||
def get_saved_action_logs(self, *args, **kwargs):
|
|
||||||
return self._get_saved_objs(self.generate_action_logs,
|
|
||||||
*args, **kwargs)
|
|
@ -1,354 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
from datetime import datetime
|
|
||||||
from datetime import timedelta
|
|
||||||
import random
|
|
||||||
import six
|
|
||||||
from six.moves import range
|
|
||||||
import uuid
|
|
||||||
|
|
||||||
from fuel_analytics.test.base import BaseTest
|
|
||||||
|
|
||||||
from fuel_analytics.api.app import db
|
|
||||||
from fuel_analytics.api.common import consts
|
|
||||||
from fuel_analytics.api.db.model import InstallationStructure
|
|
||||||
from fuel_analytics.api.db.model import OpenStackWorkloadStats
|
|
||||||
|
|
||||||
|
|
||||||
class OswlTest(BaseTest):
|
|
||||||
|
|
||||||
RESOURCE_TYPES = (
|
|
||||||
consts.OSWL_RESOURCE_TYPES.vm,
|
|
||||||
consts.OSWL_RESOURCE_TYPES.flavor,
|
|
||||||
consts.OSWL_RESOURCE_TYPES.volume,
|
|
||||||
consts.OSWL_RESOURCE_TYPES.image,
|
|
||||||
consts.OSWL_RESOURCE_TYPES.tenant,
|
|
||||||
consts.OSWL_RESOURCE_TYPES.keystone_user
|
|
||||||
)
|
|
||||||
|
|
||||||
RESOURCE_GENERATORS = {
|
|
||||||
consts.OSWL_RESOURCE_TYPES.vm: ('generate_vms',
|
|
||||||
'generate_modified_vms'),
|
|
||||||
consts.OSWL_RESOURCE_TYPES.flavor: ('generate_flavors',
|
|
||||||
'generate_modified_flavors'),
|
|
||||||
consts.OSWL_RESOURCE_TYPES.volume: ('generate_volumes',
|
|
||||||
'generate_modified_volumes'),
|
|
||||||
consts.OSWL_RESOURCE_TYPES.image: ('generate_images',
|
|
||||||
'generate_modified_images'),
|
|
||||||
consts.OSWL_RESOURCE_TYPES.tenant: ('generate_tenants',
|
|
||||||
'generate_modified_tenants'),
|
|
||||||
consts.OSWL_RESOURCE_TYPES.keystone_user: (
|
|
||||||
'generate_keystone_users', 'generate_modified_keystone_users')
|
|
||||||
}
|
|
||||||
|
|
||||||
def generate_removed_resources(self, num, gen_func):
|
|
||||||
result = []
|
|
||||||
for resource in gen_func(num):
|
|
||||||
resource['time'] = datetime.utcnow().time().isoformat()
|
|
||||||
result.append(resource)
|
|
||||||
return result
|
|
||||||
|
|
||||||
def generate_added_resources(self, num):
|
|
||||||
result = []
|
|
||||||
for i in range(num):
|
|
||||||
result.append({
|
|
||||||
'id': i,
|
|
||||||
'time': datetime.utcnow().time().isoformat()
|
|
||||||
})
|
|
||||||
return result
|
|
||||||
|
|
||||||
def generate_vms(self, vms_num, statuses=('ACTIVE', 'ERROR', 'BUILT'),
|
|
||||||
created_at_range=(1, 10),
|
|
||||||
power_states_range=(1, 10)):
|
|
||||||
result = []
|
|
||||||
for i in range(vms_num):
|
|
||||||
result.append({
|
|
||||||
'id': i,
|
|
||||||
'status': random.choice(statuses),
|
|
||||||
'tenant_id': 'tenant_id_{}'.format(i),
|
|
||||||
'host_id': 'host_id_{}'.format(i),
|
|
||||||
'created_at': (datetime.utcnow() - timedelta(
|
|
||||||
days=random.randint(*created_at_range))).isoformat(),
|
|
||||||
'power_state': random.randint(*power_states_range),
|
|
||||||
'flavor_id': 'flavor_id_{}'.format(i),
|
|
||||||
'image_id': 'image_id_{}'.format(i),
|
|
||||||
})
|
|
||||||
return result
|
|
||||||
|
|
||||||
def generate_modified_vms(self, vms_num, modifs_num_range=(1, 10),
|
|
||||||
power_states_range=(1, 10)):
|
|
||||||
result = []
|
|
||||||
for i in range(vms_num):
|
|
||||||
for _ in range(random.randint(*modifs_num_range)):
|
|
||||||
result.append({
|
|
||||||
'id': i,
|
|
||||||
'time': datetime.utcnow().time().isoformat(),
|
|
||||||
'power_state': random.choice(power_states_range)
|
|
||||||
})
|
|
||||||
return result
|
|
||||||
|
|
||||||
def generate_flavors(self, num, ram_range=(64, 24000),
|
|
||||||
vcpus_range=(1, 64), ephemeral_range=(1, 30),
|
|
||||||
disk_range=(1, 2048), swap_range=(1, 128)):
|
|
||||||
result = []
|
|
||||||
for i in range(num):
|
|
||||||
result.append({
|
|
||||||
'id': i,
|
|
||||||
'ram': random.randint(*ram_range),
|
|
||||||
'vcpus': random.randint(*vcpus_range),
|
|
||||||
'ephemeral': random.randint(*ephemeral_range),
|
|
||||||
'disk': random.randint(*disk_range),
|
|
||||||
'swap': random.randint(*swap_range),
|
|
||||||
})
|
|
||||||
return result
|
|
||||||
|
|
||||||
def generate_modified_flavors(self, num, modifs_num_range=(1, 3),
|
|
||||||
swap_range=(1, 128),
|
|
||||||
disk_range=(13, 23)):
|
|
||||||
result = []
|
|
||||||
for i in range(num):
|
|
||||||
for _ in range(random.randint(*modifs_num_range)):
|
|
||||||
result.append({
|
|
||||||
'id': i,
|
|
||||||
'time': datetime.utcnow().time().isoformat(),
|
|
||||||
'swap': random.randint(*swap_range),
|
|
||||||
'disk': random.randint(*disk_range)
|
|
||||||
})
|
|
||||||
return result
|
|
||||||
|
|
||||||
def _generate_volume_attachments(self, num, volume_id):
|
|
||||||
result = []
|
|
||||||
for i in six.moves.range(num):
|
|
||||||
result.append({
|
|
||||||
'device': '/dev/vdb{}'.format(i),
|
|
||||||
'server_id': six.text_type(uuid.uuid4()),
|
|
||||||
'volume_id': volume_id,
|
|
||||||
'host_name': six.text_type(uuid.uuid4()),
|
|
||||||
'id': six.text_type(uuid.uuid4()),
|
|
||||||
})
|
|
||||||
return result
|
|
||||||
|
|
||||||
def generate_volumes(self, num, avail_zones=('zone_0', 'zone_1', 'zone_2'),
|
|
||||||
statuses=('available', 'error'),
|
|
||||||
volume_types=('test_0', 'test_1'),
|
|
||||||
size_range=(1, 1024),
|
|
||||||
attachments_range=(1, 3)):
|
|
||||||
result = []
|
|
||||||
for i in range(num):
|
|
||||||
attachments = self._generate_volume_attachments(
|
|
||||||
random.randint(*attachments_range), i)
|
|
||||||
result.append({
|
|
||||||
'id': i,
|
|
||||||
'availability_zone': random.choice(avail_zones),
|
|
||||||
'encrypted_flag': random.choice((False, True)),
|
|
||||||
'bootable_flag': random.choice((False, True)),
|
|
||||||
'status': random.choice(statuses),
|
|
||||||
'volume_type': random.choice(volume_types),
|
|
||||||
'size': random.randint(*size_range),
|
|
||||||
'host': six.text_type(uuid.uuid4()),
|
|
||||||
'snapshot_id': random.choice((
|
|
||||||
None, six.text_type(uuid.uuid4()))),
|
|
||||||
'attachments': attachments,
|
|
||||||
'tenant_id': six.text_type(uuid.uuid4())
|
|
||||||
})
|
|
||||||
return result
|
|
||||||
|
|
||||||
def generate_modified_volumes(self, num, modifs_num_range=(1, 3),
|
|
||||||
size_range=(1, 1024),
|
|
||||||
volume_types=('test_0', 'test_1')):
|
|
||||||
result = []
|
|
||||||
for i in range(num):
|
|
||||||
for _ in range(random.randint(*modifs_num_range)):
|
|
||||||
result.append({
|
|
||||||
'id': i,
|
|
||||||
'time': datetime.utcnow().time().isoformat(),
|
|
||||||
'size': random.randint(*size_range),
|
|
||||||
'volume_type': random.choice(volume_types)
|
|
||||||
})
|
|
||||||
return result
|
|
||||||
|
|
||||||
def generate_images(self, num, min_disk_range=(1, 1024),
|
|
||||||
min_ram_range=(1, 2048), size_range=(1, 4096),
|
|
||||||
created_at_range=(1, 10), updated_at_range=(1, 10)):
|
|
||||||
result = []
|
|
||||||
for i in range(num):
|
|
||||||
result.append({
|
|
||||||
'id': i,
|
|
||||||
'minDisk': random.randint(*min_disk_range),
|
|
||||||
'minRam': random.randint(*min_ram_range),
|
|
||||||
'sizeBytes': random.randint(*size_range),
|
|
||||||
'created_at': (datetime.utcnow() - timedelta(
|
|
||||||
days=random.randint(*created_at_range))).isoformat(),
|
|
||||||
'updated_at': (datetime.utcnow() - timedelta(
|
|
||||||
days=random.randint(*updated_at_range))).isoformat(),
|
|
||||||
})
|
|
||||||
return result
|
|
||||||
|
|
||||||
def generate_modified_images(self, num, modifs_num_range=(1, 3),
|
|
||||||
min_disk_range=(1, 1024),
|
|
||||||
min_ram_range=(1, 4096),
|
|
||||||
size_bytes_range=(1, 1024000)):
|
|
||||||
result = []
|
|
||||||
for i in range(num):
|
|
||||||
for _ in range(random.randint(*modifs_num_range)):
|
|
||||||
result.append({
|
|
||||||
'id': i,
|
|
||||||
'time': datetime.utcnow().time().isoformat(),
|
|
||||||
'minDisk': random.randint(*min_disk_range),
|
|
||||||
'minRam': random.randint(*min_ram_range),
|
|
||||||
'sizeBytes': random.randint(*size_bytes_range),
|
|
||||||
})
|
|
||||||
return result
|
|
||||||
|
|
||||||
def generate_tenants(self, num, enabled_flag_values=(True, False)):
|
|
||||||
result = []
|
|
||||||
for i in range(num):
|
|
||||||
result.append({
|
|
||||||
'id': i,
|
|
||||||
'enabled_flag': random.choice(enabled_flag_values)
|
|
||||||
})
|
|
||||||
return result
|
|
||||||
|
|
||||||
def generate_modified_tenants(self, num, modifs_num_range=(1, 3),
|
|
||||||
enabled_flag_values=(True, False)):
|
|
||||||
result = []
|
|
||||||
for i in range(num):
|
|
||||||
for _ in range(random.randint(*modifs_num_range)):
|
|
||||||
result.append({
|
|
||||||
'id': i,
|
|
||||||
'time': datetime.utcnow().time().isoformat(),
|
|
||||||
'enabled_flag': random.choice(enabled_flag_values)
|
|
||||||
})
|
|
||||||
return result
|
|
||||||
|
|
||||||
def generate_keystone_users(self, num, enabled_flag_values=(True, False)):
|
|
||||||
result = []
|
|
||||||
for i in range(num):
|
|
||||||
result.append({
|
|
||||||
'id': i,
|
|
||||||
'enabled_flag': random.choice(enabled_flag_values),
|
|
||||||
'tenant_id': six.text_type(uuid.uuid4())
|
|
||||||
})
|
|
||||||
return result
|
|
||||||
|
|
||||||
def generate_modified_keystone_users(self, num, modifs_num_range=(1, 3),
|
|
||||||
enabled_flag_values=(True, False)):
|
|
||||||
result = []
|
|
||||||
for i in range(num):
|
|
||||||
for _ in range(random.randint(*modifs_num_range)):
|
|
||||||
result.append({
|
|
||||||
'id': i,
|
|
||||||
'time': datetime.utcnow().time().isoformat(),
|
|
||||||
'enabled_flag': random.choice(enabled_flag_values)
|
|
||||||
})
|
|
||||||
return result
|
|
||||||
|
|
||||||
def generate_oswls(self, oswl_num, resource_type,
|
|
||||||
current_num_range=(0, 7),
|
|
||||||
created_date_range=(1, 10),
|
|
||||||
added_num_range=(0, 5),
|
|
||||||
removed_num_range=(0, 3),
|
|
||||||
modified_num_range=(0, 15),
|
|
||||||
stats_per_mn_range=(1, 10),
|
|
||||||
cluster_ids_range=(1, 5)):
|
|
||||||
i = 1
|
|
||||||
current_mn_stats = 0
|
|
||||||
gen_name, gen_modified_name = self.RESOURCE_GENERATORS[resource_type]
|
|
||||||
gen = getattr(self, gen_name)
|
|
||||||
gen_modified = getattr(self, gen_modified_name)
|
|
||||||
while i <= oswl_num:
|
|
||||||
if not current_mn_stats:
|
|
||||||
mn_uid = six.text_type(uuid.uuid4())
|
|
||||||
current_mn_stats = random.randint(*stats_per_mn_range)
|
|
||||||
|
|
||||||
if current_mn_stats:
|
|
||||||
i += 1
|
|
||||||
created_date = (datetime.utcnow() - timedelta(
|
|
||||||
days=random.randint(*created_date_range))).\
|
|
||||||
date()
|
|
||||||
obj = OpenStackWorkloadStats(
|
|
||||||
master_node_uid=mn_uid,
|
|
||||||
external_id=i,
|
|
||||||
cluster_id=random.choice(cluster_ids_range),
|
|
||||||
created_date=created_date,
|
|
||||||
updated_time=datetime.utcnow().time(),
|
|
||||||
resource_type=resource_type,
|
|
||||||
resource_checksum=six.text_type(uuid.uuid4()),
|
|
||||||
resource_data={
|
|
||||||
'current': gen(
|
|
||||||
random.randint(*current_num_range)),
|
|
||||||
'added': self.generate_added_resources(
|
|
||||||
random.randint(*added_num_range)),
|
|
||||||
'modified': gen_modified(
|
|
||||||
random.randint(*modified_num_range)),
|
|
||||||
'removed': self.generate_removed_resources(
|
|
||||||
random.randint(*removed_num_range),
|
|
||||||
gen)
|
|
||||||
}
|
|
||||||
)
|
|
||||||
current_mn_stats -= 1
|
|
||||||
yield obj
|
|
||||||
|
|
||||||
def get_saved_oswls(self, num, resource_type, *args, **kwargs):
|
|
||||||
oswls = self.generate_oswls(num, resource_type, *args, **kwargs)
|
|
||||||
result = []
|
|
||||||
for oswl in oswls:
|
|
||||||
db.session.add(oswl)
|
|
||||||
result.append(oswl)
|
|
||||||
db.session.commit()
|
|
||||||
return result
|
|
||||||
|
|
||||||
def generate_inst_structs(self, oswls,
|
|
||||||
creation_date_range=(2, 10),
|
|
||||||
modification_date_range=(2, 5),
|
|
||||||
is_modified_date_nullable=True,
|
|
||||||
is_filtered_values=(False, None),
|
|
||||||
releases=('6.0', '6.1')):
|
|
||||||
|
|
||||||
mn_uids = set()
|
|
||||||
for oswl in oswls:
|
|
||||||
if oswl.master_node_uid not in mn_uids:
|
|
||||||
creation_date = (datetime.utcnow() - timedelta(
|
|
||||||
days=random.randint(*creation_date_range))).\
|
|
||||||
date().isoformat()
|
|
||||||
if random.choice((False, is_modified_date_nullable)):
|
|
||||||
modification_date = None
|
|
||||||
else:
|
|
||||||
modification_date = (datetime.utcnow() - timedelta(
|
|
||||||
days=random.randint(*modification_date_range))).\
|
|
||||||
date().isoformat()
|
|
||||||
structure = {
|
|
||||||
'fuel_release': {
|
|
||||||
'release': random.choice(releases)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
obj = InstallationStructure(
|
|
||||||
master_node_uid=oswl.master_node_uid,
|
|
||||||
creation_date=creation_date,
|
|
||||||
modification_date=modification_date,
|
|
||||||
is_filtered=random.choice(is_filtered_values),
|
|
||||||
structure=structure,
|
|
||||||
)
|
|
||||||
mn_uids.add(oswl.master_node_uid)
|
|
||||||
yield obj
|
|
||||||
|
|
||||||
def get_saved_inst_structs(self, oswls, *args, **kwargs):
|
|
||||||
inst_structs = self.generate_inst_structs(oswls, *args, **kwargs)
|
|
||||||
result = []
|
|
||||||
for inst_struct in inst_structs:
|
|
||||||
db.session.add(inst_struct)
|
|
||||||
result.append(inst_struct)
|
|
||||||
db.session.commit()
|
|
||||||
return result
|
|
@ -1,302 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
from datetime import datetime
|
|
||||||
from datetime import timedelta
|
|
||||||
|
|
||||||
from fuel_analytics.test.api.resources.utils.oswl_test import OswlTest
|
|
||||||
from fuel_analytics.test.base import DbTest
|
|
||||||
|
|
||||||
from fuel_analytics.api.app import app
|
|
||||||
from fuel_analytics.api.app import db
|
|
||||||
from fuel_analytics.api.common import consts
|
|
||||||
from fuel_analytics.api.db.model import ActionLog
|
|
||||||
from fuel_analytics.api.errors import DateExtractionError
|
|
||||||
from fuel_analytics.api.resources import csv_exporter as ce
|
|
||||||
from fuel_analytics.api.resources.csv_exporter import extract_date
|
|
||||||
from fuel_analytics.api.resources.csv_exporter import get_action_logs
|
|
||||||
from fuel_analytics.api.resources.csv_exporter import get_from_date
|
|
||||||
from fuel_analytics.api.resources.csv_exporter import get_inst_structures
|
|
||||||
from fuel_analytics.api.resources.csv_exporter import get_inst_structures_query
|
|
||||||
from fuel_analytics.api.resources.csv_exporter import get_oswls_query
|
|
||||||
from fuel_analytics.api.resources.csv_exporter import get_resources_types
|
|
||||||
from fuel_analytics.api.resources.csv_exporter import get_to_date
|
|
||||||
from fuel_analytics.api.resources.utils.stats_to_csv import ActionLogInfo
|
|
||||||
from fuel_analytics.api.resources.utils.stats_to_csv import StatsToCsv
|
|
||||||
|
|
||||||
|
|
||||||
class CsvExporterTest(OswlTest, DbTest):
|
|
||||||
|
|
||||||
def test_get_oswls_query(self):
|
|
||||||
num = 2
|
|
||||||
for resource_type in self.RESOURCE_TYPES:
|
|
||||||
# Fetching oswls count
|
|
||||||
count_before = get_oswls_query(resource_type).count()
|
|
||||||
|
|
||||||
# Generating oswls without installation info
|
|
||||||
oswls = self.get_saved_oswls(num, resource_type)
|
|
||||||
|
|
||||||
# Checking count of fetched oswls is not changed
|
|
||||||
count_after = get_oswls_query(resource_type).count()
|
|
||||||
self.assertEqual(count_before, count_after)
|
|
||||||
|
|
||||||
# Saving inst structures
|
|
||||||
self.get_saved_inst_structs(oswls)
|
|
||||||
|
|
||||||
# Checking count of fetched oswls is changed
|
|
||||||
count_after = get_oswls_query(resource_type).count()
|
|
||||||
self.assertEqual(num + count_before, count_after)
|
|
||||||
|
|
||||||
def test_extract_date(self):
|
|
||||||
with app.test_request_context():
|
|
||||||
self.assertIsNone(extract_date('fake_name'))
|
|
||||||
with app.test_request_context('/?from_date=2015-02-24'):
|
|
||||||
self.assertEqual(datetime(2015, 2, 24).date(),
|
|
||||||
extract_date('from_date'))
|
|
||||||
with app.test_request_context('/?from_date=20150224'):
|
|
||||||
self.assertRaises(DateExtractionError, extract_date,
|
|
||||||
'from_date')
|
|
||||||
|
|
||||||
def test_get_from_date(self):
|
|
||||||
with app.test_request_context():
|
|
||||||
expected = datetime.utcnow().date() - \
|
|
||||||
timedelta(days=app.config['CSV_DEFAULT_FROM_DATE_DAYS'])
|
|
||||||
actual = get_from_date()
|
|
||||||
self.assertEqual(expected, actual)
|
|
||||||
|
|
||||||
with app.test_request_context('/?from_date=2015-02-24'):
|
|
||||||
expected = datetime(2015, 2, 24).date()
|
|
||||||
actual = get_from_date()
|
|
||||||
self.assertEqual(expected, actual)
|
|
||||||
|
|
||||||
def test_to_date(self):
|
|
||||||
with app.test_request_context():
|
|
||||||
actual = get_to_date()
|
|
||||||
self.assertEqual(datetime.utcnow().date(), actual)
|
|
||||||
|
|
||||||
with app.test_request_context('/?to_date=2015-02-25'):
|
|
||||||
expected = datetime(2015, 2, 25).date()
|
|
||||||
actual = get_to_date()
|
|
||||||
self.assertEqual(expected, actual)
|
|
||||||
|
|
||||||
def test_get_oswls_query_with_dates(self):
|
|
||||||
num = 20
|
|
||||||
for resource_type in self.RESOURCE_TYPES:
|
|
||||||
# Fetching oswls count
|
|
||||||
count_before = get_oswls_query(resource_type, None, None).count()
|
|
||||||
|
|
||||||
# Generating oswls without installation info
|
|
||||||
oswls = self.get_saved_oswls(num, resource_type)
|
|
||||||
self.get_saved_inst_structs(oswls)
|
|
||||||
|
|
||||||
# Checking count of fetched oswls
|
|
||||||
count_after = get_oswls_query(
|
|
||||||
resource_type, None, None).count()
|
|
||||||
self.assertEqual(num + count_before, count_after)
|
|
||||||
count_after = get_oswls_query(
|
|
||||||
resource_type, None, datetime.utcnow().date()).count()
|
|
||||||
self.assertEqual(num + count_before, count_after)
|
|
||||||
count_after = get_oswls_query(
|
|
||||||
resource_type,
|
|
||||||
datetime.utcnow().date() - timedelta(days=100),
|
|
||||||
datetime.utcnow().date()).count()
|
|
||||||
self.assertEqual(num + count_before, count_after)
|
|
||||||
count_after = get_oswls_query(
|
|
||||||
resource_type,
|
|
||||||
datetime.utcnow().date(),
|
|
||||||
datetime.utcnow().date() - timedelta(days=100)).count()
|
|
||||||
self.assertEqual(0, count_after)
|
|
||||||
|
|
||||||
def test_get_inst_structures_query(self):
|
|
||||||
# Fetching inst structures count
|
|
||||||
count_before = get_inst_structures_query().count()
|
|
||||||
|
|
||||||
oswls = self.get_saved_oswls(200, consts.OSWL_RESOURCE_TYPES.vm)
|
|
||||||
inst_structures = self.get_saved_inst_structs(oswls)
|
|
||||||
num = len(inst_structures)
|
|
||||||
|
|
||||||
# Checking count of fetched inst structures
|
|
||||||
count_after = get_inst_structures_query(None, None).count()
|
|
||||||
self.assertEqual(num + count_before, count_after)
|
|
||||||
count_after = get_inst_structures_query(
|
|
||||||
None, datetime.utcnow().date()).count()
|
|
||||||
self.assertEqual(num + count_before, count_after)
|
|
||||||
count_after = get_inst_structures_query(
|
|
||||||
datetime.utcnow().date() - timedelta(days=100),
|
|
||||||
datetime.utcnow().date()).count()
|
|
||||||
self.assertEqual(num + count_before, count_after)
|
|
||||||
count_after = get_inst_structures_query(
|
|
||||||
datetime.utcnow().date(),
|
|
||||||
datetime.utcnow().date() - timedelta(days=100)).count()
|
|
||||||
self.assertEqual(0, count_after)
|
|
||||||
|
|
||||||
def test_get_inst_structures_query_not_returns_filtered(self):
|
|
||||||
# Fetching inst structures count
|
|
||||||
count_initial = get_inst_structures_query().count()
|
|
||||||
|
|
||||||
# Generating filtered inst structures
|
|
||||||
oswls = self.get_saved_oswls(10, consts.OSWL_RESOURCE_TYPES.vm,
|
|
||||||
stats_per_mn_range=(1, 1))
|
|
||||||
self.get_saved_inst_structs(oswls, is_filtered_values=(True,))
|
|
||||||
|
|
||||||
# Checking filtered inst structures don't fetched
|
|
||||||
count_with_filtered = get_inst_structures_query(None, None).count()
|
|
||||||
self.assertEqual(count_initial, count_with_filtered)
|
|
||||||
|
|
||||||
# Generating not filtered inst structures
|
|
||||||
oswls = self.get_saved_oswls(20, consts.OSWL_RESOURCE_TYPES.vm,
|
|
||||||
stats_per_mn_range=(1, 1))
|
|
||||||
inst_structures = self.get_saved_inst_structs(
|
|
||||||
oswls, is_filtered_values=(None, False))
|
|
||||||
not_filtered_num = len(inst_structures)
|
|
||||||
|
|
||||||
# Checking not filtered inst structures fetched
|
|
||||||
count_with_not_filtered = get_inst_structures_query(None, None).count()
|
|
||||||
get_inst_structures_query(None, None).all()
|
|
||||||
self.assertEqual(count_initial + not_filtered_num,
|
|
||||||
count_with_not_filtered)
|
|
||||||
|
|
||||||
def test_no_filtered_structures(self):
|
|
||||||
oswls = self.get_saved_oswls(100, consts.OSWL_RESOURCE_TYPES.vm,
|
|
||||||
stats_per_mn_range=(1, 1))
|
|
||||||
self.get_saved_inst_structs(
|
|
||||||
oswls, is_filtered_values=(True, False, None))
|
|
||||||
with app.test_request_context():
|
|
||||||
for inst_structure in get_inst_structures():
|
|
||||||
self.assertNotEqual(True, inst_structure.is_filtered)
|
|
||||||
|
|
||||||
def test_get_resources_types(self):
|
|
||||||
for resource_type in self.RESOURCE_TYPES:
|
|
||||||
self.get_saved_oswls(1, resource_type)
|
|
||||||
resources_names = get_resources_types()
|
|
||||||
self.assertItemsEqual(self.RESOURCE_TYPES, resources_names)
|
|
||||||
|
|
||||||
def test_get_all_reports(self):
|
|
||||||
oswls = []
|
|
||||||
for resource_type in self.RESOURCE_TYPES:
|
|
||||||
oswls.extend(self.get_saved_oswls(10, resource_type))
|
|
||||||
self.get_saved_inst_structs(oswls)
|
|
||||||
|
|
||||||
to_date = datetime.utcnow()
|
|
||||||
from_date = to_date - timedelta(days=30)
|
|
||||||
reports = ce.get_all_reports(from_date, to_date, {})
|
|
||||||
|
|
||||||
expected_reports = [
|
|
||||||
ce.CLUSTERS_REPORT_FILE,
|
|
||||||
ce.PLUGINS_REPORT_FILE
|
|
||||||
]
|
|
||||||
for resource_type in self.RESOURCE_TYPES:
|
|
||||||
expected_reports.append('{}.csv'.format(resource_type))
|
|
||||||
|
|
||||||
actual_reports = [name for _, name in reports]
|
|
||||||
self.assertItemsEqual(expected_reports, actual_reports)
|
|
||||||
|
|
||||||
def test_get_all_reports_with_future_dates(self):
|
|
||||||
oswls = []
|
|
||||||
for resource_type in self.RESOURCE_TYPES:
|
|
||||||
oswls.extend(self.get_saved_oswls(10, resource_type))
|
|
||||||
self.get_saved_inst_structs(oswls)
|
|
||||||
|
|
||||||
from_date = datetime.utcnow()
|
|
||||||
to_date = from_date + timedelta(days=7)
|
|
||||||
|
|
||||||
reports_generators = ce.get_all_reports(from_date, to_date, {})
|
|
||||||
|
|
||||||
# Checking no exception raised
|
|
||||||
for report_generator, report_name in reports_generators:
|
|
||||||
for _ in report_generator:
|
|
||||||
pass
|
|
||||||
|
|
||||||
def test_get_action_logs(self):
|
|
||||||
action_name = StatsToCsv.NETWORK_VERIFICATION_ACTION
|
|
||||||
action_logs = [
|
|
||||||
ActionLog(
|
|
||||||
master_node_uid='ids_order',
|
|
||||||
external_id=200,
|
|
||||||
body={'cluster_id': 1,
|
|
||||||
'end_timestamp': datetime.utcnow().isoformat(),
|
|
||||||
'action_type': 'nailgun_task',
|
|
||||||
'action_name': action_name,
|
|
||||||
'additional_info': {'ended_with_status': 'error'}}
|
|
||||||
),
|
|
||||||
ActionLog(
|
|
||||||
master_node_uid='ids_order',
|
|
||||||
external_id=1,
|
|
||||||
body={'cluster_id': 1,
|
|
||||||
'end_timestamp': datetime.utcnow().isoformat(),
|
|
||||||
'action_type': 'nailgun_task',
|
|
||||||
'action_name': action_name,
|
|
||||||
'additional_info': {'ended_with_status': 'ready'}}
|
|
||||||
),
|
|
||||||
ActionLog(
|
|
||||||
master_node_uid='normal',
|
|
||||||
external_id=200,
|
|
||||||
body={'cluster_id': 1,
|
|
||||||
'end_timestamp': datetime.utcnow().isoformat(),
|
|
||||||
'action_type': 'nailgun_task',
|
|
||||||
'action_name': action_name,
|
|
||||||
'additional_info': {'ended_with_status': 'ready'}}
|
|
||||||
),
|
|
||||||
ActionLog(
|
|
||||||
master_node_uid='yesterday',
|
|
||||||
external_id=1,
|
|
||||||
body={'cluster_id': 1,
|
|
||||||
'end_timestamp': (datetime.utcnow() -
|
|
||||||
timedelta(days=-1)).isoformat(),
|
|
||||||
'action_type': 'nailgun_task',
|
|
||||||
'action_name': action_name,
|
|
||||||
'additional_info': {'ended_with_status': 'ready'}}
|
|
||||||
),
|
|
||||||
ActionLog(
|
|
||||||
master_node_uid='wrong_name',
|
|
||||||
external_id=1,
|
|
||||||
body={'cluster_id': 1,
|
|
||||||
'end_timestamp': (datetime.utcnow() -
|
|
||||||
timedelta(days=-1)).isoformat(),
|
|
||||||
'action_type': 'nailgun_task',
|
|
||||||
'action_name': 'fake_name',
|
|
||||||
'additional_info': {'ended_with_status': 'ready'}}
|
|
||||||
),
|
|
||||||
ActionLog(
|
|
||||||
master_node_uid='no_end_ts',
|
|
||||||
external_id=1,
|
|
||||||
body={'cluster_id': 1, 'action_type': 'nailgun_task',
|
|
||||||
'action_name': action_name,
|
|
||||||
'additional_info': {'ended_with_status': 'ready'}}
|
|
||||||
),
|
|
||||||
]
|
|
||||||
for action_log in action_logs:
|
|
||||||
db.session.add(action_log)
|
|
||||||
db.session.commit()
|
|
||||||
to_date = from_date = datetime.utcnow().date().strftime('%Y-%m-%d')
|
|
||||||
|
|
||||||
req_params = '/?from_date={0}&to_date={1}'.format(from_date, to_date)
|
|
||||||
with app.test_request_context(req_params):
|
|
||||||
action_logs = list(get_action_logs())
|
|
||||||
|
|
||||||
# Checking no old and no_end_ts action logs
|
|
||||||
for action_log in action_logs:
|
|
||||||
al = ActionLogInfo(*action_log)
|
|
||||||
self.assertNotIn(al.master_node_uid, ('no_end_ts', 'yesterday'))
|
|
||||||
|
|
||||||
# Checking selected right action logs
|
|
||||||
# self.assertEqual(2, len(action_logs))
|
|
||||||
for action_log in action_logs:
|
|
||||||
al = ActionLogInfo(*action_log)
|
|
||||||
self.assertIn(al.master_node_uid, ('normal', 'ids_order'), al)
|
|
||||||
|
|
||||||
# Checking last action log is selected
|
|
||||||
for action_log in action_logs:
|
|
||||||
al = ActionLogInfo(*action_log)
|
|
||||||
self.assertEqual(200, al.external_id, al)
|
|
@ -1,278 +0,0 @@
|
|||||||
# -*- coding: utf-8 -*-
|
|
||||||
#
|
|
||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
from fuel_analytics.test.base import BaseTest
|
|
||||||
|
|
||||||
from fuel_analytics.api.resources.utils import export_utils
|
|
||||||
|
|
||||||
|
|
||||||
class O(object):
|
|
||||||
"""Helper object."""
|
|
||||||
def __init__(self, a, b, c):
|
|
||||||
self.a = a
|
|
||||||
self.b = b
|
|
||||||
self.c = c
|
|
||||||
|
|
||||||
|
|
||||||
class ExportUtilsTest(BaseTest):
|
|
||||||
|
|
||||||
def test_get_key_paths(self):
|
|
||||||
skeleton = {'a': 'b', 'c': 'd'}
|
|
||||||
paths = export_utils.get_keys_paths(skeleton)
|
|
||||||
self.assertListEqual([['a'], ['c']], paths)
|
|
||||||
|
|
||||||
skeleton = {'a': {'e': 'f', 'g': None}}
|
|
||||||
paths = export_utils.get_keys_paths(skeleton)
|
|
||||||
self.assertListEqual([['a', 'e'], ['a', 'g']], paths)
|
|
||||||
|
|
||||||
def test_get_key_paths_for_lists(self):
|
|
||||||
skeleton = {'a': [{'b': None}, 2], 'c': [None, 2]}
|
|
||||||
actual = export_utils.get_keys_paths(skeleton)
|
|
||||||
expected = [['a', 0, 'b'], ['a', 1, 'b'], ['c', 0], ['c', 1]]
|
|
||||||
self.assertListEqual(expected, actual)
|
|
||||||
|
|
||||||
skeleton = {'h': [{'a': 'b', 'c': 'd'}, 1], 't': None}
|
|
||||||
actual = export_utils.get_keys_paths(skeleton)
|
|
||||||
self.assertListEqual([['h', 0, 'a'], ['h', 0, 'c'], ['t']], actual)
|
|
||||||
|
|
||||||
def test_get_key_paths_for_empty_lists(self):
|
|
||||||
skeleton = {'h': [], 't': None}
|
|
||||||
actual = export_utils.get_keys_paths(skeleton)
|
|
||||||
self.assertListEqual([['h'], ['t']], actual)
|
|
||||||
|
|
||||||
def test_get_flatten_data(self):
|
|
||||||
data = [
|
|
||||||
{'a': 'b', 'b': {'e': 2.1}},
|
|
||||||
{'a': 'ee\nxx', 'b': {'e': 3.1415}, 'c': ['z', 'zz']},
|
|
||||||
O('y', {'e': 44}, None),
|
|
||||||
O('yy', {'e': 45}, ['b', 'e'])
|
|
||||||
]
|
|
||||||
skeleton = {'a': None, 'b': {'e': None}, 'c': [None, 2]}
|
|
||||||
expected_flatten_data = [
|
|
||||||
['b', 2.1, None, None],
|
|
||||||
['ee\nxx', 3.1415, 'z', 'zz'],
|
|
||||||
['y', 44, None, None],
|
|
||||||
['yy', 45, 'b', 'e']
|
|
||||||
]
|
|
||||||
|
|
||||||
key_paths = export_utils.get_keys_paths(skeleton)
|
|
||||||
|
|
||||||
for idx, expected in enumerate(expected_flatten_data):
|
|
||||||
actual = export_utils.get_flatten_data(key_paths, data[idx])
|
|
||||||
self.assertListEqual(expected, actual)
|
|
||||||
|
|
||||||
for idx, data in enumerate(data):
|
|
||||||
actual = export_utils.get_flatten_data(key_paths, data)
|
|
||||||
self.assertListEqual(expected_flatten_data[idx], actual)
|
|
||||||
|
|
||||||
def test_get_flatten_data_for_functions(self):
|
|
||||||
|
|
||||||
skeleton = {'a': None, 'b': len, 'c': max}
|
|
||||||
data = [
|
|
||||||
O('y', [1, 2], [0, 42, -1]),
|
|
||||||
{'a': 'yy', 'b': {'e': 45}, 'c': ['z', 'e']}
|
|
||||||
]
|
|
||||||
expected_flatten_data = [
|
|
||||||
['y', 2, 42],
|
|
||||||
['yy', 1, 'z']
|
|
||||||
]
|
|
||||||
|
|
||||||
key_paths = export_utils.get_keys_paths(skeleton)
|
|
||||||
|
|
||||||
for idx, expected in enumerate(expected_flatten_data):
|
|
||||||
actual = export_utils.get_flatten_data(key_paths, data[idx])
|
|
||||||
self.assertEqual(expected, actual)
|
|
||||||
|
|
||||||
for idx, data in enumerate(data):
|
|
||||||
actual = export_utils.get_flatten_data(key_paths, data)
|
|
||||||
self.assertEqual(expected_flatten_data[idx], actual)
|
|
||||||
|
|
||||||
def test_get_flatten_data_for_list(self):
|
|
||||||
b_repeats = 1
|
|
||||||
e_repeats = 2
|
|
||||||
skeleton = {
|
|
||||||
'a': None,
|
|
||||||
'b': [
|
|
||||||
{'d': None, 'e': [{'f': None}, e_repeats]},
|
|
||||||
b_repeats
|
|
||||||
],
|
|
||||||
'c': []
|
|
||||||
}
|
|
||||||
|
|
||||||
expected_keys = [
|
|
||||||
['a'],
|
|
||||||
['b', 0, 'd'], ['b', 0, 'e', 0, 'f'], ['b', 0, 'e', 1, 'f'],
|
|
||||||
['c']
|
|
||||||
]
|
|
||||||
self.assertEqual(expected_keys, export_utils.get_keys_paths(skeleton))
|
|
||||||
|
|
||||||
data = [
|
|
||||||
O('a_val_o', [{'d': 'd_0_o', 'e': [{'f': 'f_0_o'}]}],
|
|
||||||
['c_o_0', 'c_o_1']),
|
|
||||||
{'a': 'a_val', 'b': [{'d': 'd_0', 'e': []}, {'d': 'ignored'}],
|
|
||||||
'c': 'c_val'}
|
|
||||||
]
|
|
||||||
|
|
||||||
expected_flatten_data = [
|
|
||||||
['a_val_o', 'd_0_o', 'f_0_o', None, 'c_o_0 c_o_1'],
|
|
||||||
['a_val', 'd_0', None, None, 'c_val'],
|
|
||||||
]
|
|
||||||
|
|
||||||
key_paths = export_utils.get_keys_paths(skeleton)
|
|
||||||
|
|
||||||
for idx, expected in enumerate(expected_flatten_data):
|
|
||||||
actual = export_utils.get_flatten_data(key_paths, data[idx])
|
|
||||||
self.assertEqual(expected, actual)
|
|
||||||
|
|
||||||
for idx, data in enumerate(data):
|
|
||||||
actual = export_utils.get_flatten_data(key_paths, data)
|
|
||||||
self.assertEqual(expected_flatten_data[idx], actual)
|
|
||||||
|
|
||||||
def test_get_flatten_as_csv_unicode(self):
|
|
||||||
data = [
|
|
||||||
{'a': u'b'},
|
|
||||||
{'a': 'tt', u'эюя': 'x'},
|
|
||||||
]
|
|
||||||
expected_csv = [
|
|
||||||
'a,эюя\r\n',
|
|
||||||
'b,\r\n',
|
|
||||||
'tt,x\r\n'
|
|
||||||
]
|
|
||||||
skeleton = export_utils.get_data_skeleton(data)
|
|
||||||
key_paths = export_utils.get_keys_paths(skeleton)
|
|
||||||
flatten_data = []
|
|
||||||
for d in data:
|
|
||||||
flatten_data.append(export_utils.get_flatten_data(key_paths, d))
|
|
||||||
|
|
||||||
result = export_utils.flatten_data_as_csv(key_paths, flatten_data)
|
|
||||||
for idx, actual_csv in enumerate(result):
|
|
||||||
self.assertEqual(expected_csv[idx], actual_csv)
|
|
||||||
|
|
||||||
def test_dict_construct_skeleton(self):
|
|
||||||
data = {'a': 'b'}
|
|
||||||
expected = {'a': None}
|
|
||||||
actual = export_utils.construct_skeleton(data)
|
|
||||||
self.assertDictEqual(expected, actual)
|
|
||||||
|
|
||||||
data = {'a': 'b', 'x': None}
|
|
||||||
expected = {'a': None, 'x': None}
|
|
||||||
actual = export_utils.construct_skeleton(data)
|
|
||||||
self.assertDictEqual(expected, actual)
|
|
||||||
|
|
||||||
def test_list_construct_skeleton(self):
|
|
||||||
data = ['a', 'b', 'c']
|
|
||||||
actual = export_utils.construct_skeleton(data)
|
|
||||||
self.assertListEqual([], actual)
|
|
||||||
|
|
||||||
data = []
|
|
||||||
actual = export_utils.construct_skeleton(data)
|
|
||||||
self.assertListEqual([], actual)
|
|
||||||
|
|
||||||
data = [{'a': None}, {'b': 'x'}, {'a': 4, 'c': 'xx'}, {}]
|
|
||||||
actual = export_utils.construct_skeleton(data)
|
|
||||||
self.assertItemsEqual(
|
|
||||||
actual[0].keys(),
|
|
||||||
['a', 'b', 'c']
|
|
||||||
)
|
|
||||||
|
|
||||||
data = [
|
|
||||||
'a',
|
|
||||||
['a', 'b', []],
|
|
||||||
[],
|
|
||||||
[{'x': 'z'}, 'zz', {'a': 'b'}],
|
|
||||||
['a'],
|
|
||||||
{'p': 'q'}
|
|
||||||
]
|
|
||||||
actual = export_utils.construct_skeleton(data)
|
|
||||||
expected = [[[], {'a': None, 'x': None}], {'p': None}]
|
|
||||||
self.assertListEqual(expected, actual)
|
|
||||||
|
|
||||||
def test_construct_skeleton(self):
|
|
||||||
data = {'a': 'b', 'c': [[{'d': 'e'}], 'f']}
|
|
||||||
expected = {'a': None, 'c': [[{'d': None}]]}
|
|
||||||
actual = export_utils.construct_skeleton(data)
|
|
||||||
self.assertEqual(expected, actual)
|
|
||||||
|
|
||||||
data = {'a': {'b': []}}
|
|
||||||
expected = {'a': {'b': []}}
|
|
||||||
actual = export_utils.construct_skeleton(data)
|
|
||||||
self.assertEqual(expected, actual)
|
|
||||||
|
|
||||||
data = {'a': {'b': [{'c': 'd'}, {'e': 'f'}]}}
|
|
||||||
expected = {'a': {'b': [{'c': None, 'e': None}]}}
|
|
||||||
actual = export_utils.construct_skeleton(data)
|
|
||||||
self.assertEqual(expected, actual)
|
|
||||||
|
|
||||||
def test_get_skeleton_for_dicts(self):
|
|
||||||
data = [
|
|
||||||
{'ci': {'p': True, 'e': '@', 'n': 'n'}},
|
|
||||||
# reducing fields in nested dict
|
|
||||||
{'ci': {'p': False}},
|
|
||||||
# adding new value
|
|
||||||
{'a': 'b'},
|
|
||||||
# checking empty dict
|
|
||||||
{}
|
|
||||||
]
|
|
||||||
actual = export_utils.get_data_skeleton(data)
|
|
||||||
expected = {'a': None, 'ci': {'p': None, 'e': None, 'n': None}}
|
|
||||||
self.assertEqual(expected, actual)
|
|
||||||
|
|
||||||
def test_get_skeleton_for_lists(self):
|
|
||||||
data = [
|
|
||||||
{'c': [{'s': 'v', 'n': 2}, {'s': 'vv', 'n': 22}]},
|
|
||||||
# adding new value in the list
|
|
||||||
{'c': [{'z': 'p'}]},
|
|
||||||
# checking empty list
|
|
||||||
{'c': []},
|
|
||||||
]
|
|
||||||
actual = export_utils.get_data_skeleton(data)
|
|
||||||
expected = {'c': [{'s': None, 'n': None, 'z': None}]}
|
|
||||||
self.assertEqual(expected, actual)
|
|
||||||
|
|
||||||
def test_get_skeleton(self):
|
|
||||||
data = [
|
|
||||||
{'ci': {'p': True, 'e': '@', 'n': 'n'}},
|
|
||||||
# reducing fields in nested dict
|
|
||||||
{'ci': {'p': False}},
|
|
||||||
# adding list values
|
|
||||||
{'c': [{'s': 'v', 'n': 2}, {'s': 'vv', 'n': 22}]},
|
|
||||||
# adding new value in the list
|
|
||||||
{'c': [{'z': 'p'}]},
|
|
||||||
# checking empty list
|
|
||||||
{'c': []},
|
|
||||||
# adding new value
|
|
||||||
{'a': 'b'},
|
|
||||||
]
|
|
||||||
actual = export_utils.get_data_skeleton(data)
|
|
||||||
expected = {'a': None, 'ci': {'p': None, 'e': None, 'n': None},
|
|
||||||
'c': [{'s': None, 'n': None, 'z': None}]}
|
|
||||||
self.assertEqual(expected, actual)
|
|
||||||
|
|
||||||
def test_get_index(self):
|
|
||||||
|
|
||||||
class Indexed(object):
|
|
||||||
"""Helper object for testing indexing of objects
|
|
||||||
"""
|
|
||||||
def __init__(self, **kwds):
|
|
||||||
self.__dict__.update(kwds)
|
|
||||||
|
|
||||||
checks = [
|
|
||||||
(Indexed(**{'one': 1, 'two': 2}), ('one', ), (1,)),
|
|
||||||
(Indexed(**{'one': 1, 'two': 2}), ('one', 'two'), (1, 2)),
|
|
||||||
(Indexed(**{'one': 1, 'two': 2}), (), ()),
|
|
||||||
]
|
|
||||||
for obj, fields, idx in checks:
|
|
||||||
self.assertTupleEqual(idx, export_utils.get_index(obj, *fields))
|
|
@ -1,220 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
import datetime
|
|
||||||
import json
|
|
||||||
import six
|
|
||||||
|
|
||||||
from fuel_analytics.test.api.resources.utils.inst_structure_test import \
|
|
||||||
InstStructureTest
|
|
||||||
from fuel_analytics.test.api.resources.utils.oswl_test import \
|
|
||||||
OswlTest
|
|
||||||
from fuel_analytics.test.base import DbTest
|
|
||||||
|
|
||||||
from fuel_analytics.api.app import app
|
|
||||||
from fuel_analytics.api.app import db
|
|
||||||
from fuel_analytics.api.common import consts
|
|
||||||
from fuel_analytics.api.db import model
|
|
||||||
from fuel_analytics.api.resources import json_exporter
|
|
||||||
|
|
||||||
|
|
||||||
class JsonExporterTest(InstStructureTest, OswlTest, DbTest):
|
|
||||||
|
|
||||||
def test_jsonify_collection(self):
|
|
||||||
variants = [[], [{}], [{'a': 'b'}, {'c': 'd'}]]
|
|
||||||
for variant in variants:
|
|
||||||
it = iter(variant)
|
|
||||||
jsonified = six.text_type(''.join(
|
|
||||||
json_exporter._jsonify_collection(it)))
|
|
||||||
restored = json.loads(jsonified)
|
|
||||||
self.assertItemsEqual(variant, restored)
|
|
||||||
|
|
||||||
def test_get_installation_info_not_found(self):
|
|
||||||
with app.test_request_context():
|
|
||||||
resp = self.client.get('/api/v1/json/installation_info/xxxx')
|
|
||||||
self.check_response_error(resp, 404)
|
|
||||||
|
|
||||||
def test_get_installation_info(self):
|
|
||||||
structs = self.get_saved_inst_structures(installations_num=10)
|
|
||||||
with app.test_request_context():
|
|
||||||
for struct in structs:
|
|
||||||
url = '/api/v1/json/installation_info/{}'.format(
|
|
||||||
struct.master_node_uid)
|
|
||||||
resp = self.client.get(url)
|
|
||||||
self.check_response_ok(resp)
|
|
||||||
# Checking response is json
|
|
||||||
json.loads(resp.data)
|
|
||||||
|
|
||||||
def test_get_oswls(self):
|
|
||||||
num = 10
|
|
||||||
for resource_type in self.RESOURCE_TYPES:
|
|
||||||
oswls = self.get_saved_oswls(num, resource_type)
|
|
||||||
structs = self.get_saved_inst_structs(oswls)
|
|
||||||
with app.test_request_context():
|
|
||||||
for struct in structs:
|
|
||||||
url = '/api/v1/json/oswls/{}'.format(
|
|
||||||
struct.master_node_uid)
|
|
||||||
resp = self.client.get(url)
|
|
||||||
self.check_response_ok(resp)
|
|
||||||
# Checking response is json
|
|
||||||
json.loads(resp.data)
|
|
||||||
|
|
||||||
def test_get_oswls_by_resource_type(self):
|
|
||||||
num = 10
|
|
||||||
for resource_type in self.RESOURCE_TYPES:
|
|
||||||
oswls = self.get_saved_oswls(num, resource_type)
|
|
||||||
structs = self.get_saved_inst_structs(oswls)
|
|
||||||
with app.test_request_context():
|
|
||||||
for struct in structs:
|
|
||||||
url = '/api/v1/json/oswls/{}/{}'.format(
|
|
||||||
struct.master_node_uid, resource_type)
|
|
||||||
resp = self.client.get(url)
|
|
||||||
self.check_response_ok(resp)
|
|
||||||
# Checking response is json
|
|
||||||
json.loads(resp.data)
|
|
||||||
|
|
||||||
def test_get_action_logs(self):
|
|
||||||
structs = self.get_saved_inst_structures(installations_num=10)
|
|
||||||
self.get_saved_action_logs(structs)
|
|
||||||
with app.test_request_context():
|
|
||||||
for struct in structs:
|
|
||||||
url = '/api/v1/json/action_logs/{}'.format(
|
|
||||||
struct.master_node_uid)
|
|
||||||
resp = self.client.get(url)
|
|
||||||
self.check_response_ok(resp)
|
|
||||||
# Checking response is json
|
|
||||||
json.loads(resp.data)
|
|
||||||
|
|
||||||
def test_get_dict_param(self):
|
|
||||||
# Pairs of param_name, param_value, expected
|
|
||||||
name = 'param_name'
|
|
||||||
variants = (
|
|
||||||
('wrong_name', {}, {}),
|
|
||||||
(name, {}, {}), (name, None, {}), (name, 'a', {}),
|
|
||||||
(name, 1, {}), (name, [], {}), (name, (), {}),
|
|
||||||
(name, {'a': 'b'}, {'a': 'b'})
|
|
||||||
)
|
|
||||||
for param_name, param_value, expected in variants:
|
|
||||||
req_params = '/?{0}={1}'.format(
|
|
||||||
param_name, json.dumps(param_value))
|
|
||||||
with app.test_request_context(req_params):
|
|
||||||
self.assertDictEqual(
|
|
||||||
expected,
|
|
||||||
json_exporter.get_dict_param(name)
|
|
||||||
)
|
|
||||||
|
|
||||||
def test_get_paging_params(self):
|
|
||||||
name = 'paging_params'
|
|
||||||
limit_default = app.config.get('JSON_DB_DEFAULT_LIMIT')
|
|
||||||
variants = (
|
|
||||||
(name, {}, {'limit': limit_default, 'offset': 0}),
|
|
||||||
(name, [], {'limit': limit_default, 'offset': 0}),
|
|
||||||
(name, 4, {'limit': limit_default, 'offset': 0}),
|
|
||||||
('wrong_name', 4, {'limit': limit_default, 'offset': 0}),
|
|
||||||
(name, {'trash': 'x'}, {'limit': limit_default, 'offset': 0}),
|
|
||||||
(name, {'limit': limit_default + 1}, {'limit': limit_default + 1,
|
|
||||||
'offset': 0}),
|
|
||||||
(name, {'limit': limit_default + 1, 'offset': 50},
|
|
||||||
{'limit': limit_default + 1, 'offset': 50}),
|
|
||||||
)
|
|
||||||
|
|
||||||
for param_name, param_value, expected in variants:
|
|
||||||
req_params = '/?{0}={1}'.format(
|
|
||||||
param_name, json.dumps(param_value))
|
|
||||||
|
|
||||||
with app.test_request_context(req_params):
|
|
||||||
self.assertEqual(
|
|
||||||
expected,
|
|
||||||
json_exporter.get_paging_params()
|
|
||||||
)
|
|
||||||
|
|
||||||
def test_row_as_serializable_dict(self):
|
|
||||||
dt_now = datetime.datetime.utcnow()
|
|
||||||
d_now = dt_now.date()
|
|
||||||
t_now = dt_now.time()
|
|
||||||
objs = [
|
|
||||||
model.InstallationStructure(
|
|
||||||
id=0, master_node_uid='xx', structure={'a': [], 'b': 'c'},
|
|
||||||
creation_date=dt_now, modification_date=dt_now),
|
|
||||||
model.ActionLog(id=0, master_node_uid='yy', external_id=33,
|
|
||||||
body={'c': 4}),
|
|
||||||
model.OpenStackWorkloadStats(
|
|
||||||
id=0, master_node_uid='zz', external_id=45, cluster_id=44,
|
|
||||||
created_date=d_now, updated_time=t_now,
|
|
||||||
resource_type='vm', resource_data={},
|
|
||||||
resource_checksum='chk'
|
|
||||||
)
|
|
||||||
]
|
|
||||||
|
|
||||||
for expected in objs:
|
|
||||||
# Checking objects to json serialization
|
|
||||||
obj_json = json_exporter.row_as_serializable_dict(expected)
|
|
||||||
json.dumps(obj_json)
|
|
||||||
|
|
||||||
# Checking objects serialized properly
|
|
||||||
actual = expected.__class__(**obj_json)
|
|
||||||
|
|
||||||
# Saving object for proper types conversion
|
|
||||||
db.session.add(actual)
|
|
||||||
db.session.commit()
|
|
||||||
|
|
||||||
# Checking SqlAlchemy objects equality
|
|
||||||
for c in expected.__table__.columns:
|
|
||||||
column_name = c.name
|
|
||||||
self.assertEqual(
|
|
||||||
getattr(expected, column_name),
|
|
||||||
getattr(actual, column_name)
|
|
||||||
)
|
|
||||||
|
|
||||||
def test_get_filtered_installation_infos(self):
|
|
||||||
total_num = 10
|
|
||||||
filtered_num = 4
|
|
||||||
structs = self.get_saved_inst_structures(installations_num=total_num)
|
|
||||||
for struct in structs[:filtered_num]:
|
|
||||||
struct.is_filtered = True
|
|
||||||
db.session.flush()
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
url = '/api/v1/json/installation_infos/filtered'
|
|
||||||
resp = self.client.get(url)
|
|
||||||
self.check_response_ok(resp)
|
|
||||||
result = json.loads(resp.data)
|
|
||||||
|
|
||||||
self.assertEqual(filtered_num, result['paging_params']['total'])
|
|
||||||
for struct in result['objs']:
|
|
||||||
self.assertTrue(struct['is_filtered'])
|
|
||||||
|
|
||||||
def test_get_db_summary(self):
|
|
||||||
oswls = self.get_saved_oswls(100, consts.OSWL_RESOURCE_TYPES.volume)
|
|
||||||
inst_infos = self.get_saved_inst_structs(
|
|
||||||
oswls, is_filtered_values=(True, False, None))
|
|
||||||
not_filtered_num = len(filter(lambda x: x.is_filtered is False,
|
|
||||||
inst_infos))
|
|
||||||
filtered_num = len(inst_infos) - not_filtered_num
|
|
||||||
action_logs = self.get_saved_action_logs(inst_infos)
|
|
||||||
expected = {
|
|
||||||
'oswl_stats': {'total': len(oswls)},
|
|
||||||
'installation_structures': {
|
|
||||||
'total': len(inst_infos),
|
|
||||||
'filtered': filtered_num,
|
|
||||||
'not_filtered': not_filtered_num
|
|
||||||
},
|
|
||||||
'action_logs': {'total': len(action_logs)}
|
|
||||||
}
|
|
||||||
with app.test_request_context():
|
|
||||||
url = '/api/v1/json/summary'
|
|
||||||
resp = self.client.get(url)
|
|
||||||
self.check_response_ok(resp)
|
|
||||||
actual = json.loads(resp.data)
|
|
||||||
self.assertEqual(expected, actual)
|
|
@ -1,527 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
import json
|
|
||||||
import memcache
|
|
||||||
import mock
|
|
||||||
|
|
||||||
from fuel_analytics.test.base import DbTest
|
|
||||||
|
|
||||||
from fuel_analytics.api.app import app
|
|
||||||
from fuel_analytics.api.app import db
|
|
||||||
from fuel_analytics.api.db import model
|
|
||||||
|
|
||||||
|
|
||||||
class JsonReportsTest(DbTest):
|
|
||||||
|
|
||||||
@mock.patch.object(memcache.Client, 'get', return_value=None)
|
|
||||||
def test_get_installations_num(self, _):
|
|
||||||
structures = [
|
|
||||||
model.InstallationStructure(
|
|
||||||
master_node_uid='x0',
|
|
||||||
structure={'clusters': [], 'clusters_num': 0},
|
|
||||||
is_filtered=False,
|
|
||||||
release='9.0'
|
|
||||||
),
|
|
||||||
model.InstallationStructure(
|
|
||||||
master_node_uid='x1',
|
|
||||||
structure={'clusters': [], 'clusters_num': 0},
|
|
||||||
is_filtered=False,
|
|
||||||
release='8.0'
|
|
||||||
),
|
|
||||||
model.InstallationStructure(
|
|
||||||
master_node_uid='x2',
|
|
||||||
structure={'clusters': [], 'clusters_num': 0},
|
|
||||||
is_filtered=True,
|
|
||||||
release='8.0'
|
|
||||||
),
|
|
||||||
]
|
|
||||||
for structure in structures:
|
|
||||||
db.session.add(structure)
|
|
||||||
db.session.flush()
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
url = '/api/v1/json/report/installations'
|
|
||||||
resp = self.client.get(url)
|
|
||||||
self.check_response_ok(resp)
|
|
||||||
resp = json.loads(resp.data)
|
|
||||||
self.assertEqual(2, resp['installations']['count'])
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
url = '/api/v1/json/report/installations?release=8.0'
|
|
||||||
resp = self.client.get(url)
|
|
||||||
self.check_response_ok(resp)
|
|
||||||
resp = json.loads(resp.data)
|
|
||||||
self.assertEqual(1, resp['installations']['count'])
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
url = '/api/v1/json/report/installations?release=xxx'
|
|
||||||
resp = self.client.get(url)
|
|
||||||
self.check_response_ok(resp)
|
|
||||||
resp = json.loads(resp.data)
|
|
||||||
self.assertEqual(0, resp['installations']['count'])
|
|
||||||
|
|
||||||
@mock.patch.object(memcache.Client, 'get', return_value=None)
|
|
||||||
def test_get_env_statuses(self, _):
|
|
||||||
structures = [
|
|
||||||
model.InstallationStructure(
|
|
||||||
master_node_uid='x0',
|
|
||||||
structure={
|
|
||||||
'clusters': [
|
|
||||||
{'status': 'new'},
|
|
||||||
{'status': 'operational'},
|
|
||||||
{'status': 'error'}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
is_filtered=False,
|
|
||||||
release='9.0'
|
|
||||||
),
|
|
||||||
model.InstallationStructure(
|
|
||||||
master_node_uid='x1',
|
|
||||||
structure={
|
|
||||||
'clusters': [
|
|
||||||
{'status': 'deployment'},
|
|
||||||
{'status': 'operational'},
|
|
||||||
{'status': 'operational'},
|
|
||||||
]
|
|
||||||
},
|
|
||||||
is_filtered=False,
|
|
||||||
release='8.0'
|
|
||||||
),
|
|
||||||
model.InstallationStructure(
|
|
||||||
master_node_uid='x2',
|
|
||||||
structure={
|
|
||||||
'clusters': [
|
|
||||||
{'status': 'deployment'},
|
|
||||||
{'status': 'operational'},
|
|
||||||
]
|
|
||||||
},
|
|
||||||
is_filtered=True,
|
|
||||||
release='8.0'
|
|
||||||
),
|
|
||||||
]
|
|
||||||
for structure in structures:
|
|
||||||
db.session.add(structure)
|
|
||||||
db.session.flush()
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
url = '/api/v1/json/report/installations'
|
|
||||||
resp = self.client.get(url)
|
|
||||||
self.check_response_ok(resp)
|
|
||||||
resp = json.loads(resp.data)
|
|
||||||
self.assertEqual(
|
|
||||||
{'new': 1, 'deployment': 1, 'error': 1, 'operational': 3},
|
|
||||||
resp['environments']['statuses']
|
|
||||||
)
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
url = '/api/v1/json/report/installations?release=8.0'
|
|
||||||
resp = self.client.get(url)
|
|
||||||
self.check_response_ok(resp)
|
|
||||||
resp = json.loads(resp.data)
|
|
||||||
self.assertEqual(
|
|
||||||
{'deployment': 1, 'operational': 2},
|
|
||||||
resp['environments']['statuses']
|
|
||||||
)
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
url = '/api/v1/json/report/installations?release=xxx'
|
|
||||||
resp = self.client.get(url)
|
|
||||||
self.check_response_ok(resp)
|
|
||||||
resp = json.loads(resp.data)
|
|
||||||
self.assertEqual({}, resp['environments']['statuses'])
|
|
||||||
|
|
||||||
@mock.patch.object(memcache.Client, 'get', return_value=None)
|
|
||||||
def test_get_env_num(self, _):
|
|
||||||
structures = [
|
|
||||||
model.InstallationStructure(
|
|
||||||
master_node_uid='x0',
|
|
||||||
structure={'clusters': [{}, {}, {}], 'clusters_num': 3},
|
|
||||||
is_filtered=False,
|
|
||||||
release='9.0'
|
|
||||||
),
|
|
||||||
model.InstallationStructure(
|
|
||||||
master_node_uid='x1',
|
|
||||||
structure={'clusters': [{}, {}], 'clusters_num': 2},
|
|
||||||
is_filtered=False,
|
|
||||||
release='8.0'
|
|
||||||
),
|
|
||||||
model.InstallationStructure(
|
|
||||||
master_node_uid='x2',
|
|
||||||
structure={'clusters': [], 'clusters_num': 0},
|
|
||||||
is_filtered=False,
|
|
||||||
release='8.0'
|
|
||||||
),
|
|
||||||
model.InstallationStructure(
|
|
||||||
master_node_uid='x3',
|
|
||||||
structure={'clusters': [], 'clusters_num': 0},
|
|
||||||
is_filtered=False,
|
|
||||||
release='8.0'
|
|
||||||
),
|
|
||||||
model.InstallationStructure(
|
|
||||||
master_node_uid='x4',
|
|
||||||
structure={'clusters': [{}, {}, {}], 'clusters_num': 3},
|
|
||||||
is_filtered=True,
|
|
||||||
release='8.0'
|
|
||||||
),
|
|
||||||
]
|
|
||||||
for structure in structures:
|
|
||||||
db.session.add(structure)
|
|
||||||
db.session.flush()
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
url = '/api/v1/json/report/installations'
|
|
||||||
resp = self.client.get(url)
|
|
||||||
self.check_response_ok(resp)
|
|
||||||
resp = json.loads(resp.data)
|
|
||||||
self.assertEqual(5, resp['environments']['count'])
|
|
||||||
self.assertEqual(
|
|
||||||
{'0': 2, '2': 1, '3': 1},
|
|
||||||
resp['installations']['environments_num']
|
|
||||||
)
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
url = '/api/v1/json/report/installations?release=8.0'
|
|
||||||
resp = self.client.get(url)
|
|
||||||
self.check_response_ok(resp)
|
|
||||||
resp = json.loads(resp.data)
|
|
||||||
self.assertEqual(2, resp['environments']['count'])
|
|
||||||
self.assertEqual(
|
|
||||||
{'0': 2, '2': 1},
|
|
||||||
resp['installations']['environments_num']
|
|
||||||
)
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
url = '/api/v1/json/report/installations?release=xxx'
|
|
||||||
resp = self.client.get(url)
|
|
||||||
self.check_response_ok(resp)
|
|
||||||
resp = json.loads(resp.data)
|
|
||||||
self.assertEqual(0, resp['environments']['count'])
|
|
||||||
self.assertEqual({}, resp['installations']['environments_num'])
|
|
||||||
|
|
||||||
@mock.patch.object(memcache.Client, 'set')
|
|
||||||
@mock.patch.object(memcache.Client, 'get', return_value=None)
|
|
||||||
def test_caching(self, cached_mc_get, cached_mc_set):
|
|
||||||
structures = [
|
|
||||||
model.InstallationStructure(
|
|
||||||
master_node_uid='x0',
|
|
||||||
structure={'clusters': [{}, {}, {}]},
|
|
||||||
is_filtered=False,
|
|
||||||
release='9.0'
|
|
||||||
),
|
|
||||||
model.InstallationStructure(
|
|
||||||
master_node_uid='x1',
|
|
||||||
structure={'clusters': [{}, {}]},
|
|
||||||
is_filtered=False,
|
|
||||||
release='8.0'
|
|
||||||
),
|
|
||||||
model.InstallationStructure(
|
|
||||||
master_node_uid='x2',
|
|
||||||
structure={'clusters': [{}]},
|
|
||||||
is_filtered=True,
|
|
||||||
release='8.0'
|
|
||||||
)
|
|
||||||
]
|
|
||||||
for structure in structures:
|
|
||||||
db.session.add(structure)
|
|
||||||
db.session.flush()
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
url = '/api/v1/json/report/installations'
|
|
||||||
resp = self.client.get(url)
|
|
||||||
self.check_response_ok(resp)
|
|
||||||
self.assertEqual(1, cached_mc_get.call_count)
|
|
||||||
|
|
||||||
# Checking that mc.set was called for each release and
|
|
||||||
# for all releases summary info
|
|
||||||
|
|
||||||
# Mock call args have structure (args, kwargs)
|
|
||||||
expected_cache_keys = ['fuel-stats-installations-info8.0',
|
|
||||||
'fuel-stats-installations-info9.0',
|
|
||||||
'fuel-stats-installations-infoNone']
|
|
||||||
|
|
||||||
actual_cache_keys = [call_args[0][0] for call_args in
|
|
||||||
cached_mc_set.call_args_list]
|
|
||||||
self.assertItemsEqual(expected_cache_keys, actual_cache_keys)
|
|
||||||
|
|
||||||
# cached_mc_set.assert_has_calls(calls, any_order=True)
|
|
||||||
self.assertEqual(len(expected_cache_keys),
|
|
||||||
cached_mc_set.call_count)
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
url = '/api/v1/json/report/installations?release=8.0'
|
|
||||||
resp = self.client.get(url)
|
|
||||||
self.check_response_ok(resp)
|
|
||||||
self.assertEqual(2, cached_mc_get.call_count)
|
|
||||||
self.assertEqual(4, cached_mc_set.call_count)
|
|
||||||
expected_cache_key = 'fuel-stats-installations-info8.0'
|
|
||||||
actual_cache_key = cached_mc_set.call_args[0][0]
|
|
||||||
self.assertEqual(expected_cache_key, actual_cache_key)
|
|
||||||
|
|
||||||
@mock.patch.object(memcache.Client, 'set')
|
|
||||||
@mock.patch.object(memcache.Client, 'get')
|
|
||||||
def test_refresh_cached_data(self, cached_mc_get, cached_mc_set):
|
|
||||||
structures = [
|
|
||||||
model.InstallationStructure(
|
|
||||||
master_node_uid='x0',
|
|
||||||
structure={'clusters': [{}, {}, {}]},
|
|
||||||
is_filtered=False,
|
|
||||||
release='9.0'
|
|
||||||
),
|
|
||||||
model.InstallationStructure(
|
|
||||||
master_node_uid='x1',
|
|
||||||
structure={'clusters': [{}, {}]},
|
|
||||||
is_filtered=False,
|
|
||||||
release='8.0'
|
|
||||||
),
|
|
||||||
model.InstallationStructure(
|
|
||||||
master_node_uid='x2',
|
|
||||||
structure={'clusters': [{}]},
|
|
||||||
is_filtered=True,
|
|
||||||
release='8.0'
|
|
||||||
)
|
|
||||||
]
|
|
||||||
for structure in structures:
|
|
||||||
db.session.add(structure)
|
|
||||||
db.session.flush()
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
url = '/api/v1/json/report/installations?refresh=1'
|
|
||||||
resp = self.client.get(url)
|
|
||||||
self.check_response_ok(resp)
|
|
||||||
self.assertEqual(0, cached_mc_get.call_count)
|
|
||||||
self.assertEqual(3, cached_mc_set.call_count)
|
|
||||||
|
|
||||||
@mock.patch.object(memcache.Client, 'get', return_value=None)
|
|
||||||
def test_get_nodes_num(self, _):
|
|
||||||
structures = [
|
|
||||||
model.InstallationStructure(
|
|
||||||
master_node_uid='x0',
|
|
||||||
structure={
|
|
||||||
'clusters': [
|
|
||||||
{'status': 'operational', 'nodes_num': 3},
|
|
||||||
{'status': 'new', 'nodes_num': 2},
|
|
||||||
{'status': 'error', 'nodes_num': 1},
|
|
||||||
]
|
|
||||||
},
|
|
||||||
is_filtered=False,
|
|
||||||
release='9.0'
|
|
||||||
),
|
|
||||||
model.InstallationStructure(
|
|
||||||
master_node_uid='x1',
|
|
||||||
structure={
|
|
||||||
'clusters': [
|
|
||||||
{'status': 'operational', 'nodes_num': 3}
|
|
||||||
],
|
|
||||||
},
|
|
||||||
is_filtered=False,
|
|
||||||
release='8.0'
|
|
||||||
),
|
|
||||||
model.InstallationStructure(
|
|
||||||
master_node_uid='x2',
|
|
||||||
structure={
|
|
||||||
'clusters': [
|
|
||||||
{'status': 'operational', 'nodes_num': 5},
|
|
||||||
{'status': 'new', 'nodes_num': 6},
|
|
||||||
{'status': 'error', 'nodes_num': 7},
|
|
||||||
]
|
|
||||||
},
|
|
||||||
is_filtered=True,
|
|
||||||
release='8.0'
|
|
||||||
),
|
|
||||||
]
|
|
||||||
for structure in structures:
|
|
||||||
db.session.add(structure)
|
|
||||||
db.session.flush()
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
url = '/api/v1/json/report/installations'
|
|
||||||
resp = self.client.get(url)
|
|
||||||
self.check_response_ok(resp)
|
|
||||||
resp = json.loads(resp.data)
|
|
||||||
self.assertEqual(3, resp['environments']['operable_envs_count'])
|
|
||||||
self.assertEqual(
|
|
||||||
{'3': 2, '1': 1},
|
|
||||||
resp['environments']['nodes_num']
|
|
||||||
)
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
url = '/api/v1/json/report/installations?release=9.0'
|
|
||||||
resp = self.client.get(url)
|
|
||||||
self.check_response_ok(resp)
|
|
||||||
resp = json.loads(resp.data)
|
|
||||||
self.assertEqual(2, resp['environments']['operable_envs_count'])
|
|
||||||
self.assertEqual(
|
|
||||||
{'3': 1, '1': 1},
|
|
||||||
resp['environments']['nodes_num']
|
|
||||||
)
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
url = '/api/v1/json/report/installations?release=xxx'
|
|
||||||
resp = self.client.get(url)
|
|
||||||
self.check_response_ok(resp)
|
|
||||||
resp = json.loads(resp.data)
|
|
||||||
self.assertEqual(0, resp['environments']['operable_envs_count'])
|
|
||||||
self.assertEqual({}, resp['environments']['nodes_num'])
|
|
||||||
|
|
||||||
@mock.patch.object(memcache.Client, 'get', return_value=None)
|
|
||||||
def test_get_hypervisors_num(self, _):
|
|
||||||
structures = [
|
|
||||||
model.InstallationStructure(
|
|
||||||
master_node_uid='x0',
|
|
||||||
structure={
|
|
||||||
'clusters': [
|
|
||||||
{'status': 'operational', 'attributes':
|
|
||||||
{'libvirt_type': 'kvm'}},
|
|
||||||
{'status': 'operational', 'attributes':
|
|
||||||
{'libvirt_type': 'Qemu'}},
|
|
||||||
{'status': 'operational', 'attributes':
|
|
||||||
{'libvirt_type': 'Kvm'}},
|
|
||||||
{'status': 'new', 'attributes':
|
|
||||||
{'libvirt_type': 'kvm'}},
|
|
||||||
{'status': 'error', 'attributes':
|
|
||||||
{'libvirt_type': 'qemu'}},
|
|
||||||
]
|
|
||||||
},
|
|
||||||
is_filtered=False,
|
|
||||||
release='9.0'
|
|
||||||
),
|
|
||||||
model.InstallationStructure(
|
|
||||||
master_node_uid='x1',
|
|
||||||
structure={
|
|
||||||
'clusters': [
|
|
||||||
{'status': 'new', 'attributes':
|
|
||||||
{'libvirt_type': 'qemu'}},
|
|
||||||
{'status': 'error', 'attributes':
|
|
||||||
{'libvirt_type': 'Kvm'}},
|
|
||||||
{'status': 'error', 'attributes':
|
|
||||||
{'libvirt_type': 'vcenter'}},
|
|
||||||
],
|
|
||||||
},
|
|
||||||
is_filtered=False,
|
|
||||||
release='8.0'
|
|
||||||
),
|
|
||||||
model.InstallationStructure(
|
|
||||||
master_node_uid='x2',
|
|
||||||
structure={
|
|
||||||
'clusters': [
|
|
||||||
{'status': 'operational', 'attributes':
|
|
||||||
{'libvirt_type': 'kvm'}},
|
|
||||||
{'status': 'new', 'attributes':
|
|
||||||
{'libvirt_type': 'kvm'}},
|
|
||||||
{'status': 'error', 'attributes':
|
|
||||||
{'libvirt_type': 'qemu'}},
|
|
||||||
]
|
|
||||||
},
|
|
||||||
is_filtered=True,
|
|
||||||
release='8.0'
|
|
||||||
),
|
|
||||||
]
|
|
||||||
for structure in structures:
|
|
||||||
db.session.add(structure)
|
|
||||||
db.session.flush()
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
url = '/api/v1/json/report/installations'
|
|
||||||
resp = self.client.get(url)
|
|
||||||
self.check_response_ok(resp)
|
|
||||||
resp = json.loads(resp.data)
|
|
||||||
self.assertEqual(
|
|
||||||
{'kvm': 3, 'vcenter': 1, 'qemu': 2},
|
|
||||||
resp['environments']['hypervisors_num']
|
|
||||||
)
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
url = '/api/v1/json/report/installations?release=8.0'
|
|
||||||
resp = self.client.get(url)
|
|
||||||
self.check_response_ok(resp)
|
|
||||||
resp = json.loads(resp.data)
|
|
||||||
self.assertEqual(
|
|
||||||
{'kvm': 1, 'vcenter': 1},
|
|
||||||
resp['environments']['hypervisors_num']
|
|
||||||
)
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
url = '/api/v1/json/report/installations?release=xxx'
|
|
||||||
resp = self.client.get(url)
|
|
||||||
self.check_response_ok(resp)
|
|
||||||
resp = json.loads(resp.data)
|
|
||||||
self.assertEqual({}, resp['environments']['hypervisors_num'])
|
|
||||||
|
|
||||||
@mock.patch.object(memcache.Client, 'get', return_value=None)
|
|
||||||
def test_get_oses_num(self, _):
|
|
||||||
structures = [
|
|
||||||
model.InstallationStructure(
|
|
||||||
master_node_uid='x0',
|
|
||||||
structure={
|
|
||||||
'clusters': [
|
|
||||||
{'status': 'operational', 'release': {'os': 'Ubuntu'}},
|
|
||||||
{'status': 'error', 'release': {'os': 'ubuntu'}},
|
|
||||||
{'status': 'error', 'release': {'os': 'Centos'}}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
is_filtered=False,
|
|
||||||
release='9.0'
|
|
||||||
),
|
|
||||||
model.InstallationStructure(
|
|
||||||
master_node_uid='x1',
|
|
||||||
structure={
|
|
||||||
'clusters': [
|
|
||||||
{'status': 'new', 'release': {'os': 'Ubuntu'}},
|
|
||||||
{'status': 'operational', 'release': {'os': 'ubuntu'}}
|
|
||||||
],
|
|
||||||
},
|
|
||||||
is_filtered=False,
|
|
||||||
release='8.0'
|
|
||||||
),
|
|
||||||
model.InstallationStructure(
|
|
||||||
master_node_uid='x2',
|
|
||||||
structure={
|
|
||||||
'clusters': [
|
|
||||||
{'status': 'new', 'release': {'os': 'centos'}},
|
|
||||||
{'status': 'operational', 'release': {'os': 'centos'}},
|
|
||||||
{'status': 'operational', 'release': {'os': 'centos'}}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
is_filtered=True,
|
|
||||||
release='8.0'
|
|
||||||
),
|
|
||||||
]
|
|
||||||
for structure in structures:
|
|
||||||
db.session.add(structure)
|
|
||||||
db.session.flush()
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
url = '/api/v1/json/report/installations'
|
|
||||||
resp = self.client.get(url)
|
|
||||||
self.check_response_ok(resp)
|
|
||||||
resp = json.loads(resp.data)
|
|
||||||
self.assertEqual(
|
|
||||||
{'ubuntu': 3, 'centos': 1},
|
|
||||||
resp['environments']['oses_num']
|
|
||||||
)
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
url = '/api/v1/json/report/installations?release=8.0'
|
|
||||||
resp = self.client.get(url)
|
|
||||||
self.check_response_ok(resp)
|
|
||||||
resp = json.loads(resp.data)
|
|
||||||
self.assertEqual({'ubuntu': 1}, resp['environments']['oses_num'])
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
url = '/api/v1/json/report/installations?release=xxx'
|
|
||||||
resp = self.client.get(url)
|
|
||||||
self.check_response_ok(resp)
|
|
||||||
resp = json.loads(resp.data)
|
|
||||||
self.assertEqual({}, resp['environments']['oses_num'])
|
|
@ -1,188 +0,0 @@
|
|||||||
# Copyright 2016 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
import csv
|
|
||||||
import datetime
|
|
||||||
import six
|
|
||||||
import types
|
|
||||||
|
|
||||||
from fuel_analytics.test.api.resources.utils.inst_structure_test import \
|
|
||||||
InstStructureTest
|
|
||||||
from fuel_analytics.test.base import DbTest
|
|
||||||
|
|
||||||
from fuel_analytics.api.app import app
|
|
||||||
from fuel_analytics.api.app import db
|
|
||||||
from fuel_analytics.api.db import model
|
|
||||||
from fuel_analytics.api.resources.utils.stats_to_csv import StatsToCsv
|
|
||||||
|
|
||||||
|
|
||||||
class NodesToCsvExportTest(InstStructureTest, DbTest):
|
|
||||||
|
|
||||||
def test_get_node_keys_paths(self):
|
|
||||||
exporter = StatsToCsv()
|
|
||||||
_, _, _, csv_keys_paths = exporter.get_node_keys_paths()
|
|
||||||
self.assertNotIn(['manufacturer'], csv_keys_paths)
|
|
||||||
self.assertNotIn(['platform_name'], csv_keys_paths)
|
|
||||||
|
|
||||||
self.assertIn(['id'], csv_keys_paths)
|
|
||||||
self.assertIn(['group_id'], csv_keys_paths)
|
|
||||||
self.assertIn(['cluster_fuel_version'], csv_keys_paths)
|
|
||||||
self.assertIn(['master_node_uid'], csv_keys_paths)
|
|
||||||
self.assertIn(['os'], csv_keys_paths)
|
|
||||||
self.assertIn(['roles', 0], csv_keys_paths)
|
|
||||||
self.assertIn(['pending_addition'], csv_keys_paths)
|
|
||||||
self.assertIn(['pending_deletion'], csv_keys_paths)
|
|
||||||
self.assertIn(['pending_roles', 0], csv_keys_paths)
|
|
||||||
self.assertIn(['status'], csv_keys_paths)
|
|
||||||
self.assertIn(['online'], csv_keys_paths)
|
|
||||||
|
|
||||||
self.assertIn(['meta', 'cpu', 'real'], csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'cpu', 'total'], csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'cpu', 'spec', 0, 'frequency'], csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'cpu', 'spec', 0, 'model'], csv_keys_paths)
|
|
||||||
|
|
||||||
self.assertIn(['meta', 'memory', 'slots'], csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'memory', 'total'], csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'memory', 'maximum_capacity'], csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'memory', 'devices', 0, 'frequency'],
|
|
||||||
csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'memory', 'devices', 0, 'type'], csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'memory', 'devices', 0, 'size'], csv_keys_paths)
|
|
||||||
|
|
||||||
self.assertIn(['meta', 'disks', 0, 'name'], csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'disks', 0, 'removable'], csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'disks', 0, 'model'], csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'disks', 0, 'size'], csv_keys_paths)
|
|
||||||
|
|
||||||
self.assertIn(['meta', 'system', 'product'], csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'system', 'family'], csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'system', 'version'], csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'system', 'manufacturer'], csv_keys_paths)
|
|
||||||
|
|
||||||
self.assertIn(['meta', 'numa_topology', 'numa_nodes', 0, 'memory'],
|
|
||||||
csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'numa_topology', 'numa_nodes', 0, 'id'],
|
|
||||||
csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'numa_topology', 'supported_hugepages', 0],
|
|
||||||
csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'numa_topology', 'distances', 0],
|
|
||||||
csv_keys_paths)
|
|
||||||
|
|
||||||
self.assertIn(['meta', 'interfaces', 0, 'name'], csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'interfaces', 0, 'pxe'], csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'interfaces', 0, 'driver'], csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'interfaces', 0, 'max_speed'], csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'interfaces', 0, 'offloading_modes',
|
|
||||||
0, 'state'], csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'interfaces', 0, 'offloading_modes',
|
|
||||||
0, 'name'], csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'interfaces', 0, 'offloading_modes', 0, 'sub',
|
|
||||||
0, 'name'], csv_keys_paths)
|
|
||||||
|
|
||||||
self.assertIn(['meta', 'interfaces', 0, 'interface_properties',
|
|
||||||
'sriov', 'available'], csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'interfaces', 0, 'interface_properties',
|
|
||||||
'sriov', 'enabled'], csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'interfaces', 0, 'interface_properties',
|
|
||||||
'sriov', 'physnet'], csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'interfaces', 0, 'interface_properties',
|
|
||||||
'sriov', 'sriov_numvfs'], csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'interfaces', 0, 'interface_properties',
|
|
||||||
'sriov', 'sriov_totalvfs'], csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'interfaces', 0, 'interface_properties',
|
|
||||||
'dpdk', 'enabled'], csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'interfaces', 0, 'interface_properties',
|
|
||||||
'mtu'], csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'interfaces', 0, 'interface_properties',
|
|
||||||
'disable_offloading'], csv_keys_paths)
|
|
||||||
self.assertIn(['meta', 'interfaces', 0, 'interface_properties',
|
|
||||||
'numa_node'], csv_keys_paths)
|
|
||||||
|
|
||||||
def test_get_flatten_nodes(self):
|
|
||||||
installations_num = 10
|
|
||||||
inst_structures = self.get_saved_inst_structures(
|
|
||||||
installations_num=installations_num)
|
|
||||||
exporter = StatsToCsv()
|
|
||||||
structure_paths, cluster_paths, node_paths, csv_paths = \
|
|
||||||
exporter.get_node_keys_paths()
|
|
||||||
flatten_nodes = exporter.get_flatten_nodes(
|
|
||||||
structure_paths, cluster_paths, node_paths, inst_structures)
|
|
||||||
self.assertIsInstance(flatten_nodes, types.GeneratorType)
|
|
||||||
pos_mn_uid = csv_paths.index(['master_node_uid'])
|
|
||||||
pos_cluster_id = csv_paths.index(['cluster_id'])
|
|
||||||
pos_status = csv_paths.index(['status'])
|
|
||||||
for flatten_node in flatten_nodes:
|
|
||||||
self.assertIsNotNone(flatten_node[pos_mn_uid])
|
|
||||||
self.assertIsNotNone(flatten_node[pos_cluster_id])
|
|
||||||
self.assertIsNotNone(flatten_node[pos_status])
|
|
||||||
self.assertEqual(len(csv_paths), len(flatten_node))
|
|
||||||
|
|
||||||
def test_export_nodes(self):
|
|
||||||
installations_num = 100
|
|
||||||
exporter = StatsToCsv()
|
|
||||||
with app.test_request_context('/?from_date=2015-02-01'):
|
|
||||||
# Creating installation structures
|
|
||||||
inst_structures = self.get_saved_inst_structures(
|
|
||||||
installations_num=installations_num)
|
|
||||||
# Filtering installation structures
|
|
||||||
result = exporter.export_nodes(inst_structures)
|
|
||||||
self.assertIsInstance(result, types.GeneratorType)
|
|
||||||
output = six.StringIO(list(result))
|
|
||||||
reader = csv.reader(output)
|
|
||||||
for _ in reader:
|
|
||||||
pass
|
|
||||||
|
|
||||||
def test_fuel_release_info_in_flatten_nodes(self):
|
|
||||||
inst_fuel_version = '8.0'
|
|
||||||
cluster_fuel_version = '7.0'
|
|
||||||
packages = ['z', 'a', 'c']
|
|
||||||
inst_structures = [
|
|
||||||
model.InstallationStructure(
|
|
||||||
master_node_uid='one',
|
|
||||||
creation_date=datetime.datetime.utcnow(),
|
|
||||||
is_filtered=False,
|
|
||||||
structure={
|
|
||||||
'fuel_release': {'release': inst_fuel_version},
|
|
||||||
'fuel_packages': packages,
|
|
||||||
'clusters': [{
|
|
||||||
'id': 1, 'nodes': [],
|
|
||||||
'fuel_version': cluster_fuel_version,
|
|
||||||
'installed_plugins': [{
|
|
||||||
'name': 'plugin_a',
|
|
||||||
'version': 'plugin_version_0',
|
|
||||||
'releases': [],
|
|
||||||
'fuel_version': ['8.0', '7.0'],
|
|
||||||
'package_version': 'package_version_0'
|
|
||||||
}],
|
|
||||||
}]
|
|
||||||
}
|
|
||||||
)
|
|
||||||
]
|
|
||||||
for structure in inst_structures:
|
|
||||||
db.session.add(structure)
|
|
||||||
db.session.flush()
|
|
||||||
|
|
||||||
exporter = StatsToCsv()
|
|
||||||
structure_paths, cluster_paths, node_paths, csv_paths = \
|
|
||||||
exporter.get_node_keys_paths()
|
|
||||||
flatten_nodes = exporter.get_flatten_nodes(
|
|
||||||
structure_paths, cluster_paths, node_paths, inst_structures)
|
|
||||||
|
|
||||||
pos_fuel_version = csv_paths.index(['cluster_fuel_version'])
|
|
||||||
pos_packages = csv_paths.index(['structure', 'fuel_packages'])
|
|
||||||
for flatten_node in flatten_nodes:
|
|
||||||
self.assertEqual(cluster_fuel_version,
|
|
||||||
flatten_node[pos_fuel_version])
|
|
||||||
self.assertEqual(' '.join(packages),
|
|
||||||
flatten_node[pos_packages])
|
|
@ -1,994 +0,0 @@
|
|||||||
# -*- coding: utf-8 -*-
|
|
||||||
#
|
|
||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
import csv
|
|
||||||
from datetime import datetime
|
|
||||||
from datetime import timedelta
|
|
||||||
import mock
|
|
||||||
import six
|
|
||||||
import types
|
|
||||||
import uuid
|
|
||||||
|
|
||||||
from fuel_analytics.test.api.resources.utils.oswl_test import OswlTest
|
|
||||||
from fuel_analytics.test.base import DbTest
|
|
||||||
|
|
||||||
from fuel_analytics.api.app import app
|
|
||||||
from fuel_analytics.api.app import db
|
|
||||||
from fuel_analytics.api.common import consts
|
|
||||||
from fuel_analytics.api.db.model import InstallationStructure
|
|
||||||
from fuel_analytics.api.db.model import OpenStackWorkloadStats
|
|
||||||
from fuel_analytics.api.resources.csv_exporter import get_clusters_version_info
|
|
||||||
from fuel_analytics.api.resources.csv_exporter import get_oswls
|
|
||||||
from fuel_analytics.api.resources.csv_exporter import get_oswls_query
|
|
||||||
from fuel_analytics.api.resources.utils import export_utils
|
|
||||||
from fuel_analytics.api.resources.utils.oswl_stats_to_csv import OswlStatsToCsv
|
|
||||||
from fuel_analytics.api.resources.utils.skeleton import OSWL_SKELETONS
|
|
||||||
|
|
||||||
|
|
||||||
class OswlStatsToCsvTest(OswlTest, DbTest):
|
|
||||||
|
|
||||||
def test_get_keys_paths(self):
|
|
||||||
for resource_type in self.RESOURCE_TYPES:
|
|
||||||
exporter = OswlStatsToCsv()
|
|
||||||
oswl_keys_paths, resource_keys_paths, csv_keys_paths = \
|
|
||||||
exporter.get_resource_keys_paths(resource_type)
|
|
||||||
self.assertNotIn(['external_id'], oswl_keys_paths)
|
|
||||||
self.assertNotIn(['updated_time'], oswl_keys_paths)
|
|
||||||
self.assertNotIn(['release'], oswl_keys_paths)
|
|
||||||
self.assertIn(['version_info', 'fuel_version'], oswl_keys_paths)
|
|
||||||
self.assertIn(['version_info', 'release_version'],
|
|
||||||
oswl_keys_paths)
|
|
||||||
self.assertIn(['version_info', 'release_name'], oswl_keys_paths)
|
|
||||||
self.assertIn(['version_info', 'release_os'], oswl_keys_paths)
|
|
||||||
self.assertIn(['version_info', 'environment_version'],
|
|
||||||
oswl_keys_paths)
|
|
||||||
self.assertIn([resource_type, 'id'], resource_keys_paths)
|
|
||||||
self.assertIn([resource_type, 'is_added'], csv_keys_paths)
|
|
||||||
self.assertIn([resource_type, 'is_modified'], csv_keys_paths)
|
|
||||||
self.assertIn([resource_type, 'is_removed'], csv_keys_paths)
|
|
||||||
|
|
||||||
def test_get_flatten_resources(self):
|
|
||||||
for resource_type in self.RESOURCE_TYPES:
|
|
||||||
exporter = OswlStatsToCsv()
|
|
||||||
oswl_keys_paths, resource_keys_paths, csv_keys_paths = \
|
|
||||||
exporter.get_resource_keys_paths(resource_type)
|
|
||||||
oswls = self.generate_oswls(2, resource_type)
|
|
||||||
flatten_resources = exporter.get_flatten_resources(
|
|
||||||
resource_type, oswl_keys_paths, resource_keys_paths, oswls, {})
|
|
||||||
self.assertIsInstance(flatten_resources, types.GeneratorType)
|
|
||||||
for _ in flatten_resources:
|
|
||||||
pass
|
|
||||||
|
|
||||||
def test_flavor_ephemeral_in_flatten(self):
|
|
||||||
exporter = OswlStatsToCsv()
|
|
||||||
resource_type = consts.OSWL_RESOURCE_TYPES.flavor
|
|
||||||
oswl_keys_paths, resource_keys_paths, csv_keys_paths = \
|
|
||||||
exporter.get_resource_keys_paths(resource_type)
|
|
||||||
oswls = self.generate_oswls(1, resource_type)
|
|
||||||
flatten_resources = exporter.get_flatten_resources(
|
|
||||||
resource_type, oswl_keys_paths, resource_keys_paths, oswls, {})
|
|
||||||
|
|
||||||
ephemeral_idx = csv_keys_paths.index(['flavor', 'ephemeral'])
|
|
||||||
for fr in flatten_resources:
|
|
||||||
self.assertIsNotNone(fr[ephemeral_idx])
|
|
||||||
|
|
||||||
def test_get_additional_info(self):
|
|
||||||
exporter = OswlStatsToCsv()
|
|
||||||
added_num = 0
|
|
||||||
modified_num = 3
|
|
||||||
removed_num = 5
|
|
||||||
num = 1
|
|
||||||
for resource_type in self.RESOURCE_TYPES:
|
|
||||||
oswls = self.generate_oswls(
|
|
||||||
num,
|
|
||||||
resource_type,
|
|
||||||
added_num_range=(added_num, added_num),
|
|
||||||
modified_num_range=(modified_num, modified_num),
|
|
||||||
removed_num_range=(removed_num, removed_num)
|
|
||||||
)
|
|
||||||
oswl = oswls.next()
|
|
||||||
|
|
||||||
# Saving data for true JSON loading from DB object
|
|
||||||
db.session.add(oswl)
|
|
||||||
resource_data = oswl.resource_data
|
|
||||||
added_ids = set(d['id'] for d in resource_data['added'])
|
|
||||||
modified_ids = set(d['id'] for d in resource_data['modified'])
|
|
||||||
removed_ids = set(d['id'] for d in resource_data['removed'])
|
|
||||||
for resource in resource_data['current']:
|
|
||||||
resource_id = resource['id']
|
|
||||||
expected = [
|
|
||||||
resource_id in added_ids,
|
|
||||||
resource_id in modified_ids,
|
|
||||||
resource_id in removed_ids
|
|
||||||
]
|
|
||||||
added_ids = set(item['id'] for item in
|
|
||||||
resource_data.get('added', []))
|
|
||||||
modified_ids = set(item['id'] for item in
|
|
||||||
resource_data.get('removed', []))
|
|
||||||
removed_ids = set(item['id'] for item in
|
|
||||||
resource_data.get('modified', []))
|
|
||||||
|
|
||||||
actual = exporter.get_additional_resource_info(
|
|
||||||
resource, oswl.resource_type,
|
|
||||||
added_ids, modified_ids, removed_ids)
|
|
||||||
self.assertListEqual(expected, actual)
|
|
||||||
|
|
||||||
def test_export(self):
|
|
||||||
exporter = OswlStatsToCsv()
|
|
||||||
num = 200
|
|
||||||
with app.test_request_context():
|
|
||||||
for resource_type in self.RESOURCE_TYPES:
|
|
||||||
# Saving data for true JSON loading from DB object
|
|
||||||
oswls_saved = self.get_saved_oswls(num, resource_type)
|
|
||||||
# Saving installation structures for proper oswls filtering
|
|
||||||
self.get_saved_inst_structs(oswls_saved)
|
|
||||||
# Checking oswls filtered properly
|
|
||||||
oswls = list(get_oswls(resource_type))
|
|
||||||
self.assertEqual(num, len(oswls))
|
|
||||||
# Checking export
|
|
||||||
result = exporter.export(resource_type, oswls,
|
|
||||||
datetime.utcnow().date(), {})
|
|
||||||
self.assertIsInstance(result, types.GeneratorType)
|
|
||||||
output = six.StringIO(list(result))
|
|
||||||
reader = csv.reader(output)
|
|
||||||
for _ in reader:
|
|
||||||
pass
|
|
||||||
|
|
||||||
def test_export_on_empty_data(self):
|
|
||||||
exporter = OswlStatsToCsv()
|
|
||||||
for resource_type in self.RESOURCE_TYPES:
|
|
||||||
result = exporter.export(resource_type, [], {}, None)
|
|
||||||
self.assertIsInstance(result, types.GeneratorType)
|
|
||||||
output = six.StringIO(list(result))
|
|
||||||
reader = csv.reader(output)
|
|
||||||
for _ in reader:
|
|
||||||
pass
|
|
||||||
|
|
||||||
def test_get_last_sync_datetime(self):
|
|
||||||
exporter = OswlStatsToCsv()
|
|
||||||
for resource_type in self.RESOURCE_TYPES:
|
|
||||||
oswls_saved = self.get_saved_oswls(1, resource_type)
|
|
||||||
inst_sturcts = self.get_saved_inst_structs(oswls_saved)
|
|
||||||
inst_struct = inst_sturcts[0]
|
|
||||||
inst_struct.modification_date = None
|
|
||||||
|
|
||||||
oswls = get_oswls_query(resource_type).all()
|
|
||||||
oswl = oswls[0]
|
|
||||||
self.assertEqual(
|
|
||||||
inst_struct.creation_date,
|
|
||||||
exporter.get_last_sync_datetime(oswl)
|
|
||||||
)
|
|
||||||
|
|
||||||
inst_struct.modification_date = datetime.utcnow()
|
|
||||||
oswls = get_oswls_query(resource_type).all()
|
|
||||||
oswl = oswls[0]
|
|
||||||
self.assertEqual(
|
|
||||||
inst_struct.modification_date,
|
|
||||||
exporter.get_last_sync_datetime(oswl)
|
|
||||||
)
|
|
||||||
|
|
||||||
def test_stream_horizon_content(self):
|
|
||||||
exporter = OswlStatsToCsv()
|
|
||||||
created_days = 2
|
|
||||||
for resource_type in self.RESOURCE_TYPES:
|
|
||||||
# Generating oswl on specified date
|
|
||||||
oswls_saved = self.get_saved_oswls(
|
|
||||||
1, resource_type, created_date_range=(created_days,
|
|
||||||
created_days))
|
|
||||||
self.get_saved_inst_structs(
|
|
||||||
oswls_saved, creation_date_range=(created_days, created_days))
|
|
||||||
|
|
||||||
oswls = get_oswls_query(resource_type).all()
|
|
||||||
oswl = oswls[0]
|
|
||||||
oswl_idx = export_utils.get_index(
|
|
||||||
oswl, *exporter.OSWL_INDEX_FIELDS)
|
|
||||||
horizon = {
|
|
||||||
oswl_idx: oswl
|
|
||||||
}
|
|
||||||
|
|
||||||
# Checking horizon size is changed on date greater than specified
|
|
||||||
# date. Checks list format: [(on_date, horizon_size)]
|
|
||||||
checks = [
|
|
||||||
(datetime.utcnow().date() - timedelta(days=created_days + 1),
|
|
||||||
1),
|
|
||||||
(datetime.utcnow().date() - timedelta(days=created_days),
|
|
||||||
1),
|
|
||||||
(datetime.utcnow().date() - timedelta(days=created_days - 1),
|
|
||||||
0),
|
|
||||||
]
|
|
||||||
for on_date, horizon_size in checks:
|
|
||||||
for _ in exporter.stream_horizon_content(horizon, on_date):
|
|
||||||
pass
|
|
||||||
self.assertEqual(horizon_size, len(horizon))
|
|
||||||
|
|
||||||
def test_fill_date_gaps(self):
|
|
||||||
exporter = OswlStatsToCsv()
|
|
||||||
created_days = 5
|
|
||||||
for resource_type in self.RESOURCE_TYPES:
|
|
||||||
# Generating resource time series for one master node
|
|
||||||
oswls_saved = self.get_saved_oswls(
|
|
||||||
1, resource_type, created_date_range=(created_days,
|
|
||||||
created_days))
|
|
||||||
inst_sturcts = self.get_saved_inst_structs(
|
|
||||||
oswls_saved, creation_date_range=(created_days, created_days))
|
|
||||||
inst_struct = inst_sturcts[0]
|
|
||||||
|
|
||||||
# Checking only one record is present
|
|
||||||
inst_struct.modification_date = None
|
|
||||||
db.session.add(inst_struct)
|
|
||||||
oswls = get_oswls_query(resource_type).all()
|
|
||||||
oswl = oswls[0]
|
|
||||||
self.assertIsNotNone(oswl.installation_created_date)
|
|
||||||
self.assertIsNone(oswl.installation_updated_date)
|
|
||||||
|
|
||||||
oswls_seamless = exporter.fill_date_gaps(
|
|
||||||
oswls, datetime.utcnow().date())
|
|
||||||
self.assertEqual(1, len(list(oswls_seamless)))
|
|
||||||
|
|
||||||
# Checking record is duplicated
|
|
||||||
inst_struct.modification_date = datetime.utcnow()
|
|
||||||
db.session.add(inst_struct)
|
|
||||||
|
|
||||||
oswls = get_oswls_query(resource_type).all()
|
|
||||||
oswl = oswls[0]
|
|
||||||
self.assertIsNotNone(oswl.installation_created_date)
|
|
||||||
self.assertIsNotNone(oswl.installation_updated_date)
|
|
||||||
|
|
||||||
on_date_days = 1
|
|
||||||
on_date = (datetime.utcnow() - timedelta(days=on_date_days)).date()
|
|
||||||
oswls_seamless = list(exporter.fill_date_gaps(oswls, on_date))
|
|
||||||
self.assertEqual(created_days - on_date_days + 1,
|
|
||||||
len(oswls_seamless))
|
|
||||||
|
|
||||||
# Checking dates are seamless and grow
|
|
||||||
expected_date = oswls_seamless[0].stats_on_date
|
|
||||||
for oswl in oswls_seamless:
|
|
||||||
self.assertEqual(expected_date, oswl.stats_on_date)
|
|
||||||
expected_date += timedelta(days=1)
|
|
||||||
|
|
||||||
def test_fill_date_gaps_empty_data_is_not_failed(self):
|
|
||||||
exporter = OswlStatsToCsv()
|
|
||||||
oswls = exporter.fill_date_gaps([], datetime.utcnow().date())
|
|
||||||
self.assertIsInstance(oswls, types.GeneratorType)
|
|
||||||
|
|
||||||
def test_resource_data_on_oswl_duplication(self):
|
|
||||||
exporter = OswlStatsToCsv()
|
|
||||||
num = 20
|
|
||||||
for resource_type in self.RESOURCE_TYPES:
|
|
||||||
oswls_before = get_oswls_query(resource_type).count()
|
|
||||||
oswls_saved = self.get_saved_oswls(
|
|
||||||
num, resource_type,
|
|
||||||
added_num_range=(1, 5), removed_num_range=(1, 3),
|
|
||||||
modified_num_range=(1, 15)
|
|
||||||
)
|
|
||||||
self.get_saved_inst_structs(oswls_saved,
|
|
||||||
creation_date_range=(0, 0))
|
|
||||||
oswls = get_oswls_query(resource_type).all()
|
|
||||||
self.assertEqual(oswls_before + num, len(list(oswls)))
|
|
||||||
|
|
||||||
# Checking added, modified, removed not empty
|
|
||||||
for oswl in oswls:
|
|
||||||
resource_data = oswl.resource_data
|
|
||||||
self.assertTrue(len(resource_data['added']) > 0)
|
|
||||||
self.assertTrue(len(resource_data['modified']) > 0)
|
|
||||||
self.assertTrue(len(resource_data['removed']) > 0)
|
|
||||||
|
|
||||||
# Checking added, modified, removed empty on duplicated oswls
|
|
||||||
oswls_seamless = exporter.fill_date_gaps(
|
|
||||||
oswls, datetime.utcnow().date())
|
|
||||||
for oswl in oswls_seamless:
|
|
||||||
if oswl.created_date != oswl.stats_on_date:
|
|
||||||
resource_data = oswl.resource_data
|
|
||||||
self.assertEqual(0, len(resource_data['added']))
|
|
||||||
self.assertEqual(0, len(resource_data['modified']))
|
|
||||||
self.assertEqual(0, len(resource_data['removed']))
|
|
||||||
|
|
||||||
def test_fill_date_gaps_for_set_of_clusters(self):
|
|
||||||
exporter = OswlStatsToCsv()
|
|
||||||
created_days = 3
|
|
||||||
clusters_num = 2
|
|
||||||
insts_num = 3
|
|
||||||
for resource_type in self.RESOURCE_TYPES[:1]:
|
|
||||||
# Generating oswls
|
|
||||||
for _ in six.moves.range(insts_num):
|
|
||||||
mn_uid = six.text_type(uuid.uuid4())
|
|
||||||
oswls_saved = []
|
|
||||||
for cluster_id in six.moves.range(clusters_num):
|
|
||||||
created_date = datetime.utcnow().date() - \
|
|
||||||
timedelta(days=created_days)
|
|
||||||
oswl = OpenStackWorkloadStats(
|
|
||||||
master_node_uid=mn_uid,
|
|
||||||
external_id=cluster_id,
|
|
||||||
cluster_id=cluster_id,
|
|
||||||
created_date=created_date,
|
|
||||||
updated_time=datetime.utcnow().time(),
|
|
||||||
resource_type=resource_type,
|
|
||||||
resource_checksum='',
|
|
||||||
resource_data={}
|
|
||||||
)
|
|
||||||
db.session.add(oswl)
|
|
||||||
oswls_saved.append(oswl)
|
|
||||||
self.get_saved_inst_structs(oswls_saved,
|
|
||||||
creation_date_range=(0, 0))
|
|
||||||
# Checking all resources in seamless oswls
|
|
||||||
oswls = get_oswls_query(resource_type).all()
|
|
||||||
self.assertEqual(insts_num * clusters_num, len(oswls))
|
|
||||||
oswls_seamless = list(exporter.fill_date_gaps(
|
|
||||||
oswls, datetime.utcnow().date()))
|
|
||||||
self.assertEqual(insts_num * clusters_num * (created_days + 1),
|
|
||||||
len(list(oswls_seamless)))
|
|
||||||
|
|
||||||
# Checking dates do not decrease in seamless oswls
|
|
||||||
dates = [oswl.stats_on_date for oswl in oswls_seamless]
|
|
||||||
self.assertListEqual(sorted(dates), dates)
|
|
||||||
|
|
||||||
def test_filter_by_date(self):
|
|
||||||
exporter = OswlStatsToCsv()
|
|
||||||
num = 10
|
|
||||||
with app.test_request_context('/?from_date=2015-02-01'):
|
|
||||||
for resource_type in self.RESOURCE_TYPES:
|
|
||||||
# Creating oswls
|
|
||||||
oswls_saved = self.get_saved_oswls(num, resource_type)
|
|
||||||
self.get_saved_inst_structs(oswls_saved)
|
|
||||||
# Filtering oswls
|
|
||||||
oswls = get_oswls(resource_type)
|
|
||||||
result = exporter.export(resource_type, oswls,
|
|
||||||
datetime.utcnow().date(), {})
|
|
||||||
self.assertIsInstance(result, types.GeneratorType)
|
|
||||||
output = six.StringIO(list(result))
|
|
||||||
reader = csv.reader(output)
|
|
||||||
for _ in reader:
|
|
||||||
pass
|
|
||||||
|
|
||||||
def test_seamless_dates(self):
|
|
||||||
exporter = OswlStatsToCsv()
|
|
||||||
# Creating oswls with not continuous created dates
|
|
||||||
resource_type = consts.OSWL_RESOURCE_TYPES.vm
|
|
||||||
old_days = 7
|
|
||||||
new_days = 2
|
|
||||||
oswls_saved = [
|
|
||||||
OpenStackWorkloadStats(
|
|
||||||
master_node_uid='x',
|
|
||||||
external_id=1,
|
|
||||||
cluster_id=1,
|
|
||||||
created_date=(datetime.utcnow().date() -
|
|
||||||
timedelta(days=old_days)),
|
|
||||||
updated_time=datetime.utcnow().time(),
|
|
||||||
resource_type=resource_type,
|
|
||||||
resource_checksum='',
|
|
||||||
resource_data={}
|
|
||||||
),
|
|
||||||
OpenStackWorkloadStats(
|
|
||||||
master_node_uid='x',
|
|
||||||
external_id=2,
|
|
||||||
cluster_id=2,
|
|
||||||
created_date=(datetime.utcnow().date() -
|
|
||||||
timedelta(days=new_days)),
|
|
||||||
updated_time=datetime.utcnow().time(),
|
|
||||||
resource_type=resource_type,
|
|
||||||
resource_checksum='',
|
|
||||||
resource_data={}
|
|
||||||
),
|
|
||||||
]
|
|
||||||
for oswl in oswls_saved:
|
|
||||||
db.session.add(oswl)
|
|
||||||
self.get_saved_inst_structs(oswls_saved, creation_date_range=(0, 0))
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
oswls = get_oswls(resource_type)
|
|
||||||
oswls_seamless = list(exporter.fill_date_gaps(
|
|
||||||
oswls, datetime.utcnow().date()))
|
|
||||||
|
|
||||||
# Checking size of seamless report
|
|
||||||
single_record = old_days - new_days
|
|
||||||
number_of_records = new_days + 1 # current date should be in report
|
|
||||||
expected_num = single_record + number_of_records * len(oswls_saved)
|
|
||||||
actual_num = len(oswls_seamless)
|
|
||||||
self.assertEqual(expected_num, actual_num)
|
|
||||||
|
|
||||||
# Checking no gaps in dates
|
|
||||||
stats_on_date = oswls_seamless[0].stats_on_date
|
|
||||||
for o in oswls_seamless:
|
|
||||||
self.assertIn(
|
|
||||||
o.stats_on_date - stats_on_date,
|
|
||||||
(timedelta(days=0), timedelta(days=1))
|
|
||||||
)
|
|
||||||
stats_on_date = o.stats_on_date
|
|
||||||
|
|
||||||
def test_dates_filtering(self):
|
|
||||||
updated_at = datetime.utcnow()
|
|
||||||
base_date = updated_at.date() - timedelta(days=1)
|
|
||||||
exporter = OswlStatsToCsv()
|
|
||||||
resource_type = consts.OSWL_RESOURCE_TYPES.vm
|
|
||||||
oswl = OpenStackWorkloadStats(
|
|
||||||
master_node_uid='x',
|
|
||||||
external_id=1,
|
|
||||||
cluster_id=1,
|
|
||||||
created_date=base_date,
|
|
||||||
updated_time=updated_at.time(),
|
|
||||||
resource_type=resource_type,
|
|
||||||
resource_checksum='',
|
|
||||||
resource_data={'current': [{'id': 444, 'status': 'ACTIVE'}]}
|
|
||||||
)
|
|
||||||
db.session.add(oswl)
|
|
||||||
self.get_saved_inst_structs([oswl], creation_date_range=(0, 0))
|
|
||||||
|
|
||||||
req_params = '/?from_date={0}&to_date={1}'.format(
|
|
||||||
(base_date - timedelta(days=2)).isoformat(),
|
|
||||||
(base_date - timedelta(days=1)).isoformat()
|
|
||||||
)
|
|
||||||
with app.test_request_context(req_params):
|
|
||||||
oswls = list(get_oswls(resource_type))
|
|
||||||
self.assertEqual(0, len(oswls))
|
|
||||||
result = exporter.export(
|
|
||||||
resource_type,
|
|
||||||
oswls,
|
|
||||||
(base_date - timedelta(days=1)),
|
|
||||||
{}
|
|
||||||
)
|
|
||||||
# Only column names in result
|
|
||||||
self.assertEqual(1, len(list(result)))
|
|
||||||
|
|
||||||
req_params = '/?to_date={0}'.format(
|
|
||||||
(base_date - timedelta(days=1)).isoformat()
|
|
||||||
)
|
|
||||||
with app.test_request_context(req_params):
|
|
||||||
oswls = list(get_oswls(resource_type))
|
|
||||||
self.assertEqual(0, len(oswls))
|
|
||||||
result = exporter.export(
|
|
||||||
resource_type,
|
|
||||||
oswls,
|
|
||||||
base_date - timedelta(days=1),
|
|
||||||
{}
|
|
||||||
)
|
|
||||||
# Only column names in result
|
|
||||||
self.assertEqual(1, len(list(result)))
|
|
||||||
|
|
||||||
req_params = '/?_date={0}'.format(
|
|
||||||
base_date.isoformat()
|
|
||||||
)
|
|
||||||
with app.test_request_context(req_params):
|
|
||||||
oswls = list(get_oswls(resource_type))
|
|
||||||
self.assertEqual(1, len(oswls))
|
|
||||||
result = exporter.export(
|
|
||||||
resource_type,
|
|
||||||
oswls,
|
|
||||||
base_date + timedelta(days=1),
|
|
||||||
{}
|
|
||||||
)
|
|
||||||
# Not only column names in result
|
|
||||||
self.assertEqual(1 + 2, len(list(result)))
|
|
||||||
|
|
||||||
def test_resource_data_structure(self):
|
|
||||||
num = 20
|
|
||||||
for resource_type in self.RESOURCE_TYPES:
|
|
||||||
oswls = self.get_saved_oswls(num, resource_type)
|
|
||||||
for oswl in oswls:
|
|
||||||
for res_data in oswl.resource_data['current']:
|
|
||||||
# Checking all required for report data is in resource data
|
|
||||||
for key in six.iterkeys(OSWL_SKELETONS[resource_type]):
|
|
||||||
self.assertIn(key, res_data)
|
|
||||||
|
|
||||||
def test_oswl_invalid_data(self):
|
|
||||||
exporter = OswlStatsToCsv()
|
|
||||||
num = 10
|
|
||||||
for resource_type in self.RESOURCE_TYPES:
|
|
||||||
oswls_saved = self.get_saved_oswls(
|
|
||||||
num, resource_type, current_num_range=(1, 1),
|
|
||||||
removed_num_range=(0, 0), added_num_range=(0, 0),
|
|
||||||
modified_num_range=(0, 0))
|
|
||||||
# Saving installation structures for proper oswls filtering
|
|
||||||
self.get_saved_inst_structs(oswls_saved)
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
oswls = get_oswls(resource_type)
|
|
||||||
oswl_keys_paths, vm_keys_paths, csv_keys_paths = \
|
|
||||||
exporter.get_resource_keys_paths(resource_type)
|
|
||||||
|
|
||||||
side_effect = [[]] * num
|
|
||||||
side_effect[num / 2] = Exception
|
|
||||||
with mock.patch.object(exporter,
|
|
||||||
'get_additional_resource_info',
|
|
||||||
side_effect=side_effect):
|
|
||||||
flatten_resources = exporter.get_flatten_resources(
|
|
||||||
resource_type, oswl_keys_paths, vm_keys_paths,
|
|
||||||
oswls, {})
|
|
||||||
# Checking only invalid data is not exported
|
|
||||||
self.assertEqual(num - 1, len(list(flatten_resources)))
|
|
||||||
|
|
||||||
def test_volume_host_not_in_keys_paths(self):
|
|
||||||
exporter = OswlStatsToCsv()
|
|
||||||
resource_type = consts.OSWL_RESOURCE_TYPES.volume
|
|
||||||
oswl_keys_paths, resource_keys_paths, csv_keys_paths = \
|
|
||||||
exporter.get_resource_keys_paths(resource_type)
|
|
||||||
self.assertNotIn(['volume', 'host'], csv_keys_paths)
|
|
||||||
|
|
||||||
def test_is_filtered_oswls_export(self):
|
|
||||||
for resource_type in self.RESOURCE_TYPES:
|
|
||||||
# Creating filtered OSWLs
|
|
||||||
filtered_num = 15
|
|
||||||
filtered_oswls = self.get_saved_oswls(
|
|
||||||
filtered_num,
|
|
||||||
resource_type, current_num_range=(1, 1))
|
|
||||||
self.get_saved_inst_structs(filtered_oswls,
|
|
||||||
is_filtered_values=(True,))
|
|
||||||
# Creating not filtered OSWLs
|
|
||||||
not_filtered_num = 10
|
|
||||||
not_filtered_oswls = self.get_saved_oswls(
|
|
||||||
not_filtered_num,
|
|
||||||
resource_type, current_num_range=(1, 1))
|
|
||||||
self.get_saved_inst_structs(not_filtered_oswls)
|
|
||||||
|
|
||||||
# Checking only not filtered resources fetched
|
|
||||||
with app.test_request_context():
|
|
||||||
oswls = get_oswls_query(resource_type).all()
|
|
||||||
self.assertEqual(not_filtered_num, len(oswls))
|
|
||||||
for oswl in oswls:
|
|
||||||
self.assertIn(oswl.is_filtered, (False, None))
|
|
||||||
|
|
||||||
def test_release_info_in_oswl(self):
|
|
||||||
exporter = OswlStatsToCsv()
|
|
||||||
releases = ('6.0', '6.1', None)
|
|
||||||
num = 30
|
|
||||||
|
|
||||||
for resource_type in self.RESOURCE_TYPES:
|
|
||||||
# Creating OSWLs
|
|
||||||
oswls = self.get_saved_oswls(num, resource_type)
|
|
||||||
self.get_saved_inst_structs(oswls, releases=releases)
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
oswls = get_oswls_query(resource_type).all()
|
|
||||||
oswl_keys_paths, resource_keys_paths, csv_keys_paths = \
|
|
||||||
exporter.get_resource_keys_paths(resource_type)
|
|
||||||
|
|
||||||
# Checking release value in flatten resources
|
|
||||||
release_pos = csv_keys_paths.index(
|
|
||||||
['version_info', 'fuel_version'])
|
|
||||||
flatten_resources = exporter.get_flatten_resources(
|
|
||||||
resource_type, oswl_keys_paths, resource_keys_paths,
|
|
||||||
oswls, {})
|
|
||||||
for flatten_resource in flatten_resources:
|
|
||||||
release = flatten_resource[release_pos]
|
|
||||||
self.assertIn(release, releases)
|
|
||||||
|
|
||||||
def test_duplicated_oswls_skipped(self):
|
|
||||||
exporter = OswlStatsToCsv()
|
|
||||||
# Creating oswls duplicates
|
|
||||||
resource_type = consts.OSWL_RESOURCE_TYPES.vm
|
|
||||||
old_days = 7
|
|
||||||
new_days = 2
|
|
||||||
old_created_date = datetime.utcnow().date() - timedelta(days=old_days)
|
|
||||||
oswls_saved = [
|
|
||||||
OpenStackWorkloadStats(
|
|
||||||
master_node_uid='x',
|
|
||||||
external_id=1,
|
|
||||||
cluster_id=1,
|
|
||||||
created_date=old_created_date,
|
|
||||||
updated_time=datetime.utcnow().time(),
|
|
||||||
resource_type=resource_type,
|
|
||||||
resource_checksum='checksum',
|
|
||||||
resource_data={'current': [{'id': 1}], 'added': [{'id': 1}]}
|
|
||||||
),
|
|
||||||
OpenStackWorkloadStats(
|
|
||||||
master_node_uid='x',
|
|
||||||
external_id=2,
|
|
||||||
cluster_id=1,
|
|
||||||
created_date=(datetime.utcnow().date() -
|
|
||||||
timedelta(days=new_days)),
|
|
||||||
updated_time=datetime.utcnow().time(),
|
|
||||||
resource_type=resource_type,
|
|
||||||
resource_checksum='checksum',
|
|
||||||
resource_data={'current': [{'id': 1}], 'added': [{'id': 1}]}
|
|
||||||
),
|
|
||||||
]
|
|
||||||
for oswl in oswls_saved:
|
|
||||||
db.session.add(oswl)
|
|
||||||
self.get_saved_inst_structs(oswls_saved, creation_date_range=(0, 0))
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
oswls = get_oswls(resource_type)
|
|
||||||
oswls_seamless = list(exporter.fill_date_gaps(
|
|
||||||
oswls, datetime.utcnow().date()))
|
|
||||||
|
|
||||||
# Checking size of seamless report
|
|
||||||
expected_num = old_days + 1 # current date should be in report
|
|
||||||
actual_num = len(oswls_seamless)
|
|
||||||
self.assertEqual(expected_num, actual_num)
|
|
||||||
|
|
||||||
# Checking only old oswl in seamless_oswls
|
|
||||||
for o in oswls_seamless:
|
|
||||||
self.assertEqual(old_created_date, o.created_date)
|
|
||||||
|
|
||||||
def test_version_info_in_flatten_resource(self):
|
|
||||||
exporter = OswlStatsToCsv()
|
|
||||||
resource_type = consts.OSWL_RESOURCE_TYPES.vm
|
|
||||||
oswls_saved = [
|
|
||||||
OpenStackWorkloadStats(
|
|
||||||
master_node_uid='x',
|
|
||||||
external_id=1,
|
|
||||||
cluster_id=1,
|
|
||||||
created_date=datetime.utcnow().date(),
|
|
||||||
updated_time=datetime.utcnow().time(),
|
|
||||||
resource_type=resource_type,
|
|
||||||
resource_checksum='no_version_info',
|
|
||||||
resource_data={'current': [{'id': 1}]}
|
|
||||||
),
|
|
||||||
OpenStackWorkloadStats(
|
|
||||||
master_node_uid='y',
|
|
||||||
external_id=2,
|
|
||||||
cluster_id=2,
|
|
||||||
created_date=datetime.utcnow().date(),
|
|
||||||
updated_time=datetime.utcnow().time(),
|
|
||||||
resource_type=resource_type,
|
|
||||||
resource_checksum='empty_version_info',
|
|
||||||
resource_data={'current': [{'id': 1}]},
|
|
||||||
version_info={}
|
|
||||||
),
|
|
||||||
OpenStackWorkloadStats(
|
|
||||||
master_node_uid='z',
|
|
||||||
external_id=3,
|
|
||||||
cluster_id=3,
|
|
||||||
created_date=datetime.utcnow().date(),
|
|
||||||
updated_time=datetime.utcnow().time(),
|
|
||||||
resource_type=resource_type,
|
|
||||||
resource_checksum='with_version_info',
|
|
||||||
resource_data={'current': [{'id': 1}]},
|
|
||||||
version_info={
|
|
||||||
'release_version': 'liberty-9.0',
|
|
||||||
'release_os': 'Ubuntu',
|
|
||||||
'release_name': 'Liberty on Ubuntu 14.04',
|
|
||||||
'fuel_version': '9.0',
|
|
||||||
'environment_version': '9.0'
|
|
||||||
}
|
|
||||||
),
|
|
||||||
]
|
|
||||||
for oswl in oswls_saved:
|
|
||||||
db.session.add(oswl)
|
|
||||||
self.get_saved_inst_structs(oswls_saved, creation_date_range=(0, 0))
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
oswls = list(get_oswls(resource_type))
|
|
||||||
|
|
||||||
oswl_keys_paths, resource_keys_paths, csv_keys_paths = \
|
|
||||||
exporter.get_resource_keys_paths(resource_type)
|
|
||||||
fuel_release_pos = csv_keys_paths.index(
|
|
||||||
['version_info', 'fuel_version'])
|
|
||||||
flatten_resources = list(exporter.get_flatten_resources(
|
|
||||||
resource_type, oswl_keys_paths, resource_keys_paths, oswls, {}))
|
|
||||||
|
|
||||||
# Checking all oswls are in flatten resources
|
|
||||||
external_uid_pos = csv_keys_paths.index(['master_node_uid'])
|
|
||||||
expected_uids = set([oswl.master_node_uid for oswl in oswls])
|
|
||||||
actual_uids = set([d[external_uid_pos] for d in flatten_resources])
|
|
||||||
self.assertEqual(expected_uids, actual_uids)
|
|
||||||
|
|
||||||
# Checking every flatten_resources contain fuel_release_info
|
|
||||||
self.assertTrue(all(d[fuel_release_pos] for d in flatten_resources))
|
|
||||||
|
|
||||||
def test_all_resource_statuses_are_shown(self):
|
|
||||||
exporter = OswlStatsToCsv()
|
|
||||||
resource_type = consts.OSWL_RESOURCE_TYPES.vm
|
|
||||||
updated_time_str = datetime.utcnow().time().isoformat()
|
|
||||||
oswls_saved = [
|
|
||||||
OpenStackWorkloadStats(
|
|
||||||
master_node_uid='x',
|
|
||||||
external_id=1,
|
|
||||||
cluster_id=1,
|
|
||||||
created_date=(datetime.utcnow().date() -
|
|
||||||
timedelta(days=8)),
|
|
||||||
updated_time=datetime.utcnow().time(),
|
|
||||||
resource_type=resource_type,
|
|
||||||
resource_checksum='checksum',
|
|
||||||
resource_data={'current': [{'id': 1, 'status': 'enabled',
|
|
||||||
'tenant_id': 'first'}],
|
|
||||||
'added': [{'id': 1, 'time': updated_time_str}],
|
|
||||||
'modified': [], 'removed': []}
|
|
||||||
),
|
|
||||||
# Removing and adding back the same resource.
|
|
||||||
OpenStackWorkloadStats(
|
|
||||||
master_node_uid='x',
|
|
||||||
external_id=2,
|
|
||||||
cluster_id=1,
|
|
||||||
created_date=(datetime.utcnow().date() -
|
|
||||||
timedelta(days=6)),
|
|
||||||
updated_time=datetime.utcnow().time(),
|
|
||||||
resource_type=resource_type,
|
|
||||||
resource_checksum='checksum',
|
|
||||||
resource_data={
|
|
||||||
'current': [{'id': 1, 'status': 'enabled',
|
|
||||||
'tenant_id': 'second'}],
|
|
||||||
'added': [{'id': 1, 'time': updated_time_str}],
|
|
||||||
'modified': [],
|
|
||||||
'removed': [{'id': 1, 'status': 'enabled',
|
|
||||||
'time': updated_time_str,
|
|
||||||
'tenant_id': 'second'}]}
|
|
||||||
),
|
|
||||||
# Changing and restoring back resource
|
|
||||||
OpenStackWorkloadStats(
|
|
||||||
master_node_uid='x',
|
|
||||||
external_id=3,
|
|
||||||
cluster_id=1,
|
|
||||||
created_date=(datetime.utcnow().date() -
|
|
||||||
timedelta(days=4)),
|
|
||||||
updated_time=datetime.utcnow().time(),
|
|
||||||
resource_type=resource_type,
|
|
||||||
resource_checksum='checksum',
|
|
||||||
resource_data={
|
|
||||||
'current': [{'id': 1, 'enabled': True,
|
|
||||||
'tenant_id': 'third'}],
|
|
||||||
'added': [],
|
|
||||||
'modified': [
|
|
||||||
{'id': 1, 'enabled': False, 'time': updated_time_str},
|
|
||||||
{'id': 1, 'enabled': True, 'time': updated_time_str},
|
|
||||||
],
|
|
||||||
'removed': []
|
|
||||||
}
|
|
||||||
),
|
|
||||||
# Resource modified and finally deleted
|
|
||||||
OpenStackWorkloadStats(
|
|
||||||
master_node_uid='x',
|
|
||||||
external_id=4,
|
|
||||||
cluster_id=1,
|
|
||||||
created_date=(datetime.utcnow().date() -
|
|
||||||
timedelta(days=2)),
|
|
||||||
updated_time=datetime.utcnow().time(),
|
|
||||||
resource_type=resource_type,
|
|
||||||
resource_checksum='another_checksum',
|
|
||||||
resource_data={
|
|
||||||
'current': [],
|
|
||||||
'added': [],
|
|
||||||
'modified': [
|
|
||||||
{'id': 1, 'enabled': False, 'time': updated_time_str},
|
|
||||||
{'id': 1, 'enabled': True, 'time': updated_time_str},
|
|
||||||
],
|
|
||||||
'removed': [{'id': 1, 'enabled': True,
|
|
||||||
'tenant_id': 'fourth'}]
|
|
||||||
}
|
|
||||||
),
|
|
||||||
]
|
|
||||||
for oswl in oswls_saved:
|
|
||||||
db.session.add(oswl)
|
|
||||||
self.get_saved_inst_structs(oswls_saved, creation_date_range=(0, 0))
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
oswls = get_oswls(resource_type)
|
|
||||||
|
|
||||||
oswls_seamless = list(exporter.fill_date_gaps(
|
|
||||||
oswls, datetime.utcnow().date()))
|
|
||||||
|
|
||||||
oswl_keys_paths, resource_keys_paths, csv_keys_paths = \
|
|
||||||
exporter.get_resource_keys_paths(resource_type)
|
|
||||||
|
|
||||||
flatten_resources = list(exporter.get_flatten_resources(
|
|
||||||
resource_type, oswl_keys_paths, resource_keys_paths,
|
|
||||||
oswls_seamless, {}))
|
|
||||||
|
|
||||||
# Expected oswls num: 2 for 'first', 2 for 'second', 2 for 'third'
|
|
||||||
# and only one for finally removed 'fourth'
|
|
||||||
expected_oswls_num = 7
|
|
||||||
self.assertEqual(expected_oswls_num, len(flatten_resources))
|
|
||||||
|
|
||||||
is_added_pos = csv_keys_paths.index([resource_type, 'is_added'])
|
|
||||||
is_modified_pos = csv_keys_paths.index([resource_type, 'is_modified'])
|
|
||||||
is_removed_pos = csv_keys_paths.index([resource_type, 'is_removed'])
|
|
||||||
tenant_id_pos = csv_keys_paths.index([resource_type, 'tenant_id'])
|
|
||||||
|
|
||||||
def check_resource_state(resource, tenant_id, is_added,
|
|
||||||
is_modified, is_removed):
|
|
||||||
self.assertEqual(is_added, resource[is_added_pos])
|
|
||||||
self.assertEqual(is_modified, resource[is_modified_pos])
|
|
||||||
self.assertEqual(is_removed, resource[is_removed_pos])
|
|
||||||
self.assertEqual(tenant_id, resource[tenant_id_pos])
|
|
||||||
|
|
||||||
# The fist oswl status True only in is_added
|
|
||||||
check_resource_state(flatten_resources[0], 'first',
|
|
||||||
True, False, False)
|
|
||||||
|
|
||||||
# The first oswl status on the next day is False for
|
|
||||||
# is_added, is_modified, is_removed
|
|
||||||
check_resource_state(flatten_resources[1], 'first',
|
|
||||||
False, False, False)
|
|
||||||
|
|
||||||
# The second oswl status True in is_added, is_modified
|
|
||||||
check_resource_state(flatten_resources[2], 'second',
|
|
||||||
True, False, True)
|
|
||||||
|
|
||||||
# The second oswl status on the next day is False for
|
|
||||||
# is_added, is_modified, is_removed
|
|
||||||
check_resource_state(flatten_resources[3], 'second',
|
|
||||||
False, False, False)
|
|
||||||
|
|
||||||
# The third oswl status True only in is_modified
|
|
||||||
check_resource_state(flatten_resources[4], 'third',
|
|
||||||
False, True, False)
|
|
||||||
|
|
||||||
# The third oswl status on the next day is False for
|
|
||||||
# is_added, is_modified, is_removed
|
|
||||||
check_resource_state(flatten_resources[5], 'third',
|
|
||||||
False, False, False)
|
|
||||||
|
|
||||||
# The fourth oswl status True in is_modified, is_deleted
|
|
||||||
check_resource_state(flatten_resources[6], 'fourth',
|
|
||||||
False, True, True)
|
|
||||||
|
|
||||||
def test_fuel_version_from_clusters_data_is_used(self):
|
|
||||||
master_node_uid = 'x'
|
|
||||||
exporter = OswlStatsToCsv()
|
|
||||||
resource_type = consts.OSWL_RESOURCE_TYPES.vm
|
|
||||||
version_from_cluster = '7.0'
|
|
||||||
release_version_from_cluster = 'from_cluster_7.0'
|
|
||||||
version_from_version_info = '9.0'
|
|
||||||
release_version_from_version_info = 'from_version_info_9.0'
|
|
||||||
|
|
||||||
version_from_installation_info = '8.0'
|
|
||||||
release_version_from_inst_info = 'from_inst_info_8.0'
|
|
||||||
installation_date = datetime.utcnow().date() - timedelta(days=3)
|
|
||||||
|
|
||||||
# Upgraded Fuel and not upgraded cluster
|
|
||||||
structure = InstallationStructure(
|
|
||||||
master_node_uid=master_node_uid,
|
|
||||||
structure={
|
|
||||||
'fuel_release': {
|
|
||||||
'release': version_from_installation_info,
|
|
||||||
'openstack_version': release_version_from_inst_info
|
|
||||||
},
|
|
||||||
'clusters_num': 2,
|
|
||||||
'clusters': [
|
|
||||||
{'id': 1, 'fuel_version': version_from_cluster,
|
|
||||||
'release': {'version': release_version_from_cluster}},
|
|
||||||
{'id': 2}
|
|
||||||
],
|
|
||||||
'unallocated_nodes_num_range': 0,
|
|
||||||
'allocated_nodes_num': 0
|
|
||||||
},
|
|
||||||
creation_date=installation_date,
|
|
||||||
is_filtered=False
|
|
||||||
)
|
|
||||||
db.session.add(structure)
|
|
||||||
|
|
||||||
oswls = [
|
|
||||||
OpenStackWorkloadStats(
|
|
||||||
master_node_uid=master_node_uid,
|
|
||||||
external_id=1,
|
|
||||||
cluster_id=1,
|
|
||||||
created_date=installation_date,
|
|
||||||
updated_time=datetime.utcnow().time(),
|
|
||||||
resource_type=resource_type,
|
|
||||||
resource_checksum='info_from_cluster',
|
|
||||||
resource_data={'current': [{'id': 1, 'status': 'enabled'}],
|
|
||||||
'added': [], 'modified': [], 'removed': []},
|
|
||||||
version_info=None
|
|
||||||
),
|
|
||||||
OpenStackWorkloadStats(
|
|
||||||
master_node_uid=master_node_uid,
|
|
||||||
external_id=3,
|
|
||||||
cluster_id=1,
|
|
||||||
created_date=installation_date + timedelta(days=1),
|
|
||||||
updated_time=datetime.utcnow().time(),
|
|
||||||
resource_type=resource_type,
|
|
||||||
resource_checksum='info_from_version_info',
|
|
||||||
resource_data={'current': [{'id': 1}],
|
|
||||||
'added': [], 'modified': [], 'removed': []},
|
|
||||||
version_info={
|
|
||||||
'fuel_version': version_from_version_info,
|
|
||||||
'release_version': release_version_from_version_info
|
|
||||||
}
|
|
||||||
),
|
|
||||||
OpenStackWorkloadStats(
|
|
||||||
master_node_uid=master_node_uid,
|
|
||||||
external_id=2,
|
|
||||||
cluster_id=2,
|
|
||||||
created_date=installation_date + timedelta(days=2),
|
|
||||||
updated_time=datetime.utcnow().time(),
|
|
||||||
resource_type=resource_type,
|
|
||||||
resource_checksum='info_from_installation_info',
|
|
||||||
resource_data={'current': [{'id': 1}],
|
|
||||||
'added': [], 'modified': [], 'removed': []},
|
|
||||||
version_info=None
|
|
||||||
)
|
|
||||||
]
|
|
||||||
for oswl in oswls:
|
|
||||||
db.session.add(oswl)
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
oswls_data = list(get_oswls(resource_type))
|
|
||||||
clusters_version_info = get_clusters_version_info()
|
|
||||||
|
|
||||||
oswl_keys_paths, resource_keys_paths, csv_keys_paths = \
|
|
||||||
exporter.get_resource_keys_paths(resource_type)
|
|
||||||
fuel_release_pos = csv_keys_paths.index(
|
|
||||||
['version_info', 'fuel_version'])
|
|
||||||
release_version_pos = csv_keys_paths.index(
|
|
||||||
['version_info', 'release_version'])
|
|
||||||
flatten_resources = list(exporter.get_flatten_resources(
|
|
||||||
resource_type, oswl_keys_paths,
|
|
||||||
resource_keys_paths, oswls_data, clusters_version_info
|
|
||||||
))
|
|
||||||
|
|
||||||
self.assertEqual(len(oswls), len(flatten_resources))
|
|
||||||
|
|
||||||
# Checking version info fetched from cluster
|
|
||||||
self.assertEqual(version_from_cluster,
|
|
||||||
flatten_resources[0][fuel_release_pos])
|
|
||||||
self.assertEqual(release_version_from_cluster,
|
|
||||||
flatten_resources[0][release_version_pos])
|
|
||||||
|
|
||||||
# Checking version info fetched from oswl.version_info
|
|
||||||
self.assertEqual(version_from_version_info,
|
|
||||||
flatten_resources[1][fuel_release_pos])
|
|
||||||
self.assertEqual(release_version_from_version_info,
|
|
||||||
flatten_resources[1][release_version_pos])
|
|
||||||
|
|
||||||
# Checking version info fetched from installation info
|
|
||||||
self.assertEqual(version_from_installation_info,
|
|
||||||
flatten_resources[2][fuel_release_pos])
|
|
||||||
self.assertEqual(release_version_from_inst_info,
|
|
||||||
flatten_resources[2][release_version_pos])
|
|
||||||
|
|
||||||
def test_get_clusters_version_info(self):
|
|
||||||
mn_uid = 'x'
|
|
||||||
cluster_id = 1
|
|
||||||
empty_cluster_id = 2
|
|
||||||
mn_uid_no_clusters = 'xx'
|
|
||||||
release_name = 'release name'
|
|
||||||
version_from_cluster = '7.0'
|
|
||||||
release_version_from_cluster = 'from_cluster_7.0'
|
|
||||||
installation_date = datetime.utcnow().date() - timedelta(days=3)
|
|
||||||
|
|
||||||
expected_version_info = {
|
|
||||||
'release_version': release_version_from_cluster,
|
|
||||||
'release_os': None,
|
|
||||||
'release_name': release_name,
|
|
||||||
'fuel_version': version_from_cluster
|
|
||||||
}
|
|
||||||
|
|
||||||
structures = [
|
|
||||||
InstallationStructure(
|
|
||||||
master_node_uid=mn_uid,
|
|
||||||
structure={
|
|
||||||
'clusters': [
|
|
||||||
{'id': cluster_id,
|
|
||||||
'fuel_version': version_from_cluster,
|
|
||||||
'release': {'version': release_version_from_cluster,
|
|
||||||
'name': release_name}},
|
|
||||||
{'id': empty_cluster_id}
|
|
||||||
]
|
|
||||||
|
|
||||||
},
|
|
||||||
creation_date=installation_date,
|
|
||||||
is_filtered=False
|
|
||||||
),
|
|
||||||
InstallationStructure(
|
|
||||||
master_node_uid=mn_uid_no_clusters,
|
|
||||||
structure={'clusters': []},
|
|
||||||
creation_date=installation_date,
|
|
||||||
is_filtered=False
|
|
||||||
)
|
|
||||||
]
|
|
||||||
for structure in structures:
|
|
||||||
db.session.add(structure)
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
clusters_version_info = get_clusters_version_info()
|
|
||||||
|
|
||||||
self.assertIn(mn_uid, clusters_version_info)
|
|
||||||
self.assertIn(cluster_id, clusters_version_info[mn_uid])
|
|
||||||
self.assertNotIn(empty_cluster_id, clusters_version_info[mn_uid])
|
|
||||||
self.assertIn(mn_uid_no_clusters, clusters_version_info)
|
|
||||||
|
|
||||||
actual_version_info = clusters_version_info[mn_uid][cluster_id]
|
|
||||||
self.assertEqual(expected_version_info, actual_version_info)
|
|
||||||
self.assertEqual({}, clusters_version_info[mn_uid_no_clusters])
|
|
@ -1,141 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
import csv
|
|
||||||
import datetime
|
|
||||||
import mock
|
|
||||||
import six
|
|
||||||
import types
|
|
||||||
|
|
||||||
from fuel_analytics.test.api.resources.utils.inst_structure_test import \
|
|
||||||
InstStructureTest
|
|
||||||
from fuel_analytics.test.base import DbTest
|
|
||||||
|
|
||||||
from fuel_analytics.api.app import app
|
|
||||||
from fuel_analytics.api.app import db
|
|
||||||
from fuel_analytics.api.db import model
|
|
||||||
from fuel_analytics.api.resources.utils.stats_to_csv import StatsToCsv
|
|
||||||
|
|
||||||
|
|
||||||
class PluginsToCsvExportTest(InstStructureTest, DbTest):
|
|
||||||
|
|
||||||
def test_get_plugin_keys_paths(self):
|
|
||||||
exporter = StatsToCsv()
|
|
||||||
_, _, _, csv_keys_paths = exporter.get_plugin_keys_paths()
|
|
||||||
self.assertTrue(['cluster_id'] in csv_keys_paths)
|
|
||||||
self.assertTrue(['cluster_fuel_version'] in csv_keys_paths)
|
|
||||||
self.assertTrue(['master_node_uid'] in csv_keys_paths)
|
|
||||||
self.assertTrue(['name'] in csv_keys_paths)
|
|
||||||
self.assertTrue(['version'] in csv_keys_paths)
|
|
||||||
self.assertTrue(['fuel_version'] in csv_keys_paths)
|
|
||||||
self.assertTrue(['package_version'] in csv_keys_paths)
|
|
||||||
self.assertTrue(['structure', 'fuel_packages'] in csv_keys_paths)
|
|
||||||
|
|
||||||
def test_get_flatten_plugins(self):
|
|
||||||
installations_num = 10
|
|
||||||
inst_structures = self.get_saved_inst_structures(
|
|
||||||
installations_num=installations_num)
|
|
||||||
exporter = StatsToCsv()
|
|
||||||
structure_paths, cluster_paths, plugins_paths, csv_paths = \
|
|
||||||
exporter.get_plugin_keys_paths()
|
|
||||||
flatten_plugins = exporter.get_flatten_plugins(
|
|
||||||
structure_paths, cluster_paths, plugins_paths, inst_structures)
|
|
||||||
self.assertIsInstance(flatten_plugins, types.GeneratorType)
|
|
||||||
pos_mn_uid = csv_paths.index(['master_node_uid'])
|
|
||||||
pos_cluster_id = csv_paths.index(['cluster_id'])
|
|
||||||
for flatten_plugin in flatten_plugins:
|
|
||||||
self.assertIsNotNone(flatten_plugin[pos_mn_uid])
|
|
||||||
self.assertIsNotNone(flatten_plugin[pos_cluster_id])
|
|
||||||
self.assertEqual(len(csv_paths), len(flatten_plugin))
|
|
||||||
|
|
||||||
def test_export_plugins(self):
|
|
||||||
installations_num = 100
|
|
||||||
exporter = StatsToCsv()
|
|
||||||
with app.test_request_context('/?from_date=2015-02-01'):
|
|
||||||
# Creating installation structures
|
|
||||||
inst_structures = self.get_saved_inst_structures(
|
|
||||||
installations_num=installations_num)
|
|
||||||
# Filtering installation structures
|
|
||||||
result = exporter.export_plugins(inst_structures)
|
|
||||||
self.assertIsInstance(result, types.GeneratorType)
|
|
||||||
output = six.StringIO(list(result))
|
|
||||||
reader = csv.reader(output)
|
|
||||||
for _ in reader:
|
|
||||||
pass
|
|
||||||
|
|
||||||
def test_plugin_invalid_data(self):
|
|
||||||
exporter = StatsToCsv()
|
|
||||||
num = 10
|
|
||||||
inst_structures = self.get_saved_inst_structures(
|
|
||||||
installations_num=num, clusters_num_range=(1, 1),
|
|
||||||
plugins_num_range=(1, 1))
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
# get_flatten_data 3 times called inside get_flatten_plugins
|
|
||||||
side_effect = [[]] * num * 3
|
|
||||||
side_effect[num / 2] = Exception
|
|
||||||
with mock.patch('fuel_analytics.api.resources.utils.'
|
|
||||||
'export_utils.get_flatten_data',
|
|
||||||
side_effect=side_effect):
|
|
||||||
structure_paths, cluster_paths, plugins_paths, csv_paths = \
|
|
||||||
exporter.get_plugin_keys_paths()
|
|
||||||
flatten_plugins = exporter.get_flatten_plugins(
|
|
||||||
structure_paths, cluster_paths,
|
|
||||||
plugins_paths, inst_structures)
|
|
||||||
# Checking only invalid data is not exported
|
|
||||||
self.assertEqual(num - 1, len(list(flatten_plugins)))
|
|
||||||
|
|
||||||
def test_fuel_release_info_in_flatten_plugins(self):
|
|
||||||
inst_fuel_version = '8.0'
|
|
||||||
cluster_fuel_version = '7.0'
|
|
||||||
packages = ['z', 'a', 'c']
|
|
||||||
inst_structures = [
|
|
||||||
model.InstallationStructure(
|
|
||||||
master_node_uid='one',
|
|
||||||
creation_date=datetime.datetime.utcnow(),
|
|
||||||
is_filtered=False,
|
|
||||||
structure={
|
|
||||||
'fuel_release': {'release': inst_fuel_version},
|
|
||||||
'fuel_packages': packages,
|
|
||||||
'clusters': [{
|
|
||||||
'id': 1, 'nodes': [],
|
|
||||||
'fuel_version': cluster_fuel_version,
|
|
||||||
'installed_plugins': [{
|
|
||||||
'name': 'plugin_a',
|
|
||||||
'version': 'plugin_version_0',
|
|
||||||
'releases': [],
|
|
||||||
'fuel_version': ['8.0', '7.0'],
|
|
||||||
'package_version': 'package_version_0'
|
|
||||||
}],
|
|
||||||
}]
|
|
||||||
}
|
|
||||||
)
|
|
||||||
]
|
|
||||||
for structure in inst_structures:
|
|
||||||
db.session.add(structure)
|
|
||||||
db.session.flush()
|
|
||||||
|
|
||||||
exporter = StatsToCsv()
|
|
||||||
structure_paths, cluster_paths, plugins_paths, csv_paths = \
|
|
||||||
exporter.get_plugin_keys_paths()
|
|
||||||
flatten_plugins = exporter.get_flatten_plugins(
|
|
||||||
structure_paths, cluster_paths, plugins_paths, inst_structures)
|
|
||||||
|
|
||||||
pos_fuel_version = csv_paths.index(['cluster_fuel_version'])
|
|
||||||
pos_packages = csv_paths.index(['structure', 'fuel_packages'])
|
|
||||||
for flatten_plugin in flatten_plugins:
|
|
||||||
self.assertEqual(cluster_fuel_version,
|
|
||||||
flatten_plugin[pos_fuel_version])
|
|
||||||
self.assertEqual(' '.join(packages),
|
|
||||||
flatten_plugin[pos_packages])
|
|
@ -1,233 +0,0 @@
|
|||||||
# -*- coding: utf-8 -*-
|
|
||||||
|
|
||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
import csv
|
|
||||||
from datetime import datetime
|
|
||||||
import mock
|
|
||||||
import six
|
|
||||||
import types
|
|
||||||
|
|
||||||
from fuel_analytics.test.api.resources.utils.inst_structure_test import \
|
|
||||||
InstStructureTest
|
|
||||||
from fuel_analytics.test.base import DbTest
|
|
||||||
|
|
||||||
from fuel_analytics.api.app import app
|
|
||||||
from fuel_analytics.api.app import db
|
|
||||||
from fuel_analytics.api.db.model import ActionLog
|
|
||||||
from fuel_analytics.api.resources.csv_exporter import get_action_logs
|
|
||||||
from fuel_analytics.api.resources.csv_exporter import get_inst_structures
|
|
||||||
from fuel_analytics.api.resources.utils import export_utils
|
|
||||||
from fuel_analytics.api.resources.utils.stats_to_csv import StatsToCsv
|
|
||||||
|
|
||||||
|
|
||||||
class StatsToCsvExportTest(InstStructureTest, DbTest):
|
|
||||||
|
|
||||||
def test_get_cluster_keys_paths(self):
|
|
||||||
exporter = StatsToCsv()
|
|
||||||
_, _, csv_keys_paths = exporter.get_cluster_keys_paths()
|
|
||||||
self.assertIn(['attributes', 'heat'], csv_keys_paths)
|
|
||||||
self.assertIn(['attributes', 'auto_assign_floating_ip'],
|
|
||||||
csv_keys_paths)
|
|
||||||
self.assertIn(['attributes', 'corosync_verified'], csv_keys_paths)
|
|
||||||
self.assertIn(['attributes', 'external_mongo_replset'], csv_keys_paths)
|
|
||||||
self.assertIn(['attributes', 'external_ntp_list'], csv_keys_paths)
|
|
||||||
self.assertIn(['attributes', 'mongo'], csv_keys_paths)
|
|
||||||
self.assertIn(['attributes', 'nova_quota'], csv_keys_paths)
|
|
||||||
self.assertIn(['attributes', 'repos'], csv_keys_paths)
|
|
||||||
self.assertIn(['attributes', 'resume_guests_state_on_host_boot'],
|
|
||||||
csv_keys_paths)
|
|
||||||
self.assertIn(['attributes', 'workloads_collector_enabled'],
|
|
||||||
csv_keys_paths)
|
|
||||||
self.assertIn(['attributes', 'ironic'], csv_keys_paths)
|
|
||||||
self.assertIn(['attributes', 'murano-cfapi'], csv_keys_paths)
|
|
||||||
self.assertIn(['attributes', 'murano_glance_artifacts_plugin'],
|
|
||||||
csv_keys_paths)
|
|
||||||
self.assertIn(['attributes', 'neutron_dvr'], csv_keys_paths)
|
|
||||||
self.assertIn(['attributes', 'neutron_l2_pop'], csv_keys_paths)
|
|
||||||
self.assertIn(['attributes', 'neutron_l3_ha'], csv_keys_paths)
|
|
||||||
self.assertIn(['attributes', 'neutron_qos'], csv_keys_paths)
|
|
||||||
self.assertIn(['attributes', 'public_ssl_cert_source'], csv_keys_paths)
|
|
||||||
self.assertIn(['attributes', 'public_ssl_horizon'], csv_keys_paths)
|
|
||||||
self.assertIn(['attributes', 'public_ssl_services'], csv_keys_paths)
|
|
||||||
self.assertIn(['attributes', 'puppet_debug'], csv_keys_paths)
|
|
||||||
self.assertIn(['attributes', 'task_deploy'], csv_keys_paths)
|
|
||||||
|
|
||||||
self.assertIn(['vmware_attributes', 'vmware_az_cinder_enable'],
|
|
||||||
csv_keys_paths)
|
|
||||||
self.assertIn(['vmware_attributes', 'vmware_az_nova_computes_num'],
|
|
||||||
csv_keys_paths)
|
|
||||||
self.assertIn(['structure', 'fuel_packages', 0], csv_keys_paths)
|
|
||||||
self.assertNotIn(['structure', 'clusters'], csv_keys_paths)
|
|
||||||
self.assertNotIn(['installed_plugins'], csv_keys_paths)
|
|
||||||
|
|
||||||
def test_get_flatten_clusters(self):
|
|
||||||
installations_num = 200
|
|
||||||
inst_structures = self.get_saved_inst_structures(
|
|
||||||
installations_num=installations_num)
|
|
||||||
exporter = StatsToCsv()
|
|
||||||
structure_paths, cluster_paths, csv_paths = \
|
|
||||||
exporter.get_cluster_keys_paths()
|
|
||||||
flatten_clusters = exporter.get_flatten_clusters(
|
|
||||||
structure_paths, cluster_paths, inst_structures, [])
|
|
||||||
self.assertIsInstance(flatten_clusters, types.GeneratorType)
|
|
||||||
for flatten_cluster in flatten_clusters:
|
|
||||||
self.assertEqual(len(csv_paths), len(flatten_cluster))
|
|
||||||
|
|
||||||
def test_flatten_data_as_csv(self):
|
|
||||||
installations_num = 100
|
|
||||||
inst_structures = self.get_saved_inst_structures(
|
|
||||||
installations_num=installations_num)
|
|
||||||
|
|
||||||
exporter = StatsToCsv()
|
|
||||||
structure_paths, cluster_paths, csv_paths = \
|
|
||||||
exporter.get_cluster_keys_paths()
|
|
||||||
flatten_clusters = exporter.get_flatten_clusters(
|
|
||||||
structure_paths, cluster_paths, inst_structures, [])
|
|
||||||
self.assertIsInstance(flatten_clusters, types.GeneratorType)
|
|
||||||
result = export_utils.flatten_data_as_csv(csv_paths, flatten_clusters)
|
|
||||||
self.assertIsInstance(result, types.GeneratorType)
|
|
||||||
output = six.StringIO(list(result))
|
|
||||||
reader = csv.reader(output)
|
|
||||||
# Pop columns names from reader
|
|
||||||
_ = reader.next()
|
|
||||||
|
|
||||||
# Checking reading result CSV
|
|
||||||
for _ in reader:
|
|
||||||
pass
|
|
||||||
|
|
||||||
def test_unicode_as_csv(self):
|
|
||||||
installations_num = 10
|
|
||||||
inst_structures = self.get_saved_inst_structures(
|
|
||||||
installations_num=installations_num)
|
|
||||||
|
|
||||||
exporter = StatsToCsv()
|
|
||||||
structure_paths, cluster_paths, csv_paths = \
|
|
||||||
exporter.get_cluster_keys_paths()
|
|
||||||
flatten_clusters = exporter.get_flatten_clusters(
|
|
||||||
structure_paths, cluster_paths, inst_structures, [])
|
|
||||||
flatten_clusters = list(flatten_clusters)
|
|
||||||
flatten_clusters[1][0] = u'эюя'
|
|
||||||
list(export_utils.flatten_data_as_csv(csv_paths, flatten_clusters))
|
|
||||||
|
|
||||||
def test_export_clusters(self):
|
|
||||||
installations_num = 100
|
|
||||||
inst_structures = self.get_saved_inst_structures(
|
|
||||||
installations_num=installations_num)
|
|
||||||
exporter = StatsToCsv()
|
|
||||||
result = exporter.export_clusters(inst_structures, [])
|
|
||||||
self.assertIsInstance(result, types.GeneratorType)
|
|
||||||
|
|
||||||
def test_filter_by_date(self):
|
|
||||||
exporter = StatsToCsv()
|
|
||||||
num = 10
|
|
||||||
with app.test_request_context('/?from_date=2015-02-01'):
|
|
||||||
|
|
||||||
# Creating installation structures
|
|
||||||
inst_structures = self.get_saved_inst_structures(
|
|
||||||
installations_num=num)
|
|
||||||
# Filtering installation structures
|
|
||||||
result = exporter.export_clusters(inst_structures, [])
|
|
||||||
self.assertIsInstance(result, types.GeneratorType)
|
|
||||||
output = six.StringIO(list(result))
|
|
||||||
reader = csv.reader(output)
|
|
||||||
for _ in reader:
|
|
||||||
pass
|
|
||||||
|
|
||||||
def test_network_verification_status(self):
|
|
||||||
num = 2
|
|
||||||
inst_structures = self.get_saved_inst_structures(
|
|
||||||
installations_num=num, clusters_num_range=(2, 2))
|
|
||||||
inst_structure = inst_structures[0]
|
|
||||||
clusters = inst_structure.structure['clusters']
|
|
||||||
exporter = StatsToCsv()
|
|
||||||
expected_als = [
|
|
||||||
ActionLog(
|
|
||||||
master_node_uid=inst_structure.master_node_uid,
|
|
||||||
external_id=1,
|
|
||||||
action_type='nailgun_task',
|
|
||||||
action_name=exporter.NETWORK_VERIFICATION_ACTION,
|
|
||||||
body={'cluster_id': clusters[0]['id'],
|
|
||||||
'end_timestamp': datetime.utcnow().isoformat(),
|
|
||||||
'action_type': 'nailgun_task',
|
|
||||||
'action_name': exporter.NETWORK_VERIFICATION_ACTION,
|
|
||||||
'additional_info': {'ended_with_status': 'error'}}
|
|
||||||
),
|
|
||||||
ActionLog(
|
|
||||||
master_node_uid=inst_structure.master_node_uid,
|
|
||||||
external_id=2,
|
|
||||||
action_type='nailgun_task',
|
|
||||||
action_name=exporter.NETWORK_VERIFICATION_ACTION,
|
|
||||||
body={'cluster_id': clusters[1]['id'],
|
|
||||||
'end_timestamp': datetime.utcnow().isoformat(),
|
|
||||||
'action_type': 'nailgun_task',
|
|
||||||
'action_name': exporter.NETWORK_VERIFICATION_ACTION,
|
|
||||||
'additional_info': {'ended_with_status': 'ready'}}
|
|
||||||
)
|
|
||||||
]
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
for action_log in expected_als:
|
|
||||||
db.session.add(action_log)
|
|
||||||
db.session.commit()
|
|
||||||
|
|
||||||
action_logs = get_action_logs()
|
|
||||||
inst_structures = get_inst_structures()
|
|
||||||
structure_keys_paths, cluster_keys_paths, csv_keys_paths = \
|
|
||||||
exporter.get_cluster_keys_paths()
|
|
||||||
flatten_clusters = list(exporter.get_flatten_clusters(
|
|
||||||
structure_keys_paths, cluster_keys_paths,
|
|
||||||
inst_structures, action_logs))
|
|
||||||
self.assertIn([exporter.NETWORK_VERIFICATION_COLUMN],
|
|
||||||
csv_keys_paths)
|
|
||||||
nv_column_pos = csv_keys_paths.index(
|
|
||||||
[exporter.NETWORK_VERIFICATION_COLUMN])
|
|
||||||
|
|
||||||
# Checking cluster network verification statuses
|
|
||||||
for al_pos, expected_al in enumerate(expected_als):
|
|
||||||
self.assertEqual(
|
|
||||||
expected_al.body['additional_info']['ended_with_status'],
|
|
||||||
flatten_clusters[al_pos][nv_column_pos]
|
|
||||||
)
|
|
||||||
# Checking empty network verification status
|
|
||||||
for flatten_cluster in flatten_clusters[2:]:
|
|
||||||
self.assertIsNone(flatten_cluster[nv_column_pos])
|
|
||||||
|
|
||||||
def test_vmware_attributes(self):
|
|
||||||
exporter = StatsToCsv()
|
|
||||||
inst_structures = self.generate_inst_structures(
|
|
||||||
clusters_num_range=(1, 1))
|
|
||||||
result = exporter.export_clusters(inst_structures, [])
|
|
||||||
for _ in result:
|
|
||||||
pass
|
|
||||||
|
|
||||||
def test_cluster_invalid_data(self):
|
|
||||||
exporter = StatsToCsv()
|
|
||||||
num = 10
|
|
||||||
inst_structures = self.get_saved_inst_structures(
|
|
||||||
installations_num=num, clusters_num_range=(1, 1))
|
|
||||||
|
|
||||||
with app.test_request_context():
|
|
||||||
# get_flatten_data 2 times called inside get_flatten_plugins
|
|
||||||
side_effect = [[]] * num * 2
|
|
||||||
side_effect[num / 2] = Exception
|
|
||||||
with mock.patch('fuel_analytics.api.resources.utils.'
|
|
||||||
'export_utils.get_flatten_data',
|
|
||||||
side_effect=side_effect):
|
|
||||||
structure_paths, cluster_paths, csv_paths = \
|
|
||||||
exporter.get_cluster_keys_paths()
|
|
||||||
flatten_clusters = exporter.get_flatten_clusters(
|
|
||||||
structure_paths, cluster_paths, inst_structures, [])
|
|
||||||
self.assertEqual(num - 1, len(list(flatten_clusters)))
|
|
@ -1,71 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
from sqlalchemy.orm import scoped_session
|
|
||||||
from sqlalchemy.orm import sessionmaker
|
|
||||||
from unittest2.case import TestCase
|
|
||||||
|
|
||||||
from fuel_analytics.api.app import app
|
|
||||||
from fuel_analytics.api.app import db
|
|
||||||
from fuel_analytics.api.db.model import ActionLog
|
|
||||||
from fuel_analytics.api.db.model import InstallationStructure
|
|
||||||
from fuel_analytics.api.db.model import OpenStackWorkloadStats
|
|
||||||
from fuel_analytics.api.log import init_logger
|
|
||||||
|
|
||||||
# Configuring app for the test environment
|
|
||||||
app.config.from_object('fuel_analytics.api.config.Testing')
|
|
||||||
app.config.from_envvar('ANALYTICS_SETTINGS', silent=True)
|
|
||||||
init_logger()
|
|
||||||
|
|
||||||
|
|
||||||
class BaseTest(TestCase):
|
|
||||||
|
|
||||||
def setUp(self):
|
|
||||||
super(BaseTest, self).setUp()
|
|
||||||
self.client = app.test_client()
|
|
||||||
|
|
||||||
def check_response_ok(self, resp, codes=(200, 201)):
|
|
||||||
self.assertIn(resp.status_code, codes)
|
|
||||||
|
|
||||||
def check_response_error(self, resp, code):
|
|
||||||
self.assertEqual(code, resp.status_code)
|
|
||||||
|
|
||||||
|
|
||||||
class DbTest(BaseTest):
|
|
||||||
|
|
||||||
def setUp(self):
|
|
||||||
super(DbTest, self).setUp()
|
|
||||||
# connect to the database
|
|
||||||
self.connection = db.session.connection()
|
|
||||||
|
|
||||||
# begin a non-ORM transaction
|
|
||||||
self.trans = self.connection.begin()
|
|
||||||
|
|
||||||
# bind an individual Session to the connection
|
|
||||||
db.session = scoped_session(sessionmaker(bind=self.connection))
|
|
||||||
|
|
||||||
# Cleaning DB
|
|
||||||
OpenStackWorkloadStats.query.delete()
|
|
||||||
InstallationStructure.query.delete()
|
|
||||||
ActionLog.query.delete()
|
|
||||||
db.session.flush()
|
|
||||||
|
|
||||||
def tearDown(self):
|
|
||||||
# rollback - everything that happened with the
|
|
||||||
# Session above (including calls to commit())
|
|
||||||
# is rolled back.
|
|
||||||
self.trans.rollback()
|
|
||||||
db.session.close()
|
|
||||||
|
|
||||||
super(DbTest, self).tearDown()
|
|
@ -1,4 +0,0 @@
|
|||||||
# Ignore everything in this directory
|
|
||||||
*
|
|
||||||
# Except this file
|
|
||||||
!.gitignore
|
|
@ -1,22 +0,0 @@
|
|||||||
# Copyright 2015 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
from fuel_analytics.test.base import BaseTest
|
|
||||||
|
|
||||||
|
|
||||||
class TestCommon(BaseTest):
|
|
||||||
|
|
||||||
def test_unknown_resource(self):
|
|
||||||
resp = self.client.get('/xxx')
|
|
||||||
self.check_response_error(resp, 404)
|
|
@ -1,40 +0,0 @@
|
|||||||
#!/usr/bin/env python
|
|
||||||
|
|
||||||
# Copyright 2014 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
from flask_script import Manager
|
|
||||||
|
|
||||||
from fuel_analytics.api import log
|
|
||||||
from fuel_analytics.api.app import app
|
|
||||||
|
|
||||||
|
|
||||||
def configure_app(mode=None):
|
|
||||||
mode_map = {
|
|
||||||
'test': 'fuel_analytics.api.config.Testing',
|
|
||||||
'prod': 'fuel_analytics.api.config.Production'
|
|
||||||
}
|
|
||||||
app.config.from_object(mode_map.get(mode))
|
|
||||||
app.config.from_envvar('ANALYTICS_SETTINGS', silent=True)
|
|
||||||
log.init_logger()
|
|
||||||
return app
|
|
||||||
|
|
||||||
|
|
||||||
manager = Manager(configure_app)
|
|
||||||
manager.add_option('--mode', help="Acceptable modes. Default: 'test'",
|
|
||||||
choices=('test', 'prod'), default='prod', dest='mode')
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
|
||||||
manager.run()
|
|
@ -1,7 +0,0 @@
|
|||||||
psycopg2==2.5.4
|
|
||||||
Flask==0.10.1
|
|
||||||
Flask-Script==2.0.5
|
|
||||||
Flask-SQLAlchemy==2.0
|
|
||||||
python-memcached>=1.56
|
|
||||||
SQLAlchemy==0.9.8
|
|
||||||
six>=1.8.0
|
|
@ -1,55 +0,0 @@
|
|||||||
# Copyright 2014 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
import os
|
|
||||||
|
|
||||||
from setuptools import find_packages
|
|
||||||
from setuptools import setup
|
|
||||||
|
|
||||||
|
|
||||||
def parse_requirements_txt():
|
|
||||||
root = os.path.dirname(os.path.abspath(__file__))
|
|
||||||
requirements = []
|
|
||||||
with open(os.path.join(root, 'requirements.txt'), 'r') as f:
|
|
||||||
for line in f.readlines():
|
|
||||||
line = line.rstrip()
|
|
||||||
if not line or line.startswith('#'):
|
|
||||||
continue
|
|
||||||
requirements.append(line)
|
|
||||||
return requirements
|
|
||||||
|
|
||||||
|
|
||||||
setup(
|
|
||||||
name='analytics',
|
|
||||||
version='0.0.1',
|
|
||||||
description="Service of analytics reports",
|
|
||||||
long_description="""Service of analytics reports""",
|
|
||||||
license="http://www.apache.org/licenses/LICENSE-2.0",
|
|
||||||
classifiers=[
|
|
||||||
"License :: OSI Approved :: Apache Software License",
|
|
||||||
"Development Status :: 3 - Alpha",
|
|
||||||
"Programming Language :: Python",
|
|
||||||
"Topic :: Internet :: WWW/HTTP",
|
|
||||||
"Topic :: Internet :: WWW/HTTP :: WSGI :: Application",
|
|
||||||
],
|
|
||||||
author='Mirantis Inc.',
|
|
||||||
author_email='product@mirantis.com',
|
|
||||||
url='https://mirantis.com',
|
|
||||||
keywords='fuel statistics analytics mirantis',
|
|
||||||
packages=find_packages(),
|
|
||||||
zip_safe=False,
|
|
||||||
install_requires=parse_requirements_txt(),
|
|
||||||
include_package_data=True,
|
|
||||||
scripts=['manage_analytics.py']
|
|
||||||
)
|
|
@ -1,50 +0,0 @@
|
|||||||
{
|
|
||||||
"name": "fuel-stats",
|
|
||||||
"dependencies": {
|
|
||||||
"jquery": "2.1.1",
|
|
||||||
"jquery-ui": "1.11.4",
|
|
||||||
"bootstrap": "3.2.0",
|
|
||||||
"d3": "3.4.13",
|
|
||||||
"d3-tip": "0.6.3",
|
|
||||||
"d3pie": "0.1.4",
|
|
||||||
"nvd3": "1.1.15-beta",
|
|
||||||
"requirejs": "2.1.15"
|
|
||||||
},
|
|
||||||
"overrides": {
|
|
||||||
"nvd3": {
|
|
||||||
"main": "nv.d3.js"
|
|
||||||
},
|
|
||||||
"bootstrap": {
|
|
||||||
"main": [
|
|
||||||
"dist/css/bootstrap-theme.css",
|
|
||||||
"dist/css/bootstrap-theme.css.map",
|
|
||||||
"dist/css/bootstrap.css",
|
|
||||||
"dist/css/bootstrap.css.map",
|
|
||||||
"dist/js/bootstrap.js"
|
|
||||||
],
|
|
||||||
"normalize": {
|
|
||||||
"css": ["*.css", "*.map"]
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"jquery-ui": {
|
|
||||||
"main": [
|
|
||||||
"themes/base/*",
|
|
||||||
"themes/base/images/*",
|
|
||||||
"jquery-ui.js"
|
|
||||||
],
|
|
||||||
"normalize": {
|
|
||||||
"css": ["*.css"],
|
|
||||||
"css/images": ["*.png"]
|
|
||||||
}
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"ignore": [
|
|
||||||
"**/.*",
|
|
||||||
"node_modules",
|
|
||||||
"tests"
|
|
||||||
],
|
|
||||||
"resolutions": {
|
|
||||||
"d3": "~3.4.13"
|
|
||||||
},
|
|
||||||
"private": true
|
|
||||||
}
|
|
@ -1,332 +0,0 @@
|
|||||||
/* Fonts */
|
|
||||||
@font-face {
|
|
||||||
font-family: 'open sans light';
|
|
||||||
src: url('../fonts/opensans-light-webfont.eot');
|
|
||||||
src: url('../fonts/opensans-light-webfont.eot?#iefix') format('embedded-opentype'), url('../fonts/opensans-light-webfont.woff2') format('woff2'), url('../fonts/opensans-light-webfont.woff') format('woff'), url('../fonts/opensans-light-webfont.ttf') format('truetype'), url('../fonts/opensans-light-webfont.svg#open_sanslight') format('svg');
|
|
||||||
font-weight: normal;
|
|
||||||
font-style: normal;
|
|
||||||
}
|
|
||||||
@font-face {
|
|
||||||
font-family: 'open sans';
|
|
||||||
src: url('../fonts/opensans-regular-webfont.eot');
|
|
||||||
src: url('../fonts/opensans-regular-webfont.eot?#iefix') format('embedded-opentype'), url('../fonts/opensans-regular-webfont.woff2') format('woff2'), url('../fonts/opensans-regular-webfont.woff') format('woff'), url('../fonts/opensans-regular-webfont.ttf') format('truetype'), url('../fonts/opensans-regular-webfont.svg#open_sansregular') format('svg');
|
|
||||||
font-weight: normal;
|
|
||||||
font-style: normal;
|
|
||||||
}
|
|
||||||
@font-face {
|
|
||||||
font-family: 'open sans semibold';
|
|
||||||
src: url('../fonts/opensans-semibold-webfont.eot');
|
|
||||||
src: url('../fonts/opensans-semibold-webfont.eot?#iefix') format('embedded-opentype'), url('../fonts/opensans-semibold-webfont.woff2') format('woff2'), url('../fonts/opensans-semibold-webfont.woff') format('woff'), url('../fonts/opensans-semibold-webfont.ttf') format('truetype'), url('../fonts/opensans-semibold-webfont.svg#open_sanssemibold') format('svg');
|
|
||||||
font-weight: normal;
|
|
||||||
font-style: normal;
|
|
||||||
}
|
|
||||||
@font-face {
|
|
||||||
font-family: 'open sans bold';
|
|
||||||
src: url('../fonts/opensans-bold-webfont.eot');
|
|
||||||
src: url('../fonts/opensans-bold-webfont.eot?#iefix') format('embedded-opentype'), url('../fonts/opensans-bold-webfont.woff2') format('woff2'), url('../fonts/opensans-bold-webfont.woff') format('woff'), url('../fonts/opensans-bold-webfont.ttf') format('truetype'), url('../fonts/opensans-bold-webfont.svg#open_sansbold') format('svg');
|
|
||||||
font-weight: normal;
|
|
||||||
font-style: normal;
|
|
||||||
}
|
|
||||||
@font-face {
|
|
||||||
font-family: 'open sans extrabold';
|
|
||||||
src: url('../fonts/opensans-extrabold-webfont.eot');
|
|
||||||
src: url('../fonts/opensans-extrabold-webfont.eot?#iefix') format('embedded-opentype'), url('../fonts/opensans-extrabold-webfont.woff2') format('woff2'), url('../fonts/opensans-extrabold-webfont.woff') format('woff'), url('../fonts/opensans-extrabold-webfont.ttf') format('truetype'), url('../fonts/opensans-extrabold-webfont.svg#open_sansextrabold') format('svg');
|
|
||||||
font-weight: normal;
|
|
||||||
font-style: normal;
|
|
||||||
}
|
|
||||||
html, body {
|
|
||||||
font-family: 'open sans';
|
|
||||||
font-size: 14px;
|
|
||||||
/* background-color: #2f4050; */
|
|
||||||
background-color: #f3f3f4;
|
|
||||||
height: 100%;
|
|
||||||
}
|
|
||||||
|
|
||||||
#loader {
|
|
||||||
background-color: #415766;
|
|
||||||
position: absolute;
|
|
||||||
top: 0;
|
|
||||||
width: 100%;
|
|
||||||
height: 100%;
|
|
||||||
}
|
|
||||||
#load-error {
|
|
||||||
margin-left: 82px;
|
|
||||||
padding-top: 20px;
|
|
||||||
font-size: 20px;
|
|
||||||
color: #F1594C;
|
|
||||||
}
|
|
||||||
@keyframes loading {
|
|
||||||
from {transform:rotate(0deg);}
|
|
||||||
to {transform:rotate(360deg);}
|
|
||||||
}
|
|
||||||
.loading:before {
|
|
||||||
content: "";
|
|
||||||
display: block;
|
|
||||||
width: 98px;
|
|
||||||
height: 98px;
|
|
||||||
overflow: hidden;
|
|
||||||
margin: 200px auto 0;
|
|
||||||
background: url(../img/loader-bg.svg);
|
|
||||||
animation: loading 6s linear infinite;
|
|
||||||
}
|
|
||||||
.loading:after {
|
|
||||||
content: "";
|
|
||||||
display: block;
|
|
||||||
width: 98px;
|
|
||||||
height: 98px;
|
|
||||||
margin: -124px auto 0;
|
|
||||||
position: relative;
|
|
||||||
z-index: 9999;
|
|
||||||
background: url(../img/loader-logo.svg);
|
|
||||||
}
|
|
||||||
|
|
||||||
.hidden {
|
|
||||||
display: none;
|
|
||||||
}
|
|
||||||
.nav-pannel {
|
|
||||||
width: 70px;
|
|
||||||
height: 100%;
|
|
||||||
background-color: #2f4050;
|
|
||||||
position: fixed;
|
|
||||||
z-index: 2;
|
|
||||||
top: 0px;
|
|
||||||
left: 0px;
|
|
||||||
overflow: hidden;
|
|
||||||
transition: width .5s linear;
|
|
||||||
}
|
|
||||||
.nav-pannel:hover {
|
|
||||||
width: 200px;
|
|
||||||
transition: width .5s linear;
|
|
||||||
}
|
|
||||||
.nav-pannel > .logo {
|
|
||||||
width: 100%;
|
|
||||||
height: 70px;
|
|
||||||
background: url(../img/stat-icons.png) no-repeat 0 0;
|
|
||||||
}
|
|
||||||
.nav-pannel > .logo > a {
|
|
||||||
display: block;
|
|
||||||
width: 100%;
|
|
||||||
height: 70px;
|
|
||||||
}
|
|
||||||
.nav-item {
|
|
||||||
width: 100%;
|
|
||||||
height: 70px;
|
|
||||||
overflow: hidden;
|
|
||||||
font-family: 'open sans semibold';
|
|
||||||
}
|
|
||||||
.nav-item > a {
|
|
||||||
display: block;
|
|
||||||
width: 100%;
|
|
||||||
height: 70px;
|
|
||||||
border-left: 4px solid transparent;
|
|
||||||
color: #a7b1c2;
|
|
||||||
opacity: 0.5;
|
|
||||||
}
|
|
||||||
.nav-item > a:hover {
|
|
||||||
border-left: 4px solid #d55656;
|
|
||||||
opacity: 1;
|
|
||||||
}
|
|
||||||
.nav-item > .active {
|
|
||||||
background-color: #23303c;
|
|
||||||
border-left: 4px solid #19aa8d;
|
|
||||||
color: #ffffff;
|
|
||||||
opacity: 1;
|
|
||||||
}
|
|
||||||
.logout-button {
|
|
||||||
width: inherit;
|
|
||||||
height: 70px;
|
|
||||||
/* background-color: rgba(0, 0, 0, 0.25); */
|
|
||||||
position: fixed;
|
|
||||||
bottom: 0px;
|
|
||||||
left: 0px;
|
|
||||||
}
|
|
||||||
.logout-button > a {
|
|
||||||
color: #a7b1c2;
|
|
||||||
opacity: 0.5;
|
|
||||||
}
|
|
||||||
.logout-button > a:hover {
|
|
||||||
opacity: 1;
|
|
||||||
}
|
|
||||||
.icon-logout {
|
|
||||||
width: 70px;
|
|
||||||
height: 70px;
|
|
||||||
float: left;
|
|
||||||
background: url(../img/stat-icons.png) no-repeat 0px -70px;
|
|
||||||
}
|
|
||||||
.icon-graphs {
|
|
||||||
width: 70px;
|
|
||||||
height: 70px;
|
|
||||||
float: left;
|
|
||||||
background: url(../img/stat-icons.png) no-repeat 0px -140px;
|
|
||||||
}
|
|
||||||
.icon-reports {
|
|
||||||
width: 70px;
|
|
||||||
height: 70px;
|
|
||||||
float: left;
|
|
||||||
background: url(../img/stat-icons.png) no-repeat 0px -210px;
|
|
||||||
}
|
|
||||||
.nav-item-name {
|
|
||||||
height: 44px;
|
|
||||||
width: 106px;
|
|
||||||
float: left;
|
|
||||||
padding-top: 26px;
|
|
||||||
overflow: hidden;
|
|
||||||
-webkit-box-sizing: border-box;
|
|
||||||
-moz-box-sizing: border-box;
|
|
||||||
box-sizing: border-box;
|
|
||||||
}
|
|
||||||
.top-graph {
|
|
||||||
margin: 20px 0;
|
|
||||||
}
|
|
||||||
.base-box {
|
|
||||||
width: 100%;
|
|
||||||
height: 100%;
|
|
||||||
min-width: 1270px;
|
|
||||||
padding-left: 70px;
|
|
||||||
background-color: #f3f3f4;
|
|
||||||
-webkit-box-sizing: border-box;
|
|
||||||
-moz-box-sizing: border-box;
|
|
||||||
box-sizing: border-box;
|
|
||||||
}
|
|
||||||
.small-graphs-box {margin-top: 15px;}
|
|
||||||
.graph-item-box {
|
|
||||||
background-color: #ffffff;
|
|
||||||
-webkit-box-sizing: border-box;
|
|
||||||
-moz-box-sizing: border-box;
|
|
||||||
box-sizing: border-box;
|
|
||||||
min-width: 380px;
|
|
||||||
overflow: hidden;
|
|
||||||
}
|
|
||||||
.graph-item-box > .header {
|
|
||||||
height: 49px;
|
|
||||||
border-top: 4px solid #e7eaec;
|
|
||||||
border-bottom: 1px solid #e7eaec;
|
|
||||||
overflow: hidden;
|
|
||||||
}
|
|
||||||
.graph-item-box > .header > .title {
|
|
||||||
font-size: 14px;
|
|
||||||
font-weight: 600;
|
|
||||||
color: #676a6c;
|
|
||||||
margin: 12px 12px;
|
|
||||||
width: 70%;
|
|
||||||
float: left;
|
|
||||||
height: 24px;
|
|
||||||
overflow: hidden;
|
|
||||||
}
|
|
||||||
.graph-item-box > .header > .icon {
|
|
||||||
width: 12px;
|
|
||||||
height: 12px;
|
|
||||||
float: right;
|
|
||||||
margin: 16px 16px 0px 0px;
|
|
||||||
background: url(../img/expand-icon.png) no-repeat top left;
|
|
||||||
}
|
|
||||||
.graph-item-box > .header > .icon > a {
|
|
||||||
width: 12px;
|
|
||||||
height: 12px;
|
|
||||||
display: block;
|
|
||||||
cursor: pointer;
|
|
||||||
}
|
|
||||||
.graph-item-box > .content {
|
|
||||||
padding: 14px;
|
|
||||||
font-weight: 300;
|
|
||||||
font-size: 14px;
|
|
||||||
-webkit-box-sizing: border-box;
|
|
||||||
-moz-box-sizing: border-box;
|
|
||||||
box-sizing: border-box;
|
|
||||||
}
|
|
||||||
.graph-item-box > .content > .image {
|
|
||||||
display: block;
|
|
||||||
height: 300px;
|
|
||||||
margin-left: auto;
|
|
||||||
margin-right: auto;
|
|
||||||
overflow: hidden;
|
|
||||||
padding: 0;
|
|
||||||
width: 400px;
|
|
||||||
}
|
|
||||||
.graph-item-box > .content > .image > img {
|
|
||||||
width: 100%;
|
|
||||||
}
|
|
||||||
.graph-item-box > .content > .description {
|
|
||||||
margin-left: auto;
|
|
||||||
margin-right: auto;
|
|
||||||
margin-bottom: 10px;
|
|
||||||
padding: 0;
|
|
||||||
width: 400px;
|
|
||||||
color: #6E6E6E;
|
|
||||||
font-size: 12px;
|
|
||||||
}
|
|
||||||
.titul-graph-box {
|
|
||||||
border-top: 4px solid #e7eaec;
|
|
||||||
border-bottom: 1px solid #e7eaec;
|
|
||||||
background-color: #ffffff;
|
|
||||||
margin: 15px;
|
|
||||||
}
|
|
||||||
.standard-graph-box {
|
|
||||||
margin-bottom: 15px;
|
|
||||||
}
|
|
||||||
.info-ticket {
|
|
||||||
display: inline-block;
|
|
||||||
padding: 12px 20px;
|
|
||||||
-webkit-box-sizing: border-box;
|
|
||||||
-moz-box-sizing: border-box;
|
|
||||||
box-sizing: border-box;
|
|
||||||
}
|
|
||||||
.installation-counts {
|
|
||||||
font-size: 14px;
|
|
||||||
/* margin: 12px 0px 10px 6px; */
|
|
||||||
font-family: "open sans semibold";
|
|
||||||
color: #666666;
|
|
||||||
}
|
|
||||||
.installation-counts > b {
|
|
||||||
color: #6fb6b9;
|
|
||||||
font-weight: inherit;
|
|
||||||
font-family: "open sans semibold";
|
|
||||||
}
|
|
||||||
|
|
||||||
/* nvd3 styles customization*/
|
|
||||||
svg text {
|
|
||||||
font-family: 'open sans' !important;
|
|
||||||
font-size: 12px;
|
|
||||||
font-weight: normal;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nv-series text, .p0_title, .p1_title {
|
|
||||||
font-size: 13px !important;
|
|
||||||
}
|
|
||||||
|
|
||||||
#release-filter {
|
|
||||||
margin: 10px 0 0 13px;
|
|
||||||
}
|
|
||||||
|
|
||||||
.reports .content {
|
|
||||||
padding: 20px;
|
|
||||||
}
|
|
||||||
|
|
||||||
.reports ul {
|
|
||||||
list-style-type: none;
|
|
||||||
margin: 0;
|
|
||||||
padding: 0;
|
|
||||||
}
|
|
||||||
|
|
||||||
.reports ul li {
|
|
||||||
font-size: 14px;
|
|
||||||
margin-top: 10px;
|
|
||||||
}
|
|
||||||
|
|
||||||
.reports .notice {
|
|
||||||
font-size: 18px;
|
|
||||||
margin: 20px 0 10px 0;
|
|
||||||
color: gray;
|
|
||||||
}
|
|
||||||
|
|
||||||
.reports ul, .reports .row {
|
|
||||||
margin-left: 20px;
|
|
||||||
}
|
|
||||||
|
|
||||||
.reports label {
|
|
||||||
font-weight: normal;
|
|
||||||
}
|
|
||||||
|
|
||||||
.reports input {
|
|
||||||
margin-bottom: 3px;
|
|
||||||
}
|
|
@ -1,770 +0,0 @@
|
|||||||
|
|
||||||
/********************
|
|
||||||
* HTML CSS
|
|
||||||
*/
|
|
||||||
|
|
||||||
|
|
||||||
.chartWrap {
|
|
||||||
margin: 0;
|
|
||||||
padding: 0;
|
|
||||||
overflow: hidden;
|
|
||||||
}
|
|
||||||
|
|
||||||
/********************
|
|
||||||
Box shadow and border radius styling
|
|
||||||
*/
|
|
||||||
.nvtooltip.with-3d-shadow, .with-3d-shadow .nvtooltip {
|
|
||||||
-moz-box-shadow: 0 5px 10px rgba(0,0,0,.2);
|
|
||||||
-webkit-box-shadow: 0 5px 10px rgba(0,0,0,.2);
|
|
||||||
box-shadow: 0 5px 10px rgba(0,0,0,.2);
|
|
||||||
|
|
||||||
-webkit-border-radius: 6px;
|
|
||||||
-moz-border-radius: 6px;
|
|
||||||
border-radius: 6px;
|
|
||||||
}
|
|
||||||
|
|
||||||
/********************
|
|
||||||
* TOOLTIP CSS
|
|
||||||
*/
|
|
||||||
|
|
||||||
.nvtooltip {
|
|
||||||
position: absolute;
|
|
||||||
background-color: rgba(255,255,255,1.0);
|
|
||||||
padding: 1px;
|
|
||||||
border: 1px solid rgba(0,0,0,.2);
|
|
||||||
z-index: 10000;
|
|
||||||
|
|
||||||
font-family: Arial;
|
|
||||||
font-size: 13px;
|
|
||||||
text-align: left;
|
|
||||||
pointer-events: none;
|
|
||||||
|
|
||||||
white-space: nowrap;
|
|
||||||
|
|
||||||
-webkit-touch-callout: none;
|
|
||||||
-webkit-user-select: none;
|
|
||||||
-khtml-user-select: none;
|
|
||||||
-moz-user-select: none;
|
|
||||||
-ms-user-select: none;
|
|
||||||
user-select: none;
|
|
||||||
}
|
|
||||||
|
|
||||||
/*Give tooltips that old fade in transition by
|
|
||||||
putting a "with-transitions" class on the container div.
|
|
||||||
*/
|
|
||||||
.nvtooltip.with-transitions, .with-transitions .nvtooltip {
|
|
||||||
transition: opacity 250ms linear;
|
|
||||||
-moz-transition: opacity 250ms linear;
|
|
||||||
-webkit-transition: opacity 250ms linear;
|
|
||||||
|
|
||||||
transition-delay: 250ms;
|
|
||||||
-moz-transition-delay: 250ms;
|
|
||||||
-webkit-transition-delay: 250ms;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvtooltip.x-nvtooltip,
|
|
||||||
.nvtooltip.y-nvtooltip {
|
|
||||||
padding: 8px;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvtooltip h3 {
|
|
||||||
margin: 0;
|
|
||||||
font-size: 13px;
|
|
||||||
padding: 4px 14px;
|
|
||||||
line-height: 18px;
|
|
||||||
font-weight: normal;
|
|
||||||
background-color: rgba(247,247,247,0.75);
|
|
||||||
text-align: center;
|
|
||||||
|
|
||||||
border-bottom: 1px solid #ebebeb;
|
|
||||||
|
|
||||||
-webkit-border-radius: 5px 5px 0 0;
|
|
||||||
-moz-border-radius: 5px 5px 0 0;
|
|
||||||
border-radius: 5px 5px 0 0;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvtooltip p {
|
|
||||||
margin: 0;
|
|
||||||
padding: 5px 14px;
|
|
||||||
text-align: center;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvtooltip span {
|
|
||||||
display: inline-block;
|
|
||||||
margin: 2px 0;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvtooltip table {
|
|
||||||
margin: 6px;
|
|
||||||
border-spacing:0;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
.nvtooltip table td {
|
|
||||||
padding: 2px 9px 2px 0;
|
|
||||||
vertical-align: middle;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvtooltip table td.key {
|
|
||||||
font-weight:normal;
|
|
||||||
}
|
|
||||||
.nvtooltip table td.value {
|
|
||||||
text-align: right;
|
|
||||||
font-weight: bold;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvtooltip table tr.highlight td {
|
|
||||||
padding: 1px 9px 1px 0;
|
|
||||||
border-bottom-style: solid;
|
|
||||||
border-bottom-width: 1px;
|
|
||||||
border-top-style: solid;
|
|
||||||
border-top-width: 1px;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvtooltip table td.legend-color-guide div {
|
|
||||||
width: 8px;
|
|
||||||
height: 8px;
|
|
||||||
vertical-align: middle;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvtooltip .footer {
|
|
||||||
padding: 3px;
|
|
||||||
text-align: center;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
.nvtooltip-pending-removal {
|
|
||||||
position: absolute;
|
|
||||||
pointer-events: none;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
/********************
|
|
||||||
* SVG CSS
|
|
||||||
*/
|
|
||||||
|
|
||||||
|
|
||||||
svg {
|
|
||||||
-webkit-touch-callout: none;
|
|
||||||
-webkit-user-select: none;
|
|
||||||
-khtml-user-select: none;
|
|
||||||
-moz-user-select: none;
|
|
||||||
-ms-user-select: none;
|
|
||||||
user-select: none;
|
|
||||||
/* Trying to get SVG to act like a greedy block in all browsers */
|
|
||||||
display: block;
|
|
||||||
width:100%;
|
|
||||||
height:100%;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
svg text {
|
|
||||||
font: normal 12px Arial;
|
|
||||||
}
|
|
||||||
|
|
||||||
svg .title {
|
|
||||||
font: bold 14px Arial;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3 .nv-background {
|
|
||||||
fill: white;
|
|
||||||
fill-opacity: 0;
|
|
||||||
/*
|
|
||||||
pointer-events: none;
|
|
||||||
*/
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3.nv-noData {
|
|
||||||
font-size: 18px;
|
|
||||||
font-weight: bold;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
/**********
|
|
||||||
* Brush
|
|
||||||
*/
|
|
||||||
|
|
||||||
.nv-brush .extent {
|
|
||||||
fill-opacity: .125;
|
|
||||||
shape-rendering: crispEdges;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
/**********
|
|
||||||
* Legend
|
|
||||||
*/
|
|
||||||
|
|
||||||
.nvd3 .nv-legend .nv-series {
|
|
||||||
cursor: pointer;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3 .nv-legend .disabled circle {
|
|
||||||
fill-opacity: 0;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
/**********
|
|
||||||
* Axes
|
|
||||||
*/
|
|
||||||
.nvd3 .nv-axis {
|
|
||||||
pointer-events:none;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3 .nv-axis path {
|
|
||||||
fill: none;
|
|
||||||
stroke: #000;
|
|
||||||
stroke-opacity: .75;
|
|
||||||
shape-rendering: crispEdges;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3 .nv-axis path.domain {
|
|
||||||
stroke-opacity: .75;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3 .nv-axis.nv-x path.domain {
|
|
||||||
stroke-opacity: 0;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3 .nv-axis line {
|
|
||||||
fill: none;
|
|
||||||
stroke: #e5e5e5;
|
|
||||||
shape-rendering: crispEdges;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3 .nv-axis .zero line,
|
|
||||||
/*this selector may not be necessary*/ .nvd3 .nv-axis line.zero {
|
|
||||||
stroke-opacity: .75;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3 .nv-axis .nv-axisMaxMin text {
|
|
||||||
font-weight: bold;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3 .x .nv-axis .nv-axisMaxMin text,
|
|
||||||
.nvd3 .x2 .nv-axis .nv-axisMaxMin text,
|
|
||||||
.nvd3 .x3 .nv-axis .nv-axisMaxMin text {
|
|
||||||
text-anchor: middle
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
/**********
|
|
||||||
* Brush
|
|
||||||
*/
|
|
||||||
|
|
||||||
.nv-brush .resize path {
|
|
||||||
fill: #eee;
|
|
||||||
stroke: #666;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
/**********
|
|
||||||
* Bars
|
|
||||||
*/
|
|
||||||
|
|
||||||
.nvd3 .nv-bars .negative rect {
|
|
||||||
zfill: brown;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3 .nv-bars rect {
|
|
||||||
zfill: steelblue;
|
|
||||||
fill-opacity: .75;
|
|
||||||
|
|
||||||
transition: fill-opacity 250ms linear;
|
|
||||||
-moz-transition: fill-opacity 250ms linear;
|
|
||||||
-webkit-transition: fill-opacity 250ms linear;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3 .nv-bars rect.hover {
|
|
||||||
fill-opacity: 1;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3 .nv-bars .hover rect {
|
|
||||||
fill: lightblue;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3 .nv-bars text {
|
|
||||||
fill: rgba(0,0,0,0);
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3 .nv-bars .hover text {
|
|
||||||
fill: rgba(0,0,0,1);
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
/**********
|
|
||||||
* Bars
|
|
||||||
*/
|
|
||||||
|
|
||||||
.nvd3 .nv-multibar .nv-groups rect,
|
|
||||||
.nvd3 .nv-multibarHorizontal .nv-groups rect,
|
|
||||||
.nvd3 .nv-discretebar .nv-groups rect {
|
|
||||||
stroke-opacity: 0;
|
|
||||||
|
|
||||||
transition: fill-opacity 250ms linear;
|
|
||||||
-moz-transition: fill-opacity 250ms linear;
|
|
||||||
-webkit-transition: fill-opacity 250ms linear;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3 .nv-multibar .nv-groups rect:hover,
|
|
||||||
.nvd3 .nv-multibarHorizontal .nv-groups rect:hover,
|
|
||||||
.nvd3 .nv-discretebar .nv-groups rect:hover {
|
|
||||||
fill-opacity: 1;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3 .nv-discretebar .nv-groups text,
|
|
||||||
.nvd3 .nv-multibarHorizontal .nv-groups text {
|
|
||||||
font-weight: normal;
|
|
||||||
fill: rgba(0,0,0,1);
|
|
||||||
stroke: rgba(0,0,0,0);
|
|
||||||
}
|
|
||||||
|
|
||||||
/***********
|
|
||||||
* Pie Chart
|
|
||||||
*/
|
|
||||||
|
|
||||||
.nvd3.nv-pie path {
|
|
||||||
stroke-opacity: 0;
|
|
||||||
transition: fill-opacity 250ms linear, stroke-width 250ms linear, stroke-opacity 250ms linear;
|
|
||||||
-moz-transition: fill-opacity 250ms linear, stroke-width 250ms linear, stroke-opacity 250ms linear;
|
|
||||||
-webkit-transition: fill-opacity 250ms linear, stroke-width 250ms linear, stroke-opacity 250ms linear;
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3.nv-pie .nv-slice text {
|
|
||||||
stroke: #000;
|
|
||||||
stroke-width: 0;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3.nv-pie path {
|
|
||||||
stroke: #fff;
|
|
||||||
stroke-width: 1px;
|
|
||||||
stroke-opacity: 1;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3.nv-pie .hover path {
|
|
||||||
fill-opacity: .7;
|
|
||||||
}
|
|
||||||
.nvd3.nv-pie .nv-label {
|
|
||||||
pointer-events: none;
|
|
||||||
}
|
|
||||||
.nvd3.nv-pie .nv-label rect {
|
|
||||||
fill-opacity: 0;
|
|
||||||
stroke-opacity: 0;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**********
|
|
||||||
* Lines
|
|
||||||
*/
|
|
||||||
|
|
||||||
.nvd3 .nv-groups path.nv-line {
|
|
||||||
fill: none;
|
|
||||||
stroke-width: 1.5px;
|
|
||||||
/*
|
|
||||||
stroke-linecap: round;
|
|
||||||
shape-rendering: geometricPrecision;
|
|
||||||
|
|
||||||
transition: stroke-width 250ms linear;
|
|
||||||
-moz-transition: stroke-width 250ms linear;
|
|
||||||
-webkit-transition: stroke-width 250ms linear;
|
|
||||||
|
|
||||||
transition-delay: 250ms
|
|
||||||
-moz-transition-delay: 250ms;
|
|
||||||
-webkit-transition-delay: 250ms;
|
|
||||||
*/
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3 .nv-groups path.nv-line.nv-thin-line {
|
|
||||||
stroke-width: 1px;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
.nvd3 .nv-groups path.nv-area {
|
|
||||||
stroke: none;
|
|
||||||
/*
|
|
||||||
stroke-linecap: round;
|
|
||||||
shape-rendering: geometricPrecision;
|
|
||||||
|
|
||||||
stroke-width: 2.5px;
|
|
||||||
transition: stroke-width 250ms linear;
|
|
||||||
-moz-transition: stroke-width 250ms linear;
|
|
||||||
-webkit-transition: stroke-width 250ms linear;
|
|
||||||
|
|
||||||
transition-delay: 250ms
|
|
||||||
-moz-transition-delay: 250ms;
|
|
||||||
-webkit-transition-delay: 250ms;
|
|
||||||
*/
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3 .nv-line.hover path {
|
|
||||||
stroke-width: 6px;
|
|
||||||
}
|
|
||||||
|
|
||||||
/*
|
|
||||||
.nvd3.scatter .groups .point {
|
|
||||||
fill-opacity: 0.1;
|
|
||||||
stroke-opacity: 0.1;
|
|
||||||
}
|
|
||||||
*/
|
|
||||||
|
|
||||||
.nvd3.nv-line .nvd3.nv-scatter .nv-groups .nv-point {
|
|
||||||
fill-opacity: 0;
|
|
||||||
stroke-opacity: 0;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3.nv-scatter.nv-single-point .nv-groups .nv-point {
|
|
||||||
fill-opacity: .5 !important;
|
|
||||||
stroke-opacity: .5 !important;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
.with-transitions .nvd3 .nv-groups .nv-point {
|
|
||||||
transition: stroke-width 250ms linear, stroke-opacity 250ms linear;
|
|
||||||
-moz-transition: stroke-width 250ms linear, stroke-opacity 250ms linear;
|
|
||||||
-webkit-transition: stroke-width 250ms linear, stroke-opacity 250ms linear;
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3.nv-scatter .nv-groups .nv-point.hover,
|
|
||||||
.nvd3 .nv-groups .nv-point.hover {
|
|
||||||
stroke-width: 7px;
|
|
||||||
fill-opacity: .95 !important;
|
|
||||||
stroke-opacity: .95 !important;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
.nvd3 .nv-point-paths path {
|
|
||||||
stroke: #aaa;
|
|
||||||
stroke-opacity: 0;
|
|
||||||
fill: #eee;
|
|
||||||
fill-opacity: 0;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
.nvd3 .nv-indexLine {
|
|
||||||
cursor: ew-resize;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
/**********
|
|
||||||
* Distribution
|
|
||||||
*/
|
|
||||||
|
|
||||||
.nvd3 .nv-distribution {
|
|
||||||
pointer-events: none;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
/**********
|
|
||||||
* Scatter
|
|
||||||
*/
|
|
||||||
|
|
||||||
/* **Attempting to remove this for useVoronoi(false), need to see if it's required anywhere
|
|
||||||
.nvd3 .nv-groups .nv-point {
|
|
||||||
pointer-events: none;
|
|
||||||
}
|
|
||||||
*/
|
|
||||||
|
|
||||||
.nvd3 .nv-groups .nv-point.hover {
|
|
||||||
stroke-width: 20px;
|
|
||||||
stroke-opacity: .5;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3 .nv-scatter .nv-point.hover {
|
|
||||||
fill-opacity: 1;
|
|
||||||
}
|
|
||||||
|
|
||||||
/*
|
|
||||||
.nv-group.hover .nv-point {
|
|
||||||
fill-opacity: 1;
|
|
||||||
}
|
|
||||||
*/
|
|
||||||
|
|
||||||
|
|
||||||
/**********
|
|
||||||
* Stacked Area
|
|
||||||
*/
|
|
||||||
|
|
||||||
.nvd3.nv-stackedarea path.nv-area {
|
|
||||||
fill-opacity: .7;
|
|
||||||
/*
|
|
||||||
stroke-opacity: .65;
|
|
||||||
fill-opacity: 1;
|
|
||||||
*/
|
|
||||||
stroke-opacity: 0;
|
|
||||||
|
|
||||||
transition: fill-opacity 250ms linear, stroke-opacity 250ms linear;
|
|
||||||
-moz-transition: fill-opacity 250ms linear, stroke-opacity 250ms linear;
|
|
||||||
-webkit-transition: fill-opacity 250ms linear, stroke-opacity 250ms linear;
|
|
||||||
|
|
||||||
/*
|
|
||||||
transition-delay: 500ms;
|
|
||||||
-moz-transition-delay: 500ms;
|
|
||||||
-webkit-transition-delay: 500ms;
|
|
||||||
*/
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3.nv-stackedarea path.nv-area.hover {
|
|
||||||
fill-opacity: .9;
|
|
||||||
/*
|
|
||||||
stroke-opacity: .85;
|
|
||||||
*/
|
|
||||||
}
|
|
||||||
/*
|
|
||||||
.d3stackedarea .groups path {
|
|
||||||
stroke-opacity: 0;
|
|
||||||
}
|
|
||||||
*/
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
.nvd3.nv-stackedarea .nv-groups .nv-point {
|
|
||||||
stroke-opacity: 0;
|
|
||||||
fill-opacity: 0;
|
|
||||||
}
|
|
||||||
|
|
||||||
/*
|
|
||||||
.nvd3.nv-stackedarea .nv-groups .nv-point.hover {
|
|
||||||
stroke-width: 20px;
|
|
||||||
stroke-opacity: .75;
|
|
||||||
fill-opacity: 1;
|
|
||||||
}*/
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
/**********
|
|
||||||
* Line Plus Bar
|
|
||||||
*/
|
|
||||||
|
|
||||||
.nvd3.nv-linePlusBar .nv-bar rect {
|
|
||||||
fill-opacity: .75;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3.nv-linePlusBar .nv-bar rect:hover {
|
|
||||||
fill-opacity: 1;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
/**********
|
|
||||||
* Bullet
|
|
||||||
*/
|
|
||||||
|
|
||||||
.nvd3.nv-bullet { font: 10px sans-serif; }
|
|
||||||
.nvd3.nv-bullet .nv-measure { fill-opacity: .8; }
|
|
||||||
.nvd3.nv-bullet .nv-measure:hover { fill-opacity: 1; }
|
|
||||||
.nvd3.nv-bullet .nv-marker { stroke: #000; stroke-width: 2px; }
|
|
||||||
.nvd3.nv-bullet .nv-markerTriangle { stroke: #000; fill: #fff; stroke-width: 1.5px; }
|
|
||||||
.nvd3.nv-bullet .nv-tick line { stroke: #666; stroke-width: .5px; }
|
|
||||||
.nvd3.nv-bullet .nv-range.nv-s0 { fill: #eee; }
|
|
||||||
.nvd3.nv-bullet .nv-range.nv-s1 { fill: #ddd; }
|
|
||||||
.nvd3.nv-bullet .nv-range.nv-s2 { fill: #ccc; }
|
|
||||||
.nvd3.nv-bullet .nv-title { font-size: 14px; font-weight: bold; }
|
|
||||||
.nvd3.nv-bullet .nv-subtitle { fill: #999; }
|
|
||||||
|
|
||||||
|
|
||||||
.nvd3.nv-bullet .nv-range {
|
|
||||||
fill: #bababa;
|
|
||||||
fill-opacity: .4;
|
|
||||||
}
|
|
||||||
.nvd3.nv-bullet .nv-range:hover {
|
|
||||||
fill-opacity: .7;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
/**********
|
|
||||||
* Sparkline
|
|
||||||
*/
|
|
||||||
|
|
||||||
.nvd3.nv-sparkline path {
|
|
||||||
fill: none;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3.nv-sparklineplus g.nv-hoverValue {
|
|
||||||
pointer-events: none;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3.nv-sparklineplus .nv-hoverValue line {
|
|
||||||
stroke: #333;
|
|
||||||
stroke-width: 1.5px;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3.nv-sparklineplus,
|
|
||||||
.nvd3.nv-sparklineplus g {
|
|
||||||
pointer-events: all;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3 .nv-hoverArea {
|
|
||||||
fill-opacity: 0;
|
|
||||||
stroke-opacity: 0;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3.nv-sparklineplus .nv-xValue,
|
|
||||||
.nvd3.nv-sparklineplus .nv-yValue {
|
|
||||||
/*
|
|
||||||
stroke: #666;
|
|
||||||
*/
|
|
||||||
stroke-width: 0;
|
|
||||||
font-size: .9em;
|
|
||||||
font-weight: normal;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3.nv-sparklineplus .nv-yValue {
|
|
||||||
stroke: #f66;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3.nv-sparklineplus .nv-maxValue {
|
|
||||||
stroke: #2ca02c;
|
|
||||||
fill: #2ca02c;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3.nv-sparklineplus .nv-minValue {
|
|
||||||
stroke: #d62728;
|
|
||||||
fill: #d62728;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3.nv-sparklineplus .nv-currentValue {
|
|
||||||
/*
|
|
||||||
stroke: #444;
|
|
||||||
fill: #000;
|
|
||||||
*/
|
|
||||||
font-weight: bold;
|
|
||||||
font-size: 1.1em;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**********
|
|
||||||
* historical stock
|
|
||||||
*/
|
|
||||||
|
|
||||||
.nvd3.nv-ohlcBar .nv-ticks .nv-tick {
|
|
||||||
stroke-width: 2px;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3.nv-ohlcBar .nv-ticks .nv-tick.hover {
|
|
||||||
stroke-width: 4px;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3.nv-ohlcBar .nv-ticks .nv-tick.positive {
|
|
||||||
stroke: #2ca02c;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3.nv-ohlcBar .nv-ticks .nv-tick.negative {
|
|
||||||
stroke: #d62728;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3.nv-historicalStockChart .nv-axis .nv-axislabel {
|
|
||||||
font-weight: bold;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3.nv-historicalStockChart .nv-dragTarget {
|
|
||||||
fill-opacity: 0;
|
|
||||||
stroke: none;
|
|
||||||
cursor: move;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3 .nv-brush .extent {
|
|
||||||
/*
|
|
||||||
cursor: ew-resize !important;
|
|
||||||
*/
|
|
||||||
fill-opacity: 0 !important;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3 .nv-brushBackground rect {
|
|
||||||
stroke: #000;
|
|
||||||
stroke-width: .4;
|
|
||||||
fill: #fff;
|
|
||||||
fill-opacity: .7;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
/**********
|
|
||||||
* Indented Tree
|
|
||||||
*/
|
|
||||||
|
|
||||||
|
|
||||||
/**
|
|
||||||
* TODO: the following 3 selectors are based on classes used in the example. I should either make them standard and leave them here, or move to a CSS file not included in the library
|
|
||||||
*/
|
|
||||||
.nvd3.nv-indentedtree .name {
|
|
||||||
margin-left: 5px;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3.nv-indentedtree .clickable {
|
|
||||||
color: #08C;
|
|
||||||
cursor: pointer;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3.nv-indentedtree span.clickable:hover {
|
|
||||||
color: #005580;
|
|
||||||
text-decoration: underline;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
.nvd3.nv-indentedtree .nv-childrenCount {
|
|
||||||
display: inline-block;
|
|
||||||
margin-left: 5px;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3.nv-indentedtree .nv-treeicon {
|
|
||||||
cursor: pointer;
|
|
||||||
/*
|
|
||||||
cursor: n-resize;
|
|
||||||
*/
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3.nv-indentedtree .nv-treeicon.nv-folded {
|
|
||||||
cursor: pointer;
|
|
||||||
/*
|
|
||||||
cursor: s-resize;
|
|
||||||
*/
|
|
||||||
}
|
|
||||||
|
|
||||||
/**********
|
|
||||||
* Parallel Coordinates
|
|
||||||
*/
|
|
||||||
|
|
||||||
.nvd3 .background path {
|
|
||||||
fill: none;
|
|
||||||
stroke: #ccc;
|
|
||||||
stroke-opacity: .4;
|
|
||||||
shape-rendering: crispEdges;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3 .foreground path {
|
|
||||||
fill: none;
|
|
||||||
stroke: steelblue;
|
|
||||||
stroke-opacity: .7;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3 .brush .extent {
|
|
||||||
fill-opacity: .3;
|
|
||||||
stroke: #fff;
|
|
||||||
shape-rendering: crispEdges;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3 .axis line, .axis path {
|
|
||||||
fill: none;
|
|
||||||
stroke: #000;
|
|
||||||
shape-rendering: crispEdges;
|
|
||||||
}
|
|
||||||
|
|
||||||
.nvd3 .axis text {
|
|
||||||
text-shadow: 0 1px 0 #fff;
|
|
||||||
}
|
|
||||||
|
|
||||||
/****
|
|
||||||
Interactive Layer
|
|
||||||
*/
|
|
||||||
.nvd3 .nv-interactiveGuideLine {
|
|
||||||
pointer-events:none;
|
|
||||||
}
|
|
||||||
.nvd3 line.nv-guideline {
|
|
||||||
stroke: #ccc;
|
|
||||||
}
|
|
Before Width: | Height: | Size: 1.1 KiB |
@ -1,229 +0,0 @@
|
|||||||
<?xml version="1.0" standalone="no"?>
|
|
||||||
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd" >
|
|
||||||
<svg xmlns="http://www.w3.org/2000/svg">
|
|
||||||
<metadata></metadata>
|
|
||||||
<defs>
|
|
||||||
<font id="glyphicons_halflingsregular" horiz-adv-x="1200" >
|
|
||||||
<font-face units-per-em="1200" ascent="960" descent="-240" />
|
|
||||||
<missing-glyph horiz-adv-x="500" />
|
|
||||||
<glyph />
|
|
||||||
<glyph />
|
|
||||||
<glyph unicode="
" />
|
|
||||||
<glyph unicode=" " />
|
|
||||||
<glyph unicode="*" d="M100 500v200h259l-183 183l141 141l183 -183v259h200v-259l183 183l141 -141l-183 -183h259v-200h-259l183 -183l-141 -141l-183 183v-259h-200v259l-183 -183l-141 141l183 183h-259z" />
|
|
||||||
<glyph unicode="+" d="M0 400v300h400v400h300v-400h400v-300h-400v-400h-300v400h-400z" />
|
|
||||||
<glyph unicode=" " />
|
|
||||||
<glyph unicode=" " horiz-adv-x="652" />
|
|
||||||
<glyph unicode=" " horiz-adv-x="1304" />
|
|
||||||
<glyph unicode=" " horiz-adv-x="652" />
|
|
||||||
<glyph unicode=" " horiz-adv-x="1304" />
|
|
||||||
<glyph unicode=" " horiz-adv-x="434" />
|
|
||||||
<glyph unicode=" " horiz-adv-x="326" />
|
|
||||||
<glyph unicode=" " horiz-adv-x="217" />
|
|
||||||
<glyph unicode=" " horiz-adv-x="217" />
|
|
||||||
<glyph unicode=" " horiz-adv-x="163" />
|
|
||||||
<glyph unicode=" " horiz-adv-x="260" />
|
|
||||||
<glyph unicode=" " horiz-adv-x="72" />
|
|
||||||
<glyph unicode=" " horiz-adv-x="260" />
|
|
||||||
<glyph unicode=" " horiz-adv-x="326" />
|
|
||||||
<glyph unicode="€" d="M100 500l100 100h113q0 47 5 100h-218l100 100h135q37 167 112 257q117 141 297 141q242 0 354 -189q60 -103 66 -209h-181q0 55 -25.5 99t-63.5 68t-75 36.5t-67 12.5q-24 0 -52.5 -10t-62.5 -32t-65.5 -67t-50.5 -107h379l-100 -100h-300q-6 -46 -6 -100h406l-100 -100 h-300q9 -74 33 -132t52.5 -91t62 -54.5t59 -29t46.5 -7.5q29 0 66 13t75 37t63.5 67.5t25.5 96.5h174q-31 -172 -128 -278q-107 -117 -274 -117q-205 0 -324 158q-36 46 -69 131.5t-45 205.5h-217z" />
|
|
||||||
<glyph unicode="−" d="M200 400h900v300h-900v-300z" />
|
|
||||||
<glyph unicode="◼" horiz-adv-x="500" d="M0 0z" />
|
|
||||||
<glyph unicode="☁" d="M-14 494q0 -80 56.5 -137t135.5 -57h750q120 0 205 86.5t85 207.5t-85 207t-205 86q-46 0 -90 -14q-44 97 -134.5 156.5t-200.5 59.5q-152 0 -260 -107.5t-108 -260.5q0 -25 2 -37q-66 -14 -108.5 -67.5t-42.5 -122.5z" />
|
|
||||||
<glyph unicode="✉" d="M0 100l400 400l200 -200l200 200l400 -400h-1200zM0 300v600l300 -300zM0 1100l600 -603l600 603h-1200zM900 600l300 300v-600z" />
|
|
||||||
<glyph unicode="✏" d="M-13 -13l333 112l-223 223zM187 403l214 -214l614 614l-214 214zM887 1103l214 -214l99 92q13 13 13 32.5t-13 33.5l-153 153q-15 13 -33 13t-33 -13z" />
|
|
||||||
<glyph unicode="" d="M0 1200h1200l-500 -550v-550h300v-100h-800v100h300v550z" />
|
|
||||||
<glyph unicode="" d="M14 84q18 -55 86 -75.5t147 5.5q65 21 109 69t44 90v606l600 155v-521q-64 16 -138 -7q-79 -26 -122.5 -83t-25.5 -111q18 -55 86 -75.5t147 4.5q70 23 111.5 63.5t41.5 95.5v881q0 10 -7 15.5t-17 2.5l-752 -193q-10 -3 -17 -12.5t-7 -19.5v-689q-64 17 -138 -7 q-79 -25 -122.5 -82t-25.5 -112z" />
|
|
||||||
<glyph unicode="" d="M23 693q0 200 142 342t342 142t342 -142t142 -342q0 -142 -78 -261l300 -300q7 -8 7 -18t-7 -18l-109 -109q-8 -7 -18 -7t-18 7l-300 300q-119 -78 -261 -78q-200 0 -342 142t-142 342zM176 693q0 -136 97 -233t234 -97t233.5 96.5t96.5 233.5t-96.5 233.5t-233.5 96.5 t-234 -97t-97 -233z" />
|
|
||||||
<glyph unicode="" d="M100 784q0 64 28 123t73 100.5t104.5 64t119 20.5t120 -38.5t104.5 -104.5q48 69 109.5 105t121.5 38t118.5 -20.5t102.5 -64t71 -100.5t27 -123q0 -57 -33.5 -117.5t-94 -124.5t-126.5 -127.5t-150 -152.5t-146 -174q-62 85 -145.5 174t-149.5 152.5t-126.5 127.5 t-94 124.5t-33.5 117.5z" />
|
|
||||||
<glyph unicode="" d="M-72 800h479l146 400h2l146 -400h472l-382 -278l145 -449l-384 275l-382 -275l146 447zM168 71l2 1z" />
|
|
||||||
<glyph unicode="" d="M-72 800h479l146 400h2l146 -400h472l-382 -278l145 -449l-384 275l-382 -275l146 447zM168 71l2 1zM237 700l196 -142l-73 -226l192 140l195 -141l-74 229l193 140h-235l-77 211l-78 -211h-239z" />
|
|
||||||
<glyph unicode="" d="M0 0v143l400 257v100q-37 0 -68.5 74.5t-31.5 125.5v200q0 124 88 212t212 88t212 -88t88 -212v-200q0 -51 -31.5 -125.5t-68.5 -74.5v-100l400 -257v-143h-1200z" />
|
|
||||||
<glyph unicode="" d="M0 0v1100h1200v-1100h-1200zM100 100h100v100h-100v-100zM100 300h100v100h-100v-100zM100 500h100v100h-100v-100zM100 700h100v100h-100v-100zM100 900h100v100h-100v-100zM300 100h600v400h-600v-400zM300 600h600v400h-600v-400zM1000 100h100v100h-100v-100z M1000 300h100v100h-100v-100zM1000 500h100v100h-100v-100zM1000 700h100v100h-100v-100zM1000 900h100v100h-100v-100z" />
|
|
||||||
<glyph unicode="" d="M0 50v400q0 21 14.5 35.5t35.5 14.5h400q21 0 35.5 -14.5t14.5 -35.5v-400q0 -21 -14.5 -35.5t-35.5 -14.5h-400q-21 0 -35.5 14.5t-14.5 35.5zM0 650v400q0 21 14.5 35.5t35.5 14.5h400q21 0 35.5 -14.5t14.5 -35.5v-400q0 -21 -14.5 -35.5t-35.5 -14.5h-400 q-21 0 -35.5 14.5t-14.5 35.5zM600 50v400q0 21 14.5 35.5t35.5 14.5h400q21 0 35.5 -14.5t14.5 -35.5v-400q0 -21 -14.5 -35.5t-35.5 -14.5h-400q-21 0 -35.5 14.5t-14.5 35.5zM600 650v400q0 21 14.5 35.5t35.5 14.5h400q21 0 35.5 -14.5t14.5 -35.5v-400 q0 -21 -14.5 -35.5t-35.5 -14.5h-400q-21 0 -35.5 14.5t-14.5 35.5z" />
|
|
||||||
<glyph unicode="" d="M0 50v200q0 21 14.5 35.5t35.5 14.5h200q21 0 35.5 -14.5t14.5 -35.5v-200q0 -21 -14.5 -35.5t-35.5 -14.5h-200q-21 0 -35.5 14.5t-14.5 35.5zM0 450v200q0 21 14.5 35.5t35.5 14.5h200q21 0 35.5 -14.5t14.5 -35.5v-200q0 -21 -14.5 -35.5t-35.5 -14.5h-200 q-21 0 -35.5 14.5t-14.5 35.5zM0 850v200q0 21 14.5 35.5t35.5 14.5h200q21 0 35.5 -14.5t14.5 -35.5v-200q0 -21 -14.5 -35.5t-35.5 -14.5h-200q-21 0 -35.5 14.5t-14.5 35.5zM400 50v200q0 21 14.5 35.5t35.5 14.5h200q21 0 35.5 -14.5t14.5 -35.5v-200q0 -21 -14.5 -35.5 t-35.5 -14.5h-200q-21 0 -35.5 14.5t-14.5 35.5zM400 450v200q0 21 14.5 35.5t35.5 14.5h200q21 0 35.5 -14.5t14.5 -35.5v-200q0 -21 -14.5 -35.5t-35.5 -14.5h-200q-21 0 -35.5 14.5t-14.5 35.5zM400 850v200q0 21 14.5 35.5t35.5 14.5h200q21 0 35.5 -14.5t14.5 -35.5 v-200q0 -21 -14.5 -35.5t-35.5 -14.5h-200q-21 0 -35.5 14.5t-14.5 35.5zM800 50v200q0 21 14.5 35.5t35.5 14.5h200q21 0 35.5 -14.5t14.5 -35.5v-200q0 -21 -14.5 -35.5t-35.5 -14.5h-200q-21 0 -35.5 14.5t-14.5 35.5zM800 450v200q0 21 14.5 35.5t35.5 14.5h200 q21 0 35.5 -14.5t14.5 -35.5v-200q0 -21 -14.5 -35.5t-35.5 -14.5h-200q-21 0 -35.5 14.5t-14.5 35.5zM800 850v200q0 21 14.5 35.5t35.5 14.5h200q21 0 35.5 -14.5t14.5 -35.5v-200q0 -21 -14.5 -35.5t-35.5 -14.5h-200q-21 0 -35.5 14.5t-14.5 35.5z" />
|
|
||||||
<glyph unicode="" d="M0 50v200q0 21 14.5 35.5t35.5 14.5h200q21 0 35.5 -14.5t14.5 -35.5v-200q0 -21 -14.5 -35.5t-35.5 -14.5h-200q-21 0 -35.5 14.5t-14.5 35.5zM0 450q0 -21 14.5 -35.5t35.5 -14.5h200q21 0 35.5 14.5t14.5 35.5v200q0 21 -14.5 35.5t-35.5 14.5h-200q-21 0 -35.5 -14.5 t-14.5 -35.5v-200zM0 850v200q0 21 14.5 35.5t35.5 14.5h200q21 0 35.5 -14.5t14.5 -35.5v-200q0 -21 -14.5 -35.5t-35.5 -14.5h-200q-21 0 -35.5 14.5t-14.5 35.5zM400 50v200q0 21 14.5 35.5t35.5 14.5h700q21 0 35.5 -14.5t14.5 -35.5v-200q0 -21 -14.5 -35.5 t-35.5 -14.5h-700q-21 0 -35.5 14.5t-14.5 35.5zM400 450v200q0 21 14.5 35.5t35.5 14.5h700q21 0 35.5 -14.5t14.5 -35.5v-200q0 -21 -14.5 -35.5t-35.5 -14.5h-700q-21 0 -35.5 14.5t-14.5 35.5zM400 850v200q0 21 14.5 35.5t35.5 14.5h700q21 0 35.5 -14.5t14.5 -35.5 v-200q0 -21 -14.5 -35.5t-35.5 -14.5h-700q-21 0 -35.5 14.5t-14.5 35.5z" />
|
|
||||||
<glyph unicode="" d="M29 454l419 -420l818 820l-212 212l-607 -607l-206 207z" />
|
|
||||||
<glyph unicode="" d="M106 318l282 282l-282 282l212 212l282 -282l282 282l212 -212l-282 -282l282 -282l-212 -212l-282 282l-282 -282z" />
|
|
||||||
<glyph unicode="" d="M23 693q0 200 142 342t342 142t342 -142t142 -342q0 -142 -78 -261l300 -300q7 -8 7 -18t-7 -18l-109 -109q-8 -7 -18 -7t-18 7l-300 300q-119 -78 -261 -78q-200 0 -342 142t-142 342zM176 693q0 -136 97 -233t234 -97t233.5 96.5t96.5 233.5t-96.5 233.5t-233.5 96.5 t-234 -97t-97 -233zM300 600v200h100v100h200v-100h100v-200h-100v-100h-200v100h-100z" />
|
|
||||||
<glyph unicode="" d="M23 694q0 200 142 342t342 142t342 -142t142 -342q0 -141 -78 -262l300 -299q7 -7 7 -18t-7 -18l-109 -109q-8 -8 -18 -8t-18 8l-300 300q-119 -78 -261 -78q-200 0 -342 142t-142 342zM176 694q0 -136 97 -233t234 -97t233.5 97t96.5 233t-96.5 233t-233.5 97t-234 -97 t-97 -233zM300 601h400v200h-400v-200z" />
|
|
||||||
<glyph unicode="" d="M23 600q0 183 105 331t272 210v-166q-103 -55 -165 -155t-62 -220q0 -177 125 -302t302 -125t302 125t125 302q0 120 -62 220t-165 155v166q167 -62 272 -210t105 -331q0 -118 -45.5 -224.5t-123 -184t-184 -123t-224.5 -45.5t-224.5 45.5t-184 123t-123 184t-45.5 224.5 zM500 750q0 -21 14.5 -35.5t35.5 -14.5h100q21 0 35.5 14.5t14.5 35.5v400q0 21 -14.5 35.5t-35.5 14.5h-100q-21 0 -35.5 -14.5t-14.5 -35.5v-400z" />
|
|
||||||
<glyph unicode="" d="M100 1h200v300h-200v-300zM400 1v500h200v-500h-200zM700 1v800h200v-800h-200zM1000 1v1200h200v-1200h-200z" />
|
|
||||||
<glyph unicode="" d="M26 601q0 -33 6 -74l151 -38l2 -6q14 -49 38 -93l3 -5l-80 -134q45 -59 105 -105l133 81l5 -3q45 -26 94 -39l5 -2l38 -151q40 -5 74 -5q27 0 74 5l38 151l6 2q46 13 93 39l5 3l134 -81q56 44 104 105l-80 134l3 5q24 44 39 93l1 6l152 38q5 40 5 74q0 28 -5 73l-152 38 l-1 6q-16 51 -39 93l-3 5l80 134q-44 58 -104 105l-134 -81l-5 3q-45 25 -93 39l-6 1l-38 152q-40 5 -74 5q-27 0 -74 -5l-38 -152l-5 -1q-50 -14 -94 -39l-5 -3l-133 81q-59 -47 -105 -105l80 -134l-3 -5q-25 -47 -38 -93l-2 -6l-151 -38q-6 -48 -6 -73zM385 601 q0 88 63 151t152 63t152 -63t63 -151q0 -89 -63 -152t-152 -63t-152 63t-63 152z" />
|
|
||||||
<glyph unicode="" d="M100 1025v50q0 10 7.5 17.5t17.5 7.5h275v100q0 41 29.5 70.5t70.5 29.5h300q41 0 70.5 -29.5t29.5 -70.5v-100h275q10 0 17.5 -7.5t7.5 -17.5v-50q0 -11 -7 -18t-18 -7h-1050q-11 0 -18 7t-7 18zM200 100v800h900v-800q0 -41 -29.5 -71t-70.5 -30h-700q-41 0 -70.5 30 t-29.5 71zM300 100h100v700h-100v-700zM500 100h100v700h-100v-700zM500 1100h300v100h-300v-100zM700 100h100v700h-100v-700zM900 100h100v700h-100v-700z" />
|
|
||||||
<glyph unicode="" d="M1 601l656 644l644 -644h-200v-600h-300v400h-300v-400h-300v600h-200z" />
|
|
||||||
<glyph unicode="" d="M100 25v1150q0 11 7 18t18 7h475v-500h400v-675q0 -11 -7 -18t-18 -7h-850q-11 0 -18 7t-7 18zM700 800v300l300 -300h-300z" />
|
|
||||||
<glyph unicode="" d="M4 600q0 162 80 299t217 217t299 80t299 -80t217 -217t80 -299t-80 -299t-217 -217t-299 -80t-299 80t-217 217t-80 299zM186 600q0 -171 121.5 -292.5t292.5 -121.5t292.5 121.5t121.5 292.5t-121.5 292.5t-292.5 121.5t-292.5 -121.5t-121.5 -292.5zM500 500v400h100 v-300h200v-100h-300z" />
|
|
||||||
<glyph unicode="" d="M-100 0l431 1200h209l-21 -300h162l-20 300h208l431 -1200h-538l-41 400h-242l-40 -400h-539zM488 500h224l-27 300h-170z" />
|
|
||||||
<glyph unicode="" d="M0 0v400h490l-290 300h200v500h300v-500h200l-290 -300h490v-400h-1100zM813 200h175v100h-175v-100z" />
|
|
||||||
<glyph unicode="" d="M1 600q0 122 47.5 233t127.5 191t191 127.5t233 47.5t233 -47.5t191 -127.5t127.5 -191t47.5 -233t-47.5 -233t-127.5 -191t-191 -127.5t-233 -47.5t-233 47.5t-191 127.5t-127.5 191t-47.5 233zM188 600q0 -170 121 -291t291 -121t291 121t121 291t-121 291t-291 121 t-291 -121t-121 -291zM350 600h150v300h200v-300h150l-250 -300z" />
|
|
||||||
<glyph unicode="" d="M4 600q0 162 80 299t217 217t299 80t299 -80t217 -217t80 -299t-80 -299t-217 -217t-299 -80t-299 80t-217 217t-80 299zM186 600q0 -171 121.5 -292.5t292.5 -121.5t292.5 121.5t121.5 292.5t-121.5 292.5t-292.5 121.5t-292.5 -121.5t-121.5 -292.5zM350 600l250 300 l250 -300h-150v-300h-200v300h-150z" />
|
|
||||||
<glyph unicode="" d="M0 25v475l200 700h800l199 -700l1 -475q0 -11 -7 -18t-18 -7h-1150q-11 0 -18 7t-7 18zM200 500h200l50 -200h300l50 200h200l-97 500h-606z" />
|
|
||||||
<glyph unicode="" d="M4 600q0 162 80 299t217 217t299 80t299 -80t217 -217t80 -299t-80 -299t-217 -217t-299 -80t-299 80t-217 217t-80 299zM186 600q0 -172 121.5 -293t292.5 -121t292.5 121t121.5 293q0 171 -121.5 292.5t-292.5 121.5t-292.5 -121.5t-121.5 -292.5zM500 397v401 l297 -200z" />
|
|
||||||
<glyph unicode="" d="M23 600q0 -118 45.5 -224.5t123 -184t184 -123t224.5 -45.5t224.5 45.5t184 123t123 184t45.5 224.5h-150q0 -177 -125 -302t-302 -125t-302 125t-125 302t125 302t302 125q136 0 246 -81l-146 -146h400v400l-145 -145q-157 122 -355 122q-118 0 -224.5 -45.5t-184 -123 t-123 -184t-45.5 -224.5z" />
|
|
||||||
<glyph unicode="" d="M23 600q0 118 45.5 224.5t123 184t184 123t224.5 45.5q198 0 355 -122l145 145v-400h-400l147 147q-112 80 -247 80q-177 0 -302 -125t-125 -302h-150zM100 0v400h400l-147 -147q112 -80 247 -80q177 0 302 125t125 302h150q0 -118 -45.5 -224.5t-123 -184t-184 -123 t-224.5 -45.5q-198 0 -355 122z" />
|
|
||||||
<glyph unicode="" d="M100 0h1100v1200h-1100v-1200zM200 100v900h900v-900h-900zM300 200v100h100v-100h-100zM300 400v100h100v-100h-100zM300 600v100h100v-100h-100zM300 800v100h100v-100h-100zM500 200h500v100h-500v-100zM500 400v100h500v-100h-500zM500 600v100h500v-100h-500z M500 800v100h500v-100h-500z" />
|
|
||||||
<glyph unicode="" d="M0 100v600q0 41 29.5 70.5t70.5 29.5h100v200q0 82 59 141t141 59h300q82 0 141 -59t59 -141v-200h100q41 0 70.5 -29.5t29.5 -70.5v-600q0 -41 -29.5 -70.5t-70.5 -29.5h-900q-41 0 -70.5 29.5t-29.5 70.5zM400 800h300v150q0 21 -14.5 35.5t-35.5 14.5h-200 q-21 0 -35.5 -14.5t-14.5 -35.5v-150z" />
|
|
||||||
<glyph unicode="" d="M100 0v1100h100v-1100h-100zM300 400q60 60 127.5 84t127.5 17.5t122 -23t119 -30t110 -11t103 42t91 120.5v500q-40 -81 -101.5 -115.5t-127.5 -29.5t-138 25t-139.5 40t-125.5 25t-103 -29.5t-65 -115.5v-500z" />
|
|
||||||
<glyph unicode="" d="M0 275q0 -11 7 -18t18 -7h50q11 0 18 7t7 18v300q0 127 70.5 231.5t184.5 161.5t245 57t245 -57t184.5 -161.5t70.5 -231.5v-300q0 -11 7 -18t18 -7h50q11 0 18 7t7 18v300q0 116 -49.5 227t-131 192.5t-192.5 131t-227 49.5t-227 -49.5t-192.5 -131t-131 -192.5 t-49.5 -227v-300zM200 20v460q0 8 6 14t14 6h160q8 0 14 -6t6 -14v-460q0 -8 -6 -14t-14 -6h-160q-8 0 -14 6t-6 14zM800 20v460q0 8 6 14t14 6h160q8 0 14 -6t6 -14v-460q0 -8 -6 -14t-14 -6h-160q-8 0 -14 6t-6 14z" />
|
|
||||||
<glyph unicode="" d="M0 400h300l300 -200v800l-300 -200h-300v-400zM688 459l141 141l-141 141l71 71l141 -141l141 141l71 -71l-141 -141l141 -141l-71 -71l-141 141l-141 -141z" />
|
|
||||||
<glyph unicode="" d="M0 400h300l300 -200v800l-300 -200h-300v-400zM700 857l69 53q111 -135 111 -310q0 -169 -106 -302l-67 54q86 110 86 248q0 146 -93 257z" />
|
|
||||||
<glyph unicode="" d="M0 401v400h300l300 200v-800l-300 200h-300zM702 858l69 53q111 -135 111 -310q0 -170 -106 -303l-67 55q86 110 86 248q0 145 -93 257zM889 951l7 -8q123 -151 123 -344q0 -189 -119 -339l-7 -8l81 -66l6 8q142 178 142 405q0 230 -144 408l-6 8z" />
|
|
||||||
<glyph unicode="" d="M0 0h500v500h-200v100h-100v-100h-200v-500zM0 600h100v100h400v100h100v100h-100v300h-500v-600zM100 100v300h300v-300h-300zM100 800v300h300v-300h-300zM200 200v100h100v-100h-100zM200 900h100v100h-100v-100zM500 500v100h300v-300h200v-100h-100v-100h-200v100 h-100v100h100v200h-200zM600 0v100h100v-100h-100zM600 1000h100v-300h200v-300h300v200h-200v100h200v500h-600v-200zM800 800v300h300v-300h-300zM900 0v100h300v-100h-300zM900 900v100h100v-100h-100zM1100 200v100h100v-100h-100z" />
|
|
||||||
<glyph unicode="" d="M0 200h100v1000h-100v-1000zM100 0v100h300v-100h-300zM200 200v1000h100v-1000h-100zM500 0v91h100v-91h-100zM500 200v1000h200v-1000h-200zM700 0v91h100v-91h-100zM800 200v1000h100v-1000h-100zM900 0v91h200v-91h-200zM1000 200v1000h200v-1000h-200z" />
|
|
||||||
<glyph unicode="" d="M0 700l1 475q0 10 7.5 17.5t17.5 7.5h474l700 -700l-500 -500zM148 953q0 -42 29 -71q30 -30 71.5 -30t71.5 30q29 29 29 71t-29 71q-30 30 -71.5 30t-71.5 -30q-29 -29 -29 -71z" />
|
|
||||||
<glyph unicode="" d="M1 700l1 475q0 11 7 18t18 7h474l700 -700l-500 -500zM148 953q0 -42 30 -71q29 -30 71 -30t71 30q30 29 30 71t-30 71q-29 30 -71 30t-71 -30q-30 -29 -30 -71zM701 1200h100l700 -700l-500 -500l-50 50l450 450z" />
|
|
||||||
<glyph unicode="" d="M100 0v1025l175 175h925v-1000l-100 -100v1000h-750l-100 -100h750v-1000h-900z" />
|
|
||||||
<glyph unicode="" d="M200 0l450 444l450 -443v1150q0 20 -14.5 35t-35.5 15h-800q-21 0 -35.5 -15t-14.5 -35v-1151z" />
|
|
||||||
<glyph unicode="" d="M0 100v700h200l100 -200h600l100 200h200v-700h-200v200h-800v-200h-200zM253 829l40 -124h592l62 124l-94 346q-2 11 -10 18t-18 7h-450q-10 0 -18 -7t-10 -18zM281 24l38 152q2 10 11.5 17t19.5 7h500q10 0 19.5 -7t11.5 -17l38 -152q2 -10 -3.5 -17t-15.5 -7h-600 q-10 0 -15.5 7t-3.5 17z" />
|
|
||||||
<glyph unicode="" d="M0 200q0 -41 29.5 -70.5t70.5 -29.5h1000q41 0 70.5 29.5t29.5 70.5v600q0 41 -29.5 70.5t-70.5 29.5h-150q-4 8 -11.5 21.5t-33 48t-53 61t-69 48t-83.5 21.5h-200q-41 0 -82 -20.5t-70 -50t-52 -59t-34 -50.5l-12 -20h-150q-41 0 -70.5 -29.5t-29.5 -70.5v-600z M356 500q0 100 72 172t172 72t172 -72t72 -172t-72 -172t-172 -72t-172 72t-72 172zM494 500q0 -44 31 -75t75 -31t75 31t31 75t-31 75t-75 31t-75 -31t-31 -75zM900 700v100h100v-100h-100z" />
|
|
||||||
<glyph unicode="" d="M53 0h365v66q-41 0 -72 11t-49 38t1 71l92 234h391l82 -222q16 -45 -5.5 -88.5t-74.5 -43.5v-66h417v66q-34 1 -74 43q-18 19 -33 42t-21 37l-6 13l-385 998h-93l-399 -1006q-24 -48 -52 -75q-12 -12 -33 -25t-36 -20l-15 -7v-66zM416 521l178 457l46 -140l116 -317h-340 z" />
|
|
||||||
<glyph unicode="" d="M100 0v89q41 7 70.5 32.5t29.5 65.5v827q0 28 -1 39.5t-5.5 26t-15.5 21t-29 14t-49 14.5v71l471 -1q120 0 213 -88t93 -228q0 -55 -11.5 -101.5t-28 -74t-33.5 -47.5t-28 -28l-12 -7q8 -3 21.5 -9t48 -31.5t60.5 -58t47.5 -91.5t21.5 -129q0 -84 -59 -156.5t-142 -111 t-162 -38.5h-500zM400 200h161q89 0 153 48.5t64 132.5q0 90 -62.5 154.5t-156.5 64.5h-159v-400zM400 700h139q76 0 130 61.5t54 138.5q0 82 -84 130.5t-239 48.5v-379z" />
|
|
||||||
<glyph unicode="" d="M200 0v57q77 7 134.5 40.5t65.5 80.5l173 849q10 56 -10 74t-91 37q-6 1 -10.5 2.5t-9.5 2.5v57h425l2 -57q-33 -8 -62 -25.5t-46 -37t-29.5 -38t-17.5 -30.5l-5 -12l-128 -825q-10 -52 14 -82t95 -36v-57h-500z" />
|
|
||||||
<glyph unicode="" d="M-75 200h75v800h-75l125 167l125 -167h-75v-800h75l-125 -167zM300 900v300h150h700h150v-300h-50q0 29 -8 48.5t-18.5 30t-33.5 15t-39.5 5.5t-50.5 1h-200v-850l100 -50v-100h-400v100l100 50v850h-200q-34 0 -50.5 -1t-40 -5.5t-33.5 -15t-18.5 -30t-8.5 -48.5h-49z " />
|
|
||||||
<glyph unicode="" d="M33 51l167 125v-75h800v75l167 -125l-167 -125v75h-800v-75zM100 901v300h150h700h150v-300h-50q0 29 -8 48.5t-18 30t-33.5 15t-40 5.5t-50.5 1h-200v-650l100 -50v-100h-400v100l100 50v650h-200q-34 0 -50.5 -1t-39.5 -5.5t-33.5 -15t-18.5 -30t-8 -48.5h-50z" />
|
|
||||||
<glyph unicode="" d="M0 50q0 -20 14.5 -35t35.5 -15h1100q21 0 35.5 15t14.5 35v100q0 21 -14.5 35.5t-35.5 14.5h-1100q-21 0 -35.5 -14.5t-14.5 -35.5v-100zM0 350q0 -20 14.5 -35t35.5 -15h800q21 0 35.5 15t14.5 35v100q0 21 -14.5 35.5t-35.5 14.5h-800q-21 0 -35.5 -14.5t-14.5 -35.5 v-100zM0 650q0 -20 14.5 -35t35.5 -15h1000q21 0 35.5 15t14.5 35v100q0 21 -14.5 35.5t-35.5 14.5h-1000q-21 0 -35.5 -14.5t-14.5 -35.5v-100zM0 950q0 -20 14.5 -35t35.5 -15h600q21 0 35.5 15t14.5 35v100q0 21 -14.5 35.5t-35.5 14.5h-600q-21 0 -35.5 -14.5 t-14.5 -35.5v-100z" />
|
|
||||||
<glyph unicode="" d="M0 50q0 -20 14.5 -35t35.5 -15h1100q21 0 35.5 15t14.5 35v100q0 21 -14.5 35.5t-35.5 14.5h-1100q-21 0 -35.5 -14.5t-14.5 -35.5v-100zM0 650q0 -20 14.5 -35t35.5 -15h1100q21 0 35.5 15t14.5 35v100q0 21 -14.5 35.5t-35.5 14.5h-1100q-21 0 -35.5 -14.5t-14.5 -35.5 v-100zM200 350q0 -20 14.5 -35t35.5 -15h700q21 0 35.5 15t14.5 35v100q0 21 -14.5 35.5t-35.5 14.5h-700q-21 0 -35.5 -14.5t-14.5 -35.5v-100zM200 950q0 -20 14.5 -35t35.5 -15h700q21 0 35.5 15t14.5 35v100q0 21 -14.5 35.5t-35.5 14.5h-700q-21 0 -35.5 -14.5 t-14.5 -35.5v-100z" />
|
|
||||||
<glyph unicode="" d="M0 50v100q0 21 14.5 35.5t35.5 14.5h1100q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-1100q-21 0 -35.5 15t-14.5 35zM100 650v100q0 21 14.5 35.5t35.5 14.5h1000q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-1000q-21 0 -35.5 15 t-14.5 35zM300 350v100q0 21 14.5 35.5t35.5 14.5h800q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-800q-21 0 -35.5 15t-14.5 35zM500 950v100q0 21 14.5 35.5t35.5 14.5h600q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-600 q-21 0 -35.5 15t-14.5 35z" />
|
|
||||||
<glyph unicode="" d="M0 50v100q0 21 14.5 35.5t35.5 14.5h1100q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-1100q-21 0 -35.5 15t-14.5 35zM0 350v100q0 21 14.5 35.5t35.5 14.5h1100q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-1100q-21 0 -35.5 15 t-14.5 35zM0 650v100q0 21 14.5 35.5t35.5 14.5h1100q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-1100q-21 0 -35.5 15t-14.5 35zM0 950v100q0 21 14.5 35.5t35.5 14.5h1100q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-1100 q-21 0 -35.5 15t-14.5 35z" />
|
|
||||||
<glyph unicode="" d="M0 50v100q0 21 14.5 35.5t35.5 14.5h100q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-100q-21 0 -35.5 15t-14.5 35zM0 350v100q0 21 14.5 35.5t35.5 14.5h100q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-100q-21 0 -35.5 15 t-14.5 35zM0 650v100q0 21 14.5 35.5t35.5 14.5h100q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-100q-21 0 -35.5 15t-14.5 35zM0 950v100q0 21 14.5 35.5t35.5 14.5h100q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-100q-21 0 -35.5 15 t-14.5 35zM300 50v100q0 21 14.5 35.5t35.5 14.5h800q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-800q-21 0 -35.5 15t-14.5 35zM300 350v100q0 21 14.5 35.5t35.5 14.5h800q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-800 q-21 0 -35.5 15t-14.5 35zM300 650v100q0 21 14.5 35.5t35.5 14.5h800q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15h-800q-21 0 -35.5 15t-14.5 35zM300 950v100q0 21 14.5 35.5t35.5 14.5h800q21 0 35.5 -14.5t14.5 -35.5v-100q0 -20 -14.5 -35t-35.5 -15 h-800q-21 0 -35.5 15t-14.5 35z" />
|
|
||||||
<glyph unicode="" d="M-101 500v100h201v75l166 -125l-166 -125v75h-201zM300 0h100v1100h-100v-1100zM500 50q0 -20 14.5 -35t35.5 -15h600q20 0 35 15t15 35v100q0 21 -15 35.5t-35 14.5h-600q-21 0 -35.5 -14.5t-14.5 -35.5v-100zM500 350q0 -20 14.5 -35t35.5 -15h300q20 0 35 15t15 35 v100q0 21 -15 35.5t-35 14.5h-300q-21 0 -35.5 -14.5t-14.5 -35.5v-100zM500 650q0 -20 14.5 -35t35.5 -15h500q20 0 35 15t15 35v100q0 21 -15 35.5t-35 14.5h-500q-21 0 -35.5 -14.5t-14.5 -35.5v-100zM500 950q0 -20 14.5 -35t35.5 -15h100q20 0 35 15t15 35v100 q0 21 -15 35.5t-35 14.5h-100q-21 0 -35.5 -14.5t-14.5 -35.5v-100z" />
|
|
||||||
<glyph unicode="" d="M1 50q0 -20 14.5 -35t35.5 -15h600q20 0 35 15t15 35v100q0 21 -15 35.5t-35 14.5h-600q-21 0 -35.5 -14.5t-14.5 -35.5v-100zM1 350q0 -20 14.5 -35t35.5 -15h300q20 0 35 15t15 35v100q0 21 -15 35.5t-35 14.5h-300q-21 0 -35.5 -14.5t-14.5 -35.5v-100zM1 650 q0 -20 14.5 -35t35.5 -15h500q20 0 35 15t15 35v100q0 21 -15 35.5t-35 14.5h-500q-21 0 -35.5 -14.5t-14.5 -35.5v-100zM1 950q0 -20 14.5 -35t35.5 -15h100q20 0 35 15t15 35v100q0 21 -15 35.5t-35 14.5h-100q-21 0 -35.5 -14.5t-14.5 -35.5v-100zM801 0v1100h100v-1100 h-100zM934 550l167 -125v75h200v100h-200v75z" />
|
|
||||||
<glyph unicode="" d="M0 275v650q0 31 22 53t53 22h750q31 0 53 -22t22 -53v-650q0 -31 -22 -53t-53 -22h-750q-31 0 -53 22t-22 53zM900 600l300 300v-600z" />
|
|
||||||
<glyph unicode="" d="M0 44v1012q0 18 13 31t31 13h1112q19 0 31.5 -13t12.5 -31v-1012q0 -18 -12.5 -31t-31.5 -13h-1112q-18 0 -31 13t-13 31zM100 263l247 182l298 -131l-74 156l293 318l236 -288v500h-1000v-737zM208 750q0 56 39 95t95 39t95 -39t39 -95t-39 -95t-95 -39t-95 39t-39 95z " />
|
|
||||||
<glyph unicode="" d="M148 745q0 124 60.5 231.5t165 172t226.5 64.5q123 0 227 -63t164.5 -169.5t60.5 -229.5t-73 -272q-73 -114 -166.5 -237t-150.5 -189l-57 -66q-10 9 -27 26t-66.5 70.5t-96 109t-104 135.5t-100.5 155q-63 139 -63 262zM342 772q0 -107 75.5 -182.5t181.5 -75.5 q107 0 182.5 75.5t75.5 182.5t-75.5 182t-182.5 75t-182 -75.5t-75 -181.5z" />
|
|
||||||
<glyph unicode="" d="M1 600q0 122 47.5 233t127.5 191t191 127.5t233 47.5t233 -47.5t191 -127.5t127.5 -191t47.5 -233t-47.5 -233t-127.5 -191t-191 -127.5t-233 -47.5t-233 47.5t-191 127.5t-127.5 191t-47.5 233zM173 600q0 -177 125.5 -302t301.5 -125v854q-176 0 -301.5 -125 t-125.5 -302z" />
|
|
||||||
<glyph unicode="" d="M117 406q0 94 34 186t88.5 172.5t112 159t115 177t87.5 194.5q21 -71 57.5 -142.5t76 -130.5t83 -118.5t82 -117t70 -116t50 -125.5t18.5 -136q0 -89 -39 -165.5t-102 -126.5t-140 -79.5t-156 -33.5q-114 6 -211.5 53t-161.5 139t-64 210zM243 414q14 -82 59.5 -136 t136.5 -80l16 98q-7 6 -18 17t-34 48t-33 77q-15 73 -14 143.5t10 122.5l9 51q-92 -110 -119.5 -185t-12.5 -156z" />
|
|
||||||
<glyph unicode="" d="M0 400v300q0 165 117.5 282.5t282.5 117.5q366 -6 397 -14l-186 -186h-311q-41 0 -70.5 -29.5t-29.5 -70.5v-500q0 -41 29.5 -70.5t70.5 -29.5h500q41 0 70.5 29.5t29.5 70.5v125l200 200v-225q0 -165 -117.5 -282.5t-282.5 -117.5h-300q-165 0 -282.5 117.5 t-117.5 282.5zM436 341l161 50l412 412l-114 113l-405 -405zM995 1015l113 -113l113 113l-21 85l-92 28z" />
|
|
||||||
<glyph unicode="" d="M0 400v300q0 165 117.5 282.5t282.5 117.5h261l2 -80q-133 -32 -218 -120h-145q-41 0 -70.5 -29.5t-29.5 -70.5v-500q0 -41 29.5 -70.5t70.5 -29.5h500q41 0 70.5 29.5t29.5 70.5l200 153v-53q0 -165 -117.5 -282.5t-282.5 -117.5h-300q-165 0 -282.5 117.5t-117.5 282.5 zM423 524q30 38 81.5 64t103 35.5t99 14t77.5 3.5l29 -1v-209l360 324l-359 318v-216q-7 0 -19 -1t-48 -8t-69.5 -18.5t-76.5 -37t-76.5 -59t-62 -88t-39.5 -121.5z" />
|
|
||||||
<glyph unicode="" d="M0 400v300q0 165 117.5 282.5t282.5 117.5h300q61 0 127 -23l-178 -177h-349q-41 0 -70.5 -29.5t-29.5 -70.5v-500q0 -41 29.5 -70.5t70.5 -29.5h500q41 0 70.5 29.5t29.5 70.5v69l200 200v-169q0 -165 -117.5 -282.5t-282.5 -117.5h-300q-165 0 -282.5 117.5 t-117.5 282.5zM342 632l283 -284l567 567l-137 137l-430 -431l-146 147z" />
|
|
||||||
<glyph unicode="" d="M0 603l300 296v-198h200v200h-200l300 300l295 -300h-195v-200h200v198l300 -296l-300 -300v198h-200v-200h195l-295 -300l-300 300h200v200h-200v-198z" />
|
|
||||||
<glyph unicode="" d="M200 50v1000q0 21 14.5 35.5t35.5 14.5h100q21 0 35.5 -14.5t14.5 -35.5v-437l500 487v-1100l-500 488v-438q0 -21 -14.5 -35.5t-35.5 -14.5h-100q-21 0 -35.5 14.5t-14.5 35.5z" />
|
|
||||||
<glyph unicode="" d="M0 50v1000q0 21 14.5 35.5t35.5 14.5h100q21 0 35.5 -14.5t14.5 -35.5v-437l500 487v-487l500 487v-1100l-500 488v-488l-500 488v-438q0 -21 -14.5 -35.5t-35.5 -14.5h-100q-21 0 -35.5 14.5t-14.5 35.5z" />
|
|
||||||
<glyph unicode="" d="M136 550l564 550v-487l500 487v-1100l-500 488v-488z" />
|
|
||||||
<glyph unicode="" d="M200 0l900 550l-900 550v-1100z" />
|
|
||||||
<glyph unicode="" d="M200 150q0 -21 14.5 -35.5t35.5 -14.5h200q21 0 35.5 14.5t14.5 35.5v800q0 21 -14.5 35.5t-35.5 14.5h-200q-21 0 -35.5 -14.5t-14.5 -35.5v-800zM600 150q0 -21 14.5 -35.5t35.5 -14.5h200q21 0 35.5 14.5t14.5 35.5v800q0 21 -14.5 35.5t-35.5 14.5h-200 q-21 0 -35.5 -14.5t-14.5 -35.5v-800z" />
|
|
||||||
<glyph unicode="" d="M200 150q0 -20 14.5 -35t35.5 -15h800q21 0 35.5 15t14.5 35v800q0 21 -14.5 35.5t-35.5 14.5h-800q-21 0 -35.5 -14.5t-14.5 -35.5v-800z" />
|
|
||||||
<glyph unicode="" d="M0 0v1100l500 -487v487l564 -550l-564 -550v488z" />
|
|
||||||
<glyph unicode="" d="M0 0v1100l500 -487v487l500 -487v437q0 21 14.5 35.5t35.5 14.5h100q21 0 35.5 -14.5t14.5 -35.5v-1000q0 -21 -14.5 -35.5t-35.5 -14.5h-100q-21 0 -35.5 14.5t-14.5 35.5v438l-500 -488v488z" />
|
|
||||||
<glyph unicode="" d="M300 0v1100l500 -487v437q0 21 14.5 35.5t35.5 14.5h100q21 0 35.5 -14.5t14.5 -35.5v-1000q0 -21 -14.5 -35.5t-35.5 -14.5h-100q-21 0 -35.5 14.5t-14.5 35.5v438z" />
|
|
||||||
<glyph unicode="" d="M100 250v100q0 21 14.5 35.5t35.5 14.5h1000q21 0 35.5 -14.5t14.5 -35.5v-100q0 -21 -14.5 -35.5t-35.5 -14.5h-1000q-21 0 -35.5 14.5t-14.5 35.5zM100 500h1100l-550 564z" />
|
|
||||||
<glyph unicode="" d="M185 599l592 -592l240 240l-353 353l353 353l-240 240z" />
|
|
||||||
<glyph unicode="" d="M272 194l353 353l-353 353l241 240l572 -571l21 -22l-1 -1v-1l-592 -591z" />
|
|
||||||
<glyph unicode="" d="M3 600q0 162 80 299.5t217.5 217.5t299.5 80t299.5 -80t217.5 -217.5t80 -299.5t-80 -299.5t-217.5 -217.5t-299.5 -80t-299.5 80t-217.5 217.5t-80 299.5zM300 500h200v-200h200v200h200v200h-200v200h-200v-200h-200v-200z" />
|
|
||||||
<glyph unicode="" d="M3 600q0 162 80 299.5t217.5 217.5t299.5 80t299.5 -80t217.5 -217.5t80 -299.5t-80 -299.5t-217.5 -217.5t-299.5 -80t-299.5 80t-217.5 217.5t-80 299.5zM300 500h600v200h-600v-200z" />
|
|
||||||
<glyph unicode="" d="M3 600q0 162 80 299.5t217.5 217.5t299.5 80t299.5 -80t217.5 -217.5t80 -299.5t-80 -299.5t-217.5 -217.5t-299.5 -80t-299.5 80t-217.5 217.5t-80 299.5zM246 459l213 -213l141 142l141 -142l213 213l-142 141l142 141l-213 212l-141 -141l-141 142l-212 -213l141 -141 z" />
|
|
||||||
<glyph unicode="" d="M3 600q0 162 80 299.5t217.5 217.5t299.5 80t299.5 -80t217.5 -217.5t80 -299.5t-80 -299.5t-217.5 -217.5t-299.5 -80t-299.5 80t-217.5 217.5t-80 299.5zM270 551l276 -277l411 411l-175 174l-236 -236l-102 102z" />
|
|
||||||
<glyph unicode="" d="M3 600q0 162 80 299.5t217.5 217.5t299.5 80t299.5 -80t217.5 -217.5t80 -299.5t-80 -299.5t-217.5 -217.5t-299.5 -80t-299.5 80t-217.5 217.5t-80 299.5zM364 700h143q4 0 11.5 -1t11 -1t6.5 3t3 9t1 11t3.5 8.5t3.5 6t5.5 4t6.5 2.5t9 1.5t9 0.5h11.5h12.5 q19 0 30 -10t11 -26q0 -22 -4 -28t-27 -22q-5 -1 -12.5 -3t-27 -13.5t-34 -27t-26.5 -46t-11 -68.5h200q5 3 14 8t31.5 25.5t39.5 45.5t31 69t14 94q0 51 -17.5 89t-42 58t-58.5 32t-58.5 15t-51.5 3q-50 0 -90.5 -12t-75 -38.5t-53.5 -74.5t-19 -114zM500 300h200v100h-200 v-100z" />
|
|
||||||
<glyph unicode="" d="M3 600q0 162 80 299.5t217.5 217.5t299.5 80t299.5 -80t217.5 -217.5t80 -299.5t-80 -299.5t-217.5 -217.5t-299.5 -80t-299.5 80t-217.5 217.5t-80 299.5zM400 300h400v100h-100v300h-300v-100h100v-200h-100v-100zM500 800h200v100h-200v-100z" />
|
|
||||||
<glyph unicode="" d="M0 500v200h195q31 125 98.5 199.5t206.5 100.5v200h200v-200q54 -20 113 -60t112.5 -105.5t71.5 -134.5h203v-200h-203q-25 -102 -116.5 -186t-180.5 -117v-197h-200v197q-140 27 -208 102.5t-98 200.5h-194zM290 500q24 -73 79.5 -127.5t130.5 -78.5v206h200v-206 q149 48 201 206h-201v200h200q-25 74 -75.5 127t-124.5 77v-204h-200v203q-75 -23 -130 -77t-79 -126h209v-200h-210z" />
|
|
||||||
<glyph unicode="" d="M4 600q0 162 80 299t217 217t299 80t299 -80t217 -217t80 -299t-80 -299t-217 -217t-299 -80t-299 80t-217 217t-80 299zM186 600q0 -171 121.5 -292.5t292.5 -121.5t292.5 121.5t121.5 292.5t-121.5 292.5t-292.5 121.5t-292.5 -121.5t-121.5 -292.5zM356 465l135 135 l-135 135l109 109l135 -135l135 135l109 -109l-135 -135l135 -135l-109 -109l-135 135l-135 -135z" />
|
|
||||||
<glyph unicode="" d="M4 600q0 162 80 299t217 217t299 80t299 -80t217 -217t80 -299t-80 -299t-217 -217t-299 -80t-299 80t-217 217t-80 299zM186 600q0 -171 121.5 -292.5t292.5 -121.5t292.5 121.5t121.5 292.5t-121.5 292.5t-292.5 121.5t-292.5 -121.5t-121.5 -292.5zM322 537l141 141 l87 -87l204 205l142 -142l-346 -345z" />
|
|
||||||
<glyph unicode="" d="M4 600q0 162 80 299t217 217t299 80t299 -80t217 -217t80 -299t-80 -299t-217 -217t-299 -80t-299 80t-217 217t-80 299zM186 600q0 -115 62 -215l568 567q-100 62 -216 62q-171 0 -292.5 -121.5t-121.5 -292.5zM391 245q97 -59 209 -59q171 0 292.5 121.5t121.5 292.5 q0 112 -59 209z" />
|
|
||||||
<glyph unicode="" d="M0 547l600 453v-300h600v-300h-600v-301z" />
|
|
||||||
<glyph unicode="" d="M0 400v300h600v300l600 -453l-600 -448v301h-600z" />
|
|
||||||
<glyph unicode="" d="M204 600l450 600l444 -600h-298v-600h-300v600h-296z" />
|
|
||||||
<glyph unicode="" d="M104 600h296v600h300v-600h298l-449 -600z" />
|
|
||||||
<glyph unicode="" d="M0 200q6 132 41 238.5t103.5 193t184 138t271.5 59.5v271l600 -453l-600 -448v301q-95 -2 -183 -20t-170 -52t-147 -92.5t-100 -135.5z" />
|
|
||||||
<glyph unicode="" d="M0 0v400l129 -129l294 294l142 -142l-294 -294l129 -129h-400zM635 777l142 -142l294 294l129 -129v400h-400l129 -129z" />
|
|
||||||
<glyph unicode="" d="M34 176l295 295l-129 129h400v-400l-129 130l-295 -295zM600 600v400l129 -129l295 295l142 -141l-295 -295l129 -130h-400z" />
|
|
||||||
<glyph unicode="" d="M23 600q0 118 45.5 224.5t123 184t184 123t224.5 45.5t224.5 -45.5t184 -123t123 -184t45.5 -224.5t-45.5 -224.5t-123 -184t-184 -123t-224.5 -45.5t-224.5 45.5t-184 123t-123 184t-45.5 224.5zM456 851l58 -302q4 -20 21.5 -34.5t37.5 -14.5h54q20 0 37.5 14.5 t21.5 34.5l58 302q4 20 -8 34.5t-32 14.5h-207q-21 0 -33 -14.5t-8 -34.5zM500 300h200v100h-200v-100z" />
|
|
||||||
<glyph unicode="" d="M0 800h100v-200h400v300h200v-300h400v200h100v100h-111q1 1 1 6.5t-1.5 15t-3.5 17.5l-34 172q-11 39 -41.5 63t-69.5 24q-32 0 -61 -17l-239 -144q-22 -13 -40 -35q-19 24 -40 36l-238 144q-33 18 -62 18q-39 0 -69.5 -23t-40.5 -61l-35 -177q-2 -8 -3 -18t-1 -15v-6 h-111v-100zM100 0h400v400h-400v-400zM200 900q-3 0 14 48t36 96l18 47l213 -191h-281zM700 0v400h400v-400h-400zM731 900l202 197q5 -12 12 -32.5t23 -64t25 -72t7 -28.5h-269z" />
|
|
||||||
<glyph unicode="" d="M0 -22v143l216 193q-9 53 -13 83t-5.5 94t9 113t38.5 114t74 124q47 60 99.5 102.5t103 68t127.5 48t145.5 37.5t184.5 43.5t220 58.5q0 -189 -22 -343t-59 -258t-89 -181.5t-108.5 -120t-122 -68t-125.5 -30t-121.5 -1.5t-107.5 12.5t-87.5 17t-56.5 7.5l-99 -55z M238.5 300.5q19.5 -6.5 86.5 76.5q55 66 367 234q70 38 118.5 69.5t102 79t99 111.5t86.5 148q22 50 24 60t-6 19q-7 5 -17 5t-26.5 -14.5t-33.5 -39.5q-35 -51 -113.5 -108.5t-139.5 -89.5l-61 -32q-369 -197 -458 -401q-48 -111 -28.5 -117.5z" />
|
|
||||||
<glyph unicode="" d="M111 408q0 -33 5 -63q9 -56 44 -119.5t105 -108.5q31 -21 64 -16t62 23.5t57 49.5t48 61.5t35 60.5q32 66 39 184.5t-13 157.5q79 -80 122 -164t26 -184q-5 -33 -20.5 -69.5t-37.5 -80.5q-10 -19 -14.5 -29t-12 -26t-9 -23.5t-3 -19t2.5 -15.5t11 -9.5t19.5 -5t30.5 2.5 t42 8q57 20 91 34t87.5 44.5t87 64t65.5 88.5t47 122q38 172 -44.5 341.5t-246.5 278.5q22 -44 43 -129q39 -159 -32 -154q-15 2 -33 9q-79 33 -120.5 100t-44 175.5t48.5 257.5q-13 -8 -34 -23.5t-72.5 -66.5t-88.5 -105.5t-60 -138t-8 -166.5q2 -12 8 -41.5t8 -43t6 -39.5 t3.5 -39.5t-1 -33.5t-6 -31.5t-13.5 -24t-21 -20.5t-31 -12q-38 -10 -67 13t-40.5 61.5t-15 81.5t10.5 75q-52 -46 -83.5 -101t-39 -107t-7.5 -85z" />
|
|
||||||
<glyph unicode="" d="M-61 600l26 40q6 10 20 30t49 63.5t74.5 85.5t97 90t116.5 83.5t132.5 59t145.5 23.5t145.5 -23.5t132.5 -59t116.5 -83.5t97 -90t74.5 -85.5t49 -63.5t20 -30l26 -40l-26 -40q-6 -10 -20 -30t-49 -63.5t-74.5 -85.5t-97 -90t-116.5 -83.5t-132.5 -59t-145.5 -23.5 t-145.5 23.5t-132.5 59t-116.5 83.5t-97 90t-74.5 85.5t-49 63.5t-20 30zM120 600q7 -10 40.5 -58t56 -78.5t68 -77.5t87.5 -75t103 -49.5t125 -21.5t123.5 20t100.5 45.5t85.5 71.5t66.5 75.5t58 81.5t47 66q-1 1 -28.5 37.5t-42 55t-43.5 53t-57.5 63.5t-58.5 54 q49 -74 49 -163q0 -124 -88 -212t-212 -88t-212 88t-88 212q0 85 46 158q-102 -87 -226 -258zM377 656q49 -124 154 -191l105 105q-37 24 -75 72t-57 84l-20 36z" />
|
|
||||||
<glyph unicode="" d="M-61 600l26 40q6 10 20 30t49 63.5t74.5 85.5t97 90t116.5 83.5t132.5 59t145.5 23.5q61 0 121 -17l37 142h148l-314 -1200h-148l37 143q-82 21 -165 71.5t-140 102t-109.5 112t-72 88.5t-29.5 43zM120 600q210 -282 393 -336l37 141q-107 18 -178.5 101.5t-71.5 193.5 q0 85 46 158q-102 -87 -226 -258zM377 656q49 -124 154 -191l47 47l23 87q-30 28 -59 69t-44 68l-14 26zM780 161l38 145q22 15 44.5 34t46 44t40.5 44t41 50.5t33.5 43.5t33 44t24.5 34q-97 127 -140 175l39 146q67 -54 131.5 -125.5t87.5 -103.5t36 -52l26 -40l-26 -40 q-7 -12 -25.5 -38t-63.5 -79.5t-95.5 -102.5t-124 -100t-146.5 -79z" />
|
|
||||||
<glyph unicode="" d="M-97.5 34q13.5 -34 50.5 -34h1294q37 0 50.5 35.5t-7.5 67.5l-642 1056q-20 34 -48 36.5t-48 -29.5l-642 -1066q-21 -32 -7.5 -66zM155 200l445 723l445 -723h-345v100h-200v-100h-345zM500 600l100 -300l100 300v100h-200v-100z" />
|
|
||||||
<glyph unicode="" d="M100 262v41q0 20 11 44.5t26 38.5l363 325v339q0 62 44 106t106 44t106 -44t44 -106v-339l363 -325q15 -14 26 -38.5t11 -44.5v-41q0 -20 -12 -26.5t-29 5.5l-359 249v-263q100 -91 100 -113v-64q0 -20 -13 -28.5t-32 0.5l-94 78h-222l-94 -78q-19 -9 -32 -0.5t-13 28.5 v64q0 22 100 113v263l-359 -249q-17 -12 -29 -5.5t-12 26.5z" />
|
|
||||||
<glyph unicode="" d="M0 50q0 -20 14.5 -35t35.5 -15h1000q21 0 35.5 15t14.5 35v750h-1100v-750zM0 900h1100v150q0 21 -14.5 35.5t-35.5 14.5h-150v100h-100v-100h-500v100h-100v-100h-150q-21 0 -35.5 -14.5t-14.5 -35.5v-150zM100 100v100h100v-100h-100zM100 300v100h100v-100h-100z M100 500v100h100v-100h-100zM300 100v100h100v-100h-100zM300 300v100h100v-100h-100zM300 500v100h100v-100h-100zM500 100v100h100v-100h-100zM500 300v100h100v-100h-100zM500 500v100h100v-100h-100zM700 100v100h100v-100h-100zM700 300v100h100v-100h-100zM700 500 v100h100v-100h-100zM900 100v100h100v-100h-100zM900 300v100h100v-100h-100zM900 500v100h100v-100h-100z" />
|
|
||||||
<glyph unicode="" d="M0 200v200h259l600 600h241v198l300 -295l-300 -300v197h-159l-600 -600h-341zM0 800h259l122 -122l141 142l-181 180h-341v-200zM678 381l141 142l122 -123h159v198l300 -295l-300 -300v197h-241z" />
|
|
||||||
<glyph unicode="" d="M0 400v600q0 41 29.5 70.5t70.5 29.5h1000q41 0 70.5 -29.5t29.5 -70.5v-600q0 -41 -29.5 -70.5t-70.5 -29.5h-596l-304 -300v300h-100q-41 0 -70.5 29.5t-29.5 70.5z" />
|
|
||||||
<glyph unicode="" d="M100 600v200h300v-250q0 -113 6 -145q17 -92 102 -117q39 -11 92 -11q37 0 66.5 5.5t50 15.5t36 24t24 31.5t14 37.5t7 42t2.5 45t0 47v25v250h300v-200q0 -42 -3 -83t-15 -104t-31.5 -116t-58 -109.5t-89 -96.5t-129 -65.5t-174.5 -25.5t-174.5 25.5t-129 65.5t-89 96.5 t-58 109.5t-31.5 116t-15 104t-3 83zM100 900v300h300v-300h-300zM800 900v300h300v-300h-300z" />
|
|
||||||
<glyph unicode="" d="M-30 411l227 -227l352 353l353 -353l226 227l-578 579z" />
|
|
||||||
<glyph unicode="" d="M70 797l580 -579l578 579l-226 227l-353 -353l-352 353z" />
|
|
||||||
<glyph unicode="" d="M-198 700l299 283l300 -283h-203v-400h385l215 -200h-800v600h-196zM402 1000l215 -200h381v-400h-198l299 -283l299 283h-200v600h-796z" />
|
|
||||||
<glyph unicode="" d="M18 939q-5 24 10 42q14 19 39 19h896l38 162q5 17 18.5 27.5t30.5 10.5h94q20 0 35 -14.5t15 -35.5t-15 -35.5t-35 -14.5h-54l-201 -961q-2 -4 -6 -10.5t-19 -17.5t-33 -11h-31v-50q0 -20 -14.5 -35t-35.5 -15t-35.5 15t-14.5 35v50h-300v-50q0 -20 -14.5 -35t-35.5 -15 t-35.5 15t-14.5 35v50h-50q-21 0 -35.5 15t-14.5 35q0 21 14.5 35.5t35.5 14.5h535l48 200h-633q-32 0 -54.5 21t-27.5 43z" />
|
|
||||||
<glyph unicode="" d="M0 0v800h1200v-800h-1200zM0 900v100h200q0 41 29.5 70.5t70.5 29.5h300q41 0 70.5 -29.5t29.5 -70.5h500v-100h-1200z" />
|
|
||||||
<glyph unicode="" d="M1 0l300 700h1200l-300 -700h-1200zM1 400v600h200q0 41 29.5 70.5t70.5 29.5h300q41 0 70.5 -29.5t29.5 -70.5h500v-200h-1000z" />
|
|
||||||
<glyph unicode="" d="M302 300h198v600h-198l298 300l298 -300h-198v-600h198l-298 -300z" />
|
|
||||||
<glyph unicode="" d="M0 600l300 298v-198h600v198l300 -298l-300 -297v197h-600v-197z" />
|
|
||||||
<glyph unicode="" d="M0 100v100q0 41 29.5 70.5t70.5 29.5h1000q41 0 70.5 -29.5t29.5 -70.5v-100q0 -41 -29.5 -70.5t-70.5 -29.5h-1000q-41 0 -70.5 29.5t-29.5 70.5zM31 400l172 739q5 22 23 41.5t38 19.5h672q19 0 37.5 -22.5t23.5 -45.5l172 -732h-1138zM800 100h100v100h-100v-100z M1000 100h100v100h-100v-100z" />
|
|
||||||
<glyph unicode="" d="M-101 600v50q0 24 25 49t50 38l25 13v-250l-11 5.5t-24 14t-30 21.5t-24 27.5t-11 31.5zM100 500v250v8v8v7t0.5 7t1.5 5.5t2 5t3 4t4.5 3.5t6 1.5t7.5 0.5h200l675 250v-850l-675 200h-38l47 -276q2 -12 -3 -17.5t-11 -6t-21 -0.5h-8h-83q-20 0 -34.5 14t-18.5 35 q-55 337 -55 351zM1100 200v850q0 21 14.5 35.5t35.5 14.5q20 0 35 -14.5t15 -35.5v-850q0 -20 -15 -35t-35 -15q-21 0 -35.5 15t-14.5 35z" />
|
|
||||||
<glyph unicode="" d="M74 350q0 21 13.5 35.5t33.5 14.5h18l117 173l63 327q15 77 76 140t144 83l-18 32q-6 19 3 32t29 13h94q20 0 29 -10.5t3 -29.5q-18 -36 -18 -37q83 -19 144 -82.5t76 -140.5l63 -327l118 -173h17q20 0 33.5 -14.5t13.5 -35.5q0 -20 -13 -40t-31 -27q-8 -3 -23 -8.5 t-65 -20t-103 -25t-132.5 -19.5t-158.5 -9q-125 0 -245.5 20.5t-178.5 40.5l-58 20q-18 7 -31 27.5t-13 40.5zM497 110q12 -49 40 -79.5t63 -30.5t63 30.5t39 79.5q-48 -6 -102 -6t-103 6z" />
|
|
||||||
<glyph unicode="" d="M21 445l233 -45l-78 -224l224 78l45 -233l155 179l155 -179l45 233l224 -78l-78 224l234 45l-180 155l180 156l-234 44l78 225l-224 -78l-45 233l-155 -180l-155 180l-45 -233l-224 78l78 -225l-233 -44l179 -156z" />
|
|
||||||
<glyph unicode="" d="M0 200h200v600h-200v-600zM300 275q0 -75 100 -75h61q124 -100 139 -100h250q46 0 83 57l238 344q29 31 29 74v100q0 44 -30.5 84.5t-69.5 40.5h-328q28 118 28 125v150q0 44 -30.5 84.5t-69.5 40.5h-50q-27 0 -51 -20t-38 -48l-96 -198l-145 -196q-20 -26 -20 -63v-400z M400 300v375l150 213l100 212h50v-175l-50 -225h450v-125l-250 -375h-214l-136 100h-100z" />
|
|
||||||
<glyph unicode="" d="M0 400v600h200v-600h-200zM300 525v400q0 75 100 75h61q124 100 139 100h250q46 0 83 -57l238 -344q29 -31 29 -74v-100q0 -44 -30.5 -84.5t-69.5 -40.5h-328q28 -118 28 -125v-150q0 -44 -30.5 -84.5t-69.5 -40.5h-50q-27 0 -51 20t-38 48l-96 198l-145 196 q-20 26 -20 63zM400 525l150 -212l100 -213h50v175l-50 225h450v125l-250 375h-214l-136 -100h-100v-375z" />
|
|
||||||
<glyph unicode="" d="M8 200v600h200v-600h-200zM308 275v525q0 17 14 35.5t28 28.5l14 9l362 230q14 6 25 6q17 0 29 -12l109 -112q14 -14 14 -34q0 -18 -11 -32l-85 -121h302q85 0 138.5 -38t53.5 -110t-54.5 -111t-138.5 -39h-107l-130 -339q-7 -22 -20.5 -41.5t-28.5 -19.5h-341 q-7 0 -90 81t-83 94zM408 289l100 -89h293l131 339q6 21 19.5 41t28.5 20h203q16 0 25 15t9 36q0 20 -9 34.5t-25 14.5h-457h-6.5h-7.5t-6.5 0.5t-6 1t-5 1.5t-5.5 2.5t-4 4t-4 5.5q-5 12 -5 20q0 14 10 27l147 183l-86 83l-339 -236v-503z" />
|
|
||||||
<glyph unicode="" d="M-101 651q0 72 54 110t139 38l302 -1l-85 121q-11 16 -11 32q0 21 14 34l109 113q13 12 29 12q11 0 25 -6l365 -230q7 -4 17 -10.5t26.5 -26t16.5 -36.5v-526q0 -13 -86 -93.5t-94 -80.5h-341q-16 0 -29.5 20t-19.5 41l-130 339h-107q-84 0 -139 39t-55 111zM-1 601h222 q15 0 28.5 -20.5t19.5 -40.5l131 -339h293l107 89v502l-343 237l-87 -83l145 -184q10 -11 10 -26q0 -11 -5 -20q-1 -3 -3.5 -5.5l-4 -4t-5 -2.5t-5.5 -1.5t-6.5 -1t-6.5 -0.5h-7.5h-6.5h-476v-100zM1000 201v600h200v-600h-200z" />
|
|
||||||
<glyph unicode="" d="M97 719l230 -363q4 -6 10.5 -15.5t26 -25t36.5 -15.5h525q13 0 94 83t81 90v342q0 15 -20 28.5t-41 19.5l-339 131v106q0 84 -39 139t-111 55t-110 -53.5t-38 -138.5v-302l-121 84q-15 12 -33.5 11.5t-32.5 -13.5l-112 -110q-22 -22 -6 -53zM172 739l83 86l183 -146 q22 -18 47 -5q3 1 5.5 3.5l4 4t2.5 5t1.5 5.5t1 6.5t0.5 6.5v7.5v6.5v456q0 22 25 31t50 -0.5t25 -30.5v-202q0 -16 20 -29.5t41 -19.5l339 -130v-294l-89 -100h-503zM400 0v200h600v-200h-600z" />
|
|
||||||
<glyph unicode="" d="M2 585q-16 -31 6 -53l112 -110q13 -13 32 -13.5t34 10.5l121 85q0 -51 -0.5 -153.5t-0.5 -148.5q0 -84 38.5 -138t110.5 -54t111 55t39 139v106l339 131q20 6 40.5 19.5t20.5 28.5v342q0 7 -81 90t-94 83h-525q-17 0 -35.5 -14t-28.5 -28l-10 -15zM77 565l236 339h503 l89 -100v-294l-340 -130q-20 -6 -40 -20t-20 -29v-202q0 -22 -25 -31t-50 0t-25 31v456v14.5t-1.5 11.5t-5 12t-9.5 7q-24 13 -46 -5l-184 -146zM305 1104v200h600v-200h-600z" />
|
|
||||||
<glyph unicode="" d="M5 597q0 122 47.5 232.5t127.5 190.5t190.5 127.5t232.5 47.5q162 0 299.5 -80t217.5 -218t80 -300t-80 -299.5t-217.5 -217.5t-299.5 -80t-300 80t-218 217.5t-80 299.5zM298 701l2 -201h300l-2 -194l402 294l-402 298v-197h-300z" />
|
|
||||||
<glyph unicode="" d="M0 597q0 122 47.5 232.5t127.5 190.5t190.5 127.5t231.5 47.5q122 0 232.5 -47.5t190.5 -127.5t127.5 -190.5t47.5 -232.5q0 -162 -80 -299.5t-218 -217.5t-300 -80t-299.5 80t-217.5 217.5t-80 299.5zM200 600l402 -294l-2 194h300l2 201h-300v197z" />
|
|
||||||
<glyph unicode="" d="M5 597q0 122 47.5 232.5t127.5 190.5t190.5 127.5t232.5 47.5q162 0 299.5 -80t217.5 -218t80 -300t-80 -299.5t-217.5 -217.5t-299.5 -80t-300 80t-218 217.5t-80 299.5zM300 600h200v-300h200v300h200l-300 400z" />
|
|
||||||
<glyph unicode="" d="M5 597q0 122 47.5 232.5t127.5 190.5t190.5 127.5t232.5 47.5q162 0 299.5 -80t217.5 -218t80 -300t-80 -299.5t-217.5 -217.5t-299.5 -80t-300 80t-218 217.5t-80 299.5zM300 600l300 -400l300 400h-200v300h-200v-300h-200z" />
|
|
||||||
<glyph unicode="" d="M5 597q0 122 47.5 232.5t127.5 190.5t190.5 127.5t232.5 47.5q121 0 231.5 -47.5t190.5 -127.5t127.5 -190.5t47.5 -232.5q0 -162 -80 -299.5t-217.5 -217.5t-299.5 -80t-300 80t-218 217.5t-80 299.5zM254 780q-8 -33 5.5 -92.5t7.5 -87.5q0 -9 17 -44t16 -60 q12 0 23 -5.5t23 -15t20 -13.5q24 -12 108 -42q22 -8 53 -31.5t59.5 -38.5t57.5 -11q8 -18 -15 -55t-20 -57q42 -71 87 -80q0 -6 -3 -15.5t-3.5 -14.5t4.5 -17q104 -3 221 112q30 29 47 47t34.5 49t20.5 62q-14 9 -37 9.5t-36 7.5q-14 7 -49 15t-52 19q-9 0 -39.5 -0.5 t-46.5 -1.5t-39 -6.5t-39 -16.5q-50 -35 -66 -12q-4 2 -3.5 25.5t0.5 25.5q-6 13 -26.5 17t-24.5 7q2 22 -2 41t-16.5 28t-38.5 -20q-23 -25 -42 4q-19 28 -8 58q6 16 22 22q6 -1 26 -1.5t33.5 -4t19.5 -13.5q12 -19 32 -37.5t34 -27.5l14 -8q0 3 9.5 39.5t5.5 57.5 q-4 23 14.5 44.5t22.5 31.5q5 14 10 35t8.5 31t15.5 22.5t34 21.5q-6 18 10 37q8 0 23.5 -1.5t24.5 -1.5t20.5 4.5t20.5 15.5q-10 23 -30.5 42.5t-38 30t-49 26.5t-43.5 23q11 39 2 44q31 -13 58 -14.5t39 3.5l11 4q7 36 -16.5 53.5t-64.5 28.5t-56 23q-19 -3 -37 0 q-15 -12 -36.5 -21t-34.5 -12t-44 -8t-39 -6q-15 -3 -45.5 0.5t-45.5 -2.5q-21 -7 -52 -26.5t-34 -34.5q-3 -11 6.5 -22.5t8.5 -18.5q-3 -34 -27.5 -90.5t-29.5 -79.5zM518 916q3 12 16 30t16 25q10 -10 18.5 -10t14 6t14.5 14.5t16 12.5q0 -24 17 -66.5t17 -43.5 q-9 2 -31 5t-36 5t-32 8t-30 14zM692 1003h1h-1z" />
|
|
||||||
<glyph unicode="" d="M0 164.5q0 21.5 15 37.5l600 599q-33 101 6 201.5t135 154.5q164 92 306 -9l-259 -138l145 -232l251 126q13 -175 -151 -267q-123 -70 -253 -23l-596 -596q-15 -16 -36.5 -16t-36.5 16l-111 110q-15 15 -15 36.5z" />
|
|
||||||
<glyph unicode="" horiz-adv-x="1220" d="M0 196v100q0 41 29.5 70.5t70.5 29.5h1000q41 0 70.5 -29.5t29.5 -70.5v-100q0 -41 -29.5 -70.5t-70.5 -29.5h-1000q-41 0 -70.5 29.5t-29.5 70.5zM0 596v100q0 41 29.5 70.5t70.5 29.5h1000q41 0 70.5 -29.5t29.5 -70.5v-100q0 -41 -29.5 -70.5t-70.5 -29.5h-1000 q-41 0 -70.5 29.5t-29.5 70.5zM0 996v100q0 41 29.5 70.5t70.5 29.5h1000q41 0 70.5 -29.5t29.5 -70.5v-100q0 -41 -29.5 -70.5t-70.5 -29.5h-1000q-41 0 -70.5 29.5t-29.5 70.5zM600 596h500v100h-500v-100zM800 196h300v100h-300v-100zM900 996h200v100h-200v-100z" />
|
|
||||||
<glyph unicode="" d="M100 1100v100h1000v-100h-1000zM150 1000h900l-350 -500v-300l-200 -200v500z" />
|
|
||||||
<glyph unicode="" d="M0 200v200h1200v-200q0 -41 -29.5 -70.5t-70.5 -29.5h-1000q-41 0 -70.5 29.5t-29.5 70.5zM0 500v400q0 41 29.5 70.5t70.5 29.5h300v100q0 41 29.5 70.5t70.5 29.5h200q41 0 70.5 -29.5t29.5 -70.5v-100h300q41 0 70.5 -29.5t29.5 -70.5v-400h-500v100h-200v-100h-500z M500 1000h200v100h-200v-100z" />
|
|
||||||
<glyph unicode="" d="M0 0v400l129 -129l200 200l142 -142l-200 -200l129 -129h-400zM0 800l129 129l200 -200l142 142l-200 200l129 129h-400v-400zM729 329l142 142l200 -200l129 129v-400h-400l129 129zM729 871l200 200l-129 129h400v-400l-129 129l-200 -200z" />
|
|
||||||
<glyph unicode="" d="M0 596q0 162 80 299t217 217t299 80t299 -80t217 -217t80 -299t-80 -299t-217 -217t-299 -80t-299 80t-217 217t-80 299zM182 596q0 -172 121.5 -293t292.5 -121t292.5 121t121.5 293q0 171 -121.5 292.5t-292.5 121.5t-292.5 -121.5t-121.5 -292.5zM291 655 q0 23 15.5 38.5t38.5 15.5t39 -16t16 -38q0 -23 -16 -39t-39 -16q-22 0 -38 16t-16 39zM400 850q0 22 16 38.5t39 16.5q22 0 38 -16t16 -39t-16 -39t-38 -16q-23 0 -39 16.5t-16 38.5zM514 609q0 32 20.5 56.5t51.5 29.5l122 126l1 1q-9 14 -9 28q0 22 16 38.5t39 16.5 q22 0 38 -16t16 -39t-16 -39t-38 -16q-14 0 -29 10l-55 -145q17 -22 17 -51q0 -36 -25.5 -61.5t-61.5 -25.5t-61.5 25.5t-25.5 61.5zM800 655q0 22 16 38t39 16t38.5 -15.5t15.5 -38.5t-16 -39t-38 -16q-23 0 -39 16t-16 39z" />
|
|
||||||
<glyph unicode="" d="M-40 375q-13 -95 35 -173q35 -57 94 -89t129 -32q63 0 119 28q33 16 65 40.5t52.5 45.5t59.5 64q40 44 57 61l394 394q35 35 47 84t-3 96q-27 87 -117 104q-20 2 -29 2q-46 0 -78.5 -16.5t-67.5 -51.5l-389 -396l-7 -7l69 -67l377 373q20 22 39 38q23 23 50 23 q38 0 53 -36q16 -39 -20 -75l-547 -547q-52 -52 -125 -52q-55 0 -100 33t-54 96q-5 35 2.5 66t31.5 63t42 50t56 54q24 21 44 41l348 348q52 52 82.5 79.5t84 54t107.5 26.5q25 0 48 -4q95 -17 154 -94.5t51 -175.5q-7 -101 -98 -192l-252 -249l-253 -256l7 -7l69 -60 l517 511q67 67 95 157t11 183q-16 87 -67 154t-130 103q-69 33 -152 33q-107 0 -197 -55q-40 -24 -111 -95l-512 -512q-68 -68 -81 -163z" />
|
|
||||||
<glyph unicode="" d="M80 784q0 131 98.5 229.5t230.5 98.5q143 0 241 -129q103 129 246 129q129 0 226 -98.5t97 -229.5q0 -46 -17.5 -91t-61 -99t-77 -89.5t-104.5 -105.5q-197 -191 -293 -322l-17 -23l-16 23q-43 58 -100 122.5t-92 99.5t-101 100q-71 70 -104.5 105.5t-77 89.5t-61 99 t-17.5 91zM250 784q0 -27 30.5 -70t61.5 -75.5t95 -94.5l22 -22q93 -90 190 -201q82 92 195 203l12 12q64 62 97.5 97t64.5 79t31 72q0 71 -48 119.5t-105 48.5q-74 0 -132 -83l-118 -171l-114 174q-51 80 -123 80q-60 0 -109.5 -49.5t-49.5 -118.5z" />
|
|
||||||
<glyph unicode="" d="M57 353q0 -95 66 -159l141 -142q68 -66 159 -66q93 0 159 66l283 283q66 66 66 159t-66 159l-141 141q-8 9 -19 17l-105 -105l212 -212l-389 -389l-247 248l95 95l-18 18q-46 45 -75 101l-55 -55q-66 -66 -66 -159zM269 706q0 -93 66 -159l141 -141q7 -7 19 -17l105 105 l-212 212l389 389l247 -247l-95 -96l18 -17q47 -49 77 -100l29 29q35 35 62.5 88t27.5 96q0 93 -66 159l-141 141q-66 66 -159 66q-95 0 -159 -66l-283 -283q-66 -64 -66 -159z" />
|
|
||||||
<glyph unicode="" d="M200 100v953q0 21 30 46t81 48t129 38t163 15t162 -15t127 -38t79 -48t29 -46v-953q0 -41 -29.5 -70.5t-70.5 -29.5h-600q-41 0 -70.5 29.5t-29.5 70.5zM300 300h600v700h-600v-700zM496 150q0 -43 30.5 -73.5t73.5 -30.5t73.5 30.5t30.5 73.5t-30.5 73.5t-73.5 30.5 t-73.5 -30.5t-30.5 -73.5z" />
|
|
||||||
<glyph unicode="" d="M0 0l303 380l207 208l-210 212h300l267 279l-35 36q-15 14 -15 35t15 35q14 15 35 15t35 -15l283 -282q15 -15 15 -36t-15 -35q-14 -15 -35 -15t-35 15l-36 35l-279 -267v-300l-212 210l-208 -207z" />
|
|
||||||
<glyph unicode="" d="M295 433h139q5 -77 48.5 -126.5t117.5 -64.5v335q-6 1 -15.5 4t-11.5 3q-46 14 -79 26.5t-72 36t-62.5 52t-40 72.5t-16.5 99q0 92 44 159.5t109 101t144 40.5v78h100v-79q38 -4 72.5 -13.5t75.5 -31.5t71 -53.5t51.5 -84t24.5 -118.5h-159q-8 72 -35 109.5t-101 50.5 v-307l64 -14q34 -7 64 -16.5t70 -31.5t67.5 -52t47.5 -80.5t20 -112.5q0 -139 -89 -224t-244 -96v-77h-100v78q-152 17 -237 104q-40 40 -52.5 93.5t-15.5 139.5zM466 889q0 -29 8 -51t16.5 -34t29.5 -22.5t31 -13.5t38 -10q7 -2 11 -3v274q-61 -8 -97.5 -37.5t-36.5 -102.5 zM700 237q170 18 170 151q0 64 -44 99.5t-126 60.5v-311z" />
|
|
||||||
<glyph unicode="" d="M100 600v100h166q-24 49 -44 104q-10 26 -14.5 55.5t-3 72.5t25 90t68.5 87q97 88 263 88q129 0 230 -89t101 -208h-153q0 52 -34 89.5t-74 51.5t-76 14q-37 0 -79 -14.5t-62 -35.5q-41 -44 -41 -101q0 -28 16.5 -69.5t28 -62.5t41.5 -72h241v-100h-197q8 -50 -2.5 -115 t-31.5 -94q-41 -59 -99 -113q35 11 84 18t70 7q33 1 103 -16t103 -17q76 0 136 30l50 -147q-41 -25 -80.5 -36.5t-59 -13t-61.5 -1.5q-23 0 -128 33t-155 29q-39 -4 -82 -17t-66 -25l-24 -11l-55 145l16.5 11t15.5 10t13.5 9.5t14.5 12t14.5 14t17.5 18.5q48 55 54 126.5 t-30 142.5h-221z" />
|
|
||||||
<glyph unicode="" d="M2 300l298 -300l298 300h-198v900h-200v-900h-198zM602 900l298 300l298 -300h-198v-900h-200v900h-198z" />
|
|
||||||
<glyph unicode="" d="M2 300h198v900h200v-900h198l-298 -300zM700 0v200h100v-100h200v-100h-300zM700 400v100h300v-200h-99v-100h-100v100h99v100h-200zM700 700v500h300v-500h-100v100h-100v-100h-100zM801 900h100v200h-100v-200z" />
|
|
||||||
<glyph unicode="" d="M2 300h198v900h200v-900h198l-298 -300zM700 0v500h300v-500h-100v100h-100v-100h-100zM700 700v200h100v-100h200v-100h-300zM700 1100v100h300v-200h-99v-100h-100v100h99v100h-200zM801 200h100v200h-100v-200z" />
|
|
||||||
<glyph unicode="" d="M2 300l298 -300l298 300h-198v900h-200v-900h-198zM800 100v400h300v-500h-100v100h-200zM800 1100v100h200v-500h-100v400h-100zM901 200h100v200h-100v-200z" />
|
|
||||||
<glyph unicode="" d="M2 300l298 -300l298 300h-198v900h-200v-900h-198zM800 400v100h200v-500h-100v400h-100zM800 800v400h300v-500h-100v100h-200zM901 900h100v200h-100v-200z" />
|
|
||||||
<glyph unicode="" d="M2 300l298 -300l298 300h-198v900h-200v-900h-198zM700 100v200h500v-200h-500zM700 400v200h400v-200h-400zM700 700v200h300v-200h-300zM700 1000v200h200v-200h-200z" />
|
|
||||||
<glyph unicode="" d="M2 300l298 -300l298 300h-198v900h-200v-900h-198zM700 100v200h200v-200h-200zM700 400v200h300v-200h-300zM700 700v200h400v-200h-400zM700 1000v200h500v-200h-500z" />
|
|
||||||
<glyph unicode="" d="M0 400v300q0 165 117.5 282.5t282.5 117.5h300q162 0 281 -118.5t119 -281.5v-300q0 -165 -118.5 -282.5t-281.5 -117.5h-300q-165 0 -282.5 117.5t-117.5 282.5zM200 300q0 -41 29.5 -70.5t70.5 -29.5h500q41 0 70.5 29.5t29.5 70.5v500q0 41 -29.5 70.5t-70.5 29.5 h-500q-41 0 -70.5 -29.5t-29.5 -70.5v-500z" />
|
|
||||||
<glyph unicode="" d="M0 400v300q0 163 119 281.5t281 118.5h300q165 0 282.5 -117.5t117.5 -282.5v-300q0 -165 -117.5 -282.5t-282.5 -117.5h-300q-163 0 -281.5 117.5t-118.5 282.5zM200 300q0 -41 29.5 -70.5t70.5 -29.5h500q41 0 70.5 29.5t29.5 70.5v500q0 41 -29.5 70.5t-70.5 29.5 h-500q-41 0 -70.5 -29.5t-29.5 -70.5v-500zM400 300l333 250l-333 250v-500z" />
|
|
||||||
<glyph unicode="" d="M0 400v300q0 163 117.5 281.5t282.5 118.5h300q163 0 281.5 -119t118.5 -281v-300q0 -165 -117.5 -282.5t-282.5 -117.5h-300q-165 0 -282.5 117.5t-117.5 282.5zM200 300q0 -41 29.5 -70.5t70.5 -29.5h500q41 0 70.5 29.5t29.5 70.5v500q0 41 -29.5 70.5t-70.5 29.5 h-500q-41 0 -70.5 -29.5t-29.5 -70.5v-500zM300 700l250 -333l250 333h-500z" />
|
|
||||||
<glyph unicode="" d="M0 400v300q0 165 117.5 282.5t282.5 117.5h300q165 0 282.5 -117.5t117.5 -282.5v-300q0 -162 -118.5 -281t-281.5 -119h-300q-165 0 -282.5 118.5t-117.5 281.5zM200 300q0 -41 29.5 -70.5t70.5 -29.5h500q41 0 70.5 29.5t29.5 70.5v500q0 41 -29.5 70.5t-70.5 29.5 h-500q-41 0 -70.5 -29.5t-29.5 -70.5v-500zM300 400h500l-250 333z" />
|
|
||||||
<glyph unicode="" d="M0 400v300h300v200l400 -350l-400 -350v200h-300zM500 0v200h500q41 0 70.5 29.5t29.5 70.5v500q0 41 -29.5 70.5t-70.5 29.5h-500v200h400q165 0 282.5 -117.5t117.5 -282.5v-300q0 -165 -117.5 -282.5t-282.5 -117.5h-400z" />
|
|
||||||
<glyph unicode="" d="M217 519q8 -19 31 -19h302q-155 -438 -160 -458q-5 -21 4 -32l9 -8h9q14 0 26 15q11 13 274.5 321.5t264.5 308.5q14 19 5 36q-8 17 -31 17l-301 -1q1 4 78 219.5t79 227.5q2 15 -5 27l-9 9h-9q-15 0 -25 -16q-4 -6 -98 -111.5t-228.5 -257t-209.5 -237.5q-16 -19 -6 -41 z" />
|
|
||||||
<glyph unicode="" d="M0 400q0 -165 117.5 -282.5t282.5 -117.5h300q47 0 100 15v185h-500q-41 0 -70.5 29.5t-29.5 70.5v500q0 41 29.5 70.5t70.5 29.5h500v185q-14 4 -114 7.5t-193 5.5l-93 2q-165 0 -282.5 -117.5t-117.5 -282.5v-300zM600 400v300h300v200l400 -350l-400 -350v200h-300z " />
|
|
||||||
<glyph unicode="" d="M0 400q0 -165 117.5 -282.5t282.5 -117.5h300q163 0 281.5 117.5t118.5 282.5v98l-78 73l-122 -123v-148q0 -41 -29.5 -70.5t-70.5 -29.5h-500q-41 0 -70.5 29.5t-29.5 70.5v500q0 41 29.5 70.5t70.5 29.5h156l118 122l-74 78h-100q-165 0 -282.5 -117.5t-117.5 -282.5 v-300zM496 709l353 342l-149 149h500v-500l-149 149l-342 -353z" />
|
|
||||||
<glyph unicode="" d="M4 600q0 162 80 299t217 217t299 80t299 -80t217 -217t80 -299t-80 -299t-217 -217t-299 -80t-299 80t-217 217t-80 299zM186 600q0 -171 121.5 -292.5t292.5 -121.5t292.5 121.5t121.5 292.5t-121.5 292.5t-292.5 121.5t-292.5 -121.5t-121.5 -292.5zM406 600 q0 80 57 137t137 57t137 -57t57 -137t-57 -137t-137 -57t-137 57t-57 137z" />
|
|
||||||
<glyph unicode="" d="M0 0v275q0 11 7 18t18 7h1048q11 0 19 -7.5t8 -17.5v-275h-1100zM100 800l445 -500l450 500h-295v400h-300v-400h-300zM900 150h100v50h-100v-50z" />
|
|
||||||
<glyph unicode="" d="M0 0v275q0 11 7 18t18 7h1048q11 0 19 -7.5t8 -17.5v-275h-1100zM100 700h300v-300h300v300h295l-445 500zM900 150h100v50h-100v-50z" />
|
|
||||||
<glyph unicode="" d="M0 0v275q0 11 7 18t18 7h1048q11 0 19 -7.5t8 -17.5v-275h-1100zM100 705l305 -305l596 596l-154 155l-442 -442l-150 151zM900 150h100v50h-100v-50z" />
|
|
||||||
<glyph unicode="" d="M0 0v275q0 11 7 18t18 7h1048q11 0 19 -7.5t8 -17.5v-275h-1100zM100 988l97 -98l212 213l-97 97zM200 400l697 1l3 699l-250 -239l-149 149l-212 -212l149 -149zM900 150h100v50h-100v-50z" />
|
|
||||||
<glyph unicode="" d="M0 0v275q0 11 7 18t18 7h1048q11 0 19 -7.5t8 -17.5v-275h-1100zM200 612l212 -212l98 97l-213 212zM300 1200l239 -250l-149 -149l212 -212l149 148l249 -237l-1 697zM900 150h100v50h-100v-50z" />
|
|
||||||
<glyph unicode="" d="M23 415l1177 784v-1079l-475 272l-310 -393v416h-392zM494 210l672 938l-672 -712v-226z" />
|
|
||||||
<glyph unicode="" d="M0 150v1000q0 20 14.5 35t35.5 15h250v-300h500v300h100l200 -200v-850q0 -21 -15 -35.5t-35 -14.5h-150v400h-700v-400h-150q-21 0 -35.5 14.5t-14.5 35.5zM600 1000h100v200h-100v-200z" />
|
|
||||||
<glyph unicode="" d="M0 150v1000q0 20 14.5 35t35.5 15h250v-300h500v300h100l200 -200v-218l-276 -275l-120 120l-126 -127h-378v-400h-150q-21 0 -35.5 14.5t-14.5 35.5zM581 306l123 123l120 -120l353 352l123 -123l-475 -476zM600 1000h100v200h-100v-200z" />
|
|
||||||
<glyph unicode="" d="M0 150v1000q0 20 14.5 35t35.5 15h250v-300h500v300h100l200 -200v-269l-103 -103l-170 170l-298 -298h-329v-400h-150q-21 0 -35.5 14.5t-14.5 35.5zM600 1000h100v200h-100v-200zM700 133l170 170l-170 170l127 127l170 -170l170 170l127 -128l-170 -169l170 -170 l-127 -127l-170 170l-170 -170z" />
|
|
||||||
<glyph unicode="" d="M0 150v1000q0 20 14.5 35t35.5 15h250v-300h500v300h100l200 -200v-300h-400v-200h-500v-400h-150q-21 0 -35.5 14.5t-14.5 35.5zM600 300l300 -300l300 300h-200v300h-200v-300h-200zM600 1000v200h100v-200h-100z" />
|
|
||||||
<glyph unicode="" d="M0 150v1000q0 20 14.5 35t35.5 15h250v-300h500v300h100l200 -200v-402l-200 200l-298 -298h-402v-400h-150q-21 0 -35.5 14.5t-14.5 35.5zM600 300h200v-300h200v300h200l-300 300zM600 1000v200h100v-200h-100z" />
|
|
||||||
<glyph unicode="" d="M0 250q0 -21 14.5 -35.5t35.5 -14.5h1100q21 0 35.5 14.5t14.5 35.5v550h-1200v-550zM0 900h1200v150q0 21 -14.5 35.5t-35.5 14.5h-1100q-21 0 -35.5 -14.5t-14.5 -35.5v-150zM100 300v200h400v-200h-400z" />
|
|
||||||
<glyph unicode="" d="M0 400l300 298v-198h400v-200h-400v-198zM100 800v200h100v-200h-100zM300 800v200h100v-200h-100zM500 800v200h400v198l300 -298l-300 -298v198h-400zM800 300v200h100v-200h-100zM1000 300h100v200h-100v-200z" />
|
|
||||||
<glyph unicode="" d="M100 700v400l50 100l50 -100v-300h100v300l50 100l50 -100v-300h100v300l50 100l50 -100v-400l-100 -203v-447q0 -21 -14.5 -35.5t-35.5 -14.5h-200q-21 0 -35.5 14.5t-14.5 35.5v447zM800 597q0 -29 10.5 -55.5t25 -43t29 -28.5t25.5 -18l10 -5v-397q0 -21 14.5 -35.5 t35.5 -14.5h200q21 0 35.5 14.5t14.5 35.5v1106q0 31 -18 40.5t-44 -7.5l-276 -116q-25 -17 -43.5 -51.5t-18.5 -65.5v-359z" />
|
|
||||||
<glyph unicode="" d="M100 0h400v56q-75 0 -87.5 6t-12.5 44v394h500v-394q0 -38 -12.5 -44t-87.5 -6v-56h400v56q-4 0 -11 0.5t-24 3t-30 7t-24 15t-11 24.5v888q0 22 25 34.5t50 13.5l25 2v56h-400v-56q75 0 87.5 -6t12.5 -44v-394h-500v394q0 38 12.5 44t87.5 6v56h-400v-56q4 0 11 -0.5 t24 -3t30 -7t24 -15t11 -24.5v-888q0 -22 -25 -34.5t-50 -13.5l-25 -2v-56z" />
|
|
||||||
<glyph unicode="" d="M0 300q0 -41 29.5 -70.5t70.5 -29.5h300q41 0 70.5 29.5t29.5 70.5v500q0 41 -29.5 70.5t-70.5 29.5h-300q-41 0 -70.5 -29.5t-29.5 -70.5v-500zM100 100h400l200 200h105l295 98v-298h-425l-100 -100h-375zM100 300v200h300v-200h-300zM100 600v200h300v-200h-300z M100 1000h400l200 -200v-98l295 98h105v200h-425l-100 100h-375zM700 402v163l400 133v-163z" />
|
|
||||||
<glyph unicode="" d="M16.5 974.5q0.5 -21.5 16 -90t46.5 -140t104 -177.5t175 -208q103 -103 207.5 -176t180 -103.5t137 -47t92.5 -16.5l31 1l163 162q17 18 13.5 41t-22.5 37l-192 136q-19 14 -45 12t-42 -19l-118 -118q-142 101 -268 227t-227 268l118 118q17 17 20 41.5t-11 44.5 l-139 194q-14 19 -36.5 22t-40.5 -14l-162 -162q-1 -11 -0.5 -32.5z" />
|
|
||||||
<glyph unicode="" d="M0 50v212q0 20 10.5 45.5t24.5 39.5l365 303v50q0 4 1 10.5t12 22.5t30 28.5t60 23t97 10.5t97 -10t60 -23.5t30 -27.5t12 -24l1 -10v-50l365 -303q14 -14 24.5 -39.5t10.5 -45.5v-212q0 -21 -14.5 -35.5t-35.5 -14.5h-1100q-20 0 -35 14.5t-15 35.5zM0 712 q0 -21 14.5 -33.5t34.5 -8.5l202 33q20 4 34.5 21t14.5 38v146q141 24 300 24t300 -24v-146q0 -21 14.5 -38t34.5 -21l202 -33q20 -4 34.5 8.5t14.5 33.5v200q-6 8 -19 20.5t-63 45t-112 57t-171 45t-235 20.5q-92 0 -175 -10.5t-141.5 -27t-108.5 -36.5t-81.5 -40 t-53.5 -36.5t-31 -27.5l-9 -10v-200z" />
|
|
||||||
<glyph unicode="" d="M100 0v100h1100v-100h-1100zM175 200h950l-125 150v250l100 100v400h-100v-200h-100v200h-200v-200h-100v200h-200v-200h-100v200h-100v-400l100 -100v-250z" />
|
|
||||||
<glyph unicode="" d="M100 0h300v400q0 41 -29.5 70.5t-70.5 29.5h-100q-41 0 -70.5 -29.5t-29.5 -70.5v-400zM500 0v1000q0 41 29.5 70.5t70.5 29.5h100q41 0 70.5 -29.5t29.5 -70.5v-1000h-300zM900 0v700q0 41 29.5 70.5t70.5 29.5h100q41 0 70.5 -29.5t29.5 -70.5v-700h-300z" />
|
|
||||||
<glyph unicode="" d="M-100 300v500q0 124 88 212t212 88h700q124 0 212 -88t88 -212v-500q0 -124 -88 -212t-212 -88h-700q-124 0 -212 88t-88 212zM100 200h900v700h-900v-700zM200 300h300v300h-200v100h200v100h-300v-300h200v-100h-200v-100zM600 300h200v100h100v300h-100v100h-200v-500 zM700 400v300h100v-300h-100z" />
|
|
||||||
<glyph unicode="" d="M-100 300v500q0 124 88 212t212 88h700q124 0 212 -88t88 -212v-500q0 -124 -88 -212t-212 -88h-700q-124 0 -212 88t-88 212zM100 200h900v700h-900v-700zM200 300h100v200h100v-200h100v500h-100v-200h-100v200h-100v-500zM600 300h200v100h100v300h-100v100h-200v-500 zM700 400v300h100v-300h-100z" />
|
|
||||||
<glyph unicode="" d="M-100 300v500q0 124 88 212t212 88h700q124 0 212 -88t88 -212v-500q0 -124 -88 -212t-212 -88h-700q-124 0 -212 88t-88 212zM100 200h900v700h-900v-700zM200 300h300v100h-200v300h200v100h-300v-500zM600 300h300v100h-200v300h200v100h-300v-500z" />
|
|
||||||
<glyph unicode="" d="M-100 300v500q0 124 88 212t212 88h700q124 0 212 -88t88 -212v-500q0 -124 -88 -212t-212 -88h-700q-124 0 -212 88t-88 212zM100 200h900v700h-900v-700zM200 550l300 -150v300zM600 400l300 150l-300 150v-300z" />
|
|
||||||
<glyph unicode="" d="M-100 300v500q0 124 88 212t212 88h700q124 0 212 -88t88 -212v-500q0 -124 -88 -212t-212 -88h-700q-124 0 -212 88t-88 212zM100 200h900v700h-900v-700zM200 300v500h700v-500h-700zM300 400h130q41 0 68 42t27 107t-28.5 108t-66.5 43h-130v-300zM575 549 q0 -65 27 -107t68 -42h130v300h-130q-38 0 -66.5 -43t-28.5 -108z" />
|
|
||||||
<glyph unicode="" d="M-100 300v500q0 124 88 212t212 88h700q124 0 212 -88t88 -212v-500q0 -124 -88 -212t-212 -88h-700q-124 0 -212 88t-88 212zM100 200h900v700h-900v-700zM200 300h300v300h-200v100h200v100h-300v-300h200v-100h-200v-100zM601 300h100v100h-100v-100zM700 700h100 v-400h100v500h-200v-100z" />
|
|
||||||
<glyph unicode="" d="M-100 300v500q0 124 88 212t212 88h700q124 0 212 -88t88 -212v-500q0 -124 -88 -212t-212 -88h-700q-124 0 -212 88t-88 212zM100 200h900v700h-900v-700zM200 300h300v400h-200v100h-100v-500zM301 400v200h100v-200h-100zM601 300h100v100h-100v-100zM700 700h100 v-400h100v500h-200v-100z" />
|
|
||||||
<glyph unicode="" d="M-100 300v500q0 124 88 212t212 88h700q124 0 212 -88t88 -212v-500q0 -124 -88 -212t-212 -88h-700q-124 0 -212 88t-88 212zM100 200h900v700h-900v-700zM200 700v100h300v-300h-99v-100h-100v100h99v200h-200zM201 300v100h100v-100h-100zM601 300v100h100v-100h-100z M700 700v100h200v-500h-100v400h-100z" />
|
|
||||||
<glyph unicode="" d="M4 600q0 162 80 299t217 217t299 80t299 -80t217 -217t80 -299t-80 -299t-217 -217t-299 -80t-299 80t-217 217t-80 299zM186 600q0 -171 121.5 -292.5t292.5 -121.5t292.5 121.5t121.5 292.5t-121.5 292.5t-292.5 121.5t-292.5 -121.5t-121.5 -292.5zM400 500v200 l100 100h300v-100h-300v-200h300v-100h-300z" />
|
|
||||||
<glyph unicode="" d="M0 600q0 162 80 299t217 217t299 80t299 -80t217 -217t80 -299t-80 -299t-217 -217t-299 -80t-299 80t-217 217t-80 299zM182 600q0 -171 121.5 -292.5t292.5 -121.5t292.5 121.5t121.5 292.5t-121.5 292.5t-292.5 121.5t-292.5 -121.5t-121.5 -292.5zM400 400v400h300 l100 -100v-100h-100v100h-200v-100h200v-100h-200v-100h-100zM700 400v100h100v-100h-100z" />
|
|
||||||
<glyph unicode="" d="M-14 494q0 -80 56.5 -137t135.5 -57h222v300h400v-300h128q120 0 205 86.5t85 207.5t-85 207t-205 86q-46 0 -90 -14q-44 97 -134.5 156.5t-200.5 59.5q-152 0 -260 -107.5t-108 -260.5q0 -25 2 -37q-66 -14 -108.5 -67.5t-42.5 -122.5zM300 200h200v300h200v-300h200 l-300 -300z" />
|
|
||||||
<glyph unicode="" d="M-14 494q0 -80 56.5 -137t135.5 -57h8l414 414l403 -403q94 26 154.5 104.5t60.5 178.5q0 120 -85 206.5t-205 86.5q-46 0 -90 -14q-44 97 -134.5 156.5t-200.5 59.5q-152 0 -260 -107.5t-108 -260.5q0 -25 2 -37q-66 -14 -108.5 -67.5t-42.5 -122.5zM300 200l300 300 l300 -300h-200v-300h-200v300h-200z" />
|
|
||||||
<glyph unicode="" d="M100 200h400v-155l-75 -45h350l-75 45v155h400l-270 300h170l-270 300h170l-300 333l-300 -333h170l-270 -300h170z" />
|
|
||||||
<glyph unicode="" d="M121 700q0 -53 28.5 -97t75.5 -65q-4 -16 -4 -38q0 -74 52.5 -126.5t126.5 -52.5q56 0 100 30v-306l-75 -45h350l-75 45v306q46 -30 100 -30q74 0 126.5 52.5t52.5 126.5q0 24 -9 55q50 32 79.5 83t29.5 112q0 90 -61.5 155.5t-150.5 71.5q-26 89 -99.5 145.5 t-167.5 56.5q-116 0 -197.5 -81.5t-81.5 -197.5q0 -4 1 -11.5t1 -11.5q-14 2 -23 2q-74 0 -126.5 -52.5t-52.5 -126.5z" />
|
|
||||||
</font>
|
|
||||||
</defs></svg>
|
|
Before Width: | Height: | Size: 62 KiB |
Before Width: | Height: | Size: 114 KiB |
Before Width: | Height: | Size: 115 KiB |
Before Width: | Height: | Size: 114 KiB |
Before Width: | Height: | Size: 116 KiB |
Before Width: | Height: | Size: 115 KiB |
@ -1,108 +0,0 @@
|
|||||||
var gulp = require('gulp');
|
|
||||||
var jshint = require('gulp-jshint');
|
|
||||||
var bower = require('gulp-bower');
|
|
||||||
var bowerMainFiles = require('main-bower-files');
|
|
||||||
var bowerNormalizer = require('gulp-bower-normalize');
|
|
||||||
var jscs = require('gulp-jscs');
|
|
||||||
var lintspaces = require('gulp-lintspaces');
|
|
||||||
|
|
||||||
gulp.task('lint', function() {
|
|
||||||
return gulp.src('js/*.js')
|
|
||||||
.pipe(jshint({
|
|
||||||
eqeqeq: false,
|
|
||||||
browser: true,
|
|
||||||
bitwise: true,
|
|
||||||
laxbreak: true,
|
|
||||||
newcap: false,
|
|
||||||
undef: true,
|
|
||||||
unused: true,
|
|
||||||
predef: ['requirejs', 'require', 'define', 'app', '$'],
|
|
||||||
strict: true,
|
|
||||||
lastsemic: true,
|
|
||||||
scripturl: true,
|
|
||||||
"-W041": false
|
|
||||||
}))
|
|
||||||
.pipe(jshint.reporter('jshint-stylish'))
|
|
||||||
.pipe(jscs({
|
|
||||||
requireParenthesesAroundIIFE: true,
|
|
||||||
requireSpaceAfterKeywords: ['do', 'for', 'if', 'else', 'switch', 'case', 'try', 'while', 'return', 'typeof'],
|
|
||||||
requireSpaceBeforeBlockStatements: true,
|
|
||||||
requireSpacesInConditionalExpression: true,
|
|
||||||
requireSpacesInFunction: {beforeOpeningCurlyBrace: true},
|
|
||||||
disallowSpacesInFunction: {beforeOpeningRoundBrace: true},
|
|
||||||
requireBlocksOnNewline: 1,
|
|
||||||
disallowPaddingNewlinesInBlocks: true,
|
|
||||||
disallowEmptyBlocks: true,
|
|
||||||
disallowSpacesInsideObjectBrackets: 'all',
|
|
||||||
disallowSpacesInsideArrayBrackets: 'all',
|
|
||||||
disallowSpacesInsideParentheses: true,
|
|
||||||
disallowQuotedKeysInObjects: true,
|
|
||||||
disallowSpaceAfterObjectKeys: true,
|
|
||||||
requireSpaceBeforeObjectValues: true,
|
|
||||||
requireCommaBeforeLineBreak: true,
|
|
||||||
requireOperatorBeforeLineBreak: true,
|
|
||||||
disallowSpaceAfterPrefixUnaryOperators: true,
|
|
||||||
disallowSpaceBeforePostfixUnaryOperators: true,
|
|
||||||
requireSpaceBeforeBinaryOperators: true,
|
|
||||||
requireSpaceAfterBinaryOperators: true,
|
|
||||||
disallowImplicitTypeConversion: ['numeric', 'string'],
|
|
||||||
requireCamelCaseOrUpperCaseIdentifiers: 'ignoreProperties',
|
|
||||||
disallowKeywords: ['with'],
|
|
||||||
disallowMultipleLineStrings: true,
|
|
||||||
disallowMultipleLineBreaks: true,
|
|
||||||
disallowMixedSpacesAndTabs: true,
|
|
||||||
disallowTrailingComma: true,
|
|
||||||
disallowKeywordsOnNewLine: ['else'],
|
|
||||||
requireCapitalizedConstructors: true,
|
|
||||||
requireDotNotation: true,
|
|
||||||
disallowYodaConditions: true,
|
|
||||||
disallowNewlineBeforeBlockStatements: true,
|
|
||||||
validateLineBreaks: 'LF',
|
|
||||||
validateParameterSeparator: ', '
|
|
||||||
}
|
|
||||||
))
|
|
||||||
.pipe(lintspaces({
|
|
||||||
styles: {
|
|
||||||
options: {
|
|
||||||
showValid: true,
|
|
||||||
newline: true,
|
|
||||||
indentation: 'spaces',
|
|
||||||
spaces: 2,
|
|
||||||
newlineMaximum: 2,
|
|
||||||
trailingspaces: true,
|
|
||||||
ignores: ['js-comments']
|
|
||||||
},
|
|
||||||
src: [
|
|
||||||
'static/**/*.css'
|
|
||||||
]
|
|
||||||
},
|
|
||||||
javascript: {
|
|
||||||
options: {
|
|
||||||
showValid: true,
|
|
||||||
newline: true,
|
|
||||||
indentation: 'spaces',
|
|
||||||
spaces: 4,
|
|
||||||
trailingspaces: true,
|
|
||||||
ignores: ['js-comments']
|
|
||||||
},
|
|
||||||
src: [
|
|
||||||
'static/**/*.js',
|
|
||||||
'!static/js/libs/**'
|
|
||||||
]
|
|
||||||
}
|
|
||||||
}));
|
|
||||||
});
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
gulp.task('bower-build', function() {
|
|
||||||
return bower();
|
|
||||||
});
|
|
||||||
|
|
||||||
gulp.task('bower', ['bower-build'], function() {
|
|
||||||
return gulp.src(bowerMainFiles({
|
|
||||||
checkExistence: true
|
|
||||||
}), {base: './bower_components'})
|
|
||||||
.pipe(bowerNormalizer({bowerJson: './bower.json'}))
|
|
||||||
.pipe(gulp.dest('./js/libs/'))
|
|
||||||
});
|
|
Before Width: | Height: | Size: 225 B |
Before Width: | Height: | Size: 2.6 KiB |
@ -1,94 +0,0 @@
|
|||||||
<?xml version="1.0" encoding="utf-8"?>
|
|
||||||
<!-- Generator: Adobe Illustrator 16.0.0, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
|
|
||||||
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd">
|
|
||||||
<svg version="1.1" id="Layer_1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px"
|
|
||||||
width="220px" height="65px" viewBox="0 0 220 65" enable-background="new 0 0 220 65" xml:space="preserve">
|
|
||||||
<linearGradient id="SVGID_1_" gradientUnits="userSpaceOnUse" x1="34.0596" y1="64.167" x2="34.0596" y2="64.167">
|
|
||||||
<stop offset="0" style="stop-color:#FFFFFF"/>
|
|
||||||
<stop offset="1" style="stop-color:#000000"/>
|
|
||||||
</linearGradient>
|
|
||||||
<line display="none" fill="url(#SVGID_1_)" stroke="#FFFFFF" stroke-width="4" stroke-miterlimit="10" x1="34.06" y1="6" x2="34.06" y2="122.334"/>
|
|
||||||
<linearGradient id="SVGID_2_" gradientUnits="userSpaceOnUse" x1="53.0596" y1="64.167" x2="53.0596" y2="64.167">
|
|
||||||
<stop offset="0" style="stop-color:#FFFFFF"/>
|
|
||||||
<stop offset="1" style="stop-color:#000000"/>
|
|
||||||
</linearGradient>
|
|
||||||
<line display="none" fill="url(#SVGID_2_)" stroke="#FFFFFF" stroke-width="4" stroke-miterlimit="10" x1="53.06" y1="6" x2="53.06" y2="122.334"/>
|
|
||||||
<linearGradient id="SVGID_3_" gradientUnits="userSpaceOnUse" x1="72.0596" y1="64.167" x2="72.0596" y2="64.167">
|
|
||||||
<stop offset="0" style="stop-color:#FFFFFF"/>
|
|
||||||
<stop offset="1" style="stop-color:#000000"/>
|
|
||||||
</linearGradient>
|
|
||||||
<line display="none" fill="url(#SVGID_3_)" stroke="#FFFFFF" stroke-width="4" stroke-miterlimit="10" x1="72.06" y1="6" x2="72.06" y2="122.334"/>
|
|
||||||
<linearGradient id="SVGID_4_" gradientUnits="userSpaceOnUse" x1="91.0596" y1="64.167" x2="91.0596" y2="64.167">
|
|
||||||
<stop offset="0" style="stop-color:#FFFFFF"/>
|
|
||||||
<stop offset="1" style="stop-color:#000000"/>
|
|
||||||
</linearGradient>
|
|
||||||
<line display="none" fill="url(#SVGID_4_)" stroke="#FFFFFF" stroke-width="4" stroke-miterlimit="10" x1="91.06" y1="6" x2="91.06" y2="122.334"/>
|
|
||||||
<linearGradient id="SVGID_5_" gradientUnits="userSpaceOnUse" x1="110.0596" y1="64.167" x2="110.0596" y2="64.167">
|
|
||||||
<stop offset="0" style="stop-color:#FFFFFF"/>
|
|
||||||
<stop offset="1" style="stop-color:#000000"/>
|
|
||||||
</linearGradient>
|
|
||||||
<line display="none" fill="url(#SVGID_5_)" stroke="#FFFFFF" stroke-width="4" stroke-miterlimit="10" x1="110.06" y1="6" x2="110.06" y2="122.334"/>
|
|
||||||
<linearGradient id="SVGID_6_" gradientUnits="userSpaceOnUse" x1="129.0596" y1="64.167" x2="129.0596" y2="64.167">
|
|
||||||
<stop offset="0" style="stop-color:#FFFFFF"/>
|
|
||||||
<stop offset="1" style="stop-color:#000000"/>
|
|
||||||
</linearGradient>
|
|
||||||
<line display="none" fill="url(#SVGID_6_)" stroke="#FFFFFF" stroke-width="4" stroke-miterlimit="10" x1="129.06" y1="6" x2="129.06" y2="122.334"/>
|
|
||||||
<linearGradient id="SVGID_7_" gradientUnits="userSpaceOnUse" x1="148.0596" y1="64.167" x2="148.0596" y2="64.167">
|
|
||||||
<stop offset="0" style="stop-color:#FFFFFF"/>
|
|
||||||
<stop offset="1" style="stop-color:#000000"/>
|
|
||||||
</linearGradient>
|
|
||||||
<line display="none" fill="url(#SVGID_7_)" stroke="#FFFFFF" stroke-width="4" stroke-miterlimit="10" x1="148.06" y1="6" x2="148.06" y2="122.334"/>
|
|
||||||
<linearGradient id="SVGID_8_" gradientUnits="userSpaceOnUse" x1="167.0596" y1="64.167" x2="167.0596" y2="64.167">
|
|
||||||
<stop offset="0" style="stop-color:#FFFFFF"/>
|
|
||||||
<stop offset="1" style="stop-color:#000000"/>
|
|
||||||
</linearGradient>
|
|
||||||
<line display="none" fill="url(#SVGID_8_)" stroke="#FFFFFF" stroke-width="4" stroke-miterlimit="10" x1="167.06" y1="6" x2="167.06" y2="122.334"/>
|
|
||||||
<linearGradient id="SVGID_9_" gradientUnits="userSpaceOnUse" x1="186.0596" y1="64.167" x2="186.0596" y2="64.167">
|
|
||||||
<stop offset="0" style="stop-color:#FFFFFF"/>
|
|
||||||
<stop offset="1" style="stop-color:#000000"/>
|
|
||||||
</linearGradient>
|
|
||||||
<line display="none" fill="url(#SVGID_9_)" stroke="#FFFFFF" stroke-width="4" stroke-miterlimit="10" x1="186.06" y1="6" x2="186.06" y2="122.334"/>
|
|
||||||
<g>
|
|
||||||
<g>
|
|
||||||
<path fill="#D65656" d="M33.723,41.414h4.729v3.331h-4.729v9.625H30.06V31.057h9.692v3.331h-6.028V41.414z"/>
|
|
||||||
<path fill="#D65656" d="M45.007,31.057v17.985c0,1.665,0.733,2.265,1.898,2.265c1.166,0,1.898-0.6,1.898-2.265V31.057h3.464v17.75
|
|
||||||
c0,3.731-1.865,5.863-5.462,5.863c-3.597,0-5.462-2.132-5.462-5.863v-17.75H45.007z"/>
|
|
||||||
<path fill="#D65656" d="M58.049,40.882h5.029v3.33h-5.029v6.829h6.328v3.329h-9.992V31.057h9.992v3.331h-6.328V40.882z"/>
|
|
||||||
<path fill="#D65656" d="M66.288,31.057h3.664v19.984h6.028v3.329h-9.692V31.057z"/>
|
|
||||||
<path fill="#D65656" d="M87.669,30.791c3.564,0,5.396,2.131,5.396,5.862v0.731h-3.464v-0.965c0-1.665-0.666-2.298-1.832-2.298
|
|
||||||
c-1.166,0-1.832,0.633-1.832,2.298c0,4.796,7.161,5.694,7.161,12.356c0,3.73-1.865,5.862-5.462,5.862
|
|
||||||
c-3.597,0-5.462-2.132-5.462-5.862v-1.434h3.464v1.667c0,1.665,0.733,2.265,1.898,2.265s1.898-0.6,1.898-2.265
|
|
||||||
c0-4.797-7.16-5.695-7.16-12.356C82.275,32.922,84.106,30.791,87.669,30.791z"/>
|
|
||||||
<path fill="#D65656" d="M94.317,31.057h11.324v3.331h-3.831V54.37h-3.664V34.388h-3.83V31.057z"/>
|
|
||||||
<path fill="#D65656" d="M117.957,54.37h-3.698l-0.632-4.229h-4.496l-0.632,4.229h-3.365l3.73-23.313h5.362L117.957,54.37z
|
|
||||||
M109.597,46.977h3.53l-1.765-11.79L109.597,46.977z"/>
|
|
||||||
<path fill="#D65656" d="M117.451,31.057h11.324v3.331h-3.831V54.37h-3.663V34.388h-3.83V31.057z"/>
|
|
||||||
<path fill="#D65656" d="M130.365,31.057h3.664V54.37h-3.664V31.057z"/>
|
|
||||||
<path fill="#D65656" d="M141.339,30.791c3.564,0,5.396,2.131,5.396,5.862v0.731h-3.464v-0.965c0-1.665-0.667-2.298-1.832-2.298
|
|
||||||
c-1.166,0-1.831,0.633-1.831,2.298c0,4.796,7.16,5.694,7.16,12.356c0,3.73-1.865,5.862-5.462,5.862
|
|
||||||
c-3.598,0-5.463-2.132-5.463-5.862v-1.434h3.464v1.667c0,1.665,0.732,2.265,1.898,2.265s1.898-0.6,1.898-2.265
|
|
||||||
c0-4.797-7.16-5.695-7.16-12.356C135.944,32.922,137.775,30.791,141.339,30.791z"/>
|
|
||||||
<path fill="#D65656" d="M147.986,31.057h11.324v3.331h-3.83V54.37h-3.663V34.388h-3.831V31.057z"/>
|
|
||||||
<path fill="#D65656" d="M160.901,31.057h3.664V54.37h-3.664V31.057z"/>
|
|
||||||
<path fill="#D65656" d="M177.511,45.678v3.098c0,3.73-1.865,5.862-5.463,5.862c-3.597,0-5.462-2.132-5.462-5.862V36.652
|
|
||||||
c0-3.73,1.865-5.862,5.462-5.862c3.598,0,5.463,2.131,5.463,5.862v2.265h-3.464v-2.498c0-1.665-0.732-2.298-1.898-2.298
|
|
||||||
s-1.899,0.633-1.899,2.298v12.59c0,1.665,0.733,2.265,1.899,2.265s1.898-0.6,1.898-2.265v-3.331H177.511z"/>
|
|
||||||
<path fill="#D65656" d="M184.63,30.791c3.564,0,5.396,2.131,5.396,5.862v0.731h-3.464v-0.965c0-1.665-0.666-2.298-1.831-2.298
|
|
||||||
c-1.166,0-1.832,0.633-1.832,2.298c0,4.796,7.16,5.694,7.16,12.356c0,3.73-1.865,5.862-5.462,5.862
|
|
||||||
c-3.598,0-5.462-2.132-5.462-5.862v-1.434h3.464v1.667c0,1.665,0.732,2.265,1.897,2.265c1.167,0,1.899-0.6,1.899-2.265
|
|
||||||
c0-4.797-7.161-5.695-7.161-12.356C179.235,32.922,181.066,30.791,184.63,30.791z"/>
|
|
||||||
</g>
|
|
||||||
<g>
|
|
||||||
<rect x="30.06" y="18.195" fill="#D65656" width="12.903" height="9.462"/>
|
|
||||||
<polygon fill="#D65656" points="59.307,15.614 59.307,11.313 46.404,11.313 46.404,14.754 46.404,27.657 59.307,27.657 "/>
|
|
||||||
<polygon fill="#D65656" points="108.339,14.754 95.436,14.754 95.436,27.657 108.339,27.657 108.339,15.614 "/>
|
|
||||||
<rect x="62.748" y="19.055" fill="#D65656" width="12.903" height="8.602"/>
|
|
||||||
<rect x="111.78" y="19.055" fill="#D65656" width="12.903" height="8.602"/>
|
|
||||||
<polygon fill="#D65656" points="160.812,13.034 160.812,13.894 160.812,27.657 173.716,27.657 173.716,13.034 "/>
|
|
||||||
<polygon fill="#D65656" points="144.468,17.334 144.468,19.055 144.468,27.657 157.371,27.657 157.371,17.334 "/>
|
|
||||||
<rect x="177.156" y="9.593" fill="#D65656" width="12.903" height="18.064"/>
|
|
||||||
<rect x="128.124" y="22.496" fill="#D65656" width="12.903" height="5.161"/>
|
|
||||||
<polygon fill="#D65656" points="91.995,11.313 91.995,8.732 79.092,8.732 79.092,15.614 79.092,27.657 91.995,27.657 "/>
|
|
||||||
</g>
|
|
||||||
</g>
|
|
||||||
</svg>
|
|
Before Width: | Height: | Size: 7.5 KiB |
@ -1,26 +0,0 @@
|
|||||||
<?xml version="1.0" encoding="utf-8"?>
|
|
||||||
<!-- Generator: Adobe Illustrator 16.0.4, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
|
|
||||||
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd">
|
|
||||||
<svg version="1.1" id="Layer_1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px"
|
|
||||||
width="98px" height="98px" viewBox="0 0 98 98" enable-background="new 0 0 98 98" xml:space="preserve">
|
|
||||||
<path fill="#FFFFFF" d="M97.498,43.062l-10.425-3.683c-0.236-0.085-0.455-0.282-0.52-0.526c-0.878-3.292-2.215-6.471-3.936-9.45
|
|
||||||
c-0.061-0.104-0.101-0.221-0.108-0.337c0-0.114,0.015-0.228,0.064-0.334l4.716-9.935c-0.004-0.005-0.01-0.01-0.014-0.016
|
|
||||||
l0.013-0.027c-2.438-3.114-5.248-5.925-8.36-8.368l-9.925,4.723c-0.104,0.049-0.217,0.073-0.33,0.073
|
|
||||||
c-0.133,0-0.244-0.034-0.362-0.104C65.678,13.555,63,12.343,60,11.474v0.063c0-0.111-0.758-0.237-1.133-0.337
|
|
||||||
c-0.242-0.064-0.449-0.246-0.534-0.484l-3.686-10.36C52.688,0.12,50.7,0,48.736,0c-1.966,0-3.952,0.12-5.912,0.356l-3.681,10.361
|
|
||||||
c-0.084,0.238-0.036,0.418-0.279,0.485C38.491,11.302,38,11.428,38,11.539v-0.064c-3,0.869-5.944,2.08-8.578,3.605
|
|
||||||
c-0.118,0.069-0.374,0.104-0.507,0.104c-0.111,0-0.287-0.025-0.391-0.075l-9.957-4.722c-3.11,2.443-5.934,5.254-8.375,8.369
|
|
||||||
l0.006,0.027c-0.004,0.005-0.013,0.01-0.017,0.016l4.716,9.935c0.051,0.106,0.068,0.222,0.068,0.335
|
|
||||||
c-0.008,0.116-0.038,0.231-0.098,0.335c-1.721,2.976-3.039,6.156-3.917,9.451c-0.066,0.244-0.247,0.441-0.484,0.526l-10.23,3.684
|
|
||||||
C-0.001,45.019,0,47.007,0,48.978C0,48.985,0,48.992,0,49s0,0.015,0,0.022c0,1.967,0,3.954,0.235,5.915l10.291,3.684
|
|
||||||
c0.236,0.085,0.389,0.281,0.454,0.525c0.878,3.292,2.181,6.472,3.902,9.45c0.061,0.104,0.083,0.22,0.09,0.337
|
|
||||||
c0,0.113-0.021,0.229-0.071,0.334l-4.721,9.936c0.004,0.005,0.008,0.011,0.012,0.016l-0.014,0.027
|
|
||||||
c2.438,3.113,5.247,5.925,8.359,8.368l9.925-4.724c0.104-0.049,0.217-0.073,0.33-0.073c0.133,0,0.512,0.034,0.63,0.104
|
|
||||||
C32.056,84.444,35,85.657,38,86.526v-0.064c0,0.112,0.491,0.238,0.865,0.338c0.243,0.064,0.316,0.246,0.401,0.483l3.618,10.361
|
|
||||||
C44.844,97.88,46.799,98,48.763,98c1.965,0,3.936-0.12,5.896-0.355l3.672-10.362c0.084-0.237,0.299-0.418,0.542-0.484
|
|
||||||
c0.373-0.1,1.127-0.225,1.127-0.337v0.065c3-0.869,5.677-2.081,8.311-3.605c0.118-0.069,0.24-0.104,0.374-0.104
|
|
||||||
c0.111,0,0.22,0.025,0.323,0.075l9.924,4.723c3.11-2.443,5.916-5.255,8.357-8.369l-0.013-0.027c0.004-0.005,0.008-0.011,0.012-0.016
|
|
||||||
l-4.718-9.936c-0.051-0.106-0.069-0.221-0.068-0.335c0.007-0.116,0.037-0.231,0.097-0.335c1.721-2.976,3.038-6.157,3.917-9.451
|
|
||||||
c0.066-0.244,0.247-0.44,0.484-0.525l10.498-3.685C97.734,52.981,98,50.993,98,49.022c0-0.008,0-0.015,0-0.022s0-0.015,0-0.022
|
|
||||||
C98,47.011,97.732,45.023,97.498,43.062z"/>
|
|
||||||
</svg>
|
|
Before Width: | Height: | Size: 2.7 KiB |
@ -1,17 +0,0 @@
|
|||||||
<?xml version="1.0" encoding="utf-8"?>
|
|
||||||
<!-- Generator: Adobe Illustrator 16.0.4, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
|
|
||||||
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd">
|
|
||||||
<svg version="1.1" id="Layer_1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px"
|
|
||||||
width="98px" height="98px" viewBox="0 0 98 98" enable-background="new 0 0 98 98" xml:space="preserve">
|
|
||||||
<path fill="#415766" d="M79.271,36.387c-0.006-0.005-0.015-0.01-0.021-0.015l1.052-0.543c0,0-7.029-5.242-15.771-8.287L70.051,20
|
|
||||||
H27.447l5.657,7.727c-8.439,2.993-15.26,8.305-15.26,8.305l3.488,1.804l7.17,3.755c0.104,0.055,0.219,0.083,0.331,0.083
|
|
||||||
c0.068,0,0.13-0.04,0.196-0.06c2.789-1.803,5.8-3.227,8.971-4.234v2.48v4v18.064L23.572,93.822c-0.387,0.873-0.431,1.984,0.09,2.785
|
|
||||||
S24.941,98,25.896,98H71.57c0.953,0,1.836-0.592,2.354-1.393c0.521-0.801,0.62-1.855,0.233-2.729L60,61.925V43.86v-4v-2.37
|
|
||||||
c0.225,0.073,0.445,0.151,0.669,0.229c0.384,0.13,0.767,0.262,1.146,0.403c2.371,0.919,4.643,2.077,6.781,3.455
|
|
||||||
c0.104,0.054,0.213,0.097,0.324,0.097c0.114,0,0.229-0.028,0.332-0.081l7.268-3.809l1.472-0.761l1.237-0.628l0.041,0.021
|
|
||||||
c-0.014-0.011-0.025-0.022-0.041-0.033l0.02-0.009L79.271,36.387z"/>
|
|
||||||
<circle fill="#DA3C3B" cx="44.682" cy="25.03" r="6.987"/>
|
|
||||||
<polygon fill="#DA3C3B" points="57.486,62.521 57.486,37.451 40.661,37.451 40.661,62.521 26.237,94.964 71.91,94.964 "/>
|
|
||||||
<circle fill="#DA3C3B" cx="57.676" cy="17.171" r="4.527"/>
|
|
||||||
<circle fill="#DA3C3B" cx="47.525" cy="6.342" r="3.37"/>
|
|
||||||
</svg>
|
|
Before Width: | Height: | Size: 1.5 KiB |
Before Width: | Height: | Size: 3.4 KiB |
@ -1,168 +0,0 @@
|
|||||||
<!DOCTYPE html>
|
|
||||||
<html lang="en">
|
|
||||||
<head>
|
|
||||||
<meta charset="utf-8">
|
|
||||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
|
||||||
<meta name="viewport" content="width=device-width, initial-scale=1">
|
|
||||||
<meta name="description" content="">
|
|
||||||
<meta name="author" content="">
|
|
||||||
<link rel="icon" href="favicon.ico">
|
|
||||||
<title>Fuel Statistics</title>
|
|
||||||
<!-- Bootstrap core CSS -->
|
|
||||||
<link href="js/libs/bootstrap/css/bootstrap.css" rel="stylesheet">
|
|
||||||
<!-- Custom styles for this template -->
|
|
||||||
<link href="css/nv.d3.css" rel="stylesheet">
|
|
||||||
<link href="css/fuel-stat.css" rel="stylesheet">
|
|
||||||
<script src="https://oss.maxcdn.com/html5shiv/3.7.2/html5shiv.min.js"></script>
|
|
||||||
<script src="https://oss.maxcdn.com/respond/1.4.2/respond.min.js"></script>
|
|
||||||
<script data-main="js/main" src="js/libs/requirejs/js/require.js"></script>
|
|
||||||
</head>
|
|
||||||
<body>
|
|
||||||
|
|
||||||
<!-- Start Left Pannel -->
|
|
||||||
<div class="nav-pannel">
|
|
||||||
<div class="logo"><a href="#"></a></div>
|
|
||||||
|
|
||||||
<div class="nav-item"><a href="#" class="active">
|
|
||||||
<div class="icon-graphs"></div>
|
|
||||||
<div class="nav-item-name">Graphics</div>
|
|
||||||
</a></div>
|
|
||||||
|
|
||||||
<div class="nav-item"><a href="reports.html">
|
|
||||||
<div class="icon-reports"></div>
|
|
||||||
<div class="nav-item-name">Reports</div>
|
|
||||||
</a></div>
|
|
||||||
|
|
||||||
</div>
|
|
||||||
<!-- End Left Pannel -->
|
|
||||||
|
|
||||||
<div id="loader">
|
|
||||||
<div class="loading"></div>
|
|
||||||
</div>
|
|
||||||
<div id="load-error" class="hidden">
|
|
||||||
Error: Service is unavailable.
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<!-- Start Base Layout -->
|
|
||||||
<div class="base-box">
|
|
||||||
<select id="release-filter"></select>
|
|
||||||
|
|
||||||
<div id="main" class="hidden">
|
|
||||||
<!-- TOP BIG GRAPH -->
|
|
||||||
<div class="container-fluid titul-graph-box">
|
|
||||||
<div class="row top-graph">
|
|
||||||
<div class="col-md-4">
|
|
||||||
<div class="header">
|
|
||||||
<p>
|
|
||||||
<div class="title">
|
|
||||||
Number of installations: <b><span id="installations-count"></span></b>
|
|
||||||
</div>
|
|
||||||
</p>
|
|
||||||
<p>
|
|
||||||
<div class="title">Total number of environments: <b><span id="environments-count"></span></b>
|
|
||||||
</div>
|
|
||||||
</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<div class="col-md-4">
|
|
||||||
<div class="title">Distribution of installations by number of environments</div>
|
|
||||||
<div id="env-distribution">
|
|
||||||
<svg style="height: 280px;"></svg>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<div class="col-md-4">
|
|
||||||
<div class="title">Environments statuses distribution</div>
|
|
||||||
<div id="clusters-distribution">
|
|
||||||
<svg style="height: 280px;"></svg>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<!-- Start Block with small graphics -->
|
|
||||||
<div class="container-fluid small-graphs-box">
|
|
||||||
<div class="row">
|
|
||||||
|
|
||||||
<!-- Start Graphics Items -->
|
|
||||||
<div class="container-fluid standard-graph-box">
|
|
||||||
<div class="row">
|
|
||||||
|
|
||||||
<!-- Start Item 1 -->
|
|
||||||
<div class="col-md-4">
|
|
||||||
<div class="graph-item-box">
|
|
||||||
<!-- Start Item Header -->
|
|
||||||
<div class="header">
|
|
||||||
<div class="title">Environment Size</div>
|
|
||||||
<!-- <div class="icon"><a href="#"></a></div> -->
|
|
||||||
</div>
|
|
||||||
<!-- End Item Header -->
|
|
||||||
<!-- Start Item Content -->
|
|
||||||
<div class="content">
|
|
||||||
<div class="image">
|
|
||||||
<div id="nodes-distribution">
|
|
||||||
<svg style="height: 280px;"></svg>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<div class="description">Number of environments: <span
|
|
||||||
id="count-nodes-distribution"></span>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<!-- End Item Content -->
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<!-- End Item 1 -->
|
|
||||||
|
|
||||||
<!-- Start Item 2 -->
|
|
||||||
<div class="col-md-4">
|
|
||||||
<div class="graph-item-box">
|
|
||||||
<!-- Start Item Header -->
|
|
||||||
<div class="header">
|
|
||||||
<div class="title">Hypervisor</div>
|
|
||||||
</div>
|
|
||||||
<!-- End Item Header -->
|
|
||||||
<!-- Start Item Content -->
|
|
||||||
<div class="content">
|
|
||||||
<div class="image">
|
|
||||||
<div id="releases-distribution"></div>
|
|
||||||
</div>
|
|
||||||
<div class="description">Number of environments: <span
|
|
||||||
id="count-releases-distribution"></span>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<!-- End Item Content -->
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<!-- End Item 2 -->
|
|
||||||
|
|
||||||
<!-- Start Item 3 -->
|
|
||||||
<div class="col-md-4">
|
|
||||||
<div class="graph-item-box">
|
|
||||||
<!-- Start Item Header -->
|
|
||||||
<div class="header">
|
|
||||||
<div class="title">Operating System</div>
|
|
||||||
</div>
|
|
||||||
<!-- End Item Header -->
|
|
||||||
<!-- Start Item Content -->
|
|
||||||
<div class="content">
|
|
||||||
<div class="image">
|
|
||||||
<div id="distribution-of-oses"></div>
|
|
||||||
</div>
|
|
||||||
<div class="description">Number of environments: <span
|
|
||||||
id="count-distribution-of-oses"></span>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<!-- End Item Content -->
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<!-- End Item 3 -->
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<!-- End Graphics Items -->
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
</body>
|
|
||||||
</html>
|
|
||||||
|
|
@ -1,379 +0,0 @@
|
|||||||
|
|
||||||
define(
|
|
||||||
[
|
|
||||||
'jquery',
|
|
||||||
'd3',
|
|
||||||
'd3pie',
|
|
||||||
'd3tip',
|
|
||||||
'nv'
|
|
||||||
],
|
|
||||||
function($, d3, D3pie, d3tip, nv) {
|
|
||||||
'use strict';
|
|
||||||
|
|
||||||
var releases = [
|
|
||||||
{name: 'All', filter: ''},
|
|
||||||
{name: '6.0 Technical Preview', filter: '6.0-techpreview'},
|
|
||||||
{name: '6.0 GA', filter: '6.0'},
|
|
||||||
{name: '6.1', filter: '6.1'},
|
|
||||||
{name: '7.0', filter: '7.0'},
|
|
||||||
{name: '8.0', filter: '8.0'},
|
|
||||||
{name: '9.0', filter: '9.0'}
|
|
||||||
];
|
|
||||||
var currentRelease = releases[0].filter;
|
|
||||||
|
|
||||||
var releaseFilter = $('#release-filter');
|
|
||||||
releases.forEach(function(release) {
|
|
||||||
releaseFilter.append($('<option/>', {text: release.name, value: release.filter}));
|
|
||||||
});
|
|
||||||
releaseFilter.on('change', function(e) {
|
|
||||||
var newRelease = $(e.currentTarget).val();
|
|
||||||
currentRelease = newRelease;
|
|
||||||
statsPage();
|
|
||||||
});
|
|
||||||
|
|
||||||
var showLoader = function() {
|
|
||||||
$('#main').addClass('hidden');
|
|
||||||
$('#loader').removeClass('hidden');
|
|
||||||
};
|
|
||||||
|
|
||||||
var hideLoader = function() {
|
|
||||||
$('#main').removeClass('hidden');
|
|
||||||
$('#loader').addClass('hidden');
|
|
||||||
};
|
|
||||||
|
|
||||||
var statsPage = function() {
|
|
||||||
var url = '/api/v1/json/report/installations';
|
|
||||||
var data = {};
|
|
||||||
|
|
||||||
if (currentRelease) {
|
|
||||||
data['release'] = currentRelease;
|
|
||||||
}
|
|
||||||
$('#load-error').addClass('hidden');
|
|
||||||
showLoader();
|
|
||||||
|
|
||||||
$.get(url, data, function(resp) {
|
|
||||||
installationsCount(resp);
|
|
||||||
environmentsCount(resp);
|
|
||||||
distributionOfInstallations(resp);
|
|
||||||
nodesDistributionChart(resp);
|
|
||||||
hypervisorDistributionChart(resp);
|
|
||||||
osesDistributionChart(resp);
|
|
||||||
})
|
|
||||||
.done(function() {
|
|
||||||
hideLoader();
|
|
||||||
})
|
|
||||||
.fail(function() {
|
|
||||||
$('#load-error').removeClass('hidden');
|
|
||||||
$('#loader').addClass('hidden');
|
|
||||||
});
|
|
||||||
|
|
||||||
};
|
|
||||||
|
|
||||||
var installationsCount = function(resp) {
|
|
||||||
$('#installations-count').html(resp.installations.count);
|
|
||||||
};
|
|
||||||
|
|
||||||
var environmentsCount = function(resp) {
|
|
||||||
$('#environments-count').html(resp.environments.count);
|
|
||||||
|
|
||||||
var colors = [
|
|
||||||
{status: 'new', code: '#999999'},
|
|
||||||
{status: 'operational', code: '#51851A'},
|
|
||||||
{status: 'error', code: '#FF7372'},
|
|
||||||
{status: 'deployment', code: '#2783C0'},
|
|
||||||
{status: 'remove', code: '#000000'},
|
|
||||||
{status: 'stopped', code: '#FFB014'},
|
|
||||||
{status: 'update', code: '#775575'},
|
|
||||||
{status: 'update_error', code: '#F5007B'}
|
|
||||||
];
|
|
||||||
var chartData = [];
|
|
||||||
|
|
||||||
$.each(colors, function(index, color) {
|
|
||||||
var in_status = resp.environments.statuses[color.status];
|
|
||||||
if (in_status) {
|
|
||||||
chartData.push({label: color.status, value: in_status, color: color.code});
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
var data = [{
|
|
||||||
key: 'Distribution of environments by statuses',
|
|
||||||
values: chartData
|
|
||||||
}];
|
|
||||||
|
|
||||||
nv.addGraph(function() {
|
|
||||||
var chart = nv.models.discreteBarChart()
|
|
||||||
.x(function(d) { return d.label;})
|
|
||||||
.y(function(d) { return d.value;})
|
|
||||||
.margin({top: 30, bottom: 60})
|
|
||||||
.staggerLabels(true)
|
|
||||||
.transitionDuration(350);
|
|
||||||
|
|
||||||
chart.xAxis
|
|
||||||
.axisLabel('Statuses');
|
|
||||||
|
|
||||||
chart.yAxis
|
|
||||||
.axisLabel('Environments')
|
|
||||||
.axisLabelDistance(30)
|
|
||||||
.tickFormat(d3.format('d'));
|
|
||||||
|
|
||||||
chart.tooltipContent(function(key, x, y) {
|
|
||||||
return '<h3>Status: "' + x + '"</h3>' + '<p>' + parseInt(y) + ' environments</p>';
|
|
||||||
});
|
|
||||||
|
|
||||||
d3.select('#clusters-distribution svg')
|
|
||||||
.datum(data)
|
|
||||||
.call(chart);
|
|
||||||
|
|
||||||
nv.utils.windowResize(chart.update);
|
|
||||||
|
|
||||||
return chart;
|
|
||||||
});
|
|
||||||
};
|
|
||||||
|
|
||||||
var distributionOfInstallations = function(resp) {
|
|
||||||
var chartData = [];
|
|
||||||
$.each(resp.installations.environments_num, function(key, value) {
|
|
||||||
chartData.push({label: key, value: value});
|
|
||||||
});
|
|
||||||
var data = [{
|
|
||||||
color: '#1DA489',
|
|
||||||
values: chartData
|
|
||||||
}];
|
|
||||||
|
|
||||||
nv.addGraph(function() {
|
|
||||||
var chart = nv.models.multiBarChart()
|
|
||||||
.x(function(d) { return d.label;})
|
|
||||||
.y(function(d) { return d.value;})
|
|
||||||
.margin({top: 30, bottom: 60})
|
|
||||||
.transitionDuration(350)
|
|
||||||
.reduceXTicks(false) //If 'false', every single x-axis tick label will be rendered.
|
|
||||||
.rotateLabels(0) //Angle to rotate x-axis labels.
|
|
||||||
.showControls(false) //Allow user to switch between 'Grouped' and 'Stacked' mode.
|
|
||||||
.showLegend(false)
|
|
||||||
.groupSpacing(0.5); //Distance between each group of bars.
|
|
||||||
|
|
||||||
chart.xAxis
|
|
||||||
.axisLabel('Environments count');
|
|
||||||
|
|
||||||
chart.yAxis
|
|
||||||
.axisLabel('Installations')
|
|
||||||
.axisLabelDistance(30)
|
|
||||||
.tickFormat(d3.format('d'));
|
|
||||||
|
|
||||||
chart.tooltipContent(function(key, x, y) {
|
|
||||||
return '<h3>' + parseInt(y) + ' installations</h3>' + '<p>with ' + x + ' environments</p>';
|
|
||||||
});
|
|
||||||
|
|
||||||
d3.select('#env-distribution svg')
|
|
||||||
.datum(data)
|
|
||||||
.call(chart);
|
|
||||||
|
|
||||||
nv.utils.windowResize(chart.update);
|
|
||||||
|
|
||||||
return chart;
|
|
||||||
});
|
|
||||||
};
|
|
||||||
|
|
||||||
var nodesDistributionChart = function(resp) {
|
|
||||||
var total = resp.environments.operable_envs_count;
|
|
||||||
var ranges = [
|
|
||||||
{from: 1, to: 5, count: 0},
|
|
||||||
{from: 5, to: 10, count: 0},
|
|
||||||
{from: 10, to: 20, count: 0},
|
|
||||||
{from: 20, to: 50, count: 0},
|
|
||||||
{from: 50, to: 100, count: 0},
|
|
||||||
{from: 100, to: null, count: 0}
|
|
||||||
];
|
|
||||||
var chartData = [];
|
|
||||||
|
|
||||||
$('#count-nodes-distribution').html(total);
|
|
||||||
$.each(resp.environments.nodes_num, function(nodes_num, count) {
|
|
||||||
$.each(ranges, function(index, range) {
|
|
||||||
var num = parseInt(nodes_num);
|
|
||||||
if (
|
|
||||||
num >= range.from &&
|
|
||||||
(num < range.to || range.to == null)
|
|
||||||
) {
|
|
||||||
range.count += count;
|
|
||||||
}
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
$.each(ranges, function(index, range) {
|
|
||||||
var labelText = range.from + (range.to == null ? '+' : '-' + range.to);
|
|
||||||
chartData.push({label: labelText, value: range.count});
|
|
||||||
});
|
|
||||||
|
|
||||||
var data = [{
|
|
||||||
key: 'Environment size distribution by number of nodes',
|
|
||||||
color: '#1DA489',
|
|
||||||
values: chartData
|
|
||||||
}];
|
|
||||||
|
|
||||||
nv.addGraph(function() {
|
|
||||||
var chart = nv.models.multiBarChart()
|
|
||||||
.x(function(d) { return d.label;})
|
|
||||||
.y(function(d) { return d.value;})
|
|
||||||
.margin({top: 30})
|
|
||||||
.transitionDuration(350)
|
|
||||||
.reduceXTicks(false) //If 'false', every single x-axis tick label will be rendered.
|
|
||||||
.rotateLabels(0) //Angle to rotate x-axis labels.
|
|
||||||
.showControls(false) //Allow user to switch between 'Grouped' and 'Stacked' mode.
|
|
||||||
.groupSpacing(0.2); //Distance between each group of bars.
|
|
||||||
|
|
||||||
chart.xAxis
|
|
||||||
.axisLabel('Number of nodes');
|
|
||||||
|
|
||||||
chart.yAxis
|
|
||||||
.axisLabel('Environments')
|
|
||||||
.axisLabelDistance(30)
|
|
||||||
.tickFormat(d3.format('d'));
|
|
||||||
|
|
||||||
chart.tooltipContent(function(key, x, y) {
|
|
||||||
return '<h3>' + x + ' nodes</h3>' + '<p>' + parseInt(y) + '</p>';
|
|
||||||
});
|
|
||||||
|
|
||||||
d3.select('#nodes-distribution svg')
|
|
||||||
.datum(data)
|
|
||||||
.call(chart);
|
|
||||||
|
|
||||||
nv.utils.windowResize(chart.update);
|
|
||||||
|
|
||||||
return chart;
|
|
||||||
});
|
|
||||||
};
|
|
||||||
|
|
||||||
var hypervisorDistributionChart = function(resp) {
|
|
||||||
var totalСounted = 0,
|
|
||||||
total = resp.environments.operable_envs_count,
|
|
||||||
chartData = [];
|
|
||||||
$.each(resp.environments.hypervisors_num, function(hypervisor, count) {
|
|
||||||
chartData.push({label: hypervisor, value: count});
|
|
||||||
totalСounted += count;
|
|
||||||
});
|
|
||||||
var unknownHypervisorsCount = total - totalСounted;
|
|
||||||
if (unknownHypervisorsCount) {
|
|
||||||
chartData.push({label: 'unknown', value: unknownHypervisorsCount});
|
|
||||||
}
|
|
||||||
$('#count-releases-distribution').html(total);
|
|
||||||
$('#releases-distribution').html('');
|
|
||||||
new D3pie("releases-distribution", {
|
|
||||||
header: {
|
|
||||||
title: {
|
|
||||||
text: 'Distribution of deployed hypervisor',
|
|
||||||
fontSize: 15
|
|
||||||
},
|
|
||||||
location: 'top-left',
|
|
||||||
titleSubtitlePadding: 9
|
|
||||||
},
|
|
||||||
size: {
|
|
||||||
canvasWidth: 330,
|
|
||||||
canvasHeight: 300,
|
|
||||||
pieInnerRadius: '40%',
|
|
||||||
pieOuterRadius: '55%'
|
|
||||||
},
|
|
||||||
labels: {
|
|
||||||
outer: {
|
|
||||||
format: 'label-value2',
|
|
||||||
pieDistance: 10
|
|
||||||
},
|
|
||||||
inner: {
|
|
||||||
format: "percentage",
|
|
||||||
hideWhenLessThanPercentage: 5
|
|
||||||
},
|
|
||||||
mainLabel: {
|
|
||||||
fontSize: 14
|
|
||||||
},
|
|
||||||
percentage: {
|
|
||||||
color: '#ffffff',
|
|
||||||
decimalPlaces: 2
|
|
||||||
},
|
|
||||||
value: {
|
|
||||||
color: '#adadad',
|
|
||||||
fontSize: 11
|
|
||||||
},
|
|
||||||
lines: {
|
|
||||||
enabled: true
|
|
||||||
}
|
|
||||||
},
|
|
||||||
data: {
|
|
||||||
content: chartData
|
|
||||||
},
|
|
||||||
tooltips: {
|
|
||||||
enabled: true,
|
|
||||||
type: 'placeholder',
|
|
||||||
string: '{label}: {value} pcs, {percentage}%',
|
|
||||||
styles: {
|
|
||||||
borderRadius: 3,
|
|
||||||
fontSize: 12,
|
|
||||||
padding: 6
|
|
||||||
}
|
|
||||||
}
|
|
||||||
});
|
|
||||||
};
|
|
||||||
|
|
||||||
var osesDistributionChart = function(resp) {
|
|
||||||
var total = resp.environments.operable_envs_count,
|
|
||||||
chartData = [];
|
|
||||||
$('#count-distribution-of-oses').html(total);
|
|
||||||
$.each(resp.environments.oses_num, function(os, count) {
|
|
||||||
chartData.push({label: os, value: count});
|
|
||||||
});
|
|
||||||
$('#distribution-of-oses').html('');
|
|
||||||
new D3pie("distribution-of-oses", {
|
|
||||||
header: {
|
|
||||||
title: {
|
|
||||||
text: 'Distribution of deployed operating system',
|
|
||||||
fontSize: 15
|
|
||||||
},
|
|
||||||
location: 'top-left',
|
|
||||||
titleSubtitlePadding: 9
|
|
||||||
},
|
|
||||||
size: {
|
|
||||||
canvasWidth: 330,
|
|
||||||
canvasHeight: 300,
|
|
||||||
pieInnerRadius: '40%',
|
|
||||||
pieOuterRadius: '55%'
|
|
||||||
},
|
|
||||||
labels: {
|
|
||||||
outer: {
|
|
||||||
format: 'label-value2',
|
|
||||||
pieDistance: 10
|
|
||||||
},
|
|
||||||
inner: {
|
|
||||||
format: "percentage",
|
|
||||||
hideWhenLessThanPercentage: 5
|
|
||||||
},
|
|
||||||
mainLabel: {
|
|
||||||
fontSize: 14
|
|
||||||
},
|
|
||||||
percentage: {
|
|
||||||
color: '#ffffff',
|
|
||||||
decimalPlaces: 2
|
|
||||||
},
|
|
||||||
value: {
|
|
||||||
color: '#adadad',
|
|
||||||
fontSize: 11
|
|
||||||
},
|
|
||||||
lines: {
|
|
||||||
enabled: true
|
|
||||||
}
|
|
||||||
},
|
|
||||||
data: {
|
|
||||||
content: chartData
|
|
||||||
},
|
|
||||||
tooltips: {
|
|
||||||
enabled: true,
|
|
||||||
type: 'placeholder',
|
|
||||||
string: '{label}: {value} pcs, {percentage}%',
|
|
||||||
styles: {
|
|
||||||
borderRadius: 3,
|
|
||||||
fontSize: 12,
|
|
||||||
padding: 6
|
|
||||||
}
|
|
||||||
}
|
|
||||||
});
|
|
||||||
};
|
|
||||||
|
|
||||||
return statsPage();
|
|
||||||
});
|
|
@ -1,30 +0,0 @@
|
|||||||
requirejs.config({
|
|
||||||
baseUrl: "js",
|
|
||||||
waitSeconds: 60,
|
|
||||||
urlArgs: '_=' + (new Date()).getTime(),
|
|
||||||
paths: {
|
|
||||||
jquery: 'libs/jquery/js/jquery',
|
|
||||||
elasticsearch: 'libs/elasticsearch/js/elasticsearch',
|
|
||||||
d3: 'libs/d3/js/d3',
|
|
||||||
d3pie: 'libs/d3pie/js/d3pie',
|
|
||||||
d3tip: 'libs/d3-tip/js/index',
|
|
||||||
nv: 'libs/nvd3/js/nv.d3',
|
|
||||||
app: 'app'
|
|
||||||
},
|
|
||||||
shim: {
|
|
||||||
d3tip: {
|
|
||||||
deps: ['d3'],
|
|
||||||
exports: 'd3tip'
|
|
||||||
},
|
|
||||||
d3pie: {
|
|
||||||
deps: ['d3'],
|
|
||||||
exports: 'd3pie'
|
|
||||||
},
|
|
||||||
nv: {
|
|
||||||
deps: ['d3'],
|
|
||||||
exports: 'nv'
|
|
||||||
}
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
require(['app']);
|
|
@ -1,31 +0,0 @@
|
|||||||
$(function() {
|
|
||||||
var defaultDate = new Date(),
|
|
||||||
reportUrl = '/api/v1/csv/';
|
|
||||||
|
|
||||||
$( '#from' ).datepicker({
|
|
||||||
defaultDate: '-1m',
|
|
||||||
changeMonth: true,
|
|
||||||
numberOfMonths: 3,
|
|
||||||
dateFormat: "yy-mm-dd",
|
|
||||||
onClose: function( selectedDate ) {
|
|
||||||
$( '#to' ).datepicker( 'option', 'minDate', selectedDate );
|
|
||||||
}
|
|
||||||
});
|
|
||||||
$( '#to' ).datepicker({
|
|
||||||
changeMonth: true,
|
|
||||||
numberOfMonths: 3,
|
|
||||||
dateFormat: "yy-mm-dd",
|
|
||||||
onClose: function( selectedDate ) {
|
|
||||||
$( '#from' ).datepicker( 'option', 'maxDate', selectedDate );
|
|
||||||
}
|
|
||||||
});
|
|
||||||
$('#to').datepicker('setDate', defaultDate);
|
|
||||||
defaultDate.setMonth(defaultDate.getMonth() - 1);
|
|
||||||
$('#from').datepicker('setDate', defaultDate);
|
|
||||||
|
|
||||||
$('.btn-link').click(function(e) {
|
|
||||||
e.preventDefault();
|
|
||||||
url = reportUrl + $(this).attr('id') + '?from_date=' + $('#from').val() + '&to_date=' + $('#to').val();
|
|
||||||
window.open(url);
|
|
||||||
});
|
|
||||||
});
|
|
@ -1,18 +0,0 @@
|
|||||||
{
|
|
||||||
"name": "fuel-stats",
|
|
||||||
"version": "0.0.1",
|
|
||||||
"repository": {
|
|
||||||
"type": "git",
|
|
||||||
"url": "https://github.com/stackforge/fuel-stats.git"
|
|
||||||
},
|
|
||||||
"devDependencies": {
|
|
||||||
"gulp": "~3.8.10",
|
|
||||||
"gulp-jshint": "~1.9.0",
|
|
||||||
"gulp-jscs" : "~1.3.0",
|
|
||||||
"gulp-lintspaces": "~0.3.2",
|
|
||||||
"gulp-bower": "~0.0.7",
|
|
||||||
"main-bower-files": "~2.4.0",
|
|
||||||
"gulp-bower-normalize": "~1.0.6",
|
|
||||||
"jshint-stylish": "~1.0.0"
|
|
||||||
}
|
|
||||||
}
|
|
@ -1,73 +0,0 @@
|
|||||||
<!DOCTYPE html>
|
|
||||||
<html lang="en">
|
|
||||||
<head>
|
|
||||||
<meta charset="utf-8">
|
|
||||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
|
||||||
<meta name="viewport" content="width=device-width, initial-scale=1">
|
|
||||||
<meta name="description" content="">
|
|
||||||
<meta name="author" content="">
|
|
||||||
<link rel="icon" href="favicon.ico">
|
|
||||||
<title>Fuel Statistics</title>
|
|
||||||
<!-- Bootstrap core CSS -->
|
|
||||||
<link href="js/libs/bootstrap/css/bootstrap.css" rel="stylesheet">
|
|
||||||
<link href="css/fuel-stat.css" rel="stylesheet">
|
|
||||||
<link rel="stylesheet" href="js/libs/jquery-ui/css/jquery-ui.css">
|
|
||||||
<script src="js/libs/jquery/js/jquery.js"></script>
|
|
||||||
<script src="js/libs/jquery-ui/js/jquery-ui.js"></script>
|
|
||||||
<script src="js/reports.js"></script>
|
|
||||||
</head>
|
|
||||||
<body>
|
|
||||||
|
|
||||||
<!-- Start Left Pannel -->
|
|
||||||
<div class="nav-pannel">
|
|
||||||
<div class="logo"><a href="#"></a></div>
|
|
||||||
|
|
||||||
<div class="nav-item"><a href="index.html">
|
|
||||||
<div class="icon-graphs"></div>
|
|
||||||
<div class="nav-item-name">Graphics</div>
|
|
||||||
</a></div>
|
|
||||||
|
|
||||||
<div class="nav-item"><a href="#" class="active">
|
|
||||||
<div class="icon-reports"></div>
|
|
||||||
<div class="nav-item-name">Reports</div>
|
|
||||||
</a></div>
|
|
||||||
|
|
||||||
</div>
|
|
||||||
<!-- End Left Pannel -->
|
|
||||||
|
|
||||||
<!-- Start Base Layout -->
|
|
||||||
<div class="base-box">
|
|
||||||
|
|
||||||
<div class="container-fluid reports titul-graph-box">
|
|
||||||
<div class="content">
|
|
||||||
<h3>Reports</h3>
|
|
||||||
<div class="notice">Set time period for reporting:</div>
|
|
||||||
<div class="row">
|
|
||||||
<label class="col-md-1" for="from">from</label>
|
|
||||||
<div class="col-md-11" ><input type="text" id="from" name="from"></div>
|
|
||||||
<label class="col-md-1" for="to">to</label>
|
|
||||||
<div class="col-md-11" ><input type="text" id="to" name="to"></div>
|
|
||||||
</div>
|
|
||||||
<div class='notice'>Click the link to download report:</div>
|
|
||||||
<ul>
|
|
||||||
<li><button class="btn-link" id="clusters">Installations info</button></li>
|
|
||||||
<li><button class="btn-link" id="plugins">Plugins</button></li>
|
|
||||||
<li><button class="btn-link" id="nodes">Nodes</button></li>
|
|
||||||
<li><button class="btn-link" id="flavor">Flavors</button></li>
|
|
||||||
<li><button class="btn-link" id="image">Images</button></li>
|
|
||||||
<li><button class="btn-link" id="keystone_user">Keystone users</button></li>
|
|
||||||
<li><button class="btn-link" id="tenant">Tenants</button></li>
|
|
||||||
<li><button class="btn-link" id="vm">Vms</button></li>
|
|
||||||
<li><button class="btn-link" id="volume">Volumes</button></li>
|
|
||||||
<!-- TODO: uncomment after bug #1564427 will be fixed
|
|
||||||
<li><button class="btn-link" id="all">All reports</button> (download can take lot of time, please be patient)</li>
|
|
||||||
-->
|
|
||||||
</ul>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
</div>
|
|
||||||
|
|
||||||
</body>
|
|
||||||
</html>
|
|
||||||
|
|
@ -1,7 +0,0 @@
|
|||||||
-r requirements.txt
|
|
||||||
hacking==0.9.2
|
|
||||||
mock==1.0.1
|
|
||||||
nose==1.3.4
|
|
||||||
nose2==0.4.7
|
|
||||||
tox==1.8.0
|
|
||||||
unittest2==0.5.1
|
|
@ -1,42 +0,0 @@
|
|||||||
[tox]
|
|
||||||
minversion = 1.6
|
|
||||||
skipsdist = True
|
|
||||||
envlist = py27,pep8
|
|
||||||
|
|
||||||
[testenv]
|
|
||||||
usedevelop = True
|
|
||||||
install_command = pip install {packages}
|
|
||||||
setenv = VIRTUAL_ENV={envdir}
|
|
||||||
deps = -r{toxinidir}/test-requirements.txt
|
|
||||||
commands =
|
|
||||||
nosetests {posargs:fuel_analytics/test}
|
|
||||||
|
|
||||||
[tox:jenkins]
|
|
||||||
downloadcache = ~/cache/pip
|
|
||||||
|
|
||||||
[testenv:pep8]
|
|
||||||
deps = hacking==0.7
|
|
||||||
usedevelop = False
|
|
||||||
commands =
|
|
||||||
flake8 {posargs:fuel_analytics}
|
|
||||||
|
|
||||||
[testenv:cover]
|
|
||||||
setenv = NOSE_WITH_COVERAGE=1
|
|
||||||
|
|
||||||
[testenv:venv]
|
|
||||||
deps = -r{toxinidir}/test-requirements.txt
|
|
||||||
commands = {posargs:}
|
|
||||||
|
|
||||||
[testenv:devenv]
|
|
||||||
envdir = devenv
|
|
||||||
usedevelop = True
|
|
||||||
|
|
||||||
[flake8]
|
|
||||||
ignore = H234,H302,H802
|
|
||||||
exclude = .venv,.git,.tox,dist,doc,*lib/python*,*egg,build,tools,__init__.py,docs
|
|
||||||
show-pep8 = True
|
|
||||||
show-source = True
|
|
||||||
count = True
|
|
||||||
|
|
||||||
[hacking]
|
|
||||||
import_exceptions = testtools.matchers
|
|
@ -1,7 +0,0 @@
|
|||||||
uwsgi:
|
|
||||||
# for production app use analytics.api.app as module
|
|
||||||
module: fuel_analytics.api.app_test
|
|
||||||
callable: app
|
|
||||||
http: 0.0.0.0:5000
|
|
||||||
# Uncommetnt the following line to use external config
|
|
||||||
# env: ANALYTICS_SETTINGS=/path/to/external/config.py
|
|
@ -1,4 +0,0 @@
|
|||||||
include *.txt
|
|
||||||
recursive-include collector/api/schemas *.json
|
|
||||||
graft collector/api/db/migrations
|
|
||||||
include collector/test/logs
|
|
@ -1,13 +0,0 @@
|
|||||||
# Copyright 2014 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
@ -1,13 +0,0 @@
|
|||||||
# Copyright 2014 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
@ -1,107 +0,0 @@
|
|||||||
# Copyright 2014 Mirantis, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
from flask import Flask
|
|
||||||
from flask import jsonify
|
|
||||||
from flask import make_response
|
|
||||||
import flask_jsonschema
|
|
||||||
import flask_sqlalchemy
|
|
||||||
import os
|
|
||||||
from sqlalchemy.exc import IntegrityError
|
|
||||||
|
|
||||||
from collector.api.config import index_filtering_rules
|
|
||||||
|
|
||||||
|
|
||||||
app = Flask(__name__)
|
|
||||||
|
|
||||||
|
|
||||||
# Registering flask extensions
|
|
||||||
app.config['JSONSCHEMA_DIR'] = os.path.join(app.root_path, 'schemas')
|
|
||||||
flask_jsonschema.JsonSchema(app)
|
|
||||||
|
|
||||||
# We should rebuild packages based keys in the FILTERING_RULES.
|
|
||||||
# Sorted tuples built from packages lists are used as keys.
|
|
||||||
index_filtering_rules(app)
|
|
||||||
|
|
||||||
db = flask_sqlalchemy.SQLAlchemy(app)
|
|
||||||
|
|
||||||
|
|
||||||
# Registering blueprints
|
|
||||||
from collector.api.resources.action_logs import bp as action_logs_bp
|
|
||||||
from collector.api.resources.installation_structure import \
|
|
||||||
bp as installation_structure_bp
|
|
||||||
from collector.api.resources.oswl import bp as oswl_stats_bp
|
|
||||||
from collector.api.resources.ping import bp as ping_bp
|
|
||||||
|
|
||||||
app.register_blueprint(installation_structure_bp,
|
|
||||||
url_prefix='/api/v1/installation_structure')
|
|
||||||
app.register_blueprint(action_logs_bp, url_prefix='/api/v1/action_logs')
|
|
||||||
app.register_blueprint(ping_bp, url_prefix='/api/v1/ping')
|
|
||||||
app.register_blueprint(oswl_stats_bp, url_prefix='/api/v1/oswl_stats')
|
|
||||||
|
|
||||||
|
|
||||||
# Registering error handlers
|
|
||||||
@app.errorhandler(400)
|
|
||||||
def bad_request(error):
|
|
||||||
app.logger.error("Bad request: {}".format(error))
|
|
||||||
return make_response(jsonify({'status': 'error',
|
|
||||||
'message': '{}'.format(error)}), 400)
|
|
||||||
|
|
||||||
|
|
||||||
@app.errorhandler(IntegrityError)
|
|
||||||
def integrity_error(error):
|
|
||||||
app.logger.error("Bad request: {}".format(error))
|
|
||||||
return make_response(jsonify({'status': 'error',
|
|
||||||
'message': '{}'.format(error)}), 400)
|
|
||||||
|
|
||||||
|
|
||||||
@app.errorhandler(404)
|
|
||||||
def not_found(error):
|
|
||||||
app.logger.error("Not found: {}".format(error))
|
|
||||||
return make_response(jsonify({'status': 'error',
|
|
||||||
'message': '{}'.format(error)}), 404)
|
|
||||||
|
|
||||||
|
|
||||||
@app.errorhandler(405)
|
|
||||||
def not_allowed(error):
|
|
||||||
app.logger.error("Method not allowed: {}".format(error))
|
|
||||||
return make_response(jsonify({'status': 'error',
|
|
||||||
'message': '{}'.format(error)}), 405)
|
|
||||||
|
|
||||||
|
|
||||||
@app.errorhandler(flask_jsonschema.ValidationError)
|
|
||||||
def validation_error(error):
|
|
||||||
app.logger.error("Validation error: {}".format(error))
|
|
||||||
return make_response(jsonify({'status': 'error',
|
|
||||||
'message': '{}'.format(error)}), 400)
|
|
||||||
|
|
||||||
|
|
||||||
@app.errorhandler(500)
|
|
||||||
def server_error(error):
|
|
||||||
app.logger.error("Server error: {}".format(error))
|
|
||||||
error_name = error.__class__.__name__
|
|
||||||
return make_response(
|
|
||||||
jsonify(
|
|
||||||
{
|
|
||||||
'status': 'error',
|
|
||||||
'message': '{0}: {1}'.format(error_name, error)
|
|
||||||
}
|
|
||||||
),
|
|
||||||
500
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
@app.teardown_appcontext
|
|
||||||
def shutdown_session(exception=None):
|
|
||||||
db.session.remove()
|
|