Retire os-loganalyze
This project has been unmaintained for a while. CI jobs don't currently pass. Zuulv3's dashboard has largely replaced the need for this service as well as it renders the logs for us instead. If there is a need for this project elsewhere it can be resurrected and maintained by others, but for now lets retire it and move on. Depends-On: https://review.opendev.org/753398 Change-Id: I476b14c7cd9c49270f67897206ec7ad90643703b
This commit is contained in:
parent
d80e6a55e0
commit
88434e0ecb
|
@ -1,7 +0,0 @@
|
|||
[run]
|
||||
branch = True
|
||||
source = os_loganalyze
|
||||
omit = os_loganalyze/tests/*,os_loganalyze/openstack/*
|
||||
|
||||
[report]
|
||||
ignore_errors = True
|
|
@ -1,53 +0,0 @@
|
|||
*.py[cod]
|
||||
|
||||
# C extensions
|
||||
*.so
|
||||
|
||||
# Packages
|
||||
*.egg
|
||||
*.egg-info
|
||||
dist
|
||||
build
|
||||
eggs
|
||||
parts
|
||||
bin
|
||||
var
|
||||
sdist
|
||||
develop-eggs
|
||||
.installed.cfg
|
||||
lib
|
||||
lib64
|
||||
|
||||
# Installer logs
|
||||
pip-log.txt
|
||||
|
||||
# Unit test / coverage reports
|
||||
.coverage
|
||||
.tox
|
||||
nosetests.xml
|
||||
.testrepository
|
||||
|
||||
# Translations
|
||||
*.mo
|
||||
|
||||
# Mr Developer
|
||||
.mr.developer.cfg
|
||||
.project
|
||||
.pydevproject
|
||||
|
||||
# Complexity
|
||||
output/*.html
|
||||
output/*/index.html
|
||||
|
||||
# Sphinx
|
||||
doc/build
|
||||
|
||||
# pbr generates these
|
||||
AUTHORS
|
||||
ChangeLog
|
||||
|
||||
# Editors
|
||||
*~
|
||||
.*.swp
|
||||
*.swo
|
||||
*.swn
|
3
.mailmap
3
.mailmap
|
@ -1,3 +0,0 @@
|
|||
# Format is:
|
||||
# <preferred e-mail> <other e-mail 1>
|
||||
# <preferred e-mail> <other e-mail 2>
|
|
@ -1,7 +0,0 @@
|
|||
[DEFAULT]
|
||||
test_command=OS_STDOUT_CAPTURE=${OS_STDOUT_CAPTURE:-1} \
|
||||
OS_STDERR_CAPTURE=${OS_STDERR_CAPTURE:-1} \
|
||||
OS_TEST_TIMEOUT=${OS_TEST_TIMEOUT:-60} \
|
||||
${PYTHON:-python} -m subunit.run discover -t ./ . $LISTOPT $IDOPTION
|
||||
test_id_option=--load-list $IDFILE
|
||||
test_list_option=--list
|
13
.zuul.yaml
13
.zuul.yaml
|
@ -1,13 +0,0 @@
|
|||
---
|
||||
- project:
|
||||
check:
|
||||
jobs:
|
||||
- openstack-tox-pep8
|
||||
- openstack-tox-py27
|
||||
gate:
|
||||
jobs:
|
||||
- openstack-tox-pep8
|
||||
- openstack-tox-py27
|
||||
experimental:
|
||||
jobs:
|
||||
- legacy-dsvm-os-loganalyze
|
|
@ -1,13 +0,0 @@
|
|||
If you would like to contribute to the development of OpenStack,
|
||||
you must follow the steps in this page:
|
||||
|
||||
http://docs.openstack.org/infra/manual/developers.html
|
||||
|
||||
If you already have a good understanding of how the system works and your
|
||||
OpenStack accounts are set up, you can skip to the development workflow section
|
||||
of this documentation to learn how changes to OpenStack should be submitted for
|
||||
review via the Gerrit tool:
|
||||
|
||||
http://docs.openstack.org/infra/manual/developers.html#development-workflow
|
||||
|
||||
Pull requests submitted through GitHub will be ignored.
|
175
LICENSE
175
LICENSE
|
@ -1,175 +0,0 @@
|
|||
|
||||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
|
@ -1,6 +0,0 @@
|
|||
include AUTHORS
|
||||
include ChangeLog
|
||||
exclude .gitignore
|
||||
exclude .gitreview
|
||||
|
||||
global-exclude *.pyc
|
60
README.rst
60
README.rst
|
@ -1,54 +1,10 @@
|
|||
===============================
|
||||
os_loganalyze
|
||||
===============================
|
||||
This project is no longer maintained.
|
||||
|
||||
OpenStack tools for gate log analysis
|
||||
The contents of this repository are still available in the Git
|
||||
source code management system. To see the contents of this
|
||||
repository before it reached its end of life, please check out the
|
||||
previous commit with "git checkout HEAD^1".
|
||||
|
||||
os_loganalyze is designed as a lightweight wsgi filter for openstack
|
||||
logs, making it easier to interact with them on OpenStack's
|
||||
logs.openstack.org repository. This includes colorizing the logs based
|
||||
on log level severity, having bookmarkable links to timestamps in the
|
||||
logs for easy reference, and being able to filter by log level.
|
||||
|
||||
This is implemented as a low level wsgi application which returns a
|
||||
generator so that it can act like a pipeline. Some of our logs are 35
|
||||
MB uncompressed, so if we used a more advanced framework that required
|
||||
we load the entire data stream into memory, the user response would be
|
||||
very poor. As a pipeline and generator the delay added by this script
|
||||
to the user grabbing the logs is largely not noticeable (< 1s).
|
||||
|
||||
* Free software: Apache 2.0 license
|
||||
|
||||
Features
|
||||
--------
|
||||
* Supports text/html or text/plain dynamically based on content
|
||||
negotiation
|
||||
* html highlighting based on severity
|
||||
* filtering based on severity using the level=XXXX parameter (works in
|
||||
either text/html or text/plain responses
|
||||
* linking and highlighting of lines based on timestamp
|
||||
* control of max number of lines that will be returned using the
|
||||
limit=XXXX parameter
|
||||
* Provides a script named htmlify_server.py that serves htmlified logs
|
||||
over HTTP. To view devstack logs: set
|
||||
SCREEN_LOGDIR=$DEST/logs/screen and LOG_COLOR=false in localrc
|
||||
before running stack.sh, run htmlify_server.py, and point your
|
||||
browser at http://devstack-ip:8000/
|
||||
|
||||
Todo
|
||||
------------
|
||||
Next steps, roughly in order
|
||||
|
||||
* provide links to logstash for request streams (link well know
|
||||
request ids to logstash queries for them)
|
||||
|
||||
Hacking
|
||||
-------
|
||||
If you are working on making changes one of the easiest ways to do
|
||||
this is to run the server stack locally to see how your changes look
|
||||
on same data included for the tests.
|
||||
|
||||
This can be done with ``tox -e run``, which will use the script
|
||||
designed for devstack locally pointed at the sample data. A url where
|
||||
you can browse the resultant content will be provided on the command
|
||||
line.
|
||||
For any further questions, please email
|
||||
openstack-discuss@lists.openstack.org or join #openstack-infra on
|
||||
Freenode.
|
||||
|
|
|
@ -1,5 +0,0 @@
|
|||
# Apache config for htmlify screen logs
|
||||
RewriteEngine On
|
||||
# rewrite all txt.gz files to map to our internal htmlify wsgi app
|
||||
RewriteRule ^/(.*\.txt\.gz)$ /htmlify/$1 [QSA,L,PT]
|
||||
WSGIScriptAlias /htmlify /usr/local/lib/python2.7/dist-packages/os_loganalyze/wsgi.py
|
|
@ -1,43 +0,0 @@
|
|||
Listen %PORT%
|
||||
<VirtualHost *:%PORT%>
|
||||
DocumentRoot %OS_LOGANALYZE_APACHE_DOCUMENTROOT%
|
||||
|
||||
# use Apache to compress the results afterwards, to save on the wire
|
||||
# it's approx 18x savings of wire traffic to compress. We need to
|
||||
# compress by content types that htmlify can produce
|
||||
AddOutputFilterByType DEFLATE text/plain text/html
|
||||
|
||||
<FilesMatch \.html\.gz$>
|
||||
ForceType text/html
|
||||
AddDefaultCharset UTF-8
|
||||
AddEncoding x-gzip gz
|
||||
</FilesMatch>
|
||||
<Directory %OS_LOGANALYZE_DIR%>
|
||||
Allow from all
|
||||
Satisfy Any
|
||||
</Directory>
|
||||
|
||||
RewriteEngine On
|
||||
# rewrite txt.gz & console.html[.gz] files to map to our internal htmlify
|
||||
# wsgi app
|
||||
# PT, Pass-through: to come back around and get picked up by the
|
||||
# WSGIScriptAlias
|
||||
# NS, No-subrequest: on coming back through, mod-autoindex may have added
|
||||
# index.html which would match the !-f condition. We
|
||||
# therefore ensure the rewrite doesn't trigger by
|
||||
# disallowing subrequests.
|
||||
RewriteRule ^/(.*\.txt\.gz)$ /htmlify/$1 [QSA,L,PT,NS]
|
||||
RewriteRule ^/(.*console\.html(\.gz)?)$ /htmlify/$1 [QSA,L,PT,NS]
|
||||
|
||||
# Check if the request exists as a file, directory or symbolic link
|
||||
# If not, write the request to htmlify to see if we can fetch from swift
|
||||
RewriteCond %{DOCUMENT_ROOT}%{REQUEST_FILENAME} !-f
|
||||
RewriteCond %{DOCUMENT_ROOT}%{REQUEST_FILENAME} !-d
|
||||
RewriteCond %{DOCUMENT_ROOT}%{REQUEST_FILENAME} !-l
|
||||
RewriteCond %{REQUEST_FILENAME} !^/icon
|
||||
RewriteRule ^/(.*)$ /htmlify/$1 [QSA,L,PT,NS]
|
||||
|
||||
WSGIScriptAlias /htmlify %OS_LOGANALYZE_DIR%/os_loganalyze/wsgi.py
|
||||
LogLevel warn
|
||||
ServerSignature Off
|
||||
</VirtualHost>
|
|
@ -1,85 +0,0 @@
|
|||
# Copyright (c) 2015 Rackspace Australia
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
# check for service enabled
|
||||
if is_service_enabled os_loganalyze; then
|
||||
|
||||
if [[ "$1" == "stack" && "$2" == "pre-install" ]]; then
|
||||
# Set up system services
|
||||
echo_summary "Configuring system services os_loganalyze"
|
||||
install_apache_wsgi
|
||||
if is_ubuntu; then
|
||||
# rewrite isn't enabled by default, enable it
|
||||
sudo a2enmod rewrite
|
||||
elif is_fedora; then
|
||||
# rewrite is enabled by default, noop
|
||||
echo "rewrite mod already enabled"
|
||||
elif is_suse; then
|
||||
# WSGI isn't enabled by default, enable it
|
||||
sudo a2enmod rewrite
|
||||
else
|
||||
exit_distro_not_supported "apache mod-rewrite installation"
|
||||
fi
|
||||
|
||||
elif [[ "$1" == "stack" && "$2" == "install" ]]; then
|
||||
# Perform installation of service source
|
||||
echo_summary "Installing os_loganalyze"
|
||||
setup_install $OS_LOGANALYZE_DIR
|
||||
sudo mkdir -p $OS_LOGANALYZE_APACHE_DOCUMENTROOT
|
||||
sudo chown www-data:www-data $OS_LOGANALYZE_APACHE_DOCUMENTROOT
|
||||
|
||||
elif [[ "$1" == "stack" && "$2" == "post-config" ]]; then
|
||||
# Configure after the other layer 1 and 2 services have been configured
|
||||
echo_summary "Configuring os_loganalyze"
|
||||
|
||||
sudo cp $OS_LOGANALYZE_APACHE_TEMPLATE $(apache_site_config_for os_loganalyze)
|
||||
sudo sed -e "
|
||||
s/%PORT%/8080/g;
|
||||
s/%OS_LOGANALYZE_DIR%/${OS_LOGANALYZE_DIR//\//\\\/}/g;
|
||||
s/%OS_LOGANALYZE_APACHE_DOCUMENTROOT%/${OS_LOGANALYZE_APACHE_DOCUMENTROOT//\//\\\/}/g;
|
||||
" -i $(apache_site_config_for os_loganalyze)
|
||||
|
||||
enable_apache_site os_loganalyze
|
||||
restart_apache_server
|
||||
|
||||
elif [[ "$1" == "stack" && "$2" == "extra" ]]; then
|
||||
# Initialize and start the os_loganalyze service
|
||||
echo_summary "Initializing os_loganalyze"
|
||||
fi
|
||||
|
||||
if [[ "$1" == "unstack" ]]; then
|
||||
# Shut down os_loganalyze services
|
||||
# no-op
|
||||
stop_apache_server
|
||||
fi
|
||||
|
||||
if [[ "$1" == "clean" ]]; then
|
||||
# Remove state and transient data
|
||||
# Remember clean.sh first calls unstack.sh
|
||||
# no-op
|
||||
disable_apache_site os_loganalyze
|
||||
if is_ubuntu; then
|
||||
# rewrite isn't enabled by default, disable it agin
|
||||
sudo a2dismod rewrite
|
||||
elif is_fedora; then
|
||||
# rewrite is enabled by default, noop
|
||||
echo "rewrite mod enabled by default"
|
||||
elif is_suse; then
|
||||
# rewrite isn't enabled by default, disable it agin
|
||||
sudo a2dismod rewrite
|
||||
else
|
||||
exit_distro_not_supported "apache mod-rewrite installation"
|
||||
fi
|
||||
fi
|
||||
fi
|
|
@ -1,5 +0,0 @@
|
|||
OS_LOGANALYZE_DIR=$DEST/os-loganalyze
|
||||
OS_LOGANALYZE_APACHE_TEMPLATE=$OS_LOGANALYZE_DIR/devstack/os-loganalyze.template
|
||||
OS_LOGANALYZE_APACHE_DOCUMENTROOT=/var/www/logs
|
||||
|
||||
enable_service os_loganalyze
|
|
@ -1,75 +0,0 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
# implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import os
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, os.path.abspath('../..'))
|
||||
# -- General configuration ----------------------------------------------------
|
||||
|
||||
# Add any Sphinx extension module names here, as strings. They can be
|
||||
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
|
||||
extensions = [
|
||||
'sphinx.ext.autodoc',
|
||||
'sphinx.ext.intersphinx',
|
||||
'oslo.sphinx'
|
||||
]
|
||||
|
||||
# autodoc generation is a bit aggressive and a nuisance when doing heavy
|
||||
# text edit cycles.
|
||||
# execute "export SPHINX_DEBUG=1" in your terminal to disable
|
||||
|
||||
# The suffix of source filenames.
|
||||
source_suffix = '.rst'
|
||||
|
||||
# The master toctree document.
|
||||
master_doc = 'index'
|
||||
|
||||
# General information about the project.
|
||||
project = u'replace with the name for the git repo'
|
||||
copyright = u'2013, OpenStack Foundation'
|
||||
|
||||
# If true, '()' will be appended to :func: etc. cross-reference text.
|
||||
add_function_parentheses = True
|
||||
|
||||
# If true, the current module name will be prepended to all description
|
||||
# unit titles (such as .. function::).
|
||||
add_module_names = True
|
||||
|
||||
# The name of the Pygments (syntax highlighting) style to use.
|
||||
pygments_style = 'sphinx'
|
||||
|
||||
# -- Options for HTML output --------------------------------------------------
|
||||
|
||||
# The theme to use for HTML and HTML Help pages. Major themes that come with
|
||||
# Sphinx are currently 'default' and 'sphinxdoc'.
|
||||
# html_theme_path = ["."]
|
||||
# html_theme = '_theme'
|
||||
# html_static_path = ['static']
|
||||
|
||||
# Output file base name for HTML help builder.
|
||||
htmlhelp_basename = '%sdoc' % project
|
||||
|
||||
# Grouping the document tree into LaTeX files. List of tuples
|
||||
# (source start file, target name, title, author, documentclass
|
||||
# [howto/manual]).
|
||||
latex_documents = [
|
||||
('index',
|
||||
'%s.tex' % project,
|
||||
u'%s Documentation' % project,
|
||||
u'OpenStack Foundation', 'manual'),
|
||||
]
|
||||
|
||||
# Example configuration for intersphinx: refer to the Python standard library.
|
||||
intersphinx_mapping = {'http://docs.python.org/': None}
|
|
@ -1 +0,0 @@
|
|||
.. include:: ../../CONTRIBUTING.rst
|
|
@ -1,19 +0,0 @@
|
|||
Welcome to os-loganalyze's documentation!
|
||||
=========================================
|
||||
|
||||
Contents:
|
||||
|
||||
.. toctree::
|
||||
:maxdepth: 2
|
||||
|
||||
readme
|
||||
installation
|
||||
usage
|
||||
contributing
|
||||
|
||||
Indices and tables
|
||||
==================
|
||||
|
||||
* :ref:`genindex`
|
||||
* :ref:`modindex`
|
||||
* :ref:`search`
|
|
@ -1,12 +0,0 @@
|
|||
============
|
||||
Installation
|
||||
============
|
||||
|
||||
At the command line::
|
||||
|
||||
$ pip install replace with the name for the git repo
|
||||
|
||||
Or, if you have virtualenvwrapper installed::
|
||||
|
||||
$ mkvirtualenv replace with the name for the git repo
|
||||
$ pip install replace with the name for the git repo
|
|
@ -1 +0,0 @@
|
|||
.. include:: ../README.rst
|
|
@ -1,7 +0,0 @@
|
|||
========
|
||||
Usage
|
||||
========
|
||||
|
||||
To use replace with the name for the git repo in a project::
|
||||
|
||||
import os_loganalyze
|
|
@ -1,5 +0,0 @@
|
|||
[general]
|
||||
filter = SevFilter
|
||||
view = HTMLView
|
||||
file_conditions = /etc/os-loganalyze/file_conditions.yaml
|
||||
generate_folder_index = true
|
|
@ -1,7 +0,0 @@
|
|||
[DEFAULT]
|
||||
|
||||
# The list of modules to copy from oslo-incubator.git
|
||||
module=install_venv_common
|
||||
|
||||
# The base module to hold the copy of openstack.common
|
||||
base=os_loganalyze
|
|
@ -1,21 +0,0 @@
|
|||
#!/usr/bin/env python
|
||||
#
|
||||
# Copyright (c) 2013 IBM Corp.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import os_loganalyze.wsgi
|
||||
|
||||
|
||||
def main():
|
||||
os_loganalyze.wsgi.htmlify_stdin()
|
|
@ -1,19 +0,0 @@
|
|||
#!/usr/bin/env python
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import os_loganalyze.server
|
||||
|
||||
|
||||
def main():
|
||||
os_loganalyze.server.main()
|
|
@ -1,221 +0,0 @@
|
|||
#!/usr/bin/env python
|
||||
#
|
||||
# Copyright (c) 2013 IBM Corp.
|
||||
# Copyright (c) 2014 Hewlett-Packard Development Company, L.P.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import os.path
|
||||
import re
|
||||
|
||||
import os_loganalyze.generator as generator
|
||||
import os_loganalyze.util as util
|
||||
|
||||
# which logs support severity
|
||||
# This uses re.match so you must match the left hand side.
|
||||
SUPPORTS_SEV = re.compile(
|
||||
r'((screen-)?(n-|c-|g-|h-|ir-|ironic-|m-|o-|df-|placement-api|'
|
||||
r'q-|neutron-|' # support both lib/neutron and lib/neutron-legacy logs
|
||||
r'ceil|key|sah|des|tr|sl)' # openstack logs
|
||||
r'|(devstack\@)' # systemd logs
|
||||
# other things we understand
|
||||
r'|(keystone|tempest)\.txt|syslog)')
|
||||
|
||||
SYSLOGDATE = '\w+\s+\d+\s+\d{2}:\d{2}:\d{2}((\.|\,)\d{3,6})?'
|
||||
DATEFMT = '\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}((\.|\,)\d{3,6})?'
|
||||
STATUSFMT = '(DEBUG|INFO|WARNING|ERROR|TRACE|AUDIT|CRITICAL)'
|
||||
|
||||
OSLO_LOGMATCH = '^(?P<date>%s)(?P<line>(?P<pid> \d+)? (?P<status>%s).*)' % \
|
||||
(DATEFMT, STATUSFMT)
|
||||
SYSLOG_MATCH = ('^(?P<date>%s)(?P<line> (?P<host>[\w\-]+) '
|
||||
'(?P<service>[^\[\s]+):.*)' %
|
||||
(SYSLOGDATE))
|
||||
SYSTEMD_MATCH = (
|
||||
'^(?P<date>%s)(?P<line> (?P<host>\S+) \S+\[\d+\]\: (?P<status>%s)?.*)' %
|
||||
(SYSLOGDATE, STATUSFMT))
|
||||
CONSOLE_MATCH = '^(?P<date>%s)(?P<line>.*)' % DATEFMT
|
||||
|
||||
OSLORE = re.compile(OSLO_LOGMATCH)
|
||||
SYSLOGRE = re.compile(SYSLOG_MATCH)
|
||||
CONSOLERE = re.compile(CONSOLE_MATCH)
|
||||
SYSTEMDRE = re.compile(SYSTEMD_MATCH)
|
||||
|
||||
SEVS = {
|
||||
'NONE': 0,
|
||||
'DEBUG': 1,
|
||||
'INFO': 2,
|
||||
'AUDIT': 3,
|
||||
'TRACE': 4,
|
||||
'WARNING': 5,
|
||||
'ERROR': 6,
|
||||
'CRITICAL': 7,
|
||||
}
|
||||
|
||||
|
||||
class LogLine(object):
|
||||
status = "NONE"
|
||||
line = ""
|
||||
date = ""
|
||||
pid = ""
|
||||
service = ""
|
||||
|
||||
def __init__(self, line, old_sev="NONE"):
|
||||
self._parse(line, old_sev)
|
||||
|
||||
def _syslog_status(self, service):
|
||||
if service in ('tgtd', 'proxy-server'):
|
||||
return 'DEBUG'
|
||||
else:
|
||||
return 'INFO'
|
||||
|
||||
def safe_date(self):
|
||||
return '_' + re.sub('[\s\:\.\,]', '_', self.date)
|
||||
|
||||
def _parse(self, line, old_sev):
|
||||
m = OSLORE.match(line)
|
||||
if m:
|
||||
self.status = m.group('status')
|
||||
self.line = m.group('line')
|
||||
self.date = m.group('date')
|
||||
self.pid = m.group('pid')
|
||||
return
|
||||
m = SYSTEMDRE.match(line)
|
||||
if m:
|
||||
self.status = m.group('status') or "NONE"
|
||||
self.line = m.group('line')
|
||||
self.date = m.group('date')
|
||||
self.host = m.group('host')
|
||||
return
|
||||
m = CONSOLERE.match(line)
|
||||
if m:
|
||||
self.date = m.group('date')
|
||||
self.status = old_sev
|
||||
self.line = m.group('line')
|
||||
return
|
||||
m = SYSLOGRE.match(line)
|
||||
if m:
|
||||
self.service = m.group('service')
|
||||
self.line = m.group('line')
|
||||
self.date = m.group('date')
|
||||
self.status = self._syslog_status(self.service)
|
||||
return
|
||||
|
||||
self.status = old_sev
|
||||
self.line = line.rstrip()
|
||||
|
||||
|
||||
class SevFilter(object):
|
||||
|
||||
def __init__(self, file_generator, minsev="NONE", limit=None):
|
||||
self.minsev = minsev
|
||||
self.file_generator = file_generator
|
||||
# To avoid matching strings in the log dir path we only consider the
|
||||
# filename itself for severity support.
|
||||
filename = os.path.basename(file_generator.logname)
|
||||
self.supports_sev = \
|
||||
SUPPORTS_SEV.match(filename) is not None
|
||||
self.limit = limit
|
||||
self.strip_control = False
|
||||
|
||||
def strip(self, line):
|
||||
return re.sub('\x1b\[(([03]\d)|\;)+m', '', line)
|
||||
|
||||
def __iter__(self):
|
||||
old_sev = "NONE"
|
||||
lineno = 1
|
||||
for line in self.file_generator:
|
||||
# bail early for limits
|
||||
if self.limit and lineno > int(self.limit):
|
||||
raise StopIteration()
|
||||
# strip control chars in case the console is ascii colored
|
||||
if self.strip_control:
|
||||
line = self.strip(line)
|
||||
|
||||
logline = LogLine(line, old_sev)
|
||||
|
||||
# Some log lines come without severity. Treat those as
|
||||
# belonging to the previous non NONE severity so that we
|
||||
# get those log lines associated to the proper level.
|
||||
if logline.status != "NONE":
|
||||
old_sev = logline.status
|
||||
else:
|
||||
logline.status = old_sev
|
||||
|
||||
if self.supports_sev and self.skip_by_sev(logline.status):
|
||||
continue
|
||||
|
||||
lineno += 1
|
||||
yield logline
|
||||
|
||||
def skip_by_sev(self, sev):
|
||||
"""should we skip this line?
|
||||
|
||||
If the line severity is less than our minimum severity,
|
||||
yes we should.
|
||||
"""
|
||||
minsev = self.minsev
|
||||
return SEVS.get(sev, 0) < SEVS.get(minsev, 0)
|
||||
|
||||
|
||||
class Line(object):
|
||||
date = ''
|
||||
|
||||
def __init__(self, line):
|
||||
self.line = line
|
||||
|
||||
|
||||
class NoFilter(object):
|
||||
supports_sev = False
|
||||
|
||||
def __init__(self, file_generator):
|
||||
self.file_generator = file_generator
|
||||
|
||||
def __iter__(self):
|
||||
for line in self.file_generator:
|
||||
l = Line(line)
|
||||
l.status = "NONE"
|
||||
yield l
|
||||
|
||||
|
||||
def get_filter_generator(file_generator, environ, root_path, config):
|
||||
"""Return the filter to use as per the config."""
|
||||
|
||||
# Check if the generator is an index page. If so, we don't want to apply
|
||||
# any filters
|
||||
if isinstance(file_generator, generator.IndexIterableBuffer):
|
||||
return NoFilter(file_generator)
|
||||
|
||||
# Check file specific conditions first
|
||||
filter_selected = util.get_file_conditions('filter', file_generator,
|
||||
environ, root_path, config)
|
||||
|
||||
# Otherwise use the defaults in the config
|
||||
if not filter_selected:
|
||||
if config.has_section('general'):
|
||||
if config.has_option('general', 'filter'):
|
||||
filter_selected = config.get('general', 'filter')
|
||||
|
||||
minsev = util.parse_param(environ, 'level', default="NONE")
|
||||
limit = util.parse_param(environ, 'limit')
|
||||
|
||||
if filter_selected:
|
||||
if filter_selected.lower() in ['sevfilter', 'sev']:
|
||||
return SevFilter(file_generator, minsev, limit)
|
||||
elif filter_selected.lower() in ['nofilter', 'no']:
|
||||
return NoFilter(file_generator)
|
||||
|
||||
# Otherwise guess
|
||||
if util.use_passthrough_view(file_generator.file_headers):
|
||||
return NoFilter(file_generator)
|
||||
|
||||
return SevFilter(file_generator, minsev, limit)
|
|
@ -1,176 +0,0 @@
|
|||
#!/usr/bin/env python
|
||||
#
|
||||
# Copyright (c) 2013 IBM Corp.
|
||||
# Copyright (c) 2014 Hewlett-Packard Development Company, L.P.
|
||||
# Copyright (c) 2014 Rackspace Australia
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import collections
|
||||
import datetime
|
||||
import fileinput
|
||||
import os.path
|
||||
import re
|
||||
|
||||
import jinja2
|
||||
|
||||
import os_loganalyze.util as util
|
||||
|
||||
|
||||
class UnsafePath(Exception):
|
||||
pass
|
||||
|
||||
|
||||
class NoSuchFile(Exception):
|
||||
pass
|
||||
|
||||
|
||||
def does_file_exist(fname):
|
||||
"""Figure out if we'll be able to read this file.
|
||||
|
||||
Because we are handling the file streams as generators, we actually raise
|
||||
an exception too late for us to be able to handle it before apache has
|
||||
completely control. This attempts to do the same open outside of the
|
||||
generator to trigger the IOError early enough for us to catch it, without
|
||||
completely changing the logic flow, as we really want the generator
|
||||
pipeline for performance reasons.
|
||||
|
||||
This does open us up to a small chance for a race where the file comes
|
||||
or goes between this call and the next, however that is a vanishingly
|
||||
small possibility.
|
||||
"""
|
||||
try:
|
||||
f = open(fname)
|
||||
f.close()
|
||||
return True
|
||||
except IOError:
|
||||
return False
|
||||
|
||||
|
||||
def log_name(environ):
|
||||
path = environ['PATH_INFO']
|
||||
if path[0] == '/':
|
||||
path = path[1:]
|
||||
match = re.search('htmlify/(.*)', path)
|
||||
if match:
|
||||
raw = match.groups(1)[0]
|
||||
return raw
|
||||
|
||||
return path
|
||||
|
||||
|
||||
def safe_path(root, log_name):
|
||||
"""Pull out a safe path from a url.
|
||||
|
||||
Basically we need to ensure that the final computed path
|
||||
remains under the root path. If not, we return None to indicate
|
||||
that we are very sad.
|
||||
"""
|
||||
if log_name is not None:
|
||||
newpath = os.path.abspath(os.path.join(root, log_name))
|
||||
if newpath.find(root) == 0:
|
||||
return newpath
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def sizeof_fmt(num, suffix='B'):
|
||||
# From http://stackoverflow.com/questions/1094841/
|
||||
# reusable-library-to-get-human-readable-version-of-file-size
|
||||
for unit in ['', 'K', 'M', 'G', 'T', 'P', 'E', 'Z']:
|
||||
if abs(num) < 1024.0:
|
||||
return "%3.1f%s%s" % (num, unit, suffix)
|
||||
num /= 1024.0
|
||||
return "%.1f%s%s" % (num, 'Y', suffix)
|
||||
|
||||
|
||||
class DiskIterableBuffer(collections.Iterable):
|
||||
def __init__(self, logname, logpath, config):
|
||||
self.logname = logname
|
||||
self.logpath = logpath
|
||||
self.resp_headers = {}
|
||||
self.obj = fileinput.FileInput(self.logpath,
|
||||
openhook=fileinput.hook_compressed)
|
||||
self.file_headers = {}
|
||||
self.file_headers['filename'] = logname
|
||||
self.file_headers.update(util.get_headers_for_file(logpath))
|
||||
|
||||
def __iter__(self):
|
||||
return self.obj
|
||||
|
||||
|
||||
class IndexIterableBuffer(collections.Iterable):
|
||||
def __init__(self, logname, logpath, config):
|
||||
self.logname = logname
|
||||
self.logpath = logpath
|
||||
self.config = config
|
||||
self.resp_headers = {}
|
||||
self.file_headers = {}
|
||||
self.file_headers['Content-type'] = 'text/html'
|
||||
|
||||
# Use sets here to dedup. We can have duplicates
|
||||
# if disk and swift based paths have overlap.
|
||||
file_set = self.disk_list()
|
||||
# file_list is a list of tuples (relpath, name, mtime, size)
|
||||
self.file_list = sorted(file_set, key=lambda tup: tup[0])
|
||||
|
||||
def disk_list(self):
|
||||
file_set = set()
|
||||
if os.path.isdir(self.logpath):
|
||||
for f in os.listdir(self.logpath):
|
||||
full_path = os.path.join(self.logpath, f)
|
||||
stat_info = os.stat(full_path)
|
||||
size = sizeof_fmt(stat_info.st_size)
|
||||
mtime = datetime.datetime.utcfromtimestamp(
|
||||
stat_info.st_mtime).isoformat()
|
||||
if os.path.isdir(full_path):
|
||||
f = f + '/' if f[-1] != '/' else f
|
||||
file_set.add((
|
||||
os.path.join('/', self.logname, f),
|
||||
f,
|
||||
mtime,
|
||||
size
|
||||
))
|
||||
return file_set
|
||||
|
||||
def __iter__(self):
|
||||
env = jinja2.Environment(
|
||||
loader=jinja2.PackageLoader('os_loganalyze', 'templates'))
|
||||
template = env.get_template('file_index.html')
|
||||
gen = template.generate(logname=self.logname,
|
||||
file_list=self.file_list)
|
||||
for l in gen:
|
||||
yield l.encode("utf-8")
|
||||
|
||||
|
||||
def get_file_generator(environ, root_path, config=None):
|
||||
logname = log_name(environ)
|
||||
logpath = safe_path(root_path, logname)
|
||||
if logpath is None:
|
||||
raise UnsafePath()
|
||||
|
||||
file_generator = None
|
||||
if does_file_exist(logpath):
|
||||
file_generator = DiskIterableBuffer(logname, logpath, config)
|
||||
|
||||
if not file_generator or not file_generator.obj:
|
||||
if (config.has_section('general') and
|
||||
config.has_option('general', 'generate_folder_index') and
|
||||
config.getboolean('general', 'generate_folder_index')):
|
||||
index_generator = IndexIterableBuffer(logname, logpath,
|
||||
config)
|
||||
if len(index_generator.file_list) > 0:
|
||||
return index_generator
|
||||
raise NoSuchFile()
|
||||
|
||||
return file_generator
|
|
@ -1,80 +0,0 @@
|
|||
#!/usr/bin/env python
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
|
||||
""" Run a simple WSGI server to serve htmlified logs from local devstack
|
||||
machines. Add these lines to localrc before running stack.sh:
|
||||
|
||||
SCREEN_LOGDIR=$DEST/logs/screen
|
||||
LOG_COLOR=false
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import os
|
||||
import socket
|
||||
import sys
|
||||
import wsgiref.simple_server
|
||||
|
||||
from os_loganalyze import wsgi
|
||||
|
||||
DEF_PORT = 8000
|
||||
LOG_PATH = '/opt/stack/logs/screen/'
|
||||
WSGI_CONFIG = '/etc/os_loganalyze/wsgi.conf'
|
||||
|
||||
|
||||
def parse_args():
|
||||
parser = argparse.ArgumentParser(
|
||||
description=__doc__,
|
||||
formatter_class=argparse.ArgumentDefaultsHelpFormatter)
|
||||
|
||||
parser.add_argument('--port', '-p', type=int, default=DEF_PORT,
|
||||
help='TCP port to listen on')
|
||||
|
||||
parser.add_argument('--logdir', '-l', default=LOG_PATH,
|
||||
help='Path to the log files to be served')
|
||||
|
||||
parser.add_argument('--wsgi-config', '-c', default=WSGI_CONFIG,
|
||||
help="Specify the WSGI configuration file")
|
||||
|
||||
args = parser.parse_args()
|
||||
return (args.port, args.logdir, args.wsgi_config)
|
||||
|
||||
|
||||
def top_wsgi_app(environ, start_response):
|
||||
return wsgi.application(environ, start_response, root_path=LOG_PATH,
|
||||
wsgi_config=WSGI_CONFIG)
|
||||
|
||||
|
||||
def my_ip():
|
||||
return socket.gethostbyname(socket.gethostname())
|
||||
|
||||
|
||||
def main():
|
||||
global LOG_PATH, WSGI_CONFIG
|
||||
port, LOG_PATH, WSGI_CONFIG = parse_args()
|
||||
|
||||
if not os.path.isdir(LOG_PATH):
|
||||
print("%s is not a directory. Quiting..." % LOG_PATH)
|
||||
sys.exit(1)
|
||||
|
||||
url = "http://%s:%d/" % (my_ip(), port)
|
||||
print("Listening on port %d with %s as root path" % (port, LOG_PATH))
|
||||
print("URLs are like: %shtmlify/screen-n-api.log" % url)
|
||||
print("Or goto %s for a page of links" % url)
|
||||
|
||||
wsgiref.simple_server.make_server('', port, top_wsgi_app).serve_forever()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
|
@ -1,14 +0,0 @@
|
|||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<title>Index of {{ logname }}</title>
|
||||
</head>
|
||||
<body>
|
||||
<h1>Index of {{ logname }}</h1>
|
||||
<table><tr><th>Name</th><th>Last Modified</th><th>Size</th></tr>
|
||||
{% for link, title, mtime, size in file_list %}
|
||||
<tr><td><a href="{{ link }}">{{ title }}</a></td><td>{{ mtime }}</td><td style="text-align: right">{{ size }}</td></tr>
|
||||
{% endfor %}
|
||||
</table>
|
||||
</body>
|
||||
</html>
|
|
@ -1,126 +0,0 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
|
||||
# Copyright 2010-2011 OpenStack Foundation
|
||||
# Copyright (c) 2013 Hewlett-Packard Development Company, L.P.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import ConfigParser
|
||||
import os
|
||||
import os.path
|
||||
import tempfile
|
||||
import urllib
|
||||
from wsgiref import util
|
||||
|
||||
import fixtures
|
||||
import testtools
|
||||
|
||||
import os_loganalyze.wsgi as log_wsgi
|
||||
|
||||
_TRUE_VALUES = ('true', '1', 'yes')
|
||||
|
||||
|
||||
def samples_path(append_folder='samples'):
|
||||
"""Create an abs path for our test samples
|
||||
|
||||
Because the wsgi has a security check that ensures that we don't
|
||||
escape our root path, we need to actually create a full abs path
|
||||
for the tests, otherwise the sample files aren't findable.
|
||||
"""
|
||||
return os.path.abspath(
|
||||
os.path.join(
|
||||
os.path.dirname(__file__),
|
||||
append_folder)) + os.sep
|
||||
|
||||
|
||||
class TestCase(testtools.TestCase):
|
||||
|
||||
"""Test case base class for all unit tests."""
|
||||
|
||||
def setUp(self):
|
||||
"""Run before each test method to initialize test environment."""
|
||||
|
||||
super(TestCase, self).setUp()
|
||||
test_timeout = os.environ.get('OS_TEST_TIMEOUT', 0)
|
||||
try:
|
||||
test_timeout = int(test_timeout)
|
||||
except ValueError:
|
||||
# If timeout value is invalid do not set a timeout.
|
||||
test_timeout = 0
|
||||
if test_timeout > 0:
|
||||
self.useFixture(fixtures.Timeout(test_timeout, gentle=True))
|
||||
|
||||
self.useFixture(fixtures.NestedTempfile())
|
||||
self.useFixture(fixtures.TempHomeDir())
|
||||
|
||||
if os.environ.get('OS_STDOUT_CAPTURE') in _TRUE_VALUES:
|
||||
stdout = self.useFixture(fixtures.StringStream('stdout')).stream
|
||||
self.useFixture(fixtures.MonkeyPatch('sys.stdout', stdout))
|
||||
if os.environ.get('OS_STDERR_CAPTURE') in _TRUE_VALUES:
|
||||
stderr = self.useFixture(fixtures.StringStream('stderr')).stream
|
||||
self.useFixture(fixtures.MonkeyPatch('sys.stderr', stderr))
|
||||
|
||||
self.log_fixture = self.useFixture(fixtures.FakeLogger())
|
||||
self.samples_directory = 'samples'
|
||||
self.wsgi_config_file = samples_path('samples') + 'wsgi.conf'
|
||||
|
||||
def _start_response(self, *args):
|
||||
return
|
||||
|
||||
def fake_env(self, **kwargs):
|
||||
environ = dict(**kwargs)
|
||||
util.setup_testing_defaults(environ)
|
||||
return environ
|
||||
|
||||
def _create_wsgi_config_file_for_job(self):
|
||||
# We need to create a new config file for each job run to have the
|
||||
# opportunity to modify paths to the tests samples dir
|
||||
config = ConfigParser.ConfigParser()
|
||||
config.read(os.path.expanduser(self.wsgi_config_file))
|
||||
|
||||
if config.has_section('general'):
|
||||
if config.has_option('general', 'file_conditions'):
|
||||
config.set(
|
||||
'general', 'file_conditions',
|
||||
samples_path() + config.get('general', 'file_conditions'))
|
||||
fd, filename = tempfile.mkstemp()
|
||||
config.write(os.fdopen(fd, 'w'))
|
||||
return filename
|
||||
|
||||
def get_generator(self, fname, level=None, html=True,
|
||||
limit=None, source=None, range_bytes=None):
|
||||
kwargs = {'PATH_INFO': '/htmlify/%s/%s' % (self.samples_directory,
|
||||
fname)}
|
||||
qs = {}
|
||||
if level:
|
||||
qs['level'] = level
|
||||
if limit:
|
||||
qs['limit'] = limit
|
||||
if source:
|
||||
qs['source'] = source
|
||||
if qs:
|
||||
kwargs['QUERY_STRING'] = urllib.urlencode(qs)
|
||||
|
||||
if html:
|
||||
kwargs['HTTP_ACCEPT'] = 'text/html'
|
||||
|
||||
if range_bytes:
|
||||
kwargs['HTTP_RANGE'] = 'bytes=%s' % range_bytes
|
||||
|
||||
gen = log_wsgi.application(
|
||||
self.fake_env(**kwargs),
|
||||
self._start_response,
|
||||
root_path=samples_path(''),
|
||||
wsgi_config=self._create_wsgi_config_file_for_job())
|
||||
|
||||
return iter(gen)
|
Binary file not shown.
|
@ -1,10 +0,0 @@
|
|||
-- Logs begin at Tue 2017-03-28 12:02:58 UTC, end at Tue 2017-03-28 13:16:45 UTC. --
|
||||
Mar 28 12:20:42.377230 ubuntu-xenial-osic-cloud1-s3500-8127579 systemd[1]: Started Devstack devstack@c-api.service.
|
||||
Mar 28 12:20:43.570064 ubuntu-xenial-osic-cloud1-s3500-8127579 cinder-api[8526]: WARNING oslo_reports.guru_meditation_report [-] Guru meditation now registers SIGUSR1 and SIGUSR2 by default for backward compatibility. SIGUSR1 will no longer be registered in a future release, so please use SIGUSR2 to generate reports.
|
||||
Mar 28 12:20:44.172534 ubuntu-xenial-osic-cloud1-s3500-8127579 cinder-api[8526]: DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "singleton_lock" from (pid=8526) lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:212
|
||||
Mar 28 12:20:44.173011 ubuntu-xenial-osic-cloud1-s3500-8127579 cinder-api[8526]: DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "singleton_lock" from (pid=8526) lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:225
|
||||
Mar 28 12:20:44.173575 ubuntu-xenial-osic-cloud1-s3500-8127579 cinder-api[8526]: DEBUG oslo.service.wsgi [-] Loading app osapi_volume from /etc/cinder/api-paste.ini from (pid=8526) load_app /usr/local/lib/python2.7/dist-packages/oslo_service/wsgi.py:352
|
||||
Mar 28 12:20:44.278200 ubuntu-xenial-osic-cloud1-s3500-8127579 cinder-api[8526]: INFO root [-] Generating grammar tables from /usr/lib/python2.7/lib2to3/Grammar.txt
|
||||
Mar 28 12:20:44.301267 ubuntu-xenial-osic-cloud1-s3500-8127579 cinder-api[8526]: INFO root [-] Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
|
||||
Mar 28 12:20:44.425018 ubuntu-xenial-osic-cloud1-s3500-8127579 cinder-api[8526]: INFO cinder.api.extensions [-] Initializing extension manager.
|
||||
Mar 28 12:20:44.426439 ubuntu-xenial-osic-cloud1-s3500-8127579 cinder-api[8526]: DEBUG cinder.api.extensions [-] Loading extension cinder.api.contrib.standard_extensions from (pid=8526) load_extension /opt/stack/new/cinder/cinder/api/extensions.py:198
|
|
@ -1,14 +0,0 @@
|
|||
May 26 20:44:32.276580 ubuntu-xenial-rax-dfw-9013276 cinder-volume[27348]: INFO cinder.volume.manager [req-34715675-437c-4668-8f27-ee0f5f10cabb tempest-VolumesDeleteCascade-157659443 None] Delete snapshot completed successfully.
|
||||
May 26 20:44:32.283596 ubuntu-xenial-rax-dfw-9013276 cinder-volume[27348]: Traceback (most recent call last):
|
||||
May 26 20:44:32.283764 ubuntu-xenial-rax-dfw-9013276 cinder-volume[27348]: File "/usr/local/lib/python2.7/dist-packages/eventlet/hubs/hub.py", line 457, in fire_timers
|
||||
May 26 20:44:32.283899 ubuntu-xenial-rax-dfw-9013276 cinder-volume[27348]: timer()
|
||||
May 26 20:44:32.284031 ubuntu-xenial-rax-dfw-9013276 cinder-volume[27348]: File "/usr/local/lib/python2.7/dist-packages/eventlet/hubs/timer.py", line 58, in __call__
|
||||
May 26 20:44:32.284161 ubuntu-xenial-rax-dfw-9013276 cinder-volume[27348]: cb(*args, **kw)
|
||||
May 26 20:44:32.284293 ubuntu-xenial-rax-dfw-9013276 cinder-volume[27348]: File "/usr/local/lib/python2.7/dist-packages/eventlet/semaphore.py", line 147, in _do_acquire
|
||||
May 26 20:44:32.284429 ubuntu-xenial-rax-dfw-9013276 cinder-volume[27348]: waiter.switch()
|
||||
May 26 20:44:32.284561 ubuntu-xenial-rax-dfw-9013276 cinder-volume[27348]: error: cannot switch to a different thread
|
||||
May 26 20:44:39.756846 ubuntu-xenial-rax-dfw-9013276 cinder-volume[27348]: INFO cinder.volume.manager [req-7f545400-cf49-47ac-a901-133d9bc6a909 tempest-VolumesBackupsAdminTest-1249474671 None] Terminate volume connection completed successfully.
|
||||
May 26 20:44:39.837651 ubuntu-xenial-rax-dfw-9013276 cinder-volume[27348]: DEBUG oslo_concurrency.processutils [req-7f545400-cf49-47ac-a901-133d9bc6a909 tempest-VolumesBackupsAdminTest-1249474671 None] Running cmd (subprocess): sudo cinder-rootwrap /etc/cinder/rootwrap.conf tgt-admin --show {{(pid=27788) execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:355}}
|
||||
May 26 20:44:39.848686 ubuntu-xenial-rax-dfw-9013276 sudo[552]: stack : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/local/bin/cinder-rootwrap /etc/cinder/rootwrap.conf tgt-admin --show
|
||||
May 26 20:44:39.849078 ubuntu-xenial-rax-dfw-9013276 sudo[552]: pam_unix(sudo:session): session opened for user root by (uid=0)
|
||||
May 26 20:44:40.091434 ubuntu-xenial-rax-dfw-9013276 sudo[552]: pam_unix(sudo:session): session closed for user root
|
Binary file not shown.
|
@ -1,10 +0,0 @@
|
|||
conditions:
|
||||
- filename_pattern: ^.*\.txt\.gz$
|
||||
filter: SevFilter
|
||||
view: HTMLView
|
||||
- filename_pattern: ^.*\.txt?$
|
||||
filter: SevFilter
|
||||
view: TextView
|
||||
- filename_pattern: ^.*$
|
||||
filter: NoFilter
|
||||
view: PassthroughView
|
Binary file not shown.
Before Width: | Height: | Size: 3.6 KiB |
|
@ -1,2 +0,0 @@
|
|||
<hTML>
|
||||
</html>
|
|
@ -1,9 +0,0 @@
|
|||
|
||||
|
||||
|
||||
<!doctype html PUBLIC "-//W3C//DTD html 4.01//EN" "http://www.w3.org/TR/html4/strict.dtd">
|
||||
<html>
|
||||
|
||||
Should detect HTML even though there is leading whitespace.
|
||||
|
||||
</html>
|
File diff suppressed because one or more lines are too long
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
|
@ -1,2 +0,0 @@
|
|||
2013-09-27 18:22:35.392 testing 123
|
||||
2013-09-27 18:22:36.123 second line
|
File diff suppressed because it is too large
Load Diff
|
@ -1,2 +0,0 @@
|
|||
[general]
|
||||
# Don't override the filter or view default detection
|
|
@ -1,2 +0,0 @@
|
|||
[general]
|
||||
file_conditions = file_conditions.yaml
|
|
@ -1,2 +0,0 @@
|
|||
[general]
|
||||
generate_folder_index = true
|
|
@ -1,3 +0,0 @@
|
|||
[general]
|
||||
filter = nofilter
|
||||
view = passthrough
|
|
@ -1,234 +0,0 @@
|
|||
#!/usr/bin/env python
|
||||
#
|
||||
# Copyright (c) 2013 IBM Corp.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""
|
||||
Test the ability to convert files into wsgi generators
|
||||
"""
|
||||
|
||||
|
||||
from os_loganalyze import filter as flt
|
||||
from os_loganalyze.tests import base
|
||||
|
||||
|
||||
class TestSupportsSevRegex(base.TestCase):
|
||||
def test_matching(self):
|
||||
def yes(fname):
|
||||
self.assertIsNotNone(flt.SUPPORTS_SEV.match(fname),
|
||||
"%s should have matched" % fname)
|
||||
|
||||
def no(fname):
|
||||
self.assertIsNone(flt.SUPPORTS_SEV.match(fname),
|
||||
"%s should not have matched" % fname)
|
||||
|
||||
yes("n-api.txt.gz")
|
||||
yes("c-vol.txt.gz")
|
||||
yes("tempest.txt")
|
||||
yes("devstack@c-api.service.log.txt")
|
||||
# this specific bug was hit previously
|
||||
no("check/gate-horizon-python27/1dba20d/console.html")
|
||||
no("check/gate-tempest-dsvm-trove/c5950fc/console.html")
|
||||
no("check/neutron-tempest-plugin-dvr-multinode-scenario"
|
||||
"/42fb158/job-output.txt.gz")
|
||||
# NOTE(sdague): if we ever get edge conditions in the future,
|
||||
# please add checks in here.
|
||||
|
||||
|
||||
class TestFilters(base.TestCase):
|
||||
|
||||
def test_consolidated_filters(self):
|
||||
gen = self.get_generator('screen-q-svc.txt.gz', level='DEBUG')
|
||||
# we don't need the header, we just don't want to deal with it
|
||||
header = gen.next()
|
||||
self.assertIn("Display level: ", header)
|
||||
|
||||
# first line is INFO
|
||||
line = gen.next()
|
||||
self.assertIn("class='INFO", line)
|
||||
self.assertIn("href='#_2013-09-27_18_22_11_248'", line)
|
||||
|
||||
# second line is an ERROR
|
||||
line = gen.next()
|
||||
self.assertIn("class='ERROR", line)
|
||||
self.assertIn("href='#_2013-09-27_18_22_11_249'", line)
|
||||
|
||||
# third is a DEBUG
|
||||
line = gen.next()
|
||||
self.assertIn("class='DEBUG", line)
|
||||
self.assertIn("href='#_2013-09-27_18_22_11_249'", line)
|
||||
|
||||
# fourth is a CRITICAL
|
||||
line = gen.next()
|
||||
self.assertIn("class='CRITICAL", line)
|
||||
self.assertIn("href='#_2013-09-27_18_22_11_249'", line)
|
||||
|
||||
def test_systemd_filters(self):
|
||||
gen = self.get_generator('devstack@c-api.service.log.txt')
|
||||
# dump the header
|
||||
gen.next()
|
||||
# dump the systemd line
|
||||
gen.next()
|
||||
|
||||
# first line
|
||||
line = gen.next()
|
||||
self.assertIn("NONE", line)
|
||||
self.assertIn("href='#_Mar_28_12_20_42_377230'", line)
|
||||
|
||||
# second line
|
||||
line = gen.next()
|
||||
self.assertIn("WARNING", line)
|
||||
self.assertIn("href='#_Mar_28_12_20_43_570064", line)
|
||||
# third line
|
||||
line = gen.next()
|
||||
self.assertIn("DEBUG", line)
|
||||
self.assertIn("href='#_Mar_28_12_20_44_172534'", line)
|
||||
|
||||
def test_no_sev_matches_last_sev(self):
|
||||
gen = self.get_generator('devstack@c-vol.service.log.txt')
|
||||
# dump the header
|
||||
gen.next()
|
||||
|
||||
# first line
|
||||
line = gen.next()
|
||||
self.assertIn("INFO", line)
|
||||
self.assertIn("href='#_May_26_20_44_32_276580'", line)
|
||||
|
||||
# second line
|
||||
line = gen.next()
|
||||
self.assertIn("INFO", line)
|
||||
self.assertIn("Traceback", line)
|
||||
|
||||
for x in range(9):
|
||||
# Skip ahead to first DEBUG line.
|
||||
line = gen.next()
|
||||
|
||||
# First DEBUG line
|
||||
self.assertIn("DEBUG", line)
|
||||
self.assertIn("href='#_May_26_20_44_39_837651'", line)
|
||||
|
||||
# Second DEBUG line
|
||||
self.assertIn("DEBUG", line)
|
||||
self.assertIn("cinder-rootwrap", line)
|
||||
|
||||
def test_devstack_filters(self):
|
||||
gen = self.get_generator('devstacklog.txt.gz')
|
||||
# dump the header
|
||||
gen.next()
|
||||
|
||||
# first line
|
||||
line = gen.next()
|
||||
self.assertIn("href='#_2014-03-17_16_11_24_173'", line)
|
||||
|
||||
def test_devstack_filters_nodrop(self):
|
||||
gen = self.get_generator('devstacklog.txt.gz', level='INFO')
|
||||
|
||||
header = gen.next()
|
||||
self.assertNotIn("Display level: ", header)
|
||||
|
||||
# we shouldn't be dropping anything with the first line
|
||||
line = gen.next()
|
||||
self.assertIn("href='#_2014-03-17_16_11_24_173'", line)
|
||||
|
||||
def test_html_file_filters(self):
|
||||
# do we avoid double escaping html files
|
||||
gen = self.get_generator('console.html.gz')
|
||||
|
||||
header = gen.next()
|
||||
self.assertNotIn("Display level: ", header)
|
||||
|
||||
# we shouldn't be dropping anything with the first line
|
||||
line = gen.next()
|
||||
self.assertIn(
|
||||
"<span class='NONE _2013-09-27_18_07_11_860'>"
|
||||
"<a name='_2013-09-27_18_07_11_860' class='date' "
|
||||
"href='#_2013-09-27_18_07_11_860'>2013-09-27 18:07:11.860</a> | "
|
||||
"Started by user <a href='https://jenkins02.openstack.org/user"
|
||||
"/null' class='model-link'>anonymous</a>\n</span>",
|
||||
line)
|
||||
|
||||
line = gen.next()
|
||||
self.assertIn("<a name='_2013-09-27_18_07_11_884' "
|
||||
"class='date' href='#_2013-09-27_18_07_11_884'>", line)
|
||||
|
||||
line = gen.next()
|
||||
self.assertIn("<a name='_2013-09-27_18_09_18_784' "
|
||||
"class='date' href='#_2013-09-27_18_09_18_784'>", line)
|
||||
|
||||
# We've snuck some high-precision microsecond timestamps as
|
||||
# produced by zuul-runner in the end of this file; make sure
|
||||
# they are recognised.
|
||||
found = False
|
||||
for line in gen:
|
||||
look_for = "<span class='NONE _2013-09-27_18_47_33_342999'>" \
|
||||
"<a name='_2013-09-27_18_47_33_342999' class='date' " \
|
||||
"href='#_2013-09-27_18_47_33_342999'>" \
|
||||
"2013-09-27 18:47:33.342999</a>"
|
||||
if line.startswith(look_for):
|
||||
found = True
|
||||
|
||||
self.assertTrue(found)
|
||||
|
||||
def test_html_escape(self):
|
||||
gen = self.get_generator('screen-n-api-2.txt.gz')
|
||||
# throw away header
|
||||
gen.next()
|
||||
|
||||
line = gen.next()
|
||||
self.assertIn(
|
||||
"Calling method <bound method Versions.index of "
|
||||
"<nova.api.openstack.compute.versions.Versions object at "
|
||||
"0x371dad0>> _process_stack",
|
||||
line)
|
||||
|
||||
def test_syslog_file_filter(self):
|
||||
gen = self.get_generator('syslog.txt')
|
||||
header = gen.next()
|
||||
self.assertIn("Display level: ", header)
|
||||
|
||||
line = gen.next()
|
||||
self.assertIn("<span class='INFO", line)
|
||||
self.assertIn("object-replicator: Object replication complete.", line)
|
||||
line = gen.next()
|
||||
self.assertIn("<span class='INFO", line)
|
||||
self.assertIn("object-server: Started child 32090", line)
|
||||
line = gen.next()
|
||||
self.assertIn("<span class='DEBUG", line)
|
||||
self.assertIn('proxy-server: Pipeline is "catch_errors', line)
|
||||
|
||||
def test_syslog_file_filter_nodebug(self):
|
||||
gen = self.get_generator('syslog.txt', level='INFO')
|
||||
header = gen.next()
|
||||
self.assertIn("Display level: ", header)
|
||||
|
||||
line = gen.next()
|
||||
self.assertIn("<span class='INFO", line)
|
||||
self.assertIn("object-replicator: Object replication complete.", line)
|
||||
line = gen.next()
|
||||
self.assertIn("<span class='INFO", line)
|
||||
self.assertIn("object-server: Started child 32090", line)
|
||||
line = gen.next()
|
||||
self.assertIn("<span class='INFO", line)
|
||||
self.assertIn('object-server: SIGTERM received', line)
|
||||
|
||||
def test_limit_filters(self):
|
||||
gen = self.get_generator('devstacklog.txt.gz', limit=10)
|
||||
|
||||
lines = 0
|
||||
for line in gen:
|
||||
lines += 1
|
||||
|
||||
# the lines should actually be 2 + the limit we've asked for
|
||||
# given the header and footer
|
||||
self.assertEqual(12, lines)
|
|
@ -1,28 +0,0 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""
|
||||
test_os_loganalyze
|
||||
----------------------------------
|
||||
|
||||
Tests for `os_loganalyze` module.
|
||||
"""
|
||||
|
||||
from os_loganalyze.tests import base
|
||||
|
||||
|
||||
class TestOs_loganalyze(base.TestCase):
|
||||
|
||||
def test_something(self):
|
||||
pass
|
|
@ -1,61 +0,0 @@
|
|||
#!/usr/bin/env python
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""
|
||||
Test the view generators
|
||||
"""
|
||||
|
||||
import os_loganalyze.filter as osfilter
|
||||
import os_loganalyze.generator as osgen
|
||||
from os_loganalyze.tests import base
|
||||
import os_loganalyze.view as osview
|
||||
|
||||
|
||||
class TestViews(base.TestCase):
|
||||
def get_generator(self, fname):
|
||||
# Override base's get_generator because we don't want the full
|
||||
# wsgi application. We just need the generator to give to Views.
|
||||
root_path = base.samples_path(self.samples_directory)
|
||||
kwargs = {'PATH_INFO': '/htmlify/%s' % fname}
|
||||
file_generator = osgen.get_file_generator(self.fake_env(**kwargs),
|
||||
root_path)
|
||||
filter_generator = osfilter.SevFilter(file_generator)
|
||||
return filter_generator
|
||||
|
||||
def test_html_detection(self):
|
||||
gen = self.get_generator('sample.html')
|
||||
html_view = osview.HTMLView(gen)
|
||||
i = iter(html_view)
|
||||
self.assertFalse(html_view.is_html)
|
||||
# Move the generator so that the is_html flag is set
|
||||
i.next()
|
||||
self.assertTrue(html_view.is_html)
|
||||
|
||||
def test_doctype_html_detection(self):
|
||||
gen = self.get_generator('sample_doctype.html')
|
||||
html_view = osview.HTMLView(gen)
|
||||
i = iter(html_view)
|
||||
self.assertFalse(html_view.is_html)
|
||||
# Move the generator so that the is_html flag is set
|
||||
i.next()
|
||||
self.assertTrue(html_view.is_html)
|
||||
|
||||
def test_empty_file_html_view(self):
|
||||
gen = self.get_generator('empty.html')
|
||||
html_view = osview.HTMLView(gen)
|
||||
self.assertFalse(html_view.is_html)
|
||||
full_text = ""
|
||||
for i in html_view:
|
||||
full_text += i
|
||||
self.assertEqual("", full_text)
|
|
@ -1,284 +0,0 @@
|
|||
#!/usr/bin/env python
|
||||
#
|
||||
# Copyright (c) 2013 IBM Corp.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""
|
||||
Test the ability to convert files into wsgi generators
|
||||
"""
|
||||
|
||||
import collections
|
||||
import re
|
||||
import types
|
||||
|
||||
import testtools
|
||||
|
||||
from os_loganalyze.tests import base
|
||||
import os_loganalyze.wsgi as log_wsgi
|
||||
|
||||
|
||||
SEVS = {
|
||||
'NONE': 0,
|
||||
'DEBUG': 1,
|
||||
'INFO': 2,
|
||||
'AUDIT': 3,
|
||||
'TRACE': 4,
|
||||
'WARNING': 5,
|
||||
'ERROR': 6,
|
||||
'CRITICAL': 7,
|
||||
}
|
||||
|
||||
SEVS_SEQ = ['NONE', 'DEBUG', 'INFO', 'AUDIT', 'TRACE', 'WARNING', 'ERROR',
|
||||
'CRITICAL']
|
||||
|
||||
ISO8601RE = r'\d\d\d\d-\d\d-\d\dT\d\d:\d\d:\d\d(.\d+)?'
|
||||
TIME_REGEX = r'\d\d\d\d-\d\d-\d\d\s\d\d:\d\d:\d\d(.\d+)?'
|
||||
|
||||
|
||||
# add up all the counts from a generator
|
||||
def count_types(gen):
|
||||
counts = collections.defaultdict(lambda: 0)
|
||||
|
||||
for line in gen:
|
||||
counted = False
|
||||
|
||||
# is it even a proper log message?
|
||||
if re.match(TIME_REGEX, line):
|
||||
# skip NONE since it's an implicit level
|
||||
for key in SEVS_SEQ[1:]:
|
||||
if ' %s ' % key in line:
|
||||
counts[key] += 1
|
||||
counted = True
|
||||
break
|
||||
if not counted:
|
||||
counts['NONE'] += 1
|
||||
return counts
|
||||
|
||||
|
||||
# count up all the lines at all levels
|
||||
def compute_total(level, counts):
|
||||
total = 0
|
||||
for l in SEVS_SEQ[SEVS[level]:]:
|
||||
# so that we don't need to know all the levels
|
||||
if counts.get(l):
|
||||
total = total + counts[l]
|
||||
return total
|
||||
|
||||
|
||||
class TestWsgiDisk(base.TestCase):
|
||||
"""Test loading files from samples on disk."""
|
||||
|
||||
# counts for known files for testing
|
||||
files = {
|
||||
'screen-c-api.txt.gz': {
|
||||
'DEBUG': 2804,
|
||||
'INFO': 486,
|
||||
'AUDIT': 249,
|
||||
'TRACE': 0,
|
||||
'WARNING': 50,
|
||||
'ERROR': 0,
|
||||
},
|
||||
'screen-n-api.txt.gz': {
|
||||
'DEBUG': 11886,
|
||||
'INFO': 2721,
|
||||
'AUDIT': 271,
|
||||
'TRACE': 0,
|
||||
'WARNING': 6,
|
||||
'ERROR': 5
|
||||
},
|
||||
'screen-q-svc.txt.gz': {
|
||||
'DEBUG': 43583,
|
||||
'INFO': 262,
|
||||
'AUDIT': 0,
|
||||
'TRACE': 589,
|
||||
'WARNING': 48,
|
||||
'ERROR': 72,
|
||||
'CRITICAL': 1,
|
||||
},
|
||||
'screen-neutron-l3.txt.gz': {
|
||||
'DEBUG': 25212,
|
||||
'INFO': 85,
|
||||
'AUDIT': 0,
|
||||
'TRACE': 0,
|
||||
'WARNING': 122,
|
||||
'ERROR': 19,
|
||||
},
|
||||
}
|
||||
|
||||
def test_pass_through_all(self):
|
||||
for fname in self.files:
|
||||
gen = self.get_generator(fname, html=False)
|
||||
self.assertEqual(
|
||||
compute_total('DEBUG', count_types(gen)),
|
||||
compute_total('DEBUG', self.files[fname]))
|
||||
|
||||
def test_pass_through_at_levels(self):
|
||||
for fname in self.files:
|
||||
for level in self.files[fname]:
|
||||
gen = self.get_generator(fname, level=level, html=False)
|
||||
|
||||
counts = count_types(gen)
|
||||
print(fname, counts)
|
||||
|
||||
self.assertEqual(
|
||||
compute_total(level, counts),
|
||||
compute_total(level, self.files[fname]))
|
||||
|
||||
def test_invalid_file(self):
|
||||
gen = log_wsgi.application(
|
||||
self.fake_env(PATH_INFO='../'), self._start_response)
|
||||
self.assertEqual(gen, ['Invalid file url'])
|
||||
|
||||
def test_file_not_found(self):
|
||||
gen = log_wsgi.application(
|
||||
self.fake_env(PATH_INFO='/htmlify/foo.txt'),
|
||||
self._start_response)
|
||||
self.assertEqual(gen, ['File Not Found'])
|
||||
|
||||
def test_plain_text(self):
|
||||
gen = self.get_generator('screen-c-api.txt.gz', html=False)
|
||||
self.assertEqual(type(gen), types.GeneratorType)
|
||||
|
||||
first = gen.next()
|
||||
self.assertIn(
|
||||
'+ ln -sf /opt/stack/new/screen-logs/screen-c-api.2013-09-27-1815',
|
||||
first)
|
||||
|
||||
def test_html_gen(self):
|
||||
gen = self.get_generator('screen-c-api.txt.gz')
|
||||
first = gen.next()
|
||||
self.assertIn('<html>', first)
|
||||
|
||||
def test_plain_non_compressed(self):
|
||||
gen = self.get_generator('screen-c-api.txt', html=False)
|
||||
self.assertEqual(type(gen), types.GeneratorType)
|
||||
|
||||
first = gen.next()
|
||||
self.assertIn(
|
||||
'+ ln -sf /opt/stack/new/screen-logs/screen-c-api.2013-09-27-1815',
|
||||
first)
|
||||
|
||||
def test_passthrough_filter(self):
|
||||
# Test the passthrough filter returns an image stream
|
||||
gen = self.get_generator('openstack_logo.png')
|
||||
first = gen.next()
|
||||
self.assertNotIn('html', first)
|
||||
with open(base.samples_path('samples') + 'openstack_logo.png') as f:
|
||||
self.assertEqual(first, f.readline())
|
||||
|
||||
def test_config_no_filter(self):
|
||||
self.wsgi_config_file = (base.samples_path('samples') +
|
||||
'wsgi_plain.conf')
|
||||
# Try to limit the filter to 10 lines, but we should get the full
|
||||
# amount.
|
||||
gen = self.get_generator('devstacklog.txt.gz', limit=10)
|
||||
|
||||
lines = 0
|
||||
for line in gen:
|
||||
lines += 1
|
||||
|
||||
# the lines should actually be 2 + the limit we've asked for
|
||||
# given the header and footer, but we expect to get the full file
|
||||
self.assertNotEqual(12, lines)
|
||||
|
||||
def test_config_passthrough_view(self):
|
||||
self.wsgi_config_file = (base.samples_path('samples') +
|
||||
'wsgi_plain.conf')
|
||||
# Check there is no HTML on a file that should otherwise have it
|
||||
gen = self.get_generator('devstacklog.txt.gz')
|
||||
|
||||
first = gen.next()
|
||||
self.assertNotIn('<html>', first)
|
||||
|
||||
def test_file_conditions(self):
|
||||
self.wsgi_config_file = (base.samples_path('samples') +
|
||||
'wsgi_file_conditions.conf')
|
||||
# Check we are matching and setting the HTML filter
|
||||
gen = self.get_generator('devstacklog.txt.gz')
|
||||
|
||||
first = gen.next()
|
||||
self.assertIn('<html>', first)
|
||||
|
||||
# Check for simple.html we don't have HTML but do have date lines
|
||||
gen = self.get_generator('simple.txt')
|
||||
|
||||
first = gen.next()
|
||||
self.assertIn('2013-09-27 18:22:35.392 testing 123', first)
|
||||
|
||||
# Test images go through the passthrough filter
|
||||
gen = self.get_generator('openstack_logo.png')
|
||||
first = gen.next()
|
||||
self.assertNotIn('html', first)
|
||||
with open(base.samples_path('samples') + 'openstack_logo.png') as f:
|
||||
self.assertEqual(first, f.readline())
|
||||
|
||||
def test_accept_range(self):
|
||||
gen = self.get_generator('screen-c-api.txt.gz', range_bytes='0-5')
|
||||
body = gen.next()
|
||||
self.assertEqual('<html>', body)
|
||||
|
||||
def test_accept_range_seek(self):
|
||||
gen = self.get_generator('screen-c-api.txt.gz', range_bytes='7-12')
|
||||
body = gen.next()
|
||||
self.assertEqual('<head>', body)
|
||||
|
||||
def test_accept_range_no_start(self):
|
||||
gen = self.get_generator('screen-c-api.txt.gz', range_bytes='-8')
|
||||
body = gen.next()
|
||||
self.assertEqual('</html>\n', body)
|
||||
|
||||
def test_accept_range_no_end(self):
|
||||
gen = self.get_generator('screen-c-api.txt.gz', range_bytes='7-')
|
||||
body = gen.next()
|
||||
self.assertNotIn('<html>', body)
|
||||
self.assertIn('<head>', body)
|
||||
|
||||
def test_accept_range_no_start_no_end(self):
|
||||
gen = self.get_generator('screen-c-api.txt.gz', range_bytes='-')
|
||||
body = gen.next()
|
||||
self.assertEqual('Invalid Range', body)
|
||||
|
||||
def test_accept_range_no_hyphen(self):
|
||||
gen = self.get_generator('screen-c-api.txt.gz', range_bytes='7')
|
||||
body = gen.next()
|
||||
self.assertEqual('Invalid Range', body)
|
||||
|
||||
def test_folder_index(self):
|
||||
self.wsgi_config_file = (base.samples_path('samples') +
|
||||
'wsgi_folder_index.conf')
|
||||
gen = self.get_generator('')
|
||||
full = ''
|
||||
for line in gen:
|
||||
full += line
|
||||
|
||||
full_lines = full.split('\n')
|
||||
self.assertEqual('<!DOCTYPE html>', full_lines[0])
|
||||
self.assertIn('samples/</title>', full_lines[3])
|
||||
# self.assertEqual("foo", full_lines)
|
||||
self.assertThat(
|
||||
full_lines[9],
|
||||
testtools.matchers.MatchesRegex(
|
||||
r' <tr><td><a href="/samples/console.html.gz">'
|
||||
r'console.html.gz</a></td><td>' + ISO8601RE + r'</td>'
|
||||
r'<td style="text-align: right">277.4KB</td></tr>'),
|
||||
"\nFull log: %s" % full_lines)
|
||||
self.assertThat(
|
||||
full_lines[-5],
|
||||
testtools.matchers.MatchesRegex(
|
||||
r' <tr><td><a href="/samples/wsgi_plain.conf">'
|
||||
r'wsgi_plain.conf</a></td><td>' + ISO8601RE + r'</td>'
|
||||
r'<td style="text-align: right">47.0B</td></tr>'),
|
||||
"\nFull log: %s" % full_lines)
|
||||
self.assertEqual('</html>', full_lines[-1],
|
||||
"\nFull log: %s" % full_lines)
|
|
@ -1,137 +0,0 @@
|
|||
#!/usr/bin/python
|
||||
#
|
||||
# Copyright (c) 2014 Hewlett-Packard Development Company, L.P.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import cgi
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
import time
|
||||
|
||||
import magic
|
||||
import yaml
|
||||
|
||||
|
||||
def parse_param(env, name, default=None):
|
||||
parameters = cgi.parse_qs(env.get('QUERY_STRING', ''))
|
||||
if name in parameters:
|
||||
return cgi.escape(parameters[name][0])
|
||||
else:
|
||||
return default
|
||||
|
||||
|
||||
def get_file_mime(file_path):
|
||||
"""Get the file mime using libmagic."""
|
||||
|
||||
if not os.path.isfile(file_path):
|
||||
return None
|
||||
|
||||
if hasattr(magic, 'from_file'):
|
||||
return magic.from_file(file_path, mime=True)
|
||||
else:
|
||||
# no magic.from_file, we might be using the libmagic bindings
|
||||
m = magic.open(magic.MAGIC_MIME)
|
||||
m.load()
|
||||
return m.file(file_path).split(';')[0]
|
||||
|
||||
|
||||
def get_headers_for_file(file_path):
|
||||
"""Get some headers for a real (on-disk) file.
|
||||
|
||||
In a format similar to fetching from swift.
|
||||
"""
|
||||
|
||||
resp = {}
|
||||
resp['content-length'] = str(os.stat(file_path).st_size)
|
||||
resp['accept-ranges'] = 'bytes'
|
||||
resp['last-modified'] = time.strftime(
|
||||
"%a, %d %b %Y %H:%M:%S GMT", time.gmtime(os.path.getmtime(file_path)))
|
||||
resp['etag'] = ''
|
||||
resp['x-trans-id'] = str(time.time())
|
||||
resp['date'] = time.strftime("%a, %d %b %Y %H:%M:%S GMT")
|
||||
resp['content-type'] = get_file_mime(file_path)
|
||||
return resp
|
||||
|
||||
|
||||
def should_be_html(environ):
|
||||
"""Simple content negotiation.
|
||||
|
||||
If the client supports content negotiation, and asks for text/html,
|
||||
we give it to them, unless they also specifically want to override
|
||||
by passing ?content-type=text/plain in the query.
|
||||
|
||||
This should be able to handle the case of dumb clients defaulting to
|
||||
html, but also let devs override the text format when 35 MB html
|
||||
log files kill their browser (as per a nova-api log).
|
||||
"""
|
||||
text_override = False
|
||||
accepts_html = ('HTTP_ACCEPT' in environ and
|
||||
'text/html' in environ['HTTP_ACCEPT'])
|
||||
parameters = cgi.parse_qs(environ.get('QUERY_STRING', ''))
|
||||
if 'content-type' in parameters:
|
||||
ct = cgi.escape(parameters['content-type'][0])
|
||||
if ct == 'text/plain':
|
||||
text_override = True
|
||||
|
||||
return accepts_html and not text_override
|
||||
|
||||
|
||||
def use_passthrough_view(file_headers):
|
||||
"""Guess if we need to use the passthrough filter."""
|
||||
|
||||
if 'content-type' not in file_headers:
|
||||
# For legacy we'll try and format. This shouldn't occur though.
|
||||
return False
|
||||
else:
|
||||
if file_headers['content-type'] in ['text/plain', 'text/html']:
|
||||
# We want to format these files
|
||||
return False
|
||||
if file_headers['content-type'] in ['application/x-gzip',
|
||||
'application/gzip']:
|
||||
# We'll need to guess if we should render the output or offer a
|
||||
# download.
|
||||
filename = file_headers['filename']
|
||||
filename = filename[:-3] if filename[-3:] == '.gz' else filename
|
||||
if os.path.splitext(filename)[1] in ['.txt', '.html', '.log']:
|
||||
return False
|
||||
return True
|
||||
|
||||
|
||||
def load_file_conditions(config):
|
||||
if config.has_section('general'):
|
||||
if config.has_option('general', 'file_conditions'):
|
||||
try:
|
||||
with open(config.get('general', 'file_conditions'), 'r') as f:
|
||||
fm = yaml.safe_load(f)
|
||||
return fm.get('conditions', [])
|
||||
except Exception:
|
||||
logging.warn("Failed to load file conditions")
|
||||
return []
|
||||
|
||||
|
||||
def get_file_conditions(item, file_generator, environ, root_path, config):
|
||||
"""Get the matching item for the given file."""
|
||||
# Also take in environ and root_path if in the future we want to match
|
||||
# on other conditions
|
||||
|
||||
# We return the first match or None if nothing is found
|
||||
|
||||
conditions = load_file_conditions(config)
|
||||
for cond in conditions:
|
||||
if 'filename_pattern' in cond and item in cond:
|
||||
if re.match(cond['filename_pattern'], file_generator.logname):
|
||||
return cond[item]
|
||||
|
||||
return None
|
|
@ -1,264 +0,0 @@
|
|||
# Copyright (c) 2013 IBM Corp.
|
||||
# Copyright (c) 2014 Hewlett-Packard Development Company, L.P.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import cgi
|
||||
import collections
|
||||
import re
|
||||
|
||||
import os_loganalyze.generator as generator
|
||||
import os_loganalyze.util as util
|
||||
|
||||
HTML_HEADER = """<html>
|
||||
<head>
|
||||
<style>
|
||||
a {color: #000; text-decoration: none}
|
||||
a:hover {text-decoration: underline}
|
||||
.DEBUG, .DEBUG a {color: #888}
|
||||
.ERROR, .ERROR a {color: #c00; font-weight: bold}
|
||||
.CRITICAL, .CRITICAL a {color: #c00; background-color: yellow;
|
||||
font-weight: bold}
|
||||
.TRACE, .TRACE a {color: #c60}
|
||||
.WARNING, .WARNING a {color: #D89100; font-weight: bold}
|
||||
.INFO, .INFO a {color: #006}
|
||||
#selector, #selector a {color: #888}
|
||||
#selector a:hover {color: #c00}
|
||||
.highlight {
|
||||
background-color: rgb(255, 255, 204);
|
||||
display: block;
|
||||
}
|
||||
pre span span {padding-left: 0}
|
||||
pre span {
|
||||
padding-left: 22em;
|
||||
text-indent: -22em;
|
||||
white-space: pre-wrap;
|
||||
word-break: break-all;
|
||||
display: block;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
"""
|
||||
|
||||
HTML_HEADER_SEV = """
|
||||
<span id='selector'>
|
||||
Display level: [
|
||||
<a href='?'>ALL</a> |
|
||||
<a href='?level=DEBUG'>DEBUG</a> |
|
||||
<a href='?level=INFO'>INFO</a> |
|
||||
<a href='?level=AUDIT'>AUDIT</a> |
|
||||
<a href='?level=TRACE'>TRACE</a> |
|
||||
<a href='?level=WARNING'>WARNING</a> |
|
||||
<a href='?level=ERROR'>ERROR</a> |
|
||||
<a href='?level=CRITICAL'>CRITICAL</a> ]
|
||||
</span>"""
|
||||
|
||||
HTML_HEADER_BODY = "\n<pre>"
|
||||
|
||||
HTML_FOOTER = """</pre></body>
|
||||
<script>
|
||||
var old_highlight;
|
||||
|
||||
function remove_highlight() {
|
||||
if (old_highlight) {
|
||||
items = document.getElementsByClassName(old_highlight);
|
||||
for (var i = 0; i < items.length; i++) {
|
||||
items[i].className = items[i].className.replace('highlight','');
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
function update_selector(highlight) {
|
||||
var selector = document.getElementById('selector');
|
||||
if (selector) {
|
||||
var links = selector.getElementsByTagName('a');
|
||||
for (var i = 0; i < links.length; i++) {
|
||||
links[i].hash = "#" + highlight;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
function highlight_by_hash(event) {
|
||||
var highlight = window.location.hash.substr(1);
|
||||
// handle changes to highlighting separate from reload
|
||||
if (event) {
|
||||
highlight = event.target.hash.substr(1);
|
||||
}
|
||||
remove_highlight();
|
||||
if (highlight) {
|
||||
elements = document.getElementsByClassName(highlight);
|
||||
for (var i = 0; i < elements.length; i++) {
|
||||
elements[i].className += " highlight";
|
||||
}
|
||||
update_selector(highlight);
|
||||
old_highlight = highlight;
|
||||
}
|
||||
}
|
||||
document.onclick = highlight_by_hash;
|
||||
highlight_by_hash();
|
||||
</script>
|
||||
</html>
|
||||
"""
|
||||
|
||||
|
||||
DATE_LINE = ("<span class='%s %s'><a name='%s' class='date' href='#%s'>"
|
||||
"%s</a>%s\n</span>")
|
||||
NONDATE_LINE = "<span class='%s'>%s\n</span>"
|
||||
HTML_RE = re.compile("<(!doctype )?html", re.IGNORECASE)
|
||||
SKIP_LINES = re.compile("</?pre>")
|
||||
|
||||
# pre tags mean we're partial html and shouldn't escape
|
||||
NO_ESCAPE_START = re.compile("<pre>")
|
||||
NO_ESCAPE_FINISH = re.compile("</pre>")
|
||||
|
||||
|
||||
class HTMLView(collections.Iterable):
|
||||
should_escape = True
|
||||
sent_header = False
|
||||
is_html = False
|
||||
no_escape_count = 0
|
||||
|
||||
def __init__(self, filter_generator):
|
||||
self.headers = [('Content-type', 'text/html')]
|
||||
self.filter_generator = filter_generator
|
||||
|
||||
def _discover_html(self, line):
|
||||
self.is_html = HTML_RE.match(line)
|
||||
if self.is_html:
|
||||
self.should_escape = False
|
||||
|
||||
def _process_line(self, line):
|
||||
if NO_ESCAPE_START.match(line.line):
|
||||
# Disable escaping starting at this line
|
||||
self.should_escape = False
|
||||
# Count the number of times the escape has started in case there
|
||||
# are <pre> (or escape) blocks inside of each other
|
||||
self.no_escape_count += 1
|
||||
elif NO_ESCAPE_FINISH.match(line.line):
|
||||
# Check to see if we've exited the escape stack.
|
||||
self.no_escape_count -= 1
|
||||
if self.no_escape_count == 0:
|
||||
# Re-enable escaping starting at this line. ie we're done in
|
||||
# the escape free zone (eg outside of the pre tags again)
|
||||
self.should_escape = True
|
||||
|
||||
if SKIP_LINES.match(line.line):
|
||||
return
|
||||
|
||||
if self.should_escape:
|
||||
safeline = cgi.escape(line.line)
|
||||
else:
|
||||
safeline = line.line
|
||||
|
||||
if line.date:
|
||||
safe_date = line.safe_date()
|
||||
newline = DATE_LINE % (line.status, safe_date, safe_date,
|
||||
safe_date, line.date, safeline)
|
||||
else:
|
||||
newline = NONDATE_LINE % (line.status, safeline)
|
||||
return newline
|
||||
|
||||
def __iter__(self):
|
||||
igen = (x for x in self.filter_generator)
|
||||
# Grab the first non-empty line
|
||||
first_line = None
|
||||
for i in igen:
|
||||
if i.line:
|
||||
first_line = i
|
||||
break
|
||||
if not first_line:
|
||||
# The file must be empty
|
||||
return
|
||||
self._discover_html(first_line.line)
|
||||
|
||||
if not self.is_html:
|
||||
header = HTML_HEADER
|
||||
if self.filter_generator.supports_sev:
|
||||
header += HTML_HEADER_SEV
|
||||
header += HTML_HEADER_BODY
|
||||
yield header
|
||||
|
||||
first = self._process_line(first_line)
|
||||
if first:
|
||||
yield first
|
||||
|
||||
for line in igen:
|
||||
newline = self._process_line(line)
|
||||
if newline:
|
||||
yield newline
|
||||
|
||||
if not self.is_html:
|
||||
yield HTML_FOOTER
|
||||
|
||||
|
||||
class TextView(collections.Iterable):
|
||||
def __init__(self, filter_generator):
|
||||
self.headers = [('Content-type', 'text/plain')]
|
||||
self.filter_generator = filter_generator
|
||||
|
||||
def __iter__(self):
|
||||
for line in self.filter_generator:
|
||||
yield line.date + line.line + "\n"
|
||||
|
||||
|
||||
class PassthroughView(collections.Iterable):
|
||||
def __init__(self, filter_generator):
|
||||
self.headers = []
|
||||
self.filter_generator = filter_generator
|
||||
for k, v in self.filter_generator.file_generator.file_headers.items():
|
||||
self.headers.append((k, v))
|
||||
|
||||
def __iter__(self):
|
||||
for line in self.filter_generator:
|
||||
yield line.line
|
||||
|
||||
|
||||
def get_view_generator(filter_generator, environ, root_path, config):
|
||||
"""Return the view to use as per the config."""
|
||||
# Check if the generator is an index page. If so, we don't want to apply
|
||||
# any additional formatting
|
||||
if isinstance(filter_generator.file_generator,
|
||||
generator.IndexIterableBuffer):
|
||||
return PassthroughView(filter_generator)
|
||||
|
||||
# Determine if html is supported by the client if yes then supply html
|
||||
# otherwise fallback to text.
|
||||
supports_html = util.should_be_html(environ)
|
||||
|
||||
# Check file specific conditions first
|
||||
view_selected = util.get_file_conditions('view',
|
||||
filter_generator.file_generator,
|
||||
environ, root_path, config)
|
||||
|
||||
# Otherwise use the defaults in the config
|
||||
if not view_selected:
|
||||
if config.has_section('general'):
|
||||
if config.has_option('general', 'view'):
|
||||
view_selected = config.get('general', 'view')
|
||||
|
||||
if view_selected:
|
||||
if view_selected.lower() in ['htmlview', 'html'] and supports_html:
|
||||
return HTMLView(filter_generator)
|
||||
elif view_selected.lower() in ['textview', 'text']:
|
||||
return TextView(filter_generator)
|
||||
elif view_selected.lower() in ['passthroughview', 'passthrough']:
|
||||
return PassthroughView(filter_generator)
|
||||
|
||||
# Otherwise guess
|
||||
if util.use_passthrough_view(filter_generator.file_generator.file_headers):
|
||||
return PassthroughView(filter_generator)
|
||||
elif supports_html:
|
||||
return HTMLView(filter_generator)
|
||||
else:
|
||||
return TextView(filter_generator)
|
|
@ -1,170 +0,0 @@
|
|||
#!/usr/bin/env python
|
||||
#
|
||||
# Copyright (c) 2013 IBM Corp.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
|
||||
import ConfigParser
|
||||
import fileinput
|
||||
import os.path
|
||||
import sys
|
||||
|
||||
import os_loganalyze.filter as osfilter
|
||||
import os_loganalyze.generator as osgen
|
||||
import os_loganalyze.view as osview
|
||||
|
||||
|
||||
def htmlify_stdin():
|
||||
out = sys.stdout
|
||||
gen = osview.HTMLView(osfilter.NoFilter(fileinput.FileInput()))
|
||||
for line in gen:
|
||||
out.write(line)
|
||||
|
||||
|
||||
def get_config(wsgi_config):
|
||||
config = ConfigParser.ConfigParser()
|
||||
config.read(os.path.expanduser(wsgi_config))
|
||||
return config
|
||||
|
||||
|
||||
def get_range(environ, start_response, view_generator):
|
||||
if '-' not in environ['HTTP_RANGE']:
|
||||
status = '400 Bad Request'
|
||||
response_headers = [('Content-type', 'text/plain')]
|
||||
start_response(status, response_headers)
|
||||
yield 'Invalid Range'
|
||||
return
|
||||
|
||||
_range = environ['HTTP_RANGE'].split('=')[1].split('-')
|
||||
|
||||
try:
|
||||
start = None
|
||||
if _range[0] is not '':
|
||||
start = int(_range[0])
|
||||
|
||||
end = None
|
||||
if _range[1] is not '':
|
||||
end = int(_range[1])
|
||||
except ValueError:
|
||||
status = '400 Bad Request'
|
||||
response_headers = [('Content-type', 'text/plain')]
|
||||
start_response(status, response_headers)
|
||||
yield 'Invalid Range'
|
||||
return
|
||||
|
||||
if start is None and end is None:
|
||||
status = '400 Bad Request'
|
||||
response_headers = [('Content-type', 'text/plain')]
|
||||
start_response(status, response_headers)
|
||||
yield 'Invalid Range'
|
||||
return
|
||||
|
||||
# bytes=-5 means the last 5 bytes
|
||||
remainder = None
|
||||
if start is None:
|
||||
remainder = -end # this will be a negative index
|
||||
start = 0
|
||||
end = None
|
||||
|
||||
position = 0
|
||||
body = []
|
||||
|
||||
status = '206 Partial Content'
|
||||
start_response(status, view_generator.headers)
|
||||
|
||||
stop = False
|
||||
for chunk in view_generator:
|
||||
new_chunk = None
|
||||
|
||||
if (position + len(chunk)) < start:
|
||||
# ignore this chunk
|
||||
position += len(chunk)
|
||||
elif position < start:
|
||||
# start is between position and position + len(chunk)
|
||||
# we only want a portion of this chunk
|
||||
offset = start - position
|
||||
if end and end < (position + len(chunk)):
|
||||
# the entire range is a subset of this chunk
|
||||
cutoff = end - position + 1
|
||||
new_chunk = chunk[offset:cutoff]
|
||||
stop = True # don't break until we have yielded
|
||||
else:
|
||||
new_chunk = chunk[offset:]
|
||||
position += len(chunk[offset:])
|
||||
elif end and end <= (position + len(chunk)):
|
||||
cutoff = end - position + 1
|
||||
new_chunk = chunk[:cutoff]
|
||||
stop = True # don't break until we have yielded
|
||||
else:
|
||||
new_chunk = chunk
|
||||
position += len(chunk)
|
||||
|
||||
if remainder and new_chunk:
|
||||
body.append(new_chunk)
|
||||
elif new_chunk:
|
||||
yield new_chunk
|
||||
|
||||
if stop:
|
||||
break
|
||||
|
||||
if remainder:
|
||||
_body = ''.join(body)
|
||||
if len(_body) > -remainder: # remainder is negative
|
||||
yield _body[remainder:]
|
||||
else:
|
||||
yield _body
|
||||
|
||||
|
||||
def application(environ, start_response, root_path=None,
|
||||
wsgi_config='/etc/os_loganalyze/wsgi.conf'):
|
||||
if root_path is None:
|
||||
root_path = environ.get('OS_LOGANALYZE_ROOT_PATH',
|
||||
'/srv/static/logs')
|
||||
config = get_config(wsgi_config)
|
||||
|
||||
# make root path absolute in case we have a path with local components
|
||||
# specified
|
||||
root_path = os.path.abspath(root_path)
|
||||
|
||||
status = '200 OK'
|
||||
|
||||
try:
|
||||
file_generator = osgen.get_file_generator(environ, root_path, config)
|
||||
except osgen.UnsafePath:
|
||||
status = '400 Bad Request'
|
||||
response_headers = [('Content-type', 'text/plain')]
|
||||
start_response(status, response_headers)
|
||||
return ['Invalid file url']
|
||||
except osgen.NoSuchFile:
|
||||
status = "404 Not Found"
|
||||
response_headers = [('Content-type', 'text/plain')]
|
||||
start_response(status, response_headers)
|
||||
return ['File Not Found']
|
||||
|
||||
filter_generator = osfilter.get_filter_generator(file_generator, environ,
|
||||
root_path, config)
|
||||
view_generator = osview.get_view_generator(filter_generator, environ,
|
||||
root_path, config)
|
||||
|
||||
if 'HTTP_RANGE' in environ:
|
||||
return get_range(environ, start_response, view_generator)
|
||||
|
||||
else:
|
||||
start_response(status, view_generator.headers)
|
||||
return view_generator
|
||||
|
||||
|
||||
# for development purposes, makes it easy to test the filter output
|
||||
if __name__ == "__main__":
|
||||
htmlify_stdin()
|
|
@ -1,10 +0,0 @@
|
|||
# The order of packages is significant, because pip processes them in the order
|
||||
# of appearance. Changing the order has an impact on the overall integration
|
||||
# process, which may cause wedges in the gate later.
|
||||
pbr>=1.6
|
||||
Babel>=1.3
|
||||
Jinja2>=2.6 # BSD License (3 clause)
|
||||
PyYAML>=3.1.0
|
||||
python-magic
|
||||
python-magic-bin; platform_system=='Darwin'
|
||||
python-magic-bin; platform_system=='Windows'
|
51
setup.cfg
51
setup.cfg
|
@ -1,51 +0,0 @@
|
|||
[metadata]
|
||||
name = os_loganalyze
|
||||
summary = OpenStack tools for gate log analysis
|
||||
description-file =
|
||||
README.rst
|
||||
author = OpenStack
|
||||
author-email = openstack-dev@lists.openstack.org
|
||||
home-page = http://www.openstack.org/
|
||||
classifier =
|
||||
Environment :: OpenStack
|
||||
Intended Audience :: Information Technology
|
||||
Intended Audience :: System Administrators
|
||||
License :: OSI Approved :: Apache Software License
|
||||
Operating System :: POSIX :: Linux
|
||||
Programming Language :: Python
|
||||
Programming Language :: Python :: 2
|
||||
Programming Language :: Python :: 2.7
|
||||
Programming Language :: Python :: 2.6
|
||||
Programming Language :: Python :: 3
|
||||
Programming Language :: Python :: 3.3
|
||||
|
||||
[files]
|
||||
packages =
|
||||
os_loganalyze
|
||||
|
||||
[entry_points]
|
||||
console_scripts =
|
||||
htmlify-log.py = os_loganalyze.cmd.htmlify_log:main
|
||||
htmlify-server.py = os_loganalyze.cmd.htmlify_server:main
|
||||
|
||||
[build_sphinx]
|
||||
source-dir = doc/source
|
||||
build-dir = doc/build
|
||||
all_files = 1
|
||||
|
||||
[upload_sphinx]
|
||||
upload-dir = doc/build/html
|
||||
|
||||
[compile_catalog]
|
||||
directory = os_loganalyze/locale
|
||||
domain = os-loganalyze
|
||||
|
||||
[update_catalog]
|
||||
domain = os-loganalyze
|
||||
output_dir = os_loganalyze/locale
|
||||
input_file = os_loganalyze/locale/os-loganalyze.pot
|
||||
|
||||
[extract_messages]
|
||||
keywords = _ gettext ngettext l_ lazy_gettext
|
||||
mapping_file = babel.cfg
|
||||
output_file = os_loganalyze/locale/os-loganalyze.pot
|
29
setup.py
29
setup.py
|
@ -1,29 +0,0 @@
|
|||
# Copyright (c) 2013 Hewlett-Packard Development Company, L.P.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
# implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
# THIS FILE IS MANAGED BY THE GLOBAL REQUIREMENTS REPO - DO NOT EDIT
|
||||
import setuptools
|
||||
|
||||
# In python < 2.7.4, a lazy loading of package `pbr` will break
|
||||
# setuptools if some other modules registered functions in `atexit`.
|
||||
# solution from: http://bugs.python.org/issue15881#msg170215
|
||||
try:
|
||||
import multiprocessing # noqa
|
||||
except ImportError:
|
||||
pass
|
||||
|
||||
setuptools.setup(
|
||||
setup_requires=['pbr>=1.8'],
|
||||
pbr=True)
|
|
@ -1,15 +0,0 @@
|
|||
# The order of packages is significant, because pip processes them in the order
|
||||
# of appearance. Changing the order has an impact on the overall integration
|
||||
# process, which may cause wedges in the gate later.
|
||||
hacking>=0.12.0,!=0.13.0,<0.14 # Apache-2.0
|
||||
|
||||
coverage>=3.6
|
||||
discover
|
||||
fixtures>=1.3.1
|
||||
mock>=1.2
|
||||
python-subunit>=0.0.18
|
||||
sphinx!=1.2.0,!=1.3b1,<1.3,>=1.1.2
|
||||
oslosphinx>=2.5.0 # Apache-2.0
|
||||
testrepository>=0.0.18
|
||||
testscenarios>=0.4
|
||||
testtools>=1.4.0
|
35
tox.ini
35
tox.ini
|
@ -1,35 +0,0 @@
|
|||
[tox]
|
||||
minversion = 1.6
|
||||
envlist = py27,pypy,pep8
|
||||
skipsdist = True
|
||||
|
||||
[testenv]
|
||||
usedevelop = True
|
||||
install_command = pip install -U {opts} {packages}
|
||||
setenv =
|
||||
VIRTUAL_ENV={envdir}
|
||||
deps = -r{toxinidir}/requirements.txt
|
||||
-r{toxinidir}/test-requirements.txt
|
||||
commands = python setup.py testr --slowest --testr-args='{posargs}'
|
||||
|
||||
[testenv:py27debug]
|
||||
commands = python -m testtools.run {posargs}
|
||||
|
||||
[testenv:pep8]
|
||||
commands = flake8
|
||||
|
||||
[testenv:venv]
|
||||
commands = {posargs}
|
||||
|
||||
[testenv:cover]
|
||||
commands = python setup.py testr --coverage --testr-args='{posargs}'
|
||||
|
||||
[testenv:run]
|
||||
commands = python ./os_loganalyze/server.py -l {toxinidir}/os_loganalyze/tests/samples/
|
||||
|
||||
[flake8]
|
||||
# H904 Wrap long lines in parentheses instead of a backslash
|
||||
show-source = True
|
||||
ignore = E123,E125,H803,H904
|
||||
builtins = _
|
||||
exclude=.venv,.git,.tox,dist,doc,*openstack/common*,*lib/python*,*egg,build
|
Loading…
Reference in New Issue