Retire stackforge/satori

This commit is contained in:
Monty Taylor 2015-10-17 16:04:48 -04:00
parent 1861fee39b
commit e5d65f03d9
66 changed files with 5 additions and 8065 deletions

View File

@ -1,7 +0,0 @@
[run]
branch = True
source = satori
omit = satori/tests/*
[report]
ignore_errors = True

59
.gitignore vendored
View File

@ -1,59 +0,0 @@
*.py[cod]
# C extensions
*.so
# Packages
*.egg
*.egg-info
dist
build
eggs
parts
bin
var
sdist
develop-eggs
.installed.cfg
lib
lib64
__pycache__
# Installer logs
pip-log.txt
# Unit test / coverage reports
.coverage
.coverage.*
cover
covhtml
.tox
nosetests.xml
# Translations
*.mo
# Mr Developer
.mr.developer.cfg
.project
.pydevproject
# Rope
*/.ropeproject/*
.ropeproject/*
*.DS_Store
*.log
*.swo
*.swp
*~
.satori-venv
.testrepository
.tox
.venv
venv
AUTHORS
build
ChangeLog
dist
doc/build

View File

@ -1,4 +0,0 @@
[gerrit]
host=review.openstack.org
port=29418
project=stackforge/satori.git

View File

@ -1,8 +0,0 @@
[DEFAULT]
test_command=OS_STDOUT_CAPTURE=${OS_STDOUT_CAPTURE:-1} \
OS_STDERR_CAPTURE=${OS_STDERR_CAPTURE:-1} \
OS_TEST_TIMEOUT=${OS_TEST_TIMEOUT:-60} \
${PYTHON:-python} -m subunit.run discover -t . ./satori/tests $LISTOPT $IDOPTION
test_id_option=--load-list $IDFILE
test_list_option=--list

View File

@ -1,106 +0,0 @@
OpenStack Style Commandments
============================
- Step 1: Read the OpenStack Style Commandments
http://docs.openstack.org/developer/hacking/
- Step 2: Read on
General
-------
- thou shalt not violate causality in our time cone, or else
Docstrings
----------
Docstrings should ONLY use triple-double-quotes (``"""``)
Single-line docstrings should NEVER have extraneous whitespace
between enclosing triple-double-quotes.
Deviation! Sentence fragments do not have punctuation. Specifically in the
command classes the one line docstring is also the help string for that
command and those do not have periods.
"""A one line docstring looks like this"""
Calling Methods
---------------
Deviation! When breaking up method calls due to the 79 char line length limit,
use the alternate 4 space indent. With the first argument on the succeeding
line all arguments will then be vertically aligned. Use the same convention
used with other data structure literals and terminate the method call with
the last argument line ending with a comma and the closing paren on its own
line indented to the starting line level.
unnecessarily_long_function_name(
'string one',
'string two',
kwarg1=constants.ACTIVE,
kwarg2=['a', 'b', 'c'],
)
Text encoding
-------------
Note: this section clearly has not been implemented in this project yet, it is
the intention to do so.
All text within python code should be of type 'unicode'.
WRONG:
>>> s = 'foo'
>>> s
'foo'
>>> type(s)
<type 'str'>
RIGHT:
>>> u = u'foo'
>>> u
u'foo'
>>> type(u)
<type 'unicode'>
Transitions between internal unicode and external strings should always
be immediately and explicitly encoded or decoded.
All external text that is not explicitly encoded (database storage,
commandline arguments, etc.) should be presumed to be encoded as utf-8.
WRONG:
mystring = infile.readline()
myreturnstring = do_some_magic_with(mystring)
outfile.write(myreturnstring)
RIGHT:
mystring = infile.readline()
mytext = s.decode('utf-8')
returntext = do_some_magic_with(mytext)
returnstring = returntext.encode('utf-8')
outfile.write(returnstring)
Python 3.x Compatibility
------------------------
OpenStackClient strives to be Python 3.3 compatible. Common guidelines:
* Convert print statements to functions: print statements should be converted
to an appropriate log or other output mechanism.
* Use six where applicable: x.iteritems is converted to six.iteritems(x)
for example.
Running Tests
-------------
Note: Oh boy, are we behind on writing tests. But they are coming!
The testing system is based on a combination of tox and testr. If you just
want to run the whole suite, run `tox` and all will be fine. However, if
you'd like to dig in a bit more, you might want to learn some things about
testr itself. A basic walkthrough for OpenStack can be found at
http://wiki.openstack.org/testr

202
LICENSE
View File

@ -1,202 +0,0 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "{}"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright {yyyy} {name of copyright owner}
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View File

@ -1,5 +0,0 @@
graft satori/formats
include setup.py
include setup.cfg
prune satori/tests
global-exclude *.pyc *.sdx *.log *.db *.swp

View File

@ -1,149 +1,7 @@
This project is no longer maintained.
================================
Satori - Configuration Discovery
================================
The contents of this repository are still available in the Git source code
management system. To see the contents of this repository before it reached
its end of life, please check out the previous commit with
"git checkout HEAD^1".
Satori provides configuration discovery for existing infrastructure. It is
a `related OpenStack project`_.
The charter for the project is to focus narrowly on discovering pre-existing
infrastructure and installed or running software. For example, given a URL and
some credentials, discover which resources (load balancer and servers) the URL
is hosted on and what software is running on those servers.
Configuration discovery output could be used for:
* Configuration analysis (ex. compared against a library of best practices)
* Configuration monitoring (ex. has the configuration changed?)
* Troubleshooting
* Heat Template generation
* Solum Application creation/import
* Creation of Chef recipes/cookbooks, Puppet modules, Ansible playbooks, setup
scripts, etc..
Getting Started
===============
Run discovery on the local system::
$ pip install satori
$ satori localhost --system-info=ohai-solo -F json
# Installs and runs ohai-solo, outputs the data as JSON
Run against a URL with OpenStack credentials::
$ pip install satori
$ satori https://www.foo.com
Address:
www.foo.com resolves to IPv4 address 192.0.2.24
Domain: foo.com
Registrar: TUCOWS, INC.
Nameservers: NS1.DIGIMEDIA.COM, NS2.DIGIMEDIA.COM
Expires: 457 days
Host not found
Deeper discovery is available if the network location (IP or hostname) is
hosted on an OpenStack cloud tenant that Satori can access.
Cloud settings can be passed in on the command line or via `OpenStack tenant environment
variables`_.
Run with OpenStack credentials::
$ satori 192.0.2.24 --os-username yourname --os-password yadayadayada --os-tenant-name myproject --os-auth-url http://...
Or::
$ export OS_USERNAME=yourname
$ export OS_PASSWORD=yadayadayada
$ export OS_TENANT_NAME=myproject
$ export OS_AUTH_URL=http://...
$ satori foo.com
Notice the discovery result now contains a ``Host`` section::
$ satori 192.0.2.24 --os-username yourname --os-password yadayadayada --os-tenant-name myproject --os-auth-url http://...
Host:
192.0.2.24 is hosted on a Nova Instance
Instance Information:
URI: https://nova.api.somecloud.com/v2/111222/servers/d9119040-f767-414
1-95a4-d4dbf452363a
Name: sampleserver01.foo.com
ID: d9119040-f767-4141-95a4-d4dbf452363a
ip-addresses:
public:
::ffff:404:404
192.0.2.24
private:
10.1.1.156
System Information:
Ubuntu 12.04 installed
Server was rebooted 11 days, 22 hours ago
/dev/xvda1 is using 9% of its inodes.
Running Services:
httpd on 127.0.0.1:8080
varnishd on 0.0.0.0:80
sshd on 0.0.0.0:22
httpd:
Using 7 of 100 MaxClients
Documentation
=============
Additional documentation is located in the ``doc/`` directory and is hosted at
http://satori.readthedocs.org/.
Start Hacking
=============
We recommend using a virtualenv to install the client. This description
uses the `install virtualenv`_ script to create the virtualenv::
$ python tools/install_venv.py
$ source .venv/bin/activate
$ python setup.py develop
Unit tests can be ran simply by running::
$ tox
# or, just style checks
$ tox -e pep8
# or, just python 2.7 checks
$ tox -e py27
Checking test coverage::
# Run tests with coverage
$ tox -ecover
# generate the report
$ coverage html -d covhtml -i
# open it in a broweser
$ open covhtml/index.html
Links
=====
- `OpenStack Wiki`_
- `Documentation`_
- `Code`_
- `Launchpad Project`_
- `Features`_
- `Issues`_
.. _OpenStack Wiki: https://wiki.openstack.org/Satori
.. _Documentation: http://satori.readthedocs.org/
.. _OpenStack tenant environment variables: http://docs.openstack.org/developer/python-novaclient/shell.html
.. _related OpenStack project: https://wiki.openstack.org/wiki/ProjectTypes
.. _install virtualenv: https://github.com/stackforge/satori/blob/master/tools/install_venv.py
.. _Code: https://github.com/stackforge/satori
.. _Launchpad Project: https://launchpad.net/satori
.. _Features: https://blueprints.launchpad.net/satori
.. _Issues: https://bugs.launchpad.net/satori/

2
doc/.gitignore vendored
View File

@ -1,2 +0,0 @@
target/
build/

View File

@ -1,136 +0,0 @@
# Makefile for Sphinx documentation
#
# You can set these variables from the command line.
SPHINXOPTS =
SPHINXBUILD = sphinx-build
PAPER =
BUILDDIR = build
# Internal variables.
PAPEROPT_a4 = -D latex_paper_size=a4
PAPEROPT_letter = -D latex_paper_size=letter
ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) source
.PHONY: help clean html pdf dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest
help:
@echo "Please use \`make <target>' where <target> is one of"
@echo " html to make standalone HTML files"
@echo " pdf to make pdf with rst2pdf"
@echo " dirhtml to make HTML files named index.html in directories"
@echo " singlehtml to make a single large HTML file"
@echo " pickle to make pickle files"
@echo " json to make JSON files"
@echo " htmlhelp to make HTML files and a HTML help project"
@echo " qthelp to make HTML files and a qthelp project"
@echo " devhelp to make HTML files and a Devhelp project"
@echo " epub to make an epub"
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
@echo " latexpdf to make LaTeX files and run them through pdflatex"
@echo " text to make text files"
@echo " man to make manual pages"
@echo " changes to make an overview of all changed/added/deprecated items"
@echo " linkcheck to check all external links for integrity"
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
clean:
-rm -rf $(BUILDDIR)/*
html:
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
pdf:
$(SPHINXBUILD) -b pdf $(ALLSPHINXOPTS) $(BUILDDIR)/pdf
@echo
@echo "Build finished. The PDFs are in $(BUILDDIR)/pdf."
dirhtml:
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
singlehtml:
$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
@echo
@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."
pickle:
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
@echo
@echo "Build finished; now you can process the pickle files."
json:
$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
@echo
@echo "Build finished; now you can process the JSON files."
htmlhelp:
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
@echo
@echo "Build finished; now you can run HTML Help Workshop with the" \
".hhp project file in $(BUILDDIR)/htmlhelp."
qthelp:
$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
@echo
@echo "Build finished; now you can run "qcollectiongenerator" with the" \
".qhcp project file in $(BUILDDIR)/qthelp, like this:"
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/NebulaDocs.qhcp"
@echo "To view the help file:"
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/NebulaDocs.qhc"
devhelp:
$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
@echo
@echo "Build finished."
@echo "To view the help file:"
@echo "# mkdir -p $$HOME/.local/share/devhelp/NebulaDocs"
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/NebulaDocs"
@echo "# devhelp"
epub:
$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
@echo
@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
latex:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo
@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
@echo "Run \`make' in that directory to run these through (pdf)latex" \
"(use \`make latexpdf' here to do that automatically)."
latexpdf:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through pdflatex..."
make -C $(BUILDDIR)/latex all-pdf
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
text:
$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
@echo
@echo "Build finished. The text files are in $(BUILDDIR)/text."
man:
$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
@echo
@echo "Build finished. The manual pages are in $(BUILDDIR)/man."
changes:
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
@echo
@echo "The overview file is in $(BUILDDIR)/changes."
linkcheck:
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
@echo
@echo "Link check complete; look for any errors in the above output " \
"or in $(BUILDDIR)/linkcheck/output.txt."
doctest:
$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
@echo "Testing of doctests in the sources finished, look at the " \
"results in $(BUILDDIR)/doctest/output.txt."

View File

@ -1,55 +0,0 @@
===========================
Building the developer docs
===========================
Dependencies
============
You'll need to install python *Sphinx* package
package:
::
sudo pip install sphinx
If you are using the virtualenv you'll need to install them in the
virtualenv.
Get Help
========
Just type make to get help:
::
make
It will list available build targets.
Build Doc
=========
To build the man pages:
::
make man
To build the developer documentation as HTML:
::
make html
Type *make* for more formats.
Test Doc
========
If you modify doc files, you can type:
::
make doctest
to check whether the format has problem.

View File

@ -1,413 +0,0 @@
# -*- coding: utf-8 -*-
#
# OpenStack Configuration Discovery documentation build configuration file, created
# by sphinx-quickstart on Wed May 16 12:05:58 2012.
#
# This file is execfile()d with the current directory set to its containing
# dir.
#
# Note that not all possible configuration values are present in this
# autogenerated file.
#
# All configuration values have a default; values that are commented out
# serve to show the default.
import glob
import os
import re
import sys
import pbr.version
BASE_DIR = os.path.dirname(os.path.abspath(__file__))
ROOT = os.path.abspath(os.path.join(BASE_DIR, "..", ".."))
CONTRIB_DIR = os.path.join(ROOT, 'contrib')
PLUGIN_DIRS = glob.glob(os.path.join(CONTRIB_DIR, '*'))
sys.path.insert(0, ROOT)
sys.path.insert(0, BASE_DIR)
sys.path = PLUGIN_DIRS + sys.path
#
# Automatically write module docs
#
def write_autodoc_index():
def get_contrib_sources():
module_dirs = glob.glob(os.path.join(CONTRIB_DIR, '*'))
module_names = map(os.path.basename, module_dirs)
return dict(
('contrib/%s' % module_name,
{'module': module_name,
'path': os.path.join(CONTRIB_DIR, module_name)}
)
for module_name in module_names)
def find_autodoc_modules(module_name, sourcedir):
"""Return a list of modules in the SOURCE directory."""
modlist = []
os.chdir(os.path.join(sourcedir, module_name))
print("SEARCHING %s" % sourcedir)
for root, dirs, files in os.walk("."):
for filename in files:
if filename.endswith(".py"):
# remove the pieces of the root
elements = root.split(os.path.sep)
# replace the leading "." with the module name
elements[0] = module_name
# and get the base module name
base, extension = os.path.splitext(filename)
if not (base == "__init__"):
elements.append(base)
result = ".".join(elements)
modlist.append(result)
return modlist
RSTDIR = os.path.abspath(os.path.join(BASE_DIR, "sourcecode"))
SRCS = {'satori': {'module': 'satori',
'path': ROOT}}
SRCS.update(get_contrib_sources())
EXCLUDED_MODULES = ('satori.doc',
'.*\.tests')
CURRENT_SOURCES = {}
if not(os.path.exists(RSTDIR)):
os.mkdir(RSTDIR)
CURRENT_SOURCES[RSTDIR] = ['autoindex.rst', '.gitignore']
INDEXOUT = open(os.path.join(RSTDIR, "autoindex.rst"), "w")
INDEXOUT.write("=================\n")
INDEXOUT.write("Source Code Index\n")
INDEXOUT.write("=================\n")
for title, info in SRCS.items():
path = info['path']
modulename = info['module']
sys.stdout.write("Generating source documentation for %s\n" %
title)
INDEXOUT.write("\n%s\n" % title.capitalize())
INDEXOUT.write("%s\n" % ("=" * len(title),))
INDEXOUT.write(".. toctree::\n")
INDEXOUT.write(" :maxdepth: 1\n")
INDEXOUT.write("\n")
MOD_DIR = os.path.join(RSTDIR, title)
CURRENT_SOURCES[MOD_DIR] = []
if not(os.path.exists(MOD_DIR)):
os.makedirs(MOD_DIR)
for module in find_autodoc_modules(modulename, path):
if any([re.match(exclude, module)
for exclude
in EXCLUDED_MODULES]):
print("Excluded module %s." % module)
continue
mod_path = os.path.join(path, *module.split("."))
generated_file = os.path.join(MOD_DIR, "%s.rst" % module)
INDEXOUT.write(" %s/%s\n" % (title, module))
# Find the __init__.py module if this is a directory
if os.path.isdir(mod_path):
source_file = ".".join((os.path.join(mod_path, "__init__"),
"py",))
else:
source_file = ".".join((os.path.join(mod_path), "py"))
CURRENT_SOURCES[MOD_DIR].append("%s.rst" % module)
# Only generate a new file if the source has changed or we don't
# have a doc file to begin with.
if not os.access(generated_file, os.F_OK) or \
os.stat(generated_file).st_mtime < \
os.stat(source_file).st_mtime:
print("Module %s updated, generating new documentation."
% module)
FILEOUT = open(generated_file, "w")
header = "The :mod:`%s` Module" % module
FILEOUT.write("%s\n" % ("=" * len(header),))
FILEOUT.write("%s\n" % header)
FILEOUT.write("%s\n" % ("=" * len(header),))
FILEOUT.write(".. automodule:: %s\n" % module)
FILEOUT.write(" :members:\n")
FILEOUT.write(" :undoc-members:\n")
FILEOUT.write(" :show-inheritance:\n")
FILEOUT.write(" :noindex:\n")
FILEOUT.close()
INDEXOUT.close()
# Delete auto-generated .rst files for sources which no longer exist
for directory, subdirs, files in list(os.walk(RSTDIR)):
for old_file in files:
if old_file not in CURRENT_SOURCES.get(directory, []):
print("Removing outdated file for %s" % old_file)
os.remove(os.path.join(directory, old_file))
write_autodoc_index()
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
sys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..')))
# -- General configuration ----------------------------------------------------
# If your documentation needs a minimal Sphinx version, state it here.
#needs_sphinx = '1.0'
# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
extensions = ['sphinx.ext.autodoc',
'sphinx.ext.ifconfig',
'sphinx.ext.viewcode',
'sphinx.ext.todo',
'sphinx.ext.coverage',
'sphinx.ext.pngmath',
'sphinx.ext.viewcode',
'sphinx.ext.doctest']
todo_include_todos = True
# Add any paths that contain templates here, relative to this directory.
if os.getenv('HUDSON_PUBLISH_DOCS'):
templates_path = ['_ga', '_templates']
else:
templates_path = ['_templates']
# The suffix of source filenames.
source_suffix = '.rst'
# The encoding of source files.
#source_encoding = 'utf-8-sig'
# The master toctree document.
master_doc = 'index'
# General information about the project.
project = u'Satori'
copyright = u'2012-2013 OpenStack Foundation'
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
# built documents.
#
version_info = pbr.version.VersionInfo('satori')
#
# The short X.Y version.
version = version_info.version_string()
# The full version, including alpha/beta/rc tags.
release = version_info.release_string()
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
#language = None
# There are two options for replacing |today|: either, you set today to some
# non-false value, then it is used:
#today = ''
# Else, today_fmt is used as the format for a strftime call.
#today_fmt = '%B %d, %Y'
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
exclude_patterns = ['**/#*', '**~', '**/#*#']
# The reST default role (used for this markup: `text`) to use for all
# documents.
#default_role = None
# If true, '()' will be appended to :func: etc. cross-reference text.
#add_function_parentheses = True
# If true, the current module name will be prepended to all description
# unit titles (such as .. function::).
#add_module_names = True
# If true, sectionauthor and moduleauthor directives will be shown in the
# output. They are ignored by default.
#show_authors = False
# The name of the Pygments (syntax highlighting) style to use.
pygments_style = 'sphinx'
# A list of ignored prefixes for module index sorting.
#modindex_common_prefix = []
primary_domain = 'py'
nitpicky = False
# -- Options for HTML output --------------------------------------------------
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
html_theme = 'default'
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the
# documentation.
html_theme_options = {
"nosidebar": "false"
}
# Add any paths that contain custom themes here, relative to this directory.
#html_theme_path = []
# The name for this set of Sphinx documents. If None, it defaults to
# "<project> v<release> documentation".
#html_title = None
# A shorter title for the navigation bar. Default is the same as html_title.
#html_short_title = None
# The name of an image file (relative to this directory) to place at the top
# of the sidebar.
#html_logo = None
# The name of an image file (within the static path) to use as favicon of the
# docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32
# pixels large.
#html_favicon = None
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = ['_static']
# If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
# using the given strftime format.
#html_last_updated_fmt = '%b %d, %Y'
git_cmd = "git log --pretty=format:'%ad, commit %h' --date=local -n1"
html_last_updated_fmt = os.popen(git_cmd).read()
# If true, SmartyPants will be used to convert quotes and dashes to
# typographically correct entities.
#html_use_smartypants = True
# Custom sidebar templates, maps document names to template names.
#html_sidebars = {}
# Additional templates that should be rendered to pages, maps page names to
# template names.
#html_additional_pages = {}
# If false, no module index is generated.
#html_domain_indices = True
# If false, no index is generated.
#html_use_index = True
# If true, the index is split into individual pages for each letter.
#html_split_index = False
# If true, links to the reST sources are added to the pages.
#html_show_sourcelink = True
# If true, "Created using Sphinx" is shown in the HTML footer. Default is True.
#html_show_sphinx = True
# If true, "(C) Copyright ..." is shown in the HTML footer. Default is True.
#html_show_copyright = True
# If true, an OpenSearch description file will be output, and all pages will
# contain a <link> tag referring to it. The value of this option must be the
# base URL from which the finished HTML is served.
#html_use_opensearch = ''
# This is the file name suffix for HTML files (e.g. ".xhtml").
#html_file_suffix = None
# Output file base name for HTML help builder.
htmlhelp_basename = 'Satoridoc'
# -- Options for LaTeX output -------------------------------------------------
latex_elements = {
# The paper size ('letterpaper' or 'a4paper').
#'papersize': 'letterpaper',
# The font size ('10pt', '11pt' or '12pt').
#'pointsize': '10pt',
# Additional stuff for the LaTeX preamble.
#'preamble': '',
}
# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title, author, documentclass [howto/manual])
# .
latex_documents = [
('index', 'Satori.tex',
u'OpenStack Configuration Discovery Documentation',
u'OpenStack', 'manual'),
]
# The name of an image file (relative to this directory) to place at the top of
# the title page.
#latex_logo = None
# For "manual" documents, if this is true, then toplevel headings are parts,
# not chapters.
#latex_use_parts = False
# If true, show page references after internal links.
#latex_show_pagerefs = False
# If true, show URL addresses after external links.
#latex_show_urls = False
# Documents to append as an appendix to all manuals.
#latex_appendices = []
# If false, no module index is generated.
#latex_domain_indices = True
# -- Options for manual page output -------------------------------------------
# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [
(
'man/satori',
'satori',
u'OpenStack Configuration Discovery',
[u'OpenStack contributors'],
1,
),
]
# If true, show URL addresses after external links.
#man_show_urls = False
# -- Options for Texinfo output -----------------------------------------------
# Grouping the document tree into Texinfo files. List of tuples
# (source start file, target name, title, author,
# dir menu entry, description, category)
texinfo_documents = [
('index', 'OpenStackConfigurationDiscovery',
u'OpenStack Configuration Discovery Documentation',
u'OpenStack', 'OpenStackConfigurationDiscovery',
'One line description of project.',
'Miscellaneous'),
]
# Documents to append as an appendix to all manuals.
#texinfo_appendices = []
# If false, no module index is generated.
#texinfo_domain_indices = True
# How to display URL addresses: 'footnote', 'no', or 'inline'.
#texinfo_show_urls = 'footnote'
# Example configuration for intersphinx: refer to the Python standard library.
intersphinx_mapping = {'http://docs.python.org/': None}

View File

@ -1,17 +0,0 @@
============
Contributing
============
Satori's code is hosted on `GitHub`_. Our development process follows the
`OpenStack Gerrit`_ workflow which is much different than most projects on
Github.
If you find a problem, please `file a bug`_. Feature additions and design
discussions are taking place in `blueprints`_. `Reviewing code`_ is an easy way
to start contributing.
.. _OpenStack Gerrit: http://docs.openstack.org/infra/manual/developers.html#development-workflow
.. _GitHub: https://github.com/stackforge/satori
.. _file a bug: https://bugs.launchpad.net/satori
.. _blueprints: https://blueprints.launchpad.net/satori
.. _Reviewing code: https://review.openstack.org/#/q/status:open+project:stackforge/satori,n,z

View File

@ -1,94 +0,0 @@
=================================
OpenStack Configuration Discovery
=================================
Satori is a configuration discovery tool for OpenStack and OpenStack tenant hosted applications.
.. toctree::
:maxdepth: 1
contributing
releases
terminology
schema
satori
Get Satori
------------
Satori is distributed as a Python package. The pip command will install the
latest version.
::
$ pip install satori
If you want to install from the latest source code, these commands will fetch
the code and install it.
::
$ git clone https://github.com/stackforge/satori.git
$ cd satori
$ pip install -r requirements.txt
$ sudo python setup.py install
Use Satori
-----------
::
$ satori www.foo.com
Domain: foo.com
Registered at TUCOWS DOMAINS INC.
Expires in 475 days.
Name servers:
DNS1.STABLETRANSIT.COM
DNS2.STABLETRANSIT.COM
Address:
www.foo.com resolves to IPv4 address 4.4.4.4
Host:
4.4.4.4 (www.foo.com) is hosted on a Nova Instance
Instance Information:
URI: https://nova.api.somecloud.com/v2/111222/servers/d9119040
Name: sampleserver01.foo.com
ID: d9119040-f767-4141-95a4-d4dbf452363a
ip-addresses:
public:
::ffff:404:404
4.4.4.4
private:
10.1.1.156
Listening Services:
0.0.0.0:6082 varnishd
127.0.0.1:8080 apache2
127.0.0.1:3306 mysqld
Talking to:
10.1.1.71 on 27017
Links
=====
- `OpenStack Wiki`_
- `Code`_
- `Launchpad Project`_
- `Features`_
- `Issues`_
.. _OpenStack Wiki: https://wiki.openstack.org/Satori
.. _OpenStack tenant environment variables: http://docs.openstack.org/developer/python-novaclient/shell.html
.. _related OpenStack project: https://wiki.openstack.org/wiki/ProjectTypes
.. _install virtualenv: https://github.com/stackforge/satori/blob/master/tools/install_venv.py
.. _Code: https://github.com/stackforge/satori
.. _Launchpad Project: https://launchpad.net/satori
.. _Features: https://blueprints.launchpad.net/satori
.. _Issues: https://bugs.launchpad.net/satori/
Index
-----
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`

View File

@ -1,51 +0,0 @@
====================
:program:`satori`
====================
OpenStack Configuration Discovery
SYNOPSIS
========
:program:`satori`
DESCRIPTION
===========
:program:`satori` provides configuration discovery.
OPTIONS
=======
:program:`satori` has no options yet
AUTHORS
=======
Please refer to the AUTHORS file distributed with Satori.
COPYRIGHT
=========
Copyright 2011-2013 OpenStack Foundation and the authors listed in the AUTHORS file.
LICENSE
=======
http://www.apache.org/licenses/LICENSE-2.0
SEE ALSO
========
The `Satori page <https://wiki.openstack.org/wiki/Satori>`_
in the `OpenStack Wiki <https://wiki.openstack.org/>`_ contains further
documentation.
s
The individual OpenStack project CLIs, the OpenStack API references.

View File

@ -1,78 +0,0 @@
=================================
OpenStack Control Plane Discovery
=================================
Satori supports :ref:`control plane <terminology_control_plane>` discovery of
resources that belong to an OpenStack tenant. To discover OpenStack specific
information for a resource, provide credentials to Satori for the tenant that
owns the resource.
OpenStack Credentials
=====================
OpenStack credentials can be provided on the command line or injected into
shell environment variables. Satori reuses the `OpenStack Nova conventions`_ for
environment variables since many Satori users also use the `nova`_ program.
Use the export command to store the credentials in the shell environment:
::
$ export OS_USERNAME=yourname
$ export OS_PASSWORD=yadayadayada
$ export OS_TENANT_NAME=myproject
$ export OS_AUTH_URL=http://...
$ satori foo.com
Alternatively, the credentials can be passed on the command line:
::
$ satori foo.com \
--os-username yourname \
--os-password yadayadayada \
--os-tenant-name myproject \
--os-auth-url http://...
Discovered Host
===============
If the domain name or IP address provided belongs to the authenticated
tenant, the OpenStack resource data (Server ID, IPs, etc) will be
returned. In this example, the OpenStack credentials were provided via
environment variables. The "Host" section is only available because the
control plane discovery was possible using the OpenStack credentials.
::
$ satori www.foo.com
Domain: foo.com
Registered at TUCOWS DOMAINS INC.
Expires in 475 days.
Name servers:
DNS1.STABLETRANSIT.COM
DNS2.STABLETRANSIT.COM
Address:
www.foo.com resolves to IPv4 address 192.0.2.10
Host:
192.0.2.10 (www.foo.com) is hosted on a Nova Instance
Instance Information:
URI: https://nova.api.somecloud.com/v2/111222/servers/d9119040-f767-414
1-95a4-d4dbf452363a
Name: sampleserver01.foo.com
ID: d9119040-f767-4141-95a4-d4dbf452363a
ip-addresses:
public:
::ffff:404:404
192.0.2.10
private:
10.1.1.156
Listening Services:
0.0.0.0:6082 varnishd
127.0.0.1:8080 apache2
127.0.0.1:3306 mysqld
Talking to:
10.1.1.71 on 27017
.. _nova: https://github.com/openstack/python-novaclient
.. _OpenStack Nova conventions: https://github.com/openstack/python-novaclient/blob/master/README.rst#id1

View File

@ -1,25 +0,0 @@
=============
Release Notes
=============
0.1.4 (20 Mar 2014)
===================
* Data plane discovery (logs on to machines)
* Localhost discovery
* SSH module
* Templated output
* Bug fixes
0.1.3 (18 Feb 2014)
===================
* Bug fixes
* DNS added among other things
0.1.0 (28 Jan 2014)
===================
* Project setup

View File

@ -1,33 +0,0 @@
======
Schema
======
The following list of fields describes the data returned from Satori.
Target
======
Target contains the address suplplied to run the discovery.
Found
=====
All data items discovered are returned under the found key. Keys to resources
discovered are also added under found, but the actual resources are stored
under the resources key.
Resources
=========
All resources (servers, load balancers, DNS domains, etc...) are stored under
the resources key.
Each resource contains the following keys:
* **key**: a globally unique identifier for the resource (could be a URI)
* **id**: the id in the system that hosts the resource
* **type**: the resource type using Heat or Heat-like resource types
* **data**: any additional fields for that resource

View File

@ -1 +0,0 @@
*.rst

View File

@ -1,32 +0,0 @@
=============
Terminology
=============
Opinions
========
Opinions are being discussed at https://wiki.openstack.org/wiki/Satori/OpinionsProposal.
.. _terminology_control_plane:
Control Plane Discovery
=======================
Control plane discovery is the process of making API calls to management
systems like OpenStack or IT asset management systems. An external management
system can show relationships between resources that can further improve
the discovery process. For example, a data plane discovery of a single server
will reveal that a server has a storage device attached to it. Control plane
discovery using an OpenStack plugin can reveal the details of the Cinder
volume.
Satori can load plugins that enable these systems to be queried.
Data Plane Discovery
====================
Data plane discovery is the process of connecting to a resource and using
native tools to extract information. For example, it can provide information
about the user list, installed software and processes that are running.
Satori can load plugins that enable data plane discovery.

View File

@ -1,15 +0,0 @@
[Messages Control]
# W0511: TODOs in code comments are fine.
# W0142: *args and **kwargs are fine.
disable-msg=W0511,W0142
# Don't require docstrings on tests.
no-docstring-rgx=((__.*__)|([tT]est.*)|setUp|tearDown)$
[Design]
min-public-methods=0
max-args=6
[Master]
#We try to keep contrib files unmodified
ignore=satori/contrib

View File

@ -1,8 +0,0 @@
iso8601>=0.1.5
Jinja2>=2.7.1
paramiko>=1.13.0 # py33 support
pbr>=0.5.21,<1.0
python-novaclient>=2.15.0 # py33 support
pythonwhois>=2.1.0 # py33 support
six>=1.8.0
tldextract>=1.2

View File

@ -1,11 +0,0 @@
impacket>=0.9.11
ipaddress>=1.0.6 # in stdlib as of python3.3
iso8601>=0.1.5
Jinja2>=2.7.1 # bug resolve @2.7.1
paramiko>=1.12.0 # ecdsa added
pbr>=0.5.21,<1.0
python-novaclient>=2.6.0.1 # breaks before
pythonwhois>=2.4.3
six>=1.8.0 # six.moves.shlex introduced
tldextract>=1.2
argparse

View File

@ -1,49 +0,0 @@
#!/bin/bash
function usage {
echo "Usage: $0 [OPTION]..."
echo "Run satori's test suite(s)"
echo ""
echo " -p, --pep8 Just run pep8"
echo " -h, --help Print this usage message"
echo ""
echo "This script is deprecated and currently retained for compatibility."
echo 'You can run the full test suite for multiple environments by running "tox".'
echo 'You can run tests for only python 2.7 by running "tox -e py27", or run only'
echo 'the pep8 tests with "tox -e pep8".'
exit
}
command -v tox > /dev/null 2>&1
if [ $? -ne 0 ]; then
echo 'This script requires "tox" to run.'
echo 'You can install it with "pip install tox".'
exit 1;
fi
just_pep8=0
function process_option {
case "$1" in
-h|--help) usage;;
-p|--pep8) let just_pep8=1;;
esac
}
for arg in "$@"; do
process_option $arg
done
if [ $just_pep8 -eq 1 ]; then
exec tox -e pep8
fi
tox -e py27 $toxargs 2>&1 | tee run_tests.err.log || exit 1
tox_exit_code=${PIPESTATUS[0]}
if [ 0$tox_exit_code -ne 0 ]; then
exit $tox_exit_code
fi
if [ -z "$toxargs" ]; then
tox -e pep8
fi

View File

@ -1,41 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""Satori main module."""
__all__ = ('__version__')
try:
import eventlet
eventlet.monkey_patch()
except ImportError:
pass
import pbr.version
from satori import shell
version_info = pbr.version.VersionInfo('satori')
try:
__version__ = version_info.version_string()
except AttributeError:
__version__ = None
def discover(address=None):
"""Temporary to demo python API.
TODO(zns): make it real
"""
shell.main(argv=[address])
return {'address': address, 'other info': '...'}

View File

@ -1,257 +0,0 @@
# Copyright 2012-2013 OpenStack Foundation
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""Shell classes for executing commands on a system.
Execute commands over ssh or using the python subprocess module.
"""
import logging
import shlex
from satori.common import popen
from satori import errors
from satori import smb
from satori import ssh
from satori import utils
LOG = logging.getLogger(__name__)
class ShellMixin(object):
"""Handle platform detection and define execute command."""
def execute(self, command, **kwargs):
"""Execute a (shell) command on the target.
:param command: Shell command to be executed
:param with_exit_code: Include the exit_code in the return body.
:param cwd: The child's current directory will be changed
to `cwd` before it is executed. Note that this
directory is not considered when searching the
executable, so you can't specify the program's
path relative to this argument
:returns: a dict with stdin, stdout, and
(optionally), the exit_code of the call
See SSH.remote_execute(), SMB.remote_execute(), and
LocalShell.execute() for client-specific keyword arguments.
"""
pass
@property
def platform_info(self):
"""Provide distro, version, architecture."""
pass
def is_debian(self):
"""Indicate whether the system is Debian based.
Uses the platform_info property.
"""
if not self.platform_info['dist']:
raise errors.UndeterminedPlatform(
'Unable to determine whether the system is Debian based.')
return self.platform_info['dist'].lower() in ['debian', 'ubuntu']
def is_fedora(self):
"""Indicate whether the system in Fedora based.
Uses the platform_info property.
"""
if not self.platform_info['dist']:
raise errors.UndeterminedPlatform(
'Unable to determine whether the system is Fedora based.')
return (self.platform_info['dist'].lower() in
['redhat', 'centos', 'fedora', 'el'])
def is_osx(self):
"""Indicate whether the system is Apple OSX based.
Uses the platform_info property.
"""
if not self.platform_info['dist']:
raise errors.UndeterminedPlatform(
'Unable to determine whether the system is OS X based.')
return (self.platform_info['dist'].lower() in
['darwin', 'macosx'])
def is_windows(self):
"""Indicate whether the system is Windows based.
Uses the platform_info property.
"""
if hasattr(self, '_client'):
if isinstance(self._client, smb.SMBClient):
return True
if not self.platform_info['dist']:
raise errors.UndeterminedPlatform(
'Unable to determine whether the system is Windows based.')
return self.platform_info['dist'].startswith('win')
class LocalShell(ShellMixin):
"""Execute shell commands on local machine."""
def __init__(self, user=None, password=None, interactive=False):
"""An interface for executing shell commands locally.
:param user: The user to execute the command as.
Defaults to the current user.
:param password: The password for `user`
:param interactive: If true, prompt for password if missing.
"""
self.user = user
self.password = password
self.interactive = interactive
# properties
self._platform_info = None
@property
def platform_info(self):
"""Return distro, version, and system architecture."""
if not self._platform_info:
self._platform_info = utils.get_platform_info()
return self._platform_info
def execute(self, command, **kwargs):
"""Execute a command (containing no shell operators) locally.
:param command: Shell command to be executed.
:param with_exit_code: Include the exit_code in the return body.
Default is False.
:param cwd: The child's current directory will be changed
to `cwd` before it is executed. Note that this
directory is not considered when searching the
executable, so you can't specify the program's
path relative to this argument
:returns: A dict with stdin, stdout, and
(optionally) the exit code.
"""
cwd = kwargs.get('cwd')
with_exit_code = kwargs.get('with_exit_code')
spipe = popen.PIPE
cmd = shlex.split(command)
LOG.debug("Executing `%s` on local machine", command)
result = popen.popen(
cmd, stdout=spipe, stderr=spipe, cwd=cwd)
out, err = result.communicate()
resultdict = {
'stdout': out.strip(),
'stderr': err.strip(),
}
if with_exit_code:
resultdict.update({'exit_code': result.returncode})
return resultdict
class RemoteShell(ShellMixin):
"""Execute shell commands on a remote machine over ssh."""
def __init__(self, address, password=None, username=None,
private_key=None, key_filename=None, port=None,
timeout=None, gateway=None, options=None, interactive=False,
protocol='ssh', root_password=None, **kwargs):
"""An interface for executing shell commands on remote machines.
:param str host: The ip address or host name of the server
to connect to
:param str password: A password to use for authentication
or for unlocking a private key
:param username: The username to authenticate as
:param private_key: Private SSH Key string to use
(instead of using a filename)
:param root_password: root user password to be used if username is
not root. This will use username and password
to login and then 'su' to root using
root_password
:param key_filename: a private key filename (path)
:param port: tcp/ip port to use (defaults to 22)
:param float timeout: an optional timeout (in seconds) for the
TCP connection
:param socket gateway: an existing SSH instance to use
for proxying
:param dict options: A dictionary used to set ssh options
(when proxying).
e.g. for `ssh -o StrictHostKeyChecking=no`,
you would provide
(.., options={'StrictHostKeyChecking': 'no'})
Conversion of booleans is also supported,
(.., options={'StrictHostKeyChecking': False})
is equivalent.
:keyword interactive: If true, prompt for password if missing.
"""
if kwargs:
LOG.warning("Satori RemoteClient received unrecognized "
"keyword arguments: %s", kwargs.keys())
if protocol == 'smb':
self._client = smb.connect(address, password=password,
username=username,
port=port, timeout=timeout,
gateway=gateway)
else:
self._client = ssh.connect(address, password=password,
username=username,
private_key=private_key,
key_filename=key_filename,
port=port, timeout=timeout,
gateway=gateway,
options=options,
interactive=interactive,
root_password=root_password)
self.host = self._client.host
self.port = self._client.port
@property
def platform_info(self):
"""Return distro, version, architecture."""
return self._client.platform_info
def __del__(self):
"""Destructor which should close the connection."""
self.close()
def __enter__(self):
"""Context manager establish connection."""
self.connect()
return self
def __exit__(self, *exc_info):
"""Context manager close connection."""
self.close()
def connect(self):
"""Connect to the remote host."""
return self._client.connect()
def test_connection(self):
"""Test the connection to the remote host."""
return self._client.test_connection()
def execute(self, command, **kwargs):
"""Execute given command over ssh."""
return self._client.remote_execute(command, **kwargs)
def close(self):
"""Close the connection to the remote host."""
return self._client.close()

View File

@ -1 +0,0 @@
"""Module to hold all shared code used inside Satori."""

View File

@ -1,140 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""Logging Module.
This module handles logging to the standard python logging subsystem and to the
console.
"""
from __future__ import absolute_import
import logging
import os
import sys
LOG = logging.getLogger(__name__)
class DebugFormatter(logging.Formatter):
"""Log formatter.
Outputs any 'data' values passed in the 'extra' parameter if provided.
**Example**:
.. code-block:: python
LOG.debug("My message", extra={'data': locals()})
"""
def format(self, record):
"""Print out any 'extra' data provided in logs."""
if hasattr(record, 'data'):
return "%s. DEBUG DATA=%s" % (logging.Formatter.format(self,
record), record.__dict__['data'])
return logging.Formatter.format(self, record)
def init_logging(config, default_config=None):
"""Configure logging based on log config file.
Turn on console logging if no logging files found
:param config: object with configuration namespace (ex. argparse parser)
:keyword default_config: path to a python logging configuration file
"""
if config.get('logconfig') and os.path.isfile(config.get('logconfig')):
logging.config.fileConfig(config['logconfig'],
disable_existing_loggers=False)
elif default_config and os.path.isfile(default_config):
logging.config.fileConfig(default_config,
disable_existing_loggers=False)
else:
init_console_logging(config)
def find_console_handler(logger):
"""Return a stream handler, if it exists."""
for handler in logger.handlers:
if (isinstance(handler, logging.StreamHandler) and
handler.stream == sys.stderr):
return handler
def log_level(config):
"""Get debug settings from configuration.
--debug: turn on additional debug code/inspection (implies
logging.DEBUG)
--verbose: turn up logging output (logging.DEBUG)
--quiet: turn down logging output (logging.WARNING)
default is logging.INFO
:param config: object with configuration namespace (ex. argparse parser)
"""
if config.get('debug') is True:
return logging.DEBUG
elif config.get('verbose') is True:
return logging.DEBUG
elif config.get('quiet') is True:
return logging.WARNING
else:
return logging.INFO
def get_debug_formatter(config):
"""Get debug formatter based on configuration.
:param config: configuration namespace (ex. argparser)
--debug: log line numbers and file data also
--verbose: standard debug
--quiet: turn down logging output (logging.WARNING)
default is logging.INFO
:param config: object with configuration namespace (ex. argparse parser)
"""
if config.get('debug') is True:
return DebugFormatter('%(pathname)s:%(lineno)d: %(levelname)-8s '
'%(message)s')
elif config.get('verbose') is True:
return logging.Formatter(
'%(name)-30s: %(levelname)-8s %(message)s')
elif config.get('quiet') is True:
return logging.Formatter('%(message)s')
else:
return logging.Formatter('%(message)s')
def init_console_logging(config):
"""Enable logging to the console.
:param config: object with configuration namespace (ex. argparse parser)
"""
# define a Handler which writes messages to the sys.stderr
console = find_console_handler(logging.getLogger())
if not console:
console = logging.StreamHandler()
logging_level = log_level(config)
console.setLevel(logging_level)
# set a format which is simpler for console use
formatter = get_debug_formatter(config)
# tell the handler to use this format
console.setFormatter(formatter)
# add the handler to the root logger
logging.getLogger().addHandler(console)
logging.getLogger().setLevel(logging_level)
global LOG # pylint: disable=W0603
LOG = logging.getLogger(__name__) # reset

View File

@ -1,23 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""Popen wrapper to allow custom patching of subprocess.Popen."""
import subprocess
PIPE = subprocess.PIPE
STDOUT = subprocess.STDOUT
def popen(*args, **kwargs):
"""Wrap Popen to allow for higher level patching if necessary."""
return subprocess.Popen(*args, **kwargs)

View File

@ -1,128 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""Templating module."""
from __future__ import absolute_import
import json
import logging
import jinja2
from jinja2 import sandbox
import six
CODE_CACHE = {}
LOG = logging.getLogger(__name__)
if six.PY3:
StandardError = Exception
class TemplateException(Exception):
"""Error applying template."""
class CompilerCache(jinja2.BytecodeCache):
"""Cache for compiled template code.
This is leveraged when satori is used from within a long-running process,
like a server. It does not speed things up for command-line use.
"""
def load_bytecode(self, bucket):
"""Load compiled code from cache."""
if bucket.key in CODE_CACHE:
bucket.bytecode_from_string(CODE_CACHE[bucket.key])
def dump_bytecode(self, bucket):
"""Write compiled code into cache."""
CODE_CACHE[bucket.key] = bucket.bytecode_to_string()
def do_prepend(value, param='/'):
"""Prepend a string if the passed in string exists.
Example:
The template '{{ root|prepend('/')}}/path';
Called with root undefined renders:
/path
Called with root defined as 'root' renders:
/root/path
"""
if value:
return '%s%s' % (param, value)
else:
return ''
def preserve_linefeeds(value):
"""Escape linefeeds.
To make templates work with both YAML and JSON, escape linefeeds instead of
allowing Jinja to render them.
"""
return value.replace("\n", "\\n").replace("\r", "")
def get_jinja_environment(template, extra_globals=None, **env_vars):
"""Return a sandboxed jinja environment."""
template_map = {'template': template}
env = sandbox.ImmutableSandboxedEnvironment(
loader=jinja2.DictLoader(template_map),
bytecode_cache=CompilerCache(), **env_vars)
env.filters['prepend'] = do_prepend
env.filters['preserve'] = preserve_linefeeds
env.globals['json'] = json
if extra_globals:
env.globals.update(extra_globals)
return env
def parse(template, extra_globals=None, env_vars=None, **kwargs):
"""Parse template.
:param template: the template contents as a string
:param extra_globals: additional globals to include
:param kwargs: extra arguments are passed to the renderer
"""
if env_vars is None:
env_vars = {}
env = get_jinja_environment(template, extra_globals=extra_globals,
**env_vars)
minimum_kwargs = {
'data': {},
}
minimum_kwargs.update(kwargs)
try:
template = env.get_template('template')
except jinja2.TemplateSyntaxError as exc:
LOG.error(exc, exc_info=True)
error_message = "Template had a syntax error: %s" % exc
raise TemplateException(error_message)
try:
result = template.render(**minimum_kwargs)
# TODO(zns): exceptions in Jinja template sometimes missing traceback
except jinja2.TemplateError as exc:
LOG.error(exc, exc_info=True)
error_message = "Template had an error: %s" % exc
raise TemplateException(error_message)
except StandardError as exc:
LOG.error(exc, exc_info=True)
error_message = "Template rendering failed: %s" % exc
raise TemplateException(error_message)
return result

View File

@ -1,557 +0,0 @@
#!/usr/bin/python
# Copyright (c) 2003-2012 CORE Security Technologies
#
# This software is provided under under a slightly modified version
# of the Apache Software License. See the accompanying LICENSE file
# for more information.
#
# $Id: psexec.py 712 2012-09-06 04:26:22Z bethus@gmail.com $
#
# PSEXEC like functionality example using
# RemComSvc (https://github.com/kavika13/RemCom)
#
# Author:
# beto (bethus@gmail.com)
#
# Reference for:
# DCE/RPC and SMB.
""".
OK
"""
import cmd
import os
import re
import sys
from impacket.dcerpc import dcerpc
from impacket.dcerpc import transport
from impacket.examples import remcomsvc
from impacket import smbconnection
from impacket import structure as im_structure
from impacket import version
from satori import serviceinstall
import argparse
import random
import string
try:
import eventlet
threading = eventlet.patcher.original('threading')
time = eventlet.patcher.original('time')
except ImportError:
import threading
import time
class RemComMessage(im_structure.Structure):
"""."""
structure = (
('Command', '4096s=""'),
('WorkingDir', '260s=""'),
('Priority', '<L=0x20'),
('ProcessID', '<L=0x01'),
('Machine', '260s=""'),
('NoWait', '<L=0'),
)
class RemComResponse(im_structure.Structure):
"""."""
structure = (
('ErrorCode', '<L=0'),
('ReturnCode', '<L=0'),
)
RemComSTDOUT = "RemCom_stdout"
RemComSTDIN = "RemCom_stdin"
RemComSTDERR = "RemCom_stderr"
lock = threading.Lock()
class PSEXEC:
"""."""
KNOWN_PROTOCOLS = {
'139/SMB': (r'ncacn_np:%s[\pipe\svcctl]', 139),
'445/SMB': (r'ncacn_np:%s[\pipe\svcctl]', 445),
}
def __init__(self, command, path, exeFile, protocols=None,
username='', password='', domain='', hashes=None):
"""."""
if not protocols:
protocols = PSEXEC.KNOWN_PROTOCOLS.keys()
self.__username = username
self.__password = password
self.__protocols = [protocols]
self.__command = command
self.__path = path
self.__domain = domain
self.__lmhash = ''
self.__nthash = ''
self.__exeFile = exeFile
if hashes is not None:
self.__lmhash, self.__nthash = hashes.split(':')
def run(self, addr):
"""."""
for protocol in self.__protocols:
protodef = PSEXEC.KNOWN_PROTOCOLS[protocol]
port = protodef[1]
print("Trying protocol %s...\n" % protocol)
stringbinding = protodef[0] % addr
rpctransport = transport.DCERPCTransportFactory(stringbinding)
rpctransport.set_dport(port)
if hasattr(rpctransport, 'preferred_dialect'):
rpctransport.preferred_dialect(smbconnection.SMB_DIALECT)
if hasattr(rpctransport, 'set_credentials'):
# This method exists only for selected protocol sequences.
rpctransport.set_credentials(self.__username, self.__password,
self.__domain, self.__lmhash,
self.__nthash)
self.doStuff(rpctransport)
def openPipe(self, s, tid, pipe, accessMask):
"""."""
pipeReady = False
tries = 50
while pipeReady is False and tries > 0:
try:
s.waitNamedPipe(tid, pipe)
pipeReady = True
except Exception:
tries -= 1
time.sleep(2)
pass
if tries == 0:
print('[!] Pipe not ready, aborting')
raise
fid = s.openFile(tid, pipe, accessMask, creationOption=0x40,
fileAttributes=0x80)
return fid
def doStuff(self, rpctransport):
"""."""
dce = dcerpc.DCERPC_v5(rpctransport)
try:
dce.connect()
except Exception as e:
print(e)
sys.exit(1)
global dialect
dialect = rpctransport.get_smb_connection().getDialect()
try:
unInstalled = False
s = rpctransport.get_smb_connection()
# We don't wanna deal with timeouts from now on.
s.setTimeout(100000)
svcName = "RackspaceSystemDiscovery"
executableName = "RackspaceSystemDiscovery.exe"
if self.__exeFile is None:
svc = remcomsvc.RemComSvc()
installService = serviceinstall.ServiceInstall(s, svc,
svcName,
executableName)
else:
try:
f = open(self.__exeFile)
except Exception as e:
print(e)
sys.exit(1)
installService = serviceinstall.ServiceInstall(s, f,
svcName,
executableName)
installService.install()
if self.__exeFile is not None:
f.close()
tid = s.connectTree('IPC$')
fid_main = self.openPipe(s, tid, '\RemCom_communicaton', 0x12019f)
packet = RemComMessage()
pid = os.getpid()
packet['Machine'] = ''.join([random.choice(string.letters)
for i in range(4)])
if self.__path is not None:
packet['WorkingDir'] = self.__path
packet['Command'] = self.__command
packet['ProcessID'] = pid
s.writeNamedPipe(tid, fid_main, str(packet))
# Here we'll store the command we type so we don't print it back ;)
# ( I know.. globals are nasty :P )
global LastDataSent
LastDataSent = ''
retCode = None
# Create the pipes threads
stdin_pipe = RemoteStdInPipe(rpctransport,
'\%s%s%d' % (RemComSTDIN,
packet['Machine'],
packet['ProcessID']),
smbconnection.smb.FILE_WRITE_DATA |
smbconnection.smb.FILE_APPEND_DATA,
installService.getShare())
stdin_pipe.start()
stdout_pipe = RemoteStdOutPipe(rpctransport,
'\%s%s%d' % (RemComSTDOUT,
packet['Machine'],
packet['ProcessID']),
smbconnection.smb.FILE_READ_DATA)
stdout_pipe.start()
stderr_pipe = RemoteStdErrPipe(rpctransport,
'\%s%s%d' % (RemComSTDERR,
packet['Machine'],
packet['ProcessID']),
smbconnection.smb.FILE_READ_DATA)
stderr_pipe.start()
# And we stay here till the end
ans = s.readNamedPipe(tid, fid_main, 8)
if len(ans):
retCode = RemComResponse(ans)
print("[*] Process %s finished with ErrorCode: %d, "
"ReturnCode: %d" % (self.__command, retCode['ErrorCode'],
retCode['ReturnCode']))
installService.uninstall()
unInstalled = True
sys.exit(retCode['ReturnCode'])
except Exception:
if unInstalled is False:
installService.uninstall()
sys.stdout.flush()
if retCode:
sys.exit(retCode['ReturnCode'])
else:
sys.exit(1)
class Pipes(threading.Thread):
"""."""
def __init__(self, transport, pipe, permissions, share=None):
"""."""
threading.Thread.__init__(self)
self.server = 0
self.transport = transport
self.credentials = transport.get_credentials()
self.tid = 0
self.fid = 0
self.share = share
self.port = transport.get_dport()
self.pipe = pipe
self.permissions = permissions
self.daemon = True
def connectPipe(self):
"""."""
try:
lock.acquire()
global dialect
remoteHost = self.transport.get_smb_connection().getRemoteHost()
# self.server = SMBConnection('*SMBSERVER',
# self.transport.get_smb_connection().getRemoteHost(),
# sess_port = self.port, preferredDialect = SMB_DIALECT)
self.server = smbconnection.SMBConnection('*SMBSERVER', remoteHost,
sess_port=self.port,
preferredDialect=dialect) # noqa
user, passwd, domain, lm, nt = self.credentials
self.server.login(user, passwd, domain, lm, nt)
lock.release()
self.tid = self.server.connectTree('IPC$')
self.server.waitNamedPipe(self.tid, self.pipe)
self.fid = self.server.openFile(self.tid, self.pipe,
self.permissions,
creationOption=0x40,
fileAttributes=0x80)
self.server.setTimeout(1000000)
except Exception:
message = ("[!] Something wen't wrong connecting the pipes(%s), "
"try again")
print(message % self.__class__)
class RemoteStdOutPipe(Pipes):
"""."""
def __init__(self, transport, pipe, permisssions):
"""."""
Pipes.__init__(self, transport, pipe, permisssions)
def run(self):
"""."""
self.connectPipe()
while True:
try:
ans = self.server.readFile(self.tid, self.fid, 0, 1024)
except Exception:
pass
else:
try:
global LastDataSent
if ans != LastDataSent: # noqa
sys.stdout.write(ans)
sys.stdout.flush()
else:
# Don't echo what I sent, and clear it up
LastDataSent = ''
# Just in case this got out of sync, i'm cleaning it
# up if there are more than 10 chars,
# it will give false positives tho.. we should find a
# better way to handle this.
if LastDataSent > 10:
LastDataSent = ''
except Exception:
pass
class RemoteStdErrPipe(Pipes):
"""."""
def __init__(self, transport, pipe, permisssions):
"""."""
Pipes.__init__(self, transport, pipe, permisssions)
def run(self):
"""."""
self.connectPipe()
while True:
try:
ans = self.server.readFile(self.tid, self.fid, 0, 1024)
except Exception:
pass
else:
try:
sys.stderr.write(str(ans))
sys.stderr.flush()
except Exception:
pass
class RemoteShell(cmd.Cmd):
"""."""
def __init__(self, server, port, credentials, tid, fid, share):
"""."""
cmd.Cmd.__init__(self, False)
self.prompt = '\x08'
self.server = server
self.transferClient = None
self.tid = tid
self.fid = fid
self.credentials = credentials
self.share = share
self.port = port
self.intro = '[!] Press help for extra shell commands'
def connect_transferClient(self):
"""."""
# self.transferClient = SMBConnection('*SMBSERVER',
# self.server.getRemoteHost(), sess_port = self.port,
# preferredDialect = SMB_DIALECT)
self.transferClient = smbconnection.SMBConnection('*SMBSERVER',
self.server.getRemoteHost(),
sess_port=self.port,
preferredDialect=dialect) # noqa
user, passwd, domain, lm, nt = self.credentials
self.transferClient.login(user, passwd, domain, lm, nt)
def do_help(self, line):
"""."""
print("""
lcd {path} - changes the current local directory to {path}
exit - terminates the server process (and this session)
put {src_file, dst_path} - uploads a local file to the dst_path RELATIVE to
the connected share (%s)
get {file} - downloads pathname RELATIVE to the connected
share (%s) to the current local dir
! {cmd} - executes a local shell cmd
""" % (self.share, self.share))
self.send_data('\r\n', False)
def do_shell(self, s):
"""."""
os.system(s)
self.send_data('\r\n')
def do_get(self, src_path):
"""."""
try:
if self.transferClient is None:
self.connect_transferClient()
import ntpath
filename = ntpath.basename(src_path)
fh = open(filename, 'wb')
print("[*] Downloading %s\%s" % (self.share, src_path))
self.transferClient.getFile(self.share, src_path, fh.write)
fh.close()
except Exception as e:
print(e)
pass
self.send_data('\r\n')
def do_put(self, s):
"""."""
try:
if self.transferClient is None:
self.connect_transferClient()
params = s.split(' ')
if len(params) > 1:
src_path = params[0]
dst_path = params[1]
elif len(params) == 1:
src_path = params[0]
dst_path = '/'
src_file = os.path.basename(src_path)
fh = open(src_path, 'rb')
f = dst_path + '/' + src_file
pathname = string.replace(f, '/', '\\')
print("[*] Uploading %s to %s\\%s" % (src_file, self.share,
dst_path))
self.transferClient.putFile(self.share, pathname, fh.read)
fh.close()
except Exception as e:
print(e)
pass
self.send_data('\r\n')
def do_lcd(self, s):
"""."""
if s == '':
print(os.getcwd())
else:
os.chdir(s)
self.send_data('\r\n')
def emptyline(self):
"""."""
self.send_data('\r\n')
return
def do_EOF(self, line):
"""."""
self.server.logoff()
def default(self, line):
"""."""
self.send_data(line+'\r\n')
def send_data(self, data, hideOutput=True):
"""."""
if hideOutput is True:
global LastDataSent
LastDataSent = data
else:
LastDataSent = ''
self.server.writeFile(self.tid, self.fid, data)
class RemoteStdInPipe(Pipes):
"""RemoteStdInPipe class.
Used to connect to RemComSTDIN named pipe on remote system
"""
def __init__(self, transport, pipe, permisssions, share=None):
"""Constructor."""
Pipes.__init__(self, transport, pipe, permisssions, share)
def run(self):
"""."""
self.connectPipe()
self.shell = RemoteShell(self.server, self.port, self.credentials,
self.tid, self.fid, self.share)
self.shell.cmdloop()
# Process command-line arguments.
if __name__ == '__main__':
print(version.BANNER)
parser = argparse.ArgumentParser()
parser.add_argument('target', action='store',
help='[domain/][username[:password]@]<address>')
parser.add_argument('command', action='store',
help='command to execute at the target (w/o path)')
parser.add_argument('-path', action='store',
help='path of the command to execute')
parser.add_argument(
'-file', action='store',
help="alternative RemCom binary (be sure it doesn't require CRT)")
parser.add_argument(
'-port', action='store',
help='alternative port to use, this will copy settings from 445/SMB')
parser.add_argument('protocol', choices=PSEXEC.KNOWN_PROTOCOLS.keys(),
nargs='?', default='445/SMB',
help='transport protocol (default 445/SMB)')
group = parser.add_argument_group('authentication')
group.add_argument('-hashes', action="store", metavar="LMHASH:NTHASH",
help='NTLM hashes, format is LMHASH:NTHASH')
if len(sys.argv) == 1:
parser.print_help()
sys.exit(1)
options = parser.parse_args()
domain, username, password, address = re.compile(
'(?:(?:([^/@:]*)/)?([^@:]*)(?::([^.]*))?@)?(.*)'
).match(options.target).groups('')
if domain is None:
domain = ''
if options.port:
options.protocol = "%s/SMB" % options.port
executer = PSEXEC(options.command, options.path, options.file,
options.protocol, username, password, domain,
options.hashes)
if options.protocol not in PSEXEC.KNOWN_PROTOCOLS:
connection_string = 'ncacn_np:%s[\\pipe\\svcctl]'
PSEXEC.KNOWN_PROTOCOLS[options.protocol] = (connection_string,
options.port)
executer.run(address)

View File

@ -1,141 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""Discovery Module.
TODO(zns): testing, refactoring, etc... just using this to demonstrate
functionality
Example usage:
from satori import discovery
discovery.run(address="foo.com")
"""
from __future__ import print_function
import sys
import traceback
import ipaddress
import novaclient.v1_1
from pythonwhois import shared
import six
from satori import dns
from satori import utils
def run(target, config=None, interactive=False):
"""Run discovery and return results."""
if config is None:
config = {}
found = {}
resources = {}
errors = {}
results = {
'target': target,
'created': utils.get_time_string(),
'found': found,
'resources': resources,
}
if utils.is_valid_ip_address(target):
ip_address = target
else:
hostname = dns.parse_target_hostname(target)
found['hostname'] = hostname
ip_address = six.text_type(dns.resolve_hostname(hostname))
# TODO(sam): Use ipaddress.ip_address.is_global
# .is_private
# .is_unspecified
# .is_multicast
# To determine address "type"
if not ipaddress.ip_address(ip_address).is_loopback:
try:
domain_info = dns.domain_info(hostname)
resource_type = 'OS::DNS::Domain'
identifier = '%s:%s' % (resource_type, hostname)
resources[identifier] = {
'type': resource_type,
'key': identifier,
}
found['domain-key'] = identifier
resources[identifier]['data'] = domain_info
if 'registered' in domain_info:
found['registered-domain'] = domain_info['registered']
except shared.WhoisException as exc:
results['domain'] = str(exc)
found['ip-address'] = ip_address
host, host_errors = discover_host(ip_address, config,
interactive=interactive)
if host_errors:
errors.update(host_errors)
key = host.get('key') or ip_address
resources[key] = host
found['host-key'] = key
results['updated'] = utils.get_time_string()
return results, errors
def discover_host(address, config, interactive=False):
"""Discover host by IP address."""
host = {}
errors = {}
if config.get('username'):
server = find_nova_host(address, config)
if server:
host['type'] = 'OS::Nova::Instance'
data = {}
host['data'] = data
data['uri'] = [l['href'] for l in server.links
if l['rel'] == 'self'][0]
data['name'] = server.name
data['id'] = server.id
data['addresses'] = server.addresses
host['key'] = data['uri']
if config.get('system_info'):
module_name = config['system_info'].replace("-", "_")
if '.' not in module_name:
module_name = 'satori.sysinfo.%s' % module_name
system_info_module = utils.import_object(module_name)
try:
result = system_info_module.get_systeminfo(
address, config, interactive=interactive)
host.setdefault('data', {})
host['data']['system_info'] = result
except Exception as exc:
exc_traceback = sys.exc_info()[2]
errors['system_info'] = {
'type': "ERROR",
'message': str(exc),
'exception': exc,
'traceback': traceback.format_tb(exc_traceback),
}
return host, errors
def find_nova_host(address, config):
"""See if a nova instance has the supplied address."""
nova = novaclient.v1_1.client.Client(config['username'],
config['password'],
config['tenant_id'],
config['authurl'],
region_name=config['region'],
service_type="compute")
for server in nova.servers.list():
for network_addresses in six.itervalues(server.addresses):
for ip_address in network_addresses:
if ip_address['addr'] == address:
return server

View File

@ -1,113 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""Satori DNS Discovery."""
import datetime
import logging
import socket
import pythonwhois
from six.moves.urllib import parse as urlparse
import tldextract
from satori import errors
from satori import utils
LOG = logging.getLogger(__name__)
def parse_target_hostname(target):
"""Get IP address or FQDN of a target which could be a URL or address."""
if not target:
raise errors.SatoriInvalidNetloc("Target must be supplied.")
try:
parsed = urlparse.urlparse(target)
except AttributeError as err:
error = "Target `%s` is unparseable. Error: %s" % (target, err)
LOG.exception(error)
raise errors.SatoriInvalidNetloc(error)
# Domain names and IP are in netloc when parsed with a protocol
# they will be in path if parsed without a protocol
return parsed.netloc or parsed.path
def resolve_hostname(hostname):
"""Get IP address of hostname."""
try:
address = socket.gethostbyname(hostname)
except socket.gaierror:
error = "`%s` is an invalid domain." % hostname
raise errors.SatoriInvalidDomain(error)
return address
def get_registered_domain(hostname):
"""Get the root DNS domain of an FQDN."""
return tldextract.extract(hostname).registered_domain
def ip_info(ip_address):
"""Get as much information as possible for a given ip address."""
if not utils.is_valid_ip_address(ip_address):
error = "`%s` is an invalid IP address." % ip_address
raise errors.SatoriInvalidIP(error)
result = pythonwhois.get_whois(ip_address)
return {
'whois': result['raw']
}
def domain_info(domain):
"""Get as much information as possible for a given domain name."""
registered_domain = get_registered_domain(domain)
if utils.is_valid_ip_address(domain) or registered_domain == '':
error = "`%s` is an invalid domain." % domain
raise errors.SatoriInvalidDomain(error)
result = pythonwhois.get_whois(registered_domain)
registrar = []
if 'registrar' in result and len(result['registrar']) > 0:
registrar = result['registrar'][0]
nameservers = result.get('nameservers', [])
days_until_expires = None
expires = None
if 'expiration_date' in result:
if (isinstance(result['expiration_date'], list)
and len(result['expiration_date']) > 0):
expires = result['expiration_date'][0]
if isinstance(expires, datetime.datetime):
days_until_expires = (expires - datetime.datetime.now()).days
expires = utils.get_time_string(time_obj=expires)
else:
days_until_expires = (utils.parse_time_string(expires) -
datetime.datetime.now()).days
return {
'name': registered_domain,
'whois': result['raw'],
'registrar': registrar,
'nameservers': nameservers,
'days_until_expires': days_until_expires,
'expiration_date': expires,
}
def netloc_info(netloc):
"""Determine if netloc is an IP or domain name."""
if utils.is_valid_ip_address(netloc):
ip_info(netloc)
else:
domain_info(netloc)

View File

@ -1,116 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""Satori Discovery Errors."""
class SatoriException(Exception):
"""Parent class for Satori exceptions.
Accepts a string error message that that accept a str description.
"""
class UndeterminedPlatform(SatoriException):
"""The target system's platform could not be determined."""
class SatoriInvalidNetloc(SatoriException):
"""Netloc that cannot be parsed by `urlparse`."""
class SatoriInvalidDomain(SatoriException):
"""Invalid Domain provided."""
class SatoriInvalidIP(SatoriException):
"""Invalid IP provided."""
class SatoriShellException(SatoriException):
"""Invalid shell parameters."""
class SatoriAuthenticationException(SatoriException):
"""Invalid login credentials."""
class SatoriSMBAuthenticationException(SatoriAuthenticationException):
"""Invalid login credentials for use over SMB to server."""
class SatoriSMBLockoutException(SatoriSMBAuthenticationException):
"""Too many invalid logon attempts, the user has been locked out."""
class SatoriSMBFileSharingException(SatoriException):
"""Incompatible shared access flags for a file on the Windows system."""
class GetPTYRetryFailure(SatoriException):
"""Tried to re-run command with get_pty to no avail."""
class DiscoveryException(SatoriException):
"""Discovery exception with custom message."""
class SatoriDuplicateCommandException(SatoriException):
"""The command cannot be run because it was already found to be running."""
class UnsupportedPlatform(DiscoveryException):
"""Unsupported operating system or distro."""
class SystemInfoCommandMissing(DiscoveryException):
"""Command that provides system information is missing."""
class SystemInfoCommandOld(DiscoveryException):
"""Command that provides system information is outdated."""
class SystemInfoNotJson(DiscoveryException):
"""Command did not produce valid JSON."""
class SystemInfoMissingJson(DiscoveryException):
"""Command did not produce stdout containing JSON."""
class SystemInfoInvalid(DiscoveryException):
"""Command did not produce valid JSON or XML."""
class SystemInfoCommandInstallFailed(DiscoveryException):
"""Failed to install package that provides system information."""

View File

@ -1,23 +0,0 @@
{#
This is a jinja template used to customize the output of satori. You can add
as many of those as you'd like to your setup. You can reference them using the
--format (or -F) argument. satori takes the format you asked for and appends
".jinja" to it to look up the file from this dirfectory.
You have some global variables available to you:
- target: that's the address or URL supplied at the command line
- data: all the discovery data
For example, to use this template:
$ satori openstack.org -F custo
Hi! localhost
Happy customizing :-)
#}
Hi! {{ target }}

View File

@ -1,54 +0,0 @@
{% set found = data['found'] | default({}) %}
{% set resources = data['resources'] | default({'n/a': {}}) %}
{% set address = found['ip-address'] %}
{% set hostkey = found['host-key'] | default('n/a') %}
{% set domainkey = found['domain-key'] | default('n/a') %}
{% set server = resources[hostkey] | default(False) %}
{% set domain = resources[domainkey] | default(False) %}
{% if found['ip-address'] != target %}Address:
{{ target }} resolves to IPv4 address {{ found['ip-address'] }}
{%- endif %}
{% if domain %}Domain: {{ domain['data'].name }}
Registrar: {{ domain['data'].registrar }}
{% if domain['data'].nameservers %}
Nameservers: {% for nameserver in domain['data'].nameservers %}{{nameserver}}{% if not loop.last %}, {% endif %}{% endfor %}
{% endif %}
{% if domain['data'].days_until_expires %}
Expires: {{ domain['data'].days_until_expires }} days
{% endif %}
{%- endif %}
{% if server and server.type == 'OS::Nova::Instance' %}
Host:
{{ found['ip-address'] }} ({{ target }}) is hosted on a Nova instance
{% if 'data' in server %} Instance Information:
URI: {{ server['data'].uri | default('n/a') }}
Name: {{ server['data'].name | default('n/a') }}
ID: {{ server['data'].id | default('n/a') }}
{% if 'addresses' in server['data'] %} ip-addresses:
{% for name, addresses in server['data'].addresses.items() %}
{{ name }}:
{% for address in addresses %}
{{ address.addr }}
{% endfor %}
{% endfor %}{% endif %}{% endif %}
{% elif found['ip-address'] %}
Host:
ip-address: {{ found['ip-address'] }}
{% else %}Host not found
{% endif %}
{% if server and 'data' in server and server['data'].system_info %}
{% if 'remote_services' in server['data'].system_info %}
Listening Services:
{% for remote in server['data'].system_info.remote_services | sort %}
{{ remote.ip }}:{{ remote.port }} {{ remote.process }}
{% endfor %}{% endif %}
{% if 'connections' in server['data'].system_info %}
Talking to:
{% for connection in server['data'].system_info.connections | dictsort %}
{{ connection[0] }}{% if connection[1] %} on {% for port in connection[1] %}{{ port }}{% if not loop.last %}, {% endif %}{% endfor %}{% endif %}
{% endfor %}{% endif %}
{% endif %}

View File

@ -1,303 +0,0 @@
# Copyright (c) 2003-2012 CORE Security Technologies
#
# This software is provided under under a slightly modified version
# of the Apache Software License. See the accompanying LICENSE file
# for more information.
#
# $Id: serviceinstall.py 1141 2014-02-12 16:39:51Z bethus@gmail.com $
#
# Service Install Helper library used by psexec and smbrelayx
# You provide an already established connection and an exefile
# (or class that mimics a file class) and this will install and
# execute the service, and then uninstall (install(), uninstall().
# It tries to take care as much as possible to leave everything clean.
#
# Author:
# Alberto Solino (bethus@gmail.com)
#
"""This module has been copied from impacket.examples.serviceinstall.
It exposes a class that can be used to install services on Windows devices
"""
import random
import string
from impacket.dcerpc import dcerpc
from impacket.dcerpc import srvsvc
from impacket.dcerpc import svcctl
from impacket.dcerpc import transport
from impacket import smb
from impacket import smb3
from impacket import smbconnection
class ServiceInstall():
"""Class to manage Services on a remote windows server.
This class is slightly improved from the example in the impacket package
in a way that it allows to specify a service and executable name during
instantiation rather than using a random name by default
"""
def __init__(self, SMBObject, exeFile, serviceName=None,
binaryServiceName=None):
"""Contructor of the class.
:param SMBObject: existing SMBObject
:param exeFile: file handle or class that mimics a file
class, this will be used to create the
service
:param serviceName: name of the service to be created, will be
random if not set
:param binaryServiceName name of the uploaded file, wil be random if
not set
"""
print("In constructor now!!!")
self._rpctransport = 0
if not serviceName:
self.__service_name = ''.join(
[random.choice(string.letters) for i in range(4)])
else:
self.__service_name = serviceName
if not binaryServiceName:
self.__binary_service_name = ''.join(
[random.choice(string.letters) for i in range(8)]) + '.exe'
else:
self.__binary_service_name = binaryServiceName
self.__exeFile = exeFile
# We might receive two different types of objects, always end up
# with a SMBConnection one
if isinstance(SMBObject, smb.SMB) or isinstance(SMBObject, smb3.SMB3):
self.connection = smbconnection.SMBConnection(
existingConnection=SMBObject)
else:
self.connection = SMBObject
self.share = ''
def getShare(self):
"""Return the writable share that has been used to upload the file."""
return self.share
def getShares(self):
"""Return a list of shares on the remote windows server."""
# Setup up a DCE SMBTransport with the connection already in place
print("[*] Requesting shares on %s....." % (
self.connection.getRemoteHost()))
try:
self._rpctransport = transport.SMBTransport(
'', '', filename=r'\srvsvc', smb_connection=self.connection)
self._dce = dcerpc.DCERPC_v5(self._rpctransport)
self._dce.connect()
self._dce.bind(srvsvc.MSRPC_UUID_SRVSVC)
srv_svc = srvsvc.DCERPCSrvSvc(self._dce)
resp = srv_svc.get_share_enum_1(self._rpctransport.get_dip())
return resp
except Exception:
print("[!] Error requesting shares on %s, aborting....." % (
self.connection.getRemoteHost()))
raise
def createService(self, handle, share, path):
"""Install Service on the remote server.
This method will connect to the SVCManager on the remote server and
install the service as specified in the constructor.
"""
print("[*] Creating service %s on %s....." % (
self.__service_name, self.connection.getRemoteHost()))
# First we try to open the service in case it exists.
# If it does, we remove it.
try:
resp = self.rpcsvc.OpenServiceW(
handle, self.__service_name.encode('utf-16le'))
except Exception as e:
if e.get_error_code() == svcctl.ERROR_SERVICE_DOES_NOT_EXISTS:
# We're good, pass the exception
pass
else:
raise
else:
# It exists, remove it
self.rpcsvc.DeleteService(resp['ContextHandle'])
self.rpcsvc.CloseServiceHandle(resp['ContextHandle'])
# Create the service
command = '%s\\%s' % (path, self.__binary_service_name)
try:
resp = self.rpcsvc.CreateServiceW(
handle, self.__service_name.encode('utf-16le'),
self.__service_name.encode('utf-16le'),
command.encode('utf-16le'))
except Exception:
print("[!] Error creating service %s on %s" % (
self.__service_name, self.connection.getRemoteHost()))
raise
else:
return resp['ContextHandle']
def openSvcManager(self):
"""Connect to the SVCManager on the remote host."""
print("[*] Opening SVCManager on %s...."
"." % self.connection.getRemoteHost())
# Setup up a DCE SMBTransport with the connection already in place
self._rpctransport = transport.SMBTransport(
'', '', filename=r'\svcctl', smb_connection=self.connection)
self._dce = dcerpc.DCERPC_v5(self._rpctransport)
self._dce.connect()
self._dce.bind(svcctl.MSRPC_UUID_SVCCTL)
self.rpcsvc = svcctl.DCERPCSvcCtl(self._dce)
try:
resp = self.rpcsvc.OpenSCManagerW()
except Exception:
print("[!] Error opening SVCManager on %s...."
"." % self.connection.getRemoteHost())
raise Exception('Unable to open SVCManager')
else:
return resp['ContextHandle']
def copy_file(self, src, tree, dst):
"""Copy file to remote SMB share."""
print("[*] Uploading file %s" % dst)
if isinstance(src, str):
# We have a filename
fh = open(src, 'rb')
else:
# We have a class instance, it must have a read method
fh = src
f = dst
pathname = string.replace(f, '/', '\\')
try:
self.connection.putFile(tree, pathname, fh.read)
except Exception:
print("[!] Error uploading file %s, aborting....." % dst)
raise
fh.close()
def findWritableShare(self, shares):
"""Retrieve a list of writable shares on the remote host."""
# Check we can write a file on the shares, stop in the first one
for i in shares:
if (i['Type'] == smb.SHARED_DISK or
i['Type'] == smb.SHARED_DISK_HIDDEN):
share = i['NetName'].decode('utf-16le')[:-1]
try:
self.connection.createDirectory(share, 'BETO')
except Exception:
# Can't create, pass
print("[!] share '%s' is not writable." % share)
pass
else:
print('[*] Found writable share %s' % share)
self.connection.deleteDirectory(share, 'BETO')
return str(share)
return None
def install(self): # noqa
"""Install the service on the remote host."""
if self.connection.isGuestSession():
print("[!] Authenticated as Guest. Aborting")
self.connection.logoff()
del(self.connection)
else:
fileCopied = False
serviceCreated = False
# Do the stuff here
try:
# Let's get the shares
shares = self.getShares()
self.share = self.findWritableShare(shares)
self.copy_file(self.__exeFile,
self.share,
self.__binary_service_name)
fileCopied = True
svcManager = self.openSvcManager()
if svcManager != 0:
serverName = self.connection.getServerName()
if serverName != '':
path = '\\\\%s\\%s' % (serverName, self.share)
else:
path = '\\\\127.0.0.1\\' + self.share
service = self.createService(svcManager, self.share, path)
serviceCreated = True
if service != 0:
# Start service
print('[*] Starting service %s....'
'.' % self.__service_name)
try:
self.rpcsvc.StartServiceW(service)
except Exception:
pass
self.rpcsvc.CloseServiceHandle(service)
self.rpcsvc.CloseServiceHandle(svcManager)
return True
except Exception as e:
print("[!] Error performing the installation, cleaning up: "
"%s" % e)
try:
self.rpcsvc.StopService(service)
except Exception:
pass
if fileCopied is True:
try:
self.connection.deleteFile(self.share,
self.__binary_service_name)
except Exception:
pass
if serviceCreated is True:
try:
self.rpcsvc.DeleteService(service)
except Exception:
pass
return False
def uninstall(self):
"""Uninstall service from remote host and delete file from share."""
fileCopied = True
serviceCreated = True
# Do the stuff here
try:
# Let's get the shares
svcManager = self.openSvcManager()
if svcManager != 0:
resp = self.rpcsvc.OpenServiceA(svcManager,
self.__service_name)
service = resp['ContextHandle']
print('[*] Stoping service %s.....' % self.__service_name)
try:
self.rpcsvc.StopService(service)
except Exception:
pass
print('[*] Removing service %s.....' % self.__service_name)
self.rpcsvc.DeleteService(service)
self.rpcsvc.CloseServiceHandle(service)
self.rpcsvc.CloseServiceHandle(svcManager)
print('[*] Removing file %s.....' % self.__binary_service_name)
self.connection.deleteFile(self.share, self.__binary_service_name)
except Exception:
print("[!] Error performing the uninstallation, cleaning up")
try:
self.rpcsvc.StopService(service)
except Exception:
pass
if fileCopied is True:
try:
self.connection.deleteFile(self.share,
self.__binary_service_name)
except Exception:
try:
self.connection.deleteFile(self.share,
self.__binary_service_name)
except Exception:
pass
pass
if serviceCreated is True:
try:
self.rpcsvc.DeleteService(service)
except Exception:
pass

View File

@ -1,285 +0,0 @@
# Copyright 2012-2013 OpenStack Foundation
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""Command-line interface to Configuration Discovery.
Accept a network location, run through the discovery process and report the
findings back to the user.
"""
from __future__ import print_function
import argparse
import json
import logging
import os
import sys
from satori.common import logging as common_logging
from satori.common import templating
from satori import discovery
from satori import errors
LOG = logging.getLogger(__name__)
def netloc_parser(data):
"""Parse the netloc parameter.
:returns: username, url.
"""
if data and '@' in data:
first_at = data.index('@')
return (data[0:first_at] or None), data[first_at + 1:] or None
else:
return None, data or None
def parse_args(argv):
"""Parse the command line arguments."""
parser = argparse.ArgumentParser(description='Configuration discovery.')
parser.add_argument(
'netloc',
help="Network location as a URL, address, or ssh-style user@address. "
"E.g. https://domain.com, sub.domain.com, 4.3.2.1, or root@web01. "
"Supplying a username before an @ without the `--system-info` "
" argument will default `--system-info` to 'ohai-solo'."
)
#
# Openstack Client Settings
#
openstack_group = parser.add_argument_group(
'OpenStack Settings',
"Cloud credentials, settings and endpoints. If a network location is "
"found to be hosted on the tenant additional information is provided."
)
openstack_group.add_argument(
'--os-username',
dest='username',
default=os.environ.get('OS_USERNAME'),
help="OpenStack Auth username. Defaults to env[OS_USERNAME]."
)
openstack_group.add_argument(
'--os-password',
dest='password',
default=os.environ.get('OS_PASSWORD'),
help="OpenStack Auth password. Defaults to env[OS_PASSWORD]."
)
openstack_group.add_argument(
'--os-region-name',
dest='region',
default=os.environ.get('OS_REGION_NAME'),
help="OpenStack region. Defaults to env[OS_REGION_NAME]."
)
openstack_group.add_argument(
'--os-auth-url',
dest='authurl',
default=os.environ.get('OS_AUTH_URL'),
help="OpenStack Auth endpoint. Defaults to env[OS_AUTH_URL]."
)
openstack_group.add_argument(
'--os-compute-api-version',
dest='compute_api_version',
default=os.environ.get('OS_COMPUTE_API_VERSION', '1.1'),
help="OpenStack Compute API version. Defaults to "
"env[OS_COMPUTE_API_VERSION] or 1.1."
)
# Tenant name or ID can be supplied
tenant_group = openstack_group.add_mutually_exclusive_group()
tenant_group.add_argument(
'--os-tenant-name',
dest='tenant_name',
default=os.environ.get('OS_TENANT_NAME'),
help="OpenStack Auth tenant name. Defaults to env[OS_TENANT_NAME]."
)
tenant_group.add_argument(
'--os-tenant-id',
dest='tenant_id',
default=os.environ.get('OS_TENANT_ID'),
help="OpenStack Auth tenant ID. Defaults to env[OS_TENANT_ID]."
)
#
# Plugins
#
parser.add_argument(
'--system-info',
help="Mechanism to use on a Nova resource to obtain system "
"information. E.g. ohai, facts, factor."
)
#
# Output formatting and logging
#
parser.add_argument(
'--format', '-F',
dest='format',
default='text',
help="Format for output (json or text)."
)
parser.add_argument(
"--logconfig",
help="Optional logging configuration file."
)
parser.add_argument(
"-d", "--debug",
action="store_true",
help="turn on additional debugging inspection and "
"output including full HTTP requests and responses. "
"Log output includes source file path and line "
"numbers."
)
parser.add_argument(
"-v", "--verbose",
action="store_true",
help="turn up logging to DEBUG (default is INFO)."
)
parser.add_argument(
"-q", "--quiet",
action="store_true",
help="turn down logging to WARN (default is INFO)."
)
#
# SSH options
#
ssh_group = parser.add_argument_group(
'ssh-like Settings',
'To be used to access hosts.'
)
# ssh.py actualy handles the defaults. We're documenting it here so that
# the command-line help string is informative, but the default is set in
# ssh.py (by calling paramiko's load_system_host_keys).
ssh_group.add_argument(
"-i", "--host-key-path",
type=argparse.FileType('r'),
help="Selects a file from which the identity (private key) for public "
"key authentication is read. The default ~/.ssh/id_dsa, "
"~/.ssh/id_ecdsa and ~/.ssh/id_rsa. Supplying this without the "
"`--system-info` argument will default `--system-info` to 'ohai-solo'."
)
ssh_group.add_argument(
"-o",
metavar="ssh_options",
help="Mirrors the ssh -o option. See ssh_config(5)."
)
config = parser.parse_args(argv)
if config.host_key_path:
config.host_key = config.host_key_path.read()
else:
config.host_key = None
# argparse lacks a method to say "if this option is set, require these too"
required_to_access_cloud = [
config.username,
config.password,
config.authurl,
config.region,
config.tenant_name or config.tenant_id,
]
if any(required_to_access_cloud) and not all(required_to_access_cloud):
raise errors.SatoriShellException(
"To connect to an OpenStack cloud you must supply a username, "
"password, authentication endpoint, region and tenant. Either "
"provide all of these settings or none of them."
)
username, url = netloc_parser(config.netloc)
config.netloc = url
if (config.host_key or config.username) and not config.system_info:
config.system_info = 'ohai-solo'
if username:
config.host_username = username
else:
config.host_username = 'root'
return vars(config)
def main(argv=None):
"""Discover an existing configuration for a network location."""
config = parse_args(argv)
common_logging.init_logging(config)
if not (config['format'] == 'json' or
check_format(config['format'] or "text")):
sys.exit("Output format file (%s) not found or accessible. Try "
"specifying raw JSON format using `--format json`" %
get_template_path(config['format']))
try:
results, errors = discovery.run(config['netloc'], config,
interactive=True)
print(format_output(config['netloc'], results,
template_name=config['format']))
if errors:
sys.stderr.write(format_errors(errors, config))
except Exception as exc: # pylint: disable=W0703
if config['debug']:
LOG.exception(exc)
return str(exc)
sys.exit(0)
def get_template_path(name):
"""Get template path from name."""
root_dir = os.path.dirname(__file__)
return os.path.join(root_dir, "formats", "%s.jinja" % name)
def check_format(name):
"""Verify that we have the requested format template."""
template_path = get_template_path(name)
return os.path.exists(template_path)
def get_template(name):
"""Get template text from templates directory by name."""
root_dir = os.path.dirname(__file__)
template_path = os.path.join(root_dir, "formats", "%s.jinja" % name)
with open(template_path, 'r') as handle:
template = handle.read()
return template
def format_output(discovered_target, results, template_name="text"):
"""Format results in CLI format."""
if template_name == 'json':
return(json.dumps(results, indent=2))
else:
template = get_template(template_name)
env_vars = dict(lstrip_blocks=True, trim_blocks=True)
return templating.parse(template, target=discovered_target,
data=results, env_vars=env_vars).strip('\n')
def format_errors(errors, config):
"""Format errors for output to console."""
if config['debug']:
return str(errors)
else:
formatted = {}
for key, error in errors.items():
formatted[key] = error['message']
return str(formatted)
if __name__ == "__main__":
sys.exit(main())

View File

@ -1,347 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
# pylint: disable=W0703
"""Windows remote client module implemented using psexec.py."""
try:
import eventlet
eventlet.monkey_patch()
from eventlet.green import time
except ImportError:
import time
import ast
import base64
import logging
import os
import re
import shlex
import tempfile
from satori.common import popen
from satori import errors
from satori import ssh
from satori import tunnel
LOG = logging.getLogger(__name__)
def connect(*args, **kwargs):
"""Connect to a remote device using psexec.py."""
try:
return SMBClient.get_client(*args, **kwargs)
except Exception as exc:
LOG.error("ERROR: pse.py failed to connect: %s", str(exc))
def _posh_encode(command):
"""Encode a powershell command to base64.
This is using utf-16 encoding and disregarding the first two bytes
:param command: command to encode
"""
return base64.b64encode(command.encode('utf-16')[2:])
class SubprocessError(Exception):
"""Custom Exception.
This will be raised when the subprocess running psexec.py has exited.
"""
pass
class SMBClient(object): # pylint: disable=R0902
"""Connects to devices over SMB/psexec to execute commands."""
_prompt_pattern = re.compile(r'^[a-zA-Z]:\\.*>$', re.MULTILINE)
# pylint: disable=R0913
def __init__(self, host, password=None, username="Administrator",
port=445, timeout=10, gateway=None, **kwargs):
"""Create an instance of the PSE class.
:param str host: The ip address or host name of the server
to connect to
:param str password: A password to use for authentication
:param str username: The username to authenticate as (defaults to
Administrator)
:param int port: tcp/ip port to use (defaults to 445)
:param float timeout: an optional timeout (in seconds) for the
TCP connection
:param gateway: instance of satori.ssh.SSH to be used to set up
an SSH tunnel (equivalent to ssh -L)
"""
self.password = password
self.host = host
self.port = port or 445
self.username = username or 'Administrator'
self.timeout = timeout
self._connected = False
self._platform_info = None
self._process = None
self._orig_host = None
self._orig_port = None
self.ssh_tunnel = None
self._substituted_command = None
# creating temp file to talk to _process with
self._file_write = tempfile.NamedTemporaryFile()
self._file_read = open(self._file_write.name, 'r')
self._command = ("nice python %s/contrib/psexec.py -port %s %s:%s@%s "
"'c:\\Windows\\sysnative\\cmd'")
self._output = ''
self.gateway = gateway
if gateway:
if not isinstance(self.gateway, ssh.SSH):
raise TypeError("'gateway' must be a satori.ssh.SSH instance. "
"( instances of this type are returned by"
"satori.ssh.connect() )")
if kwargs:
LOG.debug("DEBUG: Following arguments passed into PSE constructor "
"not used: %s", kwargs.keys())
def __del__(self):
"""Destructor of the PSE class."""
try:
self.close()
except ValueError:
pass
@classmethod
def get_client(cls, *args, **kwargs):
"""Return a pse client object from this module."""
return cls(*args, **kwargs)
@property
def platform_info(self):
"""Return Windows edition, version and architecture.
requires Powershell version 3
"""
if not self._platform_info:
command = ('Get-WmiObject Win32_OperatingSystem |'
' select @{n="dist";e={$_.Caption.Trim()}},'
'@{n="version";e={$_.Version}},@{n="arch";'
'e={$_.OSArchitecture}} | '
' ConvertTo-Json -Compress')
stdout = self.remote_execute(command, retry=3)
self._platform_info = ast.literal_eval(stdout)
return self._platform_info
def create_tunnel(self):
"""Create an ssh tunnel via gateway.
This will tunnel a local ephemeral port to the host's port.
This will preserve the original host and port
"""
self.ssh_tunnel = tunnel.Tunnel(self.host, self.port, self.gateway)
self._orig_host = self.host
self._orig_port = self.port
self.host, self.port = self.ssh_tunnel.address
self.ssh_tunnel.serve_forever(async=True)
def shutdown_tunnel(self):
"""Terminate the ssh tunnel. Restores original host and port."""
self.ssh_tunnel.shutdown()
self.host = self._orig_host
self.port = self._orig_port
def test_connection(self):
"""Connect to a Windows server and disconnect again.
Make sure the returncode is 0, otherwise return False
"""
self.connect()
self.close()
self._get_output()
if self._output.find('ErrorCode: 0, ReturnCode: 0') > -1:
return True
else:
return False
def connect(self):
"""Attempt a connection using psexec.py.
This will create a subprocess.Popen() instance and communicate with it
via _file_read/_file_write and _process.stdin
"""
try:
if self._connected and self._process:
if self._process.poll() is None:
return
else:
self._process.wait()
if self.gateway:
self.shutdown_tunnel()
if self.gateway:
self.create_tunnel()
self._substituted_command = self._command % (
os.path.dirname(__file__),
self.port,
self.username,
self.password,
self.host)
self._process = popen.popen(
shlex.split(self._substituted_command),
stdout=self._file_write,
stderr=popen.STDOUT,
stdin=popen.PIPE,
close_fds=True,
universal_newlines=True,
bufsize=-1)
output = ''
while not self._prompt_pattern.findall(output):
output += self._get_output()
self._connected = True
except Exception:
LOG.error("Failed to connect to host %s over smb",
self.host, exc_info=True)
self.close()
raise
def close(self):
"""Close the psexec connection by sending 'exit' to the subprocess.
This will cleanly exit psexec (i.e. stop and uninstall the service and
delete the files)
This method will be called when an instance of this class is about to
being destroyed. It will try to close the connection (which will clean
up on the remote server) and catch the exception that is raised when
the connection has already been closed.
"""
try:
self._process.communicate('exit')
except Exception as exc:
LOG.warning("ERROR: Failed to close %s: %s", self, str(exc))
del exc
try:
if self.gateway:
self.shutdown_tunnel()
self.gateway.close()
except Exception as exc:
LOG.warning("ERROR: Failed to close gateway %s: %s", self.gateway,
str(exc))
del exc
finally:
try:
self._process.kill()
except OSError:
LOG.exception("Tried killing psexec subprocess.")
def remote_execute(self, command, powershell=True, retry=0, **kwargs):
"""Execute a command on a remote host.
:param command: Command to be executed
:param powershell: If True, command will be interpreted as Powershell
command and therefore converted to base64 and
prepended with 'powershell -EncodedCommand
:param int retry: Number of retries when SubprocessError is thrown
by _get_output before giving up
"""
self.connect()
if powershell:
command = ('powershell -EncodedCommand %s' %
_posh_encode(command))
LOG.info("Executing command: %s", command)
self._process.stdin.write('%s\n' % command)
self._process.stdin.flush()
try:
output = self._get_output()
LOG.debug("Stdout produced: %s", output)
output = "\n".join(output.splitlines()[:-1]).strip()
return output
except Exception:
LOG.error("Error while reading output from command %s on %s",
command, self.host, exc_info=True)
if not retry:
raise
else:
return self.remote_execute(command, powershell=powershell,
retry=retry - 1)
def _handle_output(self, output):
"""Check for process termination, exit code, or error messages.
If the exit code is available and is zero, return True. This rountine
will raise an exception in the case of a non-zero exit code.
"""
if self._process.poll() is not None:
if self._process.returncode == 0:
return True
if "The attempted logon is invalid" in output:
msg = [k for k in output.splitlines() if k][-1].strip()
raise errors.SatoriSMBAuthenticationException(msg)
elif "The user account has been automatically locked" in output:
msg = [k for k in output.splitlines() if k][-1].strip()
raise errors.SatoriSMBLockoutException(msg)
elif "cannot be opened because the share access flags" in output:
# A file cannot be opened because the share
# access flags are incompatible
msg = [k for k in output.splitlines() if k][-1].strip()
raise errors.SatoriSMBFileSharingException(msg)
else:
raise SubprocessError("subprocess with pid: %s has "
"terminated unexpectedly with "
"return code: %s | %s"
% (self._process.pid,
self._process.poll(), output))
def _get_output(self, prompt_expected=True, wait=500):
"""Retrieve output from _process.
This method will wait until output is started to be received and then
wait until no further output is received within a defined period
:param prompt_expected: only return when regular expression defined
in _prompt_pattern is matched
:param wait: Time in milliseconds to wait in each of the
two loops that wait for (more) output.
"""
tmp_out = ''
while tmp_out == '':
self._file_read.seek(0, 1)
tmp_out += self._file_read.read()
# leave loop if underlying process has a return code
# obviously meaning that it has terminated
if self._handle_output(tmp_out):
break
time.sleep(float(wait) / 1000)
else:
LOG.debug("Loop 1 - stdout read: %s", tmp_out)
stdout = tmp_out
while (tmp_out != '' or
(not self._prompt_pattern.findall(stdout) and
prompt_expected)):
self._file_read.seek(0, 1)
tmp_out = self._file_read.read()
stdout += tmp_out
# leave loop if underlying process has a return code
# obviously meaning that it has terminated
if self._handle_output(tmp_out):
break
time.sleep(float(wait) / 1000)
else:
LOG.debug("Loop 2 - stdout read: %s", tmp_out)
self._output += stdout
stdout = stdout.replace('\r', '').replace('\x08', '')
return stdout

View File

@ -1,512 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""SSH Module for connecting to and automating remote commands.
Supports proxying through an ssh tunnel ('gateway' keyword argument.)
To control the behavior of the SSH client, use the specific connect_with_*
calls. The .connect() call behaves like the ssh command and attempts a number
of connection methods, including using the curent user's ssh keys.
If interactive is set to true, the module will also prompt for a password if no
other connection methods succeeded.
Note that test_connection() calls connect(). To test a connection and control
the authentication methods used, just call connect_with_* and catch any
exceptions instead of using test_connect().
"""
import ast
import getpass
import logging
import os
import re
import time
import paramiko
import six
from satori import errors
from satori import utils
LOG = logging.getLogger(__name__)
MIN_PASSWORD_PROMPT_LEN = 8
MAX_PASSWORD_PROMPT_LEN = 64
TEMPFILE_PREFIX = ".satori.tmp.key."
TTY_REQUIRED = [
"you must have a tty to run sudo",
"is not a tty",
"no tty present",
"must be run from a terminal",
]
def shellquote(s):
r"""Quote a string for use on a command line.
This wraps the string in single-quotes and converts any existing
single-quotes to r"'\''". Here the first single-quote ends the
previous quoting, the escaped single-quote becomes a literal
single-quote, and the last single-quote quotes the next part of
the string.
"""
return "'%s'" % s.replace("'", r"'\''")
def make_pkey(private_key):
"""Return a paramiko.pkey.PKey from private key string."""
key_classes = [paramiko.rsakey.RSAKey,
paramiko.dsskey.DSSKey,
paramiko.ecdsakey.ECDSAKey, ]
keyfile = six.StringIO(private_key)
for cls in key_classes:
keyfile.seek(0)
try:
pkey = cls.from_private_key(keyfile)
except paramiko.SSHException:
continue
else:
keytype = cls
LOG.info("Valid SSH Key provided (%s)", keytype.__name__)
return pkey
raise paramiko.SSHException("Is not a valid private key")
def connect(*args, **kwargs):
"""Connect to a remote device over SSH."""
try:
return SSH.get_client(*args, **kwargs)
except TypeError as exc:
msg = "got an unexpected"
if msg in str(exc):
message = "%s " + str(exc)[str(exc).index(msg):]
raise exc.__class__(message % "connect()")
raise
class AcceptMissingHostKey(paramiko.client.MissingHostKeyPolicy):
"""Allow connections to hosts whose fingerprints are not on record."""
# pylint: disable=R0903
def missing_host_key(self, client, hostname, key):
"""Add missing host key."""
# pylint: disable=W0212
client._host_keys.add(hostname, key.get_name(), key)
class SSH(paramiko.SSHClient): # pylint: disable=R0902
"""Connects to devices via SSH to execute commands."""
# pylint: disable=R0913
def __init__(self, host, password=None, username="root", private_key=None,
root_password=None, key_filename=None, port=22,
timeout=20, gateway=None, options=None, interactive=False):
"""Create an instance of the SSH class.
:param str host: The ip address or host name of the server
to connect to
:param str password: A password to use for authentication
or for unlocking a private key
:param username: The username to authenticate as
:param root_password: root user password to be used if 'username'
is not root. This will use 'username' and
'password to login and then 'su' to root
using root_password
:param private_key: Private SSH Key string to use
(instead of using a filename)
:param key_filename: a private key filename (path)
:param port: tcp/ip port to use (defaults to 22)
:param float timeout: an optional timeout (in seconds) for the
TCP connection
:param socket gateway: an existing SSH instance to use
for proxying
:param dict options: A dictionary used to set ssh options
(when proxying).
e.g. for `ssh -o StrictHostKeyChecking=no`,
you would provide
(.., options={'StrictHostKeyChecking': 'no'})
Conversion of booleans is also supported,
(.., options={'StrictHostKeyChecking': False})
is equivalent.
:keyword interactive: If true, prompt for password if missing.
"""
self.password = password
self.host = host
self.username = username or 'root'
self.root_password = root_password
self.private_key = private_key
self.key_filename = key_filename
self.port = port or 22
self.timeout = timeout
self._platform_info = None
self.options = options or {}
self.gateway = gateway
self.sock = None
self.interactive = interactive
self.escalation_command = 'sudo -i %s'
if self.root_password:
self.escalation_command = "su -c '%s'"
if self.gateway:
if not isinstance(self.gateway, SSH):
raise TypeError("'gateway' must be a satori.ssh.SSH instance. "
"( instances of this type are returned by "
"satori.ssh.connect() )")
super(SSH, self).__init__()
def __del__(self):
"""Destructor to close the connection."""
self.close()
@classmethod
def get_client(cls, *args, **kwargs):
"""Return an ssh client object from this module."""
return cls(*args, **kwargs)
@property
def platform_info(self):
"""Return distro, version, architecture.
Requires >= Python 2.4 on remote system.
"""
if not self._platform_info:
platform_command = "import platform,sys\n"
platform_command += utils.get_source_definition(
utils.get_platform_info)
platform_command += ("\nsys.stdout.write(str("
"get_platform_info()))\n")
command = 'echo %s | python' % shellquote(platform_command)
output = self.remote_execute(command)
stdout = re.split('\n|\r\n', output['stdout'])[-1].strip()
if stdout:
try:
plat = ast.literal_eval(stdout)
except SyntaxError as exc:
plat = {'dist': 'unknown'}
LOG.warning("Error parsing response from host '%s': %s",
self.host, output, exc_info=exc)
else:
plat = {'dist': 'unknown'}
LOG.warning("Blank response from host '%s': %s",
self.host, output)
self._platform_info = plat
return self._platform_info
def connect_with_host_keys(self):
"""Try connecting with locally available keys (ex. ~/.ssh/id_rsa)."""
LOG.debug("Trying to connect with local host keys")
return self._connect(look_for_keys=True, allow_agent=False)
def connect_with_password(self):
"""Try connecting with password."""
LOG.debug("Trying to connect with password")
if self.interactive and not self.password:
LOG.debug("Prompting for password (interactive=%s)",
self.interactive)
try:
self.password = getpass.getpass("Enter password for %s:" %
self.username)
except KeyboardInterrupt:
LOG.debug("User cancelled at password prompt")
if not self.password:
raise paramiko.PasswordRequiredException("Password not provided")
return self._connect(
password=self.password,
look_for_keys=False,
allow_agent=False)
def connect_with_key_file(self):
"""Try connecting with key file."""
LOG.debug("Trying to connect with key file")
if not self.key_filename:
raise paramiko.AuthenticationException("No key file supplied")
return self._connect(
key_filename=os.path.expanduser(self.key_filename),
look_for_keys=False,
allow_agent=False)
def connect_with_key(self):
"""Try connecting with key string."""
LOG.debug("Trying to connect with private key string")
if not self.private_key:
raise paramiko.AuthenticationException("No key supplied")
pkey = make_pkey(self.private_key)
return self._connect(
pkey=pkey,
look_for_keys=False,
allow_agent=False)
def _connect(self, **kwargs):
"""Set up client and connect to target."""
self.load_system_host_keys()
if self.options.get('StrictHostKeyChecking') in (False, "no"):
self.set_missing_host_key_policy(AcceptMissingHostKey())
if self.gateway:
# lazy load
if not self.gateway.get_transport():
self.gateway.connect()
self.sock = self.gateway.get_transport().open_channel(
'direct-tcpip', (self.host, self.port), ('', 0))
return super(SSH, self).connect(
self.host,
timeout=kwargs.pop('timeout', self.timeout),
port=kwargs.pop('port', self.port),
username=kwargs.pop('username', self.username),
pkey=kwargs.pop('pkey', None),
sock=kwargs.pop('sock', self.sock),
**kwargs)
def connect(self): # pylint: disable=W0221
"""Attempt an SSH connection through paramiko.SSHClient.connect.
The order for authentication attempts is:
- private_key
- key_filename
- any key discoverable in ~/.ssh/
- username/password (will prompt if the password is not supplied and
interactive is true)
"""
# idempotency
if self.get_transport():
if self.get_transport().is_active():
return
if self.private_key:
try:
return self.connect_with_key()
except paramiko.SSHException:
pass # try next method
if self.key_filename:
try:
return self.connect_with_key_file()
except paramiko.SSHException:
pass # try next method
try:
return self.connect_with_host_keys()
except paramiko.SSHException:
pass # try next method
try:
return self.connect_with_password()
except paramiko.BadHostKeyException as exc:
msg = (
"ssh://%s@%s:%d failed: %s. You might have a bad key "
"entry on your server, but this is a security issue and "
"won't be handled automatically. To fix this you can remove "
"the host entry for this host from the /.ssh/known_hosts file")
LOG.info(msg, self.username, self.host, self.port, exc)
raise exc
except Exception as exc:
LOG.info('ssh://%s@%s:%d failed. %s',
self.username, self.host, self.port, exc)
raise exc
def test_connection(self):
"""Connect to an ssh server and verify that it responds.
The order for authentication attempts is:
(1) private_key
(2) key_filename
(3) any key discoverable in ~/.ssh/
(4) username/password
"""
LOG.debug("Checking for a response from ssh://%s@%s:%d.",
self.username, self.host, self.port)
try:
self.connect()
LOG.debug("ssh://%s@%s:%d is up.",
self.username, self.host, self.port)
return True
except Exception as exc:
LOG.info("ssh://%s@%s:%d failed. %s",
self.username, self.host, self.port, exc)
return False
finally:
self.close()
def close(self):
"""Close the connection to the remote host.
If an ssh tunnel is being used, close that first.
"""
if self.gateway:
self.gateway.close()
return super(SSH, self).close()
def _handle_tty_required(self, results, get_pty):
"""Determine whether the result implies a tty request."""
if any(m in str(k) for m in TTY_REQUIRED for k in results.values()):
LOG.info('%s requires TTY for sudo/su. Using TTY mode.',
self.host)
if get_pty is True: # if this is *already* True
raise errors.GetPTYRetryFailure(
"Running command with get_pty=True FAILED: %s@%s:%d"
% (self.username, self.host, self.port))
else:
return True
return False
def _handle_password_prompt(self, stdin, stdout, su_auth=False):
"""Determine whether the remote host is prompting for a password.
Respond to the prompt through stdin if applicable.
"""
if not stdout.channel.closed:
buflen = len(stdout.channel.in_buffer)
# min and max determined from max username length
# and a set of encountered linux password prompts
if MIN_PASSWORD_PROMPT_LEN < buflen < MAX_PASSWORD_PROMPT_LEN:
prompt = stdout.channel.recv(buflen)
if all(m in prompt.lower()
for m in ['password', ':']):
LOG.warning("%s@%s encountered prompt! of length "
" [%s] {%s}",
self.username, self.host, buflen, prompt)
if su_auth:
LOG.warning("Escalating using 'su -'.")
stdin.write("%s\n" % self.root_password)
else:
stdin.write("%s\n" % self.password)
stdin.flush()
return True
else:
LOG.warning("Nearly a False-Positive on "
"password prompt detection. [%s] {%s}",
buflen, prompt)
stdout.channel.send(prompt)
return False
def _command_is_already_running(self, command):
"""Check to see if the command is already running using ps & grep."""
# check plain 'command' w/o prefix or escalation
check_cmd = 'ps -ef |grep -v grep|grep -c "%s"' % command
result = self.remote_execute(check_cmd, keepalive=True,
allow_many=True)
if result['stdout'] != '0':
return True
else:
LOG.debug("Remote command %s IS NOT already running. "
"Continuing with remote_execute.", command)
def remote_execute(self, command, with_exit_code=False, # noqa
get_pty=False, cwd=None, keepalive=True,
escalate=False, allow_many=True, **kw):
"""Execute an ssh command on a remote host.
Tries cert auth first and falls back
to password auth if password provided.
:param command: Shell command to be executed by this function.
:param with_exit_code: Include the exit_code in the return body.
:param cwd: The child's current directory will be changed
to `cwd` before it is executed. Note that this
directory is not considered when searching the
executable, so you can't specify the program's
path relative to this argument
:param get_pty: Request a pseudo-terminal from the server.
:param allow_many: If False, do not run command if it is already
found running on remote client.
:returns: a dict with stdin, stdout,
and (optionally) the exit code of the call.
"""
if escalate and self.username != 'root':
run_command = self.escalation_command % command
else:
run_command = command
if cwd:
prefix = "cd %s && " % cwd
run_command = prefix + run_command
# _command_is_already_running wont be called if allow_many is True
# python is great :)
if not allow_many and self._command_is_already_running(command):
raise errors.SatoriDuplicateCommandException(
"Remote command %s is already running and allow_many was "
"set to False. Aborting remote_execute." % command)
try:
self.connect()
results = None
chan = self.get_transport().open_session()
su_auth = False
if 'su -' in run_command:
su_auth = True
get_pty = True
if get_pty:
chan.get_pty()
stdin = chan.makefile('wb')
stdout = chan.makefile('rb')
stderr = chan.makefile_stderr('rb')
LOG.debug("Executing '%s' on ssh://%s@%s:%s.",
run_command, self.username, self.host, self.port)
chan.exec_command(run_command)
LOG.debug('ssh://%s@%s:%d responded.', self.username, self.host,
self.port)
time.sleep(.25)
self._handle_password_prompt(stdin, stdout, su_auth=su_auth)
results = {
'stdout': stdout.read().strip(),
'stderr': stderr.read()
}
LOG.debug("STDOUT from ssh://%s@%s:%d: %.5000s ...",
self.username, self.host, self.port,
results['stdout'])
LOG.debug("STDERR from ssh://%s@%s:%d: %.5000s ...",
self.username, self.host, self.port,
results['stderr'])
exit_code = chan.recv_exit_status()
if with_exit_code:
results.update({'exit_code': exit_code})
if not keepalive:
chan.close()
if self._handle_tty_required(results, get_pty):
return self.remote_execute(
command, with_exit_code=with_exit_code, get_pty=True,
cwd=cwd, keepalive=keepalive, escalate=escalate,
allow_many=allow_many)
return results
except Exception as exc:
LOG.info("ssh://%s@%s:%d failed. | %s", self.username, self.host,
self.port, exc)
raise
finally:
if not keepalive:
self.close()
# Share SSH.__init__'s docstring
connect.__doc__ = SSH.__init__.__doc__
try:
SSH.__dict__['get_client'].__doc__ = SSH.__dict__['__init__'].__doc__
except AttributeError:
SSH.get_client.__func__.__doc__ = SSH.__init__.__doc__

View File

@ -1 +0,0 @@
"""Modules for Data Plane Discovery."""

View File

@ -1,6 +0,0 @@
"""."""
def get_systeminfo(resource, config, interactive=False):
"""."""
return {'facter': 'is better'}

View File

@ -1,6 +0,0 @@
"""."""
def get_systeminfo(resource, config, interactive=False):
"""."""
return {'ohai': 'there!'}

View File

@ -1,201 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
# pylint: disable=W0622
"""Ohai Solo Data Plane Discovery Module."""
import json
import logging
import ipaddress as ipaddress_module
import six
from satori import bash
from satori import errors
from satori import utils
LOG = logging.getLogger(__name__)
def get_systeminfo(ipaddress, config, interactive=False):
"""Run data plane discovery using this module against a host.
:param ipaddress: address to the host to discover.
:param config: arguments and configuration suppplied to satori.
:keyword interactive: whether to prompt the user for information.
"""
if (ipaddress in utils.get_local_ips() or
ipaddress_module.ip_address(six.text_type(ipaddress)).is_loopback):
client = bash.LocalShell()
client.host = "localhost"
client.port = 0
perform_install(client)
return system_info(client)
else:
with bash.RemoteShell(
ipaddress, username=config['host_username'],
private_key=config['host_key'],
interactive=interactive) as client:
perform_install(client)
return system_info(client)
def system_info(client, with_install=False, install_dir=None):
"""Run ohai-solo on a remote system and gather the output.
:param client: :class:`ssh.SSH` instance
:param with_install Will install ohai-solo if set to True
:param install_dir string containing directory to install to
:returns: dict -- system information from ohai-solo
:raises: SystemInfoCommandMissing, SystemInfoCommandOld, SystemInfoNotJson
SystemInfoMissingJson
SystemInfoCommandMissing if `ohai` is not installed.
SystemInfoCommandOld if `ohai` is not the latest.
SystemInfoNotJson if `ohai` does not return valid JSON.
SystemInfoMissingJson if `ohai` does not return any JSON.
"""
if with_install:
perform_install(client, install_dir=install_dir)
if client.is_windows():
raise errors.UnsupportedPlatform(
"ohai-solo is a linux-only sytem info provider. "
"Target platform was %s", client.platform_info['dist'])
ohai_solo_prefix = (install_dir or '/opt')
ohai_solo_command = six.moves.shlex_quote("%s/ohai-solo/bin/ohai-solo"
% ohai_solo_prefix)
command = ("unset GEM_CACHE GEM_HOME GEM_PATH && "
"sudo %s" % ohai_solo_command)
output = client.execute(command, escalate=True, allow_many=False)
not_found_msgs = ["command not found", "Could not find ohai"]
if any(m in k for m in not_found_msgs
for k in list(output.values()) if isinstance(k,
six.string_types)):
LOG.warning("SystemInfoCommandMissing on host: [%s]", client.host)
raise errors.SystemInfoCommandMissing("ohai-solo missing on %s" %
client.host)
# use string formatting to handle unicode
unicode_output = "%s" % output['stdout']
try:
results = json.loads(unicode_output)
except ValueError as exc:
try:
clean_output = get_json(unicode_output)
results = json.loads(clean_output)
except ValueError as exc:
raise errors.SystemInfoNotJson(exc)
return results
def perform_install(client, install_dir=None):
"""Install ohai-solo on remote system.
:param client: :class:`ssh.SSH` instance
:param install_dir string containing directory to install to
"""
LOG.info("Installing (or updating) ohai-solo on device %s at %s:%d",
client.host, client.host, client.port)
# Check if it a windows box, but fail safely to Linux
is_windows = False
try:
is_windows = client.is_windows()
except Exception:
pass
if is_windows:
raise errors.UnsupportedPlatform(
"ohai-solo is a linux-only sytem info provider. "
"Target platform was %s", client.platform_info['dist'])
else:
# Download to host
command = ("wget -N http://readonly.configdiscovery.rackspace.com"
"/install.sh")
output = client.execute(command, cwd='/tmp', escalate=True,
allow_many=False)
LOG.debug("Downloaded ohai-solo | %s", output['stdout'])
# Run install
command = "bash install.sh"
if install_dir:
command = "%s -t -i %s" % (command,
six.moves.shlex_quote(install_dir))
install_output = client.execute(command, cwd='/tmp',
with_exit_code=True,
escalate=True, allow_many=False)
LOG.debug("Ran ohai-solo install script. | %s.",
install_output['stdout'])
# Be a good citizen and clean up your tmp data
command = "rm install.sh"
client.execute(command, cwd='/tmp', escalate=True, allow_many=False)
# Process install command output
if install_output['exit_code'] != 0:
raise errors.SystemInfoCommandInstallFailed(
install_output['stderr'][:256])
else:
return install_output
def remove_remote(client, install_dir=None):
"""Remove ohai-solo from specifc remote system.
:param install_dir string containing directory ohai-solo was installed in
Currently supports:
- ubuntu [10.x, 12.x]
- debian [6.x, 7.x]
- redhat [5.x, 6.x]
- centos [5.x, 6.x]
"""
if client.is_windows():
raise errors.UnsupportedPlatform(
"ohai-solo is a linux-only sytem info provider. "
"Target platform was %s", client.platform_info['dist'])
else:
platform_info = client.platform_info
if install_dir is not None:
install_dir = six.moves.shlex_quote("%s/ohai-solo/" % install_dir)
remove = 'rm -rf %s' % install_dir
elif client.is_debian():
remove = "dpkg --purge ohai-solo"
elif client.is_fedora():
remove = "yum -y erase ohai-solo"
else:
raise errors.UnsupportedPlatform("Unknown distro: %s" %
platform_info['dist'])
command = "%s" % remove
output = client.execute(command, cwd='/tmp', escalate=True)
return output
def get_json(data):
"""Find the JSON string in data and return a string.
:param data: :string:
:returns: string -- JSON string stripped of non-JSON data
:raises: SystemInfoMissingJson
SystemInfoMissingJson if `ohai` does not return any JSON.
"""
try:
first = data.index('{')
last = data.rindex('}')
return data[first:last + 1]
except ValueError as exc:
context = {"ValueError": "%s" % exc}
raise errors.SystemInfoMissingJson(context)

View File

@ -1,299 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
# pylint: disable=W0622
"""PoSh-Ohai Data Plane Discovery Module."""
import json
import logging
import xml.etree.ElementTree as ET
import ipaddress as ipaddress_module
import six
from satori import bash
from satori import errors
from satori import utils
LOG = logging.getLogger(__name__)
def get_systeminfo(ipaddress, config, interactive=False):
"""Run data plane discovery using this module against a host.
:param ipaddress: address to the host to discover.
:param config: arguments and configuration suppplied to satori.
:keyword interactive: whether to prompt the user for information.
"""
if (ipaddress in utils.get_local_ips() or
ipaddress_module.ip_address(six.text_type(ipaddress)).is_loopback):
client = bash.LocalShell()
client.host = "localhost"
client.port = 0
perform_install(client)
return system_info(client)
else:
with bash.RemoteShell(
ipaddress, username=config['host_username'],
private_key=config['host_key'],
interactive=interactive) as client:
perform_install(client)
return system_info(client)
def system_info(client, with_install=False, install_dir=None):
"""Run Posh-Ohai on a remote system and gather the output.
:param client: :class:`smb.SMB` instance
:param install_dir -- this is for compatibility and is ignored
:returns: dict -- system information from PoSh-Ohai
:raises: SystemInfoCommandMissing, SystemInfoCommandOld, SystemInfoInvalid
SystemInfoCommandMissing if `posh-ohai` is not installed.
SystemInfoCommandOld if `posh-ohai` is not the latest.
SystemInfoInvalid if `posh-ohai` does not return valid JSON or XML.
"""
if with_install:
perform_install(client)
if client.is_windows():
powershell_command = ('Import-Module -Name Posh-Ohai;'
'Get-ComputerConfiguration')
output = client.execute(powershell_command)
unicode_output = "%s" % output
load_clean_json = lambda output: json.loads(get_json(output))
last_err = None
for loader in json.loads, parse_xml, load_clean_json:
try:
return loader(unicode_output)
except ValueError as err:
last_err = err
raise errors.SystemInfoInvalid(last_err)
else:
raise errors.PlatformNotSupported(
"PoSh-Ohai is a Windows-only sytem info provider. "
"Target platform was %s", client.platform_info['dist'])
def perform_install(client, install_dir=None):
"""Install PoSh-Ohai on remote system.
:param install_dir -- For compatibility. Ignored.
"""
LOG.info("Installing (or updating) PoSh-Ohai on device %s at %s:%d",
client.host, client.host, client.port)
# Check is it is a windows box, but fail safely to Linux
is_windows = False
try:
is_windows = client.is_windows()
except Exception:
pass
if is_windows:
powershell_command = ('[scriptblock]::Create((New-Object -TypeName '
'System.Net.WebClient).DownloadString('
'"http://readonly.configdiscovery.rackspace.com'
'/deploy.ps1")).Invoke()')
# check output to ensure that installation was successful
# if not, raise SystemInfoCommandInstallFailed
output = client.execute(powershell_command)
return output
else:
raise errors.PlatformNotSupported(
"PoSh-Ohai is a Windows-only sytem info provider. "
"Target platform was %s", client.platform_info['dist'])
def remove_remote(client, install_dir=None):
"""Remove PoSh-Ohai from specifc remote system.
:param install_dir -- for compatibility. Ignored.
Currently supports:
- ubuntu [10.x, 12.x]
- debian [6.x, 7.x]
- redhat [5.x, 6.x]
- centos [5.x, 6.x]
"""
if client.is_windows():
powershell_command = ('Remove-Item -Path (Join-Path -Path '
'$($env:PSModulePath.Split(";") '
'| Where-Object { $_.StartsWith('
'$env:SystemRoot)}) -ChildPath '
'"PoSh-Ohai") -Recurse -Force -ErrorAction '
'SilentlyContinue')
output = client.execute(powershell_command)
return output
else:
raise errors.PlatformNotSupported(
"PoSh-Ohai is a Windows-only sytem info provider. "
"Target platform was %s", client.platform_info['dist'])
def get_json(data):
"""Find the JSON string in data and return a string.
:param data: :string:
:returns: string -- JSON string stripped of non-JSON data
"""
first = data.index('{')
last = data.rindex('}')
return data[first:last + 1]
def parse_text(elem):
"""Parse text from an element.
>>> parse_text(ET.XML('<Property>Hello World</Property>'))
'Hello World'
>>> parse_text(ET.XML('<Property>True </Property>'))
True
>>> parse_text(ET.XML('<Property>123</Property>'))
123
>>> print(parse_text(ET.XML('<Property />')))
None
"""
if elem.text is None:
return None
try:
return int(elem.text)
except ValueError:
pass
text = elem.text.strip()
if text == 'True':
return True
if text == 'False':
return False
return elem.text
def parse_list(elem):
"""Parse list of properties.
>>> parse_list(ET.XML('<Property />'))
[]
>>> xml = '''<Property>
... <Property>Hello</Property>
... <Property>World</Property>
... </Property>'''
>>> parse_list(ET.XML(xml))
['Hello', 'World']
"""
return [parse_elem(c) for c in elem]
def parse_attrib_dict(elem):
"""Parse list of properties.
>>> parse_attrib_dict(ET.XML('<Property />'))
{}
>>> xml = '''<Property>
... <Property Name="verb">Hello</Property>
... <Property Name="noun">World</Property>
... </Property>'''
>>> d = parse_attrib_dict(ET.XML(xml))
>>> sorted(d.items())
[('noun', 'World'), ('verb', 'Hello')]
"""
keys = [c.get('Name') for c in elem]
values = [parse_elem(c) for c in elem]
return dict(zip(keys, values))
def parse_key_value_dict(elem):
"""Parse list of properties.
>>> parse_key_value_dict(ET.XML('<Property />'))
{}
>>> xml = '''<Property>
... <Property Name="Key">verb</Property>
... <Property Name="Value">Hello</Property>
... <Property Name="Key">noun</Property>
... <Property Name="Value">World</Property>
... </Property>'''
>>> d = parse_key_value_dict(ET.XML(xml))
>>> sorted(d.items())
[('noun', 'World'), ('verb', 'Hello')]
"""
keys = [c.text for c in elem[::2]]
values = [parse_elem(c) for c in elem[1::2]]
return dict(zip(keys, values))
def parse_elem(elem):
"""Determine element type and dispatch to other parse functions."""
if len(elem) == 0:
return parse_text(elem)
if not elem[0].attrib:
return parse_list(elem)
if elem[0].get('Name') == 'Key':
return parse_key_value_dict(elem)
return parse_attrib_dict(elem)
def parse_xml(ohai_output):
r"""Parse XML Posh-Ohai output.
>>> output = '''\
... <?xml version="1.0"?>
... <Objects>
... <Object>
... <Property Name="Key">platform_family</Property>
... <Property Name="Value">Windows</Property>
... <Property Name="Key">logonhistory</Property>
... <Property Name="Value">
... <Property Name="Key">0x6dd0359</Property>
... <Property Name="Value">
... <Property Name="Key">user</Property>
... <Property Name="Value">WIN2008R2\\Administrator</Property>
... <Property Name="Key">logontype</Property>
... <Property Name="Value">10</Property>
... </Property>
... </Property>
... <Property Name="Key">loggedon_users</Property>
... <Property Name="Value">
... <Property>
... <Property Name="Session">995</Property>
... <Property Name="User">WIN2008R2\IUSR</Property>
... <Property Name="Type">Service</Property>
... </Property>
... <Property>
... <Property Name="Session">999</Property>
... <Property Name="User">WIN2008R2\SYSTEM</Property>
... <Property Name="Type">Local System</Property>
... </Property>
... </Property>
... </Object>
... </Objects>'''
>>> import pprint
>>> pprint.pprint(parse_xml(output))
{'loggedon_users': [{'Session': 995,
'Type': 'Service',
'User': 'WIN2008R2\\IUSR'},
{'Session': 999,
'Type': 'Local System',
'User': 'WIN2008R2\\SYSTEM'}],
'logonhistory': {'0x6dd0359': {'logontype': 10,
'user': 'WIN2008R2\\Administrator'}},
'platform_family': 'Windows'}
"""
try:
root = ET.XML(ohai_output)
except ET.ParseError as err:
raise ValueError(err)
try:
properties = root[0]
except IndexError as err:
raise ValueError('XML had unexpected structure')
return parse_elem(properties)

View File

@ -1,208 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
# pylint: disable=C0111, C0103, W0212, R0904
"""Satori SSH Module Tests."""
import collections
import unittest
import mock
from satori import bash
from satori import errors
from satori.tests import utils
class TestBashModule(utils.TestCase):
def setUp(self):
super(TestBashModule, self).setUp()
testrun = collections.namedtuple(
"TestCmd", ["command", "stdout", "returncode"])
self.testrun = testrun(
command="echo hello", stdout="hello\n", returncode=0)
self.resultdict = {'stdout': self.testrun.stdout.strip(),
'stderr': ''}
class TestLocalShell(TestBashModule):
def setUp(self):
super(TestLocalShell, self).setUp()
popen_patcher = mock.patch.object(bash.popen, 'popen')
self.mock_popen = popen_patcher.start()
mock_result = mock.MagicMock()
mock_result.returncode = self.testrun.returncode
self.mock_popen.return_value = mock_result
mock_result.communicate.return_value = (self.testrun.stdout, '')
self.localshell = bash.LocalShell()
self.addCleanup(popen_patcher.stop)
def test_execute(self):
self.localshell.execute(self.testrun.command)
self.mock_popen.assert_called_once_with(
self.testrun.command.split(), cwd=None, stderr=-1, stdout=-1)
def test_execute_resultdict(self):
resultdict = self.localshell.execute(self.testrun.command)
self.assertEqual(self.resultdict, resultdict)
def test_execute_with_exit_code_resultdict(self):
resultdict = self.localshell.execute(
self.testrun.command, with_exit_code=True)
self.resultdict.update({'exit_code': self.testrun.returncode})
self.assertEqual(self.resultdict, resultdict)
@mock.patch.object(bash.utils, 'get_platform_info')
class TestLocalPlatformInfo(utils.TestCase):
def test_is_debian(self, mock_gpi):
mock_gpi.return_value = {'dist': 'Debian'}
self.assertIsInstance(bash.LocalShell().is_debian(), bool)
def test_is_fedora(self, mock_gpi):
mock_gpi.return_value = {'dist': 'Fedora'}
self.assertIsInstance(bash.LocalShell().is_fedora(), bool)
def test_is_osx(self, mock_gpi):
mock_gpi.return_value = {'dist': 'Darwin'}
self.assertIsInstance(bash.LocalShell().is_windows(), bool)
def test_is_windows(self, mock_gpi):
mock_gpi.return_value = {'dist': 'Windows'}
self.assertIsInstance(bash.LocalShell().is_osx(), bool)
class TestLocalPlatformInfoUndetermined(TestLocalShell):
def setUp(self):
blanks = {'dist': '', 'arch': '', 'version': ''}
pinfo_patcher = mock.patch.object(
bash.LocalShell, 'platform_info', new_callable=mock.PropertyMock)
self.mock_platform_info = pinfo_patcher.start()
self.mock_platform_info.return_value = blanks
super(TestLocalPlatformInfoUndetermined, self).setUp()
self.addCleanup(pinfo_patcher.stop)
def test_is_debian(self):
self.assertRaises(errors.UndeterminedPlatform,
self.localshell.is_debian)
def test_is_fedora(self):
self.assertRaises(errors.UndeterminedPlatform,
self.localshell.is_fedora)
def test_is_osx(self):
self.assertRaises(errors.UndeterminedPlatform,
self.localshell.is_osx)
def test_is_windows(self):
self.assertRaises(errors.UndeterminedPlatform,
self.localshell.is_windows)
class TestRemoteShell(TestBashModule):
def setUp(self):
super(TestRemoteShell, self).setUp()
execute_patcher = mock.patch.object(bash.ssh.SSH, 'remote_execute')
self.mock_execute = execute_patcher.start()
self.mock_execute.return_value = self.resultdict
self.remoteshell = bash.RemoteShell('192.168.2.10')
self.addCleanup(execute_patcher.stop)
def test_execute(self):
self.remoteshell.execute(self.testrun.command)
self.mock_execute.assert_called_once_with(
self.testrun.command)
def test_execute_resultdict(self):
resultdict = self.remoteshell.execute(self.testrun.command)
self.assertEqual(self.resultdict, resultdict)
def test_execute_with_exit_code_resultdict(self):
resultdict = self.remoteshell.execute(
self.testrun.command, with_exit_code=True)
self.resultdict.update({'exit_code': self.testrun.returncode})
self.assertEqual(self.resultdict, resultdict)
class TestRemoteShellInit(unittest.TestCase):
def initpatch(self, ssh_instance, *args, **kwargs):
ssh_instance.host = self.host
ssh_instance.port = self.port
self._instance = ssh_instance
def setUp(self):
self.host = "192.168.2.10"
self.port = 23
@mock.patch.object(bash.ssh.SSH, '__init__', return_value=None,
autospec=True)
def test_init_contains_kwargs(self, mock_init):
mock_init.side_effect = self.initpatch
allkwargs = {
'password': 'pass',
'username': 'user',
'private_key': 'pkey',
'key_filename': 'pkeyfile',
'port': self.port,
'timeout': 100,
'gateway': 'g',
'options': {'StrictHostKeyChecking': False},
'interactive': True,
'root_password': 'sudopass',
}
self.remoteshell = bash.RemoteShell(self.host, **allkwargs)
mock_init.assert_called_once_with(self._instance, self.host, **allkwargs)
class TestContextManager(utils.TestCase):
def setUp(self):
super(TestContextManager, self).setUp()
connect_patcher = mock.patch.object(bash.RemoteShell, 'connect')
close_patcher = mock.patch.object(bash.RemoteShell, 'close')
self.mock_connect = connect_patcher.start()
self.mock_close = close_patcher.start()
self.addCleanup(connect_patcher.stop)
self.addCleanup(close_patcher.stop)
def test_context_manager(self):
with bash.RemoteShell('192.168.2.10') as client:
pass
self.assertTrue(self.mock_connect.call_count == 1)
# >=1 because __del__ (in most python implementations)
# calls close()
self.assertTrue(self.mock_close.call_count >= 1)
class TestIsDistro(TestRemoteShell):
def setUp(self):
super(TestIsDistro, self).setUp()
self.platformdict = self.resultdict.copy()
self.platformdict['stdout'] = str(bash.LocalShell().platform_info)
def test_remote_platform_info(self):
self.mock_execute.return_value = self.platformdict
result = self.remoteshell.platform_info
self.assertIsInstance(result, dict)
self.assertTrue(all(k in result
for k in ('arch', 'dist', 'version')))
assert self.mock_execute.called
if __name__ == "__main__":
unittest.main()

View File

@ -1,58 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""Satori Logging Module Tests."""
import logging as stdlib_logging
import unittest
import mock
from satori.common import logging
from satori.tests import utils
class TestLoggingSetup(utils.TestCase):
"""Logging Setup tests."""
def test_logging_default_info(self):
config = {}
with mock.patch.dict(config, {'logconfig': None}):
logging.init_logging(config)
self.assertEqual(stdlib_logging.getLogger().level,
stdlib_logging.INFO)
def test_logging_debug_flag(self):
config = {}
with mock.patch.dict(config, {'logconfig': None, 'debug': True}):
logging.init_logging(config)
self.assertEqual(stdlib_logging.getLogger().level,
stdlib_logging.DEBUG)
def test_logging_verbose_flag(self):
config = {}
with mock.patch.dict(config, {'logconfig': None, 'verbose': True}):
logging.init_logging(config)
self.assertEqual(stdlib_logging.getLogger().level,
stdlib_logging.DEBUG)
def test_logging_quiet_flag(self):
config = {}
with mock.patch.dict(config, {'logconfig': None, 'quiet': True}):
logging.init_logging(config)
self.assertEqual(stdlib_logging.getLogger().level,
stdlib_logging.WARN)
if __name__ == "__main__":
unittest.main()

View File

@ -1,79 +0,0 @@
# pylint: disable=C0103,R0904
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""Unit Tests for Templating module."""
import unittest
from satori.common import templating
def fail_fixture():
"""Used to simulate a template error."""
raise AttributeError("Boom!")
class TestTemplating(unittest.TestCase):
"""Test Templating Module."""
def test_prepend_function(self):
"""preserve returns escaped linefeeds."""
result = templating.parse("{{ root|prepend('/')}}/path", root="etc")
self.assertEqual(result, '/etc/path')
def test_prepend_function_blank(self):
"""preserve returns escaped linefeeds."""
result = templating.parse("{{ root|prepend('/')}}/path")
self.assertEqual(result, '/path')
def test_preserve_linefeed_escaping(self):
"""preserve returns escaped linefeeds."""
result = templating.parse('{{ "A\nB" | preserve }}')
self.assertEqual(result, 'A\\nB')
def test_template_extra_globals(self):
"""Globals are available in template."""
result = templating.parse("{{ foo }}", foo="bar")
self.assertEqual(result, 'bar')
def test_template_syntax_error(self):
"""jinja.TemplateSyntaxError is caught."""
self.assertRaises(templating.TemplateException, templating.parse,
"{{ not closed")
def test_template_undefined_error(self):
"""jinja.UndefinedError is caught."""
self.assertRaises(templating.TemplateException, templating.parse,
"{{ unknown() }}")
def test_template_exception(self):
"""Exception in global is caught."""
self.assertRaises(templating.TemplateException, templating.parse,
"{{ boom() }}", boom=fail_fixture)
def test_extra_globals(self):
"""Validates globals are set."""
env = templating.get_jinja_environment("", {'foo': 1})
self.assertTrue('foo' in env.globals)
self.assertEqual(env.globals['foo'], 1)
def test_json_included(self):
"""json library available to template."""
result = templating.parse("{{ json.dumps({'data': 1}) }}")
self.assertEqual(result, '{"data": 1}')
if __name__ == '__main__':
unittest.main()

View File

@ -1,294 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""Satori DNS Discovery."""
import socket
import unittest
from freezegun import freeze_time
import mock
import pythonwhois
import six
from satori import dns
from satori import errors
from satori.tests import utils
class TestDNS(utils.TestCase):
def setUp(self):
self.ip = "4.3.2.1"
self.domain = "domain.com"
self.mysocket = socket
self.mysocket.gethostbyname = mock.MagicMock(name='gethostbyname')
self.WHOIS = ["""
The data in Fake Company WHOIS database is provided
by Fake Company for information purposes only. By submitting
WHOIS query, you agree that you will use this data only for lawful
purpose. In addition, you agree not to:
(a) use the data to allow, enable, or otherwise support marketing
activities, regardless of the medium. Such media include but are
not limited to e-mail, telephone, facsimile, postal mail, SMS, and
wireless alerts; or
(b) use the data to enable high volume, electronic processes
that sendqueries or data to the systems of any Registry Operator or
ICANN-Accredited registrar, except as necessary to register
domain names or modify existing registrations.
(c) sell or redistribute the data except insofar as it has been
incorporated into a value-added product that does not permit
the extraction of a portion of the data from the value-added
product or service for use by other parties.
Fake Company reserves the right to modify these terms at any time.
Fake Company cannot guarantee the accuracy of the data provided.
By accessing and using Fake Company WHOIS service, you agree to
these terms.
NOTE: FAILURE TO LOCATE A RECORD IN THE WHOIS DATABASE IS NOT
INDICATIVE OF THE AVAILABILITY OF A DOMAIN NAME.
Domain Name: mytestdomain.com
Registry Domain ID:
Registrar WHOIS Server: whois.fakecompany.com
Registrar URL: http://www.fakecompany.com
Updated Date: 2013-08-15T05:02:28Z
Creation Date: 2010-11-01T23:57:06Z
Registrar Registration Expiration Date: 2020-01-01T00:00:00Z
Registrar: Fake Company, Inc
Registrar IANA ID: 106
Registrar Abuse Contact Email: abuse@fakecompany.com
Registrar Abuse Contact Phone: +44.2070159370
Reseller:
Domain Status: ACTIVE
Registry Registrant ID:
Registrant Name: Host Master
Registrant Organization: Rackspace US, Inc.
Registrant Street: 5000 Walzem Road
Registrant City: San Antonio,
Registrant State/Province: Texas
Registrant Postal Code: 78218
Registrant Country: US
Registrant Phone:
Registrant Phone Ext:
Registrant Fax:
Registrant Fax Ext:
Registrant Email:
Registry Admin ID:
Admin Name: Host Master
Admin Organization: Rackspace US, Inc.
Admin Street: 5000 Walzem Road
Admin City: San Antonio,
Admin State/Province: Texas
Admin Postal Code: 78218
Admin Country: US
Admin Phone: +1.2103124712
Admin Phone Ext:
Admin Fax:
Admin Fax Ext:
Admin Email: domains@rackspace.com
Registry Tech ID:
Tech Name: Domain Administrator
Tech Organization: NetNames Hostmaster
Tech Street: 3rd Floor Prospero House
Tech Street: 241 Borough High Street
Tech City: Borough
Tech State/Province: London
Tech Postal Code: SE1 1GA
Tech Country: GB
Tech Phone: +44.2070159370
Tech Phone Ext:
Tech Fax: +44.2070159375
Tech Fax Ext:
Tech Email: corporate-services@netnames.com
Name Server: ns1.domain.com
Name Server: ns2.domain.com
DNSSEC:
URL of the ICANN WHOIS Data Problem System: http://wdprs.fake.net/
>>> Last update of WHOIS database: 2014-02-18T03:39:52 UTC <<<
"""]
patcher = mock.patch.object(pythonwhois.net, 'get_whois_raw')
self.mock_get_whois_raw = patcher.start()
self.mock_get_whois_raw.return_value = self.WHOIS, [None]
self.addCleanup(patcher.stop)
super(TestDNS, self).setUp()
def test_resolve_domain_name_returns_ip(self):
self.mysocket.gethostbyname.return_value = self.ip
self.assertEqual(self.ip, dns.resolve_hostname(self.domain))
def test_resolve_ip_returns_ip(self):
self.mysocket.gethostbyname.return_value = self.ip
self.assertEqual(self.ip, dns.resolve_hostname(self.ip))
def test_resolve_int_raises_invalid_netloc_error(self):
self.assertRaises(
errors.SatoriInvalidNetloc,
dns.parse_target_hostname,
100)
def test_resolve_none_raises_invalid_netloc_error(self):
self.assertRaises(
errors.SatoriInvalidNetloc,
dns.parse_target_hostname,
None)
def test_registered_domain_subdomain_removed(self):
self.assertEqual(
self.domain,
dns.get_registered_domain("www." + self.domain)
)
def test_registered_domain_path_removed(self):
self.assertEqual(
self.domain,
dns.get_registered_domain("www." + self.domain + "/path")
)
def test_domain_info_returns_nameservers_from_whois(self):
data = dns.domain_info(self.domain)
self.assertEqual(
['ns1.domain.com', 'ns2.domain.com'],
data['nameservers']
)
def test_domain_info_returns_nameservers_as_list(self):
data = dns.domain_info(self.domain)
self.assertIsInstance(
data['nameservers'],
list
)
def test_domain_info_returns_registrar_from_whois(self):
data = dns.domain_info(self.domain)
self.assertEqual(
'Fake Company, Inc',
data['registrar']
)
def test_domain_info_returns_no_registrar_from_whois(self):
small_whois = ["""
Domain : example.io
Status : Live
Expiry : 2014-11-06
NS 1 : dns1.example.com
NS 2 : dns2.example.com
"""]
self.mock_get_whois_raw.return_value = small_whois, [None]
data = dns.domain_info(self.domain)
self.assertEqual(
[],
data['registrar']
)
def test_domain_info_returns_domain_name_from_parameter(self):
data = dns.domain_info(self.domain)
self.assertEqual(
self.domain,
data['name']
)
def test_domain_info_returns_slimmed_down_domain_name(self):
data = dns.domain_info("s1.www." + self.domain)
self.assertEqual(
self.domain,
data['name']
)
@freeze_time("2019-01-01")
def test_domain_info_returns_365_day_expiration(self):
data = dns.domain_info(self.domain)
self.assertEqual(
365,
data['days_until_expires']
)
def test_domain_info_returns_none_for_days_until_expires(self):
small_whois = ["""
Domain : example.io
Status : Live
NS 1 : dns1.example.com
NS 2 : dns2.example.com
"""]
self.mock_get_whois_raw.return_value = small_whois, [None]
data = dns.domain_info(self.domain)
self.assertEqual(
data['days_until_expires'],
None
)
def test_domain_info_returns_array_of_strings_whois_data(self):
data = dns.domain_info(self.domain)
self.assertIsInstance(data['whois'][0], str)
def test_domain_info_returns_string_date_for_expiry(self):
small_whois = ["""
Domain : example.io
Status : Live
Expiry : 2014-11-06
NS 1 : dns1.example.com
NS 2 : dns2.example.com
"""]
self.mock_get_whois_raw.return_value = small_whois, [None]
data = dns.domain_info(self.domain)
self.assertIsInstance(data['expiration_date'], six.string_types)
def test_domain_info_returns_string_for_expiration_date_string(self):
data = dns.domain_info(self.domain)
self.assertIsInstance(data['expiration_date'], six.string_types)
def test_domain_info_returns_none_for_missing_expiration_date(self):
small_whois = ["""
Domain : example.io
Status : Live
NS 1 : dns1.example.com
NS 2 : dns2.example.com
"""]
self.mock_get_whois_raw.return_value = small_whois, [None]
data = dns.domain_info(self.domain)
self.assertIsNone(data['expiration_date'])
def test_domain_info_raises_invalid_domain_error(self):
ip_whois = ["""
Home net HOME-NET-192-168 (NET-192-0-0-0-1)
Home Inc. HOME-NET-192-168-0 (NET-192-168-0-0-1)
"""]
self.mock_get_whois_raw.return_value = ip_whois, [None]
self.assertRaises(
errors.SatoriInvalidDomain,
dns.domain_info,
"192.168.0.1"
)
def test_ip_info_raises_invalid_ip_error(self):
self.assertRaises(
errors.SatoriInvalidIP,
dns.ip_info,
"example.com"
)
def test_ip_info_raises_invalid_ip_error_bad_ip(self):
self.assertRaises(
errors.SatoriInvalidIP,
dns.ip_info,
"1.2.3"
)
if __name__ == "__main__":
unittest.main()

View File

@ -1,188 +0,0 @@
# pylint: disable=C0103,R0904
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""Tests for Format Templates."""
import unittest
from satori.common import templating
from satori import shell
class TestTextTemplate(unittest.TestCase):
"""Test Text Template."""
def setUp(self):
self.template = shell.get_template('text')
def test_no_data(self):
"""Handles response with no host."""
env_vars = dict(lstrip_blocks=True, trim_blocks=True)
result = templating.parse(self.template, data={}, env_vars=env_vars)
self.assertEqual(result.strip('\n'), 'Host not found')
def test_target_is_ip(self):
"""Handles response when host is just the supplied address."""
env_vars = dict(lstrip_blocks=True, trim_blocks=True)
result = templating.parse(self.template, target='127.0.0.1',
data={'found': {'ip-address': '127.0.0.1'}},
env_vars=env_vars)
self.assertEqual(result.strip('\n'),
'Host:\n ip-address: 127.0.0.1')
def test_host_not_server(self):
"""Handles response when host is not a nova instance."""
env_vars = dict(lstrip_blocks=True, trim_blocks=True)
result = templating.parse(self.template, target='localhost',
data={'found': {'ip-address': '127.0.0.1'}},
env_vars=env_vars)
self.assertEqual(result.strip('\n'),
'Address:\n localhost resolves to IPv4 address '
'127.0.0.1\nHost:\n ip-address: 127.0.0.1')
def test_host_is_nova_instance(self):
"""Handles response when host is a nova instance."""
data = {
'found': {
'ip-address': '10.1.1.45',
'hostname': 'x',
'host-key': 'https://servers/path'
},
'target': 'instance.nova.local',
'resources': {
'https://servers/path': {
'type': 'OS::Nova::Instance',
'data': {
'uri': 'https://servers/path',
'id': '1000B',
'name': 'x',
'addresses': {
'public': [{'type': 'ipv4', 'addr': '10.1.1.45'}]
},
'system_info': {
'connections': {
'192.168.2.100': [],
'192.168.2.101': [433],
'192.168.2.102': [8080, 8081]
},
'remote_services': [
{
'ip': '0.0.0.0',
'process': 'nginx',
'port': 80
}
]
}
}
}
}
}
env_vars = dict(lstrip_blocks=True, trim_blocks=True)
result = templating.parse(self.template,
target='instance.nova.local',
data=data, env_vars=env_vars)
expected = """\
Address:
instance.nova.local resolves to IPv4 address 10.1.1.45
Host:
10.1.1.45 (instance.nova.local) is hosted on a Nova instance
Instance Information:
URI: https://servers/path
Name: x
ID: 1000B
ip-addresses:
public:
10.1.1.45
Listening Services:
0.0.0.0:80 nginx
Talking to:
192.168.2.100
192.168.2.101 on 433
192.168.2.102 on 8080, 8081"""
self.assertEqual(result.strip('\n'), expected)
def test_host_has_no_data(self):
"""Handles response when host is a nova instance."""
data = {
'found': {
'ip-address': '10.1.1.45',
'hostname': 'x',
'host-key': 'https://servers/path'
},
'target': 'instance.nova.local',
'resources': {
'https://servers/path': {
'type': 'OS::Nova::Instance'
}
}
}
env_vars = dict(lstrip_blocks=True, trim_blocks=True)
result = templating.parse(self.template,
target='instance.nova.local',
data=data, env_vars=env_vars)
expected = """\
Address:
instance.nova.local resolves to IPv4 address 10.1.1.45
Host:
10.1.1.45 (instance.nova.local) is hosted on a Nova instance"""
self.assertEqual(result.strip('\n'), expected)
def test_host_data_missing_items(self):
"""Handles response when host is a nova instance."""
data = {
'found': {
'ip-address': '10.1.1.45',
'hostname': 'x',
'host-key': 'https://servers/path'
},
'target': 'instance.nova.local',
'resources': {
'https://servers/path': {
'type': 'OS::Nova::Instance',
'data': {
'id': '1000B',
'system_info': {
'remote_services': [
{
'ip': '0.0.0.0',
'process': 'nginx',
'port': 80
}
]
}
}
}
}
}
env_vars = dict(lstrip_blocks=True, trim_blocks=True)
result = templating.parse(self.template,
target='instance.nova.local',
data=data, env_vars=env_vars)
expected = """\
Address:
instance.nova.local resolves to IPv4 address 10.1.1.45
Host:
10.1.1.45 (instance.nova.local) is hosted on a Nova instance
Instance Information:
URI: n/a
Name: n/a
ID: 1000B
Listening Services:
0.0.0.0:80 nginx"""
self.assertEqual(result.strip('\n'), expected)
if __name__ == '__main__':
unittest.main()

View File

@ -1,159 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""Unit Tests for Shell module."""
import copy
import sys
import unittest
import fixtures
import mock
import six
from satori import errors
from satori import shell
from satori.tests import utils
if six.PY2:
BUILTINS = "__builtin__"
else:
BUILTINS = "builtins"
class TestTemplating(utils.TestCase):
"""Test Templating Code."""
@mock.patch('%s.open' % BUILTINS)
def test_get_template(self, mock_open):
"""Verify that get_template looks for the right template."""
manager = mock_open.return_value.__enter__.return_value
manager.read.return_value = 'some data'
result = shell.get_template("foo")
self.assertEqual(result, "some data")
call_ = mock_open.call_args_list[0]
args, _ = call_
path, modifier = args
self.assertTrue(path.endswith("/foo.jinja"))
self.assertEqual(modifier, 'r')
@mock.patch.object(shell, 'get_template')
def test_output_results(self, mock_template):
"""Verify that output formatter parses supllied template."""
mock_template.return_value = 'Output: {{ data.foo }}'
result = shell.format_output("127.0.0.1", {'foo': 1})
self.assertEqual(result, "Output: 1")
FAKE_ENV = {
'OS_USERNAME': 'username',
'OS_PASSWORD': 'password',
'OS_TENANT_NAME': 'tenant_name',
'OS_AUTH_URL': 'http://no.where'
}
FAKE_ENV2 = {
'OS_USERNAME': 'username',
'OS_PASSWORD': 'password',
'OS_TENANT_ID': 'tenant_id',
'OS_AUTH_URL': 'http://no.where'
}
class TestArgParsing(utils.TestCase):
"""Test Argument Parsing."""
def setUp(self):
super(TestArgParsing, self).setUp()
patcher = mock.patch.object(shell, "os")
self.mock_os = patcher.start()
self.mock_os.environ = {}
self.addCleanup(patcher.stop)
def make_env(self, exclude=None, fake_env=FAKE_ENV):
"""Create a patched os.environ.
Borrowed from python-novaclient/novaclient/tests/test_shell.py.
"""
env = dict((k, v) for k, v in fake_env.items() if k != exclude)
self.useFixture(fixtures.MonkeyPatch('os.environ', env))
def run_shell(self, argstr, exitcodes=(0,)):
"""Simulate a user shell.
Borrowed from python-novaclient/novaclient/tests/test_shell.py.
"""
orig = sys.stdout
orig_stderr = sys.stderr
try:
sys.stdout = six.StringIO()
sys.stderr = six.StringIO()
shell.main(argstr.split())
except SystemExit:
exc_type, exc_value, exc_traceback = sys.exc_info()
self.assertIn(exc_value.code, exitcodes)
finally:
stdout = sys.stdout.getvalue()
sys.stdout.close()
sys.stdout = orig
stderr = sys.stderr.getvalue()
sys.stderr.close()
sys.stderr = orig_stderr
return (stdout, stderr)
def test_missing_openstack_field_raises_argument_exception(self):
"""Verify that all 'required' OpenStack fields are needed.
Iterate over the list of fields, remove one and verify that
an exception is raised.
"""
fields = [
'--os-username=bob',
'--os-password=secret',
'--os-auth-url=http://domain.com/v1/auth',
'--os-region-name=hawaii',
'--os-tenant-name=bobs-better-burger',
]
for i in range(len(fields)):
fields_copy = copy.copy(fields)
fields_copy.pop(i)
fields_copy.append('domain.com')
self.assertRaises(
errors.SatoriShellException,
self.run_shell,
' '.join(fields_copy),
exitcodes=[0, 2]
)
def test_netloc_parser(self):
self.assertEqual(shell.netloc_parser("localhost"),
(None, 'localhost'))
def test_netloc_parser_both(self):
self.assertEqual(shell.netloc_parser("name@address"),
('name', 'address'))
def test_netloc_parser_edge(self):
self.assertEqual(shell.netloc_parser("@address"),
(None, 'address'))
self.assertEqual(shell.netloc_parser("root@"),
('root', None))
self.assertEqual(shell.netloc_parser(""),
(None, None))
if __name__ == '__main__':
unittest.main()

View File

@ -1,666 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
# pylint: disable=C0111, C0103, W0212, R0904
"""Satori SSH Module Tests."""
import os
import unittest
import mock
import paramiko
from satori import errors
from satori import ssh
from satori.tests import utils
class TestTTYRequired(utils.TestCase):
"""Test response to tty demand."""
def setUp(self):
super(TestTTYRequired, self).setUp()
self.client = ssh.SSH('123.456.789.0', password='test_password')
self.stdout = mock.MagicMock()
self.stdin = mock.MagicMock()
def test_valid_demand(self):
"""Ensure that anticipated requests for tty's return True."""
for substring in ssh.TTY_REQUIRED:
results = {'stdout': "xyz" + substring + "zyx"}
self.assertTrue(self.client._handle_tty_required(results, False))
def test_normal_response(self):
"""Ensure standard response returns False."""
examples = ["hello", "#75-Ubuntu SMP Tue Jun 18 17:59:38 UTC 2013",
("fatal: Not a git repository "
"(or any of the parent directories): .git")]
for substring in examples:
results = {'stderr': '', 'stdout': substring}
self.assertFalse(self.client._handle_tty_required(results, False))
def test_no_recurse(self):
"""Avoid infinte loop by raising GetPTYRetryFailure.
When retrying with get_pty in response to one of TTY_REQUIRED
"""
for substring in ssh.TTY_REQUIRED:
results = {'stdout': substring}
self.assertRaises(errors.GetPTYRetryFailure,
self.client._handle_tty_required,
results, True)
class TestConnectHelper(utils.TestCase):
def test_connect_helper(self):
self.assertIsInstance(ssh.connect("123.456.789.0"), ssh.SSH)
def test_throws_typeerror_well(self):
self.assertRaises(TypeError, ssh.connect,
("123.456.789.0",), invalidkey="bad")
def test_throws_typeerror_well_with_message(self):
try:
ssh.connect("123.456.789.0", invalidkey="bad")
except TypeError as exc:
self.assertEqual("connect() got an unexpected keyword "
"argument 'invalidkey'", str(exc))
def test_throws_error_no_host(self):
self.assertRaises(TypeError, ssh.connect)
class TestPasswordPrompt(utils.TestCase):
def setUp(self):
super(TestPasswordPrompt, self).setUp()
ssh.LOG = mock.MagicMock()
self.client = ssh.SSH('123.456.789.0', password='test_password')
self.stdout = mock.MagicMock()
self.stdin = mock.MagicMock()
def test_channel_closed(self):
"""If the channel is closed, there's no prompt."""
self.stdout.channel.closed = True
self.assertFalse(
self.client._handle_password_prompt(self.stdin, self.stdout))
def test_password_prompt_buflen_too_short(self):
"""Stdout chan buflen is too short to be a password prompt."""
self.stdout.channel.closed = False
self.stdout.channel.in_buffer = "a" * (ssh.MIN_PASSWORD_PROMPT_LEN - 1)
self.assertFalse(
self.client._handle_password_prompt(self.stdin, self.stdout))
def test_password_prompt_buflen_too_long(self):
"""Stdout chan buflen is too long to be a password prompt."""
self.stdout.channel.closed = False
self.stdout.channel.in_buffer = "a" * (ssh.MAX_PASSWORD_PROMPT_LEN + 1)
self.assertFalse(
self.client._handle_password_prompt(self.stdin, self.stdout))
def test_common_password_prompt(self):
"""Ensure that a couple commonly seen prompts have success."""
self.stdout.channel.closed = False
self.stdout.channel.in_buffer = "[sudo] password for user:"
self.stdout.channel.recv.return_value = self.stdout.channel.in_buffer
self.assertTrue(
self.client._handle_password_prompt(self.stdin, self.stdout))
self.stdout.channel.in_buffer = "Password:"
self.stdout.channel.recv.return_value = self.stdout.channel.in_buffer
self.assertTrue(
self.client._handle_password_prompt(self.stdin, self.stdout))
def test_password_prompt_other_prompt(self):
"""Pass buflen check, fail on substring check."""
self.stdout.channel.closed = False
self.stdout.channel.in_buffer = "Welcome to <hostname>:"
self.stdout.channel.recv.return_value = self.stdout.channel.in_buffer
self.assertFalse(
self.client._handle_password_prompt(self.stdin, self.stdout))
def test_logging_encountered_prompt(self):
self.stdout.channel.closed = False
self.stdout.channel.in_buffer = "[sudo] password for user:"
self.stdout.channel.recv.return_value = self.stdout.channel.in_buffer
self.client._handle_password_prompt(self.stdin, self.stdout)
ssh.LOG.warning.assert_called_with(
'%s@%s encountered prompt! of length [%s] {%s}', "root",
'123.456.789.0', 25, '[sudo] password for user:')
def test_logging_nearly_false_positive(self):
"""Assert that a close-call on a false-positive logs a warning."""
other_prompt = "Welcome to <hostname>:"
self.stdout.channel.closed = False
self.stdout.channel.in_buffer = other_prompt
self.stdout.channel.recv.return_value = self.stdout.channel.in_buffer
self.client._handle_password_prompt(self.stdin, self.stdout)
ssh.LOG.warning.assert_called_with(
'Nearly a False-Positive on password prompt detection. [%s] {%s}',
22, other_prompt)
def test_password_given_to_prompt(self):
self.stdout.channel.closed = False
self.stdout.channel.in_buffer = "[sudo] password for user:"
self.stdout.channel.recv.return_value = self.stdout.channel.in_buffer
self.client._handle_password_prompt(self.stdin, self.stdout)
self.stdin.write.assert_called_with(self.client.password + '\n')
def test_password_given_returns_true(self):
self.stdout.channel.closed = False
self.stdout.channel.in_buffer = "[sudo] password for user:"
self.stdout.channel.recv.return_value = self.stdout.channel.in_buffer
self.assertTrue(
self.client._handle_password_prompt(self.stdin, self.stdout))
class SSHTestBase(utils.TestCase):
"""Base class with a set of test ssh keys and paramiko.SSHClient mock."""
def setUp(self):
super(SSHTestBase, self).setUp()
self.invalidkey = """
-----BEGIN RSA PRIVATE KEY-----
MJK7hkKYHUNJKDHNF)980BN456bjnkl_0Hj08,l$IRJSDLKjhkl/jFJVSLx2doRZ
-----END RSA PRIVATE KEY-----
"""
self.ecdsakey = """
-----BEGIN EC PRIVATE KEY-----
MHcCAQEEIIiZdMfDf+lScOkujN1+zAKDJ9PQRquCVZoXfS+6hDlToAoGCCqGSM49
AwEHoUQDQgAE/qUj+vxnhIrkTR/ayYx9ZC/9JanJGyXkOe3Oe6WT/FJ9vBbfThTF
U9+i43I3TONq+nWbhFKBj8XR4NKReaYeBw==
-----END EC PRIVATE KEY-----
"""
self.rsakey = """
-----BEGIN RSA PRIVATE KEY-----
MIIEowIBAAKCAQEAwbbaH5m0yLIVAi1i4aJ5uKprPM93x6b/KkH5N4QmZoXGOFId
v0G64Sanz1VZkCWXiyivgkT6/y0+M0Ok8UK24UO6YNBSFGKboan/OMNETTIqXzmV
liVYkQTf2zrBPWofjeDnzMndy7AD5iylJ6cNAksFM+sLt0MQcOeCmbOX8E6+AGZr
JLj8orJgGJKU9jN5tnMlgtDP9BVrrbi7wX0kqb42OMtM6AuMUBDtAM2QSpTJa0JL
mFOLfe6PYOLdQaJsnaoV+Wu4eBdY91h8COmhOKZv5VMYalOSDQnsKgngDW9iOoFs
Uou7W8Wk3FXusbDwAvakWKmQtDF8SIgMLqygTwIDAQABAoIBAQCe5FkuKmm7ZTcO
PiQpZ5fn/QFRM+vP/A64nrzI6MCGv5vDfreftU6Qd6CV1DBOqEcRgiHT/LjUrkui
yQ12R36yb1dlKfrpdaiqhkIuURypJUjUKuuj6KYo7ZKgxCTVN0MCoUQBGmOvO4U3
O8+MIt3sz5RI7bcCbyQBOCRL5p/uH3soWoG+6u2W17M4otLT0xJGX5eU0AoCYfOi
Vd9Ot3j687k6KtZajy2hZIccuGNRwFeKSIAN9U7FEy4fgxkIMrc/wqArKmZLNui1
SkVP3UHlbGVAI5ZDLzdcyxXPRWz1FBtJYiITtQCVKTv5LFCxFjlIWML2qJMB2GTW
0+t1WhEhAoGBAOFdh14qn0i5v7DztpkS665vQ9F8n7RN0b17yK5wNmfhd4gYK/ym
hCPUk0+JfPNQuhhhzoDXWICiCHRqNVT0ZzkyY0E2aTYLYxbeKkiCOccqJXxtxiI+
6KneRMV3mKaJXJLz8G0YepB2Qhv4JkNsR1yiA5EqIs0Cr9Jafg9tHQsrAoGBANwL
5lYjNHu51WVdjv2Db4oV26fRqAloc0//CBCl9IESM7m9A7zPTboMMijSqEuz3qXJ
Fd5++B/b1Rkt4EunJNcE+XRJ9cI7MKE1kYKz6oiSN4X4eHQSDmlpS9DBcAEjTJ8r
c+5DsPMSkz6qMxbG+FZB1SvVflFZe9dO8Ba7oR1tAoGAa+97keIf/5jW8k0HOzEQ
p66qcH6bjqNmvLW4W7Nqmz4lHY1WI98skmyRURqsOWyEdIEDgjmhLZptKjRj7phP
h9lWKDmDEltJzf4BilC0k2rgIUQCDQzMKe9GSL0K41gOemNS1y1OJjo9V1/2E3yc
gQUnaDMiD8Ylpz2n+oNr0ZkCgYBqDK4g+2yS6JgI91MvqQW7lhc7xRZoGlfgyPe5
FlJFVmFpdcf0WjCKptARzpzfhzuZyNTqW2T37bnBHdQIgfCGVFZpDjAMQPyJ5UhQ
pqc01Ms/nOVogz9A3Ed2v5NcaQfHemiv/x2ruFsQi3R92LzczXOQYZ80U50Uwm2B
d0IJ7QKBgD39jFiz7U4XEK/knRWUBUNq8QSGF5UuzO404z/+6J2KlFeNiDe+aH0c
cdi+/PhkDkMXfW6eQdvgFYs277uss4M+4F8fWb2KVvPTuZXmTf6qntFoZNuL1oIv
kn+fI2noF0ET7ktofoPEeD2/ya0B9/XecUqDJcVofoVO2pxMn12A
-----END RSA PRIVATE KEY-----
"""
self.dsakey = """
-----BEGIN DSA PRIVATE KEY-----
MIIBuwIBAAKBgQC+WvLRuPNDPVfZwKYqJYuD6XXjrUU4KIdLWmRO9qOtq0UR1kOQ
/4rhjgb2TyujW6RzPnqPc9eUv84Z3gKawAdZv5/vKbp6tpMn86Y42r0Ohy63DEgM
XyBfWxbZm0RBmLy3bCUefMOBngnODIhrTt2o+ip5ve5JMctDvjkWBVnZiQIVAMlh
6gd7IC68FwynC4f/p8+zpx9pAoGARjTQeKxBBDDfxySYDN0maXHMR21RF/gklecO
x6sH1MEDtOupQk0/uIPvolH0Jh+PK+NAv0GBZ96PDrF5z0S6MyQ5eHWGtwW4NFqk
ZGHTriy+8qc4OhtyS3dpXQu40Ad2o1ap1v806RwM8iw1OfBa94h/vreedO0ij2Fe
7aKEci4CgYAITw+ySCskHakn1GTG952MKxlMo7Mx++dYnCoFxsMwXFlwIrpzyhhC
Qk11sEgcAOZ2HiRVhwaz4BivNV5iuwUeIeKJc12W4+FU+Lh533hFOcSAYbBr1Crl
e+YpaOHRjLel0Nb5Cil4qEQaWQDmWvQb958IQQgzC9NhnR7NRNkfrgIVAKfMMZKz
57plimt3W9YoDAATyr6i
-----END DSA PRIVATE KEY-----
"""
patcher = mock.patch.object(paramiko.SSHClient, "connect")
self.mock_connect = patcher.start()
self.addCleanup(patcher.stop)
patcher = mock.patch.object(paramiko.SSHClient,
"load_system_host_keys")
self.mock_load = patcher.start()
self.addCleanup(patcher.stop)
class TestSSHKeyConversion(SSHTestBase):
"""Test ssh key conversion reoutines."""
def test_invalid_key_raises_sshexception(self):
self.assertRaises(
paramiko.SSHException, ssh.make_pkey, self.invalidkey)
def test_valid_ecdsa_returns_pkey_obj(self):
self.assertIsInstance(ssh.make_pkey(self.ecdsakey), paramiko.PKey)
def test_valid_rsa_returns_pkey_obj(self):
self.assertIsInstance(ssh.make_pkey(self.rsakey), paramiko.PKey)
def test_valid_ds_returns_pkey_obj(self):
self.assertIsInstance(ssh.make_pkey(self.dsakey), paramiko.PKey)
@mock.patch.object(ssh, 'LOG')
def test_valid_ecdsa_logs_key_class(self, mock_LOG):
ssh.make_pkey(self.ecdsakey)
mock_LOG.info.assert_called_with(
'Valid SSH Key provided (%s)', 'ECDSAKey')
@mock.patch.object(ssh, 'LOG')
def test_valid_rsa_logs_key_class(self, mock_LOG):
ssh.make_pkey(self.rsakey)
mock_LOG.info.assert_called_with(
'Valid SSH Key provided (%s)', 'RSAKey')
@mock.patch.object(ssh, 'LOG')
def test_valid_dsa_logs_key_class(self, mock_LOG):
ssh.make_pkey(self.dsakey)
mock_LOG.info.assert_called_with(
'Valid SSH Key provided (%s)', 'DSSKey')
class TestSSHLocalKeys(SSHTestBase):
def setUp(self):
super(TestSSHLocalKeys, self).setUp()
self.host = '123.456.789.0'
self.client = ssh.SSH(self.host, username='test-user')
@mock.patch.object(ssh, 'LOG')
def test_connect_host_keys(self, mock_LOG):
self.client.connect_with_host_keys()
self.mock_connect.assert_called_once_with(
'123.456.789.0', username='test-user', pkey=None, timeout=20,
sock=None, port=22, look_for_keys=True, allow_agent=False)
mock_LOG.debug.assert_called_with(
"Trying to connect with local host keys")
class TestSSHPassword(SSHTestBase):
def setUp(self):
super(TestSSHPassword, self).setUp()
self.host = '123.456.789.0'
self.client = ssh.SSH(self.host, username='test-user')
@mock.patch.object(ssh, 'LOG')
def test_connect_with_password(self, mock_LOG):
self.client.password = "pxwd"
self.client.connect_with_password()
self.mock_connect.assert_called_once_with(
'123.456.789.0', username='test-user', pkey=None, timeout=20,
sock=None, port=22, allow_agent=False, look_for_keys=False,
password="pxwd")
mock_LOG.debug.assert_called_with("Trying to connect with password")
def test_connect_with_no_password(self):
self.client.password = None
self.assertRaises(paramiko.PasswordRequiredException,
self.client.connect_with_password)
@mock.patch.object(ssh, 'getpass')
def test_connect_with_no_password_interactive(self, mock_getpass):
self.client.password = None
self.client.interactive = True
mock_getpass.getpass.return_value = "in-pxwd"
self.client.connect_with_password()
self.mock_connect.assert_called_once_with(
'123.456.789.0', username='test-user', pkey=None, timeout=20,
sock=None, port=22, allow_agent=False, look_for_keys=False,
password="in-pxwd")
@mock.patch.object(ssh, 'LOG')
@mock.patch.object(ssh, 'getpass')
def test_connect_with_interactive_cancel(self, mock_getpass, mock_LOG):
self.client.password = None
self.client.interactive = True
mock_getpass.getpass.side_effect = KeyboardInterrupt()
self.assertRaises(paramiko.PasswordRequiredException,
self.client.connect_with_password)
mock_LOG.debug.assert_any_call("User cancelled at password prompt")
mock_LOG.debug.assert_any_call("Prompting for password "
"(interactive=%s)", True)
class TestSSHKeyFile(SSHTestBase):
def setUp(self):
super(TestSSHKeyFile, self).setUp()
self.host = '123.456.789.0'
self.client = ssh.SSH(self.host, username='test-user')
@mock.patch.object(ssh, 'LOG')
def test_key_filename(self, mock_LOG):
self.client.key_filename = "~/not/a/real/path"
expanded_path = os.path.expanduser(self.client.key_filename)
self.client.connect_with_key_file()
self.mock_connect.assert_called_once_with(
'123.456.789.0', username='test-user', pkey=None, timeout=20,
look_for_keys=False, allow_agent=False, sock=None, port=22,
key_filename=expanded_path)
mock_LOG.debug.assert_any_call("Trying to connect with key file")
def test_bad_key_filename(self):
self.client.key_filename = None
self.assertRaises(paramiko.AuthenticationException,
self.client.connect_with_key_file)
class TestSSHKeyString(SSHTestBase):
def setUp(self):
super(TestSSHKeyString, self).setUp()
self.host = '123.456.789.0'
self.client = ssh.SSH(self.host, username='test-user')
def test_connect_invalid_private_key_string(self):
self.client.private_key = self.invalidkey
self.assertRaises(paramiko.SSHException, self.client.connect_with_key)
def test_connect_valid_private_key_string(self):
validkeys = [self.rsakey, self.dsakey, self.ecdsakey]
for key in validkeys:
self.client.private_key = key
self.client.connect_with_key()
pkey_kwarg_value = (paramiko.SSHClient.
connect.call_args[1]['pkey'])
self.assertIsInstance(pkey_kwarg_value, paramiko.PKey)
self.mock_connect.assert_called_with(
'123.456.789.0', username='test-user', allow_agent=False,
look_for_keys=False, sock=None, port=22, timeout=20,
pkey=pkey_kwarg_value)
def test_connect_no_private_key_string(self):
self.client.private_key = None
self.assertRaises(paramiko.AuthenticationException,
self.client.connect_with_key)
class TestSSHPrivateConnect(SSHTestBase):
"""Test _connect call."""
def setUp(self):
super(TestSSHPrivateConnect, self).setUp()
self.host = '123.456.789.0'
self.client = ssh.SSH(self.host, username='test-user')
@mock.patch.object(ssh, 'LOG')
def test_connect_no_auth_attrs(self, mock_LOG):
"""Test connect call without auth attributes."""
self.client._connect()
self.mock_connect.assert_called_once_with(
'123.456.789.0', username='test-user', pkey=None, sock=None,
port=22, timeout=20)
@mock.patch.object(ssh, 'LOG')
def test_connect_with_password(self, mock_LOG):
self.client._connect(password='test-password')
self.mock_connect.assert_called_once_with(
'123.456.789.0', username='test-user', timeout=20, pkey=None,
password='test-password', sock=None, port=22)
@mock.patch.object(ssh, 'LOG')
def test_use_password_on_exc_negative(self, mock_LOG):
"""Do this without self.password. """
self.mock_connect.side_effect = (
paramiko.PasswordRequiredException)
self.assertRaises(paramiko.PasswordRequiredException,
self.client._connect)
@mock.patch.object(ssh, 'LOG')
def test_default_user_is_root(self, mock_LOG):
self.client = ssh.SSH('123.456.789.0')
self.client._connect()
default = self.mock_connect.call_args[1]['username']
self.assertEqual(default, 'root')
@mock.patch.object(ssh, 'LOG')
def test_missing_host_key_policy(self, mock_LOG):
client = ssh.connect(
"123.456.789.0", options={'StrictHostKeyChecking': 'no'})
client._connect()
self.assertIsInstance(client._policy, ssh.AcceptMissingHostKey)
@mock.patch.object(ssh, 'LOG')
def test_adds_missing_host_key(self, mock_LOG):
client = ssh.connect(
"123.456.789.0", options={'StrictHostKeyChecking': 'no'})
client._connect()
pkey = ssh.make_pkey(self.rsakey)
client._policy.missing_host_key(
client,
"123.456.789.0",
pkey)
expected = {'123.456.789.0': {
'ssh-rsa': pkey}}
self.assertEqual(expected, client._host_keys)
class TestSSHConnect(SSHTestBase):
def setUp(self):
super(TestSSHConnect, self).setUp()
self.host = '123.456.789.0'
self.client = ssh.SSH(self.host, username='test-user')
@mock.patch.object(ssh, 'LOG')
def test_logging_when_badhostkey(self, mock_LOG):
"""Test when raising BadHostKeyException."""
self.client.password = "foo"
self.client.private_key = self.rsakey
exc = paramiko.BadHostKeyException(None, None, None)
self.mock_connect.side_effect = exc
try:
self.client.connect()
except paramiko.BadHostKeyException:
pass
mock_LOG.info.assert_called_with(
"ssh://%s@%s:%d failed: %s. "
"You might have a bad key entry on your server, "
"but this is a security issue and won't be handled "
"automatically. To fix this you can remove the "
"host entry for this host from the /.ssh/known_hosts file",
'test-user', '123.456.789.0', 22, exc)
@mock.patch.object(ssh, 'LOG')
def test_logging_when_reraising_other_exc(self, mock_LOG):
self.client.password = "foo"
exc = paramiko.SSHException()
self.mock_connect.side_effect = exc
self.assertRaises(paramiko.SSHException, self.client.connect)
mock_LOG.info.assert_any_call(
'ssh://%s@%s:%d failed. %s',
'test-user', '123.456.789.0', 22, exc)
def test_reraising_bad_host_key_exc(self):
self.client.password = "foo"
self.client.private_key = self.rsakey
exc = paramiko.BadHostKeyException(None, None, None)
self.mock_connect.side_effect = exc
self.assertRaises(paramiko.BadHostKeyException,
self.client.connect)
@mock.patch.object(ssh, 'LOG')
def test_logging_use_password_on_exc_positive(self, mock_LOG):
self.client.password = 'test-password'
self.mock_connect.side_effect = paramiko.PasswordRequiredException
self.assertRaises(paramiko.PasswordRequiredException,
self.client.connect)
mock_LOG.debug.assert_any_call('Trying to connect with password')
def test_connect_with_key(self):
self.client.key_filename = '/some/path'
self.client.connect()
self.mock_connect.assert_called_with(
'123.456.789.0', username='test-user', pkey=None,
allow_agent=False, key_filename='/some/path', sock=None,
look_for_keys=False, timeout=20, port=22)
class TestTestConnection(SSHTestBase):
def setUp(self):
super(TestTestConnection, self).setUp()
self.host = '123.456.789.0'
self.client = ssh.SSH(self.host, username='test-user')
def test_test_connection(self):
self.assertTrue(self.client.test_connection())
@mock.patch.object(ssh.SSH, "connect_with_host_keys")
def test_test_connection_fail_invalid_key(self, mock_keys):
mock_keys.side_effect = Exception()
self.client.private_key = self.invalidkey
self.assertFalse(self.client.test_connection())
def test_test_connection_valid_key(self):
self.client.private_key = self.dsakey
self.assertTrue(self.client.test_connection())
def test_test_connection_fail_other(self):
self.mock_connect.side_effect = Exception
self.assertFalse(self.client.test_connection())
@mock.patch.object(ssh, 'LOG')
def test_test_connection_logging(self, mock_LOG):
self.client.test_connection()
mock_LOG.debug.assert_any_call(
"Trying to connect with local host keys")
mock_LOG.debug.assert_any_call(
'ssh://%s@%s:%d is up.', 'test-user', self.host, 22)
class TestRemoteExecute(SSHTestBase):
def setUp(self):
super(TestRemoteExecute, self).setUp()
self.proxy_patcher = mock.patch.object(paramiko, "ProxyCommand")
self.proxy_patcher.start()
self.client = ssh.SSH('123.456.789.0', username='client-user')
self.client._handle_password_prompt = mock.Mock(return_value=False)
self.mock_chan = mock.MagicMock()
mock_transport = mock.MagicMock()
mock_transport.open_session.return_value = self.mock_chan
self.client.get_transport = mock.MagicMock(
return_value=mock_transport)
self.mock_chan.exec_command = mock.MagicMock()
self.mock_chan.makefile.side_effect = self.mkfile
self.mock_chan.makefile_stderr.side_effect = (
lambda x: self.mkfile(x, err=True))
self.example_command = 'echo hello'
self.example_output = 'hello'
def tearDown(self):
self.proxy_patcher.stop()
super(TestRemoteExecute, self).tearDown()
def mkfile(self, arg, err=False, stdoutput=None):
if arg == 'rb' and not err:
stdout = mock.MagicMock()
stdout.read.return_value = stdoutput or self.example_output
stdout.read.return_value += "\n"
return stdout
if arg == 'wb' and not err:
stdin = mock.MagicMock()
stdin.read.return_value = ''
return stdin
if err is True:
stderr = mock.MagicMock()
stderr.read.return_value = ''
return stderr
def test_remote_execute_proper_primitive(self):
self.client._handle_tty_required = mock.Mock(return_value=False)
commands = ['echo hello', 'uname -a', 'rev ~/.bash*']
for cmd in commands:
self.client.remote_execute(cmd)
self.mock_chan.exec_command.assert_called_with(cmd)
def test_remote_execute_no_exit_code(self):
self.client._handle_tty_required = mock.Mock(return_value=False)
self.mock_chan.recv_exit_status.return_value = 0
actual_output = self.client.remote_execute(self.example_command)
expected_output = {'stdout': self.example_output,
'stderr': ''}
self.assertEqual(expected_output, actual_output)
def test_remote_execute_with_exit_code(self):
self.client._handle_tty_required = mock.Mock(return_value=False)
self.mock_chan.recv_exit_status.return_value = 0
actual_output = self.client.remote_execute(
self.example_command, with_exit_code=True)
expected_output = {'stdout': self.example_output,
'stderr': '',
'exit_code': 0}
self.assertEqual(expected_output, actual_output)
def test_remote_execute_tty_required(self):
for i, substring in enumerate(ssh.TTY_REQUIRED):
self.mock_chan.makefile.side_effect = lambda x: self.mkfile(
x, stdoutput="xyz" + substring + "zyx")
self.assertRaises(
errors.GetPTYRetryFailure,
self.client.remote_execute,
'sudo echo_hello')
self.assertEqual(i + 1, self.mock_chan.get_pty.call_count)
def test_get_platform_info(self):
platinfo = ['Ubuntu', '12.04', 'precise', 'x86_64']
fields = ['dist', 'version', 'remove', 'arch']
expected_result = dict(zip(fields, [v.lower() for v in platinfo]))
expected_result.pop('remove')
self.mock_chan.makefile.side_effect = lambda x: self.mkfile(
x, stdoutput=str(expected_result))
self.assertEqual(expected_result, self.client.platform_info)
class TestProxy(SSHTestBase):
"""self.client in this class is instantiated with a proxy."""
def setUp(self):
super(TestProxy, self).setUp()
self.gateway = ssh.SSH('gateway.address', username='gateway-user')
def tearDown(self):
super(TestProxy, self).tearDown()
def test_test_connection_fail_other(self):
self.client = ssh.SSH(
'123.456.789.0', username='client-user', gateway=self.gateway)
self.mock_connect.side_effect = Exception
self.assertFalse(self.client.test_connection())
def test_connect_with_proxy_no_host_raises(self):
gateway = {'this': 'is not a gateway'}
self.assertRaises(
TypeError,
ssh.SSH, ('123.456.789.0',),
username='client-user', gateway=gateway)
if __name__ == "__main__":
unittest.main()

View File

@ -1,261 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""Test Ohai-Solo Plugin."""
import unittest
import mock
from satori import errors
from satori.sysinfo import ohai_solo
from satori.tests import utils
class TestOhaiSolo(utils.TestCase):
@mock.patch.object(ohai_solo, 'bash')
@mock.patch.object(ohai_solo, 'system_info')
@mock.patch.object(ohai_solo, 'perform_install')
def test_connect_and_run(self, mock_install, mock_sysinfo, mock_bash):
address = "192.0.2.2"
config = {
'host_key': 'foo',
'host_username': 'bar',
}
mock_sysinfo.return_value = {}
result = ohai_solo.get_systeminfo(address, config)
self.assertTrue(result is mock_sysinfo.return_value)
mock_install.assert_called_once_with(
mock_bash.RemoteShell().__enter__.return_value)
mock_bash.RemoteShell.assert_any_call(
address, username="bar",
private_key="foo",
interactive=False)
mock_sysinfo.assert_called_with(
mock_bash.RemoteShell().__enter__.return_value)
class TestOhaiInstall(utils.TestCase):
def setUp(self):
super(TestOhaiInstall, self).setUp()
self.mock_remotesshclient = mock.MagicMock()
self.mock_remotesshclient.is_windows.return_value = False
def test_perform_install_fedora(self):
response = {'exit_code': 0, 'stdout': 'installed remote'}
self.mock_remotesshclient.execute.return_value = response
result = ohai_solo.perform_install(self.mock_remotesshclient)
self.assertEqual(result, response)
self.assertEqual(self.mock_remotesshclient.execute.call_count, 3)
self.mock_remotesshclient.execute.assert_has_calls([
mock.call('wget -N http://readonly.configdiscovery.rackspace.com/install.sh', cwd='/tmp',
escalate=True, allow_many=False),
mock.call('bash install.sh', cwd='/tmp', with_exit_code=True,
escalate=True, allow_many=False),
mock.call('rm install.sh', cwd='/tmp', escalate=True,
allow_many=False)])
def test_perform_install_with_install_dir(self):
response = {'exit_code': 0, 'stdout': 'installed remote'}
self.mock_remotesshclient.execute.return_value = response
result = ohai_solo.perform_install(self.mock_remotesshclient,
install_dir='/home/bob')
self.assertEqual(result, response)
self.assertEqual(self.mock_remotesshclient.execute.call_count, 3)
self.mock_remotesshclient.execute.assert_has_calls([
mock.call('wget -N http://readonly.configdiscovery.'
'rackspace.com/install.sh', cwd='/tmp',
escalate=True, allow_many=False),
mock.call('bash install.sh -t -i /home/bob', cwd='/tmp',
with_exit_code=True, escalate=True, allow_many=False),
mock.call('rm install.sh', cwd='/tmp', escalate=True,
allow_many=False)])
def test_perform_install_with_install_dir_and_spaces(self):
response = {'exit_code': 0, 'stdout': 'installed remote'}
self.mock_remotesshclient.execute.return_value = response
result = ohai_solo.perform_install(self.mock_remotesshclient,
install_dir='/srv/a diff * dir')
self.assertEqual(result, response)
self.assertEqual(self.mock_remotesshclient.execute.call_count, 3)
self.mock_remotesshclient.execute.assert_has_calls([
mock.call('wget -N http://readonly.configdiscovery.'
'rackspace.com/install.sh', cwd='/tmp',
escalate=True, allow_many=False),
mock.call('bash install.sh -t -i \'/srv/a diff * dir\'',
cwd='/tmp', with_exit_code=True, escalate=True,
allow_many=False),
mock.call('rm install.sh', cwd='/tmp', escalate=True,
allow_many=False)])
def test_install_linux_remote_failed(self):
response = {'exit_code': 1, 'stdout': "", "stderr": "FAIL"}
self.mock_remotesshclient.execute.return_value = response
self.assertRaises(errors.SystemInfoCommandInstallFailed,
ohai_solo.perform_install, self.mock_remotesshclient)
class TestOhaiRemove(utils.TestCase):
def setUp(self):
super(TestOhaiRemove, self).setUp()
self.mock_remotesshclient = mock.MagicMock()
self.mock_remotesshclient.is_windows.return_value = False
def test_remove_remote_fedora(self):
self.mock_remotesshclient.is_debian.return_value = False
self.mock_remotesshclient.is_fedora.return_value = True
response = {'exit_code': 0, 'foo': 'bar'}
self.mock_remotesshclient.execute.return_value = response
result = ohai_solo.remove_remote(self.mock_remotesshclient)
self.assertEqual(result, response)
self.mock_remotesshclient.execute.assert_called_once_with(
'yum -y erase ohai-solo', cwd='/tmp', escalate=True)
def test_remove_remote_debian(self):
self.mock_remotesshclient.is_debian.return_value = True
self.mock_remotesshclient.is_fedora.return_value = False
response = {'exit_code': 0, 'foo': 'bar'}
self.mock_remotesshclient.execute.return_value = response
result = ohai_solo.remove_remote(self.mock_remotesshclient)
self.assertEqual(result, response)
self.mock_remotesshclient.execute.assert_called_once_with(
'dpkg --purge ohai-solo', cwd='/tmp', escalate=True)
def test_remove_remote_unsupported(self):
self.mock_remotesshclient.is_debian.return_value = False
self.mock_remotesshclient.is_fedora.return_value = False
self.assertRaises(errors.UnsupportedPlatform,
ohai_solo.remove_remote, self.mock_remotesshclient)
def test_remove_remote_with_install_dir(self):
self.mock_remotesshclient.is_debian.return_value = True
self.mock_remotesshclient.is_fedora.return_value = False
response = {'exit_code': 0, 'foo': 'bar'}
self.mock_remotesshclient.execute.return_value = response
result = ohai_solo.remove_remote(self.mock_remotesshclient,
install_dir='/home/srv')
self.assertEqual(result, response)
self.mock_remotesshclient.execute.assert_called_once_with(
'rm -rf /home/srv/ohai-solo/', cwd='/tmp', escalate=True)
def test_remove_remote_with_install_dir_and_spaces(self):
self.mock_remotesshclient.is_debian.return_value = True
self.mock_remotesshclient.is_fedora.return_value = False
response = {'exit_code': 0, 'foo': 'bar'}
self.mock_remotesshclient.execute.return_value = response
result = ohai_solo.remove_remote(self.mock_remotesshclient,
install_dir='/srv/a dir')
self.assertEqual(result, response)
self.mock_remotesshclient.execute.assert_called_once_with(
'rm -rf \'/srv/a dir/ohai-solo/\'', cwd='/tmp', escalate=True)
class TestSystemInfo(utils.TestCase):
def setUp(self):
super(TestSystemInfo, self).setUp()
self.mock_remotesshclient = mock.MagicMock()
self.mock_remotesshclient.is_windows.return_value = False
def test_system_info(self):
self.mock_remotesshclient.execute.return_value = {
'exit_code': 0,
'stdout': "{}",
'stderr': ""
}
ohai_solo.system_info(self.mock_remotesshclient)
self.mock_remotesshclient.execute.assert_called_with(
"unset GEM_CACHE GEM_HOME GEM_PATH && "
"sudo /opt/ohai-solo/bin/ohai-solo",
escalate=True, allow_many=False)
def test_system_info_with_install_dir(self):
self.mock_remotesshclient.execute.return_value = {
'exit_code': 0,
'stdout': "{}",
'stderr': ""
}
ohai_solo.system_info(self.mock_remotesshclient,
install_dir='/home/user')
self.mock_remotesshclient.execute.assert_called_with(
"unset GEM_CACHE GEM_HOME GEM_PATH && "
"sudo /home/user/ohai-solo/bin/ohai-solo",
escalate=True, allow_many=False)
def test_system_info_with_install_dir_with_spaces(self):
self.mock_remotesshclient.execute.return_value = {
'exit_code': 0,
'stdout': "{}",
'stderr': ""
}
ohai_solo.system_info(self.mock_remotesshclient,
install_dir='/sys/omg * " lol/')
self.mock_remotesshclient.execute.assert_called_with(
"unset GEM_CACHE GEM_HOME GEM_PATH && "
'sudo \'/sys/omg * " lol//ohai-solo/bin/ohai-solo\'',
escalate=True, allow_many=False)
def test_system_info_with_motd(self):
self.mock_remotesshclient.execute.return_value = {
'exit_code': 0,
'stdout': "Hello world\n {}",
'stderr': ""
}
ohai_solo.system_info(self.mock_remotesshclient)
self.mock_remotesshclient.execute.assert_called_with(
"unset GEM_CACHE GEM_HOME GEM_PATH && "
"sudo /opt/ohai-solo/bin/ohai-solo",
escalate=True, allow_many=False)
def test_system_info_bad_json(self):
self.mock_remotesshclient.execute.return_value = {
'exit_code': 0,
'stdout': "{Not JSON!}",
'stderr': ""
}
self.assertRaises(errors.SystemInfoNotJson, ohai_solo.system_info,
self.mock_remotesshclient)
def test_system_info_missing_json(self):
self.mock_remotesshclient.execute.return_value = {
'exit_code': 0,
'stdout': "No JSON!",
'stderr': ""
}
self.assertRaises(errors.SystemInfoMissingJson, ohai_solo.system_info,
self.mock_remotesshclient)
def test_system_info_command_not_found(self):
self.mock_remotesshclient.execute.return_value = {
'exit_code': 1,
'stdout': "",
'stderr': "ohai-solo command not found"
}
self.assertRaises(errors.SystemInfoCommandMissing,
ohai_solo.system_info, self.mock_remotesshclient)
def test_system_info_could_not_find(self):
self.mock_remotesshclient.execute.return_value = {
'exit_code': 1,
'stdout': "",
'stderr': "Could not find ohai-solo."
}
self.assertRaises(errors.SystemInfoCommandMissing,
ohai_solo.system_info, self.mock_remotesshclient)
if __name__ == "__main__":
unittest.main()

View File

@ -1,85 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""Test PoSh-Ohai Plugin."""
import doctest
import unittest
import mock
from satori import errors
from satori.sysinfo import posh_ohai
from satori.tests import utils
def load_tests(loader, tests, ignore):
"""Include doctests as unit tests."""
tests.addTests(doctest.DocTestSuite(posh_ohai))
return tests
class TestSystemInfo(utils.TestCase):
def setUp(self):
super(TestSystemInfo, self).setUp()
self.client = mock.MagicMock()
self.client.is_windows.return_value = True
def test_system_info(self):
self.client.execute.return_value = "{}"
posh_ohai.system_info(self.client)
self.client.execute.assert_called_with("Import-Module -Name Posh-Ohai;"
"Get-ComputerConfiguration")
def test_system_info_json(self):
self.client.execute.return_value = '{"foo": 123}'
self.assertEqual(posh_ohai.system_info(self.client), {'foo': 123})
def test_system_info_json_with_motd(self):
self.client.execute.return_value = "Hello world\n {}"
self.assertEqual(posh_ohai.system_info(self.client), {})
def test_system_info_xml(self):
valid_xml = '''<Objects>"
<Object>"
<Property Name="Key">platform_family</Property>
<Property Name="Value">Windows</Property>
</Object>
</Objects>'''
self.client.execute.return_value = valid_xml
self.assertEqual(posh_ohai.system_info(self.client),
{'platform_family': 'Windows'})
def test_system_info_bad_json(self):
self.client.execute.return_value = "{Not JSON!}"
self.assertRaises(errors.SystemInfoInvalid,
posh_ohai.system_info, self.client)
def test_system_info_bad_xml(self):
self.client.execute.return_value = "<foo><bar>"
self.assertRaises(errors.SystemInfoInvalid,
posh_ohai.system_info, self.client)
def test_system_info_bad_xml(self):
self.client.execute.return_value = "<foo>bad structure</foo>"
self.assertRaises(errors.SystemInfoInvalid,
posh_ohai.system_info, self.client)
def test_system_info_invalid(self):
self.client.execute.return_value = "No JSON and not XML!"
self.assertRaises(errors.SystemInfoInvalid,
posh_ohai.system_info, self.client)
if __name__ == "__main__":
unittest.main()

View File

@ -1,131 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""Tests for utils module."""
import datetime
import time
import unittest
import mock
from satori import utils
class SomeTZ(datetime.tzinfo):
"""A random timezone."""
def utcoffset(self, dt):
return datetime.timedelta(minutes=45)
def tzname(self, dt):
return "STZ"
def dst(self, dt):
return datetime.timedelta(0)
class TestTimeUtils(unittest.TestCase):
"""Test time formatting functions."""
def test_get_formatted_time_string(self):
some_time = time.gmtime(0)
with mock.patch.object(utils.time, 'gmtime') as mock_gmt:
mock_gmt.return_value = some_time
result = utils.get_time_string()
self.assertEqual(result, "1970-01-01 00:00:00 +0000")
def test_get_formatted_time_string_time_struct(self):
result = utils.get_time_string(time_obj=time.gmtime(0))
self.assertEqual(result, "1970-01-01 00:00:00 +0000")
def test_get_formatted_time_string_datetime(self):
result = utils.get_time_string(
time_obj=datetime.datetime(1970, 2, 1, 1, 2, 3, 0))
self.assertEqual(result, "1970-02-01 01:02:03 +0000")
def test_get_formatted_time_string_datetime_tz(self):
result = utils.get_time_string(
time_obj=datetime.datetime(1970, 2, 1, 1, 2, 3, 0, SomeTZ()))
self.assertEqual(result, "1970-02-01 01:47:03 +0000")
def test_parse_time_string(self):
result = utils.parse_time_string("1970-02-01 01:02:03 +0000")
self.assertEqual(result, datetime.datetime(1970, 2, 1, 1, 2, 3, 0))
def test_parse_time_string_with_tz(self):
result = utils.parse_time_string("1970-02-01 01:02:03 +1000")
self.assertEqual(result, datetime.datetime(1970, 2, 1, 11, 2, 3, 0))
class TestGetSource(unittest.TestCase):
def setUp(self):
self.function_signature = "def get_my_source_oneline_docstring(self):"
self.function_oneline_docstring = '"""A beautiful docstring."""'
self.function_multiline_docstring = ('"""A beautiful docstring.\n\n'
'Is a terrible thing to '
'waste.\n"""')
self.function_body = ['the_problem = "not the problem"',
'return the_problem']
def get_my_source_oneline_docstring(self):
"""A beautiful docstring."""
the_problem = "not the problem"
return the_problem
def get_my_source_multiline_docstring(self):
"""A beautiful docstring.
Is a terrible thing to waste.
"""
the_problem = "not the problem"
return the_problem
def test_get_source(self):
nab = utils.get_source_body(self.get_my_source_oneline_docstring)
self.assertEqual("\n".join(self.function_body), nab)
def test_get_source_with_docstring(self):
nab = utils.get_source_body(self.get_my_source_oneline_docstring,
with_docstring=True)
copy = self.function_oneline_docstring + "\n" + "\n".join(
self.function_body)
self.assertEqual(copy, nab)
def test_get_source_with_multiline_docstring(self):
nab = utils.get_source_body(self.get_my_source_multiline_docstring,
with_docstring=True)
copy = (self.function_multiline_docstring + "\n" + "\n".join(
self.function_body))
self.assertEqual(copy, nab)
def test_get_definition(self):
nab = utils.get_source_definition(
self.get_my_source_oneline_docstring)
copy = "%s\n \n %s" % (self.function_signature,
"\n ".join(self.function_body))
self.assertEqual(copy, nab)
def test_get_definition_with_docstring(self):
nab = utils.get_source_definition(
self.get_my_source_oneline_docstring, with_docstring=True)
copy = "%s\n %s\n %s" % (self.function_signature,
self.function_oneline_docstring,
"\n ".join(self.function_body))
self.assertEqual(copy, nab)
if __name__ == '__main__':
unittest.main()

View File

@ -1,60 +0,0 @@
# Copyright 2012-2013 OpenStack Foundation
# Copyright 2013 Nebula Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
import os
import sys
import fixtures
import testtools
class TestCase(testtools.TestCase):
def setUp(self):
testtools.TestCase.setUp(self)
if (os.environ.get("OS_STDOUT_CAPTURE") == "True" or
os.environ.get("OS_STDOUT_CAPTURE") == "1"):
stdout = self.useFixture(fixtures.StringStream("stdout")).stream
self.useFixture(fixtures.MonkeyPatch("sys.stdout", stdout))
if (os.environ.get("OS_STDERR_CAPTURE") == "True" or
os.environ.get("OS_STDERR_CAPTURE") == "1"):
stderr = self.useFixture(fixtures.StringStream("stderr")).stream
self.useFixture(fixtures.MonkeyPatch("sys.stderr", stderr))
# 2.6 doesn't have the assert dict equals so make sure that it exists
if tuple(sys.version_info)[0:2] < (2, 7):
def assertIsInstance(self, obj, cls, msg=None):
"""Same as self.assertTrue(isinstance(obj, cls)), with a nicer
default message
"""
if not isinstance(obj, cls):
standardMsg = '%s is not an instance of %r' % (obj, cls)
self.fail(self._formatMessage(msg, standardMsg))
def assertDictEqual(self, d1, d2, msg=None):
# Simple version taken from 2.7
self.assertIsInstance(d1, dict,
'First argument is not a dictionary')
self.assertIsInstance(d2, dict,
'Second argument is not a dictionary')
if d1 != d2:
if msg:
self.fail(msg)
else:
standardMsg = '%r != %r' % (d1, d2)
self.fail(standardMsg)

View File

@ -1,167 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""SSH tunneling module.
Set up a forward tunnel across an SSH server, using paramiko. A local port
(given with -p) is forwarded across an SSH session to an address:port from
the SSH server. This is similar to the openssh -L option.
"""
try:
import eventlet
eventlet.monkey_patch()
from eventlet.green import threading
from eventlet.green import time
except ImportError:
import threading
import time
pass
import logging
import select
import socket
try:
import SocketServer
except ImportError:
import socketserver as SocketServer
import paramiko
LOG = logging.getLogger(__name__)
class TunnelServer(SocketServer.ThreadingTCPServer):
"""Serve on a local ephemeral port.
Clients will connect to that port/server.
"""
daemon_threads = True
allow_reuse_address = True
class TunnelHandler(SocketServer.BaseRequestHandler):
"""Handle forwarding of packets."""
def handle(self):
"""Do all the work required to service a request.
The request is available as self.request, the client address as
self.client_address, and the server instance as self.server, in
case it needs to access per-server information.
This implementation will forward packets.
"""
try:
chan = self.ssh_transport.open_channel('direct-tcpip',
self.target_address,
self.request.getpeername())
except Exception as exc:
LOG.error('Incoming request to %s:%s failed',
self.target_address[0],
self.target_address[1],
exc_info=exc)
return
if chan is None:
LOG.error('Incoming request to %s:%s was rejected '
'by the SSH server.',
self.target_address[0],
self.target_address[1])
return
while True:
r, w, x = select.select([self.request, chan], [], [])
if self.request in r:
data = self.request.recv(1024)
if len(data) == 0:
break
chan.send(data)
if chan in r:
data = chan.recv(1024)
if len(data) == 0:
break
self.request.send(data)
try:
peername = None
peername = str(self.request.getpeername())
except socket.error as exc:
LOG.warning("Couldn't fetch peername.", exc_info=exc)
chan.close()
self.request.close()
LOG.info("Tunnel closed from '%s'", peername or 'unnamed peer')
class Tunnel(object): # pylint: disable=R0902
"""Create a TCP server which will use TunnelHandler."""
def __init__(self, target_host, target_port,
sshclient, tunnel_host='localhost',
tunnel_port=0):
"""Constructor."""
if not isinstance(sshclient, paramiko.SSHClient):
raise TypeError("'sshclient' must be an instance of "
"paramiko.SSHClient.")
self.target_host = target_host
self.target_port = target_port
self.target_address = (target_host, target_port)
self.address = (tunnel_host, tunnel_port)
self._tunnel = None
self._tunnel_thread = None
self.sshclient = sshclient
self._ssh_transport = self.get_sshclient_transport(
self.sshclient)
TunnelHandler.target_address = self.target_address
TunnelHandler.ssh_transport = self._ssh_transport
self._tunnel = TunnelServer(self.address, TunnelHandler)
# reset attribute to the port it has actually been set to
self.address = self._tunnel.server_address
tunnel_host, self.tunnel_port = self.address
def get_sshclient_transport(self, sshclient):
"""Get the sshclient's transport.
Connect the sshclient, that has been passed in and return its
transport.
"""
sshclient.connect()
return sshclient.get_transport()
def serve_forever(self, async=True):
"""Serve the tunnel forever.
if async is True, this will be done in a background thread
"""
if not async:
self._tunnel.serve_forever()
else:
self._tunnel_thread = threading.Thread(
target=self._tunnel.serve_forever)
self._tunnel_thread.start()
# cooperative yield
time.sleep(0)
def shutdown(self):
"""Stop serving the tunnel.
Also close the socket.
"""
self._tunnel.shutdown()
self._tunnel.socket.close()

View File

@ -1,221 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""General utilities.
- Class and module import/export
- Time utilities (we standardize on UTC)
"""
import datetime
import inspect
import logging
import platform
import socket
import sys
import time
import iso8601
LOG = logging.getLogger(__name__)
STRING_FORMAT = "%Y-%m-%d %H:%M:%S +0000"
def import_class(import_str):
"""Return a class from a string including module and class."""
mod_str, _, class_str = import_str.rpartition('.')
try:
__import__(mod_str)
return getattr(sys.modules[mod_str], class_str)
except (ImportError, ValueError, AttributeError) as exc:
LOG.debug('Inner Exception: %s', exc)
raise
def import_object(import_str, *args, **kw):
"""Return an object including a module or module and class."""
try:
__import__(import_str)
return sys.modules[import_str]
except ImportError:
cls = import_class(import_str)
return cls(*args, **kw)
def get_time_string(time_obj=None):
"""The canonical time string format (in UTC).
:param time_obj: an optional datetime.datetime or timestruct (defaults to
gm_time)
Note: Changing this function will change all times that this project uses
in the returned data.
"""
if isinstance(time_obj, datetime.datetime):
if time_obj.tzinfo:
offset = time_obj.tzinfo.utcoffset(time_obj)
utc_dt = time_obj + offset
return datetime.datetime.strftime(utc_dt, STRING_FORMAT)
return datetime.datetime.strftime(time_obj, STRING_FORMAT)
elif isinstance(time_obj, time.struct_time):
return time.strftime(STRING_FORMAT, time_obj)
elif time_obj is not None:
raise TypeError("get_time_string takes only a time_struct, none, or a "
"datetime. It was given a %s" % type(time_obj))
return time.strftime(STRING_FORMAT, time.gmtime())
def parse_time_string(time_string):
"""Return naive datetime object from string in standard time format."""
parsed = time_string.replace(" +", "+").replace(" -", "-")
dt_with_tz = iso8601.parse_date(parsed)
offset = dt_with_tz.tzinfo.utcoffset(dt_with_tz)
result = dt_with_tz + offset
return result.replace(tzinfo=None)
def is_valid_ipv4_address(address):
"""Check if the address supplied is a valid IPv4 address."""
try:
socket.inet_pton(socket.AF_INET, address)
except AttributeError: # no inet_pton here, sorry
try:
socket.inet_aton(address)
except socket.error:
return False
return address.count('.') == 3
except socket.error: # not a valid address
return False
return True
def is_valid_ipv6_address(address):
"""Check if the address supplied is a valid IPv6 address."""
try:
socket.inet_pton(socket.AF_INET6, address)
except socket.error: # not a valid address
return False
return True
def is_valid_ip_address(address):
"""Check if the address supplied is a valid IP address."""
return is_valid_ipv4_address(address) or is_valid_ipv6_address(address)
def get_local_ips():
"""Return local ipaddress(es)."""
# pylint: disable=W0703
list1 = []
list2 = []
defaults = ["127.0.0.1", r"fe80::1%lo0"]
hostname = None
try:
hostname = socket.gethostname()
except Exception as exc:
LOG.debug("Error in gethostbyname_ex: %s", exc)
try:
_, _, addresses = socket.gethostbyname_ex(hostname)
list1 = [ip for ip in addresses]
except Exception as exc:
LOG.debug("Error in gethostbyname_ex: %s", exc)
try:
list2 = [info[4][0] for info in socket.getaddrinfo(hostname, None)]
except Exception as exc:
LOG.debug("Error in getaddrinfo: %s", exc)
return list(set(list1 + list2 + defaults))
def get_platform_info():
"""Return a dictionary with distro, version, and system architecture.
Requires >= Python 2.4 (2004)
Supports most Linux distros, Mac OSX, and Windows.
Example return value on Mac OSX:
{'arch': '64bit', 'version': '10.8.5', 'dist': 'darwin'}
"""
pin = list(platform.dist() + (platform.machine(),))
pinfodict = {'dist': pin[0], 'version': pin[1], 'arch': pin[3]}
if not pinfodict['dist'] or not pinfodict['version']:
pinfodict['dist'] = sys.platform.lower()
pinfodict['arch'] = platform.architecture()[0]
if 'darwin' in pinfodict['dist']:
pinfodict['version'] = platform.mac_ver()[0]
elif pinfodict['dist'].startswith('win'):
pinfodict['version'] = str(platform.platform())
return pinfodict
def get_source_definition(function, with_docstring=False):
"""Get the entire body of a function, including the signature line.
:param with_docstring: Include docstring in return value.
Default is False. Supports docstrings in
triple double-quotes or triple single-quotes.
"""
thedoc = inspect.getdoc(function)
definition = inspect.cleandoc(
inspect.getsource(function))
if thedoc and not with_docstring:
definition = definition.replace(thedoc, '')
doublequotes = definition.find('"""')
doublequotes = float("inf") if doublequotes == -1 else doublequotes
singlequotes = definition.find("'''")
singlequotes = float("inf") if singlequotes == -1 else singlequotes
if doublequotes != singlequotes:
triplet = '"""' if doublequotes < singlequotes else "'''"
definition = definition.replace(triplet, '', 2)
while definition.find('\n\n\n') != -1:
definition = definition.replace('\n\n\n', '\n\n')
definition_copy = []
for line in definition.split('\n'):
# pylint: disable=W0141
if not any(map(line.strip().startswith, ("@", "def"))):
line = " " * 4 + line
definition_copy.append(line)
return "\n".join(definition_copy).strip()
def get_source_body(function, with_docstring=False):
"""Get the body of a function (i.e. no definition line, unindented).
:param with_docstring: Include docstring in return value.
Default is False.
"""
lines = get_source_definition(
function, with_docstring=with_docstring).split('\n')
# Find body - skip decorators and definition
start = 0
for number, line in enumerate(lines):
# pylint: disable=W0141
if any(map(line.strip().startswith, ("@", "def"))):
start = number + 1
lines = lines[start:]
# Unindent body
indent = len(lines[0]) - len(lines[0].lstrip())
for index, line in enumerate(lines):
lines[index] = line[indent:]
return '\n'.join(lines).strip()

View File

@ -1,43 +0,0 @@
[metadata]
name = satori
summary = OpenStack Configuration Discovery
description-file =
README.rst
author = OpenStack
author-email = openstack-dev@lists.openstack.org
home-page = http://wiki.openstack.org/Satori
classifier =
Environment :: OpenStack
Intended Audience :: Information Technology
Intended Audience :: System Administrators
License :: OSI Approved :: Apache Software License
Operating System :: POSIX :: Linux
Programming Language :: Python
Programming Language :: Python :: 2
Programming Language :: Python :: 2.7
Programming Language :: Python :: 2.6
Programming Language :: Python :: 3
Programming Language :: Python :: 3.3
Programming Language :: Python :: Implementation
Programming Language :: Python :: Implementation :: CPython
Programming Language :: Python :: Implementation :: PyPy
[files]
packages =
satori
[entry_points]
console_scripts =
satori = satori.shell:main
[build_sphinx]
source-dir = doc/source
build-dir = doc/build
all_files = 1
[upload_sphinx]
upload-dir = doc/build/html
[wheel]
universal = 1

View File

@ -1,22 +0,0 @@
#!/usr/bin/env python
# Copyright (c) 2013 Hewlett-Packard Development Company, L.P.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# THIS FILE IS MANAGED BY THE GLOBAL REQUIREMENTS REPO - DO NOT EDIT
import setuptools
setuptools.setup(
setup_requires=['pbr'],
pbr=True)

View File

@ -1,15 +0,0 @@
# Includes fixes for handling egg-linked libraries
# and detection of stdlibs in virtualenvs
-e git://github.com/samstav/hacking.git@satori#egg=hacking
coverage>=3.6
discover
flake8_docstrings>=0.2.0 # patched for py33
fixtures>=0.3.14
freezegun
mock>=1.0
pep8>=1.5.7,<1.6
pep257>=0.3.2 # patched for py33
sphinx>=1.2.2
testrepository>=0.0.17
testtools>=0.9.32

View File

@ -1,66 +0,0 @@
# Copyright 2013 OpenStack, LLC.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""
Installation script for satori's development virtualenv
"""
import os
import sys
import install_venv_common as install_venv
def print_help():
help = """
satori development environment setup is complete.
satori development uses virtualenv to track and manage Python
dependencies while in development and testing.
To activate the satori virtualenv for the extent of your current
shell session you can run:
$ source .venv/bin/activate
Or, if you prefer, you can run commands in the virtualenv on a case by case
basis by running:
$ tools/with_venv.sh <your command>
Also, make test will automatically use the virtualenv.
"""
print help
def main(argv):
root = os.path.dirname(os.path.dirname(os.path.realpath(__file__)))
venv = os.path.join(root, ".venv")
pip_requires = os.path.join(root, "requirements.txt")
test_requires = os.path.join(root, "test-requirements.txt")
py_version = "python%s.%s" % (sys.version_info[0], sys.version_info[1])
project = "satori"
install = install_venv.InstallVenv(root, venv, pip_requires, test_requires,
py_version, project)
options = install.parse_args(argv)
install.check_python_version()
install.check_dependencies()
install.create_virtualenv(no_site_packages=options.no_site_packages)
install.install_dependencies()
print_help()
if __name__ == "__main__":
main(sys.argv)

View File

@ -1,174 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright 2013 OpenStack Foundation
# Copyright 2013 IBM Corp.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""Provides methods needed by installation script for OpenStack development
virtual environments.
Since this script is used to bootstrap a virtualenv from the system's Python
environment, it should be kept strictly compatible with Python 2.6.
Synced in from openstack-common
"""
from __future__ import print_function
import optparse
import os
import subprocess
import sys
class InstallVenv(object):
def __init__(self, root, venv, requirements,
test_requirements, py_version,
project):
self.root = root
self.venv = venv
self.requirements = requirements
self.test_requirements = test_requirements
self.py_version = py_version
self.project = project
def die(self, message, *args):
print(message % args, file=sys.stderr)
sys.exit(1)
def check_python_version(self):
if sys.version_info < (2, 6):
self.die("Need Python Version >= 2.6")
def run_command_with_code(self, cmd, redirect_output=True,
check_exit_code=True):
"""Runs a command in an out-of-process shell.
Returns the output of that command. Working directory is self.root.
"""
if redirect_output:
stdout = subprocess.PIPE
else:
stdout = None
proc = subprocess.Popen(cmd, cwd=self.root, stdout=stdout)
output = proc.communicate()[0]
if check_exit_code and proc.returncode != 0:
self.die('Command "%s" failed.\n%s', ' '.join(cmd), output)
return (output, proc.returncode)
def run_command(self, cmd, redirect_output=True, check_exit_code=True):
return self.run_command_with_code(cmd, redirect_output,
check_exit_code)[0]
def get_distro(self):
if (os.path.exists('/etc/fedora-release') or
os.path.exists('/etc/redhat-release')):
return Fedora(
self.root, self.venv, self.requirements,
self.test_requirements, self.py_version, self.project)
else:
return Distro(
self.root, self.venv, self.requirements,
self.test_requirements, self.py_version, self.project)
def check_dependencies(self):
self.get_distro().install_virtualenv()
def create_virtualenv(self, no_site_packages=True):
"""Creates the virtual environment and installs PIP.
Creates the virtual environment and installs PIP only into the
virtual environment.
"""
if not os.path.isdir(self.venv):
print('Creating venv...', end=' ')
if no_site_packages:
self.run_command(['virtualenv', '-q', '--no-site-packages',
self.venv])
else:
self.run_command(['virtualenv', '-q', self.venv])
print('done.')
else:
print("venv already exists...")
pass
def pip_install(self, *args):
self.run_command(['tools/with_venv.sh',
'pip', 'install', '--upgrade'] + list(args),
redirect_output=False)
def install_dependencies(self):
print('Installing dependencies with pip (this can take a while)...')
# First things first, make sure our venv has the latest pip and
# setuptools and pbr
self.pip_install('pip>=1.4')
self.pip_install('setuptools')
self.pip_install('pbr')
self.pip_install('-r', self.requirements, '-r', self.test_requirements)
def parse_args(self, argv):
"""Parses command-line arguments."""
parser = optparse.OptionParser()
parser.add_option('-n', '--no-site-packages',
action='store_true',
help="Do not inherit packages from global Python "
"install")
return parser.parse_args(argv[1:])[0]
class Distro(InstallVenv):
def check_cmd(self, cmd):
return bool(self.run_command(['which', cmd],
check_exit_code=False).strip())
def install_virtualenv(self):
if self.check_cmd('virtualenv'):
return
if self.check_cmd('easy_install'):
print('Installing virtualenv via easy_install...', end=' ')
if self.run_command(['easy_install', 'virtualenv']):
print('Succeeded')
return
else:
print('Failed')
self.die('ERROR: virtualenv not found.\n\n%s development'
' requires virtualenv, please install it using your'
' favorite package management tool' % self.project)
class Fedora(Distro):
"""This covers all Fedora-based distributions.
Includes: Fedora, RHEL, CentOS, Scientific Linux
"""
def check_pkg(self, pkg):
return self.run_command_with_code(['rpm', '-q', pkg],
check_exit_code=False)[1] == 0
def install_virtualenv(self):
if self.check_cmd('virtualenv'):
return
if not self.check_pkg('python-virtualenv'):
self.die("Please install 'python-virtualenv'.")
super(Fedora, self).install_virtualenv()

View File

@ -1,4 +0,0 @@
#!/bin/bash
TOOLS=`dirname $0`
VENV=$TOOLS/../.venv
source $VENV/bin/activate && $@

52
tox.ini
View File

@ -1,52 +0,0 @@
[tox]
minversion = 1.6
envlist = py26,py27,py33,py34,pep8,pypy
skipdist = True
[testenv]
usedevelop = True
install_command = pip install -U {opts} {packages}
setenv =
VIRTUAL_ENV={envdir}
deps = -r{toxinidir}/requirements.txt
-r{toxinidir}/test-requirements.txt
commands = python setup.py testr --testr-args='{posargs}'
[testenv:py26]
setenv =
VIRTUAL_ENV={envdir}
CFLAGS=-Qunused-arguments
CPPFLAGS=-Qunused-arguments
[testenv:py33]
deps = -r{toxinidir}/requirements-py3.txt
-r{toxinidir}/test-requirements.txt
[testenv:py34]
deps = -r{toxinidir}/requirements-py3.txt
-r{toxinidir}/test-requirements.txt
[testenv:pep8]
commands = flake8
[testenv:venv]
commands = {posargs}
[testenv:cover]
commands = python setup.py test --coverage --testr-args='^(?!.*test.*coverage).*$'
[testenv:docs]
deps =
-r{toxinidir}/requirements.txt
-r{toxinidir}/test-requirements.txt
sphinxcontrib-httpdomain
commands = python setup.py build_sphinx
[tox:jenkins]
downloadcache = ~/cache/pip
[flake8]
ignore = H102
show-source = True
exclude = .venv,.git,.tox,dist,doc,*openstack/common*,*lib/python*,*egg,build,tools,*satori/contrib*,*.ropeproject,*satori/tests*,setup.py
max-complexity = 16