Initial add
This commit is contained in:
10
CHANGES
Normal file
10
CHANGES
Normal file
@@ -0,0 +1,10 @@
|
||||
0.4.2 (2012-03-27)
|
||||
------------------
|
||||
- Add default attribute mappings
|
||||
|
||||
0.4.1 (2012-03-18)
|
||||
------------------
|
||||
- Auto sign authentication and logout requests following config options.
|
||||
- Add backwards compatibility with ElementTree in python < 2.7.
|
||||
- Fix minor bugs in the tests.
|
||||
- Support one more nameid format.
|
||||
27
INSTALL
Normal file
27
INSTALL
Normal file
@@ -0,0 +1,27 @@
|
||||
You need repoze.who to get the examples working, can be gotten through
|
||||
easy_install
|
||||
|
||||
easy_install "repoze.who=1.0.16"
|
||||
|
||||
!! 2.0 or newer are missing the form plugin which is used in some instances
|
||||
|
||||
Or from the PyPi site if you prefer to do it that way.
|
||||
Likewise for pyasn1.
|
||||
|
||||
You should get the latest version, which is right now 1.0.18 .
|
||||
|
||||
You also need xmlsec, which you can find here:
|
||||
|
||||
http://www.aleksey.com/xmlsec/
|
||||
|
||||
You may also need:
|
||||
|
||||
mako
|
||||
memcached
|
||||
python-memcache
|
||||
|
||||
Apart from that a normal
|
||||
|
||||
python setup.py install
|
||||
|
||||
will install the package.
|
||||
6
MANIFEST.in
Normal file
6
MANIFEST.in
Normal file
@@ -0,0 +1,6 @@
|
||||
include INSTALL
|
||||
include README
|
||||
include TODO
|
||||
recursive-include tests *
|
||||
recursive-include example *
|
||||
recursive-include doc *
|
||||
57
README
Normal file
57
README
Normal file
@@ -0,0 +1,57 @@
|
||||
README for PySAML2
|
||||
==================
|
||||
|
||||
Dependencies
|
||||
------------
|
||||
Pylint should be compatible with any python >= 2.6 not 3.X yet.
|
||||
To be able to sign/verify, encrypt/decrypt you need xmlsec1.
|
||||
The repoze stuff works best together with repoze.who .
|
||||
|
||||
* http://www.aleksey.com/xmlsec/
|
||||
* http://static.repoze.org/whodocs/
|
||||
|
||||
Install
|
||||
-------
|
||||
You need repoze.who to get the examples working, can be gotten through
|
||||
easy_install
|
||||
|
||||
easy_install repoze.who
|
||||
|
||||
Or from the PyPi site if you prefer to do it that way.
|
||||
You should get the latest version, which is right now 1.0.18 .
|
||||
|
||||
You also need xmlsec, which you can find here:
|
||||
|
||||
http://www.aleksey.com/xmlsec/
|
||||
|
||||
Apart from that a normal
|
||||
|
||||
python setup.py install
|
||||
|
||||
will install the package.
|
||||
|
||||
Documentation
|
||||
-------------
|
||||
Look in the doc/ subdirectory.
|
||||
|
||||
Comments, support, bug reports
|
||||
------------------------------
|
||||
|
||||
Project page on :
|
||||
|
||||
https://code.launchpad.net/~roland-hedberg/pysaml2/main
|
||||
|
||||
Use the Pysaml2@uma.es mailing list. Since we do not have
|
||||
publicly available bug tracker yet, bug reports should be emailed
|
||||
there too.
|
||||
|
||||
You can subscribe to this mailing list at
|
||||
http://delfos.sci.uma.es/mailman/listinfo/pysaml2
|
||||
|
||||
Archives are available at
|
||||
http://delfos.sci.uma.es/mailman/private/pysaml2/
|
||||
|
||||
Contributors
|
||||
------------
|
||||
* Roland Hedberg: main author / maintainer
|
||||
* Lorenzo Gil Sanchez: Django integration
|
||||
15
README.rst
Normal file
15
README.rst
Normal file
@@ -0,0 +1,15 @@
|
||||
|
||||
*************************
|
||||
PySAML2 - SAML2 in Python
|
||||
*************************
|
||||
|
||||
:Author: Roland Hedberg
|
||||
:Version: 0.3
|
||||
|
||||
PySAML2 is a pure python implementation of a SAML2 service provider and to
|
||||
some extend also the identity provider. Originally written to work in a WSGI
|
||||
environment there are extensions that allow you to use it with other
|
||||
frameworks.
|
||||
|
||||
|
||||
|
||||
3
TODO
Normal file
3
TODO
Normal file
@@ -0,0 +1,3 @@
|
||||
1. Write documentations.
|
||||
2. Write unittests for signature related utility methods.
|
||||
3. Complete saml2 message class.
|
||||
88
doc/Makefile
Normal file
88
doc/Makefile
Normal file
@@ -0,0 +1,88 @@
|
||||
# Makefile for Sphinx documentation
|
||||
#
|
||||
|
||||
# You can set these variables from the command line.
|
||||
SPHINXOPTS =
|
||||
SPHINXBUILD = sphinx-build
|
||||
PAPER =
|
||||
|
||||
# Internal variables.
|
||||
PAPEROPT_a4 = -D latex_paper_size=a4
|
||||
PAPEROPT_letter = -D latex_paper_size=letter
|
||||
ALLSPHINXOPTS = -d _build/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
|
||||
|
||||
.PHONY: help clean html dirhtml pickle json htmlhelp qthelp latex changes linkcheck doctest
|
||||
|
||||
help:
|
||||
@echo "Please use \`make <target>' where <target> is one of"
|
||||
@echo " html to make standalone HTML files"
|
||||
@echo " dirhtml to make HTML files named index.html in directories"
|
||||
@echo " pickle to make pickle files"
|
||||
@echo " json to make JSON files"
|
||||
@echo " htmlhelp to make HTML files and a HTML help project"
|
||||
@echo " qthelp to make HTML files and a qthelp project"
|
||||
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
|
||||
@echo " changes to make an overview of all changed/added/deprecated items"
|
||||
@echo " linkcheck to check all external links for integrity"
|
||||
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
|
||||
|
||||
clean:
|
||||
-rm -rf _build/*
|
||||
|
||||
html:
|
||||
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) _build/html
|
||||
@echo
|
||||
@echo "Build finished. The HTML pages are in _build/html."
|
||||
|
||||
dirhtml:
|
||||
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) _build/dirhtml
|
||||
@echo
|
||||
@echo "Build finished. The HTML pages are in _build/dirhtml."
|
||||
|
||||
pickle:
|
||||
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) _build/pickle
|
||||
@echo
|
||||
@echo "Build finished; now you can process the pickle files."
|
||||
|
||||
json:
|
||||
$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) _build/json
|
||||
@echo
|
||||
@echo "Build finished; now you can process the JSON files."
|
||||
|
||||
htmlhelp:
|
||||
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) _build/htmlhelp
|
||||
@echo
|
||||
@echo "Build finished; now you can run HTML Help Workshop with the" \
|
||||
".hhp project file in _build/htmlhelp."
|
||||
|
||||
qthelp:
|
||||
$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) _build/qthelp
|
||||
@echo
|
||||
@echo "Build finished; now you can run "qcollectiongenerator" with the" \
|
||||
".qhcp project file in _build/qthelp, like this:"
|
||||
@echo "# qcollectiongenerator _build/qthelp/pysaml2.qhcp"
|
||||
@echo "To view the help file:"
|
||||
@echo "# assistant -collectionFile _build/qthelp/pysaml2.qhc"
|
||||
|
||||
latex:
|
||||
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) _build/latex
|
||||
@echo
|
||||
@echo "Build finished; the LaTeX files are in _build/latex."
|
||||
@echo "Run \`make all-pdf' or \`make all-ps' in that directory to" \
|
||||
"run these through (pdf)latex."
|
||||
|
||||
changes:
|
||||
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) _build/changes
|
||||
@echo
|
||||
@echo "The overview file is in _build/changes."
|
||||
|
||||
linkcheck:
|
||||
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) _build/linkcheck
|
||||
@echo
|
||||
@echo "Link check complete; look for any errors in the above output " \
|
||||
"or in _build/linkcheck/output.txt."
|
||||
|
||||
doctest:
|
||||
$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) _build/doctest
|
||||
@echo "Testing of doctests in the sources finished, look at the " \
|
||||
"results in _build/doctest/output.txt."
|
||||
18
doc/client.rst
Normal file
18
doc/client.rst
Normal file
@@ -0,0 +1,18 @@
|
||||
.. _client:
|
||||
|
||||
***********************************************
|
||||
Classes representing Service Provider instances
|
||||
***********************************************
|
||||
|
||||
:Author: Roland Hedberg
|
||||
:Version: |version|
|
||||
|
||||
.. module:: Client
|
||||
:synopsis: Classes representing Service Provider instances.
|
||||
|
||||
Module
|
||||
==========
|
||||
|
||||
.. automodule:: saml2.client
|
||||
:members:
|
||||
|
||||
194
doc/conf.py
Normal file
194
doc/conf.py
Normal file
@@ -0,0 +1,194 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
# pysaml2 documentation build configuration file, created by
|
||||
# sphinx-quickstart on Mon Aug 24 08:13:41 2009.
|
||||
#
|
||||
# This file is execfile()d with the current directory set to its containing dir.
|
||||
#
|
||||
# Note that not all possible configuration values are present in this
|
||||
# autogenerated file.
|
||||
#
|
||||
# All configuration values have a default; values that are commented out
|
||||
# serve to show the default.
|
||||
|
||||
import sys, os
|
||||
|
||||
# If extensions (or modules to document with autodoc) are in another directory,
|
||||
# add these directories to sys.path here. If the directory is relative to the
|
||||
# documentation root, use os.path.abspath to make it absolute, like shown here.
|
||||
#sys.path.append(os.path.abspath('.'))
|
||||
|
||||
# -- General configuration -----------------------------------------------------
|
||||
|
||||
# Add any Sphinx extension module names here, as strings. They can be extensions
|
||||
# coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
|
||||
extensions = ['sphinx.ext.autodoc', 'sphinx.ext.doctest', 'sphinx.ext.coverage']
|
||||
|
||||
# Add any paths that contain templates here, relative to this directory.
|
||||
templates_path = ['_templates']
|
||||
|
||||
# The suffix of source filenames.
|
||||
source_suffix = '.rst'
|
||||
|
||||
# The encoding of source files.
|
||||
#source_encoding = 'utf-8'
|
||||
|
||||
# The master toctree document.
|
||||
master_doc = 'index'
|
||||
|
||||
# General information about the project.
|
||||
project = u'pysaml2'
|
||||
copyright = u'2010-2011, Roland Hedberg'
|
||||
|
||||
# The version info for the project you're documenting, acts as replacement for
|
||||
# |version| and |release|, also used in various other places throughout the
|
||||
# built documents.
|
||||
#
|
||||
# The short X.Y version.
|
||||
version = '0.4'
|
||||
# The full version, including alpha/beta/rc tags.
|
||||
release = '0.4.2'
|
||||
|
||||
# The language for content autogenerated by Sphinx. Refer to documentation
|
||||
# for a list of supported languages.
|
||||
#language = None
|
||||
|
||||
# There are two options for replacing |today|: either, you set today to some
|
||||
# non-false value, then it is used:
|
||||
#today = ''
|
||||
# Else, today_fmt is used as the format for a strftime call.
|
||||
#today_fmt = '%B %d, %Y'
|
||||
|
||||
# List of documents that shouldn't be included in the build.
|
||||
#unused_docs = []
|
||||
|
||||
# List of directories, relative to source directory, that shouldn't be searched
|
||||
# for source files.
|
||||
exclude_trees = ['_build']
|
||||
|
||||
# The reST default role (used for this markup: `text`) to use for all documents.
|
||||
#default_role = None
|
||||
|
||||
# If true, '()' will be appended to :func: etc. cross-reference text.
|
||||
#add_function_parentheses = True
|
||||
|
||||
# If true, the current module name will be prepended to all description
|
||||
# unit titles (such as .. function::).
|
||||
#add_module_names = True
|
||||
|
||||
# If true, sectionauthor and moduleauthor directives will be shown in the
|
||||
# output. They are ignored by default.
|
||||
#show_authors = False
|
||||
|
||||
# The name of the Pygments (syntax highlighting) style to use.
|
||||
pygments_style = 'sphinx'
|
||||
|
||||
# A list of ignored prefixes for module index sorting.
|
||||
#modindex_common_prefix = []
|
||||
|
||||
|
||||
# -- Options for HTML output ---------------------------------------------------
|
||||
|
||||
# The theme to use for HTML and HTML Help pages. Major themes that come with
|
||||
# Sphinx are currently 'default' and 'sphinxdoc'.
|
||||
html_theme = 'default'
|
||||
|
||||
# Theme options are theme-specific and customize the look and feel of a theme
|
||||
# further. For a list of options available for each theme, see the
|
||||
# documentation.
|
||||
#html_theme_options = {}
|
||||
|
||||
# Add any paths that contain custom themes here, relative to this directory.
|
||||
#html_theme_path = []
|
||||
|
||||
# The name for this set of Sphinx documents. If None, it defaults to
|
||||
# "<project> v<release> documentation".
|
||||
#html_title = None
|
||||
|
||||
# A shorter title for the navigation bar. Default is the same as html_title.
|
||||
#html_short_title = None
|
||||
|
||||
# The name of an image file (relative to this directory) to place at the top
|
||||
# of the sidebar.
|
||||
#html_logo = None
|
||||
|
||||
# The name of an image file (within the static path) to use as favicon of the
|
||||
# docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32
|
||||
# pixels large.
|
||||
#html_favicon = None
|
||||
|
||||
# Add any paths that contain custom static files (such as style sheets) here,
|
||||
# relative to this directory. They are copied after the builtin static files,
|
||||
# so a file named "default.css" will overwrite the builtin "default.css".
|
||||
html_static_path = ['_static']
|
||||
|
||||
# If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
|
||||
# using the given strftime format.
|
||||
#html_last_updated_fmt = '%b %d, %Y'
|
||||
|
||||
# If true, SmartyPants will be used to convert quotes and dashes to
|
||||
# typographically correct entities.
|
||||
#html_use_smartypants = True
|
||||
|
||||
# Custom sidebar templates, maps document names to template names.
|
||||
#html_sidebars = {}
|
||||
|
||||
# Additional templates that should be rendered to pages, maps page names to
|
||||
# template names.
|
||||
#html_additional_pages = {}
|
||||
|
||||
# If false, no module index is generated.
|
||||
#html_use_modindex = True
|
||||
|
||||
# If false, no index is generated.
|
||||
#html_use_index = True
|
||||
|
||||
# If true, the index is split into individual pages for each letter.
|
||||
#html_split_index = False
|
||||
|
||||
# If true, links to the reST sources are added to the pages.
|
||||
#html_show_sourcelink = True
|
||||
|
||||
# If true, an OpenSearch description file will be output, and all pages will
|
||||
# contain a <link> tag referring to it. The value of this option must be the
|
||||
# base URL from which the finished HTML is served.
|
||||
#html_use_opensearch = ''
|
||||
|
||||
# If nonempty, this is the file name suffix for HTML files (e.g. ".xhtml").
|
||||
#html_file_suffix = ''
|
||||
|
||||
# Output file base name for HTML help builder.
|
||||
htmlhelp_basename = 'pysaml2doc'
|
||||
|
||||
|
||||
# -- Options for LaTeX output --------------------------------------------------
|
||||
|
||||
# The paper size ('letter' or 'a4').
|
||||
#latex_paper_size = 'letter'
|
||||
|
||||
# The font size ('10pt', '11pt' or '12pt').
|
||||
#latex_font_size = '10pt'
|
||||
|
||||
# Grouping the document tree into LaTeX files. List of tuples
|
||||
# (source start file, target name, title, author, documentclass [howto/manual]).
|
||||
latex_documents = [
|
||||
('index', 'pysaml2.tex', u'pysaml2 Documentation',
|
||||
u'Roland Hedberg', 'manual'),
|
||||
]
|
||||
|
||||
# The name of an image file (relative to this directory) to place at the top of
|
||||
# the title page.
|
||||
#latex_logo = None
|
||||
|
||||
# For "manual" documents, if this is true, then toplevel headings are parts,
|
||||
# not chapters.
|
||||
#latex_use_parts = False
|
||||
|
||||
# Additional stuff for the LaTeX preamble.
|
||||
#latex_preamble = ''
|
||||
|
||||
# Documents to append as an appendix to all manuals.
|
||||
#latex_appendices = []
|
||||
|
||||
# If false, no module index is generated.
|
||||
#latex_use_modindex = True
|
||||
4
doc/examples/idp.rst
Normal file
4
doc/examples/idp.rst
Normal file
@@ -0,0 +1,4 @@
|
||||
.. _example_idp:
|
||||
|
||||
An extremly simple example of a SAML2 identity provider.
|
||||
========================================================
|
||||
23
doc/examples/index.rst
Normal file
23
doc/examples/index.rst
Normal file
@@ -0,0 +1,23 @@
|
||||
.. _example_index:
|
||||
|
||||
These are examples of the usage of pySAML2!
|
||||
===========================================
|
||||
|
||||
:Release: |version|
|
||||
:Date: |today|
|
||||
|
||||
Contents:
|
||||
|
||||
.. toctree::
|
||||
:maxdepth: 1
|
||||
|
||||
sp
|
||||
idp
|
||||
|
||||
Indices and tables
|
||||
==================
|
||||
|
||||
* :ref:`genindex`
|
||||
* :ref:`modindex`
|
||||
* :ref:`search`
|
||||
|
||||
193
doc/examples/sp.rst
Normal file
193
doc/examples/sp.rst
Normal file
@@ -0,0 +1,193 @@
|
||||
.. _example_sp:
|
||||
|
||||
An extremly simple example of a SAML2 service provider.
|
||||
=======================================================
|
||||
|
||||
How it works
|
||||
------------
|
||||
|
||||
A SP works with authentication and possibly attribute aggregation.
|
||||
Both of these functions can be seen as parts of the normal Repoze.who
|
||||
setup. Namely the Challenger, Identifier and MetadataProvider parts.
|
||||
|
||||
Normal for Repoze.who Identifier and MetadataProvider plugins are that
|
||||
they place information in environment variables. The convention is to place
|
||||
identity information in environ["repoze.who.identity"].
|
||||
This is a dictionary with keys like 'login', and 'repoze.who.userid'.
|
||||
|
||||
The SP follows this pattern and places the information gathered from
|
||||
the IdP that handled the authentication and possible extra information
|
||||
received from attribute authorities in the above mentioned dictionary under
|
||||
the key 'user'.
|
||||
|
||||
So in environ["repoze.who.identity"] you will find a dictionary with
|
||||
attributes and values, the attribute names used depends on what's returned
|
||||
from the IdP/AA. If there exists both a name and a friendly name, for
|
||||
instance, the friendly name is used as the key.
|
||||
|
||||
Setup
|
||||
-----
|
||||
|
||||
I you look in the example/sp directory of the distribution you will see
|
||||
the necessary files:
|
||||
|
||||
application.py
|
||||
which is the web application. In this case it will just print the
|
||||
information provided by the IdP in a table.
|
||||
|
||||
sp_conf.py
|
||||
The SPs configuration
|
||||
|
||||
who.ini
|
||||
The repoze.who configuration file
|
||||
|
||||
And then there are two files with certificates, mykey.pem with the private
|
||||
certificate and mycert.pem with the public part.
|
||||
|
||||
I'll go through these step by step.
|
||||
|
||||
The application
|
||||
---------------
|
||||
|
||||
Build to use the wsgiref's simple_server, which is fine for testing but
|
||||
not for production.
|
||||
|
||||
|
||||
SP configuration
|
||||
----------------
|
||||
|
||||
The configuration is written as described in :ref:`howto_config`. It means among other
|
||||
things that it's easily testable as to the correct syntax.
|
||||
|
||||
You can see the whole file in example/sp/sp_conf.py, here I will go through
|
||||
it line by line::
|
||||
|
||||
"service": ["sp"],
|
||||
|
||||
Tells the software what type of services the software are suppost to
|
||||
supply. It is used to check for the
|
||||
completeness of the configuration and also when constructing metadata from
|
||||
the configuration. More about that later. Allowed values are: "sp"
|
||||
(service provider), "idp" (identity provider) and "aa" (attribute authority).
|
||||
::
|
||||
|
||||
"entityid" : "urn:mace:example.com:saml:sp",
|
||||
"service_url" : "http://example.com:8087/",
|
||||
|
||||
The ID of the entity and the URL on which it is listening.::
|
||||
|
||||
"idp_url" : "https://example.com/saml2/idp/SSOService.php",
|
||||
|
||||
Since this is a very simple SP it only need to know about one IdP, therefor there
|
||||
is really no need for a metadata file or a WAYF-function or anything like that.
|
||||
It needs the URL of the IdP and that's all.::
|
||||
|
||||
"my_name" : "My first SP",
|
||||
|
||||
This is just for informal purposes, not really needed but nice to do::
|
||||
|
||||
"debug" : 1,
|
||||
|
||||
Well, at this point in time you'd really like to have as much information
|
||||
as possible as to what's going on, right ? ::
|
||||
|
||||
"key_file" : "./mykey.pem",
|
||||
"cert_file" : "./mycert.pem",
|
||||
|
||||
The necessary certificates.::
|
||||
|
||||
"xmlsec_binary" : "/opt/local/bin/xmlsec1",
|
||||
|
||||
Right now the software is built to use xmlsec binaries and not the python
|
||||
xmlsec package. There are reasons for this but I won't go into them here.::
|
||||
|
||||
"organization": {
|
||||
"name": "Example Co",
|
||||
#display_name
|
||||
"url":"http://www.example.com/",
|
||||
},
|
||||
|
||||
Information about the organization that is behind this SP, only used when
|
||||
building metadata. ::
|
||||
|
||||
"contact": [{
|
||||
"given_name":"John",
|
||||
"sur_name": "Smith",
|
||||
"email_address": "john.smith@example.com",
|
||||
#contact_type
|
||||
#company
|
||||
#telephone_number
|
||||
}]
|
||||
|
||||
Another piece of information that only is matters if you build and distribute
|
||||
metadata.
|
||||
|
||||
So, now to that part. In order to allow the IdP to talk to you you may have
|
||||
to provide the one running the IdP with a metadata file.
|
||||
If you have a SP configuration file similar to the one I've walked you
|
||||
through here, but with your information. You can make the metadata file
|
||||
by running the make_metadata script you can find in the tools directory.
|
||||
|
||||
Change directory to where you have the configuration file and do ::
|
||||
|
||||
make_metadata.py sp_conf.py > metadata.xml
|
||||
|
||||
|
||||
|
||||
Repoze configuration
|
||||
--------------------
|
||||
|
||||
I'm not going through the INI file format here. You should read
|
||||
`Middleware Responsibilities <http://static.repoze.org/whodocs/narr.html>`_
|
||||
to get a good introduction to the concept.
|
||||
|
||||
The configuration of the pysaml2 part in the applications middleware are
|
||||
first the special module configuration, namely::
|
||||
|
||||
[plugin:saml2auth]
|
||||
use = s2repoze.plugins.sp:make_plugin
|
||||
saml_conf = sp_conf.py
|
||||
rememberer_name = auth_tkt
|
||||
debug = 1
|
||||
path_logout = .*/logout.*
|
||||
|
||||
Which contains a specification ("use") of which function in which module
|
||||
should be used to initialize the part. After that comes the name of the
|
||||
file ("saml_conf") that contains the PySaml2 configuration. The third line
|
||||
("rememberer_name") points at the plugin that should be used to
|
||||
remember the user information.
|
||||
|
||||
After this, the plugin is referenced in a couple of places::
|
||||
|
||||
[identifiers]
|
||||
plugins =
|
||||
saml2auth
|
||||
auth_tkt
|
||||
|
||||
[authenticators]
|
||||
plugins = saml2auth
|
||||
|
||||
[challengers]
|
||||
plugins = saml2auth
|
||||
|
||||
[mdproviders]
|
||||
plugins = saml2auth
|
||||
|
||||
Which means that the plugin is used in all phases.
|
||||
|
||||
The application
|
||||
---------------
|
||||
|
||||
Is as said before extremly simple. The only thing that is connected to
|
||||
the PySaml2 configuration are at the bottom, namely where the server are.
|
||||
You have to ascertain that this coincides with what is specified in the
|
||||
PySaml2 configuration. Apart from that there really are no thing in
|
||||
application.py that demands that you use PySaml2 as middleware. If you
|
||||
switched to using the LDAP or CAS plugins nothing would change in the
|
||||
application. In the application configuration yes! But not in the application.
|
||||
And that is really how it should be done.
|
||||
|
||||
There is one assumption and that is that the middleware plugin that gathers
|
||||
information about the user places the extra information in as value on the
|
||||
"user" property in the dictionary found under the key "repoze.who.identity"
|
||||
in the environment.
|
||||
616
doc/howto/config.rst
Normal file
616
doc/howto/config.rst
Normal file
@@ -0,0 +1,616 @@
|
||||
.. _howto_config:
|
||||
|
||||
Configuration of pySAML2 entities
|
||||
=================================
|
||||
|
||||
Whether you plan to run a pySAML2 Service Provider, Identity provider or an
|
||||
attribute authority you have to configure it. The format of the configuration
|
||||
file is the same disregarding which type of service you plan to run.
|
||||
What differs is some of the directives.
|
||||
Below you will find a list of all the used directives in alphabetic order.
|
||||
The configuration is written as a python module which contains a named
|
||||
dictionary ("CONFIG") that contains the configuration directives.
|
||||
|
||||
The basic structure of the configuration file is therefor like this::
|
||||
|
||||
from saml2 import BINDING_HTTP_REDIRECT
|
||||
|
||||
CONFIG = {
|
||||
"entityid" : "http://saml.example.com:saml/idp.xml",
|
||||
"name" : "Rolands IdP",
|
||||
"service": {
|
||||
"idp": {
|
||||
"endpoints" : {
|
||||
"single_sign_on_service" : [
|
||||
("http://saml.example.com:saml:8088/sso",
|
||||
BINDING_HTTP_REDIRECT)],
|
||||
"single_logout_service": [
|
||||
("http://saml.example.com:saml:8088/slo",
|
||||
BINDING_HTTP_REDIRECT)]
|
||||
},
|
||||
...
|
||||
}
|
||||
},
|
||||
"key_file" : "my.key",
|
||||
"cert_file" : "ca.pem",
|
||||
"xmlsec_binary" : "/usr/local/bin/xmlsec1",
|
||||
"metadata": {
|
||||
"local": ["edugain.xml"],
|
||||
},
|
||||
"attribute_map_dir" : "attributemaps",
|
||||
...
|
||||
}
|
||||
|
||||
.. note:: You can build the metadata file for your services directly from the
|
||||
configuration.The make_metadata.py script in the pySAML2 tools directory
|
||||
will do that for you.
|
||||
|
||||
Configuration directives
|
||||
::::::::::::::::::::::::
|
||||
|
||||
.. contents::
|
||||
:local:
|
||||
:backlinks: entry
|
||||
|
||||
General directives
|
||||
------------------
|
||||
|
||||
attribute_map_dir
|
||||
^^^^^^^^^^^^^^^^^
|
||||
|
||||
Format::
|
||||
|
||||
"attribute_map_dir": "attribute-maps"
|
||||
|
||||
Points to a directory which has the attribute maps in Python modules.
|
||||
A typical map file will looks like this::
|
||||
|
||||
MAP = {
|
||||
"identifier": "urn:oasis:names:tc:SAML:2.0:attrname-format:basic",
|
||||
"fro": {
|
||||
'urn:mace:dir:attribute-def:aRecord': 'aRecord',
|
||||
'urn:mace:dir:attribute-def:aliasedEntryName': 'aliasedEntryName',
|
||||
'urn:mace:dir:attribute-def:aliasedObjectName': 'aliasedObjectName',
|
||||
'urn:mace:dir:attribute-def:associatedDomain': 'associatedDomain',
|
||||
'urn:mace:dir:attribute-def:associatedName': 'associatedName',
|
||||
...
|
||||
},
|
||||
"to": {
|
||||
'aRecord': 'urn:mace:dir:attribute-def:aRecord',
|
||||
'aliasedEntryName': 'urn:mace:dir:attribute-def:aliasedEntryName',
|
||||
'aliasedObjectName': 'urn:mace:dir:attribute-def:aliasedObjectName',
|
||||
'associatedDomain': 'urn:mace:dir:attribute-def:associatedDomain',
|
||||
'associatedName': 'urn:mace:dir:attribute-def:associatedName',
|
||||
...
|
||||
}
|
||||
}
|
||||
|
||||
The attribute map module contains a MAP dictionary with three items. The
|
||||
`identifier` item is the name-format you expect to support.
|
||||
The *to* and *fro* sub-dictionaries then contain the mapping between the names.
|
||||
|
||||
As you see the format is again a python dictionary where the key is the
|
||||
name to convert from and the value is the name to convert to.
|
||||
|
||||
Since *to* in most cases are the inverse of the *fro* file, the
|
||||
software allowes you to only specify one of them and it will
|
||||
automatically create the other.
|
||||
|
||||
cert_file
|
||||
^^^^^^^^^
|
||||
|
||||
Format::
|
||||
|
||||
cert_file: "cert.pem"
|
||||
|
||||
This is the public part of the service private/public key pair.
|
||||
*cert_file* must be a PEM formatted certificate chain file.
|
||||
|
||||
contact_person
|
||||
^^^^^^^^^^^^^^
|
||||
|
||||
This is only used by *make_metadata.py* when it constructs the metadata for
|
||||
the service described by the configuration file.
|
||||
This is where you described who can be contacted if questions arises
|
||||
about the service or if support is needed. The possible types are according to
|
||||
the standard **technical**, **support**, **administrative**, **billing**
|
||||
and **other**.::
|
||||
|
||||
contact_person: [{
|
||||
"givenname": "Derek",
|
||||
"surname": "Jeter",
|
||||
"company": "Example Co.",
|
||||
"mail": ["jeter@example.com"],
|
||||
"type": "technical",
|
||||
},{
|
||||
"givenname": "Joe",
|
||||
"surname": "Girardi",
|
||||
"company": "Example Co.",
|
||||
"mail": "girardi@example.com",
|
||||
"type": "administrative",
|
||||
}]
|
||||
|
||||
debug
|
||||
^^^^^
|
||||
|
||||
Format::
|
||||
|
||||
debug: 1
|
||||
|
||||
Whether debug information should be sent to the log file.
|
||||
|
||||
entityid
|
||||
^^^^^^^^
|
||||
|
||||
Format::
|
||||
|
||||
entityid: "http://saml.example.com/sp"
|
||||
|
||||
The globally unique identifier of the entity.
|
||||
|
||||
.. note:: There is a recommendation that the entityid should point to a real
|
||||
webpage where the metadata for the entity can be found.
|
||||
|
||||
key_file
|
||||
^^^^^^^^
|
||||
|
||||
Format::
|
||||
|
||||
key_file: "key.pem"
|
||||
|
||||
*key_file* is the name of a PEM formatted file that contains the private key
|
||||
of the service. This is presently used both to encrypt/sign assertions and as
|
||||
client key in a HTTPS session.
|
||||
|
||||
metadata
|
||||
^^^^^^^^
|
||||
|
||||
Contains a list of places where metadata can be found. This can be either
|
||||
a file accessible on the server the service runs on or somewhere on the net.::
|
||||
|
||||
"metadata" : {
|
||||
"local": [
|
||||
"metadata.xml", "vo_metadata.xml"
|
||||
],
|
||||
"remote": [
|
||||
{
|
||||
"url":"https://kalmar2.org/simplesaml/module.php/aggregator/?id=kalmarcentral2&set=saml2",
|
||||
"cert":"kalmar2.cert"
|
||||
}],
|
||||
},
|
||||
|
||||
The above configuration means that the service should read two local
|
||||
metadata files and on top of that load one from the net. To verify the
|
||||
authenticity of the file downloaded from the net the local copy of the
|
||||
public key should be used.
|
||||
This public key must be acquired by some out-of-band method.
|
||||
|
||||
organization
|
||||
^^^^^^^^^^^^
|
||||
|
||||
Only used by *make_metadata.py*.
|
||||
Where you describe the organization responsible for the service.::
|
||||
|
||||
"organization": {
|
||||
"name": [("Example Company","en"), ("Exempel AB","se")],
|
||||
"display_name": ["Exempel AB"],
|
||||
"url": [("http://example.com","en"),("http://exempel.se","se")],
|
||||
}
|
||||
|
||||
.. note:: You can specify the language of the name, or the language used on
|
||||
the webpage, by entering a tuple, instead of a simple string,
|
||||
where the second part is the language code. If you don't specify a
|
||||
language the default is "en" (English).
|
||||
|
||||
service
|
||||
^^^^^^^
|
||||
|
||||
Which services the server will provide, those are combinations of "idp","sp"
|
||||
and "aa".
|
||||
So if a server is a Service Provider (SP) then the configuration
|
||||
could look something like this::
|
||||
|
||||
"service": {
|
||||
"sp":{
|
||||
"name" : "Rolands SP",
|
||||
"endpoints":{
|
||||
"assertion_consumer_service": ["http://localhost:8087/"],
|
||||
"single_logout_service" : [("http://localhost:8087/slo",
|
||||
'urn:oasis:names:tc:SAML:2.0:bindings:HTTP-Redirect')],
|
||||
},
|
||||
"required_attributes": ["surname", "givenname", "edupersonaffiliation"],
|
||||
"optional_attributes": ["title"],
|
||||
"idp": {
|
||||
"urn:mace:umu.se:saml:roland:idp": None,
|
||||
},
|
||||
}
|
||||
},
|
||||
|
||||
There are two options common to all services: 'name' and 'endpoints'.
|
||||
The remaining options are specific to one or the other of the service types.
|
||||
Which one is specified along side the name of the option
|
||||
|
||||
timeslack
|
||||
^^^^^^^^^
|
||||
|
||||
If your computer and another computer that you are communicating with are not
|
||||
in synch regarding the computer clock. Then you here can state how big a
|
||||
difference you are prepared to accept.
|
||||
|
||||
.. note:: This will indiscriminately effect all time comparisons.
|
||||
Hence your server my accept a statement that in fact is to old.
|
||||
|
||||
xmlsec_binary
|
||||
^^^^^^^^^^^^^
|
||||
|
||||
Presently xmlsec1 binaries are used for all the signing and encryption stuff.
|
||||
This option defines where the binary is situated.
|
||||
|
||||
Example::
|
||||
|
||||
"xmlsec_binary": "/usr/local/bin/xmlsec1",
|
||||
|
||||
valid_for
|
||||
^^^^^^^^^
|
||||
|
||||
How many *hours* this configuration is expected to be accurate.::
|
||||
|
||||
"valid_for": 24
|
||||
|
||||
This of course is only used by *make_metadata.py*.
|
||||
The server will not stop working when this amount of time has elapsed :-).
|
||||
|
||||
Specific directives
|
||||
-------------------
|
||||
|
||||
Directives that are specific to a certain type of service.
|
||||
|
||||
idp/aa
|
||||
^^^^^^
|
||||
|
||||
Directives that are specific to an IdP or AA service instance
|
||||
|
||||
policy
|
||||
""""""
|
||||
|
||||
If the server is an IdP and/or an AA then there might be reasons to do things
|
||||
differently depending on who is asking; this is where that is specified.
|
||||
The keys are 'default' and SP entity identifiers, default is used whenever
|
||||
there is no entry for a specific SP. The reasoning is also that if there is
|
||||
no default and only SP entity identifiers as keys, then the server will only
|
||||
except connections from the specified SPs.
|
||||
An example might be::
|
||||
|
||||
"service": {
|
||||
"idp": {
|
||||
"policy": {
|
||||
"default": {
|
||||
"lifetime": {"minutes":15},
|
||||
"attribute_restrictions": None, # means all I have
|
||||
"name_form": "urn:oasis:names:tc:SAML:2.0:attrname-format:uri"
|
||||
},
|
||||
"urn:mace:example.com:saml:roland:sp": {
|
||||
"lifetime": {"minutes": 5},
|
||||
"attribute_restrictions":{
|
||||
"givenName": None,
|
||||
"surName": None,
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
*lifetime*
|
||||
is the maximum amount of time before the information should be
|
||||
regarded as stale. In an Assertion this is represented in the NotOnOrAfter
|
||||
attribute.
|
||||
*attribute_restrictions*
|
||||
By default there is no restrictions as to which attributes should be
|
||||
return. Instead all the attributes and values that is gathered by the
|
||||
database backends will be returned if nothing else is stated.
|
||||
In the example above the SP with the entity identifier
|
||||
"urn:mace:umu.se:saml:roland:sp"
|
||||
has an attribute restriction: only the attributes
|
||||
'givenName' and 'surName' are to be returned. There is no limitations as to
|
||||
what values on these attributes that can be returned.
|
||||
*name_form*
|
||||
Which name-form that should be used when sending assertions.
|
||||
|
||||
If restrictions on values are deemed necessary those are represented by
|
||||
regular expressions.::
|
||||
|
||||
"service": {
|
||||
"aa": {
|
||||
"policy": {
|
||||
"urn:mace:umu.se:saml:roland:sp": {
|
||||
"lifetime": {"minutes": 5},
|
||||
"attribute_restrictions":{
|
||||
"mail": [".*\.umu\.se$"],
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Here only mail addresses that ends with ".umu.se" will be returned.
|
||||
|
||||
sp
|
||||
^^
|
||||
|
||||
Directives specific to SP instances
|
||||
|
||||
authn_requests_signed
|
||||
"""""""""""""""""""""
|
||||
|
||||
Indicates if the Authentication Requests sent by this SP should be signed
|
||||
by default. This can be overriden by application code for a specific call.
|
||||
|
||||
This set the AuthnRequestsSigned attribute of the SPSSODescriptor node.
|
||||
of the metadata so the IdP will know this SP preference.
|
||||
|
||||
Valid values are "true" or "false". Default value is "false".
|
||||
|
||||
Example::
|
||||
|
||||
"service": {
|
||||
"sp": {
|
||||
"authn_assertions_signed": "true",
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
idp
|
||||
"""
|
||||
|
||||
Defines the set of IdPs that this SP is allowed to use. If not all the IdPs in
|
||||
the metadata is allowed, then the value is expected to be a list with entity
|
||||
identifiers for the allowed IdPs.
|
||||
A typical configuration, when the allowed set of IdPs are limited, would look
|
||||
something like this::
|
||||
|
||||
"service": {
|
||||
"sp": {
|
||||
"idp": ["urn:mace:umu.se:saml:roland:idp"],
|
||||
}
|
||||
}
|
||||
|
||||
In this case the SP has only one IdP it can use.
|
||||
|
||||
If all IdPs present in the metadata loaded this directive must be left out.
|
||||
|
||||
optional_attributes
|
||||
"""""""""""""""""""
|
||||
|
||||
Attributes that this SP would like to receive from IdPs.
|
||||
|
||||
Example::
|
||||
|
||||
"service": {
|
||||
"sp": {
|
||||
"optional_attributes": ["title"],
|
||||
}
|
||||
}
|
||||
|
||||
Since the attribute names used here are the user friendly ones an attribute map
|
||||
must exist, so that the server can use the full name when communicating
|
||||
with other servers.
|
||||
|
||||
required_attributes
|
||||
"""""""""""""""""""
|
||||
|
||||
Attributes that this SP demands to receive from IdPs.
|
||||
|
||||
Example::
|
||||
|
||||
"service": {
|
||||
"sp": {
|
||||
"required_attributes": ["surname", "givenName", "mail"],
|
||||
}
|
||||
}
|
||||
|
||||
Again as for *optional_attributes* the names given are expected to be
|
||||
the user friendly names.
|
||||
|
||||
want_assertions_signed
|
||||
""""""""""""""""""""""
|
||||
|
||||
Indicates if this SP wants the IdP to send the assertions signed. This
|
||||
set the WantAssertionsSigned attribute of the SPSSODescriptor node.
|
||||
of the metadata so the IdP will know this SP preference.
|
||||
|
||||
Valid values are "true" or "false". Default value is "true".
|
||||
|
||||
Example::
|
||||
|
||||
"service": {
|
||||
"sp": {
|
||||
"want_assertions_signed": "true",
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
idp/aa/sp
|
||||
^^^^^^^^^
|
||||
|
||||
If the configuration is covering both two or three different service types
|
||||
(like if one server is actually acting as both an IdP and a SP) then in some
|
||||
cases you might want to have these below different for the different services.
|
||||
|
||||
endpoints
|
||||
"""""""""
|
||||
|
||||
Where the endpoints for the services provided are.
|
||||
This directive has as value a dictionary with one of the following keys:
|
||||
|
||||
* artifact_resolution_service (aa, idp and sp)
|
||||
* assertion_consumer_service (sp)
|
||||
* assertion_id_request_service (aa, idp)
|
||||
* attribute_service (aa)
|
||||
* manage_name_id_service (aa, idp)
|
||||
* name_id_mapping_service (idp)
|
||||
* single_logout_service (aa, idp, sp)
|
||||
* single_sign_on_service (idp)
|
||||
|
||||
The values per service is a list of tuples containing endpoint and binding
|
||||
type.
|
||||
|
||||
Example::
|
||||
|
||||
"service":
|
||||
"idp": {
|
||||
"endpoints" : {
|
||||
"single_sign_on_service" : [
|
||||
("http://localhost:8088/sso", BINDING_HTTP_REDIRECT)],
|
||||
"single_logout_service": [
|
||||
("http://localhost:8088/slo", BINDING_HTTP_REDIRECT)]
|
||||
},
|
||||
},
|
||||
},
|
||||
|
||||
logout_requests_signed
|
||||
""""""""""""""""""""""
|
||||
|
||||
Indicates if this entity will sign the Logout Requests originated from it.
|
||||
|
||||
This can be overriden by application code for a specific call.
|
||||
|
||||
Valid values are "true" or "false". Default value is "false"
|
||||
|
||||
Example::
|
||||
|
||||
"service": {
|
||||
"sp": {
|
||||
"logout_requests_signed": "true",
|
||||
}
|
||||
}
|
||||
|
||||
subject_data
|
||||
""""""""""""
|
||||
|
||||
The name of a database where the map between a local identifier and
|
||||
a distributed identifier is kept. By default this is a shelve database.
|
||||
So if you just specify name, then a shelve database with that name
|
||||
is created. On the other hand if you specify a tuple then the first
|
||||
element in the tuple specifise which type of database you want to use
|
||||
and the second element is the address of the database.
|
||||
|
||||
Example::
|
||||
|
||||
"subject_data": "./idp.subject.db",
|
||||
|
||||
or if you want to use for instance memcache::
|
||||
|
||||
"subject_data": ("memcached", "localhost:12121"),
|
||||
|
||||
*shelve* and *memcached* are the only database types that are presently
|
||||
supported.
|
||||
|
||||
|
||||
virtual_organization
|
||||
""""""""""""""""""""
|
||||
|
||||
Gives information about common identifiers for virtual_organizations::
|
||||
|
||||
"virtual_organization" : {
|
||||
"urn:mace:example.com:it:tek":{
|
||||
"nameid_format" : "urn:oid:1.3.6.1.4.1.1466.115.121.1.15-NameID",
|
||||
"common_identifier": "umuselin",
|
||||
}
|
||||
},
|
||||
|
||||
Keys in this dictionary are the identifiers for the virtual organizations.
|
||||
The arguments per organization is 'nameid_format' and 'common_identifier'.
|
||||
Useful if all the IdPs and AAs that are involved in a virtual organization
|
||||
have common attribute values for users that are part of the VO.
|
||||
|
||||
Complete example
|
||||
----------------
|
||||
|
||||
We start with a simple but fairly complete Service provider configuration::
|
||||
|
||||
from saml2 import BINDING_HTTP_REDIRECT
|
||||
|
||||
CONFIG = {
|
||||
"entityid" : "http://example.com/sp/metadata.xml",
|
||||
"service": {
|
||||
"sp":{
|
||||
"name" : "Example SP",
|
||||
"endpoints":{
|
||||
"assertion_consumer_service": ["http://example.com/sp"],
|
||||
"single_logout_service" : [("http://example.com/sp/slo",
|
||||
BINDING_HTTP_REDIRECT)],
|
||||
},
|
||||
}
|
||||
},
|
||||
"key_file" : "./mykey.pem",
|
||||
"cert_file" : "./mycert.pem",
|
||||
"xmlsec_binary" : "/usr/local/bin/xmlsec1",
|
||||
"attribute_map_dir": "./attributemaps",
|
||||
"metadata": {
|
||||
"local": ["idp.xml"]
|
||||
}
|
||||
"organization": {
|
||||
"display_name":["Example identities"]
|
||||
}
|
||||
"contact_person": [{
|
||||
"givenname": "Roland",
|
||||
"surname": "Hedberg",
|
||||
"phone": "+46 90510",
|
||||
"mail": "roland@example.com",
|
||||
"type": "technical",
|
||||
}]
|
||||
}
|
||||
|
||||
This is the typical setup for a SP.
|
||||
A metadata file to load is *always* needed, but it can of course be
|
||||
containing anything from 1 up to many entity descriptions.
|
||||
|
||||
------
|
||||
|
||||
A slightly more complex configuration::
|
||||
|
||||
from saml2 import BINDING_HTTP_REDIRECT
|
||||
|
||||
CONFIG = {
|
||||
"entityid" : "http://sp.example.com/metadata.xml",
|
||||
"service": {
|
||||
"sp":{
|
||||
"name" : "Example SP",
|
||||
"endpoints":{
|
||||
"assertion_consumer_service": ["http://sp.example.com/"],
|
||||
"single_logout_service" : [("http://sp.example.com/slo",
|
||||
BINDING_HTTP_REDIRECT)],
|
||||
},
|
||||
"subject_data": ("memcached", "localhost:12121"),
|
||||
"virtual_organization" : {
|
||||
"urn:mace:example.com:it:tek":{
|
||||
"nameid_format" : "urn:oid:1.3.6.1.4.1.1466.115.121.1.15-NameID",
|
||||
"common_identifier": "eduPersonPrincipalName",
|
||||
}
|
||||
},
|
||||
}
|
||||
},
|
||||
"key_file" : "./mykey.pem",
|
||||
"cert_file" : "./mycert.pem",
|
||||
"xmlsec_binary" : "/usr/local/bin/xmlsec1",
|
||||
"metadata" : {
|
||||
"local": ["example.xml"],
|
||||
"remote": [{
|
||||
"url":"https://kalmar2.org/simplesaml/module.php/aggregator/?id=kalmarcentral2&set=saml2",
|
||||
"cert":"kalmar2.pem"}]
|
||||
},
|
||||
"attribute_maps" : "attributemaps",
|
||||
"organization": {
|
||||
"display_name":["Example identities"]
|
||||
}
|
||||
"contact_person": [{
|
||||
"givenname": "Roland",
|
||||
"surname": "Hedberg",
|
||||
"phone": "+46 90510",
|
||||
"mail": "roland@example.com",
|
||||
"type": "technical",
|
||||
}]
|
||||
}
|
||||
|
||||
Uses metadata files, both local and remote, and will talk to whatever
|
||||
IdP that appears in any of the metadata files.
|
||||
42
doc/howto/index.rst
Normal file
42
doc/howto/index.rst
Normal file
@@ -0,0 +1,42 @@
|
||||
.. _howto:
|
||||
|
||||
How to use PySAML2
|
||||
===================
|
||||
|
||||
:Release: |release|
|
||||
:Date: |today|
|
||||
|
||||
Before you can use Pysaml2, you'll need to get it installed.
|
||||
If you have not done it yet, read the :ref:`install`
|
||||
|
||||
Well, now you have it installed and you want to do something.
|
||||
|
||||
And I'm sorry to tell you this; but there isn't really a lot you can do with
|
||||
this code on it's own.
|
||||
|
||||
Sure you can send a AuthenticationRequest to an IdentityProvider or a
|
||||
AttributeQuery to an AttributeAuthority but in order to get what they
|
||||
return you have to sit behind a Web server. Well that is not really true since
|
||||
the AttributeQuery would be over SOAP and you would get the result over the
|
||||
conenction you have to the AttributeAuthority.
|
||||
|
||||
But anyway, you may get my point. This is middleware stuff !
|
||||
|
||||
PySAML2 is built to fit into a
|
||||
`WSGI <http://www.python.org/dev/peps/pep-0333/>`_ application
|
||||
|
||||
But it can be used in a non-WSGI environment too.
|
||||
|
||||
So you will find descriptions of both cases here.
|
||||
|
||||
The configuration is the same disregarding whether you are using PySAML2 in a
|
||||
WSGI or non-WSGI environment.
|
||||
|
||||
.. toctree::
|
||||
:maxdepth: 1
|
||||
|
||||
config
|
||||
wsgi/index
|
||||
nonwsgi/index
|
||||
|
||||
|
||||
30
doc/index.rst
Normal file
30
doc/index.rst
Normal file
@@ -0,0 +1,30 @@
|
||||
.. _index:
|
||||
|
||||
.. pysaml2 documentation master file, created by
|
||||
sphinx-quickstart on Mon Aug 24 08:13:41 2009.
|
||||
You can adapt this file completely to your liking, but it should at least
|
||||
contain the root `toctree` directive.
|
||||
|
||||
Welcome to the documentation of pysaml2!
|
||||
========================================
|
||||
|
||||
:Release: |release|
|
||||
:Date: |today|
|
||||
|
||||
Contents:
|
||||
|
||||
.. toctree::
|
||||
:maxdepth: 1
|
||||
|
||||
install
|
||||
howto/index
|
||||
saml2
|
||||
examples/index
|
||||
|
||||
Indices and tables
|
||||
==================
|
||||
|
||||
* :ref:`genindex`
|
||||
* :ref:`modindex`
|
||||
* :ref:`search`
|
||||
|
||||
54
doc/install.rst
Normal file
54
doc/install.rst
Normal file
@@ -0,0 +1,54 @@
|
||||
.. _install:
|
||||
|
||||
Quick install guide
|
||||
===================
|
||||
|
||||
Before you can use PySAML2, you'll need to get it installed. This guide
|
||||
will guide you to a simple, minimal installation.
|
||||
|
||||
Install PySAML2
|
||||
---------------
|
||||
|
||||
For all this to work you need to have Python installed.
|
||||
The development has been done using 2.6.
|
||||
There is no 3.X version yet.
|
||||
|
||||
Prerequisites
|
||||
^^^^^^^^^^^^^
|
||||
|
||||
You have to have ElementTree, which is either part of your Python distribution
|
||||
if it's recent enough, or if the Python is too old you have to install it,
|
||||
for instance by getting it from the Python Package Instance by using
|
||||
easy_install.
|
||||
|
||||
You also need xmlsec which you can download from http://www.aleksey.com/xmlsec/
|
||||
|
||||
If you're on OS X you can get xmlsec installed from MacPorts or Fink.
|
||||
|
||||
Depending on how you are going to use PySAML2 you might also need
|
||||
|
||||
* Mako
|
||||
* pyASN1
|
||||
* repoze.who (make sure you get 1.0.16 and not 2.0)
|
||||
* decorator
|
||||
* python-memcache
|
||||
* memcached
|
||||
|
||||
Quick build instructions
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
Once you have installed all the necessary prerequisites a simple::
|
||||
|
||||
python setup.py install
|
||||
|
||||
will install the basic code.
|
||||
|
||||
After this you ought to be able to run the tests without an hitch.
|
||||
The tests are based on the pypy test environment, so::
|
||||
|
||||
cd tests
|
||||
py.test
|
||||
|
||||
is what you should use. If you don't have py.test, get it it's part of pypy!
|
||||
It's really good !
|
||||
|
||||
112
doc/make.bat
Normal file
112
doc/make.bat
Normal file
@@ -0,0 +1,112 @@
|
||||
@ECHO OFF
|
||||
|
||||
REM Command file for Sphinx documentation
|
||||
|
||||
set SPHINXBUILD=sphinx-build
|
||||
set ALLSPHINXOPTS=-d _build/doctrees %SPHINXOPTS% .
|
||||
if NOT "%PAPER%" == "" (
|
||||
set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS%
|
||||
)
|
||||
|
||||
if "%1" == "" goto help
|
||||
|
||||
if "%1" == "help" (
|
||||
:help
|
||||
echo.Please use `make ^<target^>` where ^<target^> is one of
|
||||
echo. html to make standalone HTML files
|
||||
echo. dirhtml to make HTML files named index.html in directories
|
||||
echo. pickle to make pickle files
|
||||
echo. json to make JSON files
|
||||
echo. htmlhelp to make HTML files and a HTML help project
|
||||
echo. qthelp to make HTML files and a qthelp project
|
||||
echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter
|
||||
echo. changes to make an overview over all changed/added/deprecated items
|
||||
echo. linkcheck to check all external links for integrity
|
||||
echo. doctest to run all doctests embedded in the documentation if enabled
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "clean" (
|
||||
for /d %%i in (_build\*) do rmdir /q /s %%i
|
||||
del /q /s _build\*
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "html" (
|
||||
%SPHINXBUILD% -b html %ALLSPHINXOPTS% _build/html
|
||||
echo.
|
||||
echo.Build finished. The HTML pages are in _build/html.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "dirhtml" (
|
||||
%SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% _build/dirhtml
|
||||
echo.
|
||||
echo.Build finished. The HTML pages are in _build/dirhtml.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "pickle" (
|
||||
%SPHINXBUILD% -b pickle %ALLSPHINXOPTS% _build/pickle
|
||||
echo.
|
||||
echo.Build finished; now you can process the pickle files.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "json" (
|
||||
%SPHINXBUILD% -b json %ALLSPHINXOPTS% _build/json
|
||||
echo.
|
||||
echo.Build finished; now you can process the JSON files.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "htmlhelp" (
|
||||
%SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% _build/htmlhelp
|
||||
echo.
|
||||
echo.Build finished; now you can run HTML Help Workshop with the ^
|
||||
.hhp project file in _build/htmlhelp.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "qthelp" (
|
||||
%SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% _build/qthelp
|
||||
echo.
|
||||
echo.Build finished; now you can run "qcollectiongenerator" with the ^
|
||||
.qhcp project file in _build/qthelp, like this:
|
||||
echo.^> qcollectiongenerator _build\qthelp\pysaml2.qhcp
|
||||
echo.To view the help file:
|
||||
echo.^> assistant -collectionFile _build\qthelp\pysaml2.ghc
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "latex" (
|
||||
%SPHINXBUILD% -b latex %ALLSPHINXOPTS% _build/latex
|
||||
echo.
|
||||
echo.Build finished; the LaTeX files are in _build/latex.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "changes" (
|
||||
%SPHINXBUILD% -b changes %ALLSPHINXOPTS% _build/changes
|
||||
echo.
|
||||
echo.The overview file is in _build/changes.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "linkcheck" (
|
||||
%SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% _build/linkcheck
|
||||
echo.
|
||||
echo.Link check complete; look for any errors in the above output ^
|
||||
or in _build/linkcheck/output.txt.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "doctest" (
|
||||
%SPHINXBUILD% -b doctest %ALLSPHINXOPTS% _build/doctest
|
||||
echo.
|
||||
echo.Testing of doctests in the sources finished, look at the ^
|
||||
results in _build/doctest/output.txt.
|
||||
goto end
|
||||
)
|
||||
|
||||
:end
|
||||
18
doc/metadata.rst
Normal file
18
doc/metadata.rst
Normal file
@@ -0,0 +1,18 @@
|
||||
.. _metadata:
|
||||
|
||||
***************************************************
|
||||
Base classes representing Saml2.0 MetaData elements
|
||||
***************************************************
|
||||
|
||||
:Author: Roland Hedberg
|
||||
:Version: |version|
|
||||
|
||||
.. module:: MetaData
|
||||
:synopsis: Base classes representing Saml2.0 metadata elements.
|
||||
|
||||
Module
|
||||
==========
|
||||
|
||||
.. automodule:: saml2.metadata
|
||||
:members:
|
||||
|
||||
18
doc/saml.rst
Normal file
18
doc/saml.rst
Normal file
@@ -0,0 +1,18 @@
|
||||
.. _saml:
|
||||
|
||||
******************************************
|
||||
Base classes representing Saml2.0 elements
|
||||
******************************************
|
||||
|
||||
:Author: Roland Hedberg
|
||||
:Version: |version|
|
||||
|
||||
.. module:: SAML2
|
||||
:synopsis: Base classes representing Saml2.0 elements.
|
||||
|
||||
Module
|
||||
==========
|
||||
|
||||
.. automodule:: saml2.saml
|
||||
:members:
|
||||
|
||||
29
doc/saml2.rst
Normal file
29
doc/saml2.rst
Normal file
@@ -0,0 +1,29 @@
|
||||
.. _base:
|
||||
|
||||
****************************************
|
||||
Base classes representing basic elements
|
||||
****************************************
|
||||
|
||||
:Author: Roland Hedberg
|
||||
:Version: |version|
|
||||
|
||||
.. module:: Base
|
||||
:synopsis: Base classes.
|
||||
|
||||
.. toctree::
|
||||
:maxdepth: 2
|
||||
|
||||
saml
|
||||
samlp
|
||||
metadata
|
||||
xmldsig
|
||||
xmlenc
|
||||
client
|
||||
server
|
||||
|
||||
Module
|
||||
==========
|
||||
|
||||
.. automodule:: saml2
|
||||
:members:
|
||||
|
||||
18
doc/samlp.rst
Normal file
18
doc/samlp.rst
Normal file
@@ -0,0 +1,18 @@
|
||||
.. _samlp:
|
||||
|
||||
***************************************************
|
||||
Base classes representing Saml2.0 protocol elements
|
||||
***************************************************
|
||||
|
||||
:Author: Roland Hedberg
|
||||
:Version: |version|
|
||||
|
||||
.. module:: SAMLP
|
||||
:synopsis: Base classes representing Saml2.0 protocol elements.
|
||||
|
||||
Module
|
||||
==========
|
||||
|
||||
.. automodule:: saml2.samlp
|
||||
:members:
|
||||
|
||||
19
doc/server.rst
Normal file
19
doc/server.rst
Normal file
@@ -0,0 +1,19 @@
|
||||
.. _server:
|
||||
|
||||
***********************************************************************
|
||||
Classes representing Identity Provider or Attribute Authority instances
|
||||
***********************************************************************
|
||||
|
||||
:Author: Roland Hedberg
|
||||
:Version: |version|
|
||||
|
||||
.. module:: IdPAA
|
||||
:synopsis: Classes representing Identity Provider or Attribute
|
||||
Authority instances.
|
||||
|
||||
Module
|
||||
======
|
||||
|
||||
.. automodule:: saml2.server
|
||||
:members:
|
||||
|
||||
18
doc/xmldsig.rst
Normal file
18
doc/xmldsig.rst
Normal file
@@ -0,0 +1,18 @@
|
||||
.. _xmldsig:
|
||||
|
||||
*************************************
|
||||
Classes representing xmldsig elements
|
||||
*************************************
|
||||
|
||||
:Author: Roland Hedberg
|
||||
:Version: |version|
|
||||
|
||||
.. module:: XmlDsig
|
||||
:synopsis: Classes representing xmldsig elements.
|
||||
|
||||
Module
|
||||
==========
|
||||
|
||||
.. automodule:: xmldsig
|
||||
:members:
|
||||
|
||||
22
doc/xmlenc.rst
Normal file
22
doc/xmlenc.rst
Normal file
@@ -0,0 +1,22 @@
|
||||
.. _xmlenc:
|
||||
|
||||
*************************************
|
||||
Classes representing xmlenc elements
|
||||
*************************************
|
||||
|
||||
#:mod: 'XmlEnc' -- xmlenc
|
||||
|
||||
=====================================================
|
||||
|
||||
:Author: Roland Hedberg
|
||||
:Version: |version|
|
||||
|
||||
.. module:: XmlEnc
|
||||
:synopsis: Classes representing xmlenc elements.
|
||||
|
||||
Module
|
||||
==========
|
||||
|
||||
.. automodule:: xmlenc
|
||||
:members:
|
||||
|
||||
26
example/README
Normal file
26
example/README
Normal file
@@ -0,0 +1,26 @@
|
||||
This is a very simple setup just to check that all your gear are in order.
|
||||
|
||||
The setup consists of one IdP and one SP.
|
||||
The IdP authenticates users by using a htpasswd plugin and gets the identity information
|
||||
from the ini-plugin.
|
||||
|
||||
All this is in the idp/who.ini configuration file, the file used for authentication
|
||||
is idp/passwd and the ini file is idp/idp_user.ini.
|
||||
|
||||
The passwords in passwd in clear text:
|
||||
|
||||
roland:one
|
||||
ozzie:two
|
||||
derek:three
|
||||
ryan:four
|
||||
ischiro:five
|
||||
|
||||
The SP doesn't do anything but show you the information that the IdP sent.
|
||||
|
||||
To make it easy, for me :-), both the IdP and the SP uses the same keys.
|
||||
|
||||
To run the setup do
|
||||
|
||||
./run.sh
|
||||
|
||||
and then use your favourit webbrowser to look at "http://localhost:8087/whoami"
|
||||
326
example/idp/attributemaps/basic.py
Normal file
326
example/idp/attributemaps/basic.py
Normal file
@@ -0,0 +1,326 @@
|
||||
|
||||
MAP = {
|
||||
"identifier": "urn:oasis:names:tc:SAML:2.0:attrname-format:basic",
|
||||
"fro": {
|
||||
'urn:mace:dir:attribute-def:aRecord': 'aRecord',
|
||||
'urn:mace:dir:attribute-def:aliasedEntryName': 'aliasedEntryName',
|
||||
'urn:mace:dir:attribute-def:aliasedObjectName': 'aliasedObjectName',
|
||||
'urn:mace:dir:attribute-def:associatedDomain': 'associatedDomain',
|
||||
'urn:mace:dir:attribute-def:associatedName': 'associatedName',
|
||||
'urn:mace:dir:attribute-def:audio': 'audio',
|
||||
'urn:mace:dir:attribute-def:authorityRevocationList': 'authorityRevocationList',
|
||||
'urn:mace:dir:attribute-def:buildingName': 'buildingName',
|
||||
'urn:mace:dir:attribute-def:businessCategory': 'businessCategory',
|
||||
'urn:mace:dir:attribute-def:c': 'c',
|
||||
'urn:mace:dir:attribute-def:cACertificate': 'cACertificate',
|
||||
'urn:mace:dir:attribute-def:cNAMERecord': 'cNAMERecord',
|
||||
'urn:mace:dir:attribute-def:carLicense': 'carLicense',
|
||||
'urn:mace:dir:attribute-def:certificateRevocationList': 'certificateRevocationList',
|
||||
'urn:mace:dir:attribute-def:cn': 'cn',
|
||||
'urn:mace:dir:attribute-def:co': 'co',
|
||||
'urn:mace:dir:attribute-def:commonName': 'commonName',
|
||||
'urn:mace:dir:attribute-def:countryName': 'countryName',
|
||||
'urn:mace:dir:attribute-def:crossCertificatePair': 'crossCertificatePair',
|
||||
'urn:mace:dir:attribute-def:dITRedirect': 'dITRedirect',
|
||||
'urn:mace:dir:attribute-def:dSAQuality': 'dSAQuality',
|
||||
'urn:mace:dir:attribute-def:dc': 'dc',
|
||||
'urn:mace:dir:attribute-def:deltaRevocationList': 'deltaRevocationList',
|
||||
'urn:mace:dir:attribute-def:departmentNumber': 'departmentNumber',
|
||||
'urn:mace:dir:attribute-def:description': 'description',
|
||||
'urn:mace:dir:attribute-def:destinationIndicator': 'destinationIndicator',
|
||||
'urn:mace:dir:attribute-def:displayName': 'displayName',
|
||||
'urn:mace:dir:attribute-def:distinguishedName': 'distinguishedName',
|
||||
'urn:mace:dir:attribute-def:dmdName': 'dmdName',
|
||||
'urn:mace:dir:attribute-def:dnQualifier': 'dnQualifier',
|
||||
'urn:mace:dir:attribute-def:documentAuthor': 'documentAuthor',
|
||||
'urn:mace:dir:attribute-def:documentIdentifier': 'documentIdentifier',
|
||||
'urn:mace:dir:attribute-def:documentLocation': 'documentLocation',
|
||||
'urn:mace:dir:attribute-def:documentPublisher': 'documentPublisher',
|
||||
'urn:mace:dir:attribute-def:documentTitle': 'documentTitle',
|
||||
'urn:mace:dir:attribute-def:documentVersion': 'documentVersion',
|
||||
'urn:mace:dir:attribute-def:domainComponent': 'domainComponent',
|
||||
'urn:mace:dir:attribute-def:drink': 'drink',
|
||||
'urn:mace:dir:attribute-def:eduOrgHomePageURI': 'eduOrgHomePageURI',
|
||||
'urn:mace:dir:attribute-def:eduOrgIdentityAuthNPolicyURI': 'eduOrgIdentityAuthNPolicyURI',
|
||||
'urn:mace:dir:attribute-def:eduOrgLegalName': 'eduOrgLegalName',
|
||||
'urn:mace:dir:attribute-def:eduOrgSuperiorURI': 'eduOrgSuperiorURI',
|
||||
'urn:mace:dir:attribute-def:eduOrgWhitePagesURI': 'eduOrgWhitePagesURI',
|
||||
'urn:mace:dir:attribute-def:eduPersonAffiliation': 'eduPersonAffiliation',
|
||||
'urn:mace:dir:attribute-def:eduPersonEntitlement': 'eduPersonEntitlement',
|
||||
'urn:mace:dir:attribute-def:eduPersonNickname': 'eduPersonNickname',
|
||||
'urn:mace:dir:attribute-def:eduPersonOrgDN': 'eduPersonOrgDN',
|
||||
'urn:mace:dir:attribute-def:eduPersonOrgUnitDN': 'eduPersonOrgUnitDN',
|
||||
'urn:mace:dir:attribute-def:eduPersonPrimaryAffiliation': 'eduPersonPrimaryAffiliation',
|
||||
'urn:mace:dir:attribute-def:eduPersonPrimaryOrgUnitDN': 'eduPersonPrimaryOrgUnitDN',
|
||||
'urn:mace:dir:attribute-def:eduPersonPrincipalName': 'eduPersonPrincipalName',
|
||||
'urn:mace:dir:attribute-def:eduPersonScopedAffiliation': 'eduPersonScopedAffiliation',
|
||||
'urn:mace:dir:attribute-def:eduPersonTargetedID': 'eduPersonTargetedID',
|
||||
'urn:mace:dir:attribute-def:email': 'email',
|
||||
'urn:mace:dir:attribute-def:emailAddress': 'emailAddress',
|
||||
'urn:mace:dir:attribute-def:employeeNumber': 'employeeNumber',
|
||||
'urn:mace:dir:attribute-def:employeeType': 'employeeType',
|
||||
'urn:mace:dir:attribute-def:enhancedSearchGuide': 'enhancedSearchGuide',
|
||||
'urn:mace:dir:attribute-def:facsimileTelephoneNumber': 'facsimileTelephoneNumber',
|
||||
'urn:mace:dir:attribute-def:favouriteDrink': 'favouriteDrink',
|
||||
'urn:mace:dir:attribute-def:fax': 'fax',
|
||||
'urn:mace:dir:attribute-def:federationFeideSchemaVersion': 'federationFeideSchemaVersion',
|
||||
'urn:mace:dir:attribute-def:friendlyCountryName': 'friendlyCountryName',
|
||||
'urn:mace:dir:attribute-def:generationQualifier': 'generationQualifier',
|
||||
'urn:mace:dir:attribute-def:givenName': 'givenName',
|
||||
'urn:mace:dir:attribute-def:gn': 'gn',
|
||||
'urn:mace:dir:attribute-def:homePhone': 'homePhone',
|
||||
'urn:mace:dir:attribute-def:homePostalAddress': 'homePostalAddress',
|
||||
'urn:mace:dir:attribute-def:homeTelephoneNumber': 'homeTelephoneNumber',
|
||||
'urn:mace:dir:attribute-def:host': 'host',
|
||||
'urn:mace:dir:attribute-def:houseIdentifier': 'houseIdentifier',
|
||||
'urn:mace:dir:attribute-def:info': 'info',
|
||||
'urn:mace:dir:attribute-def:initials': 'initials',
|
||||
'urn:mace:dir:attribute-def:internationaliSDNNumber': 'internationaliSDNNumber',
|
||||
'urn:mace:dir:attribute-def:janetMailbox': 'janetMailbox',
|
||||
'urn:mace:dir:attribute-def:jpegPhoto': 'jpegPhoto',
|
||||
'urn:mace:dir:attribute-def:knowledgeInformation': 'knowledgeInformation',
|
||||
'urn:mace:dir:attribute-def:l': 'l',
|
||||
'urn:mace:dir:attribute-def:labeledURI': 'labeledURI',
|
||||
'urn:mace:dir:attribute-def:localityName': 'localityName',
|
||||
'urn:mace:dir:attribute-def:mDRecord': 'mDRecord',
|
||||
'urn:mace:dir:attribute-def:mXRecord': 'mXRecord',
|
||||
'urn:mace:dir:attribute-def:mail': 'mail',
|
||||
'urn:mace:dir:attribute-def:mailPreferenceOption': 'mailPreferenceOption',
|
||||
'urn:mace:dir:attribute-def:manager': 'manager',
|
||||
'urn:mace:dir:attribute-def:member': 'member',
|
||||
'urn:mace:dir:attribute-def:mobile': 'mobile',
|
||||
'urn:mace:dir:attribute-def:mobileTelephoneNumber': 'mobileTelephoneNumber',
|
||||
'urn:mace:dir:attribute-def:nSRecord': 'nSRecord',
|
||||
'urn:mace:dir:attribute-def:name': 'name',
|
||||
'urn:mace:dir:attribute-def:norEduOrgAcronym': 'norEduOrgAcronym',
|
||||
'urn:mace:dir:attribute-def:norEduOrgNIN': 'norEduOrgNIN',
|
||||
'urn:mace:dir:attribute-def:norEduOrgSchemaVersion': 'norEduOrgSchemaVersion',
|
||||
'urn:mace:dir:attribute-def:norEduOrgUniqueIdentifier': 'norEduOrgUniqueIdentifier',
|
||||
'urn:mace:dir:attribute-def:norEduOrgUniqueNumber': 'norEduOrgUniqueNumber',
|
||||
'urn:mace:dir:attribute-def:norEduOrgUnitUniqueIdentifier': 'norEduOrgUnitUniqueIdentifier',
|
||||
'urn:mace:dir:attribute-def:norEduOrgUnitUniqueNumber': 'norEduOrgUnitUniqueNumber',
|
||||
'urn:mace:dir:attribute-def:norEduPersonBirthDate': 'norEduPersonBirthDate',
|
||||
'urn:mace:dir:attribute-def:norEduPersonLIN': 'norEduPersonLIN',
|
||||
'urn:mace:dir:attribute-def:norEduPersonNIN': 'norEduPersonNIN',
|
||||
'urn:mace:dir:attribute-def:o': 'o',
|
||||
'urn:mace:dir:attribute-def:objectClass': 'objectClass',
|
||||
'urn:mace:dir:attribute-def:organizationName': 'organizationName',
|
||||
'urn:mace:dir:attribute-def:organizationalStatus': 'organizationalStatus',
|
||||
'urn:mace:dir:attribute-def:organizationalUnitName': 'organizationalUnitName',
|
||||
'urn:mace:dir:attribute-def:otherMailbox': 'otherMailbox',
|
||||
'urn:mace:dir:attribute-def:ou': 'ou',
|
||||
'urn:mace:dir:attribute-def:owner': 'owner',
|
||||
'urn:mace:dir:attribute-def:pager': 'pager',
|
||||
'urn:mace:dir:attribute-def:pagerTelephoneNumber': 'pagerTelephoneNumber',
|
||||
'urn:mace:dir:attribute-def:personalSignature': 'personalSignature',
|
||||
'urn:mace:dir:attribute-def:personalTitle': 'personalTitle',
|
||||
'urn:mace:dir:attribute-def:photo': 'photo',
|
||||
'urn:mace:dir:attribute-def:physicalDeliveryOfficeName': 'physicalDeliveryOfficeName',
|
||||
'urn:mace:dir:attribute-def:pkcs9email': 'pkcs9email',
|
||||
'urn:mace:dir:attribute-def:postOfficeBox': 'postOfficeBox',
|
||||
'urn:mace:dir:attribute-def:postalAddress': 'postalAddress',
|
||||
'urn:mace:dir:attribute-def:postalCode': 'postalCode',
|
||||
'urn:mace:dir:attribute-def:preferredDeliveryMethod': 'preferredDeliveryMethod',
|
||||
'urn:mace:dir:attribute-def:preferredLanguage': 'preferredLanguage',
|
||||
'urn:mace:dir:attribute-def:presentationAddress': 'presentationAddress',
|
||||
'urn:mace:dir:attribute-def:protocolInformation': 'protocolInformation',
|
||||
'urn:mace:dir:attribute-def:pseudonym': 'pseudonym',
|
||||
'urn:mace:dir:attribute-def:registeredAddress': 'registeredAddress',
|
||||
'urn:mace:dir:attribute-def:rfc822Mailbox': 'rfc822Mailbox',
|
||||
'urn:mace:dir:attribute-def:roleOccupant': 'roleOccupant',
|
||||
'urn:mace:dir:attribute-def:roomNumber': 'roomNumber',
|
||||
'urn:mace:dir:attribute-def:sOARecord': 'sOARecord',
|
||||
'urn:mace:dir:attribute-def:searchGuide': 'searchGuide',
|
||||
'urn:mace:dir:attribute-def:secretary': 'secretary',
|
||||
'urn:mace:dir:attribute-def:seeAlso': 'seeAlso',
|
||||
'urn:mace:dir:attribute-def:serialNumber': 'serialNumber',
|
||||
'urn:mace:dir:attribute-def:singleLevelQuality': 'singleLevelQuality',
|
||||
'urn:mace:dir:attribute-def:sn': 'sn',
|
||||
'urn:mace:dir:attribute-def:st': 'st',
|
||||
'urn:mace:dir:attribute-def:stateOrProvinceName': 'stateOrProvinceName',
|
||||
'urn:mace:dir:attribute-def:street': 'street',
|
||||
'urn:mace:dir:attribute-def:streetAddress': 'streetAddress',
|
||||
'urn:mace:dir:attribute-def:subtreeMaximumQuality': 'subtreeMaximumQuality',
|
||||
'urn:mace:dir:attribute-def:subtreeMinimumQuality': 'subtreeMinimumQuality',
|
||||
'urn:mace:dir:attribute-def:supportedAlgorithms': 'supportedAlgorithms',
|
||||
'urn:mace:dir:attribute-def:supportedApplicationContext': 'supportedApplicationContext',
|
||||
'urn:mace:dir:attribute-def:surname': 'surname',
|
||||
'urn:mace:dir:attribute-def:telephoneNumber': 'telephoneNumber',
|
||||
'urn:mace:dir:attribute-def:teletexTerminalIdentifier': 'teletexTerminalIdentifier',
|
||||
'urn:mace:dir:attribute-def:telexNumber': 'telexNumber',
|
||||
'urn:mace:dir:attribute-def:textEncodedORAddress': 'textEncodedORAddress',
|
||||
'urn:mace:dir:attribute-def:title': 'title',
|
||||
'urn:mace:dir:attribute-def:uid': 'uid',
|
||||
'urn:mace:dir:attribute-def:uniqueIdentifier': 'uniqueIdentifier',
|
||||
'urn:mace:dir:attribute-def:uniqueMember': 'uniqueMember',
|
||||
'urn:mace:dir:attribute-def:userCertificate': 'userCertificate',
|
||||
'urn:mace:dir:attribute-def:userClass': 'userClass',
|
||||
'urn:mace:dir:attribute-def:userPKCS12': 'userPKCS12',
|
||||
'urn:mace:dir:attribute-def:userPassword': 'userPassword',
|
||||
'urn:mace:dir:attribute-def:userSMIMECertificate': 'userSMIMECertificate',
|
||||
'urn:mace:dir:attribute-def:userid': 'userid',
|
||||
'urn:mace:dir:attribute-def:x121Address': 'x121Address',
|
||||
'urn:mace:dir:attribute-def:x500UniqueIdentifier': 'x500UniqueIdentifier',
|
||||
},
|
||||
"to": {
|
||||
'aRecord': 'urn:mace:dir:attribute-def:aRecord',
|
||||
'aliasedEntryName': 'urn:mace:dir:attribute-def:aliasedEntryName',
|
||||
'aliasedObjectName': 'urn:mace:dir:attribute-def:aliasedObjectName',
|
||||
'associatedDomain': 'urn:mace:dir:attribute-def:associatedDomain',
|
||||
'associatedName': 'urn:mace:dir:attribute-def:associatedName',
|
||||
'audio': 'urn:mace:dir:attribute-def:audio',
|
||||
'authorityRevocationList': 'urn:mace:dir:attribute-def:authorityRevocationList',
|
||||
'buildingName': 'urn:mace:dir:attribute-def:buildingName',
|
||||
'businessCategory': 'urn:mace:dir:attribute-def:businessCategory',
|
||||
'c': 'urn:mace:dir:attribute-def:c',
|
||||
'cACertificate': 'urn:mace:dir:attribute-def:cACertificate',
|
||||
'cNAMERecord': 'urn:mace:dir:attribute-def:cNAMERecord',
|
||||
'carLicense': 'urn:mace:dir:attribute-def:carLicense',
|
||||
'certificateRevocationList': 'urn:mace:dir:attribute-def:certificateRevocationList',
|
||||
'cn': 'urn:mace:dir:attribute-def:cn',
|
||||
'co': 'urn:mace:dir:attribute-def:co',
|
||||
'commonName': 'urn:mace:dir:attribute-def:commonName',
|
||||
'countryName': 'urn:mace:dir:attribute-def:countryName',
|
||||
'crossCertificatePair': 'urn:mace:dir:attribute-def:crossCertificatePair',
|
||||
'dITRedirect': 'urn:mace:dir:attribute-def:dITRedirect',
|
||||
'dSAQuality': 'urn:mace:dir:attribute-def:dSAQuality',
|
||||
'dc': 'urn:mace:dir:attribute-def:dc',
|
||||
'deltaRevocationList': 'urn:mace:dir:attribute-def:deltaRevocationList',
|
||||
'departmentNumber': 'urn:mace:dir:attribute-def:departmentNumber',
|
||||
'description': 'urn:mace:dir:attribute-def:description',
|
||||
'destinationIndicator': 'urn:mace:dir:attribute-def:destinationIndicator',
|
||||
'displayName': 'urn:mace:dir:attribute-def:displayName',
|
||||
'distinguishedName': 'urn:mace:dir:attribute-def:distinguishedName',
|
||||
'dmdName': 'urn:mace:dir:attribute-def:dmdName',
|
||||
'dnQualifier': 'urn:mace:dir:attribute-def:dnQualifier',
|
||||
'documentAuthor': 'urn:mace:dir:attribute-def:documentAuthor',
|
||||
'documentIdentifier': 'urn:mace:dir:attribute-def:documentIdentifier',
|
||||
'documentLocation': 'urn:mace:dir:attribute-def:documentLocation',
|
||||
'documentPublisher': 'urn:mace:dir:attribute-def:documentPublisher',
|
||||
'documentTitle': 'urn:mace:dir:attribute-def:documentTitle',
|
||||
'documentVersion': 'urn:mace:dir:attribute-def:documentVersion',
|
||||
'domainComponent': 'urn:mace:dir:attribute-def:domainComponent',
|
||||
'drink': 'urn:mace:dir:attribute-def:drink',
|
||||
'eduOrgHomePageURI': 'urn:mace:dir:attribute-def:eduOrgHomePageURI',
|
||||
'eduOrgIdentityAuthNPolicyURI': 'urn:mace:dir:attribute-def:eduOrgIdentityAuthNPolicyURI',
|
||||
'eduOrgLegalName': 'urn:mace:dir:attribute-def:eduOrgLegalName',
|
||||
'eduOrgSuperiorURI': 'urn:mace:dir:attribute-def:eduOrgSuperiorURI',
|
||||
'eduOrgWhitePagesURI': 'urn:mace:dir:attribute-def:eduOrgWhitePagesURI',
|
||||
'eduPersonAffiliation': 'urn:mace:dir:attribute-def:eduPersonAffiliation',
|
||||
'eduPersonEntitlement': 'urn:mace:dir:attribute-def:eduPersonEntitlement',
|
||||
'eduPersonNickname': 'urn:mace:dir:attribute-def:eduPersonNickname',
|
||||
'eduPersonOrgDN': 'urn:mace:dir:attribute-def:eduPersonOrgDN',
|
||||
'eduPersonOrgUnitDN': 'urn:mace:dir:attribute-def:eduPersonOrgUnitDN',
|
||||
'eduPersonPrimaryAffiliation': 'urn:mace:dir:attribute-def:eduPersonPrimaryAffiliation',
|
||||
'eduPersonPrimaryOrgUnitDN': 'urn:mace:dir:attribute-def:eduPersonPrimaryOrgUnitDN',
|
||||
'eduPersonPrincipalName': 'urn:mace:dir:attribute-def:eduPersonPrincipalName',
|
||||
'eduPersonScopedAffiliation': 'urn:mace:dir:attribute-def:eduPersonScopedAffiliation',
|
||||
'eduPersonTargetedID': 'urn:mace:dir:attribute-def:eduPersonTargetedID',
|
||||
'email': 'urn:mace:dir:attribute-def:email',
|
||||
'emailAddress': 'urn:mace:dir:attribute-def:emailAddress',
|
||||
'employeeNumber': 'urn:mace:dir:attribute-def:employeeNumber',
|
||||
'employeeType': 'urn:mace:dir:attribute-def:employeeType',
|
||||
'enhancedSearchGuide': 'urn:mace:dir:attribute-def:enhancedSearchGuide',
|
||||
'facsimileTelephoneNumber': 'urn:mace:dir:attribute-def:facsimileTelephoneNumber',
|
||||
'favouriteDrink': 'urn:mace:dir:attribute-def:favouriteDrink',
|
||||
'fax': 'urn:mace:dir:attribute-def:fax',
|
||||
'federationFeideSchemaVersion': 'urn:mace:dir:attribute-def:federationFeideSchemaVersion',
|
||||
'friendlyCountryName': 'urn:mace:dir:attribute-def:friendlyCountryName',
|
||||
'generationQualifier': 'urn:mace:dir:attribute-def:generationQualifier',
|
||||
'givenName': 'urn:mace:dir:attribute-def:givenName',
|
||||
'gn': 'urn:mace:dir:attribute-def:gn',
|
||||
'homePhone': 'urn:mace:dir:attribute-def:homePhone',
|
||||
'homePostalAddress': 'urn:mace:dir:attribute-def:homePostalAddress',
|
||||
'homeTelephoneNumber': 'urn:mace:dir:attribute-def:homeTelephoneNumber',
|
||||
'host': 'urn:mace:dir:attribute-def:host',
|
||||
'houseIdentifier': 'urn:mace:dir:attribute-def:houseIdentifier',
|
||||
'info': 'urn:mace:dir:attribute-def:info',
|
||||
'initials': 'urn:mace:dir:attribute-def:initials',
|
||||
'internationaliSDNNumber': 'urn:mace:dir:attribute-def:internationaliSDNNumber',
|
||||
'janetMailbox': 'urn:mace:dir:attribute-def:janetMailbox',
|
||||
'jpegPhoto': 'urn:mace:dir:attribute-def:jpegPhoto',
|
||||
'knowledgeInformation': 'urn:mace:dir:attribute-def:knowledgeInformation',
|
||||
'l': 'urn:mace:dir:attribute-def:l',
|
||||
'labeledURI': 'urn:mace:dir:attribute-def:labeledURI',
|
||||
'localityName': 'urn:mace:dir:attribute-def:localityName',
|
||||
'mDRecord': 'urn:mace:dir:attribute-def:mDRecord',
|
||||
'mXRecord': 'urn:mace:dir:attribute-def:mXRecord',
|
||||
'mail': 'urn:mace:dir:attribute-def:mail',
|
||||
'mailPreferenceOption': 'urn:mace:dir:attribute-def:mailPreferenceOption',
|
||||
'manager': 'urn:mace:dir:attribute-def:manager',
|
||||
'member': 'urn:mace:dir:attribute-def:member',
|
||||
'mobile': 'urn:mace:dir:attribute-def:mobile',
|
||||
'mobileTelephoneNumber': 'urn:mace:dir:attribute-def:mobileTelephoneNumber',
|
||||
'nSRecord': 'urn:mace:dir:attribute-def:nSRecord',
|
||||
'name': 'urn:mace:dir:attribute-def:name',
|
||||
'norEduOrgAcronym': 'urn:mace:dir:attribute-def:norEduOrgAcronym',
|
||||
'norEduOrgNIN': 'urn:mace:dir:attribute-def:norEduOrgNIN',
|
||||
'norEduOrgSchemaVersion': 'urn:mace:dir:attribute-def:norEduOrgSchemaVersion',
|
||||
'norEduOrgUniqueIdentifier': 'urn:mace:dir:attribute-def:norEduOrgUniqueIdentifier',
|
||||
'norEduOrgUniqueNumber': 'urn:mace:dir:attribute-def:norEduOrgUniqueNumber',
|
||||
'norEduOrgUnitUniqueIdentifier': 'urn:mace:dir:attribute-def:norEduOrgUnitUniqueIdentifier',
|
||||
'norEduOrgUnitUniqueNumber': 'urn:mace:dir:attribute-def:norEduOrgUnitUniqueNumber',
|
||||
'norEduPersonBirthDate': 'urn:mace:dir:attribute-def:norEduPersonBirthDate',
|
||||
'norEduPersonLIN': 'urn:mace:dir:attribute-def:norEduPersonLIN',
|
||||
'norEduPersonNIN': 'urn:mace:dir:attribute-def:norEduPersonNIN',
|
||||
'o': 'urn:mace:dir:attribute-def:o',
|
||||
'objectClass': 'urn:mace:dir:attribute-def:objectClass',
|
||||
'organizationName': 'urn:mace:dir:attribute-def:organizationName',
|
||||
'organizationalStatus': 'urn:mace:dir:attribute-def:organizationalStatus',
|
||||
'organizationalUnitName': 'urn:mace:dir:attribute-def:organizationalUnitName',
|
||||
'otherMailbox': 'urn:mace:dir:attribute-def:otherMailbox',
|
||||
'ou': 'urn:mace:dir:attribute-def:ou',
|
||||
'owner': 'urn:mace:dir:attribute-def:owner',
|
||||
'pager': 'urn:mace:dir:attribute-def:pager',
|
||||
'pagerTelephoneNumber': 'urn:mace:dir:attribute-def:pagerTelephoneNumber',
|
||||
'personalSignature': 'urn:mace:dir:attribute-def:personalSignature',
|
||||
'personalTitle': 'urn:mace:dir:attribute-def:personalTitle',
|
||||
'photo': 'urn:mace:dir:attribute-def:photo',
|
||||
'physicalDeliveryOfficeName': 'urn:mace:dir:attribute-def:physicalDeliveryOfficeName',
|
||||
'pkcs9email': 'urn:mace:dir:attribute-def:pkcs9email',
|
||||
'postOfficeBox': 'urn:mace:dir:attribute-def:postOfficeBox',
|
||||
'postalAddress': 'urn:mace:dir:attribute-def:postalAddress',
|
||||
'postalCode': 'urn:mace:dir:attribute-def:postalCode',
|
||||
'preferredDeliveryMethod': 'urn:mace:dir:attribute-def:preferredDeliveryMethod',
|
||||
'preferredLanguage': 'urn:mace:dir:attribute-def:preferredLanguage',
|
||||
'presentationAddress': 'urn:mace:dir:attribute-def:presentationAddress',
|
||||
'protocolInformation': 'urn:mace:dir:attribute-def:protocolInformation',
|
||||
'pseudonym': 'urn:mace:dir:attribute-def:pseudonym',
|
||||
'registeredAddress': 'urn:mace:dir:attribute-def:registeredAddress',
|
||||
'rfc822Mailbox': 'urn:mace:dir:attribute-def:rfc822Mailbox',
|
||||
'roleOccupant': 'urn:mace:dir:attribute-def:roleOccupant',
|
||||
'roomNumber': 'urn:mace:dir:attribute-def:roomNumber',
|
||||
'sOARecord': 'urn:mace:dir:attribute-def:sOARecord',
|
||||
'searchGuide': 'urn:mace:dir:attribute-def:searchGuide',
|
||||
'secretary': 'urn:mace:dir:attribute-def:secretary',
|
||||
'seeAlso': 'urn:mace:dir:attribute-def:seeAlso',
|
||||
'serialNumber': 'urn:mace:dir:attribute-def:serialNumber',
|
||||
'singleLevelQuality': 'urn:mace:dir:attribute-def:singleLevelQuality',
|
||||
'sn': 'urn:mace:dir:attribute-def:sn',
|
||||
'st': 'urn:mace:dir:attribute-def:st',
|
||||
'stateOrProvinceName': 'urn:mace:dir:attribute-def:stateOrProvinceName',
|
||||
'street': 'urn:mace:dir:attribute-def:street',
|
||||
'streetAddress': 'urn:mace:dir:attribute-def:streetAddress',
|
||||
'subtreeMaximumQuality': 'urn:mace:dir:attribute-def:subtreeMaximumQuality',
|
||||
'subtreeMinimumQuality': 'urn:mace:dir:attribute-def:subtreeMinimumQuality',
|
||||
'supportedAlgorithms': 'urn:mace:dir:attribute-def:supportedAlgorithms',
|
||||
'supportedApplicationContext': 'urn:mace:dir:attribute-def:supportedApplicationContext',
|
||||
'surname': 'urn:mace:dir:attribute-def:surname',
|
||||
'telephoneNumber': 'urn:mace:dir:attribute-def:telephoneNumber',
|
||||
'teletexTerminalIdentifier': 'urn:mace:dir:attribute-def:teletexTerminalIdentifier',
|
||||
'telexNumber': 'urn:mace:dir:attribute-def:telexNumber',
|
||||
'textEncodedORAddress': 'urn:mace:dir:attribute-def:textEncodedORAddress',
|
||||
'title': 'urn:mace:dir:attribute-def:title',
|
||||
'uid': 'urn:mace:dir:attribute-def:uid',
|
||||
'uniqueIdentifier': 'urn:mace:dir:attribute-def:uniqueIdentifier',
|
||||
'uniqueMember': 'urn:mace:dir:attribute-def:uniqueMember',
|
||||
'userCertificate': 'urn:mace:dir:attribute-def:userCertificate',
|
||||
'userClass': 'urn:mace:dir:attribute-def:userClass',
|
||||
'userPKCS12': 'urn:mace:dir:attribute-def:userPKCS12',
|
||||
'userPassword': 'urn:mace:dir:attribute-def:userPassword',
|
||||
'userSMIMECertificate': 'urn:mace:dir:attribute-def:userSMIMECertificate',
|
||||
'userid': 'urn:mace:dir:attribute-def:userid',
|
||||
'x121Address': 'urn:mace:dir:attribute-def:x121Address',
|
||||
'x500UniqueIdentifier': 'urn:mace:dir:attribute-def:x500UniqueIdentifier',
|
||||
}
|
||||
}
|
||||
199
example/idp/attributemaps/saml_uri.py
Normal file
199
example/idp/attributemaps/saml_uri.py
Normal file
@@ -0,0 +1,199 @@
|
||||
__author__ = 'rolandh'
|
||||
|
||||
EDUPERSON_OID = "urn:oid:1.3.6.1.4.1.5923.1.1.1."
|
||||
X500ATTR_OID = "urn:oid:2.5.4."
|
||||
NOREDUPERSON_OID = "urn:oid:1.3.6.1.4.1.2428.90.1."
|
||||
NETSCAPE_LDAP = "urn:oid:2.16.840.1.113730.3.1."
|
||||
UCL_DIR_PILOT = 'urn:oid:0.9.2342.19200300.100.1.'
|
||||
PKCS_9 = "urn:oid:1.2.840.113549.1.9.1."
|
||||
UMICH = "urn:oid:1.3.6.1.4.1.250.1.57."
|
||||
|
||||
MAP = {
|
||||
"identifier": "urn:oasis:names:tc:SAML:2.0:attrname-format:uri",
|
||||
"fro": {
|
||||
EDUPERSON_OID+'2': 'eduPersonNickname',
|
||||
EDUPERSON_OID+'9': 'eduPersonScopedAffiliation',
|
||||
EDUPERSON_OID+'11': 'eduPersonAssurance',
|
||||
EDUPERSON_OID+'10': 'eduPersonTargetedID',
|
||||
EDUPERSON_OID+'4': 'eduPersonOrgUnitDN',
|
||||
NOREDUPERSON_OID+'6': 'norEduOrgAcronym',
|
||||
NOREDUPERSON_OID+'7': 'norEduOrgUniqueIdentifier',
|
||||
NOREDUPERSON_OID+'4': 'norEduPersonLIN',
|
||||
EDUPERSON_OID+'1': 'eduPersonAffiliation',
|
||||
NOREDUPERSON_OID+'2': 'norEduOrgUnitUniqueNumber',
|
||||
NETSCAPE_LDAP+'40': 'userSMIMECertificate',
|
||||
NOREDUPERSON_OID+'1': 'norEduOrgUniqueNumber',
|
||||
NETSCAPE_LDAP+'241': 'displayName',
|
||||
UCL_DIR_PILOT+'37': 'associatedDomain',
|
||||
EDUPERSON_OID+'6': 'eduPersonPrincipalName',
|
||||
NOREDUPERSON_OID+'8': 'norEduOrgUnitUniqueIdentifier',
|
||||
NOREDUPERSON_OID+'9': 'federationFeideSchemaVersion',
|
||||
X500ATTR_OID+'53': 'deltaRevocationList',
|
||||
X500ATTR_OID+'52': 'supportedAlgorithms',
|
||||
X500ATTR_OID+'51': 'houseIdentifier',
|
||||
X500ATTR_OID+'50': 'uniqueMember',
|
||||
X500ATTR_OID+'19': 'physicalDeliveryOfficeName',
|
||||
X500ATTR_OID+'18': 'postOfficeBox',
|
||||
X500ATTR_OID+'17': 'postalCode',
|
||||
X500ATTR_OID+'16': 'postalAddress',
|
||||
X500ATTR_OID+'15': 'businessCategory',
|
||||
X500ATTR_OID+'14': 'searchGuide',
|
||||
EDUPERSON_OID+'5': 'eduPersonPrimaryAffiliation',
|
||||
X500ATTR_OID+'12': 'title',
|
||||
X500ATTR_OID+'11': 'ou',
|
||||
X500ATTR_OID+'10': 'o',
|
||||
X500ATTR_OID+'37': 'cACertificate',
|
||||
X500ATTR_OID+'36': 'userCertificate',
|
||||
X500ATTR_OID+'31': 'member',
|
||||
X500ATTR_OID+'30': 'supportedApplicationContext',
|
||||
X500ATTR_OID+'33': 'roleOccupant',
|
||||
X500ATTR_OID+'32': 'owner',
|
||||
NETSCAPE_LDAP+'1': 'carLicense',
|
||||
PKCS_9+'1': 'email',
|
||||
NETSCAPE_LDAP+'3': 'employeeNumber',
|
||||
NETSCAPE_LDAP+'2': 'departmentNumber',
|
||||
X500ATTR_OID+'39': 'certificateRevocationList',
|
||||
X500ATTR_OID+'38': 'authorityRevocationList',
|
||||
NETSCAPE_LDAP+'216': 'userPKCS12',
|
||||
EDUPERSON_OID+'8': 'eduPersonPrimaryOrgUnitDN',
|
||||
X500ATTR_OID+'9': 'street',
|
||||
X500ATTR_OID+'8': 'st',
|
||||
NETSCAPE_LDAP+'39': 'preferredLanguage',
|
||||
EDUPERSON_OID+'7': 'eduPersonEntitlement',
|
||||
X500ATTR_OID+'2': 'knowledgeInformation',
|
||||
X500ATTR_OID+'7': 'l',
|
||||
X500ATTR_OID+'6': 'c',
|
||||
X500ATTR_OID+'5': 'serialNumber',
|
||||
X500ATTR_OID+'4': 'sn',
|
||||
UCL_DIR_PILOT+'60': 'jpegPhoto',
|
||||
X500ATTR_OID+'65': 'pseudonym',
|
||||
NOREDUPERSON_OID+'5': 'norEduPersonNIN',
|
||||
UCL_DIR_PILOT+'3': 'mail',
|
||||
UCL_DIR_PILOT+'25': 'dc',
|
||||
X500ATTR_OID+'40': 'crossCertificatePair',
|
||||
X500ATTR_OID+'42': 'givenName',
|
||||
X500ATTR_OID+'43': 'initials',
|
||||
X500ATTR_OID+'44': 'generationQualifier',
|
||||
X500ATTR_OID+'45': 'x500UniqueIdentifier',
|
||||
X500ATTR_OID+'46': 'dnQualifier',
|
||||
X500ATTR_OID+'47': 'enhancedSearchGuide',
|
||||
X500ATTR_OID+'48': 'protocolInformation',
|
||||
X500ATTR_OID+'54': 'dmdName',
|
||||
NETSCAPE_LDAP+'4': 'employeeType',
|
||||
X500ATTR_OID+'22': 'teletexTerminalIdentifier',
|
||||
X500ATTR_OID+'23': 'facsimileTelephoneNumber',
|
||||
X500ATTR_OID+'20': 'telephoneNumber',
|
||||
X500ATTR_OID+'21': 'telexNumber',
|
||||
X500ATTR_OID+'26': 'registeredAddress',
|
||||
X500ATTR_OID+'27': 'destinationIndicator',
|
||||
X500ATTR_OID+'24': 'x121Address',
|
||||
X500ATTR_OID+'25': 'internationaliSDNNumber',
|
||||
X500ATTR_OID+'28': 'preferredDeliveryMethod',
|
||||
X500ATTR_OID+'29': 'presentationAddress',
|
||||
EDUPERSON_OID+'3': 'eduPersonOrgDN',
|
||||
NOREDUPERSON_OID+'3': 'norEduPersonBirthDate',
|
||||
UMICH+'57': 'labeledURI',
|
||||
UCL_DIR_PILOT+'1': 'uid',
|
||||
},
|
||||
"to": {
|
||||
'roleOccupant': X500ATTR_OID+'33',
|
||||
'gn': X500ATTR_OID+'42',
|
||||
'norEduPersonNIN': NOREDUPERSON_OID+'5',
|
||||
'title': X500ATTR_OID+'12',
|
||||
'facsimileTelephoneNumber': X500ATTR_OID+'23',
|
||||
'mail': UCL_DIR_PILOT+'3',
|
||||
'postOfficeBox': X500ATTR_OID+'18',
|
||||
'fax': X500ATTR_OID+'23',
|
||||
'telephoneNumber': X500ATTR_OID+'20',
|
||||
'norEduPersonBirthDate': NOREDUPERSON_OID+'3',
|
||||
'rfc822Mailbox': UCL_DIR_PILOT+'3',
|
||||
'dc': UCL_DIR_PILOT+'25',
|
||||
'countryName': X500ATTR_OID+'6',
|
||||
'emailAddress': PKCS_9+'1',
|
||||
'employeeNumber': NETSCAPE_LDAP+'3',
|
||||
'organizationName': X500ATTR_OID+'10',
|
||||
'eduPersonAssurance': EDUPERSON_OID+'11',
|
||||
'norEduOrgAcronym': NOREDUPERSON_OID+'6',
|
||||
'registeredAddress': X500ATTR_OID+'26',
|
||||
'physicalDeliveryOfficeName': X500ATTR_OID+'19',
|
||||
'associatedDomain': UCL_DIR_PILOT+'37',
|
||||
'l': X500ATTR_OID+'7',
|
||||
'stateOrProvinceName': X500ATTR_OID+'8',
|
||||
'federationFeideSchemaVersion': NOREDUPERSON_OID+'9',
|
||||
'pkcs9email': PKCS_9+'1',
|
||||
'givenName': X500ATTR_OID+'42',
|
||||
'givenname': X500ATTR_OID+'42',
|
||||
'x500UniqueIdentifier': X500ATTR_OID+'45',
|
||||
'eduPersonNickname': EDUPERSON_OID+'2',
|
||||
'houseIdentifier': X500ATTR_OID+'51',
|
||||
'street': X500ATTR_OID+'9',
|
||||
'supportedAlgorithms': X500ATTR_OID+'52',
|
||||
'preferredLanguage': NETSCAPE_LDAP+'39',
|
||||
'postalAddress': X500ATTR_OID+'16',
|
||||
'email': PKCS_9+'1',
|
||||
'norEduOrgUnitUniqueIdentifier': NOREDUPERSON_OID+'8',
|
||||
'eduPersonPrimaryOrgUnitDN': EDUPERSON_OID+'8',
|
||||
'c': X500ATTR_OID+'6',
|
||||
'teletexTerminalIdentifier': X500ATTR_OID+'22',
|
||||
'o': X500ATTR_OID+'10',
|
||||
'cACertificate': X500ATTR_OID+'37',
|
||||
'telexNumber': X500ATTR_OID+'21',
|
||||
'ou': X500ATTR_OID+'11',
|
||||
'initials': X500ATTR_OID+'43',
|
||||
'eduPersonOrgUnitDN': EDUPERSON_OID+'4',
|
||||
'deltaRevocationList': X500ATTR_OID+'53',
|
||||
'norEduPersonLIN': NOREDUPERSON_OID+'4',
|
||||
'supportedApplicationContext': X500ATTR_OID+'30',
|
||||
'eduPersonEntitlement': EDUPERSON_OID+'7',
|
||||
'generationQualifier': X500ATTR_OID+'44',
|
||||
'eduPersonAffiliation': EDUPERSON_OID+'1',
|
||||
'eduPersonPrincipalName': EDUPERSON_OID+'6',
|
||||
'edupersonprincipalname': EDUPERSON_OID+'6',
|
||||
'localityName': X500ATTR_OID+'7',
|
||||
'owner': X500ATTR_OID+'32',
|
||||
'norEduOrgUnitUniqueNumber': NOREDUPERSON_OID+'2',
|
||||
'searchGuide': X500ATTR_OID+'14',
|
||||
'certificateRevocationList': X500ATTR_OID+'39',
|
||||
'organizationalUnitName': X500ATTR_OID+'11',
|
||||
'userCertificate': X500ATTR_OID+'36',
|
||||
'preferredDeliveryMethod': X500ATTR_OID+'28',
|
||||
'internationaliSDNNumber': X500ATTR_OID+'25',
|
||||
'uniqueMember': X500ATTR_OID+'50',
|
||||
'departmentNumber': NETSCAPE_LDAP+'2',
|
||||
'enhancedSearchGuide': X500ATTR_OID+'47',
|
||||
'userPKCS12': NETSCAPE_LDAP+'216',
|
||||
'eduPersonTargetedID': EDUPERSON_OID+'10',
|
||||
'norEduOrgUniqueNumber': NOREDUPERSON_OID+'1',
|
||||
'x121Address': X500ATTR_OID+'24',
|
||||
'destinationIndicator': X500ATTR_OID+'27',
|
||||
'eduPersonPrimaryAffiliation': EDUPERSON_OID+'5',
|
||||
'surname': X500ATTR_OID+'4',
|
||||
'jpegPhoto': UCL_DIR_PILOT+'60',
|
||||
'eduPersonScopedAffiliation': EDUPERSON_OID+'9',
|
||||
'edupersonscopedaffiliation': EDUPERSON_OID+'9',
|
||||
'protocolInformation': X500ATTR_OID+'48',
|
||||
'knowledgeInformation': X500ATTR_OID+'2',
|
||||
'employeeType': NETSCAPE_LDAP+'4',
|
||||
'userSMIMECertificate': NETSCAPE_LDAP+'40',
|
||||
'member': X500ATTR_OID+'31',
|
||||
'streetAddress': X500ATTR_OID+'9',
|
||||
'dmdName': X500ATTR_OID+'54',
|
||||
'postalCode': X500ATTR_OID+'17',
|
||||
'pseudonym': X500ATTR_OID+'65',
|
||||
'dnQualifier': X500ATTR_OID+'46',
|
||||
'crossCertificatePair': X500ATTR_OID+'40',
|
||||
'eduPersonOrgDN': EDUPERSON_OID+'3',
|
||||
'authorityRevocationList': X500ATTR_OID+'38',
|
||||
'displayName': NETSCAPE_LDAP+'241',
|
||||
'businessCategory': X500ATTR_OID+'15',
|
||||
'serialNumber': X500ATTR_OID+'5',
|
||||
'norEduOrgUniqueIdentifier': NOREDUPERSON_OID+'7',
|
||||
'st': X500ATTR_OID+'8',
|
||||
'carLicense': NETSCAPE_LDAP+'1',
|
||||
'presentationAddress': X500ATTR_OID+'29',
|
||||
'sn': X500ATTR_OID+'4',
|
||||
'domainComponent': UCL_DIR_PILOT+'25',
|
||||
'labeledURI': UMICH+'57',
|
||||
'uid': UCL_DIR_PILOT+'1'
|
||||
}
|
||||
}
|
||||
190
example/idp/attributemaps/shibboleth_uri.py
Normal file
190
example/idp/attributemaps/shibboleth_uri.py
Normal file
@@ -0,0 +1,190 @@
|
||||
EDUPERSON_OID = "urn:oid:1.3.6.1.4.1.5923.1.1.1."
|
||||
X500ATTR = "urn:oid:2.5.4."
|
||||
NOREDUPERSON_OID = "urn:oid:1.3.6.1.4.1.2428.90.1."
|
||||
NETSCAPE_LDAP = "urn:oid:2.16.840.1.113730.3.1."
|
||||
UCL_DIR_PILOT = "urn:oid:0.9.2342.19200300.100.1."
|
||||
PKCS_9 = "urn:oid:1.2.840.113549.1.9."
|
||||
UMICH = "urn:oid:1.3.6.1.4.1.250.1.57."
|
||||
|
||||
MAP = {
|
||||
"identifier": "urn:mace:shibboleth:1.0:attributeNamespace:uri",
|
||||
"fro": {
|
||||
EDUPERSON_OID+'2': 'eduPersonNickname',
|
||||
EDUPERSON_OID+'9': 'eduPersonScopedAffiliation',
|
||||
EDUPERSON_OID+'11': 'eduPersonAssurance',
|
||||
EDUPERSON_OID+'10': 'eduPersonTargetedID',
|
||||
EDUPERSON_OID+'4': 'eduPersonOrgUnitDN',
|
||||
NOREDUPERSON_OID+'6': 'norEduOrgAcronym',
|
||||
NOREDUPERSON_OID+'7': 'norEduOrgUniqueIdentifier',
|
||||
NOREDUPERSON_OID+'4': 'norEduPersonLIN',
|
||||
EDUPERSON_OID+'1': 'eduPersonAffiliation',
|
||||
NOREDUPERSON_OID+'2': 'norEduOrgUnitUniqueNumber',
|
||||
NETSCAPE_LDAP+'40': 'userSMIMECertificate',
|
||||
NOREDUPERSON_OID+'1': 'norEduOrgUniqueNumber',
|
||||
NETSCAPE_LDAP+'241': 'displayName',
|
||||
UCL_DIR_PILOT+'37': 'associatedDomain',
|
||||
EDUPERSON_OID+'6': 'eduPersonPrincipalName',
|
||||
NOREDUPERSON_OID+'8': 'norEduOrgUnitUniqueIdentifier',
|
||||
NOREDUPERSON_OID+'9': 'federationFeideSchemaVersion',
|
||||
X500ATTR+'53': 'deltaRevocationList',
|
||||
X500ATTR+'52': 'supportedAlgorithms',
|
||||
X500ATTR+'51': 'houseIdentifier',
|
||||
X500ATTR+'50': 'uniqueMember',
|
||||
X500ATTR+'19': 'physicalDeliveryOfficeName',
|
||||
X500ATTR+'18': 'postOfficeBox',
|
||||
X500ATTR+'17': 'postalCode',
|
||||
X500ATTR+'16': 'postalAddress',
|
||||
X500ATTR+'15': 'businessCategory',
|
||||
X500ATTR+'14': 'searchGuide',
|
||||
EDUPERSON_OID+'5': 'eduPersonPrimaryAffiliation',
|
||||
X500ATTR+'12': 'title',
|
||||
X500ATTR+'11': 'ou',
|
||||
X500ATTR+'10': 'o',
|
||||
X500ATTR+'37': 'cACertificate',
|
||||
X500ATTR+'36': 'userCertificate',
|
||||
X500ATTR+'31': 'member',
|
||||
X500ATTR+'30': 'supportedApplicationContext',
|
||||
X500ATTR+'33': 'roleOccupant',
|
||||
X500ATTR+'32': 'owner',
|
||||
NETSCAPE_LDAP+'1': 'carLicense',
|
||||
PKCS_9+'1': 'email',
|
||||
NETSCAPE_LDAP+'3': 'employeeNumber',
|
||||
NETSCAPE_LDAP+'2': 'departmentNumber',
|
||||
X500ATTR+'39': 'certificateRevocationList',
|
||||
X500ATTR+'38': 'authorityRevocationList',
|
||||
NETSCAPE_LDAP+'216': 'userPKCS12',
|
||||
EDUPERSON_OID+'8': 'eduPersonPrimaryOrgUnitDN',
|
||||
X500ATTR+'9': 'street',
|
||||
X500ATTR+'8': 'st',
|
||||
NETSCAPE_LDAP+'39': 'preferredLanguage',
|
||||
EDUPERSON_OID+'7': 'eduPersonEntitlement',
|
||||
X500ATTR+'2': 'knowledgeInformation',
|
||||
X500ATTR+'7': 'l',
|
||||
X500ATTR+'6': 'c',
|
||||
X500ATTR+'5': 'serialNumber',
|
||||
X500ATTR+'4': 'sn',
|
||||
UCL_DIR_PILOT+'60': 'jpegPhoto',
|
||||
X500ATTR+'65': 'pseudonym',
|
||||
NOREDUPERSON_OID+'5': 'norEduPersonNIN',
|
||||
UCL_DIR_PILOT+'3': 'mail',
|
||||
UCL_DIR_PILOT+'25': 'dc',
|
||||
X500ATTR+'40': 'crossCertificatePair',
|
||||
X500ATTR+'42': 'givenName',
|
||||
X500ATTR+'43': 'initials',
|
||||
X500ATTR+'44': 'generationQualifier',
|
||||
X500ATTR+'45': 'x500UniqueIdentifier',
|
||||
X500ATTR+'46': 'dnQualifier',
|
||||
X500ATTR+'47': 'enhancedSearchGuide',
|
||||
X500ATTR+'48': 'protocolInformation',
|
||||
X500ATTR+'54': 'dmdName',
|
||||
NETSCAPE_LDAP+'4': 'employeeType',
|
||||
X500ATTR+'22': 'teletexTerminalIdentifier',
|
||||
X500ATTR+'23': 'facsimileTelephoneNumber',
|
||||
X500ATTR+'20': 'telephoneNumber',
|
||||
X500ATTR+'21': 'telexNumber',
|
||||
X500ATTR+'26': 'registeredAddress',
|
||||
X500ATTR+'27': 'destinationIndicator',
|
||||
X500ATTR+'24': 'x121Address',
|
||||
X500ATTR+'25': 'internationaliSDNNumber',
|
||||
X500ATTR+'28': 'preferredDeliveryMethod',
|
||||
X500ATTR+'29': 'presentationAddress',
|
||||
EDUPERSON_OID+'3': 'eduPersonOrgDN',
|
||||
NOREDUPERSON_OID+'3': 'norEduPersonBirthDate',
|
||||
},
|
||||
"to":{
|
||||
'roleOccupant': X500ATTR+'33',
|
||||
'gn': X500ATTR+'42',
|
||||
'norEduPersonNIN': NOREDUPERSON_OID+'5',
|
||||
'title': X500ATTR+'12',
|
||||
'facsimileTelephoneNumber': X500ATTR+'23',
|
||||
'mail': UCL_DIR_PILOT+'3',
|
||||
'postOfficeBox': X500ATTR+'18',
|
||||
'fax': X500ATTR+'23',
|
||||
'telephoneNumber': X500ATTR+'20',
|
||||
'norEduPersonBirthDate': NOREDUPERSON_OID+'3',
|
||||
'rfc822Mailbox': UCL_DIR_PILOT+'3',
|
||||
'dc': UCL_DIR_PILOT+'25',
|
||||
'countryName': X500ATTR+'6',
|
||||
'emailAddress': PKCS_9+'1',
|
||||
'employeeNumber': NETSCAPE_LDAP+'3',
|
||||
'organizationName': X500ATTR+'10',
|
||||
'eduPersonAssurance': EDUPERSON_OID+'11',
|
||||
'norEduOrgAcronym': NOREDUPERSON_OID+'6',
|
||||
'registeredAddress': X500ATTR+'26',
|
||||
'physicalDeliveryOfficeName': X500ATTR+'19',
|
||||
'associatedDomain': UCL_DIR_PILOT+'37',
|
||||
'l': X500ATTR+'7',
|
||||
'stateOrProvinceName': X500ATTR+'8',
|
||||
'federationFeideSchemaVersion': NOREDUPERSON_OID+'9',
|
||||
'pkcs9email': PKCS_9+'1',
|
||||
'givenName': X500ATTR+'42',
|
||||
'x500UniqueIdentifier': X500ATTR+'45',
|
||||
'eduPersonNickname': EDUPERSON_OID+'2',
|
||||
'houseIdentifier': X500ATTR+'51',
|
||||
'street': X500ATTR+'9',
|
||||
'supportedAlgorithms': X500ATTR+'52',
|
||||
'preferredLanguage': NETSCAPE_LDAP+'39',
|
||||
'postalAddress': X500ATTR+'16',
|
||||
'email': PKCS_9+'1',
|
||||
'norEduOrgUnitUniqueIdentifier': NOREDUPERSON_OID+'8',
|
||||
'eduPersonPrimaryOrgUnitDN': EDUPERSON_OID+'8',
|
||||
'c': X500ATTR+'6',
|
||||
'teletexTerminalIdentifier': X500ATTR+'22',
|
||||
'o': X500ATTR+'10',
|
||||
'cACertificate': X500ATTR+'37',
|
||||
'telexNumber': X500ATTR+'21',
|
||||
'ou': X500ATTR+'11',
|
||||
'initials': X500ATTR+'43',
|
||||
'eduPersonOrgUnitDN': EDUPERSON_OID+'4',
|
||||
'deltaRevocationList': X500ATTR+'53',
|
||||
'norEduPersonLIN': NOREDUPERSON_OID+'4',
|
||||
'supportedApplicationContext': X500ATTR+'30',
|
||||
'eduPersonEntitlement': EDUPERSON_OID+'7',
|
||||
'generationQualifier': X500ATTR+'44',
|
||||
'eduPersonAffiliation': EDUPERSON_OID+'1',
|
||||
'eduPersonPrincipalName': EDUPERSON_OID+'6',
|
||||
'localityName': X500ATTR+'7',
|
||||
'owner': X500ATTR+'32',
|
||||
'norEduOrgUnitUniqueNumber': NOREDUPERSON_OID+'2',
|
||||
'searchGuide': X500ATTR+'14',
|
||||
'certificateRevocationList': X500ATTR+'39',
|
||||
'organizationalUnitName': X500ATTR+'11',
|
||||
'userCertificate': X500ATTR+'36',
|
||||
'preferredDeliveryMethod': X500ATTR+'28',
|
||||
'internationaliSDNNumber': X500ATTR+'25',
|
||||
'uniqueMember': X500ATTR+'50',
|
||||
'departmentNumber': NETSCAPE_LDAP+'2',
|
||||
'enhancedSearchGuide': X500ATTR+'47',
|
||||
'userPKCS12': NETSCAPE_LDAP+'216',
|
||||
'eduPersonTargetedID': EDUPERSON_OID+'10',
|
||||
'norEduOrgUniqueNumber': NOREDUPERSON_OID+'1',
|
||||
'x121Address': X500ATTR+'24',
|
||||
'destinationIndicator': X500ATTR+'27',
|
||||
'eduPersonPrimaryAffiliation': EDUPERSON_OID+'5',
|
||||
'surname': X500ATTR+'4',
|
||||
'jpegPhoto': UCL_DIR_PILOT+'60',
|
||||
'eduPersonScopedAffiliation': EDUPERSON_OID+'9',
|
||||
'protocolInformation': X500ATTR+'48',
|
||||
'knowledgeInformation': X500ATTR+'2',
|
||||
'employeeType': NETSCAPE_LDAP+'4',
|
||||
'userSMIMECertificate': NETSCAPE_LDAP+'40',
|
||||
'member': X500ATTR+'31',
|
||||
'streetAddress': X500ATTR+'9',
|
||||
'dmdName': X500ATTR+'54',
|
||||
'postalCode': X500ATTR+'17',
|
||||
'pseudonym': X500ATTR+'65',
|
||||
'dnQualifier': X500ATTR+'46',
|
||||
'crossCertificatePair': X500ATTR+'40',
|
||||
'eduPersonOrgDN': EDUPERSON_OID+'3',
|
||||
'authorityRevocationList': X500ATTR+'38',
|
||||
'displayName': NETSCAPE_LDAP+'241',
|
||||
'businessCategory': X500ATTR+'15',
|
||||
'serialNumber': X500ATTR+'5',
|
||||
'norEduOrgUniqueIdentifier': NOREDUPERSON_OID+'7',
|
||||
'st': X500ATTR+'8',
|
||||
'carLicense': NETSCAPE_LDAP+'1',
|
||||
'presentationAddress': X500ATTR+'29',
|
||||
'sn': X500ATTR+'4',
|
||||
'domainComponent': UCL_DIR_PILOT+'25',
|
||||
}
|
||||
}
|
||||
270
example/idp/idp.py
Executable file
270
example/idp/idp.py
Executable file
@@ -0,0 +1,270 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
import re
|
||||
import base64
|
||||
from cgi import parse_qs
|
||||
from saml2 import server
|
||||
from saml2 import BINDING_HTTP_REDIRECT, BINDING_HTTP_POST
|
||||
from saml2 import time_util
|
||||
from Cookie import SimpleCookie
|
||||
|
||||
def _expiration(timeout, format=None):
|
||||
if timeout == "now":
|
||||
return time_util.instant(format)
|
||||
else:
|
||||
# validity time should match lifetime of assertions
|
||||
return time_util.in_a_while(minutes=timeout, format=format)
|
||||
|
||||
# -----------------------------------------------------------------------------
|
||||
def dict_to_table(ava, lev=0, width=1):
|
||||
txt = ['<table border=%s bordercolor="black">\n' % width]
|
||||
for prop, valarr in ava.items():
|
||||
txt.append("<tr>\n")
|
||||
if isinstance(valarr, basestring):
|
||||
txt.append("<th>%s</th>\n" % str(prop))
|
||||
try:
|
||||
txt.append("<td>%s</td>\n" % valarr.encode("utf8"))
|
||||
except AttributeError:
|
||||
txt.append("<td>%s</td>\n" % valarr)
|
||||
elif isinstance(valarr, list):
|
||||
index = 0
|
||||
num = len(valarr)
|
||||
for val in valarr:
|
||||
if not index:
|
||||
txt.append("<th rowspan=%d>%s</td>\n" % (len(valarr), prop))
|
||||
else:
|
||||
txt.append("<tr>\n")
|
||||
if isinstance(val, dict):
|
||||
txt.append("<td>\n")
|
||||
txt.extend(dict_to_table(val, lev+1, width-1))
|
||||
txt.append("</td>\n")
|
||||
else:
|
||||
try:
|
||||
txt.append("<td>%s</td>\n" % val.encode("utf8"))
|
||||
except AttributeError:
|
||||
txt.append("<td>%s</td>\n" % val)
|
||||
if num > 1:
|
||||
txt.append("</tr>\n")
|
||||
num -= 1
|
||||
index += 1
|
||||
elif isinstance(valarr, dict):
|
||||
txt.append("<th>%s</th>\n" % prop)
|
||||
txt.append("<td>\n")
|
||||
txt.extend(dict_to_table(valarr, lev+1, width-1))
|
||||
txt.append("</td>\n")
|
||||
txt.append("</tr>\n")
|
||||
txt.append('</table>\n')
|
||||
return txt
|
||||
|
||||
REPOZE_ID_EQUIVALENT = "uid"
|
||||
FORM_SPEC = """<form name="myform" method="post" action="%s">
|
||||
<input type="hidden" name="SAMLResponse" value="%s" />
|
||||
<input type="hidden" name="RelayState" value="%s" />
|
||||
</form>"""
|
||||
|
||||
def sso(environ, start_response, user, logger):
|
||||
""" Supposted to return a POST """
|
||||
#edict = dict_to_table(environ)
|
||||
#if logger: logger.info("Environ keys: %s" % environ.keys())
|
||||
logger.info("--- In SSO ---")
|
||||
query = None
|
||||
if "QUERY_STRING" in environ:
|
||||
if logger:
|
||||
logger.info("Query string: %s" % environ["QUERY_STRING"])
|
||||
query = parse_qs(environ["QUERY_STRING"])
|
||||
elif "s2repoze.qinfo" in environ:
|
||||
query = environ["s2repoze.qinfo"]
|
||||
|
||||
if not query:
|
||||
start_response('401 Unauthorized', [('Content-Type', 'text/plain')])
|
||||
return ['Unknown user']
|
||||
|
||||
# base 64 encoded request
|
||||
req_info = IDP.parse_authn_request(query["SAMLRequest"][0])
|
||||
logger.info("parsed OK")
|
||||
logger.info("%s" % req_info)
|
||||
|
||||
identity = dict(environ["repoze.who.identity"]["user"])
|
||||
logger.info("Identity: %s" % (identity,))
|
||||
userid = environ["repoze.who.identity"]['repoze.who.userid']
|
||||
if REPOZE_ID_EQUIVALENT:
|
||||
identity[REPOZE_ID_EQUIVALENT] = userid
|
||||
try:
|
||||
authn_resp = IDP.authn_response(identity,
|
||||
req_info["id"],
|
||||
req_info["consumer_url"],
|
||||
req_info["sp_entity_id"],
|
||||
req_info["request"].name_id_policy,
|
||||
userid)
|
||||
except Exception, excp:
|
||||
if logger: logger.error("Exception: %s" % (excp,))
|
||||
raise
|
||||
|
||||
if logger: logger.info("AuthNResponse: %s" % authn_resp)
|
||||
|
||||
response = ["<head>",
|
||||
"<title>SAML 2.0 POST</title>",
|
||||
"</head><body>",
|
||||
FORM_SPEC % (req_info["consumer_url"],
|
||||
base64.b64encode("".join(authn_resp)), "/"),
|
||||
"""<script type="text/javascript" language="JavaScript">""",
|
||||
" document.myform.submit();",
|
||||
"""</script>""",
|
||||
"</body>"]
|
||||
|
||||
start_response('200 OK', [('Content-Type', 'text/html')])
|
||||
return response
|
||||
|
||||
def whoami(environ, start_response, user, logger):
|
||||
start_response('200 OK', [('Content-Type', 'text/html')])
|
||||
identity = environ["repoze.who.identity"].copy()
|
||||
for prop in ["login", "password"]:
|
||||
try:
|
||||
del identity[prop]
|
||||
except KeyError:
|
||||
continue
|
||||
response = dict_to_table(identity)
|
||||
return response[:]
|
||||
|
||||
def not_found(environ, start_response, logger):
|
||||
"""Called if no URL matches."""
|
||||
start_response('404 NOT FOUND', [('Content-Type', 'text/plain')])
|
||||
return ['Not Found']
|
||||
|
||||
def not_authn(environ, start_response, logger):
|
||||
if "QUERY_STRING" in environ:
|
||||
query = parse_qs(environ["QUERY_STRING"])
|
||||
if logger: logger.info("query: %s" % query)
|
||||
start_response('401 Unauthorized', [('Content-Type', 'text/plain')])
|
||||
return ['Unknown user']
|
||||
|
||||
def slo(environ, start_response, user, logger):
|
||||
""" Expects a HTTP-redirect logout request """
|
||||
|
||||
query = None
|
||||
if "QUERY_STRING" in environ:
|
||||
if logger: logger.info("Query string: %s" % environ["QUERY_STRING"])
|
||||
query = parse_qs(environ["QUERY_STRING"])
|
||||
|
||||
if not query:
|
||||
start_response('401 Unauthorized', [('Content-Type', 'text/plain')])
|
||||
return ['Unknown user']
|
||||
|
||||
try:
|
||||
req_info = IDP.parse_logout_request(query["SAMLRequest"][0],
|
||||
BINDING_HTTP_REDIRECT)
|
||||
logger.info("LOGOUT request parsed OK")
|
||||
logger.info("REQ_INFO: %s" % req_info.message)
|
||||
except KeyError, exc:
|
||||
if logger: logger.info("logout request error: %s" % (exc,))
|
||||
# return error reply
|
||||
|
||||
# look for the subject
|
||||
subject = req_info.subject_id()
|
||||
subject = subject.text.strip()
|
||||
sp_entity_id = req_info.message.issuer.text.strip()
|
||||
logger.info("Logout subject: %s" % (subject,))
|
||||
logger.info("local identifier: %s" % IDP.ident.local_name(sp_entity_id,
|
||||
subject))
|
||||
# remove the authentication
|
||||
|
||||
status = None
|
||||
|
||||
# Either HTTP-Post or HTTP-redirect is possible
|
||||
bindings = [BINDING_HTTP_POST, BINDING_HTTP_REDIRECT]
|
||||
(resp, headers, message) = IDP.logout_response(req_info.message, bindings)
|
||||
#headers.append(session.cookie(expire="now"))
|
||||
logger.info("Response code: %s" % (resp,))
|
||||
logger.info("Header: %s" % (headers,))
|
||||
delco = delete_cookie(environ, "pysaml2idp")
|
||||
if delco:
|
||||
headers.append(delco)
|
||||
start_response(resp, headers)
|
||||
return message
|
||||
|
||||
def delete_cookie(environ, name):
|
||||
kaka = environ.get("HTTP_COOKIE", '')
|
||||
if kaka:
|
||||
cookie_obj = SimpleCookie(kaka)
|
||||
morsel = cookie_obj.get(name, None)
|
||||
cookie = SimpleCookie()
|
||||
cookie[name] = morsel
|
||||
cookie[name]["expires"] = \
|
||||
_expiration("now", "%a, %d-%b-%Y %H:%M:%S CET")
|
||||
return tuple(cookie.output().split(": ", 1))
|
||||
return None
|
||||
|
||||
# ----------------------------------------------------------------------------
|
||||
|
||||
# map urls to functions
|
||||
URLS = [
|
||||
(r'whoami$', whoami),
|
||||
(r'whoami/(.*)$', whoami),
|
||||
(r'sso$', sso),
|
||||
(r'sso/(.*)$', sso),
|
||||
(r'logout$', slo),
|
||||
(r'logout/(.*)$', slo),
|
||||
]
|
||||
|
||||
# ----------------------------------------------------------------------------
|
||||
|
||||
def application(environ, start_response):
|
||||
"""
|
||||
The main WSGI application. Dispatch the current request to
|
||||
the functions from above and store the regular expression
|
||||
captures in the WSGI environment as `myapp.url_args` so that
|
||||
the functions from above can access the url placeholders.
|
||||
|
||||
If nothing matches call the `not_found` function.
|
||||
|
||||
:param environ: The HTTP application environment
|
||||
:param start_response: The application to run when the handling of the
|
||||
request is done
|
||||
:return: The response as a list of lines
|
||||
"""
|
||||
user = environ.get("REMOTE_USER", "")
|
||||
kaka = environ.get("HTTP_COOKIE", '')
|
||||
if not user:
|
||||
user = environ.get("repoze.who.identity", "")
|
||||
|
||||
path = environ.get('PATH_INFO', '').lstrip('/')
|
||||
logger = environ.get('repoze.who.logger')
|
||||
if logger: logger.info("<application> PATH: %s" % path)
|
||||
if logger: logger.info("Cookie: %s" % (kaka,))
|
||||
for regex, callback in URLS:
|
||||
if user:
|
||||
match = re.search(regex, path)
|
||||
if match is not None:
|
||||
try:
|
||||
environ['myapp.url_args'] = match.groups()[0]
|
||||
except IndexError:
|
||||
environ['myapp.url_args'] = path
|
||||
if logger: logger.info("callback: %s" % (callback,))
|
||||
return callback(environ, start_response, user, logger)
|
||||
else:
|
||||
if logger: logger.info("-- No USER --")
|
||||
return not_authn(environ, start_response, logger)
|
||||
return not_found(environ, start_response, logger)
|
||||
|
||||
# ----------------------------------------------------------------------------
|
||||
|
||||
from repoze.who.config import make_middleware_with_config
|
||||
|
||||
APP_WITH_AUTH = make_middleware_with_config(application, {"here":"."},
|
||||
'./who.ini', log_file="app.log")
|
||||
|
||||
# ----------------------------------------------------------------------------
|
||||
|
||||
if __name__ == '__main__':
|
||||
import sys
|
||||
from wsgiref.simple_server import make_server
|
||||
import logging
|
||||
LOG_FILENAME = "./idp.log"
|
||||
PORT = 8088
|
||||
|
||||
logging.basicConfig(filename=LOG_FILENAME, level=logging.DEBUG)
|
||||
|
||||
IDP = server.Server(sys.argv[1], log=logging, debug=1)
|
||||
SRV = make_server('localhost', PORT, APP_WITH_AUTH)
|
||||
print "IdP listening on port: %s" % PORT
|
||||
SRV.serve_forever()
|
||||
45
example/idp/idp_conf.py
Normal file
45
example/idp/idp_conf.py
Normal file
@@ -0,0 +1,45 @@
|
||||
from saml2 import BINDING_HTTP_REDIRECT
|
||||
from saml2.saml import NAME_FORMAT_URI
|
||||
|
||||
BASE = "http://localhost:8088/"
|
||||
|
||||
CONFIG={
|
||||
"entityid" : "urn:mace:umu.se:saml:roland:idp",
|
||||
"description": "My IDP",
|
||||
"service": {
|
||||
"idp": {
|
||||
"name" : "Rolands IdP",
|
||||
"endpoints" : {
|
||||
"single_sign_on_service" : [BASE+"sso"],
|
||||
"single_logout_service" : [(BASE+"logout",
|
||||
BINDING_HTTP_REDIRECT)],
|
||||
},
|
||||
"policy": {
|
||||
"default": {
|
||||
"lifetime": {"minutes":15},
|
||||
"attribute_restrictions": None, # means all I have
|
||||
"name_form": NAME_FORMAT_URI
|
||||
},
|
||||
"urn:mace:umu.se:saml:roland:sp": {
|
||||
"lifetime": {"minutes": 5},
|
||||
}
|
||||
},
|
||||
"subject_data": "./idp.subject.db",
|
||||
}
|
||||
},
|
||||
"debug" : 1,
|
||||
"key_file" : "pki/mykey.pem",
|
||||
"cert_file" : "pki/mycert.pem",
|
||||
"metadata" : {
|
||||
"local": ["../sp/sp.xml"],
|
||||
},
|
||||
"organization": {
|
||||
"display_name": "Rolands Identiteter",
|
||||
"name": "Rolands Identiteter",
|
||||
"url": "http://www.example.com",
|
||||
},
|
||||
# This database holds the map between a subjects local identifier and
|
||||
# the identifier returned to a SP
|
||||
#"xmlsec_binary": "/usr/local/bin/xmlsec1",
|
||||
"attribute_map_dir" : "./attributemaps",
|
||||
}
|
||||
25
example/idp/idp_user.ini
Normal file
25
example/idp/idp_user.ini
Normal file
@@ -0,0 +1,25 @@
|
||||
[roland]
|
||||
surname=Hedberg
|
||||
givenName=Roland
|
||||
eduPersonAffiliation=staff
|
||||
uid=rohe0002
|
||||
|
||||
[ozzie]
|
||||
surname=Guillen
|
||||
givenName=Ozzie
|
||||
eduPersonAffiliation=affiliate
|
||||
|
||||
[derek]
|
||||
surname=Jeter
|
||||
givenName=Derek
|
||||
eduPersonAffiliation=affiliate
|
||||
|
||||
[ichiro]
|
||||
surname=Suzuki
|
||||
givenName=Ischiro
|
||||
eduPersonAffiliation=affiliate
|
||||
|
||||
[ryan]
|
||||
surname=Howard
|
||||
givenName=Ryan
|
||||
eduPersonAffiliation=affiliate
|
||||
5
example/idp/passwd
Normal file
5
example/idp/passwd
Normal file
@@ -0,0 +1,5 @@
|
||||
roland:0Gwsj0fYeNAIk
|
||||
ozzie:wT390u9XwBFaU
|
||||
derek:efNb53YcncbRI
|
||||
ryan:YlIhvZ6Rdt6fA
|
||||
ischiro:wgMhJvmkQgMGs
|
||||
18
example/idp/pki/mycert.pem
Normal file
18
example/idp/pki/mycert.pem
Normal file
@@ -0,0 +1,18 @@
|
||||
-----BEGIN CERTIFICATE-----
|
||||
MIIC8jCCAlugAwIBAgIJAJHg2V5J31I8MA0GCSqGSIb3DQEBBQUAMFoxCzAJBgNV
|
||||
BAYTAlNFMQ0wCwYDVQQHEwRVbWVhMRgwFgYDVQQKEw9VbWVhIFVuaXZlcnNpdHkx
|
||||
EDAOBgNVBAsTB0lUIFVuaXQxEDAOBgNVBAMTB1Rlc3QgU1AwHhcNMDkxMDI2MTMz
|
||||
MTE1WhcNMTAxMDI2MTMzMTE1WjBaMQswCQYDVQQGEwJTRTENMAsGA1UEBxMEVW1l
|
||||
YTEYMBYGA1UEChMPVW1lYSBVbml2ZXJzaXR5MRAwDgYDVQQLEwdJVCBVbml0MRAw
|
||||
DgYDVQQDEwdUZXN0IFNQMIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQDkJWP7
|
||||
bwOxtH+E15VTaulNzVQ/0cSbM5G7abqeqSNSs0l0veHr6/ROgW96ZeQ57fzVy2MC
|
||||
FiQRw2fzBs0n7leEmDJyVVtBTavYlhAVXDNa3stgvh43qCfLx+clUlOvtnsoMiiR
|
||||
mo7qf0BoPKTj7c0uLKpDpEbAHQT4OF1HRYVxMwIDAQABo4G/MIG8MB0GA1UdDgQW
|
||||
BBQ7RgbMJFDGRBu9o3tDQDuSoBy7JjCBjAYDVR0jBIGEMIGBgBQ7RgbMJFDGRBu9
|
||||
o3tDQDuSoBy7JqFepFwwWjELMAkGA1UEBhMCU0UxDTALBgNVBAcTBFVtZWExGDAW
|
||||
BgNVBAoTD1VtZWEgVW5pdmVyc2l0eTEQMA4GA1UECxMHSVQgVW5pdDEQMA4GA1UE
|
||||
AxMHVGVzdCBTUIIJAJHg2V5J31I8MAwGA1UdEwQFMAMBAf8wDQYJKoZIhvcNAQEF
|
||||
BQADgYEAMuRwwXRnsiyWzmRikpwinnhTmbooKm5TINPE7A7gSQ710RxioQePPhZO
|
||||
zkM27NnHTrCe2rBVg0EGz7QTd1JIwLPvgoj4VTi/fSha/tXrYUaqc9AqU1kWI4WN
|
||||
+vffBGQ09mo+6CffuFTZYeOhzP/2stAPwCTU4kxEoiy0KpZMANI=
|
||||
-----END CERTIFICATE-----
|
||||
15
example/idp/pki/mykey.pem
Normal file
15
example/idp/pki/mykey.pem
Normal file
@@ -0,0 +1,15 @@
|
||||
-----BEGIN RSA PRIVATE KEY-----
|
||||
MIICXAIBAAKBgQDkJWP7bwOxtH+E15VTaulNzVQ/0cSbM5G7abqeqSNSs0l0veHr
|
||||
6/ROgW96ZeQ57fzVy2MCFiQRw2fzBs0n7leEmDJyVVtBTavYlhAVXDNa3stgvh43
|
||||
qCfLx+clUlOvtnsoMiiRmo7qf0BoPKTj7c0uLKpDpEbAHQT4OF1HRYVxMwIDAQAB
|
||||
AoGAbx9rKH91DCw/ZEPhHsVXJ6cYHxGcMoAWvnMMC9WUN+bNo4gNL205DLfsxXA1
|
||||
jqXFXZj3+38vSFumGPA6IvXrN+Wyp3+Lz3QGc4K5OdHeBtYlxa6EsrxPgvuxYDUB
|
||||
vx3xdWPMjy06G/ML+pR9XHnRaPNubXQX3UxGBuLjwNXVmyECQQD2/D84tYoCGWoq
|
||||
5FhUBxFUy2nnOLKYC/GGxBTX62iLfMQ3fbQcdg2pJsB5rrniyZf7UL+9FOsAO9k1
|
||||
8DO7G12DAkEA7Hkdg1KEw4ZfjnnjEa+KqpyLTLRQ91uTVW6kzR+4zY719iUJ/PXE
|
||||
PxJqm1ot7mJd1LW+bWtjLpxs7jYH19V+kQJBAIEpn2JnxdmdMuFlcy/WVmDy09pg
|
||||
0z0imdexeXkFmjHAONkQOv3bWv+HzYaVMo8AgCOksfEPHGqN4eUMTfFeuUMCQF+5
|
||||
E1JSd/2yCkJhYqKJHae8oMLXByNqRXTCyiFioutK4JPYIHfugJdLfC4QziD+Xp85
|
||||
RrGCU+7NUWcIJhqfiJECQAIgUAzfzhdj5AyICaFPaOQ+N8FVMLcTyqeTXP0sIlFk
|
||||
JStVibemTRCbxdXXM7OVipz1oW3PBVEO3t/VyjiaGGg=
|
||||
-----END RSA PRIVATE KEY-----
|
||||
57
example/idp/who.ini
Normal file
57
example/idp/who.ini
Normal file
@@ -0,0 +1,57 @@
|
||||
[plugin:form]
|
||||
# identificaion and challenge
|
||||
use = s2repoze.plugins.formswithhidden:make_plugin
|
||||
login_form_qs = __do_login
|
||||
rememberer_name = auth_tkt
|
||||
#form = %(here)s/login_form.html
|
||||
|
||||
[plugin:auth_tkt]
|
||||
# identification
|
||||
use = repoze.who.plugins.auth_tkt:make_plugin
|
||||
secret = cassiopeja
|
||||
cookie_name = pysaml2idp
|
||||
secure = False
|
||||
include_ip = True
|
||||
timeout=3600
|
||||
reissue_time = 3000
|
||||
|
||||
[plugin:basicauth]
|
||||
# identification and challenge
|
||||
use = repoze.who.plugins.basicauth:make_plugin
|
||||
realm = 'sample'
|
||||
|
||||
[plugin:htpasswd]
|
||||
# authentication
|
||||
use = repoze.who.plugins.htpasswd:make_plugin
|
||||
filename = %(here)s/passwd
|
||||
check_fn = repoze.who.plugins.htpasswd:crypt_check
|
||||
|
||||
[plugin:ini]
|
||||
use = s2repoze.plugins.ini:make_plugin
|
||||
ini_file = %(here)s/idp_user.ini
|
||||
|
||||
[general]
|
||||
request_classifier = repoze.who.classifiers:default_request_classifier
|
||||
challenge_decider = repoze.who.classifiers:default_challenge_decider
|
||||
remote_user_key = REMOTE_USER
|
||||
|
||||
[identifiers]
|
||||
# plugin_name;classifier_name:.. or just plugin_name (good for any)
|
||||
plugins =
|
||||
form;browser
|
||||
auth_tkt
|
||||
basicauth
|
||||
|
||||
[authenticators]
|
||||
# plugin_name;classifier_name.. or just plugin_name (good for any)
|
||||
plugins =
|
||||
htpasswd
|
||||
|
||||
[challengers]
|
||||
# plugin_name;classifier_name:.. or just plugin_name (good for any)
|
||||
plugins =
|
||||
form;browser
|
||||
basicauth
|
||||
|
||||
[mdproviders]
|
||||
plugins = ini
|
||||
19
example/run.sh
Executable file
19
example/run.sh
Executable file
@@ -0,0 +1,19 @@
|
||||
#!/bin/sh
|
||||
|
||||
# Created by Roland Hedberg on 3/25/10.
|
||||
# Copyright 2010 Umeå Universitet. All rights reserved.
|
||||
|
||||
cd sp
|
||||
../../tools/make_metadata.py sp_conf > sp.xml
|
||||
|
||||
cd ../idp
|
||||
../../tools/make_metadata.py idp_conf > idp.xml
|
||||
|
||||
cd ../sp
|
||||
./sp.py sp_conf &
|
||||
|
||||
cd ../idp
|
||||
./idp.py idp_conf &
|
||||
|
||||
cd ..
|
||||
|
||||
326
example/sp/attributemaps/basic.py
Normal file
326
example/sp/attributemaps/basic.py
Normal file
@@ -0,0 +1,326 @@
|
||||
|
||||
MAP = {
|
||||
"identifier": "urn:oasis:names:tc:SAML:2.0:attrname-format:basic",
|
||||
"fro": {
|
||||
'urn:mace:dir:attribute-def:aRecord': 'aRecord',
|
||||
'urn:mace:dir:attribute-def:aliasedEntryName': 'aliasedEntryName',
|
||||
'urn:mace:dir:attribute-def:aliasedObjectName': 'aliasedObjectName',
|
||||
'urn:mace:dir:attribute-def:associatedDomain': 'associatedDomain',
|
||||
'urn:mace:dir:attribute-def:associatedName': 'associatedName',
|
||||
'urn:mace:dir:attribute-def:audio': 'audio',
|
||||
'urn:mace:dir:attribute-def:authorityRevocationList': 'authorityRevocationList',
|
||||
'urn:mace:dir:attribute-def:buildingName': 'buildingName',
|
||||
'urn:mace:dir:attribute-def:businessCategory': 'businessCategory',
|
||||
'urn:mace:dir:attribute-def:c': 'c',
|
||||
'urn:mace:dir:attribute-def:cACertificate': 'cACertificate',
|
||||
'urn:mace:dir:attribute-def:cNAMERecord': 'cNAMERecord',
|
||||
'urn:mace:dir:attribute-def:carLicense': 'carLicense',
|
||||
'urn:mace:dir:attribute-def:certificateRevocationList': 'certificateRevocationList',
|
||||
'urn:mace:dir:attribute-def:cn': 'cn',
|
||||
'urn:mace:dir:attribute-def:co': 'co',
|
||||
'urn:mace:dir:attribute-def:commonName': 'commonName',
|
||||
'urn:mace:dir:attribute-def:countryName': 'countryName',
|
||||
'urn:mace:dir:attribute-def:crossCertificatePair': 'crossCertificatePair',
|
||||
'urn:mace:dir:attribute-def:dITRedirect': 'dITRedirect',
|
||||
'urn:mace:dir:attribute-def:dSAQuality': 'dSAQuality',
|
||||
'urn:mace:dir:attribute-def:dc': 'dc',
|
||||
'urn:mace:dir:attribute-def:deltaRevocationList': 'deltaRevocationList',
|
||||
'urn:mace:dir:attribute-def:departmentNumber': 'departmentNumber',
|
||||
'urn:mace:dir:attribute-def:description': 'description',
|
||||
'urn:mace:dir:attribute-def:destinationIndicator': 'destinationIndicator',
|
||||
'urn:mace:dir:attribute-def:displayName': 'displayName',
|
||||
'urn:mace:dir:attribute-def:distinguishedName': 'distinguishedName',
|
||||
'urn:mace:dir:attribute-def:dmdName': 'dmdName',
|
||||
'urn:mace:dir:attribute-def:dnQualifier': 'dnQualifier',
|
||||
'urn:mace:dir:attribute-def:documentAuthor': 'documentAuthor',
|
||||
'urn:mace:dir:attribute-def:documentIdentifier': 'documentIdentifier',
|
||||
'urn:mace:dir:attribute-def:documentLocation': 'documentLocation',
|
||||
'urn:mace:dir:attribute-def:documentPublisher': 'documentPublisher',
|
||||
'urn:mace:dir:attribute-def:documentTitle': 'documentTitle',
|
||||
'urn:mace:dir:attribute-def:documentVersion': 'documentVersion',
|
||||
'urn:mace:dir:attribute-def:domainComponent': 'domainComponent',
|
||||
'urn:mace:dir:attribute-def:drink': 'drink',
|
||||
'urn:mace:dir:attribute-def:eduOrgHomePageURI': 'eduOrgHomePageURI',
|
||||
'urn:mace:dir:attribute-def:eduOrgIdentityAuthNPolicyURI': 'eduOrgIdentityAuthNPolicyURI',
|
||||
'urn:mace:dir:attribute-def:eduOrgLegalName': 'eduOrgLegalName',
|
||||
'urn:mace:dir:attribute-def:eduOrgSuperiorURI': 'eduOrgSuperiorURI',
|
||||
'urn:mace:dir:attribute-def:eduOrgWhitePagesURI': 'eduOrgWhitePagesURI',
|
||||
'urn:mace:dir:attribute-def:eduPersonAffiliation': 'eduPersonAffiliation',
|
||||
'urn:mace:dir:attribute-def:eduPersonEntitlement': 'eduPersonEntitlement',
|
||||
'urn:mace:dir:attribute-def:eduPersonNickname': 'eduPersonNickname',
|
||||
'urn:mace:dir:attribute-def:eduPersonOrgDN': 'eduPersonOrgDN',
|
||||
'urn:mace:dir:attribute-def:eduPersonOrgUnitDN': 'eduPersonOrgUnitDN',
|
||||
'urn:mace:dir:attribute-def:eduPersonPrimaryAffiliation': 'eduPersonPrimaryAffiliation',
|
||||
'urn:mace:dir:attribute-def:eduPersonPrimaryOrgUnitDN': 'eduPersonPrimaryOrgUnitDN',
|
||||
'urn:mace:dir:attribute-def:eduPersonPrincipalName': 'eduPersonPrincipalName',
|
||||
'urn:mace:dir:attribute-def:eduPersonScopedAffiliation': 'eduPersonScopedAffiliation',
|
||||
'urn:mace:dir:attribute-def:eduPersonTargetedID': 'eduPersonTargetedID',
|
||||
'urn:mace:dir:attribute-def:email': 'email',
|
||||
'urn:mace:dir:attribute-def:emailAddress': 'emailAddress',
|
||||
'urn:mace:dir:attribute-def:employeeNumber': 'employeeNumber',
|
||||
'urn:mace:dir:attribute-def:employeeType': 'employeeType',
|
||||
'urn:mace:dir:attribute-def:enhancedSearchGuide': 'enhancedSearchGuide',
|
||||
'urn:mace:dir:attribute-def:facsimileTelephoneNumber': 'facsimileTelephoneNumber',
|
||||
'urn:mace:dir:attribute-def:favouriteDrink': 'favouriteDrink',
|
||||
'urn:mace:dir:attribute-def:fax': 'fax',
|
||||
'urn:mace:dir:attribute-def:federationFeideSchemaVersion': 'federationFeideSchemaVersion',
|
||||
'urn:mace:dir:attribute-def:friendlyCountryName': 'friendlyCountryName',
|
||||
'urn:mace:dir:attribute-def:generationQualifier': 'generationQualifier',
|
||||
'urn:mace:dir:attribute-def:givenName': 'givenName',
|
||||
'urn:mace:dir:attribute-def:gn': 'gn',
|
||||
'urn:mace:dir:attribute-def:homePhone': 'homePhone',
|
||||
'urn:mace:dir:attribute-def:homePostalAddress': 'homePostalAddress',
|
||||
'urn:mace:dir:attribute-def:homeTelephoneNumber': 'homeTelephoneNumber',
|
||||
'urn:mace:dir:attribute-def:host': 'host',
|
||||
'urn:mace:dir:attribute-def:houseIdentifier': 'houseIdentifier',
|
||||
'urn:mace:dir:attribute-def:info': 'info',
|
||||
'urn:mace:dir:attribute-def:initials': 'initials',
|
||||
'urn:mace:dir:attribute-def:internationaliSDNNumber': 'internationaliSDNNumber',
|
||||
'urn:mace:dir:attribute-def:janetMailbox': 'janetMailbox',
|
||||
'urn:mace:dir:attribute-def:jpegPhoto': 'jpegPhoto',
|
||||
'urn:mace:dir:attribute-def:knowledgeInformation': 'knowledgeInformation',
|
||||
'urn:mace:dir:attribute-def:l': 'l',
|
||||
'urn:mace:dir:attribute-def:labeledURI': 'labeledURI',
|
||||
'urn:mace:dir:attribute-def:localityName': 'localityName',
|
||||
'urn:mace:dir:attribute-def:mDRecord': 'mDRecord',
|
||||
'urn:mace:dir:attribute-def:mXRecord': 'mXRecord',
|
||||
'urn:mace:dir:attribute-def:mail': 'mail',
|
||||
'urn:mace:dir:attribute-def:mailPreferenceOption': 'mailPreferenceOption',
|
||||
'urn:mace:dir:attribute-def:manager': 'manager',
|
||||
'urn:mace:dir:attribute-def:member': 'member',
|
||||
'urn:mace:dir:attribute-def:mobile': 'mobile',
|
||||
'urn:mace:dir:attribute-def:mobileTelephoneNumber': 'mobileTelephoneNumber',
|
||||
'urn:mace:dir:attribute-def:nSRecord': 'nSRecord',
|
||||
'urn:mace:dir:attribute-def:name': 'name',
|
||||
'urn:mace:dir:attribute-def:norEduOrgAcronym': 'norEduOrgAcronym',
|
||||
'urn:mace:dir:attribute-def:norEduOrgNIN': 'norEduOrgNIN',
|
||||
'urn:mace:dir:attribute-def:norEduOrgSchemaVersion': 'norEduOrgSchemaVersion',
|
||||
'urn:mace:dir:attribute-def:norEduOrgUniqueIdentifier': 'norEduOrgUniqueIdentifier',
|
||||
'urn:mace:dir:attribute-def:norEduOrgUniqueNumber': 'norEduOrgUniqueNumber',
|
||||
'urn:mace:dir:attribute-def:norEduOrgUnitUniqueIdentifier': 'norEduOrgUnitUniqueIdentifier',
|
||||
'urn:mace:dir:attribute-def:norEduOrgUnitUniqueNumber': 'norEduOrgUnitUniqueNumber',
|
||||
'urn:mace:dir:attribute-def:norEduPersonBirthDate': 'norEduPersonBirthDate',
|
||||
'urn:mace:dir:attribute-def:norEduPersonLIN': 'norEduPersonLIN',
|
||||
'urn:mace:dir:attribute-def:norEduPersonNIN': 'norEduPersonNIN',
|
||||
'urn:mace:dir:attribute-def:o': 'o',
|
||||
'urn:mace:dir:attribute-def:objectClass': 'objectClass',
|
||||
'urn:mace:dir:attribute-def:organizationName': 'organizationName',
|
||||
'urn:mace:dir:attribute-def:organizationalStatus': 'organizationalStatus',
|
||||
'urn:mace:dir:attribute-def:organizationalUnitName': 'organizationalUnitName',
|
||||
'urn:mace:dir:attribute-def:otherMailbox': 'otherMailbox',
|
||||
'urn:mace:dir:attribute-def:ou': 'ou',
|
||||
'urn:mace:dir:attribute-def:owner': 'owner',
|
||||
'urn:mace:dir:attribute-def:pager': 'pager',
|
||||
'urn:mace:dir:attribute-def:pagerTelephoneNumber': 'pagerTelephoneNumber',
|
||||
'urn:mace:dir:attribute-def:personalSignature': 'personalSignature',
|
||||
'urn:mace:dir:attribute-def:personalTitle': 'personalTitle',
|
||||
'urn:mace:dir:attribute-def:photo': 'photo',
|
||||
'urn:mace:dir:attribute-def:physicalDeliveryOfficeName': 'physicalDeliveryOfficeName',
|
||||
'urn:mace:dir:attribute-def:pkcs9email': 'pkcs9email',
|
||||
'urn:mace:dir:attribute-def:postOfficeBox': 'postOfficeBox',
|
||||
'urn:mace:dir:attribute-def:postalAddress': 'postalAddress',
|
||||
'urn:mace:dir:attribute-def:postalCode': 'postalCode',
|
||||
'urn:mace:dir:attribute-def:preferredDeliveryMethod': 'preferredDeliveryMethod',
|
||||
'urn:mace:dir:attribute-def:preferredLanguage': 'preferredLanguage',
|
||||
'urn:mace:dir:attribute-def:presentationAddress': 'presentationAddress',
|
||||
'urn:mace:dir:attribute-def:protocolInformation': 'protocolInformation',
|
||||
'urn:mace:dir:attribute-def:pseudonym': 'pseudonym',
|
||||
'urn:mace:dir:attribute-def:registeredAddress': 'registeredAddress',
|
||||
'urn:mace:dir:attribute-def:rfc822Mailbox': 'rfc822Mailbox',
|
||||
'urn:mace:dir:attribute-def:roleOccupant': 'roleOccupant',
|
||||
'urn:mace:dir:attribute-def:roomNumber': 'roomNumber',
|
||||
'urn:mace:dir:attribute-def:sOARecord': 'sOARecord',
|
||||
'urn:mace:dir:attribute-def:searchGuide': 'searchGuide',
|
||||
'urn:mace:dir:attribute-def:secretary': 'secretary',
|
||||
'urn:mace:dir:attribute-def:seeAlso': 'seeAlso',
|
||||
'urn:mace:dir:attribute-def:serialNumber': 'serialNumber',
|
||||
'urn:mace:dir:attribute-def:singleLevelQuality': 'singleLevelQuality',
|
||||
'urn:mace:dir:attribute-def:sn': 'sn',
|
||||
'urn:mace:dir:attribute-def:st': 'st',
|
||||
'urn:mace:dir:attribute-def:stateOrProvinceName': 'stateOrProvinceName',
|
||||
'urn:mace:dir:attribute-def:street': 'street',
|
||||
'urn:mace:dir:attribute-def:streetAddress': 'streetAddress',
|
||||
'urn:mace:dir:attribute-def:subtreeMaximumQuality': 'subtreeMaximumQuality',
|
||||
'urn:mace:dir:attribute-def:subtreeMinimumQuality': 'subtreeMinimumQuality',
|
||||
'urn:mace:dir:attribute-def:supportedAlgorithms': 'supportedAlgorithms',
|
||||
'urn:mace:dir:attribute-def:supportedApplicationContext': 'supportedApplicationContext',
|
||||
'urn:mace:dir:attribute-def:surname': 'surname',
|
||||
'urn:mace:dir:attribute-def:telephoneNumber': 'telephoneNumber',
|
||||
'urn:mace:dir:attribute-def:teletexTerminalIdentifier': 'teletexTerminalIdentifier',
|
||||
'urn:mace:dir:attribute-def:telexNumber': 'telexNumber',
|
||||
'urn:mace:dir:attribute-def:textEncodedORAddress': 'textEncodedORAddress',
|
||||
'urn:mace:dir:attribute-def:title': 'title',
|
||||
'urn:mace:dir:attribute-def:uid': 'uid',
|
||||
'urn:mace:dir:attribute-def:uniqueIdentifier': 'uniqueIdentifier',
|
||||
'urn:mace:dir:attribute-def:uniqueMember': 'uniqueMember',
|
||||
'urn:mace:dir:attribute-def:userCertificate': 'userCertificate',
|
||||
'urn:mace:dir:attribute-def:userClass': 'userClass',
|
||||
'urn:mace:dir:attribute-def:userPKCS12': 'userPKCS12',
|
||||
'urn:mace:dir:attribute-def:userPassword': 'userPassword',
|
||||
'urn:mace:dir:attribute-def:userSMIMECertificate': 'userSMIMECertificate',
|
||||
'urn:mace:dir:attribute-def:userid': 'userid',
|
||||
'urn:mace:dir:attribute-def:x121Address': 'x121Address',
|
||||
'urn:mace:dir:attribute-def:x500UniqueIdentifier': 'x500UniqueIdentifier',
|
||||
},
|
||||
"to": {
|
||||
'aRecord': 'urn:mace:dir:attribute-def:aRecord',
|
||||
'aliasedEntryName': 'urn:mace:dir:attribute-def:aliasedEntryName',
|
||||
'aliasedObjectName': 'urn:mace:dir:attribute-def:aliasedObjectName',
|
||||
'associatedDomain': 'urn:mace:dir:attribute-def:associatedDomain',
|
||||
'associatedName': 'urn:mace:dir:attribute-def:associatedName',
|
||||
'audio': 'urn:mace:dir:attribute-def:audio',
|
||||
'authorityRevocationList': 'urn:mace:dir:attribute-def:authorityRevocationList',
|
||||
'buildingName': 'urn:mace:dir:attribute-def:buildingName',
|
||||
'businessCategory': 'urn:mace:dir:attribute-def:businessCategory',
|
||||
'c': 'urn:mace:dir:attribute-def:c',
|
||||
'cACertificate': 'urn:mace:dir:attribute-def:cACertificate',
|
||||
'cNAMERecord': 'urn:mace:dir:attribute-def:cNAMERecord',
|
||||
'carLicense': 'urn:mace:dir:attribute-def:carLicense',
|
||||
'certificateRevocationList': 'urn:mace:dir:attribute-def:certificateRevocationList',
|
||||
'cn': 'urn:mace:dir:attribute-def:cn',
|
||||
'co': 'urn:mace:dir:attribute-def:co',
|
||||
'commonName': 'urn:mace:dir:attribute-def:commonName',
|
||||
'countryName': 'urn:mace:dir:attribute-def:countryName',
|
||||
'crossCertificatePair': 'urn:mace:dir:attribute-def:crossCertificatePair',
|
||||
'dITRedirect': 'urn:mace:dir:attribute-def:dITRedirect',
|
||||
'dSAQuality': 'urn:mace:dir:attribute-def:dSAQuality',
|
||||
'dc': 'urn:mace:dir:attribute-def:dc',
|
||||
'deltaRevocationList': 'urn:mace:dir:attribute-def:deltaRevocationList',
|
||||
'departmentNumber': 'urn:mace:dir:attribute-def:departmentNumber',
|
||||
'description': 'urn:mace:dir:attribute-def:description',
|
||||
'destinationIndicator': 'urn:mace:dir:attribute-def:destinationIndicator',
|
||||
'displayName': 'urn:mace:dir:attribute-def:displayName',
|
||||
'distinguishedName': 'urn:mace:dir:attribute-def:distinguishedName',
|
||||
'dmdName': 'urn:mace:dir:attribute-def:dmdName',
|
||||
'dnQualifier': 'urn:mace:dir:attribute-def:dnQualifier',
|
||||
'documentAuthor': 'urn:mace:dir:attribute-def:documentAuthor',
|
||||
'documentIdentifier': 'urn:mace:dir:attribute-def:documentIdentifier',
|
||||
'documentLocation': 'urn:mace:dir:attribute-def:documentLocation',
|
||||
'documentPublisher': 'urn:mace:dir:attribute-def:documentPublisher',
|
||||
'documentTitle': 'urn:mace:dir:attribute-def:documentTitle',
|
||||
'documentVersion': 'urn:mace:dir:attribute-def:documentVersion',
|
||||
'domainComponent': 'urn:mace:dir:attribute-def:domainComponent',
|
||||
'drink': 'urn:mace:dir:attribute-def:drink',
|
||||
'eduOrgHomePageURI': 'urn:mace:dir:attribute-def:eduOrgHomePageURI',
|
||||
'eduOrgIdentityAuthNPolicyURI': 'urn:mace:dir:attribute-def:eduOrgIdentityAuthNPolicyURI',
|
||||
'eduOrgLegalName': 'urn:mace:dir:attribute-def:eduOrgLegalName',
|
||||
'eduOrgSuperiorURI': 'urn:mace:dir:attribute-def:eduOrgSuperiorURI',
|
||||
'eduOrgWhitePagesURI': 'urn:mace:dir:attribute-def:eduOrgWhitePagesURI',
|
||||
'eduPersonAffiliation': 'urn:mace:dir:attribute-def:eduPersonAffiliation',
|
||||
'eduPersonEntitlement': 'urn:mace:dir:attribute-def:eduPersonEntitlement',
|
||||
'eduPersonNickname': 'urn:mace:dir:attribute-def:eduPersonNickname',
|
||||
'eduPersonOrgDN': 'urn:mace:dir:attribute-def:eduPersonOrgDN',
|
||||
'eduPersonOrgUnitDN': 'urn:mace:dir:attribute-def:eduPersonOrgUnitDN',
|
||||
'eduPersonPrimaryAffiliation': 'urn:mace:dir:attribute-def:eduPersonPrimaryAffiliation',
|
||||
'eduPersonPrimaryOrgUnitDN': 'urn:mace:dir:attribute-def:eduPersonPrimaryOrgUnitDN',
|
||||
'eduPersonPrincipalName': 'urn:mace:dir:attribute-def:eduPersonPrincipalName',
|
||||
'eduPersonScopedAffiliation': 'urn:mace:dir:attribute-def:eduPersonScopedAffiliation',
|
||||
'eduPersonTargetedID': 'urn:mace:dir:attribute-def:eduPersonTargetedID',
|
||||
'email': 'urn:mace:dir:attribute-def:email',
|
||||
'emailAddress': 'urn:mace:dir:attribute-def:emailAddress',
|
||||
'employeeNumber': 'urn:mace:dir:attribute-def:employeeNumber',
|
||||
'employeeType': 'urn:mace:dir:attribute-def:employeeType',
|
||||
'enhancedSearchGuide': 'urn:mace:dir:attribute-def:enhancedSearchGuide',
|
||||
'facsimileTelephoneNumber': 'urn:mace:dir:attribute-def:facsimileTelephoneNumber',
|
||||
'favouriteDrink': 'urn:mace:dir:attribute-def:favouriteDrink',
|
||||
'fax': 'urn:mace:dir:attribute-def:fax',
|
||||
'federationFeideSchemaVersion': 'urn:mace:dir:attribute-def:federationFeideSchemaVersion',
|
||||
'friendlyCountryName': 'urn:mace:dir:attribute-def:friendlyCountryName',
|
||||
'generationQualifier': 'urn:mace:dir:attribute-def:generationQualifier',
|
||||
'givenName': 'urn:mace:dir:attribute-def:givenName',
|
||||
'gn': 'urn:mace:dir:attribute-def:gn',
|
||||
'homePhone': 'urn:mace:dir:attribute-def:homePhone',
|
||||
'homePostalAddress': 'urn:mace:dir:attribute-def:homePostalAddress',
|
||||
'homeTelephoneNumber': 'urn:mace:dir:attribute-def:homeTelephoneNumber',
|
||||
'host': 'urn:mace:dir:attribute-def:host',
|
||||
'houseIdentifier': 'urn:mace:dir:attribute-def:houseIdentifier',
|
||||
'info': 'urn:mace:dir:attribute-def:info',
|
||||
'initials': 'urn:mace:dir:attribute-def:initials',
|
||||
'internationaliSDNNumber': 'urn:mace:dir:attribute-def:internationaliSDNNumber',
|
||||
'janetMailbox': 'urn:mace:dir:attribute-def:janetMailbox',
|
||||
'jpegPhoto': 'urn:mace:dir:attribute-def:jpegPhoto',
|
||||
'knowledgeInformation': 'urn:mace:dir:attribute-def:knowledgeInformation',
|
||||
'l': 'urn:mace:dir:attribute-def:l',
|
||||
'labeledURI': 'urn:mace:dir:attribute-def:labeledURI',
|
||||
'localityName': 'urn:mace:dir:attribute-def:localityName',
|
||||
'mDRecord': 'urn:mace:dir:attribute-def:mDRecord',
|
||||
'mXRecord': 'urn:mace:dir:attribute-def:mXRecord',
|
||||
'mail': 'urn:mace:dir:attribute-def:mail',
|
||||
'mailPreferenceOption': 'urn:mace:dir:attribute-def:mailPreferenceOption',
|
||||
'manager': 'urn:mace:dir:attribute-def:manager',
|
||||
'member': 'urn:mace:dir:attribute-def:member',
|
||||
'mobile': 'urn:mace:dir:attribute-def:mobile',
|
||||
'mobileTelephoneNumber': 'urn:mace:dir:attribute-def:mobileTelephoneNumber',
|
||||
'nSRecord': 'urn:mace:dir:attribute-def:nSRecord',
|
||||
'name': 'urn:mace:dir:attribute-def:name',
|
||||
'norEduOrgAcronym': 'urn:mace:dir:attribute-def:norEduOrgAcronym',
|
||||
'norEduOrgNIN': 'urn:mace:dir:attribute-def:norEduOrgNIN',
|
||||
'norEduOrgSchemaVersion': 'urn:mace:dir:attribute-def:norEduOrgSchemaVersion',
|
||||
'norEduOrgUniqueIdentifier': 'urn:mace:dir:attribute-def:norEduOrgUniqueIdentifier',
|
||||
'norEduOrgUniqueNumber': 'urn:mace:dir:attribute-def:norEduOrgUniqueNumber',
|
||||
'norEduOrgUnitUniqueIdentifier': 'urn:mace:dir:attribute-def:norEduOrgUnitUniqueIdentifier',
|
||||
'norEduOrgUnitUniqueNumber': 'urn:mace:dir:attribute-def:norEduOrgUnitUniqueNumber',
|
||||
'norEduPersonBirthDate': 'urn:mace:dir:attribute-def:norEduPersonBirthDate',
|
||||
'norEduPersonLIN': 'urn:mace:dir:attribute-def:norEduPersonLIN',
|
||||
'norEduPersonNIN': 'urn:mace:dir:attribute-def:norEduPersonNIN',
|
||||
'o': 'urn:mace:dir:attribute-def:o',
|
||||
'objectClass': 'urn:mace:dir:attribute-def:objectClass',
|
||||
'organizationName': 'urn:mace:dir:attribute-def:organizationName',
|
||||
'organizationalStatus': 'urn:mace:dir:attribute-def:organizationalStatus',
|
||||
'organizationalUnitName': 'urn:mace:dir:attribute-def:organizationalUnitName',
|
||||
'otherMailbox': 'urn:mace:dir:attribute-def:otherMailbox',
|
||||
'ou': 'urn:mace:dir:attribute-def:ou',
|
||||
'owner': 'urn:mace:dir:attribute-def:owner',
|
||||
'pager': 'urn:mace:dir:attribute-def:pager',
|
||||
'pagerTelephoneNumber': 'urn:mace:dir:attribute-def:pagerTelephoneNumber',
|
||||
'personalSignature': 'urn:mace:dir:attribute-def:personalSignature',
|
||||
'personalTitle': 'urn:mace:dir:attribute-def:personalTitle',
|
||||
'photo': 'urn:mace:dir:attribute-def:photo',
|
||||
'physicalDeliveryOfficeName': 'urn:mace:dir:attribute-def:physicalDeliveryOfficeName',
|
||||
'pkcs9email': 'urn:mace:dir:attribute-def:pkcs9email',
|
||||
'postOfficeBox': 'urn:mace:dir:attribute-def:postOfficeBox',
|
||||
'postalAddress': 'urn:mace:dir:attribute-def:postalAddress',
|
||||
'postalCode': 'urn:mace:dir:attribute-def:postalCode',
|
||||
'preferredDeliveryMethod': 'urn:mace:dir:attribute-def:preferredDeliveryMethod',
|
||||
'preferredLanguage': 'urn:mace:dir:attribute-def:preferredLanguage',
|
||||
'presentationAddress': 'urn:mace:dir:attribute-def:presentationAddress',
|
||||
'protocolInformation': 'urn:mace:dir:attribute-def:protocolInformation',
|
||||
'pseudonym': 'urn:mace:dir:attribute-def:pseudonym',
|
||||
'registeredAddress': 'urn:mace:dir:attribute-def:registeredAddress',
|
||||
'rfc822Mailbox': 'urn:mace:dir:attribute-def:rfc822Mailbox',
|
||||
'roleOccupant': 'urn:mace:dir:attribute-def:roleOccupant',
|
||||
'roomNumber': 'urn:mace:dir:attribute-def:roomNumber',
|
||||
'sOARecord': 'urn:mace:dir:attribute-def:sOARecord',
|
||||
'searchGuide': 'urn:mace:dir:attribute-def:searchGuide',
|
||||
'secretary': 'urn:mace:dir:attribute-def:secretary',
|
||||
'seeAlso': 'urn:mace:dir:attribute-def:seeAlso',
|
||||
'serialNumber': 'urn:mace:dir:attribute-def:serialNumber',
|
||||
'singleLevelQuality': 'urn:mace:dir:attribute-def:singleLevelQuality',
|
||||
'sn': 'urn:mace:dir:attribute-def:sn',
|
||||
'st': 'urn:mace:dir:attribute-def:st',
|
||||
'stateOrProvinceName': 'urn:mace:dir:attribute-def:stateOrProvinceName',
|
||||
'street': 'urn:mace:dir:attribute-def:street',
|
||||
'streetAddress': 'urn:mace:dir:attribute-def:streetAddress',
|
||||
'subtreeMaximumQuality': 'urn:mace:dir:attribute-def:subtreeMaximumQuality',
|
||||
'subtreeMinimumQuality': 'urn:mace:dir:attribute-def:subtreeMinimumQuality',
|
||||
'supportedAlgorithms': 'urn:mace:dir:attribute-def:supportedAlgorithms',
|
||||
'supportedApplicationContext': 'urn:mace:dir:attribute-def:supportedApplicationContext',
|
||||
'surname': 'urn:mace:dir:attribute-def:surname',
|
||||
'telephoneNumber': 'urn:mace:dir:attribute-def:telephoneNumber',
|
||||
'teletexTerminalIdentifier': 'urn:mace:dir:attribute-def:teletexTerminalIdentifier',
|
||||
'telexNumber': 'urn:mace:dir:attribute-def:telexNumber',
|
||||
'textEncodedORAddress': 'urn:mace:dir:attribute-def:textEncodedORAddress',
|
||||
'title': 'urn:mace:dir:attribute-def:title',
|
||||
'uid': 'urn:mace:dir:attribute-def:uid',
|
||||
'uniqueIdentifier': 'urn:mace:dir:attribute-def:uniqueIdentifier',
|
||||
'uniqueMember': 'urn:mace:dir:attribute-def:uniqueMember',
|
||||
'userCertificate': 'urn:mace:dir:attribute-def:userCertificate',
|
||||
'userClass': 'urn:mace:dir:attribute-def:userClass',
|
||||
'userPKCS12': 'urn:mace:dir:attribute-def:userPKCS12',
|
||||
'userPassword': 'urn:mace:dir:attribute-def:userPassword',
|
||||
'userSMIMECertificate': 'urn:mace:dir:attribute-def:userSMIMECertificate',
|
||||
'userid': 'urn:mace:dir:attribute-def:userid',
|
||||
'x121Address': 'urn:mace:dir:attribute-def:x121Address',
|
||||
'x500UniqueIdentifier': 'urn:mace:dir:attribute-def:x500UniqueIdentifier',
|
||||
}
|
||||
}
|
||||
199
example/sp/attributemaps/saml_uri.py
Normal file
199
example/sp/attributemaps/saml_uri.py
Normal file
@@ -0,0 +1,199 @@
|
||||
__author__ = 'rolandh'
|
||||
|
||||
EDUPERSON_OID = "urn:oid:1.3.6.1.4.1.5923.1.1.1."
|
||||
X500ATTR_OID = "urn:oid:2.5.4."
|
||||
NOREDUPERSON_OID = "urn:oid:1.3.6.1.4.1.2428.90.1."
|
||||
NETSCAPE_LDAP = "urn:oid:2.16.840.1.113730.3.1."
|
||||
UCL_DIR_PILOT = 'urn:oid:0.9.2342.19200300.100.1.'
|
||||
PKCS_9 = "urn:oid:1.2.840.113549.1.9.1."
|
||||
UMICH = "urn:oid:1.3.6.1.4.1.250.1.57."
|
||||
|
||||
MAP = {
|
||||
"identifier": "urn:oasis:names:tc:SAML:2.0:attrname-format:uri",
|
||||
"fro": {
|
||||
EDUPERSON_OID+'2': 'eduPersonNickname',
|
||||
EDUPERSON_OID+'9': 'eduPersonScopedAffiliation',
|
||||
EDUPERSON_OID+'11': 'eduPersonAssurance',
|
||||
EDUPERSON_OID+'10': 'eduPersonTargetedID',
|
||||
EDUPERSON_OID+'4': 'eduPersonOrgUnitDN',
|
||||
NOREDUPERSON_OID+'6': 'norEduOrgAcronym',
|
||||
NOREDUPERSON_OID+'7': 'norEduOrgUniqueIdentifier',
|
||||
NOREDUPERSON_OID+'4': 'norEduPersonLIN',
|
||||
EDUPERSON_OID+'1': 'eduPersonAffiliation',
|
||||
NOREDUPERSON_OID+'2': 'norEduOrgUnitUniqueNumber',
|
||||
NETSCAPE_LDAP+'40': 'userSMIMECertificate',
|
||||
NOREDUPERSON_OID+'1': 'norEduOrgUniqueNumber',
|
||||
NETSCAPE_LDAP+'241': 'displayName',
|
||||
UCL_DIR_PILOT+'37': 'associatedDomain',
|
||||
EDUPERSON_OID+'6': 'eduPersonPrincipalName',
|
||||
NOREDUPERSON_OID+'8': 'norEduOrgUnitUniqueIdentifier',
|
||||
NOREDUPERSON_OID+'9': 'federationFeideSchemaVersion',
|
||||
X500ATTR_OID+'53': 'deltaRevocationList',
|
||||
X500ATTR_OID+'52': 'supportedAlgorithms',
|
||||
X500ATTR_OID+'51': 'houseIdentifier',
|
||||
X500ATTR_OID+'50': 'uniqueMember',
|
||||
X500ATTR_OID+'19': 'physicalDeliveryOfficeName',
|
||||
X500ATTR_OID+'18': 'postOfficeBox',
|
||||
X500ATTR_OID+'17': 'postalCode',
|
||||
X500ATTR_OID+'16': 'postalAddress',
|
||||
X500ATTR_OID+'15': 'businessCategory',
|
||||
X500ATTR_OID+'14': 'searchGuide',
|
||||
EDUPERSON_OID+'5': 'eduPersonPrimaryAffiliation',
|
||||
X500ATTR_OID+'12': 'title',
|
||||
X500ATTR_OID+'11': 'ou',
|
||||
X500ATTR_OID+'10': 'o',
|
||||
X500ATTR_OID+'37': 'cACertificate',
|
||||
X500ATTR_OID+'36': 'userCertificate',
|
||||
X500ATTR_OID+'31': 'member',
|
||||
X500ATTR_OID+'30': 'supportedApplicationContext',
|
||||
X500ATTR_OID+'33': 'roleOccupant',
|
||||
X500ATTR_OID+'32': 'owner',
|
||||
NETSCAPE_LDAP+'1': 'carLicense',
|
||||
PKCS_9+'1': 'email',
|
||||
NETSCAPE_LDAP+'3': 'employeeNumber',
|
||||
NETSCAPE_LDAP+'2': 'departmentNumber',
|
||||
X500ATTR_OID+'39': 'certificateRevocationList',
|
||||
X500ATTR_OID+'38': 'authorityRevocationList',
|
||||
NETSCAPE_LDAP+'216': 'userPKCS12',
|
||||
EDUPERSON_OID+'8': 'eduPersonPrimaryOrgUnitDN',
|
||||
X500ATTR_OID+'9': 'street',
|
||||
X500ATTR_OID+'8': 'st',
|
||||
NETSCAPE_LDAP+'39': 'preferredLanguage',
|
||||
EDUPERSON_OID+'7': 'eduPersonEntitlement',
|
||||
X500ATTR_OID+'2': 'knowledgeInformation',
|
||||
X500ATTR_OID+'7': 'l',
|
||||
X500ATTR_OID+'6': 'c',
|
||||
X500ATTR_OID+'5': 'serialNumber',
|
||||
X500ATTR_OID+'4': 'sn',
|
||||
UCL_DIR_PILOT+'60': 'jpegPhoto',
|
||||
X500ATTR_OID+'65': 'pseudonym',
|
||||
NOREDUPERSON_OID+'5': 'norEduPersonNIN',
|
||||
UCL_DIR_PILOT+'3': 'mail',
|
||||
UCL_DIR_PILOT+'25': 'dc',
|
||||
X500ATTR_OID+'40': 'crossCertificatePair',
|
||||
X500ATTR_OID+'42': 'givenName',
|
||||
X500ATTR_OID+'43': 'initials',
|
||||
X500ATTR_OID+'44': 'generationQualifier',
|
||||
X500ATTR_OID+'45': 'x500UniqueIdentifier',
|
||||
X500ATTR_OID+'46': 'dnQualifier',
|
||||
X500ATTR_OID+'47': 'enhancedSearchGuide',
|
||||
X500ATTR_OID+'48': 'protocolInformation',
|
||||
X500ATTR_OID+'54': 'dmdName',
|
||||
NETSCAPE_LDAP+'4': 'employeeType',
|
||||
X500ATTR_OID+'22': 'teletexTerminalIdentifier',
|
||||
X500ATTR_OID+'23': 'facsimileTelephoneNumber',
|
||||
X500ATTR_OID+'20': 'telephoneNumber',
|
||||
X500ATTR_OID+'21': 'telexNumber',
|
||||
X500ATTR_OID+'26': 'registeredAddress',
|
||||
X500ATTR_OID+'27': 'destinationIndicator',
|
||||
X500ATTR_OID+'24': 'x121Address',
|
||||
X500ATTR_OID+'25': 'internationaliSDNNumber',
|
||||
X500ATTR_OID+'28': 'preferredDeliveryMethod',
|
||||
X500ATTR_OID+'29': 'presentationAddress',
|
||||
EDUPERSON_OID+'3': 'eduPersonOrgDN',
|
||||
NOREDUPERSON_OID+'3': 'norEduPersonBirthDate',
|
||||
UMICH+'57': 'labeledURI',
|
||||
UCL_DIR_PILOT+'1': 'uid',
|
||||
},
|
||||
"to": {
|
||||
'roleOccupant': X500ATTR_OID+'33',
|
||||
'gn': X500ATTR_OID+'42',
|
||||
'norEduPersonNIN': NOREDUPERSON_OID+'5',
|
||||
'title': X500ATTR_OID+'12',
|
||||
'facsimileTelephoneNumber': X500ATTR_OID+'23',
|
||||
'mail': UCL_DIR_PILOT+'3',
|
||||
'postOfficeBox': X500ATTR_OID+'18',
|
||||
'fax': X500ATTR_OID+'23',
|
||||
'telephoneNumber': X500ATTR_OID+'20',
|
||||
'norEduPersonBirthDate': NOREDUPERSON_OID+'3',
|
||||
'rfc822Mailbox': UCL_DIR_PILOT+'3',
|
||||
'dc': UCL_DIR_PILOT+'25',
|
||||
'countryName': X500ATTR_OID+'6',
|
||||
'emailAddress': PKCS_9+'1',
|
||||
'employeeNumber': NETSCAPE_LDAP+'3',
|
||||
'organizationName': X500ATTR_OID+'10',
|
||||
'eduPersonAssurance': EDUPERSON_OID+'11',
|
||||
'norEduOrgAcronym': NOREDUPERSON_OID+'6',
|
||||
'registeredAddress': X500ATTR_OID+'26',
|
||||
'physicalDeliveryOfficeName': X500ATTR_OID+'19',
|
||||
'associatedDomain': UCL_DIR_PILOT+'37',
|
||||
'l': X500ATTR_OID+'7',
|
||||
'stateOrProvinceName': X500ATTR_OID+'8',
|
||||
'federationFeideSchemaVersion': NOREDUPERSON_OID+'9',
|
||||
'pkcs9email': PKCS_9+'1',
|
||||
'givenName': X500ATTR_OID+'42',
|
||||
'givenname': X500ATTR_OID+'42',
|
||||
'x500UniqueIdentifier': X500ATTR_OID+'45',
|
||||
'eduPersonNickname': EDUPERSON_OID+'2',
|
||||
'houseIdentifier': X500ATTR_OID+'51',
|
||||
'street': X500ATTR_OID+'9',
|
||||
'supportedAlgorithms': X500ATTR_OID+'52',
|
||||
'preferredLanguage': NETSCAPE_LDAP+'39',
|
||||
'postalAddress': X500ATTR_OID+'16',
|
||||
'email': PKCS_9+'1',
|
||||
'norEduOrgUnitUniqueIdentifier': NOREDUPERSON_OID+'8',
|
||||
'eduPersonPrimaryOrgUnitDN': EDUPERSON_OID+'8',
|
||||
'c': X500ATTR_OID+'6',
|
||||
'teletexTerminalIdentifier': X500ATTR_OID+'22',
|
||||
'o': X500ATTR_OID+'10',
|
||||
'cACertificate': X500ATTR_OID+'37',
|
||||
'telexNumber': X500ATTR_OID+'21',
|
||||
'ou': X500ATTR_OID+'11',
|
||||
'initials': X500ATTR_OID+'43',
|
||||
'eduPersonOrgUnitDN': EDUPERSON_OID+'4',
|
||||
'deltaRevocationList': X500ATTR_OID+'53',
|
||||
'norEduPersonLIN': NOREDUPERSON_OID+'4',
|
||||
'supportedApplicationContext': X500ATTR_OID+'30',
|
||||
'eduPersonEntitlement': EDUPERSON_OID+'7',
|
||||
'generationQualifier': X500ATTR_OID+'44',
|
||||
'eduPersonAffiliation': EDUPERSON_OID+'1',
|
||||
'eduPersonPrincipalName': EDUPERSON_OID+'6',
|
||||
'edupersonprincipalname': EDUPERSON_OID+'6',
|
||||
'localityName': X500ATTR_OID+'7',
|
||||
'owner': X500ATTR_OID+'32',
|
||||
'norEduOrgUnitUniqueNumber': NOREDUPERSON_OID+'2',
|
||||
'searchGuide': X500ATTR_OID+'14',
|
||||
'certificateRevocationList': X500ATTR_OID+'39',
|
||||
'organizationalUnitName': X500ATTR_OID+'11',
|
||||
'userCertificate': X500ATTR_OID+'36',
|
||||
'preferredDeliveryMethod': X500ATTR_OID+'28',
|
||||
'internationaliSDNNumber': X500ATTR_OID+'25',
|
||||
'uniqueMember': X500ATTR_OID+'50',
|
||||
'departmentNumber': NETSCAPE_LDAP+'2',
|
||||
'enhancedSearchGuide': X500ATTR_OID+'47',
|
||||
'userPKCS12': NETSCAPE_LDAP+'216',
|
||||
'eduPersonTargetedID': EDUPERSON_OID+'10',
|
||||
'norEduOrgUniqueNumber': NOREDUPERSON_OID+'1',
|
||||
'x121Address': X500ATTR_OID+'24',
|
||||
'destinationIndicator': X500ATTR_OID+'27',
|
||||
'eduPersonPrimaryAffiliation': EDUPERSON_OID+'5',
|
||||
'surname': X500ATTR_OID+'4',
|
||||
'jpegPhoto': UCL_DIR_PILOT+'60',
|
||||
'eduPersonScopedAffiliation': EDUPERSON_OID+'9',
|
||||
'edupersonscopedaffiliation': EDUPERSON_OID+'9',
|
||||
'protocolInformation': X500ATTR_OID+'48',
|
||||
'knowledgeInformation': X500ATTR_OID+'2',
|
||||
'employeeType': NETSCAPE_LDAP+'4',
|
||||
'userSMIMECertificate': NETSCAPE_LDAP+'40',
|
||||
'member': X500ATTR_OID+'31',
|
||||
'streetAddress': X500ATTR_OID+'9',
|
||||
'dmdName': X500ATTR_OID+'54',
|
||||
'postalCode': X500ATTR_OID+'17',
|
||||
'pseudonym': X500ATTR_OID+'65',
|
||||
'dnQualifier': X500ATTR_OID+'46',
|
||||
'crossCertificatePair': X500ATTR_OID+'40',
|
||||
'eduPersonOrgDN': EDUPERSON_OID+'3',
|
||||
'authorityRevocationList': X500ATTR_OID+'38',
|
||||
'displayName': NETSCAPE_LDAP+'241',
|
||||
'businessCategory': X500ATTR_OID+'15',
|
||||
'serialNumber': X500ATTR_OID+'5',
|
||||
'norEduOrgUniqueIdentifier': NOREDUPERSON_OID+'7',
|
||||
'st': X500ATTR_OID+'8',
|
||||
'carLicense': NETSCAPE_LDAP+'1',
|
||||
'presentationAddress': X500ATTR_OID+'29',
|
||||
'sn': X500ATTR_OID+'4',
|
||||
'domainComponent': UCL_DIR_PILOT+'25',
|
||||
'labeledURI': UMICH+'57',
|
||||
'uid': UCL_DIR_PILOT+'1'
|
||||
}
|
||||
}
|
||||
190
example/sp/attributemaps/shibboleth_uri.py
Normal file
190
example/sp/attributemaps/shibboleth_uri.py
Normal file
@@ -0,0 +1,190 @@
|
||||
EDUPERSON_OID = "urn:oid:1.3.6.1.4.1.5923.1.1.1."
|
||||
X500ATTR = "urn:oid:2.5.4."
|
||||
NOREDUPERSON_OID = "urn:oid:1.3.6.1.4.1.2428.90.1."
|
||||
NETSCAPE_LDAP = "urn:oid:2.16.840.1.113730.3.1."
|
||||
UCL_DIR_PILOT = "urn:oid:0.9.2342.19200300.100.1."
|
||||
PKCS_9 = "urn:oid:1.2.840.113549.1.9."
|
||||
UMICH = "urn:oid:1.3.6.1.4.1.250.1.57."
|
||||
|
||||
MAP = {
|
||||
"identifier": "urn:mace:shibboleth:1.0:attributeNamespace:uri",
|
||||
"fro": {
|
||||
EDUPERSON_OID+'2': 'eduPersonNickname',
|
||||
EDUPERSON_OID+'9': 'eduPersonScopedAffiliation',
|
||||
EDUPERSON_OID+'11': 'eduPersonAssurance',
|
||||
EDUPERSON_OID+'10': 'eduPersonTargetedID',
|
||||
EDUPERSON_OID+'4': 'eduPersonOrgUnitDN',
|
||||
NOREDUPERSON_OID+'6': 'norEduOrgAcronym',
|
||||
NOREDUPERSON_OID+'7': 'norEduOrgUniqueIdentifier',
|
||||
NOREDUPERSON_OID+'4': 'norEduPersonLIN',
|
||||
EDUPERSON_OID+'1': 'eduPersonAffiliation',
|
||||
NOREDUPERSON_OID+'2': 'norEduOrgUnitUniqueNumber',
|
||||
NETSCAPE_LDAP+'40': 'userSMIMECertificate',
|
||||
NOREDUPERSON_OID+'1': 'norEduOrgUniqueNumber',
|
||||
NETSCAPE_LDAP+'241': 'displayName',
|
||||
UCL_DIR_PILOT+'37': 'associatedDomain',
|
||||
EDUPERSON_OID+'6': 'eduPersonPrincipalName',
|
||||
NOREDUPERSON_OID+'8': 'norEduOrgUnitUniqueIdentifier',
|
||||
NOREDUPERSON_OID+'9': 'federationFeideSchemaVersion',
|
||||
X500ATTR+'53': 'deltaRevocationList',
|
||||
X500ATTR+'52': 'supportedAlgorithms',
|
||||
X500ATTR+'51': 'houseIdentifier',
|
||||
X500ATTR+'50': 'uniqueMember',
|
||||
X500ATTR+'19': 'physicalDeliveryOfficeName',
|
||||
X500ATTR+'18': 'postOfficeBox',
|
||||
X500ATTR+'17': 'postalCode',
|
||||
X500ATTR+'16': 'postalAddress',
|
||||
X500ATTR+'15': 'businessCategory',
|
||||
X500ATTR+'14': 'searchGuide',
|
||||
EDUPERSON_OID+'5': 'eduPersonPrimaryAffiliation',
|
||||
X500ATTR+'12': 'title',
|
||||
X500ATTR+'11': 'ou',
|
||||
X500ATTR+'10': 'o',
|
||||
X500ATTR+'37': 'cACertificate',
|
||||
X500ATTR+'36': 'userCertificate',
|
||||
X500ATTR+'31': 'member',
|
||||
X500ATTR+'30': 'supportedApplicationContext',
|
||||
X500ATTR+'33': 'roleOccupant',
|
||||
X500ATTR+'32': 'owner',
|
||||
NETSCAPE_LDAP+'1': 'carLicense',
|
||||
PKCS_9+'1': 'email',
|
||||
NETSCAPE_LDAP+'3': 'employeeNumber',
|
||||
NETSCAPE_LDAP+'2': 'departmentNumber',
|
||||
X500ATTR+'39': 'certificateRevocationList',
|
||||
X500ATTR+'38': 'authorityRevocationList',
|
||||
NETSCAPE_LDAP+'216': 'userPKCS12',
|
||||
EDUPERSON_OID+'8': 'eduPersonPrimaryOrgUnitDN',
|
||||
X500ATTR+'9': 'street',
|
||||
X500ATTR+'8': 'st',
|
||||
NETSCAPE_LDAP+'39': 'preferredLanguage',
|
||||
EDUPERSON_OID+'7': 'eduPersonEntitlement',
|
||||
X500ATTR+'2': 'knowledgeInformation',
|
||||
X500ATTR+'7': 'l',
|
||||
X500ATTR+'6': 'c',
|
||||
X500ATTR+'5': 'serialNumber',
|
||||
X500ATTR+'4': 'sn',
|
||||
UCL_DIR_PILOT+'60': 'jpegPhoto',
|
||||
X500ATTR+'65': 'pseudonym',
|
||||
NOREDUPERSON_OID+'5': 'norEduPersonNIN',
|
||||
UCL_DIR_PILOT+'3': 'mail',
|
||||
UCL_DIR_PILOT+'25': 'dc',
|
||||
X500ATTR+'40': 'crossCertificatePair',
|
||||
X500ATTR+'42': 'givenName',
|
||||
X500ATTR+'43': 'initials',
|
||||
X500ATTR+'44': 'generationQualifier',
|
||||
X500ATTR+'45': 'x500UniqueIdentifier',
|
||||
X500ATTR+'46': 'dnQualifier',
|
||||
X500ATTR+'47': 'enhancedSearchGuide',
|
||||
X500ATTR+'48': 'protocolInformation',
|
||||
X500ATTR+'54': 'dmdName',
|
||||
NETSCAPE_LDAP+'4': 'employeeType',
|
||||
X500ATTR+'22': 'teletexTerminalIdentifier',
|
||||
X500ATTR+'23': 'facsimileTelephoneNumber',
|
||||
X500ATTR+'20': 'telephoneNumber',
|
||||
X500ATTR+'21': 'telexNumber',
|
||||
X500ATTR+'26': 'registeredAddress',
|
||||
X500ATTR+'27': 'destinationIndicator',
|
||||
X500ATTR+'24': 'x121Address',
|
||||
X500ATTR+'25': 'internationaliSDNNumber',
|
||||
X500ATTR+'28': 'preferredDeliveryMethod',
|
||||
X500ATTR+'29': 'presentationAddress',
|
||||
EDUPERSON_OID+'3': 'eduPersonOrgDN',
|
||||
NOREDUPERSON_OID+'3': 'norEduPersonBirthDate',
|
||||
},
|
||||
"to":{
|
||||
'roleOccupant': X500ATTR+'33',
|
||||
'gn': X500ATTR+'42',
|
||||
'norEduPersonNIN': NOREDUPERSON_OID+'5',
|
||||
'title': X500ATTR+'12',
|
||||
'facsimileTelephoneNumber': X500ATTR+'23',
|
||||
'mail': UCL_DIR_PILOT+'3',
|
||||
'postOfficeBox': X500ATTR+'18',
|
||||
'fax': X500ATTR+'23',
|
||||
'telephoneNumber': X500ATTR+'20',
|
||||
'norEduPersonBirthDate': NOREDUPERSON_OID+'3',
|
||||
'rfc822Mailbox': UCL_DIR_PILOT+'3',
|
||||
'dc': UCL_DIR_PILOT+'25',
|
||||
'countryName': X500ATTR+'6',
|
||||
'emailAddress': PKCS_9+'1',
|
||||
'employeeNumber': NETSCAPE_LDAP+'3',
|
||||
'organizationName': X500ATTR+'10',
|
||||
'eduPersonAssurance': EDUPERSON_OID+'11',
|
||||
'norEduOrgAcronym': NOREDUPERSON_OID+'6',
|
||||
'registeredAddress': X500ATTR+'26',
|
||||
'physicalDeliveryOfficeName': X500ATTR+'19',
|
||||
'associatedDomain': UCL_DIR_PILOT+'37',
|
||||
'l': X500ATTR+'7',
|
||||
'stateOrProvinceName': X500ATTR+'8',
|
||||
'federationFeideSchemaVersion': NOREDUPERSON_OID+'9',
|
||||
'pkcs9email': PKCS_9+'1',
|
||||
'givenName': X500ATTR+'42',
|
||||
'x500UniqueIdentifier': X500ATTR+'45',
|
||||
'eduPersonNickname': EDUPERSON_OID+'2',
|
||||
'houseIdentifier': X500ATTR+'51',
|
||||
'street': X500ATTR+'9',
|
||||
'supportedAlgorithms': X500ATTR+'52',
|
||||
'preferredLanguage': NETSCAPE_LDAP+'39',
|
||||
'postalAddress': X500ATTR+'16',
|
||||
'email': PKCS_9+'1',
|
||||
'norEduOrgUnitUniqueIdentifier': NOREDUPERSON_OID+'8',
|
||||
'eduPersonPrimaryOrgUnitDN': EDUPERSON_OID+'8',
|
||||
'c': X500ATTR+'6',
|
||||
'teletexTerminalIdentifier': X500ATTR+'22',
|
||||
'o': X500ATTR+'10',
|
||||
'cACertificate': X500ATTR+'37',
|
||||
'telexNumber': X500ATTR+'21',
|
||||
'ou': X500ATTR+'11',
|
||||
'initials': X500ATTR+'43',
|
||||
'eduPersonOrgUnitDN': EDUPERSON_OID+'4',
|
||||
'deltaRevocationList': X500ATTR+'53',
|
||||
'norEduPersonLIN': NOREDUPERSON_OID+'4',
|
||||
'supportedApplicationContext': X500ATTR+'30',
|
||||
'eduPersonEntitlement': EDUPERSON_OID+'7',
|
||||
'generationQualifier': X500ATTR+'44',
|
||||
'eduPersonAffiliation': EDUPERSON_OID+'1',
|
||||
'eduPersonPrincipalName': EDUPERSON_OID+'6',
|
||||
'localityName': X500ATTR+'7',
|
||||
'owner': X500ATTR+'32',
|
||||
'norEduOrgUnitUniqueNumber': NOREDUPERSON_OID+'2',
|
||||
'searchGuide': X500ATTR+'14',
|
||||
'certificateRevocationList': X500ATTR+'39',
|
||||
'organizationalUnitName': X500ATTR+'11',
|
||||
'userCertificate': X500ATTR+'36',
|
||||
'preferredDeliveryMethod': X500ATTR+'28',
|
||||
'internationaliSDNNumber': X500ATTR+'25',
|
||||
'uniqueMember': X500ATTR+'50',
|
||||
'departmentNumber': NETSCAPE_LDAP+'2',
|
||||
'enhancedSearchGuide': X500ATTR+'47',
|
||||
'userPKCS12': NETSCAPE_LDAP+'216',
|
||||
'eduPersonTargetedID': EDUPERSON_OID+'10',
|
||||
'norEduOrgUniqueNumber': NOREDUPERSON_OID+'1',
|
||||
'x121Address': X500ATTR+'24',
|
||||
'destinationIndicator': X500ATTR+'27',
|
||||
'eduPersonPrimaryAffiliation': EDUPERSON_OID+'5',
|
||||
'surname': X500ATTR+'4',
|
||||
'jpegPhoto': UCL_DIR_PILOT+'60',
|
||||
'eduPersonScopedAffiliation': EDUPERSON_OID+'9',
|
||||
'protocolInformation': X500ATTR+'48',
|
||||
'knowledgeInformation': X500ATTR+'2',
|
||||
'employeeType': NETSCAPE_LDAP+'4',
|
||||
'userSMIMECertificate': NETSCAPE_LDAP+'40',
|
||||
'member': X500ATTR+'31',
|
||||
'streetAddress': X500ATTR+'9',
|
||||
'dmdName': X500ATTR+'54',
|
||||
'postalCode': X500ATTR+'17',
|
||||
'pseudonym': X500ATTR+'65',
|
||||
'dnQualifier': X500ATTR+'46',
|
||||
'crossCertificatePair': X500ATTR+'40',
|
||||
'eduPersonOrgDN': EDUPERSON_OID+'3',
|
||||
'authorityRevocationList': X500ATTR+'38',
|
||||
'displayName': NETSCAPE_LDAP+'241',
|
||||
'businessCategory': X500ATTR+'15',
|
||||
'serialNumber': X500ATTR+'5',
|
||||
'norEduOrgUniqueIdentifier': NOREDUPERSON_OID+'7',
|
||||
'st': X500ATTR+'8',
|
||||
'carLicense': NETSCAPE_LDAP+'1',
|
||||
'presentationAddress': X500ATTR+'29',
|
||||
'sn': X500ATTR+'4',
|
||||
'domainComponent': UCL_DIR_PILOT+'25',
|
||||
}
|
||||
}
|
||||
18
example/sp/pki/mycert.pem
Normal file
18
example/sp/pki/mycert.pem
Normal file
@@ -0,0 +1,18 @@
|
||||
-----BEGIN CERTIFICATE-----
|
||||
MIIC8jCCAlugAwIBAgIJAJHg2V5J31I8MA0GCSqGSIb3DQEBBQUAMFoxCzAJBgNV
|
||||
BAYTAlNFMQ0wCwYDVQQHEwRVbWVhMRgwFgYDVQQKEw9VbWVhIFVuaXZlcnNpdHkx
|
||||
EDAOBgNVBAsTB0lUIFVuaXQxEDAOBgNVBAMTB1Rlc3QgU1AwHhcNMDkxMDI2MTMz
|
||||
MTE1WhcNMTAxMDI2MTMzMTE1WjBaMQswCQYDVQQGEwJTRTENMAsGA1UEBxMEVW1l
|
||||
YTEYMBYGA1UEChMPVW1lYSBVbml2ZXJzaXR5MRAwDgYDVQQLEwdJVCBVbml0MRAw
|
||||
DgYDVQQDEwdUZXN0IFNQMIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQDkJWP7
|
||||
bwOxtH+E15VTaulNzVQ/0cSbM5G7abqeqSNSs0l0veHr6/ROgW96ZeQ57fzVy2MC
|
||||
FiQRw2fzBs0n7leEmDJyVVtBTavYlhAVXDNa3stgvh43qCfLx+clUlOvtnsoMiiR
|
||||
mo7qf0BoPKTj7c0uLKpDpEbAHQT4OF1HRYVxMwIDAQABo4G/MIG8MB0GA1UdDgQW
|
||||
BBQ7RgbMJFDGRBu9o3tDQDuSoBy7JjCBjAYDVR0jBIGEMIGBgBQ7RgbMJFDGRBu9
|
||||
o3tDQDuSoBy7JqFepFwwWjELMAkGA1UEBhMCU0UxDTALBgNVBAcTBFVtZWExGDAW
|
||||
BgNVBAoTD1VtZWEgVW5pdmVyc2l0eTEQMA4GA1UECxMHSVQgVW5pdDEQMA4GA1UE
|
||||
AxMHVGVzdCBTUIIJAJHg2V5J31I8MAwGA1UdEwQFMAMBAf8wDQYJKoZIhvcNAQEF
|
||||
BQADgYEAMuRwwXRnsiyWzmRikpwinnhTmbooKm5TINPE7A7gSQ710RxioQePPhZO
|
||||
zkM27NnHTrCe2rBVg0EGz7QTd1JIwLPvgoj4VTi/fSha/tXrYUaqc9AqU1kWI4WN
|
||||
+vffBGQ09mo+6CffuFTZYeOhzP/2stAPwCTU4kxEoiy0KpZMANI=
|
||||
-----END CERTIFICATE-----
|
||||
15
example/sp/pki/mykey.pem
Normal file
15
example/sp/pki/mykey.pem
Normal file
@@ -0,0 +1,15 @@
|
||||
-----BEGIN RSA PRIVATE KEY-----
|
||||
MIICXAIBAAKBgQDkJWP7bwOxtH+E15VTaulNzVQ/0cSbM5G7abqeqSNSs0l0veHr
|
||||
6/ROgW96ZeQ57fzVy2MCFiQRw2fzBs0n7leEmDJyVVtBTavYlhAVXDNa3stgvh43
|
||||
qCfLx+clUlOvtnsoMiiRmo7qf0BoPKTj7c0uLKpDpEbAHQT4OF1HRYVxMwIDAQAB
|
||||
AoGAbx9rKH91DCw/ZEPhHsVXJ6cYHxGcMoAWvnMMC9WUN+bNo4gNL205DLfsxXA1
|
||||
jqXFXZj3+38vSFumGPA6IvXrN+Wyp3+Lz3QGc4K5OdHeBtYlxa6EsrxPgvuxYDUB
|
||||
vx3xdWPMjy06G/ML+pR9XHnRaPNubXQX3UxGBuLjwNXVmyECQQD2/D84tYoCGWoq
|
||||
5FhUBxFUy2nnOLKYC/GGxBTX62iLfMQ3fbQcdg2pJsB5rrniyZf7UL+9FOsAO9k1
|
||||
8DO7G12DAkEA7Hkdg1KEw4ZfjnnjEa+KqpyLTLRQ91uTVW6kzR+4zY719iUJ/PXE
|
||||
PxJqm1ot7mJd1LW+bWtjLpxs7jYH19V+kQJBAIEpn2JnxdmdMuFlcy/WVmDy09pg
|
||||
0z0imdexeXkFmjHAONkQOv3bWv+HzYaVMo8AgCOksfEPHGqN4eUMTfFeuUMCQF+5
|
||||
E1JSd/2yCkJhYqKJHae8oMLXByNqRXTCyiFioutK4JPYIHfugJdLfC4QziD+Xp85
|
||||
RrGCU+7NUWcIJhqfiJECQAIgUAzfzhdj5AyICaFPaOQ+N8FVMLcTyqeTXP0sIlFk
|
||||
JStVibemTRCbxdXXM7OVipz1oW3PBVEO3t/VyjiaGGg=
|
||||
-----END RSA PRIVATE KEY-----
|
||||
191
example/sp/sp.py
Executable file
191
example/sp/sp.py
Executable file
@@ -0,0 +1,191 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
import re
|
||||
from cgi import parse_qs
|
||||
from saml2 import BINDING_HTTP_REDIRECT
|
||||
|
||||
# -----------------------------------------------------------------------------
|
||||
def dict_to_table(ava, lev=0, width=1):
|
||||
txt = ['<table border=%s bordercolor="black">\n' % width]
|
||||
for prop, valarr in ava.items():
|
||||
txt.append("<tr>\n")
|
||||
if isinstance(valarr, basestring):
|
||||
txt.append("<th>%s</th>\n" % str(prop))
|
||||
try:
|
||||
txt.append("<td>%s</td>\n" % valarr.encode("utf8"))
|
||||
except AttributeError:
|
||||
txt.append("<td>%s</td>\n" % valarr)
|
||||
elif isinstance(valarr, list):
|
||||
i = 0
|
||||
n = len(valarr)
|
||||
for val in valarr:
|
||||
if not i:
|
||||
txt.append("<th rowspan=%d>%s</td>\n" % (len(valarr),prop))
|
||||
else:
|
||||
txt.append("<tr>\n")
|
||||
if isinstance(val, dict):
|
||||
txt.append("<td>\n")
|
||||
txt.extend(dict_to_table(val, lev+1, width-1))
|
||||
txt.append("</td>\n")
|
||||
else:
|
||||
try:
|
||||
txt.append("<td>%s</td>\n" % val.encode("utf8"))
|
||||
except AttributeError:
|
||||
txt.append("<td>%s</td>\n" % val)
|
||||
if n > 1:
|
||||
txt.append("</tr>\n")
|
||||
n -= 1
|
||||
i += 1
|
||||
elif isinstance(valarr, dict):
|
||||
txt.append("<th>%s</th>\n" % prop)
|
||||
txt.append("<td>\n")
|
||||
txt.extend(dict_to_table(valarr, lev+1, width-1))
|
||||
txt.append("</td>\n")
|
||||
txt.append("</tr>\n")
|
||||
txt.append('</table>\n')
|
||||
return txt
|
||||
|
||||
|
||||
#noinspection PyUnusedLocal
|
||||
def whoami(environ, start_response, user, logger):
|
||||
identity = environ["repoze.who.identity"]["user"]
|
||||
if not identity:
|
||||
return not_authn(environ, start_response)
|
||||
response = ["<h2>Your identity are supposed to be</h2>"]
|
||||
response.extend(dict_to_table(identity))
|
||||
response.extend("<a href='logout'>Logout</a>")
|
||||
start_response('200 OK', [('Content-Type', 'text/html')])
|
||||
return response[:]
|
||||
|
||||
#noinspection PyUnusedLocal
|
||||
def not_found(environ, start_response):
|
||||
"""Called if no URL matches."""
|
||||
start_response('404 NOT FOUND', [('Content-Type', 'text/plain')])
|
||||
return ['Not Found']
|
||||
|
||||
#noinspection PyUnusedLocal
|
||||
def not_authn(environ, start_response):
|
||||
start_response('401 Unauthorized', [('Content-Type', 'text/plain')])
|
||||
return ['Unknown user']
|
||||
|
||||
#noinspection PyUnusedLocal
|
||||
def slo(environ, start_response, user, logger):
|
||||
# so here I might get either a LogoutResponse or a LogoutRequest
|
||||
client = environ['repoze.who.plugins']["saml2auth"]
|
||||
sids = None
|
||||
if "QUERY_STRING" in environ:
|
||||
query = parse_qs(environ["QUERY_STRING"])
|
||||
if logger:
|
||||
logger.info("query: %s" % query)
|
||||
try:
|
||||
(sids, code, head, message) = client.saml_client.logout_response(
|
||||
query["SAMLResponse"][0],
|
||||
log=logger,
|
||||
binding=BINDING_HTTP_REDIRECT)
|
||||
logger.info("LOGOUT reponse parsed OK")
|
||||
except KeyError:
|
||||
# return error reply
|
||||
pass
|
||||
|
||||
if not sids:
|
||||
start_response("302 Found", [("Location", "/done")])
|
||||
return ["Successfull Logout"]
|
||||
|
||||
#noinspection PyUnusedLocal
|
||||
def logout(environ, start_response, user, logger):
|
||||
client = environ['repoze.who.plugins']["saml2auth"]
|
||||
subject_id = environ["repoze.who.identity"]['repoze.who.userid']
|
||||
logger.info("[logout] subject_id: '%s'" % (subject_id,))
|
||||
target = "/done"
|
||||
# What if more than one
|
||||
tmp = client.saml_client.global_logout(subject_id, log=logger,
|
||||
return_to=target)
|
||||
logger.info("[logout] global_logout > %s" % (tmp,))
|
||||
(session_id, code, header, result) = tmp
|
||||
|
||||
if session_id:
|
||||
start_response(code, header)
|
||||
return result
|
||||
else: # All was done using SOAP
|
||||
if result:
|
||||
start_response("302 Found", [("Location", target)])
|
||||
return ["Successfull Logout"]
|
||||
else:
|
||||
start_response("500 Internal Server Error")
|
||||
return ["Failed to logout from identity services"]
|
||||
|
||||
#noinspection PyUnusedLocal
|
||||
def done(environ, start_response, user, logger):
|
||||
# remove cookie and stored info
|
||||
logger.info("[done] environ: %s" % environ)
|
||||
subject_id = environ["repoze.who.identity"]['repoze.who.userid']
|
||||
client = environ['repoze.who.plugins']["saml2auth"]
|
||||
logger.info("[logout done] remaining subjects: %s" % (
|
||||
client.saml_client.users.subjects(),))
|
||||
|
||||
start_response('200 OK', [('Content-Type', 'text/html')])
|
||||
return ["<h3>You are now logged out from this service</h3>"]
|
||||
|
||||
# ----------------------------------------------------------------------------
|
||||
|
||||
# map urls to functions
|
||||
urls = [
|
||||
(r'whoami$', whoami),
|
||||
(r'logout$', logout),
|
||||
(r'done$', done),
|
||||
(r'slo$', slo),
|
||||
(r'^$', whoami),
|
||||
]
|
||||
|
||||
# ----------------------------------------------------------------------------
|
||||
|
||||
def application(environ, start_response):
|
||||
"""
|
||||
The main WSGI application. Dispatch the current request to
|
||||
the functions from above and store the regular expression
|
||||
captures in the WSGI environment as `myapp.url_args` so that
|
||||
the functions from above can access the url placeholders.
|
||||
|
||||
If nothing matches call the `not_found` function.
|
||||
|
||||
:param environ: The HTTP application environment
|
||||
:param start_response: The application to run when the handling of the
|
||||
request is done
|
||||
:return: The response as a list of lines
|
||||
"""
|
||||
user = environ.get("REMOTE_USER", "")
|
||||
if not user:
|
||||
user = environ.get("repoze.who.identity", "")
|
||||
|
||||
path = environ.get('PATH_INFO', '').lstrip('/')
|
||||
logger = environ.get('repoze.who.logger')
|
||||
if logger:
|
||||
logger.info( "<application> PATH: %s" % path)
|
||||
for regex, callback in urls:
|
||||
if user:
|
||||
match = re.search(regex, path)
|
||||
if match is not None:
|
||||
try:
|
||||
environ['myapp.url_args'] = match.groups()[0]
|
||||
except IndexError:
|
||||
environ['myapp.url_args'] = path
|
||||
return callback(environ, start_response, user, logger)
|
||||
else:
|
||||
return not_authn(environ, start_response)
|
||||
return not_found(environ, start_response)
|
||||
|
||||
# ----------------------------------------------------------------------------
|
||||
|
||||
from repoze.who.config import make_middleware_with_config
|
||||
|
||||
app_with_auth = make_middleware_with_config(application, {"here":"."},
|
||||
'./who.ini', log_file="sp.log")
|
||||
|
||||
# ----------------------------------------------------------------------------
|
||||
PORT = 8087
|
||||
|
||||
if __name__ == '__main__':
|
||||
from wsgiref.simple_server import make_server
|
||||
srv = make_server('localhost', PORT, app_with_auth)
|
||||
print "SP listening on port: %s" % PORT
|
||||
srv.serve_forever()
|
||||
45
example/sp/sp_conf.py
Normal file
45
example/sp/sp_conf.py
Normal file
@@ -0,0 +1,45 @@
|
||||
from saml2 import BINDING_HTTP_REDIRECT
|
||||
from saml2.saml import NAME_FORMAT_URI
|
||||
|
||||
BASE= "http://localhost:8087/"
|
||||
|
||||
CONFIG = {
|
||||
"entityid" : "urn:mace:umu.se:saml:roland:sp",
|
||||
"description": "My SP",
|
||||
"service": {
|
||||
"sp":{
|
||||
"name" : "Rolands SP",
|
||||
"endpoints":{
|
||||
"assertion_consumer_service": [BASE],
|
||||
"single_logout_service" : [(BASE+"slo",
|
||||
BINDING_HTTP_REDIRECT)],
|
||||
},
|
||||
"required_attributes": ["surname", "givenname",
|
||||
"edupersonaffiliation"],
|
||||
"optional_attributes": ["title"],
|
||||
"idp": [ "urn:mace:umu.se:saml:roland:idp"],
|
||||
}
|
||||
},
|
||||
"debug" : 1,
|
||||
"key_file" : "pki/mykey.pem",
|
||||
"cert_file" : "pki/mycert.pem",
|
||||
"attribute_map_dir" : "./attributemaps",
|
||||
"metadata" : {
|
||||
"local": ["../idp/idp.xml"],
|
||||
},
|
||||
# -- below used by make_metadata --
|
||||
"organization": {
|
||||
"name": "Exempel AB",
|
||||
"display_name": [("Exempel AB","se"),("Example Co.","en")],
|
||||
"url":"http://www.example.com/roland",
|
||||
},
|
||||
"contact_person": [{
|
||||
"given_name":"John",
|
||||
"sur_name": "Smith",
|
||||
"email_address": ["john.smith@example.com"],
|
||||
"contact_type": "technical",
|
||||
},
|
||||
],
|
||||
#"xmlsec_binary":"/usr/local/bin/xmlsec1",
|
||||
"name_form": NAME_FORMAT_URI
|
||||
}
|
||||
42
example/sp/who.ini
Normal file
42
example/sp/who.ini
Normal file
@@ -0,0 +1,42 @@
|
||||
[plugin:auth_tkt]
|
||||
# identification
|
||||
use = repoze.who.plugins.auth_tkt:make_plugin
|
||||
secret = kasamark
|
||||
cookie_name = pysaml2
|
||||
secure = False
|
||||
include_ip = True
|
||||
timeout = 3600
|
||||
reissue_time = 3000
|
||||
|
||||
# IDENTIFIER
|
||||
# @param :
|
||||
# - rememberer_name : name of the plugin for remembering (delegate)
|
||||
[plugin:saml2auth]
|
||||
use = s2repoze.plugins.sp:make_plugin
|
||||
saml_conf = sp_conf
|
||||
rememberer_name = auth_tkt
|
||||
debug = 1
|
||||
sid_store = outstanding
|
||||
identity_cache = identities
|
||||
|
||||
[general]
|
||||
request_classifier = s2repoze.plugins.challenge_decider:my_request_classifier
|
||||
challenge_decider = repoze.who.classifiers:default_challenge_decider
|
||||
remote_user_key = REMOTE_USER
|
||||
|
||||
[identifiers]
|
||||
# plugin_name;classifier_name:.. or just plugin_name (good for any)
|
||||
plugins =
|
||||
saml2auth
|
||||
auth_tkt
|
||||
|
||||
[authenticators]
|
||||
# plugin_name;classifier_name.. or just plugin_name (good for any)
|
||||
plugins = saml2auth
|
||||
|
||||
[challengers]
|
||||
# plugin_name;classifier_name:.. or just plugin_name (good for any)
|
||||
plugins = saml2auth
|
||||
|
||||
[mdproviders]
|
||||
plugins = saml2auth
|
||||
69
release-howto.rst
Normal file
69
release-howto.rst
Normal file
@@ -0,0 +1,69 @@
|
||||
Releasing software
|
||||
-------------------
|
||||
|
||||
When releasing a new version, the following steps should be taken:
|
||||
|
||||
1. Make sure all automated tests pass.
|
||||
|
||||
2. Fill in the release date in ``CHANGES``. Make sure the changelog is
|
||||
complete. Commit this change.
|
||||
|
||||
3. Make sure the package metadata in ``setup.py`` is up-to-date. You can
|
||||
verify the information by re-generating the egg info::
|
||||
|
||||
python setup.py egg_info
|
||||
|
||||
and inspecting ``src/pysaml2.egg-info/PKG-INFO``. You should also make sure
|
||||
that the long description renders as valid reStructuredText. You can
|
||||
do this by using the ``rst2html.py`` utility from docutils_::
|
||||
|
||||
python setup.py --long-description | rst2html > test.html
|
||||
|
||||
If this will produce warning or errors, PyPI will be unable to render
|
||||
the long description nicely. It will treat it as plain text instead.
|
||||
|
||||
4. Update the version in the setup.py file and the doc conf.py file. Commit
|
||||
these changes.
|
||||
|
||||
5. Create a release tag::
|
||||
|
||||
bzr tag X.Y.Z
|
||||
|
||||
6. Push these changes to Launchpad::
|
||||
|
||||
bzr push
|
||||
|
||||
7. Create a source distribution and upload it to PyPI using the following
|
||||
command::
|
||||
|
||||
python setup.py register sdist upload
|
||||
|
||||
8. Upload the documentation to PyPI. First you need to generate the html
|
||||
version of the documentation::
|
||||
|
||||
cd doc
|
||||
make clean
|
||||
make html
|
||||
cd _build/html
|
||||
zip -r pysaml2-docs.zip *
|
||||
|
||||
now go to http://pypi.python.org/pypi?%3Aaction=pkg_edit&name=pysaml2 and
|
||||
submit the pysaml2-docs.zip file in the form at the bottom of that page.
|
||||
|
||||
9. Create a new release at Launchpad. If no milestone was created for this
|
||||
release in the past, create it now at https://launchpad.net/pysaml2/main
|
||||
Then create a release for that milestone. You can copy the section of
|
||||
the CHANGES file that matches this release in the appropiate field of
|
||||
the Launchpad form. Finally, add a download file for that release.
|
||||
|
||||
10. Send an email to the pysaml2 list announcing this release
|
||||
|
||||
|
||||
**Important:** Once released to PyPI or any other public download location,
|
||||
a released egg may *never* be removed, even if it has proven to be a faulty
|
||||
release ("brown bag release"). In such a case it should simply be superseded
|
||||
immediately by a new, improved release.
|
||||
|
||||
.. _docutils: http://docutils.sourceforge.net/
|
||||
|
||||
This document is based on http://svn.zope.org/*checkout*/Sandbox/philikon/foundation/releasing-software.txt
|
||||
2105
runtests.py
Normal file
2105
runtests.py
Normal file
File diff suppressed because it is too large
Load Diff
93
setup.py
Executable file
93
setup.py
Executable file
@@ -0,0 +1,93 @@
|
||||
#!/usr/bin/env python
|
||||
#
|
||||
# Copyright (C) 2007 SIOS Technology, Inc.
|
||||
# Copyright (C) 2011 Umea Universitet, Sweden
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
#
|
||||
#
|
||||
import sys
|
||||
|
||||
from distutils.core import Command
|
||||
from setuptools import setup
|
||||
|
||||
|
||||
class PyTest(Command):
|
||||
user_options = []
|
||||
def initialize_options(self):
|
||||
pass
|
||||
def finalize_options(self):
|
||||
pass
|
||||
def run(self):
|
||||
import sys, subprocess
|
||||
errno = subprocess.call([sys.executable, 'runtests.py'])
|
||||
raise SystemExit(errno)
|
||||
|
||||
|
||||
install_requires=[
|
||||
# core dependencies
|
||||
'decorator',
|
||||
'httplib2',
|
||||
'paste',
|
||||
'zope.interface',
|
||||
'repoze.who == 1.0.18'
|
||||
]
|
||||
|
||||
# only for Python 2.6
|
||||
if sys.version_info < (2,7):
|
||||
install_requires.append('importlib')
|
||||
|
||||
setup(
|
||||
name='pysaml2',
|
||||
version='0.4.2',
|
||||
description='Python implementation of SAML Version 2 to for instance be used in a WSGI environment',
|
||||
# long_description = read("README"),
|
||||
author='Roland Hedberg',
|
||||
author_email='roland.hedberg@adm.umu.se',
|
||||
license='Apache 2.0',
|
||||
url='https://code.launchpad.net/~roland-hedberg/pysaml2/main',
|
||||
|
||||
packages=['saml2', 'xmldsig', 'xmlenc', 's2repoze', 's2repoze.plugins',
|
||||
"saml2/profile", "saml2/schema", "saml2/extension",
|
||||
"saml2/attributemaps"],
|
||||
|
||||
package_dir = {'':'src'},
|
||||
package_data={'': ['xml/*.xml']},
|
||||
|
||||
classifiers = ["Development Status :: 4 - Beta",
|
||||
"License :: OSI Approved :: Apache Software License",
|
||||
"Topic :: Software Development :: Libraries :: Python Modules"],
|
||||
|
||||
scripts=["tools/parse_xsd2.py", "tools/make_metadata.py"],
|
||||
|
||||
tests_require=[
|
||||
'pyasn1',
|
||||
'pymongo',
|
||||
'python-memcached',
|
||||
'pytest',
|
||||
#'pytest-coverage',
|
||||
],
|
||||
|
||||
install_requires=install_requires,
|
||||
|
||||
extras_require={
|
||||
'cjson': ['python-cjson'],
|
||||
'pyasn1': ['pyasn1'],
|
||||
'pymongo': ['pymongo'],
|
||||
'python-memcached': ['python-memcached']
|
||||
},
|
||||
|
||||
zip_safe=False,
|
||||
|
||||
cmdclass = {'test': PyTest},
|
||||
)
|
||||
3
src/s2repoze/__init__.py
Normal file
3
src/s2repoze/__init__.py
Normal file
@@ -0,0 +1,3 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Created by Roland Hedberg
|
||||
# Copyright (c) 2009 Umeå Universitet. All rights reserved.
|
||||
3
src/s2repoze/plugins/__init__.py
Normal file
3
src/s2repoze/plugins/__init__.py
Normal file
@@ -0,0 +1,3 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Created by Roland Hedberg
|
||||
# Copyright (c) 2009 Umeå Universitet. All rights reserved.
|
||||
100
src/s2repoze/plugins/challenge_decider.py
Normal file
100
src/s2repoze/plugins/challenge_decider.py
Normal file
@@ -0,0 +1,100 @@
|
||||
from paste.request import construct_url
|
||||
import zope.interface
|
||||
from repoze.who.interfaces import IRequestClassifier
|
||||
|
||||
from paste.httpheaders import REQUEST_METHOD
|
||||
from paste.httpheaders import CONTENT_TYPE
|
||||
from paste.httpheaders import USER_AGENT
|
||||
|
||||
import re
|
||||
|
||||
_DAV_METHODS = (
|
||||
'OPTIONS',
|
||||
'PROPFIND',
|
||||
'PROPPATCH',
|
||||
'MKCOL',
|
||||
'LOCK',
|
||||
'UNLOCK',
|
||||
'TRACE',
|
||||
'DELETE',
|
||||
'COPY',
|
||||
'MOVE'
|
||||
)
|
||||
|
||||
_DAV_USERAGENTS = (
|
||||
'Microsoft Data Access Internet Publishing Provider',
|
||||
'WebDrive',
|
||||
'Zope External Editor',
|
||||
'WebDAVFS',
|
||||
'Goliath',
|
||||
'neon',
|
||||
'davlib',
|
||||
'wsAPI',
|
||||
'Microsoft-WebDAV'
|
||||
)
|
||||
|
||||
def my_request_classifier(environ):
|
||||
""" Returns one of the classifiers 'dav', 'xmlpost', or 'browser',
|
||||
depending on the imperative logic below"""
|
||||
request_method = REQUEST_METHOD(environ)
|
||||
if request_method in _DAV_METHODS:
|
||||
return 'dav'
|
||||
useragent = USER_AGENT(environ)
|
||||
if useragent:
|
||||
for agent in _DAV_USERAGENTS:
|
||||
if useragent.find(agent) != -1:
|
||||
return 'dav'
|
||||
if request_method == 'POST':
|
||||
if CONTENT_TYPE(environ) == 'text/xml':
|
||||
return 'xmlpost'
|
||||
elif CONTENT_TYPE(environ) == "application/soap+xml":
|
||||
return 'soap'
|
||||
return 'browser'
|
||||
|
||||
zope.interface.directlyProvides(my_request_classifier, IRequestClassifier)
|
||||
|
||||
class MyChallengeDecider:
|
||||
def __init__(self, path_login=""):
|
||||
self.path_login = path_login
|
||||
def __call__(self, environ, status, _headers):
|
||||
if status.startswith('401 '):
|
||||
return True
|
||||
else:
|
||||
# logout : need to "forget" => require a peculiar challenge
|
||||
if environ.has_key('rwpc.logout'):
|
||||
return True
|
||||
|
||||
# If the user is already authent, whatever happens(except logout),
|
||||
# don't make a challenge
|
||||
if environ.has_key('repoze.who.identity'):
|
||||
return False
|
||||
|
||||
uri = environ.get('REQUEST_URI', None)
|
||||
if uri is None:
|
||||
uri = construct_url(environ)
|
||||
|
||||
# require a challenge for login
|
||||
for regex in self.path_login:
|
||||
if regex.match(uri) is not None:
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
|
||||
|
||||
def make_plugin(path_login = None):
|
||||
if path_login is None:
|
||||
raise ValueError(
|
||||
'must include path_login in configuration')
|
||||
|
||||
# make regexp out of string passed via the config file
|
||||
list_login = []
|
||||
for arg in path_login.splitlines():
|
||||
carg = arg.lstrip()
|
||||
if carg != '':
|
||||
list_login.append(re.compile(carg))
|
||||
|
||||
plugin = MyChallengeDecider(list_login)
|
||||
|
||||
return plugin
|
||||
|
||||
77
src/s2repoze/plugins/entitlement.py
Normal file
77
src/s2repoze/plugins/entitlement.py
Normal file
@@ -0,0 +1,77 @@
|
||||
#!/usr/bin/env python
|
||||
import shelve
|
||||
|
||||
from zope.interface import implements
|
||||
|
||||
#from repoze.who.interfaces import IChallenger, IIdentifier, IAuthenticator
|
||||
from repoze.who.interfaces import IMetadataProvider
|
||||
|
||||
class EntitlementMetadataProvider(object):
|
||||
|
||||
implements(IMetadataProvider)
|
||||
|
||||
def __init__(self, filename, key_attribute):
|
||||
# Means I have to do explicit syncs on writes, but also
|
||||
# that it's faster on reads since it will cache data
|
||||
self._store = shelve.open(filename, writeback=True)
|
||||
self.key_attribute = key_attribute
|
||||
|
||||
def keys(self):
|
||||
return self._store.keys()
|
||||
|
||||
def get(self, user, attribute):
|
||||
return self._store[user][attribute]
|
||||
|
||||
def set(self, user, attribute, value):
|
||||
if user not in self._store:
|
||||
self._store[user] = {}
|
||||
|
||||
self._store[user][attribute] = value
|
||||
self._store.sync()
|
||||
|
||||
def part_of(self, user, virtualorg):
|
||||
if virtualorg in self._store[user]["entitlement"]:
|
||||
return True
|
||||
else:
|
||||
return False
|
||||
|
||||
def get_entitlement(self, user, virtualorg):
|
||||
try:
|
||||
return self._store[user]["entitlement"][virtualorg]
|
||||
except KeyError:
|
||||
return []
|
||||
|
||||
def store_entitlement(self, user, virtualorg, entitlement=None):
|
||||
if user not in self._store:
|
||||
self._store[user] = {"entitlement":{}}
|
||||
elif "entitlement" not in self._store[user]:
|
||||
self._store[user]["entitlement"] = {}
|
||||
|
||||
if entitlement is None:
|
||||
entitlement = []
|
||||
self._store[user]["entitlement"][virtualorg] = entitlement
|
||||
self._store.sync()
|
||||
|
||||
def add_metadata(self, environ, identity):
|
||||
#logger = environ.get('repoze.who.logger','')
|
||||
try:
|
||||
user = self._store[identity.get('repoze.who.userid')]
|
||||
except KeyError:
|
||||
return
|
||||
|
||||
try:
|
||||
vorg = environ["myapp.vo"]
|
||||
try:
|
||||
ents = user["entitlement"][vorg]
|
||||
identity["user"] = {
|
||||
"entitlement": ["%s:%s" % (vorg,e) for e in ents]}
|
||||
except KeyError:
|
||||
pass
|
||||
except KeyError:
|
||||
res = []
|
||||
for vorg, ents in user["entitlement"].items():
|
||||
res.extend(["%s:%s" % (vorg, e) for e in ents])
|
||||
identity["user"] = res
|
||||
|
||||
def make_plugin(filename, key_attribute=""):
|
||||
return EntitlementMetadataProvider(filename, key_attribute)
|
||||
163
src/s2repoze/plugins/formswithhidden.py
Normal file
163
src/s2repoze/plugins/formswithhidden.py
Normal file
@@ -0,0 +1,163 @@
|
||||
import urllib
|
||||
|
||||
from paste.httpheaders import CONTENT_LENGTH
|
||||
from paste.httpheaders import CONTENT_TYPE
|
||||
from paste.httpheaders import LOCATION
|
||||
from paste.httpexceptions import HTTPFound
|
||||
|
||||
from paste.request import parse_dict_querystring
|
||||
from paste.request import parse_formvars
|
||||
from paste.request import construct_url
|
||||
|
||||
from zope.interface import implements
|
||||
|
||||
from repoze.who.interfaces import IChallenger
|
||||
from repoze.who.interfaces import IIdentifier
|
||||
from repoze.who.plugins.form import FormPlugin
|
||||
|
||||
_DEFAULT_FORM = """
|
||||
<html>
|
||||
<head>
|
||||
<title>Demo Organization Log In</title>
|
||||
</head>
|
||||
<body>
|
||||
<div>
|
||||
<b>Demo Organization Log In</b>
|
||||
</div>
|
||||
<br/>
|
||||
<form method="POST" action="?__do_login=true">
|
||||
<table width="350" border="0" cellspacing="0" cellpadding="1">
|
||||
<tr>
|
||||
<td bgcolor="#999999">
|
||||
<table width="350" border="0"
|
||||
cellpadding="3" cellspacing="0" bgcolor="#e6e6e6">
|
||||
<tr>
|
||||
<td colspan="2">
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td width="85">
|
||||
<font color="#CC3300" >
|
||||
<strong>
|
||||
Användarnamn/Username:
|
||||
</strong>
|
||||
</font>
|
||||
</td>
|
||||
<td width="295">
|
||||
<input type="text" name="login">
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td width="85">
|
||||
<font color="#CC3300">
|
||||
<strong>
|
||||
Lösenord/Password:
|
||||
</strong>
|
||||
</font>
|
||||
</td>
|
||||
<td width="295">
|
||||
<input type="password" name="password">
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td colspan="2">
|
||||
<input name="submit" type="submit" value="Logga in">
|
||||
</td>
|
||||
</tr>
|
||||
</table>
|
||||
</td>
|
||||
</tr>
|
||||
</table>
|
||||
%s
|
||||
</form>
|
||||
<pre>
|
||||
</pre>
|
||||
</body>
|
||||
</html>
|
||||
"""
|
||||
|
||||
HIDDEN_PRE_LINE = """<input type=hidden name="%s" value="%s">"""
|
||||
|
||||
class FormHiddenPlugin(FormPlugin):
|
||||
|
||||
implements(IChallenger, IIdentifier)
|
||||
|
||||
# IIdentifier
|
||||
def identify(self, environ):
|
||||
logger = environ.get('repoze.who.logger','')
|
||||
logger and logger.info("formplugin identify")
|
||||
#logger and logger.info("environ keys: %s" % environ.keys())
|
||||
query = parse_dict_querystring(environ)
|
||||
# If the extractor finds a special query string on any request,
|
||||
# it will attempt to find the values in the input body.
|
||||
if query.get(self.login_form_qs):
|
||||
form = parse_formvars(environ)
|
||||
from StringIO import StringIO
|
||||
# we need to replace wsgi.input because we've read it
|
||||
# this smells funny
|
||||
environ['wsgi.input'] = StringIO()
|
||||
form.update(query)
|
||||
qinfo = {}
|
||||
for key, val in form.items():
|
||||
if key.startswith("_") and key.endswith("_"):
|
||||
qinfo[key[1:-1]] = val
|
||||
if qinfo:
|
||||
environ["s2repoze.qinfo"] = qinfo
|
||||
try:
|
||||
login = form['login']
|
||||
password = form['password']
|
||||
except KeyError:
|
||||
return None
|
||||
del query[self.login_form_qs]
|
||||
query.update(qinfo)
|
||||
environ['QUERY_STRING'] = urllib.urlencode(query)
|
||||
environ['repoze.who.application'] = HTTPFound(
|
||||
construct_url(environ))
|
||||
credentials = {'login':login, 'password':password}
|
||||
max_age = form.get('max_age', None)
|
||||
if max_age is not None:
|
||||
credentials['max_age'] = max_age
|
||||
return credentials
|
||||
|
||||
return None
|
||||
|
||||
# IChallenger
|
||||
def challenge(self, environ, status, app_headers, forget_headers):
|
||||
logger = environ.get('repoze.who.logger','')
|
||||
logger and logger.info("formplugin challenge")
|
||||
if app_headers:
|
||||
location = LOCATION(app_headers)
|
||||
if location:
|
||||
headers = list(app_headers) + list(forget_headers)
|
||||
return HTTPFound(headers = headers)
|
||||
|
||||
query = parse_dict_querystring(environ)
|
||||
hidden = []
|
||||
for key, val in query.items():
|
||||
hidden.append(HIDDEN_PRE_LINE % ("_%s_" % key, val))
|
||||
|
||||
logger and logger.info("hidden: %s" % (hidden,))
|
||||
form = self.formbody or _DEFAULT_FORM
|
||||
form = form % "\n".join(hidden)
|
||||
|
||||
if self.formcallable is not None:
|
||||
form = self.formcallable(environ)
|
||||
def auth_form(environ, start_response):
|
||||
content_length = CONTENT_LENGTH.tuples(str(len(form)))
|
||||
content_type = CONTENT_TYPE.tuples('text/html')
|
||||
headers = content_length + content_type + forget_headers
|
||||
start_response('200 OK', headers)
|
||||
return [form]
|
||||
|
||||
return auth_form
|
||||
|
||||
|
||||
def make_plugin(login_form_qs='__do_login', rememberer_name=None, form=None):
|
||||
if rememberer_name is None:
|
||||
raise ValueError(
|
||||
'must include rememberer key (name of another IIdentifier plugin)')
|
||||
if form is not None:
|
||||
form = open(form).read()
|
||||
plugin = FormHiddenPlugin(login_form_qs, rememberer_name, form)
|
||||
return plugin
|
||||
|
||||
35
src/s2repoze/plugins/ini.py
Normal file
35
src/s2repoze/plugins/ini.py
Normal file
@@ -0,0 +1,35 @@
|
||||
import ConfigParser
|
||||
|
||||
from zope.interface import implements
|
||||
|
||||
#from repoze.who.interfaces import IChallenger, IIdentifier, IAuthenticator
|
||||
from repoze.who.interfaces import IMetadataProvider
|
||||
|
||||
class INIMetadataProvider(object):
|
||||
|
||||
implements(IMetadataProvider)
|
||||
|
||||
def __init__(self, ini_file, key_attribute):
|
||||
|
||||
self.users = ConfigParser.ConfigParser()
|
||||
self.users.readfp(open(ini_file))
|
||||
self.key_attribute = key_attribute
|
||||
|
||||
def add_metadata(self, _environ, identity):
|
||||
#logger = environ.get('repoze.who.logger','')
|
||||
|
||||
key = identity.get('repoze.who.userid')
|
||||
try:
|
||||
if self.key_attribute:
|
||||
for sec in self.users.sections():
|
||||
if self.users.has_option(sec, self.key_attribute):
|
||||
if key in self.users.get(sec, self.key_attribute):
|
||||
identity["user"] = dict(self.users.items(sec))
|
||||
break
|
||||
else:
|
||||
identity["user"] = dict(self.users.items(key))
|
||||
except ValueError:
|
||||
pass
|
||||
|
||||
def make_plugin(ini_file, key_attribute=""):
|
||||
return INIMetadataProvider(ini_file, key_attribute)
|
||||
569
src/s2repoze/plugins/sp.py
Normal file
569
src/s2repoze/plugins/sp.py
Normal file
@@ -0,0 +1,569 @@
|
||||
# Copyright (C) 2009 Umea University
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
"""
|
||||
A plugin that allows you to use SAML2 SSO as authentication
|
||||
and SAML2 attribute aggregations as metadata collector in your
|
||||
WSGI application.
|
||||
|
||||
"""
|
||||
import cgi
|
||||
import sys
|
||||
import platform
|
||||
import shelve
|
||||
import traceback
|
||||
from urlparse import parse_qs
|
||||
|
||||
from paste.httpexceptions import HTTPSeeOther
|
||||
from paste.httpexceptions import HTTPNotImplemented
|
||||
from paste.httpexceptions import HTTPInternalServerError
|
||||
from paste.request import parse_dict_querystring
|
||||
from paste.request import construct_url
|
||||
from zope.interface import implements
|
||||
|
||||
from repoze.who.interfaces import IChallenger, IIdentifier, IAuthenticator
|
||||
from repoze.who.interfaces import IMetadataProvider
|
||||
from repoze.who.plugins.form import FormPluginBase
|
||||
|
||||
from saml2 import ecp
|
||||
|
||||
from saml2.client import Saml2Client
|
||||
from saml2.s_utils import sid
|
||||
from saml2.config import config_factory
|
||||
from saml2.profile import paos
|
||||
|
||||
#from saml2.population import Population
|
||||
#from saml2.attribute_resolver import AttributeResolver
|
||||
|
||||
PAOS_HEADER_INFO = 'ver="%s";"%s"' % (paos.NAMESPACE, ecp.SERVICE)
|
||||
|
||||
def construct_came_from(environ):
|
||||
""" The URL that the user used when the process where interupted
|
||||
for single-sign-on processing. """
|
||||
|
||||
came_from = environ.get("PATH_INFO")
|
||||
qstr = environ.get("QUERY_STRING","")
|
||||
if qstr:
|
||||
came_from += '?' + qstr
|
||||
return came_from
|
||||
|
||||
# FormPluginBase defines the methods remember and forget
|
||||
def cgi_field_storage_to_dict(field_storage):
|
||||
"""Get a plain dictionary, rather than the '.value' system used by the
|
||||
cgi module."""
|
||||
|
||||
params = {}
|
||||
for key in field_storage.keys():
|
||||
try:
|
||||
params[ key ] = field_storage[ key ].value
|
||||
except AttributeError:
|
||||
if isinstance(field_storage[ key ], basestring):
|
||||
params[key] = field_storage[key]
|
||||
|
||||
return params
|
||||
|
||||
def get_body(environ, log=None):
|
||||
body = ""
|
||||
|
||||
length = int(environ["CONTENT_LENGTH"])
|
||||
try:
|
||||
body = environ["wsgi.input"].read(length)
|
||||
except Exception, excp:
|
||||
if log:
|
||||
log.info("Exception while reading post: %s" % (excp,))
|
||||
raise
|
||||
|
||||
# restore what I might have upset
|
||||
from StringIO import StringIO
|
||||
environ['wsgi.input'] = StringIO(body)
|
||||
environ['s2repoze.body'] = body
|
||||
|
||||
return body
|
||||
|
||||
def exception_trace(tag, exc, log):
|
||||
message = traceback.format_exception(*sys.exc_info())
|
||||
log.error("[%s] ExcList: %s" % (tag, "".join(message),))
|
||||
log.error("[%s] Exception: %s" % (tag, exc))
|
||||
|
||||
class ECP_response(object):
|
||||
code = 200
|
||||
title = 'OK'
|
||||
|
||||
def __init__(self, content):
|
||||
self.content = content
|
||||
|
||||
#noinspection PyUnusedLocal
|
||||
def __call__(self, environ, start_response):
|
||||
start_response('%s %s' % (self.code, self.title),
|
||||
[('Content-Type', "text/xml")])
|
||||
return [self.content]
|
||||
|
||||
class SAML2Plugin(FormPluginBase):
|
||||
|
||||
implements(IChallenger, IIdentifier, IAuthenticator, IMetadataProvider)
|
||||
|
||||
def __init__(self, rememberer_name, config, saml_client,
|
||||
wayf, cache, debug, sid_store=None, discovery=""):
|
||||
FormPluginBase.__init__(self)
|
||||
|
||||
self.rememberer_name = rememberer_name
|
||||
self.debug = debug
|
||||
self.wayf = wayf
|
||||
self.saml_client = saml_client
|
||||
self.discovery = discovery
|
||||
self.conf = config
|
||||
self.log = None
|
||||
self.cache = cache
|
||||
|
||||
try:
|
||||
self.metadata = self.conf.metadata
|
||||
except KeyError:
|
||||
self.metadata = None
|
||||
if sid_store:
|
||||
self.outstanding_queries = shelve.open(sid_store, writeback=True)
|
||||
else:
|
||||
self.outstanding_queries = {}
|
||||
self.iam = platform.node()
|
||||
|
||||
def _get_post(self, environ):
|
||||
"""
|
||||
Get the posted information
|
||||
|
||||
:param environ: A dictionary with environment variables
|
||||
"""
|
||||
|
||||
post = {}
|
||||
|
||||
post_env = environ.copy()
|
||||
post_env['QUERY_STRING'] = ''
|
||||
|
||||
_ = get_body(environ, self.log)
|
||||
|
||||
try:
|
||||
post = cgi.FieldStorage(
|
||||
fp=environ['wsgi.input'],
|
||||
environ=post_env,
|
||||
keep_blank_values=True
|
||||
)
|
||||
except Exception, excp:
|
||||
if self.debug and self.log:
|
||||
self.log.info("Exception (II): %s" % (excp,))
|
||||
raise
|
||||
|
||||
if self.debug and self.log:
|
||||
self.log.info('identify post: %s' % (post,))
|
||||
|
||||
return post
|
||||
|
||||
def _wayf_redirect(self, came_from):
|
||||
sid_ = sid()
|
||||
self.outstanding_queries[sid_] = came_from
|
||||
self.log.info("Redirect to WAYF function: %s" % self.wayf)
|
||||
return -1, HTTPSeeOther(headers = [('Location',
|
||||
"%s?%s" % (self.wayf, sid_))])
|
||||
|
||||
#noinspection PyUnusedLocal
|
||||
def _pick_idp(self, environ, came_from):
|
||||
"""
|
||||
If more than one idp and if none is selected, I have to do wayf or
|
||||
disco
|
||||
"""
|
||||
|
||||
# check headers to see if it's an ECP request
|
||||
# headers = {
|
||||
# 'Accept' : 'text/html; application/vnd.paos+xml',
|
||||
# 'PAOS' : 'ver="%s";"%s"' % (paos.NAMESPACE, SERVICE)
|
||||
# }
|
||||
|
||||
self.log.info("[_pick_idp] %s" % environ)
|
||||
if "HTTP_PAOS" in environ:
|
||||
if environ["HTTP_PAOS"] == PAOS_HEADER_INFO:
|
||||
if 'application/vnd.paos+xml' in environ["HTTP_ACCEPT"]:
|
||||
# Where should I redirect the user to
|
||||
# entityid -> the IdP to use
|
||||
# relay_state -> when back from authentication
|
||||
|
||||
self.log.info("- ECP client detected -")
|
||||
|
||||
_relay_state = construct_came_from(environ)
|
||||
_entityid = self.saml_client.config.ecp_endpoint(
|
||||
environ["REMOTE_ADDR"])
|
||||
if not _entityid:
|
||||
return -1, HTTPInternalServerError(
|
||||
detail="No IdP to talk to"
|
||||
)
|
||||
self.log.info("IdP to talk to: %s" % _entityid)
|
||||
return ecp.ecp_auth_request(self.saml_client, _entityid,
|
||||
_relay_state, log=self.log)
|
||||
else:
|
||||
return -1, HTTPInternalServerError(
|
||||
detail='Faulty Accept header')
|
||||
else:
|
||||
return -1, HTTPInternalServerError(
|
||||
detail='unknown ECP version')
|
||||
|
||||
|
||||
idps = self.conf.idps()
|
||||
|
||||
if self.log:
|
||||
self.log.info("IdP URL: %s" % idps)
|
||||
|
||||
if len( idps ) == 1:
|
||||
# idps is a dictionary
|
||||
idp_entity_id = idps.keys()[0]
|
||||
elif not len(idps):
|
||||
return -1, HTTPInternalServerError(detail='Misconfiguration')
|
||||
else:
|
||||
idp_entity_id = ""
|
||||
if self.log:
|
||||
self.log.info("ENVIRON: %s" % environ)
|
||||
query = environ.get('s2repoze.body','')
|
||||
if not query:
|
||||
query = environ.get("QUERY_STRING","")
|
||||
|
||||
if self.log:
|
||||
self.log.info("<_pick_idp> query: %s" % query)
|
||||
|
||||
if self.wayf:
|
||||
if query:
|
||||
try:
|
||||
wayf_selected = dict(parse_qs(query))["wayf_selected"][0]
|
||||
except KeyError:
|
||||
return self._wayf_redirect(came_from)
|
||||
idp_entity_id = wayf_selected
|
||||
else:
|
||||
return self._wayf_redirect(came_from)
|
||||
elif self.discovery:
|
||||
if query:
|
||||
idp_entity_id = self.saml_client.get_idp_from_discovery_service(
|
||||
query=environ.get("QUERY_STRING"))
|
||||
else:
|
||||
sid_ = sid()
|
||||
self.outstanding_queries[sid_] = came_from
|
||||
self.log.info("Redirect to Discovery Service function")
|
||||
loc = self.saml_client.request_to_discovery_service(
|
||||
self.discovery)
|
||||
return -1, HTTPSeeOther(headers = [('Location',loc)])
|
||||
else:
|
||||
return -1, HTTPNotImplemented(detail='No WAYF or DJ present!')
|
||||
|
||||
self.log.info("Choosen IdP: '%s'" % idp_entity_id)
|
||||
return 0, idp_entity_id
|
||||
|
||||
#### IChallenger ####
|
||||
#noinspection PyUnusedLocal
|
||||
def challenge(self, environ, _status, _app_headers, _forget_headers):
|
||||
|
||||
# this challenge consist in login out
|
||||
if environ.has_key('rwpc.logout'):
|
||||
# ignore right now?
|
||||
pass
|
||||
|
||||
self.log = environ.get('repoze.who.logger','')
|
||||
self.saml_client.log = self.log
|
||||
|
||||
# Which page was accessed to get here
|
||||
came_from = construct_came_from(environ)
|
||||
environ["myapp.came_from"] = came_from
|
||||
if self.debug and self.log:
|
||||
self.log.info("[sp.challenge] RelayState >> %s" % came_from)
|
||||
|
||||
# Am I part of a virtual organization ?
|
||||
try:
|
||||
vorg_name = environ["myapp.vo"]
|
||||
except KeyError:
|
||||
try:
|
||||
vorg_name = self.saml_client.vorg.vorg_name
|
||||
except AttributeError:
|
||||
vorg_name = ""
|
||||
|
||||
if self.log:
|
||||
self.log.info("[sp.challenge] VO: %s" % vorg_name)
|
||||
|
||||
# If more than one idp and if none is selected, I have to do wayf
|
||||
(done, response) = self._pick_idp(environ, came_from)
|
||||
# Three cases: -1 something went wrong or Discovery service used
|
||||
# 0 I've got an IdP to send a request to
|
||||
# >0 ECP in progress
|
||||
if self.log:
|
||||
self.log.debug("_idp_pick returned: %s" % done)
|
||||
if done == -1:
|
||||
return response
|
||||
elif done > 0:
|
||||
self.outstanding_queries[done] = came_from
|
||||
return ECP_response(response)
|
||||
else:
|
||||
idp_url = response
|
||||
if self.log:
|
||||
self.log.info("[sp.challenge] idp_url: %s" % idp_url)
|
||||
# Do the AuthnRequest
|
||||
|
||||
(sid_, result) = self.saml_client.authenticate(idp_url,
|
||||
relay_state=came_from,
|
||||
log=self.log,
|
||||
vorg=vorg_name)
|
||||
|
||||
# remember the request
|
||||
self.outstanding_queries[sid_] = came_from
|
||||
|
||||
if isinstance(result, tuple):
|
||||
if self.debug and self.log:
|
||||
self.log.info('redirect to: %s' % result[1])
|
||||
return HTTPSeeOther(headers=[result])
|
||||
else :
|
||||
return HTTPInternalServerError(detail='Incorrect returned data')
|
||||
|
||||
def _construct_identity(self, session_info):
|
||||
identity = {
|
||||
"login": session_info["name_id"],
|
||||
"password": "",
|
||||
'repoze.who.userid': session_info["name_id"],
|
||||
"user": session_info["ava"],
|
||||
}
|
||||
if self.debug and self.log:
|
||||
self.log.info("Identity: %s" % identity)
|
||||
|
||||
return identity
|
||||
|
||||
def _eval_authn_response(self, environ, post):
|
||||
if self.log:
|
||||
self.log.info("Got AuthN response, checking..")
|
||||
self.log.info("Outstanding: %s" % (self.outstanding_queries,))
|
||||
|
||||
try:
|
||||
# Evaluate the response, returns a AuthnResponse instance
|
||||
try:
|
||||
authresp = self.saml_client.response(post,
|
||||
self.outstanding_queries,
|
||||
self.log)
|
||||
except Exception, excp:
|
||||
if self.log:
|
||||
self.log.error("Exception: %s" % (excp,))
|
||||
raise
|
||||
|
||||
session_info = authresp.session_info()
|
||||
except TypeError, excp:
|
||||
if self.log:
|
||||
self.log.error("Exception: %s" % (excp,))
|
||||
return None
|
||||
|
||||
if session_info["came_from"]:
|
||||
if self.debug and self.log:
|
||||
self.log.info("came_from << %s" % session_info["came_from"])
|
||||
try:
|
||||
path, query = session_info["came_from"].split('?')
|
||||
environ["PATH_INFO"] = path
|
||||
environ["QUERY_STRING"] = query
|
||||
except ValueError:
|
||||
environ["PATH_INFO"] = session_info["came_from"]
|
||||
|
||||
if self.log:
|
||||
self.log.info("Session_info: %s" % session_info)
|
||||
return session_info
|
||||
|
||||
def do_ecp_response(self, body, environ):
|
||||
response, _relay_state = ecp.handle_ecp_authn_response(self.saml_client,
|
||||
body)
|
||||
|
||||
environ["s2repoze.relay_state"] = _relay_state.text
|
||||
session_info = response.session_info()
|
||||
if self.log:
|
||||
self.log.info("Session_info: %s" % session_info)
|
||||
|
||||
return session_info
|
||||
|
||||
#### IIdentifier ####
|
||||
def identify(self, environ):
|
||||
"""
|
||||
Tries do the identification
|
||||
"""
|
||||
self.log = environ.get('repoze.who.logger', '')
|
||||
self.saml_client.log = self.log
|
||||
|
||||
if "CONTENT_LENGTH" not in environ or not environ["CONTENT_LENGTH"]:
|
||||
if self.debug and self.log:
|
||||
self.log.info('[identify] get or empty post')
|
||||
return {}
|
||||
|
||||
# if self.log:
|
||||
# self.log.info("ENVIRON: %s" % environ)
|
||||
# self.log.info("self: %s" % (self.__dict__,))
|
||||
|
||||
uri = environ.get('REQUEST_URI', construct_url(environ))
|
||||
|
||||
if self.debug:
|
||||
#if self.log: self.log.info("environ.keys(): %s" % environ.keys())
|
||||
#if self.log: self.log.info("Environment: %s" % environ)
|
||||
if self.log:
|
||||
self.log.info('[sp.identify] uri: %s' % (uri,))
|
||||
|
||||
query = parse_dict_querystring(environ)
|
||||
if self.debug and self.log:
|
||||
self.log.info('[sp.identify] query: %s' % (query,))
|
||||
|
||||
post = self._get_post(environ)
|
||||
|
||||
if self.debug and self.log:
|
||||
try:
|
||||
self.log.info('[sp.identify] post keys: %s' % (post.keys(),))
|
||||
except (TypeError, IndexError):
|
||||
pass
|
||||
|
||||
try:
|
||||
if not post.has_key("SAMLResponse"):
|
||||
self.log.info("[sp.identify] --- NOT SAMLResponse ---")
|
||||
# Not for me, put the post back where next in line can
|
||||
# find it
|
||||
environ["post.fieldstorage"] = post
|
||||
return {}
|
||||
else:
|
||||
self.log.info("[sp.identify] --- SAMLResponse ---")
|
||||
# check for SAML2 authN response
|
||||
#if self.debug:
|
||||
try:
|
||||
session_info = self._eval_authn_response(environ,
|
||||
cgi_field_storage_to_dict(post))
|
||||
except Exception:
|
||||
return None
|
||||
except TypeError, exc:
|
||||
# might be a ECP (=SOAP) response
|
||||
body = environ.get('s2repoze.body', None)
|
||||
if body:
|
||||
# might be a ECP response
|
||||
try:
|
||||
session_info = self.do_ecp_response(body, environ)
|
||||
except Exception:
|
||||
environ["post.fieldstorage"] = post
|
||||
return {}
|
||||
else:
|
||||
exception_trace("sp.identity", exc, self.log)
|
||||
environ["post.fieldstorage"] = post
|
||||
return {}
|
||||
|
||||
if session_info:
|
||||
environ["s2repoze.sessioninfo"] = session_info
|
||||
name_id = session_info["name_id"]
|
||||
# contruct and return the identity
|
||||
identity = {
|
||||
"login": name_id,
|
||||
"password": "",
|
||||
'repoze.who.userid': name_id,
|
||||
"user": self.saml_client.users.get_identity(name_id)[0],
|
||||
}
|
||||
self.log.info("[sp.identify] IDENTITY: %s" % (identity,))
|
||||
return identity
|
||||
else:
|
||||
return None
|
||||
|
||||
|
||||
# IMetadataProvider
|
||||
def add_metadata(self, environ, identity):
|
||||
""" Add information to the knowledge I have about the user """
|
||||
subject_id = identity['repoze.who.userid']
|
||||
|
||||
self.log = environ.get('repoze.who.logger','')
|
||||
self.saml_client.log = self.log
|
||||
|
||||
if self.debug and self.log:
|
||||
self.log.info(
|
||||
"[add_metadata] for %s" % subject_id)
|
||||
try:
|
||||
self.log.info(
|
||||
"Issuers: %s" % self.saml_client.users.sources(subject_id))
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
if "user" not in identity:
|
||||
identity["user"] = {}
|
||||
try:
|
||||
(ava, _) = self.saml_client.users.get_identity(subject_id)
|
||||
#now = time.gmtime()
|
||||
if self.debug and self.log:
|
||||
self.log.info("[add_metadata] adds: %s" % ava)
|
||||
identity["user"].update(ava)
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
if "pysaml2_vo_expanded" not in identity:
|
||||
# is this a Virtual Organization situation
|
||||
if self.saml_client.vorg:
|
||||
try:
|
||||
if self.saml_client.vorg.do_aggregation(subject_id,
|
||||
log=self.log):
|
||||
# Get the extended identity
|
||||
identity["user"] = self.saml_client.users.get_identity(
|
||||
subject_id)[0]
|
||||
# Only do this once, mark that the identity has been
|
||||
# expanded
|
||||
identity["pysaml2_vo_expanded"] = 1
|
||||
except KeyError:
|
||||
if self.log:
|
||||
self.log.error("Failed to do attribute aggregation, "
|
||||
"missing common attribute")
|
||||
if self.debug and self.log:
|
||||
self.log.info("[add_metadata] returns: %s" % (dict(identity),))
|
||||
|
||||
if not identity["user"]:
|
||||
# remove cookie and demand re-authentication
|
||||
pass
|
||||
|
||||
# @return
|
||||
# used 2 times : one to get the ticket, the other to validate it
|
||||
def _service_url(self, environ, qstr=None):
|
||||
if qstr is not None:
|
||||
url = construct_url(environ, querystring = qstr)
|
||||
else:
|
||||
url = construct_url(environ)
|
||||
return url
|
||||
|
||||
#### IAuthenticatorPlugin ####
|
||||
#noinspection PyUnusedLocal
|
||||
def authenticate(self, environ, identity=None):
|
||||
if identity:
|
||||
return identity.get('login', None)
|
||||
else:
|
||||
return None
|
||||
|
||||
|
||||
def make_plugin(rememberer_name=None, # plugin for remember
|
||||
cache= "", # cache
|
||||
# Which virtual organization to support
|
||||
virtual_organization="",
|
||||
saml_conf="",
|
||||
wayf="",
|
||||
debug=0,
|
||||
sid_store="",
|
||||
identity_cache="",
|
||||
discovery=""
|
||||
):
|
||||
|
||||
if saml_conf is "":
|
||||
raise ValueError(
|
||||
'must include saml_conf in configuration')
|
||||
|
||||
if rememberer_name is None:
|
||||
raise ValueError(
|
||||
'must include rememberer_name in configuration')
|
||||
|
||||
conf = config_factory("sp", saml_conf)
|
||||
|
||||
scl = Saml2Client(config=conf, identity_cache=identity_cache,
|
||||
virtual_organization=virtual_organization)
|
||||
|
||||
plugin = SAML2Plugin(rememberer_name, conf, scl, wayf, cache, debug,
|
||||
sid_store, discovery)
|
||||
return plugin
|
||||
|
||||
# came_from = re.sub(r'ticket=[^&]*&?', '', came_from)
|
||||
|
||||
866
src/saml2/__init__.py
Normal file
866
src/saml2/__init__.py
Normal file
@@ -0,0 +1,866 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
# Copyright (C) 2006 Google Inc.
|
||||
# Copyright (C) 2009 Umeå University
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
"""Contains base classes representing SAML elements.
|
||||
|
||||
These codes were originally written by Jeffrey Scudder for
|
||||
representing Saml elements. Takashi Matsuo had added some codes, and
|
||||
changed some. Roland Hedberg rewrote the whole thing from bottom up so
|
||||
barely anything but the original structures remained.
|
||||
|
||||
Module objective: provide data classes for SAML constructs. These
|
||||
classes hide the XML-ness of SAML and provide a set of native Python
|
||||
classes to interact with.
|
||||
|
||||
Conversions to and from XML should only be necessary when the SAML classes
|
||||
"touch the wire" and are sent over HTTP. For this reason this module
|
||||
provides methods and functions to convert SAML classes to and from strings.
|
||||
"""
|
||||
|
||||
# try:
|
||||
# # lxml: best performance for XML processing
|
||||
# import lxml.etree as ET
|
||||
# except ImportError:
|
||||
# try:
|
||||
# # Python 2.5+: batteries included
|
||||
# import xml.etree.cElementTree as ET
|
||||
# except ImportError:
|
||||
# try:
|
||||
# # Python <2.5: standalone ElementTree install
|
||||
# import elementtree.cElementTree as ET
|
||||
# except ImportError:
|
||||
# raise ImportError, "lxml or ElementTree are not installed, "\
|
||||
# +"see http://codespeak.net/lxml "\
|
||||
# +"or http://effbot.org/zone/element-index.htm"
|
||||
|
||||
import logging
|
||||
|
||||
try:
|
||||
from xml.etree import cElementTree as ElementTree
|
||||
if ElementTree.VERSION < '1.3.0':
|
||||
# cElementTree has no support for register_namespace
|
||||
# neither _namespace_map, thus we sacrify performance
|
||||
# for correctness
|
||||
from xml.etree import ElementTree
|
||||
except ImportError:
|
||||
try:
|
||||
import cElementTree as ElementTree
|
||||
except ImportError:
|
||||
from elementtree import ElementTree
|
||||
|
||||
root_logger = logging.getLogger("pySAML2")
|
||||
root_logger.level = logging.NOTSET
|
||||
|
||||
NAMESPACE = 'urn:oasis:names:tc:SAML:2.0:assertion'
|
||||
#TEMPLATE = '{urn:oasis:names:tc:SAML:2.0:assertion}%s'
|
||||
#XSI_NAMESPACE = 'http://www.w3.org/2001/XMLSchema-instance'
|
||||
|
||||
NAMEID_FORMAT_EMAILADDRESS = (
|
||||
"urn:oasis:names:tc:SAML:2.0:nameid-format:emailAddress")
|
||||
|
||||
# These are defined in saml2.saml
|
||||
#NAME_FORMAT_UNSPECIFIED = (
|
||||
# "urn:oasis:names:tc:SAML:2.0:attrname-format:unspecified")
|
||||
#NAME_FORMAT_URI = "urn:oasis:names:tc:SAML:2.0:attrname-format:uri"
|
||||
#NAME_FORMAT_BASIC = "urn:oasis:names:tc:SAML:2.0:attrname-format:basic"
|
||||
|
||||
SUBJECT_CONFIRMATION_METHOD_BEARER = "urn:oasis:names:tc:SAML:2.0:cm:bearer"
|
||||
|
||||
DECISION_TYPE_PERMIT = "Permit"
|
||||
DECISION_TYPE_DENY = "Deny"
|
||||
DECISION_TYPE_INDETERMINATE = "Indeterminate"
|
||||
|
||||
VERSION = "2.0"
|
||||
|
||||
BINDING_SOAP = 'urn:oasis:names:tc:SAML:2.0:bindings:SOAP'
|
||||
BINDING_PAOS = 'urn:oasis:names:tc:SAML:2.0:bindings:PAOS'
|
||||
BINDING_HTTP_REDIRECT = 'urn:oasis:names:tc:SAML:2.0:bindings:HTTP-Redirect'
|
||||
BINDING_HTTP_POST = 'urn:oasis:names:tc:SAML:2.0:bindings:HTTP-POST'
|
||||
BINDING_HTTP_ARTIFACT = 'urn:oasis:names:tc:SAML:2.0:bindings:HTTP-Artifact'
|
||||
BINDING_URI = 'urn:oasis:names:tc:SAML:2.0:bindings:URI'
|
||||
|
||||
def class_name(instance):
|
||||
return "%s:%s" % (instance.c_namespace, instance.c_tag)
|
||||
|
||||
def create_class_from_xml_string(target_class, xml_string):
|
||||
"""Creates an instance of the target class from a string.
|
||||
|
||||
:param target_class: The class which will be instantiated and populated
|
||||
with the contents of the XML. This class must have a c_tag and a
|
||||
c_namespace class variable.
|
||||
:param xml_string: A string which contains valid XML. The root element
|
||||
of the XML string should match the tag and namespace of the desired
|
||||
class.
|
||||
|
||||
:return: An instance of the target class with members assigned according to
|
||||
the contents of the XML - or None if the root XML tag and namespace did
|
||||
not match those of the target class.
|
||||
"""
|
||||
tree = ElementTree.fromstring(xml_string)
|
||||
return create_class_from_element_tree(target_class, tree)
|
||||
|
||||
|
||||
def create_class_from_element_tree(target_class, tree, namespace=None,
|
||||
tag=None):
|
||||
"""Instantiates the class and populates members according to the tree.
|
||||
|
||||
Note: Only use this function with classes that have c_namespace and c_tag
|
||||
class members.
|
||||
|
||||
:param target_class: The class which will be instantiated and populated
|
||||
with the contents of the XML.
|
||||
:param tree: An element tree whose contents will be converted into
|
||||
members of the new target_class instance.
|
||||
:param namespace: The namespace which the XML tree's root node must
|
||||
match. If omitted, the namespace defaults to the c_namespace of the
|
||||
target class.
|
||||
:param tag: The tag which the XML tree's root node must match. If
|
||||
omitted, the tag defaults to the c_tag class member of the target
|
||||
class.
|
||||
|
||||
:return: An instance of the target class - or None if the tag and namespace
|
||||
of the XML tree's root node did not match the desired namespace and tag.
|
||||
"""
|
||||
if namespace is None:
|
||||
namespace = target_class.c_namespace
|
||||
if tag is None:
|
||||
tag = target_class.c_tag
|
||||
if tree.tag == '{%s}%s' % (namespace, tag):
|
||||
target = target_class()
|
||||
target.harvest_element_tree(tree)
|
||||
return target
|
||||
else:
|
||||
return None
|
||||
|
||||
class Error(Exception):
|
||||
"""Exception class thrown by this module."""
|
||||
pass
|
||||
|
||||
class ExtensionElement(object):
|
||||
"""XML which is not part of the SAML specification,
|
||||
these are called extension elements. If a classes parser
|
||||
encounters an unexpected XML construct, it is translated into an
|
||||
ExtensionElement instance. ExtensionElement is designed to fully
|
||||
capture the information in the XML. Child nodes in an XML
|
||||
extension are turned into ExtensionElements as well.
|
||||
"""
|
||||
|
||||
def __init__(self, tag, namespace=None, attributes=None,
|
||||
children=None, text=None):
|
||||
"""Constructor for ExtensionElement
|
||||
|
||||
:param namespace: The XML namespace for this element.
|
||||
:param tag: The tag (without the namespace qualifier) for
|
||||
this element. To reconstruct the full qualified name of the
|
||||
element, combine this tag with the namespace.
|
||||
:param attributes: The attribute value string pairs for the XML
|
||||
attributes of this element.
|
||||
:param children: list (optional) A list of ExtensionElements which
|
||||
represent the XML child nodes of this element.
|
||||
"""
|
||||
|
||||
self.namespace = namespace
|
||||
self.tag = tag
|
||||
self.attributes = attributes or {}
|
||||
self.children = children or []
|
||||
self.text = text
|
||||
|
||||
def to_string(self):
|
||||
""" Serialize the object into a XML string """
|
||||
element_tree = self.transfer_to_element_tree()
|
||||
return ElementTree.tostring(element_tree, encoding="UTF-8")
|
||||
|
||||
def transfer_to_element_tree(self):
|
||||
if self.tag is None:
|
||||
return None
|
||||
|
||||
element_tree = ElementTree.Element('')
|
||||
|
||||
if self.namespace is not None:
|
||||
element_tree.tag = '{%s}%s' % (self.namespace, self.tag)
|
||||
else:
|
||||
element_tree.tag = self.tag
|
||||
|
||||
for key, value in self.attributes.iteritems():
|
||||
element_tree.attrib[key] = value
|
||||
|
||||
for child in self.children:
|
||||
child.become_child_element_of(element_tree)
|
||||
|
||||
element_tree.text = self.text
|
||||
|
||||
return element_tree
|
||||
|
||||
def become_child_element_of(self, element_tree):
|
||||
"""Converts this object into an etree element and adds it as a child
|
||||
node in an etree element.
|
||||
|
||||
Adds self to the ElementTree. This method is required to avoid verbose
|
||||
XML which constantly redefines the namespace.
|
||||
|
||||
:param element_tree: ElementTree._Element The element to which this
|
||||
object's XML will be added.
|
||||
"""
|
||||
new_element = self.transfer_to_element_tree()
|
||||
element_tree.append(new_element)
|
||||
|
||||
def find_children(self, tag=None, namespace=None):
|
||||
"""Searches child nodes for objects with the desired tag/namespace.
|
||||
|
||||
Returns a list of extension elements within this object whose tag
|
||||
and/or namespace match those passed in. To find all children in
|
||||
a particular namespace, specify the namespace but not the tag name.
|
||||
If you specify only the tag, the result list may contain extension
|
||||
elements in multiple namespaces.
|
||||
|
||||
:param tag: str (optional) The desired tag
|
||||
:param namespace: str (optional) The desired namespace
|
||||
|
||||
:return: A list of elements whose tag and/or namespace match the
|
||||
parameters values
|
||||
"""
|
||||
|
||||
results = []
|
||||
|
||||
if tag and namespace:
|
||||
for element in self.children:
|
||||
if element.tag == tag and element.namespace == namespace:
|
||||
results.append(element)
|
||||
elif tag and not namespace:
|
||||
for element in self.children:
|
||||
if element.tag == tag:
|
||||
results.append(element)
|
||||
elif namespace and not tag:
|
||||
for element in self.children:
|
||||
if element.namespace == namespace:
|
||||
results.append(element)
|
||||
else:
|
||||
for element in self.children:
|
||||
results.append(element)
|
||||
|
||||
return results
|
||||
|
||||
def loadd(self, ava):
|
||||
""" expects a special set of keys """
|
||||
|
||||
if "attributes" in ava:
|
||||
for key, val in ava["attributes"].items():
|
||||
self.attributes[key] = val
|
||||
|
||||
try:
|
||||
self.tag = ava["tag"]
|
||||
except KeyError:
|
||||
if not self.tag:
|
||||
raise KeyError("ExtensionElement must have a tag")
|
||||
|
||||
try:
|
||||
self.namespace = ava["namespace"]
|
||||
except KeyError:
|
||||
if not self.namespace:
|
||||
raise KeyError("ExtensionElement must belong to a namespace")
|
||||
|
||||
try:
|
||||
self.text = ava["text"]
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
if "children" in ava:
|
||||
for item in ava["children"]:
|
||||
self.children.append(ExtensionElement(item["tag"]).loadd(item))
|
||||
|
||||
return self
|
||||
|
||||
def extension_element_from_string(xml_string):
|
||||
element_tree = ElementTree.fromstring(xml_string)
|
||||
return _extension_element_from_element_tree(element_tree)
|
||||
|
||||
|
||||
def _extension_element_from_element_tree(element_tree):
|
||||
elementc_tag = element_tree.tag
|
||||
if '}' in elementc_tag:
|
||||
namespace = elementc_tag[1:elementc_tag.index('}')]
|
||||
tag = elementc_tag[elementc_tag.index('}')+1:]
|
||||
else:
|
||||
namespace = None
|
||||
tag = elementc_tag
|
||||
extension = ExtensionElement(namespace=namespace, tag=tag)
|
||||
for key, value in element_tree.attrib.iteritems():
|
||||
extension.attributes[key] = value
|
||||
for child in element_tree:
|
||||
extension.children.append(_extension_element_from_element_tree(child))
|
||||
extension.text = element_tree.text
|
||||
return extension
|
||||
|
||||
|
||||
class ExtensionContainer(object):
|
||||
|
||||
c_tag = ""
|
||||
c_namespace = ""
|
||||
|
||||
def __init__(self, text=None, extension_elements=None,
|
||||
extension_attributes=None):
|
||||
|
||||
self.text = text
|
||||
self.extension_elements = extension_elements or []
|
||||
self.extension_attributes = extension_attributes or {}
|
||||
|
||||
# Three methods to create an object from an ElementTree
|
||||
def harvest_element_tree(self, tree):
|
||||
# Fill in the instance members from the contents of the XML tree.
|
||||
for child in tree:
|
||||
self._convert_element_tree_to_member(child)
|
||||
for attribute, value in tree.attrib.iteritems():
|
||||
self._convert_element_attribute_to_member(attribute, value)
|
||||
self.text = tree.text
|
||||
|
||||
def _convert_element_tree_to_member(self, child_tree):
|
||||
self.extension_elements.append(_extension_element_from_element_tree(
|
||||
child_tree))
|
||||
|
||||
def _convert_element_attribute_to_member(self, attribute, value):
|
||||
self.extension_attributes[attribute] = value
|
||||
|
||||
# One method to create an ElementTree from an object
|
||||
def _add_members_to_element_tree(self, tree):
|
||||
for child in self.extension_elements:
|
||||
child.become_child_element_of(tree)
|
||||
for attribute, value in self.extension_attributes.iteritems():
|
||||
tree.attrib[attribute] = value
|
||||
tree.text = self.text
|
||||
|
||||
def find_extensions(self, tag=None, namespace=None):
|
||||
"""Searches extension elements for child nodes with the desired name.
|
||||
|
||||
Returns a list of extension elements within this object whose tag
|
||||
and/or namespace match those passed in. To find all extensions in
|
||||
a particular namespace, specify the namespace but not the tag name.
|
||||
If you specify only the tag, the result list may contain extension
|
||||
elements in multiple namespaces.
|
||||
|
||||
:param tag: str (optional) The desired tag
|
||||
:param namespace: str (optional) The desired namespace
|
||||
|
||||
:Return: A list of elements whose tag and/or namespace match the
|
||||
parameters values
|
||||
"""
|
||||
|
||||
results = []
|
||||
|
||||
if tag and namespace:
|
||||
for element in self.extension_elements:
|
||||
if element.tag == tag and element.namespace == namespace:
|
||||
results.append(element)
|
||||
elif tag and not namespace:
|
||||
for element in self.extension_elements:
|
||||
if element.tag == tag:
|
||||
results.append(element)
|
||||
elif namespace and not tag:
|
||||
for element in self.extension_elements:
|
||||
if element.namespace == namespace:
|
||||
results.append(element)
|
||||
else:
|
||||
for element in self.extension_elements:
|
||||
results.append(element)
|
||||
|
||||
return results
|
||||
|
||||
def extensions_as_elements(self, tag, schema):
|
||||
""" Return extensions that has the given tag and belongs to the
|
||||
given schema as native elements of that schema.
|
||||
|
||||
:param tag: The tag of the element
|
||||
:param schema: Which schema the element should originate from
|
||||
:return: a list of native elements
|
||||
"""
|
||||
result = []
|
||||
for ext in self.find_extensions(tag, schema.NAMESPACE):
|
||||
ets = schema.ELEMENT_FROM_STRING[tag]
|
||||
result.append(ets(ext.to_string()))
|
||||
return result
|
||||
|
||||
def add_extension_elements(self, items):
|
||||
for item in items:
|
||||
self.extension_elements.append(element_to_extension_element(item))
|
||||
|
||||
def add_extension_element(self, item):
|
||||
self.extension_elements.append(element_to_extension_element(item))
|
||||
|
||||
|
||||
def add_extension_attribute(self, name, value):
|
||||
self.extension_attributes[name] = value
|
||||
|
||||
|
||||
|
||||
def make_vals(val, klass, klass_inst=None, prop=None, part=False,
|
||||
base64encode=False):
|
||||
"""
|
||||
Creates a class instance with a specified value, the specified
|
||||
class instance may be a value on a property in a defined class instance.
|
||||
|
||||
:param val: The value
|
||||
:param klass: The value class
|
||||
:param klass_inst: The class instance which has a property on which
|
||||
what this function returns is a value.
|
||||
:param prop: The property which the value should be assigned to.
|
||||
:param part: If the value is one of a possible list of values it should be
|
||||
handled slightly different compared to if it isn't.
|
||||
:return: Value class instance
|
||||
"""
|
||||
cinst = None
|
||||
|
||||
#print "make_vals(%s, %s)" % (val, klass)
|
||||
|
||||
if isinstance(val, dict):
|
||||
cinst = klass().loadd(val, base64encode=base64encode)
|
||||
else:
|
||||
try:
|
||||
cinst = klass().set_text(val)
|
||||
except ValueError:
|
||||
if not part:
|
||||
cis = [make_vals(sval, klass, klass_inst, prop, True,
|
||||
base64encode) for sval in val]
|
||||
setattr(klass_inst, prop, cis)
|
||||
else:
|
||||
raise
|
||||
|
||||
if part:
|
||||
return cinst
|
||||
else:
|
||||
if cinst:
|
||||
cis = [cinst]
|
||||
setattr(klass_inst, prop, cis)
|
||||
|
||||
def make_instance(klass, spec, base64encode=False):
|
||||
"""
|
||||
Constructs a class instance containing the specified information
|
||||
|
||||
:param klass: The class
|
||||
:param spec: Information to be placed in the instance (a dictionary)
|
||||
:return: The instance
|
||||
"""
|
||||
|
||||
return klass().loadd(spec, base64encode)
|
||||
|
||||
class SamlBase(ExtensionContainer):
|
||||
"""A foundation class on which SAML classes are built. It
|
||||
handles the parsing of attributes and children which are common to all
|
||||
SAML classes. By default, the SamlBase class translates all XML child
|
||||
nodes into ExtensionElements.
|
||||
"""
|
||||
|
||||
c_children = {}
|
||||
c_attributes = {}
|
||||
c_attribute_type = {}
|
||||
#c_attribute_use = {}
|
||||
#c_attribute_required = {}
|
||||
c_child_order = []
|
||||
c_cardinality = {}
|
||||
|
||||
def _get_all_c_children_with_order(self):
|
||||
if len(self.c_child_order) > 0:
|
||||
for child in self.c_child_order:
|
||||
yield child
|
||||
else:
|
||||
for _, values in self.__class__.c_children.iteritems():
|
||||
yield values[0]
|
||||
|
||||
def _convert_element_tree_to_member(self, child_tree):
|
||||
# Find the element's tag in this class's list of child members
|
||||
if self.__class__.c_children.has_key(child_tree.tag):
|
||||
member_name = self.__class__.c_children[child_tree.tag][0]
|
||||
member_class = self.__class__.c_children[child_tree.tag][1]
|
||||
# If the class member is supposed to contain a list, make sure the
|
||||
# matching member is set to a list, then append the new member
|
||||
# instance to the list.
|
||||
if isinstance(member_class, list):
|
||||
if getattr(self, member_name) is None:
|
||||
setattr(self, member_name, [])
|
||||
getattr(self, member_name).append(
|
||||
create_class_from_element_tree(
|
||||
member_class[0], child_tree))
|
||||
else:
|
||||
setattr(self, member_name,
|
||||
create_class_from_element_tree(member_class,
|
||||
child_tree))
|
||||
else:
|
||||
ExtensionContainer._convert_element_tree_to_member(self,
|
||||
child_tree)
|
||||
|
||||
def _convert_element_attribute_to_member(self, attribute, value):
|
||||
# Find the attribute in this class's list of attributes.
|
||||
if self.__class__.c_attributes.has_key(attribute):
|
||||
# Find the member of this class which corresponds to the XML
|
||||
# attribute(lookup in current_class.c_attributes) and set this
|
||||
# member to the desired value (using self.__dict__).
|
||||
setattr(self, self.__class__.c_attributes[attribute][0], value)
|
||||
else:
|
||||
# If it doesn't appear in the attribute list it's an extension
|
||||
ExtensionContainer._convert_element_attribute_to_member(self,
|
||||
attribute, value)
|
||||
|
||||
# Three methods to create an ElementTree from an object
|
||||
def _add_members_to_element_tree(self, tree):
|
||||
# Convert the members of this class which are XML child nodes.
|
||||
# This uses the class's c_children dictionary to find the members which
|
||||
# should become XML child nodes.
|
||||
for member_name in self._get_all_c_children_with_order():
|
||||
member = getattr(self, member_name)
|
||||
if member is None:
|
||||
pass
|
||||
elif isinstance(member, list):
|
||||
for instance in member:
|
||||
instance.become_child_element_of(tree)
|
||||
else:
|
||||
member.become_child_element_of(tree)
|
||||
# Convert the members of this class which are XML attributes.
|
||||
for xml_attribute, attribute_info in \
|
||||
self.__class__.c_attributes.iteritems():
|
||||
(member_name, member_type, required) = attribute_info
|
||||
member = getattr(self, member_name)
|
||||
if member is not None:
|
||||
tree.attrib[xml_attribute] = member
|
||||
# Lastly, call the ExtensionContainers's _add_members_to_element_tree
|
||||
# to convert any extension attributes.
|
||||
ExtensionContainer._add_members_to_element_tree(self, tree)
|
||||
|
||||
|
||||
def become_child_element_of(self, tree):
|
||||
"""
|
||||
Note: Only for use with classes that have a c_tag and c_namespace class
|
||||
member. It is in SamlBase so that it can be inherited but it should
|
||||
not be called on instances of SamlBase.
|
||||
|
||||
:param tree: The tree to which this instance should be a child
|
||||
"""
|
||||
new_child = self._to_element_tree()
|
||||
tree.append(new_child)
|
||||
|
||||
def _to_element_tree(self):
|
||||
"""
|
||||
|
||||
Note, this method is designed to be used only with classes that have a
|
||||
c_tag and c_namespace. It is placed in SamlBase for inheritance but
|
||||
should not be called on in this class.
|
||||
|
||||
"""
|
||||
new_tree = ElementTree.Element('{%s}%s' % (self.__class__.c_namespace,
|
||||
self.__class__.c_tag))
|
||||
self._add_members_to_element_tree(new_tree)
|
||||
return new_tree
|
||||
|
||||
def to_string(self, nspair=None):
|
||||
"""Converts the Saml object to a string containing XML."""
|
||||
if nspair:
|
||||
for prefix, uri in nspair.items():
|
||||
try:
|
||||
ElementTree.register_namespace(prefix, uri)
|
||||
except AttributeError:
|
||||
# Backwards compatibility with ET < 1.3
|
||||
ElementTree._namespace_map[uri] = prefix
|
||||
|
||||
return ElementTree.tostring(self._to_element_tree(), encoding="UTF-8")
|
||||
|
||||
def __str__(self):
|
||||
return self.to_string()
|
||||
|
||||
# def _init_attribute(self, extension_attribute_id,
|
||||
# extension_attribute_name, value=None):
|
||||
#
|
||||
# self.c_attributes[extension_attribute_id] = (extension_attribute_name,
|
||||
# None, False)
|
||||
# if value:
|
||||
# self.__dict__[extension_attribute_name] = value
|
||||
|
||||
def keyswv(self):
|
||||
""" Return the keys of attributes or children that has values
|
||||
|
||||
:return: list of keys
|
||||
"""
|
||||
return [key for key, val in self.__dict__.items() if val]
|
||||
|
||||
def keys(self):
|
||||
""" Return all the keys that represent possible attributes and
|
||||
children.
|
||||
|
||||
:return: list of keys
|
||||
"""
|
||||
keys = ['text']
|
||||
keys.extend([n for (n, t, r) in self.c_attributes.values()])
|
||||
keys.extend([v[0] for v in self.c_children.values()])
|
||||
return keys
|
||||
|
||||
def children_with_values(self):
|
||||
""" Returns all children that has values
|
||||
|
||||
:return: Possibly empty list of children.
|
||||
"""
|
||||
childs = []
|
||||
for attribute in self._get_all_c_children_with_order():
|
||||
member = getattr(self, attribute)
|
||||
if member is None or member == []:
|
||||
pass
|
||||
elif isinstance(member, list):
|
||||
for instance in member:
|
||||
childs.append(instance)
|
||||
else:
|
||||
childs.append(member)
|
||||
return childs
|
||||
|
||||
#noinspection PyUnusedLocal
|
||||
def set_text(self, val, base64encode=False):
|
||||
""" Sets the text property of this instance.
|
||||
|
||||
:param val: The value of the text property
|
||||
:param base64encode: Whether the value should be base64encoded
|
||||
:return: The instance
|
||||
"""
|
||||
|
||||
#print "set_text: %s" % (val,)
|
||||
if isinstance(val, bool):
|
||||
if val:
|
||||
setattr(self, "text", "true")
|
||||
else:
|
||||
setattr(self, "text", "false")
|
||||
elif isinstance(val, int):
|
||||
setattr(self, "text", "%d" % val)
|
||||
elif isinstance(val, basestring):
|
||||
setattr(self, "text", val)
|
||||
elif val is None:
|
||||
pass
|
||||
else:
|
||||
raise ValueError( "Type shouldn't be '%s'" % (val,))
|
||||
|
||||
return self
|
||||
|
||||
def loadd(self, ava, base64encode=False):
|
||||
"""
|
||||
Sets attributes, children, extension elements and extension
|
||||
attributes of this element instance depending on what is in
|
||||
the given dictionary. If there are already values on properties
|
||||
those will be overwritten. If the keys in the dictionary does
|
||||
not correspond to known attributes/children/.. they are ignored.
|
||||
|
||||
:param ava: The dictionary
|
||||
:param base64encode: Whether the values on attributes or texts on
|
||||
children shoule be base64encoded.
|
||||
:return: The instance
|
||||
"""
|
||||
|
||||
for prop, _typ, _req in self.c_attributes.values():
|
||||
#print "# %s" % (prop)
|
||||
if prop in ava:
|
||||
if isinstance(ava[prop], bool):
|
||||
setattr(self, prop, "%s" % ava[prop])
|
||||
elif isinstance(ava[prop], int):
|
||||
setattr(self, prop, "%d" % ava[prop])
|
||||
else:
|
||||
setattr(self, prop, ava[prop])
|
||||
|
||||
if "text" in ava:
|
||||
self.set_text(ava["text"], base64encode)
|
||||
|
||||
for prop, klassdef in self.c_children.values():
|
||||
#print "## %s, %s" % (prop, klassdef)
|
||||
if prop in ava:
|
||||
#print "### %s" % ava[prop]
|
||||
# means there can be a list of values
|
||||
if isinstance(klassdef, list):
|
||||
make_vals(ava[prop], klassdef[0], self, prop,
|
||||
base64encode=base64encode)
|
||||
else:
|
||||
cis = make_vals(ava[prop], klassdef, self, prop, True,
|
||||
base64encode)
|
||||
setattr(self, prop, cis)
|
||||
|
||||
if "extension_elements" in ava:
|
||||
for item in ava["extension_elements"]:
|
||||
self.extension_elements.append(ExtensionElement(
|
||||
item["tag"]).loadd(item))
|
||||
|
||||
if "extension_attributes" in ava:
|
||||
for key, val in ava["extension_attributes"].items():
|
||||
self.extension_attributes[key] = val
|
||||
|
||||
return self
|
||||
|
||||
# def complete(self):
|
||||
# for prop, _typ, req in self.c_attributes.values():
|
||||
# if req and not getattr(self, prop):
|
||||
# return False
|
||||
#
|
||||
# for prop, klassdef in self.c_children.values():
|
||||
# try:
|
||||
# restriction = self.c_cardinality[prop]
|
||||
# val = getattr(self, prop)
|
||||
# if val is None:
|
||||
# num = 0
|
||||
# elif isinstance(val, list):
|
||||
# num = len(val)
|
||||
# else:
|
||||
# num = 1
|
||||
#
|
||||
# try:
|
||||
# minimum = restriction["min"]
|
||||
# except KeyError:
|
||||
# minimum = 1
|
||||
# if num < minimum:
|
||||
# return False
|
||||
# try:
|
||||
# maximum = restriction["max"]
|
||||
# except KeyError:
|
||||
# maximum = 1
|
||||
# # what if max == 0 ??
|
||||
# if maximum == "unbounded":
|
||||
# continue
|
||||
# elif num > maximum:
|
||||
# return False
|
||||
# except KeyError:
|
||||
# # default cardinality: min=max=1
|
||||
# if not getattr(self, prop):
|
||||
# return False
|
||||
#
|
||||
# return True
|
||||
|
||||
def child_class(self, child):
|
||||
""" Return the class a child element should be an instance of
|
||||
|
||||
:param child: The name of the child element
|
||||
:return: The class
|
||||
"""
|
||||
for prop, klassdef in self.c_children.values():
|
||||
if child == prop:
|
||||
if isinstance(klassdef, list):
|
||||
return klassdef[0]
|
||||
else:
|
||||
return klassdef
|
||||
return None
|
||||
|
||||
def child_cardinality(self, child):
|
||||
""" Return the cardinality of a child element
|
||||
|
||||
:param child: The name of the child element
|
||||
:return: The cardinality as a 2-tuple (min, max).
|
||||
The max value is either a number or the string "unbounded".
|
||||
The min value is always a number.
|
||||
"""
|
||||
for prop, klassdef in self.c_children.values():
|
||||
if child == prop:
|
||||
if isinstance(klassdef, list):
|
||||
try:
|
||||
min = self.c_cardinality["min"]
|
||||
except KeyError:
|
||||
min = 1
|
||||
try:
|
||||
max = self.c_cardinality["max"]
|
||||
except KeyError:
|
||||
max = "unbounded"
|
||||
|
||||
return min, max
|
||||
else:
|
||||
return 1,1
|
||||
return None
|
||||
|
||||
|
||||
|
||||
def element_to_extension_element(element):
|
||||
"""
|
||||
Convert an element into a extension element
|
||||
|
||||
:param element: The element instance
|
||||
:return: An extension element instance
|
||||
"""
|
||||
|
||||
exel = ExtensionElement(element.c_tag, element.c_namespace,
|
||||
text=element.text)
|
||||
|
||||
exel.attributes.update(element.extension_attributes)
|
||||
exel.children.extend(element.extension_elements)
|
||||
|
||||
for xml_attribute, (member_name, typ, req) in element.c_attributes.iteritems():
|
||||
member_value = getattr(element, member_name)
|
||||
if member_value is not None:
|
||||
exel.attributes[xml_attribute] = member_value
|
||||
|
||||
exel.children.extend([element_to_extension_element(c) \
|
||||
for c in element.children_with_values()])
|
||||
|
||||
return exel
|
||||
|
||||
def extension_element_to_element(extension_element, translation_functions,
|
||||
namespace=None):
|
||||
""" Convert an extension element to a normal element.
|
||||
In order to do this you need to have an idea of what type of
|
||||
element it is. Or rather which module it belongs to.
|
||||
|
||||
:param extension_element: The extension element
|
||||
:prama translation_functions: A dictionary which klass identifiers
|
||||
as keys and string-to-element translations functions as values
|
||||
:param namespace: The namespace of the translation functions.
|
||||
:return: An element instance or None
|
||||
"""
|
||||
|
||||
try:
|
||||
element_namespace = extension_element.namespace
|
||||
except AttributeError:
|
||||
element_namespace = extension_element.c_namespace
|
||||
if element_namespace == namespace:
|
||||
try:
|
||||
try:
|
||||
ets = translation_functions[extension_element.tag]
|
||||
except AttributeError:
|
||||
ets = translation_functions[extension_element.c_tag]
|
||||
return ets(extension_element.to_string())
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
return None
|
||||
|
||||
def extension_elements_to_elements(extension_elements, schemas):
|
||||
""" Create a list of elements each one matching one of the
|
||||
given extension elements. This is of course dependent on the access
|
||||
to schemas that describe the extension elements.
|
||||
|
||||
:param extension_elements: The list of extension elements
|
||||
:param schemas: Imported Python modules that represent the different
|
||||
known schemas used for the extension elements
|
||||
:return: A list of elements, representing the set of extension elements
|
||||
that was possible to match against a Class in the given schemas.
|
||||
The elements returned are the native representation of the elements
|
||||
according to the schemas.
|
||||
"""
|
||||
res = []
|
||||
for extension_element in extension_elements:
|
||||
for schema in schemas:
|
||||
inst = extension_element_to_element(extension_element,
|
||||
schema.ELEMENT_FROM_STRING,
|
||||
schema.NAMESPACE)
|
||||
if inst:
|
||||
res.append(inst)
|
||||
break
|
||||
|
||||
return res
|
||||
|
||||
def extension_elements_as_dict(extension_elements, onts):
|
||||
ees_ = extension_elements_to_elements(extension_elements, onts)
|
||||
res = {}
|
||||
for elem in ees_:
|
||||
try:
|
||||
res[elem.c_tag].append(elem)
|
||||
except KeyError:
|
||||
res[elem.c_tag] = [elem]
|
||||
return res
|
||||
505
src/saml2/assertion.py
Normal file
505
src/saml2/assertion.py
Normal file
@@ -0,0 +1,505 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
# Copyright (C) 2010-2011 Umeå University
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import re
|
||||
import sys
|
||||
import xmlenc
|
||||
|
||||
from saml2 import saml
|
||||
|
||||
from saml2.time_util import instant, in_a_while
|
||||
from saml2.attribute_converter import from_local
|
||||
|
||||
from saml2.s_utils import sid, MissingValue
|
||||
from saml2.s_utils import factory
|
||||
from saml2.s_utils import assertion_factory
|
||||
|
||||
|
||||
def _filter_values(vals, vlist=None, must=False):
|
||||
""" Removes values from *vals* that does not appear in vlist
|
||||
|
||||
:param vals: The values that are to be filtered
|
||||
:param vlist: required or optional value
|
||||
:param must: Whether the allowed values must appear
|
||||
:return: The set of values after filtering
|
||||
"""
|
||||
|
||||
if not vlist: # No value specified equals any value
|
||||
return vals
|
||||
|
||||
if isinstance(vlist, basestring):
|
||||
vlist = [vlist]
|
||||
|
||||
res = []
|
||||
|
||||
for val in vlist:
|
||||
if val in vals:
|
||||
res.append(val)
|
||||
|
||||
if must:
|
||||
if res:
|
||||
return res
|
||||
else:
|
||||
raise MissingValue("Required attribute value missing")
|
||||
else:
|
||||
return res
|
||||
|
||||
|
||||
def filter_on_attributes(ava, required=None, optional=None):
|
||||
""" Filter
|
||||
|
||||
:param ava: An attribute value assertion as a dictionary
|
||||
:param required: list of RequestedAttribute instances defined to be
|
||||
required
|
||||
:param optional: list of RequestedAttribute instances defined to be
|
||||
optional
|
||||
:return: The modified attribute value assertion
|
||||
"""
|
||||
res = {}
|
||||
|
||||
if required is None:
|
||||
required = []
|
||||
|
||||
for attr in required:
|
||||
if attr.friendly_name in ava:
|
||||
values = [av.text for av in attr.attribute_value]
|
||||
res[attr.friendly_name] = _filter_values(ava[attr.friendly_name], values, True)
|
||||
elif attr.name in ava:
|
||||
values = [av.text for av in attr.attribute_value]
|
||||
res[attr.name] = _filter_values(ava[attr.name], values, True)
|
||||
else:
|
||||
print >> sys.stderr, ava.keys()
|
||||
raise MissingValue("Required attribute missing: '%s'" % (attr.friendly_name,))
|
||||
|
||||
if optional is None:
|
||||
optional = []
|
||||
|
||||
for attr in optional:
|
||||
if attr.friendly_name in ava:
|
||||
values = [av.text for av in attr.attribute_value]
|
||||
try:
|
||||
res[attr.friendly_name].extend(_filter_values(ava[attr.friendly_name], values))
|
||||
except KeyError:
|
||||
res[attr.friendly_name] = _filter_values(ava[attr.friendly_name], values)
|
||||
elif attr.name in ava:
|
||||
values = [av.text for av in attr.attribute_value]
|
||||
try:
|
||||
res[attr.name].extend(_filter_values(ava[attr.name], values))
|
||||
except KeyError:
|
||||
res[attr.name] = _filter_values(ava[attr.name], values)
|
||||
|
||||
return res
|
||||
|
||||
def filter_on_demands(ava, required=None, optional=None):
|
||||
""" Never return more than is needed. Filters out everything
|
||||
the server is prepared to return but the receiver doesn't ask for
|
||||
|
||||
:param ava: Attribute value assertion as a dictionary
|
||||
:param required: Required attributes
|
||||
:param optional: Optional attributes
|
||||
:return: The possibly reduced assertion
|
||||
"""
|
||||
|
||||
# Is all what's required there:
|
||||
if required is None:
|
||||
required = {}
|
||||
|
||||
for attr, vals in required.items():
|
||||
if attr in ava:
|
||||
if vals:
|
||||
for val in vals:
|
||||
if val not in ava[attr]:
|
||||
raise MissingValue(
|
||||
"Required attribute value missing: %s,%s" % (attr,
|
||||
val))
|
||||
else:
|
||||
raise MissingValue("Required attribute missing: %s" % (attr,))
|
||||
|
||||
if optional is None:
|
||||
optional = {}
|
||||
|
||||
# OK, so I can imaging releasing values that are not absolutely necessary
|
||||
# but not attributes
|
||||
for attr, vals in ava.items():
|
||||
if attr not in required and attr not in optional:
|
||||
del ava[attr]
|
||||
|
||||
return ava
|
||||
|
||||
def filter_on_wire_representation(ava, acs, required=None, optional=None):
|
||||
"""
|
||||
:param ava: A dictionary with attributes and values
|
||||
:param required: A list of saml.Attributes
|
||||
:param optional: A list of saml.Attributes
|
||||
:return: Dictionary of expected/wanted attributes and values
|
||||
"""
|
||||
acsdic = dict([(ac.name_format, ac) for ac in acs])
|
||||
|
||||
if required is None:
|
||||
required = []
|
||||
if optional is None:
|
||||
optional = []
|
||||
|
||||
res = {}
|
||||
for attr, val in ava.items():
|
||||
done = False
|
||||
for req in required:
|
||||
try:
|
||||
_name = acsdic[req.name_format]._to[attr]
|
||||
if _name == req.name:
|
||||
res[attr] = val
|
||||
done = True
|
||||
except KeyError:
|
||||
pass
|
||||
if done:
|
||||
continue
|
||||
for opt in optional:
|
||||
try:
|
||||
_name = acsdic[opt.name_format]._to[attr]
|
||||
if _name == opt.name:
|
||||
res[attr] = val
|
||||
break
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
return res
|
||||
|
||||
def filter_attribute_value_assertions(ava, attribute_restrictions=None):
|
||||
""" Will weed out attribute values and values according to the
|
||||
rules defined in the attribute restrictions. If filtering results in
|
||||
an attribute without values, then the attribute is removed from the
|
||||
assertion.
|
||||
|
||||
:param ava: The incoming attribute value assertion (dictionary)
|
||||
:param attribute_restrictions: The rules that govern which attributes
|
||||
and values that are allowed. (dictionary)
|
||||
:return: The modified attribute value assertion
|
||||
"""
|
||||
if not attribute_restrictions:
|
||||
return ava
|
||||
|
||||
for attr, vals in ava.items():
|
||||
if attr in attribute_restrictions:
|
||||
if attribute_restrictions[attr]:
|
||||
rvals = []
|
||||
for restr in attribute_restrictions[attr]:
|
||||
for val in vals:
|
||||
if restr.match(val):
|
||||
rvals.append(val)
|
||||
|
||||
if rvals:
|
||||
ava[attr] = list(set(rvals))
|
||||
else:
|
||||
del ava[attr]
|
||||
else:
|
||||
del ava[attr]
|
||||
return ava
|
||||
|
||||
class Policy(object):
|
||||
""" handles restrictions on assertions """
|
||||
|
||||
def __init__(self, restrictions=None):
|
||||
if restrictions:
|
||||
self.compile(restrictions)
|
||||
else:
|
||||
self._restrictions = None
|
||||
|
||||
def compile(self, restrictions):
|
||||
""" This is only for IdPs or AAs, and it's about limiting what
|
||||
is returned to the SP.
|
||||
In the configuration file, restrictions on which values that
|
||||
can be returned are specified with the help of regular expressions.
|
||||
This function goes through and pre-compiles the regular expressions.
|
||||
|
||||
:param restrictions:
|
||||
:return: The assertion with the string specification replaced with
|
||||
a compiled regular expression.
|
||||
"""
|
||||
|
||||
self._restrictions = restrictions.copy()
|
||||
|
||||
for _, spec in self._restrictions.items():
|
||||
if spec is None:
|
||||
continue
|
||||
|
||||
try:
|
||||
restr = spec["attribute_restrictions"]
|
||||
except KeyError:
|
||||
continue
|
||||
|
||||
if restr is None:
|
||||
continue
|
||||
|
||||
for key, values in restr.items():
|
||||
if not values:
|
||||
spec["attribute_restrictions"][key] = None
|
||||
continue
|
||||
|
||||
spec["attribute_restrictions"][key] = \
|
||||
[re.compile(value) for value in values]
|
||||
|
||||
return self._restrictions
|
||||
|
||||
def get_nameid_format(self, sp_entity_id):
|
||||
""" Get the NameIDFormat to used for the entity id
|
||||
:param: The SP entity ID
|
||||
:retur: The format
|
||||
"""
|
||||
try:
|
||||
form = self._restrictions[sp_entity_id]["nameid_format"]
|
||||
except KeyError:
|
||||
try:
|
||||
form = self._restrictions["default"]["nameid_format"]
|
||||
except KeyError:
|
||||
form = saml.NAMEID_FORMAT_TRANSIENT
|
||||
|
||||
return form
|
||||
|
||||
def get_name_form(self, sp_entity_id):
|
||||
""" Get the NameFormat to used for the entity id
|
||||
:param: The SP entity ID
|
||||
:retur: The format
|
||||
"""
|
||||
form = ""
|
||||
|
||||
try:
|
||||
form = self._restrictions[sp_entity_id]["name_form"]
|
||||
except TypeError:
|
||||
pass
|
||||
except KeyError:
|
||||
try:
|
||||
form = self._restrictions["default"]["name_form"]
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
return form
|
||||
|
||||
def get_lifetime(self, sp_entity_id):
|
||||
""" The lifetime of the assertion
|
||||
:param sp_entity_id: The SP entity ID
|
||||
:param: lifetime as a dictionary
|
||||
"""
|
||||
# default is a hour
|
||||
spec = {"hours":1}
|
||||
if not self._restrictions:
|
||||
return spec
|
||||
|
||||
try:
|
||||
spec = self._restrictions[sp_entity_id]["lifetime"]
|
||||
except KeyError:
|
||||
try:
|
||||
spec = self._restrictions["default"]["lifetime"]
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
return spec
|
||||
|
||||
def get_attribute_restriction(self, sp_entity_id):
|
||||
""" Return the attribute restriction for SP that want the information
|
||||
|
||||
:param sp_entity_id: The SP entity ID
|
||||
:return: The restrictions
|
||||
"""
|
||||
|
||||
if not self._restrictions:
|
||||
return None
|
||||
|
||||
try:
|
||||
try:
|
||||
restrictions = self._restrictions[sp_entity_id][
|
||||
"attribute_restrictions"]
|
||||
except KeyError:
|
||||
try:
|
||||
restrictions = self._restrictions["default"][
|
||||
"attribute_restrictions"]
|
||||
except KeyError:
|
||||
restrictions = None
|
||||
except KeyError:
|
||||
restrictions = None
|
||||
|
||||
return restrictions
|
||||
|
||||
def not_on_or_after(self, sp_entity_id):
|
||||
""" When the assertion stops being valid, should not be
|
||||
used after this time.
|
||||
|
||||
:param sp_entity_id: The SP entity ID
|
||||
:return: String representation of the time
|
||||
"""
|
||||
|
||||
return in_a_while(**self.get_lifetime(sp_entity_id))
|
||||
|
||||
def filter(self, ava, sp_entity_id, required=None, optional=None):
|
||||
""" What attribute and attribute values returns depends on what
|
||||
the SP has said it wants in the request or in the metadata file and
|
||||
what the IdP/AA wants to release. An assumption is that what the SP
|
||||
asks for overrides whatever is in the metadata. But of course the
|
||||
IdP never releases anything it doesn't want to.
|
||||
|
||||
:param ava: The information about the subject as a dictionary
|
||||
:param sp_entity_id: The entity ID of the SP
|
||||
:param required: Attributes that the SP requires in the assertion
|
||||
:param optional: Attributes that the SP regards as optional
|
||||
:return: A possibly modified AVA
|
||||
"""
|
||||
|
||||
|
||||
ava = filter_attribute_value_assertions(ava,
|
||||
self.get_attribute_restriction(sp_entity_id))
|
||||
|
||||
if required or optional:
|
||||
ava = filter_on_attributes(ava, required, optional)
|
||||
|
||||
return ava
|
||||
|
||||
def restrict(self, ava, sp_entity_id, metadata=None):
|
||||
""" Identity attribute names are expected to be expressed in
|
||||
the local lingo (== friendlyName)
|
||||
|
||||
:return: A filtered ava according to the IdPs/AAs rules and
|
||||
the list of required/optional attributes according to the SP.
|
||||
If the requirements can't be met an exception is raised.
|
||||
"""
|
||||
if metadata:
|
||||
(required, optional) = metadata.attribute_consumer(sp_entity_id)
|
||||
#(required, optional) = metadata.wants(sp_entity_id)
|
||||
else:
|
||||
required = optional = None
|
||||
|
||||
return self.filter(ava, sp_entity_id, required, optional)
|
||||
|
||||
def conditions(self, sp_entity_id):
|
||||
""" Return a saml.Condition instance
|
||||
|
||||
:param sp_entity_id: The SP entity ID
|
||||
:return: A saml.Condition instance
|
||||
"""
|
||||
return factory( saml.Conditions,
|
||||
not_before=instant(),
|
||||
# How long might depend on who's getting it
|
||||
not_on_or_after=self.not_on_or_after(sp_entity_id),
|
||||
audience_restriction=[factory( saml.AudienceRestriction,
|
||||
audience=factory(saml.Audience,
|
||||
text=sp_entity_id))])
|
||||
|
||||
class Assertion(dict):
|
||||
""" Handles assertions about subjects """
|
||||
|
||||
def __init__(self, dic=None):
|
||||
dict.__init__(self, dic)
|
||||
|
||||
def _authn_context_decl_ref(self, authn_class):
|
||||
# authn_class: saml.AUTHN_PASSWORD
|
||||
return factory(saml.AuthnContext,
|
||||
authn_context_decl_ref=factory(
|
||||
saml.AuthnContextDeclRef, text=authn_class))
|
||||
|
||||
def _authn_context_class_ref(self, authn_class, authn_auth=None):
|
||||
# authn_class: saml.AUTHN_PASSWORD
|
||||
cntx_class = factory(saml.AuthnContextClassRef, text=authn_class)
|
||||
if authn_auth:
|
||||
return factory(saml.AuthnContext,
|
||||
authn_context_class_ref=cntx_class,
|
||||
authenticating_authority=factory(
|
||||
saml.AuthenticatingAuthority,
|
||||
text=authn_auth))
|
||||
else:
|
||||
return factory(saml.AuthnContext,
|
||||
authn_context_class_ref=cntx_class)
|
||||
|
||||
def _authn_statement(self, authn_class=None, authn_auth=None,
|
||||
authn_decl=None):
|
||||
if authn_class:
|
||||
return factory(saml.AuthnStatement,
|
||||
authn_instant=instant(),
|
||||
session_index=sid(),
|
||||
authn_context=self._authn_context_class_ref(
|
||||
authn_class, authn_auth))
|
||||
elif authn_decl:
|
||||
return factory(saml.AuthnStatement,
|
||||
authn_instant=instant(),
|
||||
session_index=sid(),
|
||||
authn_context=self._authn_context_decl_ref(authn_decl))
|
||||
else:
|
||||
return factory(saml.AuthnStatement,
|
||||
authn_instant=instant(),
|
||||
session_index=sid())
|
||||
|
||||
def construct(self, sp_entity_id, in_response_to, consumer_url,
|
||||
name_id, attrconvs, policy, issuer, authn_class=None,
|
||||
authn_auth=None, authn_decl=None, encrypt=None,
|
||||
sec_context=None):
|
||||
""" Construct the Assertion
|
||||
|
||||
:param sp_entity_id: The entityid of the SP
|
||||
:param in_response_to: An identifier of the message, this message is
|
||||
a response to
|
||||
:param consumer_url: The intended consumer of the assertion
|
||||
:param name_id: An NameID instance
|
||||
:param attrconvs: AttributeConverters
|
||||
:param policy: The policy that should be adhered to when replying
|
||||
:param issuer: Who is issuing the statement
|
||||
:param authn_class: The authentication class
|
||||
:param authn_auth: The authentication instance
|
||||
:param encrypt: Whether to encrypt parts or all of the Assertion
|
||||
:param sec_context: The security context used when encrypting
|
||||
:return: An Assertion instance
|
||||
"""
|
||||
attr_statement = saml.AttributeStatement(attribute=from_local(
|
||||
attrconvs, self,
|
||||
policy.get_name_form(sp_entity_id)))
|
||||
|
||||
if encrypt == "attributes":
|
||||
for attr in attr_statement.attribute:
|
||||
enc = sec_context.encrypt(text="%s" % attr)
|
||||
|
||||
encd = xmlenc.encrypted_data_from_string(enc)
|
||||
encattr = saml.EncryptedAttribute(encrypted_data=encd)
|
||||
attr_statement.encrypted_attribute.append(encattr)
|
||||
|
||||
attr_statement.attribute = []
|
||||
|
||||
# start using now and for some time
|
||||
conds = policy.conditions(sp_entity_id)
|
||||
|
||||
return assertion_factory(
|
||||
issuer=issuer,
|
||||
attribute_statement = attr_statement,
|
||||
authn_statement = self._authn_statement(authn_class, authn_auth,
|
||||
authn_decl),
|
||||
conditions = conds,
|
||||
subject=factory( saml.Subject,
|
||||
name_id=name_id,
|
||||
subject_confirmation=factory( saml.SubjectConfirmation,
|
||||
method=saml.SUBJECT_CONFIRMATION_METHOD_BEARER,
|
||||
subject_confirmation_data=factory(
|
||||
saml.SubjectConfirmationData,
|
||||
in_response_to=in_response_to,
|
||||
recipient=consumer_url,
|
||||
not_on_or_after=policy.not_on_or_after(
|
||||
sp_entity_id)))),
|
||||
)
|
||||
|
||||
def apply_policy(self, sp_entity_id, policy, metadata=None):
|
||||
""" Apply policy to the assertion I'm representing
|
||||
|
||||
:param sp_entity_id: The SP entity ID
|
||||
:param policy: The policy
|
||||
:param metadata: Metadata to use
|
||||
:return: The resulting AVA after the policy is applied
|
||||
"""
|
||||
return policy.restrict(self, sp_entity_id, metadata)
|
||||
346
src/saml2/attribute_converter.py
Normal file
346
src/saml2/attribute_converter.py
Normal file
@@ -0,0 +1,346 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
# Copyright (C) s2010-2011 Umeå University
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import os
|
||||
import sys
|
||||
from importlib import import_module
|
||||
|
||||
from saml2.s_utils import factory, do_ava
|
||||
from saml2 import saml
|
||||
from saml2.saml import NAME_FORMAT_URI
|
||||
|
||||
class UnknownNameFormat(Exception):
|
||||
pass
|
||||
|
||||
def load_maps(dirspec):
|
||||
""" load the attribute maps
|
||||
|
||||
:param dirspec: a directory specification
|
||||
:return: a dictionary with the name of the map as key and the
|
||||
map as value. The map itself is a dictionary with two keys:
|
||||
"to" and "fro". The values for those keys are the actual mapping.
|
||||
"""
|
||||
map = {}
|
||||
if dirspec not in sys.path:
|
||||
sys.path.insert(0, dirspec)
|
||||
|
||||
for fil in os.listdir(dirspec):
|
||||
if fil.endswith(".py"):
|
||||
mod = import_module(fil[:-3])
|
||||
for key, item in mod.__dict__.items():
|
||||
if key.startswith("__"):
|
||||
continue
|
||||
if isinstance(item, dict) and "to" in item and "fro" in item:
|
||||
map[item["identifier"]] = item
|
||||
|
||||
return map
|
||||
|
||||
def ac_factory(path=""):
|
||||
"""Attribute Converter factory
|
||||
|
||||
:param path: The path to a directory where the attribute maps are expected
|
||||
to reside.
|
||||
:return: A AttributeConverter instance
|
||||
"""
|
||||
acs = []
|
||||
|
||||
if path:
|
||||
if path not in sys.path:
|
||||
sys.path.insert(0, path)
|
||||
|
||||
for fil in os.listdir(path):
|
||||
if fil.endswith(".py"):
|
||||
mod = import_module(fil[:-3])
|
||||
for key, item in mod.__dict__.items():
|
||||
if key.startswith("__"):
|
||||
continue
|
||||
if isinstance(item, dict) and "to" in item and "fro" in item:
|
||||
atco = AttributeConverter(item["identifier"])
|
||||
atco.from_dict(item)
|
||||
acs.append(atco)
|
||||
else:
|
||||
for map in ["basic", "saml_uri", "shibboleth_uri"]:
|
||||
mod = import_module(".%s" % map, "saml2.attributemaps")
|
||||
for key, item in mod.__dict__.items():
|
||||
if key.startswith("__"):
|
||||
continue
|
||||
if isinstance(item, dict) and "to" in item and "fro" in item:
|
||||
atco = AttributeConverter(item["identifier"])
|
||||
atco.from_dict(item)
|
||||
acs.append(atco)
|
||||
|
||||
return acs
|
||||
|
||||
def ac_factory_II(path):
|
||||
return ac_factory(path)
|
||||
|
||||
#def ac_factory_old(path):
|
||||
# acs = []
|
||||
#
|
||||
# for dir_name, directories, files in os.walk(path):
|
||||
# for d in list(directories):
|
||||
# if d.startswith('.'):
|
||||
# directories.remove(d)
|
||||
#
|
||||
# if files:
|
||||
# atco = AttributeConverter(os.path.basename(dir_name))
|
||||
# for name in files:
|
||||
# fname = os.path.join(dir_name, name)
|
||||
# if name.endswith(".py"):
|
||||
# name = name[:-3]
|
||||
# atco.set(name, fname)
|
||||
# atco.adjust()
|
||||
# acs.append(atco)
|
||||
# return acs
|
||||
|
||||
def ava_fro(acs, statement):
|
||||
""" Translates attributes according to their name_formats into the local
|
||||
names.
|
||||
|
||||
:param acs: AttributeConverter instances
|
||||
:param statement: A SAML statement
|
||||
:return: A dictionary with attribute names replaced with local names.
|
||||
"""
|
||||
if not statement:
|
||||
return {}
|
||||
|
||||
acsdic = dict([(ac.name_format, ac) for ac in acs])
|
||||
acsdic[None] = acsdic[NAME_FORMAT_URI]
|
||||
return dict([acsdic[a.name_format].ava_from(a) for a in statement])
|
||||
|
||||
def to_local(acs, statement):
|
||||
""" Replaces the attribute names in a attribute value assertion with the
|
||||
equivalent name from a local name format.
|
||||
|
||||
"""
|
||||
if not acs:
|
||||
acs = [AttributeConverter()]
|
||||
|
||||
ava = []
|
||||
for aconv in acs:
|
||||
try:
|
||||
ava = aconv.fro(statement)
|
||||
break
|
||||
except UnknownNameFormat:
|
||||
pass
|
||||
return ava
|
||||
|
||||
def from_local(acs, ava, name_format):
|
||||
for aconv in acs:
|
||||
#print ac.format, name_format
|
||||
if aconv.name_format == name_format:
|
||||
#print "Found a name_form converter"
|
||||
return aconv.to_(ava)
|
||||
|
||||
return None
|
||||
|
||||
def from_local_name(acs, attr, name_format):
|
||||
"""
|
||||
:param acs: List of AttributeConverter instances
|
||||
:param attr: attribute name as string
|
||||
:param name_format: Which name-format it should be translated to
|
||||
:return: An Attribute instance
|
||||
"""
|
||||
for aconv in acs:
|
||||
#print ac.format, name_format
|
||||
if aconv.name_format == name_format:
|
||||
#print "Found a name_form converter"
|
||||
return aconv.to_format(attr)
|
||||
return attr
|
||||
|
||||
def to_local_name(acs, attr):
|
||||
"""
|
||||
:param acs: List of AttributeConverter instances
|
||||
:param attr: an Attribute instance
|
||||
:return: The local attribute name
|
||||
"""
|
||||
for aconv in acs:
|
||||
lattr = aconv.from_format(attr)
|
||||
if lattr:
|
||||
return lattr
|
||||
|
||||
return attr.friendly_name
|
||||
|
||||
class AttributeConverter(object):
|
||||
""" Converts from an attribute statement to a key,value dictionary and
|
||||
vice-versa """
|
||||
|
||||
def __init__(self, name_format=""):
|
||||
self.name_format = name_format
|
||||
self._to = None
|
||||
self._fro = None
|
||||
|
||||
# def set(self, name, filename):
|
||||
# if name == "to":
|
||||
# self.set_to(filename)
|
||||
# elif name == "fro":
|
||||
# self.set_fro(filename)
|
||||
# # else ignore
|
||||
#
|
||||
# def set_fro(self, filename):
|
||||
# self._fro = eval(open(filename).read())
|
||||
#
|
||||
# def set_to(self, filename):
|
||||
# self._to = eval(open(filename).read())
|
||||
#
|
||||
def adjust(self):
|
||||
""" If one of the transformations is not defined it is expected to
|
||||
be the mirror image of the other.
|
||||
"""
|
||||
|
||||
if self._fro is None and self._to is not None:
|
||||
self._fro = dict([(value, key) for key, value in self._to.items()])
|
||||
if self._to is None and self.fro is not None:
|
||||
self._to = dict([(value, key) for key, value in self._fro.items()])
|
||||
|
||||
def from_dict(self, mapdict):
|
||||
""" Import the attribute map from a dictionary
|
||||
|
||||
:param mapdict: The dictionary
|
||||
"""
|
||||
|
||||
self.name_format = mapdict["identifier"]
|
||||
try:
|
||||
self._fro = mapdict["fro"]
|
||||
except KeyError:
|
||||
pass
|
||||
try:
|
||||
self._to = mapdict["to"]
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
if self._fro is None and self._to is None:
|
||||
raise Exception("Missing specifications")
|
||||
|
||||
if self._fro is None or self._to is None:
|
||||
self.adjust()
|
||||
|
||||
|
||||
def fail_safe_fro(self, statement):
|
||||
""" In case there is not formats defined """
|
||||
result = {}
|
||||
for attribute in statement.attribute:
|
||||
try:
|
||||
name = attribute.friendly_name.strip()
|
||||
except AttributeError:
|
||||
name = attribute.name.strip()
|
||||
|
||||
result[name] = []
|
||||
for value in attribute.attribute_value:
|
||||
if not value.text:
|
||||
result[name].append('')
|
||||
else:
|
||||
result[name].append(value.text.strip())
|
||||
return result
|
||||
|
||||
def ava_from(self, attribute):
|
||||
try:
|
||||
attr = self._fro[attribute.name.strip()]
|
||||
except (AttributeError, KeyError):
|
||||
try:
|
||||
attr = attribute.friendly_name.strip()
|
||||
except AttributeError:
|
||||
attr = attribute.name.strip()
|
||||
|
||||
val = []
|
||||
for value in attribute.attribute_value:
|
||||
if not value.text:
|
||||
val.append('')
|
||||
else:
|
||||
val.append(value.text.strip())
|
||||
|
||||
return attr, val
|
||||
|
||||
def fro(self, statement):
|
||||
""" Get the attributes and the attribute values
|
||||
|
||||
:param statement: The AttributeStatement.
|
||||
:return: A dictionary containing attributes and values
|
||||
"""
|
||||
|
||||
if not self.name_format:
|
||||
return self.fail_safe_fro(statement)
|
||||
|
||||
result = {}
|
||||
for attribute in statement.attribute:
|
||||
if attribute.name_format and self.name_format and \
|
||||
attribute.name_format != self.name_format:
|
||||
raise UnknownNameFormat
|
||||
|
||||
(key, val) = self.ava_from(attribute)
|
||||
result[key] = val
|
||||
|
||||
if not result:
|
||||
return self.fail_safe_fro(statement)
|
||||
else:
|
||||
return result
|
||||
|
||||
def to_format(self, attr):
|
||||
""" Creates an Attribute instance with name, name_format and
|
||||
friendly_name
|
||||
|
||||
:param attr: The local name of the attribute
|
||||
:return: An Attribute instance
|
||||
"""
|
||||
try:
|
||||
return factory(saml.Attribute,
|
||||
name=self._to[attr],
|
||||
name_format=self.name_format,
|
||||
friendly_name=attr)
|
||||
except KeyError:
|
||||
return factory(saml.Attribute, name=attr)
|
||||
|
||||
def from_format(self, attr):
|
||||
""" Find out the local name of an attribute
|
||||
|
||||
:param attr: An saml.Attribute instance
|
||||
:return: The local attribute name or "" if no mapping could be made
|
||||
"""
|
||||
if attr.name_format:
|
||||
if self.name_format == attr.name_format:
|
||||
try:
|
||||
return self._fro[attr.name]
|
||||
except KeyError:
|
||||
pass
|
||||
else: #don't know the name format so try all I have
|
||||
try:
|
||||
return self._fro[attr.name]
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
return ""
|
||||
|
||||
def to_(self, attrvals):
|
||||
""" Create a list of Attribute instances.
|
||||
|
||||
:param attrvals: A dictionary of attributes and values
|
||||
:return: A list of Attribute instances
|
||||
"""
|
||||
attributes = []
|
||||
for key, value in attrvals.items():
|
||||
try:
|
||||
attributes.append(factory(saml.Attribute,
|
||||
name=self._to[key],
|
||||
name_format=self.name_format,
|
||||
friendly_name=key,
|
||||
attribute_value=do_ava(value)))
|
||||
except KeyError:
|
||||
attributes.append(factory(saml.Attribute,
|
||||
name=key,
|
||||
attribute_value=do_ava(value)))
|
||||
|
||||
return attributes
|
||||
70
src/saml2/attribute_resolver.py
Normal file
70
src/saml2/attribute_resolver.py
Normal file
@@ -0,0 +1,70 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
# Copyright (C) 2009-2011 Umeå University
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
"""
|
||||
Contains classes and functions that a SAML2.0 Service Provider (SP) may use
|
||||
to do attribute aggregation.
|
||||
"""
|
||||
import saml2
|
||||
|
||||
DEFAULT_BINDING = saml2.BINDING_SOAP
|
||||
|
||||
class AttributeResolver(object):
|
||||
|
||||
def __init__(self, metadata=None, config=None, saml2client=None):
|
||||
self.metadata = metadata
|
||||
|
||||
if saml2client:
|
||||
self.saml2client = saml2client
|
||||
self.metadata = saml2client.config.metadata
|
||||
else:
|
||||
self.saml2client = saml2.client.Saml2Client(config)
|
||||
|
||||
def extend(self, subject_id, issuer, vo_members, name_id_format=None,
|
||||
sp_name_qualifier=None, log=None, real_id=None):
|
||||
"""
|
||||
:param subject_id: The identifier by which the subject is know
|
||||
among all the participents of the VO
|
||||
:param issuer: Who am I the poses the query
|
||||
:param vo_members: The entity IDs of the IdP who I'm going to ask
|
||||
for extra attributes
|
||||
:param nameid_format: Used to make the IdPs aware of what's going
|
||||
on here
|
||||
:param log: Where to log exciting information
|
||||
:return: A dictionary with all the collected information about the
|
||||
subject
|
||||
"""
|
||||
result = []
|
||||
for member in vo_members:
|
||||
for ass in self.metadata.attribute_services(member):
|
||||
for attr_serv in ass.attribute_service:
|
||||
if log:
|
||||
log.info(
|
||||
"Send attribute request to %s" % attr_serv.location)
|
||||
if attr_serv.binding != saml2.BINDING_SOAP:
|
||||
continue
|
||||
# attribute query assumes SOAP binding
|
||||
session_info = self.saml2client.attribute_query(
|
||||
subject_id,
|
||||
attr_serv.location,
|
||||
issuer_id=issuer,
|
||||
sp_name_qualifier=sp_name_qualifier,
|
||||
nameid_format=name_id_format,
|
||||
log=log, real_id=real_id)
|
||||
if session_info:
|
||||
result.append(session_info)
|
||||
return result
|
||||
1
src/saml2/attributemaps/__init__.py
Normal file
1
src/saml2/attributemaps/__init__.py
Normal file
@@ -0,0 +1 @@
|
||||
__author__ = 'rohe0002'
|
||||
326
src/saml2/attributemaps/basic.py
Normal file
326
src/saml2/attributemaps/basic.py
Normal file
@@ -0,0 +1,326 @@
|
||||
|
||||
MAP = {
|
||||
"identifier": "urn:oasis:names:tc:SAML:2.0:attrname-format:basic",
|
||||
"fro": {
|
||||
'urn:mace:dir:attribute-def:aRecord': 'aRecord',
|
||||
'urn:mace:dir:attribute-def:aliasedEntryName': 'aliasedEntryName',
|
||||
'urn:mace:dir:attribute-def:aliasedObjectName': 'aliasedObjectName',
|
||||
'urn:mace:dir:attribute-def:associatedDomain': 'associatedDomain',
|
||||
'urn:mace:dir:attribute-def:associatedName': 'associatedName',
|
||||
'urn:mace:dir:attribute-def:audio': 'audio',
|
||||
'urn:mace:dir:attribute-def:authorityRevocationList': 'authorityRevocationList',
|
||||
'urn:mace:dir:attribute-def:buildingName': 'buildingName',
|
||||
'urn:mace:dir:attribute-def:businessCategory': 'businessCategory',
|
||||
'urn:mace:dir:attribute-def:c': 'c',
|
||||
'urn:mace:dir:attribute-def:cACertificate': 'cACertificate',
|
||||
'urn:mace:dir:attribute-def:cNAMERecord': 'cNAMERecord',
|
||||
'urn:mace:dir:attribute-def:carLicense': 'carLicense',
|
||||
'urn:mace:dir:attribute-def:certificateRevocationList': 'certificateRevocationList',
|
||||
'urn:mace:dir:attribute-def:cn': 'cn',
|
||||
'urn:mace:dir:attribute-def:co': 'co',
|
||||
'urn:mace:dir:attribute-def:commonName': 'commonName',
|
||||
'urn:mace:dir:attribute-def:countryName': 'countryName',
|
||||
'urn:mace:dir:attribute-def:crossCertificatePair': 'crossCertificatePair',
|
||||
'urn:mace:dir:attribute-def:dITRedirect': 'dITRedirect',
|
||||
'urn:mace:dir:attribute-def:dSAQuality': 'dSAQuality',
|
||||
'urn:mace:dir:attribute-def:dc': 'dc',
|
||||
'urn:mace:dir:attribute-def:deltaRevocationList': 'deltaRevocationList',
|
||||
'urn:mace:dir:attribute-def:departmentNumber': 'departmentNumber',
|
||||
'urn:mace:dir:attribute-def:description': 'description',
|
||||
'urn:mace:dir:attribute-def:destinationIndicator': 'destinationIndicator',
|
||||
'urn:mace:dir:attribute-def:displayName': 'displayName',
|
||||
'urn:mace:dir:attribute-def:distinguishedName': 'distinguishedName',
|
||||
'urn:mace:dir:attribute-def:dmdName': 'dmdName',
|
||||
'urn:mace:dir:attribute-def:dnQualifier': 'dnQualifier',
|
||||
'urn:mace:dir:attribute-def:documentAuthor': 'documentAuthor',
|
||||
'urn:mace:dir:attribute-def:documentIdentifier': 'documentIdentifier',
|
||||
'urn:mace:dir:attribute-def:documentLocation': 'documentLocation',
|
||||
'urn:mace:dir:attribute-def:documentPublisher': 'documentPublisher',
|
||||
'urn:mace:dir:attribute-def:documentTitle': 'documentTitle',
|
||||
'urn:mace:dir:attribute-def:documentVersion': 'documentVersion',
|
||||
'urn:mace:dir:attribute-def:domainComponent': 'domainComponent',
|
||||
'urn:mace:dir:attribute-def:drink': 'drink',
|
||||
'urn:mace:dir:attribute-def:eduOrgHomePageURI': 'eduOrgHomePageURI',
|
||||
'urn:mace:dir:attribute-def:eduOrgIdentityAuthNPolicyURI': 'eduOrgIdentityAuthNPolicyURI',
|
||||
'urn:mace:dir:attribute-def:eduOrgLegalName': 'eduOrgLegalName',
|
||||
'urn:mace:dir:attribute-def:eduOrgSuperiorURI': 'eduOrgSuperiorURI',
|
||||
'urn:mace:dir:attribute-def:eduOrgWhitePagesURI': 'eduOrgWhitePagesURI',
|
||||
'urn:mace:dir:attribute-def:eduPersonAffiliation': 'eduPersonAffiliation',
|
||||
'urn:mace:dir:attribute-def:eduPersonEntitlement': 'eduPersonEntitlement',
|
||||
'urn:mace:dir:attribute-def:eduPersonNickname': 'eduPersonNickname',
|
||||
'urn:mace:dir:attribute-def:eduPersonOrgDN': 'eduPersonOrgDN',
|
||||
'urn:mace:dir:attribute-def:eduPersonOrgUnitDN': 'eduPersonOrgUnitDN',
|
||||
'urn:mace:dir:attribute-def:eduPersonPrimaryAffiliation': 'eduPersonPrimaryAffiliation',
|
||||
'urn:mace:dir:attribute-def:eduPersonPrimaryOrgUnitDN': 'eduPersonPrimaryOrgUnitDN',
|
||||
'urn:mace:dir:attribute-def:eduPersonPrincipalName': 'eduPersonPrincipalName',
|
||||
'urn:mace:dir:attribute-def:eduPersonScopedAffiliation': 'eduPersonScopedAffiliation',
|
||||
'urn:mace:dir:attribute-def:eduPersonTargetedID': 'eduPersonTargetedID',
|
||||
'urn:mace:dir:attribute-def:email': 'email',
|
||||
'urn:mace:dir:attribute-def:emailAddress': 'emailAddress',
|
||||
'urn:mace:dir:attribute-def:employeeNumber': 'employeeNumber',
|
||||
'urn:mace:dir:attribute-def:employeeType': 'employeeType',
|
||||
'urn:mace:dir:attribute-def:enhancedSearchGuide': 'enhancedSearchGuide',
|
||||
'urn:mace:dir:attribute-def:facsimileTelephoneNumber': 'facsimileTelephoneNumber',
|
||||
'urn:mace:dir:attribute-def:favouriteDrink': 'favouriteDrink',
|
||||
'urn:mace:dir:attribute-def:fax': 'fax',
|
||||
'urn:mace:dir:attribute-def:federationFeideSchemaVersion': 'federationFeideSchemaVersion',
|
||||
'urn:mace:dir:attribute-def:friendlyCountryName': 'friendlyCountryName',
|
||||
'urn:mace:dir:attribute-def:generationQualifier': 'generationQualifier',
|
||||
'urn:mace:dir:attribute-def:givenName': 'givenName',
|
||||
'urn:mace:dir:attribute-def:gn': 'gn',
|
||||
'urn:mace:dir:attribute-def:homePhone': 'homePhone',
|
||||
'urn:mace:dir:attribute-def:homePostalAddress': 'homePostalAddress',
|
||||
'urn:mace:dir:attribute-def:homeTelephoneNumber': 'homeTelephoneNumber',
|
||||
'urn:mace:dir:attribute-def:host': 'host',
|
||||
'urn:mace:dir:attribute-def:houseIdentifier': 'houseIdentifier',
|
||||
'urn:mace:dir:attribute-def:info': 'info',
|
||||
'urn:mace:dir:attribute-def:initials': 'initials',
|
||||
'urn:mace:dir:attribute-def:internationaliSDNNumber': 'internationaliSDNNumber',
|
||||
'urn:mace:dir:attribute-def:janetMailbox': 'janetMailbox',
|
||||
'urn:mace:dir:attribute-def:jpegPhoto': 'jpegPhoto',
|
||||
'urn:mace:dir:attribute-def:knowledgeInformation': 'knowledgeInformation',
|
||||
'urn:mace:dir:attribute-def:l': 'l',
|
||||
'urn:mace:dir:attribute-def:labeledURI': 'labeledURI',
|
||||
'urn:mace:dir:attribute-def:localityName': 'localityName',
|
||||
'urn:mace:dir:attribute-def:mDRecord': 'mDRecord',
|
||||
'urn:mace:dir:attribute-def:mXRecord': 'mXRecord',
|
||||
'urn:mace:dir:attribute-def:mail': 'mail',
|
||||
'urn:mace:dir:attribute-def:mailPreferenceOption': 'mailPreferenceOption',
|
||||
'urn:mace:dir:attribute-def:manager': 'manager',
|
||||
'urn:mace:dir:attribute-def:member': 'member',
|
||||
'urn:mace:dir:attribute-def:mobile': 'mobile',
|
||||
'urn:mace:dir:attribute-def:mobileTelephoneNumber': 'mobileTelephoneNumber',
|
||||
'urn:mace:dir:attribute-def:nSRecord': 'nSRecord',
|
||||
'urn:mace:dir:attribute-def:name': 'name',
|
||||
'urn:mace:dir:attribute-def:norEduOrgAcronym': 'norEduOrgAcronym',
|
||||
'urn:mace:dir:attribute-def:norEduOrgNIN': 'norEduOrgNIN',
|
||||
'urn:mace:dir:attribute-def:norEduOrgSchemaVersion': 'norEduOrgSchemaVersion',
|
||||
'urn:mace:dir:attribute-def:norEduOrgUniqueIdentifier': 'norEduOrgUniqueIdentifier',
|
||||
'urn:mace:dir:attribute-def:norEduOrgUniqueNumber': 'norEduOrgUniqueNumber',
|
||||
'urn:mace:dir:attribute-def:norEduOrgUnitUniqueIdentifier': 'norEduOrgUnitUniqueIdentifier',
|
||||
'urn:mace:dir:attribute-def:norEduOrgUnitUniqueNumber': 'norEduOrgUnitUniqueNumber',
|
||||
'urn:mace:dir:attribute-def:norEduPersonBirthDate': 'norEduPersonBirthDate',
|
||||
'urn:mace:dir:attribute-def:norEduPersonLIN': 'norEduPersonLIN',
|
||||
'urn:mace:dir:attribute-def:norEduPersonNIN': 'norEduPersonNIN',
|
||||
'urn:mace:dir:attribute-def:o': 'o',
|
||||
'urn:mace:dir:attribute-def:objectClass': 'objectClass',
|
||||
'urn:mace:dir:attribute-def:organizationName': 'organizationName',
|
||||
'urn:mace:dir:attribute-def:organizationalStatus': 'organizationalStatus',
|
||||
'urn:mace:dir:attribute-def:organizationalUnitName': 'organizationalUnitName',
|
||||
'urn:mace:dir:attribute-def:otherMailbox': 'otherMailbox',
|
||||
'urn:mace:dir:attribute-def:ou': 'ou',
|
||||
'urn:mace:dir:attribute-def:owner': 'owner',
|
||||
'urn:mace:dir:attribute-def:pager': 'pager',
|
||||
'urn:mace:dir:attribute-def:pagerTelephoneNumber': 'pagerTelephoneNumber',
|
||||
'urn:mace:dir:attribute-def:personalSignature': 'personalSignature',
|
||||
'urn:mace:dir:attribute-def:personalTitle': 'personalTitle',
|
||||
'urn:mace:dir:attribute-def:photo': 'photo',
|
||||
'urn:mace:dir:attribute-def:physicalDeliveryOfficeName': 'physicalDeliveryOfficeName',
|
||||
'urn:mace:dir:attribute-def:pkcs9email': 'pkcs9email',
|
||||
'urn:mace:dir:attribute-def:postOfficeBox': 'postOfficeBox',
|
||||
'urn:mace:dir:attribute-def:postalAddress': 'postalAddress',
|
||||
'urn:mace:dir:attribute-def:postalCode': 'postalCode',
|
||||
'urn:mace:dir:attribute-def:preferredDeliveryMethod': 'preferredDeliveryMethod',
|
||||
'urn:mace:dir:attribute-def:preferredLanguage': 'preferredLanguage',
|
||||
'urn:mace:dir:attribute-def:presentationAddress': 'presentationAddress',
|
||||
'urn:mace:dir:attribute-def:protocolInformation': 'protocolInformation',
|
||||
'urn:mace:dir:attribute-def:pseudonym': 'pseudonym',
|
||||
'urn:mace:dir:attribute-def:registeredAddress': 'registeredAddress',
|
||||
'urn:mace:dir:attribute-def:rfc822Mailbox': 'rfc822Mailbox',
|
||||
'urn:mace:dir:attribute-def:roleOccupant': 'roleOccupant',
|
||||
'urn:mace:dir:attribute-def:roomNumber': 'roomNumber',
|
||||
'urn:mace:dir:attribute-def:sOARecord': 'sOARecord',
|
||||
'urn:mace:dir:attribute-def:searchGuide': 'searchGuide',
|
||||
'urn:mace:dir:attribute-def:secretary': 'secretary',
|
||||
'urn:mace:dir:attribute-def:seeAlso': 'seeAlso',
|
||||
'urn:mace:dir:attribute-def:serialNumber': 'serialNumber',
|
||||
'urn:mace:dir:attribute-def:singleLevelQuality': 'singleLevelQuality',
|
||||
'urn:mace:dir:attribute-def:sn': 'sn',
|
||||
'urn:mace:dir:attribute-def:st': 'st',
|
||||
'urn:mace:dir:attribute-def:stateOrProvinceName': 'stateOrProvinceName',
|
||||
'urn:mace:dir:attribute-def:street': 'street',
|
||||
'urn:mace:dir:attribute-def:streetAddress': 'streetAddress',
|
||||
'urn:mace:dir:attribute-def:subtreeMaximumQuality': 'subtreeMaximumQuality',
|
||||
'urn:mace:dir:attribute-def:subtreeMinimumQuality': 'subtreeMinimumQuality',
|
||||
'urn:mace:dir:attribute-def:supportedAlgorithms': 'supportedAlgorithms',
|
||||
'urn:mace:dir:attribute-def:supportedApplicationContext': 'supportedApplicationContext',
|
||||
'urn:mace:dir:attribute-def:surname': 'surname',
|
||||
'urn:mace:dir:attribute-def:telephoneNumber': 'telephoneNumber',
|
||||
'urn:mace:dir:attribute-def:teletexTerminalIdentifier': 'teletexTerminalIdentifier',
|
||||
'urn:mace:dir:attribute-def:telexNumber': 'telexNumber',
|
||||
'urn:mace:dir:attribute-def:textEncodedORAddress': 'textEncodedORAddress',
|
||||
'urn:mace:dir:attribute-def:title': 'title',
|
||||
'urn:mace:dir:attribute-def:uid': 'uid',
|
||||
'urn:mace:dir:attribute-def:uniqueIdentifier': 'uniqueIdentifier',
|
||||
'urn:mace:dir:attribute-def:uniqueMember': 'uniqueMember',
|
||||
'urn:mace:dir:attribute-def:userCertificate': 'userCertificate',
|
||||
'urn:mace:dir:attribute-def:userClass': 'userClass',
|
||||
'urn:mace:dir:attribute-def:userPKCS12': 'userPKCS12',
|
||||
'urn:mace:dir:attribute-def:userPassword': 'userPassword',
|
||||
'urn:mace:dir:attribute-def:userSMIMECertificate': 'userSMIMECertificate',
|
||||
'urn:mace:dir:attribute-def:userid': 'userid',
|
||||
'urn:mace:dir:attribute-def:x121Address': 'x121Address',
|
||||
'urn:mace:dir:attribute-def:x500UniqueIdentifier': 'x500UniqueIdentifier',
|
||||
},
|
||||
"to": {
|
||||
'aRecord': 'urn:mace:dir:attribute-def:aRecord',
|
||||
'aliasedEntryName': 'urn:mace:dir:attribute-def:aliasedEntryName',
|
||||
'aliasedObjectName': 'urn:mace:dir:attribute-def:aliasedObjectName',
|
||||
'associatedDomain': 'urn:mace:dir:attribute-def:associatedDomain',
|
||||
'associatedName': 'urn:mace:dir:attribute-def:associatedName',
|
||||
'audio': 'urn:mace:dir:attribute-def:audio',
|
||||
'authorityRevocationList': 'urn:mace:dir:attribute-def:authorityRevocationList',
|
||||
'buildingName': 'urn:mace:dir:attribute-def:buildingName',
|
||||
'businessCategory': 'urn:mace:dir:attribute-def:businessCategory',
|
||||
'c': 'urn:mace:dir:attribute-def:c',
|
||||
'cACertificate': 'urn:mace:dir:attribute-def:cACertificate',
|
||||
'cNAMERecord': 'urn:mace:dir:attribute-def:cNAMERecord',
|
||||
'carLicense': 'urn:mace:dir:attribute-def:carLicense',
|
||||
'certificateRevocationList': 'urn:mace:dir:attribute-def:certificateRevocationList',
|
||||
'cn': 'urn:mace:dir:attribute-def:cn',
|
||||
'co': 'urn:mace:dir:attribute-def:co',
|
||||
'commonName': 'urn:mace:dir:attribute-def:commonName',
|
||||
'countryName': 'urn:mace:dir:attribute-def:countryName',
|
||||
'crossCertificatePair': 'urn:mace:dir:attribute-def:crossCertificatePair',
|
||||
'dITRedirect': 'urn:mace:dir:attribute-def:dITRedirect',
|
||||
'dSAQuality': 'urn:mace:dir:attribute-def:dSAQuality',
|
||||
'dc': 'urn:mace:dir:attribute-def:dc',
|
||||
'deltaRevocationList': 'urn:mace:dir:attribute-def:deltaRevocationList',
|
||||
'departmentNumber': 'urn:mace:dir:attribute-def:departmentNumber',
|
||||
'description': 'urn:mace:dir:attribute-def:description',
|
||||
'destinationIndicator': 'urn:mace:dir:attribute-def:destinationIndicator',
|
||||
'displayName': 'urn:mace:dir:attribute-def:displayName',
|
||||
'distinguishedName': 'urn:mace:dir:attribute-def:distinguishedName',
|
||||
'dmdName': 'urn:mace:dir:attribute-def:dmdName',
|
||||
'dnQualifier': 'urn:mace:dir:attribute-def:dnQualifier',
|
||||
'documentAuthor': 'urn:mace:dir:attribute-def:documentAuthor',
|
||||
'documentIdentifier': 'urn:mace:dir:attribute-def:documentIdentifier',
|
||||
'documentLocation': 'urn:mace:dir:attribute-def:documentLocation',
|
||||
'documentPublisher': 'urn:mace:dir:attribute-def:documentPublisher',
|
||||
'documentTitle': 'urn:mace:dir:attribute-def:documentTitle',
|
||||
'documentVersion': 'urn:mace:dir:attribute-def:documentVersion',
|
||||
'domainComponent': 'urn:mace:dir:attribute-def:domainComponent',
|
||||
'drink': 'urn:mace:dir:attribute-def:drink',
|
||||
'eduOrgHomePageURI': 'urn:mace:dir:attribute-def:eduOrgHomePageURI',
|
||||
'eduOrgIdentityAuthNPolicyURI': 'urn:mace:dir:attribute-def:eduOrgIdentityAuthNPolicyURI',
|
||||
'eduOrgLegalName': 'urn:mace:dir:attribute-def:eduOrgLegalName',
|
||||
'eduOrgSuperiorURI': 'urn:mace:dir:attribute-def:eduOrgSuperiorURI',
|
||||
'eduOrgWhitePagesURI': 'urn:mace:dir:attribute-def:eduOrgWhitePagesURI',
|
||||
'eduPersonAffiliation': 'urn:mace:dir:attribute-def:eduPersonAffiliation',
|
||||
'eduPersonEntitlement': 'urn:mace:dir:attribute-def:eduPersonEntitlement',
|
||||
'eduPersonNickname': 'urn:mace:dir:attribute-def:eduPersonNickname',
|
||||
'eduPersonOrgDN': 'urn:mace:dir:attribute-def:eduPersonOrgDN',
|
||||
'eduPersonOrgUnitDN': 'urn:mace:dir:attribute-def:eduPersonOrgUnitDN',
|
||||
'eduPersonPrimaryAffiliation': 'urn:mace:dir:attribute-def:eduPersonPrimaryAffiliation',
|
||||
'eduPersonPrimaryOrgUnitDN': 'urn:mace:dir:attribute-def:eduPersonPrimaryOrgUnitDN',
|
||||
'eduPersonPrincipalName': 'urn:mace:dir:attribute-def:eduPersonPrincipalName',
|
||||
'eduPersonScopedAffiliation': 'urn:mace:dir:attribute-def:eduPersonScopedAffiliation',
|
||||
'eduPersonTargetedID': 'urn:mace:dir:attribute-def:eduPersonTargetedID',
|
||||
'email': 'urn:mace:dir:attribute-def:email',
|
||||
'emailAddress': 'urn:mace:dir:attribute-def:emailAddress',
|
||||
'employeeNumber': 'urn:mace:dir:attribute-def:employeeNumber',
|
||||
'employeeType': 'urn:mace:dir:attribute-def:employeeType',
|
||||
'enhancedSearchGuide': 'urn:mace:dir:attribute-def:enhancedSearchGuide',
|
||||
'facsimileTelephoneNumber': 'urn:mace:dir:attribute-def:facsimileTelephoneNumber',
|
||||
'favouriteDrink': 'urn:mace:dir:attribute-def:favouriteDrink',
|
||||
'fax': 'urn:mace:dir:attribute-def:fax',
|
||||
'federationFeideSchemaVersion': 'urn:mace:dir:attribute-def:federationFeideSchemaVersion',
|
||||
'friendlyCountryName': 'urn:mace:dir:attribute-def:friendlyCountryName',
|
||||
'generationQualifier': 'urn:mace:dir:attribute-def:generationQualifier',
|
||||
'givenName': 'urn:mace:dir:attribute-def:givenName',
|
||||
'gn': 'urn:mace:dir:attribute-def:gn',
|
||||
'homePhone': 'urn:mace:dir:attribute-def:homePhone',
|
||||
'homePostalAddress': 'urn:mace:dir:attribute-def:homePostalAddress',
|
||||
'homeTelephoneNumber': 'urn:mace:dir:attribute-def:homeTelephoneNumber',
|
||||
'host': 'urn:mace:dir:attribute-def:host',
|
||||
'houseIdentifier': 'urn:mace:dir:attribute-def:houseIdentifier',
|
||||
'info': 'urn:mace:dir:attribute-def:info',
|
||||
'initials': 'urn:mace:dir:attribute-def:initials',
|
||||
'internationaliSDNNumber': 'urn:mace:dir:attribute-def:internationaliSDNNumber',
|
||||
'janetMailbox': 'urn:mace:dir:attribute-def:janetMailbox',
|
||||
'jpegPhoto': 'urn:mace:dir:attribute-def:jpegPhoto',
|
||||
'knowledgeInformation': 'urn:mace:dir:attribute-def:knowledgeInformation',
|
||||
'l': 'urn:mace:dir:attribute-def:l',
|
||||
'labeledURI': 'urn:mace:dir:attribute-def:labeledURI',
|
||||
'localityName': 'urn:mace:dir:attribute-def:localityName',
|
||||
'mDRecord': 'urn:mace:dir:attribute-def:mDRecord',
|
||||
'mXRecord': 'urn:mace:dir:attribute-def:mXRecord',
|
||||
'mail': 'urn:mace:dir:attribute-def:mail',
|
||||
'mailPreferenceOption': 'urn:mace:dir:attribute-def:mailPreferenceOption',
|
||||
'manager': 'urn:mace:dir:attribute-def:manager',
|
||||
'member': 'urn:mace:dir:attribute-def:member',
|
||||
'mobile': 'urn:mace:dir:attribute-def:mobile',
|
||||
'mobileTelephoneNumber': 'urn:mace:dir:attribute-def:mobileTelephoneNumber',
|
||||
'nSRecord': 'urn:mace:dir:attribute-def:nSRecord',
|
||||
'name': 'urn:mace:dir:attribute-def:name',
|
||||
'norEduOrgAcronym': 'urn:mace:dir:attribute-def:norEduOrgAcronym',
|
||||
'norEduOrgNIN': 'urn:mace:dir:attribute-def:norEduOrgNIN',
|
||||
'norEduOrgSchemaVersion': 'urn:mace:dir:attribute-def:norEduOrgSchemaVersion',
|
||||
'norEduOrgUniqueIdentifier': 'urn:mace:dir:attribute-def:norEduOrgUniqueIdentifier',
|
||||
'norEduOrgUniqueNumber': 'urn:mace:dir:attribute-def:norEduOrgUniqueNumber',
|
||||
'norEduOrgUnitUniqueIdentifier': 'urn:mace:dir:attribute-def:norEduOrgUnitUniqueIdentifier',
|
||||
'norEduOrgUnitUniqueNumber': 'urn:mace:dir:attribute-def:norEduOrgUnitUniqueNumber',
|
||||
'norEduPersonBirthDate': 'urn:mace:dir:attribute-def:norEduPersonBirthDate',
|
||||
'norEduPersonLIN': 'urn:mace:dir:attribute-def:norEduPersonLIN',
|
||||
'norEduPersonNIN': 'urn:mace:dir:attribute-def:norEduPersonNIN',
|
||||
'o': 'urn:mace:dir:attribute-def:o',
|
||||
'objectClass': 'urn:mace:dir:attribute-def:objectClass',
|
||||
'organizationName': 'urn:mace:dir:attribute-def:organizationName',
|
||||
'organizationalStatus': 'urn:mace:dir:attribute-def:organizationalStatus',
|
||||
'organizationalUnitName': 'urn:mace:dir:attribute-def:organizationalUnitName',
|
||||
'otherMailbox': 'urn:mace:dir:attribute-def:otherMailbox',
|
||||
'ou': 'urn:mace:dir:attribute-def:ou',
|
||||
'owner': 'urn:mace:dir:attribute-def:owner',
|
||||
'pager': 'urn:mace:dir:attribute-def:pager',
|
||||
'pagerTelephoneNumber': 'urn:mace:dir:attribute-def:pagerTelephoneNumber',
|
||||
'personalSignature': 'urn:mace:dir:attribute-def:personalSignature',
|
||||
'personalTitle': 'urn:mace:dir:attribute-def:personalTitle',
|
||||
'photo': 'urn:mace:dir:attribute-def:photo',
|
||||
'physicalDeliveryOfficeName': 'urn:mace:dir:attribute-def:physicalDeliveryOfficeName',
|
||||
'pkcs9email': 'urn:mace:dir:attribute-def:pkcs9email',
|
||||
'postOfficeBox': 'urn:mace:dir:attribute-def:postOfficeBox',
|
||||
'postalAddress': 'urn:mace:dir:attribute-def:postalAddress',
|
||||
'postalCode': 'urn:mace:dir:attribute-def:postalCode',
|
||||
'preferredDeliveryMethod': 'urn:mace:dir:attribute-def:preferredDeliveryMethod',
|
||||
'preferredLanguage': 'urn:mace:dir:attribute-def:preferredLanguage',
|
||||
'presentationAddress': 'urn:mace:dir:attribute-def:presentationAddress',
|
||||
'protocolInformation': 'urn:mace:dir:attribute-def:protocolInformation',
|
||||
'pseudonym': 'urn:mace:dir:attribute-def:pseudonym',
|
||||
'registeredAddress': 'urn:mace:dir:attribute-def:registeredAddress',
|
||||
'rfc822Mailbox': 'urn:mace:dir:attribute-def:rfc822Mailbox',
|
||||
'roleOccupant': 'urn:mace:dir:attribute-def:roleOccupant',
|
||||
'roomNumber': 'urn:mace:dir:attribute-def:roomNumber',
|
||||
'sOARecord': 'urn:mace:dir:attribute-def:sOARecord',
|
||||
'searchGuide': 'urn:mace:dir:attribute-def:searchGuide',
|
||||
'secretary': 'urn:mace:dir:attribute-def:secretary',
|
||||
'seeAlso': 'urn:mace:dir:attribute-def:seeAlso',
|
||||
'serialNumber': 'urn:mace:dir:attribute-def:serialNumber',
|
||||
'singleLevelQuality': 'urn:mace:dir:attribute-def:singleLevelQuality',
|
||||
'sn': 'urn:mace:dir:attribute-def:sn',
|
||||
'st': 'urn:mace:dir:attribute-def:st',
|
||||
'stateOrProvinceName': 'urn:mace:dir:attribute-def:stateOrProvinceName',
|
||||
'street': 'urn:mace:dir:attribute-def:street',
|
||||
'streetAddress': 'urn:mace:dir:attribute-def:streetAddress',
|
||||
'subtreeMaximumQuality': 'urn:mace:dir:attribute-def:subtreeMaximumQuality',
|
||||
'subtreeMinimumQuality': 'urn:mace:dir:attribute-def:subtreeMinimumQuality',
|
||||
'supportedAlgorithms': 'urn:mace:dir:attribute-def:supportedAlgorithms',
|
||||
'supportedApplicationContext': 'urn:mace:dir:attribute-def:supportedApplicationContext',
|
||||
'surname': 'urn:mace:dir:attribute-def:surname',
|
||||
'telephoneNumber': 'urn:mace:dir:attribute-def:telephoneNumber',
|
||||
'teletexTerminalIdentifier': 'urn:mace:dir:attribute-def:teletexTerminalIdentifier',
|
||||
'telexNumber': 'urn:mace:dir:attribute-def:telexNumber',
|
||||
'textEncodedORAddress': 'urn:mace:dir:attribute-def:textEncodedORAddress',
|
||||
'title': 'urn:mace:dir:attribute-def:title',
|
||||
'uid': 'urn:mace:dir:attribute-def:uid',
|
||||
'uniqueIdentifier': 'urn:mace:dir:attribute-def:uniqueIdentifier',
|
||||
'uniqueMember': 'urn:mace:dir:attribute-def:uniqueMember',
|
||||
'userCertificate': 'urn:mace:dir:attribute-def:userCertificate',
|
||||
'userClass': 'urn:mace:dir:attribute-def:userClass',
|
||||
'userPKCS12': 'urn:mace:dir:attribute-def:userPKCS12',
|
||||
'userPassword': 'urn:mace:dir:attribute-def:userPassword',
|
||||
'userSMIMECertificate': 'urn:mace:dir:attribute-def:userSMIMECertificate',
|
||||
'userid': 'urn:mace:dir:attribute-def:userid',
|
||||
'x121Address': 'urn:mace:dir:attribute-def:x121Address',
|
||||
'x500UniqueIdentifier': 'urn:mace:dir:attribute-def:x500UniqueIdentifier',
|
||||
}
|
||||
}
|
||||
199
src/saml2/attributemaps/saml_uri.py
Normal file
199
src/saml2/attributemaps/saml_uri.py
Normal file
@@ -0,0 +1,199 @@
|
||||
__author__ = 'rolandh'
|
||||
|
||||
EDUPERSON_OID = "urn:oid:1.3.6.1.4.1.5923.1.1.1."
|
||||
X500ATTR_OID = "urn:oid:2.5.4."
|
||||
NOREDUPERSON_OID = "urn:oid:1.3.6.1.4.1.2428.90.1."
|
||||
NETSCAPE_LDAP = "urn:oid:2.16.840.1.113730.3.1."
|
||||
UCL_DIR_PILOT = 'urn:oid:0.9.2342.19200300.100.1.'
|
||||
PKCS_9 = "urn:oid:1.2.840.113549.1.9.1."
|
||||
UMICH = "urn:oid:1.3.6.1.4.1.250.1.57."
|
||||
|
||||
MAP = {
|
||||
"identifier": "urn:oasis:names:tc:SAML:2.0:attrname-format:uri",
|
||||
"fro": {
|
||||
EDUPERSON_OID+'2': 'eduPersonNickname',
|
||||
EDUPERSON_OID+'9': 'eduPersonScopedAffiliation',
|
||||
EDUPERSON_OID+'11': 'eduPersonAssurance',
|
||||
EDUPERSON_OID+'10': 'eduPersonTargetedID',
|
||||
EDUPERSON_OID+'4': 'eduPersonOrgUnitDN',
|
||||
NOREDUPERSON_OID+'6': 'norEduOrgAcronym',
|
||||
NOREDUPERSON_OID+'7': 'norEduOrgUniqueIdentifier',
|
||||
NOREDUPERSON_OID+'4': 'norEduPersonLIN',
|
||||
EDUPERSON_OID+'1': 'eduPersonAffiliation',
|
||||
NOREDUPERSON_OID+'2': 'norEduOrgUnitUniqueNumber',
|
||||
NETSCAPE_LDAP+'40': 'userSMIMECertificate',
|
||||
NOREDUPERSON_OID+'1': 'norEduOrgUniqueNumber',
|
||||
NETSCAPE_LDAP+'241': 'displayName',
|
||||
UCL_DIR_PILOT+'37': 'associatedDomain',
|
||||
EDUPERSON_OID+'6': 'eduPersonPrincipalName',
|
||||
NOREDUPERSON_OID+'8': 'norEduOrgUnitUniqueIdentifier',
|
||||
NOREDUPERSON_OID+'9': 'federationFeideSchemaVersion',
|
||||
X500ATTR_OID+'53': 'deltaRevocationList',
|
||||
X500ATTR_OID+'52': 'supportedAlgorithms',
|
||||
X500ATTR_OID+'51': 'houseIdentifier',
|
||||
X500ATTR_OID+'50': 'uniqueMember',
|
||||
X500ATTR_OID+'19': 'physicalDeliveryOfficeName',
|
||||
X500ATTR_OID+'18': 'postOfficeBox',
|
||||
X500ATTR_OID+'17': 'postalCode',
|
||||
X500ATTR_OID+'16': 'postalAddress',
|
||||
X500ATTR_OID+'15': 'businessCategory',
|
||||
X500ATTR_OID+'14': 'searchGuide',
|
||||
EDUPERSON_OID+'5': 'eduPersonPrimaryAffiliation',
|
||||
X500ATTR_OID+'12': 'title',
|
||||
X500ATTR_OID+'11': 'ou',
|
||||
X500ATTR_OID+'10': 'o',
|
||||
X500ATTR_OID+'37': 'cACertificate',
|
||||
X500ATTR_OID+'36': 'userCertificate',
|
||||
X500ATTR_OID+'31': 'member',
|
||||
X500ATTR_OID+'30': 'supportedApplicationContext',
|
||||
X500ATTR_OID+'33': 'roleOccupant',
|
||||
X500ATTR_OID+'32': 'owner',
|
||||
NETSCAPE_LDAP+'1': 'carLicense',
|
||||
PKCS_9+'1': 'email',
|
||||
NETSCAPE_LDAP+'3': 'employeeNumber',
|
||||
NETSCAPE_LDAP+'2': 'departmentNumber',
|
||||
X500ATTR_OID+'39': 'certificateRevocationList',
|
||||
X500ATTR_OID+'38': 'authorityRevocationList',
|
||||
NETSCAPE_LDAP+'216': 'userPKCS12',
|
||||
EDUPERSON_OID+'8': 'eduPersonPrimaryOrgUnitDN',
|
||||
X500ATTR_OID+'9': 'street',
|
||||
X500ATTR_OID+'8': 'st',
|
||||
NETSCAPE_LDAP+'39': 'preferredLanguage',
|
||||
EDUPERSON_OID+'7': 'eduPersonEntitlement',
|
||||
X500ATTR_OID+'2': 'knowledgeInformation',
|
||||
X500ATTR_OID+'7': 'l',
|
||||
X500ATTR_OID+'6': 'c',
|
||||
X500ATTR_OID+'5': 'serialNumber',
|
||||
X500ATTR_OID+'4': 'sn',
|
||||
UCL_DIR_PILOT+'60': 'jpegPhoto',
|
||||
X500ATTR_OID+'65': 'pseudonym',
|
||||
NOREDUPERSON_OID+'5': 'norEduPersonNIN',
|
||||
UCL_DIR_PILOT+'3': 'mail',
|
||||
UCL_DIR_PILOT+'25': 'dc',
|
||||
X500ATTR_OID+'40': 'crossCertificatePair',
|
||||
X500ATTR_OID+'42': 'givenName',
|
||||
X500ATTR_OID+'43': 'initials',
|
||||
X500ATTR_OID+'44': 'generationQualifier',
|
||||
X500ATTR_OID+'45': 'x500UniqueIdentifier',
|
||||
X500ATTR_OID+'46': 'dnQualifier',
|
||||
X500ATTR_OID+'47': 'enhancedSearchGuide',
|
||||
X500ATTR_OID+'48': 'protocolInformation',
|
||||
X500ATTR_OID+'54': 'dmdName',
|
||||
NETSCAPE_LDAP+'4': 'employeeType',
|
||||
X500ATTR_OID+'22': 'teletexTerminalIdentifier',
|
||||
X500ATTR_OID+'23': 'facsimileTelephoneNumber',
|
||||
X500ATTR_OID+'20': 'telephoneNumber',
|
||||
X500ATTR_OID+'21': 'telexNumber',
|
||||
X500ATTR_OID+'26': 'registeredAddress',
|
||||
X500ATTR_OID+'27': 'destinationIndicator',
|
||||
X500ATTR_OID+'24': 'x121Address',
|
||||
X500ATTR_OID+'25': 'internationaliSDNNumber',
|
||||
X500ATTR_OID+'28': 'preferredDeliveryMethod',
|
||||
X500ATTR_OID+'29': 'presentationAddress',
|
||||
EDUPERSON_OID+'3': 'eduPersonOrgDN',
|
||||
NOREDUPERSON_OID+'3': 'norEduPersonBirthDate',
|
||||
UMICH+'57': 'labeledURI',
|
||||
UCL_DIR_PILOT+'1': 'uid',
|
||||
},
|
||||
"to": {
|
||||
'roleOccupant': X500ATTR_OID+'33',
|
||||
'gn': X500ATTR_OID+'42',
|
||||
'norEduPersonNIN': NOREDUPERSON_OID+'5',
|
||||
'title': X500ATTR_OID+'12',
|
||||
'facsimileTelephoneNumber': X500ATTR_OID+'23',
|
||||
'mail': UCL_DIR_PILOT+'3',
|
||||
'postOfficeBox': X500ATTR_OID+'18',
|
||||
'fax': X500ATTR_OID+'23',
|
||||
'telephoneNumber': X500ATTR_OID+'20',
|
||||
'norEduPersonBirthDate': NOREDUPERSON_OID+'3',
|
||||
'rfc822Mailbox': UCL_DIR_PILOT+'3',
|
||||
'dc': UCL_DIR_PILOT+'25',
|
||||
'countryName': X500ATTR_OID+'6',
|
||||
'emailAddress': PKCS_9+'1',
|
||||
'employeeNumber': NETSCAPE_LDAP+'3',
|
||||
'organizationName': X500ATTR_OID+'10',
|
||||
'eduPersonAssurance': EDUPERSON_OID+'11',
|
||||
'norEduOrgAcronym': NOREDUPERSON_OID+'6',
|
||||
'registeredAddress': X500ATTR_OID+'26',
|
||||
'physicalDeliveryOfficeName': X500ATTR_OID+'19',
|
||||
'associatedDomain': UCL_DIR_PILOT+'37',
|
||||
'l': X500ATTR_OID+'7',
|
||||
'stateOrProvinceName': X500ATTR_OID+'8',
|
||||
'federationFeideSchemaVersion': NOREDUPERSON_OID+'9',
|
||||
'pkcs9email': PKCS_9+'1',
|
||||
'givenName': X500ATTR_OID+'42',
|
||||
'givenname': X500ATTR_OID+'42',
|
||||
'x500UniqueIdentifier': X500ATTR_OID+'45',
|
||||
'eduPersonNickname': EDUPERSON_OID+'2',
|
||||
'houseIdentifier': X500ATTR_OID+'51',
|
||||
'street': X500ATTR_OID+'9',
|
||||
'supportedAlgorithms': X500ATTR_OID+'52',
|
||||
'preferredLanguage': NETSCAPE_LDAP+'39',
|
||||
'postalAddress': X500ATTR_OID+'16',
|
||||
'email': PKCS_9+'1',
|
||||
'norEduOrgUnitUniqueIdentifier': NOREDUPERSON_OID+'8',
|
||||
'eduPersonPrimaryOrgUnitDN': EDUPERSON_OID+'8',
|
||||
'c': X500ATTR_OID+'6',
|
||||
'teletexTerminalIdentifier': X500ATTR_OID+'22',
|
||||
'o': X500ATTR_OID+'10',
|
||||
'cACertificate': X500ATTR_OID+'37',
|
||||
'telexNumber': X500ATTR_OID+'21',
|
||||
'ou': X500ATTR_OID+'11',
|
||||
'initials': X500ATTR_OID+'43',
|
||||
'eduPersonOrgUnitDN': EDUPERSON_OID+'4',
|
||||
'deltaRevocationList': X500ATTR_OID+'53',
|
||||
'norEduPersonLIN': NOREDUPERSON_OID+'4',
|
||||
'supportedApplicationContext': X500ATTR_OID+'30',
|
||||
'eduPersonEntitlement': EDUPERSON_OID+'7',
|
||||
'generationQualifier': X500ATTR_OID+'44',
|
||||
'eduPersonAffiliation': EDUPERSON_OID+'1',
|
||||
'eduPersonPrincipalName': EDUPERSON_OID+'6',
|
||||
'edupersonprincipalname': EDUPERSON_OID+'6',
|
||||
'localityName': X500ATTR_OID+'7',
|
||||
'owner': X500ATTR_OID+'32',
|
||||
'norEduOrgUnitUniqueNumber': NOREDUPERSON_OID+'2',
|
||||
'searchGuide': X500ATTR_OID+'14',
|
||||
'certificateRevocationList': X500ATTR_OID+'39',
|
||||
'organizationalUnitName': X500ATTR_OID+'11',
|
||||
'userCertificate': X500ATTR_OID+'36',
|
||||
'preferredDeliveryMethod': X500ATTR_OID+'28',
|
||||
'internationaliSDNNumber': X500ATTR_OID+'25',
|
||||
'uniqueMember': X500ATTR_OID+'50',
|
||||
'departmentNumber': NETSCAPE_LDAP+'2',
|
||||
'enhancedSearchGuide': X500ATTR_OID+'47',
|
||||
'userPKCS12': NETSCAPE_LDAP+'216',
|
||||
'eduPersonTargetedID': EDUPERSON_OID+'10',
|
||||
'norEduOrgUniqueNumber': NOREDUPERSON_OID+'1',
|
||||
'x121Address': X500ATTR_OID+'24',
|
||||
'destinationIndicator': X500ATTR_OID+'27',
|
||||
'eduPersonPrimaryAffiliation': EDUPERSON_OID+'5',
|
||||
'surname': X500ATTR_OID+'4',
|
||||
'jpegPhoto': UCL_DIR_PILOT+'60',
|
||||
'eduPersonScopedAffiliation': EDUPERSON_OID+'9',
|
||||
'edupersonscopedaffiliation': EDUPERSON_OID+'9',
|
||||
'protocolInformation': X500ATTR_OID+'48',
|
||||
'knowledgeInformation': X500ATTR_OID+'2',
|
||||
'employeeType': NETSCAPE_LDAP+'4',
|
||||
'userSMIMECertificate': NETSCAPE_LDAP+'40',
|
||||
'member': X500ATTR_OID+'31',
|
||||
'streetAddress': X500ATTR_OID+'9',
|
||||
'dmdName': X500ATTR_OID+'54',
|
||||
'postalCode': X500ATTR_OID+'17',
|
||||
'pseudonym': X500ATTR_OID+'65',
|
||||
'dnQualifier': X500ATTR_OID+'46',
|
||||
'crossCertificatePair': X500ATTR_OID+'40',
|
||||
'eduPersonOrgDN': EDUPERSON_OID+'3',
|
||||
'authorityRevocationList': X500ATTR_OID+'38',
|
||||
'displayName': NETSCAPE_LDAP+'241',
|
||||
'businessCategory': X500ATTR_OID+'15',
|
||||
'serialNumber': X500ATTR_OID+'5',
|
||||
'norEduOrgUniqueIdentifier': NOREDUPERSON_OID+'7',
|
||||
'st': X500ATTR_OID+'8',
|
||||
'carLicense': NETSCAPE_LDAP+'1',
|
||||
'presentationAddress': X500ATTR_OID+'29',
|
||||
'sn': X500ATTR_OID+'4',
|
||||
'domainComponent': UCL_DIR_PILOT+'25',
|
||||
'labeledURI': UMICH+'57',
|
||||
'uid': UCL_DIR_PILOT+'1'
|
||||
}
|
||||
}
|
||||
190
src/saml2/attributemaps/shibboleth_uri.py
Normal file
190
src/saml2/attributemaps/shibboleth_uri.py
Normal file
@@ -0,0 +1,190 @@
|
||||
EDUPERSON_OID = "urn:oid:1.3.6.1.4.1.5923.1.1.1."
|
||||
X500ATTR = "urn:oid:2.5.4."
|
||||
NOREDUPERSON_OID = "urn:oid:1.3.6.1.4.1.2428.90.1."
|
||||
NETSCAPE_LDAP = "urn:oid:2.16.840.1.113730.3.1."
|
||||
UCL_DIR_PILOT = "urn:oid:0.9.2342.19200300.100.1."
|
||||
PKCS_9 = "urn:oid:1.2.840.113549.1.9."
|
||||
UMICH = "urn:oid:1.3.6.1.4.1.250.1.57."
|
||||
|
||||
MAP = {
|
||||
"identifier": "urn:mace:shibboleth:1.0:attributeNamespace:uri",
|
||||
"fro": {
|
||||
EDUPERSON_OID+'2': 'eduPersonNickname',
|
||||
EDUPERSON_OID+'9': 'eduPersonScopedAffiliation',
|
||||
EDUPERSON_OID+'11': 'eduPersonAssurance',
|
||||
EDUPERSON_OID+'10': 'eduPersonTargetedID',
|
||||
EDUPERSON_OID+'4': 'eduPersonOrgUnitDN',
|
||||
NOREDUPERSON_OID+'6': 'norEduOrgAcronym',
|
||||
NOREDUPERSON_OID+'7': 'norEduOrgUniqueIdentifier',
|
||||
NOREDUPERSON_OID+'4': 'norEduPersonLIN',
|
||||
EDUPERSON_OID+'1': 'eduPersonAffiliation',
|
||||
NOREDUPERSON_OID+'2': 'norEduOrgUnitUniqueNumber',
|
||||
NETSCAPE_LDAP+'40': 'userSMIMECertificate',
|
||||
NOREDUPERSON_OID+'1': 'norEduOrgUniqueNumber',
|
||||
NETSCAPE_LDAP+'241': 'displayName',
|
||||
UCL_DIR_PILOT+'37': 'associatedDomain',
|
||||
EDUPERSON_OID+'6': 'eduPersonPrincipalName',
|
||||
NOREDUPERSON_OID+'8': 'norEduOrgUnitUniqueIdentifier',
|
||||
NOREDUPERSON_OID+'9': 'federationFeideSchemaVersion',
|
||||
X500ATTR+'53': 'deltaRevocationList',
|
||||
X500ATTR+'52': 'supportedAlgorithms',
|
||||
X500ATTR+'51': 'houseIdentifier',
|
||||
X500ATTR+'50': 'uniqueMember',
|
||||
X500ATTR+'19': 'physicalDeliveryOfficeName',
|
||||
X500ATTR+'18': 'postOfficeBox',
|
||||
X500ATTR+'17': 'postalCode',
|
||||
X500ATTR+'16': 'postalAddress',
|
||||
X500ATTR+'15': 'businessCategory',
|
||||
X500ATTR+'14': 'searchGuide',
|
||||
EDUPERSON_OID+'5': 'eduPersonPrimaryAffiliation',
|
||||
X500ATTR+'12': 'title',
|
||||
X500ATTR+'11': 'ou',
|
||||
X500ATTR+'10': 'o',
|
||||
X500ATTR+'37': 'cACertificate',
|
||||
X500ATTR+'36': 'userCertificate',
|
||||
X500ATTR+'31': 'member',
|
||||
X500ATTR+'30': 'supportedApplicationContext',
|
||||
X500ATTR+'33': 'roleOccupant',
|
||||
X500ATTR+'32': 'owner',
|
||||
NETSCAPE_LDAP+'1': 'carLicense',
|
||||
PKCS_9+'1': 'email',
|
||||
NETSCAPE_LDAP+'3': 'employeeNumber',
|
||||
NETSCAPE_LDAP+'2': 'departmentNumber',
|
||||
X500ATTR+'39': 'certificateRevocationList',
|
||||
X500ATTR+'38': 'authorityRevocationList',
|
||||
NETSCAPE_LDAP+'216': 'userPKCS12',
|
||||
EDUPERSON_OID+'8': 'eduPersonPrimaryOrgUnitDN',
|
||||
X500ATTR+'9': 'street',
|
||||
X500ATTR+'8': 'st',
|
||||
NETSCAPE_LDAP+'39': 'preferredLanguage',
|
||||
EDUPERSON_OID+'7': 'eduPersonEntitlement',
|
||||
X500ATTR+'2': 'knowledgeInformation',
|
||||
X500ATTR+'7': 'l',
|
||||
X500ATTR+'6': 'c',
|
||||
X500ATTR+'5': 'serialNumber',
|
||||
X500ATTR+'4': 'sn',
|
||||
UCL_DIR_PILOT+'60': 'jpegPhoto',
|
||||
X500ATTR+'65': 'pseudonym',
|
||||
NOREDUPERSON_OID+'5': 'norEduPersonNIN',
|
||||
UCL_DIR_PILOT+'3': 'mail',
|
||||
UCL_DIR_PILOT+'25': 'dc',
|
||||
X500ATTR+'40': 'crossCertificatePair',
|
||||
X500ATTR+'42': 'givenName',
|
||||
X500ATTR+'43': 'initials',
|
||||
X500ATTR+'44': 'generationQualifier',
|
||||
X500ATTR+'45': 'x500UniqueIdentifier',
|
||||
X500ATTR+'46': 'dnQualifier',
|
||||
X500ATTR+'47': 'enhancedSearchGuide',
|
||||
X500ATTR+'48': 'protocolInformation',
|
||||
X500ATTR+'54': 'dmdName',
|
||||
NETSCAPE_LDAP+'4': 'employeeType',
|
||||
X500ATTR+'22': 'teletexTerminalIdentifier',
|
||||
X500ATTR+'23': 'facsimileTelephoneNumber',
|
||||
X500ATTR+'20': 'telephoneNumber',
|
||||
X500ATTR+'21': 'telexNumber',
|
||||
X500ATTR+'26': 'registeredAddress',
|
||||
X500ATTR+'27': 'destinationIndicator',
|
||||
X500ATTR+'24': 'x121Address',
|
||||
X500ATTR+'25': 'internationaliSDNNumber',
|
||||
X500ATTR+'28': 'preferredDeliveryMethod',
|
||||
X500ATTR+'29': 'presentationAddress',
|
||||
EDUPERSON_OID+'3': 'eduPersonOrgDN',
|
||||
NOREDUPERSON_OID+'3': 'norEduPersonBirthDate',
|
||||
},
|
||||
"to":{
|
||||
'roleOccupant': X500ATTR+'33',
|
||||
'gn': X500ATTR+'42',
|
||||
'norEduPersonNIN': NOREDUPERSON_OID+'5',
|
||||
'title': X500ATTR+'12',
|
||||
'facsimileTelephoneNumber': X500ATTR+'23',
|
||||
'mail': UCL_DIR_PILOT+'3',
|
||||
'postOfficeBox': X500ATTR+'18',
|
||||
'fax': X500ATTR+'23',
|
||||
'telephoneNumber': X500ATTR+'20',
|
||||
'norEduPersonBirthDate': NOREDUPERSON_OID+'3',
|
||||
'rfc822Mailbox': UCL_DIR_PILOT+'3',
|
||||
'dc': UCL_DIR_PILOT+'25',
|
||||
'countryName': X500ATTR+'6',
|
||||
'emailAddress': PKCS_9+'1',
|
||||
'employeeNumber': NETSCAPE_LDAP+'3',
|
||||
'organizationName': X500ATTR+'10',
|
||||
'eduPersonAssurance': EDUPERSON_OID+'11',
|
||||
'norEduOrgAcronym': NOREDUPERSON_OID+'6',
|
||||
'registeredAddress': X500ATTR+'26',
|
||||
'physicalDeliveryOfficeName': X500ATTR+'19',
|
||||
'associatedDomain': UCL_DIR_PILOT+'37',
|
||||
'l': X500ATTR+'7',
|
||||
'stateOrProvinceName': X500ATTR+'8',
|
||||
'federationFeideSchemaVersion': NOREDUPERSON_OID+'9',
|
||||
'pkcs9email': PKCS_9+'1',
|
||||
'givenName': X500ATTR+'42',
|
||||
'x500UniqueIdentifier': X500ATTR+'45',
|
||||
'eduPersonNickname': EDUPERSON_OID+'2',
|
||||
'houseIdentifier': X500ATTR+'51',
|
||||
'street': X500ATTR+'9',
|
||||
'supportedAlgorithms': X500ATTR+'52',
|
||||
'preferredLanguage': NETSCAPE_LDAP+'39',
|
||||
'postalAddress': X500ATTR+'16',
|
||||
'email': PKCS_9+'1',
|
||||
'norEduOrgUnitUniqueIdentifier': NOREDUPERSON_OID+'8',
|
||||
'eduPersonPrimaryOrgUnitDN': EDUPERSON_OID+'8',
|
||||
'c': X500ATTR+'6',
|
||||
'teletexTerminalIdentifier': X500ATTR+'22',
|
||||
'o': X500ATTR+'10',
|
||||
'cACertificate': X500ATTR+'37',
|
||||
'telexNumber': X500ATTR+'21',
|
||||
'ou': X500ATTR+'11',
|
||||
'initials': X500ATTR+'43',
|
||||
'eduPersonOrgUnitDN': EDUPERSON_OID+'4',
|
||||
'deltaRevocationList': X500ATTR+'53',
|
||||
'norEduPersonLIN': NOREDUPERSON_OID+'4',
|
||||
'supportedApplicationContext': X500ATTR+'30',
|
||||
'eduPersonEntitlement': EDUPERSON_OID+'7',
|
||||
'generationQualifier': X500ATTR+'44',
|
||||
'eduPersonAffiliation': EDUPERSON_OID+'1',
|
||||
'eduPersonPrincipalName': EDUPERSON_OID+'6',
|
||||
'localityName': X500ATTR+'7',
|
||||
'owner': X500ATTR+'32',
|
||||
'norEduOrgUnitUniqueNumber': NOREDUPERSON_OID+'2',
|
||||
'searchGuide': X500ATTR+'14',
|
||||
'certificateRevocationList': X500ATTR+'39',
|
||||
'organizationalUnitName': X500ATTR+'11',
|
||||
'userCertificate': X500ATTR+'36',
|
||||
'preferredDeliveryMethod': X500ATTR+'28',
|
||||
'internationaliSDNNumber': X500ATTR+'25',
|
||||
'uniqueMember': X500ATTR+'50',
|
||||
'departmentNumber': NETSCAPE_LDAP+'2',
|
||||
'enhancedSearchGuide': X500ATTR+'47',
|
||||
'userPKCS12': NETSCAPE_LDAP+'216',
|
||||
'eduPersonTargetedID': EDUPERSON_OID+'10',
|
||||
'norEduOrgUniqueNumber': NOREDUPERSON_OID+'1',
|
||||
'x121Address': X500ATTR+'24',
|
||||
'destinationIndicator': X500ATTR+'27',
|
||||
'eduPersonPrimaryAffiliation': EDUPERSON_OID+'5',
|
||||
'surname': X500ATTR+'4',
|
||||
'jpegPhoto': UCL_DIR_PILOT+'60',
|
||||
'eduPersonScopedAffiliation': EDUPERSON_OID+'9',
|
||||
'protocolInformation': X500ATTR+'48',
|
||||
'knowledgeInformation': X500ATTR+'2',
|
||||
'employeeType': NETSCAPE_LDAP+'4',
|
||||
'userSMIMECertificate': NETSCAPE_LDAP+'40',
|
||||
'member': X500ATTR+'31',
|
||||
'streetAddress': X500ATTR+'9',
|
||||
'dmdName': X500ATTR+'54',
|
||||
'postalCode': X500ATTR+'17',
|
||||
'pseudonym': X500ATTR+'65',
|
||||
'dnQualifier': X500ATTR+'46',
|
||||
'crossCertificatePair': X500ATTR+'40',
|
||||
'eduPersonOrgDN': EDUPERSON_OID+'3',
|
||||
'authorityRevocationList': X500ATTR+'38',
|
||||
'displayName': NETSCAPE_LDAP+'241',
|
||||
'businessCategory': X500ATTR+'15',
|
||||
'serialNumber': X500ATTR+'5',
|
||||
'norEduOrgUniqueIdentifier': NOREDUPERSON_OID+'7',
|
||||
'st': X500ATTR+'8',
|
||||
'carLicense': NETSCAPE_LDAP+'1',
|
||||
'presentationAddress': X500ATTR+'29',
|
||||
'sn': X500ATTR+'4',
|
||||
'domainComponent': UCL_DIR_PILOT+'25',
|
||||
}
|
||||
}
|
||||
252
src/saml2/binding.py
Normal file
252
src/saml2/binding.py
Normal file
@@ -0,0 +1,252 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
# Copyright (C) 2010-2011 Umeå University
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
"""Contains classes and functions that are necessary to implement
|
||||
different bindings.
|
||||
|
||||
Bindings normally consists of three parts:
|
||||
- rules about what to send
|
||||
- how to package the information
|
||||
- which protocol to use
|
||||
"""
|
||||
import saml2
|
||||
import base64
|
||||
import urllib
|
||||
from saml2.s_utils import deflate_and_base64_encode
|
||||
from saml2.soap import SOAPClient, HTTPClient
|
||||
|
||||
try:
|
||||
from xml.etree import cElementTree as ElementTree
|
||||
except ImportError:
|
||||
try:
|
||||
import cElementTree as ElementTree
|
||||
except ImportError:
|
||||
from elementtree import ElementTree
|
||||
|
||||
NAMESPACE = "http://schemas.xmlsoap.org/soap/envelope/"
|
||||
FORM_SPEC = """<form method="post" action="%s">
|
||||
<input type="hidden" name="%s" value="%s" />
|
||||
<input type="hidden" name="RelayState" value="%s" />
|
||||
<input type="submit" value="Submit" />
|
||||
</form>"""
|
||||
|
||||
def http_post_message(message, location, relay_state="", typ="SAMLRequest"):
|
||||
"""The HTTP POST binding defines a mechanism by which SAML protocol
|
||||
messages may be transmitted within the base64-encoded content of a
|
||||
HTML form control.
|
||||
|
||||
:param message: The message
|
||||
:param location: Where the form should be posted to
|
||||
:param relay_state: for preserving and conveying state information
|
||||
:return: A tuple containing header information and a HTML message.
|
||||
"""
|
||||
response = ["<head>", """<title>SAML 2.0 POST</title>""", "</head><body>"]
|
||||
|
||||
if not isinstance(message, basestring):
|
||||
message = "%s" % (message,)
|
||||
|
||||
response.append(FORM_SPEC % (location, typ, base64.b64encode(message),
|
||||
relay_state))
|
||||
|
||||
response.append("""<script type="text/javascript">""")
|
||||
response.append(" window.onload = function ()")
|
||||
response.append(" { document.forms[0].submit(); ")
|
||||
response.append("""</script>""")
|
||||
response.append("</body>")
|
||||
|
||||
return [("Content-type", "text/html")], response
|
||||
|
||||
def http_redirect_message(message, location, relay_state="",
|
||||
typ="SAMLRequest"):
|
||||
"""The HTTP Redirect binding defines a mechanism by which SAML protocol
|
||||
messages can be transmitted within URL parameters.
|
||||
Messages are encoded for use with this binding using a URL encoding
|
||||
technique, and transmitted using the HTTP GET method.
|
||||
|
||||
The DEFLATE Encoding is used in this function.
|
||||
|
||||
:param message: The message
|
||||
:param location: Where the message should be posted to
|
||||
:param relay_state: for preserving and conveying state information
|
||||
:return: A tuple containing header information and a HTML message.
|
||||
"""
|
||||
|
||||
if not isinstance(message, basestring):
|
||||
message = "%s" % (message,)
|
||||
|
||||
args = {typ: deflate_and_base64_encode(message)}
|
||||
|
||||
if relay_state:
|
||||
args["RelayState"] = relay_state
|
||||
|
||||
login_url = "?".join([location, urllib.urlencode(args)])
|
||||
headers = [('Location', login_url)]
|
||||
body = [""]
|
||||
|
||||
return headers, body
|
||||
|
||||
def make_soap_enveloped_saml_thingy(thingy, header_parts=None):
|
||||
""" Returns a soap envelope containing a SAML request
|
||||
as a text string.
|
||||
|
||||
:param thingy: The SAML thingy
|
||||
:return: The SOAP envelope as a string
|
||||
"""
|
||||
envelope = ElementTree.Element('')
|
||||
envelope.tag = '{%s}Envelope' % NAMESPACE
|
||||
|
||||
if header_parts:
|
||||
header = ElementTree.Element('')
|
||||
header.tag = '{%s}Header' % NAMESPACE
|
||||
envelope.append(header)
|
||||
for part in header_parts:
|
||||
part.become_child_element_of(header)
|
||||
|
||||
body = ElementTree.Element('')
|
||||
body.tag = '{%s}Body' % NAMESPACE
|
||||
envelope.append(body)
|
||||
|
||||
thingy.become_child_element_of(body)
|
||||
|
||||
return ElementTree.tostring(envelope, encoding="UTF-8")
|
||||
|
||||
def http_soap_message(message):
|
||||
return ([("Content-type", "application/soap+xml")],
|
||||
make_soap_enveloped_saml_thingy(message))
|
||||
|
||||
def http_paos(message, extra=None):
|
||||
return ([("Content-type", "application/soap+xml")],
|
||||
make_soap_enveloped_saml_thingy(message, extra))
|
||||
|
||||
def parse_soap_enveloped_saml(text, body_class, header_class=None):
|
||||
"""Parses a SOAP enveloped SAML thing and returns header parts and body
|
||||
|
||||
:param text: The SOAP object as XML
|
||||
:return: header parts and body as saml.samlbase instances
|
||||
"""
|
||||
envelope = ElementTree.fromstring(text)
|
||||
assert envelope.tag == '{%s}Envelope' % NAMESPACE
|
||||
|
||||
#print len(envelope)
|
||||
body = None
|
||||
header = {}
|
||||
for part in envelope:
|
||||
#print ">",part.tag
|
||||
if part.tag == '{%s}Body' % NAMESPACE:
|
||||
for sub in part:
|
||||
try:
|
||||
body = saml2.create_class_from_element_tree(body_class, sub)
|
||||
except Exception:
|
||||
raise Exception(
|
||||
"Wrong body type (%s) in SOAP envelope" % sub.tag)
|
||||
elif part.tag == '{%s}Header' % NAMESPACE:
|
||||
if not header_class:
|
||||
raise Exception("Header where I didn't expect one")
|
||||
#print "--- HEADER ---"
|
||||
for sub in part:
|
||||
#print ">>",sub.tag
|
||||
for klass in header_class:
|
||||
#print "?{%s}%s" % (klass.c_namespace,klass.c_tag)
|
||||
if sub.tag == "{%s}%s" % (klass.c_namespace, klass.c_tag):
|
||||
header[sub.tag] = \
|
||||
saml2.create_class_from_element_tree(klass, sub)
|
||||
break
|
||||
|
||||
return body, header
|
||||
|
||||
# -----------------------------------------------------------------------------
|
||||
# def send_using_http_get(request, destination, key_file=None, cert_file=None,
|
||||
# log=None):
|
||||
#
|
||||
#
|
||||
# http = HTTPClient(destination, key_file, cert_file, log)
|
||||
# if log: log.info("HTTP client initiated")
|
||||
#
|
||||
# try:
|
||||
# response = http.get()
|
||||
# except Exception, exc:
|
||||
# if log: log.info("HTTPClient exception: %s" % (exc,))
|
||||
# return None
|
||||
#
|
||||
# if log: log.info("HTTP request sent and got response: %s" % response)
|
||||
#
|
||||
# return response
|
||||
|
||||
def send_using_http_post(request, destination, relay_state, key_file=None,
|
||||
cert_file=None, log=None, ca_certs=""):
|
||||
|
||||
http = HTTPClient(destination, key_file, cert_file, log, ca_certs)
|
||||
if log:
|
||||
log.info("HTTP client initiated")
|
||||
|
||||
if not isinstance(request, basestring):
|
||||
request = "%s" % (request,)
|
||||
|
||||
(headers, message) = http_post_message(request, destination, relay_state)
|
||||
try:
|
||||
response = http.post(message, headers)
|
||||
except Exception, exc:
|
||||
if log:
|
||||
log.info("HTTPClient exception: %s" % (exc,))
|
||||
return None
|
||||
|
||||
if log:
|
||||
log.info("HTTP request sent and got response: %s" % response)
|
||||
|
||||
return response
|
||||
|
||||
def send_using_soap(message, destination, key_file=None, cert_file=None,
|
||||
log=None, ca_certs=""):
|
||||
"""
|
||||
Actual construction of the SOAP message is done by the SOAPClient
|
||||
|
||||
:param message: The SAML message to send
|
||||
:param destination: Where to send the message
|
||||
:param key_file: If HTTPS this is the client certificate
|
||||
:param cert_file: If HTTPS this a certificates file
|
||||
:param log: A log function to use for logging
|
||||
:param ca_certs: CA certificates to use when verifying server certificates
|
||||
:return: The response gotten from the other side interpreted by the
|
||||
SOAPClient
|
||||
"""
|
||||
soapclient = SOAPClient(destination, key_file, cert_file, log, ca_certs)
|
||||
if log:
|
||||
log.info("SOAP client initiated")
|
||||
try:
|
||||
response = soapclient.send(message)
|
||||
except Exception, exc:
|
||||
if log:
|
||||
log.info("SoapClient exception: %s" % (exc,))
|
||||
return None
|
||||
|
||||
if log:
|
||||
log.info("SOAP request sent and got response: %s" % response)
|
||||
|
||||
return response
|
||||
|
||||
# -----------------------------------------------------------------------------
|
||||
|
||||
PACKING = {
|
||||
saml2.BINDING_HTTP_REDIRECT: http_redirect_message,
|
||||
saml2.BINDING_HTTP_POST: http_post_message,
|
||||
}
|
||||
|
||||
def packager( identifier ):
|
||||
try:
|
||||
return PACKING[identifier]
|
||||
except KeyError:
|
||||
raise Exception("Unkown binding type: %s" % identifier)
|
||||
151
src/saml2/cache.py
Normal file
151
src/saml2/cache.py
Normal file
@@ -0,0 +1,151 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
import shelve
|
||||
from saml2 import time_util
|
||||
|
||||
# The assumption is that any subject may consist of data
|
||||
# gathered from several different sources, all with their own
|
||||
# timeout time.
|
||||
|
||||
class ToOld(Exception):
|
||||
pass
|
||||
|
||||
class CacheError(Exception):
|
||||
pass
|
||||
|
||||
class Cache(object):
|
||||
def __init__(self, filename=None):
|
||||
if filename:
|
||||
self._db = shelve.open(filename, writeback=True)
|
||||
self._sync = True
|
||||
else:
|
||||
self._db = {}
|
||||
self._sync = False
|
||||
|
||||
def delete(self, subject_id):
|
||||
del self._db[subject_id]
|
||||
|
||||
if self._sync:
|
||||
self._db.sync()
|
||||
|
||||
def get_identity(self, subject_id, entities=None,
|
||||
check_not_on_or_after=True):
|
||||
""" Get all the identity information that has been received and
|
||||
are still valid about the subject.
|
||||
|
||||
:param subject_id: The identifier of the subject
|
||||
:param entities: The identifiers of the entities whoes assertions are
|
||||
interesting. If the list is empty all entities are interesting.
|
||||
:return: A 2-tuple consisting of the identity information (a
|
||||
dictionary of attributes and values) and the list of entities
|
||||
whoes information has timed out.
|
||||
"""
|
||||
if not entities:
|
||||
try:
|
||||
entities = self._db[subject_id].keys()
|
||||
except KeyError:
|
||||
return {}, []
|
||||
|
||||
res = {}
|
||||
oldees = []
|
||||
for entity_id in entities:
|
||||
try:
|
||||
info = self.get(subject_id, entity_id, check_not_on_or_after)
|
||||
except ToOld:
|
||||
oldees.append(entity_id)
|
||||
continue
|
||||
|
||||
if not info:
|
||||
oldees.append(entity_id)
|
||||
continue
|
||||
|
||||
for key, vals in info["ava"].items():
|
||||
try:
|
||||
tmp = set(res[key]).union(set(vals))
|
||||
res[key] = list(tmp)
|
||||
except KeyError:
|
||||
res[key] = vals
|
||||
return res, oldees
|
||||
|
||||
def get(self, subject_id, entity_id, check_not_on_or_after=True):
|
||||
""" Get session information about a subject gotten from a
|
||||
specified IdP/AA.
|
||||
|
||||
:param subject_id: The identifier of the subject
|
||||
:param entity_id: The identifier of the entity_id
|
||||
:param check_not_on_or_after: if True it will check if this
|
||||
subject is still valid or if it is too old. Otherwise it
|
||||
will not check this. True by default.
|
||||
:return: The session information
|
||||
"""
|
||||
(timestamp, info) = self._db[subject_id][entity_id]
|
||||
if check_not_on_or_after and time_util.after(timestamp):
|
||||
raise ToOld("past %s" % timestamp)
|
||||
|
||||
return info or None
|
||||
|
||||
def set(self, subject_id, entity_id, info, not_on_or_after=0):
|
||||
""" Stores session information in the cache. Assumes that the subject_id
|
||||
is unique within the context of the Service Provider.
|
||||
|
||||
:param subject_id: The subject identifier
|
||||
:param entity_id: The identifier of the entity_id/receiver of an
|
||||
assertion
|
||||
:param info: The session info, the assertion is part of this
|
||||
:param not_on_or_after: A time after which the assertion is not valid.
|
||||
"""
|
||||
if subject_id not in self._db:
|
||||
self._db[subject_id] = {}
|
||||
|
||||
self._db[subject_id][entity_id] = (not_on_or_after, info)
|
||||
if self._sync:
|
||||
self._db.sync()
|
||||
|
||||
def reset(self, subject_id, entity_id):
|
||||
""" Scrap the assertions received from a IdP or an AA about a special
|
||||
subject.
|
||||
|
||||
:param subject_id: The subjects identifier
|
||||
:param entity_id: The identifier of the entity_id of the assertion
|
||||
:return:
|
||||
"""
|
||||
self.set(subject_id, entity_id, {}, 0)
|
||||
|
||||
def entities(self, subject_id):
|
||||
""" Returns all the entities of assertions for a subject, disregarding
|
||||
whether the assertion still is valid or not.
|
||||
|
||||
:param subject_id: The identifier of the subject
|
||||
:return: A possibly empty list of entity identifiers
|
||||
"""
|
||||
return self._db[subject_id].keys()
|
||||
|
||||
def receivers(self, subject_id):
|
||||
""" Another name for entities() just to make it more logic in the IdP
|
||||
scenario """
|
||||
return self.entities(subject_id)
|
||||
|
||||
def active(self, subject_id, entity_id):
|
||||
""" Returns the status of assertions from a specific entity_id.
|
||||
|
||||
:param subject_id: The ID of the subject
|
||||
:param entity_id: The entity ID of the entity_id of the assertion
|
||||
:return: True or False depending on if the assertion is still
|
||||
valid or not.
|
||||
"""
|
||||
try:
|
||||
(timestamp, info) = self._db[subject_id][entity_id]
|
||||
except KeyError:
|
||||
return False
|
||||
|
||||
if not info:
|
||||
return False
|
||||
else:
|
||||
return time_util.not_on_or_after(timestamp)
|
||||
|
||||
def subjects(self):
|
||||
""" Return identifiers for all the subjects that are in the cache.
|
||||
|
||||
:return: list of subject identifiers
|
||||
"""
|
||||
return self._db.keys()
|
||||
1133
src/saml2/client.py
Normal file
1133
src/saml2/client.py
Normal file
File diff suppressed because it is too large
Load Diff
472
src/saml2/config.py
Normal file
472
src/saml2/config.py
Normal file
@@ -0,0 +1,472 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
__author__ = 'rolandh'
|
||||
|
||||
import sys
|
||||
import os
|
||||
import re
|
||||
import logging
|
||||
import logging.handlers
|
||||
|
||||
from importlib import import_module
|
||||
|
||||
from saml2 import BINDING_SOAP, BINDING_HTTP_REDIRECT
|
||||
from saml2 import metadata
|
||||
from saml2 import root_logger
|
||||
|
||||
from saml2.attribute_converter import ac_factory
|
||||
from saml2.assertion import Policy
|
||||
from saml2.sigver import get_xmlsec_binary
|
||||
|
||||
COMMON_ARGS = ["entityid", "xmlsec_binary", "debug", "key_file", "cert_file",
|
||||
"secret", "accepted_time_diff", "name", "ca_certs",
|
||||
"description",
|
||||
"organization",
|
||||
"contact_person",
|
||||
"name_form",
|
||||
"virtual_organization",
|
||||
"logger",
|
||||
"only_use_keys_in_metadata",
|
||||
"logout_requests_signed",
|
||||
]
|
||||
|
||||
SP_ARGS = [
|
||||
"required_attributes",
|
||||
"optional_attributes",
|
||||
"idp",
|
||||
"aa",
|
||||
"subject_data",
|
||||
"want_assertions_signed",
|
||||
"authn_requests_signed",
|
||||
"name_form",
|
||||
"endpoints",
|
||||
"ui_info",
|
||||
"discovery_response",
|
||||
"allow_unsolicited",
|
||||
"ecp"
|
||||
]
|
||||
|
||||
AA_IDP_ARGS = ["want_authn_requests_signed",
|
||||
"provided_attributes",
|
||||
"subject_data",
|
||||
"sp",
|
||||
"scope",
|
||||
"endpoints",
|
||||
"metadata",
|
||||
"ui_info"]
|
||||
|
||||
PDP_ARGS = ["endpoints", "name_form"]
|
||||
|
||||
COMPLEX_ARGS = ["attribute_converters", "metadata", "policy"]
|
||||
ALL = COMMON_ARGS + SP_ARGS + AA_IDP_ARGS + PDP_ARGS + COMPLEX_ARGS
|
||||
|
||||
|
||||
SPEC = {
|
||||
"": COMMON_ARGS + COMPLEX_ARGS,
|
||||
"sp": COMMON_ARGS + COMPLEX_ARGS + SP_ARGS,
|
||||
"idp": COMMON_ARGS + COMPLEX_ARGS + AA_IDP_ARGS,
|
||||
"aa": COMMON_ARGS + COMPLEX_ARGS + AA_IDP_ARGS,
|
||||
"pdp": COMMON_ARGS + COMPLEX_ARGS + PDP_ARGS,
|
||||
}
|
||||
|
||||
# --------------- Logging stuff ---------------
|
||||
|
||||
LOG_LEVEL = {'debug': logging.DEBUG,
|
||||
'info': logging.INFO,
|
||||
'warning': logging.WARNING,
|
||||
'error': logging.ERROR,
|
||||
'critical': logging.CRITICAL}
|
||||
|
||||
LOG_HANDLER = {
|
||||
"rotating": logging.handlers.RotatingFileHandler,
|
||||
"syslog": logging.handlers.SysLogHandler,
|
||||
"timerotate": logging.handlers.TimedRotatingFileHandler,
|
||||
}
|
||||
|
||||
LOG_FORMAT = "%(asctime)s %(name)s: %(levelname)s %(message)s"
|
||||
#LOG_FORMAT = "%(asctime)s %(name)s: %(levelname)s [%(sid)s][%(func)s] %
|
||||
# (message)s"
|
||||
|
||||
class ConfigurationError(Exception):
|
||||
pass
|
||||
|
||||
# -----------------------------------------------------------------
|
||||
|
||||
class Config(object):
|
||||
def_context = ""
|
||||
|
||||
def __init__(self):
|
||||
self._attr = {"": {}, "sp": {}, "idp": {}, "aa": {}, "pdp": {}}
|
||||
self.context = ""
|
||||
|
||||
def serves(self):
|
||||
return [t for t in ["sp", "idp", "aa", "pdp"] if self._attr[t]]
|
||||
|
||||
def copy_into(self, typ=""):
|
||||
if typ == "sp":
|
||||
copy = SPConfig()
|
||||
elif typ in ["idp", "aa"]:
|
||||
copy = IdPConfig()
|
||||
else:
|
||||
copy = Config()
|
||||
copy.context = typ
|
||||
copy._attr = self._attr.copy()
|
||||
return copy
|
||||
|
||||
def __getattribute__(self, item):
|
||||
if item == "context":
|
||||
return object.__getattribute__(self, item)
|
||||
|
||||
_context = self.context
|
||||
if item in ALL:
|
||||
try:
|
||||
return self._attr[_context][item]
|
||||
except KeyError:
|
||||
if _context:
|
||||
try:
|
||||
return self._attr[""][item]
|
||||
except KeyError:
|
||||
pass
|
||||
return None
|
||||
else:
|
||||
return object.__getattribute__(self, item)
|
||||
|
||||
def setattr(self, context, attr, val):
|
||||
self._attr[context][attr] = val
|
||||
|
||||
def load_special(self, cnf, typ, metadata_construction=False):
|
||||
for arg in SPEC[typ]:
|
||||
try:
|
||||
self._attr[typ][arg] = cnf[arg]
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
self.context = typ
|
||||
self.load_complex(cnf, typ, metadata_construction=metadata_construction)
|
||||
self.context = self.def_context
|
||||
|
||||
def load_complex(self, cnf, typ="", metadata_construction=False):
|
||||
_attr_typ = self._attr[typ]
|
||||
try:
|
||||
_attr_typ["policy"] = Policy(cnf["policy"])
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
try:
|
||||
try:
|
||||
acs = ac_factory(cnf["attribute_map_dir"])
|
||||
except KeyError:
|
||||
acs = ac_factory()
|
||||
|
||||
if not acs:
|
||||
raise Exception(("No attribute converters, ",
|
||||
"something is wrong!!"))
|
||||
try:
|
||||
_attr_typ["attribute_converters"].extend(acs)
|
||||
except KeyError:
|
||||
_attr_typ["attribute_converters"] = acs
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
if not metadata_construction:
|
||||
try:
|
||||
_attr_typ["metadata"] = self.load_metadata(cnf["metadata"])
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
def load(self, cnf, metadata_construction=False):
|
||||
""" The base load method, loads the configuration
|
||||
|
||||
:param cnf: The configuration as a dictionary
|
||||
:param metadata_construction: Is this only to be able to construct
|
||||
metadata. If so some things can be left out.
|
||||
:return: The Configuration instance
|
||||
"""
|
||||
for arg in COMMON_ARGS:
|
||||
try:
|
||||
self._attr[""][arg] = cnf[arg]
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
if "service" in cnf:
|
||||
for typ in ["aa", "idp", "sp", "pdp"]:
|
||||
try:
|
||||
self.load_special(cnf["service"][typ], typ,
|
||||
metadata_construction=metadata_construction)
|
||||
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
if not metadata_construction:
|
||||
if "xmlsec_binary" not in self._attr[""]:
|
||||
self._attr[""]["xmlsec_binary"] = get_xmlsec_binary()
|
||||
# verify that xmlsec is where it's supposed to be
|
||||
if not os.access(self._attr[""]["xmlsec_binary"], os.F_OK):
|
||||
raise Exception("xmlsec binary not in '%s' !" % (
|
||||
self._attr[""]["xmlsec_binary"]))
|
||||
|
||||
self.load_complex(cnf, metadata_construction=metadata_construction)
|
||||
self.context = self.def_context
|
||||
|
||||
return self
|
||||
|
||||
def load_file(self, config_file, metadata_construction=False):
|
||||
if sys.path[0] != ".":
|
||||
sys.path.insert(0, ".")
|
||||
|
||||
if config_file.endswith(".py"):
|
||||
config_file = config_file[:-3]
|
||||
|
||||
mod = import_module(config_file)
|
||||
#return self.load(eval(open(config_file).read()))
|
||||
return self.load(mod.CONFIG, metadata_construction)
|
||||
|
||||
def load_metadata(self, metadata_conf):
|
||||
""" Loads metadata into an internal structure """
|
||||
|
||||
xmlsec_binary = self.xmlsec_binary
|
||||
acs = self.attribute_converters
|
||||
|
||||
if xmlsec_binary is None:
|
||||
raise Exception("Missing xmlsec1 specification")
|
||||
if acs is None:
|
||||
raise Exception("Missing attribute converter specification")
|
||||
|
||||
metad = metadata.MetaData(xmlsec_binary, acs)
|
||||
if "local" in metadata_conf:
|
||||
for mdfile in metadata_conf["local"]:
|
||||
metad.import_metadata(open(mdfile).read(), mdfile)
|
||||
if "remote" in metadata_conf:
|
||||
for spec in metadata_conf["remote"]:
|
||||
try:
|
||||
cert = spec["cert"]
|
||||
except KeyError:
|
||||
cert = None
|
||||
metad.import_external_metadata(spec["url"], cert)
|
||||
return metad
|
||||
|
||||
def endpoint(self, service, binding=None):
|
||||
""" Goes through the list of endpoint specifications for the
|
||||
given type of service and returnes the first endpoint that matches
|
||||
the given binding. If no binding is given any endpoint for that
|
||||
service will be returned.
|
||||
|
||||
:param service: The service the endpoint should support
|
||||
:param binding: The expected binding
|
||||
:return: All the endpoints that matches the given restrictions
|
||||
"""
|
||||
spec = []
|
||||
unspec = []
|
||||
for endpspec in self.endpoints[service]:
|
||||
try:
|
||||
endp, bind = endpspec
|
||||
if binding is None or bind == binding:
|
||||
spec.append(endp)
|
||||
except ValueError:
|
||||
unspec.append(endpspec)
|
||||
|
||||
if spec:
|
||||
return spec
|
||||
else:
|
||||
return unspec
|
||||
|
||||
def log_handler(self):
|
||||
try:
|
||||
_logconf = self.logger
|
||||
except KeyError:
|
||||
return None
|
||||
|
||||
handler = None
|
||||
for htyp in LOG_HANDLER:
|
||||
if htyp in _logconf:
|
||||
if htyp == "syslog":
|
||||
args = _logconf[htyp]
|
||||
if "socktype" in args:
|
||||
import socket
|
||||
if args["socktype"] == "dgram":
|
||||
args["socktype"] = socket.SOCK_DGRAM
|
||||
elif args["socktype"] == "stream":
|
||||
args["socktype"] = socket.SOCK_STREAM
|
||||
else:
|
||||
raise Exception("Unknown socktype!")
|
||||
try:
|
||||
handler = LOG_HANDLER[htyp](**args)
|
||||
except TypeError: # difference between 2.6 and 2.7
|
||||
del args["socktype"]
|
||||
handler = LOG_HANDLER[htyp](**args)
|
||||
else:
|
||||
handler = LOG_HANDLER[htyp](**_logconf[htyp])
|
||||
break
|
||||
|
||||
if handler is None:
|
||||
# default if rotating logger
|
||||
handler = LOG_HANDLER["rotating"]()
|
||||
|
||||
if "format" in _logconf:
|
||||
formatter = logging.Formatter(_logconf["format"])
|
||||
else:
|
||||
formatter = logging.Formatter(LOG_FORMAT)
|
||||
|
||||
handler.setFormatter(formatter)
|
||||
return handler
|
||||
|
||||
def setup_logger(self):
|
||||
try:
|
||||
_logconf = self.logger
|
||||
except KeyError:
|
||||
return None
|
||||
|
||||
if root_logger.level != logging.NOTSET: # Someone got there before me
|
||||
return root_logger
|
||||
|
||||
if _logconf is None:
|
||||
return None
|
||||
|
||||
try:
|
||||
root_logger.setLevel(LOG_LEVEL[_logconf["loglevel"].lower()])
|
||||
except KeyError: # reasonable default
|
||||
root_logger.setLevel(logging.WARNING)
|
||||
|
||||
root_logger.addHandler(self.log_handler())
|
||||
root_logger.info("Logging started")
|
||||
return root_logger
|
||||
|
||||
def keys(self):
|
||||
keys = []
|
||||
|
||||
for dir in ["", "sp", "idp", "aa"]:
|
||||
keys.extend(self._attr[dir].keys())
|
||||
|
||||
return list(set(keys))
|
||||
|
||||
def __contains__(self, item):
|
||||
for dir in ["", "sp", "idp", "aa"]:
|
||||
if item in self._attr[dir]:
|
||||
return True
|
||||
return False
|
||||
|
||||
class SPConfig(Config):
|
||||
def_context = "sp"
|
||||
|
||||
def __init__(self):
|
||||
Config.__init__(self)
|
||||
|
||||
def single_logout_services(self, entity_id, binding=BINDING_SOAP):
|
||||
""" returns a list of endpoints to use for sending logout requests to
|
||||
|
||||
:param entity_id: The entity ID of the service
|
||||
:param binding: The preferred binding (which for logout by default is
|
||||
the SOAP binding)
|
||||
:return: list of endpoints
|
||||
"""
|
||||
return self.metadata.single_logout_services(entity_id, "idp",
|
||||
binding=binding)
|
||||
|
||||
def single_sign_on_services(self, entity_id,
|
||||
binding=BINDING_HTTP_REDIRECT):
|
||||
""" returns a list of endpoints to use for sending login requests to
|
||||
|
||||
:param entity_id: The entity ID of the service
|
||||
:param binding: The preferred binding
|
||||
:return: list of endpoints
|
||||
"""
|
||||
return self.metadata.single_sign_on_services(entity_id,
|
||||
binding=binding)
|
||||
|
||||
def attribute_services(self, entity_id, binding=BINDING_SOAP):
|
||||
""" returns a list of endpoints to use for attribute requests to
|
||||
|
||||
:param entity_id: The entity ID of the service
|
||||
:param binding: The preferred binding (which for logout by default is
|
||||
the SOAP binding)
|
||||
:return: list of endpoints
|
||||
"""
|
||||
|
||||
res = []
|
||||
if self.aa is None or entity_id in self.aa:
|
||||
for aad in self.metadata.attribute_authority(entity_id):
|
||||
for attrserv in aad.attribute_service:
|
||||
if attrserv.binding == binding:
|
||||
res.append(attrserv)
|
||||
|
||||
return res
|
||||
|
||||
def idps(self, langpref=None):
|
||||
""" Returns a dictionary of usefull IdPs, the keys being the
|
||||
entity ID of the service and the names of the services as values
|
||||
|
||||
:param langpref: The preferred languages of the name, the first match
|
||||
is used.
|
||||
:return: Dictionary
|
||||
"""
|
||||
if langpref is None:
|
||||
langpref = ["en"]
|
||||
|
||||
if self.idp:
|
||||
return dict([(e, nd[0]) for (e,
|
||||
nd) in self.metadata.idps(langpref).items() if e in self.idp])
|
||||
else:
|
||||
return self.metadata.idps()
|
||||
|
||||
def vo_conf(self, vo_name):
|
||||
try:
|
||||
return self.virtual_organization[vo_name]
|
||||
except KeyError:
|
||||
return None
|
||||
|
||||
def ecp_endpoint(self, ipaddress):
|
||||
"""
|
||||
Returns the entity ID of the IdP which the ECP client should talk to
|
||||
|
||||
:param ipaddress: The IP address of the user client
|
||||
:return: IdP entity ID or None
|
||||
"""
|
||||
if "ecp" in self._attr["sp"]:
|
||||
for key, eid in self._attr["sp"]["ecp"].items():
|
||||
if re.match(key, ipaddress):
|
||||
return eid
|
||||
|
||||
return None
|
||||
|
||||
class IdPConfig(Config):
|
||||
def_context = "idp"
|
||||
|
||||
def __init__(self):
|
||||
Config.__init__(self)
|
||||
|
||||
def single_logout_services(self, entity_id, binding=BINDING_SOAP):
|
||||
""" returns a list of endpoints to use for sending logout requests to
|
||||
|
||||
:param entity_id: The entity ID of the service
|
||||
:param binding: The preferred binding (which for logout by default is
|
||||
the SOAP binding)
|
||||
:return: list of endpoints
|
||||
"""
|
||||
|
||||
return self.metadata.single_logout_services(entity_id, "sp",
|
||||
binding=binding)
|
||||
|
||||
def assertion_consumer_services(self, entity_id, binding):
|
||||
typ = "assertion_consumer_service"
|
||||
if self.sp is None or entity_id in self.sp:
|
||||
acs = self.metadata.sp_services(entity_id, typ, binding=binding)
|
||||
if acs:
|
||||
return [s[binding] for s in acs]
|
||||
|
||||
return []
|
||||
|
||||
def authz_services(self, entity_id, binding=BINDING_SOAP):
|
||||
return self.metadata.authz_services(entity_id, "pdp",
|
||||
binding=binding)
|
||||
|
||||
def config_factory(typ, file):
|
||||
if typ == "sp":
|
||||
conf = SPConfig().load_file(file)
|
||||
conf.context = typ
|
||||
elif typ in ["aa", "idp", "pdp"]:
|
||||
conf = IdPConfig().load_file(file)
|
||||
conf.context = typ
|
||||
else:
|
||||
conf = Config().load_file(file)
|
||||
conf.context = typ
|
||||
return conf
|
||||
95
src/saml2/country_codes.py
Normal file
95
src/saml2/country_codes.py
Normal file
@@ -0,0 +1,95 @@
|
||||
#!/usr/bin/env python
|
||||
# This Python file uses the following encoding: utf-8
|
||||
# ISO 3166-1 country names and codes from http://opencountrycodes.appspot.com/python
|
||||
|
||||
COUNTRIES = (
|
||||
("AF", "Afghanistan"),("AX", "Aland Islands"),("AL", "Albania"),
|
||||
("DZ", "Algeria"),("AS", "American Samoa"),("AD", "Andorra"),
|
||||
("AO", "Angola"),("AI", "Anguilla"),("AQ", "Antarctica"),
|
||||
("AG", "Antigua and Barbuda"),("AR", "Argentina"),("AM", "Armenia"),
|
||||
("AW", "Aruba"),("AU", "Australia"),("AT", "Austria"),
|
||||
("AZ", "Azerbaijan"),("BS", "Bahamas"),("BH", "Bahrain"),
|
||||
("BD", "Bangladesh"),("BB", "Barbados"),("BY", "Belarus"),("BE", "Belgium"),
|
||||
("BZ", "Belize"),("BJ", "Benin"),("BM", "Bermuda"),("BT", "Bhutan"),
|
||||
("BO", "Bolivia, Plurinational State of"),
|
||||
("BQ", "Bonaire, Sint Eustatius and Saba"),("BA", "Bosnia and Herzegovina"),
|
||||
("BW", "Botswana"),("BV", "Bouvet Island"),("BR", "Brazil"),
|
||||
("IO", "British Indian Ocean Territory"),("BN", "Brunei Darussalam"),
|
||||
("BG", "Bulgaria"),("BF", "Burkina Faso"),("BI", "Burundi"),
|
||||
("KH", "Cambodia"),("CM", "Cameroon"),("CA", "Canada"),("CV", "Cape Verde"),
|
||||
("KY", "Cayman Islands"),("CF", "Central African Republic"),("TD", "Chad"),
|
||||
("CL", "Chile"),("CN", "China"),("CX", "Christmas Island"),
|
||||
("CC", "Cocos (Keeling) Islands"),("CO", "Colombia"),("KM", "Comoros"),
|
||||
("CG", "Congo"),("CD", "Congo, The Democratic Republic of the"),
|
||||
("CK", "Cook Islands"),("CR", "Costa Rica"),("CI", "Cote D'ivoire"),
|
||||
("HR", "Croatia"),("CU", "Cuba"),("CW", "Curacao"),("CY", "Cyprus"),
|
||||
("CZ", "Czech Republic"),("DK", "Denmark"),("DJ", "Djibouti"),
|
||||
("DM", "Dominica"),("DO", "Dominican Republic"),("EC", "Ecuador"),
|
||||
("EG", "Egypt"),("SV", "El Salvador"),("GQ", "Equatorial Guinea"),
|
||||
("ER", "Eritrea"),("EE", "Estonia"),("ET", "Ethiopia"),
|
||||
("FK", "Falkland Islands (Malvinas)"),("FO", "Faroe Islands"),
|
||||
("FJ", "Fiji"),("FI", "Finland"),("FR", "France"),("GF", "French Guiana"),
|
||||
("PF", "French Polynesia"),("TF", "French Southern Territories"),
|
||||
("GA", "Gabon"),("GM", "Gambia"),("GE", "Georgia"),("DE", "Germany"),
|
||||
("GH", "Ghana"),("GI", "Gibraltar"),("GR", "Greece"),("GL", "Greenland"),
|
||||
("GD", "Grenada"),("GP", "Guadeloupe"),("GU", "Guam"),("GT", "Guatemala"),
|
||||
("GG", "Guernsey"),("GN", "Guinea"),("GW", "Guinea-Bissau"),("GY", "Guyana"),
|
||||
("HT", "Haiti"),("HM", "Heard Island and McDonald Islands"),
|
||||
("VA", "Holy See (Vatican City State)"),("HN", "Honduras"),
|
||||
("HK", "Hong Kong"),("HU", "Hungary"),("IS", "Iceland"),("IN", "India"),
|
||||
("ID", "Indonesia"),("IR", "Iran, Islamic Republic of"),("IQ", "Iraq"),
|
||||
("IE", "Ireland"),("IM", "Isle of Man"),("IL", "Israel"),("IT", "Italy"),
|
||||
("JM", "Jamaica"),("JP", "Japan"),("JE", "Jersey"),("JO", "Jordan"),
|
||||
("KZ", "Kazakhstan"),("KE", "Kenya"),("KI", "Kiribati"),
|
||||
("KP", "Korea, Democratic People's Republic of"),
|
||||
("KR", "Korea, Republic of"),("KW", "Kuwait"),("KG", "Kyrgyzstan"),
|
||||
("LA", "Lao People's Democratic Republic"),("LV", "Latvia"),
|
||||
("LB", "Lebanon"),("LS", "Lesotho"),("LR", "Liberia"),
|
||||
("LY", "Libyan Arab Jamahiriya"),("LI", "Liechtenstein"),
|
||||
("LT", "Lithuania"),("LU", "Luxembourg"),("MO", "Macao"),
|
||||
("MK", "Macedonia, The Former Yugoslav Republic of"),("MG", "Madagascar"),
|
||||
("MW", "Malawi"),("MY", "Malaysia"),("MV", "Maldives"),("ML", "Mali"),
|
||||
("MT", "Malta"),("MH", "Marshall Islands"),("MQ", "Martinique"),
|
||||
("MR", "Mauritania"),("MU", "Mauritius"),("YT", "Mayotte"),("MX", "Mexico"),
|
||||
("FM", "Micronesia, Federated States of"),("MD", "Moldova, Republic of"),
|
||||
("MC", "Monaco"),("MN", "Mongolia"),("ME", "Montenegro"),
|
||||
("MS", "Montserrat"),("MA", "Morocco"),("MZ", "Mozambique"),
|
||||
("MM", "Myanmar"),("NA", "Namibia"),("NR", "Nauru"),("NP", "Nepal"),
|
||||
("NL", "Netherlands"),("NC", "New Caledonia"),("NZ", "New Zealand"),
|
||||
("NI", "Nicaragua"),("NE", "Niger"),("NG", "Nigeria"),("NU", "Niue"),
|
||||
("NF", "Norfolk Island"),("MP", "Northern Mariana Islands"),
|
||||
("NO", "Norway"),("OM", "Oman"),("PK", "Pakistan"),("PW", "Palau"),
|
||||
("PS", "Palestinian Territory, Occupied"),("PA", "Panama"),
|
||||
("PG", "Papua New Guinea"),("PY", "Paraguay"),("PE", "Peru"),
|
||||
("PH", "Philippines"),("PN", "Pitcairn"),("PL", "Poland"),
|
||||
("PT", "Portugal"),("PR", "Puerto Rico"),("QA", "Qatar"),("RE", "Reunion"),
|
||||
("RO", "Romania"),("RU", "Russian Federation"),("RW", "Rwanda"),
|
||||
("BL", "Saint Barthelemy"),
|
||||
("SH", "Saint Helena, Ascension and Tristan Da Cunha"),
|
||||
("KN", "Saint Kitts and Nevis"),("LC", "Saint Lucia"),
|
||||
("MF", "Saint Martin (French Part)"),("PM", "Saint Pierre and Miquelon"),
|
||||
("VC", "Saint Vincent and the Grenadines"),("WS", "Samoa"),
|
||||
("SM", "San Marino"),("ST", "Sao Tome and Principe"),("SA", "Saudi Arabia"),
|
||||
("SN", "Senegal"),("RS", "Serbia"),("SC", "Seychelles"),
|
||||
("SL", "Sierra Leone"),("SG", "Singapore"),
|
||||
("SX", "Sint Maarten (Dutch Part)"),("SK", "Slovakia"),("SI", "Slovenia"),
|
||||
("SB", "Solomon Islands"),("SO", "Somalia"),("ZA", "South Africa"),
|
||||
("GS", "South Georgia and the South Sandwich Islands"),("ES", "Spain"),
|
||||
("LK", "Sri Lanka"),("SD", "Sudan"),("SR", "Suriname"),
|
||||
("SJ", "Svalbard and Jan Mayen"),("SZ", "Swaziland"),("SE", "Sweden"),
|
||||
("CH", "Switzerland"),("SY", "Syrian Arab Republic"),
|
||||
("TW", "Taiwan, Province of China"),("TJ", "Tajikistan"),
|
||||
("TZ", "Tanzania, United Republic of"),("TH", "Thailand"),
|
||||
("TL", "Timor-Leste"),("TG", "Togo"),("TK", "Tokelau"),("TO", "Tonga"),
|
||||
("TT", "Trinidad and Tobago"),("TN", "Tunisia"),("TR", "Turkey"),
|
||||
("TM", "Turkmenistan"),("TC", "Turks and Caicos Islands"),("TV", "Tuvalu"),
|
||||
("UG", "Uganda"),("UA", "Ukraine"),("AE", "United Arab Emirates"),
|
||||
("GB", "United Kingdom"),("US", "United States"),
|
||||
("UM", "United States Minor Outlying Islands"),("UY", "Uruguay"),
|
||||
("UZ", "Uzbekistan"),("VU", "Vanuatu"),
|
||||
("VE", "Venezuela, Bolivarian Republic of"),("VN", "Viet Nam"),
|
||||
("VG", "Virgin Islands, British"),("VI", "Virgin Islands, U.S."),
|
||||
("WF", "Wallis and Futuna"),("EH", "Western Sahara"),("YE", "Yemen"),
|
||||
("ZM", "Zambia"),("ZW", "Zimbabwe"),)
|
||||
|
||||
D_COUNTRIES = dict(COUNTRIES)
|
||||
213
src/saml2/ecp.py
Normal file
213
src/saml2/ecp.py
Normal file
@@ -0,0 +1,213 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
# Copyright (C) 2010-2011 Umeå University
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
"""
|
||||
Contains classes used in the SAML ECP profile
|
||||
"""
|
||||
|
||||
from saml2 import element_to_extension_element
|
||||
from saml2 import samlp
|
||||
from saml2 import soap
|
||||
from saml2 import BINDING_SOAP, BINDING_PAOS
|
||||
|
||||
from saml2.profile import paos
|
||||
from saml2.profile import ecp
|
||||
|
||||
#from saml2.client import Saml2Client
|
||||
from saml2.server import Server
|
||||
|
||||
from saml2.schema import soapenv
|
||||
from saml2.s_utils import sid
|
||||
|
||||
from saml2.response import authn_response
|
||||
|
||||
SERVICE = "urn:oasis:names:tc:SAML:2.0:profiles:SSO:ecp"
|
||||
|
||||
def ecp_capable(headers):
|
||||
if "application/vnd.paos+xml" in headers["Accept"]:
|
||||
if "PAOS" in headers:
|
||||
if 'ver="%s";"%s"' % (paos.NAMESPACE,
|
||||
SERVICE) in headers["PAOS"]:
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
ACTOR = "http://schemas.xmlsoap.org/soap/actor/next"
|
||||
|
||||
#noinspection PyUnusedLocal
|
||||
def ecp_auth_request(cls, entityid=None, relay_state="",
|
||||
log=None, sign=False):
|
||||
""" Makes an authentication request.
|
||||
|
||||
:param entityid: The entity ID of the IdP to send the request to
|
||||
:param relay_state: To where the user should be returned after
|
||||
successfull log in.
|
||||
:param log: Where to write log messages
|
||||
:param sign: Whether the request should be signed or not.
|
||||
:return: AuthnRequest response
|
||||
"""
|
||||
|
||||
eelist = []
|
||||
|
||||
# ----------------------------------------
|
||||
# <paos:Request>
|
||||
# ----------------------------------------
|
||||
my_url = cls.service_url(BINDING_PAOS)
|
||||
|
||||
# must_understan and actor according to the standard
|
||||
#
|
||||
paos_request = paos.Request(must_understand="1", actor=ACTOR,
|
||||
response_consumer_url=my_url,
|
||||
service = SERVICE)
|
||||
|
||||
eelist.append(element_to_extension_element(paos_request))
|
||||
|
||||
# ----------------------------------------
|
||||
# <ecp:Request>
|
||||
# ----------------------------------------
|
||||
|
||||
# idp = samlp.IDPEntry(
|
||||
# provider_id = "https://idp.example.org/entity",
|
||||
# name = "Example identity provider",
|
||||
# loc = "https://idp.example.org/saml2/sso",
|
||||
# )
|
||||
#
|
||||
# idp_list = samlp.IDPList(idp_entry= [idp])
|
||||
#
|
||||
# ecp_request = ecp.Request(actor = ACTOR, must_understand = "1",
|
||||
# provider_name = "Example Service Provider",
|
||||
# issuer=saml.Issuer(text="https://sp.example.org/entity"),
|
||||
# idp_list = idp_list)
|
||||
#
|
||||
# eelist.append(element_to_extension_element(ecp_request))
|
||||
|
||||
# ----------------------------------------
|
||||
# <ecp:RelayState>
|
||||
# ----------------------------------------
|
||||
|
||||
relay_state = ecp.RelayState(actor=ACTOR, must_understand="1",
|
||||
text=relay_state)
|
||||
|
||||
eelist.append(element_to_extension_element(relay_state))
|
||||
|
||||
header = soapenv.Header()
|
||||
header.extension_elements = eelist
|
||||
|
||||
# ----------------------------------------
|
||||
# <samlp:AuthnRequest>
|
||||
# ----------------------------------------
|
||||
|
||||
if log:
|
||||
log.info("entityid: %s, binding: %s" % (entityid, BINDING_SOAP))
|
||||
|
||||
location = cls._sso_location(entityid, binding=BINDING_SOAP)
|
||||
session_id = sid()
|
||||
authn_req = cls.authn(location, session_id, log=log,
|
||||
binding=BINDING_PAOS,
|
||||
service_url_binding=BINDING_PAOS)
|
||||
|
||||
body = soapenv.Body()
|
||||
body.extension_elements = [element_to_extension_element(authn_req)]
|
||||
|
||||
# ----------------------------------------
|
||||
# The SOAP envelope
|
||||
# ----------------------------------------
|
||||
|
||||
soap_envelope = soapenv.Envelope(header=header, body=body)
|
||||
|
||||
return session_id, "%s" % soap_envelope
|
||||
|
||||
|
||||
def handle_ecp_authn_response(cls, soap_message, outstanding=None):
|
||||
rdict = soap.class_instances_from_soap_enveloped_saml_thingies(
|
||||
soap_message,
|
||||
[paos, ecp,
|
||||
samlp])
|
||||
|
||||
_relay_state = None
|
||||
for item in rdict["header"]:
|
||||
if item.c_tag == "RelayState" and \
|
||||
item.c_namespace == ecp.NAMESPACE:
|
||||
_relay_state = item
|
||||
|
||||
response = authn_response(cls.config, cls.service_url(),
|
||||
outstanding, log=cls.logger,
|
||||
debug=cls.debug,
|
||||
allow_unsolicited=True)
|
||||
|
||||
response.loads("%s" % rdict["body"], False, soap_message)
|
||||
response.verify()
|
||||
cls.users.add_information_about_person(response.session_info())
|
||||
|
||||
return response, _relay_state
|
||||
|
||||
|
||||
def ecp_response(target_url, response):
|
||||
|
||||
# ----------------------------------------
|
||||
# <ecp:Response
|
||||
# ----------------------------------------
|
||||
|
||||
ecp_response = ecp.Response(assertion_consumer_service_url=target_url)
|
||||
header = soapenv.Header()
|
||||
header.extension_elements = [element_to_extension_element(ecp_response)]
|
||||
|
||||
# ----------------------------------------
|
||||
# <samlp:Response
|
||||
# ----------------------------------------
|
||||
|
||||
body = soapenv.Body()
|
||||
body.extension_elements = [element_to_extension_element(response)]
|
||||
|
||||
soap_envelope = soapenv.Envelope(header=header, body=body)
|
||||
|
||||
return "%s" % soap_envelope
|
||||
|
||||
class ECPServer(Server):
|
||||
""" This deals with what the IdP has to do
|
||||
|
||||
TODO: Still tentative
|
||||
"""
|
||||
def __init__(self, config_file="", config=None, _cache="",
|
||||
log=None, debug=0):
|
||||
Server.__init__(self, config_file, config, _cache, log, debug)
|
||||
|
||||
def parse_ecp_authn_query(self):
|
||||
pass
|
||||
|
||||
def ecp_response(self):
|
||||
|
||||
# ----------------------------------------
|
||||
# <ecp:Response
|
||||
# ----------------------------------------
|
||||
target_url = ""
|
||||
|
||||
ecp_response = ecp.Response(assertion_consumer_service_url=target_url)
|
||||
header = soapenv.Body()
|
||||
header.extension_elements = [element_to_extension_element(ecp_response)]
|
||||
|
||||
# ----------------------------------------
|
||||
# <samlp:Response
|
||||
# ----------------------------------------
|
||||
|
||||
response = samlp.Response()
|
||||
body = soapenv.Body()
|
||||
body.extension_elements = [element_to_extension_element(response)]
|
||||
|
||||
soap_envelope = soapenv.Envelope(header=header, body=body)
|
||||
|
||||
return "%s" % soap_envelope
|
||||
332
src/saml2/ecp_client.py
Normal file
332
src/saml2/ecp_client.py
Normal file
@@ -0,0 +1,332 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
# Copyright (C) 2010-2011 Umeå University
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
"""
|
||||
Contains a class that can be used handle all the ECP handling for other python
|
||||
programs.
|
||||
"""
|
||||
|
||||
import cookielib
|
||||
import sys
|
||||
|
||||
from saml2 import soap
|
||||
from saml2 import samlp
|
||||
from saml2 import BINDING_PAOS
|
||||
from saml2 import BINDING_SOAP
|
||||
from saml2 import class_name
|
||||
|
||||
from saml2.profile import paos
|
||||
from saml2.profile import ecp
|
||||
|
||||
from saml2.metadata import MetaData
|
||||
|
||||
SERVICE = "urn:oasis:names:tc:SAML:2.0:profiles:SSO:ecp"
|
||||
PAOS_HEADER_INFO = 'ver="%s";"%s"' % (paos.NAMESPACE, SERVICE)
|
||||
|
||||
class Client(object):
|
||||
def __init__(self, user, passwd, sp="", idp=None, metadata_file=None,
|
||||
xmlsec_binary=None, verbose=0, ca_certs="",
|
||||
disable_ssl_certificate_validation=True, logger=None,
|
||||
debug=False):
|
||||
"""
|
||||
:param user: user name
|
||||
:param passwd: user password
|
||||
:param sp: The SP URL
|
||||
:param idp: The IdP PAOS endpoint
|
||||
:param metadata_file: Where the metadata file is if used
|
||||
:param xmlsec_binary: Where the xmlsec1 binary can be found
|
||||
:param verbose: Chatty or not
|
||||
:param ca_certs: is the path of a file containing root CA certificates
|
||||
for SSL server certificate validation.
|
||||
:param disable_ssl_certificate_validation: If
|
||||
disable_ssl_certificate_validation is true, SSL cert validation
|
||||
will not be performed.
|
||||
:param logger: Somewhere to write logs to
|
||||
:param debug: Whether debug output is needed
|
||||
"""
|
||||
self._idp = idp
|
||||
self._sp = sp
|
||||
self.user = user
|
||||
self.passwd = passwd
|
||||
self.log = logger
|
||||
self.debug = debug
|
||||
self._verbose = verbose
|
||||
|
||||
if metadata_file:
|
||||
self._metadata = MetaData()
|
||||
self._metadata.import_metadata(open(metadata_file).read(),
|
||||
xmlsec_binary)
|
||||
self._debug_info("Loaded metadata from '%s'" % metadata_file)
|
||||
else:
|
||||
self._metadata = None
|
||||
|
||||
self.cookie_handler = None
|
||||
|
||||
self.done_ecp = False
|
||||
self.cookie_jar = cookielib.LWPCookieJar()
|
||||
self.http = soap.HTTPClient(self._sp, cookiejar=self.cookie_jar,
|
||||
ca_certs=ca_certs,
|
||||
disable_ssl_certificate_validation=disable_ssl_certificate_validation)
|
||||
|
||||
def _debug_info(self, text):
|
||||
if self.debug:
|
||||
if self.log:
|
||||
self.log.debug(text)
|
||||
|
||||
if self._verbose:
|
||||
print >> sys.stderr, text
|
||||
|
||||
def find_idp_endpoint(self, idp_entity_id):
|
||||
if self._idp:
|
||||
return self._idp
|
||||
|
||||
if idp_entity_id and not self._metadata:
|
||||
raise Exception(
|
||||
"Can't handle IdP entity ID if I don't have metadata")
|
||||
|
||||
if idp_entity_id:
|
||||
for binding in [BINDING_PAOS, BINDING_SOAP]:
|
||||
ssos = self._metadata.single_sign_on_services(idp_entity_id,
|
||||
binding=binding)
|
||||
if ssos:
|
||||
self._idp = ssos[0]
|
||||
if self.debug:
|
||||
self.log.debug("IdP endpoint: '%s'" % self._idp)
|
||||
return self._idp
|
||||
|
||||
raise Exception("No suitable endpoint found for entity id '%s'" % (
|
||||
idp_entity_id,))
|
||||
else:
|
||||
raise Exception("No entity ID -> no endpoint")
|
||||
|
||||
def phase2(self, authn_request, rc_url, idp_entity_id, headers=None,
|
||||
idp_endpoint=None, sign=False, sec=""):
|
||||
"""
|
||||
Doing the second phase of the ECP conversation
|
||||
|
||||
:param authn_request: The AuthenticationRequest
|
||||
:param rc_url: The assertion consumer service url
|
||||
:param idp_entity_id: The EntityID of the IdP
|
||||
:param headers: Possible extra headers
|
||||
:param idp_endpoint: Where to send it all
|
||||
:param sign: If the message should be signed
|
||||
:param sec: security context
|
||||
:return: The response from the IdP
|
||||
"""
|
||||
idp_request = soap.make_soap_enveloped_saml_thingy(authn_request)
|
||||
if sign:
|
||||
_signed = sec.sign_statement_using_xmlsec(idp_request,
|
||||
class_name(authn_request),
|
||||
nodeid=authn_request.id)
|
||||
idp_request = _signed
|
||||
|
||||
if not idp_endpoint:
|
||||
idp_endpoint = self.find_idp_endpoint(idp_entity_id)
|
||||
|
||||
if self.user and self.passwd:
|
||||
self.http.add_credentials(self.user, self.passwd)
|
||||
|
||||
self._debug_info("[P2] Sending request: %s" % idp_request)
|
||||
|
||||
# POST the request to the IdP
|
||||
response = self.http.post(idp_request, headers=headers,
|
||||
path=idp_endpoint)
|
||||
|
||||
self._debug_info("[P2] Got IdP response: %s" % response)
|
||||
|
||||
if response is None or response is False:
|
||||
raise Exception(
|
||||
"Request to IdP failed (%s): %s" % (self.http.response.status,
|
||||
self.http.error_description))
|
||||
|
||||
# SAMLP response in a SOAP envelope body, ecp response in headers
|
||||
respdict = soap.class_instances_from_soap_enveloped_saml_thingies(
|
||||
response, [paos, ecp,samlp])
|
||||
|
||||
if respdict is None:
|
||||
raise Exception("Unexpected reply from the IdP")
|
||||
|
||||
self._debug_info("[P2] IdP response dict: %s" % respdict)
|
||||
|
||||
idp_response = respdict["body"]
|
||||
assert idp_response.c_tag == "Response"
|
||||
|
||||
self._debug_info("[P2] IdP AUTHN response: %s" % idp_response)
|
||||
|
||||
_ecp_response = None
|
||||
for item in respdict["header"]:
|
||||
if item.c_tag == "Response" and\
|
||||
item.c_namespace == ecp.NAMESPACE:
|
||||
_ecp_response = item
|
||||
|
||||
_acs_url = _ecp_response.assertion_consumer_service_url
|
||||
if rc_url != _acs_url:
|
||||
error = ("response_consumer_url '%s' does not match" % rc_url,
|
||||
"assertion_consumer_service_url '%s" % _acs_url)
|
||||
# Send an error message to the SP
|
||||
fault_text = soap.soap_fault(error)
|
||||
_ = self.http.post(fault_text, path=rc_url)
|
||||
# Raise an exception so the user knows something went wrong
|
||||
raise Exception(error)
|
||||
|
||||
return idp_response
|
||||
|
||||
#noinspection PyUnusedLocal
|
||||
def ecp_conversation(self, respdict, idp_entity_id=None):
|
||||
""" """
|
||||
|
||||
if respdict is None:
|
||||
raise Exception("Unexpected reply from the SP")
|
||||
|
||||
self._debug_info("[P1] SP response dict: %s" % respdict)
|
||||
|
||||
# AuthnRequest in the body or not
|
||||
authn_request = respdict["body"]
|
||||
assert authn_request.c_tag == "AuthnRequest"
|
||||
|
||||
# ecp.RelayState among headers
|
||||
_relay_state = None
|
||||
_paos_request = None
|
||||
for item in respdict["header"]:
|
||||
if item.c_tag == "RelayState" and\
|
||||
item.c_namespace == ecp.NAMESPACE:
|
||||
_relay_state = item
|
||||
if item.c_tag == "Request" and\
|
||||
item.c_namespace == paos.NAMESPACE:
|
||||
_paos_request = item
|
||||
|
||||
_rc_url = _paos_request.response_consumer_url
|
||||
|
||||
# **********************
|
||||
# Phase 2 - talk to the IdP
|
||||
# **********************
|
||||
|
||||
idp_response = self.phase2(authn_request, _rc_url, idp_entity_id)
|
||||
|
||||
# **********************************
|
||||
# Phase 3 - back to the SP
|
||||
# **********************************
|
||||
|
||||
sp_response = soap.make_soap_enveloped_saml_thingy(idp_response,
|
||||
[_relay_state])
|
||||
|
||||
self._debug_info("[P3] Post to SP: %s" % sp_response)
|
||||
|
||||
headers = {'Content-Type': 'application/vnd.paos+xml', }
|
||||
|
||||
# POST the package from the IdP to the SP
|
||||
response = self.http.post(sp_response, headers, _rc_url)
|
||||
|
||||
if not response:
|
||||
if self.http.response.status == 302:
|
||||
# ignore where the SP is redirecting us to and go for the
|
||||
# url I started off with.
|
||||
pass
|
||||
else:
|
||||
print self.http.error_description
|
||||
raise Exception(
|
||||
"Error POSTing package to SP: %s" % self.http.response.reason)
|
||||
|
||||
self._debug_info("[P3] IdP response: %s" % response)
|
||||
|
||||
self.done_ecp = True
|
||||
if self.debug:
|
||||
self.log.debug("Done ECP")
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def operation(self, idp_entity_id, op, **opargs):
|
||||
if "path" not in opargs:
|
||||
opargs["path"] = self._sp
|
||||
|
||||
# ********************************************
|
||||
# Phase 1 - First conversation with the SP
|
||||
# ********************************************
|
||||
# headers needed to indicate to the SP that I'm ECP enabled
|
||||
|
||||
if "headers" in opargs and opargs["headers"]:
|
||||
opargs["headers"]["PAOS"] = PAOS_HEADER_INFO
|
||||
if "Accept" in opargs["headers"]:
|
||||
opargs["headers"]["Accept"] += ";application/vnd.paos+xml"
|
||||
elif "accept" in opargs["headers"]:
|
||||
opargs["headers"]["Accept"] = opargs["headers"]["accept"]
|
||||
opargs["headers"]["Accept"] += ";application/vnd.paos+xml"
|
||||
del opargs["headers"]["accept"]
|
||||
else:
|
||||
opargs["headers"] = {
|
||||
'Accept': 'text/html; application/vnd.paos+xml',
|
||||
'PAOS': PAOS_HEADER_INFO
|
||||
}
|
||||
|
||||
# request target from SP
|
||||
# can remove the PAOS header now
|
||||
# try:
|
||||
# del opargs["headers"]["PAOS"]
|
||||
# except KeyError:
|
||||
# pass
|
||||
|
||||
response = op(**opargs)
|
||||
self._debug_info("[Op] SP response: %s" % response)
|
||||
|
||||
if not response:
|
||||
raise Exception(
|
||||
"Request to SP failed: %s" % self.http.error_description)
|
||||
|
||||
# The response might be a AuthnRequest instance in a SOAP envelope
|
||||
# body. If so it's the start of the ECP conversation
|
||||
# Two SOAP header blocks; paos:Request and ecp:Request
|
||||
# may also contain a ecp:RelayState SOAP header block
|
||||
# If channel-binding was part of the PAOS header any number of
|
||||
# <cb:ChannelBindings> header blocks may also be present
|
||||
# if 'holder-of-key' option then one or more <ecp:SubjectConfirmation>
|
||||
# header blocks may also be present
|
||||
try:
|
||||
respdict = soap.class_instances_from_soap_enveloped_saml_thingies(
|
||||
response,
|
||||
[paos, ecp,
|
||||
samlp])
|
||||
self.ecp_conversation(respdict, idp_entity_id)
|
||||
# should by now be authenticated so this should go smoothly
|
||||
response = op(**opargs)
|
||||
except (soap.XmlParseError, AssertionError, KeyError):
|
||||
pass
|
||||
|
||||
#print "RESP",response, self.http.response
|
||||
|
||||
if not response:
|
||||
if self.http.response.status != 404:
|
||||
raise Exception("Error performing operation: %s" % (
|
||||
self.http.error_description,))
|
||||
|
||||
return response
|
||||
|
||||
def delete(self, path=None, idp_entity_id=None):
|
||||
return self.operation(idp_entity_id, self.http.delete, path=path)
|
||||
|
||||
def get(self, path=None, idp_entity_id=None, headers=None):
|
||||
return self.operation(idp_entity_id, self.http.get, path=path,
|
||||
headers=headers)
|
||||
|
||||
def post(self, path=None, data="", idp_entity_id=None, headers=None):
|
||||
return self.operation(idp_entity_id, self.http.post, data=data,
|
||||
path=path, headers=headers)
|
||||
|
||||
def put(self, path=None, data="", idp_entity_id=None, headers=None):
|
||||
return self.operation(idp_entity_id, self.http.put, data=data,
|
||||
path=path, headers=headers)
|
||||
|
||||
3
src/saml2/extension/__init__.py
Normal file
3
src/saml2/extension/__init__.py
Normal file
@@ -0,0 +1,3 @@
|
||||
# metadata extensions mainly
|
||||
__author__ = 'rolandh'
|
||||
__all__ = ["dri", "mdrpi", "mdui", "shibmd", "idpdisc"]
|
||||
338
src/saml2/extension/dri.py
Normal file
338
src/saml2/extension/dri.py
Normal file
@@ -0,0 +1,338 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
#
|
||||
# Generated Mon Oct 25 16:19:28 2010 by parse_xsd.py version 0.4.
|
||||
#
|
||||
|
||||
import saml2
|
||||
from saml2 import SamlBase
|
||||
|
||||
from saml2 import md
|
||||
|
||||
NAMESPACE = 'urn:oasis:names:tc:SAML:2.0:metadata:dri'
|
||||
|
||||
class CreationInstant(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:2.0:metadata:dri:CreationInstant element """
|
||||
|
||||
c_tag = 'CreationInstant'
|
||||
c_namespace = NAMESPACE
|
||||
c_value_type = {'base': 'datetime'}
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
|
||||
def creation_instant_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(CreationInstant, xml_string)
|
||||
|
||||
|
||||
class SerialNumber(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:2.0:metadata:dri:SerialNumber element """
|
||||
|
||||
c_tag = 'SerialNumber'
|
||||
c_namespace = NAMESPACE
|
||||
c_value_type = {'base': 'string'}
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
|
||||
def serial_number_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(SerialNumber, xml_string)
|
||||
|
||||
|
||||
class UsagePolicy(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:2.0:metadata:dri:UsagePolicy element """
|
||||
|
||||
c_tag = 'UsagePolicy'
|
||||
c_namespace = NAMESPACE
|
||||
c_value_type = {'base': 'anyURI'}
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
|
||||
def usage_policy_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(UsagePolicy, xml_string)
|
||||
|
||||
|
||||
class PublisherType_(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:2.0:metadata:dri:PublisherType element """
|
||||
|
||||
c_tag = 'PublisherType'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
c_attributes['PublisherID'] = ('publisher_id', 'md:entityIDType', True)
|
||||
c_attributes['CreationInstant'] = ('creation_instant', 'datetime', False)
|
||||
c_attributes['SerialNumber'] = ('serial_number', 'string', False)
|
||||
|
||||
def __init__(self,
|
||||
publisher_id=None,
|
||||
creation_instant=None,
|
||||
serial_number=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
SamlBase.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.publisher_id=publisher_id
|
||||
self.creation_instant=creation_instant
|
||||
self.serial_number=serial_number
|
||||
|
||||
def publisher_type__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(PublisherType_, xml_string)
|
||||
|
||||
|
||||
class RegistrationAuthority(md.EntityIDType_):
|
||||
"""The urn:oasis:names:tc:SAML:2.0:metadata:dri:RegistrationAuthority element """
|
||||
|
||||
c_tag = 'RegistrationAuthority'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = md.EntityIDType_.c_children.copy()
|
||||
c_attributes = md.EntityIDType_.c_attributes.copy()
|
||||
c_child_order = md.EntityIDType_.c_child_order[:]
|
||||
c_cardinality = md.EntityIDType_.c_cardinality.copy()
|
||||
|
||||
def registration_authority_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(RegistrationAuthority, xml_string)
|
||||
|
||||
|
||||
class RegistrationInstant(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:2.0:metadata:dri:RegistrationInstant element """
|
||||
|
||||
c_tag = 'RegistrationInstant'
|
||||
c_namespace = NAMESPACE
|
||||
c_value_type = {'base': 'datetime'}
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
|
||||
def registration_instant_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(RegistrationInstant, xml_string)
|
||||
|
||||
|
||||
class RegistrationPolicy(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:2.0:metadata:dri:RegistrationPolicy element """
|
||||
|
||||
c_tag = 'RegistrationPolicy'
|
||||
c_namespace = NAMESPACE
|
||||
c_value_type = {'base': 'anyURI'}
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
|
||||
def registration_policy_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(RegistrationPolicy, xml_string)
|
||||
|
||||
|
||||
class Publisher(PublisherType_):
|
||||
"""The urn:oasis:names:tc:SAML:2.0:metadata:dri:Publisher element """
|
||||
|
||||
c_tag = 'Publisher'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = PublisherType_.c_children.copy()
|
||||
c_attributes = PublisherType_.c_attributes.copy()
|
||||
c_child_order = PublisherType_.c_child_order[:]
|
||||
c_cardinality = PublisherType_.c_cardinality.copy()
|
||||
|
||||
def publisher_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Publisher, xml_string)
|
||||
|
||||
|
||||
class RegistrationInfoType_(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:2.0:metadata:dri:RegistrationInfoType element """
|
||||
|
||||
c_tag = 'RegistrationInfoType'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
c_children['{urn:oasis:names:tc:SAML:2.0:metadata:dri}RegistrationAuthority'] = ('registration_authority', RegistrationAuthority)
|
||||
c_children['{urn:oasis:names:tc:SAML:2.0:metadata:dri}RegistrationInstant'] = ('registration_instant', RegistrationInstant)
|
||||
c_children['{urn:oasis:names:tc:SAML:2.0:metadata:dri}RegistrationPolicy'] = ('registration_policy', RegistrationPolicy)
|
||||
c_cardinality['registration_policy'] = {"min":0, "max":1}
|
||||
c_child_order.extend(['registration_authority', 'registration_instant', 'registration_policy'])
|
||||
|
||||
def __init__(self,
|
||||
registration_authority=None,
|
||||
registration_instant=None,
|
||||
registration_policy=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
SamlBase.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.registration_authority=registration_authority
|
||||
self.registration_instant=registration_instant
|
||||
self.registration_policy=registration_policy
|
||||
|
||||
def registration_info_type__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(RegistrationInfoType_, xml_string)
|
||||
|
||||
|
||||
class PublishersType_(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:2.0:metadata:dri:PublishersType element """
|
||||
|
||||
c_tag = 'PublishersType'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
c_children['{urn:oasis:names:tc:SAML:2.0:metadata:dri}Publisher'] = ('publisher', [Publisher])
|
||||
c_cardinality['publisher'] = {"min":0}
|
||||
c_child_order.extend(['publisher'])
|
||||
|
||||
def __init__(self,
|
||||
publisher=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
SamlBase.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.publisher=publisher or []
|
||||
|
||||
def publishers_type__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(PublishersType_, xml_string)
|
||||
|
||||
|
||||
class RegistrationInfo(RegistrationInfoType_):
|
||||
"""The urn:oasis:names:tc:SAML:2.0:metadata:dri:RegistrationInfo element """
|
||||
|
||||
c_tag = 'RegistrationInfo'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = RegistrationInfoType_.c_children.copy()
|
||||
c_attributes = RegistrationInfoType_.c_attributes.copy()
|
||||
c_child_order = RegistrationInfoType_.c_child_order[:]
|
||||
c_cardinality = RegistrationInfoType_.c_cardinality.copy()
|
||||
|
||||
def registration_info_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(RegistrationInfo, xml_string)
|
||||
|
||||
|
||||
class Publishers(PublishersType_):
|
||||
"""The urn:oasis:names:tc:SAML:2.0:metadata:dri:Publishers element """
|
||||
|
||||
c_tag = 'Publishers'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = PublishersType_.c_children.copy()
|
||||
c_attributes = PublishersType_.c_attributes.copy()
|
||||
c_child_order = PublishersType_.c_child_order[:]
|
||||
c_cardinality = PublishersType_.c_cardinality.copy()
|
||||
|
||||
def publishers_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Publishers, xml_string)
|
||||
|
||||
|
||||
class DocumentInfoType_(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:2.0:metadata:dri:DocumentInfoType element """
|
||||
|
||||
c_tag = 'DocumentInfoType'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
c_children['{urn:oasis:names:tc:SAML:2.0:metadata:dri}CreationInstant'] = ('creation_instant', CreationInstant)
|
||||
c_cardinality['creation_instant'] = {"min":0, "max":1}
|
||||
c_children['{urn:oasis:names:tc:SAML:2.0:metadata:dri}SerialNumber'] = ('serial_number', SerialNumber)
|
||||
c_cardinality['serial_number'] = {"min":0, "max":1}
|
||||
c_children['{urn:oasis:names:tc:SAML:2.0:metadata:dri}UsagePolicy'] = ('usage_policy', UsagePolicy)
|
||||
c_cardinality['usage_policy'] = {"min":0, "max":1}
|
||||
c_children['{urn:oasis:names:tc:SAML:2.0:metadata:dri}Publishers'] = ('publishers', Publishers)
|
||||
c_cardinality['publishers'] = {"min":0, "max":1}
|
||||
c_child_order.extend(['creation_instant', 'serial_number', 'usage_policy', 'publishers'])
|
||||
|
||||
def __init__(self,
|
||||
creation_instant=None,
|
||||
serial_number=None,
|
||||
usage_policy=None,
|
||||
publishers=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
SamlBase.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.creation_instant=creation_instant
|
||||
self.serial_number=serial_number
|
||||
self.usage_policy=usage_policy
|
||||
self.publishers=publishers
|
||||
|
||||
def document_info_type__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(DocumentInfoType_, xml_string)
|
||||
|
||||
|
||||
class DocumentInfo(DocumentInfoType_):
|
||||
"""The urn:oasis:names:tc:SAML:2.0:metadata:dri:DocumentInfo element """
|
||||
|
||||
c_tag = 'DocumentInfo'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = DocumentInfoType_.c_children.copy()
|
||||
c_attributes = DocumentInfoType_.c_attributes.copy()
|
||||
c_child_order = DocumentInfoType_.c_child_order[:]
|
||||
c_cardinality = DocumentInfoType_.c_cardinality.copy()
|
||||
|
||||
def document_info_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(DocumentInfo, xml_string)
|
||||
|
||||
|
||||
ELEMENT_FROM_STRING = {
|
||||
DocumentInfo.c_tag: document_info_from_string,
|
||||
DocumentInfoType_.c_tag: document_info_type__from_string,
|
||||
CreationInstant.c_tag: creation_instant_from_string,
|
||||
SerialNumber.c_tag: serial_number_from_string,
|
||||
UsagePolicy.c_tag: usage_policy_from_string,
|
||||
Publishers.c_tag: publishers_from_string,
|
||||
PublishersType_.c_tag: publishers_type__from_string,
|
||||
Publisher.c_tag: publisher_from_string,
|
||||
PublisherType_.c_tag: publisher_type__from_string,
|
||||
RegistrationInfo.c_tag: registration_info_from_string,
|
||||
RegistrationInfoType_.c_tag: registration_info_type__from_string,
|
||||
RegistrationAuthority.c_tag: registration_authority_from_string,
|
||||
RegistrationInstant.c_tag: registration_instant_from_string,
|
||||
RegistrationPolicy.c_tag: registration_policy_from_string,
|
||||
}
|
||||
|
||||
ELEMENT_BY_TAG = {
|
||||
'DocumentInfo': DocumentInfo,
|
||||
'DocumentInfoType': DocumentInfoType_,
|
||||
'CreationInstant': CreationInstant,
|
||||
'SerialNumber': SerialNumber,
|
||||
'UsagePolicy': UsagePolicy,
|
||||
'Publishers': Publishers,
|
||||
'PublishersType': PublishersType_,
|
||||
'Publisher': Publisher,
|
||||
'PublisherType': PublisherType_,
|
||||
'RegistrationInfo': RegistrationInfo,
|
||||
'RegistrationInfoType': RegistrationInfoType_,
|
||||
'RegistrationAuthority': RegistrationAuthority,
|
||||
'RegistrationInstant': RegistrationInstant,
|
||||
'RegistrationPolicy': RegistrationPolicy,
|
||||
}
|
||||
|
||||
|
||||
def factory(tag, **kwargs):
|
||||
return ELEMENT_BY_TAG[tag](**kwargs)
|
||||
|
||||
37
src/saml2/extension/idpdisc.py
Normal file
37
src/saml2/extension/idpdisc.py
Normal file
@@ -0,0 +1,37 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
#
|
||||
# Generated Thu Jun 23 09:01:47 2011 by parse_xsd.py version 0.4.
|
||||
#
|
||||
|
||||
import saml2
|
||||
from saml2 import md
|
||||
|
||||
NAMESPACE = 'urn:oasis:names:tc:SAML:profiles:SSO:idp-discovery-protocol'
|
||||
|
||||
class DiscoveryResponse(md.IndexedEndpointType_):
|
||||
"""The urn:oasis:names:tc:SAML:profiles:SSO:idp-discovery-protocol:DiscoveryResponse element """
|
||||
|
||||
c_tag = 'DiscoveryResponse'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = md.IndexedEndpointType_.c_children.copy()
|
||||
c_attributes = md.IndexedEndpointType_.c_attributes.copy()
|
||||
c_child_order = md.IndexedEndpointType_.c_child_order[:]
|
||||
c_cardinality = md.IndexedEndpointType_.c_cardinality.copy()
|
||||
|
||||
def discovery_response_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(DiscoveryResponse, xml_string)
|
||||
|
||||
|
||||
ELEMENT_FROM_STRING = {
|
||||
DiscoveryResponse.c_tag: discovery_response_from_string,
|
||||
}
|
||||
|
||||
ELEMENT_BY_TAG = {
|
||||
'DiscoveryResponse': DiscoveryResponse,
|
||||
}
|
||||
|
||||
|
||||
def factory(tag, **kwargs):
|
||||
return ELEMENT_BY_TAG[tag](**kwargs)
|
||||
|
||||
75
src/saml2/extension/mdattr.py
Normal file
75
src/saml2/extension/mdattr.py
Normal file
@@ -0,0 +1,75 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
#
|
||||
# Generated Mon May 2 14:23:34 2011 by parse_xsd.py version 0.4.
|
||||
#
|
||||
|
||||
import saml2
|
||||
from saml2 import SamlBase
|
||||
|
||||
from saml2 import saml
|
||||
|
||||
NAMESPACE = 'urn:oasis:names:tc:SAML:metadata:attribute'
|
||||
|
||||
class EntityAttributesType_(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:attribute:EntityAttributesType element """
|
||||
|
||||
c_tag = 'EntityAttributesType'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}Attribute'] = ('attribute', [saml.Attribute])
|
||||
c_cardinality['attribute'] = {"min":0}
|
||||
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}Assertion'] = ('assertion', [saml.Assertion])
|
||||
c_cardinality['assertion'] = {"min":0}
|
||||
c_child_order.extend(['attribute', 'assertion'])
|
||||
|
||||
def __init__(self,
|
||||
attribute=None,
|
||||
assertion=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
SamlBase.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.attribute=attribute or []
|
||||
self.assertion=assertion or []
|
||||
|
||||
def entity_attributes_type__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(EntityAttributesType_, xml_string)
|
||||
|
||||
|
||||
class EntityAttributes(EntityAttributesType_):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:attribute:EntityAttributes element """
|
||||
|
||||
c_tag = 'EntityAttributes'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = EntityAttributesType_.c_children.copy()
|
||||
c_attributes = EntityAttributesType_.c_attributes.copy()
|
||||
c_child_order = EntityAttributesType_.c_child_order[:]
|
||||
c_cardinality = EntityAttributesType_.c_cardinality.copy()
|
||||
|
||||
def entity_attributes_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(EntityAttributes, xml_string)
|
||||
|
||||
|
||||
ELEMENT_FROM_STRING = {
|
||||
EntityAttributes.c_tag: entity_attributes_from_string,
|
||||
EntityAttributesType_.c_tag: entity_attributes_type__from_string,
|
||||
}
|
||||
|
||||
ELEMENT_BY_TAG = {
|
||||
'EntityAttributes': EntityAttributes,
|
||||
'EntityAttributesType': EntityAttributesType_,
|
||||
}
|
||||
|
||||
|
||||
def factory(tag, **kwargs):
|
||||
return ELEMENT_BY_TAG[tag](**kwargs)
|
||||
|
||||
266
src/saml2/extension/mdrpi.py
Normal file
266
src/saml2/extension/mdrpi.py
Normal file
@@ -0,0 +1,266 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
#
|
||||
# Generated Mon Jun 27 09:54:22 2011 by parse_xsd.py version 0.4.
|
||||
#
|
||||
|
||||
import saml2
|
||||
from saml2 import SamlBase
|
||||
|
||||
from saml2 import md
|
||||
|
||||
NAMESPACE = 'urn:oasis:names:tc:SAML:metadata:rpi'
|
||||
|
||||
class RegistrationPolicy(md.LocalizedURIType_):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:rpi:RegistrationPolicy element """
|
||||
|
||||
c_tag = 'RegistrationPolicy'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = md.LocalizedURIType_.c_children.copy()
|
||||
c_attributes = md.LocalizedURIType_.c_attributes.copy()
|
||||
c_child_order = md.LocalizedURIType_.c_child_order[:]
|
||||
c_cardinality = md.LocalizedURIType_.c_cardinality.copy()
|
||||
|
||||
def registration_policy_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(RegistrationPolicy, xml_string)
|
||||
|
||||
|
||||
class UsagePolicy(md.LocalizedURIType_):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:rpi:UsagePolicy element """
|
||||
|
||||
c_tag = 'UsagePolicy'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = md.LocalizedURIType_.c_children.copy()
|
||||
c_attributes = md.LocalizedURIType_.c_attributes.copy()
|
||||
c_child_order = md.LocalizedURIType_.c_child_order[:]
|
||||
c_cardinality = md.LocalizedURIType_.c_cardinality.copy()
|
||||
|
||||
def usage_policy_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(UsagePolicy, xml_string)
|
||||
|
||||
|
||||
class PublicationType_(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:rpi:PublicationType element """
|
||||
|
||||
c_tag = 'PublicationType'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
c_attributes['publisher'] = ('publisher', 'string', True)
|
||||
c_attributes['creationInstant'] = ('creation_instant', 'dateTime', False)
|
||||
c_attributes['publicationId'] = ('publication_id', 'string', False)
|
||||
|
||||
def __init__(self,
|
||||
publisher=None,
|
||||
creation_instant=None,
|
||||
publication_id=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None
|
||||
):
|
||||
SamlBase.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.publisher=publisher
|
||||
self.creation_instant=creation_instant
|
||||
self.publication_id=publication_id
|
||||
|
||||
def publication_type__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(PublicationType_, xml_string)
|
||||
|
||||
|
||||
class RegistrationInfoType_(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:rpi:RegistrationInfoType element """
|
||||
|
||||
c_tag = 'RegistrationInfoType'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
c_children['{urn:oasis:names:tc:SAML:metadata:rpi}RegistrationPolicy'] = ('registration_policy', [RegistrationPolicy])
|
||||
c_cardinality['registration_policy'] = {"min":0}
|
||||
c_attributes['registrationAuthority'] = ('registration_authority', 'string', True)
|
||||
c_attributes['registrationInstant'] = ('registration_instant', 'dateTime', False)
|
||||
c_child_order.extend(['registration_policy'])
|
||||
|
||||
def __init__(self,
|
||||
registration_policy=None,
|
||||
registration_authority=None,
|
||||
registration_instant=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None
|
||||
):
|
||||
SamlBase.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.registration_policy=registration_policy or []
|
||||
self.registration_authority=registration_authority
|
||||
self.registration_instant=registration_instant
|
||||
|
||||
def registration_info_type__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(RegistrationInfoType_, xml_string)
|
||||
|
||||
|
||||
class PublicationInfoType_(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:rpi:PublicationInfoType element """
|
||||
|
||||
c_tag = 'PublicationInfoType'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
c_children['{urn:oasis:names:tc:SAML:metadata:rpi}UsagePolicy'] = ('usage_policy', [UsagePolicy])
|
||||
c_cardinality['usage_policy'] = {"min":0}
|
||||
c_attributes['publisher'] = ('publisher', 'string', True)
|
||||
c_attributes['creationInstant'] = ('creation_instant', 'dateTime', False)
|
||||
c_attributes['publicationId'] = ('publication_id', 'string', False)
|
||||
c_child_order.extend(['usage_policy'])
|
||||
|
||||
def __init__(self,
|
||||
usage_policy=None,
|
||||
publisher=None,
|
||||
creation_instant=None,
|
||||
publication_id=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None
|
||||
):
|
||||
SamlBase.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.usage_policy=usage_policy or []
|
||||
self.publisher=publisher
|
||||
self.creation_instant=creation_instant
|
||||
self.publication_id=publication_id
|
||||
|
||||
def publication_info_type__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(PublicationInfoType_, xml_string)
|
||||
|
||||
|
||||
class Publication(PublicationType_):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:rpi:Publication element """
|
||||
|
||||
c_tag = 'Publication'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = PublicationType_.c_children.copy()
|
||||
c_attributes = PublicationType_.c_attributes.copy()
|
||||
c_child_order = PublicationType_.c_child_order[:]
|
||||
c_cardinality = PublicationType_.c_cardinality.copy()
|
||||
|
||||
def publication_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Publication, xml_string)
|
||||
|
||||
|
||||
class RegistrationInfo(RegistrationInfoType_):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:rpi:RegistrationInfo element """
|
||||
|
||||
c_tag = 'RegistrationInfo'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = RegistrationInfoType_.c_children.copy()
|
||||
c_attributes = RegistrationInfoType_.c_attributes.copy()
|
||||
c_child_order = RegistrationInfoType_.c_child_order[:]
|
||||
c_cardinality = RegistrationInfoType_.c_cardinality.copy()
|
||||
|
||||
def registration_info_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(RegistrationInfo, xml_string)
|
||||
|
||||
|
||||
class PublicationInfo(PublicationInfoType_):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:rpi:PublicationInfo element """
|
||||
|
||||
c_tag = 'PublicationInfo'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = PublicationInfoType_.c_children.copy()
|
||||
c_attributes = PublicationInfoType_.c_attributes.copy()
|
||||
c_child_order = PublicationInfoType_.c_child_order[:]
|
||||
c_cardinality = PublicationInfoType_.c_cardinality.copy()
|
||||
|
||||
def publication_info_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(PublicationInfo, xml_string)
|
||||
|
||||
|
||||
class PublicationPathType_(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:rpi:PublicationPathType element """
|
||||
|
||||
c_tag = 'PublicationPathType'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
c_children['{urn:oasis:names:tc:SAML:metadata:rpi}Publication'] = ('publication', [Publication])
|
||||
c_cardinality['publication'] = {"min":0}
|
||||
c_child_order.extend(['publication'])
|
||||
|
||||
def __init__(self,
|
||||
publication=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None
|
||||
):
|
||||
SamlBase.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.publication=publication or []
|
||||
|
||||
def publication_path_type__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(PublicationPathType_, xml_string)
|
||||
|
||||
|
||||
class PublicationPath(PublicationPathType_):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:rpi:PublicationPath element """
|
||||
|
||||
c_tag = 'PublicationPath'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = PublicationPathType_.c_children.copy()
|
||||
c_attributes = PublicationPathType_.c_attributes.copy()
|
||||
c_child_order = PublicationPathType_.c_child_order[:]
|
||||
c_cardinality = PublicationPathType_.c_cardinality.copy()
|
||||
|
||||
def publication_path_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(PublicationPath, xml_string)
|
||||
|
||||
|
||||
ELEMENT_FROM_STRING = {
|
||||
RegistrationInfo.c_tag: registration_info_from_string,
|
||||
RegistrationInfoType_.c_tag: registration_info_type__from_string,
|
||||
RegistrationPolicy.c_tag: registration_policy_from_string,
|
||||
PublicationInfo.c_tag: publication_info_from_string,
|
||||
PublicationInfoType_.c_tag: publication_info_type__from_string,
|
||||
UsagePolicy.c_tag: usage_policy_from_string,
|
||||
PublicationPath.c_tag: publication_path_from_string,
|
||||
PublicationPathType_.c_tag: publication_path_type__from_string,
|
||||
Publication.c_tag: publication_from_string,
|
||||
PublicationType_.c_tag: publication_type__from_string,
|
||||
}
|
||||
|
||||
ELEMENT_BY_TAG = {
|
||||
'RegistrationInfo': RegistrationInfo,
|
||||
'RegistrationInfoType': RegistrationInfoType_,
|
||||
'RegistrationPolicy': RegistrationPolicy,
|
||||
'PublicationInfo': PublicationInfo,
|
||||
'PublicationInfoType': PublicationInfoType_,
|
||||
'UsagePolicy': UsagePolicy,
|
||||
'PublicationPath': PublicationPath,
|
||||
'PublicationPathType': PublicationPathType_,
|
||||
'Publication': Publication,
|
||||
'PublicationType': PublicationType_,
|
||||
}
|
||||
|
||||
|
||||
def factory(tag, **kwargs):
|
||||
return ELEMENT_BY_TAG[tag](**kwargs)
|
||||
|
||||
378
src/saml2/extension/mdui.py
Normal file
378
src/saml2/extension/mdui.py
Normal file
@@ -0,0 +1,378 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
#
|
||||
# Generated Mon May 2 14:23:33 2011 by parse_xsd.py version 0.4.
|
||||
#
|
||||
|
||||
import saml2
|
||||
from saml2 import SamlBase
|
||||
|
||||
from saml2 import md
|
||||
|
||||
NAMESPACE = 'urn:oasis:names:tc:SAML:metadata:ui'
|
||||
|
||||
class DisplayName(md.LocalizedNameType_):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:DisplayName element """
|
||||
|
||||
c_tag = 'DisplayName'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = md.LocalizedNameType_.c_children.copy()
|
||||
c_attributes = md.LocalizedNameType_.c_attributes.copy()
|
||||
c_child_order = md.LocalizedNameType_.c_child_order[:]
|
||||
c_cardinality = md.LocalizedNameType_.c_cardinality.copy()
|
||||
|
||||
def display_name_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(DisplayName, xml_string)
|
||||
|
||||
|
||||
class Description(md.LocalizedNameType_):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:Description element """
|
||||
|
||||
c_tag = 'Description'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = md.LocalizedNameType_.c_children.copy()
|
||||
c_attributes = md.LocalizedNameType_.c_attributes.copy()
|
||||
c_child_order = md.LocalizedNameType_.c_child_order[:]
|
||||
c_cardinality = md.LocalizedNameType_.c_cardinality.copy()
|
||||
|
||||
def description_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Description, xml_string)
|
||||
|
||||
|
||||
class InformationURL(md.LocalizedURIType_):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:InformationURL element """
|
||||
|
||||
c_tag = 'InformationURL'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = md.LocalizedURIType_.c_children.copy()
|
||||
c_attributes = md.LocalizedURIType_.c_attributes.copy()
|
||||
c_child_order = md.LocalizedURIType_.c_child_order[:]
|
||||
c_cardinality = md.LocalizedURIType_.c_cardinality.copy()
|
||||
|
||||
def information_url_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(InformationURL, xml_string)
|
||||
|
||||
|
||||
class PrivacyStatementURL(md.LocalizedURIType_):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:PrivacyStatementURL element """
|
||||
|
||||
c_tag = 'PrivacyStatementURL'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = md.LocalizedURIType_.c_children.copy()
|
||||
c_attributes = md.LocalizedURIType_.c_attributes.copy()
|
||||
c_child_order = md.LocalizedURIType_.c_child_order[:]
|
||||
c_cardinality = md.LocalizedURIType_.c_cardinality.copy()
|
||||
|
||||
def privacy_statement_url_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(PrivacyStatementURL, xml_string)
|
||||
|
||||
|
||||
class ListOfStrings_(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:listOfStrings element """
|
||||
|
||||
c_tag = 'listOfStrings'
|
||||
c_namespace = NAMESPACE
|
||||
c_value_type = {'member': 'string', 'base': 'list'}
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
|
||||
def list_of_strings__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(ListOfStrings_, xml_string)
|
||||
|
||||
|
||||
class KeywordsType_(ListOfStrings_):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:KeywordsType element """
|
||||
|
||||
c_tag = 'KeywordsType'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = ListOfStrings_.c_children.copy()
|
||||
c_attributes = ListOfStrings_.c_attributes.copy()
|
||||
c_child_order = ListOfStrings_.c_child_order[:]
|
||||
c_cardinality = ListOfStrings_.c_cardinality.copy()
|
||||
c_attributes['{http://www.w3.org/XML/1998/namespace}lang'] = ('lang', 'mdui:listOfStrings', True)
|
||||
|
||||
def __init__(self,
|
||||
lang=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
ListOfStrings_.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.lang=lang
|
||||
|
||||
def keywords_type__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(KeywordsType_, xml_string)
|
||||
|
||||
|
||||
class LogoType_(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:LogoType element """
|
||||
|
||||
c_tag = 'LogoType'
|
||||
c_namespace = NAMESPACE
|
||||
c_value_type = {'base': 'anyURI'}
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
c_attributes['height'] = ('height', 'positiveInteger', True)
|
||||
c_attributes['width'] = ('width', 'positiveInteger', True)
|
||||
c_attributes['{http://www.w3.org/XML/1998/namespace}lang'] = ('lang', 'anyURI', False)
|
||||
|
||||
def __init__(self,
|
||||
height=None,
|
||||
width=None,
|
||||
lang=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
SamlBase.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.height=height
|
||||
self.width=width
|
||||
self.lang=lang
|
||||
|
||||
def logo_type__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(LogoType_, xml_string)
|
||||
|
||||
|
||||
class IPHint(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:IPHint element """
|
||||
|
||||
c_tag = 'IPHint'
|
||||
c_namespace = NAMESPACE
|
||||
c_value_type = {'base': 'string'}
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
|
||||
def ip_hint_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(IPHint, xml_string)
|
||||
|
||||
|
||||
class DomainHint(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:DomainHint element """
|
||||
|
||||
c_tag = 'DomainHint'
|
||||
c_namespace = NAMESPACE
|
||||
c_value_type = {'base': 'string'}
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
|
||||
def domain_hint_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(DomainHint, xml_string)
|
||||
|
||||
|
||||
class GeolocationHint(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:GeolocationHint element """
|
||||
|
||||
c_tag = 'GeolocationHint'
|
||||
c_namespace = NAMESPACE
|
||||
c_value_type = {'base': 'anyURI'}
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
|
||||
def geolocation_hint_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(GeolocationHint, xml_string)
|
||||
|
||||
|
||||
class Keywords(KeywordsType_):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:Keywords element """
|
||||
|
||||
c_tag = 'Keywords'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = KeywordsType_.c_children.copy()
|
||||
c_attributes = KeywordsType_.c_attributes.copy()
|
||||
c_child_order = KeywordsType_.c_child_order[:]
|
||||
c_cardinality = KeywordsType_.c_cardinality.copy()
|
||||
|
||||
def keywords_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Keywords, xml_string)
|
||||
|
||||
|
||||
class Logo(LogoType_):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:Logo element """
|
||||
|
||||
c_tag = 'Logo'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = LogoType_.c_children.copy()
|
||||
c_attributes = LogoType_.c_attributes.copy()
|
||||
c_child_order = LogoType_.c_child_order[:]
|
||||
c_cardinality = LogoType_.c_cardinality.copy()
|
||||
|
||||
def logo_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Logo, xml_string)
|
||||
|
||||
|
||||
class DiscoHintsType_(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:DiscoHintsType element """
|
||||
|
||||
c_tag = 'DiscoHintsType'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
c_children['{urn:oasis:names:tc:SAML:metadata:ui}IPHint'] = ('ip_hint', [IPHint])
|
||||
c_cardinality['ip_hint'] = {"min":0}
|
||||
c_children['{urn:oasis:names:tc:SAML:metadata:ui}DomainHint'] = ('domain_hint', [DomainHint])
|
||||
c_cardinality['domain_hint'] = {"min":0}
|
||||
c_children['{urn:oasis:names:tc:SAML:metadata:ui}GeolocationHint'] = ('geolocation_hint', [GeolocationHint])
|
||||
c_cardinality['geolocation_hint'] = {"min":0}
|
||||
c_child_order.extend(['ip_hint', 'domain_hint', 'geolocation_hint'])
|
||||
|
||||
def __init__(self,
|
||||
ip_hint=None,
|
||||
domain_hint=None,
|
||||
geolocation_hint=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
SamlBase.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.ip_hint=ip_hint or []
|
||||
self.domain_hint=domain_hint or []
|
||||
self.geolocation_hint=geolocation_hint or []
|
||||
|
||||
def disco_hints_type__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(DiscoHintsType_, xml_string)
|
||||
|
||||
|
||||
class UIInfoType_(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:UIInfoType element """
|
||||
|
||||
c_tag = 'UIInfoType'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
c_children['{urn:oasis:names:tc:SAML:metadata:ui}DisplayName'] = ('display_name', [DisplayName])
|
||||
c_cardinality['display_name'] = {"min":0}
|
||||
c_children['{urn:oasis:names:tc:SAML:metadata:ui}Description'] = ('description', [Description])
|
||||
c_cardinality['description'] = {"min":0}
|
||||
c_children['{urn:oasis:names:tc:SAML:metadata:ui}Keywords'] = ('keywords', [Keywords])
|
||||
c_cardinality['keywords'] = {"min":0}
|
||||
c_children['{urn:oasis:names:tc:SAML:metadata:ui}Logo'] = ('logo', [Logo])
|
||||
c_cardinality['logo'] = {"min":0}
|
||||
c_children['{urn:oasis:names:tc:SAML:metadata:ui}InformationURL'] = ('information_url', [InformationURL])
|
||||
c_cardinality['information_url'] = {"min":0}
|
||||
c_children['{urn:oasis:names:tc:SAML:metadata:ui}PrivacyStatementURL'] = ('privacy_statement_url', [PrivacyStatementURL])
|
||||
c_cardinality['privacy_statement_url'] = {"min":0}
|
||||
c_child_order.extend(['display_name', 'description', 'keywords', 'logo', 'information_url', 'privacy_statement_url'])
|
||||
|
||||
def __init__(self,
|
||||
display_name=None,
|
||||
description=None,
|
||||
keywords=None,
|
||||
logo=None,
|
||||
information_url=None,
|
||||
privacy_statement_url=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
SamlBase.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.display_name=display_name or []
|
||||
self.description=description or []
|
||||
self.keywords=keywords or []
|
||||
self.logo=logo or []
|
||||
self.information_url=information_url or []
|
||||
self.privacy_statement_url=privacy_statement_url or []
|
||||
|
||||
def ui_info_type__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(UIInfoType_, xml_string)
|
||||
|
||||
|
||||
class DiscoHints(DiscoHintsType_):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:DiscoHints element """
|
||||
|
||||
c_tag = 'DiscoHints'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = DiscoHintsType_.c_children.copy()
|
||||
c_attributes = DiscoHintsType_.c_attributes.copy()
|
||||
c_child_order = DiscoHintsType_.c_child_order[:]
|
||||
c_cardinality = DiscoHintsType_.c_cardinality.copy()
|
||||
|
||||
def disco_hints_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(DiscoHints, xml_string)
|
||||
|
||||
|
||||
class UIInfo(UIInfoType_):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:UIInfo element """
|
||||
|
||||
c_tag = 'UIInfo'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = UIInfoType_.c_children.copy()
|
||||
c_attributes = UIInfoType_.c_attributes.copy()
|
||||
c_child_order = UIInfoType_.c_child_order[:]
|
||||
c_cardinality = UIInfoType_.c_cardinality.copy()
|
||||
|
||||
def ui_info_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(UIInfo, xml_string)
|
||||
|
||||
|
||||
ELEMENT_FROM_STRING = {
|
||||
UIInfo.c_tag: ui_info_from_string,
|
||||
UIInfoType_.c_tag: ui_info_type__from_string,
|
||||
DisplayName.c_tag: display_name_from_string,
|
||||
Description.c_tag: description_from_string,
|
||||
InformationURL.c_tag: information_url_from_string,
|
||||
PrivacyStatementURL.c_tag: privacy_statement_url_from_string,
|
||||
Keywords.c_tag: keywords_from_string,
|
||||
KeywordsType_.c_tag: keywords_type__from_string,
|
||||
ListOfStrings_.c_tag: list_of_strings__from_string,
|
||||
Logo.c_tag: logo_from_string,
|
||||
LogoType_.c_tag: logo_type__from_string,
|
||||
DiscoHints.c_tag: disco_hints_from_string,
|
||||
DiscoHintsType_.c_tag: disco_hints_type__from_string,
|
||||
IPHint.c_tag: ip_hint_from_string,
|
||||
DomainHint.c_tag: domain_hint_from_string,
|
||||
GeolocationHint.c_tag: geolocation_hint_from_string,
|
||||
}
|
||||
|
||||
ELEMENT_BY_TAG = {
|
||||
'UIInfo': UIInfo,
|
||||
'UIInfoType': UIInfoType_,
|
||||
'DisplayName': DisplayName,
|
||||
'Description': Description,
|
||||
'InformationURL': InformationURL,
|
||||
'PrivacyStatementURL': PrivacyStatementURL,
|
||||
'Keywords': Keywords,
|
||||
'KeywordsType': KeywordsType_,
|
||||
'listOfStrings': ListOfStrings_,
|
||||
'Logo': Logo,
|
||||
'LogoType': LogoType_,
|
||||
'DiscoHints': DiscoHints,
|
||||
'DiscoHintsType': DiscoHintsType_,
|
||||
'IPHint': IPHint,
|
||||
'DomainHint': DomainHint,
|
||||
'GeolocationHint': GeolocationHint,
|
||||
}
|
||||
|
||||
|
||||
def factory(tag, **kwargs):
|
||||
return ELEMENT_BY_TAG[tag](**kwargs)
|
||||
|
||||
89
src/saml2/extension/shibmd.py
Normal file
89
src/saml2/extension/shibmd.py
Normal file
@@ -0,0 +1,89 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
#
|
||||
# Generated Sun Mar 20 18:06:44 2011 by parse_xsd.py version 0.4.
|
||||
#
|
||||
|
||||
import saml2
|
||||
from saml2 import SamlBase
|
||||
|
||||
import xmldsig as ds
|
||||
|
||||
NAMESPACE = 'urn:mace:shibboleth:metadata:1.0'
|
||||
|
||||
class Scope(SamlBase):
|
||||
"""The urn:mace:shibboleth:metadata:1.0:Scope element """
|
||||
|
||||
c_tag = 'Scope'
|
||||
c_namespace = NAMESPACE
|
||||
c_value_type = {'base': 'string'}
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
c_attributes['regexp'] = ('regexp', 'boolean', False)
|
||||
|
||||
def __init__(self,
|
||||
regexp='false',
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
SamlBase.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.regexp=regexp
|
||||
|
||||
def scope_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Scope, xml_string)
|
||||
|
||||
|
||||
class KeyAuthority(SamlBase):
|
||||
"""The urn:mace:shibboleth:metadata:1.0:KeyAuthority element """
|
||||
|
||||
c_tag = 'KeyAuthority'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
c_children['{http://www.w3.org/2000/09/xmldsig#}KeyInfo'] = ('key_info', [ds.KeyInfo])
|
||||
c_cardinality['key_info'] = {"min":1}
|
||||
c_attributes['VerifyDepth'] = ('verify_depth', 'unsignedByte', False)
|
||||
c_child_order.extend(['key_info'])
|
||||
|
||||
def __init__(self,
|
||||
key_info=None,
|
||||
verify_depth='1',
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
SamlBase.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.key_info=key_info or []
|
||||
self.verify_depth=verify_depth
|
||||
|
||||
def key_authority_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(KeyAuthority, xml_string)
|
||||
|
||||
|
||||
ELEMENT_FROM_STRING = {
|
||||
Scope.c_tag: scope_from_string,
|
||||
KeyAuthority.c_tag: key_authority_from_string,
|
||||
}
|
||||
|
||||
ELEMENT_BY_TAG = {
|
||||
'Scope': Scope,
|
||||
'KeyAuthority': KeyAuthority,
|
||||
}
|
||||
|
||||
|
||||
def factory(tag, **kwargs):
|
||||
return ELEMENT_BY_TAG[tag](**kwargs)
|
||||
|
||||
311
src/saml2/extension/ui.py
Normal file
311
src/saml2/extension/ui.py
Normal file
@@ -0,0 +1,311 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
#
|
||||
# Generated Mon Oct 25 16:17:51 2010 by parse_xsd.py version 0.4.
|
||||
#
|
||||
|
||||
import saml2
|
||||
from saml2 import SamlBase
|
||||
|
||||
from saml2 import md
|
||||
|
||||
NAMESPACE = 'urn:oasis:names:tc:SAML:metadata:ui'
|
||||
|
||||
class DisplayName(md.LocalizedNameType_):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:DisplayName element """
|
||||
|
||||
c_tag = 'DisplayName'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = md.LocalizedNameType_.c_children.copy()
|
||||
c_attributes = md.LocalizedNameType_.c_attributes.copy()
|
||||
c_child_order = md.LocalizedNameType_.c_child_order[:]
|
||||
c_cardinality = md.LocalizedNameType_.c_cardinality.copy()
|
||||
|
||||
def display_name_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(DisplayName, xml_string)
|
||||
|
||||
|
||||
class Description(md.LocalizedNameType_):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:Description element """
|
||||
|
||||
c_tag = 'Description'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = md.LocalizedNameType_.c_children.copy()
|
||||
c_attributes = md.LocalizedNameType_.c_attributes.copy()
|
||||
c_child_order = md.LocalizedNameType_.c_child_order[:]
|
||||
c_cardinality = md.LocalizedNameType_.c_cardinality.copy()
|
||||
|
||||
def description_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Description, xml_string)
|
||||
|
||||
|
||||
class InformationURL(md.LocalizedURIType_):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:InformationURL element """
|
||||
|
||||
c_tag = 'InformationURL'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = md.LocalizedURIType_.c_children.copy()
|
||||
c_attributes = md.LocalizedURIType_.c_attributes.copy()
|
||||
c_child_order = md.LocalizedURIType_.c_child_order[:]
|
||||
c_cardinality = md.LocalizedURIType_.c_cardinality.copy()
|
||||
|
||||
def information_url_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(InformationURL, xml_string)
|
||||
|
||||
|
||||
class PrivacyStatementURL(md.LocalizedURIType_):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:PrivacyStatementURL element """
|
||||
|
||||
c_tag = 'PrivacyStatementURL'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = md.LocalizedURIType_.c_children.copy()
|
||||
c_attributes = md.LocalizedURIType_.c_attributes.copy()
|
||||
c_child_order = md.LocalizedURIType_.c_child_order[:]
|
||||
c_cardinality = md.LocalizedURIType_.c_cardinality.copy()
|
||||
|
||||
def privacy_statement_url_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(PrivacyStatementURL, xml_string)
|
||||
|
||||
|
||||
class LogoType_(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:LogoType element """
|
||||
|
||||
c_tag = 'LogoType'
|
||||
c_namespace = NAMESPACE
|
||||
c_value_type = {'base': 'anyURI'}
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
c_attributes['height'] = ('height', 'positiveInteger', True)
|
||||
c_attributes['width'] = ('width', 'positiveInteger', True)
|
||||
c_attributes['{http://www.w3.org/XML/1998/namespace}lang'] = ('lang', 'anyURI', False)
|
||||
|
||||
def __init__(self,
|
||||
height=None,
|
||||
width=None,
|
||||
lang=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
SamlBase.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.height=height
|
||||
self.width=width
|
||||
self.lang=lang
|
||||
|
||||
def logo_type__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(LogoType_, xml_string)
|
||||
|
||||
|
||||
class IPHint(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:IPHint element """
|
||||
|
||||
c_tag = 'IPHint'
|
||||
c_namespace = NAMESPACE
|
||||
c_value_type = {'base': 'string'}
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
|
||||
def ip_hint_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(IPHint, xml_string)
|
||||
|
||||
|
||||
class DomainHint(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:DomainHint element """
|
||||
|
||||
c_tag = 'DomainHint'
|
||||
c_namespace = NAMESPACE
|
||||
c_value_type = {'base': 'string'}
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
|
||||
def domain_hint_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(DomainHint, xml_string)
|
||||
|
||||
|
||||
class GeolocationHint(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:GeolocationHint element """
|
||||
|
||||
c_tag = 'GeolocationHint'
|
||||
c_namespace = NAMESPACE
|
||||
c_value_type = {'base': 'anyURI'}
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
|
||||
def geolocation_hint_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(GeolocationHint, xml_string)
|
||||
|
||||
|
||||
class Logo(LogoType_):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:Logo element """
|
||||
|
||||
c_tag = 'Logo'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = LogoType_.c_children.copy()
|
||||
c_attributes = LogoType_.c_attributes.copy()
|
||||
c_child_order = LogoType_.c_child_order[:]
|
||||
c_cardinality = LogoType_.c_cardinality.copy()
|
||||
|
||||
def logo_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Logo, xml_string)
|
||||
|
||||
|
||||
class DiscoHintsType_(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:DiscoHintsType element """
|
||||
|
||||
c_tag = 'DiscoHintsType'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
c_children['{urn:oasis:names:tc:SAML:metadata:ui}IPHint'] = ('ip_hint', [IPHint])
|
||||
c_cardinality['ip_hint'] = {"min":0}
|
||||
c_children['{urn:oasis:names:tc:SAML:metadata:ui}DomainHint'] = ('domain_hint', [DomainHint])
|
||||
c_cardinality['domain_hint'] = {"min":0}
|
||||
c_children['{urn:oasis:names:tc:SAML:metadata:ui}GeolocationHint'] = ('geolocation_hint', [GeolocationHint])
|
||||
c_cardinality['geolocation_hint'] = {"min":0}
|
||||
c_child_order.extend(['ip_hint', 'domain_hint', 'geolocation_hint'])
|
||||
|
||||
def __init__(self,
|
||||
ip_hint=None,
|
||||
domain_hint=None,
|
||||
geolocation_hint=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
SamlBase.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.ip_hint=ip_hint or []
|
||||
self.domain_hint=domain_hint or []
|
||||
self.geolocation_hint=geolocation_hint or []
|
||||
|
||||
def disco_hints_type__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(DiscoHintsType_, xml_string)
|
||||
|
||||
|
||||
class UIInfoType_(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:UIInfoType element """
|
||||
|
||||
c_tag = 'UIInfoType'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
c_children['{urn:oasis:names:tc:SAML:metadata:ui}DisplayName'] = ('display_name', [DisplayName])
|
||||
c_cardinality['display_name'] = {"min":0}
|
||||
c_children['{urn:oasis:names:tc:SAML:metadata:ui}Description'] = ('description', [Description])
|
||||
c_cardinality['description'] = {"min":0}
|
||||
c_children['{urn:oasis:names:tc:SAML:metadata:ui}Logo'] = ('logo', [Logo])
|
||||
c_cardinality['logo'] = {"min":0}
|
||||
c_children['{urn:oasis:names:tc:SAML:metadata:ui}InformationURL'] = ('information_url', [InformationURL])
|
||||
c_cardinality['information_url'] = {"min":0}
|
||||
c_children['{urn:oasis:names:tc:SAML:metadata:ui}PrivacyStatementURL'] = ('privacy_statement_url', [PrivacyStatementURL])
|
||||
c_cardinality['privacy_statement_url'] = {"min":0}
|
||||
c_child_order.extend(['display_name', 'description', 'logo', 'information_url', 'privacy_statement_url'])
|
||||
|
||||
def __init__(self,
|
||||
display_name=None,
|
||||
description=None,
|
||||
logo=None,
|
||||
information_url=None,
|
||||
privacy_statement_url=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
SamlBase.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.display_name=display_name or []
|
||||
self.description=description or []
|
||||
self.logo=logo or []
|
||||
self.information_url=information_url or []
|
||||
self.privacy_statement_url=privacy_statement_url or []
|
||||
|
||||
def ui_info_type__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(UIInfoType_, xml_string)
|
||||
|
||||
|
||||
class DiscoHints(DiscoHintsType_):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:DiscoHints element """
|
||||
|
||||
c_tag = 'DiscoHints'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = DiscoHintsType_.c_children.copy()
|
||||
c_attributes = DiscoHintsType_.c_attributes.copy()
|
||||
c_child_order = DiscoHintsType_.c_child_order[:]
|
||||
c_cardinality = DiscoHintsType_.c_cardinality.copy()
|
||||
|
||||
def disco_hints_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(DiscoHints, xml_string)
|
||||
|
||||
|
||||
class UIInfo(UIInfoType_):
|
||||
"""The urn:oasis:names:tc:SAML:metadata:ui:UIInfo element """
|
||||
|
||||
c_tag = 'UIInfo'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = UIInfoType_.c_children.copy()
|
||||
c_attributes = UIInfoType_.c_attributes.copy()
|
||||
c_child_order = UIInfoType_.c_child_order[:]
|
||||
c_cardinality = UIInfoType_.c_cardinality.copy()
|
||||
|
||||
def ui_info_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(UIInfo, xml_string)
|
||||
|
||||
|
||||
ELEMENT_FROM_STRING = {
|
||||
UIInfo.c_tag: ui_info_from_string,
|
||||
UIInfoType_.c_tag: ui_info_type__from_string,
|
||||
DisplayName.c_tag: display_name_from_string,
|
||||
Description.c_tag: description_from_string,
|
||||
InformationURL.c_tag: information_url_from_string,
|
||||
PrivacyStatementURL.c_tag: privacy_statement_url_from_string,
|
||||
Logo.c_tag: logo_from_string,
|
||||
LogoType_.c_tag: logo_type__from_string,
|
||||
DiscoHints.c_tag: disco_hints_from_string,
|
||||
DiscoHintsType_.c_tag: disco_hints_type__from_string,
|
||||
IPHint.c_tag: ip_hint_from_string,
|
||||
DomainHint.c_tag: domain_hint_from_string,
|
||||
GeolocationHint.c_tag: geolocation_hint_from_string,
|
||||
}
|
||||
|
||||
ELEMENT_BY_TAG = {
|
||||
'UIInfo': UIInfo,
|
||||
'UIInfoType': UIInfoType_,
|
||||
'DisplayName': DisplayName,
|
||||
'Description': Description,
|
||||
'InformationURL': InformationURL,
|
||||
'PrivacyStatementURL': PrivacyStatementURL,
|
||||
'Logo': Logo,
|
||||
'LogoType': LogoType_,
|
||||
'DiscoHints': DiscoHints,
|
||||
'DiscoHintsType': DiscoHintsType_,
|
||||
'IPHint': IPHint,
|
||||
'DomainHint': DomainHint,
|
||||
'GeolocationHint': GeolocationHint,
|
||||
}
|
||||
|
||||
|
||||
def factory(tag, **kwargs):
|
||||
return ELEMENT_BY_TAG[tag](**kwargs)
|
||||
|
||||
149
src/saml2/httplib2cookie.py
Normal file
149
src/saml2/httplib2cookie.py
Normal file
@@ -0,0 +1,149 @@
|
||||
# ========================================================================
|
||||
# Copyright (c) 2007, Metaweb Technologies, Inc.
|
||||
# All rights reserved.
|
||||
#
|
||||
# Redistribution and use in source and binary forms, with or without
|
||||
# modification, are permitted provided that the following conditions
|
||||
# are met:
|
||||
# * Redistributions of source code must retain the above copyright
|
||||
# notice, this list of conditions and the following disclaimer.
|
||||
# * Redistributions in binary form must reproduce the above
|
||||
# copyright notice, this list of conditions and the following
|
||||
# disclaimer in the documentation and/or other materials provided
|
||||
# with the distribution.
|
||||
#
|
||||
# THIS SOFTWARE IS PROVIDED BY METAWEB TECHNOLOGIES AND CONTRIBUTORS
|
||||
# ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS
|
||||
# FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL METAWEB
|
||||
# TECHNOLOGIES OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT,
|
||||
# INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
|
||||
# BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
|
||||
# LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
|
||||
# CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
|
||||
# LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN
|
||||
# ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
|
||||
# POSSIBILITY OF SUCH DAMAGE.
|
||||
# ========================================================================
|
||||
|
||||
#
|
||||
#
|
||||
# httplib2cookie.py allows you to use python's standard
|
||||
# CookieJar class with httplib2.
|
||||
#
|
||||
#
|
||||
|
||||
import re
|
||||
import cookielib
|
||||
from httplib2 import Http
|
||||
|
||||
import urllib
|
||||
import urllib2
|
||||
|
||||
class DummyRequest(object):
|
||||
"""Simulated urllib2.Request object for httplib2
|
||||
|
||||
implements only what's necessary for cookielib.CookieJar to work
|
||||
"""
|
||||
def __init__(self, url, headers=None):
|
||||
self.url = url
|
||||
self.headers = headers
|
||||
self.origin_req_host = urllib2.request_host(self)
|
||||
self.type, r = urllib.splittype(url)
|
||||
self.host, r = urllib.splithost(r)
|
||||
if self.host:
|
||||
self.host = urllib.unquote(self.host)
|
||||
|
||||
def get_full_url(self):
|
||||
return self.url
|
||||
|
||||
def get_origin_req_host(self):
|
||||
# TODO to match urllib2 this should be different for redirects
|
||||
return self.origin_req_host
|
||||
|
||||
def get_type(self):
|
||||
return self.type
|
||||
|
||||
def get_host(self):
|
||||
return self.host
|
||||
|
||||
def get_header(self, key, default=None):
|
||||
return self.headers.get(key.lower(), default)
|
||||
|
||||
def has_header(self, key):
|
||||
return key in self.headers
|
||||
|
||||
def add_unredirected_header(self, key, val):
|
||||
# TODO this header should not be sent on redirect
|
||||
self.headers[key.lower()] = val
|
||||
|
||||
def is_unverifiable(self):
|
||||
# TODO to match urllib2, this should be set to True when the
|
||||
# request is the result of a redirect
|
||||
return False
|
||||
|
||||
|
||||
class DummyResponse(object):
|
||||
"""Simulated urllib2.Request object for httplib2
|
||||
|
||||
implements only what's necessary for cookielib.CookieJar to work
|
||||
"""
|
||||
def __init__(self, response):
|
||||
self.response = response
|
||||
|
||||
def info(self):
|
||||
return DummyMessage(self.response)
|
||||
|
||||
|
||||
class DummyMessage(object):
|
||||
"""Simulated mimetools.Message object for httplib2
|
||||
|
||||
implements only what's necessary for cookielib.CookieJar to work
|
||||
"""
|
||||
def __init__(self, response):
|
||||
self.response = response
|
||||
|
||||
def getheaders(self, k):
|
||||
k = k.lower()
|
||||
v = self.response.get(k.lower(), None)
|
||||
if k not in self.response:
|
||||
return []
|
||||
#return self.response[k].split(re.compile(',\\s*'))
|
||||
|
||||
# httplib2 joins multiple values for the same header
|
||||
# using ','. but the netscape cookie format uses ','
|
||||
# as part of the expires= date format. so we have
|
||||
# to split carefully here - header.split(',') won't do it.
|
||||
HEADERVAL= re.compile(r'\s*(([^,]|(,\s*\d))+)')
|
||||
return [h[0] for h in HEADERVAL.findall(self.response[k])]
|
||||
|
||||
class CookiefulHttp(Http):
|
||||
"""Subclass of httplib2.Http that keeps cookie state
|
||||
|
||||
constructor takes an optional cookiejar=cookielib.CookieJar
|
||||
|
||||
currently this does not handle redirects completely correctly:
|
||||
if the server redirects to a different host the original
|
||||
cookies will still be sent to that host.
|
||||
"""
|
||||
def __init__(self, cookiejar=None, **kws):
|
||||
# note that httplib2.Http is not a new-style-class
|
||||
Http.__init__(self, **kws)
|
||||
if cookiejar is None:
|
||||
cookiejar = cookielib.CookieJar()
|
||||
self.cookiejar = cookiejar
|
||||
|
||||
def crequest(self, uri, **kws):
|
||||
""" crequest so it's not messing up the 'real' request method
|
||||
"""
|
||||
headers = kws.pop('headers', None)
|
||||
req = DummyRequest(uri, headers)
|
||||
self.cookiejar.add_cookie_header(req)
|
||||
headers = req.headers
|
||||
|
||||
(r, body) = Http.request(self, uri, headers=headers, **kws)
|
||||
|
||||
resp = DummyResponse(r)
|
||||
self.cookiejar.extract_cookies(resp, req)
|
||||
|
||||
return r, body
|
||||
128
src/saml2/httputil.py
Normal file
128
src/saml2/httputil.py
Normal file
@@ -0,0 +1,128 @@
|
||||
__author__ = 'rohe0002'
|
||||
|
||||
import cgi
|
||||
from urllib import quote
|
||||
|
||||
class Response(object):
|
||||
_template = None
|
||||
_status = '200 OK'
|
||||
_content_type = 'text/html'
|
||||
_mako_template = None
|
||||
_mako_lookup = None
|
||||
|
||||
def __init__(self, message=None, **kwargs):
|
||||
self.status = kwargs.get('status', self._status)
|
||||
self.response = kwargs.get('response', self._response)
|
||||
self.template = kwargs.get('template', self._template)
|
||||
self.mako_template = kwargs.get('mako_template', self._mako_template)
|
||||
self.mako_lookup = kwargs.get('template_lookup', self._mako_lookup)
|
||||
|
||||
self.message = message
|
||||
|
||||
self.headers = kwargs.get('headers', [])
|
||||
_content_type = kwargs.get('content', self._content_type)
|
||||
self.headers.append(('Content-type', _content_type))
|
||||
|
||||
def __call__(self, environ, start_response, **kwargs):
|
||||
start_response(self.status, self.headers)
|
||||
return self.response(self.message or geturl(environ), **kwargs)
|
||||
|
||||
def _response(self, message="", **argv):
|
||||
if self.template:
|
||||
return [self.template % message]
|
||||
elif self.mako_lookup and self.mako_template:
|
||||
argv["message"] = message
|
||||
mte = self.mako_lookup.get_template(self.mako_template)
|
||||
return [mte.render(**argv)]
|
||||
else:
|
||||
return [message]
|
||||
|
||||
class Created(Response):
|
||||
_status = "201 Created"
|
||||
|
||||
class Redirect(Response):
|
||||
_template = '<html>\n<head><title>Redirecting to %s</title></head>\n' \
|
||||
'<body>\nYou are being redirected to <a href="%s">%s</a>\n' \
|
||||
'</body>\n</html>'
|
||||
_status = '302 Found'
|
||||
|
||||
def __call__(self, environ, start_response):
|
||||
location = self.message
|
||||
self.headers.append(('location', location))
|
||||
start_response(self.status, self.headers)
|
||||
return self.response((location, location, location))
|
||||
|
||||
class SeeOther(Response):
|
||||
_template = '<html>\n<head><title>Redirecting to %s</title></head>\n' \
|
||||
'<body>\nYou are being redirected to <a href="%s">%s</a>\n' \
|
||||
'</body>\n</html>'
|
||||
_status = '303 See Other'
|
||||
|
||||
def __call__(self, environ, start_response):
|
||||
location = self.message
|
||||
self.headers.append(('location', location))
|
||||
start_response(self.status, self.headers)
|
||||
return self.response((location, location, location))
|
||||
|
||||
class Forbidden(Response):
|
||||
_status = '403 Forbidden'
|
||||
_template = "<html>Not allowed to mess with: '%s'</html>"
|
||||
|
||||
class BadRequest(Response):
|
||||
_status = "400 Bad Request"
|
||||
_template = "<html>%s</html>"
|
||||
|
||||
class Unauthorized(Response):
|
||||
_status = "401 Unauthorized"
|
||||
_template = "<html>%s</html>"
|
||||
|
||||
class NotFound(Response):
|
||||
_status = '404 NOT FOUND'
|
||||
|
||||
class NotAcceptable(Response):
|
||||
_status = '406 Not Acceptable'
|
||||
|
||||
class ServiceError(Response):
|
||||
_status = '500 Internal Service Error'
|
||||
|
||||
def extract(environ, empty=False, err=False):
|
||||
"""Extracts strings in form data and returns a dict.
|
||||
|
||||
:param environ: WSGI environ
|
||||
:param empty: Stops on empty fields (default: Fault)
|
||||
:param err: Stops on errors in fields (default: Fault)
|
||||
"""
|
||||
formdata = cgi.parse(environ['wsgi.input'], environ, empty, err)
|
||||
# Remove single entries from lists
|
||||
for key, value in formdata.iteritems():
|
||||
if len(value) == 1:
|
||||
formdata[key] = value[0]
|
||||
return formdata
|
||||
|
||||
def geturl(environ, query=True, path=True):
|
||||
"""Rebuilds a request URL (from PEP 333).
|
||||
|
||||
:param query: Is QUERY_STRING included in URI (default: True)
|
||||
:param path: Is path included in URI (default: True)
|
||||
"""
|
||||
url = [environ['wsgi.url_scheme'] + '://']
|
||||
if environ.get('HTTP_HOST'):
|
||||
url.append(environ['HTTP_HOST'])
|
||||
else:
|
||||
url.append(environ['SERVER_NAME'])
|
||||
if environ['wsgi.url_scheme'] == 'https':
|
||||
if environ['SERVER_PORT'] != '443':
|
||||
url.append(':' + environ['SERVER_PORT'])
|
||||
else:
|
||||
if environ['SERVER_PORT'] != '80':
|
||||
url.append(':' + environ['SERVER_PORT'])
|
||||
if path:
|
||||
url.append(getpath(environ))
|
||||
if query and environ.get('QUERY_STRING'):
|
||||
url.append('?' + environ['QUERY_STRING'])
|
||||
return ''.join(url)
|
||||
|
||||
def getpath(environ):
|
||||
"""Builds a path."""
|
||||
return ''.join([quote(environ.get('SCRIPT_NAME', '')),
|
||||
quote(environ.get('PATH_INFO', ''))])
|
||||
200
src/saml2/mcache.py
Normal file
200
src/saml2/mcache.py
Normal file
@@ -0,0 +1,200 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
import memcache
|
||||
from saml2 import time_util
|
||||
from saml2.cache import ToOld, CacheError
|
||||
|
||||
# The assumption is that any subject may consist of data
|
||||
# gathered from several different sources, all with their own
|
||||
# timeout time.
|
||||
|
||||
|
||||
def _key(prefix, name):
|
||||
return "%s_%s" % (prefix, name)
|
||||
|
||||
class Cache(object):
|
||||
def __init__(self, servers, debug=0):
|
||||
self._cache = memcache.Client(servers, debug)
|
||||
|
||||
def delete(self, subject_id):
|
||||
entities = self.entities(subject_id)
|
||||
if entities:
|
||||
for entity_id in entities:
|
||||
if not self._cache.delete(_key(subject_id, entity_id)):
|
||||
raise CacheError("Delete failed")
|
||||
|
||||
if not self._cache.delete(subject_id):
|
||||
raise CacheError("Delete failed")
|
||||
|
||||
subjects = self._cache.get("subjects")
|
||||
if subjects and subject_id in subjects:
|
||||
subjects.remove(subject_id)
|
||||
if not self._cache.set("subjects", subjects):
|
||||
raise CacheError("Set operation failed")
|
||||
|
||||
def get_identity(self, subject_id, entities=None):
|
||||
""" Get all the identity information that has been received and
|
||||
are still valid about the subject.
|
||||
|
||||
:param subject_id: The identifier of the subject
|
||||
:param entities: The identifiers of the entities whoes assertions are
|
||||
interesting. If the list is empty all entities are interesting.
|
||||
:return: A 2-tuple consisting of the identity information (a
|
||||
dictionary of attributes and values) and the list of entities
|
||||
whoes information has timed out.
|
||||
"""
|
||||
if not entities:
|
||||
entities = self.entities(subject_id)
|
||||
if not entities:
|
||||
return {}, []
|
||||
|
||||
res = {}
|
||||
oldees = []
|
||||
for (entity_id, item) in self._cache.get_multi(entities,
|
||||
subject_id+'_').items():
|
||||
try:
|
||||
info = self.get_info(item)
|
||||
except ToOld:
|
||||
oldees.append(entity_id)
|
||||
continue
|
||||
for key, vals in info["ava"].items():
|
||||
try:
|
||||
tmp = set(res[key]).union(set(vals))
|
||||
res[key] = list(tmp)
|
||||
except KeyError:
|
||||
res[key] = vals
|
||||
return res, oldees
|
||||
|
||||
def get_info(self, item, check_not_on_or_after=True):
|
||||
""" Get session information about a subject gotten from a
|
||||
specified IdP/AA.
|
||||
|
||||
:param item: Information stored
|
||||
:return: The session information as a dictionary
|
||||
"""
|
||||
try:
|
||||
(timestamp, info) = item
|
||||
except ValueError:
|
||||
raise ToOld()
|
||||
|
||||
if check_not_on_or_after and not time_util.not_on_or_after(timestamp):
|
||||
raise ToOld()
|
||||
|
||||
return info or None
|
||||
|
||||
def get(self, subject_id, entity_id, check_not_on_or_after=True):
|
||||
res = self._cache.get(_key(subject_id, entity_id))
|
||||
if not res:
|
||||
return {}
|
||||
else:
|
||||
return self.get_info(res)
|
||||
|
||||
def set(self, subject_id, entity_id, info, timestamp=0):
|
||||
""" Stores session information in the cache. Assumes that the subject_id
|
||||
is unique within the context of the Service Provider.
|
||||
|
||||
:param subject_id: The subject identifier
|
||||
:param entity_id: The identifier of the entity_id/receiver of an
|
||||
assertion
|
||||
:param info: The session info, the assertion is part of this
|
||||
:param timestamp: A time after which the assertion is not valid.
|
||||
"""
|
||||
entities = self._cache.get(subject_id)
|
||||
if not entities:
|
||||
entities = []
|
||||
subjects = self._cache.get("subjects")
|
||||
if not subjects:
|
||||
subjects = []
|
||||
if subject_id not in subjects:
|
||||
subjects.append(subject_id)
|
||||
if not self._cache.set("subjects", subjects):
|
||||
raise CacheError("set failed")
|
||||
|
||||
if entity_id not in entities:
|
||||
entities.append(entity_id)
|
||||
if not self._cache.set(subject_id, entities):
|
||||
raise CacheError("set failed")
|
||||
|
||||
# Should use memcache's expire
|
||||
if not self._cache.set(_key(subject_id, entity_id), (timestamp, info)):
|
||||
raise CacheError("set failed")
|
||||
|
||||
def reset(self, subject_id, entity_id):
|
||||
""" Scrap the assertions received from a IdP or an AA about a special
|
||||
subject.
|
||||
|
||||
:param subject_id: The subjects identifier
|
||||
:param entity_id: The identifier of the entity_id of the assertion
|
||||
:return:
|
||||
"""
|
||||
if not self._cache.set(_key(subject_id, entity_id), {}, 0):
|
||||
raise CacheError("reset failed")
|
||||
|
||||
def entities(self, subject_id):
|
||||
""" Returns all the entities of assertions for a subject, disregarding
|
||||
whether the assertion still is valid or not.
|
||||
|
||||
:param subject_id: The identifier of the subject
|
||||
:return: A possibly empty list of entity identifiers
|
||||
"""
|
||||
res = self._cache.get(subject_id)
|
||||
if not res:
|
||||
raise KeyError("No such subject")
|
||||
else:
|
||||
return res
|
||||
|
||||
def receivers(self, subject_id):
|
||||
""" Another name for entities() just to make it more logic in the IdP
|
||||
scenario """
|
||||
return self.entities(subject_id)
|
||||
|
||||
def active(self, subject_id, entity_id):
|
||||
""" Returns the status of assertions from a specific entity_id.
|
||||
|
||||
:param subject_id: The ID of the subject
|
||||
:param entity_id: The entity ID of the entity_id of the assertion
|
||||
:return: True or False depending on if the assertion is still
|
||||
valid or not.
|
||||
"""
|
||||
try:
|
||||
(timestamp, info) = self._cache.get(_key(subject_id, entity_id))
|
||||
except ValueError:
|
||||
return False
|
||||
except TypeError:
|
||||
return False
|
||||
|
||||
# if not info:
|
||||
# return False
|
||||
|
||||
try:
|
||||
return time_util.not_on_or_after(timestamp)
|
||||
except ToOld:
|
||||
return False
|
||||
|
||||
def subjects(self):
|
||||
""" Return identifiers for all the subjects that are in the cache.
|
||||
|
||||
:return: list of subject identifiers
|
||||
"""
|
||||
return self._cache.get("subjects")
|
||||
|
||||
def update(self, subject_id, entity_id, ava):
|
||||
res = self._cache.get(_key(subject_id, entity_id))
|
||||
if res is None:
|
||||
raise KeyError("No such subject")
|
||||
else:
|
||||
info = self.get_info(res)
|
||||
if info:
|
||||
info.update(ava)
|
||||
self.set(subject_id, entity_id, info, res[0])
|
||||
|
||||
def valid_to(self, subject_id, entity_id, newtime):
|
||||
try:
|
||||
(timestamp, info) = self._cache.get(_key(subject_id, entity_id))
|
||||
except ValueError:
|
||||
return False
|
||||
except TypeError:
|
||||
info = {}
|
||||
|
||||
if not self._cache.set(_key(subject_id, entity_id), (newtime, info)):
|
||||
raise CacheError("valid_to failed")
|
||||
1807
src/saml2/md.py
Normal file
1807
src/saml2/md.py
Normal file
File diff suppressed because it is too large
Load Diff
200
src/saml2/mdbcache.py
Normal file
200
src/saml2/mdbcache.py
Normal file
@@ -0,0 +1,200 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
__author__ = 'rolandh'
|
||||
|
||||
from pymongo import Connection
|
||||
#import cjson
|
||||
import time
|
||||
from datetime import datetime
|
||||
|
||||
from saml2 import time_util
|
||||
from saml2.cache import ToOld
|
||||
from saml2.time_util import TIME_FORMAT
|
||||
|
||||
class Cache(object):
|
||||
def __init__(self, server=None, debug=0, db=None):
|
||||
if server:
|
||||
connection = Connection(server)
|
||||
else:
|
||||
connection = Connection()
|
||||
|
||||
if db:
|
||||
self._db = connection[db]
|
||||
else:
|
||||
self._db = connection.pysaml2
|
||||
|
||||
self._cache = self._db.collection
|
||||
self.debug = debug
|
||||
|
||||
def delete(self, subject_id):
|
||||
self._cache.remove({"subject_id": subject_id})
|
||||
|
||||
def get_identity(self, subject_id, entities=None,
|
||||
check_not_on_or_after=True):
|
||||
""" Get all the identity information that has been received and
|
||||
are still valid about the subject.
|
||||
|
||||
:param subject_id: The identifier of the subject
|
||||
:param entities: The identifiers of the entities whoes assertions are
|
||||
interesting. If the list is empty all entities are interesting.
|
||||
:return: A 2-tuple consisting of the identity information (a
|
||||
dictionary of attributes and values) and the list of entities
|
||||
whoes information has timed out.
|
||||
"""
|
||||
res = {}
|
||||
oldees = []
|
||||
if not entities:
|
||||
for item in self._cache.find({"subject_id": subject_id}):
|
||||
try:
|
||||
info = self._get_info(item, check_not_on_or_after)
|
||||
except ToOld:
|
||||
oldees.append(item["entity_id"])
|
||||
continue
|
||||
|
||||
for key, vals in info["ava"].items():
|
||||
try:
|
||||
tmp = set(res[key]).union(set(vals))
|
||||
res[key] = list(tmp)
|
||||
except KeyError:
|
||||
res[key] = vals
|
||||
else:
|
||||
for entity_id in entities:
|
||||
try:
|
||||
info = self.get(subject_id, entity_id, check_not_on_or_after)
|
||||
except ToOld:
|
||||
oldees.append(entity_id)
|
||||
continue
|
||||
|
||||
for key, vals in info["ava"].items():
|
||||
try:
|
||||
tmp = set(res[key]).union(set(vals))
|
||||
res[key] = list(tmp)
|
||||
except KeyError:
|
||||
res[key] = vals
|
||||
|
||||
return res, oldees
|
||||
|
||||
def _get_info(self, item, check_not_on_or_after=True):
|
||||
""" Get session information about a subject gotten from a
|
||||
specified IdP/AA.
|
||||
|
||||
:param item: Information stored
|
||||
:return: The session information as a dictionary
|
||||
"""
|
||||
timestamp = item["timestamp"]
|
||||
|
||||
if check_not_on_or_after and not time_util.not_on_or_after(timestamp):
|
||||
raise ToOld()
|
||||
|
||||
try:
|
||||
return item["info"]
|
||||
except KeyError:
|
||||
return None
|
||||
|
||||
def get(self, subject_id, entity_id, check_not_on_or_after=True):
|
||||
res = self._cache.find_one({"subject_id": subject_id,
|
||||
"entity_id": entity_id})
|
||||
if not res:
|
||||
return {}
|
||||
else:
|
||||
return self._get_info(res, check_not_on_or_after)
|
||||
|
||||
def set(self, subject_id, entity_id, info, timestamp=0):
|
||||
""" Stores session information in the cache. Assumes that the subject_id
|
||||
is unique within the context of the Service Provider.
|
||||
|
||||
:param subject_id: The subject identifier
|
||||
:param entity_id: The identifier of the entity_id/receiver of an
|
||||
assertion
|
||||
:param info: The session info, the assertion is part of this
|
||||
:param timestamp: A time after which the assertion is not valid.
|
||||
"""
|
||||
|
||||
if isinstance(timestamp, datetime) or isinstance(timestamp,
|
||||
time.struct_time):
|
||||
timestamp = time.strftime(TIME_FORMAT, timestamp)
|
||||
|
||||
doc = {"subject_id": subject_id,
|
||||
"entity_id": entity_id,
|
||||
"info": info,
|
||||
"timestamp": timestamp}
|
||||
|
||||
_ = self._cache.insert(doc)
|
||||
|
||||
|
||||
def reset(self, subject_id, entity_id):
|
||||
""" Scrap the assertions received from a IdP or an AA about a special
|
||||
subject.
|
||||
|
||||
:param subject_id: The subjects identifier
|
||||
:param entity_id: The identifier of the entity_id of the assertion
|
||||
:return:
|
||||
"""
|
||||
self._cache.update({"subject_id":subject_id,
|
||||
"entity_id":entity_id},
|
||||
{"$set": {"info":{}, "timestamp": 0}})
|
||||
|
||||
def entities(self, subject_id):
|
||||
""" Returns all the entities of assertions for a subject, disregarding
|
||||
whether the assertion still is valid or not.
|
||||
|
||||
:param subject_id: The identifier of the subject
|
||||
:return: A possibly empty list of entity identifiers
|
||||
"""
|
||||
try:
|
||||
return [i["entity_id"] for i in self._cache.find({"subject_id":
|
||||
subject_id})]
|
||||
except ValueError:
|
||||
return []
|
||||
|
||||
|
||||
def receivers(self, subject_id):
|
||||
""" Another name for entities() just to make it more logic in the IdP
|
||||
scenario """
|
||||
return self.entities(subject_id)
|
||||
|
||||
def active(self, subject_id, entity_id):
|
||||
""" Returns the status of assertions from a specific entity_id.
|
||||
|
||||
:param subject_id: The ID of the subject
|
||||
:param entity_id: The entity ID of the entity_id of the assertion
|
||||
:return: True or False depending on if the assertion is still
|
||||
valid or not.
|
||||
"""
|
||||
|
||||
item = self._cache.find_one({"subject_id":subject_id,
|
||||
"entity_id":entity_id})
|
||||
try:
|
||||
return time_util.not_on_or_after(item["timestamp"])
|
||||
except ToOld:
|
||||
return False
|
||||
|
||||
def subjects(self):
|
||||
""" Return identifiers for all the subjects that are in the cache.
|
||||
|
||||
:return: list of subject identifiers
|
||||
"""
|
||||
|
||||
subj = [i["subject_id"] for i in self._cache.find()]
|
||||
|
||||
return list(set(subj))
|
||||
|
||||
def update(self, subject_id, entity_id, ava):
|
||||
""" """
|
||||
item = self._cache.find_one({"subject_id":subject_id,
|
||||
"entity_id":entity_id})
|
||||
info = item["info"]
|
||||
info["ava"].update(ava)
|
||||
self._cache.update({"subject_id":subject_id,
|
||||
"entity_id":entity_id},
|
||||
{"$set": {"info":info}})
|
||||
|
||||
|
||||
def valid_to(self, subject_id, entity_id, newtime):
|
||||
""" """
|
||||
self._cache.update({"subject_id":subject_id,
|
||||
"entity_id":entity_id},
|
||||
{"$set": {"timestamp": newtime}})
|
||||
|
||||
def clear(self):
|
||||
self._cache.remove()
|
||||
1362
src/saml2/metadata.py
Normal file
1362
src/saml2/metadata.py
Normal file
File diff suppressed because it is too large
Load Diff
57
src/saml2/population.py
Normal file
57
src/saml2/population.py
Normal file
@@ -0,0 +1,57 @@
|
||||
|
||||
from saml2.cache import Cache
|
||||
|
||||
class Population(object):
|
||||
def __init__(self, cache=None):
|
||||
if cache:
|
||||
if isinstance(cache, basestring):
|
||||
self.cache = Cache(cache)
|
||||
else:
|
||||
self.cache = cache
|
||||
else:
|
||||
self.cache = Cache()
|
||||
|
||||
def add_information_about_person(self, session_info):
|
||||
"""If there already are information from this source in the cache
|
||||
this function will overwrite that information"""
|
||||
|
||||
subject_id = session_info["name_id"]
|
||||
issuer = session_info["issuer"]
|
||||
del session_info["issuer"]
|
||||
self.cache.set(subject_id, issuer, session_info,
|
||||
session_info["not_on_or_after"])
|
||||
return subject_id
|
||||
|
||||
def stale_sources_for_person(self, subject_id, sources=None):
|
||||
if not sources: # assume that all the members has be asked
|
||||
# once before, hence they are represented in the cache
|
||||
sources = self.cache.entities(subject_id)
|
||||
sources = [m for m in sources \
|
||||
if not self.cache.active(subject_id, m)]
|
||||
return sources
|
||||
|
||||
def issuers_of_info(self, subject_id):
|
||||
return self.cache.entities(subject_id)
|
||||
|
||||
def get_identity(self, subject_id, entities=None, check_not_on_or_after=True):
|
||||
return self.cache.get_identity(subject_id, entities, check_not_on_or_after)
|
||||
|
||||
def get_info_from(self, subject_id, entity_id):
|
||||
return self.cache.get(subject_id, entity_id)
|
||||
|
||||
def subjects(self):
|
||||
"""Returns the name id's for all the persons in the cache"""
|
||||
return self.cache.subjects()
|
||||
|
||||
def remove_person(self, subject_id):
|
||||
self.cache.delete(subject_id)
|
||||
|
||||
def get_entityid(self, subject_id, source_id, check_not_on_or_after=True):
|
||||
try:
|
||||
return self.cache.get(subject_id, source_id,
|
||||
check_not_on_or_after)["name_id"]
|
||||
except (KeyError, ValueError):
|
||||
return ""
|
||||
|
||||
def sources(self, subject_id):
|
||||
return self.cache.entities(subject_id)
|
||||
3
src/saml2/profile/__init__.py
Normal file
3
src/saml2/profile/__init__.py
Normal file
@@ -0,0 +1,3 @@
|
||||
#profile schema descriptions
|
||||
__author__ = 'rolandh'
|
||||
|
||||
190
src/saml2/profile/ecp.py
Normal file
190
src/saml2/profile/ecp.py
Normal file
@@ -0,0 +1,190 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
#
|
||||
# Generated Fri May 27 23:08:21 2011 by parse_xsd.py version 0.4.
|
||||
#
|
||||
|
||||
import saml2
|
||||
from saml2 import SamlBase
|
||||
|
||||
from saml2 import saml
|
||||
from saml2 import samlp
|
||||
#import soapenv as S
|
||||
|
||||
NAMESPACE = 'urn:oasis:names:tc:SAML:2.0:profiles:SSO:ecp'
|
||||
|
||||
class RequestType_(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:2.0:profiles:SSO:ecp:RequestType element """
|
||||
|
||||
c_tag = 'RequestType'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
c_children['{urn:oasis:names:tc:SAML:2.0:assertion}Issuer'] = ('issuer', saml.Issuer)
|
||||
c_children['{urn:oasis:names:tc:SAML:2.0:protocol}IDPList'] = ('idp_list', samlp.IDPList)
|
||||
c_cardinality['idp_list'] = {"min":0, "max":1}
|
||||
c_attributes['{http://schemas.xmlsoap.org/soap/envelope/}mustUnderstand'] = ('must_understand', 'None', True)
|
||||
c_attributes['{http://schemas.xmlsoap.org/soap/envelope/}actor'] = ('actor', 'None', True)
|
||||
c_attributes['ProviderName'] = ('provider_name', 'string', False)
|
||||
c_attributes['IsPassive'] = ('is_passive', 'boolean', False)
|
||||
c_child_order.extend(['issuer', 'idp_list'])
|
||||
|
||||
def __init__(self,
|
||||
issuer=None,
|
||||
idp_list=None,
|
||||
must_understand=None,
|
||||
actor=None,
|
||||
provider_name=None,
|
||||
is_passive=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
SamlBase.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.issuer=issuer
|
||||
self.idp_list=idp_list
|
||||
self.must_understand=must_understand
|
||||
self.actor=actor
|
||||
self.provider_name=provider_name
|
||||
self.is_passive=is_passive
|
||||
|
||||
def request_type__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(RequestType_, xml_string)
|
||||
|
||||
|
||||
class ResponseType_(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:2.0:profiles:SSO:ecp:ResponseType element """
|
||||
|
||||
c_tag = 'ResponseType'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
c_attributes['{http://schemas.xmlsoap.org/soap/envelope/}mustUnderstand'] = ('must_understand', 'None', True)
|
||||
c_attributes['{http://schemas.xmlsoap.org/soap/envelope/}actor'] = ('actor', 'None', True)
|
||||
c_attributes['AssertionConsumerServiceURL'] = ('assertion_consumer_service_url', 'anyURI', True)
|
||||
|
||||
def __init__(self,
|
||||
must_understand=None,
|
||||
actor=None,
|
||||
assertion_consumer_service_url=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
SamlBase.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.must_understand=must_understand
|
||||
self.actor=actor
|
||||
self.assertion_consumer_service_url=assertion_consumer_service_url
|
||||
|
||||
def response_type__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(ResponseType_, xml_string)
|
||||
|
||||
|
||||
class RelayStateType_(SamlBase):
|
||||
"""The urn:oasis:names:tc:SAML:2.0:profiles:SSO:ecp:RelayStateType element """
|
||||
|
||||
c_tag = 'RelayStateType'
|
||||
c_namespace = NAMESPACE
|
||||
c_value_type = {'base': 'string'}
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
c_attributes['{http://schemas.xmlsoap.org/soap/envelope/}mustUnderstand'] = ('must_understand', 'string', True)
|
||||
c_attributes['{http://schemas.xmlsoap.org/soap/envelope/}actor'] = ('actor', 'string', True)
|
||||
|
||||
def __init__(self,
|
||||
must_understand=None,
|
||||
actor=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
SamlBase.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.must_understand=must_understand
|
||||
self.actor=actor
|
||||
|
||||
def relay_state_type__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(RelayStateType_, xml_string)
|
||||
|
||||
|
||||
class Request(RequestType_):
|
||||
"""The urn:oasis:names:tc:SAML:2.0:profiles:SSO:ecp:Request element """
|
||||
|
||||
c_tag = 'Request'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = RequestType_.c_children.copy()
|
||||
c_attributes = RequestType_.c_attributes.copy()
|
||||
c_child_order = RequestType_.c_child_order[:]
|
||||
c_cardinality = RequestType_.c_cardinality.copy()
|
||||
|
||||
def request_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Request, xml_string)
|
||||
|
||||
|
||||
class Response(ResponseType_):
|
||||
"""The urn:oasis:names:tc:SAML:2.0:profiles:SSO:ecp:Response element """
|
||||
|
||||
c_tag = 'Response'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = ResponseType_.c_children.copy()
|
||||
c_attributes = ResponseType_.c_attributes.copy()
|
||||
c_child_order = ResponseType_.c_child_order[:]
|
||||
c_cardinality = ResponseType_.c_cardinality.copy()
|
||||
|
||||
def response_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Response, xml_string)
|
||||
|
||||
|
||||
class RelayState(RelayStateType_):
|
||||
"""The urn:oasis:names:tc:SAML:2.0:profiles:SSO:ecp:RelayState element """
|
||||
|
||||
c_tag = 'RelayState'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = RelayStateType_.c_children.copy()
|
||||
c_attributes = RelayStateType_.c_attributes.copy()
|
||||
c_child_order = RelayStateType_.c_child_order[:]
|
||||
c_cardinality = RelayStateType_.c_cardinality.copy()
|
||||
|
||||
def relay_state_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(RelayState, xml_string)
|
||||
|
||||
|
||||
ELEMENT_FROM_STRING = {
|
||||
Request.c_tag: request_from_string,
|
||||
RequestType_.c_tag: request_type__from_string,
|
||||
Response.c_tag: response_from_string,
|
||||
ResponseType_.c_tag: response_type__from_string,
|
||||
RelayState.c_tag: relay_state_from_string,
|
||||
RelayStateType_.c_tag: relay_state_type__from_string,
|
||||
}
|
||||
|
||||
ELEMENT_BY_TAG = {
|
||||
'Request': Request,
|
||||
'RequestType': RequestType_,
|
||||
'Response': Response,
|
||||
'ResponseType': ResponseType_,
|
||||
'RelayState': RelayState,
|
||||
'RelayStateType': RelayStateType_,
|
||||
}
|
||||
|
||||
|
||||
def factory(tag, **kwargs):
|
||||
return ELEMENT_BY_TAG[tag](**kwargs)
|
||||
|
||||
133
src/saml2/profile/paos.py
Normal file
133
src/saml2/profile/paos.py
Normal file
@@ -0,0 +1,133 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
#
|
||||
# Generated Fri May 27 17:30:44 2011 by parse_xsd.py version 0.4.
|
||||
#
|
||||
|
||||
import saml2
|
||||
from saml2 import SamlBase
|
||||
|
||||
#import soapenv as S
|
||||
|
||||
NAMESPACE = 'urn:liberty:paos:2003-08'
|
||||
|
||||
class RequestType_(SamlBase):
|
||||
"""The urn:liberty:paos:2003-08:RequestType element """
|
||||
|
||||
c_tag = 'RequestType'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
c_attributes['responseConsumerURL'] = ('response_consumer_url', 'anyURI', True)
|
||||
c_attributes['service'] = ('service', 'anyURI', True)
|
||||
c_attributes['messageID'] = ('message_id', 'None', False)
|
||||
c_attributes['{http://schemas.xmlsoap.org/soap/envelope/}mustUnderstand'] = ('must_understand', 'None', True)
|
||||
c_attributes['{http://schemas.xmlsoap.org/soap/envelope/}actor'] = ('actor', 'None', True)
|
||||
|
||||
def __init__(self,
|
||||
response_consumer_url=None,
|
||||
service=None,
|
||||
message_id=None,
|
||||
must_understand=None,
|
||||
actor=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
SamlBase.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.response_consumer_url=response_consumer_url
|
||||
self.service=service
|
||||
self.message_id=message_id
|
||||
self.must_understand=must_understand
|
||||
self.actor=actor
|
||||
|
||||
def request_type__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(RequestType_, xml_string)
|
||||
|
||||
|
||||
class ResponseType_(SamlBase):
|
||||
"""The urn:liberty:paos:2003-08:ResponseType element """
|
||||
|
||||
c_tag = 'ResponseType'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
c_attributes['refToMessageID'] = ('ref_to_message_id', 'None', False)
|
||||
c_attributes['{http://schemas.xmlsoap.org/soap/envelope/}mustUnderstand'] = ('must_understand', 'None', True)
|
||||
c_attributes['{http://schemas.xmlsoap.org/soap/envelope/}actor'] = ('actor', 'None', True)
|
||||
|
||||
def __init__(self,
|
||||
ref_to_message_id=None,
|
||||
must_understand=None,
|
||||
actor=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
SamlBase.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.ref_to_message_id=ref_to_message_id
|
||||
self.must_understand=must_understand
|
||||
self.actor=actor
|
||||
|
||||
def response_type__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(ResponseType_, xml_string)
|
||||
|
||||
|
||||
class Request(RequestType_):
|
||||
"""The urn:liberty:paos:2003-08:Request element """
|
||||
|
||||
c_tag = 'Request'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = RequestType_.c_children.copy()
|
||||
c_attributes = RequestType_.c_attributes.copy()
|
||||
c_child_order = RequestType_.c_child_order[:]
|
||||
c_cardinality = RequestType_.c_cardinality.copy()
|
||||
|
||||
def request_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Request, xml_string)
|
||||
|
||||
|
||||
class Response(ResponseType_):
|
||||
"""The urn:liberty:paos:2003-08:Response element """
|
||||
|
||||
c_tag = 'Response'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = ResponseType_.c_children.copy()
|
||||
c_attributes = ResponseType_.c_attributes.copy()
|
||||
c_child_order = ResponseType_.c_child_order[:]
|
||||
c_cardinality = ResponseType_.c_cardinality.copy()
|
||||
|
||||
def response_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Response, xml_string)
|
||||
|
||||
|
||||
ELEMENT_FROM_STRING = {
|
||||
Request.c_tag: request_from_string,
|
||||
RequestType_.c_tag: request_type__from_string,
|
||||
Response.c_tag: response_from_string,
|
||||
ResponseType_.c_tag: response_type__from_string,
|
||||
}
|
||||
|
||||
ELEMENT_BY_TAG = {
|
||||
'Request': Request,
|
||||
'RequestType': RequestType_,
|
||||
'Response': Response,
|
||||
'ResponseType': ResponseType_,
|
||||
}
|
||||
|
||||
|
||||
def factory(tag, **kwargs):
|
||||
return ELEMENT_BY_TAG[tag](**kwargs)
|
||||
|
||||
189
src/saml2/request.py
Normal file
189
src/saml2/request.py
Normal file
@@ -0,0 +1,189 @@
|
||||
import sys
|
||||
|
||||
from attribute_converter import to_local
|
||||
from saml2 import time_util
|
||||
from saml2 import s_utils
|
||||
from saml2.s_utils import OtherError
|
||||
|
||||
from saml2.validate import valid_instance
|
||||
from saml2.validate import NotValid
|
||||
from saml2.response import IncorrectlySigned
|
||||
|
||||
def _dummy(_arg):
|
||||
return None
|
||||
|
||||
class Request(object):
|
||||
def __init__(self, sec_context, receiver_addrs, log=None, timeslack=0,
|
||||
debug=0):
|
||||
self.sec = sec_context
|
||||
self.receiver_addrs = receiver_addrs
|
||||
self.timeslack = timeslack
|
||||
self.log = log
|
||||
self.debug = debug
|
||||
if self.debug and not self.log:
|
||||
self.debug = 0
|
||||
|
||||
self.xmlstr = ""
|
||||
self.name_id = ""
|
||||
self.message = None
|
||||
self.not_on_or_after = 0
|
||||
|
||||
self.signature_check = _dummy # has to be set !!!
|
||||
|
||||
def _clear(self):
|
||||
self.xmlstr = ""
|
||||
self.name_id = ""
|
||||
self.message = None
|
||||
self.not_on_or_after = 0
|
||||
|
||||
def _loads(self, xmldata, decode=True):
|
||||
if decode:
|
||||
if self.debug:
|
||||
self.log.debug("Expected to decode and inflate xml data")
|
||||
decoded_xml = s_utils.decode_base64_and_inflate(xmldata)
|
||||
else:
|
||||
decoded_xml = xmldata
|
||||
|
||||
# own copy
|
||||
self.xmlstr = decoded_xml[:]
|
||||
if self.debug:
|
||||
self.log.info("xmlstr: %s" % (self.xmlstr,))
|
||||
try:
|
||||
self.message = self.signature_check(decoded_xml)
|
||||
except TypeError:
|
||||
raise
|
||||
except Exception, excp:
|
||||
if self.log:
|
||||
self.log.info("EXCEPTION: %s", excp)
|
||||
|
||||
if not self.message:
|
||||
if self.log:
|
||||
self.log.error("Response was not correctly signed")
|
||||
self.log.info(decoded_xml)
|
||||
raise IncorrectlySigned()
|
||||
|
||||
if self.debug:
|
||||
self.log.info("request: %s" % (self.message,))
|
||||
|
||||
try:
|
||||
valid_instance(self.message)
|
||||
except NotValid, exc:
|
||||
if self.log:
|
||||
self.log.error("Not valid request: %s" % exc.args[0])
|
||||
else:
|
||||
print >> sys.stderr, "Not valid request: %s" % exc.args[0]
|
||||
raise
|
||||
|
||||
return self
|
||||
|
||||
def issue_instant_ok(self):
|
||||
""" Check that the request was issued at a reasonable time """
|
||||
upper = time_util.shift_time(time_util.time_in_a_while(days=1),
|
||||
self.timeslack).timetuple()
|
||||
lower = time_util.shift_time(time_util.time_a_while_ago(days=1),
|
||||
-self.timeslack).timetuple()
|
||||
# print "issue_instant: %s" % self.message.issue_instant
|
||||
# print "%s < x < %s" % (lower, upper)
|
||||
issued_at = time_util.str_to_time(self.message.issue_instant)
|
||||
return issued_at > lower and issued_at < upper
|
||||
|
||||
def _verify(self):
|
||||
assert self.message.version == "2.0"
|
||||
if self.message.destination and \
|
||||
self.message.destination not in self.receiver_addrs:
|
||||
if self.log:
|
||||
self.log.error("%s != %s" % (self.message.destination,
|
||||
self.receiver_addrs))
|
||||
else:
|
||||
print >> sys.stderr, "%s != %s" % (self.message.destination,
|
||||
self.receiver_addrs)
|
||||
raise OtherError("Not destined for me!")
|
||||
|
||||
assert self.issue_instant_ok()
|
||||
return self
|
||||
|
||||
def loads(self, xmldata, decode=True):
|
||||
return self._loads(xmldata, decode)
|
||||
|
||||
def verify(self):
|
||||
try:
|
||||
return self._verify()
|
||||
except AssertionError:
|
||||
return None
|
||||
|
||||
def subject_id(self):
|
||||
""" The name of the subject can be in either of
|
||||
BaseID, NameID or EncryptedID
|
||||
|
||||
:return: The identifier if there is one
|
||||
"""
|
||||
|
||||
if "subject" in self.message.keys():
|
||||
_subj = self.message.subject
|
||||
if "base_id" in _subj.keys() and _subj.base_id:
|
||||
return _subj.base_id
|
||||
elif _subj.name_id:
|
||||
return _subj.name_id
|
||||
else:
|
||||
if "base_id" in self.message.keys() and self.message.base_id:
|
||||
return self.message.base_id
|
||||
elif self.message.name_id:
|
||||
return self.message.name_id
|
||||
else: # EncryptedID
|
||||
pass
|
||||
|
||||
def sender(self):
|
||||
return self.message.issuer.text()
|
||||
|
||||
class LogoutRequest(Request):
|
||||
def __init__(self, sec_context, receiver_addrs, log=None, timeslack=0,
|
||||
debug=0):
|
||||
Request.__init__(self, sec_context, receiver_addrs, log, timeslack,
|
||||
debug)
|
||||
self.signature_check = self.sec.correctly_signed_logout_request
|
||||
|
||||
|
||||
class AttributeQuery(Request):
|
||||
def __init__(self, sec_context, receiver_addrs, log=None, timeslack=0,
|
||||
debug=0):
|
||||
Request.__init__(self, sec_context, receiver_addrs, log, timeslack,
|
||||
debug)
|
||||
self.signature_check = self.sec.correctly_signed_attribute_query
|
||||
|
||||
def attribute(self):
|
||||
""" Which attributes that are sought for """
|
||||
|
||||
return []
|
||||
|
||||
|
||||
class AuthnRequest(Request):
|
||||
def __init__(self, sec_context, attribute_converters, receiver_addrs,
|
||||
log=None, timeslack=0, debug=0):
|
||||
Request.__init__(self, sec_context, receiver_addrs, log, timeslack,
|
||||
debug)
|
||||
self.attribute_converters = attribute_converters
|
||||
self.signature_check = self.sec.correctly_signed_authn_request
|
||||
|
||||
|
||||
def attributes(self):
|
||||
return to_local(self.attribute_converters, self.message)
|
||||
|
||||
|
||||
class AuthzRequest(Request):
|
||||
def __init__(self, sec_context, receiver_addrs, log=None, timeslack=0,
|
||||
debug=0):
|
||||
Request.__init__(self, sec_context, receiver_addrs, log, timeslack,
|
||||
debug)
|
||||
self.signature_check = self.sec.correctly_signed_logout_request
|
||||
|
||||
def action(self):
|
||||
""" Which action authorization is requested for """
|
||||
pass
|
||||
|
||||
def evidence(self):
|
||||
""" The evidence on which the decision is based """
|
||||
pass
|
||||
|
||||
def resource(self):
|
||||
""" On which resource the action is expected to occur """
|
||||
pass
|
||||
709
src/saml2/response.py
Normal file
709
src/saml2/response.py
Normal file
@@ -0,0 +1,709 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
# Copyright (C) 2010-2011 Umeå University
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import calendar
|
||||
import base64
|
||||
import sys
|
||||
|
||||
from saml2 import samlp
|
||||
from saml2 import saml
|
||||
from saml2 import extension_element_to_element
|
||||
from saml2 import time_util
|
||||
|
||||
from saml2.saml import attribute_from_string
|
||||
from saml2.saml import encrypted_attribute_from_string
|
||||
from saml2.sigver import security_context
|
||||
from saml2.sigver import SignatureError
|
||||
from saml2.sigver import signed
|
||||
from saml2.attribute_converter import to_local
|
||||
from saml2.time_util import str_to_time
|
||||
|
||||
from saml2.validate import validate_on_or_after
|
||||
from saml2.validate import validate_before
|
||||
from saml2.validate import valid_instance
|
||||
from saml2.validate import valid_address
|
||||
from saml2.validate import NotValid
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class IncorrectlySigned(Exception):
|
||||
pass
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def _dummy(_):
|
||||
return None
|
||||
|
||||
def for_me(condition, myself ):
|
||||
# Am I among the intended audiences
|
||||
for restriction in condition.audience_restriction:
|
||||
for audience in restriction.audience:
|
||||
if audience.text.strip() == myself:
|
||||
return True
|
||||
else:
|
||||
#print "Not for me: %s != %s" % (audience.text.strip(), myself)
|
||||
pass
|
||||
|
||||
return False
|
||||
|
||||
def authn_response(conf, return_addr, outstanding_queries=None,
|
||||
log=None, timeslack=0, debug=0, asynchop=True,
|
||||
allow_unsolicited=False):
|
||||
sec = security_context(conf)
|
||||
if not timeslack:
|
||||
try:
|
||||
timeslack = int(conf.accepted_time_diff)
|
||||
except TypeError:
|
||||
timeslack = 0
|
||||
|
||||
return AuthnResponse(sec, conf.attribute_converters, conf.entityid,
|
||||
return_addr, outstanding_queries, log, timeslack,
|
||||
debug, asynchop=asynchop,
|
||||
allow_unsolicited=allow_unsolicited)
|
||||
|
||||
# comes in over SOAP so synchronous
|
||||
def attribute_response(conf, return_addr, log=None, timeslack=0, debug=0,
|
||||
asynchop=False, test=False):
|
||||
sec = security_context(conf)
|
||||
if not timeslack:
|
||||
try:
|
||||
timeslack = int(conf.accepted_time_diff)
|
||||
except TypeError:
|
||||
timeslack = 0
|
||||
|
||||
return AttributeResponse(sec, conf.attribute_converters, conf.entityid,
|
||||
return_addr, log, timeslack, debug,
|
||||
asynchop=asynchop, test=test)
|
||||
|
||||
class StatusResponse(object):
|
||||
def __init__(self, sec_context, return_addr=None, log=None, timeslack=0,
|
||||
debug=0, request_id=0):
|
||||
self.sec = sec_context
|
||||
self.return_addr = return_addr
|
||||
|
||||
self.timeslack = timeslack
|
||||
self.request_id = request_id
|
||||
self.log = log
|
||||
self.debug = debug
|
||||
if self.debug and not self.log:
|
||||
self.debug = 0
|
||||
|
||||
self.xmlstr = ""
|
||||
self.name_id = ""
|
||||
self.response = None
|
||||
self.not_on_or_after = 0
|
||||
self.in_response_to = None
|
||||
self.signature_check = self.sec.correctly_signed_response
|
||||
self.not_signed = False
|
||||
|
||||
def _clear(self):
|
||||
self.xmlstr = ""
|
||||
self.name_id = ""
|
||||
self.response = None
|
||||
self.not_on_or_after = 0
|
||||
|
||||
def _postamble(self):
|
||||
if not self.response:
|
||||
if self.log:
|
||||
self.log.error("Response was not correctly signed")
|
||||
if self.xmlstr:
|
||||
self.log.info(self.xmlstr)
|
||||
raise IncorrectlySigned()
|
||||
|
||||
if self.debug:
|
||||
self.log.info("response: %s" % (self.response,))
|
||||
|
||||
try:
|
||||
valid_instance(self.response)
|
||||
except NotValid, exc:
|
||||
if self.log:
|
||||
self.log.error("Not valid response: %s" % exc.args[0])
|
||||
else:
|
||||
print >> sys.stderr, "Not valid response: %s" % exc.args[0]
|
||||
|
||||
self._clear()
|
||||
return self
|
||||
|
||||
self.in_response_to = self.response.in_response_to
|
||||
return self
|
||||
|
||||
def load_instance(self, instance):
|
||||
if signed(instance):
|
||||
# This will check signature on Assertion which is the default
|
||||
try:
|
||||
self.response = self.sec.check_signature(instance)
|
||||
except SignatureError: # The response as a whole might be signed or not
|
||||
self.response = self.sec.check_signature(instance,
|
||||
samlp.NAMESPACE+":Response")
|
||||
else:
|
||||
self.not_signed = True
|
||||
self.response = instance
|
||||
|
||||
return self._postamble()
|
||||
|
||||
def _loads(self, xmldata, decode=True, origxml=None):
|
||||
if decode:
|
||||
decoded_xml = base64.b64decode(xmldata)
|
||||
else:
|
||||
decoded_xml = xmldata
|
||||
|
||||
# own copy
|
||||
self.xmlstr = decoded_xml[:]
|
||||
if self.debug:
|
||||
self.log.info("xmlstr: %s" % (self.xmlstr,))
|
||||
# fil = open("response.xml", "w")
|
||||
# fil.write(self.xmlstr)
|
||||
# fil.close()
|
||||
|
||||
try:
|
||||
self.response = self.signature_check(decoded_xml, origdoc=origxml)
|
||||
except TypeError:
|
||||
raise
|
||||
except SignatureError:
|
||||
raise
|
||||
except Exception, excp:
|
||||
if self.log:
|
||||
self.log.info("EXCEPTION: %s", excp)
|
||||
|
||||
#print "<", self.response
|
||||
|
||||
return self._postamble()
|
||||
|
||||
def status_ok(self):
|
||||
if self.response.status:
|
||||
status = self.response.status
|
||||
if self.log:
|
||||
self.log.info("status: %s" % (status,))
|
||||
if status.status_code.value != samlp.STATUS_SUCCESS:
|
||||
if self.log:
|
||||
self.log.info("Not successful operation: %s" % status)
|
||||
raise Exception(
|
||||
"Not successful according to: %s" % \
|
||||
status.status_code.value)
|
||||
return True
|
||||
|
||||
def issue_instant_ok(self):
|
||||
""" Check that the response was issued at a reasonable time """
|
||||
upper = time_util.shift_time(time_util.time_in_a_while(days=1),
|
||||
self.timeslack).timetuple()
|
||||
lower = time_util.shift_time(time_util.time_a_while_ago(days=1),
|
||||
-self.timeslack).timetuple()
|
||||
# print "issue_instant: %s" % self.response.issue_instant
|
||||
# print "%s < x < %s" % (lower, upper)
|
||||
issued_at = str_to_time(self.response.issue_instant)
|
||||
return issued_at > lower and issued_at < upper
|
||||
|
||||
def _verify(self):
|
||||
if self.request_id and self.in_response_to and \
|
||||
self.in_response_to != self.request_id:
|
||||
if self.log:
|
||||
self.log.error("Not the id I expected: %s != %s" % (
|
||||
self.in_response_to,
|
||||
self.request_id))
|
||||
return None
|
||||
|
||||
assert self.response.version == "2.0"
|
||||
if self.response.destination and \
|
||||
self.response.destination != self.return_addr:
|
||||
if self.log:
|
||||
self.log.error("%s != %s" % (self.response.destination,
|
||||
self.return_addr))
|
||||
return None
|
||||
|
||||
assert self.issue_instant_ok()
|
||||
assert self.status_ok()
|
||||
return self
|
||||
|
||||
def loads(self, xmldata, decode=True, origxml=None):
|
||||
return self._loads(xmldata, decode, origxml)
|
||||
|
||||
def verify(self):
|
||||
try:
|
||||
return self._verify()
|
||||
except AssertionError, exc:
|
||||
self.log.error("Assertion error: %s" % exc)
|
||||
return None
|
||||
|
||||
def update(self, mold):
|
||||
self.xmlstr = mold.xmlstr
|
||||
self.in_response_to = mold.in_response_to
|
||||
self.response = mold.response
|
||||
|
||||
def issuer(self):
|
||||
return self.response.issuer.text.strip()
|
||||
|
||||
class LogoutResponse(StatusResponse):
|
||||
def __init__(self, sec_context, return_addr=None, log=None, timeslack=0,
|
||||
debug=0):
|
||||
StatusResponse.__init__(self, sec_context, return_addr, log, timeslack,
|
||||
debug)
|
||||
self.signature_check = self.sec.correctly_signed_logout_response
|
||||
|
||||
#class AttributeResponse(StatusResponse):
|
||||
# def __init__(self, sec_context, attribute_converters, entity_id,
|
||||
# return_addr=None, log=None, timeslack=0, debug=0):
|
||||
# StatusResponse.__init__(self, sec_context, return_addr, log, timeslack,
|
||||
# debug)
|
||||
# self.entity_id = entity_id
|
||||
# self.attribute_converters = attribute_converters
|
||||
# self.assertion = None
|
||||
#
|
||||
# def get_identity(self):
|
||||
# # The assertion can contain zero or one attributeStatements
|
||||
# if not self.assertion.attribute_statement:
|
||||
# self.log.error("Missing Attribute Statement")
|
||||
# ava = {}
|
||||
# else:
|
||||
# assert len(self.assertion.attribute_statement) == 1
|
||||
#
|
||||
# if self.debug:
|
||||
# self.log.info("Attribute Statement: %s" % (
|
||||
# self.assertion.attribute_statement[0],))
|
||||
# for aconv in self.attribute_converters:
|
||||
# self.log.info(
|
||||
# "Converts name format: %s" % (aconv.name_format,))
|
||||
#
|
||||
# ava = to_local(self.attribute_converters,
|
||||
# self.assertion.attribute_statement[0])
|
||||
# return ava
|
||||
#
|
||||
# def session_info(self):
|
||||
# """ Returns a predefined set of information gleened from the
|
||||
# response.
|
||||
# :returns: Dictionary with information
|
||||
# """
|
||||
# if self.session_not_on_or_after > 0:
|
||||
# nooa = self.session_not_on_or_after
|
||||
# else:
|
||||
# nooa = self.not_on_or_after
|
||||
#
|
||||
# return { "ava": self.ava, "name_id": self.name_id,
|
||||
# "came_from": self.came_from, "issuer": self.issuer(),
|
||||
# "not_on_or_after": nooa,
|
||||
# "authn_info": self.authn_info() }
|
||||
|
||||
class AuthnResponse(StatusResponse):
|
||||
""" This is where all the profile compliance is checked.
|
||||
This one does saml2int compliance. """
|
||||
|
||||
def __init__(self, sec_context, attribute_converters, entity_id,
|
||||
return_addr=None, outstanding_queries=None, log=None,
|
||||
timeslack=0, debug=0, asynchop=True,
|
||||
allow_unsolicited=False, test=False):
|
||||
|
||||
StatusResponse.__init__(self, sec_context, return_addr, log,
|
||||
timeslack, debug)
|
||||
self.entity_id = entity_id
|
||||
self.attribute_converters = attribute_converters
|
||||
if outstanding_queries:
|
||||
self.outstanding_queries = outstanding_queries
|
||||
else:
|
||||
self.outstanding_queries = {}
|
||||
self.context = "AuthnReq"
|
||||
self.came_from = ""
|
||||
self.ava = None
|
||||
self.assertion = None
|
||||
self.session_not_on_or_after = 0
|
||||
self.asynchop = asynchop
|
||||
self.allow_unsolicited = allow_unsolicited
|
||||
self.test = test
|
||||
|
||||
def loads(self, xmldata, decode=True, origxml=None):
|
||||
self._loads(xmldata, decode, origxml)
|
||||
|
||||
if self.asynchop:
|
||||
if self.in_response_to in self.outstanding_queries:
|
||||
self.came_from = self.outstanding_queries[self.in_response_to]
|
||||
del self.outstanding_queries[self.in_response_to]
|
||||
elif self.allow_unsolicited:
|
||||
pass
|
||||
else:
|
||||
if self.log:
|
||||
self.log("Unsolicited response")
|
||||
raise Exception("Unsolicited response")
|
||||
|
||||
return self
|
||||
|
||||
def clear(self):
|
||||
self._clear()
|
||||
self.came_from = ""
|
||||
self.ava = None
|
||||
self.assertion = None
|
||||
|
||||
def authn_statement_ok(self, optional=False):
|
||||
try:
|
||||
# the assertion MUST contain one AuthNStatement
|
||||
assert len(self.assertion.authn_statement) == 1
|
||||
except AssertionError:
|
||||
if optional:
|
||||
return True
|
||||
else:
|
||||
raise
|
||||
|
||||
authn_statement = self.assertion.authn_statement[0]
|
||||
if authn_statement.session_not_on_or_after:
|
||||
if validate_on_or_after(authn_statement.session_not_on_or_after,
|
||||
self.timeslack):
|
||||
self.session_not_on_or_after = calendar.timegm(
|
||||
time_util.str_to_time(authn_statement.session_not_on_or_after))
|
||||
else:
|
||||
return False
|
||||
return True
|
||||
# check authn_statement.session_index
|
||||
|
||||
def condition_ok(self, lax=False):
|
||||
# The Identity Provider MUST include a <saml:Conditions> element
|
||||
#print "Conditions",assertion.conditions
|
||||
if self.test:
|
||||
lax = True
|
||||
assert self.assertion.conditions
|
||||
condition = self.assertion.conditions
|
||||
if self.debug and self.log:
|
||||
self.log.info("condition: %s" % condition)
|
||||
|
||||
try:
|
||||
self.not_on_or_after = validate_on_or_after(
|
||||
condition.not_on_or_after,
|
||||
self.timeslack)
|
||||
validate_before(condition.not_before, self.timeslack)
|
||||
except Exception, excp:
|
||||
if self.log:
|
||||
self.log.error("Exception on condition: %s" % (excp,))
|
||||
if not lax:
|
||||
raise
|
||||
else:
|
||||
self.not_on_or_after = 0
|
||||
|
||||
if not for_me(condition, self.entity_id):
|
||||
if not lax:
|
||||
#print condition
|
||||
#print self.entity_id
|
||||
raise Exception("Not for me!!!")
|
||||
|
||||
return True
|
||||
|
||||
def decrypt_attributes(self, attribute_statement):
|
||||
"""
|
||||
Decrypts possible encrypted attributes and adds the decrypts to the
|
||||
list of attributes.
|
||||
|
||||
:param attribute_statement: A SAML.AttributeStatement which might
|
||||
contain both encrypted attributes and attributes.
|
||||
"""
|
||||
# _node_name = [
|
||||
# "urn:oasis:names:tc:SAML:2.0:assertion:EncryptedData",
|
||||
# "urn:oasis:names:tc:SAML:2.0:assertion:EncryptedAttribute"]
|
||||
|
||||
for encattr in attribute_statement.encrypted_attribute:
|
||||
if not encattr.encrypted_key:
|
||||
_decr = self.sec.decrypt(encattr.encrypted_data)
|
||||
_attr = attribute_from_string(_decr)
|
||||
attribute_statement.attribute.append(_attr)
|
||||
else:
|
||||
_decr = self.sec.decrypt(encattr)
|
||||
enc_attr = encrypted_attribute_from_string(_decr)
|
||||
attrlist = enc_attr.extensions_as_elements("Attribute", saml)
|
||||
attribute_statement.attribute.extend(attrlist)
|
||||
|
||||
def get_identity(self):
|
||||
""" The assertion can contain zero or one attributeStatements
|
||||
|
||||
"""
|
||||
if not self.assertion.attribute_statement:
|
||||
if self.log:
|
||||
self.log.error("Missing Attribute Statement")
|
||||
ava = {}
|
||||
else:
|
||||
assert len(self.assertion.attribute_statement) == 1
|
||||
_attr_statem = self.assertion.attribute_statement[0]
|
||||
|
||||
if self.debug and self.log:
|
||||
self.log.info("Attribute Statement: %s" % (_attr_statem,))
|
||||
for aconv in self.attribute_converters:
|
||||
self.log.info(
|
||||
"Converts name format: %s" % (aconv.name_format,))
|
||||
|
||||
self.decrypt_attributes(_attr_statem)
|
||||
ava = to_local(self.attribute_converters, _attr_statem)
|
||||
return ava
|
||||
|
||||
def get_subject(self):
|
||||
""" The assertion must contain a Subject
|
||||
"""
|
||||
assert self.assertion.subject
|
||||
subject = self.assertion.subject
|
||||
subjconf = []
|
||||
for subject_confirmation in subject.subject_confirmation:
|
||||
data = subject_confirmation.subject_confirmation_data
|
||||
if not data:
|
||||
# I don't know where this belongs so I ignore it
|
||||
continue
|
||||
|
||||
if data.address:
|
||||
if not valid_address(data.address):
|
||||
# ignore this subject_confirmation
|
||||
continue
|
||||
|
||||
# These two will raise exception if untrue
|
||||
validate_on_or_after(data.not_on_or_after, self.timeslack)
|
||||
validate_before(data.not_before, self.timeslack)
|
||||
|
||||
# not_before must be < not_on_or_after
|
||||
if not time_util.later_than(data.not_on_or_after, data.not_before):
|
||||
continue
|
||||
|
||||
if self.asynchop and not self.came_from:
|
||||
if data.in_response_to in self.outstanding_queries:
|
||||
self.came_from = self.outstanding_queries[
|
||||
data.in_response_to]
|
||||
del self.outstanding_queries[data.in_response_to]
|
||||
elif self.allow_unsolicited:
|
||||
pass
|
||||
else:
|
||||
# This is where I don't allow unsolicited reponses
|
||||
# Either in_response_to == None or has a value I don't
|
||||
# recognize
|
||||
if self.debug and self.log:
|
||||
self.log.info(
|
||||
"in response to: '%s'" % data.in_response_to)
|
||||
self.log.info("outstanding queries: %s" % \
|
||||
self.outstanding_queries.keys())
|
||||
raise Exception(
|
||||
"Combination of session id and requestURI I don't recall")
|
||||
|
||||
subjconf.append(subject_confirmation)
|
||||
|
||||
if not subjconf:
|
||||
raise Exception("No valid subject confirmation")
|
||||
|
||||
subject.subject_confirmation = subjconf
|
||||
|
||||
# The subject must contain a name_id
|
||||
assert subject.name_id
|
||||
self.name_id = subject.name_id.text.strip()
|
||||
return self.name_id
|
||||
|
||||
def _assertion(self, assertion):
|
||||
self.assertion = assertion
|
||||
|
||||
if self.debug and self.log:
|
||||
self.log.info("assertion context: %s" % (self.context,))
|
||||
self.log.info("assertion keys: %s" % (assertion.keyswv()))
|
||||
self.log.info("outstanding_queries: %s" % (
|
||||
self.outstanding_queries,))
|
||||
|
||||
#if self.context == "AuthnReq" or self.context == "AttrQuery":
|
||||
if self.context == "AuthnReq":
|
||||
self.authn_statement_ok()
|
||||
# elif self.context == "AttrQuery":
|
||||
# self.authn_statement_ok(True)
|
||||
|
||||
if not self.condition_ok():
|
||||
return None
|
||||
|
||||
if self.debug and self.log:
|
||||
self.log.info("--- Getting Identity ---")
|
||||
|
||||
if self.context == "AuthnReq" or self.context == "AttrQuery":
|
||||
self.ava = self.get_identity()
|
||||
|
||||
if self.debug and self.log:
|
||||
self.log.info("--- AVA: %s" % (self.ava,))
|
||||
|
||||
try:
|
||||
self.get_subject()
|
||||
if self.asynchop:
|
||||
if self.allow_unsolicited:
|
||||
pass
|
||||
elif not self.came_from:
|
||||
return False
|
||||
return True
|
||||
except Exception, exc:
|
||||
self.log.error("Exception: %s" % exc)
|
||||
return False
|
||||
|
||||
def _encrypted_assertion(self, xmlstr):
|
||||
if xmlstr.encrypted_data:
|
||||
assertion_str = self.sec.decrypt(xmlstr.encrypted_data)
|
||||
assertion = saml.assertion_from_string(assertion_str)
|
||||
else:
|
||||
decrypt_xml = self.sec.decrypt(xmlstr)
|
||||
|
||||
if self.debug and self.log:
|
||||
self.log.info("Decryption successfull")
|
||||
|
||||
self.response = samlp.response_from_string(decrypt_xml)
|
||||
if self.debug and self.log:
|
||||
self.log.info("Parsed decrypted assertion successfull")
|
||||
|
||||
enc = self.response.encrypted_assertion[0].extension_elements[0]
|
||||
assertion = extension_element_to_element(enc,
|
||||
saml.ELEMENT_FROM_STRING,
|
||||
namespace=saml.NAMESPACE)
|
||||
|
||||
if self.debug and self.log:
|
||||
self.log.info("Decrypted Assertion: %s" % assertion)
|
||||
return self._assertion(assertion)
|
||||
|
||||
def parse_assertion(self):
|
||||
try:
|
||||
assert len(self.response.assertion) == 1 or \
|
||||
len(self.response.encrypted_assertion) == 1
|
||||
except AssertionError:
|
||||
raise Exception("No assertion part")
|
||||
|
||||
if self.response.assertion:
|
||||
if self.debug and self.log:
|
||||
self.log.info("***Unencrypted response***")
|
||||
return self._assertion(self.response.assertion[0])
|
||||
else:
|
||||
if self.debug and self.log:
|
||||
self.log.info("***Encrypted response***")
|
||||
return self._encrypted_assertion(
|
||||
self.response.encrypted_assertion[0])
|
||||
|
||||
|
||||
def verify(self):
|
||||
""" Verify that the assertion is syntactically correct and
|
||||
the signature is correct if present."""
|
||||
|
||||
try:
|
||||
self._verify()
|
||||
except AssertionError:
|
||||
return None
|
||||
|
||||
if self.parse_assertion():
|
||||
return self
|
||||
else:
|
||||
self.log.error("Could not parse the assertion")
|
||||
return None
|
||||
|
||||
def session_id(self):
|
||||
""" Returns the SessionID of the response """
|
||||
return self.response.in_response_to
|
||||
|
||||
def id(self):
|
||||
""" Return the ID of the response """
|
||||
return self.response.id
|
||||
|
||||
def authn_info(self):
|
||||
res = []
|
||||
for astat in self.assertion.authn_statement:
|
||||
context = astat.authn_context
|
||||
if context:
|
||||
aclass = context.authn_context_class_ref.text
|
||||
try:
|
||||
authn_auth = [
|
||||
a.text for a in context.authenticating_authority]
|
||||
except AttributeError:
|
||||
authn_auth = []
|
||||
res.append((aclass, authn_auth))
|
||||
return res
|
||||
|
||||
def authz_decision_info(self):
|
||||
res = {"permit":[], "deny": [], "indeterminate":[] }
|
||||
for adstat in self.assertion.authz_decision_statement:
|
||||
# one of 'Permit', 'Deny', 'Indeterminate'
|
||||
res[adstat.decision.text.lower()] = adstat
|
||||
return res
|
||||
|
||||
def session_info(self):
|
||||
""" Returns a predefined set of information gleened from the
|
||||
response.
|
||||
:returns: Dictionary with information
|
||||
"""
|
||||
if self.session_not_on_or_after > 0:
|
||||
nooa = self.session_not_on_or_after
|
||||
else:
|
||||
nooa = self.not_on_or_after
|
||||
|
||||
if self.context == "AuthzQuery":
|
||||
return {"name_id": self.name_id,
|
||||
"came_from": self.came_from, "issuer": self.issuer(),
|
||||
"not_on_or_after": nooa,
|
||||
"authz_decision_info": self.authz_decision_info() }
|
||||
else:
|
||||
return { "ava": self.ava, "name_id": self.name_id,
|
||||
"came_from": self.came_from, "issuer": self.issuer(),
|
||||
"not_on_or_after": nooa,
|
||||
"authn_info": self.authn_info() }
|
||||
|
||||
def __str__(self):
|
||||
return "%s" % self.xmlstr
|
||||
|
||||
class AttributeResponse(AuthnResponse):
|
||||
def __init__(self, sec_context, attribute_converters, entity_id,
|
||||
return_addr=None, log=None, timeslack=0, debug=0,
|
||||
asynchop=False, test=False):
|
||||
|
||||
AuthnResponse.__init__(self, sec_context, attribute_converters,
|
||||
entity_id, return_addr, log=log,
|
||||
timeslack=timeslack, debug=debug,
|
||||
asynchop=asynchop, test=test)
|
||||
self.entity_id = entity_id
|
||||
self.attribute_converters = attribute_converters
|
||||
self.assertion = None
|
||||
self.context = "AttrQuery"
|
||||
|
||||
class AuthzResponse(AuthnResponse):
|
||||
""" A successful response will be in the form of assertions containing
|
||||
authorization decision statements."""
|
||||
def __init__(self, sec_context, attribute_converters, entity_id,
|
||||
return_addr=None, log=None, timeslack=0, debug=0,
|
||||
asynchop=False):
|
||||
AuthnResponse.__init__(self, sec_context, attribute_converters,
|
||||
entity_id, return_addr, log=log,
|
||||
timeslack=timeslack, debug=debug,
|
||||
asynchop=asynchop)
|
||||
self.entity_id = entity_id
|
||||
self.attribute_converters = attribute_converters
|
||||
self.assertion = None
|
||||
self.context = "AuthzQuery"
|
||||
|
||||
def response_factory(xmlstr, conf, return_addr=None,
|
||||
outstanding_queries=None, log=None,
|
||||
timeslack=0, debug=0, decode=True, request_id=0,
|
||||
origxml=None, asynchop=True, allow_unsolicited=False):
|
||||
sec_context = security_context(conf)
|
||||
if not timeslack:
|
||||
try:
|
||||
timeslack = int(conf.accepted_time_diff)
|
||||
except TypeError:
|
||||
timeslack = 0
|
||||
|
||||
attribute_converters = conf.attribute_converters
|
||||
entity_id = conf.entityid
|
||||
|
||||
response = StatusResponse(sec_context, return_addr, log, timeslack,
|
||||
debug, request_id)
|
||||
try:
|
||||
response.loads(xmlstr, decode, origxml)
|
||||
if response.response.assertion or response.response.encrypted_assertion:
|
||||
authnresp = AuthnResponse(sec_context, attribute_converters,
|
||||
entity_id, return_addr, outstanding_queries, log,
|
||||
timeslack, debug, asynchop, allow_unsolicited)
|
||||
authnresp.update(response)
|
||||
return authnresp
|
||||
except TypeError:
|
||||
response.signature_check = sec_context.correctly_signed_logout_response
|
||||
response.loads(xmlstr, decode, origxml)
|
||||
logoutresp = LogoutResponse(sec_context, return_addr, log,
|
||||
timeslack, debug)
|
||||
logoutresp.update(response)
|
||||
return logoutresp
|
||||
|
||||
return response
|
||||
306
src/saml2/s_utils.py
Normal file
306
src/saml2/s_utils.py
Normal file
@@ -0,0 +1,306 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
import time
|
||||
import base64
|
||||
import sys
|
||||
import hmac
|
||||
|
||||
# from python 2.5
|
||||
if sys.version_info >= (2,5):
|
||||
import hashlib
|
||||
else: # before python 2.5
|
||||
import sha
|
||||
|
||||
from saml2 import saml
|
||||
from saml2 import samlp
|
||||
from saml2 import VERSION
|
||||
from saml2.time_util import instant
|
||||
|
||||
try:
|
||||
from hashlib import md5
|
||||
except ImportError:
|
||||
from md5 import md5
|
||||
import zlib
|
||||
|
||||
class VersionMismatch(Exception):
|
||||
pass
|
||||
|
||||
class UnknownPrincipal(Exception):
|
||||
pass
|
||||
|
||||
class UnsupportedBinding(Exception):
|
||||
pass
|
||||
|
||||
class OtherError(Exception):
|
||||
pass
|
||||
|
||||
class MissingValue(Exception):
|
||||
pass
|
||||
|
||||
|
||||
EXCEPTION2STATUS = {
|
||||
VersionMismatch: samlp.STATUS_VERSION_MISMATCH,
|
||||
UnknownPrincipal: samlp.STATUS_UNKNOWN_PRINCIPAL,
|
||||
UnsupportedBinding: samlp.STATUS_UNSUPPORTED_BINDING,
|
||||
OtherError: samlp.STATUS_UNKNOWN_PRINCIPAL,
|
||||
MissingValue: samlp.STATUS_REQUEST_UNSUPPORTED,
|
||||
# Undefined
|
||||
Exception: samlp.STATUS_AUTHN_FAILED,
|
||||
}
|
||||
|
||||
GENERIC_DOMAINS = "aero", "asia", "biz", "cat", "com", "coop", \
|
||||
"edu", "gov", "info", "int", "jobs", "mil", "mobi", "museum", \
|
||||
"name", "net", "org", "pro", "tel", "travel"
|
||||
|
||||
def valid_email(emailaddress, domains = GENERIC_DOMAINS):
|
||||
"""Checks for a syntactically valid email address."""
|
||||
|
||||
# Email address must be at least 6 characters in total.
|
||||
# Assuming noone may have addresses of the type a@com
|
||||
if len(emailaddress) < 6:
|
||||
return False # Address too short.
|
||||
|
||||
# Split up email address into parts.
|
||||
try:
|
||||
localpart, domainname = emailaddress.rsplit('@', 1)
|
||||
host, toplevel = domainname.rsplit('.', 1)
|
||||
except ValueError:
|
||||
return False # Address does not have enough parts.
|
||||
|
||||
# Check for Country code or Generic Domain.
|
||||
if len(toplevel) != 2 and toplevel not in domains:
|
||||
return False # Not a domain name.
|
||||
|
||||
for i in '-_.%+.':
|
||||
localpart = localpart.replace(i, "")
|
||||
for i in '-_.':
|
||||
host = host.replace(i, "")
|
||||
|
||||
if localpart.isalnum() and host.isalnum():
|
||||
return True # Email address is fine.
|
||||
else:
|
||||
return False # Email address has funny characters.
|
||||
|
||||
def decode_base64_and_inflate( string ):
|
||||
""" base64 decodes and then inflates according to RFC1951
|
||||
|
||||
:param string: a deflated and encoded string
|
||||
:return: the string after decoding and inflating
|
||||
"""
|
||||
|
||||
return zlib.decompress( base64.b64decode( string ) , -15)
|
||||
|
||||
def deflate_and_base64_encode( string_val ):
|
||||
"""
|
||||
Deflates and the base64 encodes a string
|
||||
|
||||
:param string_val: The string to deflate and encode
|
||||
:return: The deflated and encoded string
|
||||
"""
|
||||
return base64.b64encode( zlib.compress( string_val )[2:-4] )
|
||||
|
||||
def sid(seed=""):
|
||||
"""The hash of the server time + seed makes an unique SID for each session.
|
||||
|
||||
:param seed: A seed string
|
||||
:return: The hex version of the digest, prefixed by 'id-' to make it
|
||||
compliant with the NCName specification
|
||||
"""
|
||||
ident = md5()
|
||||
ident.update(repr(time.time()))
|
||||
if seed:
|
||||
ident.update(seed)
|
||||
return "id-"+ident.hexdigest()
|
||||
|
||||
def parse_attribute_map(filenames):
|
||||
"""
|
||||
Expects a file with each line being composed of the oid for the attribute
|
||||
exactly one space, a user friendly name of the attribute and then
|
||||
the type specification of the name.
|
||||
|
||||
:param filenames: List of filenames on mapfiles.
|
||||
:return: A 2-tuple, one dictionary with the oid as keys and the friendly
|
||||
names as values, the other one the other way around.
|
||||
"""
|
||||
forward = {}
|
||||
backward = {}
|
||||
for filename in filenames:
|
||||
for line in open(filename).readlines():
|
||||
(name, friendly_name, name_format) = line.strip().split()
|
||||
forward[(name, name_format)] = friendly_name
|
||||
backward[friendly_name] = (name, name_format)
|
||||
|
||||
return forward, backward
|
||||
|
||||
def identity_attribute(form, attribute, forward_map=None):
|
||||
if form == "friendly":
|
||||
if attribute.friendly_name:
|
||||
return attribute.friendly_name
|
||||
elif forward_map:
|
||||
try:
|
||||
return forward_map[(attribute.name, attribute.name_format)]
|
||||
except KeyError:
|
||||
return attribute.name
|
||||
# default is name
|
||||
return attribute.name
|
||||
|
||||
#----------------------------------------------------------------------------
|
||||
|
||||
def error_status_factory(info):
|
||||
if isinstance(info, Exception):
|
||||
try:
|
||||
exc_val = EXCEPTION2STATUS[info.__class__]
|
||||
except KeyError:
|
||||
exc_val = samlp.STATUS_AUTHN_FAILED
|
||||
msg = info.args[0]
|
||||
status = samlp.Status(
|
||||
status_message=samlp.StatusMessage(text=msg),
|
||||
status_code=samlp.StatusCode(
|
||||
value=samlp.STATUS_RESPONDER,
|
||||
status_code=samlp.StatusCode(
|
||||
value=exc_val)
|
||||
),
|
||||
)
|
||||
else:
|
||||
(errcode, text) = info
|
||||
status = samlp.Status(
|
||||
status_message=samlp.StatusMessage(text=text),
|
||||
status_code=samlp.StatusCode(
|
||||
value=samlp.STATUS_RESPONDER,
|
||||
status_code=samlp.StatusCode(value=errcode)
|
||||
),
|
||||
)
|
||||
|
||||
return status
|
||||
|
||||
def success_status_factory():
|
||||
return samlp.Status(status_code=samlp.StatusCode(
|
||||
value=samlp.STATUS_SUCCESS))
|
||||
|
||||
def status_message_factory(message, code, fro=samlp.STATUS_RESPONDER):
|
||||
return samlp.Status(
|
||||
status_message=samlp.StatusMessage(text=message),
|
||||
status_code=samlp.StatusCode(
|
||||
value=fro,
|
||||
status_code=samlp.StatusCode(value=code)))
|
||||
|
||||
def assertion_factory(**kwargs):
|
||||
assertion = saml.Assertion(version=VERSION, id=sid(),
|
||||
issue_instant=instant())
|
||||
for key, val in kwargs.items():
|
||||
setattr(assertion, key, val)
|
||||
return assertion
|
||||
|
||||
def _attrval(val, typ=""):
|
||||
if isinstance(val, list) or isinstance(val, set):
|
||||
attrval = [saml.AttributeValue(text=v) for v in val]
|
||||
elif val is None:
|
||||
attrval = None
|
||||
else:
|
||||
attrval = [saml.AttributeValue(text=val)]
|
||||
|
||||
if typ:
|
||||
for ava in attrval:
|
||||
ava.set_type(typ)
|
||||
|
||||
return attrval
|
||||
|
||||
# --- attribute profiles -----
|
||||
|
||||
# xmlns:xs="http://www.w3.org/2001/XMLSchema"
|
||||
# xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
|
||||
|
||||
def do_ava(val, typ=""):
|
||||
if isinstance(val, basestring):
|
||||
ava = saml.AttributeValue()
|
||||
ava.set_text(val)
|
||||
attrval = [ava]
|
||||
elif isinstance(val, list):
|
||||
attrval = [do_ava(v)[0] for v in val]
|
||||
elif val or val == False:
|
||||
ava = saml.AttributeValue()
|
||||
ava.set_text(val)
|
||||
attrval = [ava]
|
||||
elif val is None:
|
||||
attrval = None
|
||||
else:
|
||||
raise OtherError("strange value type on: %s" % val)
|
||||
|
||||
if typ:
|
||||
for ava in attrval:
|
||||
ava.set_type(typ)
|
||||
|
||||
return attrval
|
||||
|
||||
def do_attribute(val, typ, key):
|
||||
attr = saml.Attribute()
|
||||
attrval = do_ava(val, typ)
|
||||
if attrval:
|
||||
attr.attribute_value = attrval
|
||||
|
||||
if isinstance(key, basestring):
|
||||
attr.name = key
|
||||
elif isinstance(key, tuple): # 3-tuple or 2-tuple
|
||||
try:
|
||||
(name, nformat, friendly) = key
|
||||
except ValueError:
|
||||
(name, nformat) = key
|
||||
friendly = ""
|
||||
if name:
|
||||
attr.name = name
|
||||
if format:
|
||||
attr.name_format = nformat
|
||||
if friendly:
|
||||
attr.friendly_name = friendly
|
||||
return attr
|
||||
|
||||
def do_attributes(identity):
|
||||
attrs = []
|
||||
if not identity:
|
||||
return attrs
|
||||
for key, spec in identity.items():
|
||||
try:
|
||||
val, typ = spec
|
||||
except ValueError:
|
||||
val = spec
|
||||
typ = ""
|
||||
except TypeError:
|
||||
val = ""
|
||||
typ = ""
|
||||
|
||||
attr = do_attribute(val, typ, key)
|
||||
attrs.append(attr)
|
||||
return attrs
|
||||
|
||||
def do_attribute_statement(identity):
|
||||
"""
|
||||
:param identity: A dictionary with fiendly names as keys
|
||||
:return:
|
||||
"""
|
||||
return saml.AttributeStatement(attribute=do_attributes(identity))
|
||||
|
||||
def factory(klass, **kwargs):
|
||||
instance = klass()
|
||||
for key, val in kwargs.items():
|
||||
setattr(instance, key, val)
|
||||
return instance
|
||||
|
||||
def signature(secret, parts):
|
||||
"""Generates a signature.
|
||||
"""
|
||||
if sys.version_info >= (2, 5):
|
||||
csum = hmac.new(secret, digestmod=hashlib.sha1)
|
||||
else:
|
||||
csum = hmac.new(secret, digestmod=sha)
|
||||
|
||||
for part in parts:
|
||||
csum.update(part)
|
||||
|
||||
return csum.hexdigest()
|
||||
|
||||
def verify_signature(secret, parts):
|
||||
""" Checks that the signature is correct """
|
||||
if signature(secret, parts[:-1]) == parts[-1]:
|
||||
return True
|
||||
else:
|
||||
return False
|
||||
1585
src/saml2/saml.py
Normal file
1585
src/saml2/saml.py
Normal file
File diff suppressed because it is too large
Load Diff
1726
src/saml2/samlp.py
Normal file
1726
src/saml2/samlp.py
Normal file
File diff suppressed because it is too large
Load Diff
2
src/saml2/schema/__init__.py
Normal file
2
src/saml2/schema/__init__.py
Normal file
@@ -0,0 +1,2 @@
|
||||
__author__ = 'rolandh'
|
||||
|
||||
511
src/saml2/schema/soap.py
Normal file
511
src/saml2/schema/soap.py
Normal file
@@ -0,0 +1,511 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
#
|
||||
# Generated Fri May 27 17:23:42 2011 by parse_xsd.py version 0.4.
|
||||
#
|
||||
|
||||
import saml2
|
||||
from saml2 import SamlBase
|
||||
|
||||
from saml2.schema import wsdl
|
||||
|
||||
NAMESPACE = 'http://schemas.xmlsoap.org/wsdl/soap/'
|
||||
|
||||
class EncodingStyle_(SamlBase):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/soap/:encodingStyle element """
|
||||
|
||||
c_tag = 'encodingStyle'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
|
||||
def encoding_style__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(EncodingStyle_, xml_string)
|
||||
|
||||
|
||||
class TStyleChoice_(SamlBase):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/soap/:tStyleChoice element """
|
||||
|
||||
c_tag = 'tStyleChoice'
|
||||
c_namespace = NAMESPACE
|
||||
c_value_type = {'base': 'xs:string', 'enumeration': ['rpc', 'document']}
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
|
||||
def t_style_choice__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TStyleChoice_, xml_string)
|
||||
|
||||
|
||||
class TOperation_(wsdl.TExtensibilityElement_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/soap/:tOperation element """
|
||||
|
||||
c_tag = 'tOperation'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = wsdl.TExtensibilityElement_.c_children.copy()
|
||||
c_attributes = wsdl.TExtensibilityElement_.c_attributes.copy()
|
||||
c_child_order = wsdl.TExtensibilityElement_.c_child_order[:]
|
||||
c_cardinality = wsdl.TExtensibilityElement_.c_cardinality.copy()
|
||||
c_attributes['soapAction'] = ('soap_action', 'anyURI', False)
|
||||
c_attributes['style'] = ('style', TStyleChoice_, False)
|
||||
|
||||
def __init__(self,
|
||||
soap_action=None,
|
||||
style=None,
|
||||
required=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
wsdl.TExtensibilityElement_.__init__(self,
|
||||
required=required,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.soap_action=soap_action
|
||||
self.style=style
|
||||
|
||||
def t_operation__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TOperation_, xml_string)
|
||||
|
||||
|
||||
class UseChoice_(SamlBase):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/soap/:useChoice element """
|
||||
|
||||
c_tag = 'useChoice'
|
||||
c_namespace = NAMESPACE
|
||||
c_value_type = {'base': 'xs:string', 'enumeration': ['literal', 'encoded']}
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
|
||||
def use_choice__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(UseChoice_, xml_string)
|
||||
|
||||
|
||||
class TFaultRes_(SamlBase):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/soap/:tFaultRes element """
|
||||
|
||||
c_tag = 'tFaultRes'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
c_attributes['{http://schemas.xmlsoap.org/wsdl/}required'] = ('required', 'None', False)
|
||||
c_attributes['parts'] = ('parts', 'NMTOKENS', False)
|
||||
c_attributes['encodingStyle'] = ('encoding_style', EncodingStyle_, False)
|
||||
c_attributes['use'] = ('use', UseChoice_, False)
|
||||
c_attributes['namespace'] = ('namespace', 'anyURI', False)
|
||||
|
||||
def __init__(self,
|
||||
required=None,
|
||||
parts=None,
|
||||
encoding_style=None,
|
||||
use=None,
|
||||
namespace=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
SamlBase.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.required=required
|
||||
self.parts=parts
|
||||
self.encoding_style=encoding_style
|
||||
self.use=use
|
||||
self.namespace=namespace
|
||||
|
||||
|
||||
class TFault_(TFaultRes_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/soap/:tFault element """
|
||||
|
||||
c_tag = 'tFault'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TFaultRes_.c_children.copy()
|
||||
c_attributes = TFaultRes_.c_attributes.copy()
|
||||
c_child_order = TFaultRes_.c_child_order[:]
|
||||
c_cardinality = TFaultRes_.c_cardinality.copy()
|
||||
c_attributes['name'] = ('name', 'NCName', True)
|
||||
|
||||
def __init__(self,
|
||||
name=None,
|
||||
required=None,
|
||||
parts=None,
|
||||
encoding_style=None,
|
||||
use=None,
|
||||
namespace=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
TFaultRes_.__init__(self,
|
||||
required=required,
|
||||
parts=parts,
|
||||
encoding_style=encoding_style,
|
||||
use=use,
|
||||
namespace=namespace,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.name=name
|
||||
|
||||
def t_fault__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TFault_, xml_string)
|
||||
|
||||
|
||||
class THeaderFault_(SamlBase):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/soap/:tHeaderFault element """
|
||||
|
||||
c_tag = 'tHeaderFault'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
c_attributes['message'] = ('message', 'QName', True)
|
||||
c_attributes['part'] = ('part', 'NMTOKEN', True)
|
||||
c_attributes['use'] = ('use', UseChoice_, True)
|
||||
c_attributes['encodingStyle'] = ('encoding_style', EncodingStyle_, False)
|
||||
c_attributes['namespace'] = ('namespace', 'anyURI', False)
|
||||
|
||||
def __init__(self,
|
||||
message=None,
|
||||
part=None,
|
||||
use=None,
|
||||
encoding_style=None,
|
||||
namespace=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
SamlBase.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.message=message
|
||||
self.part=part
|
||||
self.use=use
|
||||
self.encoding_style=encoding_style
|
||||
self.namespace=namespace
|
||||
|
||||
def t_header_fault__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(THeaderFault_, xml_string)
|
||||
|
||||
|
||||
class TAddress_(wsdl.TExtensibilityElement_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/soap/:tAddress element """
|
||||
|
||||
c_tag = 'tAddress'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = wsdl.TExtensibilityElement_.c_children.copy()
|
||||
c_attributes = wsdl.TExtensibilityElement_.c_attributes.copy()
|
||||
c_child_order = wsdl.TExtensibilityElement_.c_child_order[:]
|
||||
c_cardinality = wsdl.TExtensibilityElement_.c_cardinality.copy()
|
||||
c_attributes['location'] = ('location', 'anyURI', True)
|
||||
|
||||
def __init__(self,
|
||||
location=None,
|
||||
required=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
wsdl.TExtensibilityElement_.__init__(self,
|
||||
required=required,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.location=location
|
||||
|
||||
def t_address__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TAddress_, xml_string)
|
||||
|
||||
|
||||
class TBinding_(wsdl.TExtensibilityElement_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/soap/:tBinding element """
|
||||
|
||||
c_tag = 'tBinding'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = wsdl.TExtensibilityElement_.c_children.copy()
|
||||
c_attributes = wsdl.TExtensibilityElement_.c_attributes.copy()
|
||||
c_child_order = wsdl.TExtensibilityElement_.c_child_order[:]
|
||||
c_cardinality = wsdl.TExtensibilityElement_.c_cardinality.copy()
|
||||
c_attributes['transport'] = ('transport', 'anyURI', True)
|
||||
c_attributes['style'] = ('style', TStyleChoice_, False)
|
||||
|
||||
def __init__(self,
|
||||
transport=None,
|
||||
style=None,
|
||||
required=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
wsdl.TExtensibilityElement_.__init__(self,
|
||||
required=required,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.transport=transport
|
||||
self.style=style
|
||||
|
||||
def t_binding__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TBinding_, xml_string)
|
||||
|
||||
|
||||
class Operation(TOperation_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/soap/:operation element """
|
||||
|
||||
c_tag = 'operation'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TOperation_.c_children.copy()
|
||||
c_attributes = TOperation_.c_attributes.copy()
|
||||
c_child_order = TOperation_.c_child_order[:]
|
||||
c_cardinality = TOperation_.c_cardinality.copy()
|
||||
|
||||
def operation_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Operation, xml_string)
|
||||
|
||||
|
||||
class TBody_(wsdl.TExtensibilityElement_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/soap/:tBody element """
|
||||
|
||||
c_tag = 'tBody'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = wsdl.TExtensibilityElement_.c_children.copy()
|
||||
c_attributes = wsdl.TExtensibilityElement_.c_attributes.copy()
|
||||
c_child_order = wsdl.TExtensibilityElement_.c_child_order[:]
|
||||
c_cardinality = wsdl.TExtensibilityElement_.c_cardinality.copy()
|
||||
c_attributes['parts'] = ('parts', 'NMTOKENS', False)
|
||||
c_attributes['encodingStyle'] = ('encoding_style', EncodingStyle_, False)
|
||||
c_attributes['use'] = ('use', UseChoice_, False)
|
||||
c_attributes['namespace'] = ('namespace', 'anyURI', False)
|
||||
|
||||
def __init__(self,
|
||||
parts=None,
|
||||
encoding_style=None,
|
||||
use=None,
|
||||
namespace=None,
|
||||
required=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
wsdl.TExtensibilityElement_.__init__(self,
|
||||
required=required,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.parts=parts
|
||||
self.encoding_style=encoding_style
|
||||
self.use=use
|
||||
self.namespace=namespace
|
||||
|
||||
def t_body__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TBody_, xml_string)
|
||||
|
||||
|
||||
class Fault(TFault_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/soap/:fault element """
|
||||
|
||||
c_tag = 'fault'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TFault_.c_children.copy()
|
||||
c_attributes = TFault_.c_attributes.copy()
|
||||
c_child_order = TFault_.c_child_order[:]
|
||||
c_cardinality = TFault_.c_cardinality.copy()
|
||||
|
||||
def fault_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Fault, xml_string)
|
||||
|
||||
|
||||
class Headerfault(THeaderFault_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/soap/:headerfault element """
|
||||
|
||||
c_tag = 'headerfault'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = THeaderFault_.c_children.copy()
|
||||
c_attributes = THeaderFault_.c_attributes.copy()
|
||||
c_child_order = THeaderFault_.c_child_order[:]
|
||||
c_cardinality = THeaderFault_.c_cardinality.copy()
|
||||
|
||||
def headerfault_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Headerfault, xml_string)
|
||||
|
||||
|
||||
class Address(TAddress_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/soap/:address element """
|
||||
|
||||
c_tag = 'address'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TAddress_.c_children.copy()
|
||||
c_attributes = TAddress_.c_attributes.copy()
|
||||
c_child_order = TAddress_.c_child_order[:]
|
||||
c_cardinality = TAddress_.c_cardinality.copy()
|
||||
|
||||
def address_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Address, xml_string)
|
||||
|
||||
|
||||
class Binding(TBinding_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/soap/:binding element """
|
||||
|
||||
c_tag = 'binding'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TBinding_.c_children.copy()
|
||||
c_attributes = TBinding_.c_attributes.copy()
|
||||
c_child_order = TBinding_.c_child_order[:]
|
||||
c_cardinality = TBinding_.c_cardinality.copy()
|
||||
|
||||
def binding_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Binding, xml_string)
|
||||
|
||||
|
||||
class Body(TBody_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/soap/:body element """
|
||||
|
||||
c_tag = 'body'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TBody_.c_children.copy()
|
||||
c_attributes = TBody_.c_attributes.copy()
|
||||
c_child_order = TBody_.c_child_order[:]
|
||||
c_cardinality = TBody_.c_cardinality.copy()
|
||||
|
||||
def body_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Body, xml_string)
|
||||
|
||||
|
||||
class THeader_(wsdl.TExtensibilityElement_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/soap/:tHeader element """
|
||||
|
||||
c_tag = 'tHeader'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = wsdl.TExtensibilityElement_.c_children.copy()
|
||||
c_attributes = wsdl.TExtensibilityElement_.c_attributes.copy()
|
||||
c_child_order = wsdl.TExtensibilityElement_.c_child_order[:]
|
||||
c_cardinality = wsdl.TExtensibilityElement_.c_cardinality.copy()
|
||||
c_children['{http://schemas.xmlsoap.org/wsdl/soap/}headerfault'] = ('headerfault', [Headerfault])
|
||||
c_cardinality['headerfault'] = {"min":0}
|
||||
c_attributes['message'] = ('message', 'QName', True)
|
||||
c_attributes['part'] = ('part', 'NMTOKEN', True)
|
||||
c_attributes['use'] = ('use', UseChoice_, True)
|
||||
c_attributes['encodingStyle'] = ('encoding_style', EncodingStyle_, False)
|
||||
c_attributes['namespace'] = ('namespace', 'anyURI', False)
|
||||
c_child_order.extend(['headerfault'])
|
||||
|
||||
def __init__(self,
|
||||
headerfault=None,
|
||||
message=None,
|
||||
part=None,
|
||||
use=None,
|
||||
encoding_style=None,
|
||||
namespace=None,
|
||||
required=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
wsdl.TExtensibilityElement_.__init__(self,
|
||||
required=required,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.headerfault=headerfault or []
|
||||
self.message=message
|
||||
self.part=part
|
||||
self.use=use
|
||||
self.encoding_style=encoding_style
|
||||
self.namespace=namespace
|
||||
|
||||
def t_header__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(THeader_, xml_string)
|
||||
|
||||
|
||||
class Header(THeader_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/soap/:header element """
|
||||
|
||||
c_tag = 'header'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = THeader_.c_children.copy()
|
||||
c_attributes = THeader_.c_attributes.copy()
|
||||
c_child_order = THeader_.c_child_order[:]
|
||||
c_cardinality = THeader_.c_cardinality.copy()
|
||||
|
||||
def header_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Header, xml_string)
|
||||
|
||||
|
||||
AG_tBodyAttributes = [
|
||||
('encodingStyle', EncodingStyle_, False),
|
||||
('use', UseChoice_, False),
|
||||
('namespace', 'anyURI', False),
|
||||
]
|
||||
|
||||
AG_tHeaderAttributes = [
|
||||
('message', 'QName', True),
|
||||
('part', 'NMTOKEN', True),
|
||||
('use', UseChoice_, True),
|
||||
('encodingStyle', EncodingStyle_, False),
|
||||
('namespace', 'anyURI', False),
|
||||
]
|
||||
|
||||
ELEMENT_FROM_STRING = {
|
||||
EncodingStyle_.c_tag: encoding_style__from_string,
|
||||
Binding.c_tag: binding_from_string,
|
||||
TBinding_.c_tag: t_binding__from_string,
|
||||
TStyleChoice_.c_tag: t_style_choice__from_string,
|
||||
Operation.c_tag: operation_from_string,
|
||||
TOperation_.c_tag: t_operation__from_string,
|
||||
Body.c_tag: body_from_string,
|
||||
TBody_.c_tag: t_body__from_string,
|
||||
UseChoice_.c_tag: use_choice__from_string,
|
||||
Fault.c_tag: fault_from_string,
|
||||
TFault_.c_tag: t_fault__from_string,
|
||||
Header.c_tag: header_from_string,
|
||||
THeader_.c_tag: t_header__from_string,
|
||||
Headerfault.c_tag: headerfault_from_string,
|
||||
THeaderFault_.c_tag: t_header_fault__from_string,
|
||||
Address.c_tag: address_from_string,
|
||||
TAddress_.c_tag: t_address__from_string,
|
||||
}
|
||||
|
||||
ELEMENT_BY_TAG = {
|
||||
'encodingStyle': EncodingStyle_,
|
||||
'binding': Binding,
|
||||
'tBinding': TBinding_,
|
||||
'tStyleChoice': TStyleChoice_,
|
||||
'operation': Operation,
|
||||
'tOperation': TOperation_,
|
||||
'body': Body,
|
||||
'tBody': TBody_,
|
||||
'useChoice': UseChoice_,
|
||||
'fault': Fault,
|
||||
'tFault': TFault_,
|
||||
'header': Header,
|
||||
'tHeader': THeader_,
|
||||
'headerfault': Headerfault,
|
||||
'tHeaderFault': THeaderFault_,
|
||||
'address': Address,
|
||||
'tAddress': TAddress_,
|
||||
'tFaultRes': TFaultRes_,
|
||||
}
|
||||
|
||||
|
||||
def factory(tag, **kwargs):
|
||||
return ELEMENT_BY_TAG[tag](**kwargs)
|
||||
|
||||
293
src/saml2/schema/soapenv.py
Normal file
293
src/saml2/schema/soapenv.py
Normal file
@@ -0,0 +1,293 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
#
|
||||
# Generated Fri May 27 17:26:51 2011 by parse_xsd.py version 0.4.
|
||||
#
|
||||
|
||||
import saml2
|
||||
from saml2 import SamlBase
|
||||
|
||||
NAMESPACE = 'http://schemas.xmlsoap.org/soap/envelope/'
|
||||
|
||||
class Header_(SamlBase):
|
||||
"""The http://schemas.xmlsoap.org/soap/envelope/:Header element """
|
||||
|
||||
c_tag = 'Header'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
|
||||
def header__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Header_, xml_string)
|
||||
|
||||
|
||||
class Body_(SamlBase):
|
||||
"""The http://schemas.xmlsoap.org/soap/envelope/:Body element """
|
||||
|
||||
c_tag = 'Body'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
|
||||
def body__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Body_, xml_string)
|
||||
|
||||
|
||||
class EncodingStyle_(SamlBase):
|
||||
"""The http://schemas.xmlsoap.org/soap/envelope/:encodingStyle element """
|
||||
|
||||
c_tag = 'encodingStyle'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
|
||||
def encoding_style__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(EncodingStyle_, xml_string)
|
||||
|
||||
|
||||
class Fault_faultcode(SamlBase):
|
||||
|
||||
c_tag = 'faultcode'
|
||||
c_namespace = NAMESPACE
|
||||
c_value_type = {'base': 'QName'}
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
|
||||
def fault_faultcode_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Fault_faultcode, xml_string)
|
||||
|
||||
|
||||
class Fault_faultstring(SamlBase):
|
||||
|
||||
c_tag = 'faultstring'
|
||||
c_namespace = NAMESPACE
|
||||
c_value_type = {'base': 'string'}
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
|
||||
def fault_faultstring_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Fault_faultstring, xml_string)
|
||||
|
||||
|
||||
class Fault_faultactor(SamlBase):
|
||||
|
||||
c_tag = 'faultactor'
|
||||
c_namespace = NAMESPACE
|
||||
c_value_type = {'base': 'anyURI'}
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
|
||||
def fault_faultactor_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Fault_faultactor, xml_string)
|
||||
|
||||
|
||||
class Detail_(SamlBase):
|
||||
"""The http://schemas.xmlsoap.org/soap/envelope/:detail element """
|
||||
|
||||
c_tag = 'detail'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
|
||||
def detail__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Detail_, xml_string)
|
||||
|
||||
|
||||
class Envelope_(SamlBase):
|
||||
"""The http://schemas.xmlsoap.org/soap/envelope/:Envelope element """
|
||||
|
||||
c_tag = 'Envelope'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
c_children['{http://schemas.xmlsoap.org/soap/envelope/}Header'] = ('header', Header_)
|
||||
c_cardinality['header'] = {"min":0, "max":1}
|
||||
c_children['{http://schemas.xmlsoap.org/soap/envelope/}Body'] = ('body', Body_)
|
||||
c_child_order.extend(['header', 'body'])
|
||||
|
||||
def __init__(self,
|
||||
header=None,
|
||||
body=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
SamlBase.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.header=header
|
||||
self.body=body
|
||||
|
||||
def envelope__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Envelope_, xml_string)
|
||||
|
||||
|
||||
class Header(Header_):
|
||||
"""The http://schemas.xmlsoap.org/soap/envelope/:Header element """
|
||||
|
||||
c_tag = 'Header'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = Header_.c_children.copy()
|
||||
c_attributes = Header_.c_attributes.copy()
|
||||
c_child_order = Header_.c_child_order[:]
|
||||
c_cardinality = Header_.c_cardinality.copy()
|
||||
|
||||
def header_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Header, xml_string)
|
||||
|
||||
|
||||
class Body(Body_):
|
||||
"""The http://schemas.xmlsoap.org/soap/envelope/:Body element """
|
||||
|
||||
c_tag = 'Body'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = Body_.c_children.copy()
|
||||
c_attributes = Body_.c_attributes.copy()
|
||||
c_child_order = Body_.c_child_order[:]
|
||||
c_cardinality = Body_.c_cardinality.copy()
|
||||
|
||||
def body_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Body, xml_string)
|
||||
|
||||
|
||||
class Fault_detail(Detail_):
|
||||
|
||||
c_tag = 'detail'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = Detail_.c_children.copy()
|
||||
c_attributes = Detail_.c_attributes.copy()
|
||||
c_child_order = Detail_.c_child_order[:]
|
||||
c_cardinality = Detail_.c_cardinality.copy()
|
||||
|
||||
def fault_detail_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Fault_detail, xml_string)
|
||||
|
||||
|
||||
class Fault_(SamlBase):
|
||||
"""The http://schemas.xmlsoap.org/soap/envelope/:Fault element """
|
||||
|
||||
c_tag = 'Fault'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
c_children['{http://schemas.xmlsoap.org/soap/envelope/}faultcode'] = ('faultcode', Fault_faultcode)
|
||||
c_children['{http://schemas.xmlsoap.org/soap/envelope/}faultstring'] = ('faultstring', Fault_faultstring)
|
||||
c_children['{http://schemas.xmlsoap.org/soap/envelope/}faultactor'] = ('faultactor', Fault_faultactor)
|
||||
c_cardinality['faultactor'] = {"min":0, "max":1}
|
||||
c_children['{http://schemas.xmlsoap.org/soap/envelope/}detail'] = ('detail', Fault_detail)
|
||||
c_cardinality['detail'] = {"min":0, "max":1}
|
||||
c_child_order.extend(['faultcode', 'faultstring', 'faultactor', 'detail'])
|
||||
|
||||
def __init__(self,
|
||||
faultcode=None,
|
||||
faultstring=None,
|
||||
faultactor=None,
|
||||
detail=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
SamlBase.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.faultcode=faultcode
|
||||
self.faultstring=faultstring
|
||||
self.faultactor=faultactor
|
||||
self.detail=detail
|
||||
|
||||
def fault__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Fault_, xml_string)
|
||||
|
||||
|
||||
class Envelope(Envelope_):
|
||||
"""The http://schemas.xmlsoap.org/soap/envelope/:Envelope element """
|
||||
|
||||
c_tag = 'Envelope'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = Envelope_.c_children.copy()
|
||||
c_attributes = Envelope_.c_attributes.copy()
|
||||
c_child_order = Envelope_.c_child_order[:]
|
||||
c_cardinality = Envelope_.c_cardinality.copy()
|
||||
|
||||
def envelope_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Envelope, xml_string)
|
||||
|
||||
|
||||
class Fault(Fault_):
|
||||
"""The http://schemas.xmlsoap.org/soap/envelope/:Fault element """
|
||||
|
||||
c_tag = 'Fault'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = Fault_.c_children.copy()
|
||||
c_attributes = Fault_.c_attributes.copy()
|
||||
c_child_order = Fault_.c_child_order[:]
|
||||
c_cardinality = Fault_.c_cardinality.copy()
|
||||
|
||||
def fault_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Fault, xml_string)
|
||||
|
||||
|
||||
#..................
|
||||
# []
|
||||
AG_encodingStyle = [
|
||||
('encodingStyle', '', False),
|
||||
]
|
||||
|
||||
ELEMENT_FROM_STRING = {
|
||||
Envelope.c_tag: envelope_from_string,
|
||||
Envelope_.c_tag: envelope__from_string,
|
||||
Header.c_tag: header_from_string,
|
||||
Header_.c_tag: header__from_string,
|
||||
Body.c_tag: body_from_string,
|
||||
Body_.c_tag: body__from_string,
|
||||
EncodingStyle_.c_tag: encoding_style__from_string,
|
||||
Fault.c_tag: fault_from_string,
|
||||
Fault_.c_tag: fault__from_string,
|
||||
Detail_.c_tag: detail__from_string,
|
||||
Fault_faultcode.c_tag: fault_faultcode_from_string,
|
||||
Fault_faultstring.c_tag: fault_faultstring_from_string,
|
||||
Fault_faultactor.c_tag: fault_faultactor_from_string,
|
||||
}
|
||||
|
||||
ELEMENT_BY_TAG = {
|
||||
'Envelope': Envelope,
|
||||
'Envelope': Envelope_,
|
||||
'Header': Header,
|
||||
'Header': Header_,
|
||||
'Body': Body,
|
||||
'Body': Body_,
|
||||
'encodingStyle': EncodingStyle_,
|
||||
'Fault': Fault,
|
||||
'Fault': Fault_,
|
||||
'detail': Detail_,
|
||||
'faultcode': Fault_faultcode,
|
||||
'faultstring': Fault_faultstring,
|
||||
'faultactor': Fault_faultactor,
|
||||
}
|
||||
|
||||
|
||||
def factory(tag, **kwargs):
|
||||
return ELEMENT_BY_TAG[tag](**kwargs)
|
||||
|
||||
903
src/saml2/schema/wsdl.py
Normal file
903
src/saml2/schema/wsdl.py
Normal file
@@ -0,0 +1,903 @@
|
||||
#!!!! 'NoneType' object has no attribute 'py_class'
|
||||
#!!!! 'NoneType' object has no attribute 'py_class'
|
||||
#!/usr/bin/env python
|
||||
|
||||
#
|
||||
# Generated Fri May 27 17:23:24 2011 by parse_xsd.py version 0.4.
|
||||
#
|
||||
|
||||
import saml2
|
||||
from saml2 import SamlBase
|
||||
|
||||
|
||||
NAMESPACE = 'http://schemas.xmlsoap.org/wsdl/'
|
||||
|
||||
class TDocumentation_(SamlBase):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/:tDocumentation element """
|
||||
|
||||
c_tag = 'tDocumentation'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
|
||||
def t_documentation__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TDocumentation_, xml_string)
|
||||
|
||||
|
||||
class TDocumented_documentation(TDocumentation_):
|
||||
|
||||
c_tag = 'documentation'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TDocumentation_.c_children.copy()
|
||||
c_attributes = TDocumentation_.c_attributes.copy()
|
||||
c_child_order = TDocumentation_.c_child_order[:]
|
||||
c_cardinality = TDocumentation_.c_cardinality.copy()
|
||||
|
||||
def t_documented_documentation_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TDocumented_documentation, xml_string)
|
||||
|
||||
|
||||
class TDocumented_(SamlBase):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/:tDocumented element """
|
||||
|
||||
c_tag = 'tDocumented'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
c_children['{http://schemas.xmlsoap.org/wsdl/}documentation'] = ('documentation', TDocumented_documentation)
|
||||
c_cardinality['documentation'] = {"min":0, "max":1}
|
||||
c_child_order.extend(['documentation'])
|
||||
|
||||
def __init__(self,
|
||||
documentation=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
SamlBase.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.documentation=documentation
|
||||
|
||||
def t_documented__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TDocumented_, xml_string)
|
||||
|
||||
|
||||
class TExtensibleAttributesDocumented_(TDocumented_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/:tExtensibleAttributesDocumented element """
|
||||
|
||||
c_tag = 'tExtensibleAttributesDocumented'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TDocumented_.c_children.copy()
|
||||
c_attributes = TDocumented_.c_attributes.copy()
|
||||
c_child_order = TDocumented_.c_child_order[:]
|
||||
c_cardinality = TDocumented_.c_cardinality.copy()
|
||||
|
||||
|
||||
class TExtensibleDocumented_(TDocumented_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/:tExtensibleDocumented element """
|
||||
|
||||
c_tag = 'tExtensibleDocumented'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TDocumented_.c_children.copy()
|
||||
c_attributes = TDocumented_.c_attributes.copy()
|
||||
c_child_order = TDocumented_.c_child_order[:]
|
||||
c_cardinality = TDocumented_.c_cardinality.copy()
|
||||
|
||||
|
||||
class TImport_(TExtensibleAttributesDocumented_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/:tImport element """
|
||||
|
||||
c_tag = 'tImport'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TExtensibleAttributesDocumented_.c_children.copy()
|
||||
c_attributes = TExtensibleAttributesDocumented_.c_attributes.copy()
|
||||
c_child_order = TExtensibleAttributesDocumented_.c_child_order[:]
|
||||
c_cardinality = TExtensibleAttributesDocumented_.c_cardinality.copy()
|
||||
c_attributes['namespace'] = ('namespace', 'anyURI', True)
|
||||
c_attributes['location'] = ('location', 'anyURI', True)
|
||||
|
||||
def __init__(self,
|
||||
namespace=None,
|
||||
location=None,
|
||||
documentation=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
TExtensibleAttributesDocumented_.__init__(self,
|
||||
documentation=documentation,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.namespace=namespace
|
||||
self.location=location
|
||||
|
||||
def t_import__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TImport_, xml_string)
|
||||
|
||||
|
||||
class TTypes_(TExtensibleDocumented_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/:tTypes element """
|
||||
|
||||
c_tag = 'tTypes'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TExtensibleDocumented_.c_children.copy()
|
||||
c_attributes = TExtensibleDocumented_.c_attributes.copy()
|
||||
c_child_order = TExtensibleDocumented_.c_child_order[:]
|
||||
c_cardinality = TExtensibleDocumented_.c_cardinality.copy()
|
||||
|
||||
def t_types__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TTypes_, xml_string)
|
||||
|
||||
|
||||
class TPart_(TExtensibleAttributesDocumented_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/:tPart element """
|
||||
|
||||
c_tag = 'tPart'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TExtensibleAttributesDocumented_.c_children.copy()
|
||||
c_attributes = TExtensibleAttributesDocumented_.c_attributes.copy()
|
||||
c_child_order = TExtensibleAttributesDocumented_.c_child_order[:]
|
||||
c_cardinality = TExtensibleAttributesDocumented_.c_cardinality.copy()
|
||||
c_attributes['name'] = ('name', 'NCName', True)
|
||||
c_attributes['element'] = ('element', 'QName', False)
|
||||
c_attributes['type'] = ('type', 'QName', False)
|
||||
|
||||
def __init__(self,
|
||||
name=None,
|
||||
element=None,
|
||||
type=None,
|
||||
documentation=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
TExtensibleAttributesDocumented_.__init__(self,
|
||||
documentation=documentation,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.name=name
|
||||
self.element=element
|
||||
self.type=type
|
||||
|
||||
def t_part__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TPart_, xml_string)
|
||||
|
||||
|
||||
class TOperation_(TExtensibleDocumented_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/:tOperation element """
|
||||
|
||||
c_tag = 'tOperation'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TExtensibleDocumented_.c_children.copy()
|
||||
c_attributes = TExtensibleDocumented_.c_attributes.copy()
|
||||
c_child_order = TExtensibleDocumented_.c_child_order[:]
|
||||
c_cardinality = TExtensibleDocumented_.c_cardinality.copy()
|
||||
c_attributes['name'] = ('name', 'NCName', True)
|
||||
c_attributes['parameterOrder'] = ('parameter_order', 'NMTOKENS', False)
|
||||
|
||||
def __init__(self,
|
||||
name=None,
|
||||
parameter_order=None,
|
||||
documentation=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
TExtensibleDocumented_.__init__(self,
|
||||
documentation=documentation,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.name=name
|
||||
self.parameter_order=parameter_order
|
||||
|
||||
def t_operation__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TOperation_, xml_string)
|
||||
|
||||
|
||||
class TParam_(TExtensibleAttributesDocumented_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/:tParam element """
|
||||
|
||||
c_tag = 'tParam'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TExtensibleAttributesDocumented_.c_children.copy()
|
||||
c_attributes = TExtensibleAttributesDocumented_.c_attributes.copy()
|
||||
c_child_order = TExtensibleAttributesDocumented_.c_child_order[:]
|
||||
c_cardinality = TExtensibleAttributesDocumented_.c_cardinality.copy()
|
||||
c_attributes['name'] = ('name', 'NCName', False)
|
||||
c_attributes['message'] = ('message', 'QName', True)
|
||||
|
||||
def __init__(self,
|
||||
name=None,
|
||||
message=None,
|
||||
documentation=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
TExtensibleAttributesDocumented_.__init__(self,
|
||||
documentation=documentation,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.name=name
|
||||
self.message=message
|
||||
|
||||
def t_param__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TParam_, xml_string)
|
||||
|
||||
|
||||
class TFault_(TExtensibleAttributesDocumented_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/:tFault element """
|
||||
|
||||
c_tag = 'tFault'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TExtensibleAttributesDocumented_.c_children.copy()
|
||||
c_attributes = TExtensibleAttributesDocumented_.c_attributes.copy()
|
||||
c_child_order = TExtensibleAttributesDocumented_.c_child_order[:]
|
||||
c_cardinality = TExtensibleAttributesDocumented_.c_cardinality.copy()
|
||||
c_attributes['name'] = ('name', 'NCName', True)
|
||||
c_attributes['message'] = ('message', 'QName', True)
|
||||
|
||||
def __init__(self,
|
||||
name=None,
|
||||
message=None,
|
||||
documentation=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
TExtensibleAttributesDocumented_.__init__(self,
|
||||
documentation=documentation,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.name=name
|
||||
self.message=message
|
||||
|
||||
def t_fault__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TFault_, xml_string)
|
||||
|
||||
|
||||
class TBindingOperationMessage_(TExtensibleDocumented_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/:tBindingOperationMessage element """
|
||||
|
||||
c_tag = 'tBindingOperationMessage'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TExtensibleDocumented_.c_children.copy()
|
||||
c_attributes = TExtensibleDocumented_.c_attributes.copy()
|
||||
c_child_order = TExtensibleDocumented_.c_child_order[:]
|
||||
c_cardinality = TExtensibleDocumented_.c_cardinality.copy()
|
||||
c_attributes['name'] = ('name', 'NCName', False)
|
||||
|
||||
def __init__(self,
|
||||
name=None,
|
||||
documentation=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
TExtensibleDocumented_.__init__(self,
|
||||
documentation=documentation,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.name=name
|
||||
|
||||
def t_binding_operation_message__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TBindingOperationMessage_, xml_string)
|
||||
|
||||
|
||||
class TBindingOperationFault_(TExtensibleDocumented_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/:tBindingOperationFault element """
|
||||
|
||||
c_tag = 'tBindingOperationFault'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TExtensibleDocumented_.c_children.copy()
|
||||
c_attributes = TExtensibleDocumented_.c_attributes.copy()
|
||||
c_child_order = TExtensibleDocumented_.c_child_order[:]
|
||||
c_cardinality = TExtensibleDocumented_.c_cardinality.copy()
|
||||
c_attributes['name'] = ('name', 'NCName', True)
|
||||
|
||||
def __init__(self,
|
||||
name=None,
|
||||
documentation=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
TExtensibleDocumented_.__init__(self,
|
||||
documentation=documentation,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.name=name
|
||||
|
||||
def t_binding_operation_fault__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TBindingOperationFault_, xml_string)
|
||||
|
||||
|
||||
class TBindingOperation_input(TBindingOperationMessage_):
|
||||
|
||||
c_tag = 'input'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TBindingOperationMessage_.c_children.copy()
|
||||
c_attributes = TBindingOperationMessage_.c_attributes.copy()
|
||||
c_child_order = TBindingOperationMessage_.c_child_order[:]
|
||||
c_cardinality = TBindingOperationMessage_.c_cardinality.copy()
|
||||
|
||||
def t_binding_operation_input_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TBindingOperation_input, xml_string)
|
||||
|
||||
|
||||
class TBindingOperation_output(TBindingOperationMessage_):
|
||||
|
||||
c_tag = 'output'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TBindingOperationMessage_.c_children.copy()
|
||||
c_attributes = TBindingOperationMessage_.c_attributes.copy()
|
||||
c_child_order = TBindingOperationMessage_.c_child_order[:]
|
||||
c_cardinality = TBindingOperationMessage_.c_cardinality.copy()
|
||||
|
||||
def t_binding_operation_output_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TBindingOperation_output, xml_string)
|
||||
|
||||
|
||||
class TBindingOperation_fault(TBindingOperationFault_):
|
||||
|
||||
c_tag = 'fault'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TBindingOperationFault_.c_children.copy()
|
||||
c_attributes = TBindingOperationFault_.c_attributes.copy()
|
||||
c_child_order = TBindingOperationFault_.c_child_order[:]
|
||||
c_cardinality = TBindingOperationFault_.c_cardinality.copy()
|
||||
|
||||
def t_binding_operation_fault_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TBindingOperation_fault, xml_string)
|
||||
|
||||
|
||||
class TBindingOperation_(TExtensibleDocumented_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/:tBindingOperation element """
|
||||
|
||||
c_tag = 'tBindingOperation'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TExtensibleDocumented_.c_children.copy()
|
||||
c_attributes = TExtensibleDocumented_.c_attributes.copy()
|
||||
c_child_order = TExtensibleDocumented_.c_child_order[:]
|
||||
c_cardinality = TExtensibleDocumented_.c_cardinality.copy()
|
||||
c_children['{http://schemas.xmlsoap.org/wsdl/}input'] = ('input', TBindingOperation_input)
|
||||
c_cardinality['input'] = {"min":0, "max":1}
|
||||
c_children['{http://schemas.xmlsoap.org/wsdl/}output'] = ('output', TBindingOperation_output)
|
||||
c_cardinality['output'] = {"min":0, "max":1}
|
||||
c_children['{http://schemas.xmlsoap.org/wsdl/}fault'] = ('fault', [TBindingOperation_fault])
|
||||
c_cardinality['fault'] = {"min":0}
|
||||
c_attributes['name'] = ('name', 'NCName', True)
|
||||
c_child_order.extend(['input', 'output', 'fault'])
|
||||
|
||||
def __init__(self,
|
||||
input=None,
|
||||
output=None,
|
||||
fault=None,
|
||||
name=None,
|
||||
documentation=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
TExtensibleDocumented_.__init__(self,
|
||||
documentation=documentation,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.input=input
|
||||
self.output=output
|
||||
self.fault=fault or []
|
||||
self.name=name
|
||||
|
||||
def t_binding_operation__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TBindingOperation_, xml_string)
|
||||
|
||||
|
||||
class TPort_(TExtensibleDocumented_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/:tPort element """
|
||||
|
||||
c_tag = 'tPort'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TExtensibleDocumented_.c_children.copy()
|
||||
c_attributes = TExtensibleDocumented_.c_attributes.copy()
|
||||
c_child_order = TExtensibleDocumented_.c_child_order[:]
|
||||
c_cardinality = TExtensibleDocumented_.c_cardinality.copy()
|
||||
c_attributes['name'] = ('name', 'NCName', True)
|
||||
c_attributes['binding'] = ('binding', 'QName', True)
|
||||
|
||||
def __init__(self,
|
||||
name=None,
|
||||
binding=None,
|
||||
documentation=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
TExtensibleDocumented_.__init__(self,
|
||||
documentation=documentation,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.name=name
|
||||
self.binding=binding
|
||||
|
||||
def t_port__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TPort_, xml_string)
|
||||
|
||||
|
||||
class TExtensibilityElement_(SamlBase):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/:tExtensibilityElement element """
|
||||
|
||||
c_tag = 'tExtensibilityElement'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = SamlBase.c_children.copy()
|
||||
c_attributes = SamlBase.c_attributes.copy()
|
||||
c_child_order = SamlBase.c_child_order[:]
|
||||
c_cardinality = SamlBase.c_cardinality.copy()
|
||||
c_attributes['required'] = ('required', 'None', False)
|
||||
|
||||
def __init__(self,
|
||||
required=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
SamlBase.__init__(self,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.required=required
|
||||
|
||||
|
||||
class Import(TImport_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/:import element """
|
||||
|
||||
c_tag = 'import'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TImport_.c_children.copy()
|
||||
c_attributes = TImport_.c_attributes.copy()
|
||||
c_child_order = TImport_.c_child_order[:]
|
||||
c_cardinality = TImport_.c_cardinality.copy()
|
||||
|
||||
def import_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Import, xml_string)
|
||||
|
||||
|
||||
class Types(TTypes_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/:types element """
|
||||
|
||||
c_tag = 'types'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TTypes_.c_children.copy()
|
||||
c_attributes = TTypes_.c_attributes.copy()
|
||||
c_child_order = TTypes_.c_child_order[:]
|
||||
c_cardinality = TTypes_.c_cardinality.copy()
|
||||
|
||||
def types_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Types, xml_string)
|
||||
|
||||
|
||||
class TMessage_part(TPart_):
|
||||
|
||||
c_tag = 'part'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TPart_.c_children.copy()
|
||||
c_attributes = TPart_.c_attributes.copy()
|
||||
c_child_order = TPart_.c_child_order[:]
|
||||
c_cardinality = TPart_.c_cardinality.copy()
|
||||
|
||||
def t_message_part_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TMessage_part, xml_string)
|
||||
|
||||
|
||||
class TMessage_(TExtensibleDocumented_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/:tMessage element """
|
||||
|
||||
c_tag = 'tMessage'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TExtensibleDocumented_.c_children.copy()
|
||||
c_attributes = TExtensibleDocumented_.c_attributes.copy()
|
||||
c_child_order = TExtensibleDocumented_.c_child_order[:]
|
||||
c_cardinality = TExtensibleDocumented_.c_cardinality.copy()
|
||||
c_children['{http://schemas.xmlsoap.org/wsdl/}part'] = ('part', [TMessage_part])
|
||||
c_cardinality['part'] = {"min":0}
|
||||
c_attributes['name'] = ('name', 'NCName', True)
|
||||
c_child_order.extend(['part'])
|
||||
|
||||
def __init__(self,
|
||||
part=None,
|
||||
name=None,
|
||||
documentation=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
TExtensibleDocumented_.__init__(self,
|
||||
documentation=documentation,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.part=part or []
|
||||
self.name=name
|
||||
|
||||
def t_message__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TMessage_, xml_string)
|
||||
|
||||
|
||||
class TPortType_operation(TOperation_):
|
||||
|
||||
c_tag = 'operation'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TOperation_.c_children.copy()
|
||||
c_attributes = TOperation_.c_attributes.copy()
|
||||
c_child_order = TOperation_.c_child_order[:]
|
||||
c_cardinality = TOperation_.c_cardinality.copy()
|
||||
|
||||
def t_port_type_operation_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TPortType_operation, xml_string)
|
||||
|
||||
|
||||
class TPortType_(TExtensibleAttributesDocumented_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/:tPortType element """
|
||||
|
||||
c_tag = 'tPortType'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TExtensibleAttributesDocumented_.c_children.copy()
|
||||
c_attributes = TExtensibleAttributesDocumented_.c_attributes.copy()
|
||||
c_child_order = TExtensibleAttributesDocumented_.c_child_order[:]
|
||||
c_cardinality = TExtensibleAttributesDocumented_.c_cardinality.copy()
|
||||
c_children['{http://schemas.xmlsoap.org/wsdl/}operation'] = ('operation', [TPortType_operation])
|
||||
c_cardinality['operation'] = {"min":0}
|
||||
c_attributes['name'] = ('name', 'NCName', True)
|
||||
c_child_order.extend(['operation'])
|
||||
|
||||
def __init__(self,
|
||||
operation=None,
|
||||
name=None,
|
||||
documentation=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
TExtensibleAttributesDocumented_.__init__(self,
|
||||
documentation=documentation,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.operation=operation or []
|
||||
self.name=name
|
||||
|
||||
def t_port_type__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TPortType_, xml_string)
|
||||
|
||||
|
||||
class TBinding_operation(TBindingOperation_):
|
||||
|
||||
c_tag = 'operation'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TBindingOperation_.c_children.copy()
|
||||
c_attributes = TBindingOperation_.c_attributes.copy()
|
||||
c_child_order = TBindingOperation_.c_child_order[:]
|
||||
c_cardinality = TBindingOperation_.c_cardinality.copy()
|
||||
|
||||
def t_binding_operation_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TBinding_operation, xml_string)
|
||||
|
||||
|
||||
class TBinding_(TExtensibleDocumented_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/:tBinding element """
|
||||
|
||||
c_tag = 'tBinding'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TExtensibleDocumented_.c_children.copy()
|
||||
c_attributes = TExtensibleDocumented_.c_attributes.copy()
|
||||
c_child_order = TExtensibleDocumented_.c_child_order[:]
|
||||
c_cardinality = TExtensibleDocumented_.c_cardinality.copy()
|
||||
c_children['{http://schemas.xmlsoap.org/wsdl/}operation'] = ('operation', [TBinding_operation])
|
||||
c_cardinality['operation'] = {"min":0}
|
||||
c_attributes['name'] = ('name', 'NCName', True)
|
||||
c_attributes['type'] = ('type', 'QName', True)
|
||||
c_child_order.extend(['operation'])
|
||||
|
||||
def __init__(self,
|
||||
operation=None,
|
||||
name=None,
|
||||
type=None,
|
||||
documentation=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
TExtensibleDocumented_.__init__(self,
|
||||
documentation=documentation,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.operation=operation or []
|
||||
self.name=name
|
||||
self.type=type
|
||||
|
||||
def t_binding__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TBinding_, xml_string)
|
||||
|
||||
|
||||
class TService_port(TPort_):
|
||||
|
||||
c_tag = 'port'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TPort_.c_children.copy()
|
||||
c_attributes = TPort_.c_attributes.copy()
|
||||
c_child_order = TPort_.c_child_order[:]
|
||||
c_cardinality = TPort_.c_cardinality.copy()
|
||||
|
||||
def t_service_port_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TService_port, xml_string)
|
||||
|
||||
|
||||
class TService_(TExtensibleDocumented_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/:tService element """
|
||||
|
||||
c_tag = 'tService'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TExtensibleDocumented_.c_children.copy()
|
||||
c_attributes = TExtensibleDocumented_.c_attributes.copy()
|
||||
c_child_order = TExtensibleDocumented_.c_child_order[:]
|
||||
c_cardinality = TExtensibleDocumented_.c_cardinality.copy()
|
||||
c_children['{http://schemas.xmlsoap.org/wsdl/}port'] = ('port', [TService_port])
|
||||
c_cardinality['port'] = {"min":0}
|
||||
c_attributes['name'] = ('name', 'NCName', True)
|
||||
c_child_order.extend(['port'])
|
||||
|
||||
def __init__(self,
|
||||
port=None,
|
||||
name=None,
|
||||
documentation=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
TExtensibleDocumented_.__init__(self,
|
||||
documentation=documentation,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.port=port or []
|
||||
self.name=name
|
||||
|
||||
def t_service__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TService_, xml_string)
|
||||
|
||||
|
||||
class Message(TMessage_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/:message element """
|
||||
|
||||
c_tag = 'message'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TMessage_.c_children.copy()
|
||||
c_attributes = TMessage_.c_attributes.copy()
|
||||
c_child_order = TMessage_.c_child_order[:]
|
||||
c_cardinality = TMessage_.c_cardinality.copy()
|
||||
|
||||
def message_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Message, xml_string)
|
||||
|
||||
|
||||
class PortType(TPortType_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/:portType element """
|
||||
|
||||
c_tag = 'portType'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TPortType_.c_children.copy()
|
||||
c_attributes = TPortType_.c_attributes.copy()
|
||||
c_child_order = TPortType_.c_child_order[:]
|
||||
c_cardinality = TPortType_.c_cardinality.copy()
|
||||
|
||||
def port_type_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(PortType, xml_string)
|
||||
|
||||
|
||||
class Binding(TBinding_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/:binding element """
|
||||
|
||||
c_tag = 'binding'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TBinding_.c_children.copy()
|
||||
c_attributes = TBinding_.c_attributes.copy()
|
||||
c_child_order = TBinding_.c_child_order[:]
|
||||
c_cardinality = TBinding_.c_cardinality.copy()
|
||||
|
||||
def binding_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Binding, xml_string)
|
||||
|
||||
|
||||
class Service(TService_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/:service element """
|
||||
|
||||
c_tag = 'service'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TService_.c_children.copy()
|
||||
c_attributes = TService_.c_attributes.copy()
|
||||
c_child_order = TService_.c_child_order[:]
|
||||
c_cardinality = TService_.c_cardinality.copy()
|
||||
|
||||
def service_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Service, xml_string)
|
||||
|
||||
|
||||
class TDefinitions_(TExtensibleDocumented_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/:tDefinitions element """
|
||||
|
||||
c_tag = 'tDefinitions'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TExtensibleDocumented_.c_children.copy()
|
||||
c_attributes = TExtensibleDocumented_.c_attributes.copy()
|
||||
c_child_order = TExtensibleDocumented_.c_child_order[:]
|
||||
c_cardinality = TExtensibleDocumented_.c_cardinality.copy()
|
||||
c_children['{http://schemas.xmlsoap.org/wsdl/}import'] = ('import', Import)
|
||||
c_cardinality['import'] = {"min":0, "max":1}
|
||||
c_children['{http://schemas.xmlsoap.org/wsdl/}types'] = ('types', Types)
|
||||
c_cardinality['types'] = {"min":0, "max":1}
|
||||
c_children['{http://schemas.xmlsoap.org/wsdl/}message'] = ('message', Message)
|
||||
c_cardinality['message'] = {"min":0, "max":1}
|
||||
c_children['{http://schemas.xmlsoap.org/wsdl/}portType'] = ('port_type', PortType)
|
||||
c_cardinality['port_type'] = {"min":0, "max":1}
|
||||
c_children['{http://schemas.xmlsoap.org/wsdl/}binding'] = ('binding', Binding)
|
||||
c_cardinality['binding'] = {"min":0, "max":1}
|
||||
c_children['{http://schemas.xmlsoap.org/wsdl/}service'] = ('service', Service)
|
||||
c_cardinality['service'] = {"min":0, "max":1}
|
||||
c_attributes['targetNamespace'] = ('target_namespace', 'anyURI', False)
|
||||
c_attributes['name'] = ('name', 'NCName', False)
|
||||
c_child_order.extend(['import', 'types', 'message', 'port_type', 'binding', 'service'])
|
||||
|
||||
def __init__(self,
|
||||
import_=None,
|
||||
types=None,
|
||||
message=None,
|
||||
port_type=None,
|
||||
binding=None,
|
||||
service=None,
|
||||
target_namespace=None,
|
||||
name=None,
|
||||
documentation=None,
|
||||
text=None,
|
||||
extension_elements=None,
|
||||
extension_attributes=None,
|
||||
):
|
||||
TExtensibleDocumented_.__init__(self,
|
||||
documentation=documentation,
|
||||
text=text,
|
||||
extension_elements=extension_elements,
|
||||
extension_attributes=extension_attributes,
|
||||
)
|
||||
self.import_=import_
|
||||
self.types=types
|
||||
self.message=message
|
||||
self.port_type=port_type
|
||||
self.binding=binding
|
||||
self.service=service
|
||||
self.target_namespace=target_namespace
|
||||
self.name=name
|
||||
|
||||
def t_definitions__from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(TDefinitions_, xml_string)
|
||||
|
||||
|
||||
class Definitions(TDefinitions_):
|
||||
"""The http://schemas.xmlsoap.org/wsdl/:definitions element """
|
||||
|
||||
c_tag = 'definitions'
|
||||
c_namespace = NAMESPACE
|
||||
c_children = TDefinitions_.c_children.copy()
|
||||
c_attributes = TDefinitions_.c_attributes.copy()
|
||||
c_child_order = TDefinitions_.c_child_order[:]
|
||||
c_cardinality = TDefinitions_.c_cardinality.copy()
|
||||
|
||||
def definitions_from_string(xml_string):
|
||||
return saml2.create_class_from_xml_string(Definitions, xml_string)
|
||||
|
||||
|
||||
#..................
|
||||
# []
|
||||
ELEMENT_FROM_STRING = {
|
||||
TDocumentation_.c_tag: t_documentation__from_string,
|
||||
TDocumented_.c_tag: t_documented__from_string,
|
||||
Definitions.c_tag: definitions_from_string,
|
||||
TDefinitions_.c_tag: t_definitions__from_string,
|
||||
TImport_.c_tag: t_import__from_string,
|
||||
TTypes_.c_tag: t_types__from_string,
|
||||
TMessage_.c_tag: t_message__from_string,
|
||||
TPart_.c_tag: t_part__from_string,
|
||||
TPortType_.c_tag: t_port_type__from_string,
|
||||
TOperation_.c_tag: t_operation__from_string,
|
||||
TParam_.c_tag: t_param__from_string,
|
||||
TFault_.c_tag: t_fault__from_string,
|
||||
TBinding_.c_tag: t_binding__from_string,
|
||||
TBindingOperationMessage_.c_tag: t_binding_operation_message__from_string,
|
||||
TBindingOperationFault_.c_tag: t_binding_operation_fault__from_string,
|
||||
TBindingOperation_.c_tag: t_binding_operation__from_string,
|
||||
TService_.c_tag: t_service__from_string,
|
||||
TPort_.c_tag: t_port__from_string,
|
||||
TDocumented_documentation.c_tag: t_documented_documentation_from_string,
|
||||
TBindingOperation_input.c_tag: t_binding_operation_input_from_string,
|
||||
TBindingOperation_output.c_tag: t_binding_operation_output_from_string,
|
||||
TBindingOperation_fault.c_tag: t_binding_operation_fault_from_string,
|
||||
Import.c_tag: import_from_string,
|
||||
Types.c_tag: types_from_string,
|
||||
TMessage_part.c_tag: t_message_part_from_string,
|
||||
TPortType_operation.c_tag: t_port_type_operation_from_string,
|
||||
TService_port.c_tag: t_service_port_from_string,
|
||||
Message.c_tag: message_from_string,
|
||||
PortType.c_tag: port_type_from_string,
|
||||
Binding.c_tag: binding_from_string,
|
||||
Service.c_tag: service_from_string,
|
||||
}
|
||||
|
||||
ELEMENT_BY_TAG = {
|
||||
'tDocumentation': TDocumentation_,
|
||||
'tDocumented': TDocumented_,
|
||||
'definitions': Definitions,
|
||||
'tDefinitions': TDefinitions_,
|
||||
'tImport': TImport_,
|
||||
'tTypes': TTypes_,
|
||||
'tMessage': TMessage_,
|
||||
'tPart': TPart_,
|
||||
'tPortType': TPortType_,
|
||||
'tOperation': TOperation_,
|
||||
'tParam': TParam_,
|
||||
'tFault': TFault_,
|
||||
'tBinding': TBinding_,
|
||||
'tBindingOperationMessage': TBindingOperationMessage_,
|
||||
'tBindingOperationFault': TBindingOperationFault_,
|
||||
'tBindingOperation': TBindingOperation_,
|
||||
'tService': TService_,
|
||||
'tPort': TPort_,
|
||||
'documentation': TDocumented_documentation,
|
||||
'input': TBindingOperation_input,
|
||||
'output': TBindingOperation_output,
|
||||
'fault': TBindingOperation_fault,
|
||||
'import': Import,
|
||||
'types': Types,
|
||||
'part': TMessage_part,
|
||||
'operation': TPortType_operation,
|
||||
'port': TService_port,
|
||||
'message': Message,
|
||||
'portType': PortType,
|
||||
'binding': Binding,
|
||||
'service': Service,
|
||||
'tExtensibleAttributesDocumented': TExtensibleAttributesDocumented_,
|
||||
'tExtensibleDocumented': TExtensibleDocumented_,
|
||||
'tExtensibilityElement': TExtensibilityElement_,
|
||||
}
|
||||
|
||||
|
||||
def factory(tag, **kwargs):
|
||||
return ELEMENT_BY_TAG[tag](**kwargs)
|
||||
|
||||
829
src/saml2/server.py
Normal file
829
src/saml2/server.py
Normal file
@@ -0,0 +1,829 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
# Copyright (C) 2009-2011 Umeå University
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
"""Contains classes and functions that a SAML2.0 Identity provider (IdP)
|
||||
or attribute authority (AA) may use to conclude its tasks.
|
||||
"""
|
||||
|
||||
import shelve
|
||||
import sys
|
||||
import memcache
|
||||
|
||||
from saml2 import saml
|
||||
from saml2 import class_name
|
||||
from saml2 import soap
|
||||
from saml2 import BINDING_HTTP_REDIRECT
|
||||
from saml2 import BINDING_SOAP
|
||||
from saml2 import BINDING_PAOS
|
||||
|
||||
from saml2.request import AuthnRequest
|
||||
from saml2.request import AttributeQuery
|
||||
from saml2.request import LogoutRequest
|
||||
|
||||
from saml2.s_utils import sid
|
||||
from saml2.s_utils import MissingValue
|
||||
from saml2.s_utils import success_status_factory
|
||||
from saml2.s_utils import OtherError
|
||||
from saml2.s_utils import UnknownPrincipal
|
||||
from saml2.s_utils import UnsupportedBinding
|
||||
from saml2.s_utils import error_status_factory
|
||||
|
||||
from saml2.time_util import instant
|
||||
|
||||
from saml2.binding import http_soap_message
|
||||
from saml2.binding import http_redirect_message
|
||||
from saml2.binding import http_post_message
|
||||
|
||||
from saml2.sigver import security_context
|
||||
from saml2.sigver import signed_instance_factory
|
||||
from saml2.sigver import pre_signature_part
|
||||
from saml2.sigver import response_factory, logoutresponse_factory
|
||||
|
||||
from saml2.config import config_factory
|
||||
|
||||
from saml2.assertion import Assertion, Policy
|
||||
|
||||
class UnknownVO(Exception):
|
||||
pass
|
||||
|
||||
class Identifier(object):
|
||||
""" A class that handles identifiers of objects """
|
||||
def __init__(self, db, voconf=None, debug=0, log=None):
|
||||
if isinstance(db, basestring):
|
||||
self.map = shelve.open(db, writeback=True)
|
||||
else:
|
||||
self.map = db
|
||||
self.voconf = voconf
|
||||
self.debug = debug
|
||||
self.log = log
|
||||
|
||||
def _store(self, typ, entity_id, local, remote):
|
||||
self.map["|".join([typ, entity_id, "f", local])] = remote
|
||||
self.map["|".join([typ, entity_id, "b", remote])] = local
|
||||
|
||||
def _get_remote(self, typ, entity_id, local):
|
||||
return self.map["|".join([typ, entity_id, "f", local])]
|
||||
|
||||
def _get_local(self, typ, entity_id, remote):
|
||||
return self.map["|".join([typ, entity_id, "b", remote])]
|
||||
|
||||
def persistent(self, entity_id, subject_id):
|
||||
""" Keeps the link between a permanent identifier and a
|
||||
temporary/pseudo-temporary identifier for a subject
|
||||
|
||||
The store supports look-up both ways: from a permanent local
|
||||
identifier to a identifier used talking to a SP and from an
|
||||
identifier given back by an SP to the local permanent.
|
||||
|
||||
:param entity_id: SP entity ID or VO entity ID
|
||||
:param subject_id: The local permanent identifier of the subject
|
||||
:return: An arbitrary identifier for the subject unique to the
|
||||
service/group of services/VO with a given entity_id
|
||||
"""
|
||||
try:
|
||||
return self._get_remote("persistent", entity_id, subject_id)
|
||||
except KeyError:
|
||||
temp_id = "xyz"
|
||||
while True:
|
||||
temp_id = sid()
|
||||
try:
|
||||
self._get_local("persistent", entity_id, temp_id)
|
||||
except KeyError:
|
||||
break
|
||||
self._store("persistent", entity_id, subject_id, temp_id)
|
||||
self.map.sync()
|
||||
|
||||
return temp_id
|
||||
|
||||
def _get_vo_identifier(self, sp_name_qualifier, userid, identity):
|
||||
try:
|
||||
vo_conf = self.voconf[sp_name_qualifier]
|
||||
if "common_identifier" in vo_conf:
|
||||
try:
|
||||
subj_id = identity[vo_conf["common_identifier"]]
|
||||
except KeyError:
|
||||
raise MissingValue("Common identifier")
|
||||
else:
|
||||
return self.persistent_nameid(sp_name_qualifier, userid)
|
||||
except (KeyError, TypeError):
|
||||
raise UnknownVO("%s" % sp_name_qualifier)
|
||||
|
||||
try:
|
||||
nameid_format = vo_conf["nameid_format"]
|
||||
except KeyError:
|
||||
nameid_format = saml.NAMEID_FORMAT_PERSISTENT
|
||||
|
||||
return saml.NameID(format=nameid_format,
|
||||
sp_name_qualifier=sp_name_qualifier,
|
||||
text=subj_id)
|
||||
|
||||
def persistent_nameid(self, sp_name_qualifier, userid):
|
||||
""" Get or create a persistent identifier for this object to be used
|
||||
when communicating with servers using a specific SPNameQualifier
|
||||
|
||||
:param sp_name_qualifier: An identifier for a 'context'
|
||||
:param userid: The local permanent identifier of the object
|
||||
:return: A persistent random identifier.
|
||||
"""
|
||||
subj_id = self.persistent(sp_name_qualifier, userid)
|
||||
return saml.NameID(format=saml.NAMEID_FORMAT_PERSISTENT,
|
||||
sp_name_qualifier=sp_name_qualifier,
|
||||
text=subj_id)
|
||||
|
||||
def transient_nameid(self, sp_entity_id, userid):
|
||||
""" Returns a random one-time identifier. One-time means it is
|
||||
kept around as long as the session is active.
|
||||
|
||||
:param sp_entity_id: A qualifier to bind the created identifier to
|
||||
:param userid: The local persistent identifier for the subject.
|
||||
:return: The created identifier,
|
||||
"""
|
||||
temp_id = sid()
|
||||
while True:
|
||||
try:
|
||||
_ = self._get_local("transient", sp_entity_id, temp_id)
|
||||
temp_id = sid()
|
||||
except KeyError:
|
||||
break
|
||||
self._store("transient", sp_entity_id, userid, temp_id)
|
||||
self.map.sync()
|
||||
|
||||
return saml.NameID(format=saml.NAMEID_FORMAT_TRANSIENT,
|
||||
sp_name_qualifier=sp_entity_id,
|
||||
text=temp_id)
|
||||
|
||||
def email_nameid(self, sp_name_qualifier, userid):
|
||||
return saml.NameID(format=saml.NAMEID_FORMAT_EMAILADDRESS,
|
||||
sp_name_qualifier=sp_name_qualifier,
|
||||
text=userid)
|
||||
|
||||
def construct_nameid(self, local_policy, userid, sp_entity_id,
|
||||
identity=None, name_id_policy=None, sp_nid=None):
|
||||
""" Returns a name_id for the object. How the name_id is
|
||||
constructed depends on the context.
|
||||
|
||||
:param local_policy: The policy the server is configured to follow
|
||||
:param userid: The local permanent identifier of the object
|
||||
:param sp_entity_id: The 'user' of the name_id
|
||||
:param identity: Attribute/value pairs describing the object
|
||||
:param name_id_policy: The policy the server on the other side wants
|
||||
us to follow.
|
||||
:param sp_nid: Name ID Formats from the SPs metadata
|
||||
:return: NameID instance precursor
|
||||
"""
|
||||
if name_id_policy and name_id_policy.sp_name_qualifier:
|
||||
try:
|
||||
return self._get_vo_identifier(name_id_policy.sp_name_qualifier,
|
||||
userid, identity)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
if sp_nid:
|
||||
nameid_format = sp_nid[0]
|
||||
else:
|
||||
nameid_format = local_policy.get_nameid_format(sp_entity_id)
|
||||
|
||||
if nameid_format == saml.NAMEID_FORMAT_PERSISTENT:
|
||||
return self.persistent_nameid(sp_entity_id, userid)
|
||||
elif nameid_format == saml.NAMEID_FORMAT_TRANSIENT:
|
||||
return self.transient_nameid(sp_entity_id, userid)
|
||||
elif nameid_format == saml.NAMEID_FORMAT_EMAILADDRESS:
|
||||
return self.email_nameid(sp_entity_id, userid)
|
||||
|
||||
def local_name(self, entity_id, remote_id):
|
||||
""" Get the local persistent name that has the specified remote ID.
|
||||
|
||||
:param entity_id: The identifier of the entity that got the remote id
|
||||
:param remote_id: The identifier that was exported
|
||||
:return: Local identifier
|
||||
"""
|
||||
try:
|
||||
return self._get_local("persistent", entity_id, remote_id)
|
||||
except KeyError:
|
||||
try:
|
||||
return self._get_local("transient", entity_id, remote_id)
|
||||
except KeyError:
|
||||
return None
|
||||
|
||||
class Server(object):
|
||||
""" A class that does things that IdPs or AAs do """
|
||||
def __init__(self, config_file="", config=None, _cache="",
|
||||
log=None, debug=0, stype="idp"):
|
||||
|
||||
self.log = log
|
||||
self.debug = debug
|
||||
self.ident = None
|
||||
if config_file:
|
||||
self.load_config(config_file, stype)
|
||||
elif config:
|
||||
self.conf = config
|
||||
else:
|
||||
raise Exception("Missing configuration")
|
||||
|
||||
if self.log is None:
|
||||
self.log = self.conf.setup_logger()
|
||||
|
||||
self.metadata = self.conf.metadata
|
||||
self.sec = security_context(self.conf, log)
|
||||
self._cache = _cache
|
||||
|
||||
# if cache:
|
||||
# if isinstance(cache, basestring):
|
||||
# self.cache = Cache(cache)
|
||||
# else:
|
||||
# self.cache = cache
|
||||
# else:
|
||||
# self.cache = Cache()
|
||||
|
||||
def load_config(self, config_file, stype="idp"):
|
||||
""" Load the server configuration
|
||||
|
||||
:param config_file: The name of the configuration file
|
||||
:param stype: The type of Server ("idp"/"aa")
|
||||
"""
|
||||
self.conf = config_factory(stype, config_file)
|
||||
if stype == "aa":
|
||||
return
|
||||
|
||||
try:
|
||||
# subject information is stored in a database
|
||||
# default database is a shelve database which is OK in some setups
|
||||
dbspec = self.conf.subject_data
|
||||
idb = None
|
||||
if isinstance(dbspec, basestring):
|
||||
idb = shelve.open(dbspec, writeback=True)
|
||||
else: # database spec is a a 2-tuple (type, address)
|
||||
print >> sys.stderr, "DBSPEC: %s" % dbspec
|
||||
(typ, addr) = dbspec
|
||||
if typ == "shelve":
|
||||
idb = shelve.open(addr, writeback=True)
|
||||
elif typ == "memcached":
|
||||
idb = memcache.Client(addr)
|
||||
elif typ == "dict": # in-memory dictionary
|
||||
idb = addr
|
||||
|
||||
if idb is not None:
|
||||
self.ident = Identifier(idb, self.conf.virtual_organization,
|
||||
self.debug, self.log)
|
||||
else:
|
||||
raise Exception("Couldn't open identity database: %s" %
|
||||
(dbspec,))
|
||||
except AttributeError:
|
||||
self.ident = None
|
||||
|
||||
def issuer(self, entityid=None):
|
||||
""" Return an Issuer precursor """
|
||||
if entityid:
|
||||
return saml.Issuer(text=entityid,
|
||||
format=saml.NAMEID_FORMAT_ENTITY)
|
||||
else:
|
||||
return saml.Issuer(text=self.conf.entityid,
|
||||
format=saml.NAMEID_FORMAT_ENTITY)
|
||||
|
||||
def parse_authn_request(self, enc_request, binding=BINDING_HTTP_REDIRECT):
|
||||
"""Parse a Authentication Request
|
||||
|
||||
:param enc_request: The request in its transport format
|
||||
:param binding: Which binding that was used to transport the message
|
||||
to this entity.
|
||||
:return: A dictionary with keys:
|
||||
consumer_url - as gotten from the SPs entity_id and the metadata
|
||||
id - the id of the request
|
||||
sp_entity_id - the entity id of the SP
|
||||
request - The verified request
|
||||
"""
|
||||
|
||||
response = {}
|
||||
if self.log:
|
||||
_log_info = self.log.info
|
||||
else:
|
||||
_log_info = None
|
||||
|
||||
# The addresses I should receive messages like this on
|
||||
receiver_addresses = self.conf.endpoint("single_sign_on_service",
|
||||
binding)
|
||||
if self.debug and self.log:
|
||||
_log_info("receiver addresses: %s" % receiver_addresses)
|
||||
_log_info("Binding: %s" % binding)
|
||||
|
||||
|
||||
try:
|
||||
timeslack = self.conf.accepted_time_diff
|
||||
if not timeslack:
|
||||
timeslack = 0
|
||||
except AttributeError:
|
||||
timeslack = 0
|
||||
|
||||
authn_request = AuthnRequest(self.sec,
|
||||
self.conf.attribute_converters,
|
||||
receiver_addresses, log=self.log,
|
||||
timeslack=timeslack)
|
||||
|
||||
if binding == BINDING_SOAP or binding == BINDING_PAOS:
|
||||
# not base64 decoding and unzipping
|
||||
authn_request.debug=True
|
||||
_log_info("Don't decode")
|
||||
authn_request = authn_request.loads(enc_request, decode=False)
|
||||
else:
|
||||
authn_request = authn_request.loads(enc_request)
|
||||
|
||||
if self.debug and self.log:
|
||||
_log_info("Loaded authn_request")
|
||||
|
||||
if authn_request:
|
||||
authn_request = authn_request.verify()
|
||||
|
||||
if self.debug and self.log:
|
||||
_log_info("Verified authn_request")
|
||||
|
||||
if not authn_request:
|
||||
return None
|
||||
|
||||
response["id"] = authn_request.message.id # put in in_reply_to
|
||||
|
||||
sp_entity_id = authn_request.message.issuer.text
|
||||
# try to find return address in metadata
|
||||
try:
|
||||
# What's the binding ? ProtocolBinding
|
||||
_binding = authn_request.message.protocol_binding
|
||||
consumer_url = self.metadata.consumer_url(sp_entity_id,
|
||||
binding=_binding)
|
||||
except KeyError:
|
||||
if self.log:
|
||||
_log_info("Failed to find consumer URL for %s" % sp_entity_id)
|
||||
_log_info("entities: %s" % self.metadata.entity.keys())
|
||||
raise UnknownPrincipal(sp_entity_id)
|
||||
|
||||
if not consumer_url: # what to do ?
|
||||
if self.log:
|
||||
_log_info("Couldn't find a consumer URL binding=%s" % _binding)
|
||||
raise UnsupportedBinding(sp_entity_id)
|
||||
|
||||
response["sp_entity_id"] = sp_entity_id
|
||||
|
||||
if authn_request.message.assertion_consumer_service_url:
|
||||
return_destination = \
|
||||
authn_request.message.assertion_consumer_service_url
|
||||
|
||||
if consumer_url != return_destination:
|
||||
# serious error on someones behalf
|
||||
if self.log:
|
||||
_log_info("%s != %s" % (consumer_url, return_destination))
|
||||
else:
|
||||
print >> sys.stderr, \
|
||||
"%s != %s" % (consumer_url, return_destination)
|
||||
raise OtherError("ConsumerURL and return destination mismatch")
|
||||
|
||||
response["consumer_url"] = consumer_url
|
||||
response["request"] = authn_request.message
|
||||
|
||||
return response
|
||||
|
||||
def wants(self, sp_entity_id):
|
||||
""" Returns what attributes the SP requiers and which are optional
|
||||
if any such demands are registered in the Metadata.
|
||||
|
||||
:param sp_entity_id: The entity id of the SP
|
||||
:return: 2-tuple, list of required and list of optional attributes
|
||||
"""
|
||||
return self.metadata.requests(sp_entity_id)
|
||||
|
||||
def parse_attribute_query(self, xml_string, decode=True):
|
||||
""" Parse an attribute query
|
||||
|
||||
:param xml_string: The Attribute Query as an XML string
|
||||
:param decode: Whether the xmlstring is base64encoded and zipped
|
||||
:return: 3-Tuple containing:
|
||||
subject - identifier of the subject
|
||||
attribute - which attributes that the requestor wants back
|
||||
query - the whole query
|
||||
"""
|
||||
receiver_addresses = self.conf.endpoint("attribute_service")
|
||||
attribute_query = AttributeQuery( self.sec, receiver_addresses)
|
||||
|
||||
attribute_query = attribute_query.loads(xml_string, decode=decode)
|
||||
attribute_query = attribute_query.verify()
|
||||
|
||||
self.log.info("KEYS: %s" % attribute_query.message.keys())
|
||||
# Subject is described in the a saml.Subject instance
|
||||
subject = attribute_query.subject_id()
|
||||
attribute = attribute_query.attribute()
|
||||
|
||||
return subject, attribute, attribute_query.message
|
||||
|
||||
# ------------------------------------------------------------------------
|
||||
|
||||
def _response(self, in_response_to, consumer_url=None, sp_entity_id=None,
|
||||
identity=None, name_id=None, status=None, sign=False,
|
||||
policy=Policy(), authn=None, authn_decl=None, issuer=None):
|
||||
""" Create a Response that adhers to the ??? profile.
|
||||
|
||||
:param in_response_to: The session identifier of the request
|
||||
:param consumer_url: The URL which should receive the response
|
||||
:param sp_entity_id: The entity identifier of the SP
|
||||
:param identity: A dictionary with attributes and values that are
|
||||
expected to be the bases for the assertion in the response.
|
||||
:param name_id: The identifier of the subject
|
||||
:param status: The status of the response
|
||||
:param sign: Whether the assertion should be signed or not
|
||||
:param policy: The attribute release policy for this instance
|
||||
:param authn: A 2-tuple denoting the authn class and the authn
|
||||
authority
|
||||
:param authn_decl:
|
||||
:param issuer: The issuer of the response
|
||||
:return: A Response instance
|
||||
"""
|
||||
|
||||
to_sign = []
|
||||
|
||||
if not status:
|
||||
status = success_status_factory()
|
||||
|
||||
_issuer = self.issuer(issuer)
|
||||
|
||||
response = response_factory(
|
||||
issuer=_issuer,
|
||||
in_response_to = in_response_to,
|
||||
status = status,
|
||||
)
|
||||
|
||||
if consumer_url:
|
||||
response.destination = consumer_url
|
||||
|
||||
if identity:
|
||||
ast = Assertion(identity)
|
||||
try:
|
||||
ast.apply_policy(sp_entity_id, policy, self.metadata)
|
||||
except MissingValue, exc:
|
||||
return self.error_response(in_response_to, consumer_url,
|
||||
sp_entity_id, exc, name_id)
|
||||
|
||||
if authn: # expected to be a 2-tuple class+authority
|
||||
(authn_class, authn_authn) = authn
|
||||
assertion = ast.construct(sp_entity_id, in_response_to,
|
||||
consumer_url, name_id,
|
||||
self.conf.attribute_converters,
|
||||
policy, issuer=_issuer,
|
||||
authn_class=authn_class,
|
||||
authn_auth=authn_authn)
|
||||
elif authn_decl:
|
||||
assertion = ast.construct(sp_entity_id, in_response_to,
|
||||
consumer_url, name_id,
|
||||
self.conf.attribute_converters,
|
||||
policy, issuer=_issuer,
|
||||
authn_decl=authn_decl)
|
||||
else:
|
||||
assertion = ast.construct(sp_entity_id, in_response_to,
|
||||
consumer_url, name_id,
|
||||
self.conf.attribute_converters,
|
||||
policy, issuer=_issuer)
|
||||
|
||||
if sign:
|
||||
assertion.signature = pre_signature_part(assertion.id,
|
||||
self.sec.my_cert, 1)
|
||||
# Just the assertion or the response and the assertion ?
|
||||
to_sign = [(class_name(assertion), assertion.id)]
|
||||
|
||||
# Store which assertion that has been sent to which SP about which
|
||||
# subject.
|
||||
|
||||
# self.cache.set(assertion.subject.name_id.text,
|
||||
# sp_entity_id, {"ava": identity, "authn": authn},
|
||||
# assertion.conditions.not_on_or_after)
|
||||
|
||||
response.assertion = assertion
|
||||
|
||||
return signed_instance_factory(response, self.sec, to_sign)
|
||||
|
||||
# ------------------------------------------------------------------------
|
||||
|
||||
def do_response(self, in_response_to, consumer_url,
|
||||
sp_entity_id, identity=None, name_id=None,
|
||||
status=None, sign=False, authn=None, authn_decl=None,
|
||||
issuer=None):
|
||||
""" Create a response. A layer of indirection.
|
||||
|
||||
:param in_response_to: The session identifier of the request
|
||||
:param consumer_url: The URL which should receive the response
|
||||
:param sp_entity_id: The entity identifier of the SP
|
||||
:param identity: A dictionary with attributes and values that are
|
||||
expected to be the bases for the assertion in the response.
|
||||
:param name_id: The identifier of the subject
|
||||
:param status: The status of the response
|
||||
:param sign: Whether the assertion should be signed or not
|
||||
:param authn: A 2-tuple denoting the authn class and the authn
|
||||
authority.
|
||||
:param authn_decl:
|
||||
:param issuer: The issuer of the response
|
||||
:return: A Response instance.
|
||||
"""
|
||||
|
||||
policy = self.conf.policy
|
||||
|
||||
return self._response(in_response_to, consumer_url,
|
||||
sp_entity_id, identity, name_id,
|
||||
status, sign, policy, authn, authn_decl, issuer)
|
||||
|
||||
# ------------------------------------------------------------------------
|
||||
|
||||
def error_response(self, in_response_to, destination, spid, info,
|
||||
name_id=None, sign=False, issuer=None):
|
||||
""" Create a error response.
|
||||
|
||||
:param in_response_to: The identifier of the message this is a response
|
||||
to.
|
||||
:param destination: The intended recipient of this message
|
||||
:param spid: The entitiy ID of the SP that will get this.
|
||||
:param info: Either an Exception instance or a 2-tuple consisting of
|
||||
error code and descriptive text
|
||||
:param name_id:
|
||||
:param sign: Whether the message should be signed or not
|
||||
:param issuer: The issuer of the response
|
||||
:return: A Response instance
|
||||
"""
|
||||
status = error_status_factory(info)
|
||||
|
||||
return self._response(
|
||||
in_response_to, # in_response_to
|
||||
destination, # consumer_url
|
||||
spid, # sp_entity_id
|
||||
name_id=name_id,
|
||||
status=status,
|
||||
sign=sign,
|
||||
issuer=issuer
|
||||
)
|
||||
|
||||
# ------------------------------------------------------------------------
|
||||
#noinspection PyUnusedLocal
|
||||
def do_aa_response(self, in_response_to, consumer_url, sp_entity_id,
|
||||
identity=None, userid="", name_id=None, status=None,
|
||||
sign=False, _name_id_policy=None, issuer=None):
|
||||
""" Create an attribute assertion response.
|
||||
|
||||
:param in_response_to: The session identifier of the request
|
||||
:param consumer_url: The URL which should receive the response
|
||||
:param sp_entity_id: The entity identifier of the SP
|
||||
:param identity: A dictionary with attributes and values that are
|
||||
expected to be the bases for the assertion in the response.
|
||||
:param userid: A identifier of the user
|
||||
:param name_id: The identifier of the subject
|
||||
:param status: The status of the response
|
||||
:param sign: Whether the assertion should be signed or not
|
||||
:param _name_id_policy: Policy for NameID creation.
|
||||
:param issuer: The issuer of the response
|
||||
:return: A Response instance.
|
||||
"""
|
||||
# name_id = self.ident.construct_nameid(self.conf.policy, userid,
|
||||
# sp_entity_id, identity)
|
||||
|
||||
return self._response(in_response_to, consumer_url,
|
||||
sp_entity_id, identity, name_id,
|
||||
status, sign, policy=self.conf.policy, issuer=issuer)
|
||||
|
||||
# ------------------------------------------------------------------------
|
||||
|
||||
def authn_response(self, identity, in_response_to, destination,
|
||||
sp_entity_id, name_id_policy, userid, sign=False,
|
||||
authn=None, sign_response=False, authn_decl=None,
|
||||
issuer=None, instance=False):
|
||||
""" Constructs an AuthenticationResponse
|
||||
|
||||
:param identity: Information about an user
|
||||
:param in_response_to: The identifier of the authentication request
|
||||
this response is an answer to.
|
||||
:param destination: Where the response should be sent
|
||||
:param sp_entity_id: The entity identifier of the Service Provider
|
||||
:param name_id_policy: ...
|
||||
:param userid: The subject identifier
|
||||
:param sign: Whether the assertion should be signed or not. This is
|
||||
different from signing the response as such.
|
||||
:param authn: Information about the authentication
|
||||
:param sign_response: The response can be signed separately from the
|
||||
assertions.
|
||||
:param authn_decl:
|
||||
:param issuer: Issuer of the response
|
||||
:param instance: Whether to return the instance or a string
|
||||
representation
|
||||
:return: A XML string representing an authentication response
|
||||
"""
|
||||
|
||||
name_id = None
|
||||
try:
|
||||
nid_formats = []
|
||||
for _sp in self.metadata.entity[sp_entity_id]["sp_sso"]:
|
||||
nid_formats.extend([n.text for n in _sp.name_id_format])
|
||||
|
||||
policy = self.conf.policy
|
||||
name_id = self.ident.construct_nameid(policy, userid, sp_entity_id,
|
||||
identity, name_id_policy,
|
||||
nid_formats)
|
||||
except IOError, exc:
|
||||
response = self.error_response(in_response_to, destination,
|
||||
sp_entity_id, exc, name_id)
|
||||
return ("%s" % response).split("\n")
|
||||
|
||||
try:
|
||||
response = self.do_response(
|
||||
in_response_to, # in_response_to
|
||||
destination, # consumer_url
|
||||
sp_entity_id, # sp_entity_id
|
||||
identity, # identity as dictionary
|
||||
name_id,
|
||||
sign=sign, # If the assertion should be signed
|
||||
authn=authn, # Information about the
|
||||
# authentication
|
||||
authn_decl=authn_decl,
|
||||
issuer=issuer
|
||||
)
|
||||
except MissingValue, exc:
|
||||
response = self.error_response(in_response_to, destination,
|
||||
sp_entity_id, exc, name_id)
|
||||
|
||||
|
||||
if sign_response:
|
||||
try:
|
||||
response.signature = pre_signature_part(response.id,
|
||||
self.sec.my_cert, 2)
|
||||
|
||||
return self.sec.sign_statement_using_xmlsec(response,
|
||||
class_name(response),
|
||||
nodeid=response.id)
|
||||
except Exception, exc:
|
||||
response = self.error_response(in_response_to, destination,
|
||||
sp_entity_id, exc, name_id)
|
||||
if instance:
|
||||
return response
|
||||
else:
|
||||
return ("%s" % response).split("\n")
|
||||
else:
|
||||
if instance:
|
||||
return response
|
||||
else:
|
||||
return ("%s" % response).split("\n")
|
||||
|
||||
def parse_logout_request(self, text, binding=BINDING_SOAP):
|
||||
"""Parse a Logout Request
|
||||
|
||||
:param text: The request in its transport format, if the binding is
|
||||
HTTP-Redirect or HTTP-Post the text *must* be the value of the
|
||||
SAMLRequest attribute.
|
||||
:return: A validated LogoutRequest instance or None if validation
|
||||
failed.
|
||||
"""
|
||||
|
||||
try:
|
||||
slo = self.conf.endpoint("single_logout_service", binding)
|
||||
except IndexError:
|
||||
if self.log:
|
||||
self.log.info("enpoints: %s" % (self.conf.endpoints,))
|
||||
self.log.info("binding wanted: %s" % (binding,))
|
||||
raise
|
||||
|
||||
if not slo:
|
||||
raise Exception("No single_logout_server for that binding")
|
||||
|
||||
if self.log:
|
||||
self.log.info("Endpoint: %s" % slo)
|
||||
req = LogoutRequest(self.sec, slo)
|
||||
if binding == BINDING_SOAP:
|
||||
lreq = soap.parse_soap_enveloped_saml_logout_request(text)
|
||||
try:
|
||||
req = req.loads(lreq, False) # Got it over SOAP so no base64+zip
|
||||
except Exception:
|
||||
return None
|
||||
else:
|
||||
try:
|
||||
req = req.loads(text)
|
||||
except Exception, exc:
|
||||
self.log.error("%s" % (exc,))
|
||||
return None
|
||||
|
||||
req = req.verify()
|
||||
|
||||
if not req: # Not a valid request
|
||||
# return a error message with status code element set to
|
||||
# urn:oasis:names:tc:SAML:2.0:status:Requester
|
||||
return None
|
||||
else:
|
||||
return req
|
||||
|
||||
|
||||
def logout_response(self, request, bindings, status=None,
|
||||
sign=False, issuer=None):
|
||||
""" Create a LogoutResponse. What is returned depends on which binding
|
||||
is used.
|
||||
|
||||
:param request: The request this is a response to
|
||||
:param bindings: Which bindings that can be used to send the response
|
||||
:param status: The return status of the response operation
|
||||
:param issuer: The issuer of the message
|
||||
:return: A 3-tuple consisting of HTTP return code, HTTP headers and
|
||||
possibly a message.
|
||||
"""
|
||||
sp_entity_id = request.issuer.text.strip()
|
||||
|
||||
binding = None
|
||||
destinations = []
|
||||
for binding in bindings:
|
||||
destinations = self.conf.single_logout_services(sp_entity_id,
|
||||
binding)
|
||||
if destinations:
|
||||
break
|
||||
|
||||
|
||||
if not destinations:
|
||||
if self.log:
|
||||
self.log.error("Not way to return a response !!!")
|
||||
return ("412 Precondition Failed",
|
||||
[("Content-type", "text/html")],
|
||||
["No return way defined"])
|
||||
|
||||
# Pick the first
|
||||
destination = destinations[0]
|
||||
|
||||
if self.log:
|
||||
self.log.info("Logout Destination: %s, binding: %s" % (destination,
|
||||
binding))
|
||||
if not status:
|
||||
status = success_status_factory()
|
||||
|
||||
mid = sid()
|
||||
rcode = "200 OK"
|
||||
|
||||
# response and packaging differs depending on binding
|
||||
|
||||
if binding == BINDING_SOAP:
|
||||
response = logoutresponse_factory(
|
||||
sign=sign,
|
||||
id = mid,
|
||||
in_response_to = request.id,
|
||||
status = status,
|
||||
)
|
||||
if sign:
|
||||
to_sign = [(class_name(response), mid)]
|
||||
response = signed_instance_factory(response, self.sec, to_sign)
|
||||
|
||||
(headers, message) = http_soap_message(response)
|
||||
else:
|
||||
_issuer = self.issuer(issuer)
|
||||
response = logoutresponse_factory(
|
||||
sign=sign,
|
||||
id = mid,
|
||||
in_response_to = request.id,
|
||||
status = status,
|
||||
issuer = _issuer,
|
||||
destination = destination,
|
||||
sp_entity_id = sp_entity_id,
|
||||
instant=instant(),
|
||||
)
|
||||
if sign:
|
||||
to_sign = [(class_name(response), mid)]
|
||||
response = signed_instance_factory(response, self.sec, to_sign)
|
||||
|
||||
if self.log:
|
||||
self.log.info("Response: %s" % (response,))
|
||||
if binding == BINDING_HTTP_REDIRECT:
|
||||
(headers, message) = http_redirect_message(response,
|
||||
destination,
|
||||
typ="SAMLResponse")
|
||||
rcode = "302 Found"
|
||||
else:
|
||||
(headers, message) = http_post_message(response, destination,
|
||||
typ="SAMLResponse")
|
||||
|
||||
return rcode, headers, message
|
||||
|
||||
def parse_authz_decision_query(self, xml_string):
|
||||
""" Parse an attribute query
|
||||
|
||||
:param xml_string: The Authz decision Query as an XML string
|
||||
:return: 3-Tuple containing:
|
||||
subject - identifier of the subject
|
||||
attribute - which attributes that the requestor wants back
|
||||
query - the whole query
|
||||
"""
|
||||
receiver_addresses = self.conf.endpoint("attribute_service")
|
||||
attribute_query = AttributeQuery( self.sec, receiver_addresses)
|
||||
|
||||
attribute_query = attribute_query.loads(xml_string)
|
||||
attribute_query = attribute_query.verify()
|
||||
|
||||
# Subject name is a BaseID,NameID or EncryptedID instance
|
||||
subject = attribute_query.subject_id()
|
||||
attribute = attribute_query.attribute()
|
||||
|
||||
return subject, attribute, attribute_query.message
|
||||
1029
src/saml2/sigver.py
Normal file
1029
src/saml2/sigver.py
Normal file
File diff suppressed because it is too large
Load Diff
336
src/saml2/soap.py
Normal file
336
src/saml2/soap.py
Normal file
@@ -0,0 +1,336 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
# Copyright (C) 2009-2011 Umeå University
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
"""
|
||||
Suppport for the client part of the SAML2.0 SOAP binding.
|
||||
"""
|
||||
|
||||
from httplib2 import Http
|
||||
|
||||
from saml2 import httplib2cookie
|
||||
from saml2 import create_class_from_element_tree
|
||||
from saml2.samlp import NAMESPACE as SAMLP_NAMESPACE
|
||||
#from saml2 import element_to_extension_element
|
||||
from saml2.schema import soapenv
|
||||
from saml2 import class_name
|
||||
|
||||
try:
|
||||
from xml.etree import cElementTree as ElementTree
|
||||
except ImportError:
|
||||
try:
|
||||
import cElementTree as ElementTree
|
||||
except ImportError:
|
||||
#noinspection PyUnresolvedReferences
|
||||
from elementtree import ElementTree
|
||||
|
||||
class XmlParseError(Exception):
|
||||
pass
|
||||
|
||||
|
||||
#NAMESPACE = "http://schemas.xmlsoap.org/soap/envelope/"
|
||||
|
||||
def parse_soap_enveloped_saml_response(text):
|
||||
tags = ['{%s}Response' % SAMLP_NAMESPACE,
|
||||
'{%s}LogoutResponse' % SAMLP_NAMESPACE]
|
||||
return parse_soap_enveloped_saml_thingy(text, tags)
|
||||
|
||||
def parse_soap_enveloped_saml_attribute_query(text):
|
||||
expected_tag = '{%s}AttributeQuery' % SAMLP_NAMESPACE
|
||||
return parse_soap_enveloped_saml_thingy(text, [expected_tag])
|
||||
|
||||
def parse_soap_enveloped_saml_logout_request(text):
|
||||
expected_tag = '{%s}LogoutRequest' % SAMLP_NAMESPACE
|
||||
return parse_soap_enveloped_saml_thingy(text, [expected_tag])
|
||||
|
||||
def parse_soap_enveloped_saml_authentication_request(text):
|
||||
expected_tag = '{%s}AuthenticationRequest' % SAMLP_NAMESPACE
|
||||
return parse_soap_enveloped_saml_thingy(text, [expected_tag])
|
||||
|
||||
#def parse_soap_enveloped_saml_logout_response(text):
|
||||
# expected_tag = '{%s}LogoutResponse' % SAMLP_NAMESPACE
|
||||
# return parse_soap_enveloped_saml_thingy(text, [expected_tag])
|
||||
|
||||
def parse_soap_enveloped_saml_thingy(text, expected_tags):
|
||||
"""Parses a SOAP enveloped SAML thing and returns the thing as
|
||||
a string.
|
||||
|
||||
:param text: The SOAP object as XML
|
||||
:param expected_tags: What the tag of the SAML thingy is expected to be.
|
||||
:return: SAML thingy as a string
|
||||
"""
|
||||
envelope = ElementTree.fromstring(text)
|
||||
# if True:
|
||||
# fil = open("soap.xml", "w")
|
||||
# fil.write(text)
|
||||
# fil.close()
|
||||
|
||||
assert envelope.tag == '{%s}Envelope' % soapenv.NAMESPACE
|
||||
|
||||
assert len(envelope) >= 1
|
||||
body = None
|
||||
for part in envelope:
|
||||
if part.tag == '{%s}Body' % soapenv.NAMESPACE:
|
||||
assert len(part) == 1
|
||||
body = part
|
||||
break
|
||||
|
||||
if body is None:
|
||||
return ""
|
||||
|
||||
saml_part = body[0]
|
||||
if saml_part.tag in expected_tags:
|
||||
return ElementTree.tostring(saml_part, encoding="UTF-8")
|
||||
else:
|
||||
return ""
|
||||
|
||||
import re
|
||||
|
||||
NS_AND_TAG = re.compile("\{([^}]+)\}(.*)")
|
||||
|
||||
def class_instances_from_soap_enveloped_saml_thingies(text, modules):
|
||||
"""Parses a SOAP enveloped header and body SAML thing and returns the
|
||||
thing as a dictionary class instance.
|
||||
|
||||
:param text: The SOAP object as XML
|
||||
:param modules: modules representing xsd schemas
|
||||
:return: SAML thingy as a class instance
|
||||
"""
|
||||
try:
|
||||
envelope = ElementTree.fromstring(text)
|
||||
except Exception, exc:
|
||||
raise XmlParseError("%s" % exc)
|
||||
|
||||
assert envelope.tag == '{%s}Envelope' % soapenv.NAMESPACE
|
||||
assert len(envelope) >= 1
|
||||
env = {"header":[], "body":None}
|
||||
|
||||
for part in envelope:
|
||||
if part.tag == '{%s}Body' % soapenv.NAMESPACE:
|
||||
assert len(part) == 1
|
||||
m = NS_AND_TAG.match(part[0].tag)
|
||||
ns,tag = m.groups()
|
||||
for module in modules:
|
||||
if module.NAMESPACE == ns:
|
||||
try:
|
||||
target = module.ELEMENT_BY_TAG[tag]
|
||||
env["body"] = create_class_from_element_tree(target,
|
||||
part[0])
|
||||
except KeyError:
|
||||
continue
|
||||
elif part.tag == "{%s}Header" % soapenv.NAMESPACE:
|
||||
for item in part:
|
||||
m = NS_AND_TAG.match(item.tag)
|
||||
ns,tag = m.groups()
|
||||
for module in modules:
|
||||
if module.NAMESPACE == ns:
|
||||
try:
|
||||
target = module.ELEMENT_BY_TAG[tag]
|
||||
env["header"].append(create_class_from_element_tree(
|
||||
target,
|
||||
item))
|
||||
except KeyError:
|
||||
continue
|
||||
|
||||
return env
|
||||
|
||||
def make_soap_enveloped_saml_thingy(thingy, headers=None):
|
||||
""" Returns a soap envelope containing a SAML request
|
||||
as a text string.
|
||||
|
||||
:param thingy: The SAML thingy
|
||||
:return: The SOAP envelope as a string
|
||||
"""
|
||||
soap_envelope = soapenv.Envelope()
|
||||
|
||||
if headers:
|
||||
_header = soapenv.Header()
|
||||
_header.add_extension_elements(headers)
|
||||
soap_envelope.header = _header
|
||||
|
||||
soap_envelope.body = soapenv.Body()
|
||||
soap_envelope.body.add_extension_element(thingy)
|
||||
|
||||
return "%s" % soap_envelope
|
||||
|
||||
def soap_fault(message=None, actor=None, code=None, detail=None):
|
||||
""" Create a SOAP Fault message
|
||||
|
||||
:param message: Human readable error message
|
||||
:param actor: Who discovered the error
|
||||
:param code: Error code
|
||||
:param detail: More specific error message
|
||||
:return: A SOAP Fault message as a string
|
||||
"""
|
||||
_string = _actor = _code = _detail = None
|
||||
|
||||
if message:
|
||||
_string = soapenv.Fault_faultstring(text=message)
|
||||
if actor:
|
||||
_actor = soapenv.Fault_faultactor(text=actor)
|
||||
if code:
|
||||
_code = soapenv.Fault_faultcode(text=code)
|
||||
if detail:
|
||||
_detail = soapenv.Fault_detail(text=detail)
|
||||
|
||||
fault = soapenv.Fault(
|
||||
faultcode=_code,
|
||||
faultstring=_string,
|
||||
faultactor=_actor,
|
||||
detail=_detail,
|
||||
)
|
||||
|
||||
return "%s" % fault
|
||||
|
||||
class HTTPClient(object):
|
||||
""" For sending a message to a HTTP server using POST or GET """
|
||||
def __init__(self, path, keyfile=None, certfile=None, log=None,
|
||||
cookiejar=None, ca_certs="",
|
||||
disable_ssl_certificate_validation=True):
|
||||
self.path = path
|
||||
if cookiejar is not None:
|
||||
self.cj = True
|
||||
self.server = httplib2cookie.CookiefulHttp(cookiejar,
|
||||
ca_certs=ca_certs,
|
||||
disable_ssl_certificate_validation=disable_ssl_certificate_validation)
|
||||
else:
|
||||
self.cj = False
|
||||
self.server = Http(ca_certs=ca_certs,
|
||||
disable_ssl_certificate_validation=disable_ssl_certificate_validation)
|
||||
self.log = log
|
||||
self.response = None
|
||||
|
||||
if keyfile:
|
||||
self.server.add_certificate(keyfile, certfile, "")
|
||||
|
||||
def post(self, data, headers=None, path=None):
|
||||
if headers is None:
|
||||
headers = {}
|
||||
if path is None:
|
||||
path = self.path
|
||||
|
||||
if self.cj:
|
||||
(response, content) = self.server.crequest(path, method="POST",
|
||||
body=data,
|
||||
headers=headers)
|
||||
else:
|
||||
(response, content) = self.server.request(path, method="POST",
|
||||
body=data,
|
||||
headers=headers)
|
||||
|
||||
if response.status == 200 or response.status == 201:
|
||||
return content
|
||||
# elif response.status == 302: # redirect
|
||||
# return self.post(data, headers, response["location"])
|
||||
else:
|
||||
self.response = response
|
||||
self.error_description = content
|
||||
return False
|
||||
|
||||
def get(self, headers=None, path=None):
|
||||
if path is None:
|
||||
path = self.path
|
||||
|
||||
if headers is None:
|
||||
headers = {"content-type": "text/html"}
|
||||
|
||||
(response, content) = self.server.crequest(path, method="GET",
|
||||
headers=headers)
|
||||
if response.status == 200 or response.status == 201:
|
||||
return content
|
||||
# elif response.status == 302: # redirect
|
||||
# return self.get(headers, response["location"])
|
||||
else:
|
||||
self.response = response
|
||||
self.error_description = content
|
||||
return None
|
||||
|
||||
def put(self, data, headers=None, path=None):
|
||||
if headers is None:
|
||||
headers = {}
|
||||
if path is None:
|
||||
path = self.path
|
||||
|
||||
(response, content) = self.server.crequest(path, method="PUT",
|
||||
body=data,
|
||||
headers=headers)
|
||||
if response.status == 200 or response.status == 201:
|
||||
return content
|
||||
else:
|
||||
self.response = response
|
||||
self.error_description = content
|
||||
return False
|
||||
|
||||
def delete(self, headers=None, path=None):
|
||||
if headers is None:
|
||||
headers = {}
|
||||
if path is None:
|
||||
path = self.path
|
||||
|
||||
(response, content) = self.server.crequest(path, method="DELETE",
|
||||
headers=headers)
|
||||
if response.status == 200 or response.status == 201:
|
||||
return content
|
||||
else:
|
||||
self.response = response
|
||||
self.error_description = content
|
||||
return False
|
||||
|
||||
|
||||
def add_credentials(self, name, passwd):
|
||||
self.server.add_credentials(name, passwd)
|
||||
|
||||
def clear_credentials(self):
|
||||
self.server.clear_credentials()
|
||||
|
||||
|
||||
class SOAPClient(object):
|
||||
|
||||
def __init__(self, server_url, keyfile=None, certfile=None, log=None,
|
||||
cookiejar=None, ca_certs="",
|
||||
disable_ssl_certificate_validation=True):
|
||||
self.server = HTTPClient(server_url, keyfile, certfile, log,
|
||||
cookiejar, ca_certs=ca_certs,
|
||||
disable_ssl_certificate_validation=disable_ssl_certificate_validation)
|
||||
self.log = log
|
||||
self.response = None
|
||||
|
||||
def send(self, request, path=None, headers=None, sign=None, sec=None):
|
||||
if headers is None:
|
||||
headers = {"content-type": "application/soap+xml"}
|
||||
else:
|
||||
headers.update({"content-type": "application/soap+xml"})
|
||||
|
||||
soap_message = make_soap_enveloped_saml_thingy(request)
|
||||
if sign:
|
||||
_signed = sec.sign_statement_using_xmlsec(soap_message,
|
||||
class_name(request),
|
||||
nodeid=request.id)
|
||||
soap_message = _signed
|
||||
|
||||
_response = self.server.post(soap_message, headers, path=path)
|
||||
|
||||
self.response = _response
|
||||
if _response:
|
||||
if self.log:
|
||||
self.log.info("SOAP response: %s" % _response)
|
||||
return parse_soap_enveloped_saml_response(_response)
|
||||
else:
|
||||
return False
|
||||
|
||||
def add_credentials(self, name, passwd):
|
||||
self.server.add_credentials(name, passwd)
|
||||
|
||||
292
src/saml2/time_util.py
Normal file
292
src/saml2/time_util.py
Normal file
@@ -0,0 +1,292 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
# Copyright (C) 2009-2011 Umeå University
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
"""
|
||||
Implements some usefull functions when dealing with validity of
|
||||
different types of information.
|
||||
"""
|
||||
|
||||
import calendar
|
||||
import re
|
||||
import time
|
||||
import sys
|
||||
|
||||
from datetime import timedelta
|
||||
from datetime import datetime
|
||||
|
||||
TIME_FORMAT = "%Y-%m-%dT%H:%M:%SZ"
|
||||
TIME_FORMAT_WITH_FRAGMENT = re.compile(
|
||||
"^(\d{4,4}-\d{2,2}-\d{2,2}T\d{2,2}:\d{2,2}:\d{2,2})\.\d*Z$")
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
#I'm sure this is implemeted somewhere else cann't find it now though, so I
|
||||
#made an attempt.
|
||||
#Implemented according to
|
||||
#http://www.w3.org/TR/2001/REC-xmlschema-2-20010502/
|
||||
#adding-durations-to-dateTimes
|
||||
|
||||
def f_quotient(arg0, arg1, arg2=0):
|
||||
if arg2:
|
||||
return int((arg0-arg1)/(arg2-arg1))
|
||||
elif not arg0:
|
||||
return 0
|
||||
else:
|
||||
return int(arg0/arg1)
|
||||
|
||||
def modulo(arg0, arg1, arg2=0):
|
||||
if arg2:
|
||||
return ((arg0 - arg1) % (arg2 - arg1)) + arg1
|
||||
else:
|
||||
return arg0 % arg1
|
||||
|
||||
|
||||
def maximum_day_in_month_for(year, month):
|
||||
return calendar.monthrange(year, month)[1]
|
||||
|
||||
|
||||
D_FORMAT = [
|
||||
("Y", "tm_year"),
|
||||
("M", "tm_mon"),
|
||||
("D", "tm_mday"),
|
||||
("T", None),
|
||||
("H", "tm_hour"),
|
||||
("M", "tm_min"),
|
||||
("S", "tm_sec")
|
||||
]
|
||||
|
||||
def parse_duration(duration):
|
||||
# (-)PnYnMnDTnHnMnS
|
||||
index = 0
|
||||
if duration[0] == '-':
|
||||
sign = '-'
|
||||
index += 1
|
||||
else:
|
||||
sign = '+'
|
||||
assert duration[index] == "P"
|
||||
index += 1
|
||||
|
||||
dic = dict([(typ, 0) for (code, typ) in D_FORMAT])
|
||||
|
||||
for code, typ in D_FORMAT:
|
||||
#print duration[index:], code
|
||||
if duration[index] == '-':
|
||||
raise Exception("Negation not allowed on individual items")
|
||||
if code == "T":
|
||||
if duration[index] == "T":
|
||||
index += 1
|
||||
if index == len(duration):
|
||||
raise Exception("Not allowed to end with 'T'")
|
||||
else:
|
||||
raise Exception("Missing T")
|
||||
else:
|
||||
try:
|
||||
mod = duration[index:].index(code)
|
||||
try:
|
||||
dic[typ] = int(duration[index:index+mod])
|
||||
except ValueError:
|
||||
if code == "S":
|
||||
try:
|
||||
dic[typ] = float(duration[index:index+mod])
|
||||
except ValueError:
|
||||
raise Exception("Not a float")
|
||||
else:
|
||||
raise Exception(
|
||||
"Fractions not allow on anything byt seconds")
|
||||
index = mod+index+1
|
||||
except ValueError:
|
||||
dic[typ] = 0
|
||||
|
||||
if index == len(duration):
|
||||
break
|
||||
|
||||
return sign, dic
|
||||
|
||||
def add_duration(tid, duration):
|
||||
|
||||
(sign, dur) = parse_duration(duration)
|
||||
|
||||
if sign == '+':
|
||||
#Months
|
||||
temp = tid.tm_mon + dur["tm_mon"]
|
||||
month = modulo(temp, 1, 13)
|
||||
carry = f_quotient(temp, 1, 13)
|
||||
#Years
|
||||
year = tid.tm_year + dur["tm_year"] + carry
|
||||
# seconds
|
||||
temp = tid.tm_sec + dur["tm_sec"]
|
||||
secs = modulo(temp, 60)
|
||||
carry = f_quotient(temp, 60)
|
||||
# minutes
|
||||
temp = tid.tm_min + dur["tm_min"] + carry
|
||||
minutes = modulo(temp, 60)
|
||||
carry = f_quotient(temp, 60)
|
||||
# hours
|
||||
temp = tid.tm_hour + dur["tm_hour"] + carry
|
||||
hour = modulo(temp, 60)
|
||||
carry = f_quotient(temp, 60)
|
||||
# days
|
||||
if dur["tm_mday"] > maximum_day_in_month_for(year, month):
|
||||
temp_days = maximum_day_in_month_for(year, month)
|
||||
elif dur["tm_mday"] < 1:
|
||||
temp_days = 1
|
||||
else:
|
||||
temp_days = dur["tm_mday"]
|
||||
days = temp_days + tid.tm_mday + carry
|
||||
while True:
|
||||
if days < 1:
|
||||
pass
|
||||
elif days > maximum_day_in_month_for(year, month):
|
||||
days = days - maximum_day_in_month_for(year, month)
|
||||
carry = 1
|
||||
else:
|
||||
break
|
||||
temp = month + carry
|
||||
month = modulo(temp, 1, 13)
|
||||
year = year + f_quotient(temp, 1, 13)
|
||||
|
||||
return time.localtime(time.mktime((year, month, days, hour, minutes,
|
||||
secs, 0, 0, -1)))
|
||||
else:
|
||||
pass
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def time_in_a_while(days=0, seconds=0, microseconds=0, milliseconds=0,
|
||||
minutes=0, hours=0, weeks=0):
|
||||
"""
|
||||
format of timedelta:
|
||||
timedelta([days[, seconds[, microseconds[, milliseconds[,
|
||||
minutes[, hours[, weeks]]]]]]])
|
||||
"""
|
||||
delta = timedelta(days, seconds, microseconds, milliseconds,
|
||||
minutes, hours, weeks)
|
||||
return datetime.utcnow() + delta
|
||||
|
||||
|
||||
def time_a_while_ago(days=0, seconds=0, microseconds=0, milliseconds=0,
|
||||
minutes=0, hours=0, weeks=0):
|
||||
"""
|
||||
format of timedelta:
|
||||
timedelta([days[, seconds[, microseconds[, milliseconds[,
|
||||
minutes[, hours[, weeks]]]]]]])
|
||||
"""
|
||||
delta = timedelta(days, seconds, microseconds, milliseconds,
|
||||
minutes, hours, weeks)
|
||||
return datetime.utcnow() - delta
|
||||
|
||||
|
||||
def in_a_while(days=0, seconds=0, microseconds=0, milliseconds=0,
|
||||
minutes=0, hours=0, weeks=0, format=TIME_FORMAT):
|
||||
"""
|
||||
format of timedelta:
|
||||
timedelta([days[, seconds[, microseconds[, milliseconds[,
|
||||
minutes[, hours[, weeks]]]]]]])
|
||||
"""
|
||||
if format is None:
|
||||
format = TIME_FORMAT
|
||||
|
||||
return time_in_a_while(days, seconds, microseconds, milliseconds,
|
||||
minutes, hours, weeks).strftime(format)
|
||||
|
||||
|
||||
def a_while_ago(days=0, seconds=0, microseconds=0, milliseconds=0,
|
||||
minutes=0, hours=0, weeks=0, format=TIME_FORMAT):
|
||||
return time_a_while_ago(days, seconds, microseconds, milliseconds,
|
||||
minutes, hours, weeks).strftime(format)
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def shift_time(dtime, shift):
|
||||
""" Adds/deletes an integer amount of seconds from a datetime specification
|
||||
|
||||
:param dtime: The datatime specification
|
||||
:param shift: The wanted time shift (+/-)
|
||||
:return: A shifted datatime specification
|
||||
"""
|
||||
return dtime + timedelta(seconds=shift)
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def str_to_time(timestr):
|
||||
if not timestr:
|
||||
return 0
|
||||
try:
|
||||
then = time.strptime(timestr, TIME_FORMAT)
|
||||
except Exception: # assume it's a format problem
|
||||
try:
|
||||
elem = TIME_FORMAT_WITH_FRAGMENT.match(timestr)
|
||||
except Exception, exc:
|
||||
print >> sys.stderr, "Exception: %s on %s" % (exc, timestr)
|
||||
raise
|
||||
then = time.strptime(elem.groups()[0]+"Z", TIME_FORMAT)
|
||||
|
||||
return time.gmtime(calendar.timegm(then))
|
||||
|
||||
|
||||
def instant(format=TIME_FORMAT):
|
||||
return time.strftime(format, time.gmtime())
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def utc_now():
|
||||
return calendar.timegm(time.gmtime())
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def before(point):
|
||||
""" True if point datetime specification is before now """
|
||||
if not point:
|
||||
return True
|
||||
|
||||
if isinstance(point, basestring):
|
||||
point = str_to_time(point)
|
||||
elif isinstance(point, int):
|
||||
point = time.gmtime(point)
|
||||
|
||||
return time.gmtime() < point
|
||||
|
||||
|
||||
def after(point):
|
||||
""" True if point datetime specification is equal or after now """
|
||||
if not point:
|
||||
return True
|
||||
else:
|
||||
return not before(point)
|
||||
|
||||
|
||||
not_before = after
|
||||
|
||||
# 'not_on_or_after' is just an obscure name for 'before'
|
||||
not_on_or_after = before
|
||||
|
||||
# a point is valid if it is now or sometime in the future, in other words,
|
||||
# if it is not before now
|
||||
valid = before
|
||||
|
||||
|
||||
def later_than(then, that):
|
||||
""" True if then is later or equal to that """
|
||||
if isinstance(then, basestring):
|
||||
then = str_to_time(then)
|
||||
elif isinstance(then, int):
|
||||
then = time.gmtime(then)
|
||||
|
||||
if isinstance(that, basestring):
|
||||
that = str_to_time(that)
|
||||
elif isinstance(that, int):
|
||||
that = time.gmtime(that)
|
||||
|
||||
return then >= that
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user