shipyard/alembic/env.py
Bryan Strassner 38e58cfd30 Add Action API
This change introduces a large section of the API for the next major
version of Shipyard - the action api.  By interfacing with Airflow,
Shipyard will invoke workflows and allow for controlling and querying
status of those workflows. Foundationally, this patchset introduces
a lot of framework code for other apis, including error handling
to a common output format, database interaction for persistence of
action information, and use of oslo_config for configuration support.

Add GET all actions primary code - db connection not yet impl
Update base classes to have more structure
Add POST actions framework
Add GET action by id
Add GET of validations and steps
Add control api
Add unit tests of action api methods
Re-Removed duplicate deps from test reqs
Add routes for API
Removed a lot of code better handled by falcon directly
Cleaned up error flows- handlers and defaults
Refactored existing airflow tests to match standard output format
Updated json validation to be more specific
Added basic start for alembic
Added alembic upgrade at startup
Added table creation definitions
Added base revision for alembic upgrade
Bug fixes - DB queries, airflow comm, logic issues, logging issues
Bug fixes - date formats and alignment of keys between systems
Exclusions to bandit / tox.ini
Resolved merge conflicts with integration of auth
Update to use oslo config and PBR
Update the context middleware to check uuid in a less contentious way
Removed routes and resources for regions endpoint - not used
Add auth policies for action api
Restructure execptions to be consistent class hierarchy and common handler
Add generation of config and policy examples
Update tests to init configs
Update database configs to not use env. vars
Removed examples directory, it was no longer accurate
Addressed/removed several TODOs - left some behind as well
Aligned input to DAGs with action: header
Retrieved all sub-steps for dags
Expanded step information
Refactored auth handling for better logging
rename create_actions policy to create_action
removed some templated file comments in env.py generated by alembic
updated inconsistent exception parameters
updated to use ulid instead of uuid for action ids
added action control audit code per review suggestion
Fixed correlation date betwen dags/actions by more string parsing

Change-Id: I2f9ea5250923f45456aa86826e344fc055bba762
2017-09-22 21:47:13 -05:00

82 lines
2.0 KiB
Python

from __future__ import with_statement
import os
from logging.config import fileConfig
from alembic import context
from oslo_config import cfg
from sqlalchemy import create_engine, pool
# this is the shipyard config object
CONF = cfg.CONF
# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config
# Interpret the config file for Python logging.
# This line sets up loggers basically.
if config.attributes.get('configure_logger', True):
fileConfig(config.config_file_name)
target_metadata = None
def get_url():
"""
Returns the url to use instead of using the alembic configuration
file
"""
return CONF.base.postgresql_db
def run_migrations_offline():
"""Run migrations in 'offline' mode.
This configures the context with just a URL
and not an Engine, though an Engine is acceptable
here as well. By skipping the Engine creation
we don't even need a DBAPI to be available.
Calls to context.execute() here emit the given string to the
script output.
"""
url = get_url()
# Default code: url = config.get_main_option("sqlalchemy.url")
context.configure(
url=url, target_metadata=target_metadata, literal_binds=True)
with context.begin_transaction():
context.run_migrations()
def run_migrations_online():
"""Run migrations in 'online' mode.
In this scenario we need to create an Engine
and associate a connection with the context.
"""
connectable = create_engine(get_url())
# Default/generated code:
# connectable = engine_from_config(
# config.get_section(config.config_ini_section),
# prefix='sqlalchemy.',
# poolclass=pool.NullPool)
with connectable.connect() as connection:
context.configure(
connection=connection,
target_metadata=target_metadata
)
with context.begin_transaction():
context.run_migrations()
if context.is_offline_mode():
run_migrations_offline()
else:
run_migrations_online()