db: Integrate alembic

This looks more complicated than it is, but it's really quite simple.
Essentially we have to deal with two possible configurations:

- For existing deployments, the DB sync operation should apply any
  outstanding sqlalchemy-migrate-based migrations, dummy apply the
  initial alembic migration, and then apply any additional alembic-based
  migrations requested (or any available, if no version is specified).

- For new deployments, the DB sync operation should apply the initial
  alembic migration and any additional alembic-based migrations
  requested (or any available, if no version is specified). No
  sqlalchemy-migrate-based migrations will ever be applied.

While we continue to allow users to request a specific database
migration version to upgrade to, we *do not* allow them to request a
sqlalchemy-migrate-based migration version. There's no good reason to do
this - the deployment won't run with an out-of-date DB schema (something
that's also true of the alembic migration, fwiw) - and we want to get
people off of sqlalchemy-migrate as fast as possible. A change in a
future release can remove the sqlalchemy-migrate-based migrations once
we're sure that they'll have upgraded to a release including all of the
sqlalchemy-migrated-based migrations (so Wallaby).

Tests are modified to validate the sanity of these operations. They're
mostly trivial changes, but we do need to do some funky things to ensure
that (a) we don't use logger configuration from 'alembic.ini' that will
mess with our existing logger configuration and (b) we re-use connection
objects as necessary to allow us to run tests against in-memory
databases, where a different connection would actually mean a different
database. We also can't rely on 'WalkVersionsMixin' from oslo.db since
that only supports sqlalchemy-migrate [1]. We instead must re-invent the
wheel here somewhat.

[1] https://github.com/openstack/oslo.db/blob/10.0.0/oslo_db/sqlalchemy/test_migrations.py#L42-L44

Change-Id: I850af601f81bd5d2ecc029682ae10d3a07c936ce
Signed-off-by: Stephen Finucane <stephenfin@redhat.com>
This commit is contained in:
Stephen Finucane 2021-06-17 12:59:11 +01:00
parent 307adc5e08
commit 905c9723e9
12 changed files with 595 additions and 330 deletions

View File

@ -208,6 +208,16 @@ This command should be run **after** :program:`nova-manage api_db sync`.
* - 1
- Failed to access cell0.
.. versionchanged:: 20.0.0 (Train)
Removed support for the legacy ``--version <version>`` argument.
.. versionchanged:: 24.0.0 (Xena)
Migrated versioning engine to alembic. The optional ``VERSION`` argument is
now expected to be an alembic-based version. sqlalchemy-migrate-based
versions will be rejected.
db archive_deleted_rows
-----------------------
@ -497,6 +507,12 @@ This command should be run before ``nova-manage db sync``.
Removed support for the legacy ``--version <version>`` argument.
.. versionchanged:: 24.0.0 (Xena)
Migrated versioning engine to alembic. The optional ``VERSION`` argument is
now expected to be an alembic-based version. sqlalchemy-migrate-based
versions will be rejected.
.. _man-page-cells-v2:
Cells v2 Commands

View File

@ -2,7 +2,7 @@
[alembic]
# path to migration scripts
script_location = nova/db/api/migrations
script_location = %(here)s/migrations
# template used to generate migration files
# file_template = %%(rev)s_%%(slug)s

View File

@ -21,9 +21,10 @@ from alembic import context
# access to the values within the .ini file in use.
config = context.config
# Interpret the config file for Python logging.
# Interpret the config file for Python logging unless we're told not to.
# This line sets up loggers basically.
fileConfig(config.config_file_name)
if config.attributes.get('configure_logger', True):
fileConfig(config.config_file_name)
# add your model's MetaData object here
# for 'autogenerate' support
@ -58,12 +59,25 @@ def run_migrations_online():
In this scenario we need to create an Engine and associate a connection
with the context.
This is modified from the default based on the below, since we want to
share an engine when unit testing so in-memory database testing actually
works.
https://alembic.sqlalchemy.org/en/latest/cookbook.html#connection-sharing
"""
connectable = engine_from_config(
config.get_section(config.config_ini_section),
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)
connectable = config.attributes.get('connection', None)
if connectable is None:
# only create Engine if we don't have a Connection from the outside
connectable = engine_from_config(
config.get_section(config.config_ini_section),
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)
# when connectable is already a Connection object, calling connect() gives
# us a *branched connection*.
with connectable.connect() as connection:
context.configure(

View File

@ -327,6 +327,8 @@ class ResourceProvider(API_BASE):
# parent. If parent_provider_id == NULL then root_provider_id == id
parent_provider_id = sa.Column(
sa.Integer, sa.ForeignKey('resource_providers.id'), nullable=True)
# TODO(stephenfin): Drop these from the db at some point
# columns_to_drop = ['can_host']
# TODO(stephenfin): Remove this as it's now unused post-placement split

View File

@ -2,7 +2,7 @@
[alembic]
# path to migration scripts
script_location = nova/db/main/migrations
script_location = %(here)s/migrations
# template used to generate migration files
# file_template = %%(rev)s_%%(slug)s

View File

@ -4517,9 +4517,14 @@ def archive_deleted_rows(context=None, max_rows=None, before=None,
rows_archived = 0
# skip the special sqlalchemy-migrate migrate_version table and any
# shadow tables
if (tablename == 'migrate_version' or
tablename.startswith(_SHADOW_TABLE_PREFIX)):
# TODO(stephenfin): Drop 'migrate_version' once we remove support for
# the legacy sqlalchemy-migrate migrations
if (
tablename in ('migrate_version', 'alembic_version') or
tablename.startswith(_SHADOW_TABLE_PREFIX)
):
continue
rows_archived, _deleted_instance_uuids, extras = (
_archive_deleted_rows_for_table(
meta, tablename,

View File

@ -20,9 +20,10 @@ from sqlalchemy import pool
# access to the values within the .ini file in use.
config = context.config
# Interpret the config file for Python logging.
# Interpret the config file for Python logging unless we're told not to.
# This line sets up loggers basically.
fileConfig(config.config_file_name)
if config.attributes.get('configure_logger', True):
fileConfig(config.config_file_name)
# add your model's MetaData object here
# for 'autogenerate' support
@ -57,12 +58,25 @@ def run_migrations_online():
In this scenario we need to create an Engine and associate a connection
with the context.
This is modified from the default based on the below, since we want to
share an engine when unit testing so in-memory database testing actually
works.
https://alembic.sqlalchemy.org/en/latest/cookbook.html#connection-sharing
"""
connectable = engine_from_config(
config.get_section(config.config_ini_section),
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)
connectable = config.attributes.get('connection', None)
if connectable is None:
# only create Engine if we don't have a Connection from the outside
connectable = engine_from_config(
config.get_section(config.config_ini_section),
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)
# when connectable is already a Connection object, calling connect() gives
# us a *branched connection*.
with connectable.connect() as connection:
context.configure(

View File

@ -16,26 +16,33 @@
import os
from migrate import exceptions as versioning_exceptions
from migrate.versioning import api as versioning_api
from migrate.versioning.repository import Repository
from alembic import command as alembic_api
from alembic import config as alembic_config
from alembic.runtime import migration as alembic_migration
from migrate import exceptions as migrate_exceptions
from migrate.versioning import api as migrate_api
from migrate.versioning import repository as migrate_repository
from oslo_log import log as logging
import sqlalchemy
from nova.db.api import api as api_db_api
from nova.db.main import api as main_db_api
from nova import exception
from nova.i18n import _
INIT_VERSION = {}
INIT_VERSION['main'] = 401
INIT_VERSION['api'] = 66
_REPOSITORY = {}
MIGRATE_INIT_VERSION = {
'main': 401,
'api': 66,
}
ALEMBIC_INIT_VERSION = {
'main': '8f2f1571d55b',
'api': 'd67eeaabee36',
}
_MIGRATE_REPO = {}
_ALEMBIC_CONF = {}
LOG = logging.getLogger(__name__)
def get_engine(database='main', context=None):
def _get_engine(database='main', context=None):
if database == 'main':
return main_db_api.get_engine(context=context)
@ -43,97 +50,131 @@ def get_engine(database='main', context=None):
return api_db_api.get_engine()
def find_migrate_repo(database='main'):
def _find_migrate_repo(database='main'):
"""Get the path for the migrate repository."""
global _REPOSITORY
rel_path = os.path.join(database, 'legacy_migrations')
path = os.path.join(os.path.abspath(os.path.dirname(__file__)), rel_path)
assert os.path.exists(path)
if _REPOSITORY.get(database) is None:
_REPOSITORY[database] = Repository(path)
return _REPOSITORY[database]
global _MIGRATE_REPO
path = os.path.join(
os.path.abspath(os.path.dirname(__file__)),
database, 'legacy_migrations')
if _MIGRATE_REPO.get(database) is None:
_MIGRATE_REPO[database] = migrate_repository.Repository(path)
return _MIGRATE_REPO[database]
def _find_alembic_conf(database='main'):
"""Get the path for the alembic repository."""
global _ALEMBIC_CONF
path = os.path.join(
os.path.abspath(os.path.dirname(__file__)),
database, 'alembic.ini')
if _ALEMBIC_CONF.get(database) is None:
config = alembic_config.Config(path)
# we don't want to use the logger configuration from the file, which is
# only really intended for the CLI
# https://stackoverflow.com/a/42691781/613428
config.attributes['configure_logger'] = False
_ALEMBIC_CONF[database] = config
return _ALEMBIC_CONF[database]
def _is_database_under_migrate_control(engine, repository):
try:
migrate_api.db_version(engine, repository)
return True
except migrate_exceptions.DatabaseNotControlledError:
return False
def _is_database_under_alembic_control(engine):
with engine.connect() as conn:
context = alembic_migration.MigrationContext.configure(conn)
return bool(context.get_current_revision())
def _init_alembic_on_legacy_database(engine, database, repository, config):
"""Init alembic in an existing environment with sqlalchemy-migrate."""
LOG.info(
'The database is still under sqlalchemy-migrate control; '
'applying any remaining sqlalchemy-migrate-based migrations '
'and fake applying the initial alembic migration'
)
migrate_api.upgrade(engine, repository)
# re-use the connection rather than creating a new one
with engine.begin() as connection:
config.attributes['connection'] = connection
alembic_api.stamp(config, ALEMBIC_INIT_VERSION[database])
def _upgrade_alembic(engine, config, version):
# re-use the connection rather than creating a new one
with engine.begin() as connection:
config.attributes['connection'] = connection
alembic_api.upgrade(config, version or 'head')
def db_sync(version=None, database='main', context=None):
"""Migrate the database to `version` or the most recent version."""
if version is not None:
try:
version = int(version)
except ValueError:
raise exception.NovaException(_("version should be an integer"))
current_version = db_version(database, context=context)
repository = find_migrate_repo(database)
engine = get_engine(database, context=context)
if version is None or version > current_version:
return versioning_api.upgrade(engine, repository, version)
else:
return versioning_api.downgrade(engine, repository, version)
if database not in ('main', 'api'):
raise exception.Invalid('%s is not a valid database' % database)
# if the user requested a specific version, check if it's an integer:
# if so, we're almost certainly in sqlalchemy-migrate land and won't
# support that
if version is not None and version.isdigit():
raise exception.Invalid(
'You requested an sqlalchemy-migrate database version; this is '
'no longer supported'
)
engine = _get_engine(database, context=context)
repository = _find_migrate_repo(database)
config = _find_alembic_conf(database)
# discard the URL encoded in alembic.ini in favour of the URL configured
# for the engine by the database fixtures, casting from
# 'sqlalchemy.engine.url.URL' to str in the process
config.set_main_option('sqlalchemy.url', str(engine.url))
# if we're in a deployment where sqlalchemy-migrate is already present,
# then apply all the updates for that and fake apply the initial alembic
# migration; if we're not then 'upgrade' will take care of everything
# this should be a one-time operation
if (
_is_database_under_migrate_control(engine, repository) and
not _is_database_under_alembic_control(engine)
):
_init_alembic_on_legacy_database(engine, database, repository, config)
# apply anything later
LOG.info('Applying migration(s)')
_upgrade_alembic(engine, config, version)
LOG.info('Migration(s) applied')
def db_version(database='main', context=None):
"""Display the current database version."""
repository = find_migrate_repo(database)
if database not in ('main', 'api'):
raise exception.Invalid('%s is not a valid database' % database)
# NOTE(mdbooth): This is a crude workaround for races in _db_version. The 2
# races we have seen in practise are:
# * versioning_api.db_version() fails because the migrate_version table
# doesn't exist, but meta.tables subsequently contains tables because
# another thread has already started creating the schema. This results in
# the 'Essex' error.
# * db_version_control() fails with pymysql.error.InternalError(1050)
# (Create table failed) because of a race in sqlalchemy-migrate's
# ControlledSchema._create_table_version, which does:
# if not table.exists(): table.create()
# This means that it doesn't raise the advertised
# DatabaseAlreadyControlledError, which we could have handled explicitly.
#
# I believe the correct fix should be:
# * Delete the Essex-handling code as unnecessary complexity which nobody
# should still need.
# * Fix the races in sqlalchemy-migrate such that version_control() always
# raises a well-defined error, and then handle that error here.
#
# Until we do that, though, we should be able to just try again if we
# failed for any reason. In both of the above races, trying again should
# succeed the second time round.
#
# For additional context, see:
# * https://bugzilla.redhat.com/show_bug.cgi?id=1652287
# * https://bugs.launchpad.net/nova/+bug/1804652
try:
return _db_version(repository, database, context)
except Exception:
return _db_version(repository, database, context)
repository = _find_migrate_repo(database)
engine = _get_engine(database, context=context)
migrate_version = None
if _is_database_under_migrate_control(engine, repository):
migrate_version = migrate_api.db_version(engine, repository)
def _db_version(repository, database, context):
engine = get_engine(database, context=context)
try:
return versioning_api.db_version(engine, repository)
except versioning_exceptions.DatabaseNotControlledError as exc:
meta = sqlalchemy.MetaData()
meta.reflect(bind=engine)
tables = meta.tables
if len(tables) == 0:
db_version_control(
INIT_VERSION[database], database, context=context)
return versioning_api.db_version(engine, repository)
else:
LOG.exception(exc)
# Some pre-Essex DB's may not be version controlled.
# Require them to upgrade using Essex first.
raise exception.NovaException(
_("Upgrade DB using Essex release first."))
alembic_version = None
if _is_database_under_alembic_control(engine):
alembic_version = alembic_api.current(engine)
def db_initial_version(database='main'):
"""The starting version for the database."""
return INIT_VERSION[database]
def db_version_control(version=None, database='main', context=None):
repository = find_migrate_repo(database)
engine = get_engine(database, context=context)
versioning_api.version_control(engine, repository, version)
return version
return alembic_version or migrate_version

View File

@ -28,20 +28,22 @@ For postgres on Ubuntu this can be done with the following commands::
"""
import os
from migrate.versioning import repository
from alembic import command as alembic_api
from alembic import script as alembic_script
from migrate.versioning import api as migrate_api
import mock
from oslo_db.sqlalchemy import enginefacade
from oslo_db.sqlalchemy import test_fixtures
from oslo_db.sqlalchemy import test_migrations
from oslo_log import log as logging
import testtools
from nova.db.api import legacy_migrations
from nova.db.api import models
from nova.db import migration
from nova import test
LOG = logging.getLogger(__name__)
class NovaModelsMigrationsSync(test_migrations.ModelsMigrationsSync):
"""Test that the models match the database after migrations are run."""
@ -51,7 +53,7 @@ class NovaModelsMigrationsSync(test_migrations.ModelsMigrationsSync):
self.engine = enginefacade.writer.get_engine()
def db_sync(self, engine):
with mock.patch.object(migration, 'get_engine', return_value=engine):
with mock.patch.object(migration, '_get_engine', return_value=engine):
migration.db_sync(database='api')
def get_engine(self):
@ -145,56 +147,99 @@ class TestModelsSyncPostgreSQL(
FIXTURE = test_fixtures.PostgresqlOpportunisticFixture
class NovaMigrationsWalk(test_migrations.WalkVersionsMixin):
class NovaModelsMigrationsLegacySync(NovaModelsMigrationsSync):
"""Test that the models match the database after old migrations are run."""
def db_sync(self, engine):
# the 'nova.db.migration.db_sync' method will not use the legacy
# sqlalchemy-migrate-based migration flow unless the database is
# already controlled with sqlalchemy-migrate, so we need to manually
# enable version controlling with this tool to test this code path
repository = migration._find_migrate_repo(database='api')
migrate_api.version_control(
engine, repository, migration.MIGRATE_INIT_VERSION['api'])
# now we can apply migrations as expected and the legacy path will be
# followed
super().db_sync(engine)
class TestModelsLegacySyncSQLite(
NovaModelsMigrationsLegacySync,
test_fixtures.OpportunisticDBTestMixin,
testtools.TestCase,
):
pass
class TestModelsLegacySyncMySQL(
NovaModelsMigrationsLegacySync,
test_fixtures.OpportunisticDBTestMixin,
testtools.TestCase,
):
FIXTURE = test_fixtures.MySQLOpportunisticFixture
class TestModelsLegacySyncPostgreSQL(
NovaModelsMigrationsLegacySync,
test_fixtures.OpportunisticDBTestMixin,
testtools.TestCase,
):
FIXTURE = test_fixtures.PostgresqlOpportunisticFixture
class NovaMigrationsWalk(
test_fixtures.OpportunisticDBTestMixin, test.NoDBTestCase,
):
def setUp(self):
super(NovaMigrationsWalk, self).setUp()
super().setUp()
self.engine = enginefacade.writer.get_engine()
self.config = migration._find_alembic_conf('api')
self.init_version = migration.ALEMBIC_INIT_VERSION['api']
@property
def INIT_VERSION(self):
return migration.db_initial_version('api')
def _migrate_up(self, revision):
if revision == self.init_version: # no tests for the initial revision
return
@property
def REPOSITORY(self):
return repository.Repository(
os.path.abspath(os.path.dirname(legacy_migrations.__file__)))
@property
def migration_api(self):
return migration.versioning_api
@property
def migrate_engine(self):
return self.engine
def _skippable_migrations(self):
train_placeholders = list(range(68, 73))
ussuri_placeholders = list(range(73, 78))
victoria_placeholders = list(range(78, 83))
wallaby_placeholders = list(range(83, 88))
special_cases = [
self.INIT_VERSION + 1, # initial change
]
return (
train_placeholders +
ussuri_placeholders +
victoria_placeholders +
wallaby_placeholders +
special_cases
self.assertIsNotNone(
getattr(self, '_check_%s' % revision, None),
(
'API DB Migration %s does not have a test; you must add one'
) % revision,
)
alembic_api.upgrade(self.config, revision)
def migrate_up(self, version, with_data=False):
if with_data:
check = getattr(self, '_check_%03d' % version, None)
if version not in self._skippable_migrations():
self.assertIsNotNone(
check, 'DB Migration %i does not have a test.' % version)
def test_single_base_revision(self):
"""Ensure we only have a single base revision.
super().migrate_up(version, with_data)
There's no good reason for us to have diverging history, so validate
that only one base revision exists. This will prevent simple errors
where people forget to specify the base revision. If this fail for your
change, look for migrations that do not have a 'revises' line in them.
"""
script = alembic_script.ScriptDirectory.from_config(self.config)
self.assertEqual(1, len(script.get_bases()))
def test_single_head_revision(self):
"""Ensure we only have a single head revision.
There's no good reason for us to have diverging history, so validate
that only one head revision exists. This will prevent merge conflicts
adding additional head revision points. If this fail for your change,
look for migrations with the same 'revises' line in them.
"""
script = alembic_script.ScriptDirectory.from_config(self.config)
self.assertEqual(1, len(script.get_heads()))
def test_walk_versions(self):
self.walk_versions(snake_walk=False, downgrade=False)
with self.engine.begin() as connection:
self.config.attributes['connection'] = connection
script = alembic_script.ScriptDirectory.from_config(self.config)
for revision_script in script.walk_revisions():
revision = revision_script.revision
LOG.info('Testing revision %s', revision)
self._migrate_up(revision)
class TestMigrationsWalkSQLite(

View File

@ -35,21 +35,24 @@ For postgres on Ubuntu this can be done with the following commands::
import glob
import os
from migrate.versioning import repository
from alembic import command as alembic_api
from alembic import script as alembic_script
from migrate.versioning import api as migrate_api
import mock
from oslo_db.sqlalchemy import enginefacade
from oslo_db.sqlalchemy import test_fixtures
from oslo_db.sqlalchemy import test_migrations
from oslo_db.sqlalchemy import utils as oslodbutils
from oslo_log import log as logging
import sqlalchemy
import sqlalchemy.exc
import testtools
from nova.db.main import legacy_migrations
from nova.db.main import models
from nova.db import migration
from nova import test
from nova.tests import fixtures as nova_fixtures
LOG = logging.getLogger(__name__)
class NovaModelsMigrationsSync(test_migrations.ModelsMigrationsSync):
@ -96,9 +99,8 @@ class NovaModelsMigrationsSync(test_migrations.ModelsMigrationsSync):
self.assertEqual(members, index_columns)
# Implementations for ModelsMigrationsSync
def db_sync(self, engine):
with mock.patch.object(migration, 'get_engine', return_value=engine):
with mock.patch.object(migration, '_get_engine', return_value=engine):
migration.db_sync(database='main')
def get_engine(self):
@ -160,10 +162,7 @@ class TestModelsSyncMySQL(
FIXTURE = test_fixtures.MySQLOpportunisticFixture
def test_innodb_tables(self):
with mock.patch.object(
migration, 'get_engine', return_value=self.engine,
):
migration.db_sync()
self.db_sync(self.get_engine())
total = self.engine.execute(
"SELECT count(*) "
@ -191,73 +190,99 @@ class TestModelsSyncPostgreSQL(
FIXTURE = test_fixtures.PostgresqlOpportunisticFixture
class NovaMigrationsWalk(test_migrations.WalkVersionsMixin):
class NovaModelsMigrationsLegacySync(NovaModelsMigrationsSync):
"""Test that the models match the database after old migrations are run."""
def db_sync(self, engine):
# the 'nova.db.migration.db_sync' method will not use the legacy
# sqlalchemy-migrate-based migration flow unless the database is
# already controlled with sqlalchemy-migrate, so we need to manually
# enable version controlling with this tool to test this code path
repository = migration._find_migrate_repo(database='main')
migrate_api.version_control(
engine, repository, migration.MIGRATE_INIT_VERSION['main'])
# now we can apply migrations as expected and the legacy path will be
# followed
super().db_sync(engine)
class TestModelsLegacySyncSQLite(
NovaModelsMigrationsLegacySync,
test_fixtures.OpportunisticDBTestMixin,
testtools.TestCase,
):
pass
class TestModelsLegacySyncMySQL(
NovaModelsMigrationsLegacySync,
test_fixtures.OpportunisticDBTestMixin,
testtools.TestCase,
):
FIXTURE = test_fixtures.MySQLOpportunisticFixture
class TestModelsLegacySyncPostgreSQL(
NovaModelsMigrationsLegacySync,
test_fixtures.OpportunisticDBTestMixin,
testtools.TestCase,
):
FIXTURE = test_fixtures.PostgresqlOpportunisticFixture
class NovaMigrationsWalk(
test_fixtures.OpportunisticDBTestMixin, test.NoDBTestCase,
):
def setUp(self):
super().setUp()
self.engine = enginefacade.writer.get_engine()
self.config = migration._find_alembic_conf('api')
self.init_version = migration.ALEMBIC_INIT_VERSION['api']
@property
def INIT_VERSION(self):
return migration.db_initial_version('main')
def _migrate_up(self, revision):
if revision == self.init_version: # no tests for the initial revision
return
@property
def REPOSITORY(self):
return repository.Repository(
os.path.abspath(os.path.dirname(legacy_migrations.__file__)))
@property
def migration_api(self):
return migration.versioning_api
@property
def migrate_engine(self):
return self.engine
def _skippable_migrations(self):
special = [
self.INIT_VERSION + 1,
]
train_placeholders = list(range(403, 408))
ussuri_placeholders = list(range(408, 413))
victoria_placeholders = list(range(413, 418))
wallaby_placeholders = list(range(418, 423))
return (
special +
train_placeholders +
ussuri_placeholders +
victoria_placeholders +
wallaby_placeholders
self.assertIsNotNone(
getattr(self, '_check_%s' % revision, None),
(
'API DB Migration %s does not have a test; you must add one'
) % revision,
)
alembic_api.upgrade(self.config, revision)
def migrate_up(self, version, with_data=False):
if with_data:
check = getattr(self, "_check_%03d" % version, None)
if version not in self._skippable_migrations():
self.assertIsNotNone(
check, 'DB Migration %i does not have a test' % version)
def test_single_base_revision(self):
"""Ensure we only have a single base revision.
# NOTE(danms): This is a list of migrations where we allow dropping
# things. The rules for adding things here are very very specific.
# Chances are you don't meet the critera.
# Reviewers: DO NOT ALLOW THINGS TO BE ADDED HERE
exceptions = [
# The base migration can do whatever it likes
self.INIT_VERSION + 1,
]
# Reviewers: DO NOT ALLOW THINGS TO BE ADDED HERE
There's no good reason for us to have diverging history, so validate
that only one base revision exists. This will prevent simple errors
where people forget to specify the base revision. If this fail for your
change, look for migrations that do not have a 'revises' line in them.
"""
script = alembic_script.ScriptDirectory.from_config(self.config)
self.assertEqual(1, len(script.get_bases()))
if version not in exceptions:
banned = ['Table', 'Column']
else:
banned = None
with nova_fixtures.BannedDBSchemaOperations(banned):
super().migrate_up(version, with_data)
def test_single_head_revision(self):
"""Ensure we only have a single head revision.
There's no good reason for us to have diverging history, so validate
that only one head revision exists. This will prevent merge conflicts
adding additional head revision points. If this fail for your change,
look for migrations with the same 'revises' line in them.
"""
script = alembic_script.ScriptDirectory.from_config(self.config)
self.assertEqual(1, len(script.get_heads()))
def test_walk_versions(self):
self.walk_versions(snake_walk=False, downgrade=False)
with self.engine.begin() as connection:
self.config.attributes['connection'] = connection
script = alembic_script.ScriptDirectory.from_config(self.config)
for revision_script in script.walk_revisions():
revision = revision_script.revision
LOG.info('Testing revision %s', revision)
self._migrate_up(revision)
class TestMigrationsWalkSQLite(

View File

@ -12,134 +12,170 @@
# License for the specific language governing permissions and limitations
# under the License.
from migrate import exceptions as versioning_exceptions
from migrate.versioning import api as versioning_api
from alembic import command as alembic_api
from alembic.runtime import migration as alembic_migration
from migrate import exceptions as migrate_exceptions
from migrate.versioning import api as migrate_api
import mock
import sqlalchemy
from nova.db.api import api as api_db_api
from nova.db.main import api as main_db_api
from nova.db import migration
from nova import exception
from nova import test
@mock.patch.object(migration, 'db_version', return_value=2)
@mock.patch.object(migration, 'find_migrate_repo', return_value='repo')
@mock.patch.object(versioning_api, 'upgrade')
@mock.patch.object(versioning_api, 'downgrade')
@mock.patch.object(migration, 'get_engine', return_value='engine')
class TestDbSync(test.NoDBTestCase):
class TestDBSync(test.NoDBTestCase):
def test_version_none(self, mock_get_engine, mock_downgrade, mock_upgrade,
mock_find_repo, mock_version):
database = 'fake'
migration.db_sync(database=database)
mock_version.assert_called_once_with(database, context=None)
mock_find_repo.assert_called_once_with(database)
mock_get_engine.assert_called_once_with(database, context=None)
mock_upgrade.assert_called_once_with('engine', 'repo', None)
self.assertFalse(mock_downgrade.called)
def test_db_sync_invalid_databse(self):
"""We only have two databases."""
self.assertRaises(
exception.Invalid, migration.db_sync, database='invalid')
def test_downgrade(self, mock_get_engine, mock_downgrade, mock_upgrade,
mock_find_repo, mock_version):
database = 'fake'
migration.db_sync(1, database=database)
mock_version.assert_called_once_with(database, context=None)
mock_find_repo.assert_called_once_with(database)
mock_get_engine.assert_called_once_with(database, context=None)
mock_downgrade.assert_called_once_with('engine', 'repo', 1)
self.assertFalse(mock_upgrade.called)
def test_db_sync_legacy_version(self):
"""We don't allow users to request legacy versions."""
self.assertRaises(
exception.Invalid,
migration.db_sync, '402')
@mock.patch.object(migration, 'find_migrate_repo', return_value='repo')
@mock.patch.object(versioning_api, 'db_version')
@mock.patch.object(migration, 'get_engine')
class TestDbVersion(test.NoDBTestCase):
def test_db_version(self, mock_get_engine, mock_db_version,
mock_find_repo):
database = 'fake'
mock_get_engine.return_value = 'engine'
migration.db_version(database)
mock_find_repo.assert_called_once_with(database)
mock_db_version.assert_called_once_with('engine', 'repo')
def test_not_controlled(
self, mock_get_engine, mock_db_version, mock_find_repo,
@mock.patch.object(migration, '_upgrade_alembic')
@mock.patch.object(migration, '_init_alembic_on_legacy_database')
@mock.patch.object(migration, '_is_database_under_alembic_control')
@mock.patch.object(migration, '_is_database_under_migrate_control')
@mock.patch.object(migration, '_find_alembic_conf')
@mock.patch.object(migration, '_find_migrate_repo')
@mock.patch.object(migration, '_get_engine')
def _test_db_sync(
self, has_migrate, has_alembic, mock_get_engine, mock_find_repo,
mock_find_conf, mock_is_migrate, mock_is_alembic, mock_init,
mock_upgrade,
):
database = 'api'
mock_get_engine.side_effect = ['engine', 'engine', 'engine']
exc = versioning_exceptions.DatabaseNotControlledError()
mock_db_version.side_effect = [exc, '']
metadata = mock.MagicMock()
metadata.tables.return_value = []
with mock.patch.object(sqlalchemy, 'MetaData',
metadata), mock.patch.object(migration,
'db_version_control') as mock_version_control:
migration.db_version(database)
mock_version_control.assert_called_once_with(
migration.INIT_VERSION['api'], database, context=None)
db_version_calls = [mock.call('engine', 'repo')] * 2
self.assertEqual(db_version_calls, mock_db_version.call_args_list)
engine_calls = [mock.call(database, context=None)]
self.assertEqual(engine_calls, mock_get_engine.call_args_list)
mock_is_migrate.return_value = has_migrate
mock_is_alembic.return_value = has_alembic
def test_db_version_init_race(self, mock_get_engine, mock_db_version,
mock_find_repo):
# This test exercises bug 1804652 by causing
# versioning_api.version_contro() to raise an unhandleable error the
# first time it is called.
database = 'api'
mock_get_engine.return_value = 'engine'
exc = versioning_exceptions.DatabaseNotControlledError()
mock_db_version.side_effect = [exc, '']
metadata = mock.MagicMock()
metadata.tables.return_value = []
with mock.patch.object(sqlalchemy, 'MetaData',
metadata), mock.patch.object(migration,
'db_version_control') as mock_version_control:
# db_version_control raises an unhandleable error because we were
# racing to initialise with another process.
mock_version_control.side_effect = test.TestingException
migration.db_version(database)
mock_version_control.assert_called_once_with(
migration.INIT_VERSION['api'], database, context=None)
db_version_calls = [mock.call('engine', 'repo')] * 2
self.assertEqual(db_version_calls, mock_db_version.call_args_list)
engine_calls = [mock.call(database, context=None)] * 2
self.assertEqual(engine_calls, mock_get_engine.call_args_list)
migration.db_sync()
def test_db_version_raise_on_error(self, mock_get_engine, mock_db_version,
mock_find_repo):
# This test asserts that we will still raise a persistent error after
# working around bug 1804652.
database = 'api'
mock_get_engine.return_value = 'engine'
mock_db_version.side_effect = \
versioning_exceptions.DatabaseNotControlledError
metadata = mock.MagicMock()
metadata.tables.return_value = []
with mock.patch.object(sqlalchemy, 'MetaData',
metadata), mock.patch.object(migration,
'db_version_control') as mock_version_control:
# db_version_control raises an unhandleable error because we were
# racing to initialise with another process.
mock_version_control.side_effect = test.TestingException
self.assertRaises(test.TestingException,
migration.db_version, database)
mock_get_engine.assert_called_once_with('main', context=None)
mock_find_repo.assert_called_once_with('main')
mock_find_conf.assert_called_once_with('main')
mock_find_conf.return_value.set_main_option.assert_called_once_with(
'sqlalchemy.url', str(mock_get_engine.return_value.url),
)
mock_is_migrate.assert_called_once_with(
mock_get_engine.return_value, mock_find_repo.return_value)
if has_migrate:
mock_is_alembic.assert_called_once_with(
mock_get_engine.return_value)
else:
mock_is_alembic.assert_not_called()
# we should only attempt the upgrade of the remaining
# sqlalchemy-migrate-based migrations and fake apply of the initial
# alembic migrations if sqlalchemy-migrate is in place but alembic
# hasn't been used yet
if has_migrate and not has_alembic:
mock_init.assert_called_once_with(
mock_get_engine.return_value, 'main',
mock_find_repo.return_value, mock_find_conf.return_value)
else:
mock_init.assert_not_called()
# however, we should always attempt to upgrade the requested migration
# to alembic
mock_upgrade.assert_called_once_with(
mock_get_engine.return_value, mock_find_conf.return_value, None)
def test_db_sync_new_deployment(self):
"""Mimic a new deployment without existing sqlalchemy-migrate cruft."""
has_migrate = False
has_alembic = False
self._test_db_sync(has_migrate, has_alembic)
def test_db_sync_with_existing_migrate_database(self):
"""Mimic a deployment currently managed by sqlalchemy-migrate."""
has_migrate = True
has_alembic = False
self._test_db_sync(has_migrate, has_alembic)
def test_db_sync_with_existing_alembic_database(self):
"""Mimic a deployment that's already switched to alembic."""
has_migrate = True
has_alembic = True
self._test_db_sync(has_migrate, has_alembic)
@mock.patch.object(migration, 'find_migrate_repo', return_value='repo')
@mock.patch.object(migration, 'get_engine', return_value='engine')
@mock.patch.object(versioning_api, 'version_control')
class TestDbVersionControl(test.NoDBTestCase):
@mock.patch.object(alembic_api, 'current')
@mock.patch.object(migrate_api, 'db_version')
@mock.patch.object(migration, '_is_database_under_alembic_control')
@mock.patch.object(migration, '_is_database_under_migrate_control')
@mock.patch.object(migration, '_get_engine')
@mock.patch.object(migration, '_find_migrate_repo')
class TestDBVersion(test.NoDBTestCase):
def test_version_control(self, mock_version_control, mock_get_engine,
mock_find_repo):
database = 'fake'
migration.db_version_control(database=database)
mock_find_repo.assert_called_once_with(database)
mock_version_control.assert_called_once_with('engine', 'repo', None)
def test_db_version_invalid_databse(
self, mock_find_repo, mock_get_engine, mock_is_migrate,
mock_is_alembic, mock_migrate_version, mock_alembic_version,
):
"""We only have two databases."""
self.assertRaises(
exception.Invalid, migration.db_version, database='invalid')
def test_db_version_migrate(
self, mock_find_repo, mock_get_engine, mock_is_migrate,
mock_is_alembic, mock_migrate_version, mock_alembic_version,
):
"""Database is controlled by sqlalchemy-migrate."""
mock_is_migrate.return_value = True
mock_is_alembic.return_value = False
ret = migration.db_version('main')
self.assertEqual(mock_migrate_version.return_value, ret)
mock_find_repo.assert_called_once_with('main')
mock_get_engine.assert_called_once_with('main', context=None)
mock_is_migrate.assert_called_once()
mock_is_alembic.assert_called_once()
mock_migrate_version.assert_called_once_with(
mock_get_engine.return_value, mock_find_repo.return_value)
mock_alembic_version.assert_not_called()
def test_db_version_alembic(
self, mock_find_repo, mock_get_engine, mock_is_migrate,
mock_is_alembic, mock_migrate_version, mock_alembic_version,
):
"""Database is controlled by alembic."""
mock_is_migrate.return_value = False
mock_is_alembic.return_value = True
ret = migration.db_version('main')
self.assertEqual(mock_alembic_version.return_value, ret)
mock_find_repo.assert_called_once_with('main')
mock_get_engine.assert_called_once_with('main', context=None)
mock_is_migrate.assert_called_once()
mock_is_alembic.assert_called_once()
mock_migrate_version.assert_not_called()
mock_alembic_version.assert_called_once_with(
mock_get_engine.return_value)
def test_db_version_not_controlled(
self, mock_find_repo, mock_get_engine, mock_is_migrate,
mock_is_alembic, mock_migrate_version, mock_alembic_version,
):
"""Database is not controlled."""
mock_is_migrate.return_value = False
mock_is_alembic.return_value = False
ret = migration.db_version()
self.assertIsNone(ret)
mock_find_repo.assert_called_once_with('main')
mock_get_engine.assert_called_once_with('main', context=None)
mock_is_migrate.assert_called_once()
mock_is_alembic.assert_called_once()
mock_migrate_version.assert_not_called()
mock_alembic_version.assert_not_called()
class TestGetEngine(test.NoDBTestCase):
@ -148,7 +184,7 @@ class TestGetEngine(test.NoDBTestCase):
with mock.patch.object(
main_db_api, 'get_engine', return_value='engine',
) as mock_get_engine:
engine = migration.get_engine()
engine = migration._get_engine()
self.assertEqual('engine', engine)
mock_get_engine.assert_called_once_with(context=None)
@ -156,6 +192,48 @@ class TestGetEngine(test.NoDBTestCase):
with mock.patch.object(
api_db_api, 'get_engine', return_value='engine',
) as mock_get_engine:
engine = migration.get_engine('api')
engine = migration._get_engine('api')
self.assertEqual('engine', engine)
mock_get_engine.assert_called_once_with()
class TestDatabaseUnderVersionControl(test.NoDBTestCase):
@mock.patch.object(migrate_api, 'db_version')
def test__is_database_under_migrate_control__true(self, mock_db_version):
ret = migration._is_database_under_migrate_control('engine', 'repo')
self.assertTrue(ret)
mock_db_version.assert_called_once_with('engine', 'repo')
@mock.patch.object(migrate_api, 'db_version')
def test__is_database_under_migrate_control__false(self, mock_db_version):
mock_db_version.side_effect = \
migrate_exceptions.DatabaseNotControlledError()
ret = migration._is_database_under_migrate_control('engine', 'repo')
self.assertFalse(ret)
mock_db_version.assert_called_once_with('engine', 'repo')
@mock.patch.object(alembic_migration.MigrationContext, 'configure')
def test__is_database_under_alembic_control__true(self, mock_configure):
context = mock_configure.return_value
context.get_current_revision.return_value = 'foo'
engine = mock.MagicMock()
ret = migration._is_database_under_alembic_control(engine)
self.assertTrue(ret)
context.get_current_revision.assert_called_once_with()
@mock.patch.object(alembic_migration.MigrationContext, 'configure')
def test__is_database_under_alembic_control__false(self, mock_configure):
context = mock_configure.return_value
context.get_current_revision.return_value = None
engine = mock.MagicMock()
ret = migration._is_database_under_alembic_control(engine)
self.assertFalse(ret)
context.get_current_revision.assert_called_once_with()

View File

@ -0,0 +1,25 @@
---
upgrade:
- |
The database migration engine has changed from `sqlalchemy-migrate`__ to
`alembic`__. For most deployments, this should have minimal to no impact
and the switch should be mostly transparent. The main user-facing impact is
the change in schema versioning. While sqlalchemy-migrate used a linear,
integer-based versioning scheme, which required placeholder migrations to
allow for potential migration backports, alembic uses a distributed version
control-like schema where a migration's ancestor is encoded in the file and
branches are possible. The alembic migration files therefore use a
arbitrary UUID-like naming scheme and the ``nova-manage db sync`` and
``nova-manage api_db sync`` commands now expect such an version when
manually specifying the version that should be applied. For example::
$ nova-manage db sync 8f2f1571d55b
It is no longer possible to specify an sqlalchemy-migrate-based version.
When the ``nova-manage db sync`` and ``nova-manage api_db sync`` commands
are run, all remaining sqlalchemy-migrate-based migrations will be
automatically applied. Attempting to specify an sqlalchemy-migrate-based
version will result in an error.
.. __: https://sqlalchemy-migrate.readthedocs.io/en/latest/
.. __: https://alembic.sqlalchemy.org/en/latest/