More efficient db resources-per-stack count

Testing with devstack/mysql, given a stack with 350 resources,
stack_count_total_resources() now takes ~2.2ms vs ~7.6ms the old way.

Sqlalchemy was doing a bit more than we needed before, e.g.:
2016-12-03 16:41:58.455 2294 DEBUG sqlalchemy.orm.path_registry [req-b7cdfd64-4306-4145-adb2-dae13caaa9ab - - - - -] set 'memoized_setups' on path 'EntityRegistry((<Mapper at 0x44ddbd0; Resource>,))' to '{}' set /usr/lib64/python2.7/site-packages/sqlalchemy/orm/path_registry.py:63

Change-Id: I7def4093c7719a95e72d49925559bc926f168a01
This commit is contained in:
Crag Wolfe 2016-12-03 15:19:44 -08:00
parent 510a37b601
commit b158dcf78e
1 changed files with 3 additions and 4 deletions

View File

@ -741,10 +741,9 @@ def stack_get_root_id(context, stack_id):
def stack_count_total_resources(context, stack_id):
# count all resources which belong to the root stack
results = context.session.query(
models.Resource
).filter(models.Resource.root_stack_id == stack_id).count()
return results
return context.session.query(
func.count(models.Resource.id)
).filter_by(root_stack_id=stack_id).scalar()
def user_creds_create(context):