Introduce DynamicCheckHelper class

This class shall be used to map external metrics from Prometheus, InfluxDB, ...
to well-formed Monasca metrics.

It supports:
* filtering by name (regex)
* renaming of metrics
* renamign of dimensions
* filtering by dimension value
* transforming of dimension values (regex based)

Change-Id: If0a5fdb1126bdfc93a88d5c1c7266d0c152d925f
This commit is contained in:
Jakub Wachowski 2017-03-15 11:17:21 +01:00 committed by Joachim Barheine
parent c4d38ef4c5
commit f87a29a364
5 changed files with 773 additions and 1 deletions

View File

@ -207,6 +207,97 @@ The instances section is a list of instances that this check will be run against
It is best practice to include a name for each instance as the monasca-setup program uses this to avoid duplicating instances.
#### DynamicCheckHelper class
The `DynamicCheckHelper` can be used by check plugins to map data from existing monitoring endpoints to Monasca metrics.
Features
* Adjust metric names to Monasca naming conventions
* Map metric names and dimension keys
* Provide metric type information
* Map metadata to dimensions
* Transform names and values using regular expressions
* Filter values using regular expressions on attributes
To support all these capabilities, an element 'mapping' needs to be added to the instance configuration. A default mapping can be supplied for convenience.
Filtering and renaming of input measurements is performed through regular expressions that can be provided for metric-names and dimension values.
##### Selecting and Renaming Metrics
Metrics are specified by providing a list of names or regular expressions. For every metric type (gauge, rate, counter) a separate list is provided. If an incoming measurement does not match any of the listed names/regular expressions, it will be silently filtered out.
If match-groups are specified, the group-values are concatenated with '_' (underscore). If no match-group is specified, the name is taken as is. The resulting name is normalized according to Monasca naming standards for metrics. This implies that dots are replaced by underscores and *CamelCase* is transformed into *lower_case*. Special characters are eliminated, too.
Example:
```
a) Simple mapping:
rates: [ 'FilesystemUsage' ] # map rate metric 'FileystemUsage' to 'filesystem_usage'
b) Mapping with simple regular expression
rates: [ '.*Usage' ] # map metrics ending with 'Usage' to '..._usage'
b) Mapping with regular expression and match-groups
counters: [ '(.*Usage)\.stats\.(total)' ] # map metrics ending with 'Usage.stats.total' to '..._usage_total'
```
##### Mapping Metadata to Dimensions
The filtering and mapping Mapping of metadata attributes to dimensions is a little more complex. For each dimension, an entry of the following format is required:
```
component: app
```
This will map attribute `app` to dimension `component`.
Complex mapping statements use regular expressions to filter and/or transform metadata attributes into dimensions.
The following configuration attributes control the process:
* *source\_key*: name of the incoming metadata attribute. Default: target dimension.
* *regex*: Regular expression to match the incoming metadata attribute _value_ with. This is used for both filtering and transformation using match-groups. Default: `(.*)` (match any and copy value as is).
* *separator*: This string will be used to concatenate the match-groups. Default is `-`(dash).
Example:
```
service:
source_key: kubernetes.namespace
regex: prod-(.*)
```
The regular expression is applied to the dimension value. If the regular expression does not match, then the measurement is ignored. If match-groups are part of the regular expression then the regular expression is used for value transformation: The resulting dimension value is created by concatenating all match-groups (in braces) using the specified separator. If no match-group is specified, then the value is acting as a filter and just normalized. If the regex is a string constant (no wildcards), then it will not be mapped to a dimension at all.
##### Metric Groups
Both metrics and dimension can be defined globally or as part of a group.
When a metric is specified in a group, then the group name is used as a prefix to the metric and the group-specific dimension mappings take precedence over the global ones. When several groups or the global mapping refer to the same input metric, then the Check plugin using the `DynCheckHelper` class needs to specify explicitly which group to select for mapping.
Example:
```
instances:
- name: kubernetes
mapping
dimensions:
pod_name: io.kubernetes.pod.name # simple mapping
pod_basename:
source_key: label_name
regex: 'k8s_.*_.*\._(.*)_[0-9a-z\-]*'
rates:
- io.*
groups:
postgres:
gauges: [ 'pg_(database_size_gauge_average)', 'pg_(database_size)' ]
dimensions:
service: kubernetes_namespace
database: datname
##### Plugin Documentation
Your plugin should include an example `yaml` configuration file to be placed in `/etc/monasca/agent/conf.d` which has the name of the plugin YAML file plus the extension '.example', so the example configuration file for the process plugin would be at `/etc/monasca/agent/conf.d/process.yaml.example. This file should include a set of example init_config and instances clauses that demonstrate how the plugin can be configured.

View File

@ -3,7 +3,9 @@
import base64
import logging
import math
from numbers import Number
import os
import re
import requests
from monasca_agent.common import exceptions
@ -172,3 +174,574 @@ class KubernetesConnector(object):
format(request_url, result.status_code, result.text)
raise exceptions.KubernetesAPIConnectionError(exception_message)
return result.json() if as_json else result
class DynamicCheckHelper(object):
"""Supplements existing check class with reusable functionality to transform third-party metrics into Monasca ones
in a configurable way
"""
COUNTERS_KEY = 'counters'
RATES_KEY = 'rates'
GAUGES_KEY = 'gauges'
CONFIG_SECTIONS = [GAUGES_KEY, RATES_KEY, COUNTERS_KEY]
GAUGE = 0
RATE = 1
COUNTER = 2
SKIP = 3
METRIC_TYPES = [GAUGE, RATE, COUNTER]
DEFAULT_GROUP = ""
class MetricSpec(object):
"""Describes how to filter and map input metrics to Monasca metrics
"""
def __init__(self, metric_type, metric_name):
"""Construct a metric-specification
:param metric_type: one of GAUGE, RATE, COUNTER, SKIP
:param metric_name: normalized name of the metric as reported to Monasca
"""
self.metric_type = metric_type
self.metric_name = metric_name
@staticmethod
def _normalize_dim_value(value):
"""Normalize an input value
* Replace \\x?? values with _
* Replace illegal characters
- according to ANTLR grammar: ( '}' | '{' | '&' | '|' | '>' | '<' | '=' | ',' | ')' | '(' | ' ' | '"' )
- according to Python API validation: "<>={}(),\"\\\\|;&"
* Truncate to 255 chars
:param value: input value
:return: valid dimension value
"""
return re.sub(r'[|\\;,&=\']', '-', re.sub(r'[(}>]', ']', re.sub(r'[({<]', '[', value.replace(r'\x2d', '-').
replace(r'\x7e', '~'))))[:255]
class DimMapping(object):
"""Describes how to transform dictionary like metadata attached to a metric into Monasca dimensions
"""
def __init__(self, dimension, regex='(.*)', separator=None):
"""C'tor
:param dimension to be mapped to
:param regex: regular expression used to extract value from source value
:param separator: used to concatenate match-groups
"""
self.dimension = dimension
self.regex = regex
self.separator = separator
self.cregex = re.compile(regex) if regex != '(.*)' else None
def map_value(self, source_value):
"""Transform source value into target dimension value
:param source_value: label value to transform
:return: transformed dimension value or None if the regular expression did not match. An empty
result (caused by the regex having no match-groups) indicates that the label is used for filtering
but not mapped to a dimension.
"""
if self.cregex:
match_groups = self.cregex.match(source_value)
if match_groups:
return DynamicCheckHelper._normalize_dim_value(self.separator.join(match_groups.groups()))
else:
return None
else:
return DynamicCheckHelper._normalize_dim_value(source_value)
@staticmethod
def _build_dimension_map(config):
"""Builds dimension mappings for the given configuration element
:param config: 'mappings' element of config
:return: dictionary mapping source labels to applicable DimMapping objects
"""
result = {}
for dim, spec in config.get('dimensions', {}).iteritems():
if isinstance(spec, dict):
label = spec.get('source_key', dim)
sepa = spec.get('separator', '-')
regex = spec.get('regex', '(.*)')
else:
label = spec
regex = '(.*)'
sepa = None
# note: source keys can be mapped to multiple dimensions
arr = result.get(label, [])
mapping = DynamicCheckHelper.DimMapping(dimension=dim, regex=regex, separator=sepa)
arr.append(mapping)
result[label] = arr
return result
def __init__(self, check, prefix=None, default_mapping=None):
"""C'tor
:param check: Target check instance
"""
self._check = check
self._prefix = prefix
self._groups = {}
self._metric_map = {}
self._dimension_map = {}
self._metric_cache = {}
self._grp_metric_map = {}
self._grp_dimension_map = {}
self._grp_metric_cache = {}
self._metric_to_group = {}
for inst in self._check.instances:
iname = inst['name']
mappings = inst.get('mapping', default_mapping)
if mappings:
# build global name filter and rate/gauge assignment
self._metric_map[iname] = mappings
self._metric_cache[iname] = {}
# build global dimension map
self._dimension_map[iname] = DynamicCheckHelper._build_dimension_map(mappings)
# check if groups are used
groups = mappings.get('groups')
self._metric_to_group[iname] = {}
self._groups[iname] = []
if groups:
self._groups[iname] = groups.keys()
self._grp_metric_map[iname] = {}
self._grp_metric_cache[iname] = {}
self._grp_dimension_map[iname] = {}
for grp, gspec in groups.iteritems():
self._grp_metric_map[iname][grp] = gspec
self._grp_metric_cache[iname][grp] = {}
self._grp_dimension_map[iname][grp] = DynamicCheckHelper._build_dimension_map(gspec)
# add the global mappings as pseudo group, so that it is considered when searching for metrics
self._groups[iname].append(DynamicCheckHelper.DEFAULT_GROUP)
self._grp_metric_map[iname][DynamicCheckHelper.DEFAULT_GROUP] = self._metric_map[iname]
self._grp_metric_cache[iname][DynamicCheckHelper.DEFAULT_GROUP] = self._metric_cache[iname]
self._grp_dimension_map[iname][DynamicCheckHelper.DEFAULT_GROUP] = self._dimension_map[iname]
else:
raise exceptions.CheckException('instance %s is not supported: no element "mapping" found!', iname)
def _get_group(self, instance, metric):
"""Search the group for a metric. Can be used only when metric names unambiguous across groups.
:param metric: input metric
:return: group name or None (if no group matches)
"""
iname = instance['name']
group = self._metric_to_group[iname].get(metric)
if group is None:
for g in self._groups[iname]:
spec = self._fetch_metric_spec(instance, metric, g)
if spec and spec.metric_type != DynamicCheckHelper.SKIP:
self._metric_to_group[iname][metric] = g
return g
return group
def _fetch_metric_spec(self, instance, metric, group=None):
"""Checks whether a metric is enabled by the instance configuration
:param instance: instance containing the check configuration
:param metric: metric as reported from metric data source (before mapping)
:param group: optional metric group, will be used as dot-separated prefix
"""
instance_name = instance['name']
# filter and classify the metric
if group is not None:
metric_cache = self._grp_metric_cache[instance_name].get(group, {})
metric_map = self._grp_metric_map[instance_name].get(group, {})
return DynamicCheckHelper._lookup_metric(metric, metric_cache, metric_map)
else:
metric_cache = self._metric_cache[instance_name]
metric_map = self._metric_map[instance_name]
return DynamicCheckHelper._lookup_metric(metric, metric_cache, metric_map)
def is_enabled_metric(self, instance, metric, group=None):
return self._fetch_metric_spec(instance, metric, group).metric_type != DynamicCheckHelper.SKIP
def push_metric_dict(self, instance, metric_dict, labels=None, group=None, timestamp=None, fixed_dimensions=None,
default_dimensions=None, max_depth=0, curr_depth=0, prefix='', index=-1):
"""This will extract metrics and dimensions from a dictionary.
The following mappings are applied:
Simple recursive composition of metric names:
Input:
{
'server': {
'requests': 12
}
}
Configuration:
mapping:
counters:
- server_requests
Output:
server_requests=12
Mapping of textual values to dimensions to distinguish array elements. Make sure that tests attributes
are sufficient to distinguish the array elements. If not use the build-in 'index' dimension.
Input:
{
'server': [
{
'role': 'master,
'node_name': 'server0',
'requests': 1500
},
{
'role': 'slave',
'node_name': 'server1',
'requests': 1000
},
{
'role': 'slave',
'node_name': 'server2',
'requests': 500
}
}
}
Configuration:
mapping:
dimensions:
server_role: role
node_name: node_name
rates:
- requests
Output:
server_requests{server_role=master, node_name=server0} = 1500.0
server_requests{server_role=slave, node_name=server1} = 1000.0
server_requests{server_role=slave, node_name=server2} = 500.0
Distinguish array elements where no textual attribute are available or no mapping has been configured for them.
In that case an 'index' dimension will be attached to the metric which has to be mapped properly.
Input:
{
'server': [
{
'requests': 1500
},
{
'requests': 1000
},
{
'requests': 500
}
}
}
Configuration:
mapping:
dimensions:
server_no: index # index is a predefined label
counters:
- server_requests
Result:
server_requests{server_no=0} = 1500.0
server_requests{server_no=1} = 1000.0
server_requests{server_no=2} = 500.0
:param instance: Instance to submit to
:param metric_dict: input data as dictionary
:param labels: labels to be mapped to dimensions
:param group: group to use for mapping labels and prefixing
:param timestamp: timestamp to report for the measurement
:param fixed_dimensions: dimensions which are always added with fixed values
:param default_dimensions: dimensions to be added, can be overwritten by actual data in metric_dict
:param max_depth: max. depth to recurse
:param curr_depth: depth of recursion
:param prefix: prefix to prepend to any metric
:param index: current index when traversing through a list
"""
# when traversing through an array, each element must be distinguished with dimensions
# therefore additional dimensions need to be calculated from the siblings of the actual number valued fields
if default_dimensions is None:
default_dimensions = {}
if fixed_dimensions is None:
fixed_dimensions = {}
if labels is None:
labels = {}
if index != -1:
ext_labels = self.extract_dist_labels(instance['name'], group, metric_dict, labels.copy(), index)
if not ext_labels:
log.debug(
"skipping array due to lack of mapped dimensions for group %s "
"(at least 'index' should be supported)",
group if group else '<root>')
return
else:
ext_labels = labels.copy()
for element, child in metric_dict.iteritems():
# if child is a dictionary, then recurse
if isinstance(child, dict) and curr_depth < max_depth:
self.push_metric_dict(instance, child, ext_labels, group, timestamp, fixed_dimensions,
default_dimensions, max_depth, curr_depth + 1, prefix + element + '_')
# if child is a number, assume that it is a metric (it will be filtered out by the rate/gauge names)
elif isinstance(child, Number):
self.push_metric(instance, prefix + element, float(child), ext_labels, group, timestamp,
fixed_dimensions,
default_dimensions)
# if it is a list, then each array needs to be added. Additional dimensions must be found in order to
# distinguish the measurements.
elif isinstance(child, list):
for i, child_element in enumerate(child):
if isinstance(child_element, dict):
if curr_depth < max_depth:
self.push_metric_dict(instance, child_element, ext_labels, group, timestamp,
fixed_dimensions, default_dimensions, max_depth, curr_depth + 1,
prefix + element + '_', index=i)
elif isinstance(child_element, Number):
if len(self._get_mappings(instance['name'], group, 'index')) > 0:
idx_labels = ext_labels.copy()
idx_labels['index'] = str(i)
self.push_metric(instance, prefix + element, float(child_element), idx_labels, group,
timestamp, fixed_dimensions, default_dimensions)
else:
log.debug("skipping array due to lack of mapped 'index' dimensions for group %s",
group if group else '<root>')
else:
log.debug('nested arrays are not supported for configurable extraction of element %s', element)
def extract_dist_labels(self, instance_name, group, metric_dict, labels, index):
"""Extract additional distinguishing labels from metric dictionary. All top-level attributes which are
strings and for which a dimension mapping is available will be transformed into dimensions.
:param instance_name: instance to be used
:param group: metric group or None for root/unspecified group
:param metric_dict: input dictionary containing the metric at the top-level
:param labels: labels dictionary to extend with the additional found metrics
:param index: index value to be used as fallback if no labels can be derived from string-valued attributes
or the derived labels are not mapped in the config.
:return: Extended labels, already including the 'labels' passed into this method
"""
ext_labels = None
# collect additional dimensions first from non-metrics
for element, child in metric_dict.iteritems():
if isinstance(child, str) and len(self._get_mappings(instance_name, group, element)) > 0:
if not ext_labels:
ext_labels = labels.copy()
ext_labels[element] = child
# if no additional labels supplied just take the index (if it is mapped)
if not ext_labels and len(self._get_mappings(instance_name, group, 'index')) > 0:
if not ext_labels:
ext_labels = labels.copy()
ext_labels['index'] = str(index)
return ext_labels
def push_metric(self, instance, metric, value, labels=None, group=None, timestamp=None, fixed_dimensions=None,
default_dimensions=None):
"""Pushes a meter using the configured mapping information to determine metric_type and map the name and dimensions
:param instance: instance containing the check configuration
:param value: metric value (float)
:param metric: metric as reported from metric data source (before mapping)
:param labels: labels/tags as reported from the metric data source (before mapping)
:param timestamp: optional timestamp to handle rates properly
:param group: specify the metric group, otherwise it will be determined from the metric name
:param fixed_dimensions:
:param default_dimensions:
"""
# determine group automatically if not specified
if fixed_dimensions is None:
fixed_dimensions = {}
if labels is None:
labels = {}
if default_dimensions is None:
default_dimensions = {}
if group is None:
group = self._get_group(instance, metric)
metric_entry = self._fetch_metric_spec(instance, metric, group)
if metric_entry.metric_type == DynamicCheckHelper.SKIP:
return False
if self._prefix:
metric_prefix = self._prefix + '.'
else:
metric_prefix = ''
if group:
metric_prefix += group + '.'
# determine the metric name
metric_name = metric_prefix + metric_entry.metric_name
# determine the target dimensions
dims = self._map_dimensions(instance['name'], labels, group, default_dimensions)
if dims is None:
# regex for at least one dimension filtered the metric out
return True
# apply fixed default dimensions
if fixed_dimensions:
dims.update(fixed_dimensions)
log.debug('push %s %s = %s {%s}', metric_entry.metric_type, metric_entry.metric_name, value, dims)
if metric_entry.metric_type == DynamicCheckHelper.RATE:
self._check.rate(metric_name, float(value), dimensions=dims)
elif metric_entry.metric_type == DynamicCheckHelper.GAUGE:
self._check.gauge(metric_name, float(value), timestamp=timestamp, dimensions=dims)
elif metric_entry.metric_type == DynamicCheckHelper.COUNTER:
self._check.increment(metric_name, float(value), dimensions=dims)
return True
def get_mapped_metrics(self, instance):
"""Returns input metric names or regex for which a mapping has been defined
:param instance: instance to consider
:return: array of metrics
"""
metric_list = []
iname = instance['name']
# collect level-0 metrics
metric_map = self._metric_map[iname]
metric_list.extend(metric_map.get(DynamicCheckHelper.GAUGES_KEY, []))
metric_list.extend(metric_map.get(DynamicCheckHelper.RATES_KEY, []))
metric_list.extend(metric_map.get(DynamicCheckHelper.COUNTERS_KEY, []))
# collect group specific metrics
grp_metric_map = self._grp_metric_map.get(iname, {})
for gname, gmmap in grp_metric_map.iteritems():
metric_list.extend(gmmap.get(DynamicCheckHelper.GAUGES_KEY, []))
metric_list.extend(gmmap.get(DynamicCheckHelper.RATES_KEY, []))
metric_list.extend(gmmap.get(DynamicCheckHelper.COUNTERS_KEY, []))
return metric_list
def _map_dimensions(self, instance_name, labels, group, default_dimensions):
"""Transforms labels attached to input metrics into Monasca dimensions
:param default_dimensions:
:param group:
:param instance_name:
:param labels:
:return: mapped dimensions or None if the dimensions filter did not match and the metric needs to be filtered
"""
dims = default_dimensions.copy()
# map all specified dimension all keys
for labelname, labelvalue in labels.iteritems():
mapping_arr = self._get_mappings(instance_name, group, labelname)
target_dim = None
for map_spec in mapping_arr:
try:
# map the dimension name
target_dim = map_spec.dimension
# apply the mapping function to the value
if target_dim not in dims: # do not overwrite
mapped_value = map_spec.map_value(labelvalue)
if mapped_value is None:
# None means: filter it out based on dimension value
return None
elif mapped_value != '':
dims[target_dim] = mapped_value
# else the dimension will not map
except (IndexError, AttributeError): # probably the regex was faulty
log.exception(
'dimension %s value could not be mapped from %s: regex for mapped dimension %s '
'does not match %s',
target_dim, labelvalue, labelname, map_spec.regex)
return None
return dims
def _get_mappings(self, instance_name, group, labelname):
# obtain mappings
# check group-specific ones first
if group:
mapping_arr = self._grp_dimension_map[instance_name].get(group, {}).get(labelname, [])
else:
mapping_arr = []
# fall-back to global ones
mapping_arr.extend(self._dimension_map[instance_name].get(labelname, []))
return mapping_arr
@staticmethod
def _create_metric_spec(metric, metric_type, metric_map):
"""Get or create MetricSpec if metric is in list for metric_type
:param metric: incoming metric name
:param metric_type: GAUGE, RATE, COUNTER
:param metric_map: dictionary with mapping configuration for a metric group or the entire instance
:return: new MetricSpec entry or None if metric is not listed as metric_type
"""
re_list = metric_map.get(DynamicCheckHelper.CONFIG_SECTIONS[metric_type], [])
for rx in re_list:
match_groups = re.match(rx, metric)
if match_groups:
metric_entry = DynamicCheckHelper.MetricSpec(metric_type=metric_type,
metric_name=DynamicCheckHelper._normalize_metricname(
metric,
match_groups))
return metric_entry
return None
@staticmethod
def _lookup_metric(metric, metric_cache, metric_map):
"""Search cache for a MetricSpec and create if missing
:param metric: input metric name
:param metric_cache: cache to use
:param metric_map: mapping config element to consider
:return: MetricSpec for the output metric
"""
i = DynamicCheckHelper.GAUGE
metric_entry = metric_cache.get(metric, None)
while not metric_entry and i < len(DynamicCheckHelper.METRIC_TYPES):
metric_entry = DynamicCheckHelper._create_metric_spec(metric, DynamicCheckHelper.METRIC_TYPES[i],
metric_map)
i += 1
if not metric_entry:
# fall-through
metric_entry = DynamicCheckHelper.MetricSpec(metric_type=DynamicCheckHelper.SKIP,
metric_name=DynamicCheckHelper._normalize_metricname(metric))
metric_cache[metric] = metric_entry
return metric_entry
@staticmethod
def _normalize_metricname(metric, match_groups=None):
# map metric name first
if match_groups and match_groups.lastindex > 0:
metric = '_'.join(match_groups.groups())
metric = re.sub('(?!^)([A-Z]+)', r'_\1', metric.replace('.', '_')).replace('__', '_').lower()
metric = re.sub(r"[,+*\-/()\[\]{}]", "_", metric)
# Eliminate multiple _
metric = re.sub(r"__+", "_", metric)
# Don't start/end with _
metric = re.sub(r"^_", "", metric)
metric = re.sub(r"_$", "", metric)
# Drop ._ and _.
metric = re.sub(r"\._", ".", metric)
metric = re.sub(r"_\.", ".", metric)
return metric

View File

@ -21,4 +21,4 @@ redis>=2.10.0 # MIT
six>=1.9.0 # MIT
supervisor>=3.1.3,<3.4
stevedore>=1.17.1 # Apache-2.0
tornado>=4.3
tornado>=4.3

View File

@ -5,3 +5,4 @@ hacking>=0.12.0,!=0.13.0,<0.14 # Apache-2.0
flake8<2.6.0,>=2.5.4 # MIT
nose # LGPL
mock>=2.0 # BSD
prometheus_client

107
tests/test_checks_utils.py Normal file
View File

@ -0,0 +1,107 @@
import os
import time
import unittest
import monasca_agent.common.config as configuration
from monasca_agent.collector.checks import AgentCheck
from monasca_agent.collector.checks.utils import DynamicCheckHelper
base_config = configuration.Config(os.path.join(os.path.dirname(__file__),
'test-agent.yaml'))
class TestDynamicCheckHelper(unittest.TestCase):
def setUp(self):
agent_config = base_config.get_config(sections='Main')
self._instances = [{'name': 'test',
'mapping': {
'gauges': ['stats.(MessagesAvg)'],
'counters': ['MessagesTotal'],
'dimensions': {
'index': 'index',
'simple_dimension': 'simple_label',
'complex_dimension': {
'source_key': 'complex_label',
'regex': 'k8s_([._\-a-zA-Z0-9]*)_postfix'
},
'complex_dimension_rest': {
'source_key': 'complex_label',
'regex': 'k8s_([._\-a-zA-Z0-9]*_postfix)'
}
},
'groups': {
'testgroup': {
'dimensions': {
'user': 'user'
},
'rates': ['.*\.Responses.*', '(sec_auth_.*).stats',
'(io_service_bytes)_stats_Total']
}
# dimensions should be inherited from above
}}}]
self.check = AgentCheck("DynCheckHelper-Teset", {}, agent_config, self._instances) # TODO mock check
self.helper = DynamicCheckHelper(self.check, 'dynhelper')
def run_check(self):
self.check.run()
metric_dict = {"sec": {"auth": [{"user": "me", "total.stats": 10}, {"user": "you", "total.stats": 15}]},
"io_service_bytes": {"stats": {"Total": 10}}}
self.helper.push_metric_dict(self._instances[0], metric_dict, group="testgroup",
labels={'simple_label': 'simple_label_test',
'complex_label': 'k8s_monasca-api-a8109321_postfix'}, max_depth=3)
self.helper.push_metric(self._instances[0], metric='req.ResponsesOk', value=10.0,
group="testgroup",
labels={'simple_label': 'simple_label_test',
'complex_label': 'k8s_monasca-api-a8109321_postfix'})
self.helper.push_metric(self._instances[0], metric='stats.MessagesAvg', value=5.0,
labels={'simple_label': 'simple_label_test',
'complex_label': 'k8s_monasca-api-a8109321_postfix'})
self.helper.push_metric(self._instances[0], metric='MessagesTotal', value=1)
time.sleep(1)
self.helper.push_metric_dict(self._instances[0], metric_dict, group="testgroup",
labels={'simple_label': 'simple_label_test',
'complex_label': 'k8s_monasca-api-a8109321_postfix'}, max_depth=3)
self.helper.push_metric(self._instances[0], metric='req.ResponsesOk', value=15.0,
group="testgroup",
labels={'simple_label': 'simple_label_test',
'complex_label': 'k8s_monasca-api-a8109321_postfix'})
self.helper.push_metric(self._instances[0], metric='MessagesTotal', value=100)
metrics = self.check.get_metrics()
return metrics
def testMeasurements(self):
metrics = self.run_check()
for m in metrics:
print "metric: {0}, dimensions: {1}".format(m['measurement']['name'], repr(m['measurement']['dimensions']))
metric1 = sorted(filter(lambda m: m['measurement']['name'] == 'dynhelper.messages_avg', metrics))
metric2 = sorted(filter(lambda m: m['measurement']['name'] == 'dynhelper.messages_total', metrics))
metric3 = sorted(filter(lambda m: m['measurement']['name'] == 'dynhelper.testgroup.req_responses_ok', metrics))
metric4 = sorted(filter(lambda m: m['measurement']['name'] == 'dynhelper.testgroup.sec_auth_total', metrics))
self.assertTrue(len(metric1) > 0,
'gauge dynhelper.messages_avg missing in metric list {0}'.format(repr(metrics)))
self.assertEquals(metric1[0]['measurement']['dimensions'],
{'simple_dimension': 'simple_label_test', 'complex_dimension': 'monasca-api-a8109321',
'complex_dimension_rest': 'monasca-api-a8109321_postfix',
'hostname': metric1[0]['measurement']['dimensions'].get('hostname')})
self.assertTrue(len(metric2) > 0,
'rate dynhelper.messages_total missing in metric list {0}'.format(repr(metrics)))
self.assertEquals(metric2[0]['measurement']['dimensions'],
{'hostname': metric2[0]['measurement']['dimensions'].get('hostname')})
self.assertTrue(len(metric3) > 0,
'rate dynhelper.testgroup.req_responses_ok missing in metric list {0}'.format(repr(metrics)))
self.assertEquals(metric3[0]['measurement']['dimensions'],
{'simple_dimension': 'simple_label_test', 'complex_dimension': 'monasca-api-a8109321',
'complex_dimension_rest': 'monasca-api-a8109321_postfix',
'hostname': metric3[0]['measurement']['dimensions'].get('hostname')})
self.assertTrue(len(metric4) == 2,
'rate dynhelper.testgroup.sec_auth_total missing in metric list {0}'.format(repr(metrics)))
self.assertEquals(metric4[0]['measurement']['dimensions'],
{'simple_dimension': 'simple_label_test', 'complex_dimension': 'monasca-api-a8109321',
'complex_dimension_rest': 'monasca-api-a8109321_postfix',
'user': 'me', 'hostname': metric4[0]['measurement']['dimensions'].get('hostname')})
self.assertEquals(metric4[1]['measurement']['dimensions'],
{'simple_dimension': 'simple_label_test', 'complex_dimension': 'monasca-api-a8109321',
'complex_dimension_rest': 'monasca-api-a8109321_postfix',
'user': 'you', 'hostname': metric4[1]['measurement']['dimensions'].get('hostname')})