RETIRED, An Ansible role for aggregating logs from different nodes.
Go to file
Sagi Shnaidman 50e357a4a3 Run Sova patterns parsing in CI jobs
Using regexps and string patterns, analyze logs and save failure
messages in file.
The original service code is in https://github.com/sshnaidm/sova

Change-Id: Ib55d1b025c4ad9e1775d7918d052ef5a27682c3d
2019-11-15 21:17:58 +00:00
defaults Split the collect logs task 2019-08-13 13:59:03 +02:00
docs [Trivial Fix]misspelling 2019-06-18 17:28:03 +08:00
files Move the roles contents to root directory 2019-03-27 16:29:20 +05:30
library Run Sova patterns parsing in CI jobs 2019-11-15 21:17:58 +00:00
meta Apply cookiecutter on collect-logs 2019-04-15 16:27:27 +05:30
module_utils Run Sova patterns parsing in CI jobs 2019-11-15 21:17:58 +00:00
molecule/sova Run Sova patterns parsing in CI jobs 2019-11-15 21:17:58 +00:00
scripts Move the roles contents to root directory 2019-03-27 16:29:20 +05:30
tasks Run Sova patterns parsing in CI jobs 2019-11-15 21:17:58 +00:00
templates Move the roles contents to root directory 2019-03-27 16:29:20 +05:30
vars Run Sova patterns parsing in CI jobs 2019-11-15 21:17:58 +00:00
zuul.d Added tripleo job template against new collect-logs role 2019-05-06 10:42:38 +05:30
.ansible-lint lint: maintenance chores 2019-10-31 17:41:13 +00:00
.gitignore Apply cookiecutter on collect-logs 2019-04-15 16:27:27 +05:30
.gitreview OpenDev Migration Patch 2019-04-19 19:44:00 +00:00
.pre-commit-config.yaml lint: maintenance chores 2019-10-31 17:41:13 +00:00
.yamllint Apply cookiecutter on collect-logs 2019-04-15 16:27:27 +05:30
LICENSE Apply cookiecutter on collect-logs 2019-04-15 16:27:27 +05:30
README.rst Run Sova patterns parsing in CI jobs 2019-11-15 21:17:58 +00:00
ansible.cfg Apply cookiecutter on collect-logs 2019-04-15 16:27:27 +05:30
requirements.txt Apply cookiecutter on collect-logs 2019-04-15 16:27:27 +05:30
setup.cfg Run Sova patterns parsing in CI jobs 2019-11-15 21:17:58 +00:00
setup.py Apply cookiecutter on collect-logs 2019-04-15 16:27:27 +05:30
test-requirements.txt lint: maintenance chores 2019-10-31 17:41:13 +00:00
tox.ini lint: maintenance chores 2019-10-31 17:41:13 +00:00

README.rst

collect-logs

An Ansible role for aggregating logs from different nodes.

Requirements

This role gathers logs and debug information from a target system and collates them in a designated directory, artcl_collect_dir, on the localhost.

Additionally, the role will convert templated bash scripts, created and used by TripleO-Quickstart during deployment, into rST files. These rST files are combined with static rST files and fed into Sphinx to create user friendly post-build-documentation specific to an original deployment.

Finally, the role optionally handles uploading these logs to a rsync server or to an OpenStack Swift object storage. Logs from Swift can be exposed with os-loganalyze.

Role Variables

  • artcl_collect_list A list of files and directories to gather from the target. Directories are collected recursively and need to end with a “/” to get collected. Should be specified as a YaML list, e.g.:
artcl_collect_list:
    - /etc/nova/
    - /home/stack/*.log
    - /var/log/
  • artcl_collect_list_append A list of files and directories to be appended in the default list. This is useful for users that want to keep the original list and just add more relevant paths.
  • artcl_exclude_list A list of files and directories to exclude from collecting. This list is passed to rsync as an exclude filter and it takes precedence over the collection list. For details see the “FILTER RULES” topic in the rsync man page.
  • artcl_collect_dir A local directory where the logs should be gathered, without a trailing slash.
  • artcl_gzip_only: false/true When true, gathered files are gzipped one by one in artcl_collect_dir, when false, a tar.gz file will contain all the logs.
  • collect_log_types - A list of which type of logs will be collected, such as openstack logs, network logs, system logs, etc. Acceptable values are system, monitoring, network, openstack and container.
  • artcl_gen_docs: false/true If true, the role will use build artifacts and Sphinx and produce user friendly documentation (default: false)
  • artcl_docs_source_dir a local directory that serves as the Sphinx source directory.
  • artcl_docs_build_dir A local directory that serves as the Sphinx build output directory.
  • artcl_create_docs_payload Dictionary of lists that direct what and how to construct documentation.
    • included_deployment_scripts List of templated bash scripts to be converted to rST files.
    • included_deployment_scripts List of static rST files that will be included in the output documentation.
    • table_of_contents List that defines the order in which rST files will be laid out in the output documentation.
  • artcl_verify_sphinx_build false/true If true, verify items defined in artcl_create_docs_payload.table_of_contents exist in sphinx generated index.html (default: false)
artcl_create_docs_payload:
  included_deployment_scripts:
    - undercloud-install
    - undercloud-post-install
  included_static_docs:
    - env-setup-virt
  table_of_contents:
    - env-setup-virt
    - undercloud-install
    - undercloud-post-install
  • artcl_publish: true/false If true, the role will attempt to rsync logs to the target specified by artcl_rsync_url. Uses BUILD_URL, BUILD_TAG vars from the environment (set during a Jenkins job run) and requires the next to variables to be set.
  • artcl_txt_rename: false/true rename text based file to end in .txt.gz to make upstream log servers display them in the browser instead of offering them to download
  • artcl_publish_timeout: the maximum seconds the role can spend uploading the logs, the default is 1800 (30 minutes)
  • artcl_use_rsync: false/true use rsync to upload the logs
  • artcl_rsync_use_daemon: false/true use rsync daemon instead of ssh to connect
  • artcl_rsync_url rsync target for uploading the logs. The localhost needs to have passwordless authentication to the target or the PROVISIONER_KEY Var specificed in the environment.
  • artcl_use_swift: false/true use swift object storage to publish the logs
  • artcl_swift_auth_url the OpenStack auth URL for Swift
  • artcl_swift_username OpenStack username for Swift
  • artcl_swift_password password for the Swift user
  • artcl_swift_tenant_name OpenStack tenant name for Swift
  • artcl_swift_container the name of the Swift container to use, default is logs
  • artcl_swift_delete_after The number of seconds after which Swift will remove the uploaded objects, the default is 2678400 seconds = 31 days.
  • artcl_artifact_url a HTTP URL at which the uploaded logs will be accessible after upload.
  • artcl_collect_sosreport true/false If true, create and collect a sosreport for each host.

Logs parsing

"Sova" module parses logs for known patterns and returns messages that were found. Patterns are tagged by issues types, like "infra", "code", etc. Patterns are located in file sova-patterns.yml in vars/ directory.

  • config - patterns loaded from file
  • files - files and patterns sections match
  • result - path to file to write a result of parsing
  • result_file_dir - directory to write a file with patterns in name

Example of usage of "sova" module:

---
- name: Run sova task
  sova:
    config: "{{ pattern_config }}"
    files:
      console: "{{ ansible_user_dir }}/workspace/logs/quickstart_install.log"
      errors: "/var/log/errors.txt"
      "ironic-conductor": "/var/log/containers/ironic/ironic-conductor.log"
      syslog: "/var/log/journal.txt"
      logstash: "/var/log/extra/logstash.txt"
    result: "{{ ansible_user_dir }}/workspace/logs/failures_file"
    result_file_dir: "{{ ansible_user_dir }}/workspace/logs"

Example Role Playbook

---
- name: Gather logs
  hosts: all:!localhost
  roles:
    - collect-logs
** Note:

The tasks that collect data from the nodes is executed with ignore_errors. For example:

Templated Bash to rST Conversion Notes

Templated bash scripts used during deployment are converted to rST files during the create-docs portion of the roles call. Shell scripts are fed into an awk script and output as restructured text. The awk script has several simple rules:

  1. Only lines between ### ---start_docs and ### ---stop_docs will be parsed.
  2. Lines containing # nodoc will be excluded.
  3. Lines containing ## :: indicate subsequent lines should be formatted as code blocks
  4. Other lines beginning with ## <anything else> will have the prepended ## removed. This is how and where general rST formatting is added.
  5. All other lines, including shell comments, will be indented by four spaces.

Enabling sosreport Collection

sosreport is a unified tool for collecting system logs and other debug information. To enable creation of sosreport(s) with this role, create a custom config (you can use centosci-logs.yml as a template) and ensure that artcl_collect_sosreport: true is set.

License

Apache 2.0

Author Information

RDO-CI Team