Debbuging guidelines for validations-libs

The project documentation now contains a short guide to
issue reporting and pre diagnosis.

Short checklist is provided to help with data collection
before the report is submitted.

Signed-off-by: Jiri Podivin <jpodivin@redhat.com>
Change-Id: I0f1b28d69bb9d197ecc1ad63222bb74cb2c97c93
This commit is contained in:
Jiri Podivin 2021-06-21 13:58:30 +02:00
parent e4b5dc3ea7
commit 0999476e6d
3 changed files with 69 additions and 1 deletions

View File

@ -6,6 +6,8 @@ Contributing to validations-libs
.. include:: ../../CONTRIBUTING.rst
.. _communication:
Communication
-------------
* IRC channel ``#validation-framework`` at `Libera`_ (For all subject-matters)

65
doc/source/debugging.rst Normal file
View File

@ -0,0 +1,65 @@
.. debugging:
=============================
Debugging and issue reporting
=============================
Validations Framework checks behavior and states of complex systems.
It is unfortunate, but unavoidable, that sometimes things can go wrong.
When that happens, the following simple guidelines can help.
Data collection
---------------
No matter what happened, it's not going to get fixed if nobody knows
what it actually was. Therefore it is paramount that data collected
about the issue are both accurate and comprehensive.
It is especially important to collect information about the VF itself.
If you aren't sure what you should look for, consider the listed recommendations.
The first four points deserve a special attention, as they constitute a bare minimumn
needed to begin the issue diagnosis.
* Note the system configuration. Information about VF packages,
validation target and user permissions should be given special attention.
As should be information about python interpreter used.
* Record any output of the commands run. Up to and including the point
when the issue appeared.
* Check presence and permissions of the '/var/log/validations' directory.
* The user running validations must have both read and write permissions
* Check presence and permissions of the '/usr/share/ansible/validation-playbooks' directory.
* The directory has to be accessible to the user running validations.
* Rerun the command with the same paramaters. Note the output.
* Rerun the command with `--debug` and `-vvv` parameters, while keeping the others unchanged.
Note the output.
* Rerun the command in its most basic form. That means no arguments, except for the bare minimum.
And note the output.
Additional information:
* Attempt to run the given validation playbook, usually found in the '/usr/share/ansible/validation-playbooks'
with the `ansible-playbook` binary. The behavior might be unusual, even if the validation is behaving properly,
because many mechanisms employed by the VF don't have direct counterparts in the `ansible-playbook`.
So the information collected doesn't have to be as relevant.
Report submission
-----------------
We welcome all who come to our :ref:`IRC channel <communication>`.
Nevertheless, when it comes to more complex issues, it still pays to write a bug report.
If you encounter problems with an upstream build of our code, you should post the problem,
along with all information you have diligently collected, to the `Launchpad`_.
.. _Launchpad: https://bugs.launchpad.net/tripleo/+bugs?field.tag=validations

View File

@ -14,9 +14,10 @@ Contents
:maxdepth: 2
readme
cli
contributing
testing
cli
debugging
reference/index
Indices and tables