Copy from Volvo Cars local project

We don't transfer git history since it may contain proprietary data that
we cannot have in an open sources version.

Change-Id: I9586124c1720db69a76b9390e208e9f0ba3b86d4
This commit is contained in:
Henrik Wahlqvist 2024-05-28 13:05:22 +02:00
parent 05acd5e750
commit 65c1d746a7
407 changed files with 212313 additions and 3 deletions

29
.gitignore vendored Normal file
View File

@ -0,0 +1,29 @@
# Ignore python files.
*.pyc
.tox
.cache
.pytest_cache
.coverage
.venv
# Ignore IDE files.
.idea
.vscode
# Ignore pytest generate files.
**/output/
# Ignore generated signal check reports.
**/Reports
# Ignore setuptools distribution folder.
/dist/
/build/
# Ignore python egg metadata, regenerated from source files by setuptools.
/*.egg-info
/.eggs/
# Ignore files generated by pbr
AUTHORS
ChangeLog

128
CODE_OF_CONDUCT.md Normal file
View File

@ -0,0 +1,128 @@
# Contributor Covenant Code of Conduct
## Our Pledge
We as members, contributors, and leaders pledge to make participation in our
community a harassment-free experience for everyone, regardless of age, body
size, visible or invisible disability, ethnicity, sex characteristics, gender
identity and expression, level of experience, education, socio-economic status,
nationality, personal appearance, race, religion, or sexual identity
and orientation.
We pledge to act and interact in ways that contribute to an open, welcoming,
diverse, inclusive, and healthy community.
## Our Standards
Examples of behavior that contributes to a positive environment for our
community include:
* Demonstrating empathy and kindness toward other people
* Being respectful of differing opinions, viewpoints, and experiences
* Giving and gracefully accepting constructive feedback
* Accepting responsibility and apologizing to those affected by our mistakes,
and learning from the experience
* Focusing on what is best not just for us as individuals, but for the
overall community
Examples of unacceptable behavior include:
* The use of sexualized language or imagery, and sexual attention or
advances of any kind
* Trolling, insulting or derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or email
address, without their explicit permission
* Other conduct which could reasonably be considered inappropriate in a
professional setting
## Enforcement Responsibilities
Community leaders are responsible for clarifying and enforcing our standards of
acceptable behavior and will take appropriate and fair corrective action in
response to any behavior that they deem inappropriate, threatening, offensive,
or harmful.
Community leaders have the right and responsibility to remove, edit, or reject
comments, commits, code, and other contributions that are
not aligned to this Code of Conduct, and will communicate reasons for moderation
decisions when appropriate.
## Scope
This Code of Conduct applies within all community spaces, and also applies when
an individual is officially representing the community in public spaces.
Examples of representing our community include using an official e-mail address,
posting via an official social media account, or acting as an appointed
representative at an online or offline event.
## Enforcement
Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported to the community leaders responsible for enforcement at
<henrik.wahlqvist@volvocars.com>.
All complaints will be reviewed and investigated promptly and fairly.
All community leaders are obligated to respect the privacy and security of the
reporter of any incident.
## Enforcement Guidelines
Community leaders will follow these Community Impact Guidelines in determining
the consequences for any action they deem in violation of this Code of Conduct:
### 1. Correction
**Community Impact**: Use of inappropriate language or other behavior deemed
unprofessional or unwelcome in the community.
**Consequence**: A private, written warning from community leaders, providing
clarity around the nature of the violation and an explanation of why the
behavior was inappropriate. A public apology may be requested.
### 2. Warning
**Community Impact**: A violation through a single incident or series
of actions.
**Consequence**: A warning with consequences for continued behavior. No
interaction with the people involved, including unsolicited interaction with
those enforcing the Code of Conduct, for a specified period of time. This
includes avoiding interactions in community spaces as well as external channels
like social media. Violating these terms may lead to a temporary or
permanent ban.
### 3. Temporary Ban
**Community Impact**: A serious violation of community standards, including
sustained inappropriate behavior.
**Consequence**: A temporary ban from any sort of interaction or public
communication with the community for a specified period of time. No public or
private interaction with the people involved, including unsolicited interaction
with those enforcing the Code of Conduct, is allowed during this period.
Violating these terms may lead to a permanent ban.
### 4. Permanent Ban
**Community Impact**: Demonstrating a pattern of violation of community
standards, including sustained inappropriate behavior, harassment of an
individual, or aggression toward or disparagement of classes of individuals.
**Consequence**: A permanent ban from any sort of public interaction within
the community.
## Attribution
This Code of Conduct is adapted from the [Contributor Covenant][homepage],
version 2.0, available at
<https://www.contributor-covenant.org/version/2/0/code_of_conduct.html>.
Community Impact Guidelines were inspired by [Mozilla's code of conduct
enforcement ladder](https://github.com/mozilla/diversity).
[homepage]: https://www.contributor-covenant.org
For answers to common questions about this code of conduct, see the FAQ at
<https://www.contributor-covenant.org/faq>. Translations are available at
<https://www.contributor-covenant.org/translations>.

55
CONTRIBUTING.md Normal file
View File

@ -0,0 +1,55 @@
# Contributing to Our Project
We love your input! We want to make contributing to this project as easy and transparent as possible, whether it's:
- Reporting a bug
- Discussing the current state of the code
- Submitting a fix
- Proposing new features
- Becoming a maintainer
## We Develop with Gerrit
We use Gerrit for code review. You can propose changes by submitting a patch to Gerrit.
## We Use [Coding Conventions](#)
To ensure consistency throughout the source code, please follow these coding conventions.
## Report bugs using Storyboard
We use [Storyboard](https://storyboard.openstack.org/#!/project/volvocars/powertrain-build) issues to track public bugs. Report a bug by opening a new issue.
## Write bug reports with detail, background, and sample code
A bug report with an isolated way to reproduce the problem is ideal. Here's an example:
> Short and descriptive example bug report title
>
> A summary of the issue and the environment in which it occurs. If suitable, include the steps required to reproduce the bug.
>
> 1. This is the first step
> 2. This is the second step
> 3. Further steps, etc.
>
> `<a link to the reduced test case>`
>
> Any other information you want to share that is relevant to the issue being reported.
## Use a Consistent Coding Style
We are using a specific coding style. Please follow it to make the codebase easier to read and maintain.
## License
By contributing, you agree that your contributions will be licensed under its existing license.
## References
Include any references or links that contributors may find useful, such as:
- Gerrit documentation
- Coding style guidelines
- Other reference material
Thank you for contributing!

202
LICENSE Normal file
View File

@ -0,0 +1,202 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright 2024 Volvo Car Corporation
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

21
MANIFEST.in Normal file
View File

@ -0,0 +1,21 @@
include pybuild/templates/*.html
include pybuild/static_code/DIDApiTL/*
include pybuild/static_code/DIDApiEC/*
include pybuild/static_code/bosch_headers/*
include pybuild/static_code/csp_headers/*
include pybuild/static_code/denso_headers/*
include pybuild/static_code/hi_headers/*
include pybuild/static_code/BoschCoreAPIEC/*
include pybuild/static_code/BoschCoreAPITL/*
include pybuild/static_code/CSPCoreAPITL/*
include pybuild/static_code/CSPCoreAPIEC/*
include pybuild/static_code/DensoCoreAPIEC/*
include pybuild/static_code/DensoCoreAPITL/*
include test-requirements.txt
include pybuild/matlab_scripts/CodeGen/*
include pybuild/matlab_scripts/helperFunctions/*
include pybuild/matlab_scripts/*
include requirements.txt
prune pybuild/test
prune pybuild/interface/test
prune pybuild/lib/test

3122
NOTICE Normal file

File diff suppressed because it is too large Load Diff

47
README.md Normal file
View File

@ -0,0 +1,47 @@
# PyBuild
A Continuous Integration (CI) build system, testing all configurations where a TargetLink model is used.
## General Information about Pybuild
- PyBuild is fast.
- More parallelization of jobs in the CI system makes it faster.
- Code generation is moved to the developer's PC.
- Code generation is done once for all projects using pre-processor directives.
- C code reviews are now possible in Gerrit.
- PyBuild adds signal consistency checks.
- Unit tests of the build system are introduced.
- Its quality is assured.
- PyBuild creates new variable classes with unique code decorations.
- Post-processing C code is not necessary.
- ASIL-classed variables get declared at the source.
- Memory can be optimized at compilation through short addressing different variable classes.
- The same models can be used in more than two different suppliers, for instance, SPA2's Core System Platform (CSP).
- PyBuild fixes incorrect handling of NVM variables.
## Project Structure
- `docs/`: This directory holds all the extra documentation about the project.
- `playbooks/`: Directory where we keep Ansible playbooks that are executed in the jobs we use in this project.
- `pybuild/`: Main directory of the project. All the application source code is kept here. It is divided into different Python modules:
- `interface/`
- `lib/`
- `zone_controller/`
Also, we keep `static_code/` and `templates/` directories with useful `.c`, `.h`, and `.html` files.
- `tests/`: Directory where we keep the unit tests for our application source code. The tests are structured in a similar way to what we have inside the `pybuild/` directory. Tests for the `interface`, `lib`, and `zone_controller` modules are split into `tests/interface/`, `tests/lib/`, and `tests/zone_controller/`, respectively. Other tests are kept inside the `tests/pybuild/` directory.
- `zuul.d/`: Directory where we keep our Zuul jobs.
## How to use Pybuild
## Contributing
We would love to see you contribute to this project. No matter if it is fixing a bug, adding some tests, improving documentation, or implementing new features. See our [contribution guidelines](./CONTRIBUTING.md) so you can have a better understanding of the whole process.
## Code of Conduct
We are trying to create a healthy community that thrives on the desire to improve, learn, and share knowledge. See our [code of conduct guidelines](./CODE_OF_CONDUCT.md) to check our behavioral rules on this project.

20
docs/build_cmd.md Normal file
View File

@ -0,0 +1,20 @@
# Run the build command
`pybuild/build.py` is the starting point for generating all the files needed for building a complete SW using the supplier make environment. The script takes one positional argument, and that is a [project_config](project_config.md).
This script acts as a command line wrapper for [build](../../pybuild/build.py).
```none
usage: build.py [-h] [-cd] [-d] [-na] [-if] [-V] proj_config
positional arguments:
proj_config the project root configuration file
optional arguments:
-h, --help show this help message and exit
-cd, --core_dummy generate core dummy code to enable integration with old supplier code
-d, --debug activate the debug log
-na, --no_abort do not abort due to errors
-if, --interface generate interface report
-V, --version show application version and exit
```

84
docs/design_standards.md Normal file
View File

@ -0,0 +1,84 @@
# Design standard
## Imports
Allowed imports in production code is:
- Built-in packages
- Third party packages
- Libraries
If you need a function from another executable script, refactor it into a library and import it from both places.
Allowed imports in test code is:
- Built-in packages
- Third party packages
- Test libraries
- Functions from the code that is being tested
Use to full path when importing.
## Style guides
- The source code should follow [PEP-8](https://www.python.org/dev/peps/pep-0008/).
- Exceptions to these rules are defined in the `<module>/tox.ini` and `<module>/.pylintrc` files
- Docstring documentation shall be done according to [PEP257 style guide](https://www.python.org/dev/peps/pep-0257/).
## Style guide verification
Test that the code is following the PEP-8 standards using [flake8](https://pypi.org/project/flake8/) by running 'flake8 <module>' and [pylint](https://www.pylint.org/) by running 'pylint <module>'.
The ignored folders should have their own configurations, to keep some kind of standard. For example the documentation folder:
- run 'flake8 <module>/doc' from the repo root folder. The rules are defined in `<module>/doc/tox.ini`
- run 'pylint <module>.doc --rcfile=<module>/doc/.pylintrc' from the project root folder. The rules are defined in `doc/.pylintrc`
Plugins that should be used with flake8:
- flake8-docstrings
## Comments
Comments inside the code (not docstrings) should explain WHY something is implemented the way it is, not WHAT the implementation does. An exception to this rule is regular expressions, which can be documented (but shall always tested).
Instead of commenting about what the code does, write relevant tests that shows this.
Commented and unused code shall be removed.
## Executable scripts
All python scripts that are executed from Jenkins or as stand-alone scripts should be located in the `<module>` in the project root folder
The executable scripts shall use argparse if any argument is needed for the execution of the script.
## Libraries
Common code should be located in the `<module>/lib` folder
If a library is only used by other libraries, it is allowed to put the files in a `<module>/lib/<library>` folder.
If the libraries requires arguments, they shall not be set as required in argparse. Instead, they shall have a check to see if they are present when needed. This is to make it easier to import the libraries if they are sometimes but not always using the library.
The library folders should all have an `__init__.py` containing doc string.
Placement of functions:
- If a function from an executable script is needed elsewhere, refactor it into a library and import it from both places.
- Only functions used by the library itself, functions that uses the library or functions used by more than one script should be in a library.
- Do not move a function to a library because you think it will be used by other scripts later. Wait until you can refactor both scripts to use the function.
## Tests
All tests shall be located in the `<module>/test` folder.
Unit tests shall be named `test_<submodule>.py` and function tests shall be named `func_<submodule>.py`
Example: `lib/gerrit.py` shall have a `<module>/test/test_gerrit.py`
Additional files needed by the tests shall be located in a separate folder: `test/<submodule>/<files>`
The test shall test WHAT the module is doing, not HOW. That means that you shall test as many public functions as possible, but do not explicitly test private functions (def `__<function>`). Instead of testing the private functions, test the public functions in such a way that the private functions are being tested as well. This makes it much easier to refactor how the code is implemented while keeping tests for what it does unmodified.
Some duplication of code is acceptable in the test code, as this should be more focused on DAMP (Descriptive And Meaningful Phrases) than DRY (Don't repeat yourself) code. While we want both, it sometimes makes the test code hard to read, and then DAMP should be prioritized in the test code.
Mocking inside of unit tests shall be limited to external systems (such as gerrit), and other processes (such as matlab). Mocking inside of function tests shall be limited to external systems (such as gerrit).
The code shall be unit tested using the python package unittest. The unit tests are run by `pytest.exe /\*\*/test\_\*.py` in the project root folder. The unit tests are run by `pytest.exe /\*\*/func\_\*.py` in the project root folder. All tests can be run by `python -m unittest discover -s `.

Binary file not shown.

After

Width:  |  Height:  |  Size: 29 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 7.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 7.7 KiB

BIN
docs/images/SignalFlow.PNG Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 24 KiB

BIN
docs/images/TL-DD-MACRO.jpg Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 26 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 14 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 14 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 10 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 6.4 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 27 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 13 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.5 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 73 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 76 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 38 KiB

View File

@ -0,0 +1 @@
<mxfile host="Electron" modified="2022-09-29T16:34:41.319Z" agent="5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) draw.io/20.2.3 Chrome/102.0.5005.167 Electron/19.0.11 Safari/537.36" etag="uTEIQzkqlNnG4BsbmwIM" version="20.2.3" type="device"><diagram id="pMotE5WckEuNcqNOJ_rL" name="Page-1">5ZlbU6MwFMc/DY8yBEovjxbr7oM67nS3uvviREgha0qYkLbUT78JhGuqrhe0M4WHkj/JIZzfyUlCDcdbZd8YTKJLGiBi2FaQGc6ZYYtjZIsfqewKZTJ2CyFkOCgkUAtz/IiUaCl1jQOUtipySgnHSVv0aRwjn7c0yBjdtqstKWk/NYEh0oS5D4mu3uCAR4U6dq1a/45wGJVPBpa6cw/9h5DRdayeZ9iOdSbP4vYKlrZU/TSCAd02JGdmOB6jlBdXq8xDRPq2dFvR7vyJu1W/GYr5/zTY/PyVPZCbpf34g18t0B92kYATZWUDyVr5I8tMX3WY70on5a+JpCHLcKbbCHM0T6Av725FWAgt4isiSkBcLmnMz+EKExkRHl0zjJgwd4XEy09TzugD8iihLDftnHvyVM2aen5IHRPS0K38ELrqOWIcZU+6BFSOFgGM6ApxthNVVIOTURmFKngHQ1Xe1qEwUFLUiIKSKVTBF1amawDiQjF4BQ97H4/oaHgMRwfGw9F47HZHND7cCXgbD9ATj8E+HsczPlz3wHgMNR4aCxQHp3KeFqWYxkKcBjCNcjigDULq15BzxOJcsS2nIlDOzcI70zbdZ/GhDPNbZV5e/5aNTFeVzjJlIy/sVOH1yIt3RoG20uiAFX4R/fPRSxOAHgANvu4evqXGEIEcb9rd2AddPeGaYtHBOr4GTjv/uk4nsRb9V82ai42uJXfSyeSjjiUOWYi4ZimPwurF3x6YIy0wF/4s4wvIek3f2uD3POs9MfW+dDHQplOgpYvxZ06nk5fTRT8sVOb5Ggwju4NhoGdtd1/aBqAvEEBf+Ffjo8/p9LDGh9NNUl89PoC+/r9aXN4J76x9nh5R5nKsyYtkAJh8IprScAONmLoQWwpvm3666RFNa730/gVpfn4Up2F7BFUjqsXpU3fQemYz/6Y0Pk4+1W7gGT77FpT94dEznHlqXxwnHTA+NDr69w6NjL4fa5BII5jIeqsslJ+LzSWhWz+CjJswjikXmxMa38k2kOBQ7vIIWoqXmhJ4j8g1TbGsIGRWvOw0oXmSnW2ES1Oj3iZWQF+7AmSqE0KZfOH05gzNkdtiLxYapu1q+O2Bjl9MjqbbVwToX1jm6yQhwq0n6bbHYXpYH1q6H4Zde9/I/JihKYr1vwDF/rf+q8WZ/QM=</diagram></mxfile>

Binary file not shown.

After

Width:  |  Height:  |  Size: 40 KiB

View File

@ -0,0 +1 @@
<mxfile host="Electron" modified="2022-09-29T16:33:39.081Z" agent="5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) draw.io/20.2.3 Chrome/102.0.5005.167 Electron/19.0.11 Safari/537.36" etag="TIeIR2BoZ7Wu229wwxmF" version="20.2.3" type="device"><diagram id="pMotE5WckEuNcqNOJ_rL" name="Page-1">5ZlNc5swEIZ/jY9hEBh/HGPitIckk9ap0/aSUUAGNQJ5hGzj/vpKIDAgktRJcDxjczB6ASHts9qVRM92o/QLg8vwmvqI9CzTT3v2Rc8Sv6El/qSyzZXxyMmFgGE/l8BOmOG/SImmUlfYR0ntRk4p4XhZFz0ax8jjNQ0yRjf12xaU1N+6hAHShJkHia7eY5+HuTpyzJ3+FeEgLN4MTHXlEXpPAaOrWL2vZ9nmhTzyyxEs6lL3JyH06aYi2dOe7TJKeX4WpS4i0raF2fLnLp+5WraboZj/zwPrux/pE7lfWH+/8Zs5+s2uluBM1bKGZKXskaaGpxrMt4WRsm4iWZHZsyebEHM0W0JPXt0ItxBayCMiSkCcLmjML2GEifQIl64YRkxUd4NE5ycJZ/QJuZRQllVtX7ryUI9V9ewndUxIRTezn9BVyxHjKH3WJKA0tHBgRCPE2Vbcoh44GxZeqJy3P1Dlzc4V+koKK15QMIXK+YKy6h0AcaIY7MHDauMRngyPwfDIeNgaj+32hMaHMwZv4wE64tFv43E648NxjozHQOOhsUCxfy7ztCjFNBbixIdJmMEBdRBSv4WcIxZnimXaJYEiNwvrTOp0X8SHUsx/qurl+S/5kOGo0kWq6sgKW1XYH3neZ+RrM40GWGEX0T4PvZYAdAeo8HVa+BYaQwRyvK43ow26esMtxaKBO//q2/X469iNwJq3Xz1WnWw0a3LGjUg+bNTEIQsQ12rKvLDs+Nsdc6g55tybpnwOWafhWxv8rmu+x6feFy76WjoFWrgYHTKdjl8PF92wUJHnczAMrQaGvh61nbawDUBXIIqKKyRmG1eDITrN6xbXLFjmtxZzQ4IDGco9YSkBxp5IO2Kx9jpXFyLs+/JdrZz3CPPy4kw1G3RMswy2RYwEjkZz0AKzGUo/jqW+iCtjXZdTo+OKdXYz4Xx2rAP6Wu5mfv0grLPyeHJCWcg2x6+SAWB8SDT67OD73bRDIMe1imjmI6dlFTE+5FCx9GyEZcZYCGMbXrLukExtLfJ+TNnxUeNmUMdURrjauDkoJz3TGH8SGp8mn3Kl/QKftsVad3j0jGOcW1enSQeMjo2OvpeokdH3OqpT7xAu5X1RGshPMcaC0I0XQsYNGMeUi4U/jR+syrSboIWcvRP4iMgtTbC8Qcgs7+xkSbMgO10Lkya93RZMCXTf1RVTjRDK+BOnG/bAGDo19mLiZ1j6LN3q6/jFZMVwuvIAfffyau6e0HZy83OLY3U28RDF3ae1fFNp9/3Snv4D</diagram></mxfile>

Binary file not shown.

After

Width:  |  Height:  |  Size: 44 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 105 KiB

File diff suppressed because it is too large Load Diff

After

Width:  |  Height:  |  Size: 114 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 348 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 113 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 80 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.4 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 150 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 47 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 60 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.4 KiB

BIN
docs/images/ts_change.PNG Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 42 KiB

9
docs/index.md Normal file
View File

@ -0,0 +1,9 @@
# Powertrain Build Documentation
- [Powertrain Build](./powertrain_build.md)
- [Powertrain Build Introduction](./powertrain_build_introduction.md)
- [Powertrain Build Architecture](./powertrain_build_architecture.md)
- [Powertrain Build Deployment](./powertrain_build_deployment.md)
- [Signal Interface Tool](./signal_interface_tool.md)
- [Signaling In Powertrain Build](./signaling_in_powertrain_build.md)
- [Signal Inconsistency In Check](./signal_consistency_in_check.md)

92
docs/intro.md Normal file
View File

@ -0,0 +1,92 @@
# Introduction to PCC Build System's documentation
## Target
- Improved build speed
- Easier integration of code from other sources than TargetLink
- Improved quality
- Design documentation
- Unit tests
- Better code structure
- Better target code (no postprocessing, all code explicitly assigned to memory areas)
## Overview
The big conceptual change is to move the variability from the code generation stage to the compilation stage.
![Old build system](./images/old_build_system.PNG)
Build a superset c-code, which can be configured via preprocessor directives.
![New build system](./images/new_build_system.PNG)
As the code generation step takes a significant part of the build time, distributing this step to the time of commit, instead of when building the SW, will reduce build time. Only generating one set of c-code, instead of one per variant, will further reduce build time.
The unit c-code also has a unit metadata file (in json-format) which the build system will use. This metadata file acts as a unit code generator abstraction. I.e. the metadata file contains all information needed by the build system and it is independent from how the unit code was created. This will make integration of hand written c-code much easier.
## Summary of changes
This section describes the changes made to the build system (Data dictionary, scripts, configuration files, etc.), but not the changes needed to be done in the model files. These changes are described in [model_changes](model_changes.md)
1. No TargetLink classes uses pragma-directives. These are replaced with header files. The header files are different per supplier and includes the supplier specific header files. When the project is set up in the version control system, the correct header for the supplier are included with externals. I.e. the configuration is done in the version control system. This means that no post processing of c-files are necessary, which reduces build time.
2. Data dictionary changes
- Pragma directives replaced by includes (see above)
- Removal of all datatype sizes in the classes (i.e. CVC_DISP replaces CVC_DISP8, CVC_DISP16 and CVC_DISP32). TargetLink 3.3 groups variable declarations by size without different classes.
- New NVM-ram types are added
- Removed old legacy classes which should no longer be used
- Dependability classes are added, and postprocessing of dependability code is removed.
Scripts are provided to update the models with new the classes. The Dependability classes are located in variable class groups, to make creating models easier (less clutter in the target link class dialog).
![Variable Classes](./images/DD_class_example.PNG)
3. Data dictionary additions for code switch replacement
- Create a file to be included with the preprocessor defines for the configuration, by creating a new module in DD0/Pool/Modules. The name of the module will be the name of the header-file included in the c-files. I.e. MT_CompilerSwitches in the example below.
- Create a new file specification in the module (by right clicking "file specifications" and selecting "Create FileSpecification". Fill in the values below.
![Header File Specification](./images/TL-DD-preprocessor1.jpg)
- Create another a new file specification in the module (by right clicking "file specifications" and selecting "Create FileSpecification". Fill in the values below.
![Source File Specification](./images/TL-DD-preprocessor2.jpg)
- Create a new variable class (by right clicking "Variable Class" and selecting "Create Variable Class". Fill in the values below, but add MERGEABLE in the "Optimization" Option.
![Mergeable Variable Class](./images/TL-DD-MACRO.jpg)
4. No models are needed for VcExt*, VcDebug*. C-code and A2L-files are generated directly, which saves build time.
5. Variables are declared in the units which writes to them, with the exception of variables in the supplier interface. I.e.
- VcIntVar is removed.
- Classes of unit outputs in the models are changed from CVC_EXT to CVC_DISP. Scripts are provided to modify the models.
6. All units are checked-in together with it's c-code, and a json unit-definition file. See [unit_config](unit_config.md) for structure of that file. This leads to code generator independence and vastly improved build time, as the most time consuming step is distributed.
7. New project configuration file, see [project_config](project_config.md)
- Config separated in a global config file, and one project specific file.
- More flexibility for different file structures (config, rather than code change)
8. Interface checks are introduced to guarantee that interfaces are consistent before build. Increases quality and reduces build time.
9. Improved maintainability by
- A more modular design.
- Documentation of the build system.
- Unit tests for the build system.
10. A new temp-model is generated to move the TL-outport to the subsystem where the signal is created. This is needed to be able to remove the unused outports from the generated c-code, and at the same time have the possibility to create a test harness. Is there a way to avoid this without using the TL-DD or making temp models?
## Limitations in TL3.3
When using TargetLink 3.3 and preprocessor directives, there are some limitations of what can be used in subsystems under these directives.
1. Enabled subsystems cannot use the reset states option.
2. The outports of enabled subsystems cannot use the reset when disabled option.
![Enable and Port reset options examples](./images/reset_port_SL_example.PNG)
After discussion with dSpace, they are considering removing the above mentioned limitations, in TL 4.4 or later. This needs to be confirmed with dSpace.
Another issue is that the TargetLink code generation optimizer considers preprocessor ifs as any other control flow (e.g. an ordinary if statement). Enabling the control flow code generation optimization (the MOVEABLE optimization in the class), can lead to the removal of code, calibration data or measurement variables when using preprocessor ifs.
See [model_changes](model_changes.md) for workarounds for these limitations.

71
docs/model_changes.md Normal file
View File

@ -0,0 +1,71 @@
# Necessary Model Changes
This section describes the necessary model changes that has to be done for the new build environment to work. Most of these changes will be automated by model conversion scripts.
## Replacement of the VCC Codeswitch blocks
1. Add a constant of the type you defined above (`IFDEF_MACRO_EXT_DEFINE`) and a TargetLink pre-processor IF block
![Pre-processor IF example](./images/add-preproc-block.jpg)
2. The pre-processor IF block is found here:
![Vcc_Lib](./images/TL-preproc-lib.jpg)
3. Add a Simulink action port to the subsystem
![Action port](./images/change_to_actionport.jpg)
4. Connect the TargetLink pre-processor IF block to the action port of the subsystem
![Action port connection](./images/connect_to_subsystem.jpg)
## Use of the VCC Codeswitch blocks as constants
TargetLink does not accept using the same pre-processor define for both pre-processor ifs and as constants. The workaround is to create a new define, which is set to the original define in the `LocalDefs.h` file created by the code-generation scripts.
![Pre-processor constant example](./images/DuplicatedDefines.PNG)
`VcAesAir_OPortMvd_LocalDefs.h`:
```c
#include "VcCodeSwDefines.h"
#define Vc_Aes_B_CodeGenDsl_CN (Vc_Aes_B_CodeGenDsl)
```
## Modification of ts datastore blocks
The datastore block for sampling time, *ts*, must be modified so that different time constants can be used when building the superset c-code for different projects.
Change the class of the datastore block to *"EXTERN_GLOBAL_ALIAS"* and fill in the name of the constant defining the sampling time in the *Name* field.
The build scripts generates a header file which defines the correct time constant value.
![ts datastore block](./images/ts_change.PNG)
## Declaration of output variables in unit
The `VcIntVar.c` file is removed, the unit producing a signal should declare it. This avoids interface integration issues and saves build time. **NB: Model change required**
- TL-Class for outports need to be changed to `CVC_DISP` in all models.
- A script is written which modifies all outports to `CVC_DISP`.
## Addition of new dependability TL-classes
Dependability classes are added, and postprocessing of dependability code is removed. Scripts are provided which automatically changes the classes of a model.
## Enabled subsystem outports
Subsystems with any subsystem parent, which have a TL-preprocessor switch, cannot have a "reset-port" state. See below for workarounds.
- The scripts converting the models will detect if the reset states option is used, and will not use a preprocessor-if for that subsystem. This will lead to increased RAM/ROM usage. It is recommended to refactor the model (split into several models), if this is a possibility.
- All outports with the reset when inactive option enabled, will have that option disabled by the conversion script. This will be shown in the logfile. The designer needs to consider if this change is acceptable and if not, the following design workaround is recommended.
![Reset port workaround](./images/reset_port_workaround.PNG)
## Other workarounds
The workaround for the optimization problem is to remove the MOVEABLE optimization when the new build system is introduced. This will bring us back to the same behavior as we had when running TL 2.1.5.
This will make it easier to understand how the target code works, as it will behave as in simulink. Furthermore, it will be easier to predict maximum CPU-load. The downside is that it will cost 0.1% extra CPU load, when measured last time.

203
docs/powertrain_build.md Normal file
View File

@ -0,0 +1,203 @@
# powertrain_build
---------------------
[TOC]
## Requirements
powertrain_build is supported on Python versions 3.6 through 3.10.
## Basic Usage
Code generation is done from git bash and the generated code is committed.
The TargetLink GUI is currently lean and all projects are initialized in Matlab
through running:
```bash
Projects/Init_PyBuild.m
```
This works the same way for Embedded coder projects, for example:
```bash
actuation-arbitration-manager-simulink-logic/Projects/Init_PyBuild.m
```
**NOTE:** Examples in upcoming chapters use Python version 3.6.
### Code generate a model
In git bash:
```bash
py -3.6 -m pytools.pybuild_wrapper --codegen --models Models/ICEAES/VcAesTx/VcAesTx.mdl
```
#### Set Matlab 2017 as Environmental Variable
Add New User Variables
click 'New...' and add if you want to run python from command line
```bash
MatInstl2017 "C:\Program Files\MATLAB\R2017b\bin\matlab.exe"
```
or
```bash
MatInstl2017 "/c/Program\ Files/MATLAB/R2017b/bin/matlab.exe"
```
if you want to generate code from bash.
When a new environment variable has been added you need to restart git
bash/command window.
See picture below for details.
![MatlabEnvVar](MatlabEnvVar.JPG)
#### Code generate with Embedded Coder with Matlab2019b
```python
py -3.6 -m pytools.pybuild_wrapper --codegen --matlab-bin "C:\MATLAB_2019_b\bin\matlab.exe" --models Models/STEER/VcSteer/VcSteer.mdl
```
### Update a TargeLink model to pybuild
In git bash:
```python
py -3.6 -m pytools.pybuild_wrapper --update --models Models/ICEAES/VcAesTx/VcAesTx.mdl
```
### Update and Code generate a model
In git bash:
```bash
py -3.6 -m pytools.pybuild_wrapper --update --codegen --models Models/ICEAES/VcAesTx/VcAesTx.mdl
```
### Code generation and build
To code generate and build a complete project *ABC_123*:
```bash
py -3.6 -m pytools.pybuild_wrapper --codegen --build ABC_123
```
### Build a project
You can either use the wrapper:
```bash
py -3.6 -m pytools.pybuild_wrapper --build ABC_123
```
### Detailed build options
```bash
py -3.6 -m pytools.pybuild_wrapper --help
```
The powertrain_build wrapper has many options, we'll explain them in detail here:
`--update` This option uses Matlab scripts to migrate models from the old build
system to powertrain_build. Once powertrain_build is officially in use, all source code should
already have been converted.
`--codegen` Runs TargetLink to generate C source code from the Matlab models.
This should be done before changes are submitted for review. If the generated
code is missing, the build system will reject your changes.
`--build` Reads configuration files and sets up preprocessor flags.
`--models=Models/SSP/MODEL/MODEL.mdl` Allows selective building and code
generation, useful for testing individual changes. Multiple model paths can
be entered, separated by comma signs.
`--dry-run` Dry run mode. No actual changes are done, can be used to test
configuration.
`--matlab-bin MATLAB_BIN` matlab arguments currently path to the matlab
binary to use. Defaults to C:\MATLABR2017b_x64\bin\matlab.exe. If you have
Matlab installed in the wrong place you can use:
```bash
py -3.6 -m pytools.pybuild_wrapper --codegen --models Models/ICEAES/VcAesTx/VcAesTx.mdl --matlab-bin="C:\Program Files\MATLAB\R2017b\bin\matlab.exe"
```
NOTE: Building a project (--build) does not work if a model requires a
preprocessor directive that does not exist in any configuration file.
### What to commit
Using powertrain_build we need to commit:
- Model file and if needed m-file
- All updated files in `Models/SSPXXX/VcXxxYyy/pybuild_cfg`
- Files like `config_VcXxxYyy.json`
- All updated files in `Models/SSPXXX/VcXxxYyy/pybuild_src`
- Files like `tl_defines_XxxYyy.h`, `VcXxxYyy.h`, `VcXxxYyy.c`, `VcXxxYyy.a2l`,
`VcXxxYyy_OPortMvd_LocalDefs.h`
- Files in `tests` if needed
- Configuration files, e.g. `ConfigDocuments/SPM_Codeswitch_Setup.csv`,
see [pre processor directives](./PreProcessorDirectives.md).
```txt
gitrepo/
├── ConfigDocuments/
│ ├── .
│ ├── .
│ ├── .
│ ├── SPM_Codeswitch_Setup.csv
│ ├── SPM_Codeswitch_Setup_ICE.csv
│ ├── SPM_Codeswitch_Setup_PVC.csv
│ ├── .
│ ├── .
│ ├── .
├── Models/
│ └── PVCTM/
│ └── VcPvcDemo/
│ ├── pybuild_cfg/
│ │ ├── config_VcPvcDemo.json
│ │ └── VcPvcDemo.yaml
│ ├── pybuild_src/
│ │ ├── tl_defines_PvcDemo.h
│ │ ├── VcPvcDemo.a2l
│ │ ├── VcPvcDemo.c
│ │ ├── VcPvcDemo.h
│ │ └── VcPvcDemo_OPortMvd_LocalDefs.h
│ ├── tests/
│ │ ├── _cumulated_code_coverage_
│ │ │ ├── ctcpost_merge_options.txt
│ │ │ └── experiment.spec
│ │ ├── VcPvcDemo_UnitTests
│ │ │ ├── 00_test_setup
│ │ │ │ ├── dataset.DCM
│ │ │ │ ├── sut_config.txt
│ │ │ │ ├── sut_interface.py
│ │ │ │ └── twTest.sil
│ │ │ ├── 01_stimulus
│ │ │ │ └── U_VcPvcDemo_ExplorativeStimulus.py
│ │ │ └── 02_watchers
│ │ │ └── U_VcPvcDemo_watcher.py
│ │ ├── ctc_env.bat
│ │ └── project.testweaver
│ ├── VcPvcDemo.mdl
│ ├── VcPvcDemo_par
```
### Summary of signals in powertrain_build
[Signal Summary](./signaling_in_powertrain_build.md)
### Signal Interface Tool
[Signal Interface Tool](./signal_interface_tool.md)
### Signal Interface Inconsistency
[Signal inconsistency in check](./signal_inconsistency_check.md)

View File

@ -0,0 +1,162 @@
# powertrain_build General Code Introduction
[TOC]
<!--:powertrain_build:-->
## Why is powertrain_build used?
The software is built using TargetLink, which generates c-code from Simulink
models. Some models are used in multiple projects while others only serve for one project.
To make things easier to understand, let's say we want to produce a software
for ABC_123 (fake car model) which handles the diesel.
## Software generation before powertrain_build
1) Gather all models used in ABC_123.
2) Some models are used in several projects, therefore, we need to process those
models and remove all non-ABC_123 parts. To save memory and avoid using the
wrong software, matlab scripts is responsible for cutting out useless parts of
the model.
3) Generate c-code using TargetLink.
4) Compile.
![powertrain_buildLegacy](./images/powertrain_build_legacy.png)
Such process had to be done every time we wanted software and we have up to 11
projects to maintain, which could take a while.
## Software generation after powertrain_build
powertrain_build uses preprocessor blocks in the models, which means that the generated
code has "#ifdef SOME_VAR" inside. It enables to switch between projects using
variales, which means we can generate code for each model and store them in ./Models/.
1) Gather all c-/h-files used in ABC_123.
2) Set variales according to codeswitch documents.
3) Compile.
![powertrain_build](./images/powertrain_build.png)
## Summary
Before powertrain_build, we had to generated code for each model (after modifying them
through matlab scripts) for each projects and then compile. Now the latest
version of the code is already in the repo, we just need to gather files, set
variables and compile.
## How to start with powertrain_build
To get familiar with powertrain_build, build.py is a good start since it contains all
modules. Four important modules could also help you understand powertrain_build better:
build_proj_cfg, unit_cfg, feature_cfg, signal_interface.
## build_proj_cfg
This module is used to read ./Projects/<project_name>/Config/ProjectCfg.json
and ./ConfigDocuments/BaseConfig.json, which provides configuration
files location. The location of Base_Config is also stored in ProjectCfg.
build_proj_cfg also provides methods of gathering information for other modules.
## feature_cfg
feature_cfg module is also called the codeswitch module. It reads from
./ConfigDocuments/SPM_Codeswitch_Setup*.csv and provides methods for
retrieving the currently used configurations of a unit. The first
row of SPM_Codeswitch_Setup*.csv lists all projects and the first column
lists codeswitch names. The values under each project states whether the
corresponding codeswich is active in this project.
## unit_cfg
The model list of a project is in ./Projects/<project_name>/local_cof/raster.json.
Then unit_cfg reads unit definitions from models from the model list
(/Models/<model_name>/pybuild_cfg/<model_name>.json). This module also
provides methods to retrieve any unit definition, such as inports and outports,
of a unit and all existing units.
## signal_interface
This module gets supplier interface of the project from
./Projects/<project_name>/Config/ActiveInterfaces/. It provides methods
of checking signal interfaces consistency between Supplier and VCC SW. It
also reads inport/outport information from all units in raster and checks
internally inports/outports consistency between VCC SW-units
## Why powertrain_build, whats the advantages?
A Continuous Integration (CI) Build System needs to test all configurations
where a TargetLink model is used.
- It's faster
- There is more parallelization of jobs in the CI system, so its faster.
- Code generation is moved to the developers PC
- Code generation is done once for all projects using pre-processor directives.
- C code review is now possible in gerrit!
- The total build time on a software is 15 minutes for all 9 projects
- powertrain_build adds signal consistency checks.
- Unit tests of the Build system is introduced
- So its quality assured
- Its easy to learn for new employees.
- powertrain_build creates new variable classes with unique code decorations
- This means that post processing C code is not necessary.
- This means that ASIL classed variables get declared at the source.
- This means that we can optimize memory at compilation, and save memory in GEP3 through short addressing different variable classes.
- This means we have control of all build steps
- This enables us to use the same models in more than 2 different suppliers, for instance SPA2's Core System Platform.
- powertrain_build thus fixes incorrect handling of NVM variables.
### Legacy, how it used to be
The old system was scripted in Matlab.
The code was not re factored, but build
upon for many years. Hence the complexity of the code was high.
As seen in the figure below, the models used code switches to generate different code for different projects.
The Legacy Build process can be followed in this sharepoint
site:
![powertrain_buildLEgacy](./images/powertrain_build_legacy.png)
### What is it
- Made in Python.
- Instead of Code Switches in the TargetLink models, there are C Pre Processor
directives. In this way, we will have super set C code representing the super
set Simulink models.
- Unit tests of build scripts as well as html documentation.
- Signal consistency checks through creating JSON files of the code
![powertrain_build](./images/powertrain_build.png)
### How powertrain_build is used in different targets
#### vcu/hp
![powertrain_build-hp](./images/powertrain_build-hp-white.jpg)
#### vcu/hi
![powertrain_build-hi](./images/powertrain_build-hi-white.jpg)
#### ECM
![powertrain_build-ECM](./images/powertrain_build-ECM-white.jpg)
### TargetLink Limitations
- dSPACE delivers removal of limitation in TargetLink release 2019B, first said
to be delivered 2018B. These options gives code generation errors:
- Enabled subsystems, which reset the states when enabling them.
- Out ports which are reset when disabling a subsystem, which are used in subsystems under a pre-processor block
- Solution of TargetLink limitation is refactoring of models
### powertrain_build dependencies
Python module dependency visualization from pydeps,
[https://github.com/thebjorn/pydeps](https://github.com/thebjorn/pydeps) looks like this:
![powertrain_buildDependencies](./images/powertrain_buildDependencies.svg)
#### How powertrain_build generates a SPA2 service
![generate service](./images/powertrain_build_gen_service_black.jpg)

View File

@ -0,0 +1,76 @@
# powertrain_build Deployment
[TOC]
<!--:powertrain_build:-->
## Repositories
### powertrain_build Repository
The powertrain_build git repository can be found
[here](https://opendev.org/volvocars/powertrain-build).
The powertrain_build LTS artifactory repository can be found
[here (PLACEHOLDER)](https://artifactory-link).
## Deployment
After changes (important commits, JIRA stories etc.) has been made to powertrain_build,
a new version must be deployed.
## Versioning
powertrain_build use semantic versioning, _MAJOR.MINOR.PATCH_. The version
is changed by setting an annotated tag with the version (only) at the commit
that should be the released commit.
### Development versioning
If distribution of a development version is needed, set a tag
"dev/\<explanatory-name\>". Scripts will update the patch part and add .devN,
where N is the number of commits since last proper sematic versioning tag.
## Instructions
1. Upload the change to Gerrit, have it reviewed, verified and merged.
1. Retrieve the merged commit from Gerrit and ensure it is checked out.
1. Create an annotated tag on the commit.\
`git tag -a -m'<annotation text>' <version>`
1. For a development version:\
`git tag -a -m'<annotation text>' dev/<explanatory>`
1. Push the tag to Gerrit:\
`git push origin <tag-name>`
1. Steps after merge can also be done by setting a tag in Gerrit GUI
1. Zuul will now:
1. Run verification steps.
1. Check that the version complies with PEP440 and semantic version
1. Check that there is no package with this version on artifactory already.
1. Upload the package to artifactory.
1. Modify the _requirements.txt_ file in any repo that requires these
updates.
## Additional notes
If powertrain_build become dependent on a new package, add the dependency to
_\<package\>/requirements.txt_, or _\<package\>/test-requirements.txt_ if the
dependency is needed only for testing the package, not for using it.
powertrain_build use [pbr](https://docs.openstack.org/pbr/latest/) to create
package meta data. Please read and use the features of pbr if updating resource
files or non-python scripts.
## Manual deployment
1. The python package _setuptools_ is required to deploy powertrain_build.
1. Follow the guidelines on the
[LTS artifactory (PLACEHOLDER)](https://artifactory-link)
page about deploying python packages.
1. [LTS artifactory (PLACEHOLDER)](https://artifactory-link) -> Set Me Up
1. `py -3.6 setup.py sdist`, to build the package.
1. `py -3.6 setup.py sdist upload -r local`, to deploy.
1. Deployment in this way may overwrite the package on artifactory
if the user has enough privilege. Be careful not to upload the
same version twice without intending to, as artifactory has no
package upload history and an overwritten package is lost.
1. The bullet about _requirements.txt_ in [Instructions](#instructions) is valid here too.
1. The same [additional notes](#additional-notes) apply to manual deployment.

View File

@ -0,0 +1,32 @@
# Introduction of powertrain_build for new employee
[TOC]
<!--:powertrain_build:-->
## General powertrain_build Information
[powertrain_build introduction](./powertrain_build.md) introduces installation and basic usage of powertrain_build.
## powertrain_build Code Base
The basic code introduction is placed in [powertrain_build General Code Introduction](./powertrain_build_architecture.md).
## powertrain_build Deployment
Information on how to deploy powertrain_build can be found [here](./powertrain_build_deployment.md).
## powertrain_build Development
If you want to develop powertrain_build, you can run it directly from the Git repositories. You probably need a separate virtual environment, as any installed release versions of powertrain_build would interfere:
```shell
python3 -m venv powertrain_build_venv
source ./powertrain_build_venv/bin/activate
```
Once activated, you can execute it:
```shell
PYTHONPATH=<path_to>/pt/pytools:<path_to>//pt/pybuild python -m pytools.pybuild_wrapper build-specific --project-config Projects/CSP/PvcDepDemo/ProjectCfg.json --core-dummy
```

207
docs/project_config.md Normal file
View File

@ -0,0 +1,207 @@
# Project configuration files
Configuration files needed by the build system. The "entry point" is the project configuration file given as a command line argument to build.py. This file can contain references to other configuration files and to the software [unit definition files](unit_config.md).
## Project config
The project config file contains project specific configuration settings and references to other configuration files. It is a json file that should be located in the project root. The project configuration file can override keys/values in the base configuration file.
Example ProjectCfg.json:
```json
{
"ConfigFileVersion": "0.2.1",
"BaseConfig" : "../../ConfigDocuments/BaseConfig.json",
"UnitCfgs": "conf.local/rasters.json",
"ProjectInfo" : {
"projConfig" : "VED4_GENIII",
"a2LFileName": "VEA_VED4_SPA.a2l",
"ecuSupplier" : "Denso",
"ecuType" : "G2",
"unitCfgDeliveryDir": "./output/UnitCfgs"
}
}
```
## Base config
The base config file shares the same structure as the project config file and can be used for non project-specific configuration settings and default values.
Example BaseConfig.json:
```json
{
"BaseConfigFileVersion": "0.2.1",
"ProjectInfo" : {
"didDefFile": "DIDIds_FullRange",
"srcCodeDstDir": "./output/SourceCode",
"reportDstDir": "./output/Reports",
"logDstDir": "./output/logs",
"configDir": "../../ConfigDocuments",
"interfaceCfgDir": "./Config/ActiveInterfaces",
"prjUnitSrcDir": "../../Models/*/Vc*/pybuild_src",
"prjUnitCfgDir": "../../Models/*/Vc*/pybuild_cfg",
"prjUnitMdlDir": "../../Models/*/Vc*",
"prjLocalDefs": "*_LocalDefs.h",
"prjCodeswitches": "SPM_Codeswitch_Setup*.csv",
"coreDummyFileName" : "VcCoreDummy",
"featureHeaderName": "VcCodeSwDefines.h",
"tsHeaderName": "VcUnitTsDefines.h",
"useGlobalConst" : "VcConst"
},
"NvmConfig": {
"fileName" : "vcc_nvm_struct",
"baseNvmStructs" : "conf.local/nvm_structs.json"
}
}
```
## Units config
The units config file contains information about included software units and scheduling rasters. The software units are executed in the order they are defined within each time raster definition.
```json
{
"UnitsConfigFileVersion": "0.2.1",
"Projects": {
"GEP3_HEP7": {
"Rasters": {
"2ms": [],
"10ms": [
"VcScBCoord",
"VcScCVehMtn",
"VcScAAccPed"
],
"100ms": [
"VcTmEdMon",
"VcAcCtrl"
]
},
"SampleTimes": {
"2ms": "0.002",
"10ms": "0.010",
"100ms": "0.100"
}
}
}
}
```
## Configuration settings
### File versioning
The build system compares the version information in the configuration files with the application version to make sure a consistent configuration is used.
- "ConfigFileVersion": "0.2.1"
- Project configuration file version.
- "BaseConfigFileVersion": "0.2.1"
- Base configuration file version.
- "UnitsConfigFileVersion": "0.2.1"
- Units configuration file version.
## ProjectInfo
### projConfig
The name of the project. This name is used in all the configuration files to identify the project.
### ecuSupplier
Ecu supplier name. This is used to choose supplier dependent code generation (possibly in combination with ECU Type), e.g. the core dummy file generation.
### ecuType
Ecu type name. This is used to choose supplier dependent code generation (possibly in combination with ECU Supplier), e.g. the core dummy file generation.
### unitCfgDeliveryDir
If this key is defined, the build system will deliver all the unit configuration files into the directory specified.
### didDefFile
The name of the file defining all DIDs of the project.
### srcCodeDstDir
The source code destination directory.
### logDstDir
The log files destination directory.
### configDir
The path to a folder containing all the configuration files of the project. Used to find codeswitches, core-id and DID definition files.
### interfaceCfgDir
The path to a folder with csv-files defining the supplier interface configuration. The files shall be comma separated files, with the delimiter ';'
The following files shall exists in the folder: CAN-Input.csv, CAN-Output.csv, EMS-Input.csv, EMS-Output.csv, LIN-Input.csv, LIN-Output.csv, Private CAN-Input.csv and Private CAN-Output.csv.
### prjUnitSrcDir
A file path where the superset of the source code files are found. This path can/shall use wildcards. E.g. "./Models/SSP*/Beta/Vc*/src", will match all folders under the Models folder which start with SSP, and then all folders in Beta starting with Vc, which have a src folder. The build system only includes files from software units referenced by the units config file.
### prjUnitCfgDir
A file path to the unit definition files. This file is a json file containing all the relevant meta data for the function. E.g. input parameters, output parameters, calibration labels, local measurement variables, etc... The unit definition file must match the filename pattern "config_*.json"
### coreDummyFileName
Defines the file names of the dummy Core Identifier c code, which is generated by the build environment.
### useGlobalConst
If declared, this module is included in the build. If the string is empty no module is included
### NvmConfig
This key configures the NVM area sizes, and the filename of the c-files generated to defined the NVM. The NVM is defined by six structs. The reason for using c-structs is to guarantee the order the variables are declared in memory. The c-standard does not specify in which order global variables are allocated in memory. However, the standard says that struct members should be placed in memory in the order they are declared.
```json
{
"NvmConfig": {
"fileName" : "vcc_nvm_struct",
"baseNvmStructs" : "conf.local/nvm_structs.json"
}
}
```
### baseNvmStructs
This json file holds the order for NVM signals in the structs, it also holds area size and allowed signals. We want to preserve the order for signals. So signals should never be removed from this list. If a signal is not used anymore it should not be removed, it should instead be marked with 'Position_*' eg. Position_16Bit_195 the signal position will then be filled in by buildscript with signal found in model that's not found in the json.
```json
{
"signals": [
{
"x_size": 1,
"type": "Float32",
"name": "sVcDtcIsc_Tq_NvmAdpnNC",
"y_size": 1
}
],
"name": "NVM_LIST_32_PER",
"default_datatype": "UInt32",
"instanceName": "nvm_list_32_per",
"includeStart": "MemMap_SDA_START.h",
"includeStop": "MemMap_SDA_STOP.h",
"size": 190,
"persistent": true,
"allowed_datatypes": ["Float32", "UInt32", "Int32"]
}
```
### fileName
This key defines the name of the c-files generated, which defines the NVM areas.
### SampleTimes
The key "SampleTimes" defines the names of the available time rasters, and the value defines the scheduling time in seconds.
### Rasters
The key "Rasters", defines which units that are scheduled in that raster, and the order of the list defines the order the units are executed within the raster.

View File

@ -0,0 +1,83 @@
# Signal inconsistency in check (none gating)
---------------------------------------------
[TOC]
## Introduction
Signal inconsistency check will be performed on all changes containing .mdl
files. Reports are created and uploaded to artifactory. One report for each
project, showcasing information about signals relevant for specific project.
>![project_index](./images/signal_inconsistency_project_index.png)
Inconsistencies that will be gating upon (listed below) will also be displayed
in Jenkins log.
- Unproduced inports
- Produced but not used outports
- Signals added to \<model>\_Unconsumed_Sig.csv
>![jenkins_log_example](./images/signal_inconsistency_log_example.png)
## Checks
Following checks will be performed on models in gerrit change
1. Unproduced inports
- Signals that are configured to be consumed by model but are not produced
internally (models) and/or externally (interface list).
1. Produced but not used outports
- Signals that are configured to be produced by model but does not have
consumer internally (models) and/or externally (interface list).
1. Signals added to \<model>\_Unconsumed_Sig.csv
- Signals that are defined to **not have consumer** but **is consumed**
internally (models) and/or externally (interface list)
## \<model\>\_Unconsumed_Sig.csv
For "Produced but not used inports" that are intended for consumption after
point of merging to master there exist possibility to disable check.
1. Create pybuild_cfg/\<model\>\_Unconsumed_Sig.csv.
1. Add Header "Unconsumed_Signals" and populate with produced signals
(outports only) you like to omit.
>![unconsumed_csv_example](./images/signal_inconsistency_unconsumed_csv_example.png)
## Running checks and creating reports before pushing to gerrit (optional)
Please install pytools before running signal inconsistency locally. See
[powertrain_build and PyTools instruction](./powertrain_build.md)
```bash
py -3.6 -m pytools.signal_inconsistency_check -m VcModel1 -r
```
Multiple model
```bash
py -3.6 -m pytools.signal_inconsistency_check -m VcModel1 VcModel2 VcModel3 -r
```
Running without report creation
```bash
py -3.6 -m pytools.signal_inconsistency_check -m VcModel1 VcModel2 VcModel3
```
## Limitations
- Project specific report E.g SigCheckAll_\<PROJECT\>.html will only show
information about signals configured to be used in project "PROJECT".
- Reports does not display information about check 3. This will only be
displayed in build logs.
- Jenkins/script log does not display information about none gating
inconsistencies E.g
- "Outports that are generated more than once in the listed
configuration(s)"
- "Inports that have different variable definitions than the producing
outport"
- "In-/Out-ports that have different variable definitions than in the
interface definition file."

View File

@ -0,0 +1,37 @@
# Signal Interface Tool
[TOC]
## Introduction
Please install PyTools to enable the commands below, see [powertrain_build and PyTools instruction](./powertrain_build.md)
Powertrain Build contains scripts for both signal consistency checks and signal interface information.
If you type the following in git bash:
```bash
py -3.6 -m pytools.pybuild_wrapper --help
```
## Signal Interface report
The signal Interface tool generates html reports. The following example shows how to generate the report:
```bash
py -3.6 -m pytools.pybuild_wrapper --build ABC_123 --interface
```
A project specific report will be available here: `Projects\ABC_123\output\Reports\SigIf.html`.
This report only displays what signals that exist in that project.
## Signal consistency report
Signal in-consistency displays per model:
* **Missing signals**, Inports whose signals are not generated in the listed configuration(s)
* **Unused signals**, Outports that are generated, but not used in the listed configuration(s).
* **Multiple defined signals**, Outports that are generated more than once in the listed configuration(s).
* **Internal signal inconsistencies**, Inports that have different variable definitions than the producing outport.
* **External signal inconsistencies**, In-/Out-ports that have different variable definitions than in the interface definition file.
After running the generation command above(e.g. for ABC_123) the Signal consistency reports are available in `Projects\ABC_123\output\Reports\`.

View File

@ -0,0 +1,102 @@
# Summary of signals in powertrain_build
-----------------------------------
[TOC]
## Where are the variables defined in the code?
### Outsignals from models
```
* <model_name>.c
* VcExtVar*.c - if it is listed in the interface
```
### Insignals to models
```
* <model_name>.c - if another model is producing it
* VcExtVar*.c - if it is in the interface
* VcDummy_spm.c - if neither in interface nor models
```
### Outsignals in the interface list
```
* <model_name>.c - if a model is producing it
* VcDummy.c - if no model is producing it
* VcExtVar*.c - outsignals in the interface list are all defined in this file
```
### Insignals in the interface list
```
* VcExtVar*.c - this goes for both used and unused signals for ecm projects
```
### Signal flow within a project
Signals within a project can be divided into 4 types:
```
* external_outsignals - outsignals in the supplier interface list
* external_insignals - insignals in the supplier interface list
* internal_outsignals - outsignals from models but not in the supplier interface list
* internal_insignals - insignals to the models but not in the supplier interface list
```
As shown in the picture below, if a model takes internal\_outsignals from
another model, the model is regarded as consumer and another model
is supplier.
![powertrain_buildConsummer&Supplier](supplier-consumer_Model.PNG)
If the consumer model expects more insignals than supplier model or
supplier interface could provide, then these insignals are marked as
missing signals.
![powertrain_buildMissingSignals](MissingSignals.PNG)
If supplier model or interface provides more signals than expecting
insignals, then these signals are marked as unused signals.
![powertrain_buildUnusedSignals](UnusedSignals.PNG)
The picture below indicates signal flow within a project.
External interface list defines external in-ports and
external out-ports.
The first flow indicates the normal
signal flow within a project: external\_insignals defined in
VcExtVar.c enters model1 and the internal\_outsignals match
internal\_insignals of the next model; then the external\_outsignals
come from model3 which are defined in model3.c.
The second signal flow indicates the missing signal situation.
When internal\_insignals are missing, VcDummy\_spm.c file is generated
to define missing outport variables in the out-port interface of
the previous model.
The last signal flow explains two special cases: (1) unused signals;
(2) external\_outsignals not produced by models. Unsued signals will be
ignored by models and external\_outsignals that are defined in signal
interface but not produced by models are defined in VcDummy.c instead
of model7.c.
![powertrain_buildSignalFlow](SignalFlow.PNG)
## Compilation Process
```
Compile -> Link -> fail -> Update VcDummy and VcDummy_spm (remove multiple defs, add missing defs)
Compile -> Link -> fail -> Update VcDummy and VcDummy_spm
Compile -> Link -> fail -> Update VcDummy and VcDummy_spm
Compile -> succeed
```
Compiling and linking SPM only works first try in gcc.
Multiple definitions or missing definitions are not allowed.
The iterations are to fix the inconsistencies between SPM and EMS.
If we knew what was defined in the EMS, we could generate
VcDummy and VcDummy\_spm on the first try every time.

246
docs/todo_list.md Normal file
View File

@ -0,0 +1,246 @@
# ToDo items
This is a temporary list of todo items for the PCC build system pilot.
These should be moved to JIRA at some point in time.
## Target Link
* How to handle State Flow function classes, to avoid static data to not
be allocated to a defined memory area? Right now they are set to STATIC_FNC,
but probably they shoud be set to default, and the Templates should be
updated in the TL-data dictionary
* Reset of subsystems and ports
* State flow objects.
## Matlab
* Change the parsing for configurations, so that the workaround for
enabled subsystems with reset (no TL-preprocessor only macro), is
detected as a configuration
* Create a separate class for the ts datastore block. Now it uses
EXTERN_GLOBAL_ALIAS, which is used for all Core-blocks (low prio, as
all models uses the ts-block, which means that the header file
VcUnitTsDefines is included in all models anyway)
* Remove the class EXTERN_CAL_ALIAS? Is this class used for anything?
* Check that the new configuration with TL pre-processor block, works in MIL.
Probably the header file MT_Compileswitches is needed for the simulation
for it to work.
* Add the matlab ts variable to the constant to the macro definition
block.
* Change the format of the configs key to python format (which saves time
in the total build)
* NVM-ram definition script - parse stateflow for NVM classes!
* Make a separate Matlab function for parsing stateflow variables
* generateTLUnit
* Remove workaround for a TL-problem in replaceCodeSwitch.m,
where a macro cannot be used as a constant and a TL preprocessor block
remove if that is solved in TL.
## Python
* How will the interface check work with models with different names, but
the same signal names? Test!
* Where to remove VcDebug*?
consider removing them from the Unitdefinition, and adding them
in the buildsystem if debug is enabled
* Document the format for the LocalDefines.h files
* Consider refactoring Local defines. Make that file a json definition,
and include the result in the global config header file.
* Consider using either format or % for string formatting. This is now
mixed in the code
* The names for the debug switches are not according to the naming convention.
Shall we correct his?
* Consider if we shall separate Bool and the other one byte data types,
as some CPUs have HW bit addressing (think infineon has that)?
* Check the matlab scripts *parseModelInfo.m* and *parseCodeSwitches.m* and
find the bug, which generates empty configs. Should be ['all'].
* In VcPpmPsm: yVcEc_B_AutoStartRcfSet, yVcEc_B_CEMNodeAlive,
sVcScIn_v_VehSpdLgtMax, and sVcEc_Te_EngClnt have empty configs.
* **PROBABLE ERROR** Input signals which are unused due to using
goto-blocks without corresponding from-blocks. When opening a model,
this is checked for, but only on the top level subsystem.
Fix this check, so that it runs on all levels of subsystems.
* Add a method for retreiving the core ids from the unit definitions file
(see :doc:`unit_config`)
## Done items
* The UnitConfigs._parse_all_unit_configs method doesnt seem to parse all json-configs.
If there is several versions of one unit. E.g. VcAesSupM and VcAesSupM__gen3,
both of these unit-config files seems not to be parsed.
* Only parses units in the units.json config list. This list was not updated.
* Dependability functions (and even Vcc_Lib blocks) uses a class CVC_VAR_STAT.
This class is currently not included in the powertrain_build TL-DD. Add this class.
Furthermore, how to handle the ASIL classes for lib-block which set classes in the init-scripts.
* Variables are both in "outputs" and "locals" (example VcAesSupM__gen3, yVcAesSupM_B_SupChrgrErr)
* The parsing of variables for the json file does not remove the "__gen3" suffix for variable names.
* Write email to dSpace regarding:
* Optimization of pre-processor directives. Why are they considered as control flow?
Give example of calibration constant cVcAesVnt_B_UseOldBrkSig,
which is removed even though it is not "MOVEABLE".
* A2L-file is currently generated by TL => All parameters are included in the A2L
in the final version the unit definition file (.json). mergeA2l checks the configuration for all labels,
variables, and function definition, and removes them if they are not included.
* Remove the local defs from the FeatureCOnfigs.gen_unit_cfg_header_file() (VcCodeSwDefines.h).
* Generation of the Ts macro for all the units.
This is needed to reflect the schedulilng frequency of the function in the current project.
* The models need to be updated with another name than the 'ts'.
E.g. ts_[unit_name], and the class should be changed to an external macro.
Add this change to the model conversion script.
* **DONE** - A header file which defines the scheduling defines for all units.
* The above file needs to be included in all files.
* **DONE** - a matlab function updateTsDatastore.m is created.
* OPortMvd.mdl does not seem to have the outports moved.
* LookUnderMasks has to be 'on', as the top.
* Handling of NVM. I.E.
* NVM-Ram blocks includes references to CVC_EXT8/16/32 in the mask init code
-> they need to be replaced by the new blocks.
* Parse the simulink models, and generate the unit definition information for NVM (Matlab-script).
* Generation of structs and defines? Check the Matlab implementation for a specification.
There is no matlab-implementation!
* generateTLUnit
* When generating units in isolation, it is possible to get the same file name
for definitions and tables. E.g. tl_defines_AF.h.
* par.m files are not loaded properly in the init-script.
Update to latest gen3 env will probably solve this problem.
* Add the newScripts folder and subfolders to the matlab path (init script?).
* Run the postprocess_source_files after code generation on all c&h-files.
* Other Matlab issues:
* The new NVM block gives algebraic loopbacks in some implementations,
the block needs to be redesigned to be able to be a drop in replacement for the old block.
* Add VcConst source code to output/SourceCode.
* Generate did defines does not work, no Floats are defines! Fix this!
* Add change models to generate the init function, and not the Restart function.
* No, the code gets messier as the INIT function is reserved for the init of the DISP* classes.
Keep the old way of generating the code.
* Check the debug switches for input and output parametes.
Are both allowed, if so check that the debug functions are called both at the beginning and
the end of the time raster, in the matlab implementation.
* Matlab has debug_in and debug_out, uppdate the gen debug script and the sched script.
* Missing #include "CVC_DISP_END.h".
* VcVmcEm.h is missing a #include “CVC_DISP_END.h” at line 4423ish.
Is this a TL code generation bug? rerun codegen for VcVmcEm.
* VcTrsmShfShkMon.h at line 1451.
* VcSpMon.h at line 911.
* VcSpEc.c line ? (after defines section).
* VcDeTfr.c line 605 (after defines section).
* Is The LocalConfig file is overwritten after a model update? No not any more.
* StateFlow objects are not possible to remove with TL-preprocessor blocks.
* Identify State flow models which are under TL preprocessor blocks, and refactor those models,
by moving the stateflow model to a separate simulink model.
* Model or DD-fault : The init function is not in the CVC_CODE section!
(The same behaviour exists in the current build as well).
Updated DD, set property Restart function name to "default".
* replaceNVMBlocks
* split unit json-config file into two files. One for the interface, and one for the rest of the info.
The interface config needs to be loaded for all units to perform the interface check.
The other file only needs to be loaded for the units within the project being built.
## Moved to JIRA
* Add a prefix to subsystem names which will avoid compilation errors when,
subsystem names does start with numbers after the structure number.
E.g. 1210_12VStartEnable should not be allowed.
* Search for all constant blocks, and idetify which includes a codeswitch.
Replace the identified blocks with macros (use the VccLib block).
* Handling of NVM. I.E.
* Vectors are not supported. Add vector support!
* Memory from earlier codegenerations not implemented. Implement!
* A2L file generation for NVM-ram needs to be **IMPLEMENTED**!
* Local defines:
* Add the model name as a prefix to all local defines, to ensure a separate namespace.
This is needed in the scripts, as all names of the defines all units are aggregated to one dict.
* Dependability models have the wrong classes, so they will use the normal memory areas.
The models need to be updated with the new TL dependabililty variable classe,
which are defined in the updated DD.
* Update powertrain_build to handle classes with the format "ASIL/CVC_**ASIL*[ABC]".
* Build shall fail at the end, and all errors shall presented
(avoids running the script many times, and gradually finding the latent errors)
(This could be a requirement).
* Refactor code so that each function stores errors, and returns them at the end.
* Consider moving the reading of config files to the classes that abracts the configs.
* VcCoreDummy.c does not include the supplier header file.
This must be fixed, preferably via a supplier independent file.
* Have included VcSupplierCoreAbstraction.h, it contains some defines, which might make it not compile.
Needs to be tested.
* All functions should return an error log, and continue the build.
Stop the build at the end, and present all errors.
* The debug functions are not scheduled. Add this!
* Make it an option to generate debug functionality (this could be removed for production SW).
* Currently all variables regardless of the time rasters they are used in,
is set in the beginning of each time raser. This costs some extra cpu performance.
Consider changing this.
* If the legacy excel configuration files are going to be used for more than a few more months,
write a project consistency check that checks that all config files have the same projects in them.
In the test project, projects were missing in the SPM_Codeswitch_Setup.xls,
that is defined in the SPMEMSInterfaceRequirements.xls.
* Move the tmp source code generation folder to a separate location outside the models directory
(as it is not possible to delete them until matlab is closed).

255
docs/unit_config.md Normal file
View File

@ -0,0 +1,255 @@
# Unit definition file
This unit definition file contains all the meta data needed for the build system to include the software unit in the build and to create the necessary files (e.g. A2L-files, NVM allocation). It is also used to perform consistency checks in the system.
For TargetLink models, this unit definition file is created by the source generation scripts.
The unit definition file contains nine major keys - version, outports, inports, nvm, core, dids, pre_procs, local_vars and calib_consts. These are described below.
TODO: Consider changing name from unit configuration to unit definition.
Example config_*.json:
```json
{
"version": "0.2.1",
"outports": {
"sVcPemAlc_D_EngLoadReqEl": {
"handle": "VcPemAlc/VcPemAlc/Subsystem/VcPemAlc/tlop_VcAc_Tq_AcLoad5",
"name": "sVcPemAlc_D_EngLoadReqEl",
"configs": ["all"],
"description": "Request change in electrical loads controlled by CEM",
"type": "Int16",
"unit": "W",
"offset": 0,
"lsb": 1,
"min": -32768,
"max": 32767,
"class": "CVC_EXT"}},
"inports": {
"sVcEc_n_Eng": {
"handle": "VcPemAlc/VcPemAlc/Subsystem/VcPemAlc/tlip_VcEc_n_Eng",
"name": "sVcEc_n_Eng",
"configs": [
["all"],
["Vc_Pem_Alc_B_CodegenFastCat == 1"],
["Vc_Pem_Alc_B_CodegenDpfRegen == 1"]],
"description": "Engine speed",
"type": "Float32",
"unit": "rpm",
"offset": 0,
"lsb": 1,
"min": 0,
"max": 10000,
"class": "CVC_EXT"}},
"calib_consts": {
"cVcAesAir_B_ThrCtrlStrtWght": {
"type": "Bool",
"unit": "g/s",
"description": "Switch to weight target cylinder flow during hand-over from start throttle",
"max": "-",
"min": "-",
"lsb": "1",
"offset": "0",
"class": "CVC_CAL",
"handle": "VcAesAir/VcAesAir/Subsystem/VcAesAir/VcAesAir/1_AirTarDyn/11_CylTarStrt/B_ThrCtrlStrtWght",
"configs": ["all"],
"width": [1]}},
"local_vars": {
"rVcAesAir_m_CylTarAct": {
"type": "Float32",
"unit": "mg/stk",
"description": "Target cylinder charge flow for aircharge control",
"max": "5000",
"min": "0",
"lsb": "1",
"offset": "0",
"class": "CVC_DISP",
"handle": "VcAesAir/VcAesAir/Subsystem/VcAesAir/VcAesAir/1_AirTarDyn/11_CylTarStrt/Switch1",
"configs": ["all"],
"width": 1}},
"core": {
"Events": {
"VcEvImmoBCM": {
"API_blk": [
{
"path": "VcPpmImob/VcPpmImob/Subsystem/VcPpmImob/VcPpmImob/1000_ImobConnectionLayer/1600_Diag/1620_CoreIfNew/Dem_SetEventStatusPF1",
"config": [
"Vc_NewDiagnosticCoreIF == 1"]},
{
"path": "VcPpmImob/VcPpmImob/Subsystem/VcPpmImob/VcPpmImob/1000_ImobConnectionLayer/1600_Diag/1620_CoreIfNew/Dem_SetEventStatusPP1",
"config": [
"Vc_NewDiagnosticCoreIF == 1"]}],
"blk_name": "NamedConstant1",
"subsystem": "VcPpmImob/VcPpmImob/Subsystem/VcPpmImob/VcPpmImob/1000_ImobConnectionLayer/1600_Diag/1620_CoreIfNew",
"API_blk_type": "Dem_SetEventStatus Pre-Passed",
"description": "",
"type": "",
"unit": "",
"offset": "",
"lsb": "",
"min": "",
"max": "",
"class": ""}
},
"IUMPR": {},
"FIDs": {},
"Ranking": {},
"TstId": {}},
"dids": {
"yVcPpmPsm_B_DriveCycleActive": {
"name": "yVcPpmPsm_B_DriveCycleActive",
"description": "Driver has entered the driving cycle 1= Active 0 = Not Active",
"handle": "VcPpmPsm/VcPpmPsm/Subsystem/VcPpmPsm/yVcPsm_B_DriveCycleActive",
"configs": ["Vc_D_CodegenHev > 0"],
"type": "Bool",
"unit": "-",
"offset": 0,
"lsb": 1,
"min": "NaN",
"max": "NaN",
"class": "CVC_DISP"}},
"nvm": { },
"pre_procs" : [
"Vc_Aes_TrboM_B_CodeGen2Trbo",
"Vc_Aes_TrboM_B_CodeGenBstPeak",
"Vc_Aes_TrboM_B_CodeGenTrbo",
"Vc_Aes_TrboM_B_CodeGenTrboMode06",
"Vc_Aes_TrboM_B_CodeGenTrboOverSpd"]
}
```
## Unit definition data
### outports, inports, calib_consts, local_vars and nvm
Outports contains all the signals (variables) which the unit produces.
Inports contains all signals used from other units, to perform the unit's task.
calib_consts holds the definition of all the calibration constants in the unit.
local_vars holds the definition of unit internal variables possible to measure.
nvm blocks defines the units use of non-volatile memory.
The keys outports, inports and nvm have the following keys, which defines them:
### handle
This is a handle to where the variable/parameter is created (outports) or used (inports & nvm)
For TargetLink this is a string identifying the block in the model
e.g. "VcPemAlc/VcPemAlc/Subsystem/VcPemAlc/yVcVmcPmm_B_SsActive9".
### name
The name of the variable or parameter.
### configs
Which codeswitches this variable depends on.
For TargetLink this information is parsed from the model structure,
and depends on the use of pre-processor directives.
Can have the following formats;
* a list of lists of config strings
* [[cs1 and cs2] or [cs3 and cs4]].
* list of config strings,
* [cs1 and cs2].
* or a string
* (cs):
E.g. [["Vc_Pem_Alc_B_CodegenFastCat == 1"],["Vc_Pem_Alc_B_CodegenDpfRegen == 1"]]
means that the signal is active in the configuration if the following
configuration expression evaluates to TRUE
(Vc_Pem_Alc_B_CodegenFastCat == 1) OR (Vc_Pem_Alc_B_CodegenDpfRegen == 1)
### description
A string describing the variable/parameter.
### type
The data type of the signal. Valid types are UInt8, UInt16, UInt32, Int8, Int16, Int32, Bool and Float32.
### unit
The name if the unit of the variable/parameter.
### offset
The offset used to convert the variable value from HEX to Physical.
### lsb
The value of a bit (lsb - least significant bit)
The factor used to convert the variable value from HEX to Physical.
### min
The minimum value of the variable.
### max
The maximum value of the variable.
### class
The storage class of the variable. I.e. which type of memory the variable/parameter is assigned to.
### core
The units core ids have the following different types - Events, IUMPR, FIDs, TestID and Ranking (which is not a part of the core, but is included here for simplicity)
TODO: Remove some of the keys for the Core Identifiers. subsystem, type, unit, offset, lsb, min, max, class is not needed for these blocks.
```json
{
"Events": {
"NameOfId": {
"API_blk": [
{
"path": "VcPpmImob/VcPpmImob/Subsystem/VcPpmImob/VcPpmImob/1000_ImobConnectionLayer/1600_Diag/1620_CoreIfNew/Dem_SetEventStatusPF1",
"config": ["Vc_NewDiagnosticCoreIF == 1"]},
{
"path": "VcPpmImob/VcPpmImob/Subsystem/VcPpmImob/VcPpmImob/1000_ImobConnectionLayer/1600_Diag/1620_CoreIfNew/Dem_SetEventStatusPP1",
"config": ["Vc_NewDiagnosticCoreIF == 1"]}],
"blk_name": "NamedConstant1",
"subsystem": "VcPpmImob/VcPpmImob/Subsystem/VcPpmImob/VcPpmImob/1000_ImobConnectionLayer/1600_Diag/1620_CoreIfNew",
"API_blk_type": "Dem_SetEventStatus Pre-Passed",
"description": "",
"type": "",
"unit": "",
"offset": "",
"lsb": "",
"min": "",
"max": "",
"class": ""
}
}
}
```
The first key under the ID-type key, is the name of the ID. The value of that
key is a dict with the following keys:
### API_blk
The value of this key is a list of dicts, these dicts defines the path to all
the instances where this ID is used in the model, and in which configurations
the ID is active
### API_blk_type
The value of this key is a string, which defines the type of API block that is
used for this Id
### blk_name
The value of this key is a string, which defines the name of the block in simulink
### dids
The dids defined in the unit
### pre_procs
Contains a list of strings, which defines the preprocessor names used in the
unit for configuration

37
pybuild/__init__.py Normal file
View File

@ -0,0 +1,37 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
"""Main package of the pybuild application."""
from pbr.version import VersionInfo
from pathlib import Path
from pybuild.lib import logger, helper_functions
from pybuild.environmentcheck import check_python_string
LOGGER = logger.create_logger(__file__)
__version__ = VersionInfo('pt-pybuild').release_string()
LOGGER.info('Current pybuild version is %s', __version__)
__config_version__ = '0.2.1'
__required_python_lower__ = '3.6'
__required_python_upper__ = '3.10'
workspace = helper_functions.get_repo_root()
requirement_path = Path(
workspace, 'Script', 'PyTools', 'requirements.txt'
)
if requirement_path.exists():
with requirement_path.open("r") as requirement_file:
expected_package = "pt-pybuild==" + __version__
for line in requirement_file:
if expected_package in line:
LOGGER.info('PyBuild version matched requirements!')
break
elif "pt-pybuild==" in line and expected_package not in line:
LOGGER.warning('PyBuild version does not match requirements!')
break
else:
LOGGER.warning('Current repository does not have a requirement file' +
' in expected location: %s', str(requirement_path))
check_python_string(__required_python_lower__, __required_python_upper__)

582
pybuild/a2l.py Normal file
View File

@ -0,0 +1,582 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
# -*- coding: utf-8 -*-
"""Module for a2l-file generation."""
import sys
import re
import logging
from string import Template
from pprint import pformat
from pybuild.problem_logger import ProblemLogger
from pybuild.types import a2l_type, a2l_range
LOG = logging.getLogger()
class A2l(ProblemLogger):
"""Class for a2l-file generation."""
def __init__(self, var_data_dict, prj_cfg):
"""Generate a2l-file from provided data dictionary.
Args:
var_data_dict (dict): dict defining all variables and parameters in a2l
Sample indata structure:
::
{
"function": "rVcAesSupM",
"vars": {
"rVcAesSupM_p_SupMonrSCTarDly": {
"var": {
"type": "Float32",
"cvc_type": "CVC_DISP"
},
"a2l_data": {
"unit": "kPa",
"description": "Low pass filtered supercharger target pressure",
"max": "300",
"min": "0",
"lsb": "1",
"offset": "0",
"bitmask": None,
"x_axis": None,
"y_axis": None,
},
"array": None
}
}
}
"""
super().__init__()
self._var_dd = var_data_dict
self._prj_cf = prj_cfg
self._axis_ref = None
self._axis_data = None
self._compu_meths = None
self._rec_layouts = None
self._fnc_outputs = None
self._fnc_inputs = None
self._fnc_locals = None
self._fnc_char = None
# generate the a2l string
self._gen_a2l()
def gen_a2l(self, filename):
"""Write a2l-data to file.
Args:
filename (str): Name of the generated a2l-file.
"""
with open(filename, 'w', encoding="utf-8") as a2l:
a2l.write(self._a2lstr)
self.debug('Generated %s', filename)
def _gen_a2l(self):
"""Generate an a2l-file based on the supplied data dictionary."""
self._gen_compu_methods()
# self.debug('_compu_meths')
# self.debug(pp.pformat(self._compu_meths))
self._find_axis_ref()
# self.debug("_axis_ref")
# self.debug(pp.pformat(self._axis_ref))
self._find_axis_data()
self._gen_record_layouts_data()
# self.debug("_axis_data")
# self.debug(pp.pformat(self._axis_data))
# self.debug("_rec_layouts")
# self.debug(pp.pformat(self._rec_layouts))
self._check_axis_ref()
output_meas = ''
output_char = ''
output_axis = ''
self._fnc_outputs = []
self._fnc_inputs = []
self._fnc_locals = []
self._fnc_char = []
for var, data in self._var_dd['vars'].items():
try:
cvc_type = data['var']['cvc_type']
if 'CVC_DISP' in cvc_type:
output_meas += self._gen_a2l_measurement_blk(var, data)
self._fnc_locals.append(var)
elif 'CVC_CAL' in cvc_type:
self._fnc_char.append(var)
srch = re.search('_[rcxyXY]$', var)
if srch is None:
output_char += self._gen_a2l_characteristic_blk(var, data)
else:
output_axis += self._gen_a2l_axis_pts_blk(var, data)
elif cvc_type == 'CVC_NVM':
output_meas += self._gen_a2l_measurement_blk(var, data)
self._fnc_locals.append(var)
elif cvc_type == 'CVC_IN':
self._outputs = None
self._inputs = None
self._fnc_inputs.append(var)
elif cvc_type == 'CVC_OUT':
self._fnc_outputs.append(var)
except TypeError:
self.warning("Warning: %s has no A2l-data", var)
except Exception as e:
self.critical("Unexpected error: %s", sys.exc_info()[0])
raise e
# generate COMPU_METHS
output_compu_m = ''
for k in self._compu_meths:
output_compu_m += self._gen_a2l_compu_metod_blk(k)
# generate FUNCTIONS
output_funcs = self._gen_a2l_function_blk()
# generate RECORD_LAYOUTS
output_recl = ''
for k in self._rec_layouts:
output_recl += self._gen_a2l_rec_layout_blk(k)
output = output_char + output_axis + output_meas + \
output_compu_m + output_funcs + output_recl
# self.debug(pp.pformat(self._var_dd))
# self.debug('Output:')
# self.debug(output)
self._a2lstr = output
def _find_axis_data(self):
"""Parse all variables and identify axis points.
TODO: Change this function to check for names with _r | _c
suffixes
"""
self._axis_data = {}
variable_names = self._var_dd['vars'].keys()
for name in variable_names:
x_nm = y_nm = None
if name + '_x' in variable_names:
x_nm = name + '_x'
if name + '_y' in variable_names:
y_nm = name + '_x'
if x_nm is not None or y_nm is not None:
self._axis_data[name] = (x_nm, y_nm)
def _find_axis_ref(self):
"""Parse all variables and identify which are defined as axis points."""
self._axis_ref = {}
for var, data in self._var_dd['vars'].items():
if data.get('a2l_data') is not None:
x_axis = data['a2l_data'].get('x_axis')
y_axis = data['a2l_data'].get('y_axis')
if x_axis is not None:
if x_axis in self._axis_ref:
self._axis_ref[x_axis]['used_in'].append((var, 'x'))
else:
self._axis_ref[x_axis] = {'used_in': [(var, 'x')]}
if y_axis is not None:
if y_axis in self._axis_ref:
self._axis_ref[y_axis]['used_in'].append((var, 'y'))
else:
self._axis_ref[y_axis] = {'used_in': [(var, 'y')]}
@classmethod
def _get_a2d_minmax(cls, a2d, ctype=None):
"""Get min max limits from a2l data.
Gives max limits if min/max limits are undefined.
"""
typelim = a2l_range(ctype)
minlim = a2d.get('min')
if minlim is None or minlim == '-':
minlim = typelim[0]
maxlim = a2d.get('max')
if maxlim is None or maxlim == '-':
maxlim = typelim[1]
return minlim, maxlim
def _check_axis_ref(self):
"""Check that the axis definitions are defined in the code."""
undef_axis = [ax for ax in self._axis_ref
if ax not in self._var_dd['vars']]
if undef_axis:
self.warning(f'Undefined axis {pformat(undef_axis)}')
def _gen_compu_methods(self):
"""Generate COMPU_METHOD data, and add it into the var_data_dict."""
self._compu_meths = {}
for var, data in self._var_dd['vars'].items():
a2d = data.get('a2l_data')
if a2d is not None:
lsb = self._calc_lsb(a2d['lsb'])
offset_str = str(a2d['offset'])
is_offset_num = bool(re.match('[0-9]', offset_str))
if is_offset_num:
offset = float(offset_str)
else:
offset = 0
key = (lsb, offset, a2d['unit'])
self._var_dd['vars'][var]['compu_meth'] = key
name = self._compu_key_2_name(key)
if key in self._compu_meths:
self._compu_meths[key]['vars'].append(var)
else:
self._compu_meths[key] = {'name': name,
'vars': [var],
'coeffs': self._get_coefs_str(lsb,
offset)}
def _compu_key_2_name(self, key):
"""Generate a COMPU_METHOD name from the keys in the name.
Args:
key (tuple): a list with compumethod keys (lsb, offset, unit)
"""
conversion_list = [(r'[\./]', '_'), ('%', 'percent'),
('-', 'None'), (r'\W', '_')]
name = f"{self._var_dd['function']}_{key[0]}_{key[1]}_{key[2]}"
for frm, to_ in conversion_list:
name = re.sub(frm, to_, name)
return name
@staticmethod
def _array_to_a2l_string(array):
"""Convert c-style array definitions to A2L MATRIX_DIM style."""
if not isinstance(array, list):
array = [array]
dims = [1, 1, 1]
for i, res in enumerate(array):
dims[i] = res
return f"MATRIX_DIM {dims[0]} {dims[1]} {dims[2]}"
@staticmethod
def _get_coefs_str(lsb, offset):
"""Calculate the a2l-coeffs from the lsb and offs fields.
The fields are defined in the a2l_data dictionary.
"""
return f"COEFFS 0 1 {offset} 0 0 {lsb}"
@staticmethod
def _calc_lsb(lsb):
"""Convert 2^-2, style lsbs to numericals."""
if isinstance(lsb, str):
if lsb == '-':
return 1
shift = re.match(r'(\d+)\^([\-+0-9]+)', lsb)
if shift is not None:
lsb_num = pow(int(shift.group(1)), int(shift.group(2)))
else:
lsb_num = float(lsb)
return lsb_num
return lsb
def _gen_record_layouts_data(self):
"""Generate record layouts."""
self._rec_layouts = {}
for var, data in self._var_dd['vars'].items():
if data.get('a2l_data') is not None:
a2l_unit = a2l_type(data['var']['type'])
# if calibration data has a suffix of _x of _y it is a axis_pts
srch = re.search('_[xyXY]$', var)
if srch is not None:
name = a2l_unit + "_X_INCR_DIRECT"
self._rec_layouts[name] = f"AXIS_PTS_X 1 {a2l_unit} INDEX_INCR DIRECT"
data['rec_layout'] = name
else:
name = a2l_type(data['var']['type']) + "_COL_DIRECT"
self._rec_layouts[name] = f"FNC_VALUES 1 {a2l_unit} COLUMN_DIR DIRECT"
data['rec_layout'] = name
def _get_inpq_data(self, inp_quant):
"""Get the necessary InputQuantity parameters."""
if inp_quant is not None:
if inp_quant in self._var_dd['vars']:
return inp_quant
return 'NO_INPUT_QUANTITY'
# Bosh template
_meas_tmplt = Template("""
/begin MEASUREMENT
$Name /* Name */
"$LongIdent" /* LongIdentifier */
$Datatype /* Datatype */
$Conversion /* Conversion */
1 /* Resolution */
0 /* Accuracy */
$LowerLimit /* LowerLimit */
$UpperLimit /* UpperLimit */
$OptionalData
ECU_ADDRESS 0x00000000
/end MEASUREMENT
""")
# Denso template
_meas_tmplt_nvm = Template("""
/begin MEASUREMENT
$Name /* Name */
"$LongIdent" /* LongIdentifier */
$Datatype /* Datatype */
$Conversion /* Conversion */
1 /* Resolution */
0 /* Accuracy */
$LowerLimit /* LowerLimit */
$UpperLimit /* UpperLimit */
$OptionalData
/end MEASUREMENT
""")
def _gen_a2l_measurement_blk(self, var_name, data):
"""Generate an a2l MEASUREMENT block."""
opt_data = 'READ_WRITE'
a2d = data.get('a2l_data')
if a2d is not None:
c_type = data['var']['type']
# if c_type == 'Bool':
# opt_data += '\n' + ' ' * 8 + "BIT_MASK 0x1"
if a2d.get('bitmask') is not None:
opt_data += '\n' + ' ' * 8 + "BIT_MASK %s" % a2d['bitmask']
if data.get('array'):
opt_data += '\n' + ' ' * 8 + \
self._array_to_a2l_string(data['array'])
ecu_supplier, _ = self._prj_cf.get_ecu_info()
if a2d.get('symbol'):
if ecu_supplier == 'Denso':
opt_data += '\n' + ' ' * 8 + 'SYMBOL_LINK "%s" %s' % (a2d['symbol'], a2d.get('symbol_offset'))
LOG.debug('This a2l is for Denso %s', opt_data)
elif ecu_supplier in ['RB', 'CSP', 'HI', 'ZC']:
var_name = a2d['symbol'] + '._' + var_name
LOG.debug('This a2l is for %s %s', ecu_supplier, var_name)
dtype = a2l_type(c_type)
minlim, maxlim = self._get_a2d_minmax(a2d, c_type)
conv = self._compu_meths[data['compu_meth']]['name']
if a2d.get('symbol') and ecu_supplier == 'Denso':
res = self._meas_tmplt_nvm.substitute(Name=var_name,
LongIdent=a2d['description'].replace('"', '\\"'),
Datatype=dtype,
Conversion=conv,
LowerLimit=minlim,
UpperLimit=maxlim,
OptionalData=opt_data)
else:
res = self._meas_tmplt.substitute(Name=var_name,
LongIdent=a2d['description'].replace('"', '\\"'),
Datatype=dtype,
Conversion=conv,
LowerLimit=minlim,
UpperLimit=maxlim,
OptionalData=opt_data)
return res
return None
_char_tmplt = Template("""
/begin CHARACTERISTIC
$Name /* Name */
"$LongIdent" /* LongIdentifier */
$Type /* Datatype */
0x00000000 /* address: $Name */
$Deposit /* Deposit */
0 /* MaxDiff */
$Conversion /* Conversion */
$LowerLimit /* LowerLimit */
$UpperLimit /* UpperLimit */$OptionalData
/end CHARACTERISTIC
""")
def _gen_a2l_characteristic_blk(self, var, data):
"""Generate an a2l CHARACTERISTIC block."""
opt_data = ''
a2d = data.get('a2l_data')
type_ = 'WRONG_TYPE'
if a2d is not None:
arr = data.get('array')
if arr is not None:
arr_dim = len(arr)
else:
arr_dim = 0
# Check is axis_pts are defined for the axis, if not make the
# type a VAL_BLK with matrix dimension, otherwise set
# a CURVE or MAP type
# If arr_dim is 0 the CHARACTERISTIC is a value
if arr_dim == 0:
type_ = 'VALUE'
elif arr_dim == 1:
x_axis_name = var + '_x'
# Check if axis variable is defined
if x_axis_name in self._var_dd['vars'].keys():
type_ = 'CURVE'
opt_data += self._gen_a2l_axis_desc_blk(self._get_inpq_data(a2d.get('x_axis')),
x_axis_name)
else:
type_ = 'VAL_BLK'
opt_data += self._array_to_a2l_string(data['array'])
elif arr_dim == 2:
x_axis_name = var + '_x'
y_axis_name = var + '_y'
# Check if axis variable is defined
nbr_def_axis = 0
if x_axis_name in self._var_dd['vars'].keys():
nbr_def_axis += 1
if y_axis_name in self._var_dd['vars'].keys():
nbr_def_axis += 1
if nbr_def_axis == 2:
type_ = 'MAP'
inpq_x = self._get_inpq_data(a2d['x_axis'])
opt_data += self._gen_a2l_axis_desc_blk(inpq_x, x_axis_name)
inpq_y = self._get_inpq_data(a2d['y_axis'])
opt_data += self._gen_a2l_axis_desc_blk(inpq_y, y_axis_name)
elif nbr_def_axis == 0:
type_ = 'VAL_BLK'
opt_data += self._array_to_a2l_string(data['array'])
else:
self.warning(
'MAP %s has only one AXIS_PTS defined, shall be none or two', var)
minlim, maxlim = self._get_a2d_minmax(a2d)
res = self._char_tmplt.substitute(Name=var,
LongIdent=a2d['description'].replace('"', '\\"'),
Type=type_,
Deposit=data['rec_layout'],
Conversion=self._compu_meths[data['compu_meth']]['name'],
LowerLimit=minlim,
UpperLimit=maxlim,
OptionalData=opt_data)
return res
self.warning("%s has no A2L-data", var)
return None
# Types ASCII CURVE MAP VAL_BLK VALUE
_axis_desc_tmplt = Template("""
/begin AXIS_DESCR
COM_AXIS /* Attribute */
$inp_quant /* InputQuantity */
$conv /* Conversion */
$maxaxispts /* MaxAxisPoints */
$minlim /* LowerLimit */
$maxlim /* UpperLimit */
AXIS_PTS_REF $axis_pts_ref
DEPOSIT ABSOLUTE
/end AXIS_DESCR""")
def _gen_a2l_axis_desc_blk(self, inp_quant, axis_pts_ref):
"""Generate an a2l AXIS_DESCR block.
TODO: Check that the AXIS_PTS_REF blocks are defined
"""
out = ''
inp_quant_txt = self._get_inpq_data(inp_quant)
axis_pts = self._var_dd['vars'][axis_pts_ref]
conv = self._compu_meths[axis_pts['compu_meth']]['name']
max_axis_pts = axis_pts['array'][0]
min_lim, max_lim = self._get_a2d_minmax(axis_pts['a2l_data'])
out += self._axis_desc_tmplt.substitute(inp_quant=inp_quant_txt,
conv=conv,
maxaxispts=max_axis_pts,
minlim=min_lim,
maxlim=max_lim,
axis_pts_ref=axis_pts_ref)
return out
_compu_meth_tmplt = Template("""
/begin COMPU_METHOD
$name /* Name */
"$longident" /* LongIdentifier */
RAT_FUNC /* ConversionType */
"$format" /* Format */
"$unit" /* Unit */
$coeffs
/end COMPU_METHOD
""")
def _gen_a2l_compu_metod_blk(self, key):
"""Generate an a2l COMPU_METHOD block."""
cmeth = self._compu_meths[key]
name = self._compu_key_2_name(key)
out = self._compu_meth_tmplt.substitute(name=name,
longident='',
format='%11.3',
unit=key[2],
coeffs=cmeth['coeffs'])
return out
_axis_tmplt = Template("""
/begin AXIS_PTS
$name /* Name */
"$longident" /* LongIdentifier */
0x00000000
NO_INPUT_QUANTITY /* InputQuantity */
$deposit /* Deposit */
0 /* MaxDiff */
$convert /* Conversion */
$max_ax_pts /* MaxAxisPoints */
$minlim /* LowerLimit */
$maxlim /* UpperLimit */
DEPOSIT ABSOLUTE
/end AXIS_PTS
""")
def _gen_a2l_axis_pts_blk(self, var, data):
"""Generate an a2l AXIS_PTS block."""
deposit = data['rec_layout']
conv = self._compu_meths[data['compu_meth']]['name']
max_axis_pts = data['array'][0]
min_lim, max_lim = self._get_a2d_minmax(data['a2l_data'])
out = self._axis_tmplt.substitute(name=var,
longident=data['a2l_data']['description'],
deposit=deposit,
convert=conv,
max_ax_pts=max_axis_pts,
minlim=min_lim,
maxlim=max_lim)
return out
_rec_layout_tmplt = Template("""
/begin RECORD_LAYOUT
$name /* Name */
$string
/end RECORD_LAYOUT
""")
def _gen_a2l_rec_layout_blk(self, key):
"""Generate an a2l AXIS_PTS block."""
string = self._rec_layouts[key]
out = self._rec_layout_tmplt.substitute(name=key,
string=string)
return out
def _gen_a2l_function_blk(self):
"""Generate an a2l FUNCTION block."""
out = '\n /begin FUNCTION\n'
out += f' {self._var_dd["function"]} /* Name */\n'
out += ' "" /* LongIdentifier */\n'
if self._fnc_char:
out += ' /begin DEF_CHARACTERISTIC\n'
for idf in self._fnc_char:
out += f' {idf} /* Identifier */\n'
out += ' /end DEF_CHARACTERISTIC\n'
if self._fnc_inputs:
out += ' /begin IN_MEASUREMENT\n'
for idf in self._fnc_inputs:
out += f' {idf} /* Identifier */\n'
out += ' /end IN_MEASUREMENT\n'
if self._fnc_locals:
out += ' /begin LOC_MEASUREMENT\n'
for idf in self._fnc_locals:
out += f' {idf} /* Identifier */\n'
out += ' /end LOC_MEASUREMENT\n'
if self._fnc_outputs:
out += ' /begin OUT_MEASUREMENT\n'
for idf in self._fnc_outputs:
out += f' {idf} /* Identifier */\n'
out += ' /end OUT_MEASUREMENT\n'
out += ' /end FUNCTION\n'
return out

650
pybuild/a2l_merge.py Normal file
View File

@ -0,0 +1,650 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
# -*- coding: utf-8 -*-
"""Module for merging of a2l-files."""
import json
import os
import re
from string import Template
from pybuild.lib.helper_functions import deep_dict_update
from pybuild.problem_logger import ProblemLogger
from pybuild.a2l_templates import A2lProjectTemplate, A2lSilverTemplate
class A2lMerge(ProblemLogger):
"""Class for merging of a2l-files."""
def __init__(self, prj_cfg, ucfg, a2l_files_unit, a2l_files_gen):
"""Merge a2l-files based on provided project configuration.
Removes symbols not included in the projects unit-config files.
Args:
prj_cfg (obj): Project config.
ucfg (obj): Unit config.
a2l_files_unit (list of str): Files to merge.
a2l_files_gen (list of str): Files to merge.
"""
super().__init__()
self._prj_cfg = prj_cfg
self._unit_cfg = ucfg
self._per_unit_cfg = ucfg.get_per_unit_cfg()
# generate the a2l string
self._blks = {}
self._removed_symbols = []
self.a2l = ""
# ----- Example blocks in a2l (TargetLink) -----
#
# /begin CHARACTERISTIC
# cVc_B_SeriesHev /* Name */
# "Series hybrid" /* LongIdentifier */
# VALUE /* Type */
# 0x00000000 /* address: cVc_B_SeriesHev */
# UBYTE_COL_DIRECT /* Deposit */
# 0 /* MaxDiff */
# Scaling_3 /* Conversion */
# 0 /* LowerLimit */
# 1 /* UpperLimit */
# /end CHARACTERISTIC
#
# Example of Bosch-nvm signal in nvm:
#
# /begin MEASUREMENT
# nvm_list_32._sVcDclVu_D_Markow /* Name */
# "No description given" /* LongIdentifier */
# ULONG /* Datatype */
# VcNvm_1_0_None /* Conversion */
# 1 /* Resolution */
# 0 /* Accuracy */
# 0 /* LowerLimit */
# 4294967295 /* UpperLimit */
# READ_WRITE
# MATRIX_DIM 152 1 1
# ECU_ADDRESS 0x00000000
# /end MEASUREMENT
#
# ----- Example blocks in a2l (Embedded Coder) -----
#
# /begin MEASUREMENT
# /* Name */ sVcAesVe_md_VolmcOffs
# /* Long identifier */ "Volumetric cylinder mass flow offset"
# /* Data type */ FLOAT32_IEEE
# /* Conversion method */ VcAesVe_CM_Float32_g_s
# /* Resolution (Not used) */ 0
# /* Accuracy (Not used) */ 0
# /* Lower limit */ -100.0
# /* Upper limit */ 100.0
# ECU_ADDRESS 0x0000 /* @ECU_Address@sVcAesVe_md_VolmcOffs@ */
# /end MEASUREMENT
#
# /begin CHARACTERISTIC
# /* Name */ cVcAesVe_D_VolmcCmpSel
# /* Long Identifier */ "Select compensation factor characterizing deviation from nominal voleff"
# /* Type */ VALUE
# /* ECU Address */ 0x0000 /* @ECU_Address@cVcAesVe_D_VolmcCmpSel@ */
# /* Record Layout */ Scalar_UBYTE
# /* Maximum Difference */ 0
# /* Conversion Method */ VcAesVe_CM_uint8
# /* Lower Limit */ 1.0
# /* Upper Limit */ 3.0
# /end CHARACTERISTIC
self._block_finder = re.compile(r'(?:\s*\n)*' # Optional blank lines
r'(\s*/begin (\w+)\s*' # begin <something> block
r'\n\s*([\w.]+).*?\n' # label. (Bosch-nvm contains the .)
r'.*?' # block definition
r'/end\s+\2)', # end <something> block. Same something as before
flags=re.M | re.DOTALL)
self._tl_compu_method_parser = re.compile(
r'(?:\s*\n)*(?P<compu_method>'
r'\s*/begin COMPU_METHOD\s*\n'
r'\s*(?P<name>\w*)\s*(/\* Name \*/)?\s*\n' # Name
r'\s*"(?P<ID>.*?)"\s*(/\* LongIdentifier \*/.*?)\s*\n' # Long Identifier
r'\s*(?P<conv_type>[A-Z_]*).*\s*\n' # ConversionType
r'\s*"(?P<disp_format>.*)"\s*(/\* Format \*/)?\s*\n' # Format
r'\s*"(?P<unit>.*?)"\s*(/\* Unit \*/)?\s*\n' # Unit
r'\s*(?P<conversion>.*)\s*\n' # COEFFS
r'(?P<indentation>\s*)/end COMPU_METHOD)', flags=re.M) # No DOTALL, so .* is [^\n]*
# COMPU_METHOD parser that works with files generated by Embedded Coder
self._ec_compu_method_parser = re.compile(
r'(?:\s*\n)*(?P<compu_method>'
r'\s*/begin COMPU_METHOD\s*\n'
r'\s*(/\* Name of CompuMethod\s*\*/)?\s*(?P<name>\w*)\s*\n' # Name
r'\s*(/\* Long identifier\s*\*/)?\s*"(?P<ID>.*?)"\s*\n' # Long Identifier
r'\s*/\* Conversion Type\s*\*/\s*(?P<conv_type>[A-Z_]*)\s*\n' # ConversionType
r'\s*(/\* Format\s*\*/)?\s*"(?P<disp_format>.*?)"\s*\n' # Format
r'\s*(/\* Units\s*\*/)?\s*"(?P<unit>.*?)"\s*\n' # Unit
r'\s*/\* Coefficients\s*\*/\s*(?P<conversion>.*?)\s*\n' # COEFFS
r'(?P<indentation>\s*)/end COMPU_METHOD)', flags=re.M) # No DOTALL, so .* is [^\n]*
self._expr_block_meas_kp_blob_parser = re.compile(
r'\/begin\s+(?P<keyword>\w+)\s+(?P<class>\w+)\s*\n'
r'\s*KP_BLOB\s+(?P<address>0x[0-9a-fA-F]+)\s*\n'
r'\s*\/end\s+\1'
)
self._compu_methods = {}
self._included_compu_methods = []
self._tl_compu_method_template = Template(
'$indentation/begin COMPU_METHOD\n'
'$indentation $name /* Name */\n'
'$indentation "$ID" /* LongIdentifier */\n'
'$indentation $conv_type /* ConversionType */\n'
'$indentation "$disp_format" /* Format */\n'
'$indentation "$unit" /* Unit */\n'
'$indentation $conversion\n'
'$indentation/end COMPU_METHOD'
)
# COMPU_METHOD template that looks similar to COMPU_METHOD generated by Embedded Coder
self._ec_compu_method_template = Template(
'$indentation/begin COMPU_METHOD\n'
'$indentation /* Name of CompuMethod */ $name\n'
'$indentation /* Long identifier */ "$ID"\n'
'$indentation /* Conversion Type */ $conv_type\n'
'$indentation /* Format */ "$disp_format"\n'
'$indentation /* Units */ "$unit"\n'
'$indentation /* Coefficients */ $conversion\n'
'$indentation/end COMPU_METHOD'
)
for filename in a2l_files_unit:
removed_symbols = self._parse_unit(filename)
self._removed_symbols.extend(removed_symbols)
self.debug('Loaded %s', filename)
for filename in a2l_files_gen:
self._parse_gen(filename)
self.debug('Loaded %s', filename)
def _parse_unit(self, filename):
"""Parse the unit a2l-files and apply a filter to only active parameters."""
self.debug('Processing %s', filename)
with open(filename, 'r', encoding="ISO-8859-1") as a2lfp:
a2ld = a2lfp.read()
file_path_parts = os.path.split(filename)
unit = file_path_parts[1].split('.')[0]
base_path = file_path_parts[0]
dcl_match = re.search(r'VcDcl[\w]+Mdl(__[\w]+)', base_path)
if dcl_match is not None and 'Mdl' not in unit:
# Hand coded model names including "__" will lead to this, due to name mismatch of .a2l and .json files.
# E.g. VcDclPtrlMdl__denso:
# 1. config_VcDclPtrlMdl__denso.json vs VcDclPtrlMdl__denso.a2l.
# 1.1. Match: unit in self._per_unit_cfg.
# 2. config_VcDclPtrl__denso.json vs VcDclPtrl.a2l.
# 2.1. No match: unit not in self._per_unit_cfg.
old_unit = unit
unit = unit + dcl_match.group(1)
self.info(
'Found unit %s with .a2l and .json file name mismatch. Using new unit name: %s',
old_unit,
unit
)
if unit in self._per_unit_cfg:
u_conf = self._per_unit_cfg[unit]
code_generator = u_conf['code_generator'] if 'code_generator' in u_conf else 'target_link'
else:
u_conf = {}
code_generator = 'target_link'
if code_generator == 'embedded_coder':
blks = re.findall(r'(?:\s*\n)*(\s*/begin '
r'(?!PROJECT|HEADER|MODULE|MOD_PAR|MOD_COMMON)(\w+)\s*(?:\n\s*)?'
r'(?:/\*\s*[\w ]+\s*\*/\s*)?(\w+)([\[\d+\]]*).*?\n.*?/end\s+\2)',
a2ld, flags=re.M | re.DOTALL)
else:
blks = re.findall(r'(?:\s*\n)*(\s*/begin (?!PROJECT|MODULE)(\w+)[\n\s]*'
r'(\w+(?:\.\w+)?)([\[\d+\]]*).*?\n.*?/end\s+\2)', a2ld,
flags=re.M | re.DOTALL)
compu_method_translators = self._parse_compu_methods(a2ld, unit)
unit_blks = {}
removed_symbols = []
if unit not in self._per_unit_cfg:
# Handcoded a2l without json-files will lead to this.
# Add json files for the handcoded a2l!
# NOTE: Assuming TargetLink
self.debug('%s is not in the units list. Looking for json.', unit)
config_filename = os.path.join(
self._prj_cfg.get_unit_cfg_deliv_dir(),
f'config_{unit}.json')
self.debug('Looking for %s', config_filename)
if os.path.isfile(config_filename):
with open(config_filename, 'r', encoding="utf-8") as config_file:
u_conf = json.load(config_file)
self._handle_config(
code_generator, unit, u_conf, blks, unit_blks, removed_symbols, compu_method_translators
)
else:
self.warning('%s does not have a unit_cfg json, '
'including all a2l-parameters', unit)
for blk_def, type_, label, size in blks:
if type_ == 'COMPU_METHOD':
blk_def, label = self._replace_compu_method(blk_def, label, compu_method_translators)
self.add_block_definition(unit_blks, type_, label, size, blk_def, compu_method_translators)
else:
self._handle_config(
code_generator, unit, u_conf, blks, unit_blks, removed_symbols, compu_method_translators
)
deep_dict_update(self._blks, unit_blks)
return removed_symbols
def _handle_config(self, code_generator, unit, u_conf, blks, unit_blks, removed_symbols, compu_method_translators):
"""Merge all types of ram for the unit."""
ram = u_conf['inports']
ram.update(u_conf['outports'])
ram.update(u_conf['local_vars'])
# TODO: Function the variables and labels needs to be removed from
# the FUNCTION block too
for blk_def, type_, label, size in blks:
remove_excluded_symbol = True
inc = False
if type_ == 'AXIS_PTS':
if label in u_conf['calib_consts']:
inc = True
elif type_ == 'CHARACTERISTIC':
if label in u_conf['calib_consts']:
if label in [axis_label for _, axis_type, axis_label, _ in blks if axis_type == 'AXIS_PTS']:
# AXIS_PTS can be used as CHARACTERISTC but not the other way around.
# If there are duplicates, use the AXIS_PTS.
self.debug('Will not add the block for CHARACTERISTC %s, but will keep it as a symbol,'
' since it exists as AXIS_PTS', label)
remove_excluded_symbol = False
inc = False
else:
inc = self._handle_axis_ptr_ref_config(u_conf, blk_def, unit)
elif type_ == 'MEASUREMENT':
if label in ram:
key = label if size is None else label + size
if label in u_conf['outports']:
# This unit is producing the measurement.
inc = True
elif key in unit_blks.get(type_, {}):
# This unit is not producing it, and it has already been added
inc = False
else:
# This unit is not producing it, but it has not been added
# Could be external signal, etc.
inc = True
elif type_ == 'COMPU_METHOD':
inc = True
blk_def, label = self._replace_compu_method(blk_def, label, compu_method_translators)
else:
inc = True
if inc:
self.add_block_definition(unit_blks, type_, label, size, blk_def, compu_method_translators)
else:
if remove_excluded_symbol:
removed_symbols.append(label + size)
self.debug('Did not include A2L-blk %s%s', label, size)
if not self._unit_cfg.check_if_in_unit_cfg(unit, label):
if type_ != 'COMPU_METHOD':
self.warning('A2l block %s not in config json file for %s', label, unit)
if 'FUNCTION' in unit_blks:
unit_blks['FUNCTION'] = self._remove_symbols_from_func_blks(
code_generator, unit_blks['FUNCTION'], removed_symbols
)
if 'GROUP' in unit_blks:
unit_blks['GROUP'] = self._remove_symbols_from_grp_blks(unit_blks['GROUP'], removed_symbols)
def _handle_axis_ptr_ref_config(self, u_conf, blk, unit):
"""Remove blocks referencing undefined blocks."""
ref_re = re.compile(r'\s*AXIS_PTS_REF\s*([\w]*)')
for axis_ptr_ref in ref_re.findall(blk):
if axis_ptr_ref not in u_conf['calib_consts']:
self.debug('Excluding due to %s missing in config', axis_ptr_ref)
return False
if not self._unit_cfg.check_if_in_unit_cfg(unit, axis_ptr_ref):
self.debug('Excluding due to %s not active in config', axis_ptr_ref)
return False
return True
def add_block_definition(self, unit_blks, type_, label, size, blk_def, compu_method_translators):
"""Add block definition to A2L-file."""
size = '' if size is None else size
blk_def = self._replace_conversions(blk_def, compu_method_translators)
if type_ not in unit_blks:
unit_blks[type_] = {}
unit_blks[type_][label + size] = blk_def
@staticmethod
def _parse_func_blk(code_generator, fnc_blk):
"""Remove the unused symbols from the FUNCTION blocks in the A2L-file.
Parse the FUNCTION block, TL or EC style based on code_generator.
"""
if code_generator == 'target_link':
pattern = r'\s*/begin\s+FUNCTION\s*?\n\s*(\w+).*?\n\s*"(.*?)".*?\n(.*)'
else:
pattern = r'\s*/begin\s+FUNCTION\s*?\n\s*.*\*/\s*(\w+).*?\n\s*.*\*/\s*"(.*?)".*?\n(.*)'
res = re.match(pattern, fnc_blk, flags=re.M | re.DOTALL)
fnc_name = res.group(1)
long_id = res.group(2)
fnc_dict = {
'fnc_name': fnc_name,
'long_id': long_id,
'body': {}
}
fnc_body = res.group(3)
sb_res = re.findall(r'\s*/begin\s+(\w+[\[\d\]]*)\s*\n\s*'
r'(.*?\n)\s*/end \1', fnc_body, flags=re.M | re.DOTALL)
for sb_name, sub_blk in sb_res:
symbols = set(re.findall(r'\s*(\w+(?:\.\w+)?[\[\d\]]*).*?\n', sub_blk, flags=re.M))
fnc_dict['body'][sb_name] = symbols
return fnc_dict
@staticmethod
def _parse_grp_blk(grp_blk):
"""Remove the unused symbols from the GROUP blocks in the A2L-file."""
# parse the GROUP block
res = re.match(r'\s*/begin\s+GROUP\s*?\n\s*.*\*/\s*(\w+).*?\n\s*.*\*/\s*"(.*?)".*?\n(.*)',
grp_blk, flags=re.M | re.DOTALL)
fnc_name = res.group(1)
long_id = res.group(2)
fnc_dict = {
'fnc_name': fnc_name,
'long_id': long_id,
'body': {}
}
fnc_body = res.group(3)
sb_res = re.findall(r'\s*/begin\s+(\w+[\[\d\]]*)\s*\n\s*'
r'(.*?\n)\s*/end \1', fnc_body, flags=re.M | re.DOTALL)
for sb_name, sub_blk in sb_res:
symbols = set(re.findall(r'\s*(\w+(?:\.\w+)?[\[\d\]]*).*?\n', sub_blk, flags=re.M))
fnc_dict['body'][sb_name] = symbols
return fnc_dict
def _recursive_remove(self, a2l_dict, name):
"""Remove symbols from A2L dict (e.g. group or function)."""
if name in a2l_dict:
blk = a2l_dict[name]
if 'SUB_FUNCTION' in blk:
for sub_fnc in blk['SUB_FUNCTION']:
if self._recursive_remove(a2l_dict, sub_fnc):
blk['SUB_FUNCTION'] = blk['SUB_FUNCTION'] - set([sub_fnc])
elif 'SUB_GROUP' in blk:
for sub_grp in blk['SUB_GROUP']:
if self._recursive_remove(a2l_dict, sub_grp):
blk['SUB_GROUP'] = blk['SUB_GROUP'] - set([sub_grp])
empty = True
for key in blk:
if blk[key]:
empty = False
break
if empty:
a2l_dict.pop(name)
return True
return False
def _remove_symbols_from_func_blks(self, code_generator, fnc_blks, removed_symbols):
"""Remove the unused symbols from function blocks.
If the function block is empty, it too will be removed.
first iteration - remove all symbols that have been removed
second iteration - recusively remover all functions without symbols
"""
fnc_dict = {}
for fnc_name, fnc_blk in fnc_blks.items():
fnc_dict[fnc_name] = {}
u_fnc_bdy = self._parse_func_blk(code_generator, fnc_blk)['body']
sub_blk_types = set(u_fnc_bdy.keys()) - set(['SUB_FUNCTION'])
for type_ in list(sub_blk_types):
fnc_dict[fnc_name][type_] = u_fnc_bdy[type_] - set(removed_symbols)
if 'SUB_FUNCTION' in u_fnc_bdy:
fnc_dict[fnc_name]['SUB_FUNCTION'] = u_fnc_bdy['SUB_FUNCTION']
# second iteration - remove empty FUNCTION blocks
# TODO: Add functionality which parses the the function tree structures
# And the run recursive remove on all tree roots.
for fnc_name in fnc_blks.keys():
self._recursive_remove(fnc_dict, fnc_name)
# generate new function blocks
new_fnc_blks = {}
for fnc_name, fnc_data in fnc_dict.items():
fnc_blk = f' /begin FUNCTION\n {fnc_name}\t/* Name */\n'
fnc_blk += " \"\"\t/* LongIdentifier */\n"
for sub_sec in sorted(fnc_data.keys()):
sub_sec_data = fnc_data[sub_sec]
if sub_sec_data:
fnc_blk += f" /begin {sub_sec}\n"
for param in sorted(sub_sec_data):
fnc_blk += f" {param}\t/* Identifier */\n"
fnc_blk += f" /end {sub_sec}\n"
fnc_blk += " /end FUNCTION"
new_fnc_blks[fnc_name] = fnc_blk
return new_fnc_blks
def _remove_symbols_from_grp_blks(self, grp_blks, removed_symbols):
"""Remove the unused symbols from group blocks.
If the group block is empty, it too will be removed.
first iteration - remove all symbols that have been removed
second iteration - recusively remover all groups without symbols
"""
grp_dict = {}
for grp_name, grp_blk in grp_blks.items():
grp_dict[grp_name] = {}
u_grp_bdy = self._parse_grp_blk(grp_blk)['body']
sub_blk_types = set(u_grp_bdy.keys()) - set(['SUB_GROUP'])
for type_ in list(sub_blk_types):
grp_dict[grp_name][type_] = u_grp_bdy[type_] - set(removed_symbols)
if 'SUB_GROUP' in u_grp_bdy:
grp_dict[grp_name]['SUB_GROUP'] = u_grp_bdy['SUB_GROUP']
# second iteration - remove empty GROUP blocks
# TODO: Add functionality which parses the the group tree structures
# And the run recursive remove on all tree roots.
for grp_name in grp_blks.keys():
self._recursive_remove(grp_dict, grp_name)
# generate new group blocks
new_grp_blks = {}
for grp_name, grp_data in grp_dict.items():
grp_blk = f" /begin GROUP \n /* Name */ {grp_name}\n"
grp_blk += " /* Long identifier */ \"\"\n"
for sub_sec in sorted(grp_data.keys()):
sub_sec_data = grp_data[sub_sec]
if sub_sec_data:
grp_blk += f" /begin {sub_sec}\n"
for param in sorted(sub_sec_data):
grp_blk += f" {param}\n"
grp_blk += f" /end {sub_sec}\n"
grp_blk += " /end GROUP"
new_grp_blks[grp_name] = grp_blk
return new_grp_blks
def _parse_gen(self, filename):
"""Parse the generated a2l-files, without filter."""
self.debug('parsing gen a2l: %s', filename)
with open(filename, 'r', encoding="utf-8") as a2lfp:
a2ld = a2lfp.read()
for blk_def, type_, label in self._block_finder.findall(a2ld):
self._blks.setdefault(type_, {}).setdefault(label, blk_def)
@staticmethod
def _replace_compu_method(blk_def, label, compu_method_translators):
"""Replace the compu method block and label."""
for translator in compu_method_translators:
if translator['old_compu_method'] == blk_def:
return translator['new_compu_method'], translator['new_name']
return blk_def, label
def _store_compu_method(self, ID, conv_type, disp_format, unit, conversion, indentation, u_conf):
"""Stash compu methods that exists in the resulting a2l."""
key = (ID, conv_type, disp_format, unit, conversion)
if key in self._compu_methods:
new_name = self._compu_methods[key]['name']
new_compu_method = self._compu_methods[key]['method']
else:
new_name = 'Scaling_' + str(len(self._compu_methods))
if 'code_generator' in u_conf and u_conf['code_generator'] == 'embedded_coder':
new_compu_method = self._ec_compu_method_template.substitute(
name=new_name,
ID=ID,
conv_type=conv_type,
disp_format=disp_format,
unit=unit,
conversion=conversion,
indentation=indentation
)
else:
new_compu_method = self._tl_compu_method_template.substitute(
name=new_name,
ID=ID,
conv_type=conv_type,
disp_format=disp_format,
unit=unit,
conversion=conversion,
indentation=indentation
)
self._compu_methods.update({key: {'name': new_name,
'method': new_compu_method}})
return new_name, new_compu_method
@staticmethod
def _replace_conversions(blk_def, compu_method_translators):
"""Replace conversion identifiers in a2l block."""
for translator in compu_method_translators:
# The following check is faster than running the regex on the block.
# It DOES give false positives, which is why the regex is used for substitution
# and we do not immidiately return after one positive
if translator['old_name'] in blk_def:
blk_def = translator['regex'].sub(translator['replacement'], blk_def)
return blk_def
def _parse_compu_methods(self, blk_def, unit):
"""Replace compu methods to not overwrite any of them."""
compu_method_translators = [] # Translators for one processed a2l file. Needs to be reset between files
u_conf = self._per_unit_cfg[unit] if unit in self._per_unit_cfg else {}
if 'code_generator' in u_conf and u_conf['code_generator'] == 'embedded_coder':
for match in self._ec_compu_method_parser.finditer(blk_def):
new_name, new_compu_method = self._store_compu_method(
match['ID'],
match['conv_type'],
match['disp_format'],
match['unit'],
match['conversion'],
match['indentation'],
u_conf
)
compu_method_translators.append(
{
'new_name': new_name,
'old_name': match['name'],
'regex': re.compile(
r'(\s*)' # beginning
r'\s*(/\* Conversion [Mm]ethod\s*\*/\s*)' # optional comment
r'\b{name}\b' # word
r'('
r'\s*\n' # newline
r')'.format(name=match['name']) # end of end-match
),
'replacement': r'\1\2{name}\3'.format(name=new_name),
'old_compu_method': match['compu_method'],
'new_compu_method': new_compu_method
}
)
else:
for match in self._tl_compu_method_parser.finditer(blk_def):
new_name, new_compu_method = self._store_compu_method(
match['ID'],
match['conv_type'],
match['disp_format'],
match['unit'],
match['conversion'],
match['indentation'],
u_conf
)
compu_method_translators.append(
{
'new_name': new_name,
'old_name': match['name'],
'regex': re.compile(
r'(\s*)' # beginning
r'\b{name}\b' # word
r'(' # start of end-match
r'\s*(/\* Conversion \*/)?' # optional comment
r'\s*\n' # newline
r')'.format(name=match['name']) # end of end-match
),
'replacement': r'\1{name}\2'.format(name=new_name),
'old_compu_method': match['compu_method'],
'new_compu_method': new_compu_method
}
)
return compu_method_translators
def _patch_kp_blob(self, block):
"""Return updated measurement block text.
Args:
block (str): A2L text block
Returns:
a2l_text (str): A2L text without KP_BLOB.
"""
ecu_address = '0x00000000'
for match in self._expr_block_meas_kp_blob_parser.finditer(block):
start, end = match.span()
block = f'{block[:start]}ECU_ADDRESS {ecu_address}{block[end:]}'
return block
def merge(self, f_name, complete_a2l=False, silver_a2l=False):
"""Write merged a2l-file.
Args:
f_name (str): Output filename.
"""
a2l = ''
a2l_config = self._prj_cfg.get_a2l_cfg()
for _, data in self._blks.items():
for _, blk in data.items():
if not a2l_config['allow_kp_blob']:
blk = self._patch_kp_blob(blk)
a2l += blk + '\n\n'
if complete_a2l:
events = []
time_unit_10ms = '0x07'
rasters = self._prj_cfg.get_units_raster_cfg()
for xcp_id, evt_data in enumerate(rasters['SampleTimes'].items(), 1):
events.append({
'time_cycle': '0x%02X' % int(evt_data[1] * 100),
'time_unit': time_unit_10ms,
'name': evt_data[0],
'channel_id': '0x%04X' % xcp_id
})
a2l_template = A2lProjectTemplate(
a2l,
a2l_config['asap2_version'],
a2l_config['name'],
events,
a2l_config['ip_address'],
a2l_config['ip_port']
)
a2l = a2l_template.render()
elif silver_a2l:
a2l_template = A2lSilverTemplate(a2l)
a2l = a2l_template.render()
self.a2l = a2l
with open(f_name, 'w', encoding="ISO-8859-1") as ma2l:
ma2l.write(a2l)
self.info('Written the merged A2L-file %s', f_name)
def get_characteristic_axis_data(self):
"""Get characteristic map axis data from merged a2l-file."""
axis_data = {}
for blk_def, type_, label in self._block_finder.findall(self.a2l):
if type_ == "CHARACTERISTIC":
axes = re.findall('AXIS_PTS_REF (.*)', blk_def)
if label in axis_data:
self.critical("Multiple CHARACTERISTIC for %s in merged a2l.", label)
axis_data[label] = {'axes': axes}
return axis_data

717
pybuild/a2l_templates.py Normal file
View File

@ -0,0 +1,717 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
# -*- coding: utf-8 -*-
"""Module for creating a project a2l-file from a template."""
from string import Template
class A2lProjectTemplate:
"""Class for A2l Project Template."""
def __init__(self, spm_a2l, asap2_version, project_name, events, ip_address, ip_port):
self.spm_a2l = spm_a2l
self.asap2_version = asap2_version
self.project_name = project_name
self.events = events
self.ip_address = ip_address
self.ip_port = ip_port
self.event_template = Template(
' /begin EVENT\n'
' "$name" /* EVENT CHANNEL NAME */\n'
' "$short_name" /* EVENT CHANNEL SHORT NAME */\n'
' $channel_id /* EVENT CHANNEL NUMBER */\n'
' DAQ_STIM\n'
' 0xFF\n'
' $time_cycle /* EVENT TIME CYCLE */\n'
' $time_unit /* EVENT TIME UNIT */\n'
' 0x00\n'
' /end EVENT\n'
)
self.template = Template(
'ASAP2_VERSION $asap2_version\n'
'/begin PROJECT $project_name ""\n'
' /begin MODULE XCP ""\n'
' /begin A2ML\n'
' struct Protocol_Layer {\n'
' uint; /* XCP protocol layer version, current 0x100 */\n'
' uint; /* T1 [ms] */\n'
' uint; /* T2 [ms] */\n'
' uint; /* T3 [ms] */\n'
' uint; /* T4 [ms] */\n'
' uint; /* T5 [ms] */\n'
' uint; /* T6 [ms] */\n'
' uint; /* T7 [ms] */\n'
' uchar; /* MAX_CTO */\n'
' uint; /* MAX_DTO */\n'
' enum {\n'
' "BYTE_ORDER_MSB_LAST" = 0,\n'
' "BYTE_ORDER_MSB_FIRST" = 1\n'
' };\n'
' enum {\n'
' "ADDRESS_GRANULARITY_BYTE" = 1,\n'
' "ADDRESS_GRANULARITY_WORD" = 2,\n'
' "ADDRESS_GRANULARITY_DWORD" = 4\n'
' };\n'
' taggedstruct {\n'
' ("OPTIONAL_CMD" enum {\n'
' "GET_COMM_MODE_INFO" = 251,\n'
' "GET_ID" = 250,\n'
' "SET_REQUEST" = 249,\n'
' "GET_SEED" = 248,\n'
' "UNLOCK" = 247,\n'
' "SET_MTA" = 246,\n'
' "UPLOAD" = 245,\n'
' "SHORT_UPLOAD" = 244,\n'
' "BUILD_CHECKSUM" = 243,\n'
' "TRANSPORT_LAYER_CMD" = 242,\n'
' "USER_CMD" = 241,\n'
' "DOWNLOAD" = 240,\n'
' "DOWNLOAD_NEXT" = 239,\n'
' "DOWNLOAD_MAX" = 238,\n'
' "SHORT_DOWNLOAD" = 237,\n'
' "MODIFY_BITS" = 236,\n'
' "SET_CAL_PAGE" = 235,\n'
' "GET_CAL_PAGE" = 234,\n'
' "GET_PAG_PROCESSOR_INFO" = 233,\n'
' "GET_SEGMENT_INFO" = 232,\n'
' "GET_PAGE_INFO" = 231,\n'
' "SET_SEGMENT_MODE" = 230,\n'
' "GET_SEGMENT_MODE" = 229,\n'
' "COPY_CAL_PAGE" = 228,\n'
' "CLEAR_DAQ_LIST" = 227,\n'
' "SET_DAQ_PTR" = 226,\n'
' "WRITE_DAQ" = 225,\n'
' "SET_DAQ_LIST_MODE" = 224,\n'
' "GET_DAQ_LIST_MODE" = 223,\n'
' "START_STOP_DAQ_LIST" = 222,\n'
' "START_STOP_SYNCH" = 221,\n'
' "GET_DAQ_CLOCK" = 220,\n'
' "READ_DAQ" = 219,\n'
' "GET_DAQ_PROCESSOR_INFO" = 218,\n'
' "GET_DAQ_RESOLUTION_INFO" = 217,\n'
' "GET_DAQ_LIST_INFO" = 216,\n'
' "GET_DAQ_EVENT_INFO" = 215,\n'
' "FREE_DAQ" = 214,\n'
' "ALLOC_DAQ" = 213,\n'
' "ALLOC_ODT" = 212,\n'
' "ALLOC_ODT_ENTRY" = 211,\n'
' "PROGRAM_START" = 210,\n'
' "PROGRAM_CLEAR" = 209,\n'
' "PROGRAM" = 208,\n'
' "PROGRAM_RESET" = 207,\n'
' "GET_PGM_PROCESSOR_INFO" = 206,\n'
' "GET_SECTOR_INFO" = 205,\n'
' "PROGRAM_PREPARE" = 204,\n'
' "PROGRAM_FORMAT" = 203,\n'
' "PROGRAM_NEXT" = 202,\n'
' "PROGRAM_MAX" = 201,\n'
' "PROGRAM_VERIFY" = 200\n'
' })*;\n'
' "COMMUNICATION_MODE_SUPPORTED" taggedunion {\n'
' "BLOCK" taggedstruct {\n'
' "SLAVE" ;\n'
' "MASTER" struct {\n'
' uchar; /* MAX_BS */\n'
' uchar; /* MIN_ST */\n'
' };\n'
' };\n'
' "INTERLEAVED" uchar; /* QUEUE_SIZE */\n'
' };\n'
' "SEED_AND_KEY_EXTERNAL_FUNCTION" char[256];/* Name of the Seed&Key function */\n'
' };\n'
' };\n'
'\n'
' struct Daq {\n'
' enum {\n'
' "STATIC" = 0,\n'
' "DYNAMIC" = 1\n'
' };\n'
' uint; /* MAX_DAQ */\n'
' uint; /* MAX_EVENT_CHANNEL */\n'
' uchar; /* MIN_DAQ */\n'
' enum {\n'
' "OPTIMISATION_TYPE_DEFAULT" = 0,\n'
' "OPTIMISATION_TYPE_ODT_TYPE_16" = 1,\n'
' "OPTIMISATION_TYPE_ODT_TYPE_32" = 2,\n'
' "OPTIMISATION_TYPE_ODT_TYPE_64" = 3,\n'
' "OPTIMISATION_TYPE_ODT_TYPE_ALIGNMENT" = 4,\n'
' "OPTIMISATION_TYPE_MAX_ENTRY_SIZE" = 5\n'
' };\n'
' enum {\n'
' "ADDRESS_EXTENSION_FREE" = 0,\n'
' "ADDRESS_EXTENSION_ODT" = 1,\n'
' "ADDRESS_EXTENSION_DAQ" = 3\n'
' };\n'
' enum {\n'
' "IDENTIFICATION_FIELD_TYPE_ABSOLUTE" = 0,\n'
' "IDENTIFICATION_FIELD_TYPE_RELATIVE_BYTE" = 1,\n'
' "IDENTIFICATION_FIELD_TYPE_RELATIVE_WORD" = 2,\n'
' "IDENTIFICATION_FIELD_TYPE_RELATIVE_WORD_ALIGNED" = 3\n'
' };\n'
' enum {\n'
' "GRANULARITY_ODT_ENTRY_SIZE_DAQ_BYTE" = 1,\n'
' "GRANULARITY_ODT_ENTRY_SIZE_DAQ_WORD" = 2,\n'
' "GRANULARITY_ODT_ENTRY_SIZE_DAQ_DWORD" = 4,\n'
' "GRANULARITY_ODT_ENTRY_SIZE_DAQ_DLONG" = 8\n'
' };\n'
' uchar; /* MAX_ODT_ENTRY_SIZE_DAQ */\n'
' enum {\n'
' "NO_OVERLOAD_INDICATION" = 0,\n'
' "OVERLOAD_INDICATION_PID" = 1,\n'
' "OVERLOAD_INDICATION_EVENT" = 2\n'
' };\n'
' taggedstruct {\n'
' "PRESCALER_SUPPORTED" ;\n'
' "RESUME_SUPPORTED" ;\n'
' block "STIM" struct {\n'
' enum {\n'
' "GRANULARITY_ODT_ENTRY_SIZE_STIM_BYTE" = 1,\n'
' "GRANULARITY_ODT_ENTRY_SIZE_STIM_WORD" = 2,\n'
' "GRANULARITY_ODT_ENTRY_SIZE_STIM_DWORD" = 4,\n'
' "GRANULARITY_ODT_ENTRY_SIZE_STIM_DLONG" = 8\n'
' };\n'
' uchar; /* MAX_ODT_ENTRY_SIZE_STIM */\n'
' taggedstruct {\n'
' "BIT_STIM_SUPPORTED" ;\n'
' };\n'
' };\n'
' block "TIMESTAMP_SUPPORTED" struct {\n'
' uint; /* TIMESTAMP_TICKS */\n'
' enum {\n'
' "NO_TIME_STAMP" = 0,\n'
' "SIZE_BYTE" = 1,\n'
' "SIZE_WORD" = 2,\n'
' "SIZE_DWORD" = 4\n'
' };\n'
' enum {\n'
' "UNIT_1NS" = 0,\n'
' "UNIT_10NS" = 1,\n'
' "UNIT_100NS" = 2,\n'
' "UNIT_1US" = 3,\n'
' "UNIT_10US" = 4,\n'
' "UNIT_100US" = 5,\n'
' "UNIT_1MS" = 6,\n'
' "UNIT_10MS" = 7,\n'
' "UNIT_100MS" = 8,\n'
' "UNIT_1S" = 9\n'
' };\n'
' taggedstruct {\n'
' "TIMESTAMP_FIXED" ;\n'
' };\n'
' };\n'
' "PID_OFF_SUPPORTED" ;\n'
' (block "DAQ_LIST" struct {\n'
' uint; /* DAQ_LIST_NUMBER */\n'
' taggedstruct {\n'
' "DAQ_LIST_TYPE" enum {\n'
' "DAQ" = 1,\n'
' "STIM" = 2,\n'
' "DAQ_STIM" = 3\n'
' };\n'
' "MAX_ODT" uchar;\n'
' "MAX_ODT_ENTRIES" uchar;\n'
' "FIRST_PID" uchar;\n'
' "EVENT_FIXED" uint;\n'
' block "PREDEFINED" taggedstruct {\n'
' (block "ODT" struct {\n'
' uchar; /* ODT number */\n'
' taggedstruct {\n'
' ("ODT_ENTRY" struct {\n'
' uchar; /* ODT_ENTRY number */\n'
' ulong; /* address of element */\n'
' uchar; /* address extension of element */\n'
' uchar; /* size of element [AG] */\n'
' uchar; /* BIT_OFFSET */\n'
' })*;\n'
' };\n'
' })*;\n'
' };\n'
' };\n'
' })*;\n'
' (block "EVENT" struct {\n'
' char[101]; /* EVENT_CHANNEL_NAME */\n'
' char[9]; /* EVENT_CHANNEL_SHORT_NAME */\n'
' uint; /* EVENT_CHANNEL_NUMBER */\n'
' enum {\n'
' "DAQ" = 1,\n'
' "STIM" = 2,\n'
' "DAQ_STIM" = 3\n'
' };\n'
' uchar; /* MAX_DAQ_LIST */\n'
' uchar; /* TIME_CYCLE */\n'
' uchar; /* TIME_UNIT */\n'
' uchar; /* PRIORITY */\n'
' })*;\n'
' };\n'
' };\n'
'\n'
' taggedunion Daq_Event {\n'
' "FIXED_EVENT_LIST" taggedstruct {\n'
' ("EVENT" uint)*;\n'
' };\n'
' "VARIABLE" taggedstruct {\n'
' block "AVAILABLE_EVENT_LIST" taggedstruct {\n'
' ("EVENT" uint)*;\n'
' };\n'
' block "DEFAULT_EVENT_LIST" taggedstruct {\n'
' ("EVENT" uint)*;\n'
' };\n'
' };\n'
' };\n'
'\n'
' struct Pag {\n'
' uchar; /* MAX_SEGMENTS */\n'
' taggedstruct {\n'
' "FREEZE_SUPPORTED" ;\n'
' };\n'
' };\n'
'\n'
' struct Pgm {\n'
' enum {\n'
' "PGM_MODE_ABSOLUTE" = 1,\n'
' "PGM_MODE_FUNCTIONAL" = 2,\n'
' "PGM_MODE_ABSOLUTE_AND_FUNCTIONAL" = 3\n'
' };\n'
' uchar; /* MAX_SECTORS */\n'
' uchar; /* MAX_CTO_PGM */\n'
' taggedstruct {\n'
' (block "SECTOR" struct {\n'
' char[101]; /* SECTOR_NAME */\n'
' uchar; /* SECTOR_NUMBER */\n'
' ulong; /* Address */\n'
' ulong; /* Length */\n'
' uchar; /* CLEAR_SEQUENCE_NUMBER */\n'
' uchar; /* PROGRAM_SEQUENCE_NUMBER */\n'
' uchar; /* PROGRAM_METHOD */\n'
' })*;\n'
' "COMMUNICATION_MODE_SUPPORTED" taggedunion {\n'
' "BLOCK" taggedstruct {\n'
' "SLAVE" ;\n'
' "MASTER" struct {\n'
' uchar; /* MAX_BS_PGM */\n'
' uchar; /* MIN_ST_PGM */\n'
' };\n'
' };\n'
' "INTERLEAVED" uchar; /* QUEUE_SIZE_PGM */\n'
' };\n'
' };\n'
' };\n'
'\n'
' struct Segment {\n'
' uchar; /* SEGMENT_NUMBER */\n'
' uchar; /* number of pages */\n'
' uchar; /* ADDRESS_EXTENSION */\n'
' uchar; /* COMPRESSION_METHOD */\n'
' uchar; /* ENCRYPTION_METHOD */\n'
' taggedstruct {\n'
' block "CHECKSUM" struct {\n'
' enum {\n'
' "XCP_ADD_11" = 1,\n'
' "XCP_ADD_12" = 2,\n'
' "XCP_ADD_14" = 3,\n'
' "XCP_ADD_22" = 4,\n'
' "XCP_ADD_24" = 5,\n'
' "XCP_ADD_44" = 6,\n'
' "XCP_CRC_16" = 7,\n'
' "XCP_CRC_16_CITT" = 8,\n'
' "XCP_CRC_32" = 9,\n'
' "XCP_USER_DEFINED" = 255\n'
' };\n'
' taggedstruct {\n'
' "MAX_BLOCK_SIZE" ulong;\n'
' "EXTERNAL_FUNCTION" char[256]; /* Name of the Checksum.DLL */\n'
' };\n'
' };\n'
' (block "PAGE" struct {\n'
' uchar; /* PAGE_NUMBER */\n'
' enum {\n'
' "ECU_ACCESS_NOT_ALLOWED" = 0,\n'
' "ECU_ACCESS_WITHOUT_XCP_ONLY" = 1,\n'
' "ECU_ACCESS_WITH_XCP_ONLY" = 2,\n'
' "ECU_ACCESS_DONT_CARE" = 3\n'
' };\n'
' enum {\n'
' "XCP_READ_ACCESS_NOT_ALLOWED" = 0,\n'
' "XCP_READ_ACCESS_WITHOUT_ECU_ONLY" = 1,\n'
' "XCP_READ_ACCESS_WITH_ECU_ONLY" = 2,\n'
' "XCP_READ_ACCESS_DONT_CARE" = 3\n'
' };\n'
' enum {\n'
' "XCP_WRITE_ACCESS_NOT_ALLOWED" = 0,\n'
' "XCP_WRITE_ACCESS_WITHOUT_ECU_ONLY" = 1,\n'
' "XCP_WRITE_ACCESS_WITH_ECU_ONLY" = 2,\n'
' "XCP_WRITE_ACCESS_DONT_CARE" = 3\n'
' };\n'
' taggedstruct {\n'
' "INIT_SEGMENT" uchar; /* references segment that initialises this page */\n'
' };\n'
' })*;\n'
' (block "ADDRESS_MAPPING" struct {\n'
' ulong; /* source address */\n'
' ulong; /* destination address */\n'
' ulong; /* length */\n'
' })*;\n'
' "PGM_VERIFY" ulong; /* verification value for PGM */\n'
' };\n'
' };\n'
'\n'
' taggedstruct Common_Parameters {\n'
' block "PROTOCOL_LAYER" struct Protocol_Layer;\n'
' block "SEGMENT" struct Segment;\n'
' block "DAQ" struct Daq;\n'
' block "PAG" struct Pag;\n'
' block "PGM" struct Pgm;\n'
' block "DAQ_EVENT" taggedunion Daq_Event;\n'
' };\n'
'\n'
' struct UDP_IP_Parameters {\n'
' uint; /* XCP on UDP_IP version, currently 0x0100 */\n'
' uint; /* PORT */\n'
' taggedunion {\n'
' "HOST_NAME" char[256];\n'
' "ADDRESS" char[15];\n'
' };\n'
' };\n'
'\n'
' block "IF_DATA" taggedunion if_data {\n'
' "XCP" struct {\n'
' taggedstruct Common_Parameters; /* default parameters */\n'
' taggedstruct {\n'
' block "XCP_ON_UDP_IP" struct {\n'
' struct UDP_IP_Parameters; /* specific for UDP_IP */\n'
' taggedstruct Common_Parameters; /* overruling of default */\n'
' };\n'
' };\n'
' };\n'
' };\n'
' /end A2ML\n'
'\n'
' /begin MOD_COMMON ""\n'
' BYTE_ORDER MSB_LAST\n'
' ALIGNMENT_BYTE 1\n'
' ALIGNMENT_WORD 1\n'
' ALIGNMENT_LONG 1\n'
' ALIGNMENT_FLOAT32_IEEE 1\n'
' ALIGNMENT_FLOAT64_IEEE 1\n'
' /end MOD_COMMON\n'
'\n'
' /begin IF_DATA XCP\n'
' /begin PROTOCOL_LAYER\n'
' 0x0100\n'
' 0x03E8\n'
' 0xC8\n'
' 0x00\n'
' 0x00\n'
' 0x00\n'
' 0x00\n'
' 0x00\n'
' 0x60\n'
' 0x12C\n'
' BYTE_ORDER_MSB_LAST\n'
' ADDRESS_GRANULARITY_BYTE\n'
' OPTIONAL_CMD ALLOC_ODT_ENTRY\n'
' OPTIONAL_CMD ALLOC_ODT\n'
' OPTIONAL_CMD ALLOC_DAQ\n'
' OPTIONAL_CMD FREE_DAQ\n'
' OPTIONAL_CMD GET_DAQ_RESOLUTION_INFO\n'
' OPTIONAL_CMD GET_DAQ_PROCESSOR_INFO\n'
' OPTIONAL_CMD START_STOP_SYNCH\n'
' OPTIONAL_CMD GET_DAQ_CLOCK\n'
' OPTIONAL_CMD START_STOP_DAQ_LIST\n'
' OPTIONAL_CMD GET_DAQ_LIST_MODE\n'
' OPTIONAL_CMD SET_DAQ_LIST_MODE\n'
' OPTIONAL_CMD WRITE_DAQ\n'
' OPTIONAL_CMD SET_DAQ_PTR\n'
' OPTIONAL_CMD CLEAR_DAQ_LIST\n'
' OPTIONAL_CMD SHORT_UPLOAD\n'
' OPTIONAL_CMD UPLOAD\n'
' OPTIONAL_CMD SET_MTA\n'
' OPTIONAL_CMD GET_ID\n'
' OPTIONAL_CMD GET_COMM_MODE_INFO\n'
' OPTIONAL_CMD BUILD_CHECKSUM\n'
' OPTIONAL_CMD SET_SEGMENT_MODE\n'
' OPTIONAL_CMD GET_SEGMENT_MODE\n'
' OPTIONAL_CMD SET_REQUEST\n'
' OPTIONAL_CMD GET_SEED\n'
' OPTIONAL_CMD UNLOCK\n'
' OPTIONAL_CMD COPY_CAL_PAGE\n'
' SEED_AND_KEY_EXTERNAL_FUNCTION Seed_Key.dll'
'/* Add name of seed & key dll file here.*/\n'
' /end PROTOCOL_LAYER\n'
'\n'
' /begin DAQ\n'
' DYNAMIC\n'
' 0x00\n'
' 0x06\n'
' 0x00\n'
' OPTIMISATION_TYPE_DEFAULT\n'
' ADDRESS_EXTENSION_FREE\n'
' IDENTIFICATION_FIELD_TYPE_ABSOLUTE\n'
' GRANULARITY_ODT_ENTRY_SIZE_DAQ_BYTE\n'
' 0x60\n'
' NO_OVERLOAD_INDICATION\n'
' /begin TIMESTAMP_SUPPORTED\n'
' 0x01\n'
' SIZE_DWORD\n'
' UNIT_100US\n'
' TIMESTAMP_FIXED\n'
' /end TIMESTAMP_SUPPORTED\n'
'\n'
'$events\n'
' /end DAQ\n'
'\n'
' /begin PAG\n'
' 0x01\n'
' FREEZE_SUPPORTED\n'
' /end PAG\n'
' /begin PGM\n'
' PGM_MODE_ABSOLUTE\n'
' 0x01\n'
' 0x00\n'
' /begin SECTOR\n'
' "Sector"\n'
' 0x00\n'
' 0x22014\n'
' 0x38\n'
' 0x00\n'
' 0x00\n'
' 0x00\n'
' /end SECTOR\n'
' /end PGM\n'
'\n'
' /begin XCP_ON_UDP_IP\n'
' 0x0100\n'
' $ip_port\n'
' ADDRESS "$ip_address"\n'
' /end XCP_ON_UDP_IP\n'
' /end IF_DATA\n'
'\n'
' /begin MOD_PAR ""\n'
' /begin MEMORY_SEGMENT\n'
' caldata "caldata"\n'
' DATA FLASH INTERN 0x00000 0x0000 -1 -1 -1 -1 -1\n'
' /begin IF_DATA XCP\n'
' /begin SEGMENT\n'
' 0x00\n'
' 0x02\n'
' 0x00\n'
' 0x00\n'
' 0x00\n'
' /begin CHECKSUM\n'
' XCP_CRC_16_CITT\n'
' /end CHECKSUM\n'
' /begin PAGE\n'
' 0x00\n'
' ECU_ACCESS_WITH_XCP_ONLY\n'
' XCP_READ_ACCESS_WITH_ECU_ONLY\n'
' XCP_WRITE_ACCESS_NOT_ALLOWED\n'
' INIT_SEGMENT 0x00\n'
' /end PAGE\n'
' /begin PAGE\n'
' 0x01\n'
' ECU_ACCESS_WITH_XCP_ONLY\n'
' XCP_READ_ACCESS_WITH_ECU_ONLY\n'
' XCP_WRITE_ACCESS_WITH_ECU_ONLY\n'
' INIT_SEGMENT 0x00\n'
' /end PAGE\n'
' /end SEGMENT\n'
' /end IF_DATA\n'
' /end MEMORY_SEGMENT\n'
' /begin MEMORY_SEGMENT\n'
' text ".text"\n'
' CODE ROM INTERN 0x26030 0x72c10 -1 -1 -1 -1 -1\n'
' /begin IF_DATA XCP\n'
' /begin SEGMENT\n'
' 0x01\n'
' 0x01\n'
' 0x00\n'
' 0x00\n'
' 0x00\n'
' /begin CHECKSUM\n'
' XCP_CRC_16_CITT\n'
' /end CHECKSUM\n'
' /begin PAGE\n'
' 0x01\n'
' ECU_ACCESS_WITH_XCP_ONLY\n'
' XCP_READ_ACCESS_WITH_ECU_ONLY\n'
' XCP_WRITE_ACCESS_WITH_ECU_ONLY\n'
' INIT_SEGMENT 0x00\n'
' /end PAGE\n'
' /end SEGMENT\n'
' /end IF_DATA\n'
' /end MEMORY_SEGMENT\n'
' /end MOD_PAR\n'
'\n'
' /begin COMPU_METHOD BitSlice.CONVERSION ""\n'
' RAT_FUNC "%6.2f" ""\n'
' COEFFS 0 1 0 0 0 1\n'
' /end COMPU_METHOD\n'
'\n'
' /* SPM declarations start. */\n'
'\n'
'$spm_a2l\n'
'\n'
' /* SPM declarations end. */\n'
'\n'
' /begin RECORD_LAYOUT __UBYTE_Z\n'
' FNC_VALUES 1 UBYTE ROW_DIR DIRECT\n'
' /end RECORD_LAYOUT\n'
'\n'
' /begin RECORD_LAYOUT __UWORD_Z\n'
' FNC_VALUES 1 UWORD ROW_DIR DIRECT\n'
' /end RECORD_LAYOUT\n'
'\n'
' /begin RECORD_LAYOUT __ULONG_Z\n'
' FNC_VALUES 1 ULONG ROW_DIR DIRECT\n'
' /end RECORD_LAYOUT\n'
'\n'
' /begin RECORD_LAYOUT __SBYTE_Z\n'
' FNC_VALUES 1 SBYTE ROW_DIR DIRECT\n'
' /end RECORD_LAYOUT\n'
'\n'
' /begin RECORD_LAYOUT __SWORD_Z\n'
' FNC_VALUES 1 SWORD ROW_DIR DIRECT\n'
' /end RECORD_LAYOUT\n'
'\n'
' /begin RECORD_LAYOUT __SLONG_Z\n'
' FNC_VALUES 1 SLONG ROW_DIR DIRECT\n'
' /end RECORD_LAYOUT\n'
'\n'
' /begin RECORD_LAYOUT __FLOAT32_IEEE_Z\n'
' FNC_VALUES 1 FLOAT32_IEEE ROW_DIR DIRECT\n'
' /end RECORD_LAYOUT\n'
'\n'
' /begin RECORD_LAYOUT __FLOAT64_IEEE_Z\n'
' FNC_VALUES 1 FLOAT64_IEEE ROW_DIR DIRECT\n'
' /end RECORD_LAYOUT\n'
'\n'
' /begin RECORD_LAYOUT __UBYTE_S\n'
' FNC_VALUES 1 UBYTE COLUMN_DIR DIRECT\n'
' /end RECORD_LAYOUT\n'
'\n'
' /begin RECORD_LAYOUT __UWORD_S\n'
' FNC_VALUES 1 UWORD COLUMN_DIR DIRECT\n'
' /end RECORD_LAYOUT\n'
'\n'
' /begin RECORD_LAYOUT __ULONG_S\n'
' FNC_VALUES 1 ULONG COLUMN_DIR DIRECT\n'
' /end RECORD_LAYOUT\n'
'\n'
' /begin RECORD_LAYOUT __SBYTE_S\n'
' FNC_VALUES 1 SBYTE COLUMN_DIR DIRECT\n'
' /end RECORD_LAYOUT\n'
'\n'
' /begin RECORD_LAYOUT __SWORD_S\n'
' FNC_VALUES 1 SWORD COLUMN_DIR DIRECT\n'
' /end RECORD_LAYOUT\n'
'\n'
' /begin RECORD_LAYOUT __SLONG_S\n'
' FNC_VALUES 1 SLONG COLUMN_DIR DIRECT\n'
' /end RECORD_LAYOUT\n'
'\n'
' /begin RECORD_LAYOUT __FLOAT32_IEEE_S\n'
' FNC_VALUES 1 FLOAT32_IEEE COLUMN_DIR DIRECT\n'
' /end RECORD_LAYOUT\n'
'\n'
' /begin RECORD_LAYOUT __FLOAT64_IEEE_S\n'
' FNC_VALUES 1 FLOAT64_IEEE COLUMN_DIR DIRECT\n'
' /end RECORD_LAYOUT\n'
'\n'
' /begin RECORD_LAYOUT SSV__UBYTE_S\n'
' AXIS_PTS_X 1 UBYTE INDEX_INCR DIRECT\n'
' /end RECORD_LAYOUT\n'
'\n'
' /begin RECORD_LAYOUT SSV__UWORD_S\n'
' AXIS_PTS_X 1 UWORD INDEX_INCR DIRECT\n'
' /end RECORD_LAYOUT\n'
'\n'
' /begin RECORD_LAYOUT SSV__ULONG_S\n'
' AXIS_PTS_X 1 ULONG INDEX_INCR DIRECT\n'
' /end RECORD_LAYOUT\n'
'\n'
' /begin RECORD_LAYOUT SSV__SBYTE_S\n'
' AXIS_PTS_X 1 SBYTE INDEX_INCR DIRECT\n'
' /end RECORD_LAYOUT\n'
'\n'
' /begin RECORD_LAYOUT SSV__SWORD_S\n'
' AXIS_PTS_X 1 SWORD INDEX_INCR DIRECT\n'
' /end RECORD_LAYOUT\n'
'\n'
' /begin RECORD_LAYOUT SSV__SLONG_S\n'
' AXIS_PTS_X 1 SLONG INDEX_INCR DIRECT\n'
' /end RECORD_LAYOUT\n'
'\n'
' /begin RECORD_LAYOUT SSV__FLOAT32_IEEE_S\n'
' AXIS_PTS_X 1 FLOAT32_IEEE INDEX_INCR DIRECT\n'
' /end RECORD_LAYOUT\n'
'\n'
' /begin RECORD_LAYOUT SSV__FLOAT64_IEEE_S\n'
' AXIS_PTS_X 1 FLOAT64_IEEE INDEX_INCR DIRECT\n'
' /end RECORD_LAYOUT\n'
'\n'
' /end MODULE\n'
'/end PROJECT\n'
)
def render(self):
"""Render the complete A2L.
Returns:
a2l (str): The A2L for the project.
"""
event_a2l = ''
for event in self.events:
event_a2l += self.event_template.substitute(
name=event['name'],
short_name=event['name'],
channel_id=event['channel_id'],
time_cycle=event['time_cycle'],
time_unit=event['time_unit']
)
return self.template.substitute(
events=event_a2l,
asap2_version=self.asap2_version,
project_name=self.project_name.replace('-', '_'),
ip_address=self.ip_address,
ip_port=self.ip_port,
spm_a2l=self.spm_a2l
)
class A2lSilverTemplate:
"""Class for A2l Silver Template."""
def __init__(self, base_a2l):
"""Init."""
self.patched_a2l = self.basic_patching(base_a2l)
self.template = Template(
'/* QTronic Header */\n'
'ASAP2_VERSION 1 6\n'
'/begin PROJECT SPM ""\n'
'/begin MODULE SPM ""\n'
'\n'
'$base_a2l'
'\n'
'/end MODULE\n'
'/end PROJECT'
)
@staticmethod
def basic_patching(base_a2l):
"""Perform required basic patching.
Args:
base_a2l (str): Inital A2l content.
Returns:
(str): Patched A2L content.
"""
return "".join([char if ord(char) < 128 else "?" for char in base_a2l])
def render(self):
"""Render the complete A2L.
Returns:
(str): The patched A2L for Silver.
"""
if "ASAP2_VERSION" in self.patched_a2l[:999]:
return self.patched_a2l
return self.template.substitute(base_a2l=self.patched_a2l)

946
pybuild/build.py Normal file
View File

@ -0,0 +1,946 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
# -*- coding: utf-8 -*-
"""Python module used for building a Vcc SPM SW release.
This is the entry point to pybuild which includes all other needed modules.
Loads configuration files and sequences the code generation steps.
"""
import glob
import logging
import os
import shutil
import sys
import time
from os.path import join as pjoin
from pathlib import Path
from pybuild import __config_version__, __version__, build_defs
from pybuild.a2l_merge import A2lMerge
from pybuild.build_proj_config import BuildProjConfig
from pybuild.core import Core, HICore
from pybuild.core_dummy import CoreDummy
from pybuild.create_conversion_table import create_conversion_table
from pybuild.dids import DIDs, HIDIDs
from pybuild.dummy import DummyVar
from pybuild.dummy_spm import DummySpm
from pybuild.ext_dbg import ExtDbg
from pybuild.ext_var import ExtVarCsv, ExtVarYaml
from pybuild.feature_configs import FeatureConfigs
from pybuild.lib.helper_functions import get_repo_root, merge_dicts
from pybuild.memory_section import MemorySection
from pybuild.nvm_def import NVMDef
from pybuild.problem_logger import ProblemLogger
from pybuild.replace_compu_tab_ref import replace_tab_verb
from pybuild.sched_funcs import SchedFuncs
from pybuild.signal_if_html_rep import SigIfHtmlReport
from pybuild.signal_incons_html_rep import SigConsHtmlReport
from pybuild.signal_interfaces import CsvSignalInterfaces, YamlSignalInterfaces
from pybuild.unit_configs import CodeGenerators, UnitConfigs
from pybuild.user_defined_types import UserDefinedTypes
from pybuild.zone_controller.calibration import ZoneControllerCalibration
from pybuild.zone_controller.composition_yaml import CompositionYaml
LOG = logging.getLogger()
REPO_ROOT = get_repo_root()
def setup_logging(log_dst_dir, problem_logger, debug=True, quiet=False):
"""Set up the python logger for the build environment.
Three logger streams are set up. One that log to stdout (info level),
and one to a build.log file (info level), and finally, on stream that
logs to build_dbg.log (debug level log). The log files are put in the
directory configured in the config file.
Args:
log_dst_dir (str): the path to where the log file should be stored
problem_logger (obj): the ProblemLogger object to initialise
debug (bool): True - if debug log shall be generated
quiet (bool): False - disable logging to stdout
"""
LOG.setLevel(logging.DEBUG)
LOG.handlers = [] # Remove all previous loggers
logging.captureWarnings(True)
# Setup debug build logger
log_file = pjoin(log_dst_dir, "build_dbg.log")
if debug:
dbg = logging.FileHandler(log_file)
dbg.setLevel(logging.DEBUG)
dbg_formatter = logging.Formatter(
"%(asctime)s - %(module)s."
"%(funcName)s [%(lineno)d]"
" - %(levelname)s - %(message)s"
)
dbg.setFormatter(dbg_formatter)
LOG.addHandler(dbg)
# Setup normal build logger
log_file = pjoin(log_dst_dir, "build.log")
nrm = logging.FileHandler(log_file)
nrm.setLevel(logging.INFO)
build_log_frmt = logging.Formatter("%(asctime)s - %(levelname)s" " - %(message)s")
nrm.setFormatter(build_log_frmt)
LOG.addHandler(nrm)
if not quiet:
nrm_strm = logging.StreamHandler(sys.stdout)
nrm_strm.setLevel(logging.INFO)
nrm_strm.setFormatter(build_log_frmt)
LOG.addHandler(nrm_strm)
error_strm = logging.StreamHandler(sys.stderr)
error_strm.setLevel(logging.CRITICAL)
error_strm.setFormatter(build_log_frmt)
LOG.addHandler(error_strm)
problem_logger.init_logger(LOG)
def check_interfaces(build_cfg, signal_if):
"""Check the interfaces.
Checks interfaces in all configurations, and generates a html
report with the result of the checks.
Args:
build_cfg (BuildProjConfig): Build project class holding where
files should be stored
signal_if (SignalInterfaces): class holding signal interface information
"""
LOG.info("******************************************************")
LOG.info("Start check interface inconsistencies")
start_time = time.time()
report_dst_dir = build_cfg.get_reports_dst_dir()
signal_inconsistency_result = signal_if.check_config()
sig_report = SigConsHtmlReport(signal_inconsistency_result)
LOG.info(
"Finished check interface inconsistencies (in %4.2f s)",
time.time() - start_time,
)
start_time = time.time()
LOG.info("******************************************************")
LOG.info("Start generating interface inconsistencies html-report")
sig_report.generate_report_file(pjoin(report_dst_dir, "SigCheck.html"))
LOG.info(
"Finished - generating interface inconsistencies html-report (in %4.2f s)",
time.time() - start_time,
)
def interface_report(build_cfg, unit_cfg, signal_if):
"""Create report of signal interfaces.
Creates a report of interfaces in all configurations, and generates
a html-report with the result of the checks.
Args:
build_cfg (BuildProjConfig): Build project class holding where
files should be stored.
unit_cfg (UnitConfigs): Class holding all unit interfaces.
signal_if (SignalInterfaces): class holding signal interface information.
"""
LOG.info("******************************************************")
LOG.info("Start creating interface report")
start_time = time.time()
report_dst_dir = build_cfg.get_reports_dst_dir()
sig_if = SigIfHtmlReport(build_cfg, unit_cfg, signal_if)
sig_if.generate_report_file(pjoin(report_dst_dir, "SigIf.html"))
LOG.info(
"Finished - create interface report (in %4.2f s)", time.time() - start_time
)
def generate_dummy_spm(build_cfg, unit_cfg, feature_cfg, signal_if, udt):
"""Generate c-files that define unit output signals.
Args:
build_cfg (BuildProjConfig): Build project class holding where files should be stored.
unit_cfg (UnitConfigs): Aggregated unit configs class.
feature_cfg (FeatureConfigs): Used as a library to generate C-macros.
signal_if (SignalInterfaces): Class holding signal interface information.
udt (UserDefinedTypes): Class holding user defined data types.
"""
def _add_undefined_signals_from_unit(data):
"""Add undefined signals from unit.
Includes included configs.
Arguments:
data (dict): Data for the unit
"""
for signal_name, inport_attributes in data.get("inports", {}).items():
if signal_name in defined_ports:
continue
if (
feature_cfg.check_if_active_in_config(inport_attributes["configs"])
or not build_cfg.allow_undefined_unused
):
defined_ports.append(signal_name)
undefined_outports.append(inport_attributes)
for include_unit in data.get("includes", []):
include_data = unit_cfg.get_unit_config(include_unit)
_add_undefined_signals_from_unit(include_data)
LOG.info("******************************************************")
LOG.info("Start generating output vars")
start_time = time.time()
dst_dir = build_cfg.get_src_code_dst_dir()
# Get out ports for all units in project regardless of code switches
unit_vars = unit_cfg.get_per_unit_cfg_total()
undefined_outports = []
defined_ports = []
for data in unit_vars.values():
for outport_name, outport_data in data.get("outports", {}).items():
if not feature_cfg.check_if_active_in_config(outport_data["configs"]):
LOG.debug("Outport %s not active in current project", outport_name)
elif outport_name not in defined_ports:
if not outport_data.get("class").startswith("CVC_EXT"):
defined_ports.append(outport_name)
defined_ports.extend(signal_if.get_externally_defined_ports())
for data in unit_vars.values():
_add_undefined_signals_from_unit(data)
dummy_spm = DummySpm(
undefined_outports, build_cfg, feature_cfg, unit_cfg, udt, "VcDummy_spm"
)
dummy_spm.generate_files(dst_dir)
LOG.info("Finished generating output vars (in %4.2f s)", time.time() - start_time)
def generate_did_files(build_cfg, unit_cfg):
"""Generate DIDAPI definition files.
Args:
build_cfg (BuildProjConfig): Build project class holding where
files should be stored
unit_cfg (UnitConfigs): class holding units definitions,
and which units to include
"""
start_time = time.time()
did_defs_files = pjoin(build_cfg.get_src_code_dst_dir(), "VcDIDDefinition")
car_com_file = build_cfg.get_car_com_dst()
LOG.info("******************************************************")
LOG.info("Start generating %s.c&h", did_defs_files)
dids = DIDs(build_cfg, unit_cfg)
dids.gen_did_def_files(did_defs_files)
LOG.info(
"Finished generating %s.c&h (in %4.2f s)",
did_defs_files,
time.time() - start_time,
)
LOG.info("******************************************************")
LOG.info("Start generating %s", car_com_file)
start_time = time.time()
dids.gen_did_carcom_extract(car_com_file)
LOG.info(
"Finished generating %s (in %4.2f s)", car_com_file, time.time() - start_time
)
LOG.info("******************************************************")
def generate_core_dummy(build_cfg, core, unit_cfg):
"""Generate the Core dummy files.
The core dummy creates RTE dummy functions,
and Id variables for enabling testing of VCC Software. If this dummy is not
included, it is not possible to build the project until the supplier deliver
an updated diagnostic core SW.
Note:
These dummy files shall not be delivered to the supplier!
Args:
build_cfg (BuildProjConfig): Build project class holding where
files should be stored
core (Core): class holding core configuration information
"""
core_dummy_fname = build_cfg.get_core_dummy_name()
LOG.info("******************************************************")
LOG.info("Start generating Core Dummy - %s", core_dummy_fname)
start_time = time.time()
core_dummy = CoreDummy(core.get_current_core_config(), unit_cfg)
ecu_supplier = build_cfg.get_ecu_info()[0]
if ecu_supplier == "Denso":
core_dummy.generate_dg2_core_dummy_files(core_dummy_fname)
elif ecu_supplier == "RB":
core_dummy.generate_rb_core_dummy_files(core_dummy_fname)
elif ecu_supplier == "CSP":
core_dummy.generate_csp_core_dummy_files(core_dummy_fname)
else:
msg = f"Could not generate VcCoreDummy, cannot identify the supplier {ecu_supplier}."
LOG.critical(msg)
raise ValueError(msg)
LOG.info("Finished generating Core Dummy (in %4.2f s)", time.time() - start_time)
def generate_ext_var(build_cfg, unit_cfg, signal_if, udt, debug_code=True):
"""Generate two c-files that define the signal interface to the supplier.
The VcExtVar function assigns all variables to the CVC_DISP memory area,
while the ExtVarSafe.c are allocated to the CVC_DISP_ASIL_B memory area.
Note that this function only declares the variables
in the supplier interface, which the supplier writes to. All other
variables shall be declared by the function which writes to the variable.
Note that dummy functionality/variables should be created in the function
to keep the signalling interface consistent!
Args:
build_cfg (BuildProjConfig): Build project class holding where files should be stored.
signal_if (SignalInterfaces): class holding signal interface information.
udt (UserDefinedTypes): Class holding user defined data types.
debug_code (boolean): If true, generate debug code.
"""
LOG.info("******************************************************")
LOG.info("Start generating VcExtVar and VcDebug")
start_time = time.time()
ecu_supplier = build_cfg.get_ecu_info()[0]
asil_level_dep = build_defs.ASIL_D if ecu_supplier == "HI" else build_defs.ASIL_B
asil_level_db = (
build_defs.CVC_ASIL_D if ecu_supplier == "HI" else build_defs.CVC_ASIL_B
)
nrm_dict, dep_dict, sec_dict, dbg_dict = signal_if.get_external_io()
_extract_external_var(
build_cfg, unit_cfg, udt, asil_level_dep, nrm_dict, dep_dict, sec_dict
)
if debug_code:
_extract_debug(build_cfg, unit_cfg, asil_level_db, dep_dict, dbg_dict)
LOG.info(
"Finished generating VcExtVar and VcDebug (in %4.2f s)",
time.time() - start_time,
)
def _extract_external_var(
build_cfg, unit_cfg, udt, asil_level_dep, nrm_dict, dep_dict, sec_dict
):
if build_cfg.has_yaml_interface:
ext_var_nrm = ExtVarYaml(nrm_dict, build_cfg, unit_cfg, udt)
ext_var_dep = ExtVarYaml(dep_dict, build_cfg, unit_cfg, udt, asil_level_dep)
else:
ext_var_nrm = ExtVarCsv(nrm_dict, build_cfg, unit_cfg, udt)
ext_var_dep = ExtVarCsv(dep_dict, build_cfg, unit_cfg, udt, asil_level_dep)
ext_var_sec = ExtVarCsv(sec_dict, build_cfg, unit_cfg, udt, build_defs.SECURE)
ext_var_instances = {
ext_var_nrm: "VcExtVar",
ext_var_dep: "VcExtVarSafe",
}
if not build_cfg.has_yaml_interface:
ext_var_instances[ext_var_sec] = "VcExtVarSecure"
for instance, dir_name in ext_var_instances.items():
ext_var_path = Path(build_cfg.get_src_code_dst_dir(), dir_name)
instance.generate_files(ext_var_path)
def _extract_debug(build_cfg, unit_cfg, asil_level_db, dep_dict, dbg_dict):
dbg_instances = {
ExtDbg(dbg_dict, build_cfg, unit_cfg): ("VcDebug", "VcDebugOutput"),
ExtDbg(dep_dict, build_cfg, unit_cfg, asil_level_db): ("VcDebugSafe", "VcDebugOutputSafe")
}
for instance, dir_names in dbg_instances.items():
instance.gen_dbg_files(
pjoin(build_cfg.get_src_code_dst_dir(), dir_names[0]),
pjoin(build_cfg.get_src_code_dst_dir(), dir_names[1]),
)
def generate_dummy_var(build_cfg, unit_cfg, signal_if, udt):
"""Generate c-file that define the missing signals.
Args:
build_cfg (BuildProjConfig): Build project class holding where files should be stored.
unit_cfg (UnitConfigs) : Aggregated unit configs class.
signal_if (SignalInterfaces): class holding signal interface information.
udt (UserDefinedTypes): Class holding user defined data types.
"""
LOG.info("******************************************************")
LOG.info("Start generating VcDummy")
start_time = time.time()
dst_dir = build_cfg.get_src_code_dst_dir()
nrm_dict, dep_dict, sec_dict, _ = signal_if.get_external_io()
nrm_dict = merge_dicts(nrm_dict, dep_dict, merge_recursively=True)
nrm_dict = merge_dicts(nrm_dict, sec_dict, merge_recursively=True)
res_dict = signal_if.check_config()
dummy = DummyVar(unit_cfg, nrm_dict, res_dict, build_cfg, udt)
dummy.generate_file(pjoin(dst_dir, "VcDummy"))
LOG.info("Finished generating VcDummy (in %4.2f s)", time.time() - start_time)
def generate_nvm_def(build_cfg, unit_cfg, no_nvm_a2l, use_prefix=False):
"""Generate the c&h-files which declares the NVM-ram.
The NVM-ram is declared in a struct per datatype length, in order
to provide a defined order in RAM/FLASH/EEPROM for the tester
communication service. Furthermore, # defines with the variables are
created to minimize the needed model changes for access to the memory.
Optionally, also patch the defined functions with the SWC name as prefix.
Args:
build_cfg (BuildProjConfig): Build project class holding where files should be stored.
unit_cfg (UnitConfigs): class holding units definitions, and which units to include.
no_nvm_a2l (bool): Do not generate A2L for NVM structs.
use_prefix (bool): Patch the nvm source file definitions with the SWC name as prefix.
"""
LOG.info("******************************************************")
LOG.info("Start generating NVMDefinitions")
start_time = time.time()
tot_vars_nvm = unit_cfg.get_per_cfg_unit_cfg().get("nvm", {})
nvm_def = NVMDef(build_cfg, unit_cfg, tot_vars_nvm)
nvm_def.generate_nvm_config_files(no_nvm_a2l, use_prefix)
LOG.info(
"Finished generating NVMDefinitions (in %4.2f s)", time.time() - start_time
)
def copy_unit_src_to_src_out(build_cfg):
"""Copy unit source code to delivery folder.
Function to copy all relevant .c, .h and .a2l files to the src
delivery folder (defined in the config file for the project),
from the units that are included in the project.
Args:
build_cfg (BuildProjConfig): Build project class holding where
files should be stored
"""
LOG.info("******************************************************")
LOG.info("Start copying unit source files")
start_time = time.time()
src_dirs = build_cfg.get_unit_src_dirs()
src_dst_dir = build_cfg.get_src_code_dst_dir()
files = []
for src_dir in src_dirs.values():
files.extend(glob.glob(pjoin(src_dir, "*.c")))
files.extend(glob.glob(pjoin(src_dir, "*.cpp")))
files.extend(glob.glob(pjoin(src_dir, "*.h")))
for file_ in files:
shutil.copy2(file_, src_dst_dir)
LOG.debug("copied %s to %s", file_, src_dst_dir)
LOG.info(
"Finished copying unit source files (in %4.2f s)", time.time() - start_time
)
def copy_common_src_to_src_out(build_cfg, patch=False):
"""Copy source code to delivery folder.
Function to copy all relevant .c and .h files to the src
delivery folder (defined in the config file for the project),
from the units that are included in the project.
Optionally, also patch the defined functions with the SWC name as prefix.
Args:
build_cfg (BuildProjConfig): Build project class holding where files should be stored.
patch (bool): Patch the common source file functions with the SWC name as prefix.
"""
LOG.info("******************************************************")
if patch:
LOG.info("Start copying and patching common source files")
else:
LOG.info("Start copying common source files")
start_time = time.time()
src_dir = build_cfg.get_common_src_dir()
src_dst_dir = build_cfg.get_src_code_dst_dir()
included_common_files = build_cfg.get_included_common_files()
files = [
pjoin(src_dir, common_file + ".c") for common_file in included_common_files
]
files.extend(
[pjoin(src_dir, common_file + ".h") for common_file in included_common_files]
)
files_to_copy = filter(os.path.isfile, files)
for file_ in files_to_copy:
if patch:
patch_and_copy_common_src_to_src_out(build_cfg.get_swc_name(), Path(file_), Path(src_dst_dir))
else:
shutil.copy2(file_, src_dst_dir)
LOG.debug("copied %s to %s", file_, src_dst_dir)
LOG.info(
"Finished copying common source files (in %4.2f s)", time.time() - start_time
)
def patch_and_copy_common_src_to_src_out(swc_name, file_path, dest_dir):
"""Copy common source code to delivery folder, patched with SWC prefix.
Args:
swc_name (str): Software component name to use as prefix.
file_path (Path): Path to file to patch and copy.
dest_dir (Path): Destination directory for the patched file.
"""
with file_path.open(mode="r", encoding="utf-8") as file_handle:
content = file_handle.read()
new_function = f"{swc_name}_{file_path.stem}"
new_content = content.replace(f"{file_path.stem}(", f"{new_function}(")
new_content_lines = new_content.splitlines()
if file_path.suffix == ".h":
defines_index = next(
(idx for idx, line in enumerate(new_content_lines) if "DEFINES" in line and "(OPT)" not in line),
None
)
if defines_index is not None:
# Insert at defines_index + 2, +1 to get to the next line and +1 to get after multiline comment
new_content_lines.insert(defines_index + 2, f"#define {file_path.stem} {new_function}")
with Path(dest_dir, file_path.name).open(mode="w", encoding="utf-8") as file_handle:
file_handle.write("\n".join(new_content_lines))
def copy_files_to_include(build_cfg):
"""Copy source code to delivery folder.
Function to copy all extra include files to the src delivery folder
(defined in the config file for the project),
from the units that are included in the project.
Args:
build_cfg (BuildProjConfig): Build project class holding where
files should be stored
"""
LOG.info("******************************************************")
LOG.info("Start copying extra included source files")
start_time = time.time()
include_paths = build_cfg.get_includes_paths()
src_dst_dir = build_cfg.get_src_code_dst_dir()
files = []
for include_path in include_paths:
if os.path.isdir(include_path):
files.extend(
[
Path(include_path, file_name)
for file_name in Path(include_path).iterdir()
if not file_name.is_dir()
]
)
elif os.path.isfile(include_path):
files.append(include_path)
else:
LOG.critical("File or directory %s not found", include_path)
for file_ in files:
shutil.copy2(file_, src_dst_dir)
LOG.debug("copied %s to %s", file_, src_dst_dir)
LOG.info(
"Finished copying extra included files (in %4.2f s)", time.time() - start_time
)
def copy_unit_cfgs_to_output(build_cfg):
"""Copy all relevant unit configuration files to delivery folder.
Function to copy all relevant unit config .json files to the UnitCfg
delivery folder (defined in the config file for the project),
from the units that are included in the project.
Args:
build_cfg (BuildProjConfig): Build project class holding where
files should be stored
"""
cfg_dst_dir = build_cfg.get_unit_cfg_deliv_dir()
if cfg_dst_dir is not None:
LOG.info("******************************************************")
LOG.info("Start copying the unit config files")
start_time = time.time()
cfg_dirs = build_cfg.get_unit_cfg_dirs()
files = []
for cfg_dir in cfg_dirs.values():
files.extend(glob.glob(pjoin(cfg_dir, "*.json")))
for file_ in files:
shutil.copy2(file_, cfg_dst_dir)
LOG.debug("copied %s to %s", file_, cfg_dst_dir)
LOG.info(
"Finished copying the unit config files (in %4.2f s)",
time.time() - start_time,
)
def merge_a2l_files(build_cfg, unit_cfg, complete_a2l=False, silver_a2l=False):
"""Merge a2l-files.
Args:
build_cfg (BuildProjConfig): Build project class holding where
files should be stored
unit_cfg (UnitConfigs): class holding units definitions,
and which units to include
Returns:
a2l (A2lMerge): A2lMerge class holding the merged a2l data.
"""
LOG.info("******************************************************")
LOG.info("Start merging A2L-files")
start_time = time.time()
src_dirs = build_cfg.get_unit_src_dirs()
src_dst_dir = build_cfg.get_src_code_dst_dir()
a2l_files_unit = []
for src_dir in src_dirs.values():
a2l_files_unit.extend(glob.glob(pjoin(src_dir, "*.a2l")))
a2l_files_gen = glob.glob(pjoin(src_dst_dir, "*.a2l"))
a2l = A2lMerge(build_cfg, unit_cfg, a2l_files_unit, a2l_files_gen)
new_a2l_file = os.path.join(src_dst_dir, build_cfg.get_a2l_name())
# LOG.debug(new_a2l_file)
a2l.merge(new_a2l_file, complete_a2l, silver_a2l)
LOG.info("Finished merging A2L-files (in %4.2f s)", time.time() - start_time)
return a2l
def find_all_project_configs(prj_cfg_file):
"""Find all Project config files."""
prj_root_dir, _ = os.path.split(prj_cfg_file)
prj_root_dir = os.path.abspath(prj_root_dir)
all_cfg_path = os.path.join(prj_root_dir, "..", "*", "ProjectCfg.json")
return glob.glob(all_cfg_path, recursive=True)
def propagate_tag_name(build_cfg, tag_name, problem_logger):
"""Set tag name in relevant files, for release builds.
Args:
build_cfg (BuildProjConfig): Build project class holding where files should be stored.
tag_name (str): git tag name.
problem_logger (object): logger for pybuild.
"""
LOG.info("******************************************************")
LOG.info("Propagating tag name: %s", tag_name)
start_time = time.time()
src_dst_dir = build_cfg.get_src_code_dst_dir()
h_file_path = os.path.join(src_dst_dir, "vcc_sp_version.h")
if not os.path.isfile(h_file_path):
problem_logger.critical("Missing %s", h_file_path)
return
with open(h_file_path, "r+", encoding="utf-8") as file_handle:
contents = file_handle.read()
file_handle.seek(0)
new_contents = contents.replace(
'#define VCC_SOFTWARE_NAME "tagname"',
f'#define VCC_SOFTWARE_NAME "{tag_name}"',
)
file_handle.write(new_contents)
LOG.info("Finished propagating tag name (in %4.2f s)", time.time() - start_time)
def add_args(parser):
"""Add command line arguments for pybuild.
This is useful when pybuild should be run through a command line wrapper function.
Args:
parser (argparse.ArgumentParser): Parser instance to add arguments to.
"""
pybuild_parser = parser.add_argument_group("pybuild arguments")
pybuild_parser.add_argument(
"--project-config", required=True, help="Project root configuration file"
)
pybuild_parser.add_argument(
"--generate-system-info", action="store_true", help="Generate AllSystemInfo.mat"
)
pybuild_parser.add_argument(
"--generate-custom-conv-tab",
default=None,
help="Path to conversion table file. Useful for TargetLink enums in A2L file.",
)
pybuild_parser.add_argument(
"--core-dummy",
action="store_true",
help="Generate core dummy code to enable integration with old supplier code",
)
pybuild_parser.add_argument(
"--debug", action="store_true", help="Activate the debug log"
)
pybuild_parser.add_argument(
"--no-abort", action="store_true", help="Do not abort due to errors"
)
pybuild_parser.add_argument(
"--no-nvm-a2l",
action="store_true",
help="Do not generate a2l file for NVM structs",
)
pybuild_parser.add_argument(
"--complete-a2l", action="store_true", help="Generate A2L with project info"
)
pybuild_parser.add_argument(
"--silver-a2l",
action="store_true",
help="Generate A2L file with Silver patching. Complete A2L argument takes precedence.",
)
pybuild_parser.add_argument(
"--interface", action="store_true", help="Generate interface report"
)
pybuild_parser.add_argument(
"--generate-rte-checkpoint-calls",
action="store_true",
help="Generate RTE function checkpoint calls",
)
pybuild_parser.add_argument(
"--version",
action="version",
version=f"%(prog)s {__version__}",
help="Display program version",
)
def build(
project_config,
interface=False,
core_dummy=False,
no_abort=False,
no_nvm_a2l=False,
debug=False,
quiet=False,
generate_system_info=False,
generate_custom_conversion_table=None,
complete_a2l=False,
silver_a2l=False,
generate_rte_checkpoint_calls=False,
):
"""Execute the build.
Args:
project_config (str): Project configuration file.
interface (bool): Generate interface report. Default=False.
core_dummy (bool): Generate core dummy code. Default=False.
no_abort (bool): Do not abort due to errors. Default=False.
no_nvm_a2l (bool): Do not generate A2L for NVM structs. Default=False.
debug (bool): Activate the debug log. Default=False.
quiet (bool): Disable logging to stdout. Default=False.
generate_system_info (bool): Generate AllSystemInfo.mat for DocGen compatibility. Default=False.
generate_custom_conversion_table (str): Path to conversion table file.
Useful for TargetLink enums in A2L file. Default=None.
complete_a2l (bool): Add an a2l header plus additional content such as XCP data.
silver_a2l (bool): Add an a2l header plus additional patching required for Silver.
generate_rte_checkpoint_calls (bool): Generate RTE function checkpoint calls.
"""
try:
problem_logger = ProblemLogger()
tot_start_time = time.time()
project_config_files = find_all_project_configs(project_config)
prj_cfgs = {}
for project_config_file in project_config_files:
conf = BuildProjConfig(os.path.normpath(project_config_file))
prj_cfgs.update({conf.name: conf})
prj_cfgs = {}
build_cfg = BuildProjConfig(os.path.normpath(project_config))
ecu_supplier = build_cfg.get_ecu_info()[0]
prj_cfgs.update({build_cfg.name: build_cfg})
build_cfg.create_build_dirs()
src_dst_dir = build_cfg.get_src_code_dst_dir()
setup_logging(build_cfg.get_log_dst_dir(), problem_logger, debug, quiet)
LOG.info("Starting build")
LOG.info("pybuild version is: %s", __version__)
LOG.info("Project/Model config file version is: %s", __config_version__)
LOG.info("Read SPM code switches")
start_time = time.time()
feature_cfg = FeatureConfigs(build_cfg)
LOG.info(
"Finished reading SPM code switches (in %4.2f s)", time.time() - start_time
)
LOG.info("******************************************************")
LOG.info("Start generating per project unit config data")
start_time = time.time()
unit_cfg = UnitConfigs(build_cfg, feature_cfg)
LOG.info(
"Finished generating per project unit config data (in %4.2f s)",
time.time() - start_time,
)
udt = UserDefinedTypes(build_cfg, unit_cfg)
start_time = time.time()
cnf_header = pjoin(src_dst_dir, build_cfg.get_feature_conf_header_name())
LOG.info("******************************************************")
LOG.info("Generate compiler switches header file %s", cnf_header)
feature_cfg.gen_unit_cfg_header_file(cnf_header)
LOG.info(
"Finished generating compiler switches header file (in %4.2f s)",
time.time() - start_time,
)
if build_cfg.has_yaml_interface:
signal_if = YamlSignalInterfaces(build_cfg, unit_cfg, feature_cfg, udt)
else:
signal_if = CsvSignalInterfaces(build_cfg, unit_cfg)
check_interfaces(build_cfg, signal_if)
if interface:
interface_report(build_cfg, unit_cfg, signal_if)
udt.generate_common_header_files()
generate_ext_var(build_cfg, unit_cfg, signal_if, udt)
if ecu_supplier in ["CSP", "HP", "HI", "ZC"]:
LOG.info("******************************************************")
LOG.info("Skip generating VcDummy file for %s projects", ecu_supplier)
else:
generate_dummy_var(build_cfg, unit_cfg, signal_if, udt)
custom_dummy_spm = build_cfg.get_use_custom_dummy_spm()
if custom_dummy_spm is not None:
LOG.info("******************************************************")
if os.path.isfile(custom_dummy_spm):
LOG.info("Copying custom dummy spm file (%s)", custom_dummy_spm)
shutil.copy2(custom_dummy_spm, build_cfg.get_src_code_dst_dir())
else:
LOG.warning(
"Cannot find desired custom dummy spm file: %s", custom_dummy_spm
)
generate_dummy_spm(build_cfg, unit_cfg, feature_cfg, signal_if, udt)
else:
generate_dummy_spm(build_cfg, unit_cfg, feature_cfg, signal_if, udt)
custom_sources = build_cfg.get_use_custom_sources()
if custom_sources is not None:
LOG.info("******************************************************")
for custom_src in custom_sources:
if os.path.isfile(custom_src):
LOG.info("Copying custom sourcefile (%s)", custom_src)
shutil.copy2(custom_src, build_cfg.get_src_code_dst_dir())
else:
LOG.warning("Cannot find desired custom sourcefile: %s", custom_src)
if ecu_supplier in ["HI"]:
LOG.info("******************************************************")
LOG.info("Generating Core header")
hi_core = HICore(build_cfg, unit_cfg)
hi_core.generate_dtc_files()
LOG.info("******************************************************")
LOG.info("Generating DID files")
dids = HIDIDs(build_cfg, unit_cfg)
dids.generate_did_files()
else:
generate_did_files(build_cfg, unit_cfg)
# generate core dummy files if requested
if core_dummy:
core_dummy_fname = os.path.basename(build_cfg.get_core_dummy_name())
if CodeGenerators.embedded_coder in unit_cfg.code_generators:
LOG.info("******************************************************")
LOG.info("Skip generating %s for EC projects", core_dummy_fname)
elif ecu_supplier in ["HI", "ZC", "CSP"]:
LOG.info("******************************************************")
LOG.info("Skip generating %s for SPA2+ projects", core_dummy_fname)
else:
core = Core(build_cfg, unit_cfg)
generate_core_dummy(build_cfg, core, unit_cfg)
# generate NVM definitions
if ecu_supplier in ["ZC"]:
generate_nvm_def(build_cfg, unit_cfg, no_nvm_a2l, True)
else:
generate_nvm_def(build_cfg, unit_cfg, no_nvm_a2l)
LOG.info("******************************************************")
LOG.info("Start generating the scheduling functions")
start_time = time.time()
gen_schd = SchedFuncs(build_cfg, unit_cfg)
gen_schd.generate_sched_c_fncs(generate_rte_checkpoint_calls)
LOG.info(
"Finished generating the scheduling functions (in %4.2f s)",
time.time() - start_time,
)
LOG.info("******************************************************")
LOG.info("Start generating the ts header file")
start_time = time.time()
gen_schd.generate_ts_defines(pjoin(src_dst_dir, build_cfg.get_ts_header_name()))
LOG.info(
"Finished generating ts header file (in %4.2f s)", time.time() - start_time
)
# Generate AllSystemInfo.mat for DocGen compatibility
if generate_system_info:
from pybuild.gen_allsysteminfo import GenAllSystemInfo
gen_all_system_info = GenAllSystemInfo(signal_if, unit_cfg)
gen_all_system_info.build()
# Check if errors
if not no_abort:
if problem_logger.errors():
nbr_err = problem_logger.get_nbr_problems()
problem_logger.info(
"Aborting build due to errors (# critical:%s, # warnings:%s"
" after %4.2f s.",
nbr_err["critical"],
nbr_err["warning"],
time.time() - tot_start_time,
)
return 1
# Copy files to output folder
copy_unit_src_to_src_out(build_cfg)
if ecu_supplier in ["ZC"]:
copy_common_src_to_src_out(build_cfg, True)
else:
copy_common_src_to_src_out(build_cfg)
copy_unit_cfgs_to_output(build_cfg)
copy_files_to_include(build_cfg)
if ecu_supplier in ["HI", "ZC"]:
memory_section = MemorySection(build_cfg)
memory_section.generate_required_header_files()
# Propagate tag name for release builds
# TAG_NAME is set in release -> release-compile-denso/release-ecmsildll -> pybuild.build
tag_name = os.environ.get("TAG_NAME", "")
if tag_name and ecu_supplier == "Denso":
propagate_tag_name(build_cfg, tag_name, problem_logger)
# Copy header files (subversion is using an external that points to
# the correct set of pragma section header_files
# Make A2L-file
if generate_custom_conversion_table is not None:
ctable_json = Path(generate_custom_conversion_table).resolve()
ctable_a2l = Path(build_cfg.get_src_code_dst_dir(), "custom_tabs.a2l")
create_conversion_table(ctable_json, ctable_a2l)
merged_a2l = merge_a2l_files(build_cfg, unit_cfg, complete_a2l, silver_a2l)
if ecu_supplier in ["ZC"]:
axis_data = merged_a2l.get_characteristic_axis_data()
composition_yaml = CompositionYaml(
build_cfg, signal_if.composition_spec, unit_cfg, axis_data
)
composition_yaml.generate_yaml()
zc_calibration = ZoneControllerCalibration(
build_cfg, composition_yaml.cal_class_info["tl"]
)
zc_calibration.generate_calibration_interface_files()
a2l_file_path = Path(build_cfg.get_src_code_dst_dir(), build_cfg.get_a2l_name())
replace_tab_verb(a2l_file_path)
if problem_logger.errors():
problem_logger.info(
"Critical errors were detected, aborting" " after %4.2f s.",
time.time() - tot_start_time,
)
return 1
LOG.info("Finished build in %4.2f s", time.time() - tot_start_time)
return 0
finally:
logging.shutdown()

309
pybuild/build_defs.py Normal file
View File

@ -0,0 +1,309 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
"""Defines header file names for different sections generated c-code."""
CVC_CODE_START = 'CVC_CODE_START.h' # Header file name for CVC code start
CVC_CODE_END = 'CVC_CODE_END.h' # Header file name for CVC code end
CVC_CODE_ASIL_A_START = 'CVC_CODE_ASIL_A_START.h' # Header file name for CVC code start
CVC_CODE_ASIL_A_END = 'CVC_CODE_ASIL_A_END.h' # Header file name for CVC code end
CVC_CODE_ASIL_B_START = 'CVC_CODE_ASIL_B_START.h' # Header file name for CVC code start
CVC_CODE_ASIL_B_END = 'CVC_CODE_ASIL_B_END.h' # Header file name for CVC code end
CVC_CODE_ASIL_C_START = 'CVC_CODE_ASIL_C_START.h' # Header file name for CVC code start
CVC_CODE_ASIL_C_END = 'CVC_CODE_ASIL_C_END.h' # Header file name for CVC code end
CVC_CODE_ASIL_D_START = 'CVC_CODE_ASIL_D_START.h' # Header file name for CVC code start
CVC_CODE_ASIL_D_END = 'CVC_CODE_ASIL_D_END.h' # Header file name for CVC code end
CVC_DISP_START = 'CVC_DISP_START.h' # Header file name for CVC measurable variables start
CVC_DISP_END = 'CVC_DISP_END.h' # Header file name for CVC measurable variables end
CVC_DISP_ASIL_A_START = 'CVC_DISP_ASIL_A_START.h' # Header file name for CVC ASIL A measurable variables start
CVC_DISP_ASIL_A_END = 'CVC_DISP_ASIL_A_END.h' # Header file name for CVC ASIL A measurable variables end
CVC_DISP_ASIL_B_START = 'CVC_DISP_ASIL_B_START.h' # Header file name for CVC ASIL B measurable variables start
CVC_DISP_ASIL_B_END = 'CVC_DISP_ASIL_B_END.h' # Header file name for CVC ASIL B measurable variables end
CVC_DISP_ASIL_C_START = 'CVC_DISP_ASIL_C_START.h' # Header file name for CVC ASIL C measurable variables start
CVC_DISP_ASIL_C_END = 'CVC_DISP_ASIL_C_END.h' # Header file name for CVC ASIL C measurable variables end
CVC_DISP_ASIL_D_START = 'CVC_DISP_ASIL_D_START.h' # Header file name for CVC ASIL D measurable variables start
CVC_DISP_ASIL_D_END = 'CVC_DISP_ASIL_D_END.h' # Header file name for CVC ASIL D measurable variables end
CVC_DISP_SEC_START = 'CVC_DISP_SEC_START.h' # Header file name for CVC secure measurable variables start
CVC_DISP_SEC_END = 'CVC_DISP_SEC_END.h' # Header file name for CVC secure measurable variables end
CVC_CAL_START = 'CVC_CAL_START.h' # Header file name for CVC calibration data start
CVC_CAL_END = 'CVC_CAL_END.h' # Header file name for CVC calibration data end
CVC_CAL_ASIL_A_START = 'CVC_CAL_ASIL_A_START.h' # Header file name for CVC calibration data start
CVC_CAL_ASIL_A_END = 'CVC_CAL_ASIL_A_END.h' # Header file name for CVC calibration data end
CVC_CAL_ASIL_B_START = 'CVC_CAL_ASIL_B_START.h' # Header file name for CVC calibration data start
CVC_CAL_ASIL_B_END = 'CVC_CAL_ASIL_B_END.h' # Header file name for CVC calibration data end
CVC_CAL_ASIL_C_START = 'CVC_CAL_ASIL_C_START.h' # Header file name for CVC calibration data start
CVC_CAL_ASIL_C_END = 'CVC_CAL_ASIL_C_END.h' # Header file name for CVC calibration data end
CVC_CAL_ASIL_D_START = 'CVC_CAL_ASIL_D_START.h' # Header file name for CVC calibration data start
CVC_CAL_ASIL_D_END = 'CVC_CAL_ASIL_D_END.h' # Header file name for CVC calibration data end
CVC_CONST_START = 'CVC_CONST_START.h' # Header file name for CVC calibration data start
CVC_CONST_END = 'CVC_CONST_END.h' # Header file name for CVC calibration data end
CVC_CONST_ASIL_A_START = 'CVC_CONST_ASIL_A_START.h' # Header file name for CVC calibration data start
CVC_CONST_ASIL_A_END = 'CVC_CONST_ASIL_A_END.h' # Header file name for CVC calibration data end
CVC_CONST_ASIL_B_START = 'CVC_CONST_ASIL_B_START.h' # Header file name for CVC calibration data start
CVC_CONST_ASIL_B_END = 'CVC_CONST_ASIL_B_END.h' # Header file name for CVC calibration data end
CVC_CONST_ASIL_C_START = 'CVC_CONST_ASIL_C_START.h' # Header file name for CVC calibration data start
CVC_CONST_ASIL_C_END = 'CVC_CONST_ASIL_C_END.h' # Header file name for CVC calibration data end
CVC_CONST_ASIL_D_START = 'CVC_CONST_ASIL_D_START.h' # Header file name for CVC calibration data start
CVC_CONST_ASIL_D_END = 'CVC_CONST_ASIL_D_END.h' # Header file name for CVC calibration data end
CVC_NVM_START = 'CVC_NVM_START.h' # Header file name for CVC Non Volatile Memory start
CVC_NVM_END = 'CVC_NVM_END.h' # Header file name for CVC Non Volatile Memory end
CVC_NVM_P_START = 'CVC_NVM_P_START.h' # Header file name for persistent Non Volatile Memory start
CVC_NVM_P_END = 'CVC_NVM_P_END.h' # Header file name for persistent Non Volatile Memory end
PREDECL_CODE_START = 'PREDECL_CODE_START.h' # Header file name for CVC code start
PREDECL_CODE_END = 'PREDECL_CODE_END.h' # Header file name for CVC code end
PREDECL_CODE_ASIL_A_START = 'PREDECL_CODE_ASIL_A_START.h' # Header file name for CVC code start
PREDECL_CODE_ASIL_A_END = 'PREDECL_CODE_ASIL_A_END.h' # Header file name for CVC code end
PREDECL_CODE_ASIL_B_START = 'PREDECL_CODE_ASIL_B_START.h' # Header file name for CVC code start
PREDECL_CODE_ASIL_B_END = 'PREDECL_CODE_ASIL_B_END.h' # Header file name for CVC code end
PREDECL_CODE_ASIL_C_START = 'PREDECL_CODE_ASIL_C_START.h' # Header file name for CVC code start
PREDECL_CODE_ASIL_C_END = 'PREDECL_CODE_ASIL_C_END.h' # Header file name for CVC code end
PREDECL_CODE_ASIL_D_START = 'PREDECL_CODE_ASIL_D_START.h' # Header file name for CVC code start
PREDECL_CODE_ASIL_D_END = 'PREDECL_CODE_ASIL_D_END.h' # Header file name for CVC code end
PREDECL_DISP_START = 'PREDECL_DISP_START.h' # Header file name for CVC measurable variables start
PREDECL_DISP_END = 'PREDECL_DISP_END.h' # Header file name for CVC measurable variables end
PREDECL_DISP_ASIL_A_START = 'PREDECL_DISP_ASIL_A_START.h' # Header file name for CVC ASIL A measurable variables start
PREDECL_DISP_ASIL_A_END = 'PREDECL_DISP_ASIL_A_END.h' # Header file name for CVC ASIL A measurable variables end
PREDECL_DISP_ASIL_B_START = 'PREDECL_DISP_ASIL_B_START.h' # Header file name for CVC ASIL B measurable variables start
PREDECL_DISP_ASIL_B_END = 'PREDECL_DISP_ASIL_B_END.h' # Header file name for CVC ASIL B measurable variables end
PREDECL_DISP_ASIL_C_START = 'PREDECL_DISP_ASIL_C_START.h' # Header file name for CVC ASIL C measurable variables start
PREDECL_DISP_ASIL_C_END = 'PREDECL_DISP_ASIL_C_END.h' # Header file name for CVC ASIL C measurable variables end
PREDECL_DISP_ASIL_D_START = 'PREDECL_DISP_ASIL_D_START.h' # Header file name for CVC ASIL D measurable variables start
PREDECL_DISP_ASIL_D_END = 'PREDECL_DISP_ASIL_D_END.h' # Header file name for CVC ASIL D measurable variables end
PREDECL_DISP_SEC_START = 'PREDECL_DISP_SEC_START.h' # Header file name for CVC secure measurable variables start
PREDECL_DISP_SEC_END = 'PREDECL_DISP_SEC_END.h' # Header file name for CVC secure measurable variables end
PREDECL_CAL_START = 'PREDECL_CAL_START.h' # Header file name for CVC calibration data start
PREDECL_CAL_END = 'PREDECL_CAL_END.h' # Header file name for CVC calibration data end
PREDECL_CAL_ASIL_A_START = 'PREDECL_CAL_ASIL_A_START.h' # Header file name for CVC calibration data start
PREDECL_CAL_ASIL_A_END = 'PREDECL_CAL_ASIL_A_END.h' # Header file name for CVC calibration data end
PREDECL_CAL_ASIL_B_START = 'PREDECL_CAL_ASIL_B_START.h' # Header file name for CVC calibration data start
PREDECL_CAL_ASIL_B_END = 'PREDECL_CAL_ASIL_B_END.h' # Header file name for CVC calibration data end
PREDECL_CAL_ASIL_C_START = 'PREDECL_CAL_ASIL_C_START.h' # Header file name for CVC calibration data start
PREDECL_CAL_ASIL_C_END = 'PREDECL_CAL_ASIL_C_END.h' # Header file name for CVC calibration data end
PREDECL_CAL_ASIL_D_START = 'PREDECL_CAL_ASIL_D_START.h' # Header file name for CVC calibration data start
PREDECL_CAL_ASIL_D_END = 'PREDECL_CAL_ASIL_D_END.h' # Header file name for CVC calibration data end
PREDECL_CAL_EXT_START = 'PREDECL_CAL_EXT_START.h' # Header file name for CVC calibration data start
PREDECL_CAL_EXT_END = 'PREDECL_CAL_EXT_END.h' # Header file name for CVC calibration data end
PREDECL_CAL_MERG_START = 'PREDECL_CAL_MERG_START.h' # Header file name for CVC calibration data start
PREDECL_CAL_MERG_END = 'PREDECL_CAL_MERG_END.h' # Header file name for CVC calibration data end
PREDECL_CONST_START = 'PREDECL_CONST_START.h' # Header file name for CVC calibration data start
PREDECL_CONST_END = 'PREDECL_CONST_END.h' # Header file name for CVC calibration data end
PREDECL_CONST_ASIL_A_START = 'PREDECL_CONST_ASIL_A_START.h' # Header file name for CVC calibration data start
PREDECL_CONST_ASIL_A_END = 'PREDECL_CONST_ASIL_A_END.h' # Header file name for CVC calibration data end
PREDECL_CONST_ASIL_B_START = 'PREDECL_CONST_ASIL_B_START.h' # Header file name for CVC calibration data start
PREDECL_CONST_ASIL_B_END = 'PREDECL_CONST_ASIL_B_END.h' # Header file name for CVC calibration data end
PREDECL_CONST_ASIL_C_START = 'PREDECL_CONST_ASIL_C_START.h' # Header file name for CVC calibration data start
PREDECL_CONST_ASIL_C_END = 'PREDECL_CONST_ASIL_C_END.h' # Header file name for CVC calibration data end
PREDECL_CONST_ASIL_D_START = 'PREDECL_CONST_ASIL_D_START.h' # Header file name for CVC calibration data start
PREDECL_CONST_ASIL_D_END = 'PREDECL_CONST_ASIL_D_END.h' # Header file name for CVC calibration data end
PREDECL_CONST_EXT_START = 'PREDECL_CONST_EXT_START.h' # Header file name for CVC calibration data start
PREDECL_CONST_EXT_END = 'PREDECL_CONST_EXT_END.h' # Header file name for CVC calibration data end
PREDECL_CONST_MERG_START = 'PREDECL_CONST_MERG_START.h' # Header file name for CVC calibration data start
PREDECL_CONST_MERG_END = 'PREDECL_CONST_MERG_END.h' # Header file name for CVC calibration data end
PREDECL_NVM_START = 'PREDECL_NVM_START.h' # Header file name for CVC Non Volatile Memory start
PREDECL_NVM_END = 'PREDECL_NVM_END.h' # Header file name for CVC Non Volatile Memory end
PREDECL_NVM_P_START = 'PREDECL_NVM_P_START.h' # Header file name for persistent Non Volatile Memory start
PREDECL_NVM_P_END = 'PREDECL_NVM_P_END.h' # Header file name for persistent Non Volatile Memory end
PREDECL_START = 'PREDECL_START.h'
PREDECL_END = 'PREDECL_END.h'
CVC_CODE_QM = {'START': CVC_CODE_START, 'END': CVC_CODE_END}
CVC_CODE_A = {'START': CVC_CODE_ASIL_A_START, 'END': CVC_CODE_ASIL_A_END}
CVC_CODE_B = {'START': CVC_CODE_ASIL_B_START, 'END': CVC_CODE_ASIL_B_END}
CVC_CODE_C = {'START': CVC_CODE_ASIL_C_START, 'END': CVC_CODE_ASIL_C_END}
CVC_CODE_D = {'START': CVC_CODE_ASIL_D_START, 'END': CVC_CODE_ASIL_D_END}
CVC_DISP_QM = {'START': CVC_DISP_START, 'END': CVC_DISP_END}
CVC_DISP_A = {'START': CVC_DISP_ASIL_A_START, 'END': CVC_DISP_ASIL_A_END}
CVC_DISP_B = {'START': CVC_DISP_ASIL_B_START, 'END': CVC_DISP_ASIL_B_END}
CVC_DISP_C = {'START': CVC_DISP_ASIL_C_START, 'END': CVC_DISP_ASIL_C_END}
CVC_DISP_D = {'START': CVC_DISP_ASIL_D_START, 'END': CVC_DISP_ASIL_D_END}
CVC_DISP_SECURE = {'START': CVC_DISP_SEC_START, 'END': CVC_DISP_SEC_END}
CVC_CAL_QM = {'START': CVC_CAL_START, 'END': CVC_CAL_END}
CVC_CAL_A = {'START': CVC_CAL_ASIL_A_START, 'END': CVC_CAL_ASIL_A_END}
CVC_CAL_B = {'START': CVC_CAL_ASIL_B_START, 'END': CVC_CAL_ASIL_B_END}
CVC_CAL_C = {'START': CVC_CAL_ASIL_C_START, 'END': CVC_CAL_ASIL_C_END}
CVC_CAL_D = {'START': CVC_CAL_ASIL_D_START, 'END': CVC_CAL_ASIL_D_END}
CVC_CONST_QM = {'START': CVC_CONST_START, 'END': CVC_CONST_END}
CVC_CONST_A = {'START': CVC_CONST_ASIL_A_START, 'END': CVC_CONST_ASIL_A_END}
CVC_CONST_B = {'START': CVC_CONST_ASIL_B_START, 'END': CVC_CONST_ASIL_B_END}
CVC_CONST_C = {'START': CVC_CONST_ASIL_C_START, 'END': CVC_CONST_ASIL_C_END}
CVC_CONST_D = {'START': CVC_CONST_ASIL_D_START, 'END': CVC_CONST_ASIL_D_END}
CVC_NVM = {'START': CVC_NVM_START, 'END': CVC_NVM_END}
CVC_NVM_P = {'START': CVC_NVM_P_START, 'END': CVC_NVM_P_END}
PREDECL = {'START': PREDECL_START, 'END': PREDECL_END}
PREDECL_CODE_QM = {'START': PREDECL_CODE_START, 'END': PREDECL_CODE_END}
PREDECL_CODE_A = {'START': PREDECL_CODE_ASIL_A_START, 'END': PREDECL_CODE_ASIL_A_END}
PREDECL_CODE_B = {'START': PREDECL_CODE_ASIL_B_START, 'END': PREDECL_CODE_ASIL_B_END}
PREDECL_CODE_C = {'START': PREDECL_CODE_ASIL_C_START, 'END': PREDECL_CODE_ASIL_C_END}
PREDECL_CODE_D = {'START': PREDECL_CODE_ASIL_D_START, 'END': PREDECL_CODE_ASIL_D_END}
PREDECL_DISP_QM = {'START': PREDECL_DISP_START, 'END': PREDECL_DISP_END}
PREDECL_DISP_A = {'START': PREDECL_DISP_ASIL_A_START, 'END': PREDECL_DISP_ASIL_A_END}
PREDECL_DISP_B = {'START': PREDECL_DISP_ASIL_B_START, 'END': PREDECL_DISP_ASIL_B_END}
PREDECL_DISP_C = {'START': PREDECL_DISP_ASIL_C_START, 'END': PREDECL_DISP_ASIL_C_END}
PREDECL_DISP_D = {'START': PREDECL_DISP_ASIL_D_START, 'END': PREDECL_DISP_ASIL_D_END}
PREDECL_DISP_SECURE = {'START': PREDECL_DISP_SEC_START, 'END': PREDECL_DISP_SEC_END}
PREDECL_CAL_QM = {'START': PREDECL_CAL_START, 'END': PREDECL_CAL_END}
PREDECL_CAL_A = {'START': PREDECL_CAL_ASIL_A_START, 'END': PREDECL_CAL_ASIL_A_END}
PREDECL_CAL_B = {'START': PREDECL_CAL_ASIL_B_START, 'END': PREDECL_CAL_ASIL_B_END}
PREDECL_CAL_C = {'START': PREDECL_CAL_ASIL_C_START, 'END': PREDECL_CAL_ASIL_C_END}
PREDECL_CAL_D = {'START': PREDECL_CAL_ASIL_D_START, 'END': PREDECL_CAL_ASIL_D_END}
PREDECL_CAL_EXT = {'START': PREDECL_CAL_EXT_START, 'END': PREDECL_CAL_EXT_END}
PREDECL_CAL_MERG = {'START': PREDECL_CAL_MERG_START, 'END': PREDECL_CAL_MERG_END}
PREDECL_CONST_QM = {'START': PREDECL_CONST_START, 'END': PREDECL_CONST_END}
PREDECL_CONST_A = {'START': PREDECL_CONST_ASIL_A_START, 'END': PREDECL_CONST_ASIL_A_END}
PREDECL_CONST_B = {'START': PREDECL_CONST_ASIL_B_START, 'END': PREDECL_CONST_ASIL_B_END}
PREDECL_CONST_C = {'START': PREDECL_CONST_ASIL_C_START, 'END': PREDECL_CONST_ASIL_C_END}
PREDECL_CONST_D = {'START': PREDECL_CONST_ASIL_D_START, 'END': PREDECL_CONST_ASIL_D_END}
PREDECL_CONST_EXT = {'START': PREDECL_CONST_EXT_START, 'END': PREDECL_CONST_EXT_END}
PREDECL_CONST_MERG = {'START': PREDECL_CONST_MERG_START, 'END': PREDECL_CONST_MERG_END}
PREDECL_NVM = {'START': PREDECL_NVM_START, 'END': PREDECL_NVM_END}
PREDECL_NVM_P = {'START': PREDECL_NVM_P_START, 'END': PREDECL_NVM_P_END}
CVC_SECURE = {
'CODE': CVC_CODE_QM,
'DISP': CVC_DISP_SECURE,
'CAL': CVC_CAL_QM,
'CONST': CVC_CONST_QM
}
CVC_ASIL_QM = {
'CODE': CVC_CODE_QM,
'DISP': CVC_DISP_QM,
'CAL': CVC_CAL_QM,
'CONST': CVC_CONST_QM
}
CVC_ASIL_A = {
'CODE': CVC_CODE_A,
'DISP': CVC_DISP_A,
'CAL': CVC_CAL_A,
'CONST': CVC_CONST_A
}
CVC_ASIL_B = {
'CODE': CVC_CODE_B,
'DISP': CVC_DISP_B,
'CAL': CVC_CAL_B,
'CONST': CVC_CONST_B
}
CVC_ASIL_C = {
'CODE': CVC_CODE_C,
'DISP': CVC_DISP_C,
'CAL': CVC_CAL_C,
'CONST': CVC_CONST_C
}
CVC_ASIL_D = {
'CODE': CVC_CODE_D,
'DISP': CVC_DISP_D,
'CAL': CVC_CAL_D,
'CONST': CVC_CONST_D
}
PREDECL_SECURE = {
'CODE': PREDECL_CODE_QM,
'DISP': PREDECL_DISP_SECURE,
'CAL': PREDECL_CAL_QM,
'CONST': PREDECL_CONST_QM
}
PREDECL_ASIL_QM = {
'CODE': PREDECL_CODE_QM,
'DISP': PREDECL_DISP_QM,
'CAL': PREDECL_CAL_QM,
'CONST': PREDECL_CONST_QM
}
PREDECL_ASIL_A = {
'CODE': PREDECL_CODE_A,
'DISP': PREDECL_DISP_A,
'CAL': PREDECL_CAL_A,
'CONST': PREDECL_CONST_A
}
PREDECL_ASIL_B = {
'CODE': PREDECL_CODE_B,
'DISP': PREDECL_DISP_B,
'CAL': PREDECL_CAL_B,
'CONST': PREDECL_CONST_B
}
PREDECL_ASIL_C = {
'CODE': PREDECL_CODE_C,
'DISP': PREDECL_DISP_C,
'CAL': PREDECL_CAL_C,
'CONST': PREDECL_CONST_C
}
PREDECL_ASIL_D = {
'CODE': PREDECL_CODE_D,
'DISP': PREDECL_DISP_D,
'CAL': PREDECL_CAL_D,
'CONST': PREDECL_CONST_D
}
PREDECL_EXTRA = {
'NORM': PREDECL,
'CAL_EXT': PREDECL_CAL_EXT,
'CONST_EXT': PREDECL_CONST_EXT,
'CAL_MERG': PREDECL_CAL_MERG,
'CONST_MERG': PREDECL_CONST_MERG
}
NVM = {'CVC': CVC_NVM, 'PREDECL': PREDECL_NVM}
NVM_P = {'CVC': CVC_NVM_P, 'PREDECL': PREDECL_NVM_P}
SECURE = {'CVC': CVC_SECURE, 'PREDECL': PREDECL_SECURE}
ASIL_QM = {'CVC': CVC_ASIL_QM, 'PREDECL': PREDECL_ASIL_QM}
ASIL_A = {'CVC': CVC_ASIL_A, 'PREDECL': PREDECL_ASIL_A}
ASIL_B = {'CVC': CVC_ASIL_B, 'PREDECL': PREDECL_ASIL_B}
ASIL_C = {'CVC': CVC_ASIL_C, 'PREDECL': PREDECL_ASIL_C}
ASIL_D = {'CVC': CVC_ASIL_D, 'PREDECL': PREDECL_ASIL_D}
NVM_LEVEL_MAP = {
'NORMAL': NVM,
'PROTECTED': NVM_P
}
CVC_ASIL_LEVEL_MAP = {
'QM': CVC_ASIL_QM,
'A': CVC_ASIL_A,
'B': CVC_ASIL_B,
'C': CVC_ASIL_C,
'D': CVC_ASIL_D
}
PREDECL_ASIL_LEVEL_MAP = {
'QM': PREDECL_ASIL_QM,
'A': PREDECL_ASIL_A,
'B': PREDECL_ASIL_B,
'C': PREDECL_ASIL_C,
'D': PREDECL_ASIL_D
}
ASIL_LEVEL_MAP = {
'QM': ASIL_QM,
'A': ASIL_A,
'B': ASIL_B,
'C': ASIL_C,
'D': ASIL_D
}

View File

@ -0,0 +1,606 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
# -*- coding: utf-8 -*-
"""Module used to read project and base configuration files and provides methods for abstraction."""
import glob
import json
import os
import shutil
import pathlib
from pprint import pformat
from pybuild.lib.helper_functions import deep_dict_update
from pybuild.versioncheck import Version
class BuildProjConfig:
"""A class holding build project configurations."""
def __init__(self, prj_config_file):
"""Read project configuration file to internal an representation.
Args:
prj_config_file (str): Project config filename
"""
super().__init__()
self._prj_cfg_file = prj_config_file
prj_root_dir, _ = os.path.split(prj_config_file)
self._prj_root_dir = os.path.abspath(prj_root_dir)
with open(prj_config_file, 'r', encoding="utf-8") as pcfg:
self._prj_cfg = json.load(pcfg)
if not Version.is_compatible(self._prj_cfg.get('ConfigFileVersion')):
raise ValueError('Incompatible project config file version.')
# Load a basic config that can be common for several projects
# the local config overrides the base config
if 'BaseConfig' in self._prj_cfg:
fil_tmp = os.path.join(prj_root_dir, self._prj_cfg['BaseConfig'])
fil_ = os.path.abspath(fil_tmp)
with open(fil_, 'r', encoding="utf-8") as bcfg:
base_cnfg = json.load(bcfg)
deep_dict_update(self._prj_cfg, base_cnfg)
if not Version.is_compatible(self._prj_cfg.get('BaseConfigFileVersion')):
raise ValueError('Incompatible base config file version.')
self.has_yaml_interface = self._prj_cfg['ProjectInfo'].get('yamlInterface', False)
self.device_domains = self._get_device_domains()
self.services_file = self._get_services_file()
self._load_unit_configs()
self._add_global_const_file()
self._all_units = []
self._calc_all_units()
self.name = self._prj_cfg['ProjectInfo']['projConfig']
self.allow_undefined_unused = self._prj_cfg['ProjectInfo'].get('allowUndefinedUnused', True)
self._scheduler_prefix = self._prj_cfg['ProjectInfo'].get('schedulerPrefix', '')
if self._scheduler_prefix:
self._scheduler_prefix = self._scheduler_prefix + '_'
def __repr__(self):
"""Get string representation of object."""
return pformat(self._prj_cfg['ProjectInfo'])
def _get_device_domains(self):
file_name = self._prj_cfg['ProjectInfo'].get('deviceDomains')
full_path = pathlib.Path(self._prj_root_dir, file_name)
if full_path.is_file():
with open(full_path, 'r', encoding="utf-8") as device_domains:
return json.loads(device_domains.read())
return {}
def _get_services_file(self):
file_name = self._prj_cfg['ProjectInfo'].get('serviceInterfaces', '')
full_path = pathlib.Path(self._prj_root_dir, file_name)
return full_path
@staticmethod
def get_services(services_file):
"""Get the services from the services file.
Args:
services_file (pathlib.Path): The services file.
Returns:
(dict): The services.
"""
if services_file.is_file():
with services_file.open() as services:
return json.loads(services.read())
return {}
def _load_unit_configs(self):
"""Load Unit config json file.
This file contains which units are included in which projects.
"""
if 'UnitCfgs' in self._prj_cfg:
fil_tmp = os.path.join(self._prj_root_dir, self._prj_cfg['UnitCfgs'])
with open(fil_tmp, 'r', encoding="utf-8") as fpr:
tmp_unit_cfg = json.load(fpr)
sample_times = tmp_unit_cfg.pop('SampleTimes')
self._unit_cfg = {
'Rasters': tmp_unit_cfg,
'SampleTimes': sample_times
}
else:
raise ValueError('UnitCfgs is not specified in project config')
def _add_global_const_file(self):
"""Add the global constants definition to the 'not_scheduled' time raster."""
ugc = self.get_use_global_const()
if ugc:
self._unit_cfg['Rasters'].setdefault('NoSched', []).append(ugc)
def create_build_dirs(self):
"""Create the necessary output build dirs if they are missing.
Clear the output build dirs if they exist.
"""
src_outp = self.get_src_code_dst_dir()
if os.path.exists(src_outp):
shutil.rmtree(src_outp)
os.makedirs(src_outp)
log_outp = self.get_log_dst_dir()
if os.path.exists(log_outp):
shutil.rmtree(log_outp)
os.makedirs(log_outp)
rep_outp = self.get_reports_dst_dir()
if os.path.exists(rep_outp):
shutil.rmtree(rep_outp)
os.makedirs(rep_outp)
unit_cfg_outp = self.get_unit_cfg_deliv_dir()
if os.path.exists(unit_cfg_outp):
shutil.rmtree(unit_cfg_outp)
if unit_cfg_outp is not None:
os.makedirs(unit_cfg_outp)
def get_a2l_cfg(self):
""" Get A2L configuration from A2lConfig.
Returns:
config (dict): A2L configuration
"""
a2l_config = self._prj_cfg.get('A2lConfig', {})
return {
'name': a2l_config.get('name', self._prj_cfg["ProjectInfo"]["projConfig"]),
'allow_kp_blob': a2l_config.get('allowKpBlob', True),
'ip_address': a2l_config.get('ipAddress', "169.254.4.10"),
'ip_port': '0x%X' % a2l_config.get('ipPort', 30000),
'asap2_version': a2l_config.get('asap2Version', "1 51")
}
def get_unit_cfg_deliv_dir(self):
"""Get the directory where to put the unit configuration files.
If this key is undefined, or set to None, the unit-configs will
not be copied to the output folder.
Returns:
A path to the unit deliver dir, or None
"""
if 'unitCfgDeliveryDir' in self._prj_cfg['ProjectInfo']:
return os.path.join(self.get_root_dir(),
os.path.normpath(self._prj_cfg['ProjectInfo']
['unitCfgDeliveryDir']))
return None
def get_root_dir(self):
"""Get the root directory of the project.
Returns:
A path to the project root (with wildcards)
"""
return self._prj_root_dir
def get_src_code_dst_dir(self):
"""Return the absolute path to the source output folder."""
return os.path.join(self.get_root_dir(),
os.path.normpath(self._prj_cfg['ProjectInfo']
['srcCodeDstDir']))
def get_composition_name(self):
"""Return the composition name."""
name, _ = os.path.splitext(self._prj_cfg['ProjectInfo']['compositionName'])
return name
def get_composition_ending(self):
"""Return the composition ending."""
_, ending = os.path.splitext(self._prj_cfg['ProjectInfo']['compositionName'])
if ending:
return ending
return 'yml'
def get_composition_arxml(self):
"""Return the relative composition arxml path."""
return self._prj_cfg['ProjectInfo']['compositionArxml']
def get_gen_ext_impl_type(self):
"""Return the generate external implementation type."""
return self._prj_cfg['ProjectInfo'].get('generateExternalImplementationType', True)
def get_swc_name(self):
"""Returns the software component name."""
a2lname = f"{self.get_a2l_cfg()['name']}_SC"
return self._prj_cfg['ProjectInfo'].get('softwareComponentName', a2lname)
def get_car_com_dst(self):
"""Return the absolute path to the source output folder."""
return os.path.join(self.get_root_dir(),
os.path.normpath(self._prj_cfg['ProjectInfo']
['didCarCom']))
def get_reports_dst_dir(self):
"""Get the destination dir for build reports.
Returns:
A path to the report files destination directory (with wildcards)
"""
return os.path.join(self.get_root_dir(),
os.path.normpath(self._prj_cfg['ProjectInfo']
['reportDstDir']))
def get_all_reports_dst_dir(self):
"""Get the destination dir for build reports.
Returns:
A path to the report files destination directory (for all projects)
"""
return os.path.join(self.get_root_dir(), "..", "..", "Reports")
def get_log_dst_dir(self):
"""Return the absolute path to the log output folder.
Returns:
A path to the log files destination directory (with wildcards)
"""
return os.path.join(self.get_root_dir(),
os.path.normpath(self._prj_cfg['ProjectInfo']
['logDstDir']))
def get_core_dummy_name(self):
"""Return the file name of the core dummy file from the config file.
Returns:
A file name for the core dummy files
"""
path = os.path.join(self.get_src_code_dst_dir(),
os.path.normpath(self._prj_cfg['ProjectInfo']
['coreDummyFileName']))
return path
def get_feature_conf_header_name(self):
"""Return the feature configuration header file name.
Returns:
A file name for the feature config header file
"""
return self._prj_cfg['ProjectInfo']['featureHeaderName']
def get_ts_header_name(self):
"""Return the name of the ts header file, defined in the config file.
Returns:
The file name of the file defining all unit raster times
"""
return self._prj_cfg['ProjectInfo']['tsHeaderName']
def get_included_units(self):
"""Return a list of all the included units in the project.
TODO:Consider moving this to the Feature Configs class if we start
using our configuration tool for model inclusion and scheduling
TODO:Consider calculate this on init and storing the result in the
class. this method would the just return the stored list.
"""
units_dict = self._unit_cfg['Rasters']
units = []
for unit in units_dict.values():
units.extend(unit)
return units
def get_included_common_files(self):
"""Return a list of all the included common files in the project.
Returns:
included_common_files ([str]): The names of the common files which are included in the project.
"""
return self._prj_cfg.get('includedCommonFiles', [])
def _calc_all_units(self):
"""Return a list of all the units."""
units = set()
for runits in self._unit_cfg['Rasters'].values():
units = units.union(set(runits))
self._all_units = list(units)
def get_includes_paths(self):
"""Return list of paths to files to include in source directory."""
includes_paths = self._prj_cfg.get('includesPaths', [])
return [os.path.join(self.get_root_dir(), os.path.normpath(path)) for path in includes_paths]
def get_all_units(self):
"""Return a list of all the units."""
return self._all_units
def get_prj_cfg_dir(self):
"""Return the directory containing the project configuration files.
Returns:
An absolute path to the project configuration files
"""
return os.path.join(self._prj_root_dir,
self._prj_cfg['ProjectInfo']['configDir'])
def get_scheduler_prefix(self):
"""Returns a prefix used to distinguish function calls in one project from
similarly named functions in other projects, when linked/compiled together
Returns:
scheduler_prefix (string): prefix for scheduler functions.
"""
return self._scheduler_prefix
def get_local_defs_name(self):
"""Return a string which defines the file name of local defines.
Returns:
A string containing the wildcard file name local defines
"""
return self._prj_cfg['ProjectInfo']['prjLocalDefs']
def get_codeswitches_name(self):
"""Return a string which defines the file name of code switches.
Returns:
A string containing the wildcard file name code switches
"""
return self._prj_cfg['ProjectInfo']['prjCodeswitches']
def get_did_cfg_file_name(self):
"""Return the did definition file name.
Returns:
DID definition file name
"""
return self._prj_cfg['ProjectInfo']['didDefFile']
def get_prj_config(self):
"""Get the project configuration name from the config file.
Returns:
Project config name
"""
return self._prj_cfg['ProjectInfo']["projConfig"]
def get_a2l_name(self):
"""Get the name of the a2l-file, which the build system shall generate."""
return self._prj_cfg['ProjectInfo']['a2LFileName']
def get_ecu_info(self):
"""Return ecuSupplier and ecuType.
Returns:
(ecuSupplier, ecuType)
"""
return (self._prj_cfg['ProjectInfo']['ecuSupplier'],
self._prj_cfg['ProjectInfo'].get('ecuType', ''))
def get_xcp_enabled(self):
"""Return True/False whether XCP is enabled in the project or not.
Returns:
(bool): True/False whether XCP is enabled in the project or not
"""
return self._prj_cfg['ProjectInfo'].get('enableXcp', True)
def get_nvm_defs(self):
"""Return NVM-ram block definitions.
The definitions contains the sizes of the six NVM areas
which are defined in the build-system.
Returns:
NvmConfig dict from config file.
"""
return self._prj_cfg['NvmConfig']
def _get_inc_dirs(self, path):
"""Get the dirs with the models defined in the units config file.
Model name somewhere in the path.
"""
all_dirs = glob.glob(path)
inc_units = self.get_included_units()
psep = os.path.sep
out = {}
for dir_ in all_dirs:
folders = dir_.split(psep)
for inc_unit in inc_units:
if inc_unit in folders:
out.update({inc_unit: dir_})
break
return out
def get_units_raster_cfg(self):
"""Get the units' scheduling raster config.
I.e. which units are included, and in which
rasters they are scheduled, and in which order.
Returns:
A dict in the following format.
::
{
"SampleTimes": {
"NameOfRaster": scheduling time},
"Rasters": {
"NameOfRaster": [
"NameOfFunction",
...],
...}
}
Example::
{
"SampleTimes": {
"Vc10ms": 0.01,
"Vc40ms": 0.04},
"Rasters": {
"Vc10ms": [
"VcPpmImob",
"VcPpmPsm",
"VcPpmRc",
"VcPpmSt",
"VcPemAlc"],
"Vc40ms": [
"VcRegCh"]
}
"""
return self._unit_cfg
def get_unit_cfg_dirs(self):
"""Get config dirs which matches the project config parameter prjUnitCfgDir.
Furthermore, they should be included in the unit definition for this project
Returns:
A list with absolute paths to all unit config dirs
included in the project
"""
path = os.path.join(self.get_root_dir(),
os.path.normpath(self._prj_cfg['ProjectInfo']
['prjUnitCfgDir']))
return self._get_inc_dirs(path)
def get_translation_files_dirs(self):
"""Get translation files directories, specified as a path regex in project
config by key prjTranslationDir. If key is not present, will fall back to
prjUnitCfgDir.
Returns:
A dictionary with absolute paths to all translation file dirs included
in the project
"""
if "prjTranslationDir" not in self._prj_cfg['ProjectInfo']:
return self.get_unit_cfg_dirs()
normpath_dir = os.path.normpath(self._prj_cfg['ProjectInfo']['prjTranslationDir'])
path = os.path.join(self.get_root_dir(), normpath_dir)
all_dirs = glob.glob(path)
translation_dirs = {}
for directory in all_dirs:
file = pathlib.Path(directory).stem
translation_dirs[file] = directory
return translation_dirs
def get_common_src_dir(self):
"""Get source dir which matches the project config parameter commonSrcDir.
Returns:
Absolute path to common source dir
"""
return os.path.join(self.get_root_dir(),
os.path.normpath(self._prj_cfg['ProjectInfo']['commonSrcDir']))
def get_unit_src_dirs(self):
"""Get source dirs which matches the project config parameter prjUnitCfgDir.
Furthermore, they should be included in the unit definition for this project
Returns:
A list with absolute paths to all source dirs included in the
project
"""
path = os.path.join(self.get_root_dir(),
os.path.normpath(self._prj_cfg['ProjectInfo']
['prjUnitSrcDir']))
return self._get_inc_dirs(path)
def get_unit_mdl_dirs(self):
"""Get source dirs which matches the project config parameter prjUnitCfgDir.
Furthermore, they should be included in the unit definition for this project
Returns:
A list with absolute paths to all model dirs included in the
project
"""
path = os.path.join(self.get_root_dir(),
os.path.normpath(self._prj_cfg['ProjectInfo']
['prjUnitMdlDir']))
return self._get_inc_dirs(path)
def get_use_global_const(self):
"""Get the name of the global constant module."""
return self._prj_cfg['ProjectInfo']['useGlobalConst']
def get_use_volatile_globals(self):
"""Get if global variables should be defined as volatile or not."""
if 'useVolatileGlobals' in self._prj_cfg['ProjectInfo']:
return self._prj_cfg['ProjectInfo']['useVolatileGlobals']
return False
def get_use_custom_dummy_spm(self):
"""Get path to file defining missing internal variables, if any.
This file will be used instead of generating VcDummy_spm.c,
to make it easier to maintain missing internal signals.
Returns:
customDummySpm (os.path): An absolute path to the custom dummy spm file, if existent.
"""
if 'customDummySpm' in self._prj_cfg['ProjectInfo']:
return os.path.join(
self.get_root_dir(),
os.path.normpath(self._prj_cfg['ProjectInfo']['customDummySpm'])
)
return None
def get_use_custom_sources(self):
"""Get path to files with custom handwritten sourcecode, if any.
Returns:
customSources (os.path): A list of absolute paths to custom sources, if existent.
"""
if 'customSources' in self._prj_cfg['ProjectInfo']:
normalized_paths = (os.path.normpath(p) for p in self._prj_cfg['ProjectInfo']['customSources'])
return [os.path.join(self.get_root_dir(), p) for p in normalized_paths]
return None
def get_if_cfg_dir(self):
"""Return the directory containing the interface configuration files.
Returns:
An absolute path to the interface configuration files
"""
return os.path.join(self._prj_root_dir,
self._prj_cfg['ProjectInfo']['interfaceCfgDir'])
def get_enum_def_dir(self):
"""Get path to dir containing simulink enumeration definitions, if any.
Returns:
enumDefDir (os.path): An absolute path to the simulink enumerations, if existent.
"""
if 'enumDefDir' in self._prj_cfg['ProjectInfo']:
return os.path.join(
self.get_root_dir(),
os.path.normpath(self._prj_cfg['ProjectInfo']['enumDefDir'])
)
return None
if __name__ == '__main__':
# Function for testing the module
BPC = BuildProjConfig('../../ProjectCfg.json')

563
pybuild/check_interface.py Normal file
View File

@ -0,0 +1,563 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
# -*- coding: utf-8 -*-
"""Python module used for calculating interfaces for CSP"""
import argparse
import os
import re
import sys
from itertools import product
from pathlib import Path
import git
from pybuild.interface.application import Application, Model, get_active_signals
from pybuild.interface.ems import CsvEMS
from pybuild.lib import logger
LOGGER = logger.create_logger("Check interface")
def process_app(config):
"""Get an app specification for the current project
Entrypoint for external scripts.
Args:
config (pathlib.Path): Path to the ProjectCfg.json
Returns:
app (Application): pybuild project
"""
app = Application()
app.parse_definition(config)
return app
def model_app_consistency(model, app_models, app, errors):
"""Compare model signal interface with list of models.
Args:
model (Model): model to compare against application
app_models (list(Model)): list of models to compare with
app (Application): pybuild project
errors (dict): Object for counting errors of different types
"""
for compare_model in app_models:
LOGGER.debug("Comparing %s with %s in %s", model.name, compare_model.name, app.name)
active_model_outsignals = get_active_signals(model.outsignals, app.pybuild["feature_cfg"])
active_model_insignals = get_active_signals(model.insignals, app.pybuild["feature_cfg"])
active_compare_outsignals = get_active_signals(compare_model.outsignals, app.pybuild["feature_cfg"])
active_compare_insignals = get_active_signals(compare_model.insignals, app.pybuild["feature_cfg"])
check_signals(
active_model_insignals,
active_compare_outsignals,
errors,
[app.name, model.name],
[app.name, compare_model.name],
)
check_signals(
active_model_outsignals,
active_compare_insignals,
errors,
[app.name, model.name],
[app.name, compare_model.name],
)
def check_internal_signals(app, model_names=None):
"""Look for all internal signal mismatches.
Args:
app (Application): pybuild project
model_names (list(Model)): models based on parsed config jsons
Returns:
serious_mismatch (bool): A serious mismatch was found
"""
serious_mismatch = False
LOGGER.debug("Checking internal signals")
LOGGER.debug("Checking against %s", app.signals)
errors = {"type": 0, "range": 0, "unit": 0, "width": 0}
app_models = app.get_models()
for signal in app.signals:
LOGGER.debug(signal.properties)
for model in app_models:
if model_names is not None and model.name not in model_names:
LOGGER.debug("Skipping %s", model.name)
continue
LOGGER.debug("Checking %s in %s", model.name, app.name)
active_insignals = get_active_signals(model.insignals, app.pybuild["feature_cfg"])
insignal_mismatch = check_signals(active_insignals, app.signals, errors, [app.name, model.name], [app.name])
active_outsignals = get_active_signals(model.outsignals, app.pybuild["feature_cfg"])
outsignal_mismatch = check_signals(active_outsignals, app.signals, errors, [app.name, model.name], [app.name])
if insignal_mismatch or outsignal_mismatch:
serious_mismatch = True
model_app_consistency(model, app_models, app, errors)
# Only compare with all models if a mismatch is found
LOGGER.debug("Total errors: %s", errors)
return serious_mismatch
def check_models_generic(all_models, model_names, emses):
"""Check filtered models against all models and external interfaces."""
serious_mismatch = False
for model in all_models:
LOGGER.info("Checking signals attributes for %s", model.name)
if model.name not in model_names:
continue
errors = {"type": 0, "range": 0, "unit": 0, "width": 0}
LOGGER.debug("Checking internal signals for %s", model.name)
for corresponding_model in all_models:
serious_mismatch |= check_signals(
model.insignals, corresponding_model.outsignals, errors, [model.name], [corresponding_model.name]
)
serious_mismatch |= check_signals(
model.outsignals, corresponding_model.insignals, errors, [model.name], [corresponding_model.name]
)
if emses:
LOGGER.debug("Checking external signals for %s", model.name)
for ems in emses:
serious_mismatch |= check_signals(
model.insignals, ems.outsignals, errors, [model.name], [ems.name]
)
serious_mismatch |= check_signals(
model.outsignals, ems.insignals, errors, [model.name], [ems.name]
)
LOGGER.debug("Total errors for %s: %s", model.name, errors)
return serious_mismatch
def get_all_models(model_root):
"""Find, filter and parse all model configurations."""
LOGGER.info("Parsing all models")
prefix = "config_"
suffix = ".json"
models = []
for dirpath, _, filenames in os.walk(model_root):
dirpath = Path(dirpath)
for filename in [f for f in filenames if f.startswith(prefix) and f.endswith(suffix)]:
name = filename[len(prefix): -len(suffix)]
if name == dirpath.parent.stem:
model = Model(None)
model.parse_definition((name, Path(dirpath, filename)))
models.append(model)
return models
def get_projects(root, project_names):
"""Find, parse and filter all project configurations."""
LOGGER.info("Parsing all projects")
projects = []
for dirpath, _, filenames in os.walk(root):
dirpath = Path(dirpath)
for filename in [f for f in filenames if f == "ProjectCfg.json"]:
config = Path(dirpath, filename)
app = Application()
app_name = app.get_name(config)
if project_names is not None and app_name not in project_names:
if config.parent.stem not in project_names:
LOGGER.info("%s or %s does not match %s", app_name, config.parent.stem, project_names)
continue
app.parse_definition(config)
if app.pybuild["build_cfg"].has_yaml_interface:
LOGGER.warning("Interface checks for yaml-interface projects are not implemtented yet")
LOGGER.info("Adding empty interface for %s", app_name)
projects.append((app, None))
else:
ems = CsvEMS()
ems.parse_definition(config)
projects.append((app, ems))
return projects
def correct_type(left_spec, right_spec):
"""Check if the type is the same in two specifications.
Args:
left_spec (dict): Signal specification
right_spec (dict): Signal specification to compare with
Returns:
matches (bool): Spec1 and Spec2 has the same type
"""
return left_spec["type"] == right_spec["type"]
def correct_attribute(left_spec, right_spec, attribute, default=None, check_bool=True):
"""Check attributes other than type.
Args:
left_spec (dict): Signal specification
right_spec (dict): Signal specification to compare with
attribute (string): Attribute to check
default (value): Default value for the attribute (default: None)
check_bool (bool): Check signals of type Bool (default: True)
Returns:
matches (bool): Spec1 and Spec2 has the same value for the attribute
"""
def _format(value):
if isinstance(value, str):
value = value.strip()
if re.fullmatch("[+-]?[0-9]+", value):
value = int(value)
elif re.fullmatch("[+-]?[0-9]+[0-9.,eE+]*", value):
value = float(value.replace(",", "."))
return value
if not check_bool and left_spec["type"] == "Bool":
return True
return _format(left_spec.get(attribute, default)) == _format(right_spec.get(attribute, default))
def found_mismatch(name, left_spec, right_spec, attribute, left_path, right_path):
"""Handle finding a mismatch.
Args:
name (string): Name of signal
left_spec (dict): Spec of signal
right_spec (dict): Signal specification to compare with
attribute (string): Attribute to check
left_path (list(str)): Path for where the left signals' definitions come from
right_path (list(str)): Path for where the right signals' definitions come from
"""
if attribute in ["type", "width"]:
# TODO: Add more properties as serious when the interfaces are more cleaned up
LOGGER.error(
"%s has %ss: %s in %s and %s in %s",
name,
attribute,
left_spec.get(attribute),
left_path,
right_spec.get(attribute),
right_path,
)
return True
LOGGER.info(
"%s has %ss: %s in %s and %s in %s",
name,
attribute,
left_spec.get(attribute),
left_path,
right_spec.get(attribute),
right_path,
)
return False
def check_external_signals(ems, app, model_names=None):
"""Look for external signal mismatches.
Args:
ems (CsvEMS): Parsed signal interface cvs:s
app (Application): Parsed project config
model_names (list(Model)): models based on parsed config jsons
Returns:
serious_mismatch (bool): A serious mismatch was found
"""
serious_mismatch = False
LOGGER.debug("Checking insignals")
errors = {"type": 0, "range": 0, "unit": 0, "width": 0}
app_models = app.get_models()
for model in app_models:
if model_names is not None and model.name not in model_names:
LOGGER.debug("Skipping %s in %s", model.name, app.name)
continue
LOGGER.debug("Checking %s in %s", model.name, app.name)
serious_mismatch |= check_signals(
get_active_signals(model.insignals, app.pybuild["feature_cfg"]),
ems.outsignals,
errors,
[app.name, model.name],
[ems.name],
)
serious_mismatch |= check_signals(
get_active_signals(model.outsignals, app.pybuild["feature_cfg"]),
ems.insignals,
errors,
[app.name, model.name],
[ems.name],
)
LOGGER.debug("Total errors: %s", errors)
return serious_mismatch
def check_signals(left_signals, right_signals, errors, left_path=None, right_path=None):
"""Compares insignals from one system with the outsignals of another.
Args:
left_signals (list(Signal)): Insignals of one system such as a model
right_signals (list(Signal)): Outsignals of system to compare with
errors (dict): Object for counting errors of different types
left_path (list(str)): Path for where the left signals' definitions come from
right_path (list(str)): Path for where the right signals' definitions come from
Returns:
serious_mismatch (bool): A serious mismatch was found
"""
left_path = [] if left_path is None else left_path
right_path = [] if right_path is None else right_path
serious_mismatch = False
LOGGER.debug("Checking from %s", left_signals)
LOGGER.debug("Checking against %s", right_signals)
for (left_signal, right_signal) in [
(left, right) for left, right in product(left_signals, right_signals) if left.name == right.name
]:
LOGGER.debug("Comparing %s and %s", left_signal, right_signal)
left_properties = left_signal.properties
right_properties = right_signal.properties
LOGGER.debug("Properties left: %s", left_properties)
LOGGER.debug("Properties right: %s", right_properties)
if not correct_type(left_properties, right_properties):
serious_mismatch |= found_mismatch(
left_signal.name, left_properties, right_properties, "type", left_path, right_path
)
errors["type"] += 1
if not correct_attribute(left_properties, right_properties, "min", check_bool=False):
serious_mismatch |= found_mismatch(
left_signal.name, left_properties, right_properties, "min", left_path, right_path
)
errors["range"] += 1
if not correct_attribute(left_properties, right_properties, "max", check_bool=False):
serious_mismatch |= found_mismatch(
left_signal.name, left_properties, right_properties, "max", left_path, right_path
)
errors["range"] += 1
if not correct_attribute(left_properties, right_properties, "unit", default="", check_bool=False):
serious_mismatch |= found_mismatch(
left_signal.name, left_properties, right_properties, "unit", left_path, right_path
)
errors["unit"] += 1
if not correct_attribute(left_properties, right_properties, "width", default=1):
serious_mismatch |= found_mismatch(
left_signal.name, left_properties, right_properties, "width", left_path, right_path
)
errors["width"] += 1
return serious_mismatch
def parse_args():
"""Parse arguments
Returns:
Namespace: the parsed arguments
"""
parser = argparse.ArgumentParser(
description="""
Checks attributes and existence of signals
Produced but not consumed signals are giving warnings
Consumed but not produced signals are giving errors
Attributes checked are: types, ranges, units and widths
Mismatches in types or widths give errors
Mismatches in min, max or unit gives warnings
Examples:
py -3.6 -m pybuild.check_interface models_in_projects <Projects> <Models/ModelGroup>\
--projects <ProjectOne> <ProjectTwo>
Checks models in Models/ModelGroup against ProjectOne and ProjectTwo in the folder Projects
py -3.6 -m pybuild.check_interface models <Models> --models <ModelOne> <ModelTwo>
Checks models ModelOne and ModelTwo against all other models in the folder Models
py -3.6 -m pybuild.check_interface projects <Projects> \
--projects ProjectOne ProjectTwo ProjectThree
Checks the interfaces of ProjectOne, ProjectTwo and ProjectThree in the folder Projects
""",
formatter_class=argparse.RawTextHelpFormatter,
)
subparsers = parser.add_subparsers(help="help for subcommand", dest="mode")
# create the parser for the different commands
model = subparsers.add_parser(
"models",
description="""
Check models independently of projects.
All signals are assumed to be active.
Any signal that gives and error is used in a model but is not produced in any model or project
interface.
""",
)
add_model_args(model)
project = subparsers.add_parser(
"projects",
description="""
Check projects as a whole.
It checks all models intenally and the SPM vs the interface.
""",
)
add_project_args(project)
models_in_projects = subparsers.add_parser(
"models_in_projects",
description="""
Check models specifically for projects.
Codeswitches are used to determine if the signals are produced and consumed in each model.
""",
)
add_project_args(models_in_projects)
add_model_args(models_in_projects)
models_in_projects.add_argument("--properties", help="Check properties such as type", action="store_true")
models_in_projects.add_argument("--existence", help="Check signal existence consistency", action="store_true")
return parser.parse_args()
def add_project_args(parser):
"""Add project arguments to subparser"""
parser.add_argument("project_root", help="Path to start looking for projects", type=Path)
parser.add_argument(
"--projects", help="Name of projects to check. Matches both path and interface name.", nargs="+"
)
def add_model_args(parser):
"""Add model arguments to subparser"""
parser.add_argument("model_root", help="Path to start looking for models", type=Path)
parser.add_argument("--models", help="Name of models to check", nargs="+")
parser.add_argument("--gerrit", action="store_true", help="Deprecated")
parser.add_argument("--git", action="store_true", help="Get models to check from git HEAD")
def get_changed_models():
"""Get changed models in current commit."""
repo = git.Repo()
changed_files_tmp = repo.git.diff("--diff-filter=d", "--name-only", "HEAD~1")
changed_files = changed_files_tmp.splitlines()
changed_models = [m for m in changed_files if m.endswith(".mdl") or m.endswith(".slx")]
return changed_models
def model_path_to_name(model_paths):
"""Extract model names from a list of model paths."""
model_names = []
for model_path in model_paths:
model_name_with_extension = model_path.split("/")[-1]
model_name = model_name_with_extension.split(".")[0]
model_names.append(model_name)
return model_names
def model_check(args):
"""Entry point for models command."""
serious_mismatch = False
all_models = get_all_models(args.model_root)
if args.models is not None:
model_names = args.models
elif args.git or args.gerrit:
# Still checking args.gerrit due to common-linux-signal_consistency in pt-zuul-jobs
model_paths = get_changed_models()
model_names = model_path_to_name(model_paths)
else:
model_names = [model.name for model in all_models]
serious_mismatch |= check_models_generic(all_models, model_names, [])
return serious_mismatch
def projects_check(args):
"""Entry point for projects command."""
serious_mismatch = False
projects = get_projects(args.project_root, args.projects)
for app, ems in projects:
LOGGER.info("Checking interfaces for %s", app.name)
serious_mismatch |= check_internal_signals(app, None)
if ems is not None:
serious_mismatch |= check_external_signals(ems, app, None)
return serious_mismatch
def models_in_projects_check(args):
"""Entry point for models_in_projects command."""
serious_mismatch = False
projects = get_projects(args.project_root, args.projects)
LOGGER.debug("Checking projects: %s", projects)
if args.properties:
for app, ems in projects:
serious_mismatch |= check_internal_signals(app, None or args.models)
if ems is not None:
serious_mismatch |= check_external_signals(ems, app, None or args.models)
if args.existence:
all_models = get_all_models(args.model_root)
model_names = [model.name for model in all_models] if args.models is None else args.models
serious_mismatch |= signal_existence(projects, model_names)
return serious_mismatch
def signal_existence(projects, model_names):
"""Check which signals are consumed and produced in each project."""
serious_mismatch = False
for app, ems in projects:
app_models = app.get_models()
LOGGER.info("Checking %s", app.name)
for project_model in app_models:
if project_model.name not in model_names:
continue
LOGGER.debug("Checking signal existence for %s", project_model.name)
active_insignals = get_active_signals(project_model.insignals, app.pybuild["feature_cfg"])
active_outsignals = get_active_signals(project_model.outsignals, app.pybuild["feature_cfg"])
insignal_matches = {}
outsignal_matches = {}
for check_model in app_models:
signal_match(
active_insignals,
get_active_signals(check_model.outsignals, app.pybuild["feature_cfg"]),
insignal_matches,
)
signal_match(
active_outsignals,
get_active_signals(check_model.insignals, app.pybuild["feature_cfg"]),
outsignal_matches,
)
if ems is not None:
signal_match(active_insignals, ems.outsignals, insignal_matches)
signal_match(active_outsignals, ems.insignals, outsignal_matches)
for missing_signal in [signal for signal, matched in insignal_matches.items() if not matched]:
# serious_mismatch = True # TODO: Activate this code when we want to gate on it.
LOGGER.warning(
"%s is consumed in %s but never produced in %s", missing_signal, project_model.name, app.name
)
for missing_signal in [signal for signal, matched in insignal_matches.items() if not matched]:
LOGGER.debug("%s is consumed in %s and produced in %s", missing_signal, project_model.name, app.name)
for missing_signal in [signal for signal, matched in outsignal_matches.items() if not matched]:
LOGGER.info(
"%s is produced in %s but never consumed in %s", missing_signal, project_model.name, app.name
)
for missing_signal in [signal for signal, matched in outsignal_matches.items() if not matched]:
LOGGER.debug("%s is consumed in %s and produced in %s", missing_signal, project_model.name, app.name)
return serious_mismatch
def signal_match(signals_to_check, signals_to_check_against, matches):
"""Check for matches in signal names."""
for a_signal in signals_to_check:
matches[a_signal.name] = matches.get(a_signal.name, False)
for b_signal in signals_to_check_against:
if b_signal.name == a_signal.name:
matches[a_signal.name] = True
def main():
"""Main function for stand alone execution."""
args = parse_args()
if args.mode == "models":
serious_errors = model_check(args)
if args.mode == "projects":
serious_errors = projects_check(args)
if args.mode == "models_in_projects":
serious_errors = models_in_projects_check(args)
if serious_errors:
LOGGER.error("Serious interface errors found.")
sys.exit(1)
if __name__ == "__main__":
main()

526
pybuild/config.py Normal file
View File

@ -0,0 +1,526 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
"""Script to update configs based on c-files."""
import argparse
import copy
import glob
import itertools
import json
import operator
import os
import re
from pprint import pformat
from pybuild.lib import logger
LOGGER = logger.create_logger('config')
class ConfigParserCommon:
"""Parser for c and h files."""
def __init__(self):
"""Initialize common properties."""
self.ifs = []
self.def_map = {}
self.configs = {}
self.code_regexes = [(re.compile(r'^\s*#(?P<type>if|ifdef|ifndef) (?P<condition>.*)$'),
self.parse_if),
(re.compile(r'^\s*#else.*$'), self.parse_else),
(re.compile(r'#define (\w*)\s?(.*)?'), self.parse_defines),
(re.compile(r'^\s*#endif.*$'), self.parse_endif)]
def parse_line(self, line):
"""Process each regex.
Arguments:
line (str): line of code
"""
for regex, function in self.code_regexes:
self.process_regex(line, regex, function)
@staticmethod
def process_regex(line, regex, function):
"""Process one regex.
Arguments:
line (str): line of code
regex (object): compiled re object
function (function): function to run if regex matches
"""
match = regex.match(line)
if match:
function(*match.groups())
def parse_file_content(self, file_content):
"""Parse each line in the file.
Arguments:
file_contents (list): Contents of a file
"""
for line in file_content:
self.parse_line(line)
def parse_if(self, if_type, condition):
"""Parse an if-preprocessor statement.
Arguments:
match (object): match object
"""
self.ifs.append((if_type, condition))
def parse_else(self):
"""Stub for parsing."""
raise NotImplementedError
def parse_defines(self, variable, definition):
"""Stub for parsing."""
raise NotImplementedError
def parse_endif(self):
"""Parse an endif-preprocessor statement.
Arguments:
match (object): match object
"""
if self.ifs:
c_type, condition = self.ifs.pop()
LOGGER.debug('Removing %s %s', c_type, condition)
@staticmethod
def read_file(c_file):
"""Read file.
Arguments:
c_file (str): Full path to a file
"""
file_content = ''
with open(c_file, encoding='latin-1') as file_handle:
for line in file_handle:
file_content += line
out = re.sub(r'/\*.*?\*/', '', file_content, flags=re.S).splitlines()
return out
@staticmethod
def compose_and(conditions):
"""Return and conditions."""
return f"({' && '.join(conditions)})"
@staticmethod
def compose_or(conditions):
"""Return and conditions."""
return f"({' || '.join(conditions)})"
@staticmethod
def sort_u(item):
"""Get a unique list of configs.
Can handle unhashable elements.
Arguments:
item (list): list to unique elements of.
"""
return map(
operator.itemgetter(0),
itertools.groupby(sorted(item)))
class CConfigParser(ConfigParserCommon):
"""Parser for c-files."""
def set_regexes(self, variable):
"""Create regexes to find configs for a single variable.
Arguments:
variable (str): variable
"""
self.code_regexes.append((re.compile(r'.*\b({})\b.*'.format(variable)), self.parse_code))
def parse_defines(self, variable, definition):
"""Parse defines in c-files."""
if definition:
LOGGER.warning('Configuration using %s might be wrong. Set to %s in the c-file', variable, definition)
if variable:
if self.ifs and variable == self.ifs[-1][-1]:
self.ifs.pop()
def parse_else(self):
"""Parse defines in c-files."""
if_type, condition = self.ifs.pop()
self.ifs.append(('else' + if_type, condition))
def parse_code(self, variable):
"""Parse a line with the variable we are looking for.
Arguments:
match (object): match object
"""
LOGGER.debug('Found %s with %s', variable, self.ifs)
if variable not in self.configs:
self.configs[variable] = []
# In this case, we add to a list which should be joined by 'and'
self.configs[variable].append(copy.deepcopy(self.ifs))
@staticmethod
def define_config(condition, header_map, ctype):
"""Get a config from the header map.
Arguments:
condition
header_map
ctype
"""
if ctype == 'ifdef':
if condition in header_map.keys():
config = header_map[condition]
else:
config = 'ALWAYS_ACTIVE'
LOGGER.error('Define not found: %s %s in %s', ctype, condition, header_map)
if ctype == 'ifndef':
if condition in header_map.keys():
config = '!(' + header_map[condition] + ')'
else:
config = 'NEVER_ACTIVE'
LOGGER.error('Define not found: %s %s in %s', ctype, condition, header_map)
if ctype == 'elseifdef':
if condition in header_map.keys():
config = f'!({header_map[condition]})'
else:
config = 'NEVER_ACTIVE'
LOGGER.error('Define not found: %s %s in %s', ctype, condition, header_map)
if ctype == 'elseifndef':
if condition in header_map.keys():
config = header_map[condition]
else:
config = 'ALWAYS_ACTIVE'
LOGGER.error('Define not found: %s %s in %s', ctype, condition, header_map)
return config
def get_configs(self, variable, header_map):
"""Get configs.
Does not remove redundant configs.
"""
configs = []
if variable not in self.configs:
LOGGER.warning('%s not found. Inport that leads to terminal suspected.', variable)
return '(NEVER_ACTIVE)'
for config in self.configs[variable]:
tmp_config = []
for ctype, condition in config:
if ctype == 'if':
if condition in header_map.keys():
LOGGER.debug('Redefining %s as %s', condition, header_map[condition])
tmp_config.append(header_map[condition])
else:
tmp_config.append(condition)
elif ctype == 'elseif':
if condition in header_map.keys():
LOGGER.debug('Redefining %s as !(%s)', condition, header_map[condition])
tmp_config.append('!(' + header_map[condition] + ')')
else:
LOGGER.debug('Negating %s to !(%s)', condition, condition)
tmp_config.append('!(' + condition + ')')
else:
tmp_config.append(self.define_config(condition, header_map, ctype))
if not tmp_config and config:
LOGGER.warning('Config not found: %s from %s', config, self.configs)
tmp_config.append('ALWAYS_ACTIVE')
LOGGER.info('Current config: %s', tmp_config)
elif not config:
LOGGER.debug('No config, always active')
tmp_config.append('ALWAYS_ACTIVE')
configs.append(self.compose_and(list(self.sort_u(tmp_config))))
return self.compose_or(list(self.sort_u(configs)))
class JsonConfigHandler:
"""Handle the json config."""
def __init__(self, cparser, header_map):
"""Initialize handling of one json file.
Arguments:
parser (obj): c-parser
header_map (dict): defines in the header files
"""
self.cparser = cparser
self.header_map = header_map
def traverse_unit(self, struct, setup=True):
"""Go through a data structure and look for configs to update.
Arguments:
struct (dict): data to go through
parser (obj): parsing object
header_map (dict): dict of defines
setup (bool): Set up the parser obecjt (True) or replace configs (False)
"""
for name, data in struct.items():
if isinstance(data, dict) and name != 'API_blk':
# Core data has the propety config, not configs
if data.get('API_blk') is not None or data.get('configs') is not None:
if setup:
self.cparser.set_regexes(name)
else:
data['configs'] = self.cparser.get_configs(name, self.header_map)
else:
self.traverse_unit(data, setup)
def update_config(self, struct, c_code, header_map=None):
"""Update dict.
Arguments:
data (dict): A configuration dict or subdict
c_code (list): code part of a c-file
"""
if header_map is None:
header_map = {}
# Set up regexes:
self.traverse_unit(struct, setup=True)
self.cparser.parse_file_content(c_code)
self.traverse_unit(struct, setup=False)
@staticmethod
def read_config(config_file):
"""Read config file.
Arguments:
config_file (str): Full path to config file
"""
with open(config_file, encoding='latin-1') as unit_json:
unit_config = json.load(unit_json)
return unit_config
@staticmethod
def write_config(config_file, unit_config):
"""Write config file.
Arguments:
config_file (str): Full path to config file
unit_config (dict): Unit config to write to file
"""
with open(config_file, 'w', encoding="utf-8") as unit_json:
unit_json.write(json.dumps(unit_config, indent=2))
class HeaderConfigParser(ConfigParserCommon):
"""Parser for c-files."""
def set_defines(self, defines):
"""Set already defined defines."""
self.def_map = defines
def parse_else(self):
"""Crash if this is found in a header."""
raise NotImplementedError
def parse_defines(self, variable, definition):
"""Parse defines in c-files."""
if self.ifs and self.ifs[-1][0] == 'ifndef' and variable == self.ifs[-1][-1]:
# We have encountered a case of:
#
# #ifndef a
# #define a
# #define b
#
# Then we don't want b to be dependent on a not being defined.
c_type, condition = self.ifs.pop()
LOGGER.debug('Removing now defined %s from ifs: %s %s', variable, c_type, condition)
if definition:
LOGGER.info('Redefining %s as %s', variable, definition)
# Here we ignore the potential #if statements preceding this.
# Have not encountered a case where that matters.
# This structure does not support that logic.
# Potential for bugs.
self.configs[variable] = [definition]
elif self.ifs:
config = self.get_configs(self.ifs, self.def_map)
LOGGER.info('Defining %s as %s', variable, config)
if variable not in self.configs:
self.configs[variable] = []
self.configs[variable].append(copy.deepcopy(config))
self.def_map.update({variable: copy.deepcopy(config)})
@staticmethod
def define_config(condition, header_map, ctype):
"""Get a config from the header map.
Arguments:
condition
header_map
ctype
"""
if ctype == 'ifdef':
if condition in header_map.keys():
LOGGER.debug('returning %s as %s', condition, header_map[condition])
config = header_map[condition]
else:
config = 'ALWAYS_ACTIVE'
LOGGER.warning('Not Implemented Yet: %s %s', ctype, condition)
if ctype == 'ifndef':
if condition in header_map.keys():
LOGGER.debug('returning %s as %s', condition, header_map[condition])
config = '!(' + header_map[condition] + ')'
else:
config = 'ALWAYS_ACTIVE'
LOGGER.warning('Not Implemented Yet: %s %s', ctype, condition)
return config
def process_config(self, inconfigs, header_map):
"""Process configs."""
configs = []
for config in inconfigs:
LOGGER.debug('Current config: %s', config)
if isinstance(config, list):
configs.append(self.process_config(config, header_map))
else:
ctype, condition = config
if ctype == 'if':
if condition in header_map.keys():
configs.append(header_map[condition])
else:
configs.append(condition)
elif ctype in ['ifdef', 'ifndef']:
configs.append(self.define_config(condition, header_map, ctype))
else:
LOGGER.error('Not Implemented: %s', ctype)
if not configs:
configs = ['ALWAYS_ACTIVE']
return list(self.sort_u(configs))
def get_configs(self, configs, header_map):
"""Get configs.
Does not remove redundant configs.
"""
configs = self.process_config(configs, header_map)
if len(configs) > 1:
return self.compose_and(list(self.sort_u(configs)))
return configs[0]
def get_config(self):
"""Get the header map."""
header_map = self.def_map
for header_def, configs in self.configs.items():
header_map[header_def] = self.compose_or(configs)
return header_map
class ProcessHandler:
"""Class to collect functions for the process."""
@staticmethod
def parse_args():
"""Parse arguments."""
parser = argparse.ArgumentParser("Parse configs.json and c-files, to update code switch configes")
subparser = parser.add_subparsers(title='Operation mode', dest='mode',
help="Run chosen files on in a number of directories")
dir_parser = subparser.add_parser(
'models',
help="Run for one or multiple models. Script finds files generated from the model(s).")
dir_parser.add_argument('models', nargs='+',
help="Space separated list of model directories")
file_parser = subparser.add_parser('files',
help="Choose specific files. Mainly for manually written configs.")
file_parser.add_argument('c_file',
help="Full path to C-file")
file_parser.add_argument('config_file',
help="Full path to config file")
file_parser.add_argument('--aux_file',
help="Full path to tl_aux file. (Optional) ")
file_parser.add_argument('--local_file',
help="Full path to OPort file. (Optional) ")
args = parser.parse_args()
return args
@staticmethod
def get_files(model_path):
"""Get file paths from model path.
Arguments:
model_path (str): Path to a model (.mdl)
Returns:
local_file (str): Path to model_OPortMvd_LocalDefs.h
aux_file (str): Path to tl_aux_defines_model.h
config_file (str): Path to config_model.json
c_file (str): Path to model.c
"""
model_dir = os.path.dirname(model_path)
LOGGER.info('Processing %s', model_dir)
model_name = os.path.basename(model_dir)
local_file = os.path.join(model_dir, 'pybuild_src', f'{model_name}_OPortMvd_LocalDefs.h')
# aux_file does not contain the whole model-name if it is too long.
aux_file = os.path.join(model_dir, 'pybuild_src', f'tl_aux_defines_{model_name[2:12]}.h')
config_file = os.path.join(model_dir, 'pybuild_cfg', f'config_{model_name}.json')
clean_model_name = model_name.split('__')[0]
c_file = os.path.join(model_dir, 'pybuild_src', f'{clean_model_name}.c')
return local_file, aux_file, c_file, config_file
@staticmethod
def update_config_file(c_file, config_file, header_map):
"""Update one config file.
Arguments:
c_file (str): Full path to c-file
config_file (str): Full path to config.json
oport_file (str): Full path to OPortMvd_LocalDefs.h (Optional)
"""
LOGGER.info('Updating %s based on %s', config_file, c_file)
cparser = CConfigParser()
c_code = cparser.read_file(c_file)
json_handler = JsonConfigHandler(cparser, header_map)
unit_config = json_handler.read_config(config_file)
json_handler.update_config(unit_config, c_code, header_map)
json_handler.write_config(config_file, unit_config)
@staticmethod
def get_header_config(header_file, def_map):
"""Get header config.
Arguments:
c_file (str): Full path to c-file
config_file (str): Full path to config.json
oport_file (str): Full path to OPortMvd_LocalDefs.h (Optional)
"""
if header_file is None:
LOGGER.info('File not found: %s', header_file)
return def_map
if not os.path.isfile(header_file):
LOGGER.info('File not found: %s', header_file)
model_dir = os.path.dirname(header_file)
for tl_aux_file in glob.glob(os.path.join(model_dir, 'tl_aux*')):
LOGGER.warning('Looking for %s?', tl_aux_file)
return def_map
LOGGER.info('Parsing %s', header_file)
parser = HeaderConfigParser()
header_code = parser.read_file(header_file)
parser.set_defines(def_map)
parser.parse_file_content(header_code)
LOGGER.debug('Header configs: %s', pformat(parser.configs))
return parser.get_config()
@classmethod
def main(cls):
"""Run the main function of the script."""
args = cls.parse_args()
if args.mode == 'files':
LOGGER.info('Using manually supplied files %s', args)
local_defs = cls.get_header_config(args.local_file, {})
aux_defs = cls.get_header_config(args.aux_file, local_defs)
cls.update_config_file(args.c_file, args.config_file, aux_defs)
else:
for model in args.models:
local_file, aux_file, c_file, config_file = cls.get_files(model)
local_defs = cls.get_header_config(local_file, {})
aux_defs = cls.get_header_config(aux_file, local_defs)
if os.path.isfile(c_file) and os.path.isfile(config_file):
cls.update_config_file(c_file, config_file, aux_defs)
if __name__ == "__main__":
ProcessHandler.main()

296
pybuild/core.py Normal file
View File

@ -0,0 +1,296 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
# -*- coding: utf-8 -*-
"""This module is used to parse the core definition files.
Also provides methods to filter the definitions per project.
"""
import os
import time
from collections import defaultdict
from pathlib import Path
from ruamel.yaml import YAML
from pybuild import build_defs
from pybuild.xlrd_csv import WorkBook
from pybuild.problem_logger import ProblemLogger
class Core(ProblemLogger):
"""A class holding core configuration data."""
__wrk_sheets = {'EventIDs': {'mdl_col': 0,
'id_col': 1,
'desc_col': 2,
'fid_col': 3,
'com_col': 4,
'data_col': 5},
'FunctionIDs': {'mdl_col': 0,
'id_col': 1,
'desc_col': 2,
'com_col': 3,
'data_col': 4},
'IUMPR': {'mdl_col': 0,
'id_col': 1,
'desc_col': 2,
'fid_col': 3,
'com_col': 4,
'data_col': 5},
'Mode$06': {'mdl_col': 0,
'id_col': 1,
'desc_col': 2,
'fid_col': 3,
'UAS_col': 4,
'com_col': 5,
'data_col': 6},
'Ranking': {'mdl_col': 0,
'id_col': 1,
'desc_col': 2,
'fid_col': 3,
'com_col': 4,
'data_col': 5}}
_NoFid = '_NoFid'
def __init__(self, project_cfg, unit_cfgs):
"""Parse the config files to an internal representation.
Args:
project_cfg (BuildProjConfig): Project configuration
unit_cfgs (UnitConfigs): Unit definitions
"""
super().__init__()
self._prj_cfg = project_cfg
self._unit_cfgs = unit_cfgs
self._csv_files = None
self._read_csv_files(self._prj_cfg.get_prj_cfg_dir())
self._parse_core_config()
def _read_csv_files(self, config_path):
"""Metod for reading the core csv confgiuration files."""
self.info('******************************************************')
self.info('Start parsing the core configuration files')
start_time = time.time()
csv_file_names = []
for sheet in self.__wrk_sheets:
fname = os.path.join(config_path,
'CoreIdNameDefinition_' + sheet + '.csv')
self.debug('Core csv: %s', fname)
csv_file_names.append(fname)
self._csv_files = WorkBook(csv_file_names)
self.info('Finished parsing the core configuration files (in %4.2f s)', time.time() - start_time)
def _parse_core_config(self):
"""Parse core IDs for all projects."""
# Parse sheet by sheet
tmp_ids = {}
for sheet_name, cols in self.__wrk_sheets.items():
worksheet = self._csv_files.sheet_by_name(sheet_name)
prj_row = worksheet.row(1)
curr_sheet = tmp_ids[sheet_name] = {}
for prj_col in range(cols['data_col'], len(prj_row)):
prj = prj_row[prj_col].value
curr_id = curr_sheet[prj] = {}
for curr_row in range(2, worksheet.nrows):
row = worksheet.row(curr_row)
val = row[prj_col].value
if val and val.strip() in 'xX':
func_tmp = row[cols['id_col']].value.strip()
if func_tmp != '':
if sheet_name == 'Mode$06':
uas_val = row[cols['UAS_col']].value
if isinstance(uas_val, float):
uas_val = int(uas_val)
curr_id[func_tmp] = (row[cols['desc_col']].value,
str(uas_val))
else:
curr_id[func_tmp] = row[
cols['desc_col']].value
# Reformat from Sheet->Proj->Id to Proj->Sheet->Id
self._ids = defaultdict(dict)
for s_name, s_data in tmp_ids.items():
for p_name, p_data in s_data.items():
self._ids[p_name][s_name] = p_data
def get_core_ids_proj(self, project):
"""Get the core IDs for a project.
Returns:
dict: Core IDs
"""
core_ids = self._ids[project]
# Check for unused core symbols
for sheet_name, sheet_data in core_ids.items():
for id_ in sheet_data:
if not self._unit_cfgs.check_if_in_per_cfg_unit_cfg('core', id_):
self.debug('Core symbol not used in current project: %s/%s', sheet_name, id_)
# Check for undefined core symbols in unit configs
ucfg = self._unit_cfgs.get_per_cfg_unit_cfg()
for id_ in ucfg.get('core', {}):
for _, core_sheet in core_ids.items():
if id_ in core_sheet:
break
else:
self.warning('Core symbol not defined for current project: %s', id_)
return core_ids
def get_current_core_config(self):
"""Return all the core configuration parameters for the current project.
Returns:
dict: All the core configuration parameters
"""
return self.get_core_ids_proj(self._prj_cfg.get_prj_config())
class HICore(ProblemLogger):
"""A class holding HI core configuration data."""
DTC_DEFINITION_FILE_NAME = 'DTCs.yaml'
FILE_NAME = 'VcCoreSupplierAbstraction'
def __init__(self, project_cfg, unit_cfgs):
"""Parse the config files to an internal representation.
Args:
project_cfg (BuildProjConfig): Project configuration.
unit_cfgs (UnitConfigs): Unit definitions.
"""
super().__init__()
self._prj_cfg = project_cfg
self._unit_cfgs = unit_cfgs
self.diagnostic_trouble_codes = self.get_diagnostic_trouble_codes()
def _get_project_dtcs(self):
"""Return a set with DTCs in the currently included SW-Units.
Returns:
(set): Set of DTCs in the currently included SW-Units.
"""
project_dtcs = set()
unit_cfg = self._unit_cfgs.get_per_unit_cfg()
for unit, data in unit_cfg.items():
event_data = data.get('core', {}).get('Events')
if event_data is None:
self.critical(f'No "core" or "core.Events" key in unit config for {unit}.')
continue
project_dtcs |= set(event_data.keys())
return project_dtcs
def _read_dtc_yaml_file(self):
"""Return a set with DTCs loaded from the project DTC yaml file.
Returns:
(set): Set of DTCs loaded from the DTC yaml file.
"""
diagnostic_trouble_codes = {}
dtc_definition_file_path = Path(self._prj_cfg.get_prj_cfg_dir(), self.DTC_DEFINITION_FILE_NAME)
if dtc_definition_file_path.exists():
with dtc_definition_file_path.open(mode='r', encoding='utf-8') as dtc_fh:
yaml = YAML(typ='safe', pure=True)
diagnostic_trouble_codes = yaml.load(dtc_fh)
else:
self.warning(
'Unable to generate DTC function calls. '
f'Cannot find file: {dtc_definition_file_path.as_posix()}.'
)
return diagnostic_trouble_codes
def get_diagnostic_trouble_codes(self):
"""Return a set of DTCs appearing in both the project and the project yaml file.
Returns:
(dict): Dict of DTCs, project yaml dict where the keys also appear in the project.
"""
project_dtcs = self._get_project_dtcs()
yaml_dtc_dict = self._read_dtc_yaml_file()
yaml_dtcs = set(yaml_dtc_dict.keys())
dtcs_not_in_project = yaml_dtcs - project_dtcs
dtcs_not_in_yaml = project_dtcs - yaml_dtcs
for key in dtcs_not_in_project:
self.warning(f'Ignoring DTC {key} since it does not appear in any model.')
del yaml_dtc_dict[key]
for key in dtcs_not_in_yaml:
self.warning(f'Ignoring DTC {key} since it does not appear in the project DTC yaml file.')
return yaml_dtc_dict
def get_header_content(self):
"""Get content for the DTC header file.
Returns:
(list(str)): List of lines to write to the DTC header file.
"""
name = self._prj_cfg.get_a2l_cfg()['name']
header_guard = f'{self.FILE_NAME.upper()}_H'
header = [
f'#ifndef {header_guard}\n',
f'#define {header_guard}\n',
'\n',
'/* Core API Supplier Abstraction */\n',
'\n',
'#include "tl_basetypes.h"\n',
f'#include "Rte_{name}.h"\n',
'\n'
]
footer = [f'\n#endif /* {header_guard} */\n']
if not self.diagnostic_trouble_codes:
return header + footer
body = [
'/* enum EventStatus {passed=0, failed=1, prepassed=2, prefailed=3} */\n',
'#define Dem_SetEventStatus(EventName, EventStatus)',
' ',
f'{self.FILE_NAME}_##EventName##_SetEventStatus(EventStatus)\n'
]
body.append(f'\n#include "{build_defs.PREDECL_CODE_ASIL_D_START}"\n')
for dtc_name in self.diagnostic_trouble_codes:
body.append(f'UInt8 {self.FILE_NAME}_{dtc_name}_SetEventStatus(UInt8 EventStatus);\n')
body.append(f'#include "{build_defs.PREDECL_CODE_ASIL_D_END}"\n')
return header + body + footer
def get_source_content(self):
"""Get content for the DTC source file.
Returns:
(list(str)): List of lines to write to the DTC source file.
"""
header = [
f'#include "{self.FILE_NAME}.h"\n',
'\n'
]
if not self.diagnostic_trouble_codes:
return header
body = [f'#include "{build_defs.CVC_CODE_ASIL_D_START}"\n']
for dtc_name, dtc_id in self.diagnostic_trouble_codes.items():
# hex function removes leading 0, below solution zero pads to 6 digits
dtc_hex_str = f"0x{dtc_id:06X}"
body.extend([
f'UInt8 {self.FILE_NAME}_{dtc_name}_SetEventStatus(UInt8 EventStatus)\n',
'{\n',
f' Rte_Call_Event_DTC_{dtc_hex_str}_SetEventStatus(EventStatus);\n',
' return 0;\n',
'}\n',
'\n'
])
body.append(f'#include "{build_defs.CVC_CODE_ASIL_D_END}"\n')
return header + body
def generate_dtc_files(self):
"""Generate required HI Core header files.
Only use for some projects, which doesn't copy static code."""
file_contents = {
'.h': self.get_header_content(),
'.c': self.get_source_content()
}
src_dst_dir = self._prj_cfg.get_src_code_dst_dir()
for extension, content in file_contents.items():
file_path = Path(src_dst_dir, self.FILE_NAME + extension)
with file_path.open(mode='w', encoding='utf-8') as file_handler:
file_handler.writelines(content)

358
pybuild/core_dummy.py Normal file
View File

@ -0,0 +1,358 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
# -*- coding: utf-8 -*-
"""Module containing classes for generating core API dummy code for Bosch and Denso suppliers.
These files are needed for building test SW before the supplier has
updated the core with all the needed core id definitions.
"""
import os
from pybuild import build_defs
from pybuild.problem_logger import ProblemLogger
from pybuild.types import get_ec_type
from pybuild.unit_configs import CodeGenerators
class CoreDummy(ProblemLogger):
"""A class for core API dummy file generation."""
def __init__(self, core_cfg, unit_cfg):
"""Initialize with core API configuration.
Args:
core_cfg (Core): Configuration object
"""
super().__init__()
self._core_cfg = core_cfg
self._unit_cfg = unit_cfg
self._uint16_type = self._get_uint_16_type()
self.fh_c = None
self.fh_h = None
def _get_uint_16_type(self):
if CodeGenerators.target_link in self._unit_cfg.code_generators:
return 'UInt16'
return get_ec_type('UInt16')
def _gen_rb_header(self, file_name):
"""Generate the RB start of the c and h files for the dummy core definition files.
Args:
file_name : The RB header file
"""
self.fh_c.write('/* Autogenerated core id dummy file */\n\n')
self.fh_c.write(f'#include "{file_name}"\n\n')
self.fh_c.write(f'#include "{build_defs.CVC_CODE_START}"\n\n')
self.fh_c.write('#ifndef VcEvDummy_ID\n')
self.fh_c.write(' #define VcEvDummy_ID 1\n')
self.fh_c.write('#endif /* VcEvDummy_ID */\n')
self.fh_c.write('#ifndef VcEvDummy_DEBMODE\n')
self.fh_c.write(' #define VcEvDummy_DEBMODE 0\n')
self.fh_c.write('#endif /* VcEvDummy_DEBMODE */\n')
self.fh_c.write('#ifndef VcFiDummy_ID\n')
self.fh_c.write(' #define VcFiDummy_ID 1\n')
self.fh_c.write('#endif /* VcFiDummy_ID */\n')
self.fh_c.write('#ifndef Vc06Dummy_ID\n')
self.fh_c.write(' #define Vc06Dummy_ID 1\n')
self.fh_c.write('#endif /* Vc06Dummy_ID */\n')
self.fh_c.write('#ifndef Vc09Dummy_ID\n')
self.fh_c.write(' #define Vc09Dummy_ID 1\n')
self.fh_c.write('#endif /* Vc09Dummy_ID */\n')
self.fh_c.write('#ifndef VcRvDummy_ID\n')
self.fh_c.write(' #define VcRvDummy_ID 1\n')
self.fh_c.write('#endif /* VcRvDummy_ID */\n\n')
def_name = file_name.replace('.', '_').upper()
self.fh_h.write('/* Autogenerated core id dummy file */\n\n')
self.fh_h.write(f'#ifndef {def_name}\n')
self.fh_h.write(f'#define {def_name}\n')
self.fh_h.write('#include "vcc_rb_core.h"\n')
self.fh_h.write(self._unit_cfg.base_types_headers)
def _gen_rb_event_dummies(self):
"""Generate RB dummy event declarations."""
for id_, comment in self._core_cfg.get('EventIDs', {}).items():
self.fh_h.write(f'#ifndef {id_}\n')
self.fh_h.write(f' /* {comment} */\n')
self.fh_h.write(f' extern const DSM_DFCType DFC_{id_} ;\n')
self.fh_h.write(f' #define {id_} (((DFC_{id_}.debmode) << 12u) | '
f'(DFC_{id_}.id))\n')
self.fh_h.write(f' #define {id_}_Dummy\n')
self.fh_h.write('#endif\n\n')
self.fh_c.write(f'#ifdef {id_}_Dummy\n')
self.fh_c.write(f' /* {comment} */\n')
self.fh_c.write(f' const DSM_DFCType DFC_{id_} = '
'{VcEvDummy_ID, VcEvDummy_DEBMODE};\n')
self.fh_c.write(f' #warning "CoreID: {id_} is not defined in '
'the supplier SW"\n')
self.fh_c.write('#endif\n\n')
def _gen_rb_fid_dummies(self):
"""Generate RB dummy function id declarations."""
for id_, comment in self._core_cfg.get('FunctionIDs', {}).items():
self.fh_h.write(f'#ifndef {id_}\n')
self.fh_h.write(f' /* {comment} */\n')
self.fh_h.write(f' extern const DSM_FIdType FId_{id_} ;\n')
self.fh_h.write(f' #define {id_} (FId_{id_}.id)\n')
self.fh_h.write(f' #define {id_}_Dummy\n')
self.fh_h.write('#endif\n\n')
self.fh_c.write(f'#ifdef {id_}_Dummy\n')
self.fh_c.write(f' /* {comment} */\n')
self.fh_c.write(f' const DSM_FIdType FId_{id_} = {{VcFiDummy_ID}};\n')
self.fh_c.write(f' #warning "CoreID: {id_} is not defined in the supplier SW"\n')
self.fh_c.write('#endif\n\n')
def _gen_rb_iumpr_dummies(self):
"""Generate RB dummy IUMPR declarations."""
for id_, comment in self._core_cfg.get('IUMPR', {}).items():
self.fh_h.write(f'#ifndef {id_}\n')
self.fh_h.write(f' /* {comment} */\n')
self.fh_h.write(f' extern const DSM_FIdType FId_{id_} ;\n')
self.fh_h.write(f' #define {id_} (FId_{id_}.id)\n')
self.fh_h.write(f' #define {id_}_Dummy\n')
self.fh_h.write('#endif\n\n')
self.fh_c.write(f'#ifdef {id_}_Dummy\n')
self.fh_c.write(f' /* {comment} */\n')
self.fh_c.write(f' const DSM_FIdType FId_{id_} = {{Vc09Dummy_ID}};\n')
self.fh_c.write(f' #warning "CoreID: {id_} is not defined in the supplier SW"\n')
self.fh_c.write('#endif\n\n')
def _gen_rb_mode06_dummies(self):
"""Generate RB dummy mode$06 declarations."""
for id_, comment in self._core_cfg.get('Mode$06', {}).items():
self.fh_h.write(f'#ifndef {id_}\n')
self.fh_h.write(f' /* {comment[0]}, UAS {comment[1]} */\n')
self.fh_h.write(f' extern const DSM_DTRType DTR_{id_};\n')
self.fh_h.write(f' #define {id_} (DTR_{id_}.id)\n')
self.fh_h.write(f' #define {id_}_Dummy\n')
self.fh_h.write('#endif\n\n')
self.fh_c.write(f'#ifdef {id_}_Dummy\n')
self.fh_c.write(f' /* {comment[0]}, UAS {comment[1]} */\n')
self.fh_c.write(f' const DSM_DTRType DTR_{id_} = {{Vc06Dummy_ID}};\n')
self.fh_c.write(f'#warning "CoreID: {id_} is not defined in the supplier SW"\n')
self.fh_c.write('#endif\n\n')
def _gen_rb_rnk_dummies(self):
"""Generate RB dummy ranking declarations."""
for id_, comment in self._core_cfg.get('Ranking', {}).items():
self.fh_h.write(f'#ifndef {id_}\n')
self.fh_h.write(f' /* {comment} */\n')
self.fh_h.write(f' extern const {self._uint16_type} DSMAppl_RnkValStorg_RnkId_{id_};\n')
self.fh_h.write(f' #define {id_} DSMAppl_RnkValStorg_RnkId_{id_}\n')
self.fh_h.write(f' #define {id_}_Dummy\n')
self.fh_h.write('#endif\n\n')
self.fh_c.write(f'#ifdef {id_}_Dummy\n')
self.fh_c.write(f' /* {comment} */\n')
self.fh_c.write(f' const {self._uint16_type} DSMAppl_RnkValStorg_RnkId_{id_} = '
'VcRvDummy_ID;\n')
self.fh_c.write(f' #warning "CoreID: {id_} is not defined in the supplier SW"\n')
self.fh_c.write('#endif\n\n')
def _gen_rb_end(self, file_name):
"""Generate RB footer of c and h files for dummy core definition files."""
def_name = file_name.replace('.', '_').upper()
self.fh_h.write(f'#endif /* {def_name} */\n')
self.fh_h.write('/*-------------------------------------'
'---------------------------------------*\\\n')
self.fh_h.write(' END OF FILE\n')
self.fh_h.write('\\*-------------------------------------'
'---------------------------------------*/\n')
self.fh_c.write(f'#include "{build_defs.CVC_CODE_END}"\n')
self.fh_c.write('/*--------------------------------------'
'--------------------------------------*\\\n')
self.fh_c.write(' END OF FILE\n')
self.fh_c.write('\\*--------------------------------------'
'--------------------------------------*/\n')
def generate_rb_core_dummy_files(self, file_name):
"""Generate core API dummy files for Bosch projects."""
cname = file_name + '.c'
hname = file_name + '.h'
with open(cname, 'w', encoding="utf-8") as self.fh_c:
with open(hname, 'w', encoding="utf-8") as self.fh_h:
_, f_name = os.path.split(hname)
self._gen_rb_header(f_name)
self._gen_rb_event_dummies()
self._gen_rb_fid_dummies()
self._gen_rb_iumpr_dummies()
self._gen_rb_mode06_dummies()
self._gen_rb_rnk_dummies()
self._gen_rb_end(f_name)
def _gen_dg2_header(self, file_name):
"""Generate Denso Gen2/3 header of c and h files for dummy core definition files."""
self.fh_c.write('/* Autogenerated core id dummy file */\n\n')
self.fh_c.write(f'#include "{file_name}"\n')
self.fh_c.write(self._unit_cfg.base_types_headers)
self.fh_c.write(f'#include "{build_defs.CVC_CODE_START}"\n\n')
self.fh_c.write('#define Vc06Undef 1\n')
self.fh_c.write('#define Vc09Undef 1\n')
self.fh_c.write('#define VcRvUndef 1\n\n')
self.fh_h.write('/* Autogenerated core id dummy file */\n\n')
def_name = file_name.replace('.', '_').upper()
self.fh_h.write(f'#ifndef {def_name}\n')
self.fh_h.write(f'#define {def_name}\n')
self.fh_h.write('\n#include "VcCoreSupplierAbstraction.h"\n\n')
self.fh_h.write('/* Check if denso make env. has defined the AUTOSAR declarations\n')
self.fh_h.write(' if not, do dummy declaration to be able to build a dummy I/F in\n')
self.fh_h.write(' an old make environment */\n')
self.fh_h.write('#ifndef FUNC\n')
self.fh_h.write(' #define FUNC(rettype, memclass) rettype /* from Compiler.h */\n')
self.fh_h.write(' #define P2VAR(ptrtype, memclass, ptrclass) ptrtype * '
'/* from Compiler.h */\n')
self.fh_h.write(' typedef unsigned char Std_ReturnType; /* from Std_Types.h */\n')
self.fh_h.write(' typedef unsigned char Dem_EventStatusType;\n')
self.fh_h.write(' typedef unsigned char boolean;\n')
self.fh_h.write(' #define AUTOMATIC\n')
self.fh_h.write(' #define RTE_APPL_DATA\n')
self.fh_h.write(' #define RTE_CODE /* from Compiler_cfg.h */\n')
self.fh_h.write(f'#endif /* {def_name} */\n\n')
self.fh_h.write(f'#include "{build_defs.CVC_CODE_START}"\n')
def _gen_dg2_event_dummies(self):
"""Generate the Denso Gen2+ dummy event declarations."""
for id_, comment in self._core_cfg.get('EventIDs', {}).items():
# Compare with core0/app/OBDFW/bdcore/Rte_Diag.h
self.fh_h.write(f'#ifndef Rte_Call_Event_{id_}_SetEventStatus\n')
self.fh_h.write(f' /* {comment} */\n')
self.fh_h.write(' extern FUNC(Std_ReturnType, RTE_CODE) '
f'Rte_Call_Diag_Event_{id_}_SetEventStatus '
'(Dem_EventStatusType EventStatus);\n')
self.fh_h.write('#endif\n\n')
self.fh_c.write(f'#ifndef Rte_Call_Event_{id_}_SetEventStatus\n')
self.fh_c.write(f' /* {comment} */\n')
self.fh_c.write(' FUNC(Std_ReturnType, RTE_CODE) '
f'Rte_Call_Diag_Event_{id_}_SetEventStatus '
'(Dem_EventStatusType EventStatus) { return 0;}\n')
self.fh_c.write('#endif\n\n')
# Generate Dem_SetEventDisabled dummmies
self.fh_h.write(f'#ifndef Rte_Call_Event_{id_}_SetEventDisabled\n')
self.fh_h.write(f' /* {comment} */\n')
self.fh_h.write(' extern FUNC(Std_ReturnType, RTE_CODE) '
f'Rte_Call_Diag_Event_{id_}_SetEventDisabled (void);\n')
self.fh_h.write('#endif\n\n')
self.fh_c.write(f'#ifndef Rte_Call_Event_{id_}_SetEventDisabled\n')
self.fh_c.write(f' /* {comment} */\n')
self.fh_c.write(' FUNC(Std_ReturnType, RTE_CODE) '
f'Rte_Call_Diag_Event_{id_}_SetEventDisabled '
'(void) { return 0;}\n')
self.fh_c.write('#endif\n\n')
def _gen_dg2_fid_dummies(self):
"""Generate the Denso Gen2+ dummy function id declarations."""
for id_, comment in self._core_cfg.get('FunctionIDs', {}).items():
self.fh_h.write(f'#ifndef Rte_Call_FI_{id_}_GetFunctionPermission\n')
self.fh_h.write(f' /* {comment} */\n')
self.fh_h.write(' extern FUNC(Std_ReturnType, RTE_CODE) '
f'Rte_Call_Diag_FI_{id_}_GetFunctionPermission '
'(P2VAR(boolean, AUTOMATIC, RTE_APPL_DATA) Permission);\n')
self.fh_h.write('#endif\n\n')
self.fh_c.write(f'#ifndef Rte_Call_FI_{id_}_GetFunctionPermission\n')
self.fh_c.write(f' /* {comment} */\n')
self.fh_c.write(' FUNC(Std_ReturnType, RTE_CODE) '
f'Rte_Call_Diag_FI_{id_}_GetFunctionPermission '
'(P2VAR(boolean, AUTOMATIC, RTE_APPL_DATA) Permission) '
'{ *Permission = 1; return 0;}\n')
self.fh_c.write('#endif\n\n')
def _gen_dg2_iumpr_dummies(self):
"""Generate the Denso Gen2+ dummy IUMPR declarations."""
for id_, comment in self._core_cfg.get('IUMPR', {}).items():
self.fh_h.write(f'#ifndef IUMID_{id_}\n')
self.fh_h.write(f' /* {comment} */\n')
self.fh_h.write(f' #define IUMID_{id_} cIUMID_{id_}\n')
self.fh_h.write(f' #define {id_}_DUMMY\n')
self.fh_h.write(f' extern const {self._uint16_type} cIUMID_{id_};\n')
self.fh_h.write('#endif\n\n')
self.fh_c.write(f'#ifdef {id_}_DUMMY\n')
self.fh_c.write(f' /* {comment} */\n')
self.fh_c.write(f' const {self._uint16_type} cIUMID_{id_} = Vc09Undef;\n')
self.fh_c.write('#endif\n\n')
def _gen_dg2_mode06_dummies(self):
"""Generate the Denso Gen2+ dummy mode$06 declarations."""
for id_, comment in self._core_cfg.get('Mode$06', {}).items():
self.fh_h.write(f'#ifndef TR_{id_}\n')
self.fh_h.write(f' /* {comment[0]}, UAS {comment[1]} */\n')
self.fh_h.write(f' #define TR_{id_} cTR_{id_}\n')
self.fh_h.write(f' #define {id_}_DUMMY\n')
self.fh_h.write(f' extern const {self._uint16_type} cTR_{id_};\n')
self.fh_h.write('#endif\n\n')
self.fh_c.write(f'#ifdef {id_}_DUMMY\n')
self.fh_c.write(f' /* {comment[0]}, UAS {comment[1]} */\n')
self.fh_c.write(f' const {self._uint16_type} cTR_{id_} = Vc06Undef;\n')
self.fh_c.write('#endif\n\n')
def _gen_dg2_rnk_dummies(self):
"""Generate the Denso Gen2+ dummy ranking declarations."""
for id_, comment in self._core_cfg.get('Ranking', {}).items():
self.fh_h.write(f'#ifndef RVID_{id_}\n')
self.fh_h.write(f' /* {comment} */\n')
self.fh_h.write(f' #define RVID_{id_} cRVID_{id_}\n')
self.fh_h.write(f' #define {id_}_DUMMY\n')
self.fh_h.write(f' extern const {self._uint16_type} cRVID_{id_};\n')
self.fh_h.write('#endif\n\n')
self.fh_c.write(f'#ifdef {id_}_DUMMY\n')
self.fh_c.write(f' /* {comment} */\n')
self.fh_c.write(f' const {self._uint16_type} cRVID_{id_} = VcRvUndef;\n')
self.fh_c.write('#endif\n\n')
def _gen_dg2_end(self, file_name):
"""Generate Denso Gen2+ footer of c and h files for dummy core definition files."""
def_name = file_name.replace('.', '_').upper()
self.fh_h.write(f'#include "{build_defs.CVC_CODE_END}"\n')
self.fh_h.write(f'#endif /* {def_name} */\n')
self.fh_h.write('/*------------------------------------------'
'----------------------------------*\\\n')
self.fh_h.write(' END OF FILE\n')
self.fh_h.write('\\*-----------------------------------------'
'-----------------------------------*/\n')
self.fh_c.write(f'#include "{build_defs.CVC_CODE_END}"\n')
self.fh_c.write('/*------------------------------------------'
'----------------------------------*\\\n')
self.fh_c.write(' END OF FILE\n')
self.fh_c.write('\\*-----------------------------------------'
'-----------------------------------*/\n')
def generate_dg2_core_dummy_files(self, file_name):
"""Generate the core API dummy files for Denso gen2+ projects."""
cname = file_name + '.c'
hname = file_name + '.h'
with open(cname, 'w', encoding="utf-8") as self.fh_c:
with open(hname, 'w', encoding="utf-8") as self.fh_h:
_, f_name = os.path.split(hname)
self._gen_dg2_header(f_name)
self._gen_dg2_event_dummies()
self._gen_dg2_fid_dummies()
self._gen_dg2_iumpr_dummies()
self._gen_dg2_mode06_dummies()
self._gen_dg2_rnk_dummies()
self._gen_dg2_end(f_name)
def generate_csp_core_dummy_files(self, file_name):
"""Generate core API dummy files for Bosch projects."""
cname = file_name + '.c'
hname = file_name + '.h'
with open(cname, 'w', encoding="utf-8") as self.fh_c:
with open(hname, 'w', encoding="utf-8") as self.fh_h:
_, f_name = os.path.split(hname)
self._gen_dg2_header(f_name)
self._gen_dg2_event_dummies()
self._gen_dg2_fid_dummies()
self._gen_dg2_iumpr_dummies()
self._gen_dg2_mode06_dummies()
self._gen_dg2_rnk_dummies()
self._gen_dg2_end(f_name)

View File

@ -0,0 +1,61 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
"""Module to create an a2l file from a conversion table file."""
import argparse
import json
from pathlib import Path
def get_vtab_text(vtab):
"""Convert vtab dict to a2l text."""
vtab_text = (
' /begin COMPU_VTAB\n'
f' CONV_TAB_{vtab["name"]} /* Name */\n'
' "Conversion table" /* LongIdentifier */\n'
' TAB_VERB /* ConversionType */\n'
f' {len(vtab["disp_values"])} /* NumberValuePairs */\n'
)
vtab_text += ''.join(
f' {vtab["start_value"]+i} /* InVal */\n'
f' "{value}" /* OutVal */\n'
for i, value in enumerate(vtab['disp_values'])
)
vtab_text += ' /end COMPU_VTAB\n\n'
return vtab_text
def create_conversion_table(input_json: Path, output_a2l: Path):
"""Create a2l conversion table for custom units."""
with open(input_json.resolve(), encoding="utf-8") as f_h:
conversion_table = json.load(f_h)
with open(output_a2l.resolve(), 'w', encoding="utf-8") as f_h:
for vtab in conversion_table:
f_h.write(get_vtab_text(vtab))
def parse_args():
"""Parse args."""
parser = argparse.ArgumentParser('Create a2l file from conversion_table.json file')
parser.add_argument('input_file', type=Path)
parser.add_argument('output_file', type=Path)
args = parser.parse_args()
return args
def main():
"""Main."""
args = parse_args()
conversion_table_json = args.input_file
conversion_table_a2l = args.output_file
create_conversion_table(conversion_table_json, conversion_table_a2l)
if __name__ == '__main__':
main()

604
pybuild/dids.py Normal file
View File

@ -0,0 +1,604 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
# -*- coding: utf-8 -*-
"""Module containing classes for DID definitions.
This module is used to parse DID definition files and merge with the unit definitions to find DIDs in a project.
It then generates the DID definition c-files for the supplier DID API.
"""
import csv
import os
from pathlib import Path
from ruamel.yaml import YAML
from pybuild import build_defs
from pybuild.lib.helper_functions import deep_dict_update
from pybuild.problem_logger import ProblemLogger
from pybuild.types import get_ec_type, get_float32_types
from pybuild.unit_configs import CodeGenerators
def get_dids_in_prj(unit_cfgs):
"""Return a dict with DIDs in the currently included SW-Units.
Args:
unit_cfgs (UnitConfigs): Unit definitions.
Returns:
error_message (str): Message in case something went wrong.
dict: a dict with all dids in the project, in the below format:
::
{'DID_VARIABLE_NAME': {
'handle': 'VcRegCh/VcRegCh/Subsystem/VcRegCh/1000_VcRegCh/1600_DID/Gain14',
'configs': ['all'],
'type': 'UInt32',
'unit': '-',
'lsb': 1,
'max': 20,
'min': 0,
'offset': 0,
'description': 'Actual Regen State',
'name': 'DID_VARIABLE_NAME',
'class': 'CVC_DISP'
}
}
"""
dids_prj = {}
error_messages = []
unit_cfg = unit_cfgs.get_per_unit_cfg()
for unit, data in unit_cfg.items():
dids = data.get('dids')
if dids is None:
error_messages.append(f'No "dids" key in unit config for {unit}.')
continue
for name, did in dids.items():
dids_prj[name] = did
return error_messages, dids_prj
class DIDs(ProblemLogger):
"""A class for handling of DID definitions."""
def __init__(self, build_cfg, unit_cfgs):
"""Parse DID definition files referenced by project config.
Args:
build_cfg (BuildProjConfig): Project configuration
unit_cfgs (UnitConfigs): Unit definitions
"""
super().__init__()
self._build_cfg = build_cfg
self._unit_cfgs = unit_cfgs
did_filename = self._build_cfg.get_did_cfg_file_name()
cfg_dir = self._build_cfg.get_prj_cfg_dir()
did_f32_cfg_file = os.path.join(cfg_dir, did_filename + '_Float32.csv')
did_u32_cfg_file = os.path.join(cfg_dir, did_filename + '_UInt32.csv')
self._dids_f32 = self._load_did_config_files(did_f32_cfg_file)
self._dids_u32 = self._load_did_config_files(did_u32_cfg_file)
self.fh_h = None
self.fh_c = None
get_did_error_messages, self._did_dict = get_dids_in_prj(unit_cfgs)
self._did_defs = self.get_did_config()
self._float32_types = get_float32_types()
if get_did_error_messages:
self.critical('\n'.join(get_did_error_messages))
def _load_did_config_files(self, config_file):
"""Load the did config files."""
dids = {}
with open(config_file, mode='r', encoding='utf-8') as did_fh:
csv_did = csv.reader(did_fh, delimiter=';')
did = list(csv_did)
dids['dids'] = {row[0]: int(row[1], 16) for row in did[3:]}
dids['start_did'] = int(did[1][0], 16)
dids['end_did'] = int(did[1][1], 16)
self._check_dids(dids)
return dids
@staticmethod
def _check_dids(dids):
"""Check that all dids are within the start and end values."""
start_did = dids['start_did']
end_did = dids['end_did']
for var, did in dids['dids'].items():
if did < start_did:
raise ValueError(f'{var} has a too low did 0x{did:X} start did is 0x{start_did:X}')
if did > end_did:
raise ValueError(f'{var} has a too high did 0x{did:X} start did is 0x{start_did:X}')
def gen_did_def_files(self, filename):
"""Generate the VcDidDefinitions.c & h files used by the Did-API."""
with open(filename + '.h', 'w', encoding="utf-8") as self.fh_h:
with open(filename + '.c', 'w', encoding="utf-8") as self.fh_c:
dids_f32, dids_u32, errors = self._check_and_reformat_dids()
self._gen_did_def_c_file(dids_f32, dids_u32, errors)
self._gen_did_def_h_file(dids_f32, dids_u32)
return errors
def _check_and_reformat_dids(self):
"""Check that DIDs are defined and create two new dicts."""
dids_f32 = {}
dids_u32 = {}
did_def_f32s = self._did_defs['Float32']['dids']
did_def_u32s = self._did_defs['UInt32']['dids']
errors = []
for sig in sorted(self._did_dict.keys()):
did = self._did_dict[sig]
if did['type'] in self._float32_types:
if sig in did_def_f32s:
dids_f32[did_def_f32s[sig]] = did
else:
msg = f'Did for Float32 signal "{sig}" not defined'
self.critical(msg)
errors.append(msg)
else:
if sig in did_def_u32s:
dids_u32[did_def_u32s[sig]] = did
else:
msg = f'Did for UInt32 signal "{sig}" not defined'
self.critical(msg)
errors.append(msg)
return (dids_f32, dids_u32, errors)
def _get_datatypes(self):
tl_types = ['UInt8', 'Int8', 'UInt16', 'Int16', 'UInt32', 'Int32', 'Float32', 'Bool']
data_types_tl = [f'{tl_type}_' for tl_type in tl_types]
data_types_ec = [f'{get_ec_type(tl_type)}_' for tl_type in tl_types]
if len(self._unit_cfgs.code_generators) > 1:
self.warning('Cannot generate DIDs for more than one generator.'
'Defaulting to TargetLink')
return ', '.join(data_types_tl)
if CodeGenerators.target_link in self._unit_cfgs.code_generators:
return ', '.join(data_types_tl)
return ', '.join(data_types_ec)
def _get_type(self, tl_type):
if CodeGenerators.target_link in self._unit_cfgs.code_generators:
return tl_type
return get_ec_type(tl_type)
def _gen_did_def_h_file(self, dids_f32, dids_u32):
"""Generate the VcDidDefinitions.h files used by the Did-API."""
_, f_name = os.path.split(self.fh_h.name)
header_def_name = f_name.upper().replace('.', '_')
self.fh_h.write(f'#ifndef {header_def_name}\n')
self.fh_h.write(f'#define {header_def_name}\n\n')
self.fh_h.write(self._unit_cfgs.base_types_headers)
self.fh_h.write(f'enum Datatypes {{{self._get_datatypes()}}};\n\n')
self.fh_h.write(f'#define DID_DATASTRUCT_LEN_FLOAT32 {len(dids_f32)}\n')
self.fh_h.write(f'#define DID_DATASTRUCT_LEN_UINT32 {len(dids_u32)}\n\n')
uint16_type = self._get_type('UInt16')
float32_type = self._get_type('Float32')
self.fh_h.write('struct DID_Mapping_UInt32 {\n\t'
f'{uint16_type} DID;'
'\n\tvoid* data;\n\tenum Datatypes type;\n};\n\n')
self.fh_h.write('struct DID_Mapping_Float32 {\n\t'
f'{uint16_type} DID;'
'\n\t'
f'{float32_type}* data;'
'\n};\n\n')
self.fh_h.write(f'#include "{build_defs.PREDECL_START}"\n')
self.fh_h.write('extern const struct DID_Mapping_UInt32 DID_data_struct_UInt32[];\n')
self.fh_h.write('extern const struct DID_Mapping_Float32 DID_data_struct_Float32[];\n')
self.fh_h.write('/* Floats */\n')
for key in sorted(dids_f32.keys()):
did = dids_f32[key]
self.fh_h.write(f'extern {did["type"]} {did["name"]}; /* Did id: 0x{key:X} */\n')
self.fh_h.write('/* Integers & Bools */\n')
for key in sorted(dids_u32.keys()):
did = dids_u32[key]
self.fh_h.write(f'extern {did["type"]} {did["name"]}; /* Did id: 0x{key:X} */\n')
self.fh_h.write(f'#include "{build_defs.PREDECL_END}"\n')
self.fh_h.write(f'\n#endif /* {header_def_name} */\n')
def _gen_did_def_c_file(self, dids_f32, dids_u32, errors):
"""Generate the VcDidDefinitions.c files used by the Did-API."""
_, filename = os.path.split(self.fh_h.name)
self.fh_c.write(f'#include "{filename}"\n\n')
self.fh_c.write(f'#include "{build_defs.CVC_CODE_START}"\n\n')
self.fh_c.write('/* The table shall be sorted in ascending Did is order!\n'
' If not the search algorithm does not work */\n')
self.fh_c.write('const struct DID_Mapping_Float32 DID_data_struct_Float32[] = {\n')
keys = sorted(dids_f32.keys())
for key in keys:
did = dids_f32[key]
if key == keys[-1]:
delim = ' '
else:
delim = ','
self.fh_c.write('\t{0x%X, &%s}%c /* %s */ \n' %
(key, did['name'], delim, did['handle']))
if not keys:
self.fh_c.write('\t{0x0000, 0L} /* Dummy entry */ \n')
self.fh_c.write('};\n\n')
self.fh_c.write('const struct DID_Mapping_UInt32 DID_data_struct_UInt32[] = {\n')
keys = sorted(dids_u32.keys())
for key in keys:
did = dids_u32[key]
if key == keys[-1]:
delim = ' '
else:
delim = ','
self.fh_c.write('\t{0x%X, &%s, %s_}%c /* %s */ \n' %
(key, did['name'], did['type'], delim, did['handle']))
if not keys:
self.fh_c.write(f'\t{{0x0000, 0L, {self._get_type("UInt32")}_}} /* Dummy entry */ \n')
self.fh_c.write('};\n\n')
if errors:
self.fh_c.write('/* *** DIDs not in the definition file! ****\n')
for error in errors:
self.fh_c.write(f'{error}\n')
self.fh_c.write('*/\n')
self.fh_c.write(f'\n#include "{build_defs.CVC_CODE_END}"\n')
self.fh_c.write('\n/*------------------------------------------------------'
'----------------------*\\\n END OF FILE\n\\*-------------'
'---------------------------------------------------------------*/')
def gen_did_carcom_extract(self, filename):
"""Generate the csv-file used for carcom database import."""
with open(filename, 'w', encoding="utf-8") as carcom_file:
for sig in sorted(self._did_dict.keys()):
did = self._did_dict[sig]
carcom_file.write(self._format_did_csv_line(did))
@staticmethod
def _convert_value(value, type, default_value=0):
if value in ['', '-']:
return type(default_value)
return type(value)
@staticmethod
def _hex_location(value):
return hex(value).upper().lstrip('0X')
def _format_did_csv_line(self, did):
"""Format the line based on the did.
Arguments:
did (dict): DID data
"""
did_line = '{' + '};{'.join(['location',
'description',
'name',
'name',
'bytes',
'offset',
'bits',
'data_type',
'nine',
'ten',
'low',
'high',
'scaling',
'compare',
'unit',
'sixteen',
'service',
'eighteen',
'sessions']) + '}\n'
float_format = '06'
compare = ''
did_bytes = 4
did_offset = 0 # Always use 0. Not sure why.
did_bits = 8 * did_bytes
service = 17
sessions = '22: 01 02 03'
unknown = '' # Fields were empty in old system
did_def_f32s = self._did_defs['Float32']['dids']
did_def_u32s = self._did_defs['UInt32']['dids']
if did['name'] in did_def_f32s:
location = self._hex_location(did_def_f32s[did['name']])
elif did['name'] in did_def_u32s:
location = self._hex_location(did_def_u32s[did['name']])
else:
self.warning('Could not find location for %s', did['name'])
location = unknown
if did['type'] in self._float32_types:
did_type = '4-byte float'
scaling = 'x*1'
else:
did_type = 'Unsigned'
u32_scaling_base = '(x-2147483647){{operator}}{{lsb:{float_format}}} {{sign}} {{offset:{float_format}}}'
u32_scaling = u32_scaling_base.format(float_format=float_format)
offset = self._convert_value(did['offset'], float, 0)
if offset > 0:
sign = '+'
else:
sign = '-'
lsb = self._convert_value(did['lsb'], float, 1)
if lsb > 0:
operator = '*'
else:
operator = '/'
lsb = 1.0/lsb # Why we do this, I do not know.
scaling = u32_scaling.format(operator=operator,
lsb=lsb,
sign=sign,
offset=offset)
return did_line.format(location=location,
name=did['name'],
description=did['description'],
bytes=did_bytes,
offset=did_offset,
bits=did_bits,
data_type=did_type,
nine=unknown,
ten=unknown,
low=did['min'],
high=did['max'],
scaling=scaling,
compare=compare,
unit=did['unit'],
sixteen=unknown,
service=service,
eighteen=unknown,
sessions=sessions)
def get_did_config(self):
"""Return a dict with the defined DIDs for all configs.
Returns:
dict: a dict with the DIDs defined for all configs
"""
# self._checkConfig()
return {'Float32': self._dids_f32, 'UInt32': self._dids_u32}
class HIDIDs(ProblemLogger):
"""A class for handling of HI DID definitions."""
FILE_NAME = 'VcDIDAPI'
def __init__(self, build_cfg, unit_cfgs):
"""Init.
Args:
build_cfg (BuildProjConfig): Project configuration
unit_cfgs (UnitConfigs): Unit definitions
"""
super().__init__()
self._build_cfg = build_cfg
self._unit_cfgs = unit_cfgs
self.did_dict = self._compose_did_data()
def _load_did_config_files(self, config_file):
"""Load the did config files.
Args:
config_file (str): Path to DID configuration file.
Returns:
dids (dict): Parsed DIDs from the configuration file.
"""
dids = {}
config_file_path = Path(config_file)
if config_file_path.exists():
with config_file_path.open(mode='r', encoding='utf-8') as did_fh:
yaml = YAML(typ='safe', pure=True)
dids = self._verify_did_config_dict(yaml.load(did_fh))
else:
self.warning(f'Unable to parse DIDs. Cannot find file: {config_file_path.as_posix()}.')
return dids
def _verify_did_config_dict(self, dids):
"""Verify the structure of the dict from the DID configuration file.
Missing keys will be added but also produce critical errors.
Args:
dids (dict): DIDs parsed from DID configuration file.
Returns:
(dict): Updated DID dict.
"""
optional_keys = {
'nr_of_bytes',
}
expected_keys = {
'id',
'data_type',
'function_type',
}
expected_function_type_keys = {
'read_data',
'read_data_max',
'read_data_min',
'condition_check',
'condition_check_max',
'condition_check_min',
}
for did, did_data in dids.items():
did_keys = set(did_data.keys())
used_optional_keys = did_keys & optional_keys
unknown_keys = did_keys - (expected_keys | optional_keys)
missing_keys = expected_keys - did_keys
for key in used_optional_keys:
self.info(f'Using optional key {key} for DID {did}.')
for key in unknown_keys:
self.warning(f'Ignoring unknown element {key} for DID {did}.')
del did_data[key]
for key in missing_keys:
self.critical(f'DID {did} is missing element {key}.')
did_data[key] = '<missing>'
if did_data['function_type'] not in expected_function_type_keys:
self.critical(f"DID {did} lists unknown function type {did_data['function_type']}")
did_data['function_type'] = '<missing>'
return dids
def _compose_did_data(self):
"""Gather and merge DID data from project simulink models and DID configuration file.
Returns:
did_dict (dict): Dict containing project DID data.
"""
get_did_error_messages, project_dids = get_dids_in_prj(self._unit_cfgs)
if get_did_error_messages:
self.critical('\n'.join(get_did_error_messages))
return {}
did_filename = self._build_cfg.get_did_cfg_file_name()
config_directory = self._build_cfg.get_prj_cfg_dir()
did_config_file = os.path.join(config_directory, did_filename)
dids = self._load_did_config_files(did_config_file)
did_dict = self.verify_dids(project_dids, dids)
for data in did_dict.values():
data['function'] = self.compose_did_function(data)
return did_dict
@staticmethod
def compose_did_function(did_data):
"""Compose DID function calls.
Args:
did_data (dict): Dict describing a DID in the project.
Returns:
function (str): Function to generate for given DID.
"""
did_id = did_data["id"]
data_type = did_data["data_type"]
type_to_function_map = {
'<missing>': f'DID_{did_id}_Missing({data_type} *Data)',
'read_data': f'DID_{did_id}_Runnable_ReadData({data_type} *Data)',
'read_data_max': f'DID_{did_id}_Runnable_MAX_ReadData({data_type} *Data)',
'read_data_min': f'DID_{did_id}_Runnable_MIN_ReadData({data_type} *Data)',
'condition_check': f'DID_{did_id}_Runnable_ConditionCheckRead({data_type} *ErrorCode)',
'condition_check_max': f'DID_{did_id}_Runnable_MAX_ConditionCheckRead({data_type} *ErrorCode)',
'condition_check_min': f'DID_{did_id}_Runnable_MIN_ConditionCheckRead({data_type} *ErrorCode)'
}
return type_to_function_map[did_data['function_type']]
def verify_dids(self, project_dids, dids):
"""Verify the DIDs.
* Model DIDs must be defined in DID configuration file.
* ID numbers can only appear once per function type.
Args:
project_dids (dict): DIDs listed in project/simulink models.
dids (dict): DIDs listed in the DID configuration file.
Returns:
valid_dids (dict): Validated DIDs listed in both DID configuration file as well as project.
"""
valid_dids = {}
did_id_usage = {}
if not project_dids:
for did in dids:
self.warning(f'Ignoring DID {did}, not defined in any model.')
return valid_dids
for name in project_dids:
if name not in dids:
self.warning(f'DID {name} not defined in DID defintion file.')
continue
did_id = dids[name]['id']
function_type = dids[name]['function_type']
if did_id in did_id_usage:
if function_type in did_id_usage[did_id]:
self.critical(
f'ID {did_id} is '
f'already used for DID {did_id_usage[did_id][function_type]} of '
f'function type {function_type}.'
)
continue
did_id_usage[did_id][function_type] = name
else:
did_id_usage[did_id] = {function_type: name}
valid_dids[name] = deep_dict_update(dids[name], project_dids[name])
return valid_dids
def get_header_file_content(self):
"""Get content for the DID API header file.
Returns:
(list(str)): List of lines to write to DID API header file.
"""
name = self._build_cfg.get_a2l_cfg()['name']
header_guard = f'{self.FILE_NAME.upper()}_H'
header = [
f'#ifndef {header_guard}\n',
f'#define {header_guard}\n',
'\n',
'#include "tl_basetypes.h"\n',
f'#include "Rte_{name}.h"\n',
'\n'
]
footer = [f'\n#endif /* {header_guard} */\n']
if not self.did_dict:
return header + footer
body = [f'#include "{build_defs.PREDECL_DISP_ASIL_D_START}"\n']
for did_data in self.did_dict.values():
define = did_data["class"].split('/')[-1] # E.q. for ASIL D it is ASIL_D/CVC_DISP_ASIL_D
body.append(f'extern {define} {did_data["type"]} {did_data["name"]};\n')
body.append(f'#include "{build_defs.PREDECL_DISP_ASIL_D_END}"\n')
body.append(f'\n#include "{build_defs.PREDECL_CODE_ASIL_D_START}"\n')
for did_data in self.did_dict.values():
body.append(f'void {did_data["function"]};\n')
body.append(f'#include "{build_defs.PREDECL_CODE_ASIL_D_END}"\n')
return header + body + footer
def get_source_file_content(self):
"""Get content for the DID API source file.
Returns:
(list(str)): List of lines to write to DID API source file.
"""
header = [
f'#include "{self.FILE_NAME}.h"\n',
'\n'
]
if not self.did_dict:
return header
body = [f'#include "{build_defs.CVC_CODE_ASIL_D_START}"\n']
for did, did_data in self.did_dict.items():
size = f'{did_data["nr_of_bytes"]}' if 'nr_of_bytes' in did_data else f'sizeof({did_data["data_type"]})'
if 'ConditionCheckRead' in did_data["function"]:
argument = 'ErrorCode'
else:
argument = 'Data'
body.extend([
f'void {did_data["function"]}\n',
'{\n',
f' memcpy({argument}, &{did}, {size});\n',
'}\n'
])
body.append(f'#include "{build_defs.CVC_CODE_ASIL_D_END}"\n')
return header + body
def generate_did_files(self):
"""Generate required DID API files.
Only use for some projects, which doesn't copy static code."""
file_contents = {
'.h': self.get_header_file_content(),
'.c': self.get_source_file_content()
}
src_dst_dir = self._build_cfg.get_src_code_dst_dir()
for extension, content in file_contents.items():
file_path = Path(src_dst_dir, self.FILE_NAME + extension)
with file_path.open(mode='w', encoding='utf-8') as file_handler:
file_handler.writelines(content)

159
pybuild/dummy.py Normal file
View File

@ -0,0 +1,159 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
"""Module for generation of c- and a2l-file with dummy signal declarations."""
import pybuild.build_defs as bd
from pybuild.types import byte_size_string, get_bitmask
from pybuild.a2l import A2l
from pybuild.problem_logger import ProblemLogger
class DummyVar(ProblemLogger):
"""Generate c- and a2l-files which declares all missing variables in the interfaces.
TODO: Please remove this file! Only used while testing.
"""
def __init__(self, unit_cfg, ext_dict, res_dict, prj_cfg, user_defined_types):
"""Initialize instance of class."""
super().__init__()
self._unit_cfg = unit_cfg
self._unit_vars = unit_cfg.get_per_cfg_unit_cfg()
self._ext_dict = ext_dict
self._res_dict = res_dict
self._ext_vars = {}
self._int_vars = {}
self._prj_cfg = prj_cfg
self._enumerations = user_defined_types.get_enumerations()
self._common_header_files = user_defined_types.common_header_files
def _get_byte_size_string(self, data_type):
"""Get byte size of a data type as string.
Enumeration byte sizes are derived from the underlying data type.
Args:
data_type (str): Data type.
Returns:
byte_size_string(pybuild.types.byte_size_string): Return result of pybuild.types.byte_size_string.
"""
if data_type in self._enumerations:
return byte_size_string(self._enumerations[data_type]['underlying_data_type'])
return byte_size_string(data_type)
def _restruct_input_data(self):
"""Restructure all the input variables per data-type.
This will be used for declaring the variables and generating the
A2L-file
"""
ext_out = {var: data for ioclass, vardict in self._ext_dict.items()
if ioclass.endswith('-Output') for var, data in vardict.items()}
ext_ = {}
for var in self._res_dict['sigs']['ext']['missing']:
self.debug('ext: %s', var)
if var in ext_out:
data = ext_out[var]
self.debug('ext_data: %s', data)
ext_[var] = data
int_ = {}
for unit in self._res_dict['sigs']['int']:
for var in self._res_dict['sigs']['int'][unit]['missing']:
if var not in ext_ and var in self._unit_vars['inports']:
data = self._unit_vars['inports'][var][unit]
int_[var] = data
for var, data in int_.items():
data_type_size = self._get_byte_size_string(data['type'])
self._int_vars.setdefault(data_type_size, {})[var] = data
for var, data in ext_.items():
data_type_size = self._get_byte_size_string(data['type'])
self._ext_vars.setdefault(data_type_size, {})[var] = data
def _a2l_dict(self):
"""Return a dict defining all parameters for a2l-generation."""
res = {
'vars': {},
'function': 'VcDummy'
}
for inp in [self._ext_vars]:
for sizes in inp.values():
for var, data in sizes.items():
if data['type'] in self._enumerations:
data_type = self._enumerations[data['type']]['underlying_data_type']
else:
data_type = data['type']
resv = res['vars']
resv.setdefault(var, {})['a2l_data'] = {
'bitmask': get_bitmask(data_type),
'description': data.get('description', ''),
'lsb': '2^0',
'max': data.get('max'),
'min': data.get('min'),
'offset': '0',
'unit': data['unit'],
'x_axis': None,
'y_axis': None
}
resv[var]['array'] = []
resv[var]['function'] = ['VcEc']
resv[var]['var'] = {
'cvc_type': 'CVC_DISP',
'type': data_type,
'var': var
}
return res
@classmethod
def _generate_var_defs(cls, fh_c, vars, enums, comment):
"""Generate the variable definitions."""
fh_c.write(f'\n{comment}\n\n')
for varsize in sorted(vars.keys(), reverse=True):
fh_c.write(f'/* Variables of size {varsize} bytes */\n\n')
var_defs = vars[varsize]
for var in sorted(var_defs.keys()):
data = var_defs[var]
if data['type'] in enums:
if enums[data['type']]['default_value'] is not None:
init_value = enums[data['type']]['default_value']
else:
cls.warning('Initializing enumeration %s to "zero".', data['type'])
init_value = [k for k, v in enums[data['type']]['members'].items() if v == 0][0]
fh_c.write(f"{data['type']} {var} = {init_value};\n")
else:
fh_c.write(f"{data['type']} {var} = {0};\n")
fh_c.write('\n')
@classmethod
def _generate_var_initialization(cls, fh_c, vars, comment):
"""Generate the variable initializations."""
fh_c.write(f'\n{comment}\n\n')
fh_c.write('\nvoid RESTART_VcDummy(void)\n{\n')
for varsize in sorted(vars.keys(), reverse=True):
var_defs = vars[varsize]
for var in sorted(var_defs.keys()):
fh_c.write(f" {var} = {0};\n")
fh_c.write('}\n')
def _generate_c_file(self, filename):
"""Generate the c-file defining all missing input variables."""
general_includes = ''
general_includes += self._unit_cfg.base_types_headers
for common_header_file in self._common_header_files:
general_includes += f'#include "{common_header_file}"\n'
general_includes += '\n'
with open(filename, 'w', encoding="utf-8") as fh_c:
fh_c.write(general_includes)
fh_c.write(f'#include "{bd.CVC_DISP_START}"\n\n')
self._generate_var_defs(fh_c, self._ext_vars, self._enumerations, '/** Missing external signals **/')
fh_c.write(f'\n#include "{bd.CVC_DISP_END}"\n')
self.info('Generated %s', filename)
def generate_file(self, filename):
"""Generate the files for defining all missing input variables."""
self._restruct_input_data()
self._generate_c_file(filename + '.c')
a2l_dict = self._a2l_dict()
A2l(a2l_dict, self._prj_cfg).gen_a2l(filename + '.a2l')

252
pybuild/dummy_spm.py Normal file
View File

@ -0,0 +1,252 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
# -*- coding: utf-8 -*-
"""Module containing classes for model out-port interfaces."""
import os
from operator import itemgetter
import math
import pybuild.build_defs as bd
from pybuild import signal
from pybuild.a2l import A2l
from pybuild.problem_logger import ProblemLogger
from pybuild.types import byte_size, get_bitmask
class DummySpm(ProblemLogger):
"""Generate c-files which defines missing outport variables in the model out-port interface.
The models declare all in-ports as 'external' and pybuild will then
generate any missing outports in the correct #if/#endif guards here.
One c-file per outport origin model should be generated.
* Generate c-code from matlab/TL script with #if/#endif guards.
- Pro: Is generated at the same time as other code and placed in model src folder.
- Pro: Generic code. Will be cached together with TL-generated code.
- Con: m-script
* Generate c-code from python with only needed variables. No preprocessor directives.
- Pro: Python
- Pro: Simpler c-file with only used variables.
- Con: Not generic! Not cached?
"""
__asil_level_map = {
'A': (bd.CVC_DISP_ASIL_A_START, bd.CVC_DISP_ASIL_A_END),
'B': (bd.CVC_DISP_ASIL_B_START, bd.CVC_DISP_ASIL_B_END),
'C': (bd.CVC_DISP_ASIL_C_START, bd.CVC_DISP_ASIL_C_END),
'D': (bd.CVC_DISP_ASIL_D_START, bd.CVC_DISP_ASIL_D_END),
'QM': (bd.CVC_DISP_START, bd.CVC_DISP_END)}
def __init__(self, missing_outports, prj_cfg, feature_cfg, unit_cfg, user_defined_types, basename):
"""Constructor.
Args:
missing_outports (list): undefined outports based on unit config variables.
prj_cfg (BuildProjConfig): Build project class holding where files should be stored.
feature_cfg (FeatureConfig): Feature configs from SPM_Codeswitch_Setup.
unit_cfg (UnitConfigs): Class holding all unit interfaces.
user_defined_types (UserDefinedTypes): Class holding user defined data types.
basename (str): the basename of the outvar, used for .c and .a2l creation.
See :doc:`Unit config <unit_config>` for information on the 'outport' dict.
"""
super().__init__()
self._prj_cfg = prj_cfg
self._feature_cfg = feature_cfg
self._unit_cfg = unit_cfg
self._enumerations = user_defined_types.get_enumerations()
self._common_header_files = user_defined_types.common_header_files
self._name = basename
self.use_volatile_globals = prj_cfg.get_use_volatile_globals()
self._missing_outports = self._restruct_input_data(missing_outports)
def _get_byte_size(self, data_type):
"""Get byte size of a data type.
Enumeration byte sizes are derived from the underlying data type.
Args:
data_type (str): Data type.
Returns:
byte_size(pybuild.types.byte_size): Return result of pybuild.types.byte_size.
"""
if data_type in self._enumerations:
return byte_size(self._enumerations[data_type]['underlying_data_type'])
return byte_size(data_type)
def _restruct_input_data(self, outports):
"""Restructure all the input variables per data-type."""
outports.sort(key=itemgetter('name'))
outports.sort(key=lambda var: self._get_byte_size(var['type']), reverse=True)
new_outports = {}
for outport in outports:
integrity_level = outport.get('integrity_level', 'QM')
if integrity_level == 'QM':
outport['cvc_type'] = 'CVC_DISP'
else:
outport['cvc_type'] = f'ASIL_{integrity_level}/CVC_DISP_ASIL_{integrity_level}'
if integrity_level in new_outports:
new_outports[integrity_level].append(outport)
else:
new_outports.update({integrity_level: [outport]})
return new_outports
def _a2l_dict(self, outports):
"""Return a dict defining all parameters for a2l-generation."""
res = {
'vars': {},
'function': 'VcDummy_spm'
}
for outport in [port for sublist in outports.values() for port in sublist]:
var = outport['name']
if outport['type'] in self._enumerations:
data_type = self._enumerations[outport['type']]['underlying_data_type']
else:
data_type = outport['type']
resv = res['vars']
resv.setdefault(var, {})['a2l_data'] = {
'bitmask': get_bitmask(data_type),
'description': outport.get('description', ''),
'lsb': '2^{}'.format(int(math.log2(outport.get('lsb', 1)))
if outport.get('lsb') not in ['-', '']
else 0),
'max': outport.get('max'),
'min': outport.get('min'),
'offset': -(outport.get('offset', 0) if outport.get('offset') not in ['-', ''] else 0),
'unit': outport['unit'],
'x_axis': None,
'y_axis': None}
resv[var]['function'] = ['VcEc']
resv[var]['var'] = {'cvc_type': outport['cvc_type'],
'type': data_type,
'var': var}
resv[var]['array'] = outport.get('width', 1)
res.update({'vars': resv})
return res
def generate_code_files(self, dst_dir):
"""Generate code and header files.
Args:
dst_dir (str): Path to destination directory.
"""
h_file_path = os.path.join(dst_dir, f'{self._name}.h')
self._generate_h_file(h_file_path, self._missing_outports)
c_file_path = os.path.join(dst_dir, f'{self._name}.c')
self._generate_c_file(c_file_path, self._missing_outports)
def generate_a2l_files(self, dst_dir):
"""Generate A2L files.
Args:
dst_dir (str): Path to destination directory.
"""
filename = f"{os.path.join(dst_dir, self._name)}.a2l"
a2l_dict = self._a2l_dict(self._missing_outports)
a2l = A2l(a2l_dict, self._prj_cfg)
a2l.gen_a2l(filename)
def _generate_h_file(self, file_path, outports):
"""Generate header file.
Args:
file_path (str): File path to generate.
"""
file_name = os.path.basename(file_path).split('.')[0]
with open(file_path, 'w', encoding="utf-8") as fh:
fh.write(f'#ifndef {file_name.upper()}_H\n')
fh.write(f'#define {file_name.upper()}_H\n')
fh.write(self._unit_cfg.base_types_headers)
fh.write('#include "VcCodeSwDefines.h"\n')
for common_header_file in self._common_header_files:
fh.write(f'#include "{common_header_file}"\n')
for integrity_level in outports.keys():
disp_start = bd.PREDECL_ASIL_LEVEL_MAP[integrity_level]['DISP']['START']
disp_end = bd.PREDECL_ASIL_LEVEL_MAP[integrity_level]['DISP']['END']
fh.write('\n')
fh.write(f'#include "{disp_start}"\n')
for outport in outports[integrity_level]:
if outport['class'] not in signal.INPORT_CLASSES:
self.warning(f'inport {outport["name"]} class {outport["class"]} is not an inport class')
array = ''
width = outport['width']
if not isinstance(width, list):
width = [width]
if len(width) != 1 or width[0] != 1:
for w in width:
if w > 1:
if not isinstance(w, int):
self.critical(f'{outport["name"]} widths must be integers. Got "{type(w)}"')
array += f'[{w}]'
elif w < 0:
self.critical(f'{outport["name"]} widths can not be negative. Got "{w}"')
if self.use_volatile_globals:
fh.write(f"extern volatile {outport['type']} {outport['name']}{array};\n")
else:
fh.write(f"extern {outport['type']} {outport['name']}{array};\n")
fh.write(f'#include "{disp_end}"\n')
fh.write(f'#endif /* {file_name.upper()}_H */\n')
def _generate_c_file(self, file_path, outports):
"""Generate C-file for inports that are missing outports except for supplier ports."""
file_name = os.path.basename(file_path).split('.')[0]
base_header = f'#include "{file_name}.h"\n'
with open(file_path, 'w', encoding="utf-8") as fh_c:
fh_c.write(base_header)
for integrity_level in outports.keys():
disp_start = bd.CVC_ASIL_LEVEL_MAP[integrity_level]['DISP']['START']
disp_end = bd.CVC_ASIL_LEVEL_MAP[integrity_level]['DISP']['END']
fh_c.write('\n')
fh_c.write(f'#include "{disp_start}"\n')
for outport in outports[integrity_level]:
if outport['class'] not in signal.INPORT_CLASSES:
self.warning(f'inport {outport["name"]} class {outport["class"]} is not an inport class')
width = outport['width']
if outport['type'] in self._enumerations:
if self._enumerations[outport['type']]['default_value'] is not None:
init_value = self._enumerations[outport['type']]['default_value']
else:
self.warning('Initializing enumeration %s to "zero".', outport['type'])
init_value = [
k for k, v in self._enumerations[outport['type']]['members'].items() if v == 0
][0]
if width != 1:
self.critical(f'{outport["name"]} enumeration width must be 1. Got "{width}"')
fh_c.write(f"{outport['type']} {outport['name']} = {init_value};\n")
else:
if not isinstance(width, list):
width = [width]
if len(width) == 1 and width[0] == 1:
array = ' = 0'
else:
array = ''
for w in width:
if w > 1:
if not isinstance(w, int):
msg = f'{outport["name"]} widths must be integers. Got "{type(w)}"'
self.critical(msg)
array += f'[{w}]'
elif w < 0:
self.critical(f'{outport["name"]} widths can not be negative. Got "{w}"')
if self.use_volatile_globals:
fh_c.write(f'volatile {outport["type"]} {outport["name"]}{array};\n')
else:
fh_c.write(f'{outport["type"]} {outport["name"]}{array};\n')
fh_c.write(f'#include "{disp_end}"\n')
def generate_files(self, dst_dir):
"""Generate the files for defining all missing input variables."""
self.generate_code_files(dst_dir)
self.generate_a2l_files(dst_dir)

View File

@ -0,0 +1,52 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
"""Environment compatibility check."""
import sys
def check_python_string(python_lower, python_upper=None):
"""Ensure current Python interpreter is a version between lower and upper.
Arguments:
python_lower (str): Required lower bound for Python version.
python_upper (str): Optional upper bound for Python version.
Raises:
RuntimeError: If current Python executable is not compatible with pybuild.
"""
versions = [_split_version(python_lower)]
if python_upper:
versions.append(_split_version(python_upper))
check_python_tuple(*versions)
def check_python_tuple(python_lower, python_upper=None):
"""Ensure current Python interpreter is a version between lower and upper.
Arguments:
python_lower (2-tuple): Required lower bound for Python version.
python_upper (2-tuple): Optional upper bound for Python version.
Raises:
RuntimeError: If current Python executable is not compatible with pybuild.
"""
cur_version = sys.version_info[:2]
if cur_version[0] < python_lower[0] or cur_version[1] < python_lower[1]:
raise RuntimeError(_format_error(f'must be higher than {python_lower}'))
if python_upper and (cur_version[0] > python_upper[0] or cur_version[1] > python_upper[1]):
raise RuntimeError(_format_error(f'must be lower than {python_upper}'))
def _split_version(version_string):
"""Split a major.minor style string and returns a 2-tuple of integers."""
parts = version_string.split('.')
return (int(parts[0]), int(parts[1]))
def _format_error(message):
"""Return a version error string including current interpreter, version and a custom message."""
return f'Unsupported Python version ({sys.version_info}), {message}. Path: {sys.executable}'

249
pybuild/ext_dbg.py Normal file
View File

@ -0,0 +1,249 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
# -*- coding: utf-8 -*-
"""Module containing classes for VCC - Supplier debug interface.
TODO: Check if all IO parameters in SPMEMSInterfaceRequirements.xlsx defined as safety variables
automatically have debug switches.
"""
import os
import re
from pybuild import build_defs
from pybuild.types import byte_size_string, get_bitmask, a2l_range
from pybuild.a2l import A2l
from pybuild.problem_logger import ProblemLogger
class ExtDbg(ProblemLogger):
"""Class for generating c-files.
These declares all debug parameters in the VCC - Supplier interface.
"""
__data_type_size = {'Float32': '4', 'UInt32': '4', 'Int32': '4',
'UInt16': '2', 'Int16': '2',
'UInt8': '1', 'Int8': '1', 'Bool': '1'}
def __init__(self, variable_dict, prj_cfg, unit_cfg, integrity_level=build_defs.CVC_ASIL_QM):
"""Constructor.
Args:
variable_dict (dict): dictionary with signal information
Variable dict shall have the following format and is generated by the
:doc:`CsvSignalInterfaces <signal_interfaces>` class::
{
'CAN-Input': {
'signal1': {
'IOType': 'd',
'description': 'Some description',
'init': 0,
'max': 1,
'min': 0,
'type': 'UInt8',
'unit': '-'
},
'signal2': {
...
}
},
'CAN-Output': {
'signal3': {
...
}
},
'xxx-Input': ...,
'xxx-Output': ...
}
The optional keyword arguments may override which code section
directives to use.
"""
super().__init__()
self.set_integrity_level(integrity_level)
self._var_dict = variable_dict
self.__restructured_data = self._restruct_data()
self._prj_cfg = prj_cfg
self._unit_cfg = unit_cfg
self.use_volatile_globals = prj_cfg.get_use_volatile_globals()
def set_integrity_level(self, integrity_level):
"""Set integrity level of code generation.
Args:
integrity_level (str): integrity level of the unit from 'A' to 'D' or 'QM'
"""
self._disp_start = integrity_level['DISP']['START']
self._disp_end = integrity_level['DISP']['END']
self._cal_start = integrity_level['CAL']['START']
self._cal_end = integrity_level['CAL']['END']
self._code_start = integrity_level['CODE']['START']
self._code_end = integrity_level['CODE']['END']
def _restruct_data(self):
"""Restructure input variables per data-type.
This will be used for declaring the variables and generating the A2L-file.
"""
data = {'outputs': {}, 'inputs': {}}
for inp in self._var_dict.keys():
if re.match(r'.*Output$', inp) is not None:
iotype = 'outputs'
else:
iotype = 'inputs'
for var, var_data in self._var_dict[inp].items():
data_type_size = byte_size_string(var_data['type'])
data[iotype].setdefault(data_type_size, {})[var] = var_data
return data
def _a2l_dict(self, var_dict, function):
"""Generate dict defining parameters for a2l-generation."""
def _range(data):
range_a2l = a2l_range(data['type'])
if data['min'] == '-':
a2l_min = range_a2l[0]
else:
a2l_min = data['min']
if data['max'] == '-':
a2l_max = range_a2l[1]
else:
a2l_max = data['max']
return a2l_min, a2l_max
res = {'vars': {},
'function': function}
for var, data in self._type_order_iterator(var_dict):
resv = res['vars']
a2l_min, a2l_max = _range(data)
a2l_data = {
'bitmask': get_bitmask(data['type']),
'description': data['description'],
'lsb': '2^0',
'max': a2l_max,
'min': a2l_min,
'offset': '0',
'unit': '',
'x_axis': None,
'y_axis': None
}
var_db = f'c{var[1:]}_db'
var_sw = self._var_name_to_dbgsw_name(var)
resv.setdefault(var_db, {})['a2l_data'] = a2l_data
a2l_data = {
'bitmask': get_bitmask(data['type']),
'description': f'debug switch for {var_db} (1=bdsw act)',
'lsb': '2^0',
'max': '1',
'min': '0',
'offset': '0',
'unit': '',
'x_axis': None,
'y_axis': None
}
resv.setdefault(var_sw, {})['a2l_data'] = a2l_data
resv[var_db]['array'] = []
resv[var_sw]['array'] = []
resv[var_db]['function'] = [function]
resv[var_sw]['function'] = [function]
resv[var_db]['var'] = {'cvc_type': 'CVC_CAL',
'type': data['type'],
'var': var}
resv[var_sw]['var'] = {'cvc_type': 'CVC_CAL',
'type': 'Bool',
'var': var}
return res
@staticmethod
def _var_name_to_dbgsw_name(name):
"""Convert a variable name to a debug switch name."""
# the below conversion would generate a correct name for the
# debug switch, however the current build systemt generates one
# like the currently returned name
# return re.sub(r'\w(\w+?)_\w+?_(\w+)', r'c\1_B_\2_sw', name)
return re.sub(r'\w(\w+)', r'c\1_sw', name)
@staticmethod
def _type_order_iterator(var_items):
"""Get iterator over all variables.
In data type size order, and then in alphabetical order.
"""
for _, typ_data in var_items.items():
for var in sorted(typ_data.keys()):
yield (var, typ_data[var])
def _gen_dbg_c_file(self, data, filename):
"""Generate debug c-files.
These define all the debug labels for
the supplier input and output signals.
"""
with open(filename, 'w', encoding="utf-8") as fh_c:
fh_c.write(self._unit_cfg.base_types_headers)
fh_c.write('#define CVC_DISP\n')
# define extrern variable references
fh_c.write(f'#include "{self._disp_start}"\n')
for var, var_data in self._type_order_iterator(data):
fh_c.write(f"extern CVC_DISP {var_data['type']} {var};\n")
fh_c.write(f'#include "{self._disp_end}"\n\n')
# define debug calibration constants
fh_c.write(f'#include "{self._cal_start}"\n')
fh_c.write('\n/* Debug values */\n\n')
for var, var_data in self._type_order_iterator(data):
initial_value = var_data['min'] if var_data['min'] != "-" and float(var_data['min']) > 0 else "0"
if self.use_volatile_globals:
fh_c.write(f"volatile {var_data['type']} c{var[1:]}_db = {initial_value};\n")
else:
fh_c.write(f"{var_data['type']} c{var[1:]}_db = {initial_value};\n")
fh_c.write('\n/* Debug switches */\n\n')
for var, var_data in self._type_order_iterator(data):
sw_name = self._var_name_to_dbgsw_name(var)
if self.use_volatile_globals:
fh_c.write(f"volatile Bool {sw_name} = 0;\n")
else:
fh_c.write(f"Bool {sw_name} = 0;\n")
fh_c.write(f'#include "{self._cal_end}"\n\n')
# set the variable to the debug calibration constants
fh_c.write('/***********************/\n')
fh_c.write('/* debug functionality */\n')
fh_c.write('/***********************/\n\n')
_, fname_tmp = os.path.split(filename)
func_name = fname_tmp.split('.')[-2]
fh_c.write(f'#include "{self._code_start}"\n')
fh_c.write(f'void {func_name}(void) {{\n')
for var, var_data in self._type_order_iterator(data):
sw_name = self._var_name_to_dbgsw_name(var)
fh_c.write(f' if ({sw_name}) {{\n')
fh_c.write(f' {var} = c{var[1:]}_db;\n }}\n')
fh_c.write(f'}}\n#include "{self._code_end}"\n\n')
self.info('Generated %s', filename)
def gen_dbg_files(self, fname_in, fname_out):
"""Generate the c-files and A2L-files.
These declares all the supplier interface debug parameters and functions.
"""
# c-files
self._gen_dbg_c_file(self.__restructured_data['inputs'],
fname_in + '.c')
self._gen_dbg_c_file(self.__restructured_data['outputs'],
fname_out + '.c')
# A2L-files
_, fname_tmp = os.path.split(fname_in)
a2l_dict_in = self._a2l_dict(self.__restructured_data['inputs'],
fname_tmp)
_, fname_tmp = os.path.split(fname_out)
a2l_dict_out = self._a2l_dict(self.__restructured_data['outputs'],
fname_tmp)
a2l = A2l(a2l_dict_in, self._prj_cfg)
a2l.gen_a2l(fname_in + '.a2l')
a2l = A2l(a2l_dict_out, self._prj_cfg)
a2l.gen_a2l(fname_out + '.a2l')

296
pybuild/ext_var.py Normal file
View File

@ -0,0 +1,296 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
# -*- coding: utf-8 -*-
"""Module containing classes for VCC - Supplier signal interface."""
from pybuild import build_defs
from pybuild.types import byte_size_string, get_bitmask
from pybuild.a2l import A2l
from pybuild.problem_logger import ProblemLogger
class ExtVarBase(ProblemLogger):
"""Generate a2l- and c-files.
These which declares all variables in the interface that the
supplier platform writes to.
This is needed due to legacy handling of the interfaces between units.
Note that all variables sent from the VCC SPM to the platform should be declared in
the function that produces the signal!
"""
INPORT_INDEX = 0
OUTPORT_INDEX = 1
__data_type_size = {'Float32': '4', 'UInt32': '4', 'Int32': '4',
'UInt16': '2', 'Int16': '2',
'UInt8': '1', 'Int8': '1', 'Bool': '1'}
def __init__(self, variable_dict, prj_cfg, unit_cfg, user_defined_types, integrity_level=build_defs.ASIL_QM):
"""Constructor.
Args:
variable_dict (dict): dictionary with signal information.
prj_cfg (BuildProjConfig): Build project class holding where files should be stored.
user_defined_types (UserDefinedTypes): Class holding user defined data types.
integrity_level (str): integrity level of the unit from 'A' to 'D' or 'QM'.
"""
super().__init__()
self.set_integrity_level(integrity_level)
self._var_dict = variable_dict
self._ext_vars = {}
self._prj_cfg = prj_cfg
self._unit_cfg = unit_cfg
self._enumerations = user_defined_types.get_enumerations()
self._common_header_files = user_defined_types.common_header_files
def set_integrity_level(self, integrity_level):
"""Set integrity level of code generation.
Args:
integrity_level (str): integrity level of the unit from 'A' to 'D' or 'QM'
"""
self._disp_start = integrity_level['CVC']['DISP']['START']
self._disp_end = integrity_level['CVC']['DISP']['END']
self._decl_start = integrity_level['PREDECL']['DISP']['START']
self._decl_end = integrity_level['PREDECL']['DISP']['END']
def _get_byte_size_string(self, data_type):
"""Get byte size of a data type as string.
Enumeration byte sizes are derived from the underlying data type.
Args:
data_type (str): Data type.
Returns:
byte_size_string(pybuild.types.byte_size_string): Return result of pybuild.types.byte_size_string.
"""
if data_type in self._enumerations:
return byte_size_string(self._enumerations[data_type]['underlying_data_type'])
return byte_size_string(data_type)
def _get_bitmask(self, data_type):
"""Get bitmask of a data type.
Enumeration bitmasks are derived from the underlying data type.
Args:
data_type (str): Data type.
Returns:
get_bitmask(pybuild.types.get_bitmask): Return result of pybuild.types.get_bitmask.
"""
if data_type in self._enumerations:
return get_bitmask(self._enumerations[data_type]['underlying_data_type'])
return get_bitmask(data_type)
def _restruct_input_data(self):
"""Restructure all the input variables per data-type.
This will be used for declaring the variables and generating the
A2L-file
"""
external_inports = {}
external_outports = {}
for external_port_type in self.EXTERNAL_INPORT_TYPES:
if external_port_type in self._var_dict:
for var, data in self._var_dict[external_port_type].items():
data_type_size = self._get_byte_size_string(data[self.TYPE_NAME])
external_inports.setdefault(data_type_size, {})[var] = data
for external_port_type in self.EXTERNAL_OUTPORT_TYPES:
if external_port_type in self._var_dict:
for var, data in self._var_dict[external_port_type].items():
data_type_size = self._get_byte_size_string(data[self.TYPE_NAME])
external_outports.setdefault(data_type_size, {})[var] = data
self._ext_vars = external_inports, external_outports
def _a2l_dict(self):
"""Return a dict defining all parameters for a2l-generation."""
res = {
'vars': {},
'function': 'VcExtVar'
}
for inp in self.EXTERNAL_INPORT_TYPES:
if inp in self._var_dict:
for var, data in self._var_dict[inp].items():
if data[self.TYPE_NAME] in self._enumerations:
data_type = self._enumerations[data[self.TYPE_NAME]]['underlying_data_type']
else:
data_type = data[self.TYPE_NAME]
resv = res['vars']
resv.setdefault(var, {})['a2l_data'] = self.get_a2l_format(data)
resv[var]['array'] = []
resv[var]['function'] = ['VcEc']
resv[var]['var'] = {
'cvc_type': 'CVC_DISP',
'type': data_type,
'var': var
}
return res
def _generate_c_file(self, path):
"""Generate the c-file defining all the supplier input signals."""
header = path.with_suffix('.h').name
var_set = set()
with path.open('w') as fh_c:
fh_c.write(f'#include "{header}"\n')
fh_c.write(f'#include "{self._disp_start}"\n\n')
for data_type_s, ext_vars in self._ext_vars[self.INPORT_INDEX].items():
fh_c.write(f"/* Variables of size {data_type_s} bytes */\n\n")
for var in sorted(ext_vars.keys()):
data = ext_vars[var]
if var not in var_set:
fh_c.write(f"CVC_DISP {data[self.TYPE_NAME]} {var} = {data['init']};\n")
var_set.add(var)
fh_c.write('\n')
fh_c.write(f'\n#include "{self._disp_end}"\n')
self.info('Generated %s', path.name)
def _generate_h_file(self, path):
"""Generate header file externally declaring interface signals."""
filename = path.stem
guard = f"{filename.upper()}_H"
var_set = set()
with path.open('w') as fh_c:
fh_c.write(f'#ifndef {guard}\n')
fh_c.write(f'#define {guard}\n')
fh_c.write('#define CVC_DISP\n')
fh_c.write(self._unit_cfg.base_types_headers)
for common_header_file in self._common_header_files:
fh_c.write(f'#include "{common_header_file}"\n')
fh_c.write('\n')
fh_c.write(f'#include "{self._decl_start}"\n')
fh_c.write('/* VCC Inports */\n')
for data_type_s, ext_vars in self._ext_vars[self.INPORT_INDEX].items():
fh_c.write(f"/* Variables of size {data_type_s} bytes */\n\n")
for var in sorted(ext_vars.keys()):
if var not in var_set:
data = ext_vars[var]
fh_c.write(f"extern CVC_DISP {data[self.TYPE_NAME]} {var};\n")
var_set.add(var)
fh_c.write('\n')
fh_c.write('/* VCC Outports */\n')
for data_type_s, ext_vars in self._ext_vars[self.OUTPORT_INDEX].items():
fh_c.write(f"/* Variables of size {data_type_s} bytes */\n\n")
for var in sorted(ext_vars.keys()):
if var not in var_set:
data = ext_vars[var]
fh_c.write(f"extern CVC_DISP {data[self.TYPE_NAME]} {var};\n")
var_set.add(var)
fh_c.write('\n')
fh_c.write(f'#include "{self._decl_end}"\n')
fh_c.write('#endif\n')
self.info('Generated %s', path.name)
def generate_files(self, path):
"""Generate the c- and a2l-file for defining all the supplier input variables."""
self._restruct_input_data()
if not self._ext_vars[0] and not self._ext_vars[1]:
self.info(f'Skipping {path.name} as there were no corresponding vars.')
return
self._generate_c_file(path.with_suffix('.c'))
self._generate_h_file(path.with_suffix('.h'))
a2l_dict = self._a2l_dict()
a2l = A2l(a2l_dict, self._prj_cfg)
a2l.gen_a2l(path.with_suffix('.a2l'))
class ExtVarCsv(ExtVarBase):
"""Handles variable dicts from CSV files.
Variable dict shall have the following format and is generated by the
:doc:`CsvSignalInterfaces <signal_interfaces>` class::
{
'CAN-Input': {
'signal1': {
'IOType': 'd',
'description': 'Some description',
'init': 0,
'max': 1,
'min': 0,
'type': 'UInt8',
'unit': '-'
},
'signal2': {
...
}
},
'CAN-Output': {
'signal3': {
...
}
},
'xxx-Input': ...,
'xxx-Output': ...
}
"""
EXTERNAL_INPORT_TYPES = ['EMS-Input', 'CAN-Input', 'Private CAN-Input', 'LIN-Input']
EXTERNAL_OUTPORT_TYPES = ['EMS-Output', 'CAN-Output', 'Private CAN-Output', 'LIN-Output']
TYPE_NAME = 'type'
def get_a2l_format(self, data):
"""Get a2l format.
Args:
data (dict): Data dictionary.
Returns:
dict: A2l format dictionary.
"""
return {
'bitmask': self._get_bitmask(data[self.TYPE_NAME]),
'description': data['description'],
'lsb': '2^0',
'max': data['max'],
'min': data['min'],
'offset': '0',
'unit': data['unit'],
'x_axis': None,
'y_axis': None
}
class ExtVarYaml(ExtVarBase):
"""Handles variable dicts from Yaml files.
Variable dict shall have the following format and is generated by the
:doc:`YamlSignalInterfaces <signal_interfaces>` class::
{
'input': {
'sVcIhfa_D_WhlMotSysFrntLimnIndcn': {},
'sVcIhfa_D_WhlMotSysFrntModSts': {},
'sVcIhfa_I_WhlMotSysFrntIdc': {},
'sVcIhfa_U_UDcDcActHiSide1': {},
'sVcIhfa_U_WhlMotSysFrntUdc': {}
},
'output': {},
'status': {},
}
"""
EXTERNAL_INPORT_TYPES = ['input', 'status']
EXTERNAL_OUTPORT_TYPES = ['output']
TYPE_NAME = 'variable_type'
def get_a2l_format(self, data):
"""Get a2l format.
Args:
data (dict): Data dictionary.
Returns:
dict: A2l format dictionary.
"""
return {
'bitmask': self._get_bitmask(data[self.TYPE_NAME]),
'description': data['description'],
'lsb': '2^0',
'max': data['range']['max'],
'min': data['range']['min'],
'offset': '0',
'unit': data['unit'],
'x_axis': None,
'y_axis': None
}

301
pybuild/feature_configs.py Normal file
View File

@ -0,0 +1,301 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
# -*- coding: utf-8 -*-
"""Feature configuration (codeswitches) module."""
import copy
import glob
import os
import re
from pprint import pformat
from pybuild.lib.helper_functions import deep_dict_update
from pybuild.problem_logger import ProblemLogger
from pybuild.xlrd_csv import WorkBook
class FeatureConfigs(ProblemLogger):
"""Hold feature configurations read from SPM_Codeswitch_Setup*.csv config files.
Provides methods for retrieving the currently
used configurations of a unit.
"""
convs = (('~=', '!='), ('~', ' not '), ('!', ' not '), (r'\&\&', ' and '),
(r'\|\|', ' or '))
def __init__(self, prj_config):
"""Constructor.
Args:
prj_config (BuildProjConfig): configures which units are active in the current project and where
the codeswitches files are located
"""
super().__init__()
self._if_define_dict = {}
self._build_prj_config = prj_config
self._missing_codesw = set()
# Get the config switches configuration
self._set_config(self._parse_all_code_sw_configs())
self._parse_all_local_defs()
self._add_local_defs_to_tot_code_sw()
def __repr__(self):
"""Get string representation of object."""
return pformat(self.__code_sw_cfg.keys())
def _parse_all_code_sw_configs(self):
"""Parse all SPM_Codeswitch_Setup*.csv config files.
Returns:
dict: with the projects as keys, and the values are
another dict with the config-parameter and it's value.
"""
# TODO: Change this when condeswitches are moved to model config
# cfg_paths = self._build_prj_config.get_unit_mdl_dirs('all')
cfg_paths = [self._build_prj_config.get_prj_cfg_dir()]
cfg_fname = self._build_prj_config.get_codeswitches_name()
cfg_files = []
for cfg_path in cfg_paths:
cfg_files.extend(glob.glob(os.path.join(cfg_path, cfg_fname)))
self.debug('cfg_paths: %s', pformat(cfg_paths))
self.debug('cfg_fname: %s', pformat(cfg_fname))
self.debug('cfg_files: %s', pformat(cfg_files))
conf_dict = {}
for file_ in cfg_files:
conf_dict = deep_dict_update(conf_dict, self._parse_code_sw_config(file_))
return conf_dict
def _parse_code_sw_config(self, file_name):
"""Parse the SPM_Codeswitch_Setup.csv config file.
Returns:
dict: with the projects as keys, and the values are
another dict with the config-parameter and it's value.
"""
self.debug('_parse_code_sw_config: %s', file_name)
wbook = WorkBook(file_name)
conf_dict = {'NEVER_ACTIVE': 0,
'ALWAYS_ACTIVE': 1}
# TODO: handle sheet names in a better way!
wsheet = wbook.single_sheet()
prjs = [d.value for d in wsheet.row(0)[2:]]
prj_row = enumerate(prjs, 2)
for col, prj in prj_row:
if prj != self._build_prj_config.get_prj_config():
self.debug('Skipping %s', prj)
continue
for r_nbr in range(1, wsheet.nrows):
row = wsheet.row(r_nbr)
conf_par = row[0].value.strip().replace('.', '_')
val = row[col].value
if not isinstance(val, str):
conf_dict[conf_par] = val
elif val.lower().strip() == 'na' or val.lower().strip() == 'n/a':
conf_dict[conf_par] = 0
else:
self.warning('Unexpected codeswitch value %s = "%s". Ignored!', row[0].value.strip(), val)
return conf_dict
return conf_dict
def _recursive_subs(self, m_def, code_sws):
"""Recursivly replaces macro definitions with values."""
# find and replace all symbols with values
symbols = re.findall(r'(?!(?:and|or|not)\b)(\b[a-zA-Z_]\w+)', m_def)
m_def_subs = m_def
for symbol in symbols:
if symbol in code_sws:
m_def_subs = re.sub(symbol, str(code_sws[symbol]), m_def_subs)
elif symbol in self._if_define_dict:
m_def_subs = re.sub(symbol, str(self._if_define_dict[symbol]), m_def_subs)
m_def_subs = self._recursive_subs(m_def_subs, code_sws)
else:
self.critical('Symbol %s not defined in config switches.', symbol)
return None
return m_def_subs
def _add_local_defs_to_tot_code_sw(self):
"""Add the defines from the LocalDefs.h files to the code switch dict."""
for macro, m_def in self._if_define_dict.items():
tmp_subs = self._recursive_subs(m_def, self.__tot_code_sw)
if tmp_subs is None:
continue
self.__tot_code_sw[macro] = eval(tmp_subs)
def get_preprocessor_macro(self, nested_code_switches):
"""Get the #if macro string for a code switch configuration from a unit config json file.
Args:
nested_code_switches(list()): list of lists of code switches from unitconfig
return:
string: A string with an #if macro that defines if the code should be active
'#if (<CS1> && <CS2>) || (<CS3> && <CS4>)'
"""
if_macro_and = []
if not isinstance(nested_code_switches, list):
self.warning("Unitconfig codeswitches should be in a nested list")
nested_code_switches = [nested_code_switches]
if not isinstance(nested_code_switches[0], list):
self.warning("Unitconfig codeswitches should be in a nested list")
nested_code_switches = [nested_code_switches]
for code_switches in nested_code_switches:
if isinstance(code_switches, str):
code_switches = [code_switches]
if_macro_and.append(f"( { ' && '.join(code_switches) } )")
if_macro_string = f"#if {' || '.join(if_macro_and)}" if if_macro_and else ""
all_projects = re.search('all', if_macro_string, re.I)
if all_projects:
return ""
return if_macro_string
def gen_unit_cfg_header_file(self, file_name):
"""Generate a header file with preprocessor defines needed to configure the SW.
Args:
file_name (str): The file name (with path) of the unit config header file
"""
with open(file_name, 'w', encoding="utf-8") as f_hndl:
_, fname = os.path.split(file_name)
fname = fname.replace('.', '_').upper()
f_hndl.write(f'#ifndef {fname}\n')
f_hndl.write(f'#define {fname}\n\n')
conf_sw = self.__code_sw_cfg
for key_, val in conf_sw.items():
if val == "":
self.warning('Code switch "%s" is missing a defined value', key_)
f_hndl.write(f'#define {key_} {val}\n')
f_hndl.write(f'\n#endif /* {fname} */\n')
def _eval_cfg_expr(self, elem):
"""Convert matlab config expression to python expression.
Uses the tuple self.convs, and evaluates the result using
self.__code_sw_cfg[config]
This function does not handle the complex definitions made outside
the dict.
Args:
elem (str): element string
Returns:
Bool: True if config is active, False if not.
"""
res = re.search('all', elem, re.I)
if res is not None:
return True
# modify all matlab expressions
elem_tmp = elem
# find and replace all symbols with values
symbols = re.findall(r'[a-zA-Z_]\w+', elem_tmp)
code_sw_dict = self.__tot_code_sw
for symbol in symbols:
try:
elem_tmp = re.sub(fr'\b{symbol}\b', str(code_sw_dict[symbol]), elem_tmp)
except KeyError:
if symbol not in self._missing_codesw:
self.critical('Missing %s in CodeSwitch definition', symbol)
self._missing_codesw.add(symbol)
return False
# convert matlab/c to python expressions
for conv in self.convs:
elem_tmp = re.sub(conv[0], conv[1], elem_tmp)
# evaluate and return result
return eval(elem_tmp)
def check_if_active_in_config(self, config_def):
"""Check if a config is active in the current context.
Takes a collection of config strings and checks if this config definition is active
within the current configuration. The structure of the provided string collection
determines how the config definition is evaluated. (logical and/or expressions)
list of list of config strings
[[*cs1* and *cs2*] or [*cs3* and *cs4*]].
list of config strings
[*cs1* and *cs2*]
single config string
*cs1*
Args:
config_def (list): the config definitions as described above
Returns:
Bool: True if active the current configuration
"""
if not config_def:
return True
# format the input to a list of list of strings
if isinstance(config_def, str):
c_def = [[config_def]]
elif isinstance(config_def, list) and isinstance(config_def[0], str):
c_def = [config_def]
else:
c_def = config_def
eval_ = False
for or_elem in c_def:
for and_elem in or_elem:
eval_ = self._eval_cfg_expr(and_elem)
if not eval_:
break
if eval_:
break
return eval_
def _conv_mlab_def_to_py(self, matlab_def):
"""Convert matlab syntax to python syntax.
TODO: Move this functionality to the matlab-scripts, which are
run on the local machine.
"""
m_def_tmp = matlab_def
for from_, to_ in self.convs:
m_def_tmp = re.sub(from_, to_, m_def_tmp)
return m_def_tmp
def _parse_local_def(self, file_data):
"""Parse one local define file."""
res = re.findall(r'#if\s+(.*?)(?<!\\)$\s*#define (\w+)\s+#endif',
file_data, flags=re.M | re.DOTALL)
def_wo_if_endif = re.sub(r'#if(.*?)#endif', '', file_data, flags=re.M | re.DOTALL)
one_line_def = re.findall(r'#define\s+(\w+)\s+(.+?)(?:(?://|/\*).*?)?$',
def_wo_if_endif, flags=re.M)
res.extend([(v, d) for (d, v) in one_line_def])
# remove line break '\' characters and line break
self._if_define_dict.update({d: self._conv_mlab_def_to_py(re.sub(r'\s*?\\$\s*', ' ', i))
for (i, d) in res})
def _parse_all_local_defs(self):
"""Parse all local define files."""
def_paths = self._build_prj_config.get_unit_src_dirs()
ld_fname = self._build_prj_config.get_local_defs_name()
loc_def_files = []
for def_path in def_paths.values():
loc_def_files.extend(glob.glob(os.path.join(def_path, ld_fname)))
for file_ in loc_def_files:
with open(file_, 'r', encoding="utf-8") as fhndl:
data = fhndl.read()
self._parse_local_def(data)
self.debug('self._if_define_list:\n%s\n', pformat(self._if_define_dict))
def _set_config(self, code_sw):
"""Set config for code switches.
Useful for unit testing.
Args:
code_sw
"""
# __tot_code_sw
self.__code_sw_cfg = code_sw
self.__tot_code_sw = copy.deepcopy(self.__code_sw_cfg)

View File

@ -0,0 +1,227 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
"""Module to generate AllSystemInfo.mat for compatibility purposes with DocGen."""
import os
from copy import deepcopy
from numpy import array
from scipy.io import savemat
from pybuild.signal_interfaces import SignalInterfaces
from pybuild.unit_configs import UnitConfigs
from pybuild.lib.helper_functions import merge_dicts
from pybuild.problem_logger import ProblemLogger
def _get_signals_by_type(signal_conf, signal_type):
"""Get signals by type ('missing', 'unused', 'inconsistent_defs' or 'multiple_defs').
Args:
signal_conf (dict): Configuration from SignalInterfaces with the following format
{
"missing": {"signal_name" : ["VED4_GENIII", "VEP4_GENIII"]}
"unused": {}
"multiple_defs": {}
"inconsistent_defs": {}
}
Returns:
dict: with the following format
{
"signal_name" : {'VarStatus' : 'Not Used', 'SignalType' : "signal_type"}
}
"""
result = {}
for signal_name in signal_conf[signal_type].keys():
signal = {signal_name: {'VarStatus': 'Not Used', 'SignalType': signal_type}}
result.update(signal)
return result
class GenAllSystemInfo(ProblemLogger):
"""Class to generate AllSystemInfo.mat for compatibility purposes with DocGen."""
_signal_types = ['missing', 'unused']
def __init__(self, signal_if, unit_cfg):
"""Class to generate a matlab struct AllSystemInfo for compatibility with the old document generation.
Args:
signal_if: an initialized SignalInterfaces object to access signal configurations
"""
if not isinstance(signal_if, SignalInterfaces):
raise TypeError
if not isinstance(unit_cfg, UnitConfigs):
raise TypeError
self.signal_if = signal_if
self.unit_cfg = unit_cfg
self._signals_without_source = {}
self._core_ids = {}
self._file_info = {}
self._dids = {}
def __repr__(self):
"""Get string representation of object."""
return f'if: {self.signal_if}\n uc: {self.unit_cfg}'
def build(self):
"""Build AllSystemInfo.mat for docgen compatibility."""
self.info('******************************************************')
self.info('%s - Getting signals without source', __name__)
signals_tmp = self.get_signals_without_source()
for signal_name, signal_data in signals_tmp.items():
if '.' in signal_name:
struct_name = signal_name.split('.')[0]
if struct_name not in self._signals_without_source:
self.info(
f'{__name__} - Found struct signal: {signal_name}, adding struct name only ({struct_name}). '
'Remaining members will be ignored.'
)
self._signals_without_source[struct_name] = signal_data
else:
self._signals_without_source[signal_name] = signal_data
self.info('%s - Getting CoreIDs', __name__)
self._core_ids = self.get_core_ids()
self.info('%s - Generating FileInfo', __name__)
self._file_info = self._gen_dummy_file_info()
self.info('%s - Getting DIDs', __name__)
self._dids = self.get_dids()
output_path = os.path.abspath(os.path.join('Models', 'CodeGenTmp'))
self.info('%s - Creating CodeGenTmp directory', __name__)
self.create_codegen_tmp_dir(output_path)
self.info('%s - Building AllSystemInfo', __name__)
self._build_all_system_info(output_path)
def create_codegen_tmp_dir(self, path):
"""Create CodeGenTmp directory if it does not exist.
Args:
path (str): directory to create.
"""
if not os.path.isdir(path):
os.mkdir(path)
else:
self.info('%s - Directory already exist at %s', __name__, path)
def _build_all_system_info(self, path):
"""Build AllSystemInfo.mat for DocGen compatibility.
Args:
path (str): directory path to place the output file.
"""
absolute_path = os.path.abspath(os.path.join(path, 'AllSystemInfo.mat'))
all_system_info = {
'SystemData': self._signals_without_source,
'CoreIDs': self._core_ids,
'FileInfo': self._file_info,
'DIDs': self._dids
}
all_system_info_scipy = {'AllSystemInfo': all_system_info}
savemat(absolute_path, all_system_info_scipy, long_field_names=True, do_compression=True)
self.info('%s - AllSystemInfo placed at %s', __name__, absolute_path)
def get_core_ids(self):
"""Get core IDs for specified project.
Returns:
dict: DID as a dict with the following format
{
"unit name" : {}
}
"""
result = {}
unit_configs = self.unit_cfg.get_per_unit_cfg()
for unit, configs in unit_configs.items():
result[unit] = deepcopy(configs['core'])
for unit, configs in result.items():
for core_id, core_id_dict in configs.items():
for identifier, data_dict in core_id_dict.items():
for tl_field, tl_data in data_dict.items():
# Turn lists into numpy.array, required to get matlab cell arrays instead of char arrays.
# char arrays make DocGen crash.
if isinstance(tl_data, list):
configs[core_id][identifier][tl_field] = array(tl_data, dtype=object)
else:
configs[core_id][identifier][tl_field] = tl_data
return result
def get_dids(self):
"""Get DIDs for specified project.
Returns:
dict: DID as a dict with the following format
{
"unit name" : {}
}
"""
result = {}
unit_configs = self.unit_cfg.get_per_unit_cfg()
for unit, configs in unit_configs.items():
dids_list = []
for signal_values in configs['dids'].values():
signal_values_allsysteminfo_keys = {
'sig_name': signal_values['name'],
'sig_desc': signal_values['description'],
'blk_name': signal_values['handle'],
'data_type': signal_values['type'],
'unit': signal_values['unit'],
'lsb': signal_values['lsb'],
'offset': signal_values['offset']
}
dids_list.append(signal_values_allsysteminfo_keys)
result.update({unit: dids_list})
return result
def _gen_dummy_file_info(self):
"""Generate dummy FileInfo struct for compatibility purposes."""
result = {}
for key in self._core_ids:
result.update({key: ''})
return result
def get_signals_without_source(self):
"""Get missing and unused signals from project configuration.
Returns:
dict: result with the following format
{
"signal_name" : "signal_type"
}
"""
prj_conf = self.signal_if.check_config()
result = self._get_external_signals(prj_conf)
temp_signals = self._get_internal_signals(prj_conf)
result = merge_dicts(result, temp_signals, merge_recursively=True)
return result
def _get_external_signals(self, prj_conf):
signals_conf = prj_conf['sigs']['ext']
return self._get_signals_by_types(signals_conf)
def _get_internal_signals(self, prj_conf):
result = {}
signals_conf = prj_conf['sigs']['int']
for sig_conf in signals_conf.values():
temp_signals = self._get_signals_by_types(sig_conf)
result = merge_dicts(result, temp_signals, merge_recursively=True)
return result
def _get_signals_by_types(self, signal_conf):
"""Get signals by multiple types and merge them into one dictionary, see _get_signals_by_type() for docs."""
result = {}
for signal_type in self._signal_types:
temp_signals = _get_signals_by_type(signal_conf, signal_type)
if temp_signals:
result = merge_dicts(result, temp_signals, merge_recursively=True)
return result

449
pybuild/gen_label_split.py Normal file
View File

@ -0,0 +1,449 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
"""Module for labelsplit files."""
import glob
import re
import json
import sys
from xml.etree import ElementTree
from pathlib import Path
from pybuild.feature_configs import FeatureConfigs
from pybuild.unit_configs import UnitConfigs
from pybuild.signal_interfaces import CsvSignalInterfaces
from pybuild.lib import helper_functions, logger
LOGGER = logger.create_logger(__file__)
class LabelSplit:
""" Provides common LabelSplit functions for multiple repos.
"""
def __init__(self, project, build_cfg, cfg_json, cmt_source_folder):
"""Read project configuration file to internal an representation.
Args:
project (str): Project name.
build_cfg(BuildProjConfig): configures which units are active in the current project and where
the code switch files are located.
cfg_json(Path): Path to label split configuration file.
cmt_source_folder (Path): Path to CMT source folder.
"""
super().__init__()
self.project = project
self.build_cfg = build_cfg
project_a2l_file_path = Path(self.build_cfg.get_src_code_dst_dir(),
self.build_cfg.get_a2l_name())
self.feature_cfg = FeatureConfigs(self.build_cfg)
self.unit_cfg = UnitConfigs(self.build_cfg, self.feature_cfg)
self.csv_if = CsvSignalInterfaces(self.build_cfg, self.unit_cfg)
self.project_a2l_symbols = self.get_project_a2l_symbols(project_a2l_file_path)
self.labelsplit_cfg = self.read_json(cfg_json)
self.cmt_source_folder = cmt_source_folder
@staticmethod
def read_json(cfg_json):
"""Read label split configuration file from given location
If the file does not exsit in the given location, program
exits with error message.
Args:
cfg_json(Path): Path to label split configuration file
Returns:
labelsplit_cfg (dict): Dict of given file content
"""
labelsplit_cfg = None
if cfg_json.exists():
with cfg_json.open() as json_file:
labelsplit_cfg = json.load(json_file)
return labelsplit_cfg
LOGGER.error('Cannot find label split config file: %s', cfg_json)
sys.exit(1)
@staticmethod
def get_project_a2l_symbols(project_a2l_file_path):
"""Get a list of calibration symbols found in a given project A2L file.
Args:
project_a2l_file_path (Path): Path to project A2L file.
Returns:
symbols_in_a2l (list): List of calibration symbols found in the project A2L file.
"""
symbols_in_a2l = []
with project_a2l_file_path.open() as a2l_fh:
a2l_text = a2l_fh.read()
calibration_blocks = re.findall(r'(?:\s*\n)*(\s*/begin (CHARACTERISTIC|AXIS_PTS)[\n\s]*'
r'(\w+)([\[\d+\]]*).*?\n.*?/end \2)',
a2l_text,
flags=re.M | re.DOTALL)
for blk in calibration_blocks:
symbols_in_a2l.append(blk[2])
return symbols_in_a2l
@staticmethod
def get_sgp_symbols(sgp_file: Path):
"""Get symbols and symbol_groups found in a given _sgp.xml file.
Example output: {sVcExample: [(3, EC_EX_1), (4, EC_EX_2)]}, where the indices are column indices:
1 -> symbol name (therefore not in list of symbol groups).
2 -> diesel group.
3 -> petrol group.
4 -> hybrid group.
5 -> subsystem (therefore not in list of symbol groups).
Args:
sgp_file (Path): Path to an _sgp.xml file.
Returns:
found_sgp_symbols (dict): A symbol to symbol_groups dictionary found in the sgp_file.
"""
tree = ElementTree.parse(sgp_file)
root = tree.getroot()
search_string = '{{urn:schemas-microsoft-com:office:spreadsheet}}{tag}'
label_sheet = root.find(search_string.format(tag='Worksheet'))
table = label_sheet.find(search_string.format(tag='Table'))
rows = table.findall(search_string.format(tag='Row'))
found_sgp_symbols = {}
for row in rows:
symbol = None
column_counter = 1
cells = row.findall(search_string.format(tag='Cell'))
for cell in cells:
data = cell.find(search_string.format(tag='Data'))
if data is not None:
# Sometimes there are spaces in the symbol cell
# Sometimes there is a weird \ufeff character (VcDebug_sgp.xml) in the symbol cell
value = data.text.replace(' ', '').replace('\ufeff', '')
if symbol is None:
symbol = value
found_sgp_symbols[symbol] = []
else:
new_index = search_string.format(tag='Index')
if new_index in cell.attrib:
column_counter = int(cell.attrib[new_index])
found_sgp_symbols[symbol].append((column_counter, value))
column_counter += 1
return found_sgp_symbols
def get_sgp_symbol_group(self, symbol_groups_by_index):
"""Match _sgp.xml file indices (symbol groups) with a given project.
Args:
symbol_groups_by_index (list(tuple)): List of (index, symbol_group) pairs.
Returns:
symbol_group (str): The symbol group corresponding to the given project.
"""
symbol_group = ''
symbol_dict = self.labelsplit_cfg.get("SGP_SYMBOL_GROUPS")
symbol_list = [val for key, val in symbol_dict.items() if key in self.project]
if len(symbol_list) >= 1:
for index, group in symbol_groups_by_index:
if index == symbol_list[0]:
symbol_group = group
else:
LOGGER.error('Cannot match symbol group type for project: %s', self.project)
return symbol_group
def get_interface_symbols_and_groups(self, interface_dict, in_symbol_sgp_dict, out_symbol_sgp_dict):
"""Get a list of (symbol, symbol_group) pairs found in given interface and sgp files.
Args:
interface_dict (dict): interface to symbol map, matching a certain IO type.
in_symbol_sgp_dict (dict): An input symbol to symbol_groups dictionary found in an sgp_file,
to be compared with interface_dict inputs.
out_symbol_sgp_dict (dict): An output symbol to symbol_groups dictionary found in an sgp_file,
to be compared with interface_dict outputs.
Returns:
symbols_and_groups (list(tuple)): List of (symbol, symbol_group) pairs in: interface and sgp file.
"""
symbols_and_groups = []
for interface, symbol_data in interface_dict.items():
for symbol in symbol_data.keys():
debug_name = re.sub(r'\w(\w+)', r'c\1_db', symbol)
switch_name = re.sub(r'\w(\w+)', r'c\1_sw', symbol)
if 'Input' in interface and debug_name in in_symbol_sgp_dict and switch_name in in_symbol_sgp_dict:
debug_symbol_group = self.get_sgp_symbol_group(in_symbol_sgp_dict[debug_name])
switch_symbol_group = self.get_sgp_symbol_group(in_symbol_sgp_dict[switch_name])
symbols_and_groups.extend([(debug_name, debug_symbol_group), (switch_name, switch_symbol_group)])
elif 'Output' in interface and debug_name in out_symbol_sgp_dict and switch_name in out_symbol_sgp_dict:
debug_symbol_group = self.get_sgp_symbol_group(out_symbol_sgp_dict[debug_name])
switch_symbol_group = self.get_sgp_symbol_group(out_symbol_sgp_dict[switch_name])
symbols_and_groups.extend([(debug_name, debug_symbol_group), (switch_name, switch_symbol_group)])
return symbols_and_groups
def get_debug_symbols_and_groups(self):
"""Get a list of (symbol, symbol_group) pairs found in project interface and VcDebug*_sgp.xml files.
Returns:
debug_symbols_and_groups (list(tuple)): List of (symbol, symbol_group) pairs in:
interface and VcDebug*_sgp.xmlfiles.
"""
_unused, dep, _unused_two, debug = self.csv_if.get_io_config()
sgp_file_dict = self.labelsplit_cfg.get("SGP_FILE")
debug_sgp_file = Path(sgp_file_dict.get('cfg_folder'), sgp_file_dict.get('debug'))
debug_output_sgp_file = Path(sgp_file_dict.get('cfg_folder'), sgp_file_dict.get('debug_output'))
dep_sgp_file = Path(sgp_file_dict.get('cfg_folder'), sgp_file_dict.get('dep'))
dep_output_sgp_file = Path(sgp_file_dict.get('cfg_folder'), sgp_file_dict.get('dep_output'))
debug_sgp_symbols = self.get_sgp_symbols(debug_sgp_file)
debug_output_sgp_symbols = self.get_sgp_symbols(debug_output_sgp_file)
dep_sgp_symbols = self.get_sgp_symbols(dep_sgp_file)
dep_output_sgp_symbols = self.get_sgp_symbols(dep_output_sgp_file)
symbols_and_groups_tmp = []
debug_tmp = self.get_interface_symbols_and_groups(debug,
debug_sgp_symbols,
debug_output_sgp_symbols)
dep_tmp = self.get_interface_symbols_and_groups(dep,
dep_sgp_symbols,
dep_output_sgp_symbols)
symbols_and_groups_tmp.extend(debug_tmp)
symbols_and_groups_tmp.extend(dep_tmp)
debug_symbols_and_groups = []
for symbol, symbol_group in symbols_and_groups_tmp:
if symbol_group == '':
LOGGER.info('Debug symbol %s is missing symbol group and will be removed.', symbol)
else:
debug_symbols_and_groups.append((symbol, symbol_group))
return debug_symbols_and_groups
def check_unit_par_file(self, unit):
"""Check <unit>_par.m file for default sgp symbol group.
Args:
unit (str): Current unit/model name.
Returns:
has_sgp_default (Bool): True/False if unit is associated with default sgp value.
default_symbol_group (str): Name of default symbol group.
"""
has_sgp_default = False
default_symbol_group = ''
base_search_string = r'SgpDefault\.{unit}\.[A-Za-z]+\s*=\s*[\'\"]([A-Za-z_]+)[\'\"]'
search_string = base_search_string.format(unit=unit)
non_existent_par_file = Path('non_existent_par_file.m')
found_par_files = glob.glob('Models/*/' + unit + '/' + unit + '_par.m')
if len(found_par_files) > 1:
LOGGER.warning('Found more than one _par.m file, using %s', found_par_files[0])
par_file = Path(found_par_files[0]) if found_par_files else non_existent_par_file
if self.labelsplit_cfg.get("special_unit_prefixes"):
for special_prefix in self.labelsplit_cfg.get("special_unit_prefixes"):
if unit.startswith(special_prefix) and not par_file.is_file():
# Some units require special handling.
if '__' in unit:
parent = unit.replace('__', 'Mdl__')
else:
parent = unit + 'Mdl'
found_par_files = glob.glob('Models/*/' + parent + '/' + parent + '_par.m')
par_file = Path(found_par_files[0]) if found_par_files else Path(non_existent_par_file)
# Default symbol group is based on c-file name
c_name = re.sub('(Mdl)?(__.*)?', '', unit)
search_string = base_search_string.format(unit=c_name)
if par_file.is_file():
with par_file.open(encoding="latin-1") as par_fh:
par_text = par_fh.read()
sgp_default_match = re.search(search_string, par_text)
if sgp_default_match is not None:
has_sgp_default = True
default_symbol_group = sgp_default_match.group(1)
else:
LOGGER.info('Missing _par file for model: %s', unit)
return has_sgp_default, default_symbol_group
def get_unit_sgp_file(self, unit):
"""Get path to <unit>_sgp.xml file.
Args:
unit (str): Current unit/model name.
Returns:
sgp_file (Path): Path to <unit>_sgp.xml file.
"""
non_existent_sgp_file = Path('non_existent_sgp_file.xml')
found_sgp_files = glob.glob('Models/*/' + unit + '/' + unit + '_sgp.xml')
if len(found_sgp_files) > 1:
LOGGER.warning('Found more than one _sgp.xml file, using %s', found_sgp_files[0])
sgp_file = Path(found_sgp_files[0]) if found_sgp_files else Path(non_existent_sgp_file)
if self.labelsplit_cfg.get("special_unit_prefixes"):
for special_prefix in self.labelsplit_cfg.get("special_unit_prefixes"):
if unit.startswith(special_prefix) and not sgp_file.is_file():
# Some units require special handling.
if '__' in unit:
parent = unit.replace('__', 'Mdl__')
else:
parent = unit + 'Mdl'
found_sgp_files = glob.glob('Models/*/' + parent + '/' + parent + '_sgp.xml')
sgp_file = Path(found_sgp_files[0]) if found_sgp_files else Path(non_existent_sgp_file)
return sgp_file
def get_unit_symbols_and_groups(self, unit, calibration_symbols):
"""Get a list of (symbol, symbol_group) pairs found in A2L, <unit>_sgp/par and config_<unit>.json files.
Args:
unit (str): Current unit/model name.
calibration_symbols (list): All calibration symbols for the unit (from config_<unit>.json).
Returns:
unit_symbols_and_groups (list(tuple)): List of (symbol, symbol_group) pairs in: A2L, _sgp/_par and
config files.
"""
unit_symbols_and_groups = []
has_sgp_default, default_symbol_group = self.check_unit_par_file(unit)
sgp_file = self.get_unit_sgp_file(unit)
if sgp_file.is_file():
found_sgp_symbols = self.get_sgp_symbols(sgp_file)
else:
found_sgp_symbols = {}
LOGGER.info('Missing _sgp file for model: %s', unit)
for symbol in calibration_symbols:
if symbol not in self.project_a2l_symbols:
LOGGER.info('Symbol %s not in project A2L file and will be removed.', symbol)
continue
if symbol in found_sgp_symbols:
symbol_group = self.get_sgp_symbol_group(found_sgp_symbols[symbol])
unit_symbols_and_groups.append((symbol, symbol_group))
elif has_sgp_default:
if symbol.endswith('_sw') or symbol.endswith('_db'):
LOGGER.info('Debug symbol %s not in sgp file and will be removed.', symbol)
else:
unit_symbols_and_groups.append((symbol, default_symbol_group))
else:
LOGGER.info('Symbol %s missing in _sgp file and lack SgpDefault value.', symbol)
return unit_symbols_and_groups
def get_calibration_constants(self):
"""Get all calibration symbols for each unit in the project.
Returns:
calibration_symbols_per_unit (dict): A unit to symbol list dictionary.
"""
security_variables = self.labelsplit_cfg.get("security_variables")
u_conf_dict = self.unit_cfg.get_per_cfg_unit_cfg()
safe_calibration_symbols = {}
for symbol, symbol_data in u_conf_dict['calib_consts'].items():
if symbol not in security_variables:
safe_calibration_symbols.update({symbol: symbol_data})
calibration_symbols_per_unit = {}
for symbol, symbol_data in safe_calibration_symbols.items():
for unit, unit_data in symbol_data.items():
if 'CVC_CAL' in unit_data['class']:
if unit in calibration_symbols_per_unit:
calibration_symbols_per_unit[unit].append(symbol)
else:
calibration_symbols_per_unit[unit] = [symbol]
return calibration_symbols_per_unit
def get_symbols_and_groups(self):
"""Get a list of (symbol, symbol_group) pairs found in A2L, <unit>_sgp/par and config_<unit>.json files.
Returns:
exit_code (int): 0/1 based on successful collection of symbols and symbol groups.
all_symbols_and_groups (dict): A symbol to symbol_group dictionary for all A2L, _sgp/_par and config files,
"""
exit_code = 0
calibration_symbols_per_unit = self.get_calibration_constants()
debug_symbols = self.get_debug_symbols_and_groups()
all_symbol_and_group_pairs = []
if self.labelsplit_cfg.get("project_symbols"):
special_project_dict = self.labelsplit_cfg.get("project_symbols")
pair_list = [val for key, val in special_project_dict.items() if key in self.project]
if len(pair_list) == 1:
symbol_and_group_pairs_list = pair_list[0].items()
all_symbol_and_group_pairs += symbol_and_group_pairs_list
elif len(pair_list) > 1:
LOGGER.error('Project %s has does not follow the name rule', self.project)
return 1, {}
all_symbol_and_group_pairs.extend(debug_symbols)
for unit, symbols in calibration_symbols_per_unit.items():
if self.labelsplit_cfg.get("special_units"):
special_unit_dict = self.labelsplit_cfg.get("special_units")
if unit in special_unit_dict.keys():
# Some units require special handling.
LOGGER.warning('Found %s, assuming %s is used.', unit, special_unit_dict.get(unit))
labels = self.get_unit_symbols_and_groups(special_unit_dict.get(unit), symbols)
else:
labels = self.get_unit_symbols_and_groups(unit, symbols)
else:
labels = self.get_unit_symbols_and_groups(unit, symbols)
all_symbol_and_group_pairs.extend(labels)
symbol_to_group_dict = {}
for symbol, symbol_group in all_symbol_and_group_pairs:
if symbol in symbol_to_group_dict:
if symbol_to_group_dict[symbol] != symbol_group:
LOGGER.error('Symbol %s multiply defined with different symbol groups.', symbol)
exit_code = 1
else:
symbol_to_group_dict[symbol] = symbol_group
return exit_code, symbol_to_group_dict
def generate_label_split_xml_file(self, symbols_and_groups):
"""Generate a label split file, given a directory plus labels and groups to add.
Args:
symbols_and_groups (dict): A symbol to symbol_group dictionary given a project.
Returns:
exit_code (int): 0/1 based on successful generation of Labelsplit.xls.
"""
errors = []
project_root_dir = self.build_cfg.get_root_dir()
cmt_output_folder = helper_functions.create_dir(Path(project_root_dir, 'output', 'CMT'))
start_file_name = Path(self.cmt_source_folder, 'template_labelsplit_sgp_start.xml_')
row_count_file_name = Path(cmt_output_folder, 'labelsplit_rowcount.xml_')
start_2_file_name = Path(self.cmt_source_folder, 'template_labelsplit_sgp_start_2.xml_')
label_split_rows_filename = Path(cmt_output_folder, 'labelsplit_rows.xml_')
end_file_name = Path(self.cmt_source_folder, 'template_labelsplit_sgp_end.xml_')
files_to_merge = [start_file_name, row_count_file_name, start_2_file_name,
label_split_rows_filename, end_file_name]
with row_count_file_name.open('w', encoding="utf-8") as rc_fh:
rc_fh.write(f'{len(symbols_and_groups) + 1}') # header + data
with label_split_rows_filename.open('w', encoding="utf-8") as lsrf_fh:
for symbol, symbol_group in symbols_and_groups.items():
if symbol_group == '':
errors.append(f'Missing symbol group for symbol: {symbol}')
elif symbol_group == 'VCC_SPM_DEBUG':
LOGGER.info('Ignoring undistributed debug symbol: %s', symbol)
else:
lsrf_fh.write(
' <Row ss:AutoFitHeight="0">\n'
f' <Cell><Data ss:Type="String">{symbol}'
'</Data><NamedCell ss:Name="_FilterDatabase"/></Cell>\n'
f' <Cell ss:Index="5"><Data ss:Type="String">{symbol_group}'
'</Data><NamedCell ss:Name="_FilterDatabase"/></Cell>\n'
' </Row>\n'
)
if errors:
LOGGER.error('\n'.join(errors))
return 1
output_file_name = Path(project_root_dir, 'output', 'CMT', 'Labelsplit.xls')
with output_file_name.open('w', encoding="utf-8") as output_fh:
for file_name in files_to_merge:
with file_name.open(encoding="utf-8") as input_fh:
content = input_fh.read()
output_fh.write(content)
LOGGER.info('Delivery to: %s', str(output_file_name))
return 0

View File

@ -0,0 +1,124 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
"""Script to replace pragmas in hand written c-code."""
import re
import os
import sys
import shutil
REPO_ROOT = os.path.join(os.path.dirname(__file__), '..', '..')
class CodeReplacer:
"""Class to replace code in hand written c-code."""
def replace_line(self, line):
"""Replace line."""
raise NotImplementedError
def replace_file(self, file_name):
"""Go through all lines in the file and replace pragmas."""
tmp_file = file_name + '.tmp'
with open(file_name, 'r', encoding='ascii', errors='ignore') as old_file:
with open(tmp_file, 'w', encoding='ascii', errors='ignore') as new_file:
for line in old_file.readlines():
line = self.replace_line(line)
new_file.write(line)
shutil.move(tmp_file, file_name)
class PragmaReplacer(CodeReplacer):
"""Class to replace pragmas in hand written c-code."""
def __init__(self):
"""Init."""
self.cvc_started = False
self.cvc = None
self.regex = re.compile(r'^\s*#pragma section\s*(CVC(?P<cvc>[a-zA-Z0-9_]*))*\s*$')
self.template = '#include "CVC_{cvc}_{start_or_end}.h"\n'
def replace_line(self, line):
"""Replace line (if it has a pragma)."""
match = self.regex.match(line)
if match:
if self.cvc_started:
line = self.template.format(cvc=self.cvc,
start_or_end='END')
self.cvc = None
else:
self.cvc = match.group('cvc') or 'CODE'
line = self.template.format(cvc=self.cvc,
start_or_end='START')
self.cvc_started = not self.cvc_started
return line
class CodeSwitchReplacer(CodeReplacer):
"""Class to replace code switch includes in hand written c-code."""
def __init__(self):
"""Init."""
self.regex = re.compile(r'(.*)SPM_Codeswitch_Setup(_PVC)?(.*)')
self.template = '{}VcCodeSwDefines{}\n'
def replace_line(self, line):
"""Replace include code switch file."""
match = self.regex.match(line)
if match:
return self.template.format(match.group(1), match.group(3))
return line
def update_files(files):
"""Replace code in handwritten file."""
for file_path in files:
PragmaReplacer().replace_file(file_path)
CodeSwitchReplacer().replace_file(file_path)
def update_test_files(files):
"""Replace code in handwritten file."""
for file_path in files:
CodeSwitchReplacer().replace_file(file_path)
def get_files_to_update(source_dir_name):
"""Get files to update."""
files = [os.path.join('./Models/SSPCECD/VcTqReq__DIESEL', source_dir_name, 'invTab2_UInt16_func.c'),
os.path.join('./Models/SSPCECD/VcCmbNOx__DIESEL', source_dir_name, 'NNEval15x8_func.c'),
os.path.join('./Models/SSPCECD/VcTqEff__DIESEL', source_dir_name, 'NNEval15x8_func.c')]
for path, _, filenames in os.walk(os.path.join(REPO_ROOT, 'Models', 'SSPDL')):
if path.endswith(os.path.join('Mdl', source_dir_name)) or\
path.endswith(os.path.join('Mdl__denso', source_dir_name)):
for filename in filenames:
if (filename.endswith('.c') and not filename.endswith('Mdl.c')) or \
(filename.endswith('.h') and not filename.endswith('Mdl.h')):
file_path = os.path.relpath(os.path.join(path, filename))
files.append(file_path)
return files
def get_test_files_to_update():
"""Get files to update."""
files = []
for path, _, filenames in os.walk(os.path.join(REPO_ROOT, 'Models')):
for filename in filenames:
if filename.endswith('.py'):
file_path = os.path.relpath(os.path.join(path, filename))
files.append(file_path)
return files
def main():
"""Replace incorrect pragmas in handwritten c-code."""
files_to_update = get_files_to_update('pybuild_src')
update_files(files_to_update)
test_files_to_update = get_test_files_to_update()
update_test_files(test_files_to_update)
return 0
if __name__ == '__main__':
sys.exit(main())

133
pybuild/html_report.py Normal file
View File

@ -0,0 +1,133 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
"""Module for html report generation."""
from string import Template
class HtmlReport:
"""Generate template html report. Extend this class to add content.
TODO: Refactor common parts from derived report classes to this class.
"""
__html_head = """<!DOCTYPE html>
<html>
<head>
<meta charset="UTF-8">
<style>
body {
counter-reset: h2;
font-family:"Segoe UI","Segoe",Tahoma,Helvetica,Arial,sans-serif;
}
h1 {text-align: center}
h2 {counter-reset: h3}
h3 {counter-reset: h4}
h4 {counter-reset: h5}
h5 {counter-reset: h6}
h2:before {counter-increment: h2; content: counter(h2) ". "}
h3:before {counter-increment: h3; content: counter(h2) "." counter(h3) ". "}
h4:before {counter-increment: h4; content: counter(h2) "." counter(h3) "." counter(h4) ". "}
h5:before {counter-increment: h5; content: counter(h2) "." counter(h3) "." counter(h4) "." counter(h5) ". "}
h6:before {
counter-increment: h6; content: counter(h2) "." counter(h3) "." counter(h4) "." counter(h5) "." counter(h6) ". "
}
h2.nocount:before, h3.nocount:before, h4.nocount:before, h5.nocount:before, h6.nocount:before {
content: ""; counter-increment: none
}
p {
padding-left:5mm;
}
table {
width:95%;
margin-left:5mm
}
table, th, td {
border: 1px solid black;
border-collapse: collapse;
}
th, td {
padding: 5px;
text-align: left;
}
table tr:nth-child(even) {
background-color: #eee;
}
table tr:nth-child(odd) {
background-color:#fff;
}
table th {
background-color: black;
color: white;
}
ol {
list-style-type: none;
counter-reset: item;
margin: 0;
padding-left:5mm;
}
ol > li {
display: table;
counter-increment: item;
margin-bottom: 0.6em;
}
ol > li:before {
content: counters(item, ".") ". ";
display: table-cell;
padding-right: 0.6em;
}
li ol > li {
margin: 0;
}
li ol > li:before {
content: counters(item, ".") " ";
}
</style>
<title>$title</title>
</head>
<body>
<h1>$title</h1>
"""
__html_end = """</body>
</html>
"""
def __init__(self, title):
"""Initialize report title.
Args:
title (str): The report title
"""
super().__init__()
self._title = title
def _gen_header(self):
"""Generate html header."""
return Template(self.__html_head).substitute(title=self._title)
@staticmethod
def gen_contents():
"""Generate report contents.
Override this method to add report contents.
Contents should start with a <h2> header.
"""
return '<h2>Template report</h2>'
def _gen_end(self):
"""Generate the end of the body and html document."""
return self.__html_end
def generate_report_string(self):
"""Generate a html report as string."""
html = []
html += self._gen_header()
html += self.gen_contents()
html += self._gen_end()
return ''.join(html)
def generate_report_file(self, filename):
"""Generate a html report and save to file."""
with open(filename, 'w', encoding="utf-8") as fhndl:
fhndl.write(self.generate_report_string())

View File

@ -0,0 +1,19 @@
#Folder for abstract interfaces
The idea is to be able to calculate an interface between any two things.
Examples:
* Calculate the interface between a raster and the EMS.
* Calculate the interface between two rasters.
This is done by specifying anything as something with insignals and outsignals.
Then we can match the insignals from one item with the outsignals of the other.
When we do this, it is important that the signal names correspond.
The insignal name in one item has to be exactly the same as the outsignal name in the other.
That is why we have the HALA, to translate between internal signal name and hal property name.
We can also combine more than one interface to a domain.
Examples of domains:
* All raster to raster interfaces -> internal domain
* All rasters communication with one hal -> that hals domain

View File

@ -0,0 +1,4 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
"""pybuild.interface."""

View File

@ -0,0 +1,511 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
# -*- coding: utf-8 -*-
"""Python module for abstracting Pybuild applications"""
from pathlib import Path
import json
from pybuild.interface.base import BaseApplication, Signal, MultipleProducersError, Domain, Interface
from pybuild.lib import logger
from pybuild.build_proj_config import BuildProjConfig
from pybuild.feature_configs import FeatureConfigs
from pybuild.unit_configs import UnitConfigs
from pybuild.user_defined_types import UserDefinedTypes
LOGGER = logger.create_logger("application")
def get_raster_to_raster_interfaces(rasters):
"""Generate a list of Interfaces for internal raster-to-raster signals.
Args:
rasters (list): Input rasters (from app.get_rasters())
Returns:
interfaces (list(interfaces)): List of unique raster-to-raster-interfaces.
"""
raster_pairs = []
for current_raster in rasters:
for corresponding_raster in [r for r in rasters if r != current_raster]:
# If we have interface a_b, no need to produce b_a.
if (corresponding_raster, current_raster) not in raster_pairs:
raster_pairs.append((current_raster, corresponding_raster))
return [Interface(raster[0], raster[1]) for raster in raster_pairs]
def get_internal_domain(rasters):
""" Create an internal domain of signals
Loops through all raster<->raster communications and adds them to a domain object
Args:
rasters (list(Raster)): rasters to calculate communication for
Returns:
domain (Domain): signals belonging to the same domain
"""
internal = Domain()
internal.set_name("internal")
for interface in get_raster_to_raster_interfaces(rasters):
internal.add_interface(interface)
return internal
def get_active_signals(signals, feature_cfg):
""" Filter out inactive signals. """
LOGGER.debug('Filtering %s', signals)
return [signal for signal in signals if feature_cfg.check_if_active_in_config(signal.properties['configs'])]
class Application(BaseApplication):
""" Object for holding information about a pybuild project """
def __init__(self):
self.name = None
self.pybuild = {'build_cfg': None,
'feature_cfg': None,
'unit_vars': {}}
self._insignals = None
self._outsignals = None
self._signals = None
self._raster_definitions = []
self._services = None
self._methods = []
self._enumerations = None
self._structs = None
def parse_definition(self, definition):
""" Parse ProjectCfg.json, get code switch values and read config.json files.
Add the information to the object.
Args:
definition (Path): Path to ProjectCfg.json
"""
self.pybuild['build_cfg'] = BuildProjConfig(str(definition))
self.name = self.pybuild['build_cfg'].name
self.pybuild['feature_cfg'] = FeatureConfigs(self.pybuild['build_cfg'])
unit_cfg = UnitConfigs(self.pybuild['build_cfg'], self.pybuild['feature_cfg'])
self.pybuild['unit_vars'] = unit_cfg.get_per_unit_cfg()
self.pybuild['user_defined_types'] = UserDefinedTypes(self.pybuild['build_cfg'], unit_cfg)
def get_domain_names(self):
""" Get domain names. """
return self.pybuild['build_cfg'].device_domains.values()
def get_domain_mapping(self):
""" Get device to signal domain mapping. """
return self.pybuild['build_cfg'].device_domains
def get_methods(self):
""" Get csp methods. """
if self._signals is None:
self._get_signals()
return self._methods
@property
def enumerations(self):
""" Get enumerations defined in the project. """
if self._enumerations is None:
self._enumerations = self.pybuild['user_defined_types'].get_enumerations()
return self._enumerations
@property
def structs(self):
""" Get structs defined in the project. """
if self._structs is None:
self._structs = self.pybuild['user_defined_types'].get_structs()
return self._structs
@property
def services(self):
""" Get interface to service mapping. """
if self._services is None:
services_file = self.get_services_file()
self._services = self.pybuild['build_cfg'].get_services(services_file)
return self._services
def get_service_mapping(self):
""" Get interface to service mapping. """
return self.services
def get_services_file(self):
""" Get path to file specifying interface to service mapping. """
return self.pybuild['build_cfg'].services_file
def get_name(self, definition):
""" Parse ProjectCfg.json and return the specified project name """
if self.name is None:
return BuildProjConfig(str(definition)).name
return self.name
def _get_signals(self):
""" Calculate parse all inport and outports of all models """
self._insignals = set()
self._outsignals = set()
defined_ports = {'inports': set(), 'outports': set()}
for unit, data in self.pybuild['unit_vars'].items():
self.parse_ports(data, defined_ports, self.pybuild['feature_cfg'], unit)
self.parse_csp_methods(data, self.pybuild['feature_cfg'], unit)
def parse_ports(self, port_data, defined_ports, feature_cfg, unit):
""" Parse ports for one model, based on code switch values.
Modifies the defined_ports dict and the object.
Args:
port_data (dict): port data for a model/unit
defined_ports (set): all known signals
feature_cfg (FeatureConfigs): pybuild parsed object for code switches
unit (string): Name of model/unit
"""
if self._signals is None:
self._signals = {}
for port_type, outport in {'outports': True, 'inports': False}.items():
for port_name, data in port_data.get(port_type, {}).items():
# Get what signals we are dealing with
if not feature_cfg.check_if_active_in_config(data['configs']):
continue
if port_name not in self._signals:
signal = Signal(port_name, self)
self._signals.update({port_name: signal})
else:
signal = self._signals[port_name]
# Add information about which models are involved while we are reading it
if outport:
try:
signal.set_producer(unit)
except MultipleProducersError as mpe:
LOGGER.debug(mpe.message)
signal.force_producer(unit)
self._outsignals.add(port_name)
else:
signal.consumers = unit
self._insignals.add(port_name)
defined_ports[port_type].add(port_name)
def parse_csp_methods(self, port_data, feature_cfg, unit):
""" Parse csp methods.
Args:
port_data (dict): port data for a model/unit.
feature_cfg (FeatureConfigs): pybuild parsed object for code switches
unit (string): Name of model/unit
"""
if self._signals is None:
self._signals = {}
methods = port_data.get('csp', {}).get('methods', {})
for method_name, data in methods.items():
if feature_cfg.check_if_active_in_config(data['configs']):
method = Method(self, unit)
method.parse_definition((method_name, data))
self._methods.append(method)
def get_signal_properties(self, signal):
""" Get properties for the signal from pybuild definition.
Args:
signal (Signal): Signal object
Returns:
properties (dict): Properties of the signal in pybuild
"""
# Hack: Take the first consumer or producer if any exists
for producer in signal.producer:
return self.pybuild['unit_vars'][producer]['outports'][signal.name]
for consumer in signal.consumers:
return self.pybuild['unit_vars'][consumer]['inports'][signal.name]
return {}
def get_rasters(self):
""" Get rasters parsed from pybuild.
Returns:
rasters (list): rasters parsed from pybuild
"""
if self._signals is None:
self._get_signals()
raster_definition = self.pybuild['build_cfg'].get_units_raster_cfg()
rasters = []
for raster_field, raster_content in raster_definition.items():
if raster_field in ['SampleTimes']:
continue
for name, content in raster_content.items():
if name in ['NoSched']:
continue
raster = Raster(self)
raster.parse_definition((name, content, self._signals))
rasters.append(raster)
return rasters
def get_models(self):
""" Get models and parse their config files.
Returns:
models (list(Model)): config.jsons parsed
"""
rasters = self.get_rasters()
# Since one model can exist in many rasters. Find all unique model names first.
cfg_dirs = self.pybuild['build_cfg'].get_unit_cfg_dirs()
model_names = set()
for raster in rasters:
model_names = model_names.union(raster.models)
models = []
for model_name in model_names:
if model_name not in cfg_dirs:
LOGGER.debug("%s is generated code. It does not have a config.", model_name)
continue
model = Model(self)
cfg_dir = cfg_dirs[model_name]
config = Path(cfg_dir, f'config_{model_name}.json')
model.parse_definition((model_name, config))
models.append(model)
return models
def get_translation_files(self):
""" Find all yaml files in translation file dirs.
Returns:
translation_files (list(Path)): translation files
"""
translation_files = []
cfg_dirs = self.pybuild['build_cfg'].get_translation_files_dirs()
for cfg_dir in cfg_dirs.values():
cfg_path = Path(cfg_dir)
translation_files.extend(cfg_path.glob('*.yaml'))
translation_files = list(set(translation_files))
return translation_files
class Raster(BaseApplication):
""" Object for holding information about a raster """
def __init__(self, app):
"""Construct a new Raster object.
Args:
app (pybuild.interface.application.Application): Pybuild project raster is part of
"""
self.app = app
self.name = str()
self._insignals = None
self._outsignals = None
self._available_signals = None
self.models = set()
def parse_definition(self, definition):
""" Parse the definition from pybuild.
Args:
definition (tuple):
name (string): Name of the raster
content (list): Models in the raster
app_signals (dict): All signals in all rasters
"""
self.name = definition[0]
self.models = set(definition[1])
self._available_signals = definition[2]
def _get_signals(self):
""" Add signals from the project to the raster if they are used here
Modifies the object itself.
"""
self._insignals = set()
self._outsignals = set()
self._signals = {}
if self._available_signals is None:
return
for signal in self._available_signals.values():
for consumer in signal.consumers:
if consumer in self.models:
self._signals.update({signal.name: signal})
self._insignals.add(signal.name)
if isinstance(signal.producer, set):
for producer in signal.producer:
if producer in self.models:
self._signals.update({signal.name: signal})
self._outsignals.add(signal.name)
else:
if signal.producer in self.models:
self._signals.update({signal.name: signal})
self._outsignals.add(signal.name)
def get_signal_properties(self, signal):
""" Get properties for the signal from pybuild definition.
Args:
signal (Signal): Signal object
Returns:
properties (dict): Properties of the signal in pybuild
"""
for producer in signal.producer:
if producer in self.app.pybuild['unit_vars']:
return self.app.get_signal_properties(signal)
return {}
class Model(BaseApplication):
""" Object for holding information about a model """
def __init__(self, app):
self.app = app
self.name = str()
self.config = None
self._insignals = None
self._outsignals = None
self._signal_specs = None
def get_signal_properties(self, signal):
""" Get properties for the signal from pybuild definition.
Args:
signal (Signal): Signal object
Returns:
properties (dict): Properties of the signal in pybuild
"""
if self._signal_specs is None:
self._get_signals()
if signal.name in self._signal_specs:
return self._signal_specs[signal.name]
return {}
def _get_signals(self):
""" Add signals from the project to the model if they are used here
Modifies the object itself.
Entrypoint for finding signals from the base class.
"""
self._insignals = set()
self._outsignals = set()
self._signals = {}
self._signal_specs = {}
self._parse_unit_config(self.config)
def _parse_unit_config(self, path):
""" Parse a unit config file.
Broken out of get_signals to be recursive for included configs.
"""
cfg = self._load_json(path)
for signal_spec in cfg['inports'].values():
signal = Signal(signal_spec['name'], self)
self._insignals.add(signal.name)
self._signals.update({signal.name: signal})
self._signal_specs[signal.name] = signal_spec
for signal_spec in cfg['outports'].values():
signal = Signal(signal_spec['name'], self)
self._outsignals.add(signal.name)
self._signals.update({signal.name: signal})
self._signal_specs[signal.name] = signal_spec
for include_cfg in cfg.get('includes', []):
LOGGER.debug('%s includes %s in %s', self.name, include_cfg, path.parent)
include_path = Path(path.parent, f'config_{include_cfg}.json')
self._parse_unit_config(include_path)
@staticmethod
def _load_json(path):
""" Small function that opens and loads a json file.
Exists to be mocked in unittests
"""
with open(path, encoding="utf-8") as fhndl:
return json.load(fhndl)
def parse_definition(self, definition):
""" Parse the definition from pybuild.
Args:
definition (tuple):
name (string): Name of the model
configuration (Path): Path to config file
"""
self.name = definition[0]
self.config = definition[1]
self._get_signals()
class Method(BaseApplication):
""" Object for holding information about a csp method call """
def __init__(self, app, unit):
"""Construct a new Method object.
Args:
app (pybuild.interface.application.Application): Pybuild project raster is part of.
unit (str): Model that the method is defined in.
"""
self.app = app
self.unit = unit
self.name = str()
self.namespace = str()
self.adapter = str()
self.description = None
self._signals = {}
self._insignals = set()
self._outsignals = set()
self._primitives = {}
self._properties = {}
def parse_definition(self, definition):
""" Parse the definition from pybuild.
Args:
definition (tuple):
name (string): Name of the model
configuration (dict): Configuration of method
"""
name = definition[0]
configuration = definition[1]
self.name = name
self.adapter = configuration['adapter']
self.namespace = configuration['namespace']
self._primitives[name] = configuration['primitive']
if 'description' in configuration:
self.description = configuration['description']
signals = configuration.get('ports', {})
outsignals = signals.get('out', {})
for signal_name, signal_data in outsignals.items():
signal = self._add_signal(signal_name)
signal.consumers = name
signal.set_producer(name)
self._primitives[signal_name] = signal_data['primitive']
self._properties[signal_name] = signal_data
self._outsignals.add(signal_name)
insignals = signals.get('in', {})
for signal_name, signal_data in insignals.items():
signal = self._add_signal(signal_name)
signal.consumers = name
signal.set_producer(name)
self._insignals.add(signal_name)
self._primitives[signal_name] = signal_data['primitive']
self._properties[signal_name] = signal_data
def _add_signal(self, signal_name):
""" Add a signal used by the method.
Args:
signal_name (str): Name of the signal
"""
if signal_name not in self._signals:
signal = Signal(signal_name, self)
self._signals.update({signal_name: signal})
else:
signal = self._signals[signal_name]
return signal
def get_signal_properties(self, signal):
""" Get properties for the signal from csp method configuration.
Args:
signal (Signal): Signal object
Returns:
properties (dict): Properties of the signal in pybuild
"""
return self._properties[signal.name]
def get_primitive(self, primitive_name):
""" Get primitive.
Args:
primitive_name (str): Name of primitive part
Returns:
primitive (str): Primitive
"""
return self._primitives[primitive_name]

500
pybuild/interface/base.py Normal file
View File

@ -0,0 +1,500 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
# -*- coding: utf-8 -*-
"""Python module used for abstracting an application that should be interfacing others."""
from abc import abstractmethod
from ruamel.yaml import YAML
from pybuild.lib import logger
LOGGER = logger.create_logger('base')
def filter_signals(signals, domain):
""" Take a list of signals and remove all domains belonging to a domain
If the signal is part of the domain, it is not part of the resulting list
Arguments:
signals (list(Signal)): signals to filter
domain (Domain): domain that the signals should not be part of
"""
filtered_signals = []
for signal in signals:
if signal.name not in domain.signals:
filtered_signals.append(signal)
return filtered_signals
class MultipleProducersError(Exception):
"""Error when setting a producer and there already exists one"""
def __init__(self, signal, old_producer, new_producer):
"""Set error message
Args:
signal (Signal): Signal object
old_producer (BaseApplication): Producer already registered
new_producer (BaseApplication): Producer attempted to be registered
"""
super().__init__()
self.message = (f"{signal.name}:"
f" Attempting to set producer {new_producer}"
f" when {old_producer} is already set")
class BaseApplication:
"""Base application to build other adapters on"""
name = str()
_signals = None
node = str() # Used to calculate interface
read_strategies = {
"Always",
"OnChanged",
"OnUpdated"
}
def __repr__(self):
"""String representation for logging and debugging
Returns:
repr (string): Name of the application, the number of insignals and outsignals
"""
return f"<{self.name}" \
f" insignals:{len(self.insignals)}" \
f" outsignals:{len(self.outsignals)}>"
def parse_signals(self):
"""API interface to read all signals in any child object"""
self._get_signals()
@property
def insignals(self):
""" Insignals to the raster.
Calculated as all read ports - all written ports
Returns:
signals (list): List of Signal objects.
"""
if self._insignals is None:
self._get_signals()
return [self._signals[port] for port in self._insignals - self._outsignals]
@property
def outsignals(self):
""" All outports.
Since we might consume some of the signals that should also be sent elsewhere,
we do not remove internally consumed signals.
Returns:
signals (list): List of Signal objects.
"""
if self._outsignals is None:
self._get_signals()
return [self._signals[port] for port in self._outsignals]
@property
def signals(self):
"""API interface property in any child object
All cached signals.
If no cache exists, reads all signals and save to cache.
Returns:
signals (list): Signal objects
"""
if self._signals is None:
self.parse_signals()
return self._signals.values()
@abstractmethod
def _get_signals(self):
"""Stub to implement in child object"""
@abstractmethod
def get_signal_properties(self, signal):
"""Stub to implement in child object
Ideally, this should be moved to the signal.
Currently, getting the properties depends on how we read and define the signals.
"""
@abstractmethod
def parse_definition(self, definition):
"""Stub for parsing a defintion after the object has been initialized.
Raises NotImplementedError if called without being implemented.
Args:
definition: Definition of the Application. Type depends on the application.
"""
raise NotImplementedError('This is a stub')
class Signal:
"""Signal object
The signal should behave the same way independently of where we define it.
"""
def __repr__(self):
"""String representation for logging and debugging
Returns:
repr (string): Name of the application, the number of insignals and outsignals
"""
return (f"<{self.name} in {self.applications}"
f" producer:{self.producer}"
f" consumers:{self.consumers}>")
def __init__(self, name, application):
"""Define base properties of the signal object
The application object is used to read properties of a signal.
TODO: Do this when we define the signal and add properties known in other
systems when we encounter them.
Args:
name (string): Signal name
application (BaseApplication): Application defining the signal
"""
self.name = name
self.applications = {} # Add applications to a dict to prevent duplicates
if application is not None:
self.applications[application.name] = application
self._consumers = set()
self._producer = None
def add_application(self, application):
"""Add an application to find properties from
Args:
application (BaseApplication): Application to read properties from
"""
if application.name in self.applications:
return
self.applications[application.name] = application
@property
def consumers(self):
"""Get all consumers of a signal
Returns:
consumers (set): All consumers of a signal
"""
if isinstance(self._consumers, set):
return self._consumers
return set()
@consumers.setter
def consumers(self, consumers):
"""Set consumers of a signal
If the consumers is a list or set, iterate over each consumer
Otherwise, add the consumer to the set of consumers
Args:
consumers (list/set/string): consumer(s) of a signal
"""
if isinstance(consumers, (list, set)):
for consumer in consumers:
self._consumers.add(consumer)
else:
self._consumers.add(consumers)
@property
def producer(self):
"""Get the producer of a signal
Since we have some strange signals with multiple producers,
such as counters for dep, this returns a set.
Returns:
producer (set): Producer(s) of a signal
"""
if isinstance(self._producer, set):
return self._producer
return set()
@producer.setter
def producer(self, producer):
"""Set producer of a signal
Args:
producer (string/set): Name of the producer
"""
if isinstance(producer, set):
self._producer = producer
else:
self._producer = {producer}
def set_producer(self, producer):
"""Set producer of a signal
If there already is a registered producer of the signal,
raise MultipleProducersError
This can be expected and force_producer can be called to override this.
That must be explicit in each instance.
Args:
producer (string): Name of the producer
application (BaseApplication): Application defining the signal. Optional
"""
if isinstance(producer, set):
if self._producer is not None and producer - self._producer:
raise MultipleProducersError(self, self._producer, producer)
self.producer = producer
else:
if self._producer is not None \
and isinstance(producer, str) \
and producer not in self._producer:
raise MultipleProducersError(self, self._producer, producer)
self.producer = {producer}
def force_producer(self, producer):
"""Forcefully update add producers of a signal
This is needed since we have some signals that are written by multiple model
Args:
producers (string): Producer of a signal
application (BaseApplication): Application defining the signal. Optional
"""
self._producer.add(producer)
@property
def properties(self):
"""Properties of a signal
Currently not homogenized.
Therefore we read the properties from the application that defined the signal.
Returns:
properties (dict): properties of a signal
"""
properties = {}
for application in self.applications.values():
LOGGER.debug('Getting properties for %s from %s', self.name, application.name)
application_properties = application.get_signal_properties(self)
LOGGER.debug(application_properties)
for key, value in application_properties.items():
LOGGER.debug('Looking at %s: %s', key, value)
if key in properties and value != properties[key]:
LOGGER.debug('Signal %s already has %s with value %s, ignoring %s from %s',
self.name, key, properties[key], value, application.name)
continue
properties[key] = value
return properties
class Interface:
"""Interface between two objects"""
def __repr__(self):
"""String representation for logging and debugging
Returns:
repr (string): Name of the interface, and the length of received and transmitted signals
"""
return (f"<{self.name}"
f" a->b:{len(self.get_directional_signals(self.current, self.corresponding))}"
f" b->a:{len(self.get_directional_signals(self.corresponding, self.current))}>")
def debug(self):
"""Debug an interface object to stdout"""
LOGGER.info('name: %s', self.name)
for signal in self.get_directional_signals(self.current, self.corresponding):
LOGGER.info('insignal: %s', signal)
for signal in self.get_directional_signals(self.corresponding, self.current):
LOGGER.info('outsignal: %s', signal)
def __init__(self, current, corresponding):
"""Create the interface object
Args:
current (BaseApplication): Primary object of an interface
corresponding (BaseApplication): Secondary object of an interface
"""
self.name = current.name + '_' + corresponding.name
self.current = current
self.corresponding = corresponding
@staticmethod
def get_directional_signals(producer, consumer):
"""Get signals going from producer to consumer
Args:
producer (BaseApplication): producer of the signals
consumer (BaseApplication): consumer of the signals
Returns:
signals (list): Signals sent from producer and received in consumer
"""
outsignals = {signal.name: signal for signal in producer.outsignals}
signals = []
for signal in consumer.insignals:
if signal.name in outsignals:
signal.set_producer(outsignals[signal.name].producer)
signal.add_application(producer)
signal.add_application(consumer)
signals.append(signal)
return signals
def get_produced_signals(self, producer_name):
"""Get signals going from producer to consumer
This function can be used if you are lacking some objects
Args:
consumer_name (string): name of the consumer of the signals
Returns:
signals (list): Signals sent from producer and received in consumer
"""
if producer_name == self.current.name:
consumer = self.corresponding
producer = self.current
elif producer_name == self.corresponding.name:
consumer = self.current
producer = self.corresponding
else:
LOGGER.error('%s not in [%s, %s]',
producer_name,
self.current.name,
self.corresponding.name)
return self.get_directional_signals(producer, consumer)
def get_consumed_signals(self, consumer_name):
"""Get signals going from producer to consumer
This function can be used if you are lacking some objects
Args:
consumer_name (string): name of the consumer of the signals
Returns:
signals (list): Signals sent from producer and received in consumer
"""
if consumer_name == self.current.name:
consumer = self.current
producer = self.corresponding
elif consumer_name == self.corresponding.name:
consumer = self.corresponding
producer = self.current
else:
LOGGER.error('%s not in [%s, %s]',
consumer_name,
self.current.name,
self.corresponding.name)
return self.get_directional_signals(producer, consumer)
class Domain:
"""Domain with interacting interfaces"""
def __repr__(self):
"""String representation for logging and debugging
Returns:
repr (string): Name of the domain, and all clients for that domain
"""
return f"<{self.name}: {self.clients}>"
def __init__(self):
"""Initialize the object"""
self.name = ''
self.signals = {}
self.clients = set()
self._clients = {}
def set_name(self, name):
"""Set the name of the domain
Args:
name (string): Name of the domain
"""
self.name = name
def add_interface(self, interface):
"""Add an interface to a domain
Args:
interface (Interface): Interface object
"""
self._process_interface(interface)
def _process_interface(self, interface):
"""Process interface to add signals to the domain
Args:
interface (Interface): Interface object
"""
for signal in interface.get_directional_signals(interface.current, interface.corresponding):
self._process_signal(signal.name, interface.current, interface.corresponding)
for signal in interface.get_directional_signals(interface.corresponding, interface.current):
self._process_signal(signal.name, interface.corresponding, interface.current)
def _process_signal(self, signal_name, producer, consumer):
"""Process signal to add to the domain
Args:
signal_name (string): Name of the signal
consumer (BaseApplication): Consumer application of the signal
producer (BaseApplication): Producer application of the signal
"""
if signal_name not in self.signals:
self.signals[signal_name] = Signal(signal_name, producer)
self.signals[signal_name].consumers = consumer.name
self.signals[signal_name].add_application(producer)
self.signals[signal_name].add_application(consumer)
signal = self.signals[signal_name]
if producer.name not in self.clients:
self.clients.add(producer.name)
self._clients[producer.name] = {'producer': [], 'consumer': []}
self._clients[producer.name]['producer'].append(signal)
if consumer.name not in self.clients:
self.clients.add(producer.name)
self._clients[producer.name] = {'producer': [], 'consumer': []}
self._clients[producer.name]['consumer'].append(signal)
signal.consumers = consumer.name
try:
signal.producer = producer.name
except MultipleProducersError as mpe:
LOGGER.debug(mpe.message)
def create_groups(self):
"""Create groups of signals going from each producer
Returns:
signal_groups (dict): Signal groups
"""
signal_groups = {}
for signal in self.signals.values():
# Producer is always a set, to handle pass-through signals
for producer in signal.producer - set(signal_groups.keys()):
signal_groups[producer] = []
for producer in signal.producer:
signal_groups[producer].append(signal)
return signal_groups
def create_selective_groups(self, a_names, b_names):
"""Create groups for the a_list communicating with the b_names
Returns:
signal_groups (dict): Signal groups
"""
signal_groups = {name: {'consumer': [], 'producer': []} for name in a_names}
for signal in self.signals.values():
for producer in set(signal.producer):
if producer in a_names and signal.consumers & b_names:
signal_groups[producer]['producer'].append(signal)
for consumer in signal.consumers:
if consumer in a_names and set(signal.producer) & a_names:
signal_groups[consumer]['consumer'].append(signal)
return signal_groups
@staticmethod
def to_yaml(spec, output):
"""Writes spec to yaml file
Args:
spec (dict): data for the yaml
output (Path): file to write to
"""
with open(output, 'w', encoding="utf-8") as yaml_file:
yaml = YAML()
yaml.dump(spec, yaml_file)

View File

@ -0,0 +1,490 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
"""Module for CSP API abstraction."""
import enum
from ruamel.yaml import YAML
from abc import abstractmethod
from pybuild.interface.base import BaseApplication, Signal
from pybuild.lib import logger
LOGGER = logger.create_logger("service")
class MissingApi(Exception):
"""Exception to raise when api is missing"""
def __init__(self, api, map_file):
self.message = f"Api {api} missing from {map_file}"
class BadYamlFormat(Exception):
"""Exception to raise when in/out signal is not defined."""
def __init__(self, api, signal_name):
self.message = f"Signal {signal_name} for {api} should be set as insignal or outsignal."
class CspApi(BaseApplication):
"""Abstraction for HAL and SFW"""
position = enum.Enum(
'Position',
names=[
"property_name",
"property_type",
"variable_type",
"offset",
"factor",
"default",
"length",
"min",
"max",
"enum",
"init",
"description",
"unit",
"endpoint",
"api",
"variant",
"strategy",
"debug",
"dependability"
]
)
def __init__(self, base_application, read_strategy='Always'):
"""Create the interface object
Args:
base_application (BaseApplication): Primary object of an interface
Usually a raster, but can be an application or a model too.
"""
self.name = ""
self.translations = {}
self.api_side = False
# we only care when generating models for csp.
self.filter = None
self.signal_names = {
"api": {"insignals": set(), "outsignals": set()},
"app": {"insignals": set(), "outsignals": set()},
}
self.base_application = base_application
self.map_file = self.get_map_file()
self.api_map = self.get_map()
self.translations_files = []
self.signal_primitives_list = []
self.default_read_strategy = read_strategy
def get_signal_properties(self, signal):
"""Get signal properties for signal
Calls self.base_application to get signal properties
Args:
signal (Signal): Signal to get properties for
"""
self.base_application.get_signal_properties(signal)
def _get_signals(self):
"""Read signals"""
self.parse_definition(self.translations_files)
@property
def insignals(self):
return self.get_signals(self.hal_name, 'insignals')
@property
def outsignals(self):
return self.get_signals(self.hal_name, 'outsignals')
def get_signals(self, api, signal_type='insignals'):
"""Get signals to and from an api abstraction
self.api_side configures if we look at the api side.
If it is set to False, we look at the application side.
Args:
api (str): Name of the api
signal_type (str): insignals or outsignals
Returns:
signals (list): Signals in the interface
"""
if self.api_side:
signal_names = self.signal_names['api'][signal_type]
else:
signal_names = self.signal_names['app'][signal_type]
signals = []
for signal_name, specs in self.translations.items():
for spec in specs:
signal = None
if api is not None:
if spec['api'] == api:
if self.api_side:
if spec['property'] in signal_names:
signal = Signal(spec['property'], self)
else:
if signal_name in signal_names:
signal = Signal(signal_name, self)
else:
if self.api_side:
if spec['property'] in signal_names:
signal = Signal(spec['property'], self)
else:
if signal_name in signal_names:
signal = Signal(signal_name, self)
if signal is not None:
signals.append(signal)
return signals
def clear_signal_names(self):
"""Clear signal names
Clears defined signal names (but not signal properties).
"""
self.signal_names = {
"api": {"insignals": set(), "outsignals": set()},
"app": {"insignals": set(), "outsignals": set()},
}
def parse_definition(self, definition):
"""Parses all definition files
Args:
definition (list(Path)): Definition files
"""
for translation in definition:
raw = self.read_translation(translation)
self.extract_endpoint_definitions(raw)
@staticmethod
def get_api_name(api_name):
"""Return the api name
Args:
api_name (str): Name of the api
Returns:
(str): Name of the api
"""
return api_name
def verify_api(self, api_name):
"""Verify that the api is in the map
Args:
api_name (str): Name of the api
"""
if api_name not in self.api_map:
raise MissingApi(api_name, self.map_file)
def add_signals(self, signals, signal_type='insignals', properties=None):
"""Add signal names and properties to already set ones
Args:
signals (list(Signals)): Signals to use
signal_type (str): 'insignals' or 'outsignals'
properties (list(str)): signal definition properties, default = []
"""
opposite = {'insignals': 'outsignals', 'outsignals': 'insignals'}
api_type = opposite[signal_type]
properties = [] if properties is None else properties
for signal in signals:
LOGGER.debug("Adding signal: %s", signal)
temp_set = set()
for translation in self.translations.get(signal.name, []):
temp_list = list(translation)
api_name = translation[self.position.api.value]
variant_name = translation[self.position.variant.value]
endpoint = translation[self.position.endpoint.value]
api_signal = translation[self.position.property_name.value]
self.check_signal_property(api_name, variant_name, endpoint,
api_signal, signal_type)
self.signal_names['api'][api_type].add(api_signal)
for enum_property in properties:
LOGGER.debug("Modifying property: %s", enum_property)
value = signal.properties[enum_property["source"]]
if value == "-":
value = enum_property["default"]
temp_list[
self.position[enum_property["destination"]].value
] = value
temp_set.add(tuple(temp_list))
self.translations[signal.name] = temp_set
self.signal_names['app'][signal_type].add(signal.name)
self.check_endpoints()
LOGGER.debug('Registered signal names: %s', self.signal_names)
def check_signal_property(self, api, variant, endpoint, property_name, signal_type):
"""Check if we have only one signal written for the same property.
Args:
api (str): interface name
variant (str): variant value. "properties" or "methods" for service
"hals" for hal
endpoint (str): signal endpoint
property_name (str): signal property
signal_type (str): 'insignals' or 'outsignals'
"""
primitive_value = ""
for value in [api, variant, endpoint, property_name]:
if value:
if primitive_value == "":
primitive_value = value
else:
primitive_value = primitive_value + '.' + value
if primitive_value == "":
raise Exception("The primitive does not contain any value!")
directional_primitive = f"{primitive_value}.{signal_type}"
self.check_property(directional_primitive, signal_type)
def check_property(self, property_spec, signal_type):
"""Check if we have only one signal written for the same property.
Args:
property_spec (str): property specification
signal_type (str): 'insignals' or 'outsignals'
"""
if property_spec in self.signal_primitives_list:
error_msg = (f"You can't write {property_spec} as "
f"{signal_type} since this primitive has been used."
" Run model_yaml_verification to identify exact models.")
raise Exception(error_msg)
self.signal_primitives_list.append(property_spec)
@abstractmethod
def check_endpoints(self):
"""Should be implemented by subclasses."""
@staticmethod
def read_translation(translation_file):
"""Read specification of the format:
service:
interface:
properties:
- endpoint_name:
- signal: name
property: name
- signal: name
property: name
hal:
hal_name:
- primitive_endpoint:
- insignal: name
hal_name:
- struct_endpoint:
- insignal: name1
property: member1
- insignal: name2
property: member2
ecm:
- signal: name
signals:
tvrl:
- signal: name
property: can_name
offset: offset
factor: scaling
Args:
translation_file (Path): file with specs
Returns:
yaml_data (dict): Loaded YAML data as dict, empty if not found
"""
if not translation_file.is_file():
LOGGER.warning("No file found for %s", translation_file)
return {}
with open(translation_file, encoding="utf-8") as translation:
yaml = YAML(typ='safe', pure=True)
raw = yaml.load(translation)
return raw
def parse_api_definitions(self, api_definitions):
"""Parses group definitions.
Args:
api_definitions (dict): endpoints in parsed yaml file.
"""
for api_from_spec, definition in api_definitions.items():
for variant, variant_endpoints in self.extract_definition(definition).items():
for endpoints in variant_endpoints:
for endpoint, signals in endpoints.items():
self.parse_property_definitions({api_from_spec: signals}, endpoint, variant)
def parse_property_definitions(self, endpoint_definitions, endpoint, variant):
"""Parse signal definitions.
Args:
endpoint_definitions (dict): parsed yaml file.
endpoint (str): Name of the endpoint to use
"""
def _get_property_name(specification):
""" Handle cases when there are no propery name.
If there is no property name, the "group" is set to the signal.
This should be used when the property is not a struct in the interface api.
Args:
specification (dict): signal specification
Returns:
property_name (str): name of the potential internal property
"""
property_name = specification.get('property', '')
if not property_name:
return None
return property_name
enumerations = self.base_application.enumerations
for api, specifications in endpoint_definitions.items():
self.verify_api(api)
for specification in specifications:
in_out_signal = [key for key in specification.keys() if 'signal' in key]
base_signal = None
signal_name = None
if "in" in in_out_signal[0]:
for signal in self.base_application.insignals:
if signal.name == specification["insignal"]:
base_signal = signal
signal_name = signal.name
elif "out" in in_out_signal[0]:
for signal in self.base_application.outsignals:
if signal.name == specification["outsignal"]:
base_signal = signal
signal_name = signal.name
else:
raise BadYamlFormat(api, specification[in_out_signal[0]])
if base_signal is None:
continue
base_properties = self.base_application.get_signal_properties(
base_signal
)
if base_properties["type"] in enumerations:
underlying_data_type = enumerations[base_properties['type']]['underlying_data_type']
interface_type = underlying_data_type
if 'init' not in specification:
if enumerations[base_properties['type']]['default_value'] is not None:
init_value = enumerations[base_properties['type']]['default_value']
else:
LOGGER.warning('Initializing enumeration %s to "zero".', base_properties['type'])
init_value = [
k for k, v in enumerations[base_properties['type']]['members'].items() if v == 0
][0]
else:
init_value = specification.get("init", 0)
else:
interface_type = base_properties["type"]
init_value = specification.get("init", 0)
if "out" in in_out_signal[0] and "strategy" in specification:
LOGGER.warning('Cannot set read strategy for outsignal %s, using "Always".', signal_name)
strategy = "Always"
else:
strategy = specification.get("strategy", self.default_read_strategy)
if strategy not in self.read_strategies:
LOGGER.warning('Invalid strategy %s, using "Always" instead.', strategy)
strategy = self.default_read_strategy
if signal_name not in self.translations:
self.translations[signal_name] = set()
self.translations[signal_name].add(
(
"enum_0", # Enum starts at position 1.
_get_property_name(specification),
interface_type,
specification.get("type"),
specification.get("offset"),
specification.get("factor"),
specification.get("default"),
specification.get("length"),
specification.get("min"),
specification.get("max"),
specification.get("enum"),
init_value,
specification.get("description"),
specification.get("unit"),
endpoint,
api,
variant,
strategy,
specification.get("debug", False),
specification.get("dependability", False)
)
)
def spec_to_dict(self, signal_spec, signal_name):
"""Convert signal specification to dict
Args:
signal_spec (dict): signal specification
signal_name (str): signal name
Returns:
(dict): signal specification as dict
"""
return {
'variable': signal_name,
'variable_type': signal_spec[self.position.variable_type.value],
'property': signal_spec[self.position.property_name.value],
'property_type': signal_spec[self.position.property_type.value],
"default": signal_spec[self.position.default.value],
"length": signal_spec[self.position.length.value],
'offset': signal_spec[self.position.offset.value],
'factor': signal_spec[self.position.factor.value],
'range': {
'min': signal_spec[self.position.min.value],
'max': signal_spec[self.position.max.value]
},
'init': signal_spec[self.position.init.value],
'description': signal_spec[self.position.description.value],
'unit': signal_spec[self.position.unit.value],
'endpoint': signal_spec[self.position.endpoint.value],
'api': self.get_api_name(signal_spec[self.position.api.value]),
'variant': signal_spec[self.position.variant.value],
'strategy': signal_spec[self.position.strategy.value],
'debug': signal_spec[self.position.debug.value],
'dependability': signal_spec[self.position.dependability.value]
}
def to_dict(self, client="app"):
"""Method to generate dict to be saved as yaml
Returns:
spec (dict): Signalling specification
"""
spec = {"consumer": [], "producer": []}
direction = {
"app": ["consumer", "producer"],
"api": ["producer", "consumer"]}
for signal_name, signal_spec in self._generator(self.signal_names["app"]["insignals"]):
spec[direction[client][0]].append(
self.spec_to_dict(signal_spec, signal_name)
)
for signal_name, signal_spec in self._generator(self.signal_names["app"]["outsignals"]):
spec[direction[client][1]].append(
self.spec_to_dict(signal_spec, signal_name)
)
return spec
def _generator(self, signal_names, unique_names=False):
"""Iterate over signals for allowed services
If unique_names is True, the iterator does not yield the same signal twice
if unique_names is False, it yields each allowed signal spec with the signal name
Args:
signal_names (list): allowed signals
Yields:
name (str): Name of the signal
specification (dict): signal specification for allowed service
"""
for signal_name, specifications in (
(name, spec) for name, spec in sorted(
self.translations.items())
if name in signal_names):
for specification in specifications:
if unique_names:
yield signal_name, specification
break
yield signal_name, specification

View File

@ -0,0 +1,677 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
"""Python module used for reading device proxy arxml:s"""
from ruamel.yaml import YAML
import enum
from pybuild.interface.base import BaseApplication, Signal
from pybuild.lib import logger
LOGGER = logger.create_logger("device_proxy")
class MissingDevice(Exception):
"""Exception to raise when device is missing"""
def __init__(self, dp):
self.message = f"Device proxy {dp} missing from deviceDomains.json"
class BadYamlFormat(Exception):
"""Exception to raise when in/out signal is not defined."""
def __init__(self, message):
self.message = message
class DPAL(BaseApplication):
"""Device Proxy abstraction layer"""
dp_position = enum.Enum(
"Position",
names=[
"domain",
"property",
"variable_type",
"property_interface_type",
"property_manifest_type",
"offset",
"factor",
"default",
"length",
"min",
"max",
"enum",
"init",
"description",
"unit",
"group",
"strategy",
"debug",
"dependability",
"port_name"
],
)
def __repr__(self):
"""String representation of DPAL"""
return (
f"<DPAL {self.name}"
f" app_side insignals: {len(self.signal_names['other']['insignals'])}"
f" app_side outsignals: {len(self.signal_names['other']['outsignals'])}>"
)
def __init__(self, base_application):
"""Create the interface object
Args:
base_application (BaseApplication): Primary object of an interface
Usually a raster, but can be an application or a model too.
"""
self.name = ""
self.dp_translations = {}
# We do not care about domain when looking from a project perspective,
# we only care when generating manifests for csp.
self.domain_filter = None
self.signal_names = {
"dp": {"insignals": set(), "outsignals": set()},
"other": {"insignals": set(), "outsignals": set()},
}
self.e2e_sts_signals = set()
self.base_application = base_application
self.translations_files = []
self.device_domain = base_application.get_domain_mapping()
self.signal_primitives_list = []
def clear_signal_names(self):
"""Clear signal names
Clears defined signal names (but not signal properties).
"""
self.signal_names = {
"dp": {"insignals": set(), "outsignals": set()},
"other": {"insignals": set(), "outsignals": set()},
}
def add_signals(self, signals, signal_type="insignal", properties=[]):
"""Add signal names and properties
Args:
signals (list(Signals)): Signals to use
signal_type (str): 'insignals' or 'outsignals'
properties (list(str)): signal definition properties, default = []
"""
opposite = {"insignals": "outsignals", "outsignals": "insignals"}
dp_type = opposite[signal_type]
for signal in signals:
LOGGER.debug("Adding signal: %s", signal)
temp_set = set()
for translation in self.dp_translations.get(signal.name, []):
temp_list = list(translation)
domain = translation[self.dp_position.domain.value]
group = translation[self.dp_position.group.value]
dp_signal = translation[self.dp_position.property.value]
self.check_signal_property(domain, group, dp_signal, signal_type)
self.signal_names["dp"][dp_type].add(dp_signal)
for enum_property in properties:
LOGGER.debug("Modifying property: %s", enum_property)
value = signal.properties[enum_property["source"]]
if value == "-":
value = enum_property["default"]
temp_list[
self.dp_position[enum_property["destination"]].value
] = value
temp_set.add(tuple(temp_list))
self.dp_translations[signal.name] = temp_set
self.signal_names["other"][signal_type].add(signal.name)
for e2e_sts_signal_name in self.e2e_sts_signals:
if e2e_sts_signal_name not in self.signal_names["other"]["insignals"]:
LOGGER.warning("E2E check signal %s not used in any model.", e2e_sts_signal_name)
self.signal_names["other"][signal_type].add(e2e_sts_signal_name)
self.check_groups()
LOGGER.debug("Registered signal names: %s", self.signal_names)
def check_signal_property(self, domain, group, property_name, signal_type):
"""Check if we have only one signal written for the same property.
Args:
domain (str): signal domain
group (str): signal group
property_name (str): signal property
signal_type (str): 'insignals' or 'outsignals'
"""
primitive_value = ""
for value in [domain, group, property_name]:
if value:
if primitive_value == "":
primitive_value = value
else:
primitive_value = primitive_value + '.' + value
if primitive_value == "":
raise Exception("The primitive does not contain any value!")
directional_primitive = f"{primitive_value}.{signal_type}"
self.check_property(directional_primitive, signal_type)
def check_property(self, property_spec, signal_type):
"""Check if we have only one signal written for the same property.
Args:
property_spec (str): property specification
signal_type (str): 'insignals' or 'outsignals'
"""
if property_spec in self.signal_primitives_list:
error_msg = (f"You can't write {property_spec} as "
f"{signal_type} since this primitive has been used."
" Run model_yaml_verification to identify exact models.")
raise Exception(error_msg)
self.signal_primitives_list.append(property_spec)
def check_groups(self):
"""Check and crash if signal group contains both produces and consumes signals."""
groups = {}
for signal_name, signal_specs in self.dp_translations.items():
if signal_name in self.signal_names["other"]['insignals']:
consumed = True
elif signal_name in self.signal_names["other"]['outsignals']:
consumed = False
else:
continue
for signal_spec in signal_specs:
group = signal_spec[self.dp_position.group.value]
if group is None:
continue
domain = signal_spec[self.dp_position.domain.value]
key = (domain, group)
if key not in groups:
groups[key] = {"consumed": consumed,
"signals": set()}
groups[key]["signals"].add(signal_name)
assert consumed == groups[key]["consumed"], \
f"Signal group {group} for {domain} contains both consumed and produced signals"
@staticmethod
def read_translation(translation_file):
"""Read specification of the format:
service:
interface:
properties:
- endpoint_name:
- signal: name
property: name
- signal: name
property: name
hal:
hal_name:
- primitive_endpoint:
- insignal: name
hal_name:
- struct_endpoint:
- insignal: name1
property: member1
- insignal: name2
property: member2
ecm:
- signal: name
signals:
tvrl:
- signal: name
property: can_name
offset: offset
factor: scaling
Args:
translation_file (Path): file with specs
Returns:
yaml_data (dict): Loaded YAML data as dict, empty if not found
"""
if not translation_file.is_file():
LOGGER.warning("No file found for %s", translation_file)
return {}
with open(translation_file, encoding="utf-8") as translation:
yaml = YAML(typ='safe', pure=True)
raw = yaml.load(translation)
return raw
def parse_group_definitions(self, signal_groups):
"""Parse group definitions.
Args:
signal_groups (dict): parsed yaml file.
"""
for dp_name, group_definitions in signal_groups.items():
for group in group_definitions:
port_name = None
if 'portname' in group:
port_name = group.pop('portname')
for group_name, signals in group.items():
self.parse_signal_definitions({dp_name: signals}, group_name, port_name)
def parse_signal_definitions(self, signals_definition, group=None, port_name=None):
"""Parse signal definitions.
Args:
signals_definition (dict): parsed yaml file.
group (str): Name of signal group, if signal belongs to a group.
port_name (str): Name of signal port, if there is one.
"""
enumerations = self.base_application.enumerations
for dp_name, dp_specification in signals_definition.items():
for specification in dp_specification:
in_out_signal = [key for key in specification.keys() if 'signal' in key]
base_signal = None
signal_name = None
if "in" in in_out_signal[0]:
for signal in self.base_application.insignals:
if signal.name == specification["insignal"]:
base_signal = signal
signal_name = signal.name
elif "out" in in_out_signal[0]:
for signal in self.base_application.outsignals:
if signal.name == specification["outsignal"]:
base_signal = signal
signal_name = signal.name
else:
raise BadYamlFormat(f"in/out signal for {dp_name} is missing.")
if base_signal is None:
continue
base_properties = self.base_application.get_signal_properties(
base_signal
)
if base_properties["type"] in enumerations:
underlying_data_type = enumerations[base_properties['type']]['underlying_data_type']
interface_type = underlying_data_type
manifest_type = underlying_data_type
if 'init' not in specification:
if enumerations[base_properties['type']]['default_value'] is not None:
init_value = enumerations[base_properties['type']]['default_value']
else:
LOGGER.warning('Initializing enumeration %s to "zero".', base_properties['type'])
init_value = [
k for k, v in enumerations[base_properties['type']]['members'].items() if v == 0
][0]
else:
init_value = specification.get("init", 0)
else:
interface_type = base_properties["type"]
manifest_type = base_properties["type"]
init_value = specification.get("init", 0)
if "out" in in_out_signal[0] and "strategy" in specification:
LOGGER.warning('Cannot set read strategy for outsignal %s, using "Always".', signal_name)
strategy = "Always"
else:
strategy = specification.get("strategy", "Always")
if strategy not in self.read_strategies:
LOGGER.warning('Invalid strategy %s, using "Always" instead.', strategy)
strategy = "Always"
if group is not None and specification.get("portname", None) is not None:
raise BadYamlFormat(f"Port name should be on group level not signal level: {dp_name}")
port_name_tmp = port_name if port_name is not None else specification.get("portname", None)
is_safe_signal = specification.get("dependability", False)
if signal_name not in self.dp_translations:
self.dp_translations[signal_name] = set()
domain = self._get_domain(dp_name)
self.dp_translations[signal_name].add(
(
"enum_0", # read from this tuple using the dp_position enum. Enum starts at 1 though.
domain,
specification["property"],
specification.get("type"),
interface_type,
manifest_type,
specification.get("offset"),
specification.get("factor"),
specification.get("default"),
specification.get("length"),
specification.get("min"),
specification.get("max"),
specification.get("enum"),
init_value,
specification.get("description"),
specification.get("unit"),
group,
strategy,
specification.get("debug", False),
is_safe_signal,
port_name_tmp
)
)
ecu_supplier, _unused = self.base_application.pybuild['build_cfg'].get_ecu_info()
if ecu_supplier in ['HI', 'ZC'] and is_safe_signal and group is not None:
e2e_sts_property = f"{group}E2eSts"
e2e_sts_signal_name = f"sVc{domain}_D_{e2e_sts_property}"
if signal_name == e2e_sts_signal_name:
raise BadYamlFormat(f"Don't put E2E status signals ({signal_name}) in yaml interface files.")
if e2e_sts_signal_name not in self.dp_translations:
self.dp_translations[e2e_sts_signal_name] = set()
self.dp_translations[e2e_sts_signal_name].add(
(
"enum_0", # read from this tuple using the dp_position enum. Enum starts at 1 though.
domain,
e2e_sts_property,
"UInt8",
"UInt8",
"UInt8",
0,
1,
None,
None,
0,
255,
None,
255,
f"E2E status code for E2E protected signal (group) {signal_name}.",
None,
group,
strategy,
False,
is_safe_signal,
port_name_tmp
)
)
self.e2e_sts_signals.add(e2e_sts_signal_name)
def parse_definition(self, definition):
"""Parses all definition files
Args:
definition (list(Path)): Definition files
"""
for translation in definition:
raw = self.read_translation(translation)
self.parse_signal_definitions(raw.get("signals", {}))
self.parse_group_definitions(raw.get("signal_groups", {}))
def get_signal_properties(self, signal):
"""Get signal properties for signal
Calls self.base_application to get signal properties
Args:
signal (Signal): Signal to get properties for
"""
self.base_application.get_signal_properties(signal)
def _get_signals(self):
"""Read signals"""
self.parse_definition(self.translations_files)
def _get_domain(self, device_proxy):
"""Get domain for device proxy
Args:
device_proxy (str): Name of device proxy
Returns:
domain (str): Name of the domain
"""
if device_proxy not in self.device_domain:
raise MissingDevice(device_proxy)
return self.device_domain[device_proxy]
def _allow_domain(self, domain):
"""Check if device proxy is in current domain_filter
If there is no filter, the device is seen as part of the filter
Args:
domain (str): Name of the domain
Returns:
filtered (bool): True if device is not filtered away
"""
return self.domain_filter is None or domain in self.domain_filter
def get_signals(self, signal_type="insignals"):
"""Get signals to and from a dp abstraction
If it is set to False, we look at the application side.
Args:
signal_type (str): insignals or outsignals
Returns:
signals (list): Signals in the interface
"""
signal_names = self.signal_names["other"][signal_type]
signals = []
for name in self._allowed_names(signal_names):
signals.append(Signal(name, self))
return signals
@property
def insignals(self):
""" Signals going to the device proxy. """
return self.get_signals("insignals")
@property
def outsignals(self):
""" Signals sent from the device proxy. """
return self.get_signals("outsignals")
def dp_spec_to_dict(self, signal_spec, signal_name):
"""Convert signal specification to dict.
Args:
signal_spec (tuple): Signal specification
signal_name (str): Signal name
Returns:
signal_spec (dict): Signal specification
"""
return {
"variable": signal_name,
"variable_type": signal_spec[self.dp_position.variable_type.value],
"property_type": signal_spec[self.dp_position.property_interface_type.value],
"domain": signal_spec[self.dp_position.domain.value],
"default": signal_spec[self.dp_position.default.value],
"length": signal_spec[self.dp_position.length.value],
"property": signal_spec[self.dp_position.property.value],
"offset": signal_spec[self.dp_position.offset.value],
"factor": signal_spec[self.dp_position.factor.value],
"range": {
"min": signal_spec[self.dp_position.min.value],
"max": signal_spec[self.dp_position.max.value],
},
"init": signal_spec[self.dp_position.init.value],
"description": signal_spec[self.dp_position.description.value],
"unit": signal_spec[self.dp_position.unit.value],
"group": signal_spec[self.dp_position.group.value],
"strategy": signal_spec[self.dp_position.strategy.value],
"debug": signal_spec[self.dp_position.debug.value],
"dependability": signal_spec[self.dp_position.dependability.value],
"port_name": signal_spec[self.dp_position.port_name.value]
}
@classmethod
def dp_spec_for_manifest(cls, signal_spec):
"""Convert signal specification to dict for a signal manifest.
Args:
signal_spec (tuple): Signal specification
Returns:
signal_spec (dict): Signal specification
"""
spec = {
"name": signal_spec[cls.dp_position.property.value],
"type": signal_spec[cls.dp_position.property_manifest_type.value],
}
for key, value in {
"default": cls.dp_position.default.value,
"length": cls.dp_position.length.value,
"enum": cls.dp_position.enum.value,
"description": cls.dp_position.description.value,
"unit": cls.dp_position.unit.value,
}.items():
if signal_spec[value] is not None:
spec[key] = signal_spec[value]
if (
signal_spec[cls.dp_position.min.value] is not None
and signal_spec[cls.dp_position.max.value] is not None
and cls.dp_position.enum.value is not None
):
spec["range"] = {
"min": signal_spec[cls.dp_position.min.value],
"max": signal_spec[cls.dp_position.max.value],
}
return spec
def to_dict(self):
"""Method to generate dict to be saved as yaml
Returns:
spec (dict): Signalling specification
"""
spec = {"consumer": [], "producer": []}
for signal_name, signal_spec in self._allowed_names_and_specifications(
self.signal_names["other"]["insignals"]):
spec['consumer'].append(
self.dp_spec_to_dict(signal_spec, signal_name)
)
for signal_name, signal_spec in self._allowed_names_and_specifications(
self.signal_names["other"]["outsignals"]):
spec['producer'].append(
self.dp_spec_to_dict(signal_spec, signal_name)
)
return spec
def to_manifest(self, client_name):
"""Method to generate dict to be saved as yaml
Args:
client_name (str): Name of the client in signal comm
Returns:
spec (dict): Signal manifest for using a Device proxy
"""
manifest = {"name": client_name}
manifest["consumes"] = self.insignals_dp_manifest(client_name)
manifest["produces"] = self.outsignals_dp_manifest(client_name)
manifest = self.cleanup_dp_manifest(manifest)
if "consumes" not in manifest and "produces" not in manifest:
return None
return {"signal_info": {"version": 0.2, "clients": [manifest]}}
def _generator(self, signal_names, unique_names=False):
"""Iterate over signals for allowed devices
If unique_names is True, the iterator does not yield the same signal twice
if unique_names is False, it yields each allowed signal spec with the signal name
Args:
signal_names (list): allowed signals
Yields:
name (str): Name of the signal
specification (dict): signal specification for allowed device
"""
for signal_name, specifications in (
(name, spec) for name, spec in self.dp_translations.items()
if name in signal_names):
for specification in (
spec for spec in specifications
if self._allow_domain(spec[self.dp_position.domain.value])):
if unique_names:
yield signal_name, specification
break
yield signal_name, specification
def _allowed_names(self, signal_names):
""" Iterate over signal names for allowed devices
Args:
signal_names (list): allowed signals
Yields:
name (str): Signal name
"""
for name, _ in self._generator(signal_names, unique_names=True):
yield name
def _allowed_specifications(self, signal_names):
""" Iterate over signal specifications for allowed devices
Args:
signal_names (list): allowed signals
Yields:
specification (dict): Specification for a signal for an allowed device
"""
for _, spec in self._generator(signal_names, unique_names=False):
yield spec
def _allowed_names_and_specifications(self, signal_names):
""" Iterate over signal specifications for allowed devices
Args:
signal_names (list): allowed signals
Yields:
name (str): Signal name
specification (dict): Specification for the signal for an allowed device
"""
for name, spec in self._generator(signal_names, unique_names=False):
yield name, spec
def insignals_dp_manifest(self, client_name):
""" Create consumes part of manifest for reading signals from device proxies
Args:
client_name (str): Name of the client in signal comm
"""
consumes = [{"name": client_name, "signal_groups": []}]
signal_names = self.signal_names["other"]["insignals"]
consumed_groups = set()
for signal_spec in self._allowed_specifications(signal_names):
group = signal_spec[self.dp_position.group.value]
if group is not None:
consumed_groups.add(group)
else:
consumes[0]["signal_groups"].append(
{"name": signal_spec[self.dp_position.property.value]}
)
for group in consumed_groups:
consumes[0]["signal_groups"].append(
{"name": group}
)
return consumes
def outsignals_dp_manifest(self, client_name):
""" Update manifests for writing signals to device proxies
Args:
client_name (str): Name of the client in signal comm
"""
produces = [{"name": client_name, "signals": [], "signal_groups": []}]
signal_names = self.signal_names["other"]["outsignals"]
group_signals = {}
for signal_spec in self._allowed_specifications(signal_names):
group = signal_spec[self.dp_position.group.value]
if group is not None:
if group not in group_signals:
group_signals[group] = []
group_signals[group].append(
self.dp_spec_for_manifest(signal_spec)
)
else:
produces[0]["signals"].append(
self.dp_spec_for_manifest(signal_spec)
)
for group_name, signals in group_signals.items():
produces[0]["signal_groups"].append(
{"name": group_name,
"signals": list(signals)}
)
return produces
@staticmethod
def cleanup_dp_manifest(manifest):
""" Remove empty device proxies.
Args:
manifest (dict): Device proxy configurations
"""
if not manifest["produces"][0]["signal_groups"]:
del manifest["produces"][0]["signal_groups"]
if not manifest["produces"][0]["signals"]:
del manifest["produces"][0]["signals"]
if list(manifest["produces"][0].keys()) == ["name"]:
del manifest["produces"]
if not manifest["consumes"][0]["signal_groups"]:
del manifest["consumes"]
return manifest

67
pybuild/interface/ems.py Normal file
View File

@ -0,0 +1,67 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
"""Module for extraction Energy Management System"""
import os
from pybuild.interface.base import BaseApplication, Signal
from pybuild.lib import logger
from pybuild.signal_interfaces import CsvSignalInterfaces
from pybuild.build_proj_config import BuildProjConfig
LOGGER = logger.create_logger(__file__)
class CsvEMS(BaseApplication, CsvSignalInterfaces):
"""Supplier part of the ECM"""
def __init__(self):
self._signals = {}
self.interfaces = {}
self._insignals = None
self._outsignals = None
self.projects = {}
def parse_definition(self, definition):
"""Read the interface files.
Args:
definition (Path): Path to ProjectCfg.json
"""
self.build_cfg = BuildProjConfig(os.path.normpath(str(definition)))
CsvSignalInterfaces.__init__(self, self.build_cfg, [])
self.projects[self.build_cfg.name] = self.build_cfg
self.config_path = self.build_cfg.get_if_cfg_dir()
self.name = self.build_cfg.name # Set name for CsvSignalInterfaces
self._parse_io_cnfg(self.config_path)
self.name = 'Supplier_' + self.build_cfg.name # set name for BaseApplication
def _get_signals(self):
"""Look through interfaces and create Signal objects"""
self._insignals = set()
self._outsignals = set()
for interface_name, data in self.interfaces.items():
for signal_name, signal_data in data.items():
if signal_data['IOType'] == '-':
# Signal is inactive
continue
if signal_name in self._signals:
signal = self._signals[signal_name]
else:
signal = Signal(signal_name, self)
self._signals.update({signal_name: signal})
if 'output' in interface_name.lower():
# outport from ECM. Inport in EMS (from SPM)
self._insignals.add(signal_name)
signal.consumers = interface_name
else:
signal.producers = interface_name
self._outsignals.add(signal_name)
def get_signal_properties(self, signal):
"""Find properties for a signal from interface files"""
for interface_name, interface_data in self.interfaces.items():
if signal.name in interface_data.keys():
properties = interface_data[signal.name]
properties['interface'] = interface_name
return properties
return {}

View File

@ -0,0 +1,98 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
"""Module to export information of global variables from pybuild projects."""
import argparse
import os
import sys
from typing import Dict, Tuple
from ruamel.yaml import YAML
from pybuild.build_proj_config import BuildProjConfig
from pybuild.feature_configs import FeatureConfigs
from pybuild.unit_configs import UnitConfigs
def get_global_variables(project_config_path: str) -> Dict:
"""Get global variables connected to PyBuild project.
Args:
project_config_path (str): Path to ProjectCfg.json file.
Returns:
(Dict): Dict containing project name and its global variables (name, type).
"""
project_name, project_unit_config = _get_project_data(project_config_path)
variable_types = ["outports", "local_vars", "calib_consts", "nvm"]
variables = []
for variable_type in variable_types:
if variable_type not in project_unit_config:
continue
variables_info = [
{"name": variable, "type": _get_variable_type(variable_info)}
for variable, variable_info in project_unit_config[variable_type].items()
]
variables.extend(variables_info)
return {"name": project_name, "variables": variables}
def _get_variable_type(variable_info: Dict) -> str:
"""Get variable type from variable info.
Args:
variable_info (Dict): Dictionary with the variable info.
Returns:
str: Variable type.
"""
unit_name = list(variable_info.keys())[0] # Getting any unit name, since variable type should be the same
return variable_info[unit_name]["type"]
def _get_project_data(project_config_path: str) -> Tuple[str, Dict]:
"""Gets data for a pybuild project.
Args:
project_config_path (str): Path to ProjectCfg.json file.
Returns:
project_name (str): Name of PyBuild project.
project_unit_config (dict): Dict mapping variable types to variables and their data.
"""
build_cfg = BuildProjConfig(os.path.normpath(project_config_path))
feature_cfg = FeatureConfigs(build_cfg)
unit_cfg = UnitConfigs(build_cfg, feature_cfg)
project_unit_config = unit_cfg.get_per_cfg_unit_cfg()
return build_cfg.name, project_unit_config
def _export_yaml(data: Dict, file_path: str) -> None:
"""Exports data from dictionary to a yaml file.
Args:
data (Dict): Dictionary with data.
file_path (str): Path of the file to export data.
"""
with open(file_path, "w", encoding="utf-8") as yaml_file:
yaml = YAML()
yaml.default_flow_style = False
yaml.dump(data, yaml_file)
def _main():
args = _parse_args()
global_variables = get_global_variables(args.project_config)
_export_yaml(global_variables, args.output_file)
def _parse_args():
parser = argparse.ArgumentParser(description="Export global variables.")
parser.add_argument("--project-config", help="Project root configuration file.", required=True)
parser.add_argument("--output-file", help="Output file to export global variables.", required=True)
return parser.parse_args()
if __name__ == "__main__":
sys.exit(_main())

View File

@ -0,0 +1,125 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
# -*- coding: utf-8 -*-
"""Python module used for calculating interfaces for CSP"""
from pathlib import Path
from os import path
from pybuild.interface.hal import HALA
from pybuild.interface.device_proxy import DPAL
from pybuild.interface.service import ServiceFramework
from pybuild.interface import simulink
from pybuild.lib import logger
from pybuild.interface import generation_utils
from pybuild.lib.helper_functions import deep_json_update
LOGGER = logger.create_logger("CSP adapters")
def parse_args():
"""Parse arguments
Returns:
Namespace: the parsed arguments
"""
parser = generation_utils.base_parser()
parser.add_argument(
"--dp-interface",
help="Add dp interface to adapter specification",
action="store_true"
)
parser.add_argument(
"--hal-interface",
help="Add dp interface to adapter specification",
action="store_true"
)
parser.add_argument(
"--service-interface",
help="Add sfw interface to adapter specification",
action="store_true"
)
parser.add_argument(
"output",
help="Output file with interface specifications",
type=Path
)
parser.add_argument(
"--update-config",
help="Update project config file with path to adapter specifications",
action="store_true"
)
return parser.parse_args()
def main():
""" Main function for stand alone execution.
Mostly useful for testing and generation of dummy hal specifications
"""
args = parse_args()
app = generation_utils.process_app(args.config)
adapters(args, app)
def update_project_config(args):
""" Update project config file with relative location to adapter specification file, linux styled path.
Args:
args (Namespace): Arguments from command line
"""
config_dir = Path(args.config).resolve().parents[0]
output_dir = Path(args.output).resolve().parents[0]
rel_dir = Path(path.relpath(output_dir, config_dir))
rel_path = rel_dir / Path(args.output).name
rel_path_linux = rel_path.as_posix()
deep_json_update(
args.config,
{'ProjectInfo': {'adapterSpec': rel_path_linux}}
)
def adapters(args, app):
""" Generate specification for adapter generation.
Args:
args (Namespace): Arguments from command line
app (Application): Application to generate specification for
"""
if args.dp_interface:
dp = DPAL(app)
dp_interface = generation_utils.get_interface(app, dp)
else:
dp_interface = {}
if args.hal_interface:
hala = HALA(app)
hal_interface = generation_utils.get_interface(app, hala)
else:
hal_interface = {}
if args.service_interface:
sfw = ServiceFramework(app)
sfw_interface = generation_utils.get_interface(app, sfw)
method_interface = generation_utils.get_method_interface(app)
else:
sfw_interface = {}
method_interface = {}
interface_data_types = app.pybuild['user_defined_types'].get_interface_data_types()
adapters = simulink.get_interface(
interface_data_types,
dp_interface,
hal_interface,
sfw_interface,
method_interface
)
generation_utils.write_to_file(adapters, args.output, is_yaml=True)
if args.update_config:
update_project_config(args)
if __name__ == "__main__":
main()

View File

@ -0,0 +1,74 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
# -*- coding: utf-8 -*-
"""Python module used for calculating interfaces for CSP HI"""
from pathlib import Path
from pybuild.interface import generation_utils
from pybuild.interface.device_proxy import DPAL
from pybuild.lib.helper_functions import recursive_default_dict, to_normal_dict
OP_READ = 'read'
OP_WRITE = 'write'
def generate_hi_interface(args, hi_interface):
"""Generate HI YAML interface file.
Args:
args (Namespace): Arguments from command line.
hi_interface (dict): HI interface dict based on HIApplication and generation_utils.get_interface.
Returns:
result (dict): Aggregated signal information as a dict.
"""
io_translation = {
'consumer': OP_READ,
'producer': OP_WRITE
}
result = recursive_default_dict()
for raster_data in hi_interface.values():
for direction, signals in raster_data.items():
hi_direction = io_translation[direction]
for signal in signals:
domain = signal['domain']
group = signal['group']
name = signal['variable']
property_name = signal['property']
if group is None:
port_name = signal['port_name'] or property_name
result[hi_direction]['signals'][domain][port_name]['data'][property_name] = name
else:
port_name = signal['port_name'] or group
result[hi_direction]['signal_groups'][domain][port_name][group]['data'][property_name] = name
generation_utils.write_to_file(to_normal_dict(result), args.output, is_yaml=True)
def parse_args():
"""Parse arguments.
Returns:
Namespace: the parsed arguments.
"""
parser = generation_utils.base_parser()
parser.add_argument(
"output",
help="Output file with interface specifications.",
type=Path
)
return parser.parse_args()
def main():
""" Main function for stand alone execution.
Mostly useful for testing and generation of dummy hal specifications.
"""
args = parse_args()
app = generation_utils.process_app(args.config)
hi_app = DPAL(app)
interface = generation_utils.get_interface(app, hi_app)
generate_hi_interface(args, interface)
if __name__ == "__main__":
main()

View File

@ -0,0 +1,60 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
# -*- coding: utf-8 -*-
"""Python module used for calculating interfaces for CSP"""
from pathlib import Path
from pybuild.interface.service import get_service
from pybuild.lib import logger
from pybuild.interface import generation_utils
LOGGER = logger.create_logger("CSP service")
def parse_args():
"""Parse command line arguments
Returns:
Namespace: Arguments from command line
"""
parser = generation_utils.base_parser()
parser.add_argument(
"--client-name",
help="Name of the context object in CSP. Defaults to project name."
)
parser.add_argument(
"output",
help="Output directory for service models",
type=Path
)
return parser.parse_args()
def main():
""" Main function for stand alone execution.
Mostly useful for testing and generation of dummy hal specifications
"""
args = parse_args()
app = generation_utils.process_app(args.config)
client_name = generation_utils.get_client_name(args, app)
service(args, app, client_name)
def service(args, app, client_name):
""" Generate specifications for pt-scheduler wrappers.
Args:
args (Namespace): Arguments from command line
app (Application): Application to generate specifications for
client_name (str): Signal client name
"""
model_internal = get_service(app, client_name, 'internal')
model_external = get_service(app, client_name, 'external')
model_observer = get_service(app, client_name, 'observer')
generation_utils.write_to_file(model_internal, Path(args.output, 'model', 'internal.yaml'), is_yaml=True)
generation_utils.write_to_file(model_external, Path(args.output, 'model', 'external.yaml'), is_yaml=True)
generation_utils.write_to_file(model_observer, Path(args.output, 'model', 'observer.yaml'), is_yaml=True)
if __name__ == "__main__":
main()

View File

@ -0,0 +1,134 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
# -*- coding: utf-8 -*-
"""Python module used for calculating interfaces for CSP"""
from pathlib import Path
from pybuild.interface.hal import HALA, get_hal_list
from pybuild.interface.device_proxy import DPAL
from pybuild.interface.service import ServiceFramework, get_service_list
from pybuild.lib import logger
from pybuild.interface import generation_utils
LOGGER = logger.create_logger("CSP wrappers")
def get_manifest(app, domain, client_name):
"""Get signal manifest for application
The model yaml files are parsed for the models included in the application.
If there are no signals to interact with,
this function returns None to indicate that there should not be a manifest.
Args:
app (Application): Pybuild project
domain (str): Domain that the signals should not be part of
client_name (str): Client name in the signal database
Returns:
spec (dict/None): signal manifest, None if no manifest should be written
"""
rasters = app.get_rasters()
LOGGER.debug("Rasters: %s", rasters)
translation_files = app.get_translation_files()
dpal = DPAL(app)
dpal.domain_filter = [domain]
dpal.parse_definition(translation_files)
dpal.clear_signal_names()
dpal.add_signals(app.insignals, "insignals")
dpal.add_signals(app.outsignals, "outsignals")
return dpal.to_manifest(client_name)
def parse_args():
"""Parse command line arguments
Returns:
Namespace: Arguments from command line
"""
parser = generation_utils.base_parser()
parser.add_argument(
"--client-name",
help="Name of the context object in CSP. Defaults to project name."
)
parser.add_argument(
"--dp-interface",
help="Output file with DP interface specifications",
type=Path
)
parser.add_argument(
"--dp-manifest-dir",
help="Output directory for signal manifests",
type=Path
)
parser.add_argument(
"--hal-interface",
help="Output file with HAL interface specifications",
type=Path,
)
parser.add_argument(
"--service-interface",
help="Output file with service interface specifications",
type=Path
)
return parser.parse_args()
def main():
""" Main function for stand alone execution.
Mostly useful for testing and generation of dummy hal specifications
"""
args = parse_args()
app = generation_utils.process_app(args.config)
client_name = generation_utils.get_client_name(args, app)
wrappers(args, app, client_name)
def wrappers(args, app, client_name):
""" Generate specifications for pt-scheduler wrappers.
Args:
args (Namespace): Arguments from command line
app (Application): Application to generate specifications for
client_name (str): Signal client name
"""
if args.hal_interface:
hala = HALA(app)
interface = generation_utils.get_interface(app, hala)
interface["relocatable_language"] = "C"
generation_utils.write_to_file(interface, args.hal_interface, is_yaml=True)
hals = get_hal_list(app)
cmake = Path(args.hal_interface.parent, 'hal.cmake')
generation_utils.write_to_file(hals, cmake)
if args.dp_interface:
dp = DPAL(app)
interface = generation_utils.get_interface(app, dp)
interface["relocatable_language"] = "C"
generation_utils.write_to_file(interface, args.dp_interface, is_yaml=True)
if args.service_interface:
service = ServiceFramework(app)
interface = generation_utils.get_interface(app, service)
interface["relocatable_language"] = "C"
generation_utils.write_to_file(interface, args.service_interface, is_yaml=True)
proxies = get_service_list(app)
cmake = Path(args.service_interface.parent, 'service-proxies.cmake')
generation_utils.write_to_file(proxies, cmake)
if args.dp_manifest_dir:
if not args.dp_manifest_dir.is_dir():
args.dp_manifest_dir.mkdir()
# Enable changing client name in generation
for domain in app.get_domain_names():
manifest = get_manifest(app, domain, client_name)
if manifest is None:
# Manifests without produced or consumed signals are not allowed
# Therefore get_manifests returned None to tell us to skip this domain
continue
domain_file = Path(
args.dp_manifest_dir,
f"{domain}_{app.name}.yaml")
generation_utils.write_to_file(manifest, domain_file, is_yaml=True)
if __name__ == "__main__":
main()

View File

@ -0,0 +1,144 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
"""Module with generation utils."""
import argparse
from ruamel.yaml import YAML
from pathlib import Path
from pybuild.interface.application import Application, get_internal_domain
from pybuild.interface.base import filter_signals
from pybuild.lib import logger
LOGGER = logger.create_logger("CSP interface generation utils")
def base_parser():
""" Base parser that adds config argument.
Returns:
parser (ArgumentParser): Base parser
"""
parser = argparse.ArgumentParser()
parser.add_argument("config", help="The project configuration file", type=Path)
return parser
def get_client_name(args, app):
""" Get client name for app.
Args:
app (Application): Parsed project configuration
Returns:
name (str): Name of the project
"""
return args.client_name if args.client_name else app.name
def process_app(config):
""" Get an app specification for the current project
Entrypoint for external scripts.
Args:
config (pathlib.Path): Path to the ProjectCfg.json
Returns:
app (Application): pybuild project
"""
app = Application()
app.parse_definition(config)
return app
def get_interface(app, interface_type):
"""Get interface(hal/dp/zc/service) to application
Args:
app (Application): Pybuild project
interface_type (BaseApplication): A type of interface
Returns:
spec (obj): obj for hal/dp/zc/service class
"""
spec = {}
rasters = app.get_rasters()
LOGGER.debug("Rasters: %s", rasters)
translation_files = app.get_translation_files()
interface_type.parse_definition(translation_files)
internal = get_internal_domain(rasters)
properties_from_json = [
{"destination": "min", "source": "min", "default": "-"},
{"destination": "max", "source": "max", "default": "-"},
{"destination": "variable_type", "source": "type"},
{"destination": "offset", "source": "offset", "default": "-"},
{"destination": "factor", "source": "lsb", "default": 1},
{"destination": "description", "source": "description"},
{"destination": "unit", "source": "unit", "default": "-"},
]
# TODO We read all yaml files at once, should we "add_signals" for all rasters at once?
# For example, service.check_endpoints will miss endpoint mismatches that reside in different rasters.
for raster in rasters:
interface_type.name = raster.name
interface_type.clear_signal_names()
interface_type.add_signals(
filter_signals(raster.insignals, internal), "insignals", properties_from_json)
interface_type.add_signals(raster.outsignals, "outsignals", properties_from_json)
LOGGER.debug("Current communication interface: %s", interface_type)
spec[raster.name] = interface_type.to_dict()
return spec
def get_method_interface(app):
""" Get method interface
Args:
app (Application): Application
Returns:
spec (dict): Specification for csp methods
"""
spec = {}
for method in app.get_methods():
method_spec = {}
method_spec['name'] = method.name
method_spec['primitive'] = method.get_primitive(method.name)
method_spec['namespace'] = method.namespace
if method.description:
method_spec['description'] = method.description
inports = {}
outports = {}
for signal in method.signals:
signal_spec = {
'primitive': method.get_primitive(signal.name),
'type': signal.properties['type'],
'variable_type': signal.properties['type'],
'variable': signal.name,
'range': signal.properties['range'],
'init': 0,
'description': '',
'unit': ''
}
if signal in method.outsignals:
outports[signal.name] = signal_spec
else:
inports[signal.name] = signal_spec
ports = {'in': inports, 'out': outports}
method_spec['ports'] = ports
spec[method.name] = method_spec
return spec
def write_to_file(content, output, is_yaml=False):
""" Write to cmake.
Args:
content (str): cmake
output (Path): File to write.
yaml (bool): Dump yaml
"""
output.parent.mkdir(parents=True, exist_ok=True)
with open(output, "w", encoding="utf-8") as file_handler:
if is_yaml:
yaml = YAML()
yaml.dump(content, file_handler)
else:
file_handler.write(content)

194
pybuild/interface/hal.py Normal file
View File

@ -0,0 +1,194 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
"""Python module used for abstracting Hardware Abstraction Layer specifications"""
import re
from pathlib import Path
from ruamel.yaml import YAML
from pybuild.interface.csp_api import CspApi
from pybuild.lib import logger
LOGGER = logger.create_logger('base')
def get_hal_list(app):
"""Get translated hal name from yaml file
Args:
app (Application): Pybuild project
Returns:
cmake (str): a string contains translated hal name
"""
translation_files = app.get_translation_files()
hala = HALA(app)
hala.parse_definition(translation_files)
hal_translations = hala.get_map()
hals = set()
for definitions in hala.translations.values():
for definition in definitions:
hal_abbreviation = definition[hala.position.api.value]
real_hal_name = get_real_hal_name(hal_abbreviation, hal_translations)
hals.add((hal_abbreviation, real_hal_name))
cmake = ""
for hal_abbreviation, real_hal_name in hals:
lib = re.sub('-', '_', f'hal_{hal_abbreviation}' + '_libhal_' + real_hal_name).upper()
include = re.sub('-', '_', f'hal_{hal_abbreviation}' + '_include_dir').upper()
cmake += f"LIST(APPEND extra_libraries ${{{lib}}})\n"
cmake += f"LIST(APPEND EXTRA_INCLUDE_DIRS ${{{include}}})\n"
return cmake
def strip_hal_name(hal_name):
"""Strip hal name
Args:
hal_name (str): hal name
Returns:
(str): stripped hal name
"""
return hal_name.replace('_hal', '')
def verify_name(hal_name, api_map):
"""Verify hal name
Args:
hal_name (str): hal name
api_map (dict): hal translation map
"""
if strip_hal_name(hal_name) not in api_map:
raise HalTranslationException(
f"{hal_name} does not exist in the hal translation file."
)
def get_real_hal_name(hal_name, translation_map):
"""Get real hal name from translation map file.
Args:
hal_name (str): hal abreviation
Returns:
real_hal_name (str): real name of a hal
"""
verify_name(hal_name, translation_map)
return translation_map.get(strip_hal_name(hal_name))
class UnknownAccessorError(Exception):
"""Error when setting a producer and there already exists one"""
def __init__(self, hal, signal, accessor):
"""Set error message
Args:
hal (HAL): Hal where the problem is
signal (Signal): Signal with the problem
accessor (str): Unknown accessor type
"""
super().__init__()
self.message = f"Accessor of type {accessor} for {signal.name} in {hal.name} is not handled"
class HalTranslationException(Exception):
"""Class for hal translation exceptions"""
class HALA(CspApi):
"""Hardware abstraction layer abstraction"""
def __repr__(self):
"""String representation of HALA"""
return (f"<HALA {self.name}"
f" app_side insignals: {len(self.signal_names['app']['insignals'])}"
f" app_side outsignals: {len(self.signal_names['app']['outsignals'])}>")
def get_map(self):
"""Get hal translation map
Returns:
(dict): hal translation map
"""
path = self.get_map_file()
if path.is_file():
return self._get_hal_translation(path)
return {}
@staticmethod
def get_map_file():
"""Get hal translation map file
Returns:
(Path): hal translation map file
"""
return Path("Projects", "CSP", "hal_list.yaml")
def get_api_name(self, api_name):
real_hal_name = get_real_hal_name(
api_name,
self.api_map
)
return real_hal_name
def verify_api(self, api_name):
verify_name(api_name, self.api_map)
@staticmethod
def _get_hal_translation(path):
"""Get translated hal names
Args:
path (Path): path to the hal translation list.
Returns:
hal_translation_content (dict): translated hal names
"""
hal_translation_content = None
if path and path.is_file():
with path.open("r") as file_handler:
yaml = YAML(typ="safe", pure=True)
hal_translation_content = yaml.load(file_handler)
if not hal_translation_content:
if hal_translation_content is None:
raise HalTranslationException(
"No hal translation file given."
)
if isinstance(hal_translation_content, dict):
raise HalTranslationException(
"Hal translation file are empty."
)
raise HalTranslationException("Bad hal translation format.")
return hal_translation_content
def check_endpoints(self):
pass
def extract_endpoint_definitions(self, raw):
"""Extract endpoint definitions from raw data
Args:
(dict): endpoint definitions
"""
self.parse_api_definitions(raw.get("hal", {}))
@staticmethod
def extract_definition(definition):
"""Extract definition from hal
Args:
definition (dict): hal definition
Returns:
(dict): hal definition
"""
if isinstance(definition, list):
specifications = {
'hals': definition
}
else:
specifications = {
'properties': definition.get('properties', []),
'methods': definition.get('methods', [])
}
return specifications

View File

@ -0,0 +1,342 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
"""Module for verifying the model yaml files."""
import argparse
import logging
from pathlib import Path
from voluptuous import All, MultipleInvalid, Optional, Required, Schema
from ruamel.yaml import YAML
from pybuild.interface.application import Application
from pybuild.interface.base import BaseApplication
class ModelYmlVerification(BaseApplication):
"""Class for verifying the model yaml files."""
def __init__(self, base_application):
self.base_application = base_application
self.raw = {}
self.app_insignals = self.get_insignals_name()
self.app_outsignals = self.get_outsignals_name()
self.signal_properties = {}
self.model_name = None
self.error_printed = False
def read_translation(self, translation_file):
"""Read specification of the yaml file.
Args:
translation_file (Path): file with specs.
"""
if not translation_file.is_file():
return {}
with open(translation_file, encoding="utf-8") as translation:
try:
yaml = YAML(typ='safe', pure=True)
self.raw = yaml.load(translation)
except yaml.YAMLError as e:
self.raw = {}
if hasattr(e, 'problem_mark'):
mark = e.problem_mark
self.error("Error while reading model file, verification of this file cannot continue until this "
f"is fixed:\nFile: {translation_file}\nLine: {mark.line + 1}\nColumn: {mark.column + 1}")
else:
self.error("Error while reading model file, verification of this file cannot continue until this "
f"is fixed:\nFile: {translation_file}")
def validate_signal_schema(self, signal_spec, signal_direction, is_hal, is_service):
"""Validate if signal have a correct schema in model yaml file.
Args:
signal_spec (dict): signal specification.
signal_direction (str): insignal or outsignal.
is_hal (Bool): signal comming from hal
is_service (Bool): signal comming from service
"""
if is_hal:
signal_schema = Schema({Required('insignal'): All(str), Optional('property'): All(str)})
elif is_service:
signal_schema = Schema({Required(signal_direction): All(str), Optional('property'): All(str)})
else:
signal_schema = Schema({Required(signal_direction): All(str), Required('property'): All(str)})
try:
signal_schema(signal_spec)
except MultipleInvalid as e:
self.error(f"{e} in {self.model_name}")
def validate_group_schema(self, group_spec):
"""Validate if device proxy signal group and hal endpoint
have correct schema.
Args:
group_spec (dict): dp signal group or hal endpoint.
"""
group_schema = Schema({Required(str): [{Required(str): list}]})
try:
group_schema(group_spec)
except MultipleInvalid as e:
self.error(self.model_name + ' ' + str(e))
def validate_service_schema(self, service_spec):
"""Validate if service schema in model yaml file.
Args:
service_spec (dict): service in model yaml file.
"""
service_schema = Schema(
{
Required(str): {
Optional('properties'): [{Required(str): list}],
Optional('methods'): [{Required(str): list}]
}
}
)
try:
service_schema(service_spec)
except MultipleInvalid as e:
self.error(self.model_name + ' ' + str(e))
def validate_schema(self):
"""Validate interface schema in model yaml file.
Interface could be hal, dp(signal/signal group) and serivece.
"""
schema = Schema({
Optional('hal'): dict, Optional('signal_groups'): dict,
Optional('service'): dict, Optional('signals'): dict})
try:
schema(self.raw)
except MultipleInvalid as e:
self.error(self.model_name + ' ' + str(e))
def parse_hal_definition(self, hal):
"""Parse hal definition.
Args:
hal (dict): hal specification in model yaml file.
"""
self.parse_group_definitions(hal, is_hal=True)
def parse_service_definitions(self, service):
"""Parse service.
Args:
service (dict): service in model yaml file.
"""
if service:
self.validate_service_schema(service)
for service_name, definition in service.items():
for endpoints in definition.get('properties', []):
for endpoint, signals in endpoints.items():
self.verify_signals({service_name: signals}, is_service=True, endpoint=endpoint)
for endpoints in definition.get('methods', []):
for endpoint, signals in endpoints.items():
self.verify_signals({service_name: signals}, is_service=True, endpoint=endpoint)
def parse_group_definitions(self, signal_groups, is_hal=False):
"""Parse signal groups.
Args:
signal_groups (dict): Hal/dp signal group in yaml file.
is_hal (Bool): hal signal
"""
if signal_groups:
self.validate_group_schema(signal_groups)
for interface, group_definitions in signal_groups.items():
for group in group_definitions:
for group_name, signals in group.items():
self.verify_signals({interface: signals}, is_hal, group=group_name)
def verify_signals(self, signals_definition, is_hal=False, is_service=False, endpoint=None, group=None):
"""verify signal in-signal and out-signal in model yaml file.
Args:
signals_definition (dict): parsed signals in model yaml file.
is_hal (Bool): hal signal.
is_service (Bool): service signal.
endpoint (str): service endpoint.
group(str): hal group.
"""
for interface, specifications in signals_definition.items():
for specification in specifications:
in_out_signal = [key for key in specification.keys() if 'signal' in key]
signal_name = specification[in_out_signal[0]]
self.validate_signal_schema(specification, in_out_signal[0], is_hal, is_service)
if in_out_signal == []:
self.error(f"signal is not defined for {interface} in {self.model_name}!")
if 'in' not in in_out_signal[0] and 'out' not in in_out_signal[0]:
self.error(f"in and out signal must be added to signal {specification['signal']}")
elif 'in' in in_out_signal[0] and specification[in_out_signal[0]] not in self.app_insignals:
self.error(
f"{specification['insignal']} is not defined as an insignal in json file")
elif "out" in in_out_signal[0] and specification[in_out_signal[0]] not in self.app_outsignals:
self.error(
f"{specification['outsignal']} is not defined as an outsignal in json file")
else:
if is_service:
if specification.get('property') is None:
self.verify_primitive(f"{interface}.{endpoint}.{signal_name}", in_out_signal[0])
else:
self.verify_primitive(
f"{interface}.{endpoint}.{specification['property']}.{signal_name}",
in_out_signal[0])
elif is_hal:
if specification.get('property') is None:
self.verify_primitive(f"{interface}.{group}.{signal_name}", in_out_signal[0])
else:
self.verify_primitive(f"{interface}.{group}.{specification['property']}.{signal_name}",
in_out_signal[0])
else:
self.verify_primitive(f"{interface}.{specification['property']}.{signal_name}",
in_out_signal[0])
def check_duplicate_signals(self, property_spec, in_out):
"""Check if each signal appears only once for each model.
It is ok for two insignals with the same name to be mapped to the same primitive
if they are in different models.
It is not ok for a outsignal to map to the same interface twice, but it is ok to map to
different interfaces (with same or different property name).
Args:
property_spec (str): property specification.
in_out (str): whether it is an in- or outsignal.
"""
signal_name = property_spec.split('.')[-1]
interface_name = property_spec.split('.')[0]
for model, spec in self.signal_properties.items():
for primitive in spec:
if signal_name in primitive.split('.'):
if property_spec not in primitive:
if "in" in in_out:
self.error(
f"You can't connect a signal {signal_name} in {self.model_name} model to two "
f"different primitives. It's already connected in {model} model")
else:
if interface_name in primitive.split('.'):
self.error(
f"You can't connect a signal {signal_name} in {self.model_name} model "
f"to the same interface ({interface_name}) twice. "
f"It's already connected as {primitive} in model {model}.")
else:
if model == self.model_name:
self.error(f"You can't connect signal {signal_name} in {self.model_name} model twice.")
elif "out" in in_out:
self.error(
f"You can't connect signal {signal_name} in {self.model_name} model to the same "
f"primitive as in another model. It is already defined in {model}")
def check_property(self, property_spec):
"""Check if we have only one signal written for the same property.
Args:
property_spec (str): property specification.
"""
signal_name = property_spec.split('.')[-1]
for model, spec in self.signal_properties.items():
for primitive in spec:
if ('.'.join(property_spec.split('.')[:-1]) == '.'.join(primitive.split('.')[:-1])
and signal_name != primitive.split('.')[-1]):
self.error(
f"You can't connect another signal to the existing property {property_spec} in "
f"{self.model_name} model, because it is already defined in {model} model.")
def verify_primitive(self, property_spec, in_out):
"""Runs the necessary tests.
Args:
property_spec (str): property specification.
in_out (str): whether it is an in- or outsignal.
"""
self.check_duplicate_signals(property_spec, in_out)
self.check_property(property_spec)
self.signal_properties[self.model_name].append(property_spec)
def get_insignals_name(self):
"""Base application in-signals.
"""
app_insignals = []
for signal in self.base_application.insignals:
app_insignals.append(signal.name)
return app_insignals
def get_outsignals_name(self):
"""Base application out-signals.
"""
app_outsignals = []
for signal in self.base_application.outsignals:
app_outsignals.append(signal.name)
return app_outsignals
def parse_definition(self, definition):
"""Parses all definition files
Args:
definition (list(Path)): model yaml files.
"""
for translation in definition:
path = Path(translation)
self.model_name = path.name.replace(".yaml", "")
self.signal_properties[self.model_name] = []
self.read_translation(translation)
self.validate_schema()
self.verify_signals(self.raw.get("signals", {}))
self.parse_group_definitions(self.raw.get("signal_groups", {}))
self.parse_hal_definition(self.raw.get("hal", {}))
self.parse_service_definitions(self.raw.get("service", {}))
def error(self, msg):
""" Prints an error message to the terminal.
Args:
msg (string): The message to be printed.
"""
self.error_printed = True
logging.error(f"{msg}\n")
def print_success_msg(self):
""" Prints a success message if no error messages have been printed.
"""
if not self.error_printed:
print('Yaml verification done without any errors.')
def get_app(project_config):
""" Get an app specification for the current project.
Args:
config (pathlib.Path): Path to the ProjectCfg.json.
Returns:
app (Application): pybuild project.
"""
app = Application()
app.parse_definition(project_config)
return app
def parse_args():
"""Parse command line arguments.
Returns:
(Namespace): parsed command line arguments.
"""
parser = argparse.ArgumentParser()
parser.add_argument("config", help="The SPA2 project config file", type=Path)
return parser.parse_args()
def main():
"""Main function for model yaml verification."""
args = parse_args()
app = get_app(args.config)
model_yamls = app.get_translation_files()
model_yaml_ver = ModelYmlVerification(app)
model_yaml_ver.parse_definition(model_yamls)
model_yaml_ver.print_success_msg()
if __name__ == "__main__":
main()

View File

@ -0,0 +1,296 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
"""Module for handling the Service abstraction."""
import re
from pybuild.interface.base import filter_signals
from pybuild.interface.csp_api import CspApi
from pybuild.interface.application import get_internal_domain
from pybuild.lib import logger
LOGGER = logger.create_logger("service")
def get_service(app, client_name, interface):
"""Get service implementation specification"""
rasters = app.get_rasters()
LOGGER.debug("Rasters: %s", rasters)
translation_files = app.get_translation_files()
sfw = ServiceFramework(app)
sfw.filter = f"{client_name}_internal"
sfw.name = f"{client_name}_{interface}"
sfw.parse_definition(translation_files)
internal = get_internal_domain(rasters)
properties_from_json = [
{"destination": "min", "source": "min", "default": "-"},
{"destination": "max", "source": "max", "default": "-"},
{"destination": "variable_type", "source": "type"},
{"destination": "offset", "source": "offset", "default": "-"},
{"destination": "factor", "source": "lsb", "default": 1},
{"destination": "description", "source": "description"},
{"destination": "unit", "source": "unit", "default": "-"},
]
for raster in rasters:
external_signals = filter_signals(raster.insignals, internal)
sfw.add_signals(
external_signals,
"insignals",
properties_from_json,
)
sfw.add_signals(raster.outsignals, "outsignals", properties_from_json)
return sfw.to_model(interface)
def get_service_list(app):
"""Get service list from app
Args:
app (Application): Pybuild project
Returns:
(str): a string containing the translated service list
"""
translation_map = app.get_service_mapping()
cmake = ''
for proxy, service in translation_map.items():
lib = re.sub('-', '_', service + '_lib' + proxy + '_service_proxy').upper()
include = re.sub('-', '_', service + '_include_dir').upper()
cmake += f"LIST(APPEND extra_libraries ${{{lib}}})\n"
cmake += f"LIST(APPEND EXTRA_INCLUDE_DIRS ${{{include}}})\n"
return cmake
class ServiceFramework(CspApi):
"""Service Framework abstraction layer"""
def __repr__(self):
"""String representation of SWFL"""
return (
f"<SWFL {self.name}"
f" app_side insignals: {len(self.signal_names['app']['insignals'])}"
f" app_side outsignals: {len(self.signal_names['app']['outsignals'])}>"
)
def get_map_file(self):
"""Get service translation map file
Returns:
(Path): service translation map file
"""
return self.base_application.get_services_file()
def get_map(self):
"""Get service translation map
Returns:
(dict): service translation map
"""
return self.base_application.get_service_mapping()
def check_endpoints(self):
"""Check and crash if signal endpoint contains both produces and consumes signals."""
endpoints = {}
for signal_name, signal_specs in self.translations.items():
if signal_name in self.signal_names["app"]['insignals']:
consumed = True
elif signal_name in self.signal_names["app"]['outsignals']:
consumed = False
else:
continue
for signal_spec in signal_specs:
endpoint = signal_spec[self.position.endpoint.value]
api = signal_spec[self.position.api.value]
key = (api, endpoint)
if key not in endpoints:
endpoints[key] = {
"consumed": consumed,
"signals": set()
}
endpoints[key]["signals"].add(signal_name)
assert consumed == endpoints[key]["consumed"], \
f"Signal endpoint {endpoint} for {api} contains both consumed and produced signals"
def extract_endpoint_definitions(self, raw):
"""Extract endpoint definitions from yaml file.
Args:
raw (dict): Raw yaml file
Returns:
(dict): Endpoint definitions
"""
self.parse_api_definitions(raw.get("service", {}))
@staticmethod
def extract_definition(definition):
"""Extract definition from yaml file.
Returns the properties and methods for a service.
Args:
definition (dict): Definition from yaml file
Returns:
(dict): Specifications for a service
"""
specifications = {}
specifications['properties'] = definition.get('properties', [])
specifications['methods'] = definition.get('methods', [])
return specifications
def to_model(self, client):
"""Method to generate dict to be saved as yaml
Args:
client (str): Name of the client in signal comm
Returns:
spec (dict): Signal model for using a service
"""
properties, types = self.properties_service_model(client)
descriptions = {
'internal': {
'brief': "Internal interface for associated application.",
'full': "This interface should only be used by the associated application."
},
'external': {
'brief': "External interface.",
'full': "This interface should be used by modules wanting to interact with the associated application."
},
'observer': {
'brief': "Read-only interface.",
'full': "This interface can be used by anyone wanting information from the associated application."
},
}
model = {"name": self.name,
"version": "${SERVICE_VERSION}",
"description": descriptions[client],
"properties": properties,
"types": types}
return model
def properties_service_model(self, client):
"""Generate properties and types for a service
Args:
client (str): Name of the client in signal comm
Returns:
(list): List of properties
(list): List of types
"""
properties = {}
types = {}
accessors = {}
if client == 'internal':
accessors['insignals'] = 'r-'
accessors['outsignals'] = '-w'
elif client == 'external':
accessors['insignals'] = '-w'
accessors['outsignals'] = 'r-'
else:
accessors['insignals'] = 'r-'
accessors['outsignals'] = 'r-'
properties_in, types_in = self._properties_service_model(
self.signal_names["app"]["insignals"],
accessors['insignals'])
properties_out, types_out = self._properties_service_model(
self.signal_names["app"]["outsignals"],
accessors['outsignals'])
properties = properties_in + properties_out
types = types_in + types_out
return properties, types
def _specifications(self, signal_names):
""" Iterate over signal specifications for allowed services
Args:
signal_names (list): allowed signals
Yields:
specification (dict): Specification for a signal for an allowed service
"""
for _, spec in self._generator(signal_names, unique_names=False):
yield spec
def _properties_service_model(self, signal_names, accessors):
""" Placeholder
"""
properties = []
endpoint_members = {}
endpoint_types = {}
for signal_spec in self._specifications(signal_names):
interface = signal_spec[self.position.api.value]
if self.skip_interface(interface):
continue
endpoint = signal_spec[self.position.endpoint.value]
primitive = signal_spec[self.position.property_name.value]
if endpoint not in endpoint_members and primitive is not None:
endpoint_members[endpoint] = []
if primitive is not None:
if endpoint not in endpoint_types:
endpoint_types[endpoint] = {
'name': endpoint,
'kind': 'struct',
'description': {
'brief': endpoint,
"full": "Generated from project without custom description"
},
'members': []
}
endpoint_members[endpoint].append({
'name': primitive,
'type': primitive,
})
endpoint_types[endpoint]['members'].append({
'name': primitive,
'type': signal_spec[self.position.property_type.value],
})
else:
primitive_type = signal_spec[self.position.property_type.value]
primitive_desc = signal_spec[self.position.description.value]
primitive_unit = signal_spec[self.position.unit.value]
properties.append(
{
'name': endpoint,
'type': primitive_type,
'unit': primitive_unit,
'accessors': accessors,
'description': {
'brief': endpoint,
'full': primitive_desc,
}
}
)
for endpoint_name in sorted(endpoint_members):
properties.append(
{
'name': endpoint_name,
'type': endpoint_name,
'unit': 'struct',
'accessors': accessors,
'description': {
'brief': endpoint_name,
"full": "Generated from project without custom description"},
}
)
return_types = []
for endpoint_name, endpoint_type in endpoint_types.items():
return_types.append(endpoint_type)
return properties, return_types
def skip_interface(self, interface):
""" Filter services not in list.
Args:
service (str): interface
Returns:
skip (bool): Skip this interface
"""
if self.filter is None:
LOGGER.debug('No interface filter. Allowing everyone.')
return False
if interface in self.filter:
LOGGER.debug('%s is in %s', interface, self.filter)
return False
return True

View File

@ -0,0 +1,249 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
"""Module to handle the Simulink interface."""
from pybuild.lib import logger
LOGGER = logger.create_logger("simulink")
def get_interface(interface_data_types, dp_interface, hal_interface, sfw_interface, method_interface):
""" Get interface combined for dp, hal and sfw
Args:
interface_data_types (dict): User defined interface data types
dp_interface (dict): DP interface
hal_interface (dict): HAL interface
sfw_interface (dict): SFW interface
method_interface (dict): Method interface
Returns:
output ([dict]): Combined interface
"""
output = []
output = add_dp(output, interface_data_types, split_interface(dp_interface))
output = add_api(output, interface_data_types, split_interface(hal_interface))
output = add_api(output, interface_data_types, split_interface(sfw_interface))
output = add_methods(output, interface_data_types, method_interface)
# Cannot have a completely empty adapters file, CSP will break
if not output:
if dp_interface:
output = ensure_raster(output, list(dp_interface.keys())[0] + '_Nothing__Always')
elif hal_interface:
output = ensure_raster(output, list(hal_interface.keys())[0] + '_Nothing_Always')
elif sfw_interface:
output = ensure_raster(output, list(sfw_interface.keys())[0] + '_Nothing_Always')
return output
def split_interface(interface):
"""Takes a raster interface and splits it based on read strategies.
Args:
interface (dict): DP/HAL/SFW raster interface.
Returns:
strategy_split_interface (dict): DP/HAL/SFW interface divided by read strategy.
"""
strategy_split_interface = {}
for raster, raster_data in interface.items():
for port_type, signals in raster_data.items():
for signal_spec in signals:
interface_name = signal_spec.get('api') or signal_spec.get('domain').replace('_', '')
new_raster = '_'.join([raster, interface_name, signal_spec['strategy']])
if new_raster not in strategy_split_interface:
strategy_split_interface[new_raster] = {p: [] for p in raster_data}
strategy_split_interface[new_raster][port_type].append(signal_spec)
return strategy_split_interface
def ensure_raster(output, raster):
""" Ensure raster exists in the output
Args:
output ([dict]): Combined interface
raster (str): Name of raster
Returns:
output ([dict]): Combined interface
"""
for adapter in output:
if adapter['name'] == raster:
return output
output.append(
{
"name": raster,
"ports": {
"in": {},
"out": {}
}
}
)
return output
def get_adapter(interface, raster):
""" Get adapter
Args:
interface (dict): Combined interface
raster (str): Name of raster
Returns:
adapter (dict): Adapter for the raster already in the interface
"""
for adapter in interface:
if adapter['name'] == raster:
return adapter
raise KeyError(raster)
def add_dp(output, interface_data_types, interface):
""" Adds the DP interface to the combined interface
Args:
output ([dict]): Combined interface
interface_data_types (dict): User defined interface data types
interface (dict): DP interface
Returns
output (dict): Combined interface
"""
ports = {
"consumer": "in",
"producer": "out"
}
for raster, raster_data in interface.items():
if not isinstance(raster_data, dict):
LOGGER.debug('Ignoring metadata: %s', raster_data)
continue
output = ensure_raster(output, raster)
adapter = get_adapter(output, raster)
for port_type, signals in raster_data.items():
for signal in signals:
data_type = signal['variable_type']
primitive = ['signals']
primitive.append(signal['domain'])
if signal['group'] is not None:
primitive.append(signal['group'])
primitive.append(signal['property'])
adapter['ports'][ports[port_type]][signal['variable']] = {
'primitive': '.'.join(primitive),
'type': data_type
}
if 'enums' in interface_data_types and data_type in interface_data_types['enums']:
csp_enum_definition = add_csp_enum_def(data_type, interface_data_types['enums'][data_type])
if 'types' not in adapter:
adapter['types'] = []
if csp_enum_definition not in adapter['types']:
adapter['types'].append(csp_enum_definition)
return output
def add_api(output, interface_data_types, interface):
""" Adds the HAL/GenericApi/SFW interface to the combined interface
Args:
output ([dict]): Combined interface
interface_data_types (dict): User defined interface data types
interface (dict): HAL interface
Returns
output (dict): Combined interface
"""
ports = {
"consumer": "in",
"producer": "out"
}
for raster, raster_data in interface.items():
if not isinstance(raster_data, dict):
LOGGER.debug('Ignoring metadata: %s', raster_data)
continue
output = ensure_raster(output, raster)
adapter = get_adapter(output, raster)
for port_type, signals in raster_data.items():
for signal in signals:
data_type = signal['variable_type']
primitive = [signal['api'].lower()]
primitive.append(signal['endpoint'].lower())
if signal['property'] is not None:
primitive.append(signal['property'].lower())
adapter['ports'][ports[port_type]][signal['variable']] = {
'primitive': '.'.join(primitive),
'type': data_type
}
if 'enums' in interface_data_types and data_type in interface_data_types['enums']:
csp_enum_definition = add_csp_enum_def(data_type, interface_data_types['enums'][data_type])
if 'types' not in adapter:
adapter['types'] = []
if csp_enum_definition not in adapter['types']:
adapter['types'].append(csp_enum_definition)
return output
def add_methods(output, interface_data_types, methods):
""" Adds the CSP method call interfaces to the combined interface.
Args:
output ([dict]): Combined interface
interface_data_types (dict): Dict with enum definitions
methods (dict): Methods used by the application
Returns:
output (dict): Combined interface
"""
if methods == {}:
return output
output = ensure_raster(output, 'csp_methods')
adapter = get_adapter(output, 'csp_methods')
if 'methods' not in adapter:
adapter['methods'] = []
for method_data in methods.values():
method_spec = {}
method_spec['name'] = method_data['name']
method_spec['primitive_method'] = method_data['primitive']
method_spec['namespace'] = method_data['namespace']
if 'description' in method_data:
method_spec['description'] = method_data['description']
for direction in ['in', 'out']:
parameters = method_data['ports'][direction]
if len(parameters) > 0:
method_spec[direction] = []
for param_name, param_data in parameters.items():
param_spec = {}
param_spec['name'] = param_name
param_spec['primitive_name'] = param_data['primitive']
data_type = param_data['type']
param_spec['type'] = data_type
method_spec[direction].append(param_spec)
if data_type in interface_data_types.get('enums', {}):
csp_enum_definition = add_csp_enum_def(data_type, interface_data_types['enums'][data_type])
if 'types' not in adapter:
adapter['types'] = []
if csp_enum_definition not in adapter['types']:
adapter['types'].append(csp_enum_definition)
adapter['methods'].append(method_spec)
return output
def add_csp_enum_def(enum_name, interface_enum_definition):
""" Returns a CSP style enumeration definition, given an interface style enum definition.
Args:
enum_name (str): Name of the enumeration
interface_enum_definition (dict): Enumeration from interface data types
Returns:
csp_enum_definition (dict): Combined interface
"""
default = {}
enumerators = []
for enum_member_definition in interface_enum_definition:
if 'default' in enum_member_definition:
default = {enum_member_definition['in']: enum_member_definition['out']}
else:
enumerators.append({enum_member_definition['in']: enum_member_definition['out']})
csp_enum_definition = {
'name': enum_name,
'kind': 'enum',
'enumerators': enumerators,
'default': default
}
return csp_enum_definition

View File

@ -0,0 +1,173 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
"""Module that handles the update of call sources."""
import argparse
import re
from ruamel.yaml import YAML
from pathlib import Path
def parse_args():
""" Parse arguments
Returns:
Namespace: the parsed arguments
"""
parser = argparse.ArgumentParser()
parser.add_argument("interface", help="Interface specification dict", type=Path)
parser.add_argument("src_dir", help="Path to source file directory", type=Path)
parser.add_argument(
"-p",
"--project-config",
type=Path,
default=None,
help="Path to project config json file",
)
return parser.parse_args()
def main():
""" Main function for stand alone execution."""
args = parse_args()
method_config = read_project_config(args.project_config)
with open(args.interface, encoding="utf-8") as interface_file:
yaml = YAML(typ='safe', pure=True)
adapter_spec = yaml.load(interface_file)
update_call_sources(args.src_dir, adapter_spec, method_config)
def read_project_config(project_config_path):
""" Reads project config file and extract method specific settings if they are present.
Args:
project_config_path (Path): path to the ProjectCfg.json file
Returns:
method_config (dict): dictionary of method related configs.
"""
project_info = {}
if project_config_path is not None:
with project_config_path.open() as config_file:
yaml = YAML(typ='safe', pure=True)
config = yaml.load(config_file)
project_info = config["ProjectInfo"]
method_config = {
"adapter_declarations_change": project_info.get("adapterDeclarations", None)
}
method_config["method_call_wrapper"] = {
"pre": project_info.get("methodPreCallWrapper", ""),
"post": project_info.get("methodPostCallWrapper", "")
}
for key, value in method_config["method_call_wrapper"].items():
if value != "":
method_config["method_call_wrapper"][key] = value + "\n"
return method_config
def is_method_adapter(adapter):
""" Check if adapter has methods in it
Args:
adapter (dict): adapter specification
Returns:
methods_in_adapter (bool): true if adapter contains methods,
false otherwise
"""
methods_in_adapter = "methods" in adapter and len(adapter["methods"]) > 0
return methods_in_adapter
def update_call_sources(src_dir, adapter_spec, method_config):
""" Update the source files for specified method calls with
adapter function calls that trigger the methods.
Args:
src_dir (Path): path to folder for method call sources
adapter_spec (list): adapter specifications with methods
method_config (dict): project specific method configs
"""
method_adapters = [a for a in adapter_spec if is_method_adapter(a)]
for adapter in method_adapters:
for method in adapter["methods"]:
method_src = src_dir / (method["name"] + ".c")
with method_src.open("r+") as src_file:
old_src = src_file.read()
new_src = generate_src_code(adapter, method, old_src, method_config)
method_src.unlink()
with open(method_src.with_suffix(".cpp"), "w", encoding="utf-8") as dst_file:
dst_file.write(new_src)
method_header = src_dir / (method["name"] + ".h")
with method_header.open("r+") as header_file:
old_header = header_file.read()
new_header = generate_header_code(method_config, old_header)
header_file.seek(0)
header_file.write(new_header)
header_file.truncate()
def generate_header_code(method_config, old_header):
""" Change header code to include project specific adapter wrapper.
Args:
method_config (dict): project specific method settings
old_header (string): header source code
Returns:
new_header (string): modified header source code
"""
adapter_pattern = r'(?<=#include ")adapter_wrapper.hh'
if method_config["adapter_declarations_change"]:
new_header = re.sub(
adapter_pattern, method_config["adapter_declarations_change"], old_header
)
return new_header
return old_header
def generate_src_code(adapter, method, old_src, method_config):
""" Generate the method call source code to trigger the method call
Args:
adapter (dict): adapter specification
method (dict): method specification
old_src (string): method call source code with mock call
method_config (dict): project specific method settings
Returns:
new_src (string): method call source code with adapter function call
that triggers method
"""
call_pattern = (
r" \/\* CSP method call\n"
r"(.*)ADAPTER->METHOD(.*)"
r" CSP method call end\*\/\n"
)
dummy_pattern = (
r" \/\* C dummy call\*\/" r".*" r" \/\* C dummy call end\*\/\n"
)
comment = (
r"\/\* Used for running spm without csp, such as silver\n"
r" This code should be replaced when using csp. \*\/\n"
)
name = method["name"]
adapter_name = [adapter["name"].title()]
namespace = []
if method["namespace"] != "":
namespace = [method["namespace"]]
adapter = "::".join(namespace + adapter_name) + "->" + name
pre_call = method_config["method_call_wrapper"]["pre"]
post_call = method_config["method_call_wrapper"]["post"]
sans_comment = re.sub(comment, "", old_src)
sans_dummy_call = re.sub(dummy_pattern, "", sans_comment, flags=re.DOTALL)
new_src = re.sub(
call_pattern, pre_call + r"\1" + adapter + r"\2" + post_call, sans_dummy_call, flags=re.DOTALL
)
return new_src
if __name__ == "__main__":
main()

View File

@ -0,0 +1,179 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
"""Module for handling the update of model yaml files."""
import argparse
from pathlib import Path
from ruamel.yaml import YAML
from pybuild.interface.application import Application
from pybuild.interface.base import BaseApplication
class BadYamlFormat(Exception):
"""Exception to raise when signal is not in/out signal."""
def __init__(self, signal):
self.message = f"{signal} is not in-signal or out-signal."
class UpdateYmlFormat(BaseApplication):
"""Class to handle the update of model yaml files."""
def __init__(self, base_application):
self.base_application = base_application
self.raw = {}
self.app_insignals = self.get_insignals_name()
self.app_outsignals = self.get_outsignals_name()
def read_translation(self, translation_file):
"""Read specification of the yaml file.
Args:
translation_file (Path): file with specs.
"""
if not translation_file.is_file():
return {}
with open(translation_file, encoding="utf-8") as translation:
yaml = YAML(typ='safe', pure=True)
self.raw = yaml.load(translation)
def parse_groups(self, signal_groups):
"""Parse signal groups.
Args:
signal_groups (dict): Hal/dp signal group in yaml file.
"""
for interface_type, group_definitions in signal_groups.items():
for group in group_definitions:
for signals in group.values():
self.check_signals({interface_type: signals})
def parse_hal_definitions(self, hal):
"""Parse hal.
Args:
hal (dict): hal in yaml file.
"""
self.parse_groups(hal)
def parse_service_definitions(self, service):
"""Parse service.
Args:
service (dict): service in yaml file.
"""
for interface, definition in service.items():
for apis in definition['properties']:
for signals in apis.values():
self.check_signals({interface: signals})
def check_signals(self, signals_definition):
"""check signal direction(in-signal or out-signal).
Args:
signals_definition (dict): parsed yaml file.
"""
for specifications in signals_definition.values():
for specification in specifications:
in_out_signal = [key for key in specification.keys() if 'signal' in key]
if in_out_signal == []:
raise ValueError(f"signal is not defined for property: {specification['property']} !")
if "in" in in_out_signal[0]:
if specification[in_out_signal[0]] not in self.app_insignals:
if specification[in_out_signal[0]] in self.app_outsignals:
specification["outsignal"] = specification[in_out_signal[0]]
else:
raise BadYamlFormat(specification[in_out_signal[0]])
if "out" in in_out_signal[0]:
if specification[in_out_signal[0]] not in self.app_outsignals:
if specification[in_out_signal[0]] in self.app_insignals:
specification["insignal"] = specification[in_out_signal[0]]
else:
raise BadYamlFormat(specification[in_out_signal[0]])
if in_out_signal[0] == "signal":
if specification["signal"] in self.app_insignals:
specification["insignal"] = specification["signal"]
del specification["signal"]
elif specification["signal"] in self.app_outsignals:
specification["outsignal"] = specification["signal"]
del specification["signal"]
else:
raise BadYamlFormat(specification[in_out_signal[0]])
def get_insignals_name(self):
"""Base application in-signals.
"""
app_insignals = []
for signal in self.base_application.insignals:
app_insignals.append(signal.name)
return app_insignals
def get_outsignals_name(self):
"""Base application out-signals.
"""
app_outsignals = []
for signal in self.base_application.outsignals:
app_outsignals.append(signal.name)
return app_outsignals
def parse_definition(self, definition):
"""Parses all definition files
Args:
definition (list(Path)): Definition files
"""
for translation in definition:
self.read_translation(translation)
self.check_signals(self.raw.get("signals", {}))
self.parse_groups(self.raw.get("signal_groups", {}))
self.parse_hal_definitions(self.raw.get("hal", {}))
self.parse_service_definitions(self.raw.get("service", {}))
self.write_to_file(translation)
def write_to_file(self, translation):
"""Write in/out signals to file
Args:
spec (dict): Specification
output (Path): File to write
"""
with open(translation, "w", encoding="utf-8") as new_file:
yaml = YAML()
yaml.dump(self.raw, new_file)
def get_app(config):
""" Get an app specification for the current project
Args:
config (pathlib.Path): Path to the ProjectCfg.json.
Returns:
app (Application): pybuild project.
"""
app = Application()
app.parse_definition(config)
return app
def parse_args():
"""Parse command line arguments
Returns:
(Namespace): Parsed command line arguments
"""
parser = argparse.ArgumentParser()
parser.add_argument("config", help="The SPA2 project config file", type=Path)
return parser.parse_args()
def main():
"""Main function for update model yaml."""
args = parse_args()
app = get_app(args.config)
translation_files = app.get_translation_files()
uymlf = UpdateYmlFormat(app)
uymlf.parse_definition(translation_files)
if __name__ == "__main__":
main()

View File

@ -0,0 +1,350 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
"""Python module used for handling zone controller specifications"""
from ruamel.yaml import YAML
from pybuild.lib import logger
from pybuild.interface.base import BaseApplication
LOGGER = logger.create_logger("base")
class BadYamlFormat(Exception):
"""Exception to raise when in/out signal is not defined."""
def __init__(self, message):
self.message = message
class ZCAL(BaseApplication):
"""Zone controller abstraction layer"""
def __repr__(self):
"""String representation of ZCAL"""
return (
f"<ZCAL {self.name}"
f" app_side insignals: {len(self.signal_names['other']['insignals'])}"
f" app_side outsignals: {len(self.signal_names['other']['outsignals'])}>"
)
def __init__(self, base_application):
"""Create the interface object
Currently, there is no verification that the signal is used.
TODO: Check if the signal is a set or a get somewhere. Maybe in here.
Args:
base_application (BaseApplication): Primary object of an interface
Usually a raster, but can be an application or a model too.
"""
self.name = ""
self.zc_translations = {}
self.composition_spec = {}
self.signal_names = {}
self.e2e_sts_signals = set()
self.update_bit_signals = set()
self.clear_signal_names()
self.interface_enums = set()
self.base_application = base_application
self.translations_files = []
self.device_domain = base_application.get_domain_mapping()
self.signal_primitives_list = set()
@staticmethod
def read_translation_files(translation_files, keys=None):
""" Searches translation files (yaml) for given keys at top level. Raises an
error if conflicting configurations are found among the files. Return a dict
containing the aggregated data found in the translation files for the keys
provided in the function call.
Args:
translation_files (list): List of paths to files to search.
keys (list): List of keys to search among translation files for (all present keys by default).
Returns:
specs (dict): Dictionary containing provided keys with the aggregated data
stored under those keys in translation files.
"""
specs = {}
for translation_file in translation_files:
with open(translation_file, encoding="utf-8") as translation:
yaml = YAML(typ='safe', pure=True)
raw = yaml.load(translation)
used_keys = raw.keys() if keys is None else keys
for key in used_keys:
new_elements = raw.get(key, None)
if isinstance(new_elements, list):
specs[key] = specs.get(key, []) + new_elements
elif isinstance(new_elements, dict):
specs[key] = specs.get(key, {})
if {**specs[key], **new_elements} != {**new_elements, **specs[key]}:
LOGGER.error(
"Conflicting configuration found for key '%s' among translation files: %s",
key,
translation_files,
)
specs[key].update(new_elements)
return specs
def add_signals(self, signals, signal_type="insignal", *args):
"""Add signal names and properties
Args:
signals (list(Signals)): Signals to use
signal_type (str): 'insignals' or 'outsignals'
"""
type_in_zc = "insignals" if signal_type == "outsignals" else "outsignals"
for signal in signals:
if signal.name not in self.zc_translations:
continue
LOGGER.debug("Adding signal: %s", signal)
for translation in self.zc_translations[signal.name]:
signal_property = translation["property"]
struct_name = translation["struct_name"]
self.check_signal_property(struct_name, signal_property, type_in_zc)
self.signal_names["zc"][type_in_zc].add(signal_property)
self.signal_names["other"][signal_type].add(signal.name)
for e2e_sts_signal_name in self.e2e_sts_signals:
if e2e_sts_signal_name not in self.signal_names["other"]["insignals"]:
LOGGER.warning("E2E check signal %s not used in any model.", e2e_sts_signal_name)
else:
self.signal_names["other"][signal_type].add(e2e_sts_signal_name)
for update_bit_signal_name in self.update_bit_signals:
if update_bit_signal_name not in self.signal_names["other"]["insignals"]:
LOGGER.warning("Update bit signal %s not used in any model.", update_bit_signal_name)
else:
self.signal_names["other"][signal_type].add(update_bit_signal_name)
LOGGER.debug("Registered signal names: %s", self.signal_names)
def check_signal_property(self, struct_name, property_name, signal_type):
"""Check if we have only one signal written for the same property.
Args:
struct_name (str): signal struct name
property_name (str): signal property
signal_type (str): insignal or outsignal
"""
signal_primitive_spec = ".".join([struct_name, property_name, signal_type])
if signal_primitive_spec in self.signal_primitives_list:
error_msg = (f"You can't write {property_name} in {struct_name} as"
f" {signal_type} since this primitive has been used."
" Run model_yaml_verification to identify exact models.")
raise Exception(error_msg)
self.signal_primitives_list.add(signal_primitive_spec)
def parse_definition(self, definition):
"""Parses all translation files and populates class interface data.
Args:
definition (list(Path)): Definition files
"""
raw = self.read_translation_files(definition)
self.composition_spec = {
key: raw[key] for key in ("port_interfaces", "data_types", "calls") if key in raw
}
ports_info = {}
for port_name, port in raw.get("ports", {}).items():
signal_struct = port.get("element", {})
if signal_struct:
self.populate_signal_translations(signal_struct)
ports_info[port_name] = {
**self.get_port_info(signal_struct),
"interface": port.get("interface")
}
else:
raise Exception(f"Port {port_name} has no element.")
self.composition_spec["ports"] = ports_info
@staticmethod
def get_port_info(signal_struct):
"""Extract port information from signal elements in port. Raises exception
if signal elements are not exclusively sent in one direction.
Args:
signal_struct (dict): Signal dict containing list of signal elements
Returns:
port_info (dict): Dict containing port direction and if any elements
should have an update bit associated with them.
"""
struct_name = list(signal_struct.keys())[0]
signal_elements = signal_struct[struct_name]
update_elements = set()
direction = None
for element in signal_elements:
if "insignal" in element:
temp_dir = "IN"
elif "outsignal" in element:
temp_dir = "OUT"
else:
raise BadYamlFormat(f"in/out signal for element in { struct_name } is missing.")
if direction is not None and direction != temp_dir:
raise BadYamlFormat(f"Signal { struct_name } has both in and out elements.")
direction = temp_dir
if element.get("updateBit", False):
update_elements.add(struct_name)
port_info = {"direction": direction}
if update_elements:
port_info["enable_update"] = list(update_elements)
return port_info
def populate_signal_translations(self, struct_specifications):
"""Populate class translations data.
Args:
struct_specifications (dict): Dict with signal structs to/from a port.
"""
enumerations = self.base_application.enumerations
struct_name = list(struct_specifications.keys())[0]
signal_definitions = struct_specifications[struct_name]
for signal_definition in signal_definitions:
if "insignal" in signal_definition:
signal_name = signal_definition["insignal"]
base_signals = self.base_application.insignals
elif "outsignal" in signal_definition:
signal_name = signal_definition["outsignal"]
base_signals = self.base_application.outsignals
else:
raise BadYamlFormat(f"in/out signal for { signal_name } is missing.")
base_properties = None
for base_signal in base_signals:
if signal_name == base_signal.name:
matching_base_signal = base_signal
base_properties = self.base_application.get_signal_properties(
matching_base_signal
)
if base_properties is None:
raise BadYamlFormat(f"in/out signal for { signal_name } is missing.")
if base_properties["type"] in enumerations:
if 'init' in signal_definition:
init_value = signal_definition["init"]
else:
if enumerations[base_properties['type']]['default_value'] is not None:
init_value = enumerations[base_properties['type']]['default_value']
else:
LOGGER.warning('Initializing enumeration %s to "zero".', base_properties['type'])
init_value = [
k for k, v in enumerations[base_properties['type']]['members'].items() if v == 0
][0]
else:
init_value = signal_definition.get("init", 0)
update_bit = signal_definition.get("updateBit", False)
e2e_status = signal_definition.get("e2eStatus", False)
group = signal_definition.get("group", struct_name)
translation = {
"range": {
"min": base_properties.get("min", "-"),
"max": base_properties.get("max", "-")
},
"offset": base_properties.get("offset", "-"),
"factor": base_properties.get("factor", "-"),
"property": signal_definition["property"],
"init": init_value,
"struct_name": struct_name,
"variable_type": base_properties.get("type"),
"description": base_properties.get("description"),
"unit": base_properties.get("unit"),
"debug": base_properties.get("debug", False),
"dependability": e2e_status,
"update_bit": update_bit
}
if signal_name not in self.zc_translations:
self.zc_translations[signal_name] = []
self.zc_translations[signal_name].append(translation)
if update_bit:
update_bit_property = f"{struct_name}UpdateBit"
update_signal_name = f"yVc{group}_B_{update_bit_property}"
if signal_name == update_signal_name:
error_msg = f"Don't put updateBit status signals ({update_signal_name}) in yaml interface files."
raise BadYamlFormat(error_msg)
self.zc_translations[update_signal_name] = [
{
"property": update_bit_property,
"variable_type": "Bool",
"property_interface_type": "Bool",
"offset": "-",
"factor": "-",
"range": {
"min": "-",
"max": "-",
},
"init": 1,
"struct_name": struct_name,
"description": f"Update bit signal for signal {struct_name}.",
"unit": None,
"dependability": False,
"update_bit": True
}
]
self.update_bit_signals.add(update_signal_name)
if e2e_status:
if "outsignal" in signal_definition:
error_msg = "E2e status not expected for outsignals"
raise BadYamlFormat(error_msg)
e2e_sts_property = f"{struct_name}E2eSts"
e2e_sts_signal_name = f"sVc{group}_D_{e2e_sts_property}"
if signal_name == e2e_sts_signal_name:
error_msg = f"Don't put E2E status signals ({e2e_sts_signal_name}) in yaml interface files."
raise BadYamlFormat(error_msg)
if e2e_sts_signal_name not in self.zc_translations:
self.zc_translations[e2e_sts_signal_name] = [
{
"property": e2e_sts_property,
"variable_type": "UInt8",
"property_interface_type": "UInt8",
"offset": 0,
"factor": 1,
"range": {
"min": 0,
"max": 255
},
"init": 255,
"struct_name": struct_name,
"description": f"E2E status code for E2E protected signal(s) {signal_name}.",
"unit": None,
"debug": False,
"dependability": True,
"update_bit": False
}
]
self.e2e_sts_signals.add(e2e_sts_signal_name)
else:
translation = self.zc_translations[e2e_sts_signal_name][0]
translation["description"] = translation["description"][:-1] + f", {signal_name}."
def clear_signal_names(self):
"""Clear signal names
Clears defined signal names (but not signal properties).
"""
self.signal_names = {
"zc": {"insignals": set(), "outsignals": set()},
"other": {"insignals": set(), "outsignals": set()}
}
def to_dict(self):
"""Method to generate dict to be saved as yaml
Returns:
spec (dict): Signalling specification
"""
spec = {"consumer": [], "producer": []}
signal_roles = ["consumer", "producer"]
signal_types = ["insignals", "outsignals"]
for signal_role, signal_type in zip(signal_roles, signal_types):
for signal_name in self.signal_names["other"][signal_type]:
for signal_spec in self.zc_translations[signal_name]:
spec[signal_role].append({**signal_spec, "variable": signal_name})
return spec

4
pybuild/lib/__init__.py Normal file
View File

@ -0,0 +1,4 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
"""pybuild.lib."""

View File

@ -0,0 +1,127 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
"""Module for various helper functions."""
import json
import collections
from pathlib import Path
from subprocess import getoutput
def get_repo_root():
""" Return absolute path to repository where script is executed, regardless
of the current script's location.
Returns:
path (str): Absolute, canonical path to the top-level repository.
if not a git repository, returns current working dir
"""
try:
root = Path(getoutput('git rev-parse --show-toplevel')).resolve()
except (FileNotFoundError, OSError):
root = Path.cwd().resolve()
return str(root)
def create_dir(path: Path):
"""If the directory for a given directory path does not exist, create it.
Including parent directories.
Args:
path (Path): Path to directory.
Returns:
path (Path): Path to directory.
"""
if not path.is_dir():
path.mkdir(parents=True)
return path
def merge_dicts(dict1, dict2, handle_collision=lambda a, b: a, merge_recursively=False):
"""Merge two dicts.
Args:
dict1 (dict): dict to merge
dict2 (dict): dict to merge
handle_collision (function(arg1, arg2)): function to resolve key collisions,
default keeps original value in dict1
merge_recursively (bool): if set to True it merges nested dicts recursively with handle_collision
resolving collisions with non-dict types.
Returns:
dict: the result of the two merged dicts
"""
result = dict1.copy()
for key, value in dict2.items():
if key not in result:
result.update({key: value})
elif isinstance(result[key], dict) and \
isinstance(value, dict) and \
merge_recursively:
result[key] = merge_dicts(result[key], value, handle_collision, merge_recursively)
else:
result[key] = handle_collision(result[key], value)
return result
def deep_dict_update(base, add):
"""Recursively update a dict that may contain sub-dicts.
Args:
base (dict): The base dict will be updated with the contents
of the add dict
add (dict): This dict will be added to the base dict
Returns:
dict: the updated base dict is returned
"""
for key, value in add.items():
if key not in base:
base[key] = value
elif isinstance(value, dict):
deep_dict_update(base[key], value)
return base
def deep_json_update(json_file, dict_to_merge):
""" Recursively update a json file with the content of a dict.
Args:
json_file (path): json file.
dict_to_merge (dict): Dictionary that will be merged into json file, overwriting values and adding keys.
"""
with open(json_file, 'r', encoding="utf-8") as fle:
json_dict = json.load(fle)
merged_dict = merge_dicts(
json_dict,
dict_to_merge,
handle_collision=lambda a, b: b,
merge_recursively=True
)
with open(json_file, 'w', encoding="utf-8") as fle:
json.dump(merged_dict, fle, indent=2)
def recursive_default_dict():
"""Returns recursively defined default dict. This allows people to insert
arbitrarily complex nested data into the dict without getting KeyErrors.
Returns:
defaultdict(self): A new defaultdict instance, recursively defined.
"""
return collections.defaultdict(recursive_default_dict)
def to_normal_dict(weird_dict):
"""Converts nested dict to normal, suitable for YAML/JSON dumping.
Args:
weird_dict (dict): Any dict-like item that can be converted to a dict.
Returns:
dict(nested): An identical nested dict structure, but using real dicts.
"""
for key, value in weird_dict.items():
if isinstance(value, dict):
weird_dict[key] = to_normal_dict(value)
return dict(weird_dict)

55
pybuild/lib/logger.py Normal file
View File

@ -0,0 +1,55 @@
# Copyright 2024 Volvo Car Corporation
# Licensed under Apache 2.0.
"""Module for logging."""
import os
import sys
import logging
LEVEL_NOTSET = 'NOTSET'
LEVEL_DEBUG = 'DEBUG'
LEVEL_INFO = 'INFO'
LEVEL_WARNING = 'WARNING'
LEVEL_ERROR = 'ERROR'
LEVEL_CRITICAL = 'CRITICAL'
LEVELS = {
LEVEL_NOTSET: logging.NOTSET,
LEVEL_DEBUG: logging.DEBUG,
LEVEL_INFO: logging.INFO,
LEVEL_WARNING: logging.WARNING,
LEVEL_ERROR: logging.ERROR,
LEVEL_CRITICAL: logging.CRITICAL
}
def parse_log_level(log_level_name):
"""Convert textual log_level_name to numerical ones defined in logging module."""
level = log_level_name.upper()
if level not in LEVELS:
print(f'Log level "{log_level_name}" invalid, valid list: {", ".join(LEVELS.keys())}', file=sys.stderr)
level = LEVEL_DEBUG
return LEVELS[level]
def create_logger(name, handler=None, log_format=None):
"""Create a logger.
If the handler already have a log format, it will be replaced.
Args:
name (str): Name of the logger
handler (obj): Handler for the logger. Default used if not supplied.
log_format (str): Format for the handler. Default used if not supplied.'
Returns:
logger (obj): A logger with a handler and log format
"""
new_logger = logging.getLogger(name)
new_logger.setLevel(parse_log_level(os.getenv('LOG_LEVEL', LEVEL_INFO)))
if handler is None:
handler = logging.StreamHandler(sys.stdout)
if log_format is None:
log_format = '%(asctime)s - %(name)s - %(levelname)s - %(message)s'
handler.setFormatter(logging.Formatter(log_format))
new_logger.addHandler(handler)
return new_logger

View File

@ -0,0 +1,78 @@
% Copyright 2024 Volvo Car Corporation
% Licensed under Apache 2.0.
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%
% Author: Henrik Wahlqvist
% Date: 31-01-2019
% Purpose: This is for automatically generating c-files from MATLAB models.
% This program can also be used as daily PyBuild code generation.
%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
function BuildAutomationPyBuild(mode, exitOnFailure, modelList)
try
addpath(genpath([pwd '/Projects']));
addpath(genpath([pwd '/matlab-scripts']));
initFile = 'Init_PyBuild(false);';
disp(['Running init file: ' initFile])
evalin('base', initFile);
%Update all models unless a list is provided:
if ~exist('modelList', 'var')
disp('Updating all models...');
modelList = gatherAllModels();
end
updateModels(mode, pwd, modelList);
disp('Done.');
catch err
disp(getReport(err))
if exitOnFailure
quit force;
end
end
if exitOnFailure
exit;
end
bdclose Vcc_Lib;
end
function models = gatherAllModels()
% Function for gathering all models in the repo.
startdir = pwd;
models = {};
modelsFolder = [startdir '/Models/'];
env_ssp = getenv('SSP');
if isempty(env_ssp)
disp('ALL models')
ssps = dir([modelsFolder '*']);
else
ssp_dir = [modelsFolder env_ssp '*']
disp(['All models in ' ssp_dir])
ssps = dir(ssp_dir);
end
for i=1:length(ssps)
if ~ssps(i).isdir
continue;
end
ssp = ssps(i).name;
currSspFolder = [modelsFolder ssp '/'];
cd(currSspFolder)
modelsInSsp = dir('Vc*');
for j=1:length(modelsInSsp)
% Make sure it is a directory
if ~modelsInSsp(j).isdir
continue;
end
model = modelsInSsp(j).name;
% Make sure the directory contains an .mdl file
if isfile([model '/' model '.mdl'])
models = [models ['Models/' ssp '/' model '/' model '.mdl']];
end
end
cd(startdir);
end
end

View File

@ -0,0 +1,125 @@
% Copyright 2024 Volvo Car Corporation
% Licensed under Apache 2.0.
function Generate_A2L(CodegenFunction, A2lStylesheetName, model_ending)
if nargin < 3
model_ending = '_Tmp';
end
fprintf('\n\nGeneration of %s started\n\n', [CodegenFunction '.a2l']);
workspace = getenv('WORKSPACE');
tl_config_dir = fullfile([workspace, '/ConfigDocuments/TL4_3_settings']);
if exist(tl_config_dir, 'dir')
TargetInfoDir = fullfile([tl_config_dir '/TargetInfo']);
TargetConfigDir = fullfile([tl_config_dir '/TargetConfig']);
else
TargetInfoDir = fullfile([getenv('TL_ROOT') '/Matlab/TL/ApplicationBuilder/BoardPackages/HostPC64/MSVC']);
TargetConfigDir = fullfile([getenv('TL_ROOT') '/Matlab/TL/TargetConfiguration/x86_64/MSVC']);
end
ApplicationName = [CodegenFunction model_ending];
BuildName = ['Build' CodegenFunction model_ending];
dsdd_manage_build('Create',...
'Name', BuildName,...
'Application', ApplicationName,...
'TargetInfoDir', TargetInfoDir,...
'TargetConfigDir', TargetConfigDir);
enumDataTypesObj = dsdd('CreateEnumDataTypes', ['/' ApplicationName '/' BuildName]);
enumNames = GetUsedEnumerations(CodegenFunction);
for idx = 1:length(enumNames)
enumName = enumNames{idx};
underlyingDataType = CalculateUnderlyingDataType(CodegenFunction, enumName);
[hDDTypedef, ~, ~] = tlEnumDataType('CreateDDTypedef',...
'SLEnumType', enumName,...
'TypedefGroup', ['/' ApplicationName '/' BuildName '/EnumDataTypes'],...
'CreateTemplate', 'off');
dsdd('SetUnderlyingEnumDataType', hDDTypedef, underlyingDataType);
dsdd('SetTag', hDDTypedef, [enumName '_tag']);
end
dsdd_export_a2l_file(...
'Application', ApplicationName,...
'Build', BuildName,...
'TargetInfoDir', TargetInfoDir,...
'TargetConfigDir', TargetConfigDir,...
'StyleSheet', A2lStylesheetName,...
'File', [CodegenFunction '.a2l'],...
'ProjectFrame', 'off',...
'OverwriteCalProperties', 'on',...
'MergeA2lModules', 'on',...
'asap2version', '1.40',...
'UseLookupStructures', 'off',...
'ExternalVariables', 'on',...
'UseUnderlyingEnumDataTypeInfo', 'on');
% Delete created enums, these can cause problems for upcoming code generation calls on models
% using the same enumerations.
dsdd('Delete', enumDataTypesObj);
fprintf('\n\nGeneration of %s finished\n\n', [CodegenFunction '.a2l']);
end
function enumerations = GetUsedEnumerations(modelName)
%GETUSEDENUMERATIONS Get all enumeration names used in a given model.
unsupportedByTLAPI = {'TL_DummyTrigger', 'TL_Enable', 'TL_Function', 'TL_SimFrame'};
tlBlocksTmp = find_system(modelName, 'FindAll', 'on', 'LookUnderMasks', 'All', 'RegExp', 'on', 'MaskType', 'TL_*');
maskTypesTmp = get_param(tlBlocksTmp, 'MaskType');
supportedTlBlocks = tlBlocksTmp(ismember(maskTypesTmp, unsupportedByTLAPI)==0);
[allDataTypes, invalidIndices, ~] = tl_get(supportedTlBlocks, 'output.type');
validTlBlocks = supportedTlBlocks(~invalidIndices);
validDataTypes = allDataTypes(~invalidIndices);
enumBlocks = validTlBlocks(strcmp(validDataTypes, 'IMPLICIT_ENUM'));
unitsTmp = tl_get(enumBlocks, 'output.unit');
if ~iscell(unitsTmp)
unitsTmp = {unitsTmp};
end
if ~isempty(unitsTmp)
units = unitsTmp(cellfun(@(x) ~isempty(x), unitsTmp));
enumerations = unique(erase(units, ['-', ',', '$']));
else
enumerations = {};
end
end
function underlyingDataType = CalculateUnderlyingDataType(modelName, enumName)
%CALCULATEUNDERLYINGDATATYPE Calculate best fitting data type given a name of an enumeration.
% This is done by find min and max values, than choosing a fitting data type, as small as possible.
enumMemberNames = enumeration(enumName);
if isempty(enumMemberNames)
error([...
'Cannot get enum name %s from model %s.',...
'Enum name should match the unit in applicable blocks, e.g. -,$EnumName.'...
], enumName, modelName);
end
% int32 is super class of Simulink.IntEnumType
enumMemberValues = int32(enumMemberNames);
minValue = min(enumMemberValues);
maxValue = max(enumMemberValues);
% TODO Consider forcing signed like CS (or ARXML, from database?) does, seems to be int8 specifically
underlyingDataType = '';
if minValue >= 0
if maxValue <= 255
underlyingDataType = 'UInt8';
elseif maxValue <= 65535
underlyingDataType = 'UInt16';
end
elseif minValue >= -128
if maxValue <= 127
underlyingDataType = 'Int8';
elseif maxValue <= 32767
underlyingDataType = 'Int16';
end
elseif minValue >= -32768
if maxValue <= 32767
underlyingDataType = 'Int16';
end
end
if isempty(underlyingDataType)
error(...
'Unhandled enum size, name: %s, min: %s, max: %s. Valid types are uint8/16 and int8/16',...
enumName, minValue, maxValue);
end
end

Some files were not shown because too many files have changed in this diff Show More