Updated documentation

Change-Id: I65c140c3ab156406ba8a6ee18eba60692438e4b4
This commit is contained in:
Henrik Wahlqvist 2024-09-20 16:42:37 +02:00
parent c1e138cb47
commit 635045a3f2
39 changed files with 382 additions and 757 deletions

View File

@ -1,39 +1,37 @@
# Powertrain-build # powertrain-build
A Continuous Integration (CI) build system, testing all configurations where a TargetLink model is used. A Continuous Integration (CI) build system, testing all configurations where a TargetLink model is used.
## General Information about powertrain-build ## General Information about powertrain-build
- Powertrain-build is fast. - powertrain-build is fast.
- More parallelization of jobs in the CI system makes it faster. - More parallelization of jobs in the CI system makes it faster.
- Code generation is moved to the developer's PC. - Code generation is moved to the developer's PC.
- Code generation is done once for all projects using pre-processor directives. - Code generation is done once for all projects using pre-processor directives.
- C code reviews are now possible in Gerrit. - C code reviews are now possible in Gerrit.
- Powertrain-build adds signal consistency checks. - powertrain-build adds signal consistency checks.
- Unit tests of the build system are introduced. - Unit tests of the build system are introduced.
- Its quality is assured. - Its quality is assured.
- Powertrain-build creates new variable classes with unique code decorations. - powertrain-build creates new variable classes with unique code decorations.
- Post-processing C code is not necessary. - Post-processing C code is not necessary.
- ASIL-classed variables get declared at the source. - ASIL-classed variables get declared at the source.
- Memory can be optimized at compilation through short addressing different variable classes. - Memory can be optimized at compilation through short addressing different variable classes.
- The same models can be used in more than two different suppliers, for instance, SPA2's Core System Platform (CSP). - The same models can be used in more than two different suppliers, for instance, SPA2's Core System Platform (CSP).
- Powertrain-build fixes incorrect handling of NVM variables. - powertrain-build fixes incorrect handling of NVM variables.
## Project Structure ## Project Structure
- `docs/`: This directory holds all the extra documentation about the project. - `docs/`: This directory holds all the extra documentation about the project.
- `playbooks/`: Directory where we keep Ansible playbooks that are executed in the jobs we use in this project. - `playbooks/`: Directory where we keep Ansible playbooks that are executed in the jobs we use in this project.
- `powertrain_build/`: Main directory of the project. All the application source code is kept here.
- `powertrain_build/`: Main directory of the project. All the application source code is kept here. It is divided into different Python modules:
- `interface/` - `interface/`
- `lib/` - `lib/`
- `zone_controller/` - `zone_controller/`
- `templates/`: Template `.html` files.
Also, we keep `static_code/` and `templates/` directories with useful `.c`, `.h`, and `.html` files. - `matlab_scripts/`: Collection of m-scripts which can be used for generating powertrain-build compatible source code from Simulink models.
- `roles/`: Directory where we keep Ansible roles that are executed in the jobs we use in this project.
- `test_data/`: Directory where we keep test data for the unit tests.
- `tests/`: Directory where we keep the unit tests for our application source code. The tests are structured in a similar way to what we have inside the `powertrain_build/` directory. Tests for the `interface`, `lib`, and `zone_controller` modules are split into `tests/interface/`, `tests/lib/`, and `tests/zone_controller/`, respectively. Other tests are kept inside the `tests/powertrain_build/` directory. - `tests/`: Directory where we keep the unit tests for our application source code. The tests are structured in a similar way to what we have inside the `powertrain_build/` directory. Tests for the `interface`, `lib`, and `zone_controller` modules are split into `tests/interface/`, `tests/lib/`, and `tests/zone_controller/`, respectively. Other tests are kept inside the `tests/powertrain_build/` directory.
- `zuul.d/`: Directory where we keep our Zuul jobs. - `zuul.d/`: Directory where we keep our Zuul jobs.
## How to use powertrain-build ## How to use powertrain-build

View File

@ -0,0 +1,22 @@
# Pre Processor Directives
powertrain-build uses pre processor directives instead of code switches.
This means that all code is generated, but the compiler chooses what to compile
depending of the settings of the directives.
![vcc_lib_preProcessor.jpg](images/preProcessor.jpg)
## Using Pre Processor Calibration Switches to Switch Signals
For this use the LogicalCodeSwitch block, otherwse you will end up with dead code.
## Code Switches for Signals Consumed in Model
Signals that are produced under a code switche and that are used locally in the model,
need to have a switch on the signal too.
The switch should be the LogicalCodeSwitch.
## Requirements
New pre processor directives, e.g. Vc.Pvc.Demo, must also be defined in a code switch configuration document.
Note that they are defined as, for example, `Vc.Pvc.Demo` in the configuration document and `Vc_Pvc_Demo` in the simulink block.

View File

@ -1,20 +0,0 @@
# Run the build command
`powertrain_build/build.py` is the starting point for generating all the files needed for building a complete SW using the supplier make environment. The script takes one positional argument, and that is a [project_config](project_config.md).
This script acts as a command line wrapper for [build](../../powertrain_build/build.py).
```none
usage: build.py [-h] [-cd] [-d] [-na] [-if] [-V] proj_config
positional arguments:
proj_config the project root configuration file
optional arguments:
-h, --help show this help message and exit
-cd, --core_dummy generate core dummy code to enable integration with old supplier code
-d, --debug activate the debug log
-na, --no_abort do not abort due to errors
-if, --interface generate interface report
-V, --version show application version and exit
```

View File

@ -2,37 +2,36 @@
## Imports ## Imports
Allowed imports in production code is: Allowed imports in production code is listed below.
- Built-in packages - Built-in packages.
- Third party packages - Third party packages.
- Libraries - Libraries.
If you need a function from another executable script, refactor it into a library and import it from both places. If you need a function from another executable script, refactor it into a library and import it from both places.
Allowed imports in test code is: Allowed imports in test code is listed below.
- Built-in packages - Built-in packages.
- Third party packages - Third party packages.
- Test libraries - Test libraries.
- Functions from the code that is being tested - Functions from the code that is being tested.
Use to full path when importing. Use to full path when importing.
## Style guides ## Style Guides
- The source code should follow [PEP-8](https://www.python.org/dev/peps/pep-0008/). - The source code should follow [PEP-8](https://www.python.org/dev/peps/pep-0008/).
- Exceptions to these rules are defined in the `<module>/tox.ini` and `<module>/.pylintrc` files - Exceptions to these rules are defined in the `tox.ini` or `<module>/tox.ini` and `.pylintrc` or `<module>/.pylintrc` files.
- Docstring documentation shall be done according to [PEP257 style guide](https://www.python.org/dev/peps/pep-0257/). - Docstring documentation shall be done according to [PEP257 style guide](https://www.python.org/dev/peps/pep-0257/).
## Style guide verification ## Style Guide Verification
Test that the code is following the PEP-8 standards using [flake8](https://pypi.org/project/flake8/) by running 'flake8 <module>' and [pylint](https://www.pylint.org/) by running 'pylint <module>'. Test that the code is following the PEP-8 standards using
[flake8](https://pypi.org/project/flake8/) by running `flake8 <module>` and
[pylint](https://www.pylint.org/) by running `pylint <module>`.
The ignored folders should have their own configurations, to keep some kind of standard. For example the documentation folder: The ignored folders should have their own configurations, to keep some kind of standard.
- run 'flake8 <module>/doc' from the repo root folder. The rules are defined in `<module>/doc/tox.ini`
- run 'pylint <module>.doc --rcfile=<module>/doc/.pylintrc' from the project root folder. The rules are defined in `doc/.pylintrc`
Plugins that should be used with flake8: Plugins that should be used with flake8:
@ -40,7 +39,8 @@ Plugins that should be used with flake8:
## Comments ## Comments
Comments inside the code (not docstrings) should explain WHY something is implemented the way it is, not WHAT the implementation does. An exception to this rule is regular expressions, which can be documented (but shall always tested). Comments inside the code (not docstrings) should explain WHY something is implemented the way it is, not WHAT the implementation does.
An exception to this rule is regular expressions, which can be documented (but shall always be tested).
Instead of commenting about what the code does, write relevant tests that shows this. Instead of commenting about what the code does, write relevant tests that shows this.
@ -48,16 +48,17 @@ Commented and unused code shall be removed.
## Executable scripts ## Executable scripts
All python scripts that are executed from Jenkins or as stand-alone scripts should be located in the `<module>` in the project root folder All python scripts that are executed as stand-alone scripts should be located in the `<module>` in the project root folder.
The executable scripts shall use argparse if any argument is needed for the execution of the script. The executable scripts shall use argparse if any argument is needed for the execution of the script.
## Libraries ## Libraries
Common code should be located in the `<module>/lib` folder Common code should be located in the `<module>/lib` folder.
If a library is only used by other libraries, it is allowed to put the files in a `<module>/lib/<library>` folder. If a library is only used by other libraries, it is allowed to put the files in a `<module>/lib/<library>` folder.
If the libraries requires arguments, they shall not be set as required in argparse. Instead, they shall have a check to see if they are present when needed. This is to make it easier to import the libraries if they are sometimes but not always using the library. If the libraries requires arguments, they shall not be set as required in argparse.
Instead, they shall have a check to see if they are present when needed.
This is to make it easier to import the libraries if they are sometimes but not always using the library.
The library folders should all have an `__init__.py` containing doc string. The library folders should all have an `__init__.py` containing doc string.
@ -65,20 +66,30 @@ Placement of functions:
- If a function from an executable script is needed elsewhere, refactor it into a library and import it from both places. - If a function from an executable script is needed elsewhere, refactor it into a library and import it from both places.
- Only functions used by the library itself, functions that uses the library or functions used by more than one script should be in a library. - Only functions used by the library itself, functions that uses the library or functions used by more than one script should be in a library.
- Do not move a function to a library because you think it will be used by other scripts later. Wait until you can refactor both scripts to use the function. - Do not move a function to a library because you think it will be used by other scripts later.
Wait until you can refactor both scripts to use the function.
## Tests ## Tests
All tests shall be located in the `<module>/test` folder. All tests shall be located in the `tests` folder.
Unit tests shall be named `test_<submodule>.py` and function tests shall be named `func_<submodule>.py` Unit tests shall be named `test_<submodule>.py` and function tests shall be named `func_<submodule>.py`.
Example: `lib/gerrit.py` shall have a `<module>/test/test_gerrit.py` Example: `lib/gerrit.py` shall have a `tests/<module>/test_gerrit.py`.
Additional files needed by the tests shall be located in a separate folder: `test/<submodule>/<files>` Additional files needed by the tests shall be located in a separate folder: `test_data`.
The test shall test WHAT the module is doing, not HOW. That means that you shall test as many public functions as possible, but do not explicitly test private functions (def `__<function>`). Instead of testing the private functions, test the public functions in such a way that the private functions are being tested as well. This makes it much easier to refactor how the code is implemented while keeping tests for what it does unmodified. The test shall test WHAT the module is doing, not HOW.
That means that you shall test as many public functions as possible, but do not explicitly test private functions (def `__<function>`).
Instead of testing the private functions, test the public functions in such a way that the private functions are being tested as well.
This makes it much easier to refactor how the code is implemented while keeping tests for what it does unmodified.
Some duplication of code is acceptable in the test code, as this should be more focused on DAMP (Descriptive And Meaningful Phrases) than DRY (Don't repeat yourself) code. While we want both, it sometimes makes the test code hard to read, and then DAMP should be prioritized in the test code. Some duplication of code is acceptable in the test code,
as this should be more focused on DAMP (Descriptive And Meaningful Phrases) than DRY (Don't repeat yourself) code.
While we want both, it sometimes makes the test code hard to read, and then DAMP should be prioritized in the test code.
Mocking inside of unit tests shall be limited to external systems (such as gerrit), and other processes (such as matlab). Mocking inside of function tests shall be limited to external systems (such as gerrit). Mocking inside of unit tests shall be limited to external systems (such as gerrit), and other processes (such as matlab).
Mocking inside of function tests shall be limited to external systems (such as gerrit).
The code shall be unit tested using the python package unittest. The unit tests are run by `pytest.exe /\*\*/test\_\*.py` in the project root folder. The unit tests are run by `pytest.exe /\*\*/func\_\*.py` in the project root folder. All tests can be run by `python -m unittest discover -s `. The code shall be unit tested using the python package unittest.
The unit tests are run by `pytest.exe /\*\*/test\_\*.py` in the project root folder.
The function tests are run by `pytest.exe /\*\*/func\_\*.py` in the project root folder.
All tests can be run by `python -m unittest discover -s`.

Binary file not shown.

Before

Width:  |  Height:  |  Size: 29 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 7.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 51 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 26 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 14 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 14 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 10 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 27 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 13 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 5.5 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 73 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 76 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 19 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 20 KiB

View File

@ -0,0 +1 @@
<mxfile host="Electron" modified="2022-09-29T16:30:39.925Z" agent="5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) draw.io/20.2.3 Chrome/102.0.5005.167 Electron/19.0.11 Safari/537.36" etag="HUNseBqCKg2znbYkdtvJ" version="20.2.3" type="device"><diagram id="pMotE5WckEuNcqNOJ_rL" name="Page-1">5Zlbk5owFMc/jY9lgIiXR3Xd9qHd2Y6tvbzspBAl3UCcEBX76ZtAQOCw3dotrjPKg+SfC8n5HU4u9NAsSt8KvAk/8ICwnmsHaQ/d9Fz1G7rqTyuHXBmPvFxYCxrkknMUFvQXMaJt1C0NSFIrKDlnkm7qos/jmPiypmEh+L5ebMVZ/akbvCZAWPiYQfULDWSYqyPPPurvCF2HxZMd2+T8wP7jWvBtbJ7Xc5F9o688O8JFW6Z8EuKA7ysSmvfQTHAu87sonRGmbVuYLa93+0Ru2W9BYvk3FXafPqeP7MvK/fVR3i3Jd/F+47wxreww2xp7pKnlmw7LQ2GkbJhEN2T30HQfUkkWG+zr3L1yC6WFMmIq5ajbFY/lLY4o0x4x41tBiVDN3RE1+GkiBX8kM864yJpGtzN9mWpVPftpnTJW0e3sp3TTcyIkSZ80iVMaWjkw4RGR4qCKmApvhoUXGuftD0x6f3SFvpHCihcUTLFxvnXZ9BGAujEMTuDhtvEIr4bHYHhhPBDgcThc0fvhjZ1/4+F0xKPfxuN63g/PuzAeA8ADsCBxMNHztErFPFbiNMBJmMFx6iC0fo+lJCLOFNdGJYFiblbWmdbp/hEfSan8aprX9990JcszqZvUtJElDiZxOvJ8zCQAK40GWGUX1T+fPDcBQAeo8PVa+BaaIAxLuqt3ow26ecI9p6qDR//qo3r89VAjsOb9N9Wqi41mS964EcmHjZYkFmsiQUuZF5YD/3fHHALHXPrzVC6x6DR8g5d/NrNf4lMvCxd9MJ06IFyMzjmdjp8PF92wMJHndTAM3QaGPozaXlvYHnTFoVj2VkBMaQxYqDHLusGBAcvprcXamNG1juS+MpTigqbajFRtvSYmI6JBoJ/VivmEKK8zF6bbTscwXdQMbCMAc9DCshlJ/x9LuIcrQ12XK6PLCnWoOd+8dqhz4FbubvnhQVln68vkiiYhZI+fJeM443Oi8QCad5P3HQK5rE1EczryXDgdjc/6qsBdxIKIHVWmvhYmg8IGF8MELqCDzdXgAK9Iyz77vDjgyjnSR/HWASv7doeltlV/OaPs6mhzU64AXouRCxfVVC98V8rylp/srhMTGjyPyXHOygkumK2fCYfbn6vgU54X/oFP25FTd3jgwtmauF0uzy6YjjO6NDrwiwggA09sqycIId7oclG61h+UrRXjez/EQlo4jrnEkvL4wa2cHjCy0ocQDP8g7J4nVBdQssgHO93wLMjOd8qkSe94kFwCPfWMSJhOKGX8irsmNLCGXo292r9argfwu32IX+25LO9kD1DJ4+fo/CD2+M0fzX8D</diagram></mxfile>

Binary file not shown.

After

Width:  |  Height:  |  Size: 22 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 113 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 12 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 80 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 4.4 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 42 KiB

View File

@ -1,9 +1,10 @@
# Powertrain Build Documentation # powertrain-build Documentation
- [Powertrain Build](./powertrain_build.md) - [Design Standards](./design_standards.md)
- [Powertrain Build Introduction](./powertrain_build_introduction.md) - [powertrain-build](./powertrain_build.md)
- [Powertrain Build Architecture](./powertrain_build_architecture.md) - [powertrain-build Introduction](./powertrain_build_introduction.md)
- [Powertrain Build Deployment](./powertrain_build_deployment.md) - [powertrain-build Architecture](./powertrain_build_architecture.md)
- [powertrain-build Deployment](./powertrain_build_deployment.md)
- [Signal Interface Tool](./signal_interface_tool.md) - [Signal Interface Tool](./signal_interface_tool.md)
- [Signaling In Powertrain Build](./signaling_in_powertrain_build.md) - [Signaling In powertrain-build](./signaling_in_powertrain_build.md)
- [Signal Inconsistency In Check](./signal_consistency_in_check.md) - [Signal Inconsistency In Check](./signal_consistency_in_check.md)

View File

@ -1,92 +0,0 @@
# Introduction to PCC Build System's documentation
## Target
- Improved build speed
- Easier integration of code from other sources than TargetLink
- Improved quality
- Design documentation
- Unit tests
- Better code structure
- Better target code (no postprocessing, all code explicitly assigned to memory areas)
## Overview
The big conceptual change is to move the variability from the code generation stage to the compilation stage.
![Old build system](./images/old_build_system.PNG)
Build a superset c-code, which can be configured via preprocessor directives.
![New build system](./images/new_build_system.PNG)
As the code generation step takes a significant part of the build time, distributing this step to the time of commit, instead of when building the SW, will reduce build time. Only generating one set of c-code, instead of one per variant, will further reduce build time.
The unit c-code also has a unit metadata file (in json-format) which the build system will use. This metadata file acts as a unit code generator abstraction. I.e. the metadata file contains all information needed by the build system and it is independent from how the unit code was created. This will make integration of hand written c-code much easier.
## Summary of changes
This section describes the changes made to the build system (Data dictionary, scripts, configuration files, etc.), but not the changes needed to be done in the model files. These changes are described in [model_changes](model_changes.md)
1. No TargetLink classes uses pragma-directives. These are replaced with header files. The header files are different per supplier and includes the supplier specific header files. When the project is set up in the version control system, the correct header for the supplier are included with externals. I.e. the configuration is done in the version control system. This means that no post processing of c-files are necessary, which reduces build time.
2. Data dictionary changes
- Pragma directives replaced by includes (see above)
- Removal of all datatype sizes in the classes (i.e. CVC_DISP replaces CVC_DISP8, CVC_DISP16 and CVC_DISP32). TargetLink 3.3 groups variable declarations by size without different classes.
- New NVM-ram types are added
- Removed old legacy classes which should no longer be used
- Dependability classes are added, and postprocessing of dependability code is removed.
Scripts are provided to update the models with new the classes. The Dependability classes are located in variable class groups, to make creating models easier (less clutter in the target link class dialog).
![Variable Classes](./images/DD_class_example.PNG)
3. Data dictionary additions for code switch replacement
- Create a file to be included with the preprocessor defines for the configuration, by creating a new module in DD0/Pool/Modules. The name of the module will be the name of the header-file included in the c-files. I.e. MT_CompilerSwitches in the example below.
- Create a new file specification in the module (by right clicking "file specifications" and selecting "Create FileSpecification". Fill in the values below.
![Header File Specification](./images/TL-DD-preprocessor1.jpg)
- Create another a new file specification in the module (by right clicking "file specifications" and selecting "Create FileSpecification". Fill in the values below.
![Source File Specification](./images/TL-DD-preprocessor2.jpg)
- Create a new variable class (by right clicking "Variable Class" and selecting "Create Variable Class". Fill in the values below, but add MERGEABLE in the "Optimization" Option.
![Mergeable Variable Class](./images/TL-DD-MACRO.jpg)
4. No models are needed for VcExt*, VcDebug*. C-code and A2L-files are generated directly, which saves build time.
5. Variables are declared in the units which writes to them, with the exception of variables in the supplier interface. I.e.
- VcIntVar is removed.
- Classes of unit outputs in the models are changed from CVC_EXT to CVC_DISP. Scripts are provided to modify the models.
6. All units are checked-in together with it's c-code, and a json unit-definition file. See [unit_config](unit_config.md) for structure of that file. This leads to code generator independence and vastly improved build time, as the most time consuming step is distributed.
7. New project configuration file, see [project_config](project_config.md)
- Config separated in a global config file, and one project specific file.
- More flexibility for different file structures (config, rather than code change)
8. Interface checks are introduced to guarantee that interfaces are consistent before build. Increases quality and reduces build time.
9. Improved maintainability by
- A more modular design.
- Documentation of the build system.
- Unit tests for the build system.
10. A new temp-model is generated to move the TL-outport to the subsystem where the signal is created. This is needed to be able to remove the unused outports from the generated c-code, and at the same time have the possibility to create a test harness. Is there a way to avoid this without using the TL-DD or making temp models?
## Limitations in TL3.3
When using TargetLink 3.3 and preprocessor directives, there are some limitations of what can be used in subsystems under these directives.
1. Enabled subsystems cannot use the reset states option.
2. The outports of enabled subsystems cannot use the reset when disabled option.
![Enable and Port reset options examples](./images/reset_port_SL_example.PNG)
After discussion with dSpace, they are considering removing the above mentioned limitations, in TL 4.4 or later. This needs to be confirmed with dSpace.
Another issue is that the TargetLink code generation optimizer considers preprocessor ifs as any other control flow (e.g. an ordinary if statement). Enabling the control flow code generation optimization (the MOVEABLE optimization in the class), can lead to the removal of code, calibration data or measurement variables when using preprocessor ifs.
See [model_changes](model_changes.md) for workarounds for these limitations.

View File

@ -1,71 +0,0 @@
# Necessary Model Changes
This section describes the necessary model changes that has to be done for the new build environment to work. Most of these changes will be automated by model conversion scripts.
## Replacement of the VCC Codeswitch blocks
1. Add a constant of the type you defined above (`IFDEF_MACRO_EXT_DEFINE`) and a TargetLink pre-processor IF block
![Pre-processor IF example](./images/add-preproc-block.jpg)
2. The pre-processor IF block is found here:
![Vcc_Lib](./images/TL-preproc-lib.jpg)
3. Add a Simulink action port to the subsystem
![Action port](./images/change_to_actionport.jpg)
4. Connect the TargetLink pre-processor IF block to the action port of the subsystem
![Action port connection](./images/connect_to_subsystem.jpg)
## Use of the VCC Codeswitch blocks as constants
TargetLink does not accept using the same pre-processor define for both pre-processor ifs and as constants. The workaround is to create a new define, which is set to the original define in the `LocalDefs.h` file created by the code-generation scripts.
![Pre-processor constant example](./images/DuplicatedDefines.PNG)
`VcAesAir_OPortMvd_LocalDefs.h`:
```c
#include "VcCodeSwDefines.h"
#define Vc_Aes_B_CodeGenDsl_CN (Vc_Aes_B_CodeGenDsl)
```
## Modification of ts datastore blocks
The datastore block for sampling time, *ts*, must be modified so that different time constants can be used when building the superset c-code for different projects.
Change the class of the datastore block to *"EXTERN_GLOBAL_ALIAS"* and fill in the name of the constant defining the sampling time in the *Name* field.
The build scripts generates a header file which defines the correct time constant value.
![ts datastore block](./images/ts_change.PNG)
## Declaration of output variables in unit
The `VcIntVar.c` file is removed, the unit producing a signal should declare it. This avoids interface integration issues and saves build time. **NB: Model change required**
- TL-Class for outports need to be changed to `CVC_DISP` in all models.
- A script is written which modifies all outports to `CVC_DISP`.
## Addition of new dependability TL-classes
Dependability classes are added, and postprocessing of dependability code is removed. Scripts are provided which automatically changes the classes of a model.
## Enabled subsystem outports
Subsystems with any subsystem parent, which have a TL-preprocessor switch, cannot have a "reset-port" state. See below for workarounds.
- The scripts converting the models will detect if the reset states option is used, and will not use a preprocessor-if for that subsystem. This will lead to increased RAM/ROM usage. It is recommended to refactor the model (split into several models), if this is a possibility.
- All outports with the reset when inactive option enabled, will have that option disabled by the conversion script. This will be shown in the logfile. The designer needs to consider if this change is acceptable and if not, the following design workaround is recommended.
![Reset port workaround](./images/reset_port_workaround.PNG)
## Other workarounds
The workaround for the optimization problem is to remove the MOVEABLE optimization when the new build system is introduced. This will bring us back to the same behavior as we had when running TL 2.1.5.
This will make it easier to understand how the target code works, as it will behave as in simulink. Furthermore, it will be easier to predict maximum CPU-load. The downside is that it will cost 0.1% extra CPU load, when measured last time.

View File

@ -56,7 +56,7 @@ bash/command window.
See picture below for details. See picture below for details.
![MatlabEnvVar](MatlabEnvVar.JPG) ![MatlabEnvVar](images/MatlabEnvVar.JPG)
#### Code generate with Embedded Coder with Matlab2019b #### Code generate with Embedded Coder with Matlab2019b
@ -132,7 +132,64 @@ py -3.6 -m powertrain_build.wrapper --codegen --models Models/ICEAES/VcAesTx/VcA
NOTE: Building a project (--build) does not work if a model requires a NOTE: Building a project (--build) does not work if a model requires a
preprocessor directive that does not exist in any configuration file. preprocessor directive that does not exist in any configuration file.
### What to commit ### Errors
#### Wrong installation path of Matlab 2017B
Wrong installation path of Matlab 2017B. You would get this fault trace:
```bash
$ py -3.6 -m powertrain_build.wrapper --codegen --models Models/PVCTM/VcScHmi/VcScHmi.mdl
2024-09-09 13:45:36,512 - C:\Users\<user>\AppData\Local\Programs\Python\Python36\lib\site-packages\powertrain_build\__init__.py - INFO - Current powertrain-build version is 0.1.1.dev1
2024-09-09 13:45:36,623 - C:\Users\<user>\AppData\Local\Programs\Python\Python36\lib\site-packages\powertrain_build\__init__.py - WARNING - powertrain-build version does not match requirements!
2024-09-09 13:45:37,215 - C:\Users\<user>\AppData\Local\Programs\Python\Python36\lib\site-packages\powertrain_build\wrapper.py - INFO - Affected models: Models/PVCTM/VcScHmi/VcScHmi.mdl
INFO:C:\Users\<user>\AppData\Local\Programs\Python\Python36\lib\site-packages\powertrain_build\wrapper.py:Affected models: Models/PVCTM/VcScHmi/VcScHmi.mdl
2024-09-09 13:45:37,215 - C:\Users\<user>\AppData\Local\Programs\Python\Python36\lib\site-packages\powertrain_build\wrapper.py - INFO - Preparing workspace for powertrain-build!
INFO:C:\Users\<user>\AppData\Local\Programs\Python\Python36\lib\site-packages\powertrain_build\wrapper.py:Preparing workspace for powertrain-build!
2024-09-09 13:45:37,215 - C:\Users\<user>\AppData\Local\Programs\Python\Python36\lib\site-packages\powertrain_build\wrapper.py - INFO - Running powertrain-build generate code!
INFO:C:\Users\<user>\AppData\Local\Programs\Python\Python36\lib\site-packages\powertrain_build\wrapper.py:Running powertrain-build generate code!
Traceback (most recent call last):
File "C:\Users\<user>\AppData\Local\Programs\Python\Python36\lib\site-packages\powertrain_build\pt_matlab.py", line 392, in run_m_script
p_matlab.run(command=" ".join(cmd))
File "C:\Users\<user>\AppData\Local\Programs\Python\Python36\lib\site-packages\powertrain_build\pt_win32.py", line 67, in run
self.startup_info,
pywintypes.error: (2, 'CreateProcess', 'The system cannot find the file specified.')
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Users\<user>\AppData\Local\Programs\Python\Python36\lib\runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "C:\Users\<user>\AppData\Local\Programs\Python\Python36\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "C:\Users\<user>\AppData\Local\Programs\Python\Python36\lib\site-packages\powertrain_build\wrapper.py", line 502, in <module>
sys.exit(main())
File "C:\Users\<user>\AppData\Local\Programs\Python\Python36\lib\site-packages\powertrain_build\wrapper.py", line 498, in main
return wrapper.run()
File "C:\Users\<user>\AppData\Local\Programs\Python\Python36\lib\site-packages\powertrain_build\wrapper.py", line 408, in run
exit_code |= self.build_automation(mode="codegen")
File "C:\Users\<user>\AppData\Local\Programs\Python\Python36\lib\site-packages\powertrain_build\wrapper.py", line 351, in build_automation
self.run_m_script(script_name, wrap_cmd=False, attempts=2)
File "C:\Users\<user>\AppData\Local\Programs\Python\Python36\lib\site-packages\powertrain_build\pt_matlab.py", line 423, in run_m_script
if p_matlab.poll() == pt_win32.STILL_ACTIVE:
File "C:\Users\<user>\AppData\Local\Programs\Python\Python36\lib\site-packages\powertrain_build\pt_win32.py", line 131, in poll
return win32process.GetExitCodeProcess(self.process_handle)
pywintypes.error: (6, 'GetExitCodeProcess', 'The handle is invalid.')
```
**NOTE:** Instead of `pywintypes.error: (2, 'CreateProcess', 'The system cannot find the file specified.')`
you might get `pywintypes.error: (5, CreateProcess, Access is denied.)`.
**Solution:**
1. Set an environmental label: [MatlabEnvVar](#set-matlab-2017-as-environmental-variable).
1. Specify your Matlab path using the `--matlab-bin` argument:
```bash
py -3.6 -m powertrain_build.wrapper --codegen --models Models/ICEAES/VcAesTx/VcAesTx.mdl --matlab-bin="C:\Program Files\MATLAB\R2017b\bin\matlab.exe"
```
## What to commit
Using powertrain_build we need to commit: Using powertrain_build we need to commit:
@ -143,8 +200,7 @@ Using powertrain_build we need to commit:
- Files like `tl_defines_XxxYyy.h`, `VcXxxYyy.h`, `VcXxxYyy.c`, `VcXxxYyy.a2l`, - Files like `tl_defines_XxxYyy.h`, `VcXxxYyy.h`, `VcXxxYyy.c`, `VcXxxYyy.a2l`,
`VcXxxYyy_OPortMvd_LocalDefs.h` `VcXxxYyy_OPortMvd_LocalDefs.h`
- Files in `tests` if needed - Files in `tests` if needed
- Configuration files, e.g. `ConfigDocuments/SPM_Codeswitch_Setup.csv`, - Configuration files, e.g. `ConfigDocuments/SPM_Codeswitch_Setup.csv`, see [pre processor directives](./PreProcessorDirectives.md).
see [pre processor directives](./PreProcessorDirectives.md).
```txt ```txt
gitrepo/ gitrepo/
@ -190,14 +246,14 @@ gitrepo/
│ ├── VcPvcDemo_par │ ├── VcPvcDemo_par
``` ```
### Summary of signals in powertrain_build ## Summary of signals in powertrain_build
[Signal Summary](./signaling_in_powertrain_build.md) [Signal Summary](./signaling_in_powertrain_build.md)
### Signal Interface Tool ## Signal Interface Tool
[Signal Interface Tool](./signal_interface_tool.md) [Signal Interface Tool](./signal_interface_tool.md)
### Signal Interface Inconsistency ## Signal Interface Inconsistency
[Signal inconsistency in check](./signal_inconsistency_check.md) [Signal inconsistency in check](./signal_consistency_in_check.md)

View File

@ -22,16 +22,12 @@ the model.
3) Generate c-code using TargetLink. 3) Generate c-code using TargetLink.
4) Compile. 4) Compile.
![powertrain_buildLegacy](./images/powertrain_build_legacy.png) Such process had to be done every time we wanted software and we have X projects to maintain, which could take a while.
Such process had to be done every time we wanted software and we have up to 11
projects to maintain, which could take a while.
## Software generation after powertrain_build ## Software generation after powertrain_build
powertrain_build uses preprocessor blocks in the models, which means that the generated powertrain_build uses preprocessor blocks in the models, which means that the generated code has "#ifdef SOME_VAR" inside.
code has "#ifdef SOME_VAR" inside. It enables to switch between projects using It enables to switch between projects using variales, which means we can generate code for each model and store them in ./Models/.
variales, which means we can generate code for each model and store them in ./Models/.
1) Gather all c-/h-files used in ABC_123. 1) Gather all c-/h-files used in ABC_123.
2) Set variales according to codeswitch documents. 2) Set variales according to codeswitch documents.
@ -40,10 +36,7 @@ variales, which means we can generate code for each model and store them in ./Mo
## Summary ## Summary
Before powertrain_build, we had to generated code for each model (after modifying them Before powertrain_build, we had to generated code for each model (after modifying them through matlab scripts) for each projects and then compile. Now the latest version of the code is already in the repo, we just need to gather files, set variables and compile.
through matlab scripts) for each projects and then compile. Now the latest
version of the code is already in the repo, we just need to gather files, set
variables and compile.
## How to start with powertrain_build ## How to start with powertrain_build
@ -51,80 +44,49 @@ To get familiar with powertrain_build, build.py is a good start since it contain
modules. Four important modules could also help you understand powertrain_build better: modules. Four important modules could also help you understand powertrain_build better:
build_proj_cfg, unit_cfg, feature_cfg, signal_interface. build_proj_cfg, unit_cfg, feature_cfg, signal_interface.
## build_proj_cfg ### build_proj_cfg
This module is used to read ./Projects/<project_name>/Config/ProjectCfg.json This module is used to read ./Projects/<project_name>/Config/ProjectCfg.json and ./ConfigDocuments/BaseConfig.json,
and ./ConfigDocuments/BaseConfig.json, which provides configuration which provides configuration files location.
files location. The location of Base_Config is also stored in ProjectCfg. The location of Base_Config is also stored in ProjectCfg.
build_proj_cfg also provides methods of gathering information for other modules. build_proj_cfg also provides methods of gathering information for other modules.
## feature_cfg ### feature_cfg
feature_cfg module is also called the codeswitch module. It reads from feature_cfg module is also called the codeswitch module.
./ConfigDocuments/SPM_Codeswitch_Setup*.csv and provides methods for It reads from ./ConfigDocuments/SPM_Codeswitch_Setup*.csv and provides methods for
retrieving the currently used configurations of a unit. The first retrieving the currently used configurations of a unit.
row of SPM_Codeswitch_Setup*.csv lists all projects and the first column The first row of SPM_Codeswitch_Setup*.csv lists all projects and the first column lists codeswitch names.
lists codeswitch names. The values under each project states whether the The values under each project states whether the corresponding codeswich is active in this project.
corresponding codeswich is active in this project.
## unit_cfg ### unit_cfg
The model list of a project is in ./Projects/<project_name>/local_cof/raster.json. The model list of a project is in ./Projects/<project_name>/local_cof/raster.json.
Then unit_cfg reads unit definitions from models from the model list Then unit_cfg reads unit definitions from models from the model list (/Models/<model_name>/pybuild_cfg/<model_name>.json).
(/Models/<model_name>/pybuild_cfg/<model_name>.json). This module also This module also provides methods to retrieve any unit definition, such as inports and outports,
provides methods to retrieve any unit definition, such as inports and outports,
of a unit and all existing units. of a unit and all existing units.
## signal_interface ### signal_interface
This module gets supplier interface of the project from This module gets supplier interface of the project from ./Projects/<project_name>/Config/ActiveInterfaces/.
./Projects/<project_name>/Config/ActiveInterfaces/. It provides methods It provides methods of checking signal interfaces consistency between Supplier and VCC SW.
of checking signal interfaces consistency between Supplier and VCC SW. It It also reads inport/outport information from all units in raster and checks internally inports/outports consistency between VCC SW-units.
also reads inport/outport information from all units in raster and checks
internally inports/outports consistency between VCC SW-units ### Additional Information
For more information about the project and unit configuration files see
[project configuration](./project_config.md)
and
[unit configuration](./unit_config.md)
chapters.
## Why powertrain_build, whats the advantages? ## Why powertrain_build, whats the advantages?
A Continuous Integration (CI) Build System needs to test all configurations
where a TargetLink model is used.
- It's faster
- There is more parallelization of jobs in the CI system, so its faster.
- Code generation is moved to the developers PC
- Code generation is done once for all projects using pre-processor directives.
- C code review is now possible in gerrit!
- The total build time on a software is 15 minutes for all 9 projects
- powertrain_build adds signal consistency checks.
- Unit tests of the Build system is introduced
- So its quality assured
- Its easy to learn for new employees.
- powertrain_build creates new variable classes with unique code decorations
- This means that post processing C code is not necessary.
- This means that ASIL classed variables get declared at the source.
- This means that we can optimize memory at compilation, and save memory in GEP3 through short addressing different variable classes.
- This means we have control of all build steps
- This enables us to use the same models in more than 2 different suppliers, for instance SPA2's Core System Platform.
- powertrain_build thus fixes incorrect handling of NVM variables.
### Legacy, how it used to be
The old system was scripted in Matlab.
The code was not re factored, but build
upon for many years. Hence the complexity of the code was high.
As seen in the figure below, the models used code switches to generate different code for different projects.
The Legacy Build process can be followed in this sharepoint
site:
![powertrain_buildLEgacy](./images/powertrain_build_legacy.png)
### What is it
- Made in Python. - Made in Python.
- Instead of Code Switches in the TargetLink models, there are C Pre Processor - Instead of Code Switches in the TargetLink models, there are C Pre Processor directives.
directives. In this way, we will have super set C code representing the super In this way, we will have superset C code representing the superset Simulink models.
set Simulink models.
- Unit tests of build scripts as well as html documentation. - Unit tests of build scripts as well as html documentation.
- Signal consistency checks through creating JSON files of the code - Signal consistency checks by creating JSON files of the code.
![powertrain_build](./images/powertrain_build.png) ![powertrain_build](./images/powertrain_build.png)
@ -144,11 +106,11 @@ site:
### TargetLink Limitations ### TargetLink Limitations
- dSPACE delivers removal of limitation in TargetLink release 2019B, first said - dSPACE delivers removal of limitation in TargetLink release 2019B, first said to be delivered 2018B.
to be delivered 2018B. These options gives code generation errors: These options gives code generation errors:
- Enabled subsystems, which reset the states when enabling them. - Enabled subsystems, which reset the states when enabling them.
- Out ports which are reset when disabling a subsystem, which are used in subsystems under a pre-processor block - Out ports which are reset when disabling a subsystem, which are used in subsystems under a pre-processor block.
- Solution of TargetLink limitation is refactoring of models - Solution of TargetLink limitation is refactoring of models.
### powertrain_build dependencies ### powertrain_build dependencies

View File

@ -18,7 +18,8 @@ Information on how to deploy powertrain_build can be found [here](./powertrain_b
## powertrain_build Development ## powertrain_build Development
If you want to develop powertrain_build, you can run it directly from the Git repositories. You probably need a separate virtual environment, as any installed release versions of powertrain_build would interfere: If you want to develop powertrain_build, you can run it directly from the Git repositories.
You probably need a separate virtual environment, as any installed release versions of powertrain_build would interfere:
```shell ```shell
python3 -m venv powertrain_build_venv python3 -m venv powertrain_build_venv

View File

@ -1,10 +1,12 @@
# Project configuration files # Project Configuration Files
Configuration files needed by the build system. The "entry point" is the project configuration file given as a command line argument to build.py. This file can contain references to other configuration files and to the software [unit definition files](unit_config.md). Configuration files needed by the build system. The "entry point" is the project configuration file given as a command line argument to build.py. This file can contain references to other configuration files and to the software [unit definition files](unit_config.md).
## Project config ## Project Config
The project config file contains project specific configuration settings and references to other configuration files. It is a json file that should be located in the project root. The project configuration file can override keys/values in the base configuration file. The project config file contains project specific configuration settings and references to other configuration files.
It is a json file that should be located in the project root.
The project configuration file can override keys/values in the base configuration file.
Example ProjectCfg.json: Example ProjectCfg.json:
@ -14,18 +16,19 @@ Example ProjectCfg.json:
"BaseConfig" : "../../ConfigDocuments/BaseConfig.json", "BaseConfig" : "../../ConfigDocuments/BaseConfig.json",
"UnitCfgs": "conf.local/rasters.json", "UnitCfgs": "conf.local/rasters.json",
"ProjectInfo" : { "ProjectInfo" : {
"projConfig" : "VED4_GENIII", "projConfig" : "ABC_123",
"a2LFileName": "VEA_VED4_SPA.a2l", "a2LFileName": "A2L.a2l",
"ecuSupplier" : "Denso", "ecuSupplier" : "CSP",
"ecuType" : "G2", "ecuType" : "",
"unitCfgDeliveryDir": "./output/UnitCfgs" "unitCfgDeliveryDir": "./output/UnitCfgs"
} }
} }
``` ```
## Base config ## Base Config
The base config file shares the same structure as the project config file and can be used for non project-specific configuration settings and default values. The base config file shares the same structure as the project config file and
can be used for non project-specific configuration settings and default values.
Example BaseConfig.json: Example BaseConfig.json:
@ -56,9 +59,10 @@ Example BaseConfig.json:
} }
``` ```
## Units config ## Units Config
The units config file contains information about included software units and scheduling rasters. The software units are executed in the order they are defined within each time raster definition. The units config file contains information about included software units and scheduling rasters.
The software units are executed in the order they are defined within each time raster definition.
```json ```json
{ {
@ -87,11 +91,12 @@ The units config file contains information about included software units and sch
} }
``` ```
## Configuration settings ## Configuration Settings
### File versioning ### File Versioning
The build system compares the version information in the configuration files with the application version to make sure a consistent configuration is used. The build system compares the version information in the configuration files with the application version
to make sure a consistent configuration is used.
- "ConfigFileVersion": "0.2.1" - "ConfigFileVersion": "0.2.1"
- Project configuration file version. - Project configuration file version.
@ -104,15 +109,18 @@ The build system compares the version information in the configuration files wit
### projConfig ### projConfig
The name of the project. This name is used in all the configuration files to identify the project. The name of the project.
This name is used in all the configuration files to identify the project.
### ecuSupplier ### ecuSupplier
Ecu supplier name. This is used to choose supplier dependent code generation (possibly in combination with ECU Type), e.g. the core dummy file generation. Ecu supplier name.
This is used to choose supplier dependent code generation (possibly in combination with ECU Type), e.g. the core dummy file generation.
### ecuType ### ecuType
Ecu type name. This is used to choose supplier dependent code generation (possibly in combination with ECU Supplier), e.g. the core dummy file generation. Ecu type name.
This is used to choose supplier dependent code generation (possibly in combination with ECU Supplier), e.g. the core dummy file generation.
### unitCfgDeliveryDir ### unitCfgDeliveryDir
@ -132,21 +140,31 @@ The log files destination directory.
### configDir ### configDir
The path to a folder containing all the configuration files of the project. Used to find codeswitches, core-id and DID definition files. The path to a folder containing all the configuration files of the project.
Used to find codeswitches, core-id and DID definition files.
### interfaceCfgDir ### interfaceCfgDir
The path to a folder with csv-files defining the supplier interface configuration. The files shall be comma separated files, with the delimiter ';' The path to a folder with csv-files defining the supplier interface configuration.
The files shall be comma separated files, with the delimiter ';'.
The following files shall exists in the folder: CAN-Input.csv, CAN-Output.csv, EMS-Input.csv, EMS-Output.csv, LIN-Input.csv, LIN-Output.csv, Private CAN-Input.csv and Private CAN-Output.csv. The following files shall exists in the folder:
CAN-Input.csv, CAN-Output.csv, EMS-Input.csv, EMS-Output.csv, LIN-Input.csv, LIN-Output.csv, Private CAN-Input.csv and Private CAN-Output.csv.
### prjUnitSrcDir ### prjUnitSrcDir
A file path where the superset of the source code files are found. This path can/shall use wildcards. E.g. "./Models/SSP*/Beta/Vc*/src", will match all folders under the Models folder which start with SSP, and then all folders in Beta starting with Vc, which have a src folder. The build system only includes files from software units referenced by the units config file. A file path where the superset of the source code files are found.
This path can/shall use wildcards.
E.g. "./Models/SSP*/Beta/Vc*/src", will match all folders under the Models folder which start with SSP,
and then all folders in Beta starting with Vc, which have a src folder.
The build system only includes files from software units referenced by the units config file.
### prjUnitCfgDir ### prjUnitCfgDir
A file path to the unit definition files. This file is a json file containing all the relevant meta data for the function. E.g. input parameters, output parameters, calibration labels, local measurement variables, etc... The unit definition file must match the filename pattern "config_*.json" A file path to the unit definition files.
This file is a json file containing all the relevant meta data for the function.
E.g. input parameters, output parameters, calibration labels, local measurement variables, etc...
The unit definition file must match the filename pattern "config_*.json"
### coreDummyFileName ### coreDummyFileName
@ -158,7 +176,11 @@ If declared, this module is included in the build. If the string is empty no mod
### NvmConfig ### NvmConfig
This key configures the NVM area sizes, and the filename of the c-files generated to defined the NVM. The NVM is defined by six structs. The reason for using c-structs is to guarantee the order the variables are declared in memory. The c-standard does not specify in which order global variables are allocated in memory. However, the standard says that struct members should be placed in memory in the order they are declared. This key configures the NVM area sizes, and the filename of the c-files generated to defined the NVM.
The NVM is defined by six structs.
The reason for using c-structs is to guarantee the order the variables are declared in memory.
The c-standard does not specify in which order global variables are allocated in memory.
However, the standard says that struct members should be placed in memory in the order they are declared.
```json ```json
{ {
@ -171,7 +193,11 @@ This key configures the NVM area sizes, and the filename of the c-files generate
### baseNvmStructs ### baseNvmStructs
This json file holds the order for NVM signals in the structs, it also holds area size and allowed signals. We want to preserve the order for signals. So signals should never be removed from this list. If a signal is not used anymore it should not be removed, it should instead be marked with 'Position_*' eg. Position_16Bit_195 the signal position will then be filled in by buildscript with signal found in model that's not found in the json. This json file holds the order for NVM signals in the structs, it also holds area size and allowed signals.
We want to preserve the order for signals.
So signals should never be removed from this list.
If a signal is not used anymore it should not be removed, it should instead be marked with 'Position_*'.
E.g. Position_16Bit_195 the signal position will then be filled in by buildscript with signal found in model that's not found in the json.
```json ```json
{ {
@ -204,4 +230,5 @@ The key "SampleTimes" defines the names of the available time rasters, and the v
### Rasters ### Rasters
The key "Rasters", defines which units that are scheduled in that raster, and the order of the list defines the order the units are executed within the raster. The key "Rasters", defines which units that are scheduled in that raster,
and the order of the list defines the order the units are executed within the raster.

View File

@ -1,83 +1,70 @@
# Signal inconsistency in check (none gating) # Signal Consistency in Check
--------------------------------------------- -------------------------------
[TOC] [TOC]
## Introduction ## Introduction
Signal inconsistency check will be performed on all changes containing .mdl Signal inconsistency check will be performed on all changes containing .mdl files.
files. Reports are created and uploaded to artifactory. One report for each Reports are created and uploaded to artifactory.
project, showcasing information about signals relevant for specific project. One report for each project, showcasing information about signals relevant for specific project.
>![project_index](./images/signal_inconsistency_project_index.png) ![project_index](./images/signal_inconsistency_project_index.png)
Inconsistencies that will be gating upon (listed below) will also be displayed Inconsistencies that will be gating upon (listed below) will also be displayed in log.
in Jenkins log.
- Unproduced inports - Unproduced inports.
- Produced but not used outports - Produced but not used outports.
- Signals added to \<model>\_Unconsumed_Sig.csv - Signals added to \<model>\_Unconsumed_Sig.csv.
>![jenkins_log_example](./images/signal_inconsistency_log_example.png) ![jenkins_log_example](./images/signal_inconsistency_log_example.png)
## Checks ## Checks
Following checks will be performed on models in gerrit change Following checks will be performed on models in gerrit change.
1. Unproduced inports 1. Unproduced inports.
- Signals that are configured to be consumed by model but are not produced - Signals that are configured to be consumed by model but are not produced internally (models) and/or externally (interface list).
internally (models) and/or externally (interface list). 1. Produced but not used outports.
1. Produced but not used outports - Signals that are configured to be produced by model but does not have consumer internally (models) and/or externally (interface list).
- Signals that are configured to be produced by model but does not have 1. Signals added to \<model\>\_Unconsumed_Sig.csv.
consumer internally (models) and/or externally (interface list). - Signals that are defined to **not have consumer** but **is consumed** internally (models) and/or externally (interface list).
1. Signals added to \<model>\_Unconsumed_Sig.csv
- Signals that are defined to **not have consumer** but **is consumed**
internally (models) and/or externally (interface list)
## \<model\>\_Unconsumed_Sig.csv ## \<model\>\_Unconsumed_Sig.csv
For "Produced but not used inports" that are intended for consumption after For "Produced but not used inports" that are intended for consumption after point of merging to master,
point of merging to master there exist possibility to disable check. there exist possibility to disable check.
1. Create pybuild_cfg/\<model\>\_Unconsumed_Sig.csv. 1. Create pybuild_cfg/\<model\>\_Unconsumed_Sig.csv.
1. Add Header "Unconsumed_Signals" and populate with produced signals 1. Add Header "Unconsumed_Signals" and populate with produced signals (outports only) you like to omit.
(outports only) you like to omit.
>![unconsumed_csv_example](./images/signal_inconsistency_unconsumed_csv_example.png) ![unconsumed_csv_example](./images/signal_inconsistency_unconsumed_csv_example.png)
## Running checks and creating reports before pushing to gerrit (optional) ## Running Checks and Creating Reports
Please install pytools before running signal inconsistency locally. See
[powertrain_build and PyTools instruction](./powertrain_build.md)
```bash ```bash
py -3.6 -m pytools.signal_inconsistency_check -m VcModel1 -r py -3.6 -m powertrain_build.signal_inconsistency_check -m VcModel1 -r
``` ```
Multiple model Multiple models.
```bash ```bash
py -3.6 -m pytools.signal_inconsistency_check -m VcModel1 VcModel2 VcModel3 -r py -3.6 -m powertrain_build.signal_inconsistency_check -m VcModel1 VcModel2 VcModel3 -r
``` ```
Running without report creation Running without report creation.
```bash ```bash
py -3.6 -m pytools.signal_inconsistency_check -m VcModel1 VcModel2 VcModel3 py -3.6 -m powertrain_build.signal_inconsistency_check -m VcModel1 VcModel2 VcModel3
``` ```
## Limitations ## Limitations
- Project specific report E.g SigCheckAll_\<PROJECT\>.html will only show - Project specific report E.g SigCheckAll_\<PROJECT\>.html will only show information about signals configured to be used in project "PROJECT".
information about signals configured to be used in project "PROJECT". - Reports does not display information about check 3.
- Reports does not display information about check 3. This will only be This will only be displayed in build logs.
displayed in build logs. - Log does not display information about none gating inconsistencies e.g
- Jenkins/script log does not display information about none gating - "Outports that are generated more than once in the listed configuration(s)".
inconsistencies E.g - "Inports that have different variable definitions than the producing outport".
- "Outports that are generated more than once in the listed - "In-/Out-ports that have different variable definitions than in the interface definition file."
configuration(s)"
- "Inports that have different variable definitions than the producing
outport"
- "In-/Out-ports that have different variable definitions than in the
interface definition file."

View File

@ -4,8 +4,7 @@
## Introduction ## Introduction
Please install PyTools to enable the commands below, see [powertrain_build and PyTools instruction](./powertrain_build.md) powertrain-build contains scripts for both signal consistency checks and signal interface information.
Powertrain Build contains scripts for both signal consistency checks and signal interface information.
If you type the following in git bash: If you type the following in git bash:
@ -13,7 +12,7 @@ If you type the following in git bash:
py -3.6 -m powertrain_build.wrapper --help py -3.6 -m powertrain_build.wrapper --help
``` ```
## Signal Interface report ## Signal Interface Report
The signal Interface tool generates html reports. The following example shows how to generate the report: The signal Interface tool generates html reports. The following example shows how to generate the report:
@ -24,11 +23,11 @@ py -3.6 -m powertrain_build.wrapper --build ABC_123 --interface
A project specific report will be available here: `Projects\ABC_123\output\Reports\SigIf.html`. A project specific report will be available here: `Projects\ABC_123\output\Reports\SigIf.html`.
This report only displays what signals that exist in that project. This report only displays what signals that exist in that project.
## Signal consistency report ## Signal Consistency Report
Signal in-consistency displays per model: Signal in-consistency displays per model:
* **Missing signals**, Inports whose signals are not generated in the listed configuration(s) * **Missing signals**, Inports whose signals are not generated in the listed configuration(s).
* **Unused signals**, Outports that are generated, but not used in the listed configuration(s). * **Unused signals**, Outports that are generated, but not used in the listed configuration(s).
* **Multiple defined signals**, Outports that are generated more than once in the listed configuration(s). * **Multiple defined signals**, Outports that are generated more than once in the listed configuration(s).
* **Internal signal inconsistencies**, Inports that have different variable definitions than the producing outport. * **Internal signal inconsistencies**, Inports that have different variable definitions than the producing outport.

View File

@ -1,102 +1,94 @@
# Summary of signals in powertrain_build # Summary of Signals in powertrain_build
----------------------------------- -----------------------------------
[TOC] [TOC]
## Where are the variables defined in the code? ## Where Are the Variables Defined in the Code?
### Outsignals from models ### Outsignals From Models
``` ```text
* <model_name>.c * <model_name>.c
* VcExtVar*.c - if it is listed in the interface * VcExtVar*.c - if it is listed in the interface.
``` ```
### Insignals to models ### Insignals to Models
``` ```text
* <model_name>.c - if another model is producing it * <model_name>.c - if another model is producing it.
* VcExtVar*.c - if it is in the interface * VcExtVar*.c - if it is in the interface.
* VcDummy_spm.c - if neither in interface nor models * VcDummy_spm.c - if neither in interface nor models.
``` ```
### Outsignals in the interface list ### Outsignals in the Interface List
``` ```text
* <model_name>.c - if a model is producing it * <model_name>.c - if a model is producing it.
* VcDummy.c - if no model is producing it * VcDummy.c - if no model is producing it.
* VcExtVar*.c - outsignals in the interface list are all defined in this file * VcExtVar*.c - outsignals in the interface list are all defined in this file.
``` ```
### Insignals in the interface list ### Insignals in the Interface List
``` ```text
* VcExtVar*.c - this goes for both used and unused signals for ecm projects * VcExtVar*.c - this goes for both used and unused signals for ecm projects.
``` ```
### Signal flow within a project ### Signal Flow Within a Project
Signals within a project can be divided into 4 types: Signals within a project can be divided into 4 types:
``` ```text
* external_outsignals - outsignals in the supplier interface list * external_outsignals - outsignals in the supplier interface list.
* external_insignals - insignals in the supplier interface list * external_insignals - insignals in the supplier interface list.
* internal_outsignals - outsignals from models but not in the supplier interface list * internal_outsignals - outsignals from models but not in the supplier interface list.
* internal_insignals - insignals to the models but not in the supplier interface list * internal_insignals - insignals to the models but not in the supplier interface list.
``` ```
As shown in the picture below, if a model takes internal\_outsignals from As shown in the picture below, if a model takes internal\_outsignals from another model,
another model, the model is regarded as consumer and another model the model is regarded as consumer and another model is supplier.
is supplier.
![powertrain_buildConsummer&Supplier](supplier-consumer_Model.PNG) ![powertrain_buildConsummer&Supplier](images/supplier-consumer_Model.PNG)
If the consumer model expects more insignals than supplier model or If the consumer model expects more insignals than supplier model or supplier interface could provide,
supplier interface could provide, then these insignals are marked as then these insignals are marked as missing signals.
missing signals.
![powertrain_buildMissingSignals](MissingSignals.PNG) ![powertrain_buildMissingSignals](images/MissingSignals.PNG)
If supplier model or interface provides more signals than expecting If supplier model or interface provides more signals than expecting insignals,
insignals, then these signals are marked as unused signals. then these signals are marked as unused signals.
![powertrain_buildUnusedSignals](UnusedSignals.PNG) ![powertrain_buildUnusedSignals](images/UnusedSignals.PNG)
The picture below indicates signal flow within a project. The picture below indicates signal flow within a project.
External interface list defines external in-ports and External interface list defines external in-ports and external out-ports.
external out-ports.
The first flow indicates the normal The first flow indicates the normal signal flow within a project:
signal flow within a project: external\_insignals defined in external\_insignals defined in VcExtVar.c enters model1 and the internal\_outsignals match internal\_insignals of the next model;
VcExtVar.c enters model1 and the internal\_outsignals match then the external\_outsignals come from model3 which are defined in model3.c.
internal\_insignals of the next model; then the external\_outsignals
come from model3 which are defined in model3.c.
The second signal flow indicates the missing signal situation. The second signal flow indicates the missing signal situation.
When internal\_insignals are missing, VcDummy\_spm.c file is generated When internal\_insignals are missing,
to define missing outport variables in the out-port interface of VcDummy\_spm.c file is generated to define missing outport variables in the outport interface of the previous model.
the previous model.
The last signal flow explains two special cases: (1) unused signals; The last signal flow explains two special cases:
(2) external\_outsignals not produced by models. Unsued signals will be (1) unused signals; (2) external\_outsignals not produced by models.
ignored by models and external\_outsignals that are defined in signal Unsued signals will be ignored by models and external\_outsignals that are defined in signal interface but
interface but not produced by models are defined in VcDummy.c instead not produced by models are defined in VcDummy.c instead of model7.c.
of model7.c.
![powertrain_buildSignalFlow](SignalFlow.PNG) ![powertrain_buildSignalFlow](images/SignalFlow.PNG)
## Compilation Process ## Compilation Process
``` ```text
Compile -> Link -> fail -> Update VcDummy and VcDummy_spm (remove multiple defs, add missing defs) Compile -> Link -> fail -> Update VcDummy and VcDummy_spm (remove multiple defs, add missing defs).
Compile -> Link -> fail -> Update VcDummy and VcDummy_spm Compile -> Link -> fail -> Update VcDummy and VcDummy_spm.
Compile -> Link -> fail -> Update VcDummy and VcDummy_spm Compile -> Link -> fail -> Update VcDummy and VcDummy_spm.
Compile -> succeed Compile -> succeed.
``` ```
Compiling and linking SPM only works first try in gcc. Compiling and linking SPM only works first try in gcc.
Multiple definitions or missing definitions are not allowed. Multiple definitions or missing definitions are not allowed.
The iterations are to fix the inconsistencies between SPM and EMS. The iterations are to fix the inconsistencies between SPM and EMS.
If we knew what was defined in the EMS, we could generate If we knew what was defined in the EMS, we could generate VcDummy and VcDummy\_spm on the first try every time.
VcDummy and VcDummy\_spm on the first try every time.

View File

@ -1,246 +0,0 @@
# ToDo items
This is a temporary list of todo items for the PCC build system pilot.
These should be moved to JIRA at some point in time.
## Target Link
* How to handle State Flow function classes, to avoid static data to not
be allocated to a defined memory area? Right now they are set to STATIC_FNC,
but probably they shoud be set to default, and the Templates should be
updated in the TL-data dictionary
* Reset of subsystems and ports
* State flow objects.
## Matlab
* Change the parsing for configurations, so that the workaround for
enabled subsystems with reset (no TL-preprocessor only macro), is
detected as a configuration
* Create a separate class for the ts datastore block. Now it uses
EXTERN_GLOBAL_ALIAS, which is used for all Core-blocks (low prio, as
all models uses the ts-block, which means that the header file
VcUnitTsDefines is included in all models anyway)
* Remove the class EXTERN_CAL_ALIAS? Is this class used for anything?
* Check that the new configuration with TL pre-processor block, works in MIL.
Probably the header file MT_Compileswitches is needed for the simulation
for it to work.
* Add the matlab ts variable to the constant to the macro definition
block.
* Change the format of the configs key to python format (which saves time
in the total build)
* NVM-ram definition script - parse stateflow for NVM classes!
* Make a separate Matlab function for parsing stateflow variables
* generateTLUnit
* Remove workaround for a TL-problem in replaceCodeSwitch.m,
where a macro cannot be used as a constant and a TL preprocessor block
remove if that is solved in TL.
## Python
* How will the interface check work with models with different names, but
the same signal names? Test!
* Where to remove VcDebug*?
consider removing them from the Unitdefinition, and adding them
in the buildsystem if debug is enabled
* Document the format for the LocalDefines.h files
* Consider refactoring Local defines. Make that file a json definition,
and include the result in the global config header file.
* Consider using either format or % for string formatting. This is now
mixed in the code
* The names for the debug switches are not according to the naming convention.
Shall we correct his?
* Consider if we shall separate Bool and the other one byte data types,
as some CPUs have HW bit addressing (think infineon has that)?
* Check the matlab scripts *parseModelInfo.m* and *parseCodeSwitches.m* and
find the bug, which generates empty configs. Should be ['all'].
* In VcPpmPsm: yVcEc_B_AutoStartRcfSet, yVcEc_B_CEMNodeAlive,
sVcScIn_v_VehSpdLgtMax, and sVcEc_Te_EngClnt have empty configs.
* **PROBABLE ERROR** Input signals which are unused due to using
goto-blocks without corresponding from-blocks. When opening a model,
this is checked for, but only on the top level subsystem.
Fix this check, so that it runs on all levels of subsystems.
* Add a method for retreiving the core ids from the unit definitions file
(see :doc:`unit_config`)
## Done items
* The UnitConfigs._parse_all_unit_configs method doesnt seem to parse all json-configs.
If there is several versions of one unit. E.g. VcAesSupM and VcAesSupM__gen3,
both of these unit-config files seems not to be parsed.
* Only parses units in the units.json config list. This list was not updated.
* Dependability functions (and even Vcc_Lib blocks) uses a class CVC_VAR_STAT.
This class is currently not included in the powertrain_build TL-DD. Add this class.
Furthermore, how to handle the ASIL classes for lib-block which set classes in the init-scripts.
* Variables are both in "outputs" and "locals" (example VcAesSupM__gen3, yVcAesSupM_B_SupChrgrErr)
* The parsing of variables for the json file does not remove the "__gen3" suffix for variable names.
* Write email to dSpace regarding:
* Optimization of pre-processor directives. Why are they considered as control flow?
Give example of calibration constant cVcAesVnt_B_UseOldBrkSig,
which is removed even though it is not "MOVEABLE".
* A2L-file is currently generated by TL => All parameters are included in the A2L
in the final version the unit definition file (.json). mergeA2l checks the configuration for all labels,
variables, and function definition, and removes them if they are not included.
* Remove the local defs from the FeatureCOnfigs.gen_unit_cfg_header_file() (VcCodeSwDefines.h).
* Generation of the Ts macro for all the units.
This is needed to reflect the schedulilng frequency of the function in the current project.
* The models need to be updated with another name than the 'ts'.
E.g. ts_[unit_name], and the class should be changed to an external macro.
Add this change to the model conversion script.
* **DONE** - A header file which defines the scheduling defines for all units.
* The above file needs to be included in all files.
* **DONE** - a matlab function updateTsDatastore.m is created.
* OPortMvd.mdl does not seem to have the outports moved.
* LookUnderMasks has to be 'on', as the top.
* Handling of NVM. I.E.
* NVM-Ram blocks includes references to CVC_EXT8/16/32 in the mask init code
-> they need to be replaced by the new blocks.
* Parse the simulink models, and generate the unit definition information for NVM (Matlab-script).
* Generation of structs and defines? Check the Matlab implementation for a specification.
There is no matlab-implementation!
* generateTLUnit
* When generating units in isolation, it is possible to get the same file name
for definitions and tables. E.g. tl_defines_AF.h.
* par.m files are not loaded properly in the init-script.
Update to latest gen3 env will probably solve this problem.
* Add the newScripts folder and subfolders to the matlab path (init script?).
* Run the postprocess_source_files after code generation on all c&h-files.
* Other Matlab issues:
* The new NVM block gives algebraic loopbacks in some implementations,
the block needs to be redesigned to be able to be a drop in replacement for the old block.
* Add VcConst source code to output/SourceCode.
* Generate did defines does not work, no Floats are defines! Fix this!
* Add change models to generate the init function, and not the Restart function.
* No, the code gets messier as the INIT function is reserved for the init of the DISP* classes.
Keep the old way of generating the code.
* Check the debug switches for input and output parametes.
Are both allowed, if so check that the debug functions are called both at the beginning and
the end of the time raster, in the matlab implementation.
* Matlab has debug_in and debug_out, uppdate the gen debug script and the sched script.
* Missing #include "CVC_DISP_END.h".
* VcVmcEm.h is missing a #include “CVC_DISP_END.h” at line 4423ish.
Is this a TL code generation bug? rerun codegen for VcVmcEm.
* VcTrsmShfShkMon.h at line 1451.
* VcSpMon.h at line 911.
* VcSpEc.c line ? (after defines section).
* VcDeTfr.c line 605 (after defines section).
* Is The LocalConfig file is overwritten after a model update? No not any more.
* StateFlow objects are not possible to remove with TL-preprocessor blocks.
* Identify State flow models which are under TL preprocessor blocks, and refactor those models,
by moving the stateflow model to a separate simulink model.
* Model or DD-fault : The init function is not in the CVC_CODE section!
(The same behaviour exists in the current build as well).
Updated DD, set property Restart function name to "default".
* replaceNVMBlocks
* split unit json-config file into two files. One for the interface, and one for the rest of the info.
The interface config needs to be loaded for all units to perform the interface check.
The other file only needs to be loaded for the units within the project being built.
## Moved to JIRA
* Add a prefix to subsystem names which will avoid compilation errors when,
subsystem names does start with numbers after the structure number.
E.g. 1210_12VStartEnable should not be allowed.
* Search for all constant blocks, and idetify which includes a codeswitch.
Replace the identified blocks with macros (use the VccLib block).
* Handling of NVM. I.E.
* Vectors are not supported. Add vector support!
* Memory from earlier codegenerations not implemented. Implement!
* A2L file generation for NVM-ram needs to be **IMPLEMENTED**!
* Local defines:
* Add the model name as a prefix to all local defines, to ensure a separate namespace.
This is needed in the scripts, as all names of the defines all units are aggregated to one dict.
* Dependability models have the wrong classes, so they will use the normal memory areas.
The models need to be updated with the new TL dependabililty variable classe,
which are defined in the updated DD.
* Update powertrain_build to handle classes with the format "ASIL/CVC_**ASIL*[ABC]".
* Build shall fail at the end, and all errors shall presented
(avoids running the script many times, and gradually finding the latent errors)
(This could be a requirement).
* Refactor code so that each function stores errors, and returns them at the end.
* Consider moving the reading of config files to the classes that abracts the configs.
* VcCoreDummy.c does not include the supplier header file.
This must be fixed, preferably via a supplier independent file.
* Have included VcSupplierCoreAbstraction.h, it contains some defines, which might make it not compile.
Needs to be tested.
* All functions should return an error log, and continue the build.
Stop the build at the end, and present all errors.
* The debug functions are not scheduled. Add this!
* Make it an option to generate debug functionality (this could be removed for production SW).
* Currently all variables regardless of the time rasters they are used in,
is set in the beginning of each time raser. This costs some extra cpu performance.
Consider changing this.
* If the legacy excel configuration files are going to be used for more than a few more months,
write a project consistency check that checks that all config files have the same projects in them.
In the test project, projects were missing in the SPM_Codeswitch_Setup.xls,
that is defined in the SPMEMSInterfaceRequirements.xls.
* Move the tmp source code generation folder to a separate location outside the models directory
(as it is not possible to delete them until matlab is closed).

View File

@ -1,12 +1,13 @@
# Unit definition file # Unit Definition File
This unit definition file contains all the meta data needed for the build system to include the software unit in the build and to create the necessary files (e.g. A2L-files, NVM allocation). It is also used to perform consistency checks in the system. This unit definition file contains all the meta data needed for the build system to include the software unit in the build
and to create the necessary files (e.g. A2L-files, NVM allocation).
It is also used to perform consistency checks in the system.
For TargetLink models, this unit definition file is created by the source generation scripts. For TargetLink models, this unit definition file is created by the source generation scripts.
The unit definition file contains nine major keys - version, outports, inports, nvm, core, dids, pre_procs, local_vars and calib_consts. These are described below. The unit definition file contains nine major keys - version, outports, inports, nvm, core, dids, pre_procs, local_vars and calib_consts.
These are described below.
TODO: Consider changing name from unit configuration to unit definition.
Example config_*.json: Example config_*.json:
@ -14,9 +15,9 @@ Example config_*.json:
{ {
"version": "0.2.1", "version": "0.2.1",
"outports": { "outports": {
"sVcPemAlc_D_EngLoadReqEl": { "sVcDummyOne_D_EngLoadReqEl": {
"handle": "VcPemAlc/VcPemAlc/Subsystem/VcPemAlc/tlop_VcAc_Tq_AcLoad5", "handle": "VcDummyOne/VcDummyOne/Subsystem/VcDummyOne/tlop_VcAc_Tq_AcLoad5",
"name": "sVcPemAlc_D_EngLoadReqEl", "name": "sVcDummyOne_D_EngLoadReqEl",
"configs": ["all"], "configs": ["all"],
"description": "Request change in electrical loads controlled by CEM", "description": "Request change in electrical loads controlled by CEM",
"type": "Int16", "type": "Int16",
@ -28,12 +29,12 @@ Example config_*.json:
"class": "CVC_EXT"}}, "class": "CVC_EXT"}},
"inports": { "inports": {
"sVcEc_n_Eng": { "sVcEc_n_Eng": {
"handle": "VcPemAlc/VcPemAlc/Subsystem/VcPemAlc/tlip_VcEc_n_Eng", "handle": "VcDummyOne/VcDummyOne/Subsystem/VcDummyOne/tlip_VcEc_n_Eng",
"name": "sVcEc_n_Eng", "name": "sVcEc_n_Eng",
"configs": [ "configs": [
["all"], ["all"],
["Vc_Pem_Alc_B_CodegenFastCat == 1"], ["Vc_Dummy_One_B_CodegenFastCat == 1"],
["Vc_Pem_Alc_B_CodegenDpfRegen == 1"]], ["Vc_Dummy_One_B_CodegenDpfRegen == 1"]],
"description": "Engine speed", "description": "Engine speed",
"type": "Float32", "type": "Float32",
"unit": "rpm", "unit": "rpm",
@ -43,7 +44,7 @@ Example config_*.json:
"max": 10000, "max": 10000,
"class": "CVC_EXT"}}, "class": "CVC_EXT"}},
"calib_consts": { "calib_consts": {
"cVcAesAir_B_ThrCtrlStrtWght": { "cVcDummyTwo_B_ThrCtrlStrtWght": {
"type": "Bool", "type": "Bool",
"unit": "g/s", "unit": "g/s",
"description": "Switch to weight target cylinder flow during hand-over from start throttle", "description": "Switch to weight target cylinder flow during hand-over from start throttle",
@ -52,11 +53,11 @@ Example config_*.json:
"lsb": "1", "lsb": "1",
"offset": "0", "offset": "0",
"class": "CVC_CAL", "class": "CVC_CAL",
"handle": "VcAesAir/VcAesAir/Subsystem/VcAesAir/VcAesAir/1_AirTarDyn/11_CylTarStrt/B_ThrCtrlStrtWght", "handle": "VcDummyTwo/VcDummyTwo/Subsystem/VcDummyTwo/VcDummyTwo/1_AirTarDyn/11_CylTarStrt/B_ThrCtrlStrtWght",
"configs": ["all"], "configs": ["all"],
"width": [1]}}, "width": [1]}},
"local_vars": { "local_vars": {
"rVcAesAir_m_CylTarAct": { "rVcDummyTwo_m_CylTarAct": {
"type": "Float32", "type": "Float32",
"unit": "mg/stk", "unit": "mg/stk",
"description": "Target cylinder charge flow for aircharge control", "description": "Target cylinder charge flow for aircharge control",
@ -65,7 +66,7 @@ Example config_*.json:
"lsb": "1", "lsb": "1",
"offset": "0", "offset": "0",
"class": "CVC_DISP", "class": "CVC_DISP",
"handle": "VcAesAir/VcAesAir/Subsystem/VcAesAir/VcAesAir/1_AirTarDyn/11_CylTarStrt/Switch1", "handle": "VcDummyTwo/VcDummyTwo/Subsystem/VcDummyTwo/VcDummyTwo/1_AirTarDyn/11_CylTarStrt/Switch1",
"configs": ["all"], "configs": ["all"],
"width": 1}}, "width": 1}},
"core": { "core": {
@ -73,15 +74,15 @@ Example config_*.json:
"VcEvImmoBCM": { "VcEvImmoBCM": {
"API_blk": [ "API_blk": [
{ {
"path": "VcPpmImob/VcPpmImob/Subsystem/VcPpmImob/VcPpmImob/1000_ImobConnectionLayer/1600_Diag/1620_CoreIfNew/Dem_SetEventStatusPF1", "path": "VcDummyThree/VcDummyThree/Subsystem/VcDummyThree/VcDummyThree/1000_ImobConnectionLayer/1600_Diag/1620_CoreIfNew/Dem_SetEventStatusPF1",
"config": [ "config": [
"Vc_NewDiagnosticCoreIF == 1"]}, "Vc_NewDiagnosticCoreIF == 1"]},
{ {
"path": "VcPpmImob/VcPpmImob/Subsystem/VcPpmImob/VcPpmImob/1000_ImobConnectionLayer/1600_Diag/1620_CoreIfNew/Dem_SetEventStatusPP1", "path": "VcDummyThree/VcDummyThree/Subsystem/VcDummyThree/VcDummyThree/1000_ImobConnectionLayer/1600_Diag/1620_CoreIfNew/Dem_SetEventStatusPP1",
"config": [ "config": [
"Vc_NewDiagnosticCoreIF == 1"]}], "Vc_NewDiagnosticCoreIF == 1"]}],
"blk_name": "NamedConstant1", "blk_name": "NamedConstant1",
"subsystem": "VcPpmImob/VcPpmImob/Subsystem/VcPpmImob/VcPpmImob/1000_ImobConnectionLayer/1600_Diag/1620_CoreIfNew", "subsystem": "VcDummyThree/VcDummyThree/Subsystem/VcDummyThree/VcDummyThree/1000_ImobConnectionLayer/1600_Diag/1620_CoreIfNew",
"API_blk_type": "Dem_SetEventStatus Pre-Passed", "API_blk_type": "Dem_SetEventStatus Pre-Passed",
"description": "", "description": "",
"type": "", "type": "",
@ -97,10 +98,10 @@ Example config_*.json:
"Ranking": {}, "Ranking": {},
"TstId": {}}, "TstId": {}},
"dids": { "dids": {
"yVcPpmPsm_B_DriveCycleActive": { "yVcDummyFour_B_DriveCycleActive": {
"name": "yVcPpmPsm_B_DriveCycleActive", "name": "yVcDummyFour_B_DriveCycleActive",
"description": "Driver has entered the driving cycle 1= Active 0 = Not Active", "description": "Driver has entered the driving cycle 1= Active 0 = Not Active",
"handle": "VcPpmPsm/VcPpmPsm/Subsystem/VcPpmPsm/yVcPsm_B_DriveCycleActive", "handle": "VcDummyFour/VcDummyFour/Subsystem/VcDummyFour/yVcPsm_B_DriveCycleActive",
"configs": ["Vc_D_CodegenHev > 0"], "configs": ["Vc_D_CodegenHev > 0"],
"type": "Bool", "type": "Bool",
"unit": "-", "unit": "-",
@ -111,15 +112,15 @@ Example config_*.json:
"class": "CVC_DISP"}}, "class": "CVC_DISP"}},
"nvm": { }, "nvm": { },
"pre_procs" : [ "pre_procs" : [
"Vc_Aes_TrboM_B_CodeGen2Trbo", "Vc_Dummy_Four_B_CodeGen2Trbo",
"Vc_Aes_TrboM_B_CodeGenBstPeak", "Vc_Dummy_Four_B_CodeGenBstPeak",
"Vc_Aes_TrboM_B_CodeGenTrbo", "Vc_Dummy_Four_B_CodeGenTrbo",
"Vc_Aes_TrboM_B_CodeGenTrboMode06", "Vc_Dummy_Four_B_CodeGenTrboMode06",
"Vc_Aes_TrboM_B_CodeGenTrboOverSpd"] "Vc_Dummy_Four_B_CodeGenTrboOverSpd"]
} }
``` ```
## Unit definition data ## Unit Definition Data
### outports, inports, calib_consts, local_vars and nvm ### outports, inports, calib_consts, local_vars and nvm
@ -133,9 +134,9 @@ The keys outports, inports and nvm have the following keys, which defines them:
### handle ### handle
This is a handle to where the variable/parameter is created (outports) or used (inports & nvm) This is a handle to where the variable/parameter is created (outports) or used (inports & nvm).
For TargetLink this is a string identifying the block in the model For TargetLink this is a string identifying the block in the model,
e.g. "VcPemAlc/VcPemAlc/Subsystem/VcPemAlc/yVcVmcPmm_B_SsActive9". e.g. "VcDummyOne/VcDummyOne/Subsystem/VcDummyOne/yVcVmcPmm_B_SsActive9".
### name ### name
@ -147,19 +148,18 @@ Which codeswitches this variable depends on.
For TargetLink this information is parsed from the model structure, For TargetLink this information is parsed from the model structure,
and depends on the use of pre-processor directives. and depends on the use of pre-processor directives.
Can have the following formats; Can have the following formats:
* a list of lists of config strings * a list of lists of config strings.
* [[cs1 and cs2] or [cs3 and cs4]]. * [[cs1 and cs2] or [cs3 and cs4]].
* list of config strings, * list of config strings.
* [cs1 and cs2]. * [cs1 and cs2].
* or a string * or a string.
* (cs): * (cs).
E.g. [["Vc_Pem_Alc_B_CodegenFastCat == 1"],["Vc_Pem_Alc_B_CodegenDpfRegen == 1"]] E.g. [["Vc_Dummy_One_B_CodegenFastCat == 1"],["Vc_Dummy_One_B_CodegenDpfRegen == 1"]]
means that the signal is active in the configuration if the following means that the signal is active in the configuration if the following configuration expression evaluates to TRUE
configuration expression evaluates to TRUE (Vc_Dummy_One_B_CodegenFastCat == 1) OR (Vc_Dummy_One_B_CodegenDpfRegen == 1).
(Vc_Pem_Alc_B_CodegenFastCat == 1) OR (Vc_Pem_Alc_B_CodegenDpfRegen == 1)
### description ### description
@ -179,7 +179,7 @@ The offset used to convert the variable value from HEX to Physical.
### lsb ### lsb
The value of a bit (lsb - least significant bit) The value of a bit (lsb - least significant bit).
The factor used to convert the variable value from HEX to Physical. The factor used to convert the variable value from HEX to Physical.
### min ### min
@ -196,9 +196,8 @@ The storage class of the variable. I.e. which type of memory the variable/parame
### core ### core
The units core ids have the following different types - Events, IUMPR, FIDs, TestID and Ranking (which is not a part of the core, but is included here for simplicity) The units core ids have the following different types - Events, IUMPR, FIDs, TestID and Ranking
(which is not a part of the core, but is included here for simplicity).
TODO: Remove some of the keys for the Core Identifiers. subsystem, type, unit, offset, lsb, min, max, class is not needed for these blocks.
```json ```json
{ {
@ -206,13 +205,13 @@ TODO: Remove some of the keys for the Core Identifiers. subsystem, type, unit, o
"NameOfId": { "NameOfId": {
"API_blk": [ "API_blk": [
{ {
"path": "VcPpmImob/VcPpmImob/Subsystem/VcPpmImob/VcPpmImob/1000_ImobConnectionLayer/1600_Diag/1620_CoreIfNew/Dem_SetEventStatusPF1", "path": "VcDummyThree/VcDummyThree/Subsystem/VcDummyThree/VcDummyThree/1000_ImobConnectionLayer/1600_Diag/1620_CoreIfNew/Dem_SetEventStatusPF1",
"config": ["Vc_NewDiagnosticCoreIF == 1"]}, "config": ["Vc_NewDiagnosticCoreIF == 1"]},
{ {
"path": "VcPpmImob/VcPpmImob/Subsystem/VcPpmImob/VcPpmImob/1000_ImobConnectionLayer/1600_Diag/1620_CoreIfNew/Dem_SetEventStatusPP1", "path": "VcDummyThree/VcDummyThree/Subsystem/VcDummyThree/VcDummyThree/1000_ImobConnectionLayer/1600_Diag/1620_CoreIfNew/Dem_SetEventStatusPP1",
"config": ["Vc_NewDiagnosticCoreIF == 1"]}], "config": ["Vc_NewDiagnosticCoreIF == 1"]}],
"blk_name": "NamedConstant1", "blk_name": "NamedConstant1",
"subsystem": "VcPpmImob/VcPpmImob/Subsystem/VcPpmImob/VcPpmImob/1000_ImobConnectionLayer/1600_Diag/1620_CoreIfNew", "subsystem": "VcDummyThree/VcDummyThree/Subsystem/VcDummyThree/VcDummyThree/1000_ImobConnectionLayer/1600_Diag/1620_CoreIfNew",
"API_blk_type": "Dem_SetEventStatus Pre-Passed", "API_blk_type": "Dem_SetEventStatus Pre-Passed",
"description": "", "description": "",
"type": "", "type": "",
@ -227,29 +226,27 @@ TODO: Remove some of the keys for the Core Identifiers. subsystem, type, unit, o
} }
``` ```
The first key under the ID-type key, is the name of the ID. The value of that The first key under the ID-type key, is the name of the ID.
key is a dict with the following keys: The value of that key is a dict with the following keys:
### API_blk ### API_blk
The value of this key is a list of dicts, these dicts defines the path to all The value of this key is a list of dicts,
the instances where this ID is used in the model, and in which configurations these dicts defines the path to all the instances where this ID is used in the model,
the ID is active and in which configurations the ID is active.
### API_blk_type ### API_blk_type
The value of this key is a string, which defines the type of API block that is The value of this key is a string, which defines the type of API block that is used for this Id.
used for this Id
### blk_name ### blk_name
The value of this key is a string, which defines the name of the block in simulink The value of this key is a string, which defines the name of the block in simulink.
### dids ### dids
The dids defined in the unit The dids defined in the unit.
### pre_procs ### pre_procs
Contains a list of strings, which defines the preprocessor names used in the Contains a list of strings, which defines the preprocessor names used in the unit for configuration.
unit for configuration

View File

@ -24,10 +24,10 @@ if requirement_path.exists():
expected_package = "powertrain-build==" + __version__ expected_package = "powertrain-build==" + __version__
for line in requirement_file: for line in requirement_file:
if expected_package in line: if expected_package in line:
LOGGER.info('Powertrain-build version matched requirements!') LOGGER.info('powertrain-build version matched requirements!')
break break
elif "powertrain-build==" in line and expected_package not in line: elif "powertrain-build==" in line and expected_package not in line:
LOGGER.warning('Powertrain-build version does not match requirements!') LOGGER.warning('powertrain-build version does not match requirements!')
break break
else: else:

View File

@ -365,7 +365,7 @@ class PyBuildWrapper(pt_matlab.Matlab):
"""Execute powertrain-build. """Execute powertrain-build.
Returns: Returns:
exit_code: Exit code from Powertrain-build build step. exit_code: Exit code from powertrain-build build step.
""" """
try: try:
exit_code = build.build( exit_code = build.build(