vector.com|Contact
Search tips

apple banana
Find rows that contain at least one of the two words.

+apple +juice
Find rows that contain both words.

+apple macintosh
Find rows that contain the word 'apple', but rank rows higher if they also contain 'macintosh'.

+apple -macintosh
Find rows that contain the word 'apple' but not 'macintosh'.

+apple ~macintosh
Find rows that contain the word 'apple', but if the row also contains the word 'macintosh', rate it lower than if row does not. This is "softer" than a search for '+apple -macintosh', for which the presence of 'macintosh' causes the row not to be returned at all.

+apple +(>turnover >strudel)
Find rows that contain the words 'apple' and 'turnover', or 'apple' and 'strudel' (in any order), but rank 'apple turnover' higher than 'apple strudel'.

apple*
Find rows that contain words such as 'apple', 'apples', 'applesauce', or 'applet'.

"some words"
Find rows that contain the exact phrase 'some words' (for example, rows that contain 'some words of wisdom' but not "some noise words").

By continuing to use this site you agree to the use of cookies. For more information and to find out how to change this click here. Accept Cookies
Please enable cookies in your browser for this website.
Search:
Advanced search

How to Perform Integration Testing in Automotive (ISO 26262) in Subsystem Level

Last updated: 2020-04-03

Background and Environment

One question that is increasingly asked of VectorCAST Field Application Engineers in the field is the best practices surrounding integration testing as per the ISO 26262 standard. Until recently most of our Automotive customers have focused their ISO 26262 compliance efforts (including code coverage) on unit testing only. That unit testing effort is usually interpreted liberally and is at time performed with test environments inclusive of more than just one file.

Integration testing, on the other hand, is referring to another reality altogether in the standard. As ISO 26262 Part 6 10.2 states:

In this subphase, the particular integration levels are tested against the software architectural design, and the interfaces between the software units and the software components are tested. The steps of the integration and the tests of the software components are to correspond directly to the hierarchical architecture of the software.

The standard then proceeds to describe which tests (and which code coverage levels, function and call coverage) should be conducted at that level. But the exact definition of what is integration testing is a bit of a grey area. How do you define a “software unit”? Is a file a software unit? A module? An application running within the AUTOSAR framework?

Integration as a subsystem, or testing the application level

One possible answer that customers provided to the question “what is integration testing for you” has been the application. Usually, these customers have implemented an AUTOSAR OS (or, previously, some flavor of OSEK) into their code base. That operating system launches a number of tasks that themselves launch a number of applications. The system is cyclical with the tasks being relaunched at regular and different intervals (e.g., 10 ms, 100 ms, 500 ms, etc).


Figure 1: Simplified RTOS structure

The goal of the customer is then to test the applications as when they are launched by the task function, i.e. at the “top” of AUTOSAR. Execution can be executed in isolation from other apps, or have other tasks launched as usual, except for the one being tested (which is being executed wrapped around the test harness). Once the test is completed, then the whole system can be stopped and the results of the test are captured.

Solution

For these customers, we deployed VectorCAST/C++. VectorCAST/C++ is a unit, module and integration test tool; in the sense that it can test a variable number of units from an application within one environment. Independently of the number of units put in the environment, VectorCAST/C++ runs tests by calling individual functions directly and setting global variables. Hence it is often referred to as a “white box testing” solution, albeit this term also suffers of multiple definitions.

The integration we delivered used the customer’s production compiler and executed tests on the target. This can of course be done when the board is not running any operating system. However, in this case, the OS (AUTOSAR) will first be started. Once the tasks initialize, one of these tasks will launch the VectorCAST test harness. VectorCAST’s integration with the facilities available in the customer’s environment will then monitor the execution of the test, as well as capture the results. Specific methods to achieve this are numerous and include port reading and memory capture via the debugger and are customized to fit user requirements.

Possible configurations

This solution answered the need of our customers by providing a way to run a test on the top of AUTOSAR. Any OS calls present in the code would not be stubbed, and the harness would run as part of a task. Because the OS is usually customized to the needs of the customer, the matching VectorCAST integration must also be customized.

There can be a number of ways such an integration can be refined. Since many of these possible choices may have an impact on the scope and meaning of tests, they should be discussed with your VectorCAST Field Application Engineer during deployment of the solution:

  • Should the test harness run an application in isolation of all others (so only the application tested effectively runs) or should it launch all the other applications (with the exception of the application being tested)?
  • How many files should be put under test as opposed as compiled in (not stubbed) or stubbed? This is an especially important matter if one requires the test harness to operate in one of the more frequently reset tasks. If the test harness is so complex that it cannot run to completion, the test will fail. Keeping the number of files under tests to a strict minimum will reduce the odds this will happen and the code not under test can be kept unstubbed, even instrumented for code coverage.
  • Since this approach will necessarily entail some modifications to the file where the tasks are launched, should these be done in a separate copy, or can they be made part of the build, provided they only become effective when relevant VectorCAST compilation macros are provided during compilation?
  • Finally, should a special linker file be developed to accommodate the different requirements of the test harness, as AUTOSAR linker files tend to be highly segmented and may not be suitable for the test harnesses’ requirements?

Conclusion

The subsystem approach is certainly very flexible. It does also offer the possibility to recycle test cases developed at unit level to be used in integration test environments. This way, customers can progressively reuse test cases from unit to module to integration testing, which not only confer some time savings but also makes it possible to find bugs in the smallest amount of code under test possible. Indeed, it makes sense that bugs that can be found at unit level be captured and resolved at that level, as waiting for them to occur in larger, more complex environments would make the debugging and fixing process longer and less efficient.

Please see How to Perform Integration Testing in Automotive (ISO 26262) in System Level for customers who opt not to use subsystem level approach.

For more information on VectorCAST, visit us at www.vector.com.

 
Article Options
2020-04-03
Views: 413