Article (Scientific journals)
Test Generation and Test Prioritization for Simulink Models with Dynamic Behavior
Matinnejad, Reza; Nejati, Shiva; Briand, Lionel et al.
2019In IEEE Transactions on Software Engineering, 45 (9), p. 919-944
Peer reviewed
 

Files


Full Text
paper.pdf
Author postprint (2.31 MB)
Download

All documents in ORBilu are protected by a user license.

Send to



Details



Keywords :
Simulink models; search-based software testing; test generation; test prioritization; test oracle; output diversity; signal features; structural coverage
Abstract :
[en] All engineering disciplines are founded and rely on models, although they may differ on purposes and usages of modeling. Among the different disciplines, the engineering of Cyber Physical Systems (CPSs) particularly relies on models with dynamic behaviors (i.e., models that exhibit time-varying changes). The Simulink modeling platform greatly appeals to CPS engineers since it captures dynamic behavior models. It further provides seamless support for two indispensable engineering activities: (1) automated verification of abstract system models via model simulation, and (2) automated generation of system implementation via code generation. We identify three main challenges in the verification and testing of Simulink models with dynamic behavior, namely incompatibility, oracle and scalability challenges. We propose a Simulink testing approach that attempts to address these challenges. Specifically, we propose a black-box test generation approach, implemented based on meta-heuristic search, that aims to maximize diversity in test output signals generated by Simulink models. We argue that in the CPS domain test oracles are likely to be manual and therefore the main cost driver of testing. In order to lower the cost of manual test oracles, we propose a test prioritization algorithm to automatically rank test cases generated by our test generation algorithm according to their likelihood to reveal a fault. Engineers can then select, according to their test budget, a subset of the most highly ranked test cases. To demonstrate scalability, we evaluate our testing approach using industrial Simulink models. Our evaluation shows that our test generation and test prioritization approaches outperform baseline techniques that rely on random testing and structural coverage.
Disciplines :
Computer science
Author, co-author :
Matinnejad, Reza;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT)
Nejati, Shiva ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT)
Briand, Lionel ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT)
Bruckmann, Thomas;  Delphi Automotive Systems, Luxembourg
External co-authors :
yes
Language :
English
Title :
Test Generation and Test Prioritization for Simulink Models with Dynamic Behavior
Publication date :
September 2019
Journal title :
IEEE Transactions on Software Engineering
ISSN :
0098-5589
Publisher :
Institute of Electrical and Electronics Engineers, New York, United States - New York
Volume :
45
Issue :
9
Pages :
919-944
Peer reviewed :
Peer reviewed
Focus Area :
Computational Sciences
European Projects :
H2020 - 694277 - TUNE - Testing the Untestable: Model Testing of Complex Software-Intensive Systems
Funders :
CE - Commission Européenne [BE]
Available on ORBilu :
since 25 February 2018

Statistics


Number of views
347 (101 by Unilu)
Number of downloads
1545 (54 by Unilu)

Scopus citations®
 
46
Scopus citations®
without self-citations
42
WoS citations
 
43

Bibliography


Similar publications



Contact ORBilu