Abstract
Testing embedded systems software has become a costly activity as these systems become more complex to fulfill rising needs. Testing processes should be both effective and affordable. An ideal testing process should begin with validated requirements and begin as early as possible so that requirements defects can be fixed before they propagate and become more difficult to address. Furthermore, the testing process should facilitate test procedures creation and automate their execution. We propose a novel methodology for testing functional requirements. The methodology activities include standard notations, such as UCM for modeling scenarios derived from requirements, TDL for describing test cases and TTCN-3 for executing test procedures; other test scripting languages can also be used with our methodology. Furthermore, the automation of the methodology generates test artifacts through model transformation. The main goals of this test methodology are to leverage requirements represented as scenarios, to replace the natural language test case descriptions with test scenarios in TDL, and to generate executable test procedures. Demonstration of the feasibility of the proposed approach is based on a public case study. An empirical evaluation of our approach is given using a case study from the avionics domain.












Similar content being viewed by others
Notes
A gate is a point of communication for exchanging information between components, and it specifies also the data that can be exchanged.
Obtained from Philip Makedonski, University of Göttingen.
References
Adolph, S., Cockburn, A., Bramble, P.: Patterns for Effective Use Cases. Addison-Wesley Longman Publishing Co., Inc., Boston (2002)
Baker, P., Dai, Z.R., Grabowski, J., Schieferdecker, I., Williams, C.: Model-Driven Testing: Using the UML Testing Profile. Springer, Berlin (2007). ISBN 9783540725626
Bertolino, A., Fantechi, A., Gnesi, S., Lami, G.: Product line use cases: Scenario-based specification and testing of requirements. In: Software Product Lines, pp. 425–445. Springer, Berlin Heidelberg (2006)
Boniol, F., Wiels, V.: The landing gear system case study. In: ABZ 2014: The Landing Gear Case Study, pp. 1–18. Springer (2014)
Boulet, P., Amyot, D., Stepien, B.: Towards the generation of tests in the test description language from use case map models. In: SDL 2015: Model-Driven Engineering for Smart Cities, pp. 193–201. Springer (2015)
Briand, L., Labiche, Y.: A UML-based approach to system testing. Softw. Syst. Model. 1(1), 10–42 (2002)
Buhr, R.J.A.: Use case maps as architectural entities for complex systems. IEEE Trans. Softw. Eng. 24(12), 1131–1155 (1998)
DO-178A Software Considerations in Airborne Systems and Equipment Certification, Document Number: DO-178A, Issue Date: 3/22/1985, Committee: SC-152, Category: Software
DO-178C: Available from RTCA at www.rtca.org
Dvorak, D.: NASA study on Flight Software Complexity. NASA office of chief engineer (2009)
Elberzhager, F., Rosbach, A., Münch, J., Eschbach, R.: Reducing test effort: a systematic mapping study on existing approaches. Inf. Softw. Technol. 54(10), 1092–1106 (2012)
Hasling, B., Goetz, H., Beetz, K.: Model based testing of system requirements using UML use case models. In: 2008 1st International Conference on Software Testing, Verification, and Validation, pp. 367–376. IEEE (2008, April)
Heckel, R., Lohmann, M.: Towards model-driven testing. Electron. Notes Theor. Comput. Sci. 82(6), 33–43 (2003). ISBN 1571-0661
http://jucmnav.softwareengineering.ca/ucm/bin/view/ProjetSEG/WebHome
http://www.etsi.org/deliver/etsi_es/203100_203199/20311901/01.03.01_60/es_20311901v010301p.pdf. http://www.etsi.org/technologies-clusters/technologies/test-description-language
http://jucmnav.softwareengineering.ca/ucm/pub/UCM/VirLibTutorial99/UCMquickRef.pdf
http://www.etsi.org/deliver/etsi_es/201800_201899/20187301/04.08.01_60/es_20187301v040801p.pdf
Hovsepyan, A., Van Landuyt, D., Michiels, S., Joosen, W., Rangel, G., Fernandez Briones, J., Depauw, J.: Model-driven software development of safety-critical avionics systems: an experience report. In: 1st International Workshop on Model-Driven Development Processes and Practices co-located with ACM/IEEE 17th International Conference on Model Driven Engineering Languages & Systems (MoDELS 2014), vol. 1249 (2014, September)
ITU-T Z.151: http://www.itu.int/rec/T-REC-Z.151/en
Kealey, J., Amyot, D.: Enhanced use case map traversal semantics. In: Gaudin, E., Najm, E., Reed, R. (eds.) SDL 2007. LNCS, vol. 4745, pp. 133–149. Springer, Heidelberg (2007)
Leite, J.C.S.P., Hadad, G., Doorn, J., Kaplan, G.: A scenario construction process. Requir. Eng. J. 5(1), 38–61 (2000)
Makedonski, P., Adamis, G., Käärik, M., Ulrich, A., Wendland, M.-F., Wiles, A.: Bringing TDL to users: a hands-on tutorial. In: User Conference on Advanced Automated Testing (UCAAT 2014), Munich
Marrone, S., Flammini, F., Mazzocca, N., Nardone, R., Vittorini, V.: Towards model-driven V&V assessment of railway control systems. Int. J. Softw. Tools Technol. Transf. 16(6), 669–683 (2014)
Nebut, C., Fleurey, F., Le Traon, Y., Jezequel, J.M.: Automatic test generation: a use case driven approach. IEEE Trans. Softw. Eng. 32(3), 140–155 (2006)
Nogueira, S., Sampaio, A., Mota, A.: Test generation from state based use case models. Formal Asp. Comput. 26(3), 441–490 (2014)
Ryser, J., Glinz, M.: A scenario-based approach to validating and testing software systems using statecharts. In: Proceedings of 12th International Conference on Software and Systems Engineering and Their Applications (1999, December)
Sarmiento, E., Sampaio do Prado Leite, J. C., Almentero, E.: C&L: generating model based test cases from natural language requirements descriptions. In: 2014 IEEE 1st International Workshop on Requirements Engineering and Testing (RET), pp. 32–38. IEEE (2014, August)
Schatz, Bernhard.: 10 years model-driven—what did we achieve?. In: Proceedings of the 2011 Second Eastern European Regional Conference on the Engineering of Computer Based Systems (ECBS-EERC ’11). IEEE Computer Society, Washington, DC, USA, 1-. (2011). doi:10.1109/ECBS-EERC.2011.42
Somé, S. S., Cheng, X.: An approach for supporting system-level test scenarios generation from textual use cases. In: Proceedings of the 2008 ACM Symposium on Applied computing, pp. 724–729. ACM (2008, March)
Ulrich, A., Jell, S., Votintseva, A., Kull, A.: The ETSI Test Description Language TDL and its application. In: 2014 2nd International Conference on Model-Driven Engineering and Software Development (MODELSWARD), pp. 601–608. IEEE (2014, January)
What is the Benefit of a Model-Based Design of Embedded Software Systems in the Car Industry?Manfred Broy (Technical University Munich, Germany), Sascha Kirstan (Altran Technologies, Germany), Helmut Krcmar (Technical University Munich, Germany) and Bernhard Schätz (Technical University Munich, Germany). doi:10.4018/978-1-61350-438-3.ch013
Zhang, M., Yue, T., Ali, S., Zhang, H., Wu, J.: A systematic approach to automatically derive test cases from use cases specified in restricted natural languages. In: Proceedings of the 8th International Conference on System Analysis and Modeling: Models and Reusability (SAM’14) (2014)
Acknowledgements
This research was supported by CRIAQ, Esterline CMC Electronics, Solutions Isoneo and Mitacs-Accelerated Graduate Research Internship Program. Project title: Test Automation with TTCN-3, Grant Numbers: FR05066, FR05067.
Author information
Authors and Affiliations
Corresponding author
Additional information
Communicated by Professor Jean-Marc Jezequel.
Appendices
Appendix 1: TDL specification developed from UCM [successful deployment] scenario model

Appendix 2: The Test Description module in TTCN-3 mapped from TDL Test Description

Appendix 3: Transformation rules from TDL to TTCN-3 implemented in Xtend
TDL metamodel elements (abstract syntax) | Our TDL concrete syntax | Equivalent TTCN-3 statements | Description | |
---|---|---|---|---|
Rule# 1 | TestConfiguration | Test Configuration \(<\hbox {tc}\_{\mathrm{name}}>\) | module \(<\hbox {tc}\_{\mathrm{name}}>\) { } | Map to a module statement with the name < td_name > |
Rule# 2 | GateType | Gate Type \(<\hbox {gt}\_\hbox {name}>\) accepts dataOut, dataIn; | type port \(<\hbox {gt}\_\hbox {name}>\) message { inout dataOut; inout dataIn;} | Map to a port-type statement (message-based) that declares concrete data to be exchanged over the port |
Rule# 3 | ComponentType | Component Type \(<\hbox {ct}\_\hbox {name}>\) { gate types : \(<\hbox {gt}\_\hbox {name}>\hbox {instantiate }<\hbox {comp}\_\hbox {name1}>\) as Tester of type \(<\hbox {ct}\_\hbox {name}>\) having { gate \(<\hbox {g}\_\hbox {name1}>\) of type \(<\hbox {gt}\_\hbox {name}>\); } | type component comp_name1{ port \(<\hbox {gt}\_\hbox {name}> \quad <\hbox {g}\_\hbox {name1}>\);} | Map to a component-type statement and associate a port to it. The port is not a system port |
Rule# 4 | ComponentType | Component Type \(<\hbox {ct}\_\hbox {name}>\) { gate types : \(<\hbox {gt}\_\hbox {name}>\hbox {instantiate }<\hbox {comp}\_\hbox {name}2>\) as SUT of type \(<\hbox {ct}\_\hbox {name}>\) having { gate \(<\hbox {g}\_\hbox {name2}>\) of type \(<\hbox {gt}\_\hbox {name}>\); } | type component comp_name2{ port \(<\hbox {gt}\_\hbox {name}> \quad <\hbox {g}\_\hbox {name2}>\);} | Map to a component-type statement and associate a port of the test system interface to it |
Rule# 5 | Connection | connect \(<\hbox {g}\_\hbox {name1}>\) to \(<\hbox {g}\_\hbox {name 2}>\) | map (mtc: \(<\hbox {g}\_\hbox {name1}>\), system: \(<\hbox {g}\_\hbox {name2}>)\) | Map to a map statement where a test component port is mapped to a test system interface port |
Rule# 6 | TestDescription | Test Description (\(<\hbox {dataproxy}) <\hbox {td}\_\hbox {name}>\) { use configuration: \(<\hbox {tc}\_\hbox {name}>\); { }} | module \(<\hbox {td}\_\hbox {name}>\) {import from \(<\hbox {dataproxy}>\) all; import from \(<\hbox {tc}\_\hbox {name}>\) all; testcase _TC() runs on comp_name1 {}} | Map to a module statement with the name \(<\hbox {td}\_\hbox {name }>\). The TDL \(<\hbox {DataProxy}>\) element passed as a formal parameter (optional) is mapped to an import statement of the \(<\hbox {DataProxy}>\) to be used in the module. The TDL property Test Configuration associated with the “TestDescription” is mapped to an import statement of the Test Configuration module |
A test case definition is added | ||||
Rule# 7 | AlternativeBehaviour | alternatively { } | alt {} | Map to an alt statement |
Rule# 8 | Interaction | \(<\hbox {comp}\_\hbox {name1}>\) sends instance \(<\hbox {instance}\_\hbox {outX}>\) to \(<\hbox {comp}\_\hbox {name2}>\) | \(<\hbox {comp}\_\hbox {name1}>\hbox {.send}(<\hbox {instance}\_\hbox {outX}>)\) | Map to a send statement that sends a stimulus message |
\(<\hbox {comp}\_\hbox {name}2>\) sends instance \(<\hbox {instance}\_\hbox {Inx}>\) to \(<\hbox {comp}\_\hbox {name2}>\) | \(<\hbox {comp}\_\hbox {name1}>\) \(\hbox {.receive}(<\hbox {instance}\_\hbox {InX}>)\) | Map to a receive statement that receives a response when the sending source is an SUT component | ||
Rule# 9 | VerdictType | Verdict \(<\hbox {verdict}\_\hbox {value}>\) | verdicttype | \(<\hbox {verdict}\_\hbox {value}>\) contains the following values: {inconclusive, pass, fail}. No mapping is necessary since these values exist in TTCN-3 |
Rule# 10 | TimeUnit | Time Unit \(<\hbox {time}\_\hbox {unit}>\) | N/A | \(<\hbox {time}\_\hbox {unit}>\) contains the following values: {tick, nanosecond, microsecond, millisecond, second, minute, hour}. No mapping is necessary; a float value is used to represent the time in seconds |
Rule# 11 | VerdictAssignment | set verdict to \(<\hbox {verdict}\_\hbox {value}>\) | setverdict (\(<\hbox {verdict}\_\hbox {value}>)\) | Map to a setverdict statement |
Rule# 12 | Action | perform action \(<\hbox {action}\_\hbox {name}>\) | function \(<\hbox {action}\_\hbox {name}>\)() runs on \(<\hbox {g}\_\hbox {name1}>\){ } \(<\hbox {action}\_\hbox {name (); }>\) | Map to a function signature and to a function call. The function body is refined later if applicable |
Rule# 13 | Stop | stop | stop | Map to a stop statement within an alt statement |
Rule# 14 | Break | break | break | Map to a break statement within an alt statement |
Rule# 15 | Timer | timer \(<\hbox {timer}\_\hbox {name}>\) | \(\hbox {timer}<\hbox {timer}\_\hbox {name}>\) | Map to a timer definition statement |
Rule# 16 | TimerStart | start \(<\hbox {timer}\_\hbox {name}>\) for (time_unit) | \(<\hbox {timer}\_\hbox {name}>.\hbox {start}(\hbox {time}\_\hbox {unit})\); | Map to a start statement |
Rule# 17 | TimerStop | stop \(<\hbox {timer}\_\hbox {name}>\) | \(<\hbox {timer}\_\hbox {name}>\).stop; | Map to a stop statement |
Rule# 18 | TimeOut | \(<\hbox {timer}\_\hbox {name}>\) times out | \(<\hbox {timer}\_\hbox {name}>\).timeout; | Map to a timeout statement |
Rule# 19 | Quiescence/Wait | is quite for (time_unit) waits for (time_unit) | timer \(<\hbox {timer}\_\hbox {name}> \quad <\hbox {timer}\_\hbox {name}>\).start(time_unit); \(<\hbox {timer}\_\hbox {name}>\).timeout | Map to a timer definition statement, a start statement and to a timeout statement |
Rule# 20 | InterruptBehaviour | interrupt | stop | Map to stop statement |
Rule# 21 | BoundedLoopBehaviour | repeat \(<\hbox {number}>\) times | repeat | Map to a repeat statement. The repeat is used as the last statement in the alt behavior. It should be used once for each possible alternative |
Rule# 22 | DataSet | Data Set \(<\hbox {DataSet}\_\hbox {name}>\) { } | type record \(<\hbox {DataSet}\_\hbox {nameType}>\) { } | Map Data Set to record type using DataSet_name and prefixed with “Type” |
Rule# 23 | DataInstance | instance \(<\hbox {instance}\_\hbox {name}>\); | \([<\hbox {instance}\_\hbox {name}\_\hbox {S}>;]\,[<\hbox {instance}\_\hbox {name}\_\hbox {R}>;]\) | Map instance to a variable, using instance_name and prefixed either with “_S” for stimulus or with “_R” for response |
Appendix 4
TDL Test Configuration metamodel

The Test Description metamodel

Appendix 5: The concrete metamodel of the UCM notation

Appendix 6: The scenario metamodel

The data metamodel

Rights and permissions
About this article
Cite this article
Kesserwan, N., Dssouli, R., Bentahar, J. et al. From use case maps to executable test procedures: a scenario-based approach. Softw Syst Model 18, 1543–1570 (2019). https://doi.org/10.1007/s10270-017-0620-y
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10270-017-0620-y