• Evelin Halling
  • Jüri Vain
  • Artem Boyarchuk
  • Oleg Illiashenko




Model-based testing, Test scenario description language, Timed automata, Verification by model checking, Conformance testing.


In mission critical systems a single failure might cause catastrophic consequences. This sets high expectations to timely detection of design faults and runtime failures. By traditional software testing methods the detection of deeply nested faults that occur sporadically is almost impossible. The discovery of such bugs can be facilitated by generating well-targeted test cases where the test scenario is explicitly specified. On the other hand, the excess of implementation details in manually crafted test scripts makes it hard to understand and to interpret the test results. This paper defines high-level test scenario specification language TDLTP for specifying complex test scenarios that are relevant for model-based testing of mission critical systems. The syntax and semantics of TDLTP operators are defined and the transformation rules that map its declarative expressions to executable Uppaal Timed Automata test models are specified. The scalability of the method is demonstrated on the TUT100 satellite software integration testing case study.


Object Management Group (OMG): CCDL whitepaper, Razorcat Technical Report, January 2014. [Online]. Available: http://www.razorcat.eu/PDF/Razorcat_Technical_Report_CCDL_Whitepaper_02.pdf.

Robot Framework. [Online]. Available at: https://robotframework.org.

T. Pajunen, T. Takala, and M. Katara, “Model-based testing with a general purpose keyword-driven test automation framework,” Proceedings of the 4th IEEE Int. Conf. on Software Testing, Verification and Validation, ICST 2012, 2011, pp. 242–251.

A. Guduvan, H. Waeselynck, V. Wiels, G. Durrieu, Y. Fusero, and M. Schieber, “A meta-model for tests of avionics embedded systems,” Proceedings of the 1st Int. Conf. on Model-Driven Engineering and Software Development MODELSWARD 2013, SciTePress, 2013, pp. 5–13.

J. Grossmann and W. Müller, “A formal behavioral semantics for testml,” Proceedings of the 2nd Int. Symposium on Leveraging Applications of Formal Methods, Verification and Validation, ISoLA’2006, 2006, pp. 441–448.

ISO: Road Vehicles – Open Test Sequence Exchange Format, Part 3: Standard Extensions and Requirements, International ISO Multipart Standard No. 13209-3, 2017.

ISO/IEC: Information Technology – Open Systems Interconnection – Conformance Testing Methodology and Framework, Part 1: General Concepts, International ISO/IEC Multipart Standard No. 9646, 1994/S1998.

ITU Recommendation Z.120: Message Sequence Chart (MSC), 02/11. [Online] Available at: http:// www.itu.int/rec/T-REC-Z.120-201102-I/en.

ITU Recommendation Z.120: Annex B: Formal Semantics of Message Sequence Chart (MSC), 04/98. [Online] Available at: http://www.itu.int/rec/T-REC-Z.120-199804I!AnnB/en.

ETSI: TDL. [Online] Available at: http:// www.etsi.org/deliver/etsi_tr/103100_103199/103119/01.01.01_60/tr_103119v010101p.pdf.

F. Bouquet, C. Grandpierre, B. Legeard, F. Peureux, N. Vacelet, and M. Utting, “A subset of precise UML for model-based testing,” Proceedings of the 3rd ACM WS on Advances in Model Based Testing, A-MOST 2007, co-located with the ISSTA 2007, 2007, pp. 95–104.

P. Makedonski et al., “Test descriptions with ETSI TDL,” Software Quality Journal, vol. 27, issue 2, pp. 885-917, June 2019. DOI: https://doi.org/10.1007/s11219-018-9423-9.

ETSI ES 202 553: Methods for testing and specification (mts), TPLan: A notation for expressing Test Purposes, v1.2.1. ETSI, Sophia-Antipolis, France, June 2009.

A. David, K. G. Larsen, S. Li, and B. Nielsen, “A game-theoretic approach to real-time system testing,” Proceedings of the ACM International Conference on Design, Automation and Test in Europe DATE’2008, 2008, pp. 486–491.

A. David, K. G. Larsen, A. Legay, M. Mikucionis, and D. B. Poulsen, “Uppaal SMC tutorial,” STTT, vol 17, no 4, pp. 397–415, 2015.

J. Bengtsson, W. Yi, “Timed automata: Semantics, algorithms and tools,” in: Desel, J., Reisig, W., Rozenberg, G. (eds.) Lectures on Concurrency and Petri Nets: Advances in Petri Nets, Lecture Notes in Computer Science, Springer, Heidelberg, 2004, vol. 3098, pp. 87–124.

A. Hessel, K. G. Larsen, M. Mikucionis, B. Nielsen, P. Pettersson, A. Skou, “Testing real-time systems using UPPAAL,” in: R. Hierons, J. Bowen, M. Harman (Eds.) Lecture Notes in Computer Science, Springer, Heidelberg (2008), vol. 4949, pp. 77-117.

G. Behrmann, A. David, K. G. Larsen, “A tutorial on UPPAAL,” in: Bernardo, M., Corradini, F. (eds.) Formal Methods for the Design of Real-Time Systems. Lecture Notes in Computer Science, Springer, Heidelberg, 2004, vol. 3185, pp. 200-236.

J. Vain, M. Kääramees, M. Markvardt, “Online testing of nondeterministic systems with reactive planning tester,” in: Petre, L., Sere, K., Troubitsyna, E. (eds.) Dependability and Computer Engineering: Concepts for Software-Intensive Systems, IGI Global, Hershey, 2012, pp. 113-150.

C. Arilo, D. Neto, R. Subramanyan, M. Vieira, and G. H. Travassos, “A survey on model-based testing approaches: a systematic review,” Proceedings of the 1st ACM International Workshop on Empirical Assessment of Software Engineering Languages and Technologies held in conjunction with the 22nd IEEE/ACM International Conference on Automated Software Engineering ASE’2007 (WEASELTech’07), 2007, pp. 31-36. http://dx.doi.org/10.1145/1353673.1353681

C. Baier and J.-P. Katoen, Principles of Model Checking, MIT Press, 2008.

Yu. Kolokolov and A. Monovskaya, “A practice-oriented bifurcation analysis for pulse energy converters, Part 4: Emergency forecasting,” int. journal of bifurcation and chaos, vol. 28, no. 12, article 1850152, 2018.




How to Cite

Halling, E., Vain, J., Boyarchuk, A., & Illiashenko, O. (2019). TEST SCENARIO SPECIFICATION LANGUAGE FOR MODEL-BASED TESTING. International Journal of Computing, 18(4), 408-421. https://doi.org/10.47839/ijc.18.4.1611