Skip to main content

Testing Avionics Software: Is FMI up to the Task?

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 11246))

Abstract

This paper compares two test engine architectures, one based on the RT-Tester test system, and one based on FMI, and analyzes how these different approaches satisfy the needs for verification and validation of safety-critical avionics software. The study is based on an aircraft controller application, which motivates the requirements to the test engine designs.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    Handling of discrete I/O is necessary to start or reset the aircraft controller. Without discrete I/O, automated testing is thus not possible.

  2. 2.

    Controlling external devices is essential to reach the test goals, for instance, verify the stability of the SUT with respect to unexpected timings of incoming CAN messages.

  3. 3.

    The devices connected to the aircraft controller are configured via software parameters, and different tests may have to exercise different parameter settings. The parameter settings include, for example, the number of heating controllers and the association of heaters to heating controllers. It must therefore be possible to straightforwardly adapt the set of simulations, as well as simulation parameters and the order in which these simulations are executed, for each single test.

  4. 4.

    A predictable order is necessary since the different simulations and checkers running on the test engine may depend on one another.

  5. 5.

    Please note that this is one possibility for defining the interface to the actual aircraft controller. It would likewise be possible to summarize the interfaces to a specific hardware device in one FMU, or to structure the interfaces based on applications. This decision, however, does not influence the principled architecture.

  6. 6.

    Observe that this requirement is strongly related to predictable scheduling and a suitable scheduling policy, as specified by requirement REQ-SCHEDULING.

References

  1. ACOSAR: Advanced Co-Simulation Open System Architecture. http://www.acosar.eu

  2. Blochwitz, T.: Functional Mock-up Interface for Model Exchange and Co-Simulation, July 2014. https://www.fmi-standard.org/downloads

  3. Formaggio, L., Fummi, F., Pravadelli, G.: A timing-accurate HW/SW co-simulation of an ISS with systemc. In: Proceedings of the 2nd IEEE/ACM/IFIP International Conference on Hardware/Software Codesign and System Synthesis, pp. 152–157. CODES+ISSS 2004. ACM, New York (2004)

    Google Scholar 

  4. Fujimoto, R.M.: Parallel discrete event simulation. Commun. ACM 33(10), 30–53 (1990)

    Article  Google Scholar 

  5. Gomes, C., Thule, C., Broman, D., Larsen, P.G., Vangheluwe, H.: Co-simulation: state of the art. CoRR abs/1702.00686 (2017). http://arxiv.org/abs/1702.00686

  6. Larsen, P.G., et al.: Integrated tool chain for model-based design of cyber-physical systems: the INTO-CPS project. In: 2016 2nd International Workshop on Modelling, Analysis, and Control of Complex CPS, CPS Data 2016, Vienna, Austria, 11 April 2016, pp. 1–6. IEEE (2016)

    Google Scholar 

  7. RTCA SC-167/EUROCAE WG-12: Software Considerations in Airborne Systems and Equipment Certification. Technical report RTCA/DO-178B, RTCA Inc., 1140 Connecticut Avenue, N.W., Suite 1020, Washington, D.C. 20036, December 1992

    Google Scholar 

  8. RTCA SC-205/EUROCAE WG-71: Software Considerations in Airborne Systems and Equipment Certification. Technical report RTCA/DO-178C, RTCA Inc., 1140 Connecticut Avenue, N.W., Suite 1020, Washington, D.C. 20036, December 2011

    Google Scholar 

  9. Saidi, S.E., Pernet, N., Sorel, Y., Khaled, A.B.: Acceleration of FMU co-simulation on multi-core architectures. In: The First Japanese Modelica Conferences, May 23–24, Tokyo, Japan. pp. 106–112. No. 124 in Linköping Electronic Conference Proceedings, Linköping University Electronic Press, Linköpings Universitet (2016)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jörg Brauer .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Brauer, J., Möller, O., Peleska, J. (2018). Testing Avionics Software: Is FMI up to the Task?. In: Margaria, T., Steffen, B. (eds) Leveraging Applications of Formal Methods, Verification and Validation. Distributed Systems. ISoLA 2018. Lecture Notes in Computer Science(), vol 11246. Springer, Cham. https://doi.org/10.1007/978-3-030-03424-5_32

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-03424-5_32

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-03423-8

  • Online ISBN: 978-3-030-03424-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics