Skip to main content
Log in

Maintaining the health of software monitors

  • SI: SwHM
  • Published:
Innovations in Systems and Software Engineering Aims and scope Submit manuscript

Abstract

Software health management (SWHM) techniques complement the rigorous verification and validation processes that are applied to safety-critical systems prior to their deployment. These techniques are used to monitor deployed software in its execution environment, serving as the last line of defense against the effects of a critical fault. SWHM monitors use information from the specification and implementation of the monitored software to detect violations, predict possible failures, and help the system recover from faults. Changes to the monitored software, such as adding new functionality or fixing defects, therefore, have the potential to impact the correctness of both the monitored software and the SWHM monitor. In this work, we describe how the results of a software change impact analysis technique, Directed Incremental Symbolic Execution (DiSE ), can be applied to monitored software to identify the potential impact of the changes on the SWHM monitor software. The results of DiSE can then be used by other analysis techniques, e.g., testing, debugging, to help preserve and improve the integrity of the SWHM monitor as the monitored software evolves.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Notes

  1. We use the term reliability to mean ‘continuity of correct service’ as specified in [4].

References

  1. Al-Khanjari ZA, Woodward MR, Ramadhan HA, Kutti NS (2005) The efficiency of critical slicing in fault localization. Softw Qual Control 13:129–153

    Article  Google Scholar 

  2. Apiwattanapong T, Orso A, Harrold MJ (2007) Jdiff: a differencing technique and tool for object-oriented programs. Autom Softw Eng 14(1):3–36

    Article  Google Scholar 

  3. Arnold RS, Bohner SA (1993) Impact analysis-towards a framework for comparison. In: Proceedings of the conference on software maintenance, ICSM ’93, pp 292–301

  4. Avizienis A, Laprie JC, Randell B, Landwehr C (2004) Basic concepts and taxonomy of dependable and secure computing. IEEE Trans Dependable Secur Comput 1:11–33

    Article  Google Scholar 

  5. Backes, J, Person, S, Rungta, N, Tkachuk, O (2013) Regression verification using impact summaries. In: International SPIN symposium on model checking of software, Stony Brook, NY, USA, 8–9 July 2013

  6. Binkley D (1999) The application of program slicing to regression testing. In: Information and software technology special issue on program slicing, pp 583–594

  7. Buse RP, Weimer WR (2010) Automatically documenting program changes. In: Proceedings of the IEEE/ACM international conference on automated software engineering, ASE ’10. ACM, New York, NY, USA, pp 33–42

  8. Choco: Main-page Choco (2010) http://www.emn.fr/z-info/choco-solver/

  9. Clarke LA (1976) A program testing system. In: Proceedings of the 1976 annual conference, ACM ’76, pp 488–491

  10. CVC3: CVC3 page (2010) http://www.cs.nyu.edu/acsys/cvc3

  11. Darwiche A (2009) Modeling and reasoning with Bayesian networks. Cambridge University Press, Cambridge

    Book  MATH  Google Scholar 

  12. Dubey A, Karsai G, Mahadevan N (2011) Model-based software health management for real-time systems. In: 2011 IEEE aerospace conference, pp 1–18

  13. Godefroid P, Lahiri SK, Rubio-Gonzalez C (2010) Incremental compositional dynamic test generation. Tech Rep MSR-TR-2010-11, Microsoft Research

  14. Gupta R, Jean M, Harrold MJ, Soffa ML (1992) An approach to regression testing using slicing. In: ICSM, pp 299–308

  15. Harrold MJ, Jones JA, Li T, Liang D, Orso A, Pennings M, Sinha S, Spoon SA, Gujarathi A (2001) Regression test selection for java software. In: OOPSLA, pp 312–326

  16. IASolver: IASolver page (2010) http://www.cs.brandeis.edu/tim/Applets/IAsolver.html

  17. Jin W, Orso A, Xie T (2010) Automated behavioral regression testing. In: ICST, pp 137–146

  18. Joshi A, Heimdahl M (2005) Model-based safety analysis of Simulink models using SCADE design verifier. In: SAFECOMP, LNCS, vol 3688, pp 122–135

  19. King JC (1976) Symbolic execution and program testing. Commun ACM 19(7):385–394

    Article  MATH  Google Scholar 

  20. Law J, Rothermel G (2003) Incremental dynamic impact analysis for evolving software systems. In: Proceedings of the 14th international symposium on software reliability engineering, ISSRE ’03, pp 430–439

  21. Law J, Rothermel G (2003) Whole program path-based dynamic impact analysis. In: ICSE ’03: proceedings of the 25th international conference on software engineering. IEEE Computer Society, Washington, DC, USA, pp 308–318

  22. Luo J, Pattipati KR, Qiao L, Chigusa S (2008) Model-based prognostic techniques applied to a suspension system. IEEE Trans Syst Man Cybern Part A 38(5):1156–1168

    Article  Google Scholar 

  23. Orso A, Apiwattanapong T, Harrold MJ (2003) Leveraging field data for impact analysis and regression testing. In: Proceedings of the ESEC/FSE-11, pp 128–137

  24. Păsăreanu C, Rungta N (2010) Symbolic PathFinder: symbolic execution of Java bytecode. In: ASE, pp 179–180

  25. Pearl J (1988) Probabilistic reasoning in intelligent systems: networks of plausible inference. Morgan Kaufmann, San Mateo

    Google Scholar 

  26. Pecheur C, Cimatti A, Cimatti R (2003) Formal verification of diagnosability via symbolic model checking. In: Proceedings of the 18th international joint conference on artificial intelligence IJCAI03, pp 363–369

  27. Person S, Dwyer MB, Elbaum S, Pǎsǎreanu CS (2008) Differential symbolic execution. In: FSE, pp 226–237

  28. Person S, Yang G, Rungta N, Khurshid S (2011) Directed incremental symbolic execution. In: Proceedings of the 32nd ACM SIGPLAN conference on Programming language design and implementation, PLDI ’11. ACM, New York, NY, USA, pp 504–515

  29. Pike L, Niller S, Wegmann N (2011) Runtime verification for ultra-critical systems. In: Proceedings of the 2nd international conference on runtime verification, LNCS. Springer, Berlin

  30. Păsăreanu CS, Mehlitz PC, Bushnell DH, Gundy-Burlet K, Lowry M, Person S, Pape M (2008) Combining unit-level symbolic execution and system-level concrete execution for testing NASA software. In: ISSTA, pp 15–25

  31. Qi D, Roychoudhury A, Liang Z (2010) Test generation to expose changes in evolving programs. In: ASE, pp 397–406

  32. Raghavan S, Rohana R, Leon D, Podgurski A, Augustine V (2004) Dex: a semantic-graph differencing tool for studying changes in large code bases. In: ICSM, pp 188–197

  33. Ren X, Ryder BG, Stoerzer M, Tip F (2005) Chianti: a change impact analysis tool for Java programs. In: ICSE, pp 664–665

  34. Rungta N, Person S, Branchaud J (2012) A change impact analysis to characterize evolving program behaviors. In: ICSM

  35. SAE-ARP4761: Guidelines and methods for conducting the safety assessment process on civil airborne systems and equipment. SAE International (1996)

  36. Santelices R, Chittimalli PK, Apiwattanapong T, Orso A, Harrold M (2008) Test-suite augmentation for evolving software. In: ASE, pp 218–227

  37. Schumann J, Bajwa A, Berg P, Thirumalainambi R (2010) Parametric testing of launch vehicle FDDR models. In: AIAA SPACE 2010 conference & exposition

  38. Schumann J, Mbaya T, Mengshoel O (2011) Bayesian software health management for aircraft guidance, navigation, and control. In: Proceedings of Conference on Prognostics and Health Management (PHM-2011)

  39. Schumann J, Mengshoel O, MBaya T (2011) Integrated software and sensor health management for small spacecraft. In: Proc SMC-IT

  40. Strichman O, Godlin B (2008) Regression verification—a practical way to verify programs. Springer, Berlin

    Google Scholar 

  41. Taneja K, Xie T, Tillmann N, de Halleux J (2011) Express: guided path exploration for efficient regression test generation. In: Proceedings of the ISSTA, pp 1–11

  42. Taneja K, Xie T, Tillmann N, de Halleux J, Schulte W (2009) Guided path exploration for regression test generation. In: ICSE, new ideas and emerging results, pp 311–314

  43. Visser W, Geldenhuys J, Dwyer MB (2012) Green: Reducing, reusing and recycling constraints in program analysis. In: ESEC/FSE ’12 (to appear)

  44. Visser W, Havelund K, Brat GP, Park S, Lerda F (2003) Model checking programs. Autom Softw Eng 10(2):203–232

    Article  Google Scholar 

  45. Xu Z, Rothermel G (2009) Directed test suite augmentation. In: APSEC, pp 406–413

  46. Yang G, Dwyer MB, Rothermel G (2009) Regression model checking. In: ICSM, pp 115–124

  47. Yang G, Păsăreanu CS, Khurshid S (2012) Memoized symbolic execution. In: ISSTA, pp 144–154

  48. Yin Z, Yuan D, Zhou Y, Pasupathy S, Bairavasundaram L (2011) How do fixes become bugs? In: Proceedings of the 19th ACM SIGSOFT symposium and the 13th European conference on Foundations of software engineering, ESEC/FSE ’11. ACM, New York, NY, USA, pp 26–36

Download references

Acknowledgments

The authors thank Johann Schumann for his valuable insights on Bayesian Networks and how they are used within the software health management domain, and for discussions on how the change impact analysis results can be used within this domain to maintain the health of the monitors. The authors also thank Paul Miner and Ben Di Vito for their helpful comments to improve the paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Suzette Person.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Person, S., Rungta, N. Maintaining the health of software monitors. Innovations Syst Softw Eng 9, 257–269 (2013). https://doi.org/10.1007/s11334-013-0213-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11334-013-0213-z

Keywords

Navigation