skip to main content
10.1145/2372251.2372290acmconferencesArticle/Chapter ViewAbstractPublication PagesesemConference Proceedingsconference-collections
research-article

Testing highly complex system of systems: an industrial case study

Published:19 September 2012Publication History

ABSTRACT

Context: Systems of systems (SoS) are highly complex and are integrated on multiple levels (unit, component, system, system of systems). Many of the characteristics of SoS (such as operational and managerial independence, integration of system into system of systems, SoS comprised of complex systems) make their development and testing challenging.

Contribution: This paper provides an understanding of SoS testing in large-scale industry settings with respect to challenges and how to address them.

Method: The research method used is case study research. As data collection methods we used interviews, documentation, and fault slippage data.

Results: We identified challenges related to SoS with respect to fault slippage, test turn-around time, and test maintainability. We also classified the testing challenges to general testing challenges, challenges amplified by SoS, and challenges that are SoS specific. Interestingly, the interviewees agreed on the challenges, even though we sampled them with diversity in mind, which meant that the number of interviews conducted was sufficient to answer our research questions. We also identified solution proposals to the challenges that were categorized under four classes of developer quality assurance, function test, testing in all levels, and requirements engineering and communication.

Conclusion: We conclude that although over half of the challenges we identified can be categorized as general testing challenges still SoS systems have their unique and amplified challenges stemming from SoS characteristics. Furthermore, it was found that interviews and fault slippage data indicated that different areas in the software process should be improved, which indicates that using only one of these methods would have led to an incomplete picture of the challenges in the case company.

References

  1. JUnit: A programmer-oriented testing framework for Java. http://www.junit.org/. {Acc. Mar. 2012}.Google ScholarGoogle Scholar
  2. Sonar - an open platform to manage code quality. www.sonarsource.org. {Accessed Mar. 10, 2012}.Google ScholarGoogle Scholar
  3. Testing and Test Control Notation Version 3 (TTCN-3). http://www.ttcn-3.org/. {Acc. Mar. 2012}.Google ScholarGoogle Scholar
  4. TestNG - a testing framework. http://testng.org/doc/index.html. {Acc. Mar. 2012}.Google ScholarGoogle Scholar
  5. M. C. B. Alves, D. Drusinsky, J. B. Michael, and M. T. Shing. Formal validation and verification of space flight software using statechart-assertions and runtime execution monitoring. In Proceedings of the 6th International Conference on System of Systems Engineering, pages 155--160. IEEE, 2011.Google ScholarGoogle Scholar
  6. J. Christie. The seductive and Dangerous V-model. Testing Experience, pages 73--77, 2008.Google ScholarGoogle Scholar
  7. J. Colombi, B. C. Cohee, and C. W. Turner. Interoperability test and evaluation: A system of systems field study. The Journal of Defense Software Engineering, 21(11):10--14, 2008.Google ScholarGoogle Scholar
  8. J. Dahmann and K. Baldwin. Understanding the current state of us defense systems of systems and the implications for systems engineering. In Proceedings of the 2nd Annual IEEE Systems Conference, pages 1--7. IEEE, 2008.Google ScholarGoogle ScholarCross RefCross Ref
  9. L. O. Damm and L. Lundberg. Identification of test process improvements by combining fault trigger classification and faults-slip-through measurement. In 2005 International Symposium on Empirical Software Engineering, 2005. IEEE, Nov. 2005.Google ScholarGoogle ScholarCross RefCross Ref
  10. L.-O. Damm, L. Lundberg, and C. Wohlin. Faults-slip-through - a concept for measuring the efficiency of the test process. Software Process: Improvement and Practice, 11(1):47--59, 2006.Google ScholarGoogle ScholarCross RefCross Ref
  11. DoD. Systems and software engineering. systems engineering guide for systems of systems, version 1.0. Technical Report ODUSD(A&T)SSE, Office of the Deputy Under Secretary of Defense for Acquisition and Technology, Washington, DC, USA, 2008.Google ScholarGoogle Scholar
  12. M. Fewster and D. Graham. Software Test Automation. Addison-Wesley Professional, Sept. 1999.Google ScholarGoogle Scholar
  13. R. A. Gougal and A. Monti. The virtual test bed as a tool for rapid system engineering. In Proceedings of the 1st Annual IEEE Systems Conference, pages 1--6. IEEE, 2007.Google ScholarGoogle Scholar
  14. J. E. Hannay and H. C. Benestad. Perceived productivity threats in large agile development projects. In Proceedings of the International Symposium on Empirical Software Engineering and Measurement (ESEM 2010), 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. J. A. Lane and R. Valerdi. Synthesizing sos concepts for use in cost estimation. Systems Engineering, 10(4):297--307, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. G. Lewis, E. Morris, P. Place, S. Simanta, D. Smith, and L. Wrage. Engineering systems of systems. In Proceedings of the IEEE International Systems Conference (SysCon 2008), 2008.Google ScholarGoogle ScholarCross RefCross Ref
  17. G. A. Lewis, E. J. Morris, S. Simanta, and D. B. Smith. Service orientation and systems of systems. IEEE Software, 28(1):58--63, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. R. O. Lewis. Independent verification and validation {Elektronisk resurs} : a life cycle engineering process for quality software. Wiley, New York, 1992. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. M. W. Maier. Architecting principles for systems-of-systems. Systems Engineering, 1(4):267--284, 1998.Google ScholarGoogle ScholarCross RefCross Ref
  20. K. Petersen and C. Wohlin. Context in industrial software engineering research. In Proceedings of the Third International Symposium on Empirical Software Engineering and Measurement (ESEM 2009), pages 401--404, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. G. Rothermel, R. H. Untch, C. Chu, and M. J. Harrold. Test case prioritization: An empirical study. In Proceedings of the International Conference on Software Maintenance (ICSM 99), pages 179--188, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. P. Runeson and M. Host. Guidelines for conducting and reporting case study research in software engineering. Empirical Software Engineering, 14(2):131--164, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. C. Solis and X. Wang. A study of the characteristics of behaviour driven development. In Proceedings of the 37th EUROMICRO Conference on Software Engineering and Advanced Applications (SEAA 2011), pages 383--387, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. R. K. Yin. Case study research: design and methods. Sage Publications, Thousand Oaks, 3 ed. edition, 2003.Google ScholarGoogle Scholar
  25. S. Yoo and M. Harman. Regression testing minimization, selection and prioritization: a survey. Software Testing, Verification and Reliability, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. B. Zeiss, H. Neukirchen, J. Grabowski, D. Evans, and P. Baker. Refactoring and metrics for ttcn-3 test suites. In Proceedings of the 5th International Workshop on System Analysis and Modeling: Language Profiles (SAM 2006), pages 148--165, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Testing highly complex system of systems: an industrial case study

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      ESEM '12: Proceedings of the ACM-IEEE international symposium on Empirical software engineering and measurement
      September 2012
      338 pages
      ISBN:9781450310567
      DOI:10.1145/2372251

      Copyright © 2012 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 19 September 2012

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate130of594submissions,22%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader