Skip to main content
Log in

Objective evaluation of software architectures in driver assistance systems

Methods–quality model–metrics

  • Special Issue Paper
  • Published:
Computer Science - Research and Development

Abstract

This paper describes methods and the obtained results for objective evaluation of software architectures in the automotive embedded domain. Software architecture is the key factor for influencing and fulfilling non-functional requirements that are addressed to a software system. Among others scalability, extensibility and portability are to be mentioned as major criteria. Until today however there is no approach that allows evaluating and measuring the quality of such software architectures objectively and quantitatively. The approach described here tries to close this gap and offers software architects methods and tools to be applied automatically to any existing architecture draft measuring an objective ‘quality value’.

Initially a quality model is developed which consists of several quality characteristics and quality attributes which are highly adapted and specialized to the specific needs of the automotive embedded software domain. After the identification of relevant quality attributes eight objective architecture metrics are developed and presented.

The whole methodology is fully integrable into existing development processes. The suggested steps and artifacts can be added as optional upgrade. For a better understanding of the surrounding environment the current development process and necessary extensions are also explained.

All metrics have been implemented in a fully functioning prototypical tool which can be operated via a graphical user interface (GUI) on any Java compliant system without further requirements. The user can freely configure which metrics shall be applied and how they are weighted to do an individual evaluation according to the software’s specific needs or requirements.

To prove its benefit for automotive applications and long-term establishment in the software development process the approach is evaluated. This is done in two consecutive steps. Firstly the general functioning and applicability is approved by using the metrics for several small case studies which also helped getting to know the metrics better in detail to create an initial balancing and weighting. Secondly the approach was applied to a complex and real practical example out of the driver assistance domain. The whole longitudinal dynamics software architecture of BMW’s driver assistance systems was refactored using the metrics to monitor the success and keep up a goal-oriented, iterative and incremental procedure.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Abowd G, Bass L, Clements P, Kazman R, Northrop L, Zaremski A (1996) Recommended best industrial practice for software architecture evaluation. Technical report, Georgia Institute of Technology, Carnegie Mellon University, Software Engineering Institute

  2. Ahrens D, Frey A, Pfeiffer A, Bertram T (2009) Entwicklung einer leistungsfähigen Darstellung für komplexe Funktions- und Softwarearchitekturen im Bereich Fahrerassistenz. In: Mechatronik 2009. VDI-Verlag, Düsseldorf

    Google Scholar 

  3. Ahrens D, Frey A, Pfeiffer A, Bertram T (2010) Designing reusable and scalable software architectures for automotive embedded systems in driver assistance. In: SAE world congress, No. 2010-01-0942. SAE

    Google Scholar 

  4. Ahrens D, Frey A, Pfeiffer A, Bertram T (2010) Restructuring and optimization of software architectures for longitudinal driver assistance systems. In: FISITA world automotive congress, No. F2010-C-121. FISITA

    Google Scholar 

  5. Ahrens D, Pfeiffer A, Bertram T (2008) Comparison of ASCET and UML—preparations for an abstract software architecture. In: Forum on specification and design languages (FDL), Stuttgart, Germany. ECSI

    Google Scholar 

  6. AUTOSAR Development Partnership (2010) AUTOSAR—AUTomotive Open System ARchitecture. http://www.autosar.org

  7. Babar MA, Zhu L, Jeffery R (2004) A framework for classifying and comparing software architecture evaluation. In: Software engineering conference proceedings. IEEE Comput Soc, Los Alamitos, pp 309–318

    Chapter  Google Scholar 

  8. Baldwin CY, Clark KB (1999) Design rules: the power of modularity, vol 1. MIT Press, Cambridge

    Google Scholar 

  9. Bass L, Clements P, Kazman R (2003) Software architecture in practice, vol 2. Addison-Wesley, Reading

    Google Scholar 

  10. Bengtsson P-O (1998) Towards maintainability metrics on software architecture: an adaptation of object-oriented metrics. In: First nordic workshop on software architecture (NOSA’98), pp 87–91, Ronneby. University of Karlskrona/Ronneby, Department of Computer Science and Business Administration

    Google Scholar 

  11. Bengtsson P-O, Bosch J (1998) Scenario-based software architecture reengineering. In: Proceedings of the 5th international conference on software reuse. IEEE Comp Soc, Los Alamitos, pp 308–317

    Google Scholar 

  12. Boehm BW, Brown JR, Kaspar H, Lipow M, MacLeod GJ, Merrit MJ (1978) Characteristics of software quality

  13. Bosch J (2000) Design and use of software architectures. Addison-Wesley, Reading

    Google Scholar 

  14. Bosch J, Bengtsson P-O (2001) Assessing optimal software architecture maintainability. In: Bengtsson P (ed) Proc fifth European conference on software maintenance and reengineering, pp 168–175

    Chapter  Google Scholar 

  15. Bundesministerium des Inneren, Koordinierungs- und Beratungsstelle der Bundesregierung für Informationstechnik in der Bundesverwaltung (1997). V-Modell Entwicklungsstandard für IT-Systeme des Bundes

  16. Clements P, Bachmann F, Bass L, Garlan D, Ivers J, Little R, Nord R, Stafford J (2005) Documenting software architectures. Addison-Wesley, Boston

    Google Scholar 

  17. Clements P, Kazman R, Klein M (2002) Evaluating software architectures. Addison-Wesley, Reading

    Google Scholar 

  18. Eclipse open source software development environment. http://www.eclipse.org

  19. ETAS GmbH. Ascet SD/SE. http://www.etas.com

  20. Florentz B (2007) Inside architecture evaluation: analysis and representation of optimization potential. In: Sixth working IEEE/IFIP conference on software architecture (WICSA’07). IEEE Comput Soc, Los Alamitos, p 3

    Google Scholar 

  21. Florentz B, Huhn M (2006) Embedded systems architecture: evaluation and analysis. In: Quality of software architectures, Lecture notes in computer science, vol 4214. Springer, Berlin, pp 145–162

    Chapter  Google Scholar 

  22. ISO—International Organization for Standardization (2001) ISO/IEC 9126-1: software engineering—product quality

  23. Kazman R, Bass L, Abowd G, Webb M (1994) SAAM: a method for analyzing the properties of software architectures. In: Proceedings of the 16th international conference on software engineering

    Google Scholar 

  24. Kazman R, Klein M, Barbacci M, Longstaff T, Lipson H, Carriere J (1998) The architecture tradeoff analysis method. In: IEEE, ICECCS. Kazman

    Google Scholar 

  25. Krikhaar R, Postma A, Sellink A, Stroucken M, Verhoef C (1999) A two-phase process for software architecture improvement. In: IEEE international conference on software maintenance (ICSM ’99), pp 371–380

    Google Scholar 

  26. Lochau M, Müller T, Steiner J, Goltz U, Form T (2009) Optimierung von AUTOSAR-systemen durch automatisierte Architektur-Evaluation. In: 14. Internationaler Kongress “Elektronik im Kraftfahrzeug”, pp 827–838

    Google Scholar 

  27. Losavio F, Chirinos L, Lévy N, Ramdane-Cherif A (2003) Quality characteristics for software architecture. J Object Technol 2:133–150

    Article  Google Scholar 

  28. Lung C-H, Kalaichelvan K (2000) An approach to quantitative software architecture sensitivity analysis. Int J Softw Eng Knowl Eng 10(1):97–114

    Article  Google Scholar 

  29. McCall JA (1994) Quality factors. In: Encyclopedia of software engineering, vol 2 O-Z, pp 958–969. Wiley, New York

    Google Scholar 

  30. Miller GA (1956) The magical number seven, plus or minus two: some limits on our capacity for processing information. Psychol Rev 63:81–97

    Article  Google Scholar 

  31. OMG—Object Management Group. UML—unified modeling language. http://www.uml.org

  32. SUM Microsystems. Java programming language. http://www.java.com

  33. SUN Microsystems. JAXB—Java architecture for XML binding. https://jaxb.dev.java.net/

  34. The MathWorks. Matlab & Simulink. www.mathworks.com

  35. USC Center for Systems and Software Engineering. UCC—Unified Code Count. http://sunset.usc.edu/research/CODECOUNT/

  36. van Gurp J, Bosch J (2000) Automating software architecture assessment. In: Nordic workshop on programming and software development

    Google Scholar 

  37. W3C—World wide web consortium. XML—extensible markup language. http://www.w3.org/XML/

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dirk Ahrens.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Ahrens, D., Frey, A., Pfeiffer, A. et al. Objective evaluation of software architectures in driver assistance systems. Comput Sci Res Dev 28, 23–43 (2013). https://doi.org/10.1007/s00450-011-0185-x

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00450-011-0185-x

Keywords

Navigation