Abstract
[Context and motivation] It is desirable that requirement engineering methods are reliable, that is, that methods can be repeated with the same results. Risk assessments methods, however, often have low reliability when they identify risk mitigations for a system based on expert judgement. [Question/problem] Our goal is to assess the reliability of an availability risk assessment method for telecominfrastructures, and to identify possibilities for improvement of its reliability. [Principal ideas/results] We propose an experimental validation of reliability, and report on its application. We give a detailed analysis of sources of variation, explain how we controlled them and validated their mitigations, and motivate the statistical procedure used to analyse the outcome. [Contribution] Our results can be used to improve the reliability of risk assessment methods. Our approach to validating reliability can be useful for the assessment of the reliability of other methods.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Bartha, P.: By Parallel Reasoning. Oxford University Press (2010)
Feather, M.S., Cornford, S.L.: Quantitative risk-based requirements reasoning. Requirements Engineering 8(4), 248–265 (2003)
Franqueira, V.N., Tun, T.T., Yu, Y., Wieringa, R., Nuseibeh, B.: Risk and argument: a risk-based argumentation method for practical security. In: 2011 19th IEEE International Requirements Engineering Conference (RE), pp. 239–248. IEEE (2011)
Ghaisas, S., Rose, P., Daneva, M., Sikkel, K.: Generalizing by similarity: Lessons learnt from industrial case studies. In: 1st International Workshop on Conducting Empirical Studies in Industry (CESI), pp. 37–42 (2013)
Gorschek, T., Garre, P., Larsson, S., Wohlin, C.: A model for technology transfer in practice. IEEE Software 23(6), 88–95 (2006)
Hallgren, K.A.: Computing inter-rater reliability for observational data: An overview and tutorial. Tutorials in Quantitative Methods for Psychology 8(1), 23 (2012)
Herrmann, A.: Information need of IT risk estimation - qualitative results from experiments. In: Proceedings of the REFSQ 2011 RePriCo Workshop, pp. 72–84 (2011)
Herrmann, A.: REFSQ 2011 live experiment about risk-based requirements prioritization: The influence of wording and metrics. In: Proceedings of REFSQ 2011, pp. 176–194 (2011)
Herrmann, A., Paech, B.: Practical challenges of requirements prioritization based on risk estimation. Empirical Software Engineering 14(6), 644–684 (2009)
Höst, M., Regnell, B., Wohlin, C.: Using students as subjects–a comparative study of students and professionals in lead-time impact assessment. Empirical Software Engineering 5(3), 201–214 (2000)
IEC: Analysis techniques for system reliability - Procedure for failure mode and effects analysis (FMEA). International Standard 60812:2006 (2006)
ISO: Risk management - principles and guidelines. International Standard 31000 (2009)
Kerckhoffs Institute: The Kerckhoffs masters programme. http://www.kerchoffs-institute.org/ Last accessed 2014–07-10
Krippendorff, K.: Calculation of alpha over partitions (Private communication)
Krippendorff, K.: Content analysis: an introduction to its methodology. 2nd edn. Sage Publications (2004)
Ojameruaye, B., Bahsoon, R.: Systematic elaboration of compliance requirements using compliance debt and portfolio theory. In: Salinesi, C., van de Weerd, I. (eds.) REFSQ 2014. LNCS, vol. 8396, pp. 152–167. Springer, Heidelberg (2014)
Runeson, P.: Using students as experiment subjects-an analysis on graduate and freshmen student data. In: Proceedings of the 7th International Conference on Empirical Assessment in Software Engineering.-Keele University, UK, pp. 95–102. Citeseer (2003)
Shadish, W., Cook, T., Campbell, D.: Experimental and Quasi-Experimental Designs for Generalized Causal Inference. Houghton Mifflin Company (2002)
Sunstein, C.: On analogical reasoning. Harvard Law Review 106, 741–790 (1993)
Svahnberg, M., Aurum, A., Wohlin, C.: Using students as subjects-an empirical evaluation. In: Proceedings of the Second ACM-IEEE International Symposium on Empirical Software Engineering and Measurement, pp. 288–290. ACM (2008)
Vriezekolk, E.: Testing reliability of Raster - report of experiment with Kerckhoffs students. Technical report, University of Twente (2014)
Vriezekolk, E., Wieringa, R., Etalle, S.: A new method to assess telecom service availability risks. In: Mendonca, D., Dugdale, J. (eds.) Proceedings of the 8th International Conference on Information Systems for Crisis Response and Management ISCRAM2011 (2011)
Vriezekolk, E., Wieringa, R., Etalle, S.: Design and initial validation of the Raster method for telecom service availability risk assessment. In: Rothkrantz, L., Ristvej, J., Franco, Z. (eds.) Proceedings of the 9th International Conference on Information Systems for Crisis Response and Management ISCRAM2012 (2012)
Vriezekolk, E.: Raster documentation website http://wwwhome.ewi.utwente.nl/~vriezekolk/Raster/
Wieringa, R.: Design Science Methodology for Information Systems and Software Engineering. Springer (2014)
Willner, P.: Methods for assessing the validity of animal models of human psychopathology. In: Boulton, A., Baker, G., Martin-Iverson, M. (eds.) Animal Models in Psychiatry, I, Neuromethods vol. 18, pp. 1–23. Humana Press (1991)
Yin, R.K.: Case study research: design and methods - fourth edition. Sage Publications (2009)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Vriezekolk, E., Etalle, S., Wieringa, R. (2015). Experimental Validation of a Risk Assessment Method. In: Fricker, S., Schneider, K. (eds) Requirements Engineering: Foundation for Software Quality. REFSQ 2015. Lecture Notes in Computer Science(), vol 9013. Springer, Cham. https://doi.org/10.1007/978-3-319-16101-3_1
Download citation
DOI: https://doi.org/10.1007/978-3-319-16101-3_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-16100-6
Online ISBN: 978-3-319-16101-3
eBook Packages: Computer ScienceComputer Science (R0)