Abstract
Context and Motivation: Our recent work leverages Cognitive Psychology research on human errors to improve the standard fault-based requirements inspections. Question: The empirical study presented in this paper investigates the effectiveness of a newly developed Human Error Abstraction Assist (HEAA) tool in helping inspectors identify human errors to guide the fault detection during the requirements inspection. Results: The results showed that the HEAA tool, though effective, presented challenges during the error abstraction process. Contribution: In this experience report, we present major challenges during the study execution and lessons learned for future replications.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Anu, V., Walia, G.S., Hu, W., Carver, J.C., Bradshaw, G.: Effectiveness of human error taxonomy during requirements inspection: an empirical investigation. In: Software Engineering and Knowledge Engineering, SEKE 2016 (2016)
Anu, V., Walia, G.S., Hu, W., Carver, J.C., Bradshaw, G.: The Human Error Abstraction Assist (HEAA) tool (2016). http://vaibhavanu.com/NDSU-CS-TP-2016-001.html
Hsieh, H.F., Shannon, S.E.: Three approaches to qualitative content analysis. Qual. Health Res. 15(9), 1277–1288 (2005)
Hu, W., Carver, J.C., Anu, V., Walia, G.S., Bradshaw, G.: Detection of requirement errors and faults via a human error taxonomy: a feasibility study. In: 10th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement, ESEM 2016 (2016)
Lanubile, F., Shull, F., Basili, V.R.: Experimenting with error abstraction in requirements documents. In: Proceedings of the 5th International Symposium on Software Metrics (1998)
Porter, A.A., Votta, L.G., Basili, V.R.: Comparing detection methods for software requirements inspections: a replicated experiment. IEEE Trans. Softw. Eng. 21(6), 563–575 (1995)
Acknowledgment
This work was supported by NSF Awards 1423279 and 1421006. The authors would like to thank the students of the Software Requirements course at North Dakota State University for participating in this study.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Anu, V., Walia, G., Bradshaw, G., Hu, W., Carver, J.C. (2017). Usefulness of a Human Error Identification Tool for Requirements Inspection: An Experience Report. In: Grünbacher, P., Perini, A. (eds) Requirements Engineering: Foundation for Software Quality. REFSQ 2017. Lecture Notes in Computer Science(), vol 10153. Springer, Cham. https://doi.org/10.1007/978-3-319-54045-0_26
Download citation
DOI: https://doi.org/10.1007/978-3-319-54045-0_26
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-54044-3
Online ISBN: 978-3-319-54045-0
eBook Packages: Computer ScienceComputer Science (R0)