Abstract
Interviews are the most widely used elicitation technique in requirements engineering (RE). However, conducting a requirements elicitation interview is challenging. The mistakes made in design or conduct of the interviews can create problems in the later stages of requirements analysis. Empirical evidence about effective pedagogical approaches for training novices on conducting requirements elicitation interviews is scarce. In this paper, we present a novel pedagogical approach for training student analysts in the art of elicitation interviews. Our study is conducted in two parts: first, we perform an observational study of interviews performed by novices, and we present a classification of the most common mistakes made; second, we utilize this list of mistakes and monitor the students’ progress in three set of interviews to discover the individual areas for improvement. We conducted an empirical study involving role-playing and authentic assessment in two semesters on two different cohorts of students. In the first semester, we had 110 students, teamed up in 28 groups, to conduct three interviews with stakeholders. We qualitatively analysed the data to identify and classify the mistakes made from their first interview only. In the second semester, we had 138 students in 34 groups and we monitored and analysed their progress in all three interviews by utilizing the list of mistakes from the first study. First, we identified 34 unique mistakes classified into seven high-level themes, namely question formulation, question omission, interview order, communication skills, analyst behaviour, customer interaction, teamwork and planning. In the second study, we discovered that the students struggled mostly in the areas of question formulation, question omission and interview order and did not manage to improve their skills throughout the three interviews. Our study presents a novel and repeatable pedagogical design, and our findings extend the body of knowledge aimed at RE education and training by providing an empirically grounded categorization of mistakes made by novices. We offer an analysis of the main pain points in which instructors should pay more attention during their design and training.



















Similar content being viewed by others
Notes
The authors playing the role of customer and observer were not available in study 2 to have an active role during the class term; therefore, we do not have customer think aloud or observation data in study 2.
References
Briggs CL (1986) Learning how to ask: a sociolinguistic appraisal of the role of the interview in social science research, vol 1. Cambridge University Press, Cambridge
Zowghi D, Coulin C (2005) Requirements elicitation: a survey of techniques, approaches, and tools. In: Engineering and managing software requirements. Springer, pp 19–46
Sutcliffe A, Sawyer P (2013) Requirements elicitation: towards the unknown unknowns. In: 2013 21st IEEE international on requirements engineering conference (RE). IEEE
Davis A et al (2006) Effectiveness of requirements elicitation techniques: empirical results derived from a systematic review. In: 14th IEEE international conference on requirements engineering. IEEE
Dieste O, Juristo N (2011) Systematic review and aggregation of empirical studies on elicitation techniques. IEEE Trans Softw Eng 37(2):283–304
Donati B et al (2017) Common mistakes of student analysts in requirements elicitation interviews. In: International working conference on requirements engineering: foundation for software quality. Springer
Pitts MG, Browne GJ (2007) Improving requirements elicitation: an empirical investigation of procedural prompts. Inf Syst J 17(1):89–110
Hogarth RM et al (1991) Learning from feedback: exactingness and incentives. J Exp Psychol Learn Mem Cogn 17(4):734
Li S (2010) The effectiveness of corrective feedback in SLA: a meta-analysis. Lang Learn 60(2):309–365
Svensson RB, Regnell B (2017) Is role playing in Requirements Engineering Education increasing learning outcome? Requir Eng 22(4):475–489
Zowghi D, Paryani S (2003) Teaching requirements engineering through role playing: lessons learnt. In: 11th IEEE international on requirements engineering conference. Proceedings. IEEE
Yusop N, Mehboob Z, Zowghi D (2007) The role of conducting stakeholder meeting in requirements engineering techniques. In: International workshop on the requirements engineering education and training. IEEE Computer Society
Bano M et al (2018) Learning from mistakes: an empirical study of elicitation interviews performed by novices. In: 2018 IEEE 26th international requirements engineering conference (RE). IEEE
Spoletini P, Ferrari A (2017) Requirements elicitation: a look at the future through the lenses of the past. In: 2017 IEEE 25th international on requirements engineering conference (RE). IEEE
Aranda AM, Dieste O, Juristo N (2016) Effect of domain knowledge on elicitation effectiveness: an internally replicated controlled experiment. IEEE Trans Software Eng 42(5):427–451
Hadar I, Soffer P, Kenzi K (2014) The role of domain knowledge in requirements elicitation via interviews: an exploratory study. Requir Eng 19(2):143–159
Niknafs A, Berry D (2017) The impact of domain knowledge on the effectiveness of requirements engineering activities. Empir Softw Eng 22(1):80–133
Niknafs A, Berry DM (2013) An industrial case study of the impact of domain ignorance on the effectiveness of requirements idea generation during requirements elicitation. In: 2013 21st IEEE international on requirements engineering conference (RE). IEEE
Pitts MG, Browne GJ (2004) Stopping behavior of systems analysts during information requirements elicitation. J Manag Inf Syst 21(1):203–226
Distanont A et al (2012) The engagement between knowledge transfer and requirements engineering. Int J Manag Knowl Learn 1(2):131–156
Ferrari A, Spoletini P, Gnesi S (2016) Ambiguity and tacit knowledge in requirements elicitation interviews. Requir Eng 21(3):333–355
Coughlan J, Macredie RD (2002) Effective communication in requirements elicitation: a comparison of methodologies. Requir Eng 7(2):47–60
Agarwal R, Tanniru MR (1990) Knowledge acquisition using structured interviewing: an empirical investigation. J Manag Inf Syst 7(1):123–140
Browne GJ, Rogich MB (2001) An empirical investigation of user requirements elicitation: comparing the effectiveness of prompting techniques. J Manag Inf Syst 17(4):223–249
Shuraida S, Barki H (2013) The influence of analyst communication in IS projects. J Assoc Inf Syst 14(9):482
Portugal S (2013) Interviewing users: how to uncover compelling details. Louis Rosenfeld, New York
Lynda.com, Requirements elicitation for business analysts: interviews. https://www.lynda.com/Communication-tutorials/Requirements-Elicitation-Interviews/410330-2.html. Accessed 14 May 2019
International, L.T., Developing user requirements: the key to project success. https://www.learningtree.com/courses/315/developing-user-requirements-training-the-key-to-project-success/. Accessed 14 May 2019
Walcott-Justice K Requirements elicitation: artifact and stakeholder analysis. Coursera.com. https://www.coursera.org/learn/requirements-elicitation/lecture/FAN2U/stakeholder-elicitation-starting-interviewing-techniques. Accessed 14 May 2019
Hathaway T, Hathaway A (2016) Requirements elicitation interviews and workshops—simply put!: best practices, skills, and attitudes for requirements gathering on IT projects. BA-Experts, USA
Adams S (2001) Interviewing for journalists. Psychology Press, London
Martin JR (2017) Actuality interviewing and listening: how to conduct successful interviews for nonfiction storytelling, actuality documentaries and other disciplines. Real Deal Press, USA
Grobel L (2010) The art of the interview: lessons from a master of the craft. Three Rivers Press, NY, USA
De Burgh H (2003) Skills are not enough: the case for journalism as an academic discipline. Journalism 4(1):95–112
DiCicco-Bloom B, Crabtree BF (2006) The qualitative research interview. Med Educ 40(4):314–321
Brinkmann S (2014) Interview. In: Encyclopedia of critical psychology. Springer, pp 1008–1010
Jacob SA, Furgerson SP (2012) Writing interview protocols and conducting interviews: tips for students new to the field of qualitative research. Qual Rep 17(42):1–10
Turner DI (2010) Qualitative interview design: a practical guide for novice investigators. Qual Rep 15(3):754
Dilley P (2000) Conducting successful interviews: tips for intrepid research. Theory Pract 39(3):131–137
Seidman I (2013) Interviewing as qualitative research: a guide for researchers in education and the social sciences. Teachers College Press, New York
Ritchie J et al (2013) Qualitative research practice: a guide for social science students and researchers. Sage, Thousand Oaks
Morrison J (2014) The first interview. Guilford Publications, New York
Warner RE (2013) Solution-focused interviewing: applying positive psychology, a manual for practitioners. University of Toronto Press, Toronto
Miller C (2003) Interviewing strategies. In: Diagnostic interviewing. Springer, pp 47–66
Hoffman CD (2005) Investigative interviewing: strategies and techniques. International Foundation for Protection Officers, Naples
Investigations, I.f.I.C. (2017) Investigative interview skills course. https://iici.global/course/investigative-interview-skills-course. Accessed 14 May 2019
Navarro EO (2011) On the role of learning theories in furthering software engineering education. In: Instructional design: concepts, methodologies, tools and applications. IGI Global, pp 1645–1666
Dewey J (1916) Education and democracy. Macmillan, New York
Lave J (1988) Cognition in practice: mind, mathematics and culture in everyday life. Cambridge University Press, Cambridge
Bruner JS (1979) On knowing: Essays for the left hand. Harvard University Press, Cambridge
Schank R (1997) Virtual learning. A revolutionary approach to building a highly skilled workforce. ERIC, New York
Schön DA (1987) Educating the reflective practitioner. Jossey-Bass, San Francisco
Moore M, Potts C (1994) Learning by doing: goals and experiences of two software engineering project courses. In: Conference on software engineering education. Springer
Tvedt JD, Tesoriero R, Gary KA (2001) The software factory: combining undergraduate computer science and software engineering education. In: Proceedings of the 23rd international conference on software engineering. IEEE Computer Society
Germain T, Robillard PN, Dulipovici M (2002) Process activities in a project based course in software engineering. In: Frontiers in education. FIE 2002. 32nd Annual. IEEE
dos Santos SC, Soares FS (2013) Authentic assessment in software engineering education based on PBL principles: a case study in the telecom market. In: Proceedings of the 2013 international conference on software engineering. IEEE Press
Herrington J, Herrington A (1998) Authentic assessment and multimedia: how university students respond to a model of authentic assessment. Higher Educ Res Dev 17(3):305–322
Gulikers JT, Bastiaens TJ, Kirschner PA (2004) A five-dimensional framework for authentic assessment. Educ Tech Res Dev 52(3):67
Dawson R (2000) Twenty dirty tricks to train software engineers. In: Proceedings of the 22nd international conference on Software engineering. ACM
Ferrari A et al (2017) Interview review: detecting latent ambiguities to improve the requirements elicitation process. In: 2017 IEEE 25th international on requirements engineering conference (RE). IEEE
Spoletini P et al (2018) Interview review: an empirical study on detecting ambiguities in requirements elicitation interviews. In: International Working Conference on Requirements Engineering: Foundation for Software Quality. Springer
Burnay C, Jureta IJ, Faulkner S (2014) What stakeholders will or will not say: a theoretical and empirical study of topic importance in Requirements Engineering elicitation interviews. Inf Syst 46:61–81
Moody JW, Blanton JE, Cheney PH (1998) A theoretically grounded approach to assist memory recall during information requirements determination. J Manag Inf Syst 15(1):79–98
Wetherbe JC (1991) Executive information requirements: getting it right. Mis Q 15:51–65
Pacheco C, Garcia I (2012) A systematic literature review of stakeholder identification methods in requirements elicitation. J Syst Softw 85(9):2171–2181
Gervasi V et al (2013) Unpacking tacit knowledge for requirements engineering. In: Managing requirements knowledge. Springer, pp 23–47
Saiedian H, Dale R (2000) Requirements engineering: making the connection between the software developer and customer. Inf Softw Technol 42(6):419–428
Lauer TW, Peacock E, Jacobs SM (1992) Question generation and the systems analysis process. In: Questions and information systems, pp 47–61
Coughlan J, Lycett M, Macredie RD (2003) Communication issues in requirements elicitation: a content analysis of stakeholder experiences. Inf Softw Technol 45(8):525–537
Gallivan MJ, Keil M (2003) The user–developer communication process: a critical case study. Inf Syst J 13(1):37–68
Berry DM (1995) The importance of ignorance in requirements engineering. J Syst Softw 28:179–184
Denzin NK, Lincoln YS (1994) Handbook of qualitative research. Sage Publications Inc., Thousand Oaks
Karras O, Kiesling S, Schneider K (2016) Supporting requirements elicitation by tool-supported video analysis. In: 2016 IEEE 24th international on requirements engineering conference (RE). IEEE
Acknowledgements
Authors would like to thank all the students who participated in this project. This research was approved by the University of Technology Sydney’s Research Ethics committee, under the number ETH17-1266. This work was partially supported by the National Science Foundation under grant CCF-1718377.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendices
Appendix 1: Interview Questionnaire
Please rate your agreement with the following statements about QUESTION FORMULATION * [Strongly Agree (1), Agree (2), Not sure (3), Disagree (4), Strongly Disagree (5)]
The analyst asked vague questions
The analyst asked technical questions
The analyst asked questions that appeared irrelevant to me
The analyst asked the customer for solutions
The analyst asked long and overly complex questions
The analyst formulated their questions in a way that appeared incorrect to me
The analyst asked vague questions
The analyst asked technical questions
The analyst asked questions that appeared irrelevant to me
The analyst asked the customer for solutions
The analyst asked long and overly complex questions
The analyst formulated their questions in a way that appeared incorrect to me
Please rate your agreement with the following statements about QUESTION OMISSION * [Strongly Agree (1), Agree (2), Not sure (3), Disagree (4), Strongly Disagree (5)]
The analyst DID NOT ask for additional stakeholders
The analyst DID NOT ask probing questions to confirm their understanding
The analyst DID NOT ask about the existing system or business process
The analyst DID NOT ask questions about feature prioritisation
The analyst DID NOT ask information about the problem domain
The analyst DID NOT identify goals and success criteria
The analyst DID NOT ask all the questions that I consider relevant
The analyst DID NOT ask for additional stakeholders
The analyst DID NOT ask probing questions to confirm their understanding
The analyst DID NOT ask about the existing system or business process
The analyst DID NOT ask questions about feature prioritisation
The analyst DID NOT ask information about the problem domain
The analyst DID NOT identify goals and success criteria
The analyst DID NOT ask all the questions that I consider relevant
Please rate your agreement with the following statements about ORDER OF INTERVIEW * [Strongly Agree (1), Agree (2), Not sure (3), Disagree (4), Strongly Disagree (5)]
The analyst DID NOT perform a summary at the end of the interview
The analyst started the interview by asking direct questions about the system
The analyst asked questions in an order that appeared incorrect to me
The analyst repeated the same questions multiple times
The analyst DID NOT perform a summary at the end of the interview
The analyst started the interview by asking direct questions about the system
The analyst asked questions in an order that appeared incorrect to me
The analyst repeated the same questions multiple times
Please rate your agreement with the following statements about COMMUNICATION SKILLS * [Strongly Agree (1), Agree (2), Not sure (3), Disagree (4), Strongly Disagree (5)]
The dialogues style used by the analyst appears unnatural to me
The analyst showed poor communication skills
The analyst showed poor listening skills
The analyst spoke with a low and unclear tone
The dialogues style used by the analyst appears unnatural to me
The analyst showed poor communication skills
The analyst showed poor listening skills
The analyst spoke with a low and unclear tone
Please rate your agreement with the following statements about ANALYST BEHAVIOUR * [Strongly Agree (1), Agree (2), Not sure (3), Disagree (4), Strongly Disagree (5)]
The analyst showed lack of confidence
The analyst appeared overconfident or arrogant
The analyst showed a passive attitude
The analyst showed a behaviour that appeared unprofessional to me
The analyst showed lack of confidence
The analyst appeared overconfident or arrogant
The analyst showed a passive attitude
The analyst showed a behaviour that appeared unprofessional to me
Please rate your agreement with the following statements about CUSTOMER INTERACTION * [Strongly Agree (1), Agree (2), Not sure (3), Disagree (4), Strongly Disagree (5)]
The analyst DID NOT create rapport with the customer
The analyst tried to influence the customer
The analyst interrupted the customer
The analyst DID NOT create rapport with the customer
The analyst tried to influence the customer
The analyst interrupted the customer
Please rate your agreement with the following statements about TEAMWORK and PLANNING * [Strongly Agree (1), Agree (2), Not sure (3), Disagree (4), Strongly Disagree (5)]
There was lack of coordination and choreography among team members
The analyst did NOT manage their time in a proper way
The analyst showed a lack of preparation on the domain
The analyst looked like they did not plan the interview
There were long pauses during the interview
There was lack of coordination and choreography among team members
The analyst did NOT manage their time in a proper way
The analyst showed a lack of preparation on the domain
The analyst looked like they did not plan the interview
There were long pauses during the interview
Appendix 2: Group performance based on SRS Document Assessment
We had three groups each for top marks, average marks and the lowest marks. We were interested to see whether their performance during the interviews had any correlation with their understanding that leads to writing the SRS document. For ease of visualization, we have divided the interview themes into further two categories, i.e. domain-specific aspects of elicitation interview (question formulation, question omission and interview order) and social aspect of interview (communication skills, analyst behaviour, customer interaction, and teamwork and planning). The higher scores show better performance and the lower scores show poor performance (Figs. 20, 21, 22, 23, 24, 25).
Rights and permissions
About this article
Cite this article
Bano, M., Zowghi, D., Ferrari, A. et al. Teaching requirements elicitation interviews: an empirical study of learning from mistakes. Requirements Eng 24, 259–289 (2019). https://doi.org/10.1007/s00766-019-00313-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00766-019-00313-0