Skip to main content
Log in

The inconsistency between theory and practice in managing inconsistency in requirements engineering

  • Published:
Empirical Software Engineering Aims and scope Submit manuscript

Abstract

The problem of inconsistency in requirements engineering has been in the spotlight of the RE community from the early 1990s. In the early years, inconsistency was perceived in the literature as a problem that needs to be eliminated on sight. More recently, it has become recognized that maintaining consistency at all times is not only infeasible, but also even counterproductive. Based on this recognition, paradigms and tools have been proposed in the RE literature for managing inconsistency. However, over the same period, inconsistency as perceived and managed in practice has not received much attention. Our research aims to better understand the phenomenon of inconsistency and the strategies to address it in RE practice. This paper describes an empirical study investigating practitioners’ perceptions of inconsistency manifestations in RE, their attitudes towards these manifestations, and strategies they apply to address them. The findings of this research led to the two contributions: (a) an explanation of how the ideas of the RE field about managing RE inconsistency are reflected in practitioners’ perceptions of the inconsistency that they encounter in their daily work, and (b) the identification of some barriers that appear to be hindering practitioners’ adoption of the RE field’s inconsistency management strategies, together with possible reasons underlying these barriers.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Notes

  1. A set of statements is logically inconsistent if and only if anything can be proved from it. In RE, a set of artifacts, being treated as the specification of a CBS is considered inconsistent if there is anything among them, including logical inconsistency, that prevents the validation of the specification as meeting the requirements of the CBS’s client.

  2. The expected cost of an event is the product of its probability and the cost of its damages.

  3. This section uses explanatory text from a paper titled “Reasoning about Inconsistency in RE: Separating the Wheat from the Chaff” by a permutation of the same authors that was published in the 2016 Proceedings of the 11th International Conference on Evaluation of Novel Software Approaches to Software Engineering (Zamansky et al. 2016).

  4. Note that what is called “incorrectness” here can be described as “inconsistency with the Env”; X is incorrect if and only if X and Env are inconsistent. Hence, there is a logic to calling all sorts of problems with an SRS “inconsistency”.

  5. The vocabulary, inherited from the cognitive sciences, needs an explanation. A “normative response” is the cognitive psychologist’s way to refer to what can be called a “correct answer”, and it is not used to refer to what is the usual, i.e., the normal response. This term can be confusing in the context of this research because the cognitive science is being applied to a situation in which the usual, i.e., normal, response is in correct, i.e., non-normative.

  6. This explanation is inspired by the explanation given by Hadar and Leron (2008), which distills the DTP as described by Stanovich and West (2000) and Kahneman (2002).

  7. We ignore the parts of the specification that ensure that the traffic light shows green eventually to each direction, the parts that ensure progress, to focus on the prevention of perpendicular collisions.

References

  • Bagheri E, Ghorbani AA (2008) Experiences on the belief-theoretic integration of para-consistent conceptual models. In: Proceedings of 19th IEEE Australian Conference on Software Engineering (ASWEC), pp 357–366

  • Brower RS, Jeong HS (2008) Grounded analysis: Beyond description to derive theory from qualitative data. In: Yang K, Miller GJ (eds) Handbook of Research Methods in Public Administration. Taylor & Francis, Boca Raton, pp 823–839

  • Cho A (2016) Gravitational waves, Einstein’s ripples in spacetime, spotted for first time. Science Magazine (11 February 2016, viewed on 20 January 2018) http://www.sciencemag.org/news/2016/02/gravitational-waves-einstein-s-ripples-spacetime-spotted-first-time

  • Easterbrook S, Nuseibeh B (1996) Using viewpoints for inconsistency management. Softw Eng J 11:31–43

    Article  Google Scholar 

  • Easterbrook S, Chechik M (2001) A framework for multi-valued reasoning over inconsistent viewpoints. In: Proceedings of the 23rd International Conference on Software Engineering (ICSE), pp 411–420

  • Ejersbo LR, Engelhardt R, Frølunde L, Hanghøj T, Magnussen R, Misfeldt M (2008) Balancing product design and theoretical insights. In: Kelly A.E, Lesh R.A, Baek J.Y (eds) Handbook of Design Research Methods in Education: Innovations in Science, Technology, Engineering, and Mathematics Learning and Teaching. Routledge, New York, pp 149–163

  • Ernst NA, Borgida A, Mylopoulos J, Jureta IJ (2012) Agile requirements evolution via paraconsistent reasoning. In: Proceedings of the 24th International Conference on Advanced Information Systems Engineering (CAiSE), pp 382–397

  • Evans JSBT, Over DE (1997) Rationality in reasoning: The problem of deductive competence. Cahiers Psychol Cogn/Curr Psychol Cogn 16:3–38

    Google Scholar 

  • Finkelstein A, Gabbay D, Hunter A, Kramer J, Nuseibeh B (1994) Inconsistency handling in multiperspective specifications. IEEE Trans Softw Eng (TSE) 20:569–578

    Article  Google Scholar 

  • Finkelstein A (2000) A foolish consistency: Technical challenges in consistency management. In: Ibrahim M, Küng J, Revell N (eds) Proceedings 11th International Conference on Database and Expert Systems Applications (DEXA). Volume 1873 of LNCS. DE, Springer, Berlin, pp 1–5

    Google Scholar 

  • Gervasi V, Zowghi D (2005) Reasoning about inconsistencies in natural language requirements. ACM Trans Softw Eng Methodol (TOSEM) 14:277–330

    Article  Google Scholar 

  • Gigerenzer G, Todd PM (1990) The ABC Research Group: Simple Heuristics That Make Us Smart. Oxford University Press, New York

  • Gilbert DT (2002) Inferential correction. In: Gilovich T, Griffi D, Kahneman D (eds) Part One — Theoretical and Empirical Extensions. Cambridge University Press, Cambridge, pp 167–184

    Chapter  Google Scholar 

  • Hadar I, Leron U (2008) How intuitive is object-oriented design? Commun ACM 51:41–46

    Article  Google Scholar 

  • Hadar I (2013) When intuition and logic clash: The case of the object-oriented paradigm. Sci Comput Program 78:1407–1426

    Article  Google Scholar 

  • Hadar I, Zamansky A (2015a) Cognitive factors in inconsistency management. In: Proceedings of 23rd IEEE International Requirements Engineering Conference (RE), pp 226–229

  • Hadar I, Zamansky A (2015b) When a paradigm is inconsistent with intuition: The case of inconsistency management. In: Proceedings of 3rd International Workshop on Cognitive Aspects of Information Systems Engineering (COGNISE). Volume 215 of Lecture Notes in Business Information Processing, pp 107–113

    Google Scholar 

  • Hayes I, Jackson M, Jones C (2003) Determining the specification of a control system from that of its environment. In: Araki K, Gnesi S, Mandrioli D (eds) Formal Methods (FME). Volume 2805 of LNCS. Springer, Berlin, pp 154–169

    Chapter  Google Scholar 

  • Jones CB, Hayes IJ, Jackson MA (2007) Deriving specifications for systems that are connected to the physical world. In: Jones CB, Liu Z, Woodcock J (eds) Formal Methods and Hybrid Real-Time Systems. Volume 4700 of LNCS. Springer, Berlin

  • Jureta IJ, Borgida A, Ernst NA, Mylopoulos J (2010) Techne: Towards a new generation of requirements modeling languages with goals, preferences, and inconsistency handling. In: Proceedings of 18th IEEE International Requirements Engineering Conference (RE), pp 115–124

  • Kahneman D (2002) (Nobel) Prize Lecture, December 8, 2002: Maps of bounded rationality: A perspective on intuitive judgment and choice http://www.nobel.se/economics/laureates/2002/kahnemann-lecture.pdf

  • Kamalrudin M, Hosking J, Grundy J (2017) MaramaAIC: tool support for consistency management and validation of requirements. Autom Softw Eng 24:1–45

    Article  Google Scholar 

  • Mercier H, Sperber D (2011) Why do humans reason? arguments for an argumentative theory. Behav Brain Sci 34:57–74

    Article  Google Scholar 

  • Mu K, Liu W, Jin Z, Lu R, Yue A, Bell D (2009) Handling inconsistency in distributed software requirements specifications based on prioritized merging. Fundam Inf 91:631–670

    Article  Google Scholar 

  • Myers MD, Avison D (1997) Qualitative research in information systems. Manag Inf Syst Q 21:241–242

    Article  Google Scholar 

  • Nuseibeh B, Easterbrook S (1999) The process of inconsistency management: A framework for understanding. In: Proceedings 10th International Workshop on Database and Expert Systems Applications (DEXA), pp 364–368

  • Nuseibeh B, Kramer J, Finkelstein A (1994) A framework for expressing the relationships between multiple views in requirements specification. IEEE Trans Softw Eng 20:760–773

    Article  Google Scholar 

  • Nuseibeh B, Easterbrook S (2000a) Requirements engineering: A roadmap. In: Proceedings of the Conference on The Future of Software Engineering, Proceedings of the 22nd International Conference on Software Engineering (ICSE), pp 35–46

  • Nuseibeh B, Easterbrook S, Russo A (2000b) Leveraging inconsistency in software development. IEEE Comput 33:24–29

    Article  Google Scholar 

  • Nuseibeh B, Easterbrook S, Russo A (2001) Making inconsistency respectable in software development. J Syst Softw 58:171–180

    Article  Google Scholar 

  • Nuseibeh B, Kramer J, Finkelstein A (2003) Viewpoints: Meaningful relationships are difficult! In: Proceedings of the 25th International Conference on Software Engineering (ICSE), pp 676–681

  • Robson C, McCartan K (2016) Real world research. Wiley, New York

  • Rodrigues O, D’Avila Garcez AS, Russo A (2004) Reasoning about requirements evolution using clustered belief revision. In: Bazzan A, Labidi S (eds) Advances in Artificial Intelligence — SBIA 2004, Proceedings of 17th Brazilian Symposium on Artificial Intelligence (SBIA), pp 41–51

    Chapter  Google Scholar 

  • Runeson P, Höst M (2009) Guidelines for conducting and reporting case study research in software engineering. Empir Softw Eng 14:131

    Article  Google Scholar 

  • Russo A, Miller R, Nuseibeh B, Kramer J (2002) An abductive approach for analysing event-based requirements specifications. In: Proceedings 18th International Conference in Logic Programming (ICLP), pp 22–37

    Chapter  Google Scholar 

  • Spanoudakis G, Zisman A (2001) Inconsistency management in software engineering: Survey and open research issues. In: Chang SK (ed) Handbook of Software Engineering and Knowledge Engineering, vol 1. World Scientific Publishing, Singapore, pp 329–380

    Chapter  Google Scholar 

  • Stanovich KE, West RF (2000) Individual differences in reasoning: Implications for the rationality debate. Behav Brain Sci 23:645–726

    Article  Google Scholar 

  • Stanovich KE (2005) The robot’s rebellion: Finding meaning in the age of Darwin. University of Chicago Press, Chicago

  • Strauss A, Corbin J (1990) Basics of qualitative research: Grounded theory procedures and techniques. Sage Publications, Thousand Oaks

    Google Scholar 

  • Strauss A, Corbin J (1994) Grounded theory methodology—an overview. In: Norman KD, Vannaeds SLY (eds) Handbook of Qualitative Research. Sage Publications, Thousand Oaks, pp 273–285

  • Strauss A, Corbin J (1998) Basics of qualitative research: Techniques and procedures for developing grounded theory. Sage Publications, Thousand Oaks

  • Todd PM, Gigerenzer G (2012) Ecological rationality: Intelligence in the world. Oxford University Press, New York

  • van der Aa H, Leopold H, Reijers HA (2017) Comparing textual descriptions to process models–the automatic detection of inconsistencies. Inf Syst 64:447–460

    Article  Google Scholar 

  • Walsham G (2006) Doing interpretive research. Eur J Inf Syst 15:320–330

    Article  Google Scholar 

  • Whitworth E, Biddle R (2007) The social nature of Agile teams. In: Agile 2007 (AGILE 2007), pp 26–36

  • Wieringa RJ, Heerkens JMG (2006) The methodological soundness of requirements engineering papers: a conceptual framework and two case studies. Requir Eng J 11:295–307

    Article  Google Scholar 

  • Wohlin C, Runeson P, Höst M, Ohlsson MC, Regnell B, Wesslén A (2012) Experimentation in software engineering. Springer, Heidelberg Germany

    Chapter  Google Scholar 

  • Zamansky A, Hadar I, Berry DM (2016) Reasoning about inconsistency in RE: Separating the wheat from the chaff. In: Proceedings of the 11th International Conference on Evaluation of Novel Software Approaches to Software Engineering (ENASE), pp 377–382. http://www.scitepress.org/Papers/2016/59286/59286.pdf

  • Zave P, Jackson M (1997) Four dark corners of requirements engineering. ACM Trans Softw Eng Methodol (TOSEM) 6:1–30

    Article  Google Scholar 

  • Zowghi D, Gervasi V, McRae A (2001) Using default reasoning to discover inconsistencies in natural language requirements. In: Proceedings of 8th IEEE Asia-Pacific Software Engineering Conference (APSEC), pp 133–140

  • Zowghi D, Gervasi V (2002) The three Cs of requirements: Consistency, completeness, and correctness. In: Proceedings of 8th International Workshop on Requirements Engineering: Foundations for Software Quality (REFSQ). Essener Informatik Beitiage, Essen

Download references

Acknowledgments

The authors thank Yael David for her help in translating Hebrew quotes into English. Daniel Berry’s work was supported in part by a Canadian NSERC grant NSERC-RGPIN227055-15. Anna Zamansky’s work was supported by the Israel Science Foundation under grant agreement 817/15.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Irit Hadar.

Additional information

Communicated by: Tony Gorschek

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendices

Appendix: A Illustrative Examples

This appendix gives some examples of using the ZJVF to manage inconsistencies in the RE for three easily described CBSs.

A.1 Traffic Light

This example shows the ZJVF interplay applied to how a traffic light at a 4-way, 2-perpendicular-street intersection can help prevent perpendicular collisions among vehicles being driven through the intersection. The interplay is apparent even when very high level assertions are used for D, S, and R. We start with R and S in the hopes that just S is enough to entail R:

R::

There are no perpendicular collisions at a 4-way intersection.

S::

The 4-way traffic light above the 4-way intersection ensures that at no time does it show green to perpendicular directions. Footnote 7

Unfortunately, S by itself is not strong enough to entail R. What is missing? We have to have domain assumptions D that say something about the behavior of drivers and of their vehicles.

D::

All drivers obey the traffic light and all vehicles obey their drivers.

Now D, S together are strong enough to entail R.

However, D is in fact not true. An occasional driver does not obey a traffic light. Independently, an occasional vehicle does not do what its driver has commanded it to do, perhaps because it is broken. Fortunately, the probability that D is not true is very low, say ε, because an overwhelming majority of drivers obey traffic lights, and an even larger majority of vehicles do what their drivers command them to do. Because of this very low probability, we as a society have decided to accept the risk of disobedient drivers and disobedient vehicles to obtain the benefit of smooth intersectional traffic flow that traffic lights provide while preventing most but not all perpendicular collisions. So, we permanently tolerate the inconsistency between D and the real world, the D that is a lie.

For an R that includes There are no rear-end and head-on collisions., S is really incapable of having anything to do with entailing R, and it falls entirely to domain assumptions that would include Drivers are careful enough to avoid hitting vehicles going in the same or the opposite direction. Also, probably the very existence of a red light in any direction increases the probability of rear-end collisions among the vehicles that are facing the red light.

Getting back to the original R and S, is there anyway that the D that is a lie can be made unnecessary? There are two possible ways:

  1. 1.

    We could decide that all perpendicular collisions are acceptable, i.e., weaken R so that it no longer requires no perpendicular collisions. Then D, and in fact, S are unnecessary.

  2. 2.

    We could try to strengthen S, to make it unnecessary for drivers to obey traffic lights and vehicles to obey drivers.

The first approach is clearly not acceptable to society. So all that is left is the second approach. There are a number of ways that S can be strengthened, two of which are:

  1. (a)

    Have the traffic light pop a steel wall out of the street at the stop lines that face a red light.

  2. (b)

    Have the traffic light send a signal to all vehicles on the road that face a red light, a signal that causes the vehicles to stop.

Approach (a) will work, but has some other side effects that make it not a good idea. Among these side effects are that vehicles will collide with the steel wall and there will be more rear-end collisions. Approach (b) will work, but then there are some other assumptions needed about the domain:

D::

Vehicles obey traffic lights.

That D is a lie would have to be permanently tolerated, just as for the the original D. As a society, we would probably prefer D to the original D, because we know that properly constructed self-driving vehicles are in fewer accidents than are human drivers. However, we need to ensure that we are able to build vehicles that can be controlled by traffic lights. Of course, unless the vehicles are quite smart, and know how to slow down without hitting other vehicles, there will be an increase in rear-end collisions.

2.1 A.2 Aircraft

This example shows the ZJVF interplay applied to how the rare, but usually catastrophic collision between flying aircraft and flying birds is managed in aviation. Here too, the interplay is apparent even when very high level assertions are used for D, S, and R. We start with R and S in the hopes that just S is enough to entail R:

R::

A flying aircraft does not crash when it collides with birds in the air.

S::

An aircraft, made of aluminum, is able to achieve sufficient speed that it lifts off the ground and flies through the air.

Here too S is not strong enough to entail R, mainly because an aircraft built with aluminum and flying at flying speeds will break into pieces if it hits a bird in the air.

A domain assumption D that would allow D, S to entail R is:

D::

There are no birds in the air anywhere near any aircraft.

This D is not true, although the probability of its not being true is observably quite low, say ε.

If we try strengthening S to

S::

An aircraft is built with a thick enough fuselage that the aircraft will not break up in a collision with a bird.,

then S will not entail the flying part of R. The aircraft will be too heavy to fly. So, if we insist that an aircraft both fly and not crash when it collides with a bird, and we cannot change the material of the aircraft, we are stuck. There is no way that D, S can entail R.

We could decide to permanently tolerate that D is not true, because the probability of its not being true is low enough. This permanent toleration is logically equivalent to changing R to state that an aircraft collides with a bird and crashes with probability less than ε, where ε is the low probability that D is not true. This toleration is exactly the state of aircraft building today. We as a society have decided to accept the risk of crashes for the benefits of flying.

If neither the change in R nor the toleration of the falseness of D is acceptable, another alternative is to find a metal that is super light and super strong so that an aircraft built with it both flies and does not break up when it hits a bird while flying.

2.2 A.3 Sluice gates

A more complete operationalization was proposed by Hayes et al. (2003) and Jones et al. (2007) in which the three elements, D, S, and R are played against each other to derive the specification S of a system that will meet a set of suitable requirements R in the context of the real-world environment that meets the domain assumptions D well enough. Here “suitable”, “real-world”ness, and “meet … well enough” are decided by the stakeholders of the system with respect to the real world as they see it. As the reader is able to consult the original reports, their operationalization is only summarized here.

Hayes, Jackson, and Jones’s idea is to take each requirement R1, and if R1 does not already hold in the environment, then to find a feature, specified by S1, combined with an assumption about the environment D1, such that D1, S1 ⊩ R1. This balancing act may require adjusting any of D1, S1, R1, and the part of the world that is considered to be the environment, to allow D1, S1 ⊩ R1 to hold. The balancing act may require adjusting any of D1, S1, and R1 also to allow each to be a conjunct in the global D, S, and R, respectively, for which D, SR holds. The authors demonstrate their method by carrying it out to derive D, S, and R for a system to control the opening and closing of sluice gates (Hayes et al. 2003; Jones et al. 2007).

B Interview guide and questionnaire questions

B.1 Interview guiding questions

  1. 1.

    What is inconsistency?

  2. 2.

    Please provide several (at least 3) examples of inconsistencies.

  3. 3.

    What is inconsistency in requirements engineering?

  4. 4.

    Please provide several (at least 3) examples of inconsistencies in requirements engineering.

  5. 5.

    How do you feel about the inconsistencies you described in (2)?

  6. 6.

    How do you feel about the inconsistencies you described in (3)?

  7. 7.

    What do you think should be done in each example?

  8. 8.

    Imagine there is a case in which there is an inconsistency in requirements so that in rare cases the system will behave inconsistently, however resolving this situation would be of high cost. What would you do?

Interviewee’s Background Questions:

  1. 1.

    Education

  2. 2.

    Profession/occupation

  3. 3.

    Years of experience (if in different occupations, detail accordingly)

  4. 4.

    Current position

  5. 5.

    Experience with Agile development (y/n)

B.2 Questionnaire questions

  1. 1.

    What is inconsistency in requirements engineering?

  2. 2.

    Have you encountered inconsistency in your work? If so, give an example.

  3. 3.

    How do you feel about the inconsistency you described in (2)?

  4. 4.

    Give an example of a very severe inconsistency in RE.

  5. 5.

    Give an example of a less severe inconsistency in RE.

  6. 6.

    Give an example of a inconsistency in RE that can be tolerated (does not have to resolved).

Respondent’s Background Questions:

  1. 1.

    Education

  2. 2.

    Years of experience

  3. 3.

    Current position

  4. 4.

    Experience with Agile development (y/n)

C Participant Demographics

Tables 1 and 2 provide the demographics of the interview and questionnaire participants respectively.

Table 1 Interviewees
Table 2 Questionnaire respondents

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hadar, I., Zamansky, A. & Berry, D.M. The inconsistency between theory and practice in managing inconsistency in requirements engineering. Empir Software Eng 24, 3972–4005 (2019). https://doi.org/10.1007/s10664-019-09718-5

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10664-019-09718-5

Keywords

Navigation