Skip to main content
Log in

Methodology Support in CASE Tools and Its Impact on Individual Acceptance and Use: A Controlled Experiment

  • Published:
Empirical Software Engineering Aims and scope Submit manuscript

Abstract

This paper reports the results of a controlled experiment undertaken to investigate whether the methodology support offered by a CASE tool does have an impact on the tool’s acceptance and actual use by individuals.Subjects used the process modelling tool SPEARMINT to complete a partial process model and remove all inconsistencies. Half the subjects used a variant of SPEARMINT that corrected consistency violations automatically and silently, whilst the other half used a variant of SPEARMINT that told them about inconsistencies both immediately and persistently but without automatic correction. Measurement of acceptance and prediction of actual use was based on the technology acceptance model, supplemented by beliefs about consistency rules. The impact of form of automated consistency assurance applied or hierarchical consistency rules was found to be significant at the 0.05 level with a type I error of 0.027, explaining 71.6% of the variance in CASE tool acceptance. However, intention to use and thus predicted use was of the same size for both variants of SPEARMINT, whereas perceived usefulness and perceived ease of use were affected contrarily.Internal validity of the findings was threatened by validity and reliability issues related to beliefs about consistency rules. Here, further research is needed to develop valid constructs and reliable scales. Following the experiment, a small survey among experienced users of SPEARMINT found that different forms of automated consistency assurance were preferred depending on individual, consistency rule, and task characteristics. Based on these findings, it is recommended that vendors should provide CASE tools with adaptable methodology support, which allow their users to fit automated consistency assurance to the task at hand.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Adams, D. A., Nelson, R. R., and Todd, P. A. 1992. Perceived usefulness, ease of use, and usage of information technology: A replication. MIS Quarterly 16(2): 227–247.

    Google Scholar 

  • Becker-Kornstaedt, U., Hamann, D., Kempkens, R., Rösch, P., Verlage, M., Webby, R., and Zettel, J. 1999. Support or the process engineer: The spearmint approach to software process definition and process guidance. In: M. Jarke and A. Oberweis (eds), Advanced Information Systems Engineering: 11th International Conference, CAiSE’99, Heidelberg, Germany, June 14–18, 1999, Springer, pp. 119–133.

  • Brooks, A., and Sobczak, A. 1999. CASE tool support or high-level refactorings: Keystroke-level model comparisons. In 5th Conference on Smalltalk and Java in Industry and Education (STJA 99), 28–30 September 1999, Erfurt, Germany.

  • Brooks, A., Takada, S., and Scott, L. 1999. Strongly formative pilot studies on constraints in early life-cycle work. In Proceedings of the Sixth Asia-Pacific Software Engineering Conference ‘99 (APSEC’99), 7–10 December, 1999, IEEE Computer Society, Takamatsu, Japan, pp. 614–621.

  • Brooks, A., Campbell, B., and Scott, L. 2000a. The human–computer interface explanation: A correspondence on Jankowski’s paper on methodological support or structured analysis. Empirical Software Engineering 5(1): 69–71.

    Article  Google Scholar 

  • Brooks, A., Utbult, F., Mulligan, C., and Jeffery, R. 2000b. Early lifecycle work: Influence of individual characteristics, methodological constraints, and interface constraints. Empirical Software Engineering 5(3): 269–285.

    Article  Google Scholar 

  • Brooks, A., and Scott, L. 2001. Constraints in CASE tools: Results from curiosity driven research. In Proceedings of the 2001 Australian Software Engineering Conference, 27–28 August 2001, IEEE Computer Society, Canberra, Australia, pp. 285–293. Also published as CAESAR Report No.98/7.

  • Card, D. N., McGarry, F. E., and Page, G. T. 1987. Evaluating software engineering technologies. IEEE Transactions on Software Engineering SE-13(7): 845–851.

    Google Scholar 

  • Chau, P. Y. K. 1996. An empirical investigation on actors affecting the acceptance of CASE by systems developers. Information & Management 30(6): 269–280.

    Google Scholar 

  • Davis, F. D. 1986. A technology acceptance model or empirically testing new end-user information systems: Theory and results. Ph.D. thesis, Massachusetts Institute of Technology, Sloan School of Management, Cambridge.

  • Davis, F. D. 1989. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly 13(3): 318–340.

    Google Scholar 

  • Davis, F. D., Bagozzi, R. P., and Warshaw, P. R. 1989. User acceptance of computer technology: A comparison of two theoretical models. Management Science 35(8): 982–1003.

    Google Scholar 

  • Davis, F. D. 1993. User acceptance of information technology: System characteristics, user perceptions and behavioral impacts. International Journal of Man–Machine Studies 38: 475–487.

    Article  Google Scholar 

  • Davis, F. D., and Venkatesh, V. 1996. A critical assessment of potential measurement biases in the technology acceptance model: Three experiments. International Journal of Human–Computer Studies 45: 19–45.

    Article  Google Scholar 

  • Day, D. L. 1996. User responses to constraints in computerized design tools: An extended abstract. ACM SIGSOFT Software Engineering Notes 21(5): 47–50.

    Article  Google Scholar 

  • Day, D., Ahuja, M., and Scott, L. 1997. Constraints in design engineering: A report on research in progress. Technical Report 97/5, The University of New South Wales, Centre or Advanced Empirical Software Research. Published in: Proceedings of the 8th Australian Conference on In ormation Systems, pp. 509–516.

  • DeLone, W. H., and McLean, E. R. 1992. Information systems success: The quest or the dependent variable. Information Systems Research 3(1): 60–95.

    Google Scholar 

  • Fichman, R. G., and Kemerer, C. F. 1999. The illusory diffusion of innovation: An examination of assimilation gaps. Information Systems Research 10(3): 255–275.

    Google Scholar 

  • Fischer, G., Lemke, A. C., Mastaglio, T., and Morch, A. I. 1991. Critics: An emerging approach to knowledge-based human–computer interaction. International Journal of Man–Machine Studies 35(5): 695–721.

    Google Scholar 

  • Fishbein, M., and Ajzen, I. 1975. Belief, Attitude, Intention and Behavior: An Introduction to Theory and Research Addison–Wesley Series in Social Psychology. Reading: Addison–Wesley.

    Google Scholar 

  • Fuggetta, A. 1993. A classiffication of CASE technology. IEEE Computer 26(12): 25–38.

    Google Scholar 

  • Glass, R. L. 1999. The realities of software technology—Payoffs. Communications of the ACM 42(2): 74–79.

    Article  Google Scholar 

  • Goodhue, D. L. 1995. Understanding user evaluations of information systems. Management Science 41(12): 1827–1844.

    Google Scholar 

  • Goodhue, D. L., and Thompson, R. L. 1995. Task-technology fit and individual performance. MIS Quarterly 19(2): 213–236.

    Google Scholar 

  • Henderson, J. C., and Cooprider, J. G. 1990. Dimensions of I/S planning and design aids: A functional model of CASE technology. Information Systems Research 1(3): 227–254.

    Google Scholar 

  • Henderson, R., Rickwood, D., and Roberts, P. 1998. The beta test of an electronic supermarket. Interacting with Computers 10(4): 385–399.

    Article  Google Scholar 

  • Iivari, J. 1996. Why are CASE tools not used? Communications of the ACM 39(10): 94–103.

    Article  Google Scholar 

  • ISO 12207. 1995. Information technology—Software life cycle processes. International Standard ISO/IEC 12207.

  • Jankowski, D. 1997. Computer-aided systems engineering methodology support and its effect on the output of structured analysis. Empirical Software Engineering 2(1): 11–38.

    Article  Google Scholar 

  • Jarzabek, S., and Huang, R. 1998. The case for user-centered CASE tools. Communications of the ACM 41(8): 93–99.

    Article  Google Scholar 

  • Jeffery, D. R., and Offen, R. J. (eds) 1999. CADPRO—Experiments in CASE tool use and constraint conditions: In Proceedings of the 1999 CADPRO Workshop, 25th September 1999, The University of New South Wales, Centre for Advanced Empirical Software Research.

  • Kerlinger, F. N. 1986. Foundations of Behavioral Research. 3rd edition. Fort Worth: Harcourt Brace Jovanovich College Publishers.

    Google Scholar 

  • Laitenberger, O., and Dreyer, H. M. 1998. Evaluating the usefulness and the ease of use of a web-based inspection data collection tool. Technical Report ISERN-98-13, Fraunhofer Institute or Experimental Software Engineering, Germany. Also published as IESE-Report No. 027.98/E.

  • Mathieson, K. 1991. Predicting user intentions: Comparing the technology acceptance model with the theory of planned behavior. Information Systems Research 2(3): 173–191.

    Google Scholar 

  • Morris, M. G., and Dillon, A. 1997. How user perceptions influence software use. IEEE Software 14(4): 58–65.

    Article  Google Scholar 

  • Riemenschneider, C. K., Hardgrave, B. C., and Davis, F. D. 2002. Explaining software developer acceptance of methodologies: A comparison of five theoretical models. IEEE Transactions on Software Engineering 28(12): 1135–1145.

    Article  Google Scholar 

  • Robbins, J. E., and Redmiles, D. F. 1999. Cognitive support, UML adherence, and XMI interchange in Argo/UML. In: J. Gray, J. Harvey, A. Liu, and L. Scott, (eds) Proceedings of The First International Symposium on Constructing Software Engineering Tools (CoSET’99), 17–18 May, 1999, Los Angeles, USA. Mawson Lakes, SA 5095, Australia, School of Computer and Information Science, University of South Australia, pp. 61–70.

  • Roberts, P., and Henderson, R. 2000. Information technology acceptance in a sample of government employees: A test of the technology acceptance model. Interacting with Computers 12(5): 427–443.

    Article  Google Scholar 

  • Segars, A. H., and Grover, V. 1993. Re-examining perceived ease of use and use ulness: A confirmatory factor analysis. MIS Quarterly 17(4): 517–525.

    Google Scholar 

  • Silverman, B. G., and Mehzer, T. 1997. A study of strategies for computerized critiquing of programmers. Empirical Software Engineering 2(3): 339–359.

    Article  Google Scholar 

  • Szajna, B. 1994. Software evaluation and choice: Predictive validation of the technology acceptance instrument. MIS Quarterly 18: 319–324.

    Google Scholar 

  • Taylor, S., and Todd, P. A. 1995. Understanding information technology usage: A test of competing models. Information Systems Research 6(2): 144–176.

    Google Scholar 

  • Venkatesh, V., and Davis, F. D. 1996. A model of the antecedents of perceived ease of use: Development and test. Decision Sciences 27(3): 451–481.

    Google Scholar 

  • Venkatesh, V., and Davis, F. D. 2000. A theoretical extension of the technology acceptance model: Four longitudinal field studies. Management Science 46(2): 186–204.

    Article  Google Scholar 

  • Vessey, I., Jarvenpaa, S. L., and Tractinsky, N. 1992. Evaluation of vendor products: CASE tools as methodological companions. Communications of the ACM 35(4): 90–105.

    Article  Google Scholar 

  • Zettel, J. 2001. Methodological support or descriptive software process modeling: Consistency rules in SPEARMINT 5. Technical Report IESE-Report 014.01/E, Fraunhofer Institute or Experimental Software Engineering.

  • Zettel, J. 2003. Anpassbare Methodenassistenz in CASE-Werkzeugen, Vol. 13 of PhD Theses in Experimental Software Engineering. Stuttgart: Fraunhofer IRB Verlag. PhD Thesis, University of Kaiserslautern, Department of Computer Science; ISBN 3-8167-6284-0.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jörg Zettel.

Additional information

This work originates from the author’s time at the Fraunhofer Institute or Experimental Software Engineering (IESE), Sauerwiesen 6, 67661 Kaiserslautern, Germany.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Zettel, J. Methodology Support in CASE Tools and Its Impact on Individual Acceptance and Use: A Controlled Experiment. Empir Software Eng 10, 367–394 (2005). https://doi.org/10.1007/s10664-005-1287-5

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10664-005-1287-5

Keywords

Navigation