A CSCW Requirements Engineering CASE Tool: Development and usability evaluation

https://doi.org/10.1016/j.infsof.2014.02.009Get rights and content

Highlights

  • Six metrics were used to measure effectiveness, efficiency and satisfaction.

  • Facial expression analysis was used to assess users’ satisfaction throughout the test.

  • The evaluation of tool was reported by applying the ISO/IEC 25062:2006.

  • We found some flaws in the tool, despite its good usability figures.

  • The lessons learnt are presented to help in other usability assessment studies.

Abstract

Context

CSRML Tool 2012 is a Requirements Engineering CASE Tool for the Goal-Oriented Collaborative Systems Requirements Modeling Language (CSRML).

Objective

This work aims at evaluating the usability of CSRML Tool 2012, thus identifying any possible usability flaw to be solved in the next releases of the application, as well as giving a general overview on how to develop a DSL tool similar to the one evaluated in this work by means of Visual Studio Modelling SDK.

Method

In this evaluation, which was reported by following the ISO/IEC 25062:2006 Common Industry Format for usability tests, 28 fourth-course Computer Science students took part. They were asked to carry out a series of modifications to an incomplete CSRML requirements specification. Usability was assessed by measuring the task’s completion rate, the elapsed time, number of accesses to the help system of the tool and the instructor’s verbal assistance. The participants’ arousal and pleasantness were assessed by analyzing both facial expressions and a USE questionnaire.

Results

In spite of obtaining high usability levels, the test outcome revealed some usability flaws that should be addressed.

Conclusion

The important lessons learnt from this evaluation are also applicable to the success of other usability tests as well as to the development of new CASE tools.

Introduction

Usability is defined by the International Standard Organization (ISO 9241-11 [36]) as “the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use”. That means that if a product (a software application in our case) does not provide effectiveness, efficiency and satisfaction to its users, it is not usable, and therefore will probably not be used. This was the case of Google Wave [64], a web application that was part email, part micro-blogging service, part instant messaging, part collaborative discussion tool, and so on. Its creators said that “it would set a new benchmark for interactivity”, but the truth is that this tool had too many features to allow it to be usable and so forced Google to discontinue its development 2 years after its public preview due to very low usage. Nevertheless, the Wave issue can be considered an isolated failure among the large list of Google successes. In fact, its search engine is the most widely used probably due to its usability [21]. When a user requests the Google search engine to find webpages, images, videos, news or whatever, it is very likely for him/her to find what he/she was looking for (effectiveness) in a considerably short period of time (efficiency) and finally being satisfied with the simple search procedure and its results (satisfaction). Taking into account these three features, we are able to deduce why almost everybody uses Google: it is usable.

However, not only web applications need to achieve usability, but every piece of software endowed with a user interface (UI) must be usable. For this reason in this work we evaluate the usability of a computer-aided software engineering (CASE) tool in the form of a Requirements Engineering (RE) CASE tool. The object of this assessment is the CSRML Tool 2012 (hereinafter abbreviated to CT’12), the CASE tool supporting CSRML (Collaborative Systems Requirements Modeling Language) [61] which is a Goal-Oriented RE language [41] able to model the special sort of requirements which are needed to specify a Computer Supported Cooperative Work (CSCW) system. These requirements, which are based on the management of the system participants and their capabilities, the specification of the tasks they are supposed to perform, and the awareness of the other task participants (see Section 2), make CSRML a really graphically complex language. The UI of the tool must be complete enough to deal with all the CSRML modeling elements, including 5 different diagram editors with cross-referencing facilities. As the understandability of CSRML had already been validated empirically by means a family of experiments [62], in this work the usability of a tool supporting all CSRML features is evaluated to assess the extent to which users are able to specify CSCW requirements by using it. Thus, by evaluating the usability of the CSRML CASE tool it will be possible to improve it according to any usability flaws detected. Indeed, the more usable the CT’12 is, the more likely it is that the requirements of CSCW systems will be properly specified by it, thus avoiding several potential sources of error, such as missed information and incorrect translation of requirements to written natural language [66].

With this aim, a usability test was documented by using the ISO/IEC Common Industry Format (CIF) for usability tests [38]. We also applied different evaluation techniques, one for the definition of the experimental tasks [15] and two others for evaluating user satisfaction by measuring their emotions [20] and gathering their opinion through a survey [43] to enrich this usability test. 28 fourth-course Computer Science students participated in this test. They were asked to carry out a set of tasks on an incomplete CSRML requirements specification corresponding to an on-line collaborative multiplayer game. The usability of the tool was measured by several variables such as tasks completion rate, elapsed time, a satisfaction questionnaire and user pleasantness and arousal when using the tool (gathered by analyzing video recordings of their facial expressions). It is worth highlighting the importance of carrying out usability evaluations [3] as part of the CT’12 development process. As shown in this work, it has enabled us to find several usability flaws that are being addressed in the next version of CT’12 to be launched shortly. A general instructional overview of how to develop a DSL tool like CT’12 is also included to be used by the community as a guide to developing and evaluating new DSL CASE tools.

This paper is structured as follows: after this introduction, in Section 2 related work is discussed. In Section 3 the DSL tool development process and the usability test report are presented in accordance with the ISO/IEC CIF. In Section 4 the previously presented report results are analysed and Section 5 discusses the threats to the validity of this evaluation. Section 6 summarizes the lessons learnt from the test and finally Section 7 presents our conclusions and further work.

Section snippets

Related work

CT’12 is a CASE tool that supports the specification of CSCW (Computer Supported Cooperative Work) requirements using CSRML language [61]. Hence, CSRML is a Goal-Oriented Requirements Engineering (RE) language inspired by the i* framework [12], [42], focusing on the specification of the special kind of requirements that collaborative systems usually have [59]. These special kinds of software systems are studied under the CSCW discipline, which addresses “how collaborative activities and their

Usability report

This section deals with our experimental method, the design of the experiment, the procedure followed and the results obtained, in accordance with the template provided by the ISO/IEC 25062:2006 international standard [38]. The method was designed to comply with the ISO/IEC’s Common Industry Format (CIF) for usability test reports. Hence, this section has been organized by following the ISO/IEC 25062:2006, which standardizes the types of information to be captured in tests with users, so that

Results interpretation

This section contains an analysis and interpretation of the results. The directed tasks obtained a 99.55% completion rate, as can be seen in Table 7. From the detailed results shown in Table 11 in Appendix C, we can see that only one participant did not perform correctly one of the directed tasks (which can be considered an isolated mistake), so that it can be said that the subjects understood the training session perfectly.

Regarding the effectiveness of the non-directed tasks, the participants

Threats to validity

Next, the usability test is analyzed according to the four types of threats to validity proposed by [67].

Lessons learnt

The aim of this section is not only to show what we learnt in this study, but also to contribute to other usability assessment studies in which our successes can be extrapolated and our failures avoided.

(LL1) First of all, it is worth pointing out the almost-one-hundred-percent result obtained by the participants in the directed tasks. In the light of this positive result, it can be said that it is appropriate to give a practical training session (learning by doing) rather than giving a

Conclusions and further work

This work describes the usability evaluation of CSRML Tool 2012 (CT’12), the CASE tool that provides support to CSRML [61], in which twenty-eight fourth-year Computer Science students took part. The participants had to complete 8 directed tasks during a training session and continue with another 10 non-directed tasks (grouped in 3 difficulty levels), whose results would be used to measure the tool’s usability. Besides task completion rates, other dependent values were considered, such as total

Acknowledgements

This research has been funded by the Spanish Ministry of Economy and Competitiveness and by the FEDER funds of the EU under the project Grant insPIre (TIN2012-34003). It has also been funded by Spanish Ministry of Education, Culture and Sports with the FPU scholarship (AP2010-0259). We would like to thank the members of SymbiaIT and the fourth-year Computer Science students of the University of Castilla-La Mancha for their valuable collaboration.

References (69)

  • J. Lee et al.

    Structuring requirement specifications with goals

    Inf. Softw. Technol.

    (2001)
  • G. Peevers et al.

    A usability comparison of three alternative message formats for an SMS banking service

    Int. J. Hum. Comput. Stud.

    (2008)
  • A. Sutcliffe et al.

    Investigating user experience in Second Life for collaborative learning

    Int. J. Hum. Comput. Stud.

    (2012)
  • M.A. Teruel et al.

    Analyzing the understandability of requirements engineering languages for CSCW systems: a family of experiments

    Inf. Softw. Technol.

    (2012)
  • M.H. Tran et al.

    Using an experimental study to develop group awareness support for real-time distributed collaborative writing

    Inf. Softw. Technol.

    (2006)
  • A. Vasalou et al.

    Cultural differences, experience with social networks and the nature of “true commitment” in Facebook

    Int. J. Hum. Comput. Stud.

    (2010)
  • G.S. Walia et al.

    A systematic literature review to identify and classify software requirement errors

    Inf. Softw. Technol.

    (2009)
  • P.G. Aitken, NET Graphics and Printing: A Comprehensive Tutorial and Reference for Developers, Optimax Pub,...
  • C. Ardito, P. Buono, D. Caivano, Usability evaluation: a survey of software development organizations, in: 23rd...
  • A. Bangor et al.

    An empirical evaluation of the system usability scale

    Int. J. Hum. Comput. Interact.

    (2008)
  • V.R. Basili et al.

    Building knowledge through families of experiments

    IEEE Trans. Softw. Eng.

    (1999)
  • I. Blau, A. Caspi, What Type of Collaboration Helps? Psychological Ownership, Perceived Learning and Outcome Quality of...
  • J. Brooke

    SUS – a quick and dirty usability scale

    Usability Eval. Ind.

    (1996)
  • P.H. Carstensen et al.

    Computer supported cooperative work: new challenges to systems design

  • J. Castro, M. Kolp, J. Mylopoulos, A requirements-driven development methodology, 13th Int. Conf. on Advanced...
  • W.G. Cochran et al.

    Experimental Designs

    (1992)
  • A. Cockburn

    Writing Effective Use Cases

    (2000)
  • S. Cook et al.

    Domain-Specific Development with Visual Studio DSL Tools

    (2007)
  • L.J. Cronbach et al.

    My current thoughts on coefficient alpha and successor procedures

    Educ. Psychol. Measur.

    (2004)
  • J.A. Cruz-Lemus et al.

    Assessing the understandability of UML statechart diagrams with composite states—a family of empirical studies

    Empir. Softw. Eng.

    (2009)
  • L.M. Cysneiros et al.

    Non-functional requirements elicitation

  • P.M.A. Desmet

    Measuring emotions: development and application of an instrument to measure emotional responses to products

  • D. Dudek et al.

    Is Google the answer? A study into usability of search engines

    Libr. Rev.

    (2007)
  • C.A. Ellis et al.

    Groupware: some issues and experiences

    Commun. ACM

    (1991)
  • Cited by (23)

    • Evaluating the benefits of a computer-aided software engineering tool to develop and document product configuration systems

      2021, Computers in Industry
      Citation Excerpt :

      Other approaches have been used to evaluate CASE tools in literature. Teruel et al. (2014) for example evaluate a case tool for requirements engineering on the basis of the user’s ability to understand the tool and build a relevant representation with it. We were driven by other considerations in this paper.

    • Assessing the impact of the awareness level on a co-operative game

      2018, Information and Software Technology
      Citation Excerpt :

      This framework will also establish the GA elements to be implemented according to the players’ expertise level. Once this framework is created, it would be integrated with a requirement engineering methodology [84,85] to deal with the requirements specification of a collaborative game. As the results obtained in the present study revealed that the players’ satisfaction as measured by a questionnaire differed when analyzed by the users’ emotions, a deeper study could be performed involving more users and other games belonging to different genres.

    • Applying thematic analysis to define an awareness interpretation for collaborative computer games

      2016, Information and Software Technology
      Citation Excerpt :

      Finally, we plan to integrate our GA interpretation into a Requirements Engineering model. Initially, CSRML (Collaborative Systems Requirements Modeling Language) [103] was selected because the empirical results have proven its suitability for collaborative system requirements specification [104] as well as the availability of a supporting tool [102]. This will mean a step forward in covering the gap between Requirements Engineering and awareness-demanding game developments [2].

    • Model-Driven Development of Groupware Systems

      2022, International Journal of e-Collaboration
    View all citing articles on Scopus
    View full text