Abstract
The definition of user experience (UX) is broad and covers several aspects. The job of any programmer is very specific and demanding. He/she uses different systems or tools to carry out their programming tasks. We consider a programmer as a specific case of user, who employs programming environments and other software development artifacts. We therefore consider this particular kind of UX as Programmer eXperience (PX). Several authors have defined different aspects of PX, including, among others, language features, programming learning factors or programmer performance. Usability is a relevant aspect of UX, as well as an important aspect of programming environments. Heuristic evaluation is an inspection method that allows evaluating the usability of interactive software systems. We developed a set of heuristics following the methodology proposed by Quiñones et al. We defined a new set of 12 specific heuristics that incorporate concepts of UX and usability of programming environments. These heuristics have been validated following also that methodology. The results obtained in different effectiveness criteria were satisfactory. However, the set of heuristics could be further refined and validate in new scenarios or case studies.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
Programming environments are systems that provide different tools that facilitate the work of a programmer. The definition of UX is quite broad and refers to the perceptions of a user in relation to different products, systems and/or services [1]. In this work, we consider the programmer as a user and Programmer eXperience (PX) a specific kind of UX.
In previous works, we found that the PX is related to the motivation and perception of the programmer. PX is also related to the choice of development tools as the programming environments. We also found that most studies focus on the usability of programming environments. One way to evaluate usability is through heuristic evaluation. To perform a heuristic evaluation, a specific set of heuristics is needed to find usability problems in a specific domain.
In this paper, we present a set of heuristics that incorporates aspects of UX and usability to evaluate programming environments. We followed the methodology proposed by Quiñones et al. in 2018 [2] to develop the set of heuristics. This methodology contains 8 stages, which allow the development of heuristics and their validation. We developed a new set of 12 specific heuristics that incorporates concepts of UX and usability of programming environments. The set of heuristics was validated through two experiments established in the methodology used: expert judgment and heuristic evaluation. The results obtained allowed us to establish the effectiveness of the proposed set. In both experiments we evaluated several criteria and dimensions.
The paper is organized as follows. Section 2 introduces the theoretical background. Section 3 describes the methodology followed to develop the set of heuristics and the results obtained. Finally, in Sect. 4 we present conclusions and future work.
2 Theoretical Background
2.1 User eXperience
User eXperience (UX) is defined by the International Organization for Standardization (ISO) 9241-210 as follows: “user’s perceptions and responses that result from the use and/or anticipated use of a system, product, or service” [1]. To explain the UX several authors have proposed models, one of them is Morville’s “Honeycomb” [3]. This model is composed of seven aspects which are briefly identified below:
-
Useful: the system or product must be useful and meet a need, otherwise there is no justification for the product.
-
Usable: ease of use is vital. The system must be simple and easy to use. Usability is necessary but is not sufficient.
-
Desirable: the visual aesthetics of the product or system should be balanced and in relation to the elements of emotional design.
-
Findable: the system must be easy to navigate and incorporate objects easy to find. Users can find what they need.
-
Accessible: the system must be accessible for all users, including those with disabilities.
-
Credible: companies, products or services must be reliable for users.
-
Valuable: the system must deliver value to our sponsors. The user experience must contribute to customer satisfaction and/or advance in the mission depending on the organization.
2.2 Usability
Usability is an important element within the user experience. ISO 9241-11 of 2018 [4] defines usability as follows: “extent to which a system, product or service can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use”. Also, this standard defines effectiveness, efficiency and satisfaction as follow:
-
Effectiveness: accuracy and completeness with which users achieve specified goals.
-
Efficiency: resources used in relation to the results achieved.
-
Satisfaction: extent to which the user’s physical, cognitive and emotional responses that result from the use of a system, product or service meet the user’s needs and expectations.
Nielsen in 1993 [5] explained that the usability has five attributes that allow to evaluate it and possibly measure it:
-
Learning: the system can be learned and used quickly by the user.
-
Efficiency: related to the level of productivity that the user can achieve when using the system.
-
Memorability: related to the ease of being remembered by non-frequent users.
-
Errors: related to the number of mistakes a user makes when performing a task. The system must have low rates of errors.
-
Subjective Satisfaction: the users must be satisfied when using the system.
2.3 Programmer eXperience
We consider the programmer as a user of different systems and artifacts in their work. One kind of system is the programming environment. In a systematic literature review [6] we found several articles that address usability in programming environments for different users (end users and professional users) [7,8,9,10]. Also, we carried out experiments in previous works in order to evaluate the perception of the usability in programming environments used by students and professional programmers. The results obtained showed that two of three Integrated Development Environments (IDE) obtained low usability scores [11]. This indicates that usability is an important aspect that must be developed and improved in programming environments. In addition, we found articles that propose sets of heuristics to evaluate programming environments [12], programming languages and programmer environments [13].
As mentioned earlier, usability is an important element in the programmer experience, when the programmer uses a programming environment. However, it is not the only factor that affects PX, because of we have to consider other elements as the aspects indicated in the “honeycomb” model of user experience.
The programmer experience has a close relationship with the user experience because PX considers the perception of programmers about systems, products and services (in this work the systems are programming environments). In addition, PX has a personal component related to the intrinsic and extrinsic motivations that the programmer has. Other skills like technical and social skills can facilitate their performance in a development group [6].
2.4 Heuristic Evaluation
Heuristic evaluation (HE) is an inspection method that allows us to evaluate the usability of interfaces [14]. HE is carried out by a group of evaluators, who may or may not be experts. The evaluators examine the interface and judge the compliance of the usability principles in the interfaces.
The usability principles are heuristics grouped in sets. A set of heuristics can be general like in Nielsen’s heuristics, or specific to the product to evaluate, such as social networks, languages or programming environments.
One HE is less expensive than a test with users and can be performed in a short period of time [14].
3 Methodology
The development of the heuristics to evaluate the programming environments was performed based on the methodology proposed by Quiñones et al. [2]. This methodology contains 8 stages, which can be carried out iteratively, optionally, or overlapped depending on the specific case. Next, we explain briefly each stage:
-
1.
Exploratory stage: perform a literature review.
-
2.
Experimental stage: analyze data that are obtained in different experiments to collect additional information that has not been identified in the previous stage.
-
3.
Descriptive stage: select and prioritize the most important topics of all information that was collected in the previous stages.
-
4.
Correlational stage: match the features of the specific application domain with the usability/UX attributes and existing heuristics (and/or other relevant elements).
-
5.
Selection stage: keep, adapt and/or discard the existing sets of usability/UX heuristics that were selected in Step 3 (and/or other relevant elements).
-
6.
Specification stage: formally specify the new set of usability/UX heuristics.
-
7.
Validation stage: validate the set of heuristics through several experiments in terms of their effectiveness and efficiency in evaluating the specific application.
-
8.
Refinement stage: refine and improve the new set of heuristics based on the feedback that was obtained in previous stage.
This methodology allows obtaining a set of heuristics in a specific domain incorporating aspects of usability and user experience. In this work, the programming environments are the specific domain used by programmers.
3.1 Heuristics Development
We followed the methodology carefully. We collect relevant information from different scientific databases in relation to the specific domain and found several features, usability/UX attributes, and sets of heuristics. Regarding the sets of heuristics found, two sets of heuristics about programming environments were used in the development of our new set and we refer to them as base sets.
The first set evaluates programming environments [12] and has 9 heuristics. This set is an adaptation of the set proposed by Nielsen, to evaluate programming environments. In this work we identified this set with the abbreviation NA (Nielsen Adapted) followed by the corresponding number. In Table 1 we can see the set of heuristics adapted from Nielsen.
The second set evaluates programming languages and programming environments [13]. This set is more extensive than the previous one and includes aspects of learning, languages and programming environments. We identified this set with the abbreviation EL (Environments and Languages) followed by corresponding number. Table 2 shows EL set with their 13 heuristics.
We considered experiments carried out by other authors to complement the information collected previously. We performed our own experiments in order to incorporate features or usability problems that were not found in the reviewed articles.
We then prioritized the information collected and decided what aspects, attributes and set of heuristics are considered important in the development of our new set. Also, we decided what actions to take over the existing set of heuristics: creating, adapting, maintaining or eliminating. The actions made over these sets of heuristics are shown in Fig. 1.
The first version of the proposed heuristics set is shown in Table 3. The obtained set is composed of 12 heuristics, created by considering the heuristic sets found (base sets), and performing the actions mentioned and explained previously.
The proposed set contains 9 adapted heuristics, one maintained heuristic and two new heuristics. We identify with the abbreviation PE (Programming Environment) and its corresponding number.
3.2 Heuristics Validation
We validated the set of heuristics with two experiments, following the guidelines of Quiñones et al. [2]: expert judgement and heuristic evaluation. To perform the validation of expert judgment, we applied an online survey. To collect the information, we used a questionnaire that evaluates each heuristic regarding its usefulness, clarity, ease of use and completeness. In all cases we used a 5-point Likert scale; the value 1 indicates the minimal score and the value 5 is the best score. We asked UX experts about the new set of heuristics proposed to evaluate usability/UX in programming environments. The answers obtained reached the satisfactory score in all dimensions. Figure 2 shown the average obtained. Also, we received several comments in order to improve the heuristics.
To perform the heuristic evaluation, we consider an IDE as a case study and develop a heuristic evaluation with two groups: the experimental group that worked with the heuristics proposed by us and the control group that worked with a summary version of the heuristics of Kölling and McKay [13]. The results obtained were favorable for our proposed heuristics in various effectiveness criteria, such as Number of correct associations of the problem with the heuristics, Number of specific usability/UX problems identified and Number of usability/UX problems identified qualifies as more severe. For example, Number of correct associations of the problem with the heuristics are shown in Eq. (1) for experimental group and in Eq. (2) for the control group. The experimental group obtained the better results than experimental group in relation to correct association.
The comments obtained from the experts in both experiments (expert judgment and heuristic evaluation) were considered. Modifications were made in the set of heuristics. In general, a description of the example image of heuristic compliance was added in each heuristic. In addition, we improve consistency in heuristics in some cases the names, definitions and explanations. These changes in all cases were minor but improved the accuracy of heuristics.
3.3 The Set of Usability/UX Heuristics for Programming Environments
The set of heuristics that we proposed and validated for programming environments is shown in Table 4. In this table we identified each heuristic by means of id, name and definition. The first ten heuristics are adaptations of the heuristics base and the last two correspond to new heuristics created.
The definition of each heuristic must be done in detail, so we used a template with several elements following the methodology used.
An example of a template used to define the heuristics is shown in Table 5. In this table we can see the Automatic Feedback heuristic (PE-12). The elements contained in the template are the indicated by the Quiñones et al. methodology [2]:
-
Id: contains the heuristic identifier.
-
Name: indicates the name of the heuristic.
-
Definition: a brief definition of heuristics.
-
Explanation: a more detailed explanation of the heuristic.
-
Explanation of the figure: a detailed explanation of the example figure, which represents the accomplishment of the heuristic, incorporating used environment, operating system in addition to the context of the figure.
-
Example figure: an image that represents the fulfillment of the heuristic.
-
Benefits: identification of the expected benefits associated with the fulfillment of the heuristic.
-
Usability attributes and/or UX aspects: aspects associated with the heuristic.
4 Conclusions and Future Work
A programmer is a specific user of systems, products and services. We focus this work on programmer experience (PX), which we identify as a specific case of the user experience (UX). The PX is affected by the tools that programmers use. One of these commonly used tools are programming environments.
In this work, we present the development of a set of heuristics to evaluate programming environments, which includes aspects of usability/UX. The development of the set of heuristics has been carried out under a rigorous methodology that allowed us to obtain a specific set which is made up of 12 specific heuristics. This set was subjected to validation through two experiments: expert judgment and heuristic evaluation. In both cases, the results obtained allowed us to validate the heuristics and were also refined to obtain a final set presented in this work.
This work focuses on one of the most important artifacts that a programmer uses, the programming environments. As future work we intend to further validate the set of heuristics in new contexts and case studies. We also intend to develop sets of heuristics for other software development artifacts evaluation, such as programming codes, design documents or programming languages.
References
ISO 9241-210: Ergonomics of human-system interaction - Part 11: Usability: Definitions and concepts. International Organization for Standardization, Geneva (2018)
Quiñones, D., Rusu, C., Rusu, V.: A methodology to develop usability/user experience heuristics. Comput. Stand. Interfaces 59, 109–129 (2018)
Morville, P.: User experience honeycomb. http://semanticstudios.com/user_experience_design/. Accessed 14 Dec 2019
ISO 9241-11: Ergonomics of human-system interaction - Part 11: Usability: Definitions and concepts. International Organization for Standardization, Geneva (2018)
Nielsen, J.: Usability Engineering. AP Professional, Boston (1993)
Morales, J., Rusu, C., Botella, F., Quiñones, D.: Programmer eXperience: a systematic literature review. IEEE Access 7, 71079–71094 (2019)
Cao, J., Fleming, S.D., Burnett, M.: An exploration of design opportunities for ‘gardening’ end-user programmers’ ideas. In: Proceedings of the IEEE Symposium Visual Languages and Human-Centric Computing (VL/HCC), Pittsburgh, USA, pp. 35–42 (2011)
Kuttal, S.K., Sarma, A., Rothermel, G., Wang, Z.: What happened to my application? Helping end users comprehend evolution through variation management. Inf. Softw. Technol. 103, 55–74 (2018)
Smith, J., Brown, C., Murphy-Hill, E.: Flower: navigating program flow in the IDE. In: Proceedings IEEE Symposium Visual Languages Human-Centric Computing (VL/HCC), Raleigh, USA, pp. 19–23 (2017)
Traver, V.J.: On compiler error messages: What they say and what they mean. Adv. Hum.-Comput. Interact. 2010, 1–26 (2010)
Morales, J., Botella, F., Rusu, C., Quiñones, D.: How “friendly” integrated development environments are? In: Meiselwitz, G. (ed.) HCII 2019. LNCS, vol. 11578, pp. 80–91. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-21902-4_7
Kline, R.B., Seffah, A.: Evaluation of integrated software development environments: challenges and results from three empirical studies. Int. J. Hum.-Comput. Stud. 63(6), 607–627 (2005)
Kölling, M., McKay, F.: Heuristic evaluation for novice programming systems. ACM Trans. Comput. Educ. (TOCE) 16(3), 12 (2016)
Nielsen, J., Molich, R.: Heuristic evaluation of user interfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 249–256. ACM (1990)
Acknowledgment
We are grateful to all experts that participated in the survey. We also thank to all participants in the heuristic evaluation. Jenny Morales is a beneficiary of one INF-PUCV doctoral scholarship.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Morales, J., Rusu, C., Botella, F., Quiñones, D. (2020). Programmer eXperience: A Set of Heuristics for Programming Environments. In: Meiselwitz, G. (eds) Social Computing and Social Media. Participation, User Experience, Consumer Experience, and Applications of Social Computing. HCII 2020. Lecture Notes in Computer Science(), vol 12195. Springer, Cham. https://doi.org/10.1007/978-3-030-49576-3_15
Download citation
DOI: https://doi.org/10.1007/978-3-030-49576-3_15
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-49575-6
Online ISBN: 978-3-030-49576-3
eBook Packages: Computer ScienceComputer Science (R0)