Abstract
We performed a series of usability studies to evaluate an education department portal for New York State Education Department (NYSED) (www.nysed.gov) in order to measure the quality of a user’s experience when interacting with specific sections of this Web site. This study is composed of two phases: 1. heuristic evaluation and cognitive walkthrough were carried out to evaluate 25 web pages of the site; and 2. a user testing was performed to evaluate three components of the site that have been redesigned based on the findings and recommendations from the Phase 1. The results will assist NYSED in identifying opportunities for improving customer service and enhancing the website.
Keywords
- User Satisfaction
- Usability Issue
- Heuristic Evaluation
- Human Service Professional
- Improve Customer Service
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
1 Introduction
The Internet has become an integral part of people’s lives. People rely on web-based information systems like websites to access the information that they need on the Internet. These websites generally contain an extensive amount of information on a significant number of web pages. It is very important that usability issues have already been addressed when designing such portals. International Standard Organization (ISO) 9241 defines usability as “the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use” [4]. Nielsen & Mack [10] define usability problems as “any aspect of the design where a change would lead to improved system measures”. Website usability can be measured through various human-computer interaction (HCI) techniques. This paper addresses usability issues with a large government education website employing heuristic evaluation, cognitive walkthrough, and user testing.
The New York State Education Department’s (NYSED) website (www.nysed.gov) provides an electronic means for a wide variety of people to access information from and conduct business with the State Education Department (SED). The Department’s Web presence consists of approximately 600,000 pages, including the main site and about 60 sub-sites. The scope of this site includes the Department’s main entry portal (at www.nysed.gov) as well as sites maintained by various program offices.
In this study, we aim to evaluate the NYSED website by conducting a series of usability studies to measure the quality of a user’s experience when interacting with specific sections of this large website. We want to find out whether or not users can accomplish their respective tasks, and measure the overall effectiveness of the resources which are offered via the internet. The results of the usability analysis will assist NYSED in identifying opportunities for improving customer service and enhancing the website.
The following sections include a discussion on previous work, details about how we conducted heuristics evaluation, cognitive walkthrough and user testing of the NYSED site, our findings, and improvement recommendations for the site.
2 Previous Work
2.1 Usability Testing Methods
Heuristic evaluation has been widely used to identify usability problems in user interface design of websites [10, 11]. A small number of evaluators will be used to examine the interface and evaluate its compliance with the widely used usability principles. Cognitive Walkthrough is a popularly used method for evaluating the design of a user interface, which addresses how well the interface supports “exploratory learning” [12].
In 2010, [14] addressed the eight golden rules of interface design, including consistency, catering to universal usability, offering informative feedback, designing dialogs to yield closure, preventing errors, permitting easy reversal of actions, supporting internal locus of control, and reducing short-term memory load. Blackmon, Polson, Kitajima, & Lewis [2] proposed the Cognitive Walkthrough for the Web (CWW) that can be used in the website design and usability evaluation.
In addition to the expert evaluation methods such as heuristic evaluation and cognitive walkthrough, researchers also rely on user testing to evaluate website usability. User testing relies on user comments and user experience of using the website, and it is usually carried out in a lab environment [16].
Iterative design of user interfaces involves redesigning the interfaces based on findings of user testing. Nielsen [9] discovered that, among four case studies, the median improvement in overall usability was 165 % from the first to the last iteration, and the median improvement per iteration was 38 %. Nielsen [9] suggested iterating through at least three versions of the interface.
2.2 Previous Related Usability Studies
Jones [5] presented a case study implementing discount usability techniques (including prototypes, heuristic evaluation, etc.) in a federal government agency on their system WISQARS (Web-based Injury Statistics Query and Reporting System). The results of the study were positive and have raised usability issues on several other Injury Center web efforts. In 2009, Zhou [17] analyzed the Canadian government website using theories of usability engineering and information architecture. The study tried to integrate a large amount of information by using a clear structure and a friendly interface within a small screen to allow users to see the information and locate it faster. Following W3C recommendations and standards, Murenin and Tabrizi [8] identified useful guidelines for interactive websites where user registrations and searching within static fields of a database are involved. The website they redesigned became optimized for viewing from different browsers and profiles, including text browsers, printing profiles of graphical browsers, and mobile devices [8].
Sutcliffe [15] employed heuristics to evaluate three airline websites and found that although attractiveness and aesthetic design are key factors in usability evaluation, further research would need to be carried out to assess how these properties affect different user groups.
In a usability test of a distance continuing education website for human service professionals, Levine & Chaparro [6] evaluated the website’s ease of use, efficiency, and user satisfaction with the targeted users. The evaluation identified usability issues that were critical to know before releasing the site to the public. It addressed the need to focus on the potential consumers of the site (human service professionals) rather than just on the site’s functionality [6].
McMillen and Pehrsson [7] employed a cost-effective strategy to improve the usability of the Bibliotherapy Education Project’s (BEP) counselor education website. They identified three themes for improvement: graphics and visual presentation, organization of textual content, and workflow.
Akilli’s study [1] focused on user satisfaction for an educational website by using only one usability technique, the Questionnaire for User Satisfaction based on OAI (Object-Action Interface) model. The study found the website user friendly but did not have flexibility and stimulating attributes.
3 Phase 1: Heuristic Evaluation and Cognitive Walkthrough
3.1 Heuristic Evaluation
Two evaluators conducted heuristic evaluation independently on 25 web pages using the adapted website checklist designed by Gaffney [3]. The heuristics considered were navigation, functionality, control, language, authority, feedback, consistency, and visual clarity. These heuristics were evaluated based on a 6-point scale: 1-Never, 2-Seldom, 3-Sometimes, 4-Frequently, 5-Always, Not Applicable (N/A).
We computed the average score for each of the eight usability categories based on the 5-point rating scale. An average score close to five indicates good usability in that area. A low score close to zero signifies bad usability. Figure 1 shows the combined average ratings of the evaluators for each of the eight aspects of usability.
Evaluators felt generally positive about the usability of the website. Results indicated that the navigation, consistency, and visual clarity aspects of the website usability need to be given more attention.
Similar ratings were given for the majority of categories by the two evaluators. Evaluator 2 is slightly more generous than Evaluator 1. Relatively large discrepancies exist with language and consistency. The disagreements were caused by a few problems identified by Evaluator 1 that Evaluator 2 was not able to detect.
3.2 Cognitive Walkthrough
An evaluator conducted cognitive walkthroughs on all of the 8 scenarios provided by NYSED. While accomplishing the tasks described in the scenario narratives, the evaluator duly recorded each of the mental and operational steps with a Cognitive Walkthrough Evaluation Sheet [13]. Overall, the site has done a good job of laying out the information, as evidenced by the high success rate of the walkthrough evaluations. Results indicated that the structure and searching functionality of the site need to be improved.
Since heuristic evaluation and cognitive walkthrough are methods that rely on opinions and evaluations from a small number of evaluators, the findings could be limited in the generalizability and exhaustiveness. In the next section, we will describe a formal user-centered experiment which involves real users to find out the usability problems and task effectiveness of the website. The results will contribute to a more robust website for targeted NYSED users.
4 Phase 2: User Testing
User Testing was carried out based on the newly improved site based on the recommendations from the Phase 1. Seven participants evaluated three components of the site: General Education Development (GED), Office of Teaching and General, which were redesigned based on the recommended changes after the heuristics evaluation and cognitive walkthrough. These participants were recruited by NYSED. Table 1 shows the demographic information of the seven participants. NYSED also provided a total of eight tasks: three tasks for the GED site (Task 1.1, 1.2, and 1.3), four tasks for the Office of Teaching site (Task 2.1, 2.2, 2.3 and 2.4), and one task for the General site (Task 3.1). Each participant was asked to complete the eight tasks using NYSED sites. Before the study, each participant was asked to sign a consent form. Then each participant filled out a background questionnaire. Next, each participant searched the site for information to complete the eight tasks. Each participant session was recorded using TechSmith UserVue, an online service that enables the experimenter remotely observe and record participants’ screens as they navigate applications and sites. We then reviewed the recordings, compiled the results and provide a report to NYSED.
Task performance was evaluated from two perspectives: task effectiveness and task efficiency. Task effectiveness was judged based on the accuracy of results, as summarized in Table 2. “√” indicates that the result was correct, while “x” indicates that the result was wrong. As the table shows, for tasks 1.1, 1.2, 1.3, 2.2 and 2.3, more than half of the participants got the correct results. For tasks 2.1, 2.4 and 3.1, less than half of the participants got the correct results, which suggested potential areas for improvement.
Task efficiency (Table 3) was evaluated using the time spent by the participants on the tasks. “x” indicates the result was not collected because the participant finished the task 2.2 when doing task 2.1. It is not surprising to see that the tasks that were identified to have bad effectiveness (tasks 2.1, 2.4, and 3.1) had the longest completion time. This suggests that these tasks were particularly difficult for the participants. A lot of participants could not find the correct answers even though they spent much more time on these tasks.
Overall, the participants performed well for the tasks related to the GED site, but performed poorly for the tasks related to the Office of Teaching site and the General site.
5 Recommendations and Conclusions
At the end of the second phase, we recommended the following changes, including: (1) improving on how to categorize the top menus and submenus; (2) highlighting the important links; (3) restructuring website to show the requirements for different user profiles; (4) avoiding ambiguity of contents so users know how to interpret them; (5) avoiding long pages by organizing contents in a clear structure so users will not get lost in navigation; (6) providing better clarification of link names to avoid confusion.
Our study has proved successful in identifying usability problems of the NYSED site. We conclude that it is necessary to have a series of studies, and involve stakeholders in the whole usability testing process. We were constrained by a limited number of experts and subjects, and a limited number of tasks. The only realistic way to address this issue is to investigate further if and how the user groups (e.g. teachers, parents, and students) behave differently when using the site. Despite the limitations, we believe our study made a significant effort in highlighting the critical importance of involving stakeholders in the design and content development [6], and involving a sequence of testing phases in the usability testing process of websites.
References
Akilli, G.K.: User satisfaction evaluation of an educational website. Turk. Online J. Educ. Technol. 4(1), 85–92 (2002)
Blackmon, M.H., Polson, P.G., Kitajima, M., Lewis, C.: Cognitive walkthrough for the web. In: Proceedings of CHI, vol. 4, pp. 463–470 (2002)
Gaffney, G.: Web site evaluation checklist v1.1. Information and Design pty ltd (1998). www.infodesign.com.au Accessed 20 March 2007 http://www.infodesign.com.au/ftp/WebCheck.pdf
ISO 9241-11. Ergonomic requirements for office work with visual display terminals (VDTs): guidance on usability, International Standards Organisations (1998)
Jones, C.P.: Lessons learned from discount usability engineering for the federal government. In: Society for Technical Communication’s 50th Annual Conference Proceedings, pp. 333–338 (2003)
Levine, J., Chaparro, B.S.: Usability study of a distance continuing education website for human service professionals. J. Technol. Hum. Serv. 25(4), 23–39 (2007)
McMillen, P.S., Pehrsson, D.E.: Improving a counselor education web site through usability testing: the bibliotherapy education project. Counselor Educ. Supervision 29, 122–136 (2009)
Murenin, C.A., Tabrizi, M.H.N.: Development of usable and accessible web-portals using W3C standards. In: Proceedings of the International Conference on Information Technology: Coding and Computing, vol. 2, pp. 829–831 (2005)
Nielsen, J.: Iterative interface design. IEEE. Computer 26, 32–41 (1993)
Nielsen, J., Mack, R.L. (eds.): Usability inspection methods. John Wiley and Sons, New York (1994)
Nielsen, J.: Designing web usability. New Riders Publishing, Indiana (2000)
Rieman, J., Franzke, M., Redmiles, D.: Usability evaluation with the cognitive walkthrough (1995). Accessed 15 December 2010. http://www.sigchi.org/chi95/proceedings/tutors/jr_bdy.htm
Rowley, D.E., Rhoades, D.G.: The cognitive jogthrough: a fast-paced user interface evaluation procedure. In: Proceedings of the Conference on Human Factors in Computing Systems, pp. 389–395 (1992)
Schneiderman, B., Plaisant, C.: Designing the user interface: strategies for effective human-computer interaction. Addison Wesley, Boston (2010)
Sutcliffe, A.: Assessing the reliability of heuristic evaluation for website attractiveness and usability. In: Proceedings of the 35th Hawaii International Conference on System Sciences, vol. 5, pp. 183–198 (2002)
Tan, W., Liu, D., Bishu, R.: Web evaluation: heuristic evaluation vs. user testing. Int. J. Ind. Ergon. 39, 621–627 (2009)
Zhou, X.-Y. : Usage-centered design for government websites. In: Proceedings of the Second International Conference on Information and Computing Science, pp. 305–308 (2009)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Yuan, X., Yang, H., Moorhead, K., DeMers, K. (2015). Evaluating an Education Department Portal: A Case Study. In: Marcus, A. (eds) Design, User Experience, and Usability: Interactive Experience Design. DUXU 2015. Lecture Notes in Computer Science(), vol 9188. Springer, Cham. https://doi.org/10.1007/978-3-319-20889-3_23
Download citation
DOI: https://doi.org/10.1007/978-3-319-20889-3_23
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-20888-6
Online ISBN: 978-3-319-20889-3
eBook Packages: Computer ScienceComputer Science (R0)