Abstract
Usability is a core concept in HCI and is a common quality attribute for the design and evaluation of interactive systems. However, usability is a fluid construct and requires context-specific frameworks to be clearly defined and operationalized. Academic search user interfaces (SUIs) include the search portals of academic and research libraries, digital data repositories, academic data aggregators, and commercial publishers. In addition to information lookup, academic SUIs serve scientific information seeking in learning, exploration, and problem-solving.
Researchers in library and information science (LIS) have intensively studied information seeking behavior. In recent years, exploratory search has gained attention from LIS researchers and experimental SUI features are prototyped to support information seeking. In the meantime, many academic and research libraries have conducted usability evaluation and adopted discovery systems as part of SUIs. However, there is a lack of context-specific usability models for guiding academic SUI implementation and evaluation.
This study takes the perspectives of information seeking tasks and usability contextualization to propose a formative conceptual framework of academic SUI usability. Information seeking tasks from information seeking and behavior models are integrated based on the exploratory search paradigm. Information seeking tasks are mapped to the layered usability construct to shows how academic information seeking tasks may be supported to achieve high usability. Future studies should focus on developing contextualized academic SUI usability models with measurement metrics to guide the empirical implementation and evaluation of academic SUIs.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
Searching for information has become an increasingly important human activity in the modern world. As the explosion of information has become commonplace for knowledge workers, search has been widely used to ease the stress of information overload. Contemporary search happens in many different types of data collections: general Web search, image collections, ecommerce sites, government data, health and medical databases, and digital libraries and archives. While many see the search engine Google as the synonym of search, Amazon users would understand that it takes more than an omnibox to identify one suitable product from the enormous online store. Faceting, sorting, checking rankings, reading reviews, comparing product details are among the common techniques an online shopper engages in an ecommerce environment. General purpose search user interfaces (SUIs), therefore, do not satisfy all search needs [1]. In other words, specialized SUIs are necessary when the information seeking needs and contexts go beyond a general Web search.
Academic search SUIs include the search portals of academic and research libraries, digital data repositories, academic data aggregators, and publishers. The design of SUIs is important to information seeking activities and could influence information seeking behavior. For example, a simplistic design such as Google Search requires users to issue longer queries to compensate for the lack of a functional faceting mechanism; ineffective SUIs could waste the cognitive resources of the information seeker [2]. For academic information seekers, the use of Google Search is convenient but at the cost of search result quality [3].
Due to the intensive use of information in scientific activities, library and information science (LIS) researchers have long been interested in the information seeking and retrieval behavior of scholars. However, although scientific researchers spend much of their time in information seeking [4], how academic SUIs could better support scientific information seeking remains under-studied. As Bates [5] has pointed out, “[t]o optimize information search, … various design layers need to be recognized, understood, and designed for in an interface that nonetheless feels simple and natural to the end user.” Other researchers have also framed search as “data exploration for knowledge building” [6] to call for the design of future search interfaces.
As a core concept of human-computer interaction (HCI), usability has been used as an attribute for quality measurement of interactive systems. To achieve scientific understanding and better user experience of academic SUIs, usability evaluation has received attention among LIS researchers and professionals. According to a survey, 85% of academic libraries of the Association of Research Libraries (ARL) have conducted usability testing on their websites or OPAC [7]. However, usability is a fluid concept and the variation in perspective has led to different usability conceptualizations and evaluation approaches, which makes usability difficult to operationalize in practice. As a result, many usability measurement studies fall short in validity and reliability [8, 9].
The lack of consensus in usability conceptualization has also caused usability evaluation studies in digital libraries and discovery tools to be mostly ad hoc. In order to create academic SUIs with high usability, a clearly articulated conceptual framework of usability is critical for the design and evaluation of academic SUIs. Such a framework should bring the scientific information seeking knowledge together with contextualized usability conceptualization to further conceptual development and empirical examination of academic SUIs.
Conceptual models are critical for theory-informed design. Researchers have adopted models to guide the design of SUI. Jackson et al. [10] designed an exploratory search interface to support scholarly activities in searching an Internet archive (webarchive.ca) following Shneiderman’s [11] visual information seeking mantra as a design principle and the chess analogy of Hearst et al. [12] as a task model. Many recent prototyped experimental SUI features have been developed under the exploratory search paradigm [13,14,15]. The development of contextualized usability models for academic SUI would therefore provide guidance for the design and evaluation of SUIs for academic search.
A major difference between academic SUIs and general interactive systems is that academic SUIs take content, rather than the system’s functional features, as the purpose of the interaction. Content relevance hence becomes a key criterion of successful retrieval. In addition, the design of academic SUIs is further complicated by the huge volume and idiosyncratic essence of content used by users from various disciplines. Researchers [16] have therefore called for the study of the work needs, patterns, and workflow of researchers in order to integrate internal and external content with search services. As Shneiderman and Plaisant [17] point out, “[t]he conversion of information needs … to interface actions is a large cognitive step.” The objective of this study, therefore, is to explore the issue of contextualized academic SUI usability modeling through the perspectives of academic information seeking tasks and contextualized usability factors.
2 Literature Analysis
While LIS researchers have intensively studied the information seeking behavior of scientists, research issues related to HCI have received relatively little attention from information retrieval researchers [18]. Pettigrew and McKechnie [19], after a content analysis of six major information science journals, found that HCI represents only 2% of the published articles. Many usability evaluation studies on academic and research library websites have also placed less attention to usability concepts and specific SUI features. As Bates [5] pointed out, interface design specific to searching is an under-studied and promising research area in information seeking and HCI.
2.1 Usability
Usability as a quality attribute of interactive systems has critical implications for the implementation and evaluation of information systems. However, the lack of a clear definition of usability as a conceptual construct has led to problems in the principled design and measurement of interactive systems. This predicament is shown by the popularity of the term user experience (UX) as both a displacement and synonym of usability to denote the broader aspects of human experience with products and services. The conceptual evolution and overlapping are evidenced by the rebranding of the Usability Professional Association as User Experience Professional Experience in 2012 and the inclusion of the subtitle of “improving the user experience” in the U.S. government usability website (www.usability.gov) [20]. In many cases, the terms usability, user experience, and human-centered design have been used commonly without carrying specific meanings [21].
One reason for the lack of conceptual clarity in usability is that the evolvement of context has changed the nature of interaction that usability as an academic term once meant. For example, usability was discussed in a time when “[m]ost computer software in use today is unnecessarily difficult to understand, hard to learn, and complicated to use” [22]. Users, instead of systems, were once the target of improvement in that “[u]sability depends heavily on users’ abilities to map their goals onto a system’s capabilities” [23]. The commonly referenced ISO 9241-11:1998 was conceived when “evaluation of usability by user based measurement of effectiveness, efficiency, and satisfaction, as this was a convincing way of demonstrating the existence of usability problems to system developers” [24]. When these contexts no longer hold true, reconceptualization or creation of new concepts become necessary. That is the reason why the new ISO usability guidelines are incorporating UX perspectives under the satisfaction aspect in the coming new revision [24].
Usability Frameworks.
General usability frameworks such as ISO standard 9241-11 [25] and Nielsen’s heuristic evaluation [26] are commonly used in conducting empirical usability design and evaluation. It should be noted that these models are often defined with different factors. The ISO standard 9241-11 defines usability with three factors of (1) effectiveness, (2) efficiency, and (3) satisfaction; while Nielsen [27] defines usability with five quality components of (1) learnability, (2) efficiency, (3) memorability, (4) errors, and (5) satisfaction. Research reviews have thus pointed out the lack of consensus in the definition of usability [9, 28, 29].
Among the attempts to clarify the concept of usability, Alonso-Ríos et al. [30] proposed a usability taxonomy with six factors of (1) knowability, (2) operability, (3) efficiency, (4) robustness, (5) safety, and (6) subjective satisfaction with sub-attributes discussed under each factor. Similarly, Seffah et al. [31] reviewed various usability standards and models, and proposed a Quality in Use Integrated Measurement (QUIM) model with ten factors (efficiency, effectiveness, productivity, satisfaction, learnability, safety, trustfulness, accessibility, universality, and usefulness), 26 sub-factors, and measurement metrics.
The existence of multiple definitions with varied factors has evidenced usability as a multi-dimensional concept [32]. What troubles a unified definition of usability is that the included factors are related to each other [31, 33] and thus making analysis difficult. A solution for conceptual clarification is to specify the use context of the usability framework. In fact, this emphasis on context is addressed in the ISO 9241-11 standards by defining usability as the “[e]xtent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use” [25].
Contextualizing Usability.
Fidel [34] discussed the design of context-specific information systems and suggested that a context-specific system serves a particular community of users and its design may rely on the use of a context-general system. The efforts to address the issue of usability in context can be seen in the specialized models, metrics, and instruments developed for measuring usability in various fields. The Questionnaire for User Interface Satisfaction [35, 36] is a usability scale for interface measurement and has aspects of screen and terminology specifically as factors. To ease the difficulty in usability implementation and the dependency on evaluator expertise, Lin et al. [37] developed a comprehensive index of software interface usability based on human information processing theory. Eight factors (compatibility, consistency, flexibility, learnability, minimal action, minimal memory load, perceptual limitation, and user guidance) are indexed and the resulting Purdue Usability Testing Questionnaire (PUTQ) contains 100 questions.
Many contextualized usability models and measurement instruments are based on the ISO 2941-11 model’s factors of effectiveness, efficiency, and satisfaction. Some models are also informed by theories, while most are based on literature review and domain features. Evaluation tools are often based on the developed models containing metrics and/or questionnaires. Usability contextualization is emphasized by researchers [38, 39]. As Bevan and Macleod [22] state: “[t]he ideal way to specify and measure usability would be to specify the features and attributes required to make a product usable, and measure whether they are present in the implemented product” thus enabling quality to be designed into a product. They also suggest that usability can only be measured empirically “by assessing effectiveness, efficiency, and satisfaction with which representative users carry out representative tasks in representative environments” [22].
2.2 Academic Information Seeking
Among the academic SUIs, the library websites of academic and research universities are studied more than publishers, aggregators, and academic databases. Broadly speaking, the websites of academic and research libraries are academic SUIs since the central function of a library website is content discovery and delivery. Academic searchers engage in scientific problem-solving with browsing and search activities. The process of academic information seeking is therefore usually complicated, longitudinal, and exploratory in nature.
As Shneiderman and Plaisant [17] point out, the weaknesses of traditional search interfaces include “difficulty in repeating searches across multiple databases, weak methods for discovering where to narrow broad searches, poor integration with other tools.” Contemporary academic SUIs have made progress in the capacity of combining content sources through federated search and discovery tools, although tool integration has not been greatly improved and academic library websites are still often complicated and low in usability due to the issue of resource management [40, 41]. As Web search engines have gradually evolved from keyword matching and Boolean search to semantic search, the representation of search results has also progressed from “search for links” to “search for information;” yet the topic of academic search continues to be under-studied [42].
Nel and Fourie [4], for example, found that about one-third of veterinary researchers spend more than 50% of their research time on information seeking and that they rely on electronic journal articles, scientific databases, and internet search tools as their sources of information. These scientific information needs should cause researchers to primarily rely on academic SUIs. However, according to a large-scale survey on researcher information behavior, a new generation of researchers across disciplines have a strong preference for using Google/Google Scholar for information seeking [43]. While academic SUIs and Google Search should complement each other, academic SUI’s would satisfy the information needs of scientists more if they offered better usability.
Information Seeking Tasks.
LIS researchers have investigated the information behavior of scientists and developed influential descriptive models of the information seeking processes. Wilson [44] reviewed information seeking models and indicated that the models are at different levels of information behavior, information-seeking behavior, and information search behavior. Descriptive models such as Wilson’s [44] model of information behavior provide overarching descriptions of information behavior; Ingwersen’s [45] cognitive model of IR interaction is a high-level overview of information retrieval; whereas Saracevic’s [46] stratified model of IR interaction analyzes the information retrieval process from the interface perspective. Such models provide a basis for understanding academic searchers and their interaction with academic SUIs.
Due to the complexity in information needs and content, academic SUIs are less studied and most library website usability evaluation studies are ad hoc in nature. This complexity is evident in the question raised by Bates [47] of “how much and what type of activity the user should be able to direct the system to do at once.” Although Bates [48] has proposed the embedment of search tasks in SUI to support academic search, embedding tasks in SUI is difficult. As Shneiderman and Plaisant [17] explain, “[t]he conversion of information needs … to interface actions is a large cognitive step.”
Järvelin and Ingwersen [49] propose that information seeking and retrieval research should pay more attention to tasks and their contexts, which include information retrieval, information seeking, and work task contexts. Bates [48] emphasizes the importance of information seeking tasks in academic search by discussing the embedment of six scholarly browsing techniques that are commonly used by scientists in SUIs: footnote chasing, citation searching, journal run, area scanning, subject searching, and author searching. These browsing techniques are partially supported in existing SUIs, but not fully developed and integrated to support academic search.
Researchers have described the behavioral tasks of academic searchers from experience and empirical studies and formulated these tasks in information seeking models. These models offer components at the task level in addition to conceptual and procedural descriptions. For example, Kuhlthau [50] proposed the six-stage model of Information Search Process with specific tasks (task initiation, topic selection, prefocus exploration, focus formulation, information collection, and search closure). Ellis [51] identified six behavioral characteristics of scholarly information seeking: starting, chaining, browsing, differentiating, monitoring, and extracting. Bates [48] used the metaphor of berrypicking to describe the evolving nature of academic search and suggested six common browsing techniques to be supported by SUIs. Belkin’s [52] model of the standard view of information retrieval illustrates information seeking behavior as an interaction between information need and text with four tasks of comparison, retrieve texts, judgment, and modification. Shneiderman et al. [53] also propose an SUI model with four phases: formulation, action, review of results, and refinement. These models offer task descriptions grounded in the information seeking and retrieval practice of academic searchers and are thus critical to the conceptualization of academic SUI usability.
Prior studies in information seeking models and tasks have been used as lens for further understanding of information seeking behavior and tasks of scientists. Al-Suqri [54] integrated the work of Ellis [51], Kuhlthau [55], and Wilson [44] to propose an information seeking behavior model and verified the model elements through a qualitative study. Following the grounded theory approach of Ellis [56], Moral et al. [57] analyzed data collected from one focus group and eight interviews with computer science researchers in a modeling study. An information-seeking process model was developed from the 169 derived concepts with eight information seeking purposes: (1) obtain relevant information about a topic, (2) elaborate a state-of-the-art, (3) find again a forgotten reference, (4) update a bibliography, (5) find a specific information, (6) find a reference for a citation, (7) browse a document collection, and (8) incorporate a set of documents into an existing local collection. Also, in the information seeking task sub-model, five first-level tasks are identified: exploration, reading, filtering, single information-seeking, and chained information-seeking [57].
Exploratory Search.
Early research in search has characterized scientific information seeking as exploratory browsing and search. The exploratory nature of information seeking behavior is discussed by Bates [48] as “berrypicking,” depicting information seeking as evolving with iteration of browsing and retrieval tasks through feedback and query refining. The resurgence of exploratory search since the mid-2000’s [58, 59] has brought attention to the exploratory nature of search and its importance in the design of SUI features in support of academic search tasks.
Exploratory search is one approach to dealing with the complexity of search, which traditional search systems are not built to support [60]. As empirical study has shown, exploratory search can be characterized by the use of short queries, maximum scroll depth, and long task completion time [61]. Marchionini [62] differentiated lookup from browsing and proposed the two features of learning and investigation in exploratory search. The idea that exploratory search is closely related to learning and problem solving [62, 63] has enriched exploratory search as a field of study and demonstrated the uniqueness of academic search. Such extended definition of exploratory search is therefore widely used by contemporary exploratory search researchers. Many of the exploratory search feature prototyped therefore conceptually follow the processes of lookup, learn, and investigate.
With the evolution of technology and data science, exploratory search researchers are able to prototype SUI features to support learning and investigation. For example, temporal presentation of single search query results [15], temporal comparison of multiple entities [10], overview of searched area topic flows [14], comparative overviewing of query result documents [14, 15], and spatial presentation of query results [13, 64]. At task level, these developments echo Shneiderman’s [11] proposal of visual data types and information retrieval tasks (overview, zoom, filter, details-on-demand, relate, history, and extract). In addition to learning and investigation, the exploratory search prototypes have also given much management capacity to the academic searcher; for example, displaying documents collected in session for document management and search trail for query management [15]. Contemporary academic search, therefore, can contain the four processes of lookup, learn, investigate, and manage within one SUI (Fig. 1).
Discovery Tools.
In recent years, discovery tools have become increasingly dominant in academic and research libraries [65] for the integration of information resources across institutional repositories, subscribed commercial publication databases, open access resources, aggregation services, and Web resources. Discovery tools offer a Google-like simplistic search interface and attempt to address the issue of a growing volume and complexity of data. As libraries gradually adopt discovery tools, research has shown that users favor the Google-type single search box feature although usability and content integration are still major problems [66]. Hanrath and Kottman [67] also point out that existing discovery tools lack sufficient integration between the discovery tool interface and content providers such as publishers and content aggregators.
Academic search tasks are in the middle ground of the search activity levels of move, tactic, stratagem, and strategy as Bates [47] proposed. These search tasks are seen in both research lines of information seeking modeling and exploratory search and could serve as the foundation for usability modeling of academic SUI. With the strength to search across information resources with a single search interface, discovery tools have the potential to serve as contextualized academic SUIs if the academic search tasks are embedded through implementation of exploratory search features.
2.3 Academic SUI Usability
While there is a lack of research in the conceptualization of academic SUI usability, usability studies of digital libraries may inform the design of academic SUIs. Usability has received attention from digital library researchers because content and usability are important for users when evaluating digital libraries [68]. In the context of digital library research, researchers have defined usability as “a system has visible working functionality familiar to its users, maximum readability, and useful content that is supported by its environment and aligned with context of use” [69]. Tsakonas et al. [70] proposed a contextualized digital library interaction model, in which usability and usefulness are placed at the same conceptual level (Fig. 2). Usability in this model is defined as the interaction between user and system, and usefulness is defined as the interaction between user and content. Such conceptualization reflects the unique importance of content in digital library context. Content is therefore unique in the usability conceptualization of academic SUI.
Jeng [72] reviewed usability definitions and proposed a digital library usability model to add learnability to the ISO factors of effectiveness, efficiency, and satisfaction. Similarly, also based on the ISO 2941-11 usability factors and Nielsen [73], Joo and Lee [74] developed a model for digital library usability evaluation to include the four dimensions of efficiency, effectiveness, satisfaction, and learnability. A study [69] to survey and contrast usability definitions between library researchers and practitioners found 11 attributes of usability in a library context. As researchers [29] indicated after reviewing the digital library evaluation models, the major challenge in usability evaluation in digital libraries remains to find consensus on the definition of usability. This definition issue is complicated by the finding that there is a correlation relationship among the digital library usability factors [71, 72] as general usability frameworks. This phenomenon of a lack of consensus in definition demonstrates the need for contextualized efforts to define and develop usability models for academic SUI.
3 A Formative Model
As Fidel [34] discussed, formative models describe the required context under which a desired behavior happens. A conceptualization of usability for academic SUI, therefore needs to describe the conditions under which information seeking tasks would be better supported before metrics can be specified. Information seeking tasks are grounded in information seeking and behavior models and can be integrated into the exploratory search processes to map with related usability constructs.
Given the complicated nature of usability, it is not surprising that researchers have not been able to reach consensus on the academic SUI usability factors. Some research has, however, approached the issue of user interface usability from a different perspective. Parush [75], in discussing conceptual models and design for interactive systems, proposed five human factors to assess the implications of conceptual models: (1) mental models and understanding, (2) location awareness, (3) visual search effectiveness, (4) operational load, and (5) working memory load. These factors seem to be appropriate in intermediating between the user tasks and usability factors given that they are human behavioral performance constructs and can also be interpreted as usability evidenced through interaction.
Due to the lack of clearly defined usability factors in academic SUI, this study takes the layered architect approach of Folmer and Bosch [28] by including levels of related usability factors and indicators to form a framework of usability. The layered elements are organized to demonstrate the rough relationships between the layers without specific matches among the elements. Based on the above analysis of literature, the proposed framework puts together a formative model of academic SUI by mapping the academic search tasks of users and the usability constructs to indicate the specific conditions to be achieved in order to effectively support the academic searcher. This mapping has led to the academic SUI usability framework depicted in Table 1.
The matching of the information seeking tasks involves identifying and interpreting the information seeking tasks from prior studies. For example, the “Details-on-demand” in Shneiderman [11], which focuses on selected items for details, is similar to Reading in the Prefocus stage of Kulthau [50], in which reading is “Reading to become informed,” and associated with the strategy of “Reading to learn about topic” (p. 238) after locating relevant information. They are therefore grouped under the task of Reading. The task Browsing is taken broadly to mean going through resources such as scanning topic or spatial areas, online collections, or lists of resources. This broad definition approach follows Ellis [51], although similar activities and behavior are sometimes termed exploration.
The information seeking tasks are generally grouped into the category of lookup, learn, explore, and manage. This category roughly matches the search activities of lookup, learn, and investigate of Marchionini [62] and Kuhlthau’s [55] description of actions in the Information Search Process model: seeking background information, seeking relevant information, and seeking relevant/focused information. The process of Manage is added due to the user capacity offered by contemporary search systems. Especially with the recent features in exploratory search SUI prototypes, the process of Manage is thereby added as an extension of information seeking processes.
4 Conclusion
The development of a context-specific usability framework through identification of user information seeking tasks and integration of usability constructs is the focus of this study. This study stipulates the relationship between the information-seeking tasks and the usability constructs in the academic SUI context to reveal how scientific information seeking behavior could be supported in the SUI to achieve high usability. As Shneiderman and Plaisant [17] have indicated, there is a huge gap between information needs and interface actions. Well-developed usability conceptual frameworks, however, would guide the implementation of academic SUIs and their empirical validation for validity and reliability.
Usability is a broad field of study significant to research and design in the industry. This study aims at the conceptualization of a contextualized usability framework grounded in the existing knowledge of information seeking tasks and usability. This framework may be further refined and developed into context-specific usability models of and evaluation tools for academic SUIs. The architect approach to model usability in the context of academic SUI is pragmatic in generally presenting the usability constructs. Future studies should further clarify usability factors and indicators according to context.
A number of SUI features unseen in the traditional information seeking models are supported in recent exploratory SUI prototypes and more innovative features could be developed in the future [6]. These tasks especially enhance the exploration and management processes in academic search and are mostly open-ended tasks at a higher cognitive level, which would trigger the next browsing and search moves. In contrast to general search, in which search is meant to satisfy the information needs of the searcher, future academic SUIs could include more personalization and collaborative academic information management features to facilitate the learning and investigation tasks. Development of strategic SUI features such as collaboration, recommendation, and adaptation and their corresponding usability constructs are also needed for future academic SUIs.
As Galitz [79] indicated, interface design must start with knowing the user. Ethnographic studies may provide new understanding of representative user tasks and information seeking scenarios when the academic search context has undergone changes with the adoption of discovery systems and development of exploratory search research. A desired development may be to integrate the SUI features developed in the exploratory search prototypes into existing discovery systems for empirical evaluation and improvement. Meanwhile, such implementations would be better if guided by clearly contextualized academic SUI usability models.
References
Swanson, T.A., Green, J.: Why we are not Google: lessons from a library web site usability study. J. Acad. Librariansh. 37, 222–229 (2011)
Gooding, P.: Exploring the information behaviour of users of Welsh Newspapers Online through web log analysis. J. Doc. 72, 232–246 (2016)
Kroll, S., Forsman, R.: A Slice of Research Life: Information Support for Research in the United States. OCLC Research, Dublin (2010)
Nel, M.A., Fourie, I.: Information behavior and expectations of veterinary researchers and their requirements for academic library services. J. Acad. Librariansh. 42, 44–54 (2016)
Bates, M.J.: Many paths to theory: the creative process in the information sciences. In: Sonnenwald, D.H. (ed.) Theory Development in the Information Sciences, pp. 21–49. University of Texas Press, Austin (2016)
Wilson, M.L., Kules, B., Schraefel, M.C., Shneiderman, B.: From keyword search to exploration: designing future search interfaces for the web. Found. Trends® Web Sci. 2, 1–97 (2010)
Chen, Y.-H., Germain, C.A., Yang, H.: An exploration into the practices of library web usability in ARL academic libraries. J. Am. Soc. Inf. Sci. Technol. 60, 953–968 (2009)
Faulkner, L.: Beyond the five-user assumption: benefits of increased sample sizes in usability testing. Behav. Res. Methods Instrum. Comput. 35, 379–383 (2003)
Hornbæk, K.: Current practice in measuring usability: challenges to usability studies and research. Int. J. Hum.-Comput. Stud. 64, 79–102 (2006)
Jackson, A., Lin, J., Milligan, I., Ruest, N.: Desiderata for exploratory search interfaces to web archives in support of scholarly activities. Presented at the (2016)
Shneiderman, B.: The eyes have it: a task by data type taxonomy for information visualizations. In: Proceedings IEEE Symposium on Visual Languages, pp. 336–343. IEEE Computer Society, Boulder (1996)
Hearst, M.A., Smalley, P., Chandler, C.: Faceted metadata for information architecture and search. In: CHI Course for CHI. ACM, Montréal (2006)
Glowacka, D., Ruotsalo, T., Konuyshkova, K., Kaski, S., Jacucci, G.: Directing exploratory search: reinforcement learning from user interactions with keywords. In: Proceedings of the 2013 International Conference on Intelligent User Interfaces, pp. 117–128. ACM (2013)
Medlar, A., Ilves, K., Wang, P., Buntine, W., Glowacka, D.: PULP: a system for exploratory search of scientific literature. In: Proceedings of the 39th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 1133–1136. ACM, New York (2016)
Singh, J., Nejdl, W., Anand, A.: Expedition: a time-aware exploratory search system designed for scholars. In: Proceedings of the 39th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 1105–1108. ACM, New York (2016)
Bourg, C., Coleman, R., Erway, R.: Support for the Research Process: An Academic Library Manifesto. OCLC Research, Dublin (2009)
Shneiderman, B., Plaisant, C.: Designing the User Interface: Strategies for Effective Human-Computer Interaction. Pearson Education, Inc., Upper Saddle River (2005)
Ahmed, S.M.Z., McKnight, C., Oppenheim, C.: A review of research on human-computer interfaces for online information retrieval systems. Electron. Libr. 27, 96–116 (2009)
Pettigrew, K.E., McKechnie, L.E.F.: The use of theory in information science research. J. Am. Soc. Inf. Sci. Technol. 52, 62–73 (2001)
U.S. Department of Health and Human Services. https://www.usability.gov/
Merholz, P.: Peter in conversation with Don Norman about UX & innovation (2007). http://adaptivepath.org/ideas/e000862/
Bevan, N., Macleod, M.: Usability measurement in context. Behav. Inf. Technol. 13, 132–145 (1994)
Borgman, C.L.: Designing digital libraries for usability. In: Bishop, A.P., House, N.A.V., Buttenfield, B.P. (eds.) Digital Library Use: Social Practice in Design and Evaluation, pp. 85–118. The MIT Press, Cambridge (2003)
Bevan, N., Carter, J., Harker, S.: ISO 9241-11 revised: what have we learnt about usability since 1998? In: Kurosu, M. (ed.) HCI 2015. LNCS, vol. 9169, pp. 143–151. Springer, Cham (2015). doi:10.1007/978-3-319-20901-2_13
International Standards Organization: ISO 9241-11:1998(en), Ergonomic requirements for office work with visual display terminals (VDTs) — Part 11: Guidance on usability. https://www.iso.org/obp/ui/#iso:std:iso:9241:-11:ed-1:v1:en
Nielsen, J.: Heuristic evaluation. In: Nielsen, J., Mack, R.L. (eds.) Usability Inspection Methods, pp. 25–62. Wiley, New York (1994)
Nielsen, J.: Usability 101: Introduction to Usability. https://www.nngroup.com/articles/usability-101-introduction-to-usability/
Folmer, E., Bosch, J.: Architecting for usability: a survey. J. Syst. Softw. 70, 61–78 (2004)
Heradio, R., Fernandez-Amoros, D., Javier Cabrerizo, F., Herrera-Viedma, E.: A review of quality evaluation of digital libraries based on users’ perceptions. J. Inf. Sci. 38, 269–283 (2012)
Alonso-Ríos, D., Vázquez-García, A., Mosqueira-Rey, E., Moret-Bonillo, V.: Usability: a critical analysis and a taxonomy. Int. J. Hum.-Comput. Interact. 26, 53–74 (2009)
Seffah, A., Donyaee, M., Kline, R.B., Padda, H.K.: Usability measurement and metrics: a consolidated model. Softw. Qual. J. 14, 159–178 (2006)
Alshamari, M., Mayhew, P.: Technical review: current issues of usability testing. IETE Tech. Rev. 26, 402–406 (2009)
Hornbæk, K., Law, E.L.-C.: Meta-analysis of correlations among usability measures. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 617–626. ACM (2007)
Fidel, R.: Human Information Interaction: An Ecological Approach to Information Behavior. MIT Press, Cambridge (2012)
Chin, J.P., Diehl, V.A., Norman, K.L.: Development of an instrument measuring user satisfaction of the human-computer interface. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 213–218. ACM, New York (1988)
Perlman, G.: Questionnaire for User Interface Satisfaction. http://garyperlman.com/quest/quest.cgi?form=QUIS
Lin, H.X., Choong, Y.Y., Salvendy, G.: A proposed index of usability: a method for comparing the relative usability of different software systems. Behav. Inf. Technol. 16, 267–277 (1997)
Brooke, J.: SUS: a quick and dirty usability scale. In: Jordan, P.W., Thomas, B., Weerdmeester, B.A., McClelland, I.L. (eds.) Usability Evaluation in Industry, pp. 189–194. Taylor & Francis, Milton Park (1996)
Vilar, P.: Information behaviour of scholars. Libellarium J. Hist. Writ. 7, 17–39 (2015)
Pennington, B.: ERM UX: electronic resources management and the user experience. Ser. Rev. 41, 194–198 (2015)
Teague-Rector, S., Ballard, A., Pauley, S.K.: The North Carolina State University libraries search experience: usability testing tabbed search interfaces for academic libraries. J. Web Librariansh. 5, 80–95 (2011)
Khabsa, M., Wu, Z., Giles, C.L.: Towards better understanding of academic search. Presented at the (2016)
Carpenter, J., Wetheridge, L., Tanner, S.: Researchers of tomorrow: the research behaviour of Generation Y docotral students. British Library and Joint Information Systems Committee (JISC) of the Higher Education Funding Council (HEFCE) (2012)
Wilson, T.D.: Models in information behaviour research. J. Doc. 55, 249–270 (1999)
Ingwersen, P.: Cognitive perspectives of information retrieval interaction: elements of a cognitive IR theory. J. Doc. 52, 3–50 (1996)
Saracevic, T.: The stratified model of information retrieval interaction: extension and applications. In: Proceedings of the Annual Meeting-American Society for Information Science, pp. 313–327. Learned Information (Europe) Ltd. (1997)
Bates, M.J.: Where should the person stop and the information search interface start? Inf. Process. Manag. 26, 575–591 (1990)
Bates, M.J.: The design of browsing and berrypicking techniques for the online search interface. Online Rev. 13, 407–524 (1989)
Järvelin, K., Ingwersen, P.: Information seeking research needs extension towards tasks and technology. Inf. Res. Int. Electron. J. 10, paper 212 (2004)
Kuhlthau, C.C.: Developing a model of the library search process: cognitive and affective aspects. Res. Q. 28, 232–242 (1988)
Ellis, D.: A behavioural model for information retrieval system design. J. Inf. Sci. 15, 237–247 (1989)
Belkin, N.J.: Interaction with texts: Information retrieval as information seeking behavior. Inf. Retr. 93, 55–66 (1993)
Shneiderman, B., Byrd, D., Croft, W.B.: Clarifying search: a user-interface framework for text searches. D-Lib Mag. 3, 18–20 (1997)
Al-Suqri, M.N.: Information-seeking behavior of social science scholars in developing countries: a proposed model. Int. Inf. Libr. Rev. 43, 1–14 (2011)
Kuhlthau, C.C.: Inside the search process: Information seeking from the user’s perspective. J. Am. Soc. Inf. Sci. 42, 361 (1991)
Ellis, D.: A behavioural approach to information retrieval system design. J. Doc. 45, 171–212 (1989)
Moral, C., De Antonio, A., Ferre, X.: A visual UML-based conceptual model of information-seeking by computer science researchers. Inf. Process. Manag. 53, 963–988 (2016)
White, R.W., Kules, B., Bederson, B.: Exploratory search interfaces: categorization, clustering and beyond: report on the XSI 2005 workshop at the human-computer interaction laboratory, University of Maryland. In: ACM SIGIR Forum, pp. 52–56. ACM (2005)
Wilson, M.L.: Search User Interface Design. Morgan & Claypool Publishers, San Rafael (2011)
Ahn, J., Brusilovsky, P.: Adaptive visualization for exploratory information retrieval. Inf. Process. Manag. 49, 1139–1164 (2013)
Athukorala, K., Głowacka, D., Jacucci, G., Oulasvirta, A., Vreeken, J.: Is exploratory search different? A comparison of information search behavior for exploratory and lookup tasks. J. Assoc. Inf. Sci. Technol. 67, 2635–2651 (2016)
Marchionini, G.: Exploratory search: from finding to understanding. Commun. ACM 49, 41–46 (2006)
Marchionini, G.: Information Seeking in Electronic Environments. Cambridge University Press, Cambridge (1995)
Roux, C.: Using spatialisation to support exploratory search behaviour (2016)
Wells, D.: Library discovery systems and their users: a case study from Curtin University library. Aust. Acad. Res. Libr. 47, 92–105 (2016)
Niu, X., Zhang, T., Chen, H.: Study of user search activities with two discovery tools at an academic library. Int. J. Hum.-Comput. Interact. 30, 422–433 (2014)
Hanrath, S., Kottman, M.: Use and usability of a discovery tool in an academic library. J. Web Librariansh. 9, 1–21 (2015)
Xie, I.: Evaluation of digital libraries: criteria and problems from users perspectives. Libr. Inf. Sci. Res. 28, 433–452 (2006)
Chen, Y.-H., Germain, C.A., Rorissa, A.: Defining usability: how library practice differs from published research. Portal-Libr. Acad. 11, 599–628 (2011)
Tsakonas, G., Kapidakis, S., Papatheodorou, C.: Evaluation of user interaction in digital libraries. In: Notes of the DELOS WP7 Workshop on the Evaluation of Digital Libraries, Padua, Italy. Citeseer (2004)
Tsakonas, G., Papatheodorou, C.: Analysing and evaluating usefulness and usability in electronic information services. J. Inf. Sci. 32, 400–419 (2006)
Jeng, J.: Usability assessment of academic digital libraries: effectiveness, efficiency, satisfaction, and learnability. Libri 55, 96–121 (2005)
Nielsen, J.: Usability Engineering. AP Professional, Boston (1993)
Joo, S., Lee, J.Y.: Measuring the usability of academic digital libraries: instrument development and validation. Electron. Libr. 29, 523–537 (2011)
Parush, A.: Conceptual Design for Interactive Systems: Designing for Performance and User Experience. Morgan Kaufmann/Elsevier, Amsterdam (2015)
Xie, I., Cool, C.: Understanding help seeking within the context of searching digital libraries. J. Am. Soc. Inf. Sci. Technol. 60, 477–494 (2009)
Xie, I., Joo, S., Bennett-Kapusniak, R.: User involvement and system support in applying search tactics. J. Assoc. Inf. Sci. Technol. 68, 1165–1185 (2016)
Palmer, C.L., Teffeau, L.C., Pirmann, C.M.: Scholarly Information Practices in the Online Environment: Themes from the Literature and Implications for Library Service Development. OCLC Research, Dublin (2009)
Galitz, W.O.: The Essential Guide to User Interface Design: An Introduction to GUI Design Principles and Techniques. Wiley, Indianapolis (2007)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Chen, T., Gross, M. (2017). Usability Modeling of Academic Search User Interface. In: Marcus, A., Wang, W. (eds) Design, User Experience, and Usability: Understanding Users and Contexts. DUXU 2017. Lecture Notes in Computer Science(), vol 10290. Springer, Cham. https://doi.org/10.1007/978-3-319-58640-3_2
Download citation
DOI: https://doi.org/10.1007/978-3-319-58640-3_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-58639-7
Online ISBN: 978-3-319-58640-3
eBook Packages: Computer ScienceComputer Science (R0)