Users’ evaluation of digital libraries (DLs): Their uses, their criteria, and their assessment
Introduction
The availability of the Internet brings dramatic changes to millions of people in terms of how they collect, organize, disseminate, access, and use information. Perceptions of digital libraries vary and evolve over time, and many definitions for digital libraries have been proposed. The concept of a digital library means different things to different people. Even the key players in the development and use of digital libraries have different understanding of digital libraries. To librarians, a digital library is another form of a physical library; to computer scientists, a digital library is a distributed text-based information system or a networked multimedia information system; to end users, digital libraries are similar to the world wide web (www) with improvements in performance, organization, functionality, and usability (Fox, Akscyn, Furuta, & Leggett, 1995). Borgman’s (1999) two competing visions of digital libraries stimulate more discussions on the definition of a digital library by researchers and practitioners. The common elements of a digital library definition identified by the Association of Research Libraries (1995) are more acceptable to researchers of digital libraries:
- •
The digital library is not a single entity.
- •
The digital library requires technology to link the resources of many.
- •
The linkages between the many digital libraries and information services are transparent to the end users.
- •
Universal access to digital libraries and information services is a goal.
- •
Digital library collections are not limited to document surrogates: they extend to digital artifacts that cannot be represented or distributed in printed formats.
The Digital Library Initiative 1 & II funded by National Science Foundation (NSF) and other federal agencies have advanced the technical as well as the social, behavioral, and economic research needed to design and develop digital libraries. Millions of dollars have been invested into the development of digital libraries. Many unanswered questions related to whether users use them, how they use them, and what facilitate and hinder their access of information in these digital libraries. These questions cannot be answered without the evaluation of the existing digital libraries. We need to assess the usability of digital libraries in order to evaluate the full potential of digital libraries (Blandford & Buchanan, 2003). Moreover, user model of digital libraries and digital library model of users are different (Saracevic, 2004). It is important to understand users’ perspectives of digital libraries. In addition, needs assessment and evaluation is also essential for the iterative design for digital libraries (Van House, Butler, Ogle, & Schiff, 1996). In order to answer these questions, we need to first clarify the following questions: (1) what are the digital library (DL) evaluation criteria? (2) who determines these criteria? or how these criteria are determined?
Just as definitions for digital libraries vary, so do the DL evaluation criteria. A major challenge for DL evaluation is to identify what to evaluate and how to evaluate non-intrusively at low cost (Borgamn & Larsen, 2003). DL evaluation is a challenging task due to the complicated technology, rich content and a variety of users involved (Borgman et al., 2000, Saracevic and Covi, 2000). The most recognized DL evaluation criteria are derived from evaluation criteria for traditional libraries, IR system performance, and human-computer interaction (Chowdhury and Chowdhury, 2003, Marchionini, 2000, Saracevic, 2000, Saracevic and Covi, 2000). Very few studies actually apply all the DL evaluation criteria to assess a digital library. Many of the studies focus on the evaluation of usability of digital libraries. After reviewing usability tests in selected academic digital libraries, Jeng, 2005a, Jeng, 2005b found that ease of use, satisfaction, efficiency and effectiveness are the main applied criteria. Some of the evaluation studies extend to assess performance, content and services of digital libraries while service evaluation mainly concentrates on digital reference (Carter & Janes, 2000). Other evaluation studies also look into the impact of digital libraries (Marchionini, 2000).
Little research has investigated users’ evaluation of digital libraries, in particular, their criteria and their actual assessment of digital libraries. Xie (2006) pointed out that even though researcher have developed DL evaluation criteria, and conducted actual studies to evaluate existing digital libraries or prototypes of digital libraries, there is a lack of user input regarding evaluation criteria. User evaluation of digital libraries by applying their own criteria is needed. The remaining questions related to DL evaluation are: What is the relationship between users’ use and evaluation of digital libraries? And what is the relationship between users’ perceived DL evaluation criteria and their actual evaluation?
Section snippets
Research problem and research questions
Evaluation of digital libraries is an essential component for the design of effective digital libraries. Digital libraries are designed for users to use. However, most of the research on evaluation of digital libraries has applied criteria from researchers themselves. In particular, these studies focus on the usability studies. Little has been done on the identification and ranking DL evaluation criteria from users’ perspectives. Furthermore, less has evaluated digital libraries by applying
Literature review
The emergence of digital libraries calls for the need for the evaluation of digital libraries. Evaluation is a research activity, and it has both theoretical and practical impact (Marchionini, 2000). An evaluation is a judgment of worth. The objective of DL evaluation is to assess to what extent a digital library meets its objectives and offer suggestions for improvements (Chowdhury & Chowdhury, 2003). Even though there are no standard evaluation criteria and evaluation techniques for DL
Sampling
Subjects were recruited from School of Information Studies, University of Wisconsin–Milwaukee. Two reasons were considered for the recruitment: (1) These subjects have a need to understand digital libraries and have some experience with the use of digital libraries. (2) These subjects are the targeted audience for similar types of digital libraries. (3) This study is the extension of the author’s previous study (Xie, 2006). That is why it is important to recruit subjects who share similar
Results
The results present answers to the three research questions in relation to users’ use, their evaluation criteria and their actual evaluation of the two selected digital libraries. Fig. 3 illustrates the structure of the results.
Discussion
Borgman et al. (2000) suggest that technical complexity, variety of content, uses and users, and lack of evaluation methods contribute to the problem of DL evaluation. Citing Marchionini’s (2000) metaphor considering evaluation of digital libraries as evaluation of a marriage, Saracevic (2004) well illustrates the complexity of DL evaluation. However, the only people who can best evaluate a marriage are the people who are involved in the marriage. That also applies to the digital library
Conclusion
Digital Library evaluation is not an easy task. The value of digital libraries needs be judged by the users of digital libraries. Subjects of this study not only rated the importance of the DL evaluation criteria developed by another group of users in a previous study but also evaluated the two digital libraries by applying these criteria. The results of this study yielded some interesting findings. The use of digital libraries shows that the design of digital libraries affects how users
Acknowledgement
The author would like to thank Adrienne Wiegert, Eleanore Bednarek, and Jessy Olson for their assistance on data analysis and anonymous reviewers for their insightful comments.
References (47)
- et al.
An evaluation of faculty use of the digital library at Ankara University, Turkey
Journal of Academic Librianship
(2006) - et al.
Interfaces and tools for the library of congress national digital library program
Information Processing and Management
(1998) Evaluation measures for interactive information retrieval
Information Processing and Management
(1992)- et al.
Understanding user acceptance of digital libraries: What are the roles of interface characteristics, organizational context, and individual differences?
International Journal of Human-Computer Studies
(2002) - Association of Research Libraries. (1995). Definition and purposes of a digital library. Retrieved July 17, 2002, from:...
- et al.
Functionality, usability and accessibility: Iterative user-centered assessment strategies for digital libraries
Performance Measurement and Metrics
(2006) - et al.
Digital libraries: Situating use in changing information infrastructure
Journal of the American Society for Information Science
(2000) - et al.
Workshop report: Usability of digital libraries @JCDL’02
SIGIR Forum
(2002) - Blandford, A. & Buchanan, G. (2003). Usability of digital libraries: A source of creative tensions with technical...
- et al.
Evaluation of digital library impact and user communities by analysis of usage patterns
D-Lib Magazine
(2002)
Usage analysis for the identification of Research trends in digital libraries
D-Lib Magazine
What are digital libraries? Competing visions
Information Processing and Management
Evaluating digital libraries for teaching and learning in undergraduate education: A case study of the Alexandria digital earth prototype (ADEPT)
Library Trends
The design and evaluation of interactivities in a digital library
D-Lib Magazine
Usability evaluation of digital libraries
Science and Technology Libraries
Unobtrusive data analysis of digital reference questions and service at the Internet Public Library: An exploratory study
Library Trends
Studying digital library users over time: A follow-up survey of early Canadiana online
Information Research
A Framework for evaluating digital library services
D-Lib Magazine
Introduction to digital libraries
Evaluating on TIME: A framework for the expert evaluation of digital library interface usability
International Journal on Digital Libraries
Digital libraries
Communications of the ACM
Evaluation of digital libraries
International Journal on Digital Libraries
Cited by (116)
Evaluating a digital community archive from the user perspective: The case of formative multifaceted evaluation
2022, Library and Information Science ResearchCitation Excerpt :Evaluation from the user perspective has been particularly prevalent, although initially tended to focus on the usability of user interfaces while overlooking other components of DL systems (Kelly, 2014; Li & Liu, 2019; Shiri & Villanueva, 2021). Xie (2008) expands the user-centered model beyond usability and identifies other dimensions relevant to users. The studies that focus on developing comprehensive frameworks for DL evaluation recognize that DLs are complex multifaceted systems and engage different stakeholders in identifying multiple dimensions for evaluation.
Information Resource, Interface, and Tasks as User Interaction Components for Digital Library Evaluation
2019, Information Processing and ManagementCitation Excerpt :To tackle these research questions, we can reveal the relationships between users’ interaction with a DL and their subsequent evaluation of the DL on one hand; on the other hand, we can identify the most effective criteria that are significantly correlated or even can significantly predict users’ evaluation of DLs from different perspectives. Many user criteria for DL evaluation have been proposed and identified as important criteria, such as information authority, meta-data accuracy, service quality, browsing and search functions, etc. (Xie, 2006; 2008; Blandford et al., 2008; Zhang, 2010; Xie, Joo, & Matusiak, 2014). However, the most critical criteria for DL evaluation from the three perspectives mentioned above remain untouched.
Multifaceted Evaluation Criteria of Digital Libraries in Academic Settings: Similarities and Differences From Different Stakeholders
2018, Journal of Academic LibrarianshipCitation Excerpt :There was also no significant difference in ranking the criteria in the “Administration” dimension between the scholar and librarian groups except for “regular assessment,” which was ranked higher by scholars. As discussed in the literature review section, the evaluation studies of DLs have been primarily conducted by researchers (Albertson, 2015; Dobreva & Chowdhury, 2010; Xie, 2006, 2008; Zhang, 2010). This is an important area of research in DLs and hence higher ranking of assessment does not come as a surprise.
Perceived value of digital components in library programmes: The case of Auckland Libraries’ Dare to Explore summer reading programme
2018, Library and Information Science ResearchThe impact of information needs satisfaction on the creativity of visual art teachers
2024, Journal of Documentation