Skip to main content

Capitalizing on new forms of academic library’s intellectual assets: a new library mobile application proposition

  • Published:
Education and Information Technologies Aims and scope Submit manuscript

Abstract

Library and information science experts around the globe are currently exploring ways of capitalizing student workflow data within library walls. Within this realm, the researchers designed and pilot-tested a user-driven lightweight application that envisions library as a crucial contributor of co-curricular data to learner profiles’ contextual integrity. The prototype usability test conducted in December 2018 with the participation of 30 students at the University of West Attica, Greece, aimed not only to record participants’ perspectives about the application but also to trace their attitudes towards this new kind of intervention. Post-test questionnaires yield a variety of positive rich-textured comments indicating students’ interest in the emerging conversation around library use data capitalization. The participants felt positive about the need to develop a culture that fosters the reconsideration of library value constituents and their new dynamic role in the educational context. The pilot-tested application could serve as a reference for the improvement of academic library use data collection practices.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Notes

  1. According to Beagle (2004) and REBIUN (2003), a new library model, which is a dynamic student-centered setting that accommodates all the information and IT services necessary to support learning and research in the university. Since the year 2000, Information Commons have expanded their facilities and scope of activities by incorporating tutorial programs, writing centers, and faculty development centers to include a new focus on student learning, while many of them have also taken an additional step to changing their designation to Learning Commons.

  2. More information available from: https://library.hud.ac.uk/blogs/lidp/

References

  • Abbott, A., Cyranoski, D., Jones, N., Maher, B., Schiermeier, Q., & Van Noorden, R. (2010). Metrics: Do metrics matter? Nature News, 465(7300), 860–862. https://doi.org/10.1038/465860a.

    Article  Google Scholar 

  • Allison, D. (2015). Measuring the academic impact of libraries. portal: Libraries and the Academy, 15(1), 29–40. https://doi.org/10.1353/pla.2015.0001.

    Article  MathSciNet  Google Scholar 

  • Arminio, J., Roberts, D. C., & Bonfiglio, R. (2009). The professionalization of student learning practice: An ethos of scholarship. About Campus, 14(1), 16–20. https://doi.org/10.1002/abc.279.

    Article  Google Scholar 

  • Asher, A, Briney K.A., Perry, M.R., Goben, A., Salo, D., Robertshaw, M.B. & Jones, K.M. (2018). Spec Kit 360: Learning Analytics. Washington, DC: Association of Research Libraries, September 2018, https://doi.org/10.29242/spec.360.

    Book  Google Scholar 

  • Assila, A., De Oliveira, K.M. & Ezzedine, H. (2016). Standardized usability questionnaires: Features and quality focus. Electronic Journal of Computer Science and Information Technology: eJCIST, 6(1). http://ejcsit.uniten.edu.my/index.php/ejcsit/article/view/96. Accessed 20 Nov 2018.

  • Bangor, A., Kortum, P., & Miller, J. (2009). Determining what individual SUS scores mean: Adding an adjective rating scale. Journal of Usability Studies, 4(3), 114–123.

    Google Scholar 

  • Beagle, D. (2004). From information commons to learning commons, white paper. In: University of Southern California Leavey Library 2004 Conference on Information Commons: Learning Space Beyond the Classroom.

  • Bennetot Pruvot, E., Estemann, T. & Kupriyanova, V. (2017). Public Funding Observatory Report 2017”, European University Association, Brussels, Belgium. https://www.eua.eu/downloads/publications/eua-pfo-report-december-2017.pdf. Accessed 15 Oct 2018.

  • Brooke, J. (1996). SUS-A quick and dirty usability scale. Usability Evaluation in Industry, 189(194), 4–7.

    Google Scholar 

  • Chan, J. C., & Lam, S. F. (2010). Effects of different evaluative feedback on students’ self-efficacy in learning. Instructional Science, 38(1), 37–58. https://doi.org/10.1007/s11251-008-9077-2.

    Article  Google Scholar 

  • Chen, H., Doty, P., Mollman, C., Niu, X., Yu, J., & Zhang, T. (2015). Library assessment and data analytics in the big data era: practice and policies. Proceedings of the Association for Information Science and Technology, 52(1), 1–4. https://doi.org/10.1002/pra2.2015.14505201002.

    Article  Google Scholar 

  • Clark, M., De Leon, E., Edgar, L. & Perrin, J. (2012). Library Usability: Tools for Usability Testing in the Library. In: Texas Library Association Annual Conference, March 2012.

  • Cormack, A. N, (2016). A data protection framework for learning analytics a data protection framework for learning analytics. Journal of Learning Analytics, 3(1), 91–106. https://doi.org/10.18608/jla.2016.31.6.

  • Donham, J., & Green, C. W. (2004). Perspectives on… developing a culture of collaboration: Librarian as consultant. The Journal of Academic Librarianship, 30(4), 314–321. https://doi.org/10.1016/j.acalib.2004.04.005.

    Article  Google Scholar 

  • Dow, R. F. (1998). Using assessment criteria to determine library quality. Journal of Academic Librarianship, 24, 277–281.

    Article  Google Scholar 

  • Drucker, P.F. (1993). The post-capitalist society. Oxford: Butterworth Heinemann. In: De Jong, T., (2010), Linking social capital to knowledge productivity: an explorative study on the relationship between social capital and learning in knowledge-productive networks, Enschede.

  • Duval, E. (2011). Attention please!: learning analytics for visualization and recommendation. In: Proceedings of the 1st international conference on learning analytics and knowledge, February 27–March 01, 2011, Banff, Albert, Canada [online]. ACM. pp. 9–17, https://doi.org/10.1145/2090116.2090118.

  • Germano, M. A., & Stretch-Stephenson, S. M. (2012). Strategic value planning for libraries. The Bottom Line, 25(2), 71–88. https://doi.org/10.1108/08880451211256405.

    Article  Google Scholar 

  • Goldhaber, M.H. (1997). The attention economy and the Net. First Monday, 2(4). https://doi.org/10.5210/fm.v2i4.519.

  • Harrison, R., & Kessels, J. (2004). Human resource development in a knowledge economy. An organizational view. New York: Palgrave Mcmillan.

  • Hellenic Quality Assurance and Accreditation Agency (2017). Higher education quality report. HQA Council. https://www.adip.gr/data/HQA_report2016.pdf. Accessed 24 Jan 2019.

  • Jantti, M.H. (2014). Aspiring to excellence: Maximizing data to sustain, shift and reshape a library for the future. In: Library Assessment Conference, Association of Research Libraries, Seattle. pp. 1–9. https://ro.uow.edu.au/cgi/viewcontent.cgi?referer=https://scholar.google.com/&httpsredir=1&article=1506&context=asdpapers. Accessed 12 Aug 2018.

  • Jivet, I., Scheffel, M., Specht, M. & Drachsler, H. (2018). License to evaluate: Preparing learning analytics dashboards for educational practice. In: Proceedings of the 8th International Conference on Learning Analytics and Knowledge, March 7–9, 2018. Sydney, ACM. pp. 31–40. http://hdl.handle.net/1820/9138. Accessed 10 Dec 2018.

  • Jong, T.D. (2010). Linking social capital to knowledge productivity. PhD Thesis, Proefschrift, Universiteit Twente. https://ris.utwente.nl/ws/portalfiles/portal/6082371/thesis_T_de_Jong.pdf. Accessed 23 Nov 2018.

  • Kay, D.& Van Harmelen, M, (2015). Activity data - delivering benefits from the data deluge. [online]. JISC. https://www.jisc.ac.uk/guides/activity-data-delivering-benefits-from-the-data-deluge. Accessed 20 Sept 2018.

  • Kezar, A. (2003). Enhancing innovative partnerships: Creating a change model for academic and student affairs collaboration. Innovative Higher Education, 28(2), 137–156. https://doi.org/10.1023/B:IHIE.0000006289.31227.25.

    Article  Google Scholar 

  • Kluger, A. N., & DeNisi, A. (1996). The effects of feedback interventions on performance: A historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychological Bulletin, 119(2), 254–284.

    Article  Google Scholar 

  • Kroski, E. (2008). On the move with the mobile web: Libraries and mobile technologies. Library Technology Reports, 44(5), 1–48 http://hdl.handle.net/10760/12463. Accessed 15 June 2018.

    Google Scholar 

  • Kyrillidou, M. (1997). The use of statistics by libraries in North America. In: 6ο Panhellenic Academic Libraries Conference, November 5–7. Athens, http://hdl.handle.net/10760/9814. Accessed 12 May 2018.

  • Kyrillidou, M., & Cook, C. (2008). The evolution of measurement and evaluation of libraries: A perspective from the Association of Research Libraries. Library Trends, 56(4), 888–909. http://hdl.handle.net/2142/9498. Accessed 12 May 2018.

    Article  Google Scholar 

  • Lakes, K. D. (2013). Restricted sample variance reduces generalizability. Psychological Assessment, 25(2), 643–650. https://doi.org/10.1037/a0030912.

    Article  Google Scholar 

  • Lakos, A. (2007). Evidence-based library management: The leadership challenge. Portal: Libraries and the Academy, 7(4), 431–450. Johns Hopkins University Press. https://doi.org/10.1353/pla.2007.0049.

    Article  Google Scholar 

  • Landøy, A., & Repanovici, A. (2010). Knowing the needs: a system for evaluating the university library. In Proceedings of the International Conference on QQML2009 (pp. 329–334). https://doi.org/10.1142/9789814299701_0041.

    Chapter  Google Scholar 

  • Lessick, S. (2015). Enhancing library impact through technology. Journal of the Medical Library Association, 103(4), 222–233. https://doi.org/10.3163/1536-5050.103.4.015.

    Article  Google Scholar 

  • Lewis, J. R. (1995). IBM computer usability satisfaction questionnaires: Psychometric evaluation and instructions for use. International Journal of Human Computer Interaction, 7(1), 57–78. https://doi.org/10.1080/10447319509526110.

    Article  Google Scholar 

  • Lim, Y.K., Pangam, A., Periyasami, S. & Aneja, S. (2006). Comparative analysis of high-and low-fidelity prototypes for more valid usability evaluations of mobile devices. [online]. In: Proceedings of the 4th Nordic conference on Human-computer interaction: changing roles, October 14–18, 2006, Oslo. ACM. pp. 291–300, https://doi.org/10.1145/1182475.1182506.

  • Long, D. (2016). Librarians and student affairs professionals as collaborators for student learning and success. PhD Thesis. Illinois State University, https://doi.org/10.30707/ETD2016.Long.D.

  • Luna Scott, C. (2015). The Futures of Learning 3: what kind of pedagogies for the 21st century? UNESCO Education Research and Foresight, Paris. ERF Working Papers Series, No. 15, https://rpinternacional.com.co/wp-content/uploads/2018/05/ARTICULO-5.pdf. Accessed 25 Mar 2018.

  • Murray, A.L. (2014). The Academic Library and High-Impact Practices for Student Retention: Perspectives of Library Deans. PhD Thesis. Western Kentucky University. http://digitalcommons.wku.edu/diss/57. Accessed 14 Feb 2018.

  • Mylonas, P. (2017). Turning Greece into an education hub. [online]. National Bank of Greece Sectoral report. https://www.nbg.gr/greek/the-group/press-office/e-spot/reports/Documents/Education.pdf. Accessed 17 Feb 2019.

  • Nielsen, J. (1993). Usability engineering. xiv edn. Boston: Academic Press.

  • Oakleaf, M. (2010). The value of academic libraries: A comprehensive research review and report. Chicago: Association of College & Research Libraries.

    Google Scholar 

  • Oakleaf, M. (2018). LIILA project Report, Library Integration in Institutional Learning Analytics. https://library.educause.edu/~/media/files/library/2018/11/liila.pdf. Accessed 22 Nov 2018.

  • Parsons, G. (2010). Information provision for HE distance learners using mobile devices. The Electronic Library, 28(2), 231–244. https://doi.org/10.1108/02640471011033594.

    Article  Google Scholar 

  • Red de Bibliotecas Universitarias Españolas (REBIUN) (2003). Centros de Recursos para el Aprendizaje y la Investigación: Un nuevo modelo de biblioteca universitaria, Conferencia de Rectores de las Universidades Españolas, Madrid.

  • Salonen, P., Vauras, M., & Efklides, A. (2005). Social interaction-what can it tell us about metacognition and coregulation in learning?. European Psychologist, 10(3), 199-208.

  • Sant-Geronikolou, S. (2018a). Greek and Spanish University Community Perspective of Challenges Affecting Library Integration in Learning Analytics Initiatives. Poster presented at 10th Qualitative and Quantitative Methods in Libraries International Conference (QQML2018), Chania (Greece), 22 - 25 May 2018. http://eprints.rclis.org/32914/. Accessed 10 Dec 2018.

  • Sant-Geronikolou, S. (2018b). Understanding in-library use data lifecycle within Greek and Spanish higher education ecosystems. Library Hi Tech News, 35(7):13-17. https://doi.org/10.1108/LHTN-10-2017-0077.

    Article  Google Scholar 

  • Sauro, J. (2018). Can you use a single item to predict SUS scores? MeasuringU [online]. 4 December 2018. https://measuringu.com/single-item-sus/. Accessed 16 Nov 2018.

  • Showers, B., & Stone, G. (2014). Safety in numbers: Developing a shared analytics Service for Academic Libraries. Performance Measurement and Metrics, 15(1/2), 13–22.

    Article  Google Scholar 

  • Shute, V. J. (2008). Focus on formative feedback. Review of Educational Research, 78(1), 153–189.

    Article  Google Scholar 

  • Somerville, M.M. & Harlan, S. (2008). From information commons to learning commons and learning spaces: An evolutionary context. Learning commons: Evolution and collaborative essentials, In: Schader, B., (Ed.) (2008), Learning Commons: Evolution and Collaborative Essentials. pp. 1–36.

  • Soria, K.M., Peterson, K., Fransen, J. & Nackerud, S., (2017). The Impact of Academic Library Resources on First-Year Students’ Learning Outcomes. [online]. Research Library Issues, 290(2017): 5–20. http://publications.arl.org/rli290/. Accessed 22 Sept 2018.

  • Stone, G., & Ramsden, B. (2013). Library impact data project: looking for the link between library usage and student attainment. College and Research Libraries, 74(6), 546–559. https://doi.org/10.5860/crl12-406.

    Article  Google Scholar 

  • Strakantouna, V. (2005). Epexeryasia Prosopikon Dedomenon kai Prostasia tes Idiotikotetas sto Sychrono Perivallon ton Vivliothekon kai Eperesion Pleroforeses [processing of personal data and protection of privacy in the modern environment of libraries and information services]. Master's thesis, Ionian University, Department of Archives and Library Science.

  • Tevaniemi, J., Poutanen, J., & Lähdemäki, R. (2015). Library as a partner in co-designing learning spaces: A case study at Tampere University of Technology, Finland. New Review of Academic Librarianship, 21(3), 304–324. https://doi.org/10.1080/13614533.2015.1025147.

    Article  Google Scholar 

  • Town, S. (2015). Implementing the value scorecard. Performance Measurement and Metrics, 16(3), 234–251. https://doi.org/10.1108/PMM-10-2015-0033.

    Article  Google Scholar 

  • Trowler, V. (2010). Student engagement literature review. The Higher Education Academy, 11(1), 1–15.

    Google Scholar 

  • Tullis, T. & Albert, W. (2008a). Tips and Tricks for Measuring the User Experience [PowerPoint presentation]. Usability and User Experience, UPA-Boston’s Seventh Annual Mini UPA Conference, 28 May 2008. https://www.bentley.edu/files/centers/duc/albert-tullis-tips-trix-2008.pdf. Accessed 16 Sept 2018.

  • Tullis, T. & Albert, W. (2008b). Measuring the User Experience: Collecting, analyzing and Presenting Usability Metrics. 1st ed. Burlington: Morgan Kaufmann.

  • Väätäjä, H. & Roto, V. (2010). Mobile questionnaires for user experience evaluation. In: CHI'10 Extended Abstracts on Human Factors in Computing Systems, April 10–15, 2010. Atlanta. ACM. pp. 3361–3366, https://doi.org/10.1145/1753846.1753985.

  • Van Trigt, M. (2016). How data can improve the quality of higher education. [online]. SURF Net. https://www.surf.nl/binaries/content/assets/surf/en/knowledgebase/2016/whitepaper-learning-analytics_en-def.pdf. Accessed 23 Jan 2017.

  • Wang, C. Y., Ke, H. R., & Lu, W. C. (2012). Design and performance evaluation of mobile web services in libraries: a case study of the oriental institute of technology library. The Electronic Library, 30(1), 33–50. https://doi.org/10.1108/02640471211204051.

    Article  Google Scholar 

  • Yanosky, R. & Arroway, P. (2015). The Analytics Landscape in Higher Education, 2015. Louisville, ECAR, https://library.educause.edu/~/media/files/library/2015/5/ers1504cl.pdf. Accessed 8 July 2018.

Download references

Acknowledgements

The authors would like to thank all research and pilot trial participants for their help and valuable contributions to developing and evaluating the CLIC Library App prototype.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Stavroula Sant-Geronikolou.

Ethics declarations

Applications, utilities, clipart credits

Microsoft Office Word / Excel 2010, Adobe Acrobat, Google Forms, Pixabay.com, Pnging.com, Wikipedia.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix 1

Appendix 1

1.1 Group A (CSUQ) usability test items and response averages

Lewis’ Computer System usability Questionnaire (CSUQ) (adaptation from the 19 initial items).

Item no

Item wording

avg

1

The application architecture and navigation make sense

4,92

2

The information organization and representation were understandable

5,38

3

It was easy to find the information I needed

4,69

4

It was easy to use the system

5,08

5

The interface of the system is pleasant

5,62

6

I was able to complete the tasks and scenarios quickly using the system

5,54

7

I believe I could become more productive using the system

6,00

8

On-screen messages and other documentation provided with the system were clear

5,69

9

This system had all the functions and capabilities I expected it to have

5,92

10

I believe I would like to use the system frequently

5,92

1.1.1 Qualitative items Group A – Open ended questions

  1. 1.

    What aspects of the system do you find most useful?

  2. 2.

    Do you believe it fosters collaboration and networking?

  3. 3.

    Do you find the app engaging? Would you use it?

  4. 4.

    When exploring the application, were you confused at any point?

  5. 5.

    Were there any features that you would like to see added to the application?

  6. 6.

    Do you believe the application functionalities contribute to supporting student progress?

  7. 7.

    Do you feel the system jeopardizes in any way user privacy?

  8. 8.

    Choose at least 4 adjectives that best describe the application

  9. 9.

    Once the system refined, would you recommend it to other users or your library?

  10. 10.

    Any comments to add?

1.2 Group B (SUS) usability test items and response averages

Item no

Item wording

avg

1

I think that I would like to use this system frequently

2,93

2

I found the system unnecessarily complex

3,79

3

I thought the system was easy to use

2,57

4

I think that I would need the support of a technical person to be able to use this system

1,93

5

I found the various functions in this system were well integrated

3,00

6

I thought there was too much inconsistency in this system

2,57

7

I would imagine that most people would learn to use this system very quickly

3,21

8

I found the system inflexible

3,29

9

I felt very confident using the system

2,29

10

I needed to learn a lot of things before I could get going with this system

2,43

1.2.1 Qualitative items Group B – Open ended questions

  1. 1.

    What aspects of the system do you find most useful?

  2. 2.

    Do you believe it fosters collaboration and networking?

  3. 3.

    Do you find the app engaging? Would you use it?

  4. 4.

    When exploring the application, were you confused at any point?

  5. 5.

    Were there any features that you would like to see added to the application?

  6. 6.

    Do you have any concerns around the application’s compatibility with existing library operations?

  7. 7.

    Do you believe the application functionalities contribute to supporting student progress?

  8. 8.

    Can CLIC data visualizations contribute to facilitating library planning and patron support services?

  9. 9.

    Do you feel the system jeopardizes in any way user privacy?

  10. 10.

    Choose at least 4 adjectives that best describe the application

  11. 11.

    Once the system refined, would you recommend it to other users or your library?

  12. 12.

    Any comments to add?

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sant-Geronikolou, S., Kouis, D. & Koulouris, A. Capitalizing on new forms of academic library’s intellectual assets: a new library mobile application proposition. Educ Inf Technol 24, 3707–3730 (2019). https://doi.org/10.1007/s10639-019-09944-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10639-019-09944-w

Keywords