Connecting the dots: An exploratory study on learning analytics adoption factors, experience, and priorities

https://doi.org/10.1016/j.iheduc.2021.100794Get rights and content

Highlights

  • Connections of key adoption factors vary among institutions with different LA experience.

  • Experienced institutions prioritised exploring over measuring a learning phenomenon.

  • Experienced institutions were more concerned with methods than constraints of data.

  • Experienced institutions engaged primary stakeholders more equally.

  • Special attention is need for institutional context, people issues, and ethics and privacy.

Abstract

Existing studies have shed light on policies and strategies for learning analytics (LA) adoption, yet there is limited understanding of associations among factors that influence adoption processes or the change in priorities when institutional experience with LA increases. This paper addresses this gap by presenting a study based on interviews with institutional leaders from 27 European higher education institutions. Results showed that experienced institutions demonstrated more interest in exploring learning behaviour and pedagogical reformation than simply measuring a phenomenon. Experienced institutions also paid more attention to methodological approaches to LA than data constraints, and demonstrated a broader involvement of teachers and students. This paper also identifies inter-related connections between prevailing challenges that impede the scaling of LA. Based on the results, we suggest regular evaluations of LA adoption to ensure the alignment of strategy and desired changes. We also identify three areas that require particular attention when forming short-term goals for LA at different phases of adoption

Introduction

Learning analytics (LA) is commonly defined as “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs” (Long, Siemens, Conole, & Gašević, 2011, para. 5). It has been repeatedly reported as an essential educational technology to support adaptive learning (Adams Becker et al., 2017; EDUCAUSE, 2018; Johnson et al., 2016). Research has also identified a number of benefits of LA, including its offer of targeted course offerings, personalising learning, improved learning outcomes, teaching performance, and curriculum design, and enhanced post-educational employment opportunities (Avella, Kebritchi, Nunn, & Kanai, 2016). In the context of higher education institutions (HEIs), there is an increasing demand to measure, demonstrate, and improve performance. As a result, LA has emerged as a new solution to addressing issues around retention, progression, and the enhancement of student success (Ferguson, 2012; Siemens, Dawson, & Lynch, 2013).

In recent years, the development of LA has been notable in Europe, where large-scale research projects have been commissioned by the European Commission to build a community to facilitate knowledge exchange, e.g., LACE project (http://www.laceproject.eu/) and engage various stakeholders in the process of policy formation, e.g., SHEILA project (https://sheilaproject.eu/). Specifically, LA has been highlighted as an area of strategic interest in the UK higher education (Scotland, 2018; Shacklock, 2016) as a response to continuous pressure from the change in funding models, a marketisation culture, international competition, and Brexit upheavels. An early report publised by Jisc (Sclater, 2014) showed emerging interest among UK HEIs in using LA to enhance student learning experience in areas such as feedback, learner agency, and learning performance and progression. In the following year, a survey carried out by the Heads of eLearning Forum (HelF) found that among the 53 UK HEIs that responded, a third was preparing to implement LA, a fifth had partially implemented LA, and one had fully implemented LA. Nevertheless, about a half of the institutions did not use LA at all. Subsequently, the UK Higher Education Commission carried out an investigation (Shacklock, 2016, p.26) in response to the increasing importance of data and analytics in higher education. One of the recommendations made by the report (ibid.) states, “All HEIs should consider introducing an appropriate learning analytics system to improve student support/ performance at their institution”. Importantly, this report highlights the crucial role of senior leadeship in driving the adoption of LA forward with a strong strategic vision. In response, we started a series of interviews in the same year with senior managers from 21 UK HEIs and 6 HEIs in mainland Europe to understand existing activities and plans concerning learning analytics, so as to formulate strategic directions for institutional adoption.

While interest in LA among HEIs continues to grow, limited strategic planning for LA deployment has been identified as a critical factor of narrow scaling of adoption in higher education (Siemens et al., 2013; Viberg, Hatakka, Bälter, & Mavroudi, 2018). As Colvin et al. (2016) point out, a strategic vision that responds to the needs of an organisation is critical for a long-term impact and the development of institutional capability for LA. It has also been argued that the complex nature of higher education structures and its ecosystem requires a complexity leadership model that can respond to external changes in an agile fashion and turn tensions between the need to innovate and the need to produce into opportunities for LA (Dawson et al., 2018). Driven by the need to unpack and address tensions that might occur when LA is introduced to an institution, we set out to explore potential connections between factors that can influence LA adoption. While existing studies highlight key influences of social factors on LA adoption, such as leadership, strategy, culture, ethics and privacy, and analytic capabilities (Colvin et al., 2016; Greller & Drachsler, 2012; Siemens, Dawson, & Eshleman, 2018; Tsai et al., 2018), there has been limited research that systematically investigates the relationships between these factors, which arguably need to be treated as a whole rather than individual component in a complex system (Uhl-Bien, Marion, & McKelvey, 2007). In recent years, there have been consistent calls for evaluations of LA in terms of its impacts on learning and institutional practice as a whole (Ferguson & Clow, 2017; Kitto, Shum, & Gibson, 2018; Viberg et al., 2018). Crucial to evaluation are clear short-term goals that allow institutions to review, clarify, and adjust their vision so as to sustain the impetus of innovation in the long haul (Kotter, 2006). In light of this, we also investigate whether connections between LA adoption factors vary between institutions that are comparatively new to LA and those that have had formal engagement with LA, i.e., initiatives supported by the university, for more than a year. We explore two questions in this study:

  • (1).

    How do factors that influence LA adoption processes in higher education associate with each other?

  • (2).

    Do the connections of adoption factors differ between institutions that have adopted LA for less than a year and those that have adopted LA for more than a year?

This paper presents the findings of an exploratory study based on 29 interviews with 27 European HEIs. We analysed the data using mixed methods. That is, we followed a qualitative approach to collect and code the data thematically. Followed by that, we used a quantitative method, Epistemic Network Analysis (ENA), to identify and visualise the connections between the thematic codes, so as to guide further inspection of the interview data. ENA uses statistical computation to visualise the connections between concepts that could otherwise be difficult to observe or systematically present from qualitative data. This study is not intended to generalise observations of or make comparison between HEIs in mainland Europe and UK. Instead, the intension is to obtain in-depth information about the adoption of LA in a small group of institutions that share a certain degree of cultural similarity in a geographical region, for the purpose of informing institutional strategy for LA. In the rest of the paper, we first discuss key literature that has contributed to our understanding of prominent challenges of LA and key factors that influence the success of LA. Thereafter, we detail the methods that we have undertaken to collect, process, and analyse the interview data. Finally, we present prominent connections between selected themes, including goals, approaches, ethics, challenges, stakeholder involvement, and success. This paper concludes with three areas that require particular attention when forming short-term goals for LA at different phases of adoption.

In this section, we first discuss the prevailing challenges associated with the adoption of LA in higher education, so as to provide a perspective on the ‘challenge’ codes that we used to analyse collected interview data. Then, we review the existing adoption framework for LA to identify critical factors that influence LA deployment and provide a theoretical background of the other codes presented in this paper.

Research has found that the most significant challenges that confront HEIs in terms of LA adoption are not technical, but social (Ferguson, 2012; Howell, Roberts, Seaman, & Gibson, 2018; Roberts, Howell, Seaman, & Gibson, 2016; Siemens, 2013; Siemens et al., 2013; Tsai & Gašević, 2017; Tsai et al., 2018). For example, Tsai et al. (2018) identified three areas of prominent challenges: the demand for resources, ethics and privacy, and stakeholder involvement. Resources primarily refer to data, funding, and people. In terms of data, significant issues concern the quality and scope of data that can reflect learning experiences accurately. In addition, the ease of obtaining and integrating data from various sources has also been reported constantly as a major challenge (Arroway, Morgan, O'Keefe, & Yanosky, 2016; Siemens, 2013). With regards to funding, EDUCAUSE studies (Arroway et al., 2016; Yanosky, 2009) reveal that LA often needs to compete with other institutional priorities, resulting in a challenge of obtaining sufficient financial support to supply an enabling infrastructure. Human resources primarily concern the capability and capacity to implement LA and to act on the results produced by LA. For example, a shortage of skilled people that have the ability to process, analyse, interpret data, and link it with pedagogy has been identified as a prominent challenge that worsens the gap between needs and solutions (Norris & Baer, 2013; Tsai & Gašević, 2017; Rienties, Herodotou, Olney, Schencks, & Boroowa, 2018), thereby impeding the scaling of LA research to enterprise solutions (Siemens et al., 2013). In addition, as LA serves the purpose of informing learning, teaching, and managerial decisions with data-based evidence, it is crucial to develop ‘data literacy’ among key users; that is, the skill to interpret the analysis of data critically (Wolff, Moore, Zdrahal, Hlosta, & Kuzilek, 2016).

LA involves the use of personal data to provide targeted support. As a result, ethics and privacy issues have also emerged as a significant challenge that is unresolved (Siemens et al., 2013) and continues to affect buy-in from key stakeholders (Drachsler & Greller, 2016). Some prominent concerns include the risk of intruding privacy, the difficulty of assuming informed consent due to unequal power relationships (Roberts et al., 2016; Slade & Prinsloo, 2013), the dilemma between keeping data anonymous and exploiting the most value (Drachsler & Greller, 2012), the potential of demotivating or stereotyping learners, the deprivation of learner autonomy (Roberts et al., 2016), data integrity, and potential data misuses (Howell, Roberts, Seaman, & Gibson, 2018). These unresolved issues have inspired the development of DELICATE (Drachsler & Greller, 2016), a checklist for institutions, researchers, and educators to self-check ethical and privacy requirements to carry out LA.

Finally, stakeholder involvement has been considered as crucial to the success of LA deployment. As educational systems are stable and resistant to change (Macfadyen, Dawson, Pardo, & Gašević, 2014), for academics to adopt LA, they need to perceive it to be pedagogically useful (Gašević, Dawson, & Siemens, 2015) as well as necessary in terms of addressing existing learning or teaching challenges (Ali, Asadi, Gašević, Jovanovic, & Hatala, 2013; Howell, Roberts, Seaman, & Gibson, 2018). However, discrepancies in existing experience and knowledge of data among different stakeholders often result in the challenge of finding common ground among stakeholders and meeting everyone's expectations (Tsai et al., 2018). Moreover, existing heavy workloads for academics often lead to anxiety about their limited capacity to incorporate LA into teaching (Tsai, Poquet, Dawson, Pardo, & Gašević, 2019; Howell, Roberts, Seaman, & Gibson, 2018). In addition, insufficient support from institutional leadership to drive strategic planning and policy development for LA has also been identified as a critical challenge that slows down the deployment of LA at institutional level (Colvin, Dawson, Wade, & Gašević, 2017; Norris & Baer, 2013; Siemens et al., 2013).

In light of the prevailing challenges, several frameworks have been proposed to ensure effective and ethical adoption of LA.

Colvin, Dawson, Wade, & Gašević, 2017 reviewed current models of LA deployment and identified three types of models: input, output, and process models. Input models focus on antecedent affordances that enable LA. For example, Greller and Drachsler (2012) proposed an LA framework to gauge understanding and expectations towards learning analytics among key stakeholders. This framework consists of six critical dimensions: stakeholders, objectives, data, method, constraints, and competences. Among these dimensions, constraints in particular focuses on observations of ethical and privacy limitations. Another established input model is the Learning Analytics Readiness Instrument (LARI) (Oster, Lonn, Pistilli, & Brown, 2016), which also includes six key elements: culture, data management expertise, data analysis expertise, communication and policy application, and training. Unlike the framework proposed by Greller and Drachsler (2012), LARI was designed to identify the processes institutions use when discerning their readiness to implement learning analytics. What is interesting about this study is that the data factor that was the largest component in the alpha analysis of this instrument (Arnold, Lonn, & Pistilli, 2014) has fallen out of the beta version of LARI. Instead, the culture factor emerged as the most influential factor when evaluating institutional readiness (Oster, Lonn, Pistilli, & Brown, 2016). This finding highlights the crucial role of institutional culture prior to or during the early adoption of LA.

Process models lay out key steps in the process of adopting LA (Colvin, Dawson, Wade, & Gašević, 2017). For example, the SHEILA framework (Tsai et al., 2018) builds on the RAPID Outcomes Mapping Approach (ROMA) (Young & Mendizabel, 2009), which defines six operational dimensions after the initial goal-setting step. These dimensions cover political contexts, stakeholders, desired changes, engagement strategy, capacity assessment, and evaluation. The SHEILA framework expands upon ROMA with action plans, challenge mitigation, and policy prompts. The framework aims to facilitate readiness assessment, strategy formation, and policy development. Unlike the two input models (Greller & Drachsler, 2012; Oster, Lonn, Pistilli, & Brown, 2016) that did not emphasise connections between the suggested elements, this process model highlights an iterative cycle to treat factors that influence LA adoption as mutually influential variables. For example, when identifying a (new) desired change, all the other dimensions need to be (re-)examined to ensure their connections. This applies to any strategic decision made related to a particular action, challenge, or policy process.

An output model of LA presents adoption maturity in multiple levels (Colvin, Dawson, Wade, & Gašević, 2017). For example, an LA sophistication model proposed by Siemens et al. (2013) includes five stages of maturity: awareness, experimentation, LA adoption (at student, faculty, and organisation levels), organisation transformation, and sector transformation. The model presents a vision of ideal progression of LA deployment, even though the reality is still far from the transformation stages (Viberg et al., 2018), and issues that hamper the scalability are often more entwined than linear (Tsai, Poquet, Dawson, Pardo, & Gašević, 2019). Based on the input, process, and output models introduced in this section, we can see four intertwined factors that influence the adoption of LA (Fig. 1). The context factors involve the political context of the institution, the drivers for LA, the resource capacity (funding, technology, capability), and the institutional culture. The strategy factors involve defining objectives, methods to adopt LA (including communication strategy), evaluation, and policy. The people factors include all issues related to stakeholders, ethics, and privacy. The challenge factor involve challenges or constraints related to any of the other three factors.

As Fig. 1 shows, issues that confront HEIs in the adoption of LA tend to be entangled in a complex social system. The purpose of our study is to unfold the intertwined relationship between these social factors and learn from the changing patterns that might emerge from institutions with different experience of LA, so as to provide insights that may help institutions evaluate LA adoption progress, adjust short-term goals, and examine a long-term vision. To this end, an exploratory study was carried out to investigate key factors that influence LA deployment European HEIs. Specifically, we adopted a quantitative ethnographic approach to visualise patterns of associations between emerging themes (represented by codes) in the data. We elaborate on this method and the data collection process in the next section.

Section snippets

A quantitative ethnographic approach

A quantitative ethnographic approach highlights the role of culture in turning a large set of data into meaningful information (Shaffer, 2017). The cultural significance in the interpretation of data is grounded in the tradition of ethnographic studies that are interested in how and why people attribute certain meanings to what they say or do (Taylor, 2001). Fundamentally, a quantitative ethnographic approach marries the strengths of both qualitative and quantitative research – the former

Thematic analysis

The epistemic network in Fig. 2 shows the connection between the 21 thematic units under implementation variable and readiness variable:

  • (1).

    Goal (goals)

  • (2).

    Exp (experience)

  • (3).

    EDW (education data warehouse)

  • (4).

    Policy (policies)

  • (5).

    Apprch (approach)

  • (6).

    PrimUsr (primary users)

  • (7).

    Scope (scope)

  • (8).

    Anl.Elm (analytics elements)

  • (9).

    Int (interventions)

  • (10).

    StratDev (strategy development)

  • (11).

    Eval (evaluation)

  • (12).

    Succ (success)

  • (13).

    Ethics (ethics)

  • (14).

    ChIngs (challenges)

  • (15).

    Tech (technology)

  • (16).

    Fund (funding)

  • (17).

    Anl.Cul (analytical culture)

  • (18).

    Lead (leadership)

  • (19).

    Anl.Cap

Discussion

In this paper, we examine the following questions using an exploratory research approach:

  • (1).

    How do factors that influence LA adoption processes in higher education associate with each other?

  • (2).

    Do the connections between adoption factors differ between institutions that have adopted LA for less than a year and those that have adopted LA for more than a year?

We use ENA analysis to visualise connections among prominent factors that influence the adoption of LA, and we further inspect the connections

Acknowledgements

This work was supported by the Erasmus+ Programme of the European Union [562080-EPP-1-2015-1-BE-EPPKA3-PI-FORWARD]. The European Commission support for the production of this publication does not constitute an endorsement of the contents which reflects the views only of the authors, and the Commission will not be held responsible for any use which may be made of the information contained therein. We would like to thank the participants who participated in the study and contributed their

Declarations of Competing Interest

None.

References (55)

  • H. Drachsler et al.

    The pulse of learning analytics understandings and expectations from the stakeholders

  • H. Drachsler et al.

    Privacy and analytics: It’s a DELICATE issue a checklist for trusted learning analytics

  • EDUCAUSE. (2018). NMC horizon report preview: 2018 higher education edition. Retrieved from...
  • R. Ferguson

    Learning analytics: Drivers, developments and challenges

    International Journal of Technology Enhanced Learning

    (2012)
  • R. Ferguson et al.

    Where is the evidence?: A call to action for learning analytics

  • R. Ferreira et al.

    Towards combined network and text analytics of student discourse in online discussions

  • D. Gašević et al.

    Let’s not forget: Learning analytics are about learning

    TechTrends

    (2015)
  • W. Greller et al.

    Translating learning into numbers: A generic framework for learning analytics

    Educational Technology & Society

    (2012)
  • J.A. Howell et al.

    Are we on our way to becoming a “helicopter university”? Academics’ views on learning analytics

    Technology Knowledge and Learning

    (2018)
  • L. Johnson et al.

    NMC horizon report: 2016 higher education edition

    (2016)
  • C. Colvin et al.

    Student retention and learning analytics: A snapshot of australian practices and a framework for advancement

    (2016)
  • K. Kitto et al.

    Embracing imperfection in learning analytics

  • J.P. Kotter

    Leading change: Why transformation efforts fail

    Harvard Business Review

    (2006)
  • P.D. Long et al.

    Proceedings of the First International Conference on Learning Analytics and Knowledge

    (2011)
  • L. Macfadyen et al.

    Numbers are not enough. Why e-learning analytics failed to inform an institutional strategic plan

    Journal of Educational Technology & Society

    (2012)
  • L. Macfadyen et al.

    Embracing big data in complex educational systems: The learning analytics imperative and the policy challenge

    Research & Practice in Assessment

    (2014)
  • Marquart, C. L., Swiecki, Z., Collier, W., Eagan, B., Woodward, R., & Shaffer, D. W. (2018). rENA: Epistemic network...
  • Cited by (0)

    1

    Current affiliation: Monash University, Dept of Data Science and AI, Faculty of Information Technology, 20 Exhibition Walk, Clayton, VIC 3800, Australia

    View full text