Keywords

1 Introduction

Digital literacy is not solely a critical requisite for the success of students in online learning environments [1], it is believed that the acquisition of digital skills is one of the necessary tools for facilitating lifelong learning [2]. In the words of Jones and Flannigan [3] “Being void of digital or visual literacy is akin to being handicapped” (p. 4).

The escalating digitalisation of society becomes clear in the variety of concepts that are adopting the “e” of electronic as a prefix, such as e-Learning, e-Health or e-Government [4]. In an increasingly digital society, it has become imperative to develop people’s ability to use, understand and manage resources that are in a digital format. Hence, the development of digital competences is a chief economic concern in most developed countries [5]. The notion of functional illiteracy began to be commonly used once it became clear that despite having the capability to write and read, some people could not derive meaning from what they read. The rise of functional illiteracy was at the origin of a new concern with alphabetization. The skill to read and to write was no longer enough to identify a person as being literate. The introduction of the concept of functional literacy placed an emphasis in the ability to comprehend day to day information. More than joining words together, the functionally literate, were required to demonstrate their capacity of extracting sense from what they read [6]. Digital literacy has gone through this same process. The capacity to use a computer has become an insufficient criterion to define the digitally literate. In light of the growing demands of a knowledge-centred economy, people began to be aware that “working with digital systems and tools to perform most job tasks involve complex cognitive and metacognitive skills, over and above the basic ICT skills for operating computers.” (p. 7) [5]. As with alphabetization, it is fundamental to provide a depiction of people’s digital literacy, offering a deeper insight into the implementation and efficacy of current guidelines and policies for the promotion of digital skills [7].

Numbers on internet access remain insufficient as to the assessment of individuals’ digital skills. Access is not an indicator in itself [8]. There are people online, who have not acquired even the most basic Information and Communication Technology (ICT) competences. They are minimally functional, but still digitally illiterate. The assessment of digital literacy has been the subject of many studies [9]. The measurement of digital skills classically involves self-report. Some researchers argue that one of the most accurate methods of assessing people’s digital skills is by requesting them to complete self-evaluations [5]. In contrast, task-based research is also a method that has been widely used for the assessment of digital literacy [10, 11]. The strengths and weaknesses associated with specific digital skills provide an important insight into what needs to be addressed by higher education institutions. Also, it creates a necessary awareness of the existing significant disparities in terms of digital skills proficiency that impact the students experience in digital contexts [1].

Firstly, this paper will explore the conceptual complexity of digital literacy, by depicting some of its most important definitions and by proposing a threefold framework to define it, based on access to ICT, operational competences and conceptual skills. Subsequently, the issue of measurement will also be discussed and analysed. The methods section focuses on the development and application of the online questionnaire to higher education Portuguese students. The results, where the data that was collected will be processed and analysed, will preface a brief discussion section, that concludes this paper.

2 Digital Literacy in Higher Education

Digital literacy is an asset for students in the sense that it assists their full engagement in a society growingly dominated by digital contexts at a social, cultural and professional level [12]. The use of digital devices in the classroom has become common as early as elementary education [13], with children being familiar with those devices even before they enrol in formal education and as young as 2 years old [14].

2.1 A Problem of Definition

Although the need for digital skills has been widely recognised, their generic nature hinders their development. Programs such as the European Computer Driving License (ECDL) set the basic competences that individuals need to acquire, but they continue to fall short in their representation of the digital literacy concept. It is paramount to address the necessity to elucidate what it means to be digitally literate, in practice [14].

As digital skills grow in number and diversity, so do the terms associated with them. New media literacy, information literacy, photo-visual literacy, lateral literacy, reproduction literacy, visual literacy are some of the concepts usually associated with literacy in digital environments [3, 15]. The overuse of the word literacy is confusing and the division of digital literacy into countless sub-groups fails to address the challenges that derive from its generic nature. These sub-groups, despite isolating some aspects of navigating digital platforms, do not purport concrete competences. Hence, in this paper, for clarity, the concept of digital literacy will disregard these subdivisions and it will solely use the term digital literacy.

The first step towards the pervasiveness of digital literacy is the clear outline of the skills it entails. Despite the plethora of existing designations, it is possible to dissect the notion of digital literacy into two types of definitions: conceptual and operational [16]. Operational descriptions refer to specific sets of tasks and concrete skills that compose digital literacy. Conceptual definitions, on the other hand, concern the broader cognitive, social and emotional side of dealing with digital resources and settings [2]. In most definitions it is possible to isolate these two components, regardless of how differently they can be presented.

Gilster [17] offered a very inclusive depiction of digital literacy by stating that it is “the ability to understand and use information in multiple formats from a wide variety of sources when it is presented via computers” (p. 1) [17]. Far from being a consensual definition, it managed to survive many updates of the term over the years mainly due to its wide ambit [18]. Martin [19] defines digital literacy as “the awareness, attitude and ability of individuals to appropriately use digital tools and facilities to identify, access, manage, integrate, evaluate, analyse and synthesize digital resources, construct new knowledge, create media expressions, and communicate with others, in the context of specific life situations, in order to enable constructive social action; and to reflect upon this process.” (pp. 135–136) [19]. Furthermore, digital literacy “refers to the assortment of cognitive-thinking strategies that consumers of digital information utilize. Digital literacy is usually regarded as a measure of the ability of users to perform tasks in digital environments.” (p. 6) [3]. More recently, Hague and Payton [20] stated that “to be digitally literate is to have access to a broad range of practices and cultural resources that you are able to apply to digital tools. It is the ability to make and share meaning in different modes and formats; to create, collaborate and communicate effectively and to understand how and when digital technologies can best be used to support these processes.” (p. 2).

All of the aforementioned definitions have conceptual and operational approaches. Conceptually, they mention cognitive strategies, cultural resources, awareness, understanding and reflections. Operationally speaking, the different authors cite the capacity to perform tasks, to create and exchange meaning and the ability to manoeuvre multiple formats of digital information. Additionally, they highlight the importance of the ability of accessing, managing, integrating and analysing data. The fact that digital literacy has become increasingly operational and less theoretical, makes this type of definitions lack a practical value. The delimitation of the term is required to be more concrete and to itemize the skills that together compose digital literacy, if the term is to be taken seriously and as a reference for real scenarios of digital literacy development. The literature is abundant in studies that argue the complexity of defining the concept and of working with such a variety of definitions from equally varied authors. Nonetheless, it is paramount to move past this awareness and really discriminate each of the skills and tasks that individuals need to have and perform to achieve literacy in the digital world. There is a multiplicity of definitions for the concept of digital literacy, but many fail to provide a concrete description of the skills it entails [18].

The importance of delimiting the concept of digital literacy has driven the creation of reference frameworks. The European Commission [21] defined digital competence as the result of environmental factors (access to ICT) and individual competences (elemental use/operational competences, the ability to apply digital resources to everyday life and personal attitude). In terms of ICT access, it accounts for the availability of the necessary conditions to become proficient in digital contexts, such as access to computers, internet connection and mobile devices. The operational skills refer to the basic knowledge of operating a computer and navigating the internet. The application of digital resources to everyday life concerns the individual’s capacity to use technology for the purpose of learning, leisure, working and being social. Personal attitude, in its turn, mainly requires creativity, responsible use and critical thinking. This view of digital competence adds an important element to digital literacy, which is the access to ICT.

3 Threefold Definition of Digital Literacy

This paper aims to provide a depiction of the level of digital literacy of university students. As with any assessment, it is imperative to define a set of guiding criteria. In this case, the criteria resulted from a deconstruction of the definition of digital literacy. Firstly, the concept of digital literacy was divided into three core sections: access to ICT, operational competences and conceptual skills. Secondly, each of these sections was then fractioned into specific sets of basic requirements that were later converted into specific tasks. Figure 1 illustrates the first part of the dissection of the term.

Fig. 1.
figure 1

Threefold definition of digital literacy

3.1 Access to Technology

The first aspect to define the set of essential digital skills is to assess and guarantee that people have access to the technology itself. Digital skills require digital logistics. The demand on swift communication and data exchange places pressure on the delivery of internet connection. The European Union (EU) struggles with a low level of high-speed Internet. Some of the measures of the Digital Agenda for Europe include the development of the internet cover throughout the EU, by investing in high bandwidth to enable internet access [4]. Lack of exposure to information technology impedes the progress of digital literacy. Time spent online provides opportunity to acquire and consolidate digital skills [8]. As Isaias et al. [7] state, the provision of access to varied technological tools is fundamental to potentiate people’s familiarity with them. The first element of the definition that this paper proposes, access to ICT, is essential to determine people’s access to computers, internet connection, tablets and smartphones.

3.2 Operational Competences

Operational competences define the specific operational requirements of interacting with digital environments. It consists of four main sections: computer basics, internet navigation, communication and information search and management. Table 1 presents the several categories of operational skills and some of the tasks associated with them.

Table 1. Operational competences breakdown

Computer basics entails using core computer applications and software [3, 5, 15, 21], file management [15] and operating output and input devices [15]. Some of the tasks that best illustrate these competences include installing software, preparing PowerPoint presentations, renaming folders and scanning documents [22]. Internet navigation requires a knowledge of website basics [12, 23], safe online behaviour [5, 15] and the capacity to access resources for everyday use [15]. Connecting to a wireless device, creating a safe password, shopping online and setting a website as the browser’s homepage [23] are descriptive examples of what these competences involve. Communication is about information exchange [5, 15, 21, 24, 25] and social networking [15, 25]. In this case it is mandatory that the user can to open attachments, identify fraudulent emails and create a personal profile on a social network website. Finally, information search and management calls for information search and retrieval [3, 5, 15, 21, 24,25,26], filtering information [3, 5, 21, 25,26,27], information storage [5, 21, 25, 26], editing and producing information [3, 5, 21, 27] and multimodality [3, 15, 21, 26, 27]. Information search and management is a core precept of digital literacy and its rising intricacy necessitates the user to be proficient namely in reproducing content, assessing data reliability, understanding graphical displays and organising information.

3.3 Conceptual Skills

Conceptual skills refer to wider cognitive, social and emotional aspects of engaging with digital resources and settings. These skills were grouped into four categories: critical attitude, ICT in daily life, social interaction and online safety. Table 2 portrays the set of specific tasks and skills that exemplify each of the categories.

Table 2. Conceptual skills mapping

In this context, to have a critical attitude demands critical thinking [5, 12, 15, 20, 21, 24] and the management of concurrent sources of stimulus [27]. A user needs to have autonomy and the capacity to recognise and address the influence of digital content as well as to question its trustworthiness. Additionally, the unlimited fonts of stimuli of computerised environments calls for time management and real time thinking [27]. The capacity to use ICT in daily life, means that the user should be able to employ the appropriate technology to solve problems [15, 19] and to have a responsible use of that technology [21]. More specifically, individuals who are digitally literate need to be proficient at frequently backing up important files and data, to use the internet to communicate and engage in online civic participation. Social interaction, consists of social participation [15, 24] and the knowledge and respect for netiquette [21, 26]. It is imperative to respect the norms of social conduct, have cultural awareness and to be willing to collaborate with others. With respect to online safety from a conceptual point of view, people must be knowledgeable in terms of legal and ethical issues of digital environments [21] [27] [15] and to be aware of online deception [15, 20, 27]. Namely, digital literacy means complying with the legal rules of digital content, being aware of copyright issues and identifying and reporting inappropriate online activity.

4 Measuring Digital Literacy

Having provided a guiding definition of what digital literacy demands, it is important to explore the subject of assessment. The majority of the research on people’s digital literacy is supported by self-assessment instruments of data collection [8]. The administration of this type of surveys is a habitual method of digital skills data collection. Some believe that the self-report of the participants’ skills offers a more precise illustration of their digital competences level [5]. Alternatively, some studies have used task-based instruments of data collection to appraise levels of digital literacy [10, 11].

4.1 Measurement Instruments

Students are the future working force and the data collected with empirical experiences is very important to provide palpable information on how prepared it is the prospective working force. It is important to incorporate technology in education delivery as there are unlimited benefits from requiring students to use technology to complete their assignments [9]. There are multiple studies where students are asked to evaluate their own IT skills. Self-evaluation is a critical strategy for digital literacy assessment [28]. The results speak for their perception of their skills rather than the actual skill level. The authors used the FIT (fluent with information technology) model as the basis of their work [9]. Kaminski et al. [9] applied a survey to university students that intended to measure their fluency in information technology and they used self-report as an assessment method.

Besides self-assessment, task-based research is also a method that has been used for the assessment of digital literacy. Alkali and Amichai-Hamburger [10], for example, conducted a study to assess the level of digital literacy of groups of scholars. They developed a research instrument that was ground-breaking and that consisted in completing five tasks where different types of digital skills were put to test. The results of this study reflected a clear disparity among the participants’ ages and among their proficiency in certain skills, such as information literacy related abilities [10]. The author van Deursen and van Dijk [11] also used a task based research tool to assess the internet skills of the Dutch population in general. Their questionnaire had nine assignments that the participants should complete. Similarly to other studies of this nature the results revealed, among other conclusions, a difference of performance in certain skills.

Burton et al. [1] developed a survey to understand the students’ perceptions of their digital skills and experience. The questions were not only about their skills, but also regarding their experience with institutional online services for learning. As it happened with other studies, the analysis of the data that was collected showed levels of confidence and proficiency in some areas such as online tools and levels of discomfort when using certain database and spreadsheet options [1]. Also, Hargittai [8] developed an instrument to measure digital literacy levels that was mainly composed of self-report questions on several skills. The sample they used consisted of 100 internet users who were randomly selected. In addition to the self-report questionnaire the author designed a quiz-like questionnaire to be applied to a portion of the sample [8]

The ETS’s iSkills assessment was developed by the Educational Testing Service (ETS) that was formerly known as “ICT Literacy Assessment”. The ETS’s iSkills assessment is an internet-based instrument that evaluates the ability that the students have to use technology for the purpose of searching, organising and communicating data. This assessment has an approximate duration of seventy five minutes and it is centred on the students capacity to use their problem-solving and critical-thinking competences to use technology [29].

Finally, the European/International Computer Driving License (E/ICDL) is a valuable example of a program that pursues the standardisation of digital skills [5] “A high level of interoperability ensures greater competition in the online world, thus contributing to a greater development of the online services, but also offers new innovation opportunities, by offering a standard for IT services. Moreover, inclusion is promoted by pushing communication barriers.” (p. 35) [4].

5 Methods

The online questionnaire was the chosen research instrument to collect data on the digital literacy of higher education students. This research instrument was selected for its simplicity in the introduction of data, its speediness and for its capacity to be flexible and to reach a wide audience [30]. The online questionnaire measured access to ICT, operational competences and conceptual skills and it was divided into four parts. The first part concerned the participants’ demographic information. The second section intended to gather data on the students’ access to ICT to explore the differences in accessibility among different devices. The third part measured the participants’ operational competences when using a computer and the internet. Finally, the fourth section measured the students’ perception of their own conceptual skills.

The online questionnaire was distributed online among four classes of higher education students who were selected via a sample of convenience. Although nonprobability sampling methods have several limitations, they can be valuable when applied to scenarios with no sampling frame [31].

6 Results

6.1 Sample Characteristics

The questionnaire received 177 valid responses. The sample of respondents was constituted by 47% of female students and 53% of male students. The majority of the participants were between 18 and 24 years old (71%) and were enrolled with Universidade Aberta, Portugal (Portuguese Open University) (53%). Most of the participants were studying for their bachelor degree (81%) in Management (51%).

In terms of their experience with e-Learning, only 25% of them said that they had completed a course that was entirely delivered online. This percentage is even smaller (14%) for blended learning. Despite their reduced experience with digital learning, 85% of the respondents declared that having digital skills is very important for their learning proficiency.

6.2 Accessibility

The students’ access to technology was measured in terms of the availability of several devices in their everyday life and their use of those devices. Except for 1 respondent, all participants claimed to have a computer at home and internet connection. In terms of the students’ unlimited access to technology (24/7), computers and internet connection were significantly predominant, as can be seen in Fig. 2.

Fig. 2.
figure 2

Students access to technology

Although there was a high percentage of students with unlimited access to a smartphone or a tablet, the percentage of students with no access was significant (19% and 36% respectively). With respect to their use of computers, most students (51%) have been using them for 10–20 years, 26% for over 20 years and 20% for 6–10 years. Only 2% of the students claimed to have been using a computer for 1–5 years. Also, 98% of the participants stated that they use a computer on a daily basis, both for personal and academic reasons. In terms of their use of a computer at the university, 67% of the students stated that they do use it and 33% said they don’t.

With concern to their use of the internet, 97% of the students claimed to use it daily. In terms of having free access to an internet connection at their university while 61% of the students said that it is available, 33% said they didn’t know.

6.3 Operational Literacy

Prior to the students’ self-assessment of the operational skills, it was important to know what type of computer training they had. Circa 49% of the students claimed to have completed a training module or university course, 20% completed a workshop and 8% said that they obtained a certification from an official entity. On the other hand, 21% of the students claimed to have no training. Regardless of their training, the respondents had, overall, a positive perception of their digital skills. No student rated his/her digital skills as being poor and only 14% regarded them as basic. The remaining participants stated that their digital skills were good (44%), very good (31%) or excellent (11%). The students’ equally displayed high levels of comfort when performing specific operational tasks (Fig. 3).

Fig. 3.
figure 3

Students’ levels of comfort when performing specific operational tasks

With the exception of 3 tasks, all operational tasks had comfort levels (very comfortable and comfortable above 90%. In these 3 tasks some students reported levels of discomfort (not very comfortable and uncomfortable): creating a chart from spreadsheet data (9%), online shopping (8%) and preparing a PowerPoint presentation (7%). When asked about the tasks that they could easily complete the students exhibited the same level of confidence as to their skills (Fig. 4).

Fig. 4.
figure 4

Tasks that the students can perform easily

All the tasks were selected by an expressive majority of the students (84% or over). To burn a CD or install software were the tasks that more students, around 15%, did not select as being easily performed. Finally, with respect to operational skills, the students were asked to assess specific skills according to their level of completion difficulty. Similarly to the previous questions, the students made a very positive self-assessment of their skills, as can be seen in Fig. 5.

Fig. 5.
figure 5

Level of difficulty in completing specific operational tasks

All the tasks that were listed were classified by the majority of the students as being very easy or easy. The tasks that had higher levels of difficulty were to use cloud computing services, in which 19% of the respondents selected can’t do it, very hard or hard; present information in a video, where 16% of the respondents selected can’t do it, very hard or hard; and determine the reliability of data, in which 11% of the respondents selected can’t do it, very hard or hard.

6.4 Conceptual Literacy

Generally speaking, the students demonstrated to have a positive perception of their levels of proficiency in terms of conceptual skills. Firstly, the students were asked to rate the frequency with which they performed specific behaviours related to conceptual skills (Fig. 6).

Fig. 6.
figure 6

Frequency with which students have specific behaviours

As it becomes evident in the figure above, the students more frequently respect the norms of online social conduct (95%), use the internet for communication (89%) and access resources for everyday use (92%). On the contrary, they rarely or never engage in online civic participation (53%) or report inappropriate online activity (63%). Furthermore, the students were asked to rate the adequacy of specific conceptual competences according to their online behaviour (Fig. 7).

Fig. 7.
figure 7

Competences used in the students’ online behaviour

According to the students, all the conceptual skills that were listed are adequate to their online behaviour. They specially highlighted autonomy, flexible attitude and real time thinking. The competences that were deemed less adequate include to be aware of copyright issues (13% of the students found it to be not adequate or it not adequate at all) and to be aware of identity deception (10% of the students believed it was not adequate or not adequate at all). Finally, the participants were asked to classify their capacity to perform specific tasks related to conceptual skills (Fig. 8).

Fig. 8.
figure 8

Students’ capacity to perform specific tasks

The majority of the students were confident with their capacity to perform all of the tasks that were listed, especially in terms of naming folders meaningfully to facilitate information retrieval, preserving their own privacy, questioning the reliability of online content and complying with the legal rules of digital content. Solely 9% of the students stated that their capacity to identify online threats and deal with internet viruses was fair or poor.

7 Discussion

As education progresses towards an increasingly digital format, it is paramount to assess the digital literacy of students and their capacity to be proficient in digital environments. Assumptions of students’ digital skills may cause damaging cleavages between the students and they may compromise the support that technology can provide to educational settings.

The results of the questionnaire revealed a higher education student population that has wide access to technology, namely computers, internet connection and to mobile devices. Nonetheless, according to the students the latter are less available. There was a significant percentage of students who claimed that they did not have access to a tablet or a smartphone. This insufficient access to mobile devices can hinder educators’ ambition for a ubiquitous online education. Plans to advance mobile learning need to account for this issue of access.

Despite the high number of students with some type of prior computer training, a relevant percentage of the sample claimed to have no training at all. With regards to the participants’ self-assessment of their level of digital competences, overall, all the students had a positive perception. More specifically, with concern to operational tasks, an expressive majority of the students reported to have high levels of comfort and confidence in the performance of all the listed tasks. Nevertheless, certain tasks were classified as being more difficult by some of the students, specially burning a CD, installing software, using cloud computing services and presenting information in a video format. With respect to conceptual skills, an unequivocal majority of the students made a positive assessment of all the listed tasks and of most of the related behaviours. Being aware of copyright issues and of identity deception were the aspects that some students found to be less aligned with their competences. Engaging in online civic participation and reporting inappropriate online activity were the behaviours that most of the students claimed to rarely or never perform.

8 Conclusion

The swift evolution of information technology has shaped a new type of literacy. Similarly to the vitality of reading, writing and having numerical skills, digital competences are part of fully functional individuals. The concept of literacy as evolved to accommodate the characteristics of intricate information settings [6]. Technical competences are solely a part of being digitally literate, individuals also need cognitive, social and emotional abilities [27]. Digital literacy also determines that the mastery of technical tools is insufficient in contexts where content is added by and shared among users. The creation and exchange of content became priorities over those of mere technical nature [16].

The analysis of the students’ digital competences allows educators to prepare the learning process accordingly. It provides a depiction of their strengths and weaknesses and an assessment of what technology should be employed or avoided. Also, it highlights the areas where training should focus. Issues of access and training need to be addressed if education is to pervasively adopt technology to support learning.

Some of the limitations of this study concern the use of a convenience sample and the use of a self-assessment tool. This study was conducted in Portugal, but further research can apply it to participants in other countries. Also, future studies could complement this self-assessment instrument with a quiz or a test of the actual skills to provide a more conclusive depiction of the students’ competences.