Keywords

1 Introduction

This paper describes a pedagogical design to capture 21st Century Skills. Already in 1999, there was a realisation that the workforce and the workplace landscape was changing rapidly, and training would need to reflect these changes, in what was called “21st Century Skills for 21st Century Jobs” [14]. Despite the initial focus on the workplace, and the recognition that competency-based-education is not a new conceptFootnote 1 [13], opportunities to re-surface much desirable student-centred pedagogies were also recognised [11].

With regard to scaling such approaches, one well-known approach in K-12 is the Programme for International Student Assessment (PISA), developed by the Organisation for Economic Co-operation and Development (OECD)Footnote 2. Other attempts include the Assessment and Teaching of 21\(^{\text {st}}\) Century Skills (ATC21S) projectFootnote 3 and the Collaborative Assessment AllianceFootnote 4.

These attempts have been criticised with a number of arguments [7]; however, one aspect that was of special interest to us was that current approaches seem to be tightly-coupled with specific tasks. Thus, it can be the case that the obtained results are a matter of the students’ skills as much as they are the result of task design.

In contrast, we set out to develop a task-independent approach so that it would scale and maintain its flexibility at the same time. Our intention is to develop a pedagogical design which will be developed as a software tool to be deployed at institutions of primarily K-12, but also Higher Education. While this is our initial focus, our design has no component that explicitly excludes informal education settings. Our approach is described below.

2 Pedagogical Design

2.1 Research Direction

Our Pedagogical Design is rooted in the Core Questions of this Research Project:

  1. 1.

    What are 21st Century Skills?

  2. 2.

    What learning innovations are being used to promote them?

  3. 3.

    What techniques/methodologies are being employed to assess them?

  4. 4.

    What technologies are being used to promote 21st Century Skills and their assessment?

And are given shape by the project objectives:

  • To create a common framework for how 21st Century Skills can be assessed;

  • To be able to assess informal learning and social activity from learners; in particular, to research new methods of assessment which can interpret, visualise and comparatively assess learning activity implicitly and continuously;

  • To create a software tools in which multiple methods and approaches to assessment can take place.

2.2 Design Recommendations

With these questions and objectives focusing the initial research, a literature review was done around these areas to identify current trends in 21st Ce. skills, 21st Ce. skills assessment, and the state of art pedagogical design surrounding both of these areas.

From this research, the following pedagogical recommendations were made in regards to where gaps in innovation currently exist within this space:

  • Vertical and horizontal mobility: Anything designed should be able to cross grade level and content area as opposed to being grade or subject specific.

  • Not activity specific: Anything designed should be more than a once-off activity and have longevity and breadth to it, as opposed to being a singular activity that a student/teacher only interacts with once.

  • Authentic classroom dynamic: Anything designed should fit within the authentic classroom dynamic and become an extension of regular classroom practice as opposed to being something that interferes, prohibits, or breaks up the standard rhythm of instruction.

Additionally, recommendations were made that whatever demonstrator was built should:

  • Activate student skill literacy: Student understanding of the skills is not being addressed global focus seems to have jumped straight to the assessment of the skill without focusing on the teaching of the skill.

  • Be based in experiential learning: As opposed to forcing a context for the skill if possible the pedagogy should be rooted in a naturally occurring learning experience (a pseudo experiential learning situation).

  • Offer formative assessment for learning: Typical assessment activities in this area are either summative or disjointed formative and there should be a more streamlined and continuous formative assessment that promotes true and deep learning.

And lastly, the recommendations were made that:

  • The design be flexible: As it is clear that this is not a defined space the demonstrator should be flexible and dynamic offering many options for future design and extension of the original frame.

  • Data be viewed as baseline: That the data generated from the experience be something that is not limiting and can be used to establish a baseline for future development.

With this in mind, it was established that the best direction for development was in the self-assessment space as self-assessment allows for the flexibility established within the recommendations and is not a path being pursued by most developers at the moment and has the potential for more innovation. Additionally, self-assessment:

  • Activates student understanding of 21st Ce. skills, providing knowledge base and direct instruction for what is implicit (literacy of 21st Ce. skills and assessment)

  • Is personalised and allows for goal setting, continuous feedback, strength and deficit identification and formative assessment

  • Allows for metacognitive awareness to increase student responsibility for skill development

  • Informs classroom choices in all three realms of the educative relationship (student-teacher-knowledge) and is universally applicable

The decision was then made to create a 21st Ce. century skills self-assessment app. With the knowledge that this app would be trialled in Ireland, the frame that was chosen for the 21st Ce. skills was that used by the National Council for Curriculum and Assessment (NCCA) [8]. The NCCA refer to these skills as the Key Skills and have created ‘rubrics’ for them at both the Junior Cycle (K-8) and Senior Cycle Level (K-12). For the purposes of the demonstrator, the Junior Cycle Key Skills frame has been selected. However, the pedagogical frame described below, the gamification framework, and the microinteraction design of the next sections are not limited to this setting and have been designed to be as generic and flexible as possible.

2.3 Pedagogical Frame

The pedagogical frame in this use case is based on assessment strategies for self-directed learning and utilizes the conceptual design of manage, monitor and modify in regards to student behaviour around 21\(^{\text {st}}\) Ce. skills. Specifically, the model of reference is model [3] of self-directed learning and their process-design model for feedback and continuous learning.

Generally, the frame consists of a phase which:

  1. 1.

    Starts with an identification of an experiential learning instances (a tagging of one of the identified 21\(^{\text {st}}\) Ce. skills on the home page)

  2. 2.

    Continues with benchmarked experiences (an answering of either a quick answer multiple choice or free text question to activate student literacy and learning within the tagged skill)

  3. 3.

    Ends with the selection of an exemplar (student uploading of personal evidence of work in the skill) and a self-assessment (self-rating based on reflection)

To support this, a frame was selected with the steps of each design phase being built using a blend of feedback spirals and metacognitively scaffolded benchmark prompts that are designed to activate experiential learning (using Bloom’s revised taxonomy [1], Wiggins & McTighe’s Six Facets of Understanding [16], and Zimmerman’s Phases and Subprocess of Self-Regulation [17]).

In regards to the specific self-assessment activities, benchmark activities within each phase are based on Rolheiser’s growth scheme for teacher implementation of stages of student self-assessment [10], and student self-rating is done using a modified version of Marzano’s 4-Point Self-Assessment Scale [6].

In regards to the specific creation and scaffolding of content within the on-boarding, benchmarked experiences and exemplar questions and tasks, Bloom’s revised taxonomy was used to formulate questions and tasks as was the concept of knowledge acquisition needing to occur prior to knowledge application.

3 Gamification Framework

This section describes and explains a gamified system for the aforementioned pedagogical design, mainly focusing on a proof-of-concept tablet app. The system consists of a tablet app, and a group of players who are students. The way the system will be designed and deployed is explained below, using the 6D Gamification Design Framework [15].

3.1 Description of the Gamified System

The system consists of a tablet app, a website, and players who meet in real life to participate in class activities. The players with the role of a student will be using the tablet app. The setting is a physical and synchronous classroom environment for the majority of the game tasks, and other environments for a few tasks. No asynchronous teaching or learning is assumed, but is not prohibited either.

The students will use the tablet app to identify each moment in class when they are active in one of the 21\(^{st}\) Ce. skills defined by the NCCA (Collaboration, Communication, Creativity, Self-management, Information management) [8]. The home screen provides the students with a selection of the skills and they have to tap the appropriate choice each time they have used a skill in the classroom (e.g., Alice taps “Creativity” after solving a new problem in Mathematics). To validate this input without interrupting the class, the app will occasionally ask the student to perform short benchmark tasks after they have tapped a skill. However, these validation benchmarks will not appear each and every time the student has selected a skill. These self-assessment activities are organised in levels (phases) of increasing difficulty and are rewarded as described in the following sections. A preliminary on-boarding phase has been designed in a way that it can be delivered by the teacher in class without consuming too much time off a class session. Moreover, to clear a phase the student will have to upload an exemplar of an achievement of theirs that reflects each skill.

This gamified self-assessment process is suitable for both the Junior and the Senior Cycle and is not affected by pedagogical decisions with regard to the language of the assessment. Thus, it can facilitate multiple models of 21st Ce. Skills, multiple education systems, curricula, age groups, taught modules, or languages. Many of these benefits derive from the curriculum-independent nature of the self-assessment pedagogy itself, and not specifically from the gamification process.

The role of other stakeholders such as the teachers and parents is beyond the scope of this paper.

Fig. 1.
figure 1

The student user-experience map for the capturing of the 21st Ce. skills. A “Hero Journey” experience is designed by phases of increasing difficulty, microcredentials as rewards, and personalised solutions (exemplars) to “quests”.

3.2 Define Business Objectives

One main reason why a design decision was made to gamify the process is that the self-assessment process is a continuous one. Indeed, the pedagogy is based on the continuous feedback spiral described in [3].

Since self-assessment is an iterative process, it is only safe to assume that initial iterations will produce poorer results than subsequent ones. Competence in self-assessment depends greatly on familiarisation with the assessment language. Thus, it is important to keep motivation among students high until they reach a stage where they will produce rich self-assessment material.

Gamification can facilitate getting the best out of students’ self-assessments by keeping them in a mental state of flow [4]. A state of flow is one where the students immerse into their tasks and thus they are more likely to respond in a qualitatively appropriate way.

3.3 Delineate Target Behaviours

The target behaviours are the following. Firstly, tagging. That is, a player is expected to use the system to digitally tag a physical activity. That is, a key performance indicator (KPI) of the system will be the amount of user activity related with identifying that they have used a 21st Ce. skill in the classroom.

Secondly, a target behaviour is the player to explain their involvement with the skills. That is, a KPI of the system is the amount and the quality of user activity around the benchmark tasks during the phases, and the uploaded exemplars at the end of each phase (see Fig. 1).

3.4 Describe Your Players

The players are young, and relatively tech savvy (as we assume that their schools has provided them with tablet devices). While the pedagogical design and the overall gamification framework (phases, exemplars, etc.) have nothing that absolutely prescribes a tablet app and could be used with paper-based forms, the age of the players favours a digital solution.

The players, depending on their exact age, could have a varying level of workload and this could affect the use of the system. New students could use the system more due to excitement about its novelty, while near-graduation students could be affected by the current system’s high appreciation of examination results and focus on those rather than on 21st Ce. skills.

Fig. 2.
figure 2

Various designs for badges for the system. Benchmark badges also function as a progress indicator within a phase, while skills badges can indicate both progress across all phases and Marzano scale. Badges can be verified—but not evaluated—by teachers. A “guide” avatar is designed to provide guidance to the students.

3.5 Devise Activity Loops

The main activity loop will be to tag classroom activities in the system/app. Moreover, if the user has tagged a skill a set number of times they will be asked to complete a short benchmark task. Finally, the users get to upload an exemplar work of theirs for each skill that represents their best example of what each skill looks like in practice. For the main activity, the feedback is a simple notification that they have performed the tagging (see about microinteractions at the section below). For the benchmark and the exemplar tasks, the users will receive digital badges within the system (see Fig. 2). These badges would be designed so as to assign a status to users depending on their self-assessment and could include some teacher validation (not evaluation, rather validation in the sense of avoiding plagiarism etc.).

3.6 Dont Forget the Fun!

All the points said above, it is expected that satisfaction, within-school civic-duty-like fun, not necessarily playful fun is going to be the key motivator for players to participate in the system. Fun is seeked by expanding intrinsic motivation, it is not the goal that the aforementioned badges will be a major motivation force. Rather, extrinsic motivations will provide moments of instant gratification for sticking with the system, while, using again the examples of Alice tagging “Creativity” in a Mathematics activity, the Mathematics activity itself is supposed to be the playful fun of the systemFootnote 5. This can be conveyed to the users via the app visual design and text. However, various benchmark tasks can be designed so as to have playful elements. A “guide” avatar, designed to provide guidance to the students, can also consist an element of playful behaviour.

3.7 Deploy the Appropriate Tools

The appropriate tool here is a tablet app. The tablet app is intended to capture skills on the spot. Moreover, one can see their badges and previous exemplars.

A tablet is preferred since it is a mobile device which is less cumbersome for text input than a mobile phone. It allows on-the-spot capturing of skills and also to complete benchmark tasks that would require text input (e.g., “What does it mean to be excellent at Collaboration?”). Larger screen real-estate at tablets also means that browsing history or an overview of exemplars is better than using a mobile device.

As the players are young and tech savvy, they shouldn’t have any difficulty in using this technology.

Overall, our gamification framework suggests the design of a finite game, where (i) mastery, ownership, and identity are the chief motivators, (ii) there are clear checkpoints as victory conditions, (iii) levels of difficulty, levels, rewards (badges), reinforcement through teacher validation of the badges, and quests (exemplars) are the game mechanics, (iv) and status, achievement, and feedback by the teacher are the social interactions.

4 Microinteraction Design

The pedagogical design and the gamification framework described above can result in many different implementations, but they all require a single interaction: to digitally tag the physical activity of the skill by tapping the appropriate choice each time they have used a skill in the classroom (e.g., Alice taps “Creativity” after solving a new problem in Mathematics). This interaction is a microinteraction: microinteractions are “contained product moments that revolve around a single use case—they have one main task” [12] and they consist of four parts:

Fig. 3.
figure 3

The microinteraction for the capturing of the 21st Ce. skills. A user performs some activity in the classroom and then in the app they tag it by tapping the respective option. The system gives them feedback about the success of the microinteraction. Two possible designs for different tablet platforms are presented.

Triggers: The trigger (see Fig. 3) that initiates the microinteraction is the user. The user performs some activity in the classroom and then in the app they tag it by tapping the respective option.

Rules: The rules for tagging are explained during an on-boarding phase to the students, and also by the teachers. It is anticipated that teachers would adapt the use of the tool to their teaching style. From the system’s point of view, the rule is that the microinteraction needs to be triggered and then it will give feedback to the user or will initiate a loop (see the fourth part of microinteractions below).

Feedback: Feedback needs to be kept to a minimum in order to avoid interruptions of teaching in the classroom. A “thumbs up” icon with an informative text about which skill has been tagged should appear (see Fig. 3).

Modes and loops: Two extension loops of the tagging microinteraction for the capturing of the 21st Ce. skills are based on user behaviour as described below:

  • — After a user taps a skill for a certain number of times (as in Fig. 3) they are prompted to perform a benchmark task (see Fig. 4 left).

  • — After they have performed all the benchmark tasks for a phase, they are asked to upload an exemplar of the skill to move to the next phase (see Fig. 4 right).

  • — After the user has completed either a benchmark task or an exemplar, they receive their respective badge (see Fig. 2).

Fig. 4.
figure 4

Extension loops of the tagging microinteraction for the capturing of the 21st Ce. skills. After a user taps a skill for a certain number of times (as in Fig. 3) they are prompted to perform a benchmark task (left). After they have performed all the benchmark tasks of a phase they are asked to upload an exemplar of the skill to move to the next phase (right).

Overall, the aforementioned microinteraction design has a twofold intention. Its simplicity aims to enhance the usability and the user experience of the system. Moreover, the interaction design needs to facilitate the use in an authentic classroom environment and not interrupt teaching.

5 Preliminary Findings

We have sense-checked the pedagogical framework and the microinteractions-based and gamified design with a small sample of teachers and students. The response was positive and our design has been described by these few teachers as “filling the gap” in the area of skills assessment. The students perceived that they would benefit in raising their awareness around the skills. The response was from a small sample and to present conclusive findings a trial should be conducted with a prototype software application.

To this end, to day we have started developing a software tablet application which incorporates the ideas of this paper. Trials with the demonstrator have been scheduled with both the K-12 and the Higher Education sector and we anticipate that the data collected from these will yield interesting results.

6 Conclusions

In conclusion, the original project objectives of creating a framework for the assessment 21st Ce. Skills that would be independent of (formal or informal) a singular activity and which could be implemented in a software application were met successfully.

Moreover, we have designed a microinteraction-based gamified framework that accompanies the pedagogical design, which has the potential to enhance the user experience and the usability of skills assessment without interrupting the in-classroom activities.

A forthcoming trial with a software tool that incorporates the aforementioned principles is going to validate this approach, its flexibility, and its scalability.