Abstract
Authentic assessment is widely recognized as a valuable method that reflects real-world contexts, allowing students to apply their knowledge to practical challenges and prepare for their futures. Despite the pervasive influence of digital technologies in modern work and life, their role in authentic assessment—an approach centered on real-world relevance—remains poorly understood. Key questions persist regarding how digital technologies are integrated into authentic assessment practices. This review examines the nature and extent of technology integration in authentic assessment within higher education literature. Through a systematic search, identification, analysis, and synthesis, we identified 52 relevant studies. Our findings reveal significant variation in technology use across the four steps of authentic assessment. While digital tools are commonly employed in assessment task design (Step 2), there is limited consideration of broader digital contexts (Step 1) or integration into evaluative judgments and feedback mechanisms (Steps 3 and 4). We also identify diverse approaches to incorporating technology within the design phase. These differences in technology use reflect varying conceptualizations of authentic assessment, influencing its design, implementation, and learning outcomes. To provide educators with practical guidance, we build on a widely adopted stepwise model by introducing a structured framework for integrating digital technologies into authentic assessment. Finally, we highlight areas for future research and practice that may enhance authentic assessment through technology.
Introduction
Authentic assessment represents a shift from standardized and objective testing in higher education, allowing assessment to contribute to learning and better prepare students for their future lives and careers (Jopp, 2020). Early works on authentic assessment stemmed from a recognition of the limitations of testing decontextualized knowledge, arguing that assessment helps students solve problems outside of school settings and evidence their achievements (Riley & Stern, 1998). This approach essentially assumes authenticity as a means to the real world (Splitter, 2009) and regards assessment tasks as opportunities for students to apply learned capabilities in real-world contexts.
Alternative interpretations of authentic assessment have been developed in the literature. Wald and Harland (2017), for example, argued that authenticity contains facets of the existential self and a degree of personal meaning. Learning and assessments become authentic not only because they involve real-world tasks but also because they enable students to be and become themselves, taking ownership of their learning and creating personal meaning during learning. McArthur (2023) further challenged authentic assessment serving primarily to mirror the real world. She argued for justifying the social value of authentic assessment: assessment tasks should not reinforce social structural barriers such as inequity. Instead, they should serve to advance and transform social practices.
The latest work by Ajjawi et al. (2024) proposes considerations of the degree of authenticity in assessment tasks rather than classifying a task as authentic or inauthentic. The authors identify psychological authenticity (assessment as enabling individual judgments), ontological fidelity (assessment as enabling the construction of a convincing narrative over knowledge and skills), and the necessary ambiguity and complexity associated with authentic assessment.
While there are differences in how authenticity is defined, which we have identified above, authentic assessment can be broadly understood as an approach that not only equips students to tackle real world of work challenges by evaluating their application of pertinent knowledge, skills, and dispositions (Gulikers et al., 2004), but also integrates two additional core dimensions: personal authenticity and engagement with societal issues. Collectively, these dimensions establish authentic assessment as an inclusive and transformative framework for academic and professional development (Arnold & Croxford, 2025).
Given that digital technologies have been used to shape the real world of work, human capabilities and arguably social structures, and that authentic assessment aims to mirror real-world contexts, it is perhaps unsurprising that digital technologies have been reported to increasingly mediate authentic assessment practices in higher education (Nieminen et al., 2023). A recent scoping review conducted by Nieminen et al. (2023) has broadly investigated the rationales behind incorporating digital technologies in authentic assessment. The technologies have been used to improve assessment design and delivery, enhance students’ proficiency with digital tools, foster critical engagement with digital technologies, and prepare students for thoughtful, socially responsible participation in digital societies. Research has also examined how technology shapes authentic assessment (Lim et al., 2022), and addressed the practicalities of implementing e-authentic assessment (Raynault et al., 2022). The recent COVID-19 pandemic and the application of generative artificial intelligence have further catalyzed innovative online assessment practices, among which authentic assessment has been afforded a high profile (Advanced Study Institute Online Symposium, 2023). Despite these, the role of digital technologies in authentic assessment has not been clearly established in current research. This to some extent explains the apparent difficulties educators face when integrating technologies in authentic assessment (Jopp, 2020).
In this review, we aim to establish the nature and extent of technology integration reported in authentic assessment research. Specifically, we seek to address the overarching question: what is the role of technology in authentic assessment? We choose to conduct a systematic literature review to address this question, employing a structured and transparent method to identify, evaluate, and synthesize evidence. This approach provides a clear and comprehensive understanding of technology use in authentic assessment while offering valuable insights to guide future research and practice in integrating digital technology into authentic assessment (Munn et al., 2018).
Our conceptualization of digital technology aligns with Oliver’s (2013) perspective. While acknowledging that digital technologies possess distinct functionalities and facilitate specific actions, Oliver emphasizes their socially constructed nature. These technologies are not static; rather, they are deeply embedded in practice, shaped, interpreted, and adapted by users—in our context, higher education teachers. Aligning with such a broad conceptualization, we define digital technology as electronic tools, systems, devices, platforms, and resources that generate, process and store data (Kumi-Yeboah et al., 2020). The technology enables users to access the internet, engage with online resources (Liu et al., 2025), and connects students with the digital society in which they are living (Bearman et al., 2023). It includes but not limited to for example, learning management systems, Microsoft Office applications, or social media sites.
Organizing Framework for Identifying Technology Use in Authentic Assessment
To articulate the nuances in technology use throughout authentic assessment, we drew on the widely referenced model for designing authentic assessment by Villarroel et al. (2018). Compared with other dimension-based models of determining the authenticity of an assessment design (e.g., Ashford-Rowe et al., 2014; Gulikers et al., 2004; Osborne et al., 2013), we find that this step-based model, which further translates dimensions of authentic assessment, is most suitable for understanding the nature and extent of digital technology use in the process of designing and implementing authentic assessment by frontline educators who often face challenges in dedicating adequate time and energy to effectively transform conceptual dimensions into actionable steps, given the constraints of their actual context, like managing large class sizes and navigating a culture heavily focused on testing (Brush & Saye, 2008). We further adapted the first step of the model from considering the workplace context to a consideration of the broader context where learning is applied, in order to ensure the model caters to varied interpretation of authentic assessment, which we identified previously. The adapted model (Figure 1) comprises four steps for authentic assessment design. The first step involves consideration of the broader context in which learning is applied. The second step involves designing assessment tasks. The third step concerns developing student evaluative judgment, “the capacity to judge the quality of work of self and others” (Tai et al., 2018, p. 472). The last step focuses on students gathering and responding to feedback, in an attempt to develop an appreciation of feedback and make appropriate affective and behavioral responses to feedback (Carless & Boud, 2018).

Four-step model for authentic assessment adapted from Villarroel et al. (2018).
Method
The Search Process
The literature search followed the 2020 PRISMA guidelines (Page et al., 2021) and was recorded in Figure 2. We collaborated with an experienced higher education research librarian. After several pilot searches and expert consultation, we finalized the search string: “authentic assessment AND higher education OR HE OR college OR university* OR tertiary OR undergraduate” to examine the role of digital technologies in various stages of authentic assessment. This approach was designed to capture the most relevant studies within research team’s capability. Given the multitude of digital technologies for teaching, learning and assessment, we did not include any technology in the search string but chose to identify technology and determine whether a study include detailed description of technology use during the manual screening process.

PRISMA flow diagram of the steps and results of article selection.
In December 2022, we applied the search string to Scopus and Web of Science Core Collection databases in collaboration with the same librarian. Our choice of the databases allowed us to capture most relevent research studies, ensure quality of included sources, and maintain the review in a manageable scope. We limited the search to full-text articles written in English and published in peer-reviewed journals. As the early 2000s was the time that authentic assessment appeared regularly in higher education research (McArthur, 2023), we further limited the search to dates between 2000 and 2022.
Database search results were entered into a reference manager, EndNote, to remove duplications and then into Rayyan, a web-based systematic review tool, for title, and abstract screening. The full-text screening was conducted by AH with the inclusion and exclusion criteria (Table 1). We also performed backward and forward searches by examining the references of articles undergoing full-text screening based on our eligibility criteria (see Table 1). Ambiguous cases were jointly reviewed and discussed by all authors.
Inclusion and Exclusion Criteria.
We identified 314 articles from Scopus and 417 from Web of Science. We screened 601 articles based on title and abstract and 190 in full text. This led to 28 articles being included. We further obtained 15 articles from the backward and forward search, leading to 43 articles being included for review during our 2022 search. In November 2024, we updated the database search following the same procedure. This allowed us to identify nine additional articles. In total, 52 studies were included.
We used the Mixed Methods Appraisal Tool (Hong et al., 2018) to rate the quality of included papers as high, medium, or low (Appendix 1). The inter-rater reliability between two team members was 86%. The Mixed Methods Appraisal Tool (MMAT) offers a systematic approach to evaluating studies across various research designs—qualitative, quantitative, and mixed methods. It enables consistent and efficient review of mixed studies, ensuring that our conclusions and recommendations are trustworthy, valid, and reliable by assessing the overall quality of empirical research in this field. After quality appraisal, we extracted data from the included articles based on the research questions (Appendix 2).
The Data Analyzing Process
To analyze variations in technology use, we adopted the four-step model (Villarroel et al., 2018) as a guiding framework and employed the narrative synthesis method (Popay et al., 2006) to synthesize findings both within and across the four steps of technology use in authentic assessment. Narrative synthesis was chosen because the studies included diverse methodologies—quantitative, qualitative, and mixed-method designs—spanning intervention and non-intervention studies. This approach enabled us to construct a “trustworthy story” of the role of technology in each stage of authentic assessment by relying on textual evidence (Popay et al., 2006, p. 5).
We adhered to Popay et al.’s Guidance on the Conduct of Narrative Synthesis in Systematic Reviews (2006), which comprises four non-linear elements. The first element, developing a theory of how and why the intervention works, was omitted as our study included both intervention and non-intervention research. For the second element, developing a preliminary synthesis, we created a table to organize textual evidence, presenting how digital technology was used in authentic assessment and students’ learning experiences and outcomes. This table was iteratively refined and reviewed by all authors. The third element involved conducting textual analysis to examine the use of technology across the four steps of authentic assessment design and implementation. AH charted the data, and the research team collaboratively reviewed and refined the findings from the second and third elements. Lastly, the fourth element assessed the robustness of the synthesis by evaluating the strengths and limitations of the systematic search and narrative synthesis, ensuring the reliability of our conclusions.
Characteristics of Included Studies
The included articles covered various disciplines (Appendix 1), including medicine, nursing, business, STEM subjects, and teacher training. Studies from Australia (21/52), the UK (11/52), and the USA (5/52) dominated the sample. The studies also reported various research methods, including mixed-methods design (30/52), qualitative design (17/52) and quantitative design (5/52). Empirical data sources included interviews, surveys, course evaluations, students’ assessment performance, student-generated assessment artifacts, course-related documents, and observational notes. Descriptive statistics and thematic analysis were most frequently used data analysis methods.
Findings
The section uses the four-step model by Villarroel et al. (2018) as the organizing framework to present findings on the nature and extent of technology use in authentic assessment. Each of the sub-sections describes the nature of the activities occurred and technologies used. Rather than exhaustively listing all studies associated with each step, we present a selection of representative articles to illustrate the findings. However, for transparency and reproducibility, Appendix 2 provides a comprehensive analysis of all included studies based on the four-step model.
Technology Use in Step 1: Considering the Broader Context (17/52)
Seventeen out of the 52 studies reported considerations of technology use beyond disciplinary learning. Among them, 11 studies explicitly considered technology use in future professional practice (e.g., de Beer et al., 2023; Dermo & Boyne, 2014). Such a consideration influenced educators’ technology choice in subsequent assessment design processes: the assessment task involved having students use technologies that have been adopted in corresponding professional work (e.g., Dawson et al., 2006). The other six studies did not address the role of technology in future work but instead emphasized preparing students for engagement in a digital society. In this context, discipline-specific knowledge and information are created, collaboratively constructed, disseminated, and shared through multimodal formats—such as videos and podcasts—or on social media platforms (Balderas et al., 2018; Nieminen et al., 2025).
Technology Use in Step 2: Designing Assessment Task (52/52)
This step concerns designing authentic assessment tasks or contexts for students to apply their learning (Villarroel et al., 2018). All studies reported integration of digital technologies, which include for example simulation applications, video recording platforms, learning management systems (LMSs), e-portfolio systems, and social media platforms. Regardless of the kinds of technologies used, there were seven types of assessment tasks (see Figure 3), which we describe below in turn.

Variations of technology integration in Step 2.
Simulation (n = 14)
Simulation refers to instructional strategies that “replace or amplify real experiences with guided experiences that evoke or replicate substantial aspects of the real world in a fully interactive manner” (Gaba, 2007, p. 126). This corresponds closely to the view of authentic assessment as mirroring real-world practice, and therefore, has been used extensively as an authentic assessment task in the reviewed studies. Some studies developed roleplay tasks based on real world scenarios or cases and used audio and video technologies, LMSs, and web applications to host materials related to the assessment task (Raymond et al., 2013). Other studies adopted computerized simulation solutions to mimic real world situations, where students exercised problem-solving or decision-making skills in simulated environments (Way et al., 2021). One study conducted by Gerard et al. (2024) incorporated insights from game-based learning, enhancing simulation-based assessment to provide a more engaging, motivating, and adaptive experience for students. Despite the differences in simulation fidelity between roleplay tasks and computerized simulations, both were reported to be associated with improved confidence, workplace readiness, knowledge application, skills development, and learning gains in the affective domains (Way et al., 2021).
Digital Production (n = 10)
This assessment engaged students in creating digital resources and artifacts to help them develop technology-related capabilities relevant to professional work and to connect their discipline learning with the need of society. Students produced, for instance, online courseware in a teacher education program (Dawson et al., 2006), digital tours in a tourism study program (Jopp, 2020), digital narratives in a physical education program (Sargent & Lynch, 2021), and podcasts that bridge discipline learning with societal issues (Wakefield et al., 2023). Generally, digital production projects deviated students away from written assignments and provided an opportunity to express themselves creatively through assessment tasks. This, in most cases (with an exception from Colbran et al., 2017), provided additional motivation to engage students with the assessment, developed technology-related skills (Dawson et al., 2006), allowed for creativity and collaboration (Jopp, 2020), ultimately fostering a sense of ownership and connection to a professional community (Kohnke et al., 2021) or even to the broader civic community (Wakefield et al., 2023, 2024).
E-Portfolio (n = 10)
E-portfolio is a digital collection of artifacts representing a learner’s achievements and development (L. H. Bryant & Chittum, 2013). In the reviewed studies, various digital platforms (e.g., LMSs or specialized e-portfolio systems) have been used to host e-portfolios (Johnson-Leslie, 2009). In these studies, authenticity seemed to be operationalized as allowing students to document and exhibit their development, aligning more toward ideas of psychological authenticity (Ajjawi et al., 2024) or personal meaningfulness (Wald & Harland, 2017). A range of positive learning outcomes associated with this assessment task has been reported. These included improvements in engagement in learning (Kavanagh & Raftery, 2017), collaborative skills (Bradley & Schofield, 2014), self-assessment and planning (Lewis & Gerbic, 2012), technological skills (ElSayary & Mohebi, 2025), and the development of professional identity among students (Baird et al., 2016).
Inquiry-Based Learning (n = 7)
In these projects, students were tasked to solve a problem over an extended period. The problems were based on real issues relevant to disciplinary learning or professional practice, including, for example, health promotion in medicine (Anderson et al., 2022), financial management in hospitality (Maniram, 2022), and animal behavior patterns in ecology (Wu et al., 2021). The problems allowed students to apply their learning in practice. Technologies primarily served to host project-related material, enable project management, and facilitate collaboration among project group members. A range of positive learning experiences and outcomes have been reported, including improved satisfaction and participation (Curtis et al., 2021), mastery of disciplinary and foundational knowledge (Wu et al., 2021), self-reflection and emotional intelligence (Maniram, 2022), creativity (Schonell & Macklin, 2019), and communication skills (Anderson et al., 2022).
Social Media Applications (n = 5)
Social media such as online wiki pages (Balderas et al., 2018), blogs (Durand, 2017) and tweets (Cacchione, 2015) have been reported as assessment tasks in which students share their understanding of and reflections on learning. These tasks were often collaborative, allowing students to learn from each other’s perspectives. This type of task aligns students’ experiences with the dynamics of the broader digital society, where knowledge and information are collaboratively created and shared in multimodal formats. It reframes students as active knowledge creators rather than passive learners (Nieminen et al., 2025). Additionally, it holds the potential to engage authentic audiences in evaluating the products developed by students, thereby enhancing the real-world relevance and impact of their work (Osborne et al., 2013). Positive outcomes such as student satisfaction (except for Balderas et al., 2018), development of digital literacy and sense of belonging to this digital community (Nieminen et al., 2025), improved collaborative creativity (Cacchione, 2015) and workplace readiness (Allen et al., 2014) have been reported variously across the studies in addition to gains in disciplinary learning (Durand, 2017).
Data Analysis and Reporting (n = 3)
This assessment task was implemented in study areas where quantitative analysis was essential. Students were given data analysis tasks typical in professional or disciplinary practice. In two studies, students used statistical analysis tools adopted in work practice (i.e., biomedical science and statistics; Dermo & Boyne, 2014; Morris et al., 2004). The other study engaged students in using digital technologies to creatively report financial data (van Rensburg et al., 2022). Interestingly, while all assessment tasks focused on quantitative analysis and reasoning skills, they reported positive learning experiences and learning outcomes in addition to this skills domain, including collaborative, reflective, and higher-order cognitive skills.
Presentation or Oral Exam (n = 3)
This assessment also drew on problems or cases relevant to the real world or the professional context. Students worked on the problems or cases, often in collaboration, and then presented their findings and analysis, which educators were assessing. There were variations in terms of the assessment format: some studies opted for presentations (Carter et al., 2015), while others adopted interactive oral examinations in which students had opportunities to address follow-up questions, justify their analysis, and discuss it with the educators (Sotiriadou et al., 2020). Regardless of the format, this assessment had a common emphasis on verbal communication skills. Technologies were used primarily to support, deliver, and organize the assessment. This assessment engaged most students, leading to satisfaction and active participation. Perhaps unsurprisingly, studies using this assessment reported improved students’ communication and collaboration skills (e.g., conflict resolution), deeper understanding, and improved work-related competence (Carter et al., 2015). Additionally, two studies that adopted interactive oral exams suggested that this assessment turned the high-stakes end-of-semester assessment into a low-stakes one (Scott & Unsworth, 2018) and had fewer opportunities for academic misconduct (Sotiriadou et al., 2020).
Technology Use in Step 3: Engaging Students in Judgment (16/52)
Villarroel et al. (2018) considered this step as engaging students with assessment criteria and performance standards to help them evaluate and judge their work. This engagement is primarily facilitated through the use of rubrics and exemplars. While 15 out of 52 studies reported using rubrics to assist students in evaluating and judging their own performance, 2 reported using of technology in this process. This included using electronic rubrics (Cameron & Dickfos, 2014) to handle self-assessment. The rest 14 articles provided exemplary digital artifacts in digital production projects and e-portfolios to help students understand performance expectations and benchmark their performance (Panadero et al., 2018).
Technology Use in Step 4: Providing Feedback (21/52)
Majority of studies (15/21) used online platforms to manage the feedback process so that students can learn from feedback and improve their performance. In general, technology was used to enable ongoing feedback between peers and educators (Balderas et al., 2018), facilitate efficient feedback giving and receiving (Rausch et al., 2016), allow external stakeholders’ input in the evaluation of student work (de Beer et al., 2023) and replicate workplace-style iterative feedback (McLachlan & Tippett, 2024).
Discussion
Our review examines the extent and nature of technology integration in authentic assessment research. The 52 articles identified through a systematic literature search form the evidence base for understanding the role of digital technology in authentic assessment in higher education. Our analysis revealed that digital technology is consistently incorporated into authentic assessment task design at Step 2. Specifically, we identified seven distinct types of technology-integrated authentic assessment tasks. However, technology use in Steps 1, 3, and 4 is less frequent and more limited in scope. The following discussion explores these variations across all four steps, with a particular focus on Step 2, to identify gaps and implications for assessment practices.
Conceptualizations of Authenticity and Technology Integration
Authenticity in assessment has been conceptualized in various ways, with one prominent approach viewing authentic assessment as mirroring the real-world tasks, problems, and experiences (Gulikers et al., 2004). Our review suggests that this perspective significantly influences how technology is integrated into authentic assessment research. All studies reported learning outcomes related to knowledge, skills, and capabilities essential for students’ future employment. Consequently, technologies were primarily used to develop work-related competencies (e.g., van Rensburg et al., 2022) or stimulate technology-medicated professional practices (e.g., Kohnke et al., 2021). This operationalization of authenticity reduces assessment to a mechanism for evaluating students’ capital value in the labor market, measured by workplace-relevant skills and efficiencies (Nieminen & Carless, 2023). Technologies were introduced as tools to enhance assessment efficiency and, in some cases, to enable the evaluation of these skills altogether.
Some reviewed studies reported that authentic assessment contributed to professional identity development. However, this remains distinct from the broader notion of as self-expression and personal meaning-making in learning (Ajjawi et al., 2024; Wald & Harland, 2017). While professional identity is work-related, self identity encompasses diverse roles, experiences, narratives, and beliefs. Our findings suggest that technology use in authentic assessment has largely focused on producing competent individuals rather than fostering broader personal development. Digital technologies can support human functioning (e.g., assistive technologies; Cowan et al., 2012), and their presence inevitably shapes self-perception (Belk, 2013). However, these considerations appear to be absent from the reviewed studies.
Similarly, while some researchers have reconsidered authentic assessment as having the potential to transform and improve social practices (McArthur, 2023), none of the reviewed studies used digital technologies to achieve such transformation. Six studies integrated disciplinary knowledge with the broader civic engagement, helping students understand their field’s societal relevance by creating digital products or completing digital-based projects in disciplines such as community health (Anderson et al., 2022; de Beer et al., 2023), ecology (Wu et al., 2021), marketing (Spanjaard et al., 2023) and conservation biology (Wakefield et al., 2023, 2024). However, these initiatives did not explicitly employ technology to drive social change.
Variations in Technology Integration
Technology integration varied across different stages of authentic assessment design and implementation. One-quarter of the reviewed studies considered technology in the broader context for the assessment’s intended real-world application (Step 1), but the majority did not. This finding aligns with prior research (Nieminen et al., 2023), indicating that even when authentic assessment aims to mirror professional environments, these tasks do not always incorporate digital technologies. Baskerville et al. (2020) argued that contemporary reality is often digitally constructed before being reassembled in the physical world. If this is accurate, then current authentic assessment practices may risk inauthenticity by failing to reflect digital-first realities. Alternatively, leveraging digital technologies to recreate physical environments aligns with McArthur’s (2023) transformative approach to authentic assessment, contrasting with the prevailing use of technology to replicate real-world tasks.
All studies reported technology use in the seven types of assessment tasks (Step 2). While we could not establish the statistical correlation between task type, design intention and assessment outcomes, all assessment tasks were associated with increased student satisfaction (e.g., Anderson et al., 2022; Curtis et al., 2021), and engagement (e.g., Sotiriadou et al., 2020; Way et al., 2021). Additionally, they supported disciplinary learning and understanding (Dermo & Boyne, 2014; Wu et al., 2021) and were linked to improvements in communication and collaboration skills (Cameron & Dickfos, 2014; Polly et al., 2018). However, five studies reported mixed student attitudes toward assessment tasks (e.g., Colbran et al., 2017). A possible explanation is that students did not perceive the relevance of these assessment tasks to their digital lives (Tong et al., 2020), which may have affected their goal-setting and engagement with learning activities (Cano & Berbén, 2009). This dissatisfaction warrants further empirical investigation.
Importantly, conceptualizations of authentic assessment influenced assessment task design. While different task types afford distinct learning activities and outcomes, some tasks (e.g., simulation) closely aligned with the idea of authenticity as workplace preparation. Others, such as e-portfolio, emphasized personal, and professional identity development, while social media and digital production tasks fostered connections with communities beyond the university. These findings suggest that how authentic assessment is conceptualized affects both the choice of technology and the nature of digitally mediated activities within assessment task.
Regarding technology use for evaluative judgment (Step 3) and feedback (Step four), results indicated limited adoption. When technology was employed, it primarily served to streamline assessment processes (Sabin et al., 2013) or manage multiple feedback sources (Polly et al., 2018). Only one study ( McLachlan & Tippett, 2024) explored how students and educators used technology to enhance evaluative judgment and feedback quality through the Padlet platform. This interactive, industry-relevant tool replicated professional feedback dynamics, fostering “work-in-progress” dialogs to develop creative collaboration competencies. However, most other use cases showed little distinction from general technology-enhanced assessment practices, and no unique patterns of technology integration were identified in these steps.
Practical Implications
Building on our findings, we extend Villarroel et al.’s (2018) stepwise model (see Figure 1) by introducing a structured framework (see Figure 4) to guide educators in integrating digital technologies into authentic assessment. This framework represents an initial attempt to provide structured guidance, and we encourage educators to apply, test, and refine it across different educational settings.

Four-step model for technology integration in authentic assessment (Adapted from Villarroel et al., 2018).
Our review highlights a critical shift in perspective: authentic assessment should not only enhance employability but also engage students as active participants in a digital society. Digital technology is primarily used to simulate professional environments or support student performance and reflection. However, even when tasks involve creating digital artifacts, few explicitly target digital skill development (see Appendix 2), leaving a gap in preparing students for the complexities of a technology-driven world.
A broader conceptualization of authenticity is essential—one that includes “authenticity to self” (Ajjawi et al., 2024), reflects the digital world (Nieminen et al., 2023), and fosters transformative social practices (McArthur, 2023). Educators should view students not just as future professionals but as digital citizens who actively shape and adopt technology (Hu et al., 2025; Liu et al., 2023). This approach promotes engagement through real-world simulations, cultivates essential digital skills (van Laar et al., 2017), encourages students as active knowledge creators (Nieminen et al., 2025), and enhances awareness of technology’s broader impact beyond authentic assessment tasks they have participated (Osborne et al., 2013).
Rather than listing the seven types of technology-mediated authentic assessment tasks, we adopt a role-centered perspective for two reasons. First, designing assessments around students’ roles in the digital world (Step 1) and how technology supports these roles (Step 2) encourages creative and contextually relevant integration. Second, given the rapid evolution of digital technologies, rigid classifications risk obsolescence. A flexible, role-centered approach provides a more sustainable framework for integrating technology into authentic assessment. For evaluative judgment (Step 3) and feedback (Step 4), most studies provided descriptions, but fewer than half incorporated digital technology. This aligns with Villarroel et al. (2018), who emphasize authentic assessment’s role in developing students’ capabilities within the broader framework of Assessment for Learning. Technology-mediated feedback plays a key role in engaging students and supporting educators’ design intentions. Our review highlights various ways practitioners used digital technology for feedback, including timely feedback (e.g., Watmough et al., 2016), personalized feedback (e.g., Hwang & Chang, 2024), and multi-source feedback (e.g., de Beer et al., 2023). These are commendable feedback practices in authentic assessment, aligning with the technology-enhanced assessment literature, which often focuses on augmenting and enhancing existing offline practices (Liu et al., 2025). Notably, McLachlan and Tippett (2024) demonstrated an exemplary practice for embedding authenticity into feedback mechanism. They integrated an industry-relevant interactive tool, to mirror workplace feedback practices.
Implications for Future Research
Our review found diverse research designs and data sources with most studies rated as high (n = 28) or medium (n = 20) in quality (see Appendix 1). Despite this, five key areas require further investigation, each with both research significance and practical implications.
First, most studies were conducted in applied disciplines or professional education programs, where authenticity is often assumed to mirror workplace demands. This assumption may not hold in non-applied fields, yet there has been little research on disciplinary influences on authentic assessment and technology use. Comparative studies across disciplines could provide valuable insights into how authenticity and technology integration vary.
Second, in all but one study, teachers selected the technologies used in authentic assessment. This raises concerns about the absence of student choice, described by Lim et al. (2023) as a lack of “freedom-of-choice” (p. 9043). Only Osborne et al. (2013) developed a process allowing students to choose technologies that they found appropriate for assessment tasks. Recognizing students’ agency in technology integration aligns with authenticity as a personally meaningful process (P. Bryant, 2023). It places students at the center of authentic assessment and acknowledges that technology use in reality is often self-initiated rather than being imposed. Future research could explore the feasibility of students-driven technology choices in authentic assessment.
Third, there is limited exploration of students’ encounters with technologies in developing evaluative judgment and engaging with feedback. Understanding the relationship between digital technologies, external evaluations, assessment performance, and self-reflection is crucial to students’ self-authenticity: achieving goals, adapting to environment, and receiving recognition (Schmader & Sedikides, 2018). Most revied studies treated technology as independent of evaluative judgment, which contrasts with emerging research showing students increasingly use generative AI as a collaborative learning partner (Šedlbauer et al., 2024) and a tool to develop evaluative judgment (Bearman et al., 2024).
Fourth, while technology-mediated authentic assessment has been shown to enhance students’ satisfaction, engagement, content knowledge, workplace readiness, and professional identity development, evidence is primarily based on short-term impact indicators. Since authentic assessment aims to prepare students for future professional and societal roles (McArthur, 2023), longitudinal studies are needed to assess its long-term impact and track student learning trajectories over time.
Finally, Bearman et al. (2023, p. 296) described “a romanticised view of technology-mediated assessments,” where research often portrays innovative assessments as inherently beneficial. This trend was evident in our review and aligns with broader literature on educational technology. Few studies reported challenges, negative experiences, or unintended consequences of technology-mediated authentic assessment. Future research should adopt a post-digital perspective that critically examines the complexities of technology integration in assessment. This approach would provide a more nuanced understanding of how digital activies intersect with institutional and societal contexts (Fawns et al., 2023), allowing educators to design more effective and contextually relevant authentic assessments.
Limitations
The review focused on peer-reviewed journal articles published in English and sourced from two databases, potentially excluding studies in other languages, publication outlets, or databases. However, we are confident that our selected databases, along with additional articles identified through backward and forward search and an updated search in 2024, captured the most representative and up-to-date literature. Search results are inherently sensitive to the keywords. Our search string did not include terms such as “competency-based assessment,” “alternative assessment,” or “performance assessment,” which share similarities with authentic assessment (see Gulikers et al., 2004; Palm, 2008). However, these terms have distinct conceptual foundations and merit separate literature reviews. Overall, our search strategy was appropriate for identifying authentic assessment literature: it was formulated in consultation with an experienced librarian and was consistent with a relevant prior review (Nieminen et al., 2023). Additionally, our analysis followed a stepwise authentic assessment framework. In practice, assessment design is often non-linear manner, and effective technology use requires an understanding of authenticity and resource implications (e.g., time, staffing and funding). Future research should explore how these factors influence authentic assessment implementation and the role of technology in diverse educational contexts.
Conclusion
Our review established that while technologies have been integrated into authentic assessment, the nature of technology integration is likely influenced by how authentic assessment is conceptualized. Currently, much of technology use serves to allow the assessment to mimic the world of work, developing work-related capabilities, skills, and identities. There was little exploration into using technologies in assessment to foster students’ holistic development as individuals or active members of the digitally-mediated world, or to drive transformative changes in social realities rather than perpetuating existing structures. There were also considerable variations in technology integration during authentic assessment design and implementation. Technologies were mainly introduced to support defined assessment activities, but less so when educators considered the relevance of assessment to the broader social contexts, the development of student evaluative judgment and the way feedback is used to facilitate learning.
To help educators reap the benefit of authentic assessment through technology integration, we recommend clear considerations of what constitutes authentic assessment, how we position our students, the role of technologies in different stages of assessment design and implementation, and the digital society to which students belong. We further developed a framework to provide structured guidance on integrating digital technology into authentic assessment. We also highlight the need for exploring disciplinary influences on technology-mediated authentic assessment, involving students in technology integration, using technologies for evaluative judgment and feedback negotiation, and evaluating the success as well as the “failure” in technology-mediated authentic assessment.
Footnotes
Appendix
Technology Use in the Process of Authentic Assessment Design Reported by Included Studies.
| Author (s) & year | Consider broader context | Design authentic task | Judgement | Feedback | Learning experiences and outcomes |
|---|---|---|---|---|---|
| Cameron and Dickfos (2014) | n.a. | Simulations—roleplay: Videoed 3-min-elevator pitch (Tech: Video cameras, LMS, a custom-built web application) |
Tech: “Good” and “bad” elevator pitches; an automatically generated electronic rubric | n.a. | Communication skills; Communication self-efficacy; Workplace readiness. |
| Gemmell et al. (2010) | n.a. | Simulations—computerized simulation: Real-time hazard management simulation (Tech: An SMS messaging service, computer-based simulations, Turnitin UK) |
n.a. | n.a. | No improved student academic performance; Student mixed attitudes. |
| Gerard et al. (2024) | n.a. | Simulations—computerized simulation: A gamified virtual simulation (GVS) designed in a 3D environment, which enables students to investigate a series of murder cases |
Tech: Providing a video playthrough of the simulation | n.a. | Application of knowledge; Student engagement; Increased student independence and autonomy; Soft skill cultivation, such as resilience, adaptability, and strategic thinking; Discipline related skill development. |
| Ghosh et al. (2020) | n.a. | Simulations—roleplay: Students’ efficient operations on board a merchant ship when emergencies happen (Tech: Simulated waves, strong winds, darkness, rain, smoke and sounding of the emergency alarms) |
Non-tech: A rubric | n.a. | Improved academic performance |
| Hwang and Chang (2024) | n.a. | Simulations—computerized simulations: a Reflective Cycle-Based Virtual Reality (RC-VR) approach in professional healthcare training, specifically in the context of maternal labor risk assessment |
Tech: Self-assessing and conducting reflection based on learning records and interaction results stored in the RC-VR system | Providing feedback based on learning records and interaction results stored in the RC-VR system | Improved learning achievement; Enhanced sense of presence; Increased critical thinking awareness; Enhanced problem-solving awareness; Higher engagement in reflective learning |
| Koretsky et al. (2022) | n.a. | Simulations—computerized simulations: Solving industrially situated engineering problems (Tech: A computerized simulation, MATLAB or Excel) |
Non-tech: A rubric | n.a. | Workplace readiness; Student mixed attitudes. |
| Marriott (2007) | n.a. | Simulations—computerized simulations: Determining appropriate management plans according to the database of “virtual patients” (Tech: A self-designed computer program) |
Non-tech: A framework of specific criteria | n.a. | Application of knowledge; Low opportunities for misconduct. |
| Rausch et al. (2016) | Considering how technologies have been used in the professional world | Simulations—computerized simulations: A computer-based office simulation (Tech: A computer-based office simulation, an online summative system) |
n.a. | Rating Suite for rating | Unknown |
| Raymond et al. (2013) | n.a. | Simulations—roleplays: Students’ practices in female catheterization (Tech: The audio-visual equipment, video recording devices) |
Non-tech: A set of marking criteria | n.a. | Enhanced confidence; Application of knowledge; Student satisfaction. |
| Sabin et al. (2013) | n.a. | Simulations—computerized simulations: Calculations of injection dosage (Tech: Authentic world online learning services) |
Non-tech: A hierarchical diagnostic assessment framework | Giving individual feedback and overall performance via software | Unknown |
| Sherrett et al. (2013) | Considering how technologies have been used in the professional world | Simulations—computerized simulations: An authentic, industrially situated process development task for chemical engineering students (Tech: Industrially situated, computer-enabled learning environments) |
n.a. | n.a. | Workplace readiness |
| Sundler et al. (2015) | n.a. | Simulations—roleplay: videotaped students’ performance when they took care of “patients” generated by high-fidelity patient simulators (Tech: A clinical simulation laboratory, video-recording devices) |
Non-tech: An objective structured clinical examination protocol | n.a. | Enhanced motivation; Workplace readiness. |
| Watmough et al. (2016) | n.a. | Simulations—roleplay: Videoed student performance under simulations (Tech: Video recording devices) |
n.a. | A formal assessment tool called the Objective Simulation Assessment Tool or OSAT. | Improved learning experience; Workplace readiness; Developing emotional intelligence; Enhanced communication. |
| Way et al. (2021) | n.a. | Simulations—computerized simulations: Videoed presentations responding to a simulated workplace incident (Tech: Moodle; Moodle discussion forum, Moodle quiz, Kaltura, Turnitin) |
n.a. | n.a. | Enhanced confidence; Student engagement; Developing emotional intelligence. |
| Colbran et al. (2017) | n.a. | Digital products—digital flashcards (Tech: Moodle, iTunes U, FlashCram) |
Tech: Examples of digital flashcards | n.a. | Mixed attitudes (Low perceived relevance to professional practice and to exam preparation) |
| Dawson et al. (2006) | Considering how technologies have been used in the professional world | Digital products—developing an ICT resource package for teaching (Tech: ICT resources currently used in secondary school science) |
Tech: A web site which contained samples and copyright-free templates for preparing web quests | n.a. | Perceived and actual better ICT skills |
| Fulton et al. (2021) | Considering how technologies have been used in the professional world | Digital Products—publishing course works on media platforms (Tech: Unknown) |
n.a. | n.a. | Student satisfaction; Workplace readiness. |
| Jopp (2020) | n.a. | Digital products—designing digital walking tours (Tech: Google Maps, a computer lab “Digital Aquarium,” Blackboard) |
Non-tech: A detailed rubric and instructions outlining the specific marking criteria | A computer lab and a discussion board to get technical support and guidance in the creation of walking tours | Student satisfaction; Student engagement; Enhanced communication; Creativity. |
| Kohnke et al. (2021) | Considering how technologies have been used in the professional world | Digital products—creating infographics (Tech: Unknown) |
Tech: A variety of sample infographics | n.a. | Enhanced communication; Enhanced motivation; Professional identity formation. A sense of belonging in their desired Imagined professional community. |
| McLachlan and Tippett (2024) | Considering how technologies have been used in the professional world | Digital products—developing a complete digital media artifact, such as a video, poster design, social media campaign, or website (Tech: Unknown) |
Tech: Exemplars of “work-in-progress” feedback and creative collaboration presented on Padlet’s digital whiteboard | Padlet enabled asynchronous yet ongoing dialogue between students and instructors, replicating workplace-style iterative feedback and allowed students to see how their projects evolved over time | Developing creative collaboration capabilities; Enhancing design thinking skills; Building confidence in creative practices; Improving content knowledge and technical skills; Workplace readiness. |
| Sargent and Lynch (2021) | n.a. | Digital products—video narratives (Tech: A discussion board, video devices) |
n.a. | Teacher’s comments on discussion board | Ownership of the learning process; Enhanced reflective and self-assessing skills; Workplace readiness; Being less stressful. |
| Spanjaard et al. (2023) | Considering the outside digital world | Digital products—short movies (5–6 min) focusing on the social impact of marketing (Tech: iMovie, Canva, or Adobe Spark) |
Tech: A teacher-created digital story | Peer’s and teacher’s feedback on the class online platform | Student engagement; Application of knowledge; Creativity and critical thinking; Development of digital literacy; Student satisfaction; Improved teamwork and communication ability; Improved academic performance; Being less stressful. |
| Wakefield et al. (2023) | Considering the outside digital world | Digital products—podcasts creating a podcast in pairs on topics related to conservation biology (Tech: audio recorders, mobile phones, laptops, and Audacity® software) |
Tech: Exemplar podcasts; A partially filled digital marking rubric | n.a. | Improved teamwork and communication ability; Subject knowledge; Development of understanding interdisciplinary nature; Foster critical thinking and creative problem-solving. |
| Wakefield et al. (2024) | Considering the outside digital world | Digital products—team-based 5-min educational videos on pre-selected scientific topics (Tech: Camtasia software) |
Tech: exemplars of student-created films; | n.a. | Development of digital literacy; Subject knowledge; Improved teamwork and communication ability; Foster critical thinking and creative problem-solving; Increased awareness of societal issues. |
| Baird et al. (2016) | n.a. | E-portfolios—professional development (Tech: Google Sites platforms) |
Non-tech: Quality indicators of assessment items | n.a. | Workplace readiness; Professional identity formation. Professional value formation; |
| Bradley and Schofield (2014) | n.a. | E-portfolios—professional development (Tech: LMS) |
Tech: An example of an e-portfolio | Teacher feedback via the e-portfolio comments tab | A sense of responsibility for professional development; Reflective and self-assessing skills. |
| ElSayary and Mohebi (2025) | Considering how technologies have been used in the professional world | E-portfolios—professional development (Tech: Unknown) |
Tech: Instructors’ demonstration of the effective use of various applications; Peers’ e-portfolios; Applications with built-in templates and interactive elements. |
Video feedback; Collaborative tools and platforms (Tech: Google Apps, iCloud Applications, Padlet) |
Workplace readiness; Socio-Emotional Knowledge; Technological Knowledge; Metacognitive Knowledge; Practical teaching skills. |
| Hastie and Sinelnikov (2007) | n.a. | E-portfolios—learning development (Tech: Unknown) |
n.a. | n.a. | Student satisfaction; Collaborative skills; New ways of learning. |
| Herrington et al. (2014) | n.a. | E-portfolios—professional development Social media—Blogs (Tech: Unknown) |
n.a. | Enhancing communication via chats, peer review and discussion forums | Student satisfaction; Ownership of the products; Professional value formation; Professional identity formation. |
| Johnson-Leslie (2009) | n.a. | E-portfolios—professional development (Tech: College LiveText, HyperStudio) |
Non-tech: The clear outline of expectations regarding the e-portfolio assessment | n.a. | Making students’ learning experiences and development visible; Application of knowledge. |
| Kavanagh and Raftery (2017) | n.a. | E-portfolios—learning development (Tech: Unknown) |
n.a. | n.a. | Student satisfaction; Creativity; Student engagement. |
| Lewis and Gerbic (2012) | n.a. | E-portfolios—professional development Digital products—E-posters (Tech: Unknown) |
n.a. | n.a. | Deep learning; Making students’ professional growth visible; Workplace readiness; Professional identity formation. |
| Polly et al. (2018) | n.a. | E-portfolios—professional development Social media: Blogs (Tech: Workshop UNSW in Moodle, WordPress) |
n.a. | The Moodle Workshop (UNSW) tool developed for teacher, student self and/or peer review | Collaborative skills |
| Sidebotham et al. (2018) | n.a. | E-portfolios—professional development (Tech: Unknown) |
Non-tech: The National Midwifery Competency standards | n.a. | A sense of responsibility for learning; Enhanced confidence; Professional identity formation; Workplace readiness. |
| Anderson et al. (2022) | n.a. | Project/Inquiry-based learning—developing healthy promotion programs in local communities (Tech: LMS, MS Teams and Zoom) |
n.a. | Zoom and MS Teams | Student satisfaction; Workplace readiness; Student engagement; Deep learning; Enhanced communication. |
| Curtis et al. (2021) | n.a. | Project/Inquiry-based learning—live video presentations/screencasts/animations based on authentic cases (Tech: Unknown) |
n.a. | n.a. | Student satisfaction; Student engagement; Workplace readiness; Deep learning. |
| de Beer et al. (2023) | Considering how technologies have been used in the professional world | Project/Inquiry-based learning—a hybrid serious game, which supports and facilitates community health projects (Tech: A hybrid serious game called Carion, Online collaboration community called WijkLink Oost, LMS) |
n.a. | An online collaboration community called “WijkLink Oost” | Positive self-perceived outcomes: Collaborative problem-solving (CPS) skills; Application of knowledge; Increased student autonomy; No significant improvement in CPS skills across the broader cohort showed by quantitative results. |
| Maniram (2022) | n.a. | Project/Inquiry-based learning—creating an online journal to reflect the process of doing a hospitality financial management project (Tech: Unknown) |
n.a. | n.a. | Enhanced self-efficacy and confidence; Developing emotional intelligence; Reflective skills. |
| Osborne et al. (2013) | n.a. | Project/inquiry-based learning—creating patients’ information leaflets (Tech: Students’ self-chosen project management tools, such as: Do, Trello, and Team-Match) |
n.a. | n.a. | Collaborative skills; Time management skills. |
| Schonell and Macklin (2019) | n.a. | Project/inquiry-based learning—company case studies (Tech: LMS, video-recording devices) |
n.a. | Video debates and discussions | Application of knowledge; Student mixed attitude (Unpredictable developments and challenges); Creativity. |
| Wu et al. (2021) | n.a. | Project/inquiry-based learning—a web-based ecological inquiry project studying bear patterns in bear behaviour and spatial distribution (Tech: Archived BearCam photos, LMS) |
Non-tech: A rubric for evaluating inquiry reports | n.a. | Developing science literacy and STEM skills; Engagement in civic life. |
| Allen et al. (2014) | n.a. | Social media—online blogs (Tech: Unknown) |
n.a. | Discussion forums | Deep learning; Student satisfaction; Workplace readiness. |
| Balderas et al. (2018) | Considering the outside digital world | Social media—MediaWiki Pages (Tech: Wiki) |
Non-tech: A rubric | AMW: an open-source web application for qualitatively assessing wiki contributions | Unknown |
| Cacchione (2015) | Considering the outside digital world | Social media—Tweets | n.a. | Teacher’s comments on students’ tweets | Improved language proficiency; Student satisfaction; Collaborative creativity. |
| Durand (2017) | n.a. | Social media—online blogs (Tech: Blog creating platforms—WordPress, Wix, and Jimdo) |
n.a. | A web-based reading and analysis tool for organising digital texts—Voyant Tools | Making students’ learning experiences visible; Socio-cultural cognitive awareness; Evolved worldviews; Intercultural knowledge and skills. |
| Nieminen et al. (2025) | Considering the outside digital world | Social media—editing Wikipedia articles about archaeological sites structured by students, sharing their findings with the public (Tech: 3D modeling digital tools, Virtual Reality tour, Wikipedia) |
Tech: Exemplars of reflective journals, 3D modeling and Wikipedia articles | Group discussions and direct feedback from the instructor on digital tools like Zoom and Perusall | Application of knowledge; Critical thinking and interpretation; Development of digital literacy; Professional identity formation; Fostering students’ sense of agency in using, evaluating, and producing knowledge. |
| Carter et al. (2015) | n.a. | Presentations/Oral exams—live presentations of the root cause analysis of authentic cases (Tech: An online conferencing tool) |
n.a. | n.a. | Professional identity formation; Collaborative skills. |
| Scott and Unsworth (2018) | n.a. | Presentation/Oral exams—digital vivas (Tech: A cloud-based file sharing application called Filemail) |
Tech: Rubric functions in the Virtual Learning Environment | Providing feedback and results via Turnitin | Being less stressful; Student mixed attitude (relevance to professional practice). |
| Sotiriadou et al. (2020) | n.a. | Presentations/Oral exams—interactive oral exams (Tech: A booking system for the interactive oral times, a system for interactive oral exams) |
Tech: An exemplar client interview to students to enable their preparation | n.a. | Student engagement; Low opportunities for misconduct; Professional identity formation; Communication skills; Workplace readiness. |
| Dermo and Boyne (2014) | Considering how technologies have been used in the professional world | Data analysis/reporting—genetic case studies (Tech: LMS, a DNA sequencing software which has been used in the industry, a system for online high-stake tests) |
n.a. | n.a. | Discipline-relevant skills; Workplace readiness. |
| Morris et al. (2004) | Considering how technologies have been used in the professional world | Data analysis/reporting—running experiments of statistics (Tech: A statistical software package) |
Non-tech: Marking criteria | n.a. | A shift to a higher order learning; Collaborative skills; Low opportunities for misconduct; Reflective skills; Improved learning experience. |
| van Rensburg et al. (2022) | Considering how technologies have been used in the professional world | Data analysis/reporting—creating digital financial reports in forms of presentations, videos, websites, smartphone applications, or social media platforms (Tech: Unknown) |
Tech: Creative financial reports of current real-world companies | n.a. | Developing digital creativity |
Acknowledgements
We are deeply grateful to the anonymous reviewers for their invaluable feedback and insightful suggestions, which have significantly enhanced the quality of this manuscript.
Ethical Considerations and Consent to Participate
There are no human participants in this article and informed consent is not required.
Author Contributions
All authors contributed to conceptualization of the manuscript. AH extracted, analyzed, and charted the data, and QL validated the result. AH and QL conducted quality appraisal. AH and QL drafted the manuscript. All authors contributed to editing and revision, and approved the manuscript.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Data Availability Statement
All data generated or analyzed during this study are included in this published article.
