Abstract
Objective
Practical courses in medical microbiology aim to translate theory into hands-on skills, but students often struggle to link laboratory techniques with diagnostic procedures. Formative testing and self-assessment enhance knowledge retention and metacognitive awareness. This study aims to integrate quiz-based self-assessment using an audience response system (OnlineTED) into a microbiological practical course to evaluate feasibility and explore whether quiz-based self-assessment is associated with differences in students' perceived learning outcomes, satisfaction, and engagement.
Methods
Medical students (n = 125) in their first clinical year participated in a diagnostic practical at Saarland University in 2023/24. Two cohorts were compared: an intervention group using OnlineTED-based quizzes during each course day and a control group without quizzes. Standardized evaluations assessed satisfaction, clarity of objectives, perceived workload, and learning outcomes in both cohorts.
Results
Overall evaluations were very high in both groups. No statistically significant differences were observed between groups, likely reflecting a ceiling effect. Descriptive differences in the distribution of top ratings were observed in favor of the quiz cohort: more students awarded the top rating for clarity of learning objectives (57.9% vs 28.6%), relevance for the final exam (63.2% vs 38.1%), and learning outcome (68.4% vs 42.9%). In the quiz cohort, aggregated quiz accuracy increased from 63.6% on day 1 to 88.8% on day 6. Similarly, mean examination scores were descriptively higher in the quiz cohort and increased from 6.65 to 7.10 points, while the exam failure rate declined in the quiz group (15.87% vs 11.29%). Day-specific evaluations indicated variation in perceived success and sufficiency of time, highlighting sessions requiring adjustment.
Conclusion
Integrating interactive quizzes was feasible, well accepted, might enhanced students perceived learning goals and provided valuable feedback for supervisors without increasing workload or reducing satisfaction. As structured evaluations, quizzes can guide targeted curriculum adjustments and optimization.
Keywords
Introduction
Practical courses are essential for developing laboratory skills in medical microbiology. However, students often struggle to link techniques with diagnostic decision-making when feedback is delayed, which is limiting timely reflection and metacognitive engagement. Active self-assessment strengthens learning through the “testing effect,” showing that retrieving information via quizzes fosters more durable memory than passive review. Studies in medical and undergraduate education confirm that repeated low-stakes quizzes enhance retention and motivation compared to high-stakes tests.1,2 Self-assessment complements retrieval practice by fostering learner autonomy. 3 When combined with supervisor feedback, structured reflection improves performance and self-regulation. Learning analytics also show that frequent self-testing correlates with higher exam performance and more consistent preparation. 4 Based on these principles, we implemented interactive quizzes (based on the educational online voting system OnlineTED) during a microbiology practical course to (I) encourage formative self-assessment, (II) to explore potential effects on perceived learning and satisfaction, and (III) provide supervisors with real-time feedback on teaching effectiveness.
Materials and Methods
Study design and participants
This study aims to evaluate the impact of quiz-based formative self-assessment delivered through an audience response system on student perceptions and learning outcomes in a medical microbiology practical course.
Medical students (n = 125) in their first clinical year were included in the study as part of a medical microbiology diagnostic practical course at Saarland University/Germany in 2023/2024, with a duration of 6 course days each of 3 h length between December 5th and January 25th, followed by a final examination. Two cohorts were compared (the sessions for both groups were held on individual days twice a week):
Quiz group (intervention): 62 attending students were given the chance to participate in short, voluntary, anonymous quizzes during each course day (mean voluntary participation 82.26%). No-quiz group (control): 63 students attended the same sessions without quizzes and completed only a voluntary end-of-course evaluation.
The study was designed as an exploratory cohort comparison and was not powered for small effect sizes. For individual quiz participant rates refer to Table 1.
. Quiz Participant Numbers: Individual Participation of Students Among the Quiz Group (Total Numbers and Percentage) per Course Day and Diagnostic/Evaluation Question.
Implementation
Each course day began with a short lecture on pathogens, diagnostic principles, and therapy, followed by laboratory exercises based on clinical cases. Students’ participation was often hindered by a reluctance to respond to supervisors’ questions publicly. The quiz group completed a short interactive quiz (OnlineTED, onlineted.de, via personal mobile devices) of two questions covering diagnostic content and two further questions covering reflective items on workload and comprehension. Participation was anonymous and voluntary. Participant answers and participation rates per course day and question were monitored and evaluated via OnlineTED. Supervisors received feedback to adjust teaching, while individual results remained confidential. The control group completed only the standard end-of-course evaluation. Both groups completed standardized evaluations containing Likert-type items (1-6) assessing satisfaction, clarity of objectives, and learning outcomes, and a final examination. Additional quiz-specific data (daily quiz accuracy, perceived time sufficiency, methodological success) were collected electronically in the intervention group. The reporting of this study conforms to the Defined Criteria to Report Innovations in Education (DoCTRINE) 5 [see also Supplemental Material].
Statistical analysis
Group comparisons were performed using the Mann–Whitney U-test and Fisher's exact test for categorical outcomes (P = 0.05) was applied. Analyses were conducted using GraphPad Prism 10.5.0.
Exclusion criteria
One student left the practical course for personal scheduling reasons and was not considered further. Two additional students did not participate in the final examination for personal organizational reasons. Due to the anonymous character of the study, their evaluation and quiz data were not identified and therefore not excluded.
Results
Across both groups, course evaluations were highly favorable, with median ratings ranging from 5 to 6 across all domains. Overall satisfaction did not differ between the quiz and no-quiz groups (median = 5.0 in both groups; P = 0.95), indicating broad acceptance of the course irrespective of the intervention. Similarly, ratings for teaching materials, supervisor clarity, and course organization were uniformly high (median = 5-6).
Perceived course demands were moderate in both cohorts (median = 3.0), suggesting an appropriate level of challenge. The inclusion of quizzes was generally accepted among participants of the quiz group (82.26% mean participation). The following group comparisons showed no significant differences; however, descriptive trends suggested that for clarity of learning objectives, a higher proportion of participants in the quiz group assigned the top rating compared with the no-quiz group (57.9% vs 28.6%; P = 0.11). A similar pattern was observed for perceived relevance to the examination, indicated by participants awarding the highest score (63.2% for the quiz group vs 38.1% for the no-quiz group; P = 0.20). Although these differences did not reach statistical significance, they indicate a descriptive shift in the distribution of ratings toward the highest category in the quiz cohort (Figure 1A).

(A) Evaluation of participants giving the best grade in the quiz versus the no-quiz cohort in terms of practical course recommendation, learning outcome, encouragement for further studies, interdisciplinary relevance, examination relevance, and clarity. (B) Aggregated quiz accuracy per course day (anonymous participation). (C) Perceived time sufficiency during the practical course by students in the TED cohort. (D) Results of final examination, achieved score points, and exam failure rate.
Perceived learning outcomes showed a descriptive trend to be rated higher in the quiz group (median = 6.0) than in the no-quiz group (median = 5.0), (68.4% vs 42.9%; P = 0.13). Course recommendation followed a comparable descriptive trend (63.2% for the quiz group vs 47.4% for the no-quiz group; P = 0.29) which might indicating a high level of endorsement (Figure 1A).
Within the quiz cohort, quiz performance increased progressively from 63.6% correct responses on day 1 to 88.8% on day 6, which indicates an overall increase in correct responses across course days; however, due to anonymous participation and varying response rates, this trend cannot be directly attributed to individual learning gains (Figure 1B). Day-specific evaluations of perceived time sufficiency might further reflect students’ methodological progress over the course duration (Figure 1C). Mean examination scores were descriptively higher in the quiz cohort (6.65 vs 7.10 points; P = 0.094), and the failure rate was descriptively lower (15.87% vs 11.29%). These differences did not reach statistical significance (Figure 1D).
Discussion
This study aimed to examine whether quiz-based self-assessment using an audience response system improved student perception and engagement in a microbiology practical course. No significant group differences emerged, likely due to ceiling effects. Therefore, the present findings do not provide statistical evidence for a measurable effect of the quizzes on student evaluations. We attribute positive overall course evaluations, among other factors, to a favorable supervisor–student ratio. Here, additional co-supervisors were present to provide support, particularly during practical laboratory exercises like microscopy. However, descriptive trends favored the quiz group in clarity, relevance, and learning outcomes. In line with this, the quizzes may have contributed to increasing the scores achieved by participants in the final examination as well as to reducing the failure rate. However, as no statistically significant difference was demonstrated here, it must be acknowledged that the observed progression in quiz performance might also simply reflect increasing familiarization with the quiz format rather than genuine knowledge acquisition from day 1 to day 6, which is along with the already mentioned ceiling effect a limitation of this study. Due to the anonymous design, individual longitudinal learning trajectories could not be assessed. Furthermore, questionnaires (end-of-course evaluation and quiz questions) were not formally validated but were developed for this study. It must be noted that the sample size was not actively selected but rather resulted from the number of students enrolled in the microbiology course in the winter semester 2023/2024. This must be considered an additional limitation, but it also highlights the practical and operational nature of the study.
Nevertheless, frequent low-stakes quizzes might appear to reinforce constructive alignment by making learning objectives more transparent and encouraging reflection. 6 Similar effects have been observed in microbiology and other disciplines before, where formative testing clarifies expectations and enhances retention.7–9
The indicated improvement in daily quiz scores aligns with the “testing effect,” in which retrieval practice consolidates learning more effectively than re-exposure alone. 10 However, here, alternative explanations such as varying participation and increasing familiarity with the quiz format must be considered.
Moreover, the use of anonymous quizzes may have reduced performance anxiety, enabling honest self-assessment and participation, as suggested before. 11 However, this interpretation remains speculative, as no direct measure of anxiety was collected. The high acceptance and stable workload ratings indicate that such tools can be integrated seamlessly into demanding practical courses. Day-specific feedback revealed meaningful variation in perceived methodological success and time allocation. Overall, this study should be interpreted as an exploratory evaluation of feasibility and student perception rather than as confirmatory evidence of improved learning outcomes.
Conclusion
Integrating interactive quizzes into a microbiology practical course was feasible and well accepted, and provided supervisors with real-time formative feedback. While no statistically significant improvements in course evaluations or examination performance were observed, descriptive trends warrant further evaluation in adequately powered and preferably randomized studies.
Supplemental Material
sj-pdf-1-mde-10.1177_23821205261431556 - Supplemental material for Integrating Quiz-Based Self-Assessment Into a Microbiology Practical Course: Effects on Student Perception and Learning Outcomes
Supplemental material, sj-pdf-1-mde-10.1177_23821205261431556 for Integrating Quiz-Based Self-Assessment Into a Microbiology Practical Course: Effects on Student Perception and Learning Outcomes by Maximilian O. Förster, Sören L. Becker and Philipp Jung in Journal of Medical Education and Curricular Development
Supplemental Material
sj-docx-2-mde-10.1177_23821205261431556 - Supplemental material for Integrating Quiz-Based Self-Assessment Into a Microbiology Practical Course: Effects on Student Perception and Learning Outcomes
Supplemental material, sj-docx-2-mde-10.1177_23821205261431556 for Integrating Quiz-Based Self-Assessment Into a Microbiology Practical Course: Effects on Student Perception and Learning Outcomes by Maximilian O. Förster, Sören L. Becker and Philipp Jung in Journal of Medical Education and Curricular Development
Footnotes
References
Supplementary Material
Please find the following supplemental material available below.
For Open Access articles published under a Creative Commons License, all supplemental material carries the same license as the article it is associated with.
For non-Open Access articles published, all supplemental material carries a non-exclusive license, and permission requests for re-use of supplemental material or any part of supplemental material shall be sent directly to the copyright owner as specified in the copyright notice associated with the article.
