Abstract
Background
Early clinical training in undergraduate medical education often occurs in unstructured environments, limiting student participation and competency development. The SimZones framework offers a progressive approach to simulation-based education, yet evidence for its implementation in first-year medical students remains limited.
Objective
To describe the design, implementation, and outcomes of a structured clinical simulation program for first-year medical students using the SimZones framework.
Methods
A descriptive study with mixed-methods analysis was conducted in 2024 with 116 first-year medical students at a Chilean medical school. Students participated in structured Zone 1 simulation activities throughout the academic year in small groups (5-6 students), using 12 validated assessment instruments developed and validated by a multidisciplinary expert panel comprising 202 specific procedural steps including 30 critical safety steps. Activities focused on technical, communication, and attitudinal skills with study guides, pre-tests, and formative rubrics. The program culminated with a Zone 2 integrative activity simulating primary care consultations using standardized patients, clinical documentation, and faculty debriefing using the Plus/Delta model. Performance was assessed using standardized rubrics, and qualitative observations were analyzed thematically.
Results
All 116 students completed the program. High performance was observed in attitudinal competencies (≥97%) and communication domains (≥89%), while technical skills showed variable achievement rates (range 63%-78%). Qualitative analysis identified strengths in empathy, professionalism, and teamwork, with areas for improvement in procedural technique, interview sequencing, and time management.
Conclusions
Early progressive simulation using the SimZones framework effectively develops foundational competencies in first-year medical students. The structured approach, combining deliberate practice with formative assessment and guided debriefing, supports competency-based medical education objectives while identifying specific areas requiring additional reinforcement in technical skill training.
Introduction
Early clinical training in undergraduate medical education faces a fundamental tension between providing meaningful learning experiences and ensuring patient safety. Traditional approaches often confine students to observational roles or unstructured clinical encounters, limiting their active participation in skill development and delaying the acquisition of professional identity. 1 This pedagogical challenge has profound implications for competency-based medical education (CBME), which requires deliberate, progressive skill development from the earliest stages of training. 2
Simulation-based education has emerged as a transformative strategy to address these limitations, offering a controlled, safe, and replicable learning environment where students can actively practice clinical, communication, and decision-making skills without risk to patients. 3 Modern simulation-based education aligns closely with International Nursing Association for Clinical Simulation and Learning (INACSL) standards, which emphasize structured design, facilitation, and debriefing as essential components for effective learning outcomes. 4 The theoretical foundation for early simulation integration rests on experiential learning theory, which emphasizes the importance of concrete experience, reflective observation, and active experimentation in the learning process. 5 Meta-analyses consistently demonstrate that simulation-based medical education with deliberate practice yields superior results in skill acquisition and retention compared to traditional clinical education methods. 3 Effective simulation-based learning depends on creating and maintaining psychologically safe environments that encourage commitment, reflection, and meaningful participation through structured orientation and prebriefing phases. 6
The SimZones framework, developed by Roussin and Weinstock, 7 provides a systematic approach to organizing simulation experiences across a continuum of complexity and learner autonomy. Zone 1 activities are highly structured and instructor led, focusing on specific skill acquisition through deliberate practice. Zone 2 integrates multiple competencies in realistic scenarios with moderate instructor guidance, while higher zones progress toward independent performance in complex situations. This framework aligns with cognitive load theory and scaffolded learning principles, suggesting that novice learners benefit from structured guidance before advancing to more autonomous practice. 8 Contemporary applications of the SimZones framework in medical education demonstrate its effectiveness across diverse educational contexts, particularly in supporting early learner progression through structured competency development.
The SimZones approach directly supports entrustable professional activity (EPA) development for early medical students, particularly EPA 1 (gathering history and performing physical examination), EPA 2 (prioritizing differential diagnosis), and EPA 6 (providing oral and written reports), by providing structured practice opportunities that build foundational competencies before clinical exposure.
Despite the theoretical appeal and reported benefits of the SimZones model, empirical evidence for its systematic implementation in early medical education remains limited. Most published studies focus on advanced learners or individual simulation sessions rather than comprehensive curricular integration.9–11 Furthermore, there is insufficient evidence regarding how first-year medical students perform across different competency domains when exposed to structured simulation experiences or which specific skills require additional reinforcement in early training.
Understanding these patterns is crucial for medical educators seeking to optimize early clinical education and align curricular design with CBME principles. If simulation-based learning is to fulfill its promise as a cornerstone of medical education, we need robust evidence about how to structure these experiences for maximum educational impact, particularly in the foundational years when professional identity and clinical reasoning patterns are being established.
This study aimed to address these gaps by describing the design, implementation, and outcomes of a comprehensive simulation-based education program for first-year medical students, grounded in the SimZones framework. Specifically, we sought to evaluate student performance across technical, communication, and attitudinal competencies, identify areas of strength and improvement, and provide insights for educators implementing similar structured simulation curricula.
Methods
Educational Design
This intervention was implemented during the first academic year of the medical program, using a sequential instructional design based on the SimZones framework. 7 The program aimed to progressively develop clinical competencies through Zone 1 formative activities across the year, followed by a comprehensive integrative Zone 2 simulation. The educational strategy emphasized deliberate practice, structured simulation-based education, and formative assessment with continuous feedback.2,5
Study Period and Participants
This study was conducted from March to November 2024. All first-year medical students enrolled in the 2024 cohort were included (n = 116).
Inclusion Criteria
All first-year medical students actively enrolled in the curriculum during the study period.
Exclusion Criteria
Students who withdrew from the program or were absent during the Zone 2 integrative simulation were excluded.
Sample Size
This study included the complete population of first-year medical students (census sampling); therefore, no sample size calculation was performed.
Zone 1 Activities
Throughout the academic year, all 116 first-year students participated in standardized simulation sessions conducted in small groups (5-6 students), under the supervision of simulation-trained clinical faculty. All 116 students individually practiced and were assessed on all Zone 1 skills using formative evaluation. Although sessions were conducted in small groups for logistical efficiency and peer learning benefits, each student performed every procedural skill individually at least three times and received individual formative feedback using the 12 validated instruments. All Zone 1 evaluation was strictly formative, oriented toward learning and continuous improvement.
Assessment employed 12 technical evaluation instruments comprising 202 specific procedural steps, including 30 critical steps essential for patient safety. All instruments underwent content validation by a multidisciplinary expert panel comprising 16 professionals: 3 clinical simulation specialists, 5 medical education experts, and 8 practicing clinicians. The validation process included (1) initial instrument development based on literature review and institutional standards, considering the formative objectives of each activity, (2) expert panel review through extensive discussion and consensus among all faculty involved in activity design and student education, and (3) alpha pilot testing with faculty evaluators prior to implementation. We acknowledge that more formal and comprehensive validation of these tools constitutes an important limitation of our experience.
Each session was designed to target specific clinical competencies and included direct observation, immediate feedback, and the use of formative rubrics. Every student completed each activity at least twice and observed their peers’ performance and the feedback they received.
For each simulation, students had access to a study guide, a visible checklist or rubric, and a knowledge-based pre-test required to participate. This approach aimed to support the development of the “knows” and “knows how” levels in Miller's pyramid.
The skills practiced included the following:
Hand hygiene: a standardized technique with step-by-step verification. Personal protective equipment (PPE): proper donning and doffing focused on biosecurity. Professional presentation: appropriate attire, behavior, and identification. Basic clinical communication: greeting, open-ended questions, paraphrasing, verbal and non-verbal cues. Structured interview (pre-anamnesis): patient identification, chief complaint, data collection. Vital signs assessment: cuff placement, blood pressure (palpatory and auscultatory methods), heart rate measurement. Basic documentation: free-text clinical note after the simulated encounter.
All Zone 1 assessment instruments utilized binary evaluation (achieved/not achieved) and underwent content validation by the expert panel to ensure clinical accuracy, educational appropriateness, and alignment with evidence-based practice standards. Critical steps were identified based on patient safety implications and procedural integrity requirements.
Zone 2 Integrative Activity
At the end of the year, a Zone 2 integrative simulation was held at the Clinical Simulation Center of Hospital Padre Hurtado. Zone 2 functioned as an integrative activity combining (1) simulation skills developed in Zone 1, (2) fundamental course knowledge, and (3) non-technical competencies explored in parallel modules (communication workshops, professionalism reflections, etc). The fundamental objective was to prepare first-year students for their first real clinical approach, evaluating communication and teamwork to achieve collaborative objectives. This assessment consisted of a simulated primary care consultation with standardized patients (trained actors) in exam rooms designed to resemble real clinical settings. The simulation took place over two consecutive days.
Four clinical scripts were developed to represent common ambulatory conditions and were validated by the academic team. Student pairs completed the following sequence, with individual evaluation using separate rubrics focused on non-technical elements such as professionalism, patient communication, and collaborative work contribution:
Case review Simulated consultation (20 min) Vital signs assessment (blood pressure and heart rate) Immediate structured feedback (10 min) Free-form clinical documentation Group debriefing facilitated using the Plus/Delta technique
For logistical and pedagogical reasons, both students had to complete the scenario collaboratively but did not necessarily perform all technical skills individually. All evaluation maintained a formative character as this was the students’ first integrative clinical simulation experience.
Each room was staffed by two trained evaluators, previously prepared to ensure consistency. All students rotated through the same scenarios during the sessions.
Evaluation and Data Analysis
Student performance was assessed using standardized rubrics and checklists designed by the academic team and validated by expert consensus. Zone 1 employed 12 validated technical instruments (202 procedural steps, 30 critical), while Zone 2 used integrative assessment focusing on achievement percentages without numerical scoring. The results presented in Tables 1–3 correspond specifically to Zone 2 formative evaluation (integrative activity), not Zone 1. Zone 1 served as structured formative preparation for Zone 2. Zone 2 evaluation was designed to measure integration of (1) basic technical skills developed in Zone 1, (2) fundamental knowledge application, and (3) communicative and professional competencies developed in parallel modules. These tools were made available in advance, along with preparatory study materials. Technical, communication, and attitudinal dimensions were evaluated, and qualitative observations were recorded by the evaluators. Assessment instruments are provided as Supplementary File 1.
Percentage of Achievement by Competency Indicator.
Observed Strengths and Areas for Improvement.
Qualitative Findings: Strengths and Improvement Areas by Priority.
Statistical Analysis
Descriptive statistics were calculated using Microsoft Excel (Microsoft Corporation, Redmond, WA). Quantitative data were analyzed to calculate the percentage of achievement for each indicator and category. Qualitative data were derived from 116 qualitative comments (one per student) recorded by faculty in the rubrics. These observations were thematically analyzed using manual inductive coding by a single investigator to maintain consistent criteria. Thematic saturation was reached after reviewing approximately 80% of comments without new categories emerging. Example of code derivation: comments such as “excellent collaborative work between both students” were coded as “teamwork” in the strengths category.
Ethical Considerations
This study was approved by the Research Ethics Committee of Facultad de Medicina Clínica Alemana, Universidad del Desarrollo (authorization letter attached as Supplementary File 2). Written informed consent was obtained from all participating students at the beginning of the simulation activities, after explaining the educational and research purposes of the program. Students were informed that participation was part of the curricular requirements and that data would be used for educational research purposes. All data were anonymized for analysis and reporting. The study adhered to the principles of the Declaration of Helsinki.
Results
Participation and General Performance
All 116 first-year medical students completed the integrative simulation activity, working in 58 pairs. All participants arrived punctually and engaged with the clinical simulation scenarios, meeting the key elements of the instructional design. Adherence to professional conduct was high: 100% of students greeted the patient appropriately, and over 95% wore suitable clinical attire and visible identification.
Performance by Competency Category
Student performance was assessed using standardized rubrics that included technical, communication, and attitudinal indicators from the Zone 2 integrative activity. High levels of achievement were observed in communication competencies, with moderate to high performance in attitudinal competencies, while technical skills showed lower levels of performance (see Table 1).
Observed Strengths and Areas for Improvement
Instructors recorded qualitative comments based on student performance. Thematic analysis revealed specific patterns of strengths and improvement areas, with collaborative teamwork and effective interviewing skills being the most frequently observed positive behaviors (see Table 2).
Frequency of Qualitative Mentions
Analysis of qualitative feedback enabled the development of a priority-based framework for curricular enhancement, identifying high-priority areas requiring immediate attention and medium-priority areas for systematic development (see Table 3).
Detailed Evaluation by Competency Domain
A comprehensive breakdown of all evaluation elements by competency domain reveals differential performance patterns across technical, communicative, and attitudinal/professional competencies. The technical domain showed the lowest average achievement rate (75.5%), while the attitudinal/professional domain demonstrated the highest performance (95.5%), with the communication domain achieving intermediate results (82.4%) (see Table 4).
Detailed Evaluation by Competency Domain—Zone 2 Integrative Activity (N = 116 Students).
Note. N total = 116 students (58 Group A + 58 Group B). “Not applicable” indicates situations where the indicator could not be evaluated due to specific scenario circumstances. Percentages calculated based on total N = 116 students. Formative evaluation oriented toward identifying strengths and improvement areas.
These findings demonstrate that students excelled in collaborative and communicative competencies, while technical procedural skills and time management emerged as high-priority areas requiring targeted intervention.
Discussion
The structured integration of clinical simulation from the beginning of the medical curriculum, employing 12 validated assessment instruments with 202 procedural steps including 30 critical safety components developed through expert consensus, showed a positive impact on the development of essential clinical competencies in first-year students. The instructional design, based on the SimZones framework and supported by expert-validated assessment tools, preparatory materials, and structured feedback, appears to have fostered an active, reflective, and learner-centered educational process.1,5,6
These findings align with previous literature that highlights the benefits of early simulation-based education, particularly in the development of communication, attitudinal, and basic clinical reasoning skills.2,4,13 Our results demonstrate strong alignment with INACSL standards for simulation design, particularly in the areas of structured prebriefing, facilitated debriefing, and competency-based evaluation frameworks. The high levels of adherence to professional behaviors—such as punctuality, appropriate dress, and respectful interaction—suggest that these competencies can be effectively taught and evaluated from the early stages of training, especially when deliberately integrated into the curriculum.
The priority-based analysis of improvement areas provides a structured framework for targeted curricular interventions that directly supports EPA development for early medical students. Our findings particularly support EPA 1 (gathering history and performing physical examination) through demonstrated communication competencies and EPA 6 (providing oral and written reports) through structured documentation and presentation skill development.
The priority-based analysis of improvement areas provides a structured framework for targeted curricular interventions. High-priority technical skill deficits, particularly in blood pressure measurement technique, align with literature emphasizing the need for deliberate practice in procedural competencies.11,14,15 Similarly, time management challenges reflect the cognitive load experienced by novice learners in complex simulated clinical environments, supporting the implementation of structured time management training protocols. 16 The prominence of collaborative teamwork as a strength suggests that peer-based learning approaches embedded within the SimZones framework effectively develop interpersonal professional competencies.
The variability observed in technical performance, particularly in vital signs measurement, underscores the importance of reinforcing deliberate procedural practice. This is consistent with other studies that note a persistent gap between theoretical knowledge and practical execution in the early phases of medical training.3,8
In addition, the use of structured debriefing following the integrative simulation helped identify patterns of performance and opportunities for improvement. Prior research has shown that well-facilitated debriefing enhances critical reflection and promotes meaningful learning.9,13,17,18
This experience supports the value of structured, intentional, and reflective simulation-based learning. Combining Zone 1 activities (focused on specific skills) with Zone 2 scenarios (integrative, contextualized clinical encounters) offers a coherent educational strategy aligned with experiential learning principles and the staged development of clinical judgment.
Strengths of this study include explicit curricular alignment, the use of preparatory materials, visible rubrics, pre-activity tests, and faculty development for simulation facilitation.
Limitations
This study has several important limitations that must be acknowledged. First, potential bias from social desirability may have influenced student performance during observed simulations, especially given the formative-evaluative nature of the activity. Second, we did not assess inter-rater reliability among faculty evaluators formally, which may have introduced variability in assessment outcomes. Third, the qualitative analysis framework could benefit from standardized categorization criteria to enhance reproducibility, although consistency was maintained through analysis by a single investigator.
Fourth, this is a single-institution study with a specific student population (n = 116), which limits generalizability to other contexts. However, we have discussed adaptability considerations for resource-limited settings, different cultural contexts, and larger programs. Fifth, we lack longitudinal follow-up on learning outcomes and assessment of skill transfer to real clinical settings. Sixth, as acknowledged, the formal validation of our assessment instruments was limited and represents an area for improvement in future implementations.
However, it is important to highlight that this same faculty/research team continues to accompany these students in subsequent years (2025, 2026) through clinical simulation and medical practice, positioning us uniquely to conduct the necessary longitudinal follow-up and evaluate retention and transfer of competencies developed in this initial experience. This continuity of the educational team represents a concrete opportunity for follow-up studies that address the identified limitations through real longitudinal evaluation of these same students in progressively more complex clinical contexts.
Future research could expand this model to upper-year students, incorporate interprofessional simulation, and assess the transfer of skills to real clinical settings.
This study contributes evidence supporting early, structured integration of simulation into the medical curriculum, particularly when intentionally designed to foster foundational clinical competencies.
Conclusion
The early and structured integration of clinical simulation within the first year of medical education promotes the development of essential clinical competencies. A structured design aligned with the SimZones framework enables the deliberate training of technical, communicational, and attitudinal skills in a safe and controlled environment. This experience highlights the value of early exposure to realistic clinical scenarios when supported by structured instruction, formative feedback, and guided reflection.
Implementing a simulation-based approach with increasing complexity, embedded in the curriculum from the outset, helps establish professional behaviors and clinical reasoning habits from the first stages of training. The priority-based analysis suggests that future implementations should emphasize technical skill reinforcement and time management training as foundational elements, followed by systematic enhancement of communication competencies. These results support the incorporation of structured simulation models as a standard component of early medical education, with targeted remediation protocols addressing specific competency gaps.
Supplemental Material
sj-pdf-1-mde-10.1177_23821205251404538 - Supplemental material for Early Integration of Clinical Simulation in Medical Students: A Progressive Experience Using SimZones
Supplemental material, sj-pdf-1-mde-10.1177_23821205251404538 for Early Integration of Clinical Simulation in Medical Students: A Progressive Experience Using SimZones by Karen Alejandra Medel Rodríguez, Carla Benaglio and José Ignacio Ortega Sepúlveda in Journal of Medical Education and Curricular Development
Supplemental Material
sj-pdf-2-mde-10.1177_23821205251404538 - Supplemental material for Early Integration of Clinical Simulation in Medical Students: A Progressive Experience Using SimZones
Supplemental material, sj-pdf-2-mde-10.1177_23821205251404538 for Early Integration of Clinical Simulation in Medical Students: A Progressive Experience Using SimZones by Karen Alejandra Medel Rodríguez, Carla Benaglio and José Ignacio Ortega Sepúlveda in Journal of Medical Education and Curricular Development
Footnotes
Acknowledgements
The authors thank the faculty members who participated as evaluators and facilitators in the simulation activities, the standardized patients who contributed to the realism of the scenarios, and the students for their enthusiastic participation in this educational innovation. The reporting of this study conforms to the STROBE statement for observational studies
12
(
).
Author Contributions
K.A.M.R. contributed to the study design, implementation of activities, data analysis, and final manuscript approval. C.B. contributed to the study design and final manuscript approval. J.I.O.S. led the study design, data analysis, manuscript writing, and final approval. All authors reviewed and approved the final version of the manuscript.
Funding
The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This project was funded by the Faculty of Medicine Clínica Alemana, Universidad del Desarrollo, as part of the implementation of a first-year medical curriculum course.
Declaration of Conflicting Interests
The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Supplemental Material
Supplemental material for this article is available online.
References
Supplementary Material
Please find the following supplemental material available below.
For Open Access articles published under a Creative Commons License, all supplemental material carries the same license as the article it is associated with.
For non-Open Access articles published, all supplemental material carries a non-exclusive license, and permission requests for re-use of supplemental material or any part of supplemental material shall be sent directly to the copyright owner as specified in the copyright notice associated with the article.
