Abstract
Background:
Heralded as a teaching, assessment and reflective tool, and increasingly as a longitudinal and holistic perspective of the educator’s development, medical educator’s portfolios (MEP)s are increasingly employed to evaluate progress, assess for promotions and career switches, used as a reflective tool and as a means of curating educational activities. However, despite its blossoming role, there is significant dissonance in the content and structure of MEPs. As such, a systematic scoping review (SSR) is proposed to identify what is known of MEPs and its contents.
Methods:
Krishna’s Systematic Evidenced Based Approach (SEBA) was adopted to structure this SSR in SEBA of MEPs. SEBA’s constructivist approach and relativist lens allow data from a variety of sources to be considered to paint a holistic picture of available information on MEPs.
Results:
From the 12 360 abstracts reviewed, 768 full text articles were evaluated, and 79 articles were included. Concurrent thematic and content analysis revealed similar themes and categories including: (1) Definition and Functions of MEPs, (2) Implementing and Assessing MEPs, (3) Strengths and limitations of MEPs and (4) electronic MEPs.
Discussion:
This SSR in SEBA proffers a novel 5-staged evidence-based approach to constructing MEPs which allows for consistent application and assessment of MEPs. This 5-stage approach pivots on assessing and verifying the achievement of developmental milestones or ‘micro-competencies’ that facilitate micro-credentialling and effective evaluation of a medical educator’s development and entrust-ability. This allows MEPs to be used as a reflective and collaborative tool and a basis for career planning.
Introduction
Portfolios provide a holistic and longitudinal self-portrait of a medical educator’s professional identity formation and career development.1-3 Using self-selected material and reflections, portfolios3-7 differ sharply from logbooks, curriculum vitae, course logs, and training folders as a better means of evaluating a professional holistically and longitudinally.1,3,4,8,9 These self-portraits have even been used by medical educators as a means of illustrating their many roles10-15 for employment and promotion purposes.16-18 Indeed, medical educator portfolios (henceforth MEPs) circumnavigate the limitations posed by conventional assessment methods that often focus upon research grants and publications5-7,15,18-20 and to the detriment of appreciating the quality, breadth, depth19,21, and impact 11 of a medical educator’s role amongst other things a ‘Professional Expert’, ‘Facilitator’, ‘Information Provider’, ‘Enthusiast’, ‘Faculty Developer’, ‘Mentor’, ‘Undergraduate and Postgraduate Trainer’, ‘Curriculum Developer’, ‘Assessor and Assessment Creator’, ‘Influencer’, ‘Scholar’, ‘Innovator’, ‘Leader’ and ‘Researcher’. 13
Increasing use of electronic portfolios 22 have further boosted the visibility of MEPs,2,3,6,14,16-18,23,24 and expanded its use in collaborative work and mentoring, making MEPs a valuable tool to assess medical educators,1,17 and underlining its increasing footprint in the medical education landscape.1,4,25
However, despite its much heralded benefits,2,3,6,14,16-18,23,24 various considerations in MEP’s structure, implementation, and assessments challenge its validity2,7,14,18,21. A systematic scoping review (SSR) is proposed to study current literature to enhance understanding of MEPs, its roles, its structure, and help to design a consistent framework for MEPs that can be used across settings, purposes, and specialities, given its ability to evaluate data26-30 from ‘various methodological and epistemological traditions’. 31
Methodology
To overcome a lack of structuring and the reflexive nature of SSRs, which raises questions to their reproducibility and transparency, we adopt Krishna’s Systematic Evidenced Based Approach (henceforth SEBA)32-35 to guide the SSR (henceforth SSR in SEBA) of MEPs. SSRs in SEBA proffer accountable, transparent and reproducible reviews.
To enhance accountability and transparency, SSRs in SEBA employ an expert team to guide, oversee, and support all stages of SEBA. The expert team is composed of composed of medical librarians from the Yong Loo Lin School of Medicine (YLLSoM) at the National University of Singapore (NUS) and the National Cancer Centre Singapore (NCCS), and local educational experts and clinicians at NCCS, the Palliative Care Institute Liverpool, YLLSoM and Duke-NUS Medical School. The expert team were involved in all stages of the SSR in SEBA.
SSRs in SEBA are built on a constructivist perspective. It acknowledges the personalised, reflective and experiential aspect of development as a medical educator, as well as medical education as a sociocultural construct influenced by prevailing clinical, academic, personal, research, professional, ethical, psychosocial, emotional, legal, and educational factors.36-40 This enables them to map data on a specific topic from multiple angles and consider the factors influencing the adoption of MEPs.
To operationalise a SSR in SEBA, the research team adopted the principles of interpretivist analysis, to enhance reflexivity and discussions30,41-43 in the Systematic Approach, Split Approach,44-47 Jigsaw Perspective, Funnelling Process, analysis of data from grey and black literature, and Synthesis of SSR in SEBA which make up SEBA’s 6 stages outlined in Figure 1.

The SEBA process.
Stage 1 of SEBA: Systematic approach
Determining the title and background of the review
The expert and research team worked together to determine the overall goals of the SSR and the population, context, and concept to be evaluated. With increasing focus on the evaluation of educational activities amongst clinical faculties, 48 it was deemed reasonable for MEPs to focus exclusively on educational activity and be distinct from a clinical portfolio, given prevailing suggestions that clinical accomplishments and development tend to cloud educational achievements. 1
Identifying the research question
Guided by the Population, Concept, and Context (PCC), the teams agreed upon the research questions. The primary research question was ‘what is known about medical educator portfolios?’. The secondary questions were ‘what are its components?’, ‘how are MEPs implemented?’ and ‘what are the strengths and weaknesses of current MEPs?’.
Inclusion criteria
All grey literature, peer reviewed articles, narrative reviews, systematic, scoping, and systematic scoping reviews published from 1st January 2000 to 31st December 2019 were included in the PCC and a PICOS format was adopted to guide the research processes49,50 (See Supplemental File 1).
Searching
A search on 6 bibliographic databases (PubMed, Embase, PsycINFO, ERIC, Google Scholar and Scopus) was carried out between 17th of November 2019 to 24th of April 2020 for articles published between the years 2000 to 2019. Limiting the inclusion criteria to these dates was in keeping with Pham et al (2014)’s 51 approach of ensuring a viable and sustainable research process. The search process adopted was structured along the processes set out by systematic scoping reviews. Additional articles were identified through snowballing.
The PubMed Search Strategy may be found in Supplemental File 2.
Extracting and charting
Using the abstract screening tool, members of the research team independently reviewed the titles and abstracts and created independent lists of titles to be reviewed. These lists were discussed online, and Sambunjak, Straus, Marusic’s 52 approach to ‘negotiated consensual validation’ was used to achieve consensus on the final list of articles to be scrutinised.
The 6 members of the research team independently reviewed all articles on the final list, discussed them online, and adopted Sambunjak, Straus, Marusic’s 52 approach to ‘negotiated consensual validation’ to achieve consensus on the final list of articles to be included.
Stage 2 of SEBA: Split approach
Three teams of researchers simultaneously and independently reviewed the 79 included full-text articles. The first team of 3 researchers independently summarised and tabulated the included full-text articles in keeping with recommendations drawn from Wong, Greenhalgh, Westhorp, Buckingham, Pawson’s 53 RAMESES publication standards: meta-narrative reviews and Popay, Roberts, Sowden, Petticrew, Arai, Rodgers, Britten, Roen, Duffy’s 54 ‘Guidance on the conduct of narrative synthesis in systematic reviews’. These individual efforts were compared and discussed by the 3 researchers and consensus was achieved on the final content and structure of the tabulated summaries. The tabulated summaries served to ensure that key aspects of included articles were not lost.
Concurrently, the second team of 3 researchers independently analysed the included articles using Braun, Clarke’s 55 approach to thematic analysis. 56 In phase 1 of Braun and Clarke’s approach, the research team carried out independent reviews, ‘actively’ reading the included articles to find meaning and patterns in the data.57-61 In phase 2, ‘codes’ were constructed from the ‘surface’ meaning and collated into a code book to code and analyse the rest of the articles using an iterative step-by-step process. As new codes emerged, these were associated with previous codes and concepts. In phase 3, the categories were organised into themes that best depict the data. An inductive approach allowed themes to be ‘defined from the raw data without any predetermined classification’ . 60 In phase 4, the themes were refined to best represent the whole data set and discussed. In phase 5, the research team discussed the results of their independent analysis online and at reviewer meetings. ‘Negotiated consensual validation’ was used to determine a final list of themes approach and ensure the final themes. 52
A third team of 3 researchers independently analysed the included articles using Hsieh, Shannon’s 62 approach to directed content analysis. Analysis using the directed content analysis approach involved ‘identifying and operationalising a priori coding categories’.62-67 The first stage saw the research team draw categories from Baldwin, Chandran, Gusic’s 68 article entitled ‘Guidelines for evaluating the educational performance of medical school faculty: priming a national conversation’ to guide the coding of the articles. Any data not captured by these codes were assigned a new code. In keeping with deductive category application, coding categories were reviewed and revised as required.
In the third stage, the research team discussed their findings online and used ‘negotiated consensual validation’ to achieve consensus on the categories delineated and the codes within them. The final codes were compared and discussed with the final author, who checked the primary data sources to ensure that the codes made sense and were consistently employed. Any differences in coding were resolved between the research team and the final author. ‘Negotiated consensual validation’ was used as a means of peer debrief in all 3 teams to further enhance the validity of the findings. 69
Results
A total of 12 360 abstracts were reviewed, 768 full text articles were evaluated, and 79 articles were included (see Supplemental File 3). The themes identified using Braun, Clarke’s 55 approach to thematic analysis and categories identified through use of Hsieh, Shannon’s 62 approach to directed content analysis were similar and included (1) Definition and functions of MEPs, (2) Developing and implementing MEPs, (3) Assessing MEPs, (4) Strengths and limitations of MEPs and Electronic MEPs (E-MEP)s. See Table 1.
Themes/categories by jigsaw perspective.
Stage 3 of SEBA: Jigsaw perspective
The Jigsaw Perspective sees the themes identified using Braun, Clarke’s 55 approach to thematic analysis and categories identified through use of Hsieh, Shannon’s 62 approach to directed content analysis reviewed by the research and expert teams as part of SEBA’s reiterative process. These discussions determined that there were significant overlaps and similarities between the themes and categories allowing them to be considered and presented in tandem.
Theme/Category 1: Definition and functions of MEPs
Definition of MEPs
MEPs are defined as a collection of documents spanning a period of time4,5,7,14,17,20,70,71 seeking to demonstrate developing competencies,1-4,8,17-20,72 desirable character traits, 8 learning,4,8 challenges and improvements made3,8,14 in the field of medical education. Curated by the individual, these documents reflect the medical educator’s perspective of their development1-4,8,17-20,72 and contains elements of feedback 73 and reflection on good and bad experiences.1-4,8
Functions of MEPs
MEPs are used by medical educators to highlight professional development, documentation, learning activities, educational undertakings, reflections and career planning, while institutions employ MEPs for assessment purposes.
MEPs serve several functions and are used by medical educators and institutions differently. First, medical educators use MEPs to highlight professional development. They record their appraisals,1,2,4,18,23 revalidations,1,3,4,23 accreditations,3,23,70,73 and promotions1,2,4,5,14,15,17-20,68,70,73,74 in an MEP, and this can also be used for applying for specific roles within educational settings.1,6,18 Second, they serve as a form of documentation, where medical educators document their competencies1-4,8,14,17-19,75 certification of standards of professional performance3,4, illustrate accomplishments and educational activities,1-4,7,17,18,25,70,71,76 demonstrate desirable character traits,1,8 highlight leadership roles and successes,1,8,17,19,68 and showcase teamwork. 8 Third, medical educators use MEPs as a learning tool to guide professional and personal improvements. They highlight experiences8,24,76,77 and reflections,4,6,14,70,74 capture feedback from learners, peers, mentors and supervisors, 4 set learning objectives and guide work towards the achievement of learning objectives,4,78 and help to plan future lessons based on past experiences.4,8
On the other hand, institutions employ MEPs for assessment purposes. It serves as an assessment tool to facilitate hiring and promotion of medical educators by selection committee,2,4-6,14,18,21,23,68 and to evaluate medical educator’s performance and impact.2,6,18,68 Furthermore, MEPs help with the review of program accreditation. 79
Theme/Category 2: Developing and implementing MEPs
Designing MEPs
MEPs attempt to capture longitudinal development. Design of prevailing MEPs occur in a stepwise fashion beginning with an understanding of prevailing use of portfolio16-18,73,76 and the guiding principles behind these design structures,16-18,73,74,76 and its benefits and limitations.8,70 To contextualise MEPs to the particular setting, speciality, and the desired role,3,4,8 designers often consult intended users 77 and experts.68,73,76
The dominant guiding principle for the design of the prototype is the need to balance structure1,3,4,7,77 and flexibility.8,25,73 Structure takes the form of including ‘critical’ domains to be curated 1 and a consistent format is employed to ensure that practical 23 and local institutional needs,17,74 as well as minimum standards of MEPs are met. 77 Flexibility8,25,73 revolves around the contents of the MEP where aspects are sought to effectively capture inventiveness and learning 73 and documentation and reflections.3,4 The prototype 4 is then piloted and review from experts68,73,76 and feedback2,3 sought from a small group further refines its components 77 . The feedback and lessons learnt may be used to educate future users. 3
Components of MEPs
A variety of domains are listed within current MEPs. These domains reflect the setting and the goals of the MEP. How these domains are selected and structured are often not described nor discussed and are thus curated in Table 2.
Components of MEP.
Implementation of MEPs
There are similarly poorly described steps in implementing current MEPs. Implementation of MEPs can be grouped under 4 themes – user training, assessor training, support and integration into existing practice. With user training, teaching sessions should be carried out prior to implementation of portfolio practice.8,23 This includes highlighting the purpose and benefits when introducing portfolios which may increase portfolio uptake, and providing samples, templates, flowcharts and assessment criteria to medical educators for better clarity on how to create and use portfolios.7,8,17,68,73 Trainers in these sessions should stress the documentation of activities before details are forgotten, 1 highlight the use of portfolios as an active learning tool which allows for self-directed learning, the sharing of teaching philosophy and goals, introduce how one may interact with peers via dissemination of work,17,77 and explain how one should be discerning in the selection of evidence to include for reflection. 2 Second, assessor training can enhance reliability as an assessment tool, 4 help the institute’s promotion committee identify essential components of quality performance and train assessors to work with one another to evaluate and interpret a portfolio. 73 Third, support should be provided through telephone calls or in person. Mentors, facilitators and tutors may facilitate reflection 2 and help review the portfolio or go through portfolio assessment criteria before evaluation.6,73,76 Additionally, administrative support such as an information technology team are essential for successful implementation of MEPs to troubleshoot user problems. 23 Lastly, integration into practice may be done in a longitudinal manner where users have to fill in the portfolio over time, and portfolios may be standalone or an adjunct to existing documentation methods like a curriculum vitae,16,76 and may also be part of summative assessments. 73
Theme/Category 3: Assessing MEPs
Table 3 summarises the key subthemes associated with assessments of MEPs including its use as a formative and summative tool, the type of evidence required for assessment, the development of assessment rubrics, and the assessors.
Assessment of MEP.
Theme/Category 4: Strengths and limitations of MEPs and E-MEPs
Table 4 showcases the strengths and limitations of MEPs and electronic-MEPs.
Strengths and limitations of MEPs and E-MEPs.
Stage 4 of SEBA: Funnelling
Reviewing the themes/categories identified through the Jigsaw Process and comparing them with the tabulated summaries highlighted in Supplemental File 4 allows verification of the themes/categories and ensure that there is no additional data to be included. The themes/categories are then reviewed again by the expert team to determine if they may be funnelled into larger themes/categories that will form the basis of the discussion.
Stage 5 of SEBA: Analysis of peer-reviewed and non-data driven literature
Evidenced based data from bibliographic databases (henceforth evidence-based publications) were separated from grey literature and opinion, perspectives, editorial, letters and non-data-based articles drawn from bibliographic databases (henceforth non-data driven) and thematically analysed to determine if non-data driven accounts had influenced the final synthesis of the discussions and conclusions.
The key themes identified from the peer-reviewed evidence-based publications and non-data driven publications were identical and included:
Definition and functions of MEPs
Developing and implementing MEPs
Assessing MEPs
Strengths and limitations of MEPs and E-MEPs
There was consensus that themes from non-data driven and the peer-reviewed evidence-based publications were similar and did not bias the analysis untowardly.
Discussion
The narrative produced from consolidating the themes/categories/tabulated summaries was guided by the Best Evidence Medical Education (BEME) Collaboration guide and the STORIES (Structured approach to the Reporting In healthcare education of Evidence Synthesis) statement.
Stage 6: Synthesis of the SSR in SEBA
In answering its primary and secondary research questions, this SSR in SEBA provides a number of key insights into the creation and employ of MEPs. To begin MEPs chronicles the professional, personal, research, academic, education and learning journey and development of a medical educator through self-selected data points, descriptions and reflections. Medical educators see MEPs as a means of advancing their careers, capturing their experiences and reflections and as a learning tool, whilst for institutions, MEPs provide a wider perspective of the medical educator and an additional source of data to evaluate an education program.
With evidence that they motivate lifelong learning, self-improvements, promote the acquisition of competency and career advancement, and benefit learners by improving quality of teaching, patient safety and care and program efficacy, MEPs are gradually gaining traction amongst medical educators and institutions. These developments underline the need better structure MEPs to facilitate its wider use.
Here we proffer a 5-staged evidence-based approach to the construction and deployment of a MEP as shown in Figure 2.

Five stages of construction and deployment of MEP.
Stage 1: Mapping of the MEP
To begin with, a needs assessment should be carried out by the educational institute to determine, the need, goals,17,74 support and practical issues9,25 associated with implementing such a process. Party to this process must be an acceptable, transparent16,19,20,68,74 and verifiable1,6,18,19,21,68,77,80 means of evaluating the diverse contents of MEPs for accreditation and promotion.16-18 This takes the form of a purpose designed MEP.
Stage 2: Designing the MEP
To maximise its impact, a MEP must include longitudinal quantitative and qualitative evidence15,19,20,68,71,74,75 that is accompanied by clear documentation, reflections and be supplemented the medical educators’ many educational roles,12,81-83 competencies,84-87 characteristics, expectations5,15,20,21,80 and attainment of specific professional standards such as those set out by the Academy of Medical Educators (AoME)84,88-92 and the Accreditation Council for Graduate Medical Education (ACGME).91,93-96 Guiding this design are several considerations.
One, the competency based assessments of progression set out by the Academy of Medical Educators 84 helps ensure that key elements of this assessment process is contained within the MEP. These competency based assessments of progression 84 also align a medical educator’s learning objectives to the relevant competency guidelines, local context7,68 and an institution’s promotion criteria,5,74 making the MEP more applicable across settings95,97 and outcomes.95,97
Two, the need for a flexible framework that facilitates balance between flexibility to infuse personal data and the requisite for consistency to ensure that critical data is included. Only when a balance of structure and flexibility is obtained can the portfolio be an accurate depiction of the beliefs, attitudes, behaviours, and professional identity of the medical educator. 3
Three, there must be adequate education of medical educators to ensure that they remain motivated to maintain this ‘living’ document and update it with their goals and plans for future career development. This will also foster effective use of the MEPs as a means of regular self-assessment, continuous education and reflection3,95 which will boost professional development. 3
Four, for ease of review, access and personalisation, electronic MEPs ought to be employed. Electronic MEPs also allow application and storage of diverse evidence such as digital media and recordings and provide a convenient means of collaboration with peers and mentors.1,4,7,9,25,98 However, it is crucial to keep an electronic MEP user-friendly2,25,98 and well supported, 9 to aid its adoption. 99
Five, the combination of this template and use of an electronic platform facilitates adaption to local requirements 68 and enable medical educators to personalise the MEP to their own needs, focuses, phases of their career 73 and learning style. 3
Based on these 5 considerations, we suggest that MEPs document these themes seen in Figure 3 below:

What should be documented in MEPs.
See also Supplemental File 5 for a MEP template based on these themes.
(Sections 3-10 should contain exemplars, innovations, evidence of progress/maturation of practice, evaluations, feedback and reflections and analysis of events, both positive and negative experiences)100,101
Stage 3: Implementing the MEP
Implementation of MEPs must be accompanied by the training of all users, assessors, and faculty as to the role, need, value and use of MEPs as well as how it is assessed.70,102-105 Exemplars7,8 and scoring rubrics, 19 can guide new users14,16 and ensure fair assessments and improve reliability.4,106 To provide support and guidance for users, assessors and faculty,99,102,104,107,108 coaches, supervisors, and increasingly mentors 1 should be made available to follow the learner’s progress. 99 In turn these coaches, supervisors, and mentors must be provided with protected time 109 and administrative support to help design, update and troubleshoot issues. 105
Stage 4: Assessing the MEP
Pre-empting issues with assessing the various domains and diverse designs through use of qualitative, quantitative, and or mixed methods in the absence of a general standardised assessment rubric, 7 local institutions could promote a homogenous portfolio structure which would aid in the creation of an assessment rubric 19 and clear assessment criteria. 110 Such a rubric may be drawn from Glassick’s criteria of educational excellence,111-114 Miller’s pyramid, 115 the GNOME model of curriculum design,116-118 Kirkpatrick’s Model119,120 and Association of American Medical Colleges (AAMC) Toolbox.121,122 These tools will help overcome concerns about the lack of transparency and consistency in prevailing assessments of MEPs.16,73,110,123-126
Stage 5: Updating and improving the MEP
As ‘living documents’ capturing the evolving self-concepts and professional and personal identities of medical educators’ and their changing goals, experiences, and MEPs need to be adapted, pared, and reviewed. Here the data suggests the presence of ‘micro-competencies’. ‘Micro-competencies’ are effectively milestones that are formally assessed and verified using multisource assessments contained within the portfolio. ‘Micro-competencies’ are evident in the developing medical educator’s entries within the portfolio. These entries replete with learning objectives, reports of training approaches and assessments used, the feedback garnered from these sessions, evidence of the longer-term impact upon the learners, the medical educator’s own reflections and plans for refinement provide evidence and verification to development.
‘Micro-competencies’ suggest that a medical educator’s skills, knowledge, and attitudes develop in stepwise competency-based stages from early medical training and continue till all the micro-competencies and competencies are met. These micro-competencies and competencies are then honed and refined by master medical educators. We see the use of these verified achievements of milestones as a natural progression of the concept of milestones within the context of MEPs. ‘Micro-competencies’ guide the medical educator’s development and inform appraisals of their progress, coping, conduct and development. Critically rather than merely standardized points to be met along the trajectory towards achieving a competency, micro-competencies within the MEPs allow a number of refinements to the traditional concept of milestones.
One, micro-competencies are variable and determined with due consideration of the medical educator’s abilities, skills, level of practice, experience, training, and clinical and or professional roles and responsibilities as well as their practice settings and sociocultural context. This highlights the personalised features of micro-competencies.
Two, when considered in tandem with established milestones expected of all medical educators, micro-competencies also highlight the ‘general’ aspect of micro-competencies. The general aspect of micro-competencies is drawn from ‘stage specific requirements’ that all medical educators should achieve at a specific stage of their training.
Three, micro-competencies also vary with setting, stages of training, context, and time. Changes in these aspects of practice require re-evaluation of the medical educator’s micro-competencies. Micro-competencies allow the tutors, supervisors, reviewers, mentors, coaches, supervisors, assessors (henceforth faculty) and or employers to evaluate progress and provide medical educators with an opportunity to re-evaluate and reflect on their development and focus upon developing their learning plan.
Four, micro-competencies also acknowledge that they may be the basis of more than one competency and that without regular application will result in degradation of their abilities specifically communication and skills based micro-competencies. This highlights the time-specific nature of the micro-competency. Similarly, with medical educators often posted to different settings and or participate in training in different specialties involving learners of different backgrounds, experience and training underline the need for timely re-evaluation of micro-competencies.
Overall use of MEPs evidences the notion that micro-credentialling 127 could be built upon the achievement of personalised and general micro-competencies. Micro-credentialling allows medical educators, the organisation, the evaluators and potential recruiters to see the specific settings that a medical educator can function within, the capacity or roles and responsibilities that they can adopt, the level of supervision required and their overall progress towards attaining Entrustable Professional Activities (EPAs). With EPAs built on micro-credentials, the trajectory and gaps on the course towards attaining a specific EPA are mapped out, aiding medical educators as they reflect upon and map their course towards their overall goals. Progress captured in longitudinal assessments will also help medical educators and faculty to personalise training and support programs.
Overall micro-competencies, their relationship with micro-credentialling and EPAs inform guidance on personal, professional, and research expectations upon medical educators and steer effective career progression, maturation of thought, philosophies, skills, and actions.
Limitations
Whilst our goal was to appreciate the scope of available literature on portfolios used by medical educators, this review is limited by the lack of longitudinal and holistic evaluations of portfolios.
Although the search process was vetted and overseen by the expert team, use of specific search terms and inclusion of only English language articles potentiates the possibility of key publications being omitted. In addition, whilst independent and concurrent use of thematic and content analysis by the team of researchers improved its trustworthiness through enhanced triangulation and transparency, biases cannot be entirely eradicated.
The inclusion of grey literature improves transparency in the synthesis of the discussion, but its themes may contain bias results and provide these opinion-based views with a ‘veneer of respectability’ despite a lack of evidence to support it. This raises the question as to whether grey literature should be accorded the same weight as published literature.
Conclusions
This SSR in SEBA has laid bare the range of data on MEPs and highlighted the gaps in prevailing concepts. Perhaps a critical consideration is the fact that MEPs continue to be used for a variety of roles and goals and remain influenced by local clinical, academic, personal, research, professional, ethical, psychosocial, emotional, cultural, societal, legal and educational factors underlining the heterogeneity of available data.
Recognising this fact, we propose to determine the key ‘ingredients’ of successful MEPs in a coming study. In the meantime, we look forward to continuing this discussion, evaluating how best to ensure this living document is effectively tended to and how effective and appropriate training and assessment processes can be set up to realise the full potential of MEPs.
Supplemental Material
sj-pdf-1-mde-10.1177_23821205211000356 – Supplemental material for A Systematic Scoping Review on Portfolios of Medical Educators
Supplemental material, sj-pdf-1-mde-10.1177_23821205211000356 for A Systematic Scoping Review on Portfolios of Medical Educators by Daniel Zhihao Hong, Annabelle Jia Sing Lim, Rei Tan, Yun Ting Ong, Anushka Pisupati, Eleanor Jia Xin Chong, Chrystie Wan Ning Quek, Jia Yin Lim, Jacquelin Jia Qi Ting, Min Chiam, Annelissa Mien Chew Chin, Alexia Sze Inn Lee, Limin Wijaya, Sandy Cook and Lalit Kumar Radha Krishna in Journal of Medical Education and Curricular Development
Supplemental Material
sj-pdf-2-mde-10.1177_23821205211000356 – Supplemental material for A Systematic Scoping Review on Portfolios of Medical Educators
Supplemental material, sj-pdf-2-mde-10.1177_23821205211000356 for A Systematic Scoping Review on Portfolios of Medical Educators by Daniel Zhihao Hong, Annabelle Jia Sing Lim, Rei Tan, Yun Ting Ong, Anushka Pisupati, Eleanor Jia Xin Chong, Chrystie Wan Ning Quek, Jia Yin Lim, Jacquelin Jia Qi Ting, Min Chiam, Annelissa Mien Chew Chin, Alexia Sze Inn Lee, Limin Wijaya, Sandy Cook and Lalit Kumar Radha Krishna in Journal of Medical Education and Curricular Development
Supplemental Material
sj-pdf-3-mde-10.1177_23821205211000356 – Supplemental material for A Systematic Scoping Review on Portfolios of Medical Educators
Supplemental material, sj-pdf-3-mde-10.1177_23821205211000356 for A Systematic Scoping Review on Portfolios of Medical Educators by Daniel Zhihao Hong, Annabelle Jia Sing Lim, Rei Tan, Yun Ting Ong, Anushka Pisupati, Eleanor Jia Xin Chong, Chrystie Wan Ning Quek, Jia Yin Lim, Jacquelin Jia Qi Ting, Min Chiam, Annelissa Mien Chew Chin, Alexia Sze Inn Lee, Limin Wijaya, Sandy Cook and Lalit Kumar Radha Krishna in Journal of Medical Education and Curricular Development
Supplemental Material
sj-pdf-4-mde-10.1177_23821205211000356 – Supplemental material for A Systematic Scoping Review on Portfolios of Medical Educators
Supplemental material, sj-pdf-4-mde-10.1177_23821205211000356 for A Systematic Scoping Review on Portfolios of Medical Educators by Daniel Zhihao Hong, Annabelle Jia Sing Lim, Rei Tan, Yun Ting Ong, Anushka Pisupati, Eleanor Jia Xin Chong, Chrystie Wan Ning Quek, Jia Yin Lim, Jacquelin Jia Qi Ting, Min Chiam, Annelissa Mien Chew Chin, Alexia Sze Inn Lee, Limin Wijaya, Sandy Cook and Lalit Kumar Radha Krishna in Journal of Medical Education and Curricular Development
Supplemental Material
sj-pdf-5-mde-10.1177_23821205211000356 – Supplemental material for A Systematic Scoping Review on Portfolios of Medical Educators
Supplemental material, sj-pdf-5-mde-10.1177_23821205211000356 for A Systematic Scoping Review on Portfolios of Medical Educators by Daniel Zhihao Hong, Annabelle Jia Sing Lim, Rei Tan, Yun Ting Ong, Anushka Pisupati, Eleanor Jia Xin Chong, Chrystie Wan Ning Quek, Jia Yin Lim, Jacquelin Jia Qi Ting, Min Chiam, Annelissa Mien Chew Chin, Alexia Sze Inn Lee, Limin Wijaya, Sandy Cook and Lalit Kumar Radha Krishna in Journal of Medical Education and Curricular Development
Supplemental Material
sj-pdf-6-mde-10.1177_23821205211000356 – Supplemental material for A Systematic Scoping Review on Portfolios of Medical Educators
Supplemental material, sj-pdf-6-mde-10.1177_23821205211000356 for A Systematic Scoping Review on Portfolios of Medical Educators by Daniel Zhihao Hong, Annabelle Jia Sing Lim, Rei Tan, Yun Ting Ong, Anushka Pisupati, Eleanor Jia Xin Chong, Chrystie Wan Ning Quek, Jia Yin Lim, Jacquelin Jia Qi Ting, Min Chiam, Annelissa Mien Chew Chin, Alexia Sze Inn Lee, Limin Wijaya, Sandy Cook and Lalit Kumar Radha Krishna in Journal of Medical Education and Curricular Development
Footnotes
Acknowledgements
The authors would like to dedicate this paper to the late Dr. S Radha Krishna whose advice and ideas were integral to the success of this study. The authors would like to thank the anonymous reviewers whose advice and feedback greatly improved this manuscript.
Funding:
The author(s) received no financial support for the research, authorship, and/or publication of this article.
Declaration of conflicting interests:
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Abbreviations
Authors’ Contributions
DZH, AJSL, RT, YTO, AP, EJXC, CWNQ, JYL, JJQT, MC, AMCC, ASIL, LW, SC, LKRK were involved in data curation, formal analysis, investigation, preparing the original draft of the manuscript as well as reviewing and editing the manuscript. All authors have read and approved the manuscript. All authors were involved in data curation, formal analysis, investigation, preparing the original draft of the manuscript as well as reviewing and editing the manuscript. All authors have read and approved the manuscript.
Availability of Data and Materials
All data generated or analysed during this review are included in this published article [and its supplementary files].
Supplemental Material
Supplemental material for this article is available online.
References
Supplementary Material
Please find the following supplemental material available below.
For Open Access articles published under a Creative Commons License, all supplemental material carries the same license as the article it is associated with.
For non-Open Access articles published, all supplemental material carries a non-exclusive license, and permission requests for re-use of supplemental material or any part of supplemental material shall be sent directly to the copyright owner as specified in the copyright notice associated with the article.
