Abstract

Introduction
Evidence-based medicine (EBM) has evolved as a key skill to be taught and learnt in medical education. There are several methods for teaching EBM 1, 2, 3 but the aim of teaching should be to impart knowledge, attitudes and skills to improve clinician performance and patient care. Without an adequate assessment it is difficult to know whether a teaching intervention has the desired effect. How frequently is teaching directed to achieve these objectives and how good are we at measuring learning achievement? An overview of assessments used in studies evaluating outcomes of EBM teaching was undertaken to address this question.
There are many publications on the outcomes of EBM teaching but little has been written about the coverage of educational domains in tools used for assessing outcomes. A systematic review 4, 5 on this subject failed to examine if studies covered established educational domains grounded in Bloom's taxonomy 6 and Kirkpatrick's hierarchy, 7 which allow examination of the impact of teaching on a sliding scale from a simple record of attendance to application of evidence in practice leading to improvements in health outcomes. We evaluated how existing EBM assessments rank on these scales. We also examined measurement quality of tools in terms of validity.
Methods
We developed a protocol to undertake the review using established methods for quality systematic reviews.
We searched, without language restrictions, Medline, Embase, ERIC, Cochrane controlled trials register (CCTR), Cochrane database of systematic reviews (CDSR), database of reviews of effects (DARE), Campbell collaboration, Best Evidence Medical Education (BEME), Science Citation Index (SCI), CINHAL, PSYCHINFO from database inception to May 2007 using the following terms and their variants: ‘Evidence’, ‘evidence based medicine’, ‘evidence based practice’, ‘question framing’, ‘literature searching’, ‘critical appraisal’ combined using ‘AND’ with ‘assessment’, ‘evaluation’, ‘teach’, ‘learn’, or ‘tools’. We hand-searched using the reference lists of known systematic reviews and identified relevant grey literature by contacting authors.
Description of domains and subdomains of Blooms taxonomy and Kirkpatrick levels in teaching and learning evidence-based medicine (EBM)
We extracted data on the coverage of cognitive (subdomains: remembering knowledge, comprehension, application, analysis, synthesis and evaluation), affective or attitude (subdomains: receiving, responding, valuing, organization and characterization), and psychomotor or skills (subdomains: perception, set, guided response, mechanism, complex overt response, adaptation and organization) domains in the assessments using Bloom's taxonomy 6 (Table 1). Learning achievement at the more advanced subdomains represents mastery of the subject matter. Data were gathered on educational objectives concerning the following key EBM steps: framing of the clinical information need into answerable questions, searching the literature, critical appraisal of relevant papers and application of findings. The assessments were graded for capture of impact on practice using the Kirkpatrick (modified) scale 7 that evaluates outcomes of medical educational interventions (Table 1) using the following levels: participation or completion, modification of attitudes or perceptions, modification of knowledge or skills, health professional's behaviour and change in delivery of care and health outcomes. Percentages were calculated and statistical analyses were undertaken to assess if any observed trends were not due to chance using Chi-square test for trends.
Results
Literature identified
Study selection flow chart for systematic review of assessment tools for
evidence-based medicine (EBM) teaching Coverage of various subdomains of cognition in Bloom's Taxonomy of
educational objectives among 113 assessment tools for EBM teaching. Data
presented in Appendix 2. For
description of subdomains in the cognitive (knowledge) domain, see Table 1
Coverage of various subdomains of affective (attitudes) domain in Bloom's
Taxonomy of educational objectives among 113 assessment tools for EBM
teaching. Data presented in Appendix
2. For description of subdomains in the affective (attitude)
domain, see Table 1



There were 113 relevant studies in which the instruments were administered in 72(64%) postgraduate and 41 (36%) undergraduate settings. Nursing featured in nine (8%) studies, out of which only three were in the undergraduate setting. Dentistry featured in only one (1%) undergraduate study. In 49 (36%) studies there was a pre and post course component. A questionnaire was used in 62 (46%) studies, out of which 30 (22%) were self-administered. Seven contained open questions, 19 used multiple choice questions and two used essays. Six studies asked questions in the form of interviews and seven studies used short answers to respond to questions. There were 23 randomized trials. The assessment tool was purported to be validated in 61 (45%) studies with the validation being part of the study in 36 (26%) and 16 (12%) did not give any detail on how they had validated their tools. There was evidence of content validity in 45 (33%), tools construct validity in 43(32%), face validity in 41 (30%), concurrent criterion validity in 11 (8%) and predictive criterion validity in nine (7%) assessment tools (Appendix 1).
Coverage of educational domains, subdomains and levels
As shown in Figures 2–5 (and Appendix 2) there was coverage of all four steps of EBM in the knowledge and comprehension component of the cognitive domain in 33 (29%) studies. This fell to 27 (24%) in the use of learnt material in new and concrete situations and only one (1%) for analysis and synthesis subdomains. None of the studies assessed the ability to judge the value of material for a given purpose (evaluation). In the affective domain 34 (30%) studies measured all four steps of EBM in willingness to attend to particular phenomenon or stimuli (receiving component). This fell to 28 (25%) in actual measurement of participation (responding) and 21 (19%) in valuing component. None of the studies went on to assess retention and incorporation of teaching into practice (organization and characterization). In the psychomotor domain 34 (30%) studies measured the perception and set component in all four steps of EBM, while nine (8%) extended assessment to examining whether learnt responses had become habitual. None of the studies went beyond this to measure the higher subdomains.
Coverage of various subdomains of psychomotor (skills) domain in Bloom's
Taxonomy of educational objectives among 113 assessment tools for EBM
teaching. Data presented in Appendix
2. For description of psychomotor (Skills) subdomains, see Table 1
Coverage of various levels of Kirkpatrick hierarchy of outcomes of
educational interventions among 113 assessment tools for EBM teaching. Data
presented in Appendix 2. For
description of Kirkpatrick (modified) levels, see Table 1


Discussion
Principle findings
We found that no single tool covered the assessment of all EBM steps to the highest level of learning achievement possible. Tools were often restricted to assessing subdomains at the lower end of the achievement spectrum. Assessments of application in practice and impact on delivery of care were rarely covered. This is likely to be due to teaching being directed at achieving basic educational objectives rather than being aimed at changing practice. Thus it is not possible to measure whether there was transfer of learning to the workplace.
Strengths and weaknesses in relation to other studies
The validity of our findings depends on the robustness of our methodology. Previous reviews on this subject have been simplistic in their approach. 4, 5 While things should be made as simple as possible, they should never be simplified to a level that valuable information is lost. Thus earlier work failed to analyse information in ways that could yield a sound synthesis for EBM education. When previous reviewers 4 were challenged, 5 they accepted that they were unaware of Bloom's taxonomy and Kirkpatrick hierarchy. We used a rigorous approach both in research synthesis methods and in educational philosophy. Being more up-to-date and comprehensive in searching, we captured more studies that previously sought. This review attempts to go beyond just knowledge, attitudes and behaviours classification of learning. Our detailed analysis using all 20 subdomains of Bloom's taxonomy of learning objectives allows a comprehensive educational analysis of the existing EBM curricula. Coverage of Kirkpatrick's hierarchy in the assessments further aided in evaluating whether education was directed at improving practice and clinical outcomes. Given so many domains and subdomains to assess, inevitably there was some subjectivity involved in data extraction. We piloted our data extraction form and used reviewers in triplicate to obtain reliable judgments about the data. We are, therefore, confident that the limitations in current EBM teaching made explicit through our work merit consideration.
Clinical learning through reading scientific papers dates back to 1875 when Sir William Osler held the first journal club. 8 EBM has become a specialty in its own right with a vast menu of methods for teaching, learning and assessment. Educationalists stipulate that a good curriculum should be accompanied by assessments that match the aims of the learning experience to be provided. The stated aim of EBM is to improve practice, so it was surprising that so few teaching efforts were accompanied by appropriately developed assessments. One might argue that undergraduate teaching need not go as far as application of EBM, although we would disagree with this view point as it is important at all levels to demonstrate EBM as a practical tool.
Limitation of the review
In light of incomplete reporting in the studies it is not possible to obtain information required to assess the appropriateness of the validity of assessment tools. The data extracted were summarized according to what was reported in the studies.
We included randomized studies and assesses only the quality of their assessment tools. We did not undertake review of effectiveness and this is where assessment of quality of randomization, blinding, et cetera, would be important. There is a Cochrane review 9 and other reviews 1 and publication which have looked into this area in detail.
Implications for clinicians or policymakers
In our review, studies on postgraduate and continuing education constituted more that two-thirds of included literature and analysis within this subgroup confirmed our main finding that assessments tended not to focus on application. Ideally EBM education should be clinically integrated exploiting opportunities for just-in-time learning through on-the-job training. 10 Courses organized for classroom teaching should be integrated with clinical practice through follow-up. In this regard we consider stand-alone teaching to be merely a precursor for proper clinically integrated teachings, not an end in itself.
Future research
Teaching methods should be intended to impact directly on practice. Curriculum design should be accompanied by assessments that match the aims of the learning experience to be provided. EBM education can be designed to be clinically integrated, exploiting opportunities for just-in-time learning through on-the-job training. Incorporating EBM assessments within clinical assessment methods such as mini clinical exercise, direct observation of procedural skills and objective structured assessment tools is the right step forward. 11, 12
Conclusions
In conclusion, many EBM assessment instruments exist but assessment is often restricted to assessing lower learning achievement levels and changes to delivery of care are rarely covered. Existing assessment tools with sound measurement properties may be adapted for employment in teaching where the domains covered have content validity, matching the learning objectives set for the course. However, EBM education directed at improving application in practice will need to make further efforts to design teaching to improve practice and assessments to capture changes in it.
Footnotes
DECLARATIONS
Footnotes
Acknowledgements
None
Evaluations of measurement properties (validity) among 113 assessment tools
for each and all of the key EBM steps Coverage of various subdomains and levels in Bloom's Taxonomy of educational
objectives and Kirkpatrick's Hierarchy among 113 assessment tools for each
and all of the key EBM steps For descriptions of the subdomains and levels, see Table 1
Measurement property
n
%
Instrument purported to be validated
61
45
Part of study
36
26
Expert validity
32
24
Previously validity
28
20
Statistically tested
32
24
No explanation given
16
12
Validity
Construct validity: The extent
to which a question measures a hypothetical construct
43
32
Content validity: The content
matches with learning objectives
45
33
Face validity: On the face of it
do questions appear to be fair and does the instrument make
straight-forward sense
41
30
Concurrent criterion validity:
The instrument performance compared to other validated instruments
administered at the same time
11
8
Predictive criterion validity:
Follow-up to assesses how well the instrument can foresee future
performance
9
7
EBM Steps
Question Framing
Literature Search
Critical Appraisal
Application
All EBM steps
n (%)
n (%)
n (%)
n (%)
n (%)
Cognitive domain
Knowledge
47(42)
68(60)
63(56)
39(35)
33(29)
Comprehension
45(40)
63(56)
57(50)
37(32)
33(29)
Application
35(31)
46(40)
42(37)
32(28)
27(24)
Analysis
4(3)
6(5)
5(4)
3(3)
1(1)
Synthesis
1(1)
2(1)
2(2)
1(1)
1(1)
Evaluation
0
0
0
0
0
Affective domain
Receiving
45(40)
63(56)
56(50)
42(37)
34(30)
Responding
44(39)
59(52)
47(42)
35(31)
28(25)
Valuing
30(27)
41(36)
36(32)
27(24)
21(19)
Organization
0
0
0
0
0
Characterization
0
0
0
0
0
Psychomotor (skills) domain
Perception
45(40)
61(54)
50(44)
40(35)
34(30)
Set
45(40)
58(51)
50(44)
40(35)
34(30)
Guided response
35(31)
44(39)
34(30)
31(27)
26(23)
Mechanism
13(12)
21(19)
12(11)
10(9)
9(8)
Complex overt response
2(1)
5(4)
1(1)
1(1)
0
Adaptation
1(1)
2(1)
0
0
0
Organization
1(1)
2(1)
0
0
0
Kirkpatrick's Levels
Participation or completion
28(25)
41(36)
38(34)
24(21)
26(23)
Modification of attitudes
45(40)
42(37)
37(32)
24(21)
21(19)
Modification of knowledge or
skills
47(42)
45(40)
41(36)
26(23)
23(20)
Health professional's
behaviour
15(13)
23(20)
20(18)
17(15)
11(10)
Change in delivery of care and
health outcomes
8(7)
9(8)
9(8)
10(9)
4(3)
