Abstract
This study investigated handouts regarding common upper extremity problems for inaccuracies, distracting information, and concepts that reinforce common unhelpful cognitive biases. We reviewed handouts on upper extremity conditions from 2 electronic medical records and 2 professional associations. We categorized information as inaccurate, distracting, and risk of reinforcing common unhelpful cognitive biases. Reading level, quality, and the ability of patients to process and take action was also rated. We found an average rate of inaccurate statements of 1.9 per 100 words, distracting statements of 0.73 per 100 words, and statements reinforcing common unhelpful cognitive biases of 2.1 per 100 words. Handouts from electronic medical records were rated higher quality and had higher reading grade level, but on average were constructed for better understandability. Patient handouts have a notable rate of inaccuracies, distractions, and information that may reinforce less adaptive cognitions. Greater attention is merited to making patient handouts readable, understandable, hopeful, and enabling.
Introduction
Clinicians often give patients handouts to help them learn about their condition. Some electronic medical records (EMRs) may include automatic orders for handouts. Many professional societies prepare handouts and make them available in web or print form. To our knowledge, the information in handouts is not as well studied as information available on the internet.
Information on the web varies in quality, accuracy, and reading level (1 –5). Patients are at substantial risk of encountering health information of poor quality or information delivered using technical terminology that may better suit surgeons than patients. It is often difficult for patients to determine whether information provided on the web or in print is reliable. An effective teaching tool would allow people from diverse backgrounds and varying levels of health literacy to process, understand, and take action to promote their health (6). Technical information may be confusing for the lay person and inaccurate information may be hard to identify (7,8). Patient educational materials are recommended to have a reading level at or below the sixth grade to be suitable for a majority of the American adult population.
Ideally, handouts would provide optimistic, enabling, and empowering information that can improve health and align choices with what matters most to an individual (their values). Suboptimal terminology can reinforce common unhelpful cognitive biases, may be inaccurate, and can provide information that is distracting from the key issues to be considered (9 –12). We classified the information in patient handouts for hand and upper extremity conditions from 2 EMRs and 2 professional societies in order to identify information that is inaccurate, distracting, or has the potential to reinforce common unhelpful cognitive biases. Our second aim was to compare the reading level, quality, actionability, and understandability of the patient handouts.
Methods
Institutional review board approval was not applicable. We analyzed the patient handouts for ganglion cyst, carpal tunnel syndrome, De Quervain tendinopathy, Dupuytren disease, rotator cuff tendinopathy, lateral epicondylitis, medial epicondylitis, thumb arthritis, osteoarthritis, ulnar neuropathy, and trigger finger available on 2 EMRs and handouts from an orthopedic and a hand surgery professional associations. The handouts generated automatically by EMRs when a diagnosis is entered. This is in contrast to handouts from professional societies that are available on the website or as paper pamphlets. A total of 37 handouts were analyzed.
One orthopedic resident and one fellowship trained orthopedic hand surgeon categorized the information in patient handouts as follows: inaccurate statements, distracting statements, and statements that might reinforce common unhelpful cognitive biases (Table 1). The number of each type of misconception was recorded per 100 words per handout. Initial categorization was performed by the orthopedic resident with each handout reviewed by one orthopedic hand surgeon. Discrepancies were resolved by discussion. No discrepancies required adjudication.
Categorization of Reinforcing Common Unhelpful Cognitive Biases, Inaccuracies, and Distractions.
Patient handouts were also assessed using the Flesch-Kincaid reading level, DISCERN, and PEMAT tool. The reading level or reading level is typically determined in a grade level format by a reading level algorithm. For this study, we copied the content from each handout into a Microsoft Word document and determined reading level by the Flesch-Kincaid reading level. The DISCERN score was one of the first standardized quality indexes for consumer health information and can help categorize information as excellent (63-75), good (51–62), fair (39–50), poor (27–38), and very poor (15-26) (8,13). The PEMAT Understandability and Actionability tools help measure whether patients with varying levels of health literacy can process information from handouts and whether they are able to act on the information provided. The understandability score takes into account the structure and organization of the handout, how numbers are presented, word choice, and the use of visual aids. Actionability is dependent on the use of visual aids and the explicit description of how patients can act of the information provided.
Statistical Analysis
We counted the number of instances of inaccuracy, distraction, and potential reinforcement common unhelpful cognitive biases in each patient handout per 100 words. For each category, we calculated the mean ± standard deviation (SD) per 100 words per patient handout. This was also done for the Flesch-Kincaid reading level, DISCERN, and PEMAT tools per patient handout. We compared mean rates of inaccuracies, distractions, potential reinforcement common unhelpful cognitive biases, reading level, PEMAT scores, and DISCERN scores between sources of patient handouts using 1-way analysis of variance tests. Post hoc Tukey HSD pairwise comparisons were used to determine which categories were significantly different. We considered P < .05 as a statistically significant difference.
Results
There were an average of 1.9 inaccuracies per 100 words (SD: 0.98), 0.73 instances of distracting information per 100 words (SD 0.73), and 2.1 (SD 0.98) statements that might reinforce common unhelpful cognitive biases per 100 words in the handouts (Table 2). Professional society 1 had less inaccurate information when compared with EMR 1 and 2 (P < .05 Table 1). Electronic medical record 2 had more distracting information than both professional societies and EMR 1 (P < .05). There was no difference in the amount of statements potentially reinforcing common unhelpful cognitive biases in handouts from EMRs and those from professional societies.
Rate of Misconceptions, Distractions, and Inaccuracies per 100 Words per Handout.a
a Bold indicates statistically significant; Continuous variables as mean ± standard deviation.
The average Flesch-Kincaid reading level was 7.7 (SD 1.6) with a range from 5.3 to 11. Sixty-five percent were seventh grade reading level or above. The handouts available from professional societies had higher average reading levels compared to those from EMRs (P < .05; Table 3).
Readability and Quality Metrics.a
a Bold indicates statistically significant; Continuous variables as mean ± standard deviation.
On average, handouts from professional societies are more understandable to patients with diverse levels of health literacy than handouts from EMRs (mean 50 [SD 9.3], P < .05). The actionability scores varied significantly but not by source (EMR vs professional society; P < .05; Table 3).
On average, the handouts were classified as fair quality using the DISCERN score (mean DISCERN score of 39 [SD 13]). Handouts from professional society 1 provided better quality information when DISCERN scores were compared with all of the other sources (P < .05; Table 3). DISCERN scores from patient handouts from EMR 1 and professional society 2 were not significantly different.
Discussion
We investigated inaccuracies, distracting information, and potential reinforcement common unhelpful cognitive biases in patient handouts from EMRs and professional associations. Statements that reinforce common unhelpful cognitive biases and inaccurate statements were quite common, which raises the concern that handouts may reinforce less healthy mindsets that are known to contribute to greater pain intensity and magnitude of limitations. In other words, well-intentioned handouts have the potential to make a person feel more ill.
The results of this study should be interpreted along with its limitations. The ratings of potential reinforcement of less adaptive coping strategies are based on an interpretation of best available evidence and a motivation to avoid reinforcing worst-case thinking (catastrophic thinking) or fear of movement (kinesiophobia) given the moderate to large correlation of these factors with pain intensity and magnitude of limitations in people with upper limb conditions (1,2,14). This determination of potential reinforcement of less effective cognitive coping strategies was somewhat subjective, but there were no instances of debate. Given the importance of avoiding reinforcement of unhealthy mindsets, a low threshold for labeling a statement as potentially misleading seems justified. Reading level measures are largely based on a count of syllables, and there is debate regarding the degree to which this reflects reading level (9,10,15).
Statements that might reinforce common unhelpful cognitive biases are common in patient handouts, particularly those available in EMRs. Common cognitive errors that increase symptom intensity and magnitude of limitations fit the categories of “hurt represents harm,” “it’s taking too long,” and other types of worst-case thinking. These thoughts are inconsistent with most of the common hand and upper extremity problems, many of which are aspects of normal human aging while others are benign and self-limited. Statements that reinforce worst-case thinking, kinesiophobia, and other automatic but common unhelpful cognitive biases increase symptom intensity and magnitude of limitations (16 –18). Misconceptions reinforced by common unhelpful cognitive biases coping strategies might also lead people to choose options that are not consistent with what matters most to them (their values). We encourage writers of health information to take care to use the most hopeful, enabling, empowering language consistent with best evidence about a given disease.
Handouts from the EMR were easier to read when compared with handouts from professional medical associations. Feghhi et al demonstrated good reliability and accessibility of online pediatric orthopedic educational materials (19). We found that handouts from all 4 sources had an average reading-level grade higher than that recommended by the National Institutes of Health (NIH). This means that the average patient would be unable to read and comprehend the information provided to them in the handouts. The reading level of professional association handouts was higher than the EMR handouts. A 2008 study found that only 2% of articles on The American Academy of Orthopaedic Surgeons (AAOS) patient-oriented web pages provided an appropriate reading level for patients (9). A 2014 study compared reading level of the AAOS handouts from 2008 with the revised patient-directed website and found that 84% of web pages remained above the eighth grade reading level (20). Multiple studies have found the reading level of content provided by professional associations to be higher than recommended by the NIH (9 –12,21 –24).
Professional handouts had higher understandability scores when compared with the handouts from EMRs. Limited and low-quality visual aids included in handouts from EMRs contributed to lower PEMAT scores. Specifically, EMRs consistently scored poorly largely due to confusing or unclear descriptions and inaccurate visual depictions. The variability seen in actionability scores was dependent on whether sources were more likely to provide material that broke down actions into manageable and explicit steps and the availability of a clear structure to help patients take action.
Current evidence suggests that it is difficult for patients to find reliable information that is also optimistic, enabling, and empowering. Even patient handouts available in EMRs and those from professional societies can be difficult to read, distracting, inaccurate, and have the potential to reinforce common unhelpful cognitive biases. In other words, patient handouts intend to inform and empower, but they often mislead and misinform. It seems that greater attention to reading level, accuracy, relevance, and health promotion aspects of handouts is merited. Patient handouts can be written at a sixth grade reading level with greater attention to the language and concepts used. Medical jargon and colloquialisms have the potential to be distracting, inaccurate, or to reinforce unhelpful cognitive biases.
Footnotes
Authors’ Note
The study has been performed in accordance with the ethical standards in the 1964 Declaration of Helsinki and has been carried out in accordance with relevant regulations of the US Health Insurance Portability and Accountability Act (HIPAA). This work was performed at The Albany Medical Center; Albany, NY; and The University of Texas at Austin, Dell Medical School; Austin, TX.
Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
