Abstract
Introduction
Social media websites, such as Facebook, have made an effort to monitor and label news stories and op-eds that could be false or misleading. As such, we sought to evaluate fact checkers for news stories and op-eds that circulate on Facebook.
Methods
We searched all articles on HealthFeedback.org for names of reviewers and other quoted individuals cited in the article and their professional titles. We searched Twitter on March 10, 2021, to see whether the reviewers and quoted individuals had an account and noted the number of Twitter followers.
Results
The median number of followers on Twitter for reviewers was 10,000 (since January 2020) versus 1012 (prior to January 2020;
Conclusion
Current fact-checking processes appear to be strongly associated with large Twitter followings. Greater transparency in the process of determining misinformation is needed.
Introduction
Social media websites, such as Facebook, have made an effort to monitor and label news stories and op-eds that could be false or misleading. 1 HealthFeedback.org is one of the dozens of organizations that review multiple types of media articles for Facebook, but HealthFeedback.org focuses specifically on health and medical articles. They solicit two to four reviewers per article, who makes a determination as to which articles are misleading. Reviewers apply to review articles identified by Facebook, as opposed to reviewers for scientific articles who are invited, and reviewers do not disclose conflict of interest. Fact checking for misinformation relies on unbiased and objective reviewer selection and processes to balance the protection of scientific dialogue with the removal of misinformation from Facebook. Hence, we sought to evaluate fact checkers for news stories and op-eds that circulate on Facebook.
Methods
On 3 October 2021, we included and searched all articles on HealthFeedback.org, since the website inception. We abstracted data on names of reviewers and other quoted individuals cited in the article, their professional titles, the article title they reviewed, the number of articles reviewed, and year of review(s). We searched Twitter on 10 March 2021, to see whether the reviewers and quoted individuals had an account and noted the number of Twitter followers. Data are presented in frequencies and are stratified by pre-Covid-19 and post-Covid-19 (January 2020). We tested differences in means with a t-test, medians with Wilcoxon rank-sum test, and differences in categorical variables with Fisher exact test. As data were publicly available, institutional review board approval was not required.
Results
We found that of the 19 articles reviewed by HealthFeedback.org, seven (37%) articles were fact-checked since January 2020. The average number of reviewers per article prior to January 2020 was 3.6, and the average number of reviewers per article since then was 2.7 (
Prior to January 2020, 64% of reviewers had active Twitter accounts, and the average follower count was 6986 (median 1012; Table 1). Since then, 79% of reviewers have active Twitter accounts, and the average follower count was 42,000 (median 10,000).
Twitter activity among HealthFeedback.org fact checkers and quoted individuals.
Fisher exact test
The average number of followers on Twitter for quoted individuals was 161,123 (since January 2020) versus 314 (prior to January 2020;
Discussion
We found that reviewers who were selected to review articles for Facebook and those whose quotes are included have large followings on Twitter and in at least one case, the reviewer had publicly stated their opinion of the article prior to the review. These findings are concerning in light of survey results from another study, which reported that social media was the biggest perceived source of Covid-19 misinformation among researchers, clinicians, and academics. 2 Our findings are limited by sample sizes, but we included all articles and reviewers on HealthFeedback.org during our study time frame.
The spread of misinformation on Facebook raises concerns. Current fact-checking processes appear to be strongly associated with large Twitter followings. Further, reviewers appear to be selected based upon
Footnotes
Conflict of interest
Vinay Prasad's Disclosures. (Research funding) Arnold Ventures (Royalties) Johns Hopkins Press, Medscape, and MedPage (Honoraria) Grand Rounds/lectures from universities, medical centers, non-profits, and professional societies. (Consulting) UnitedHealthcare and OptumRX. (Other) Plenary Session podcast has Patreon backers, YouTube, and Substack. Dr. Haslam has no financial nor non-financial conflicts of interest to report.
Contributorship
Both AH and VP were involved with study design, data acquisition, and drafting of manuscript. AH conducted data analysis.
Ethical approval
As data were publicly available, institutional review board approval was not required.
Funding
The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was supported by Arnold Ventures.
Guarantor
VP.
