Abstract
A manuscript published recently on histological changes induced in the pancreas by incretin-based medications has been widely criticized because of ill-matched groups treated with incretin-based versus non-incretin-based medications and because of methodological problems identifying glucagon-producing cells. Now a study making use of the same tissue bank is available, and does not easily confirm the bulk of findings originally reported. This is an important opportunity to discuss the responsibility of authors to publish results potentially reproducible by other scientists as an important quality criterion, and the responsibility of reviewers and editors in handling such manuscripts. The main conclusion is that attempts to reproduce controversial findings are a necessity if finally the essence of novel results is at stake.
Keywords
Consider yourself sitting in a soccer stadium and watching a striker entering the box, then, suddenly, he and a defender from the other team go to ground. Supporters of the offensive team request a penalty, because they claim to have observed a foul play on the side of the defender, but you, as a relatively independent spectator, are not really sure what happened. In this situation, a human being called the referee needs to make a judgment and a decision, no matter how well he was able to view details of what the defender did to the striker. He may ask one of his assistants at the side line, but no matter how well founded the decision might be, he will have to blow his whistle in favor of a penalty, or he will have to make everyone accept that the game continues without a penalty. Even though this might be a match of a professional league, perhaps deciding about an important championship, which is to say, with a lot of money and financial interests involved, it is from the world of sports, driven by the idea of fair play, which includes accepting a referee’s decision, even if later a video replay proves his decision wrong. The rules are exactly like that, maybe because young people should better learn that there always remain uncertainties when we are discussing human judgment and decision making.
Now, consider watching a scientific dispute: The players now are scientists, either in favor (offensive player) or against (defender) a certain hypothesis. A situation arises such that data have been obtained that allow different options for judgment. For one team, the new finding raises questions as to the overall risk-benefit relationship of a novel treatment, the other team first of all questions the technical integrity of these findings (due to methodological problems) and secondly does not agree with the far-reaching conclusions suggested by those generating the original data and submitting them for publication. The soccer referee now turns into an editor-in-chief of a scientific journal, assisted by his linesmen (the referees), and he will have to judge upon the manuscript, and he will have to make a decision. Like judging the details of a sports scene, this decision should, if at all possible, be based on a precise assessment: (1) Is the information novel? (2) Do the data (as obtained, hopefully based on a sound methodology) support the conclusions drawn? (3) Does it seem likely that the findings can be reproduced by other research groups? (4) Are these conclusions important and of relevance for the field? Like with making decisions in a sports arena, a manuscript often does not allow to whole-heartedly answer with a yes to all of these questions. But, again, a decision needs to be made, taking into account the limitations that need to be accepted for decision-makers in a real world: If the data are of potential interest, any requests for revision should be reasonable regarding the necessary efforts; even experts in the field may not be able go fully judge details and limitations of some of the methods used; different opinions are possible on the importance and novelty of certain findings; this certainly applies to the question whether certain findings have the potential to change clinical practice, for example, because the shed new light on the benefits or risks of drug treatment.
Butler et al have previously1,2 and currently
3
published diverse findings at the level of epidemiology and pancreatic histology raising concerns about a potential of incretin-based drugs to cause acute, maybe chronic pancreatitis, proliferation of pancreatic exocrine cells, potentially indicating a long-term risk for pancreatic carcinoma as well as changes in the endocrine pancreas (expansion of ß- and α-cells, co-staining for insulin and glucagon, and the development of endocrine micro- and macro-adenomas expressing glucagon). Both the findings themselves (methodologically sound? reproducible?) and their interpretation (reason for a serious change in the risk-benefit assessment for GLP-1 receptor agonists and/or DPP-4 inhibitors?) have raised a highly controversial debate, with letters-to-the editor4
-6 from pharmaceutical companies marketing incretin-based drugs or commentaries from the academic world in the journal originally publishing the latest results,
Lawyers hired by pharmaceutical companies have requested that original documents describing research findings be legally confiscated in order to be searched for evidence of scientific misconduct, based on allegations that reporting such results, based on supposedly weak evidence, will have negative influences on sales of drugs allegedly causing severe adverse consequences during long-term use in patients with type 2 diabetes.
This is, unfortunately, where the analogy between decision making in a soccer stadium and in the publication process ends. In the first environment, we are talking of sports, we accept the rules of that game, we have to accept the imperfections of human decision-making, because
From the point of view of potential authors, at best there is a clear-cut hypothesis, which is either supported or refuted by the experimental results. Being too much driven by certain expectations in either way may seriously jeopardize unbiased reporting. We suggest that the best standard for the value of results is their potential reproducibility. Limitations of methods used are often best recognized by researchers using them. They need to be openly discussed. In this regard, the recent publication by Bonner-Weir and colleagues 17 is an important next step in this scientific debate. It is a serious attempt to reproduce important published findings, to alert the scientific community regarding methodological limitations, helping to provide alternative interpretations. Such scientific debates are the foundation of academic science. It is important to recognize that Bonner-Weir et al 17 identified several methodological weaknesses in Butler et al’s 3 recent publication, including all too lax criteria for matching different patient groups with regard to important clinical characteristics, but they do not find any evidence of gross negligence in the way they generated, processed, analyzed, and interpreted their data, and certainly no hint of willful misconduct or plain scientific fraud. Thus, the appropriate answer is a continuing scientific debate.
Unfortunately, the reality of modern science also has a different face. When lawyers, judges and investigative journalists become involved in resolving scientific questions, and if scientists have to be prepared for allegations from pharmaceutical companies, academic liberty is at serious danger.
We believe that scientists should maintain the liberty to present their data as well as their opinions and conclusions, even if these conclusions may go against certain interests—be they scientific or financial in nature. In a way, this might be considered as the basis of “scientific fair play.” It is the responsibility of the fellow scientists to challenge such ideas and conclusions by reproducing certain experiments and by adding novel findings. For this reason, the analyses by Bonner-Weir and colleagues 17 provide a helpful addition to guide the academic community towards more balanced conclusions. Judges are trained to and will always come to a verdict, even if their expertise to decide in scientific questions may be limited. Journalists are primarily interested in a story that sells their journal, and may therefore tend to transport the more provocative view based on an extreme position rather than searching for the (boring?) truth. It is our hope that scientific debates will remain and can be resolved within the academic community, as much as difficult, but neccessary and unavoidable decisions in soccer matches are made on the soccer field rather than in a courtroom. Attempts to reproduce challenging findings may often be required to reach the degree of certainty regarding the value of questionable findings, much like the Fédération de Football International Association is considering to introduce novel standards like goal line technology and video replays as decision aids.
Footnotes
Declaration of Conflicting Interests
The author(s) declared the following potential conflicts of interest with respect to the research, authorship, and/or publication of this article: MAN has been member on advisory boards or has consulted with AstraZeneca, Boehringer Ingelheim, Eli Lilly & Co, GlaxoSmithKline, Hoffman La Roche, Menarini/Berlin Chemie, Merck, Sharp & Dohme, NovoNordisk, Versatis, and Intarcia Therapeutics, Inc. He has received grant support from Eli Lilly & Co, Menarini/Berlin-Chemie, Merck, Sharp & Dohme, Novartis Pharma, and Ypsomed. He has also served on the speakers’ bureau of AstraZeneca, Boehringer Ingelheim, Eli Lilly & Co, Hoffman La Roche, Menarini/Berlin Chemie, Merck, Sharp & Dohme, and NovoNordisk. JJM has participated in advisory boards or was invited as speaker by Astra Zeneca, BMS, Boehringer-Ingelheim, Eli Lilly, Intarcia Therapeutics, Inc, MSD, Novo Nordisk, Novartis, and Sanofi. He has received research support from Eli Lilly, Boehringer-Ingelheim, MSD, Novo Nordisk, Novartis, and Sanofi.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
