Abstract
The current crisis of confidence in psychological science has spurred on field-wide reforms to enhance transparency, reproducibility, and replicability. To solidify these reforms within the scientific community, student courses on open science practices are essential. Here we describe the content of our Research Master course “Good Research Practices” which we have designed and taught at the University of Amsterdam. Supported by Chambers’ recent book The 7 Deadly Sins of Psychology, the course covered topics such as QRPs, the importance of direct and conceptual replication studies, preregistration, and the public sharing of data, code, and analysis plans. We adopted a pedagogical approach that: (a) reduced teacher-centered lectures to a minimum; (b) emphasized practical training on open science practices; and (c) encouraged students to engage in the ongoing discussions in the open science community on social media platforms.
Introduction
Over the last eight years, psychological research has been in the midst of a “crisis of confidence” (e.g., Pashler & Wagenmakers, 2012; Simmons, Nelson, & Simonsohn, 2011). Central to the crisis is the increasing realization that common research practices may in fact be deeply problematic. Examples include poor study design (i.e., low statistical power; Button et al., 2013; Ioannidis, 2005), the field's reluctance to conduct direct replication studies (Pashler & Harris, 2012; Schmidt, 2009), and a bias to selectively report positive results (Francis, 2013; Scargle, 1999). Moreover, many researchers self-admit to the use of so-called questionable research practices (QRPs; John, Loewenstein, & Prelec, 2012). Hidden from the reader, QRPs exploit researchers' degrees of freedom in study design and analysis in order to produce significant findings. For instance, researchers may decide to stop data collection when the result reaches significance, exclude data points based on their impact on the results, or report unexpected findings as having been predicted from the start (Kerr, 1998; Simmons et al., 2011). The detrimental effect of these practices is evident from recent surveys and large-scale replication projects. For instance, a survey among over 1,500 scientists revealed that 90% believe there is indeed a crisis, with 52% observing a “significant crisis” (Baker, 2016). These perceptions are substantiated by large-scale replication efforts, which demonstrated replication rates from 36% to 77% (Camerer et al., 2018; Klein et al., 2014, 2018; Open Science Collaboration, 2015).
To combat the crisis of confidence, the scientific community has begun to adopt research standards that reduce cherry-picking and significance chasing. For instance, an effective practice that has quickly gained popularity is preregistration. When preregistering their studies, researchers outline their analysis plan before the data are collected. Because the analysis pipeline cannot be tailored to the data, researchers protect themselves against hindsight bias and other QRPs that may unwittingly contaminate the results. Researchers can choose to either preregister their study independently or integrate preregistration with the peer-review process (i.e., in the form of a Registered Report; Chambers, 2013). In addition, the scientific community has launched various initiatives to increase transparency. For instance, to encourage data sharing, Morey et al. (2016) started the Peer Reviewers' Openness (PRO) initiative. PRO signatories agree to provide a full review only for articles that share data and materials in a public repository, or provide reasons why this is not possible. Journals have also promoted transparency standards, for instance by signing up to the Transparency and Openness Promotion guidelines (TOP; Nosek et al., 2015), or by providing open science badges for preregistration and sharing of data and materials (Kidwell et al., 2016). Open science advocates have argued that the methodological reforms within the scientific community have been so substantial as to warrant descriptions such as “Revolution 2.0” (Spellman, 2015) or “Credibility Revolution” (Vazire, 2018).
In addition to the reforms within the research community itself, researchers have emphasized the need to overhaul methodological education. For instance, in the survey by Baker (2016), three of the five factors considered most promising for increasing the reproducibility in science were directly related to improvements in scientific training (i.e., “better statistical understanding,” “better mentoring/supervision,” and “better teaching”). Central among the proposed changes are offering lectures on the crisis of confidence and open scientific practices (Chopik, Bremner, Defever, & Keller, 2018; Funder et al., 2014; Munafò et al., 2017).
We believe that a course on good research practices deserves a place in the standard psychology curriculum, and that open scientific practices should be an inherent part of the methodological training of students for several reasons. First, without the proper education, students' opinions on the crisis of confidence tend to be “quite radical, superficial, or even emotional” (Chopik et al., 2018, p. 159). Educating students about the ongoing methodological changes allows them to develop informed opinions on these topics. Second, when students—the next generation of scientists—understand open science practices, they can confidently introduce them in their future labs. Third, students who pursue an academic career will ultimately be evaluated on whether they adhere to these practices. As journals and university policies are making increasing demands on transparency criteria, educating students about these practices seems advisable, if not imperative. Lastly, regardless of students' future career plans, advancing the methodological curriculum also benefits the students' development on a more general level. By following a course on good scientific practices, students learn to recognize scientific studies that meet certain quality standards reflected by, for instance, being preregistered, having open materials and data, being published as a Registered Report, including a power analysis, or reporting effect sizes. As such, a course on open science enhances students' skills to critically evaluate research, be it from the published literature or conducted by themselves, for instance as part of a thesis requirement.
Since 2015 we have offered the open science course “Good Research Practices” at the University of Amsterdam. The course covers the current crisis of confidence in psychological science and outlines attempts by the scientific community to increase the reliability and transparency in the field. “Good Research Practices” is a Research Master course; students generally know basic statistics and have had practical experience with the empirical cycle. This background makes it easier to understand the challenges and advantages of implementing open science practices. Nevertheless, the course is not technical in nature and mostly demands common sense—hence, the material may also be useful for a course for undergraduate students.
In this article, we aim to provide an overview of our “Good Research Practices” in order to assist lecturers who intend to develop a similar course. Below we discuss the course objectives, describe our pedagogical approach, and illustrate the contents of two classes in more detail. Furthermore, we will list the lecture topics together with suggested literature for students. Readers interested in the full course catalogue and materials can access it in our online appendix (accessible via https://osf.io/v3z7q/). 1
General Information
“Good Research Practices” is designed as a seven-week course including a total of 14 two-hour classes. A total of 43 Research Master Psychology students at the University of Amsterdam participated in the course last year (academic year 2018/2019), for which they were awarded six ECTS credits after completion (equivalent to 180 hours of work). Grading was based on a combination of bi-weekly quizzes about the background literature, and on the quality of their short presentations and in-class assignments.
Course Objectives
In general, a course on good research practices should teach students how to critically review the scientific literature and how to conduct open, transparent, and reliable research. In addition, we wanted to immerse students in current debates and recent developments in the open science community. Specifically, our course had four objectives, as follows.
Our first objective was for students to reflect on various types of questionable research practices. In particular, we emphasize that researchers are not immune to biases (e.g., hindsight bias and confirmation bias) that cause them to selectively report analyses that yield publishable findings. To protect themselves against their own biases, researchers must rely on scientific practices that minimize hidden degrees of freedom (Wagenmakers, Wetzels, Borsboom, van der Maas, & Kievit, 2012). As primary course literature we used the book The 7 Deadly Sins of Psychology by Chambers (2017), which presents a clearly written, authoritative, and comprehensive account of the causes and proposed solutions for the current crisis in psychological science.
Our second objective was to engage students in current debates and recent developments in the open science community. Social media platforms constitute a prominent stage for science communication and debates on research methods reforms. These platforms include Twitter, scientific blogs, and podcasts. As part of the curriculum, we encouraged students to stay informed about ongoing discussions and new developments within the open science movement, and to educate their peers in weekly “Newsflashes” on interesting debates, articles, or events. A list with Twitter handles, podcasts, and scientific blogs we recommended is available via https://osf.io/mcqa5/. 2
Our third objective was to let our students contribute to the curriculum themselves. We believe that students learn more when they are stimulated to actively participate in the course (e.g., Jang, Reeve, & Halusic, 2016; Reeve, 2016). Therefore, we adopted a flipped-classroom setting to reduce teacher-centered classes to a minimum. In this setting, students give short lectures, design in-class assignments, and lead group discussions.
Our fourth objective was to provide multiple perspectives on the open science movement. Therefore, we invited a series of guest speakers to present their most recent research projects, their perspectives on the developments within the scientific community, and their opinions on possible ways to resolve the crisis. Since the course was designed to illustrate the necessity and benefits of open science, we exclusively invited proponents of the open science movement. At the same time, we tried to select speakers who differ in their level of seniority, and who approach methodological reforms from different angles. In the current installment of the course, the guest speakers included a former student from the Research Master program (Bobby Lee Houtkoop), a science journalist (Hans van Maanen), metascience researchers (Balazs Aczel, Nick Brown, and Olmo van den Akker), and Chris Chambers, who is the chair of the Registered Reports committee at the Center for Open Science (https://cos.io/) and leading force within the open science movement.
Pedagogical Approach
In line with our course objectives, we alternated regular classes with classes organized by students. Lectures in regular classes were given either by us or one of our guest speakers, and focused on the substantial impact that QRPs may have on the reliability of research findings. In particular, we explained why certain research practices can be considered “bad science” (Goldacre, 2009), and how such practices can be detected, and—importantly—avoided. The classes also featured specific in-class assignments and group discussions to deepen students' understanding. In addition, the regular classes covered recent developments and debates within the open science movement; specifically, we reserved the last 20 minutes of each regular class for a “Newsflash” item, where students gave lighting presentations about relevant events, discussions, or articles they encountered on social media platforms that week. It should be noted that discussions following the lightning presentations were led by one of the lecturers who could provide context and insight about the presented topics. These guided discussions are recommended, since students might not be aware that they are exposed to only a selective group of people who typically dominate these debates, and who may not be representative of the entire scientific community.
Classes given by students were structurally similar to regular classes. However, at about 10 minutes each, the student lectures were much shorter than regular lectures, leaving considerable time for active learning during the in-class assignments. Shorter student lectures also allowed us to have multiple groups present each week.
To encourage creativity and originality, students were instructed to base their lectures on relevant topics that had not already been elaborately discussed in their assigned readings. With respect to the in-class assignments, we emphasized that the exercises should have practical value for their peers, that is, the exercises should be training material for open science practices. 3 Examples of the current year's in-class assignments are: tutorials on how to preregister a study or share data on the Open Science Framework (https://osf.io); trying out software tools that examine possible anomalies in individual articles (e.g., statcheck; Epskamp & Nuijten, 2016, or SPRITE; Heathers, Anaya, van der Zee, & Brown, 2018); or detecting hidden analytic flexibility in entire research fields (e.g., with a p-curve analysis as proposed by Simonsohn, Nelson, & Simmons, 2014). To illustrate our pedagogical approach, the next two sections describe a regular class and a student-organized class.
Example of a Regular Class: The Sin of Data Hoarding
The fifth week of the course focused on “The Sin of Data Hoarding” (Chambers, 2017), that is, the chapter on data sharing (for a recent special issue see Simons, 2018). As an expert on this topic we invited Bobby Lee Houtkoop, a former student from the same program. Houtkoop recently conducted and published a survey study to reveal reasons why researchers are reluctant to share their data, and what can be done to overcome this reluctance (Houtkoop et al., 2018). In her lecture, Houtkoop discussed the dominant scientific culture in which data sharing is not the norm, even though data sharing offers unequivocal advantages for both the author and the scientific community. In cancer research, for instance, it was found that studies for which data were publicly shared received higher citation rates compared to studies for which data were not available (Piwowar, Day, & Fridsma, 2007). In addition, data sharing may improve the reputation or perceived integrity of the researcher. The scientific community benefits from data sharing since: (a) it increases the longevity of the data; (b ) data can be reanalyzed and reused efficiently (e.g., for meta-analyses); and (c) statistical or reporting errors are more likely to be found (Vanpaemel, Vermorgen, Deriemaecker, & Storms, 2015; Wicherts, Borsboom, Kats, & Molenaar, 2006). Houtkoop then presented the methods and results of the survey study. The survey results demonstrated that data are shared only infrequently. Most respondents acknowledged the benefits and importance of data sharing in general; however, they perceived data sharing as less beneficial for their own research projects. Among the perceived barriers to data sharing are the respondents' belief that data sharing is not a common practice in their fields, their preference to share data only upon request, their perception that data sharing requires additional work, and their perceived lack of training in data sharing. Houtkoop's study sparked a lively discussion among the students about future research, about initiatives that encourage data sharing, but also about limitations of the study. In particular, the students were critical about potential biases in the results due the low response rate of the survey (i.e., a response rate of only about 5% which, however, translated into a sample of 600 respondents) and the self-selection of the respondents.
The end of the class featured a “Newsflash.” In that particular week, the science community was excitedly debating the results of the “Many Labs 2” project (Klein et al., 2018) which had just been published. In this project, the participating research teams conducted high-powered preregistered replications of 28 classic and contemporary findings across many samples and settings. The replication efforts showed that only 54% (i.e., 15 studies) could be replicated. In the newsflash, students discussed the article by Klein et al. (2018), the related news article published in The Atlantic titled “Psychology's replication crisis is running out of excuses” (Yong, 2018), and the BBC radio episode on the replication crisis (BBC Radio 4, 2018).
Example of a Student Class: The Sin of Data Hoarding
The student lecture continued where Houtkoop's study left off. The student presenters emphasized the benefits of data sharing and created a tutorial for their peers on how to archive and share data of simple empirical studies on the Open Science Framework (see also Soderberg, 2018). The objective of this lecture was to encourage their peers to ask their future thesis supervisors' permission to share the collected data in a public repository. The in-class assignment revolved around the Peer Reviewers' Openness initiative (PRO; Morey et al., 2016) mentioned in the introduction. Specifically, the students let their peers create a set of questions for the signatories of the PRO initiative, inquiring about signatories' post-PRO experiences with journals and editors, their attitude towards data sharing in their own research, and whether and how the signatories would improve the initiative. Students were divided into small groups and were instructed to read the article by Morey et al. (2016) on the PRO initiative. Then, each group had to propose concrete questions for the PRO signatories. In a plenary discussion, the students reviewed the questions, selected the ones they found most relevant, and created a survey. Since this exercise generated items that seemed informative and useful, the students who prepared the class decided to continue and execute the survey as a separate research project. Currently, the PRO initiative survey has elicited responses from over 120 of the current 340 signatories for whom email information could be retrieved (i.e., 37.4%).
Topics Covered
Topics Covered and Suggested Literature for the Course “Good Research Practices”
Student Evaluation and Recommendations for Future Courses
Student feedback was highly positive. Students particularly appreciated: (a) the guest lectures; (b) the group discussions about ongoing debates and recent articles; (c) the assigned literature (i.e., the course book and the additional articles), which was perceived as relevant and enjoyable; and (d) the teaching of important practical skills. The perceived work load was deemed appropriate, and students liked the fact that the course was designed to encourage regular work through quizzes and assignments.
Students were most critical about our emphasis on negative facets during regular classes, that is, QRPs and the crisis of confidence. Some students stated that discussing these aspects so frequently made them pessimistic about the current state of science. Furthermore, the students felt the two-hour classes were too short. In particular, students were disappointed that often only one group rather than two groups (as anticipated) could present during the student classes. This lack of time also repeatedly forced us to skip the weekly “Newsflashes.”
We believe the student feedback is constructive and helpful. We agree with the students that scheduling an additional hour for each class will reduce the time pressure. With regard to the focus on negative facets, we believe that the recognition of QRPs and “bad science” (Goldacre, 2009) is essential to motivate the methodological reorientation towards more transparency and rigor; on the other hand, our main objective was to inspire students to embrace open research practices, not to instill a sense of despair. As nicely put by Michèle Nuijten (2019), we want to “turn students into skeptics, not cynics.” Therefore, the next installment of our course will devote a larger proportion of time to the positive changes within the scientific community. For instance, we suggest to reconstruct the lecture “Unreliability of Scientific Findings.” During this lecture, we focused mainly on the importance of conducting direct replications to determine the validity of alleged effects, and emphasized the lack thereof in the scientific literature. However, this lecture offers the opportunity to highlight recent large-scale replication efforts and multi-lab collaborations, such as the Open Science Collaboration (2015), the Many Labs projects (Klein et al., 2014, 2018), the ManyBabies project (Frank et al., 2017), and the Psychological Science Accelerator (Moshontz et al., 2018). In addition to a lecture which gives students a general overview on these collaborative efforts, it would be particularly interesting to invite a guest speaker who participated in one of these collaborations to share his or her experiences in working and publishing in such an environment.
Additionally, we would like to replace the lecture “Scientific Fraud” with a lecture on “Open Science within the University of Amsterdam” to educate our students on the concrete steps our university has taken to improve reproducibility, transparency, and openness. For instance, the ethical committee of the psychology department demands a detailed methods and analysis plan as precondition to grant ethical approval for any research project; similarly, students are requested to write their introduction, methods, and analysis plan of their internship and thesis projects before data collection. Additionally, we would like to highlight the methodological and statistical consulting which is offered to both researchers and students, as well as several open science initiatives that were launched recently. 4
Concluding Remarks
Across 14 lectures, the course “Good Research Practices” taught psychology students about the causes of the crisis of confidence and about recent attempts by the scientific community to increase transparency, reproducibility, and replicability. In addition, students acquired practical skills on how to conduct research that is open, transparent, and reliable. We believe that this learning success was primarily due to the active role we gave students in our course. By being instructed to create lectures and in-class assignments that go beyond the assigned literature, students were able to choose articles covering topics that they consider most relevant for their future research projects. Furthermore, the students developed a sense of ownership for the lectures and in-class assignments, which facilitated ambitious student projects such as the PRO initiative survey.
As the scientific culture changes, practical knowledge on open scientific practices is becoming an increasingly important scientific skill. A course on this topic helps students not only to develop critical thinking, but also to get excited about conducting research that distinguishes sharply between its exploratory and confirmatory components. We hope that courses on open science practices inspire the future generation of psychological researchers to deliver psychology from the deadly sins that have so stained it in the past.
Footnotes
Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This research was supported by a Netherlands Organisation for Scientific Research (NWO) grant to AS (406-17-568), a Veni grant from the NWO to DM (451-15-010), as well as a Vici grant from the NWO to EJW (016.Vici.170.083).
Notes
Author biographies
). She has recently received an NWO research talent grant to develop methods for analysis blinding, validate them empirically and make them accessible in the open-source softwares R and JASP and through teaching materials. Her teaching interests include Bayesian inference and open scientific practices. Since 2016, she has been a teaching assistant on the course “Good Research Practices.”
