Abstract
This is a visual representation of the abstract.
Introduction
Open science refers to a movement to make research and its processes freely available to access, read, and build upon. The goal of open science is to improve transparency, not only for other researchers, but also for patients and the broader public.1-4 There are many advantages to implementing open science practices. For example, when research is published open access, the financial barrier to access is eliminated, allowing readers who can’t afford journal subscriptions or who are not a part of an affiliated institution to access it.1,4 When research is open access, it increases its visibility and spread since more people can download and read the articles.1,4 Furthermore, open science practices such as study registration and data sharing increase the transparency of research methods which enables critical appraisal and replication.1,4
In August of 2022, the White House’s Office of Science and Technology Policy (OSTP) stated that US federal agencies are required to make all tax-payer funded publications free for public access 5 ; this is a demonstration of the push toward open science practices. In addition, the government of Canada has begun to implement recommendations and action plans to ensure that open science practices are being implemented in the research community. 6 Their objective is to provide recommendations and guidance regarding open access practices. The reason for the push toward open science practices, according to the program, is that open science accelerates knowledge transfer, increases reproducibility and leverages diversity and inclusion. 6 Their first recommendation is “Canada should adopt an open science approach to federally funded scientific and research outputs,” highlighting the increased attention surrounding the open science concept.
The Transparency and Openness Promotion (TOP) guidelines, 7 an initiative started by the Center for Open Science in 2015, evaluates journals based on their degree of compliance with transparency and openness standards. This organization provides a score, called “Top Factor,” from 0 to 3, with 3 being the best, for many open science categories based on their level of adherence. The journals are evaluated based on citation standards, data transparency, analytic methods transparency, research materials transparency, design and analysis transparency, study registration, analysis plan registration, and replication. 8 By evaluating journals using open science metrics, this initiative is facilitating the push for transparency and improves research communication.
Despite the increasing attention toward open science practices, these are not yet translating into universal practice. A study by Ebrahimzadeh et al 9 audited the open science and data sharing practices of the Montreal Neurological Institute—Canada’s first self-proclaimed Open Science institute—and concluded that the authors of only about half of all publications shared data. Rates at institutions less formally committed to open science are likely even lower. A study by Nutu et al 10 evaluated open science practices of high impact psychology journals and found very low adherence to prospective registration (3%) and data sharing (2%). Moreover, in imaging research, a paper by Hong et al 11 evaluated the adherence of diagnostic test accuracy (DTA) studies to reporting guidelines and revealed that only 55% of papers adhered to the STAndards for Reporting of Diagnostic accuracy studies. In addition, a study conducted by Salameh et al 12 showed that recent diagnostic test accuracy systematic reviews only included two thirds of the relevant reporting guideline items, which made the studies “not fully informative.” These examples indicate a low level of implementation or adherence to open science practice.
The primary objective of this study was to evaluate open science practices in imaging research. This included evaluating the open science policies of individual journals, as well as the rate of adherence to these policies in a sample of recent individual studies published in these journals. The secondary objective was to evaluate for associations between specific journal characteristics, such as journal impact factor (from 2021), category of journal (open science or hybrid), % of open access documents, and compare it to open science practice adherence in the articles.
Methods
Transparency Statement, Ethics, and Protocol Registration
The study protocol, data, analytical code, and materials associated with the project is available on our Open Science Framework project page (https://osf.io/gzv46). There were no major protocol deviations. Our institution research ethics board approval was not required for this evaluation of published research.
Reporting Guideline
This study was reporting using the STROBE 13 reporting guideline for cross sectional study design as guidance. A completed checklist can be found in Appendix 1.
Journal and Article Selection
We identified a list of radiology and nuclear medicine journals using Web of Science. 14 The full list of search terms used to identify the journals can be found in Appendix 2a. From the identified journals, we only included journals that were in English and published empirical/primary research. We excluded journals that were related purely to radiation oncology, radiation therapy, or experimental physics as our focused is on clinical imaging. We also excluded journals that were not in English or that strictly published review studies (Appendix 2b).
Journal Data Extraction
Data extraction was conducted using Microsoft Excel. Data extraction was completed by in duplicate. Pilot data extraction was conducted on a sample of journals and articles. The authors that were involved in data acquisition were MK, PR, AK, NI, HA, HD, RA, and MZ. All conflicts were resolved by consensus or, when necessary, third-party arbitration. In addition, MM is the only radiologist from the author list.
From the included journals, we extracted (independently and in duplicate) information from the “instructions for authors” subsection of the journals’ website, in addition to any other available relevant information on the website (eg, journal policies).
The open sciences policies that were extracted included: requirements for study registration, use of reporting guidelines, ethical practice, data sharing, open access policies, and TOP factor. The definitions of the variables evaluated in this paper can be found in Table 3. For authors who specify that their primary data is available upon request, we contacted them through email and requested their data. We provided the authors a 14 day period, from the time of the email, to share their data. Open access policies include the option to publish an article to be made available free of charge for the read. In addition, we extracted the article processing charges (APC) amount in US dollars and if there were any methods to reduce the APC (eg, APC discount for low-income countries). The TOP factor is a metric that evaluates the level of compliance of a journal to open access practices. We extracted TOP factor scores directly from their website. 8 These open science categories were derived from similar auditing studies,9,10 the TOP guidelines, 7 and previous literature guiding open science practices.1,3,8 A complete list of extraction parameters can be found in Appendix 3a. Data sources will include journal and publisher web page, instructions for authors and any other information identified from these sources. Data sources included journal and publisher web page, instructions for authors and any other information identified from these sources. This data was extracted during February and March of 2023.
Article Data Extraction
Following journal data extraction, for each included journal, we conducted a cross-sectional audit of the 10 most recently published empirical research articles. We chose December 31st of 2022 as the latest date then worked backwards to screen for articles. The article PDFs were retrieved directly from PubMed if the article was open access, through the University of Ottawa Library, or using an inter-library loan with the help of a librarian. From the journal extraction form, we extracted related information and evaluated compliance with each journal’s policies. The articles were retrieved as PDFs from the journal’s website. We extracted information on the same open science practices obtained at the journal level from the articles obtained. The extraction form can be found in Appendix 3b. We also categorized the articles based on the study design as different types of studies may have different requirements or adhere to open science practices at different rates. The 3 types of study design categories were prospective studies (eg, randomized controlled trial), retrospective observational studies (eg, case-control studies), and systematic reviews. A pilot test for the data extraction step was conducted on 20 articles prior to study commencement.
Open Science Score (OSS)
To quantify open science practices and the performance of each journal and article, we created a journal or article Open Science Score (OSS) (Appendix 4). The OSS combined numerous data points that we extracted, for each journal or article. The included variables are all dichotomous. A score of 1 was given for each “Yes.” A score of 0 was given for “No” or “Not Mentioned.” We calculated the OSS for each journal and article. In addition, a sample calculation can be found in Appendix 4b. Each component open science had an equal weight in the equation; however, ethical practices was evaluated using a number of sub-variables, so we decided to combine all the sub-variables of ethical practices into a weight that is equal to all the other variables. The rationale for weighing it in this manner was because ethical practice, as an open science practices, requires evaluation of many subcomponents, which must be combined into one score of equal weight to other open science practices, to allow the Open Science Score to represent a balanced grade.
The journal OSS included 6 variables: registration, use of reporting guideline, ethical practices, data sharing policy, the presence of an APC waiver, and whether the journal uses an open peer review system.
The article OSS included 5 variables: registration, use of reporting guideline, ethical practices, data sharing, and whether the article was available through open access. The ethical practices included 3 sub-variables, each accounting for 1/3 of the ethical practices score. The 3 sub-variables are disclosing project findings, conflicts of interest, and whether authors mentioned whether receiving board approval to conduct their study.
Data Analysis
For our secondary objective analysis, we compared the relationship between specific journal characteristics (journal impact factor, category of journal [Open access, hybrid, traditional], % of open access articles) and the OSS. For the category of journal compared to the level of open science recommendation, we used the Pearson chi-square test for independence. For journal impact factor and % of open access documents compared to the level of open science recommendation, we used the Pearson Correlation Test. In addition, we evaluated the study design of an articles compared to the article OSS using a Tukey Honest significance test. Data analysis was performed using R Studio and SAS Analytics software.
Results
A total of 82 journals were included; 53 journals were excluded that were related purely to radiation oncology, radiation therapy, or experimental physics. We also excluded journals that were not in English or that strictly published review studies. In addition, a total of 820 articles were included representing 10 articles for each included journal. Complete list of included journals and extracted data is provided as Supplemental Appendix.
Journal Results
The average OSS percentage for all journals was 58.3%. A list of individual OSS for each journal can be found in the Supplemental Material. The two journals that received a perfect OSS were BMC Medical Imaging and the Iranian Journal of Radiology. At the journal level, 15.9% of journals were fully open access and the percentage of open access documents across all journals was 48.0%. In terms of preprints, 51.2% of journals allowed submissions of articles that have been posted as preprints, 3.7% did not allow preprints, and 45.1% did not mention their policy regarding preprints. The median APC value was 3319 USD. The results of all other variables, at the journal level, can be found in Table 1 and Figure 1.
Proportions and Percentages of Open Science and Data Sharing Practices at the Journal Level.

Proportions of open science practices at the journal level. All variables had an N-value of 82 unless otherwise specified. “Full Open Access” indicates which journals only publish open access articles. “Registration” refers to which journals had a prospective registration policy. “RG” and “RC” indicate which journals had a policy that required authors to follow a reporting guideline, or submit a reporting guideline checklist, respectively. “COPE” refers to which journals had a membership with the Committee On Publication Ethics. “Funding,” “COI,” “Ethics Board” reveal whether journals required disclosure of funding, conflicts of interest, and approval from an ethics research board, respectively. “APC Waiver” indicates which journals provided a waiver for articles processing charges associated with open access publishing.
Article Results
A total of 820 articles (10 per journal) were included in this study. The average Open Science Score (OSS) percentage for all articles was 31.8%. A list of individual OSS for each article can be found in the Supplemental Material. Sixty-four (7.8%) articles registered their study protocol. Of those, 12 (1.5%) articles registered their protocol for a systematic review on registries, such as Open Science Framework and PROSPERO. The remaining 52 (6.3%) were clinical trials registrations, most of which were registered on “ClinicalTrials.gov.” In terms of data sharing, 37.4% of articles included a data availability statement; 3.3% of articles included all raw data of their study in the paper itself; 6.5% of articles use data that is publicly available as their raw data and (27.32%) of articles mentioned stated that their data is available upon request. After emailing the corresponding authors and asking them to provide their data within a 14 day period, only 9.4% of 224 authors responded and provided their data. Moreover, 46.6% of all articles were available through open access. 4.6% of all articles were posted as a preprint (Appendix 5). The results of all other variables, at the article level, can be found in Table 2 and Figure 2.
Proportions and Percentages of Open Science Practices at the Article Level.
This variable had an N-value of 224.

Proportions of open science practices at the article level. All variables had an N-value of 820 unless otherwise specified. “Registration” refers to which articles prospectively registered their study protocol. “RG” and “RC” indicate which articles followed a reporting guideline, or submitted a reporting guideline checklist, respectively. “Funding,” “COI,” “Ethics Board Approval” reveal whether articles disclosed sources of funding, conflicts of interest, and approval from an ethics research board, respectively. “Upon Request*” reveals which authors provided their primary data after being contacted. “Open Access” indicates which articles are accessible for free. “Preprint” indicates which articles were posted as a preprint prior to publication.
Relevant Open Science Terms and Definitions.
Secondary Analysis
There was no correlation (Pearson’s correlation coefficient of −.036) between impact factor and journal OSS (
Discussion
Imaging journals endorsed just over half of evaluated open science practices, reflecting an OSS of 58.3% among the 82 imaging journals we evaluated. The 820 recently published research articles from these journals had a lower compliance with open science practices reflected by OSS of 31.8%.
Previous research on open science practices provides context for these results and allows us to compare and contrast imaging research to other fields. Nutu et al 10 evaluated open science practices in psychology journals and found that 25% of journals had a registration policy, 67% of journals had a data sharing policy, 47% of journals endorsed reporting guidelines, and 87% of journals required authors to disclose conflicts of interest. In addition, at the article level, only 3% of studies were prospectively registered, 2% shared their data, and 50% disclosed their conflicts of interest. In every metric, both at the journal and article level, our study revealed a higher level of open science practices. This could be interpreted as a sign that imaging journals are more adherent to open science practices than psychology journals. However, their study was published 4 years ago indicating that differences may be related to progress of open science practices over time.
Siebert et al 19 evaluated data sharing recommendations in biomedical journals and randomized clinical trials and found that 30% of ICMJE affiliated journals had explicit data-sharing policies and 22% of articles expressed intentions to share data. In comparison, our study found that 81.7% of imaging journals had a data sharing policy and we retrieved primary data from 12.3% of articles. The lower rate of data sharing in our cohort may be related to more stringent definition applied in that we required authors to share data upon request rather than simply stating that they would in the paper.
Hong et al 11 showed that only 4.9% of imaging diagnostic accuracy studies had pre-registered protocols. Our cohort had a similarly low proportion of registered studies indicating that further progress in imaging research protocol registration remains necessary. Studies in other fields such as behavioral medicine 20 have found similarly low rates of protocol registration (2%-12%).
A limitation of this study was that open science practices were difficult to quantify. At the time of conducting this study, there was no agreed upon standard in the literature on how to best evaluate open science practices. In addition, extracting data on journal policies was a subjective process. To combat these related issue, we limited our data extraction questions to categorical answer options, and we created a formula to output an overall numeric score for open science practices. In addition, we conducted all data extraction in duplicates, to limit subjectivity of our data. Another limitation was that we were only able to include 10 articles per journal for feasibility reasons. To combat this issue, all of our analysis compared articles versus journals cumulatively, and we did not conduct any analysis that stratified per journal, in which the sample number would have been too small (n = 10). Also, we excluded the journal titled “Diagnostic & Interventional Radiology” inadvertently due to an error in the screening process. In retrospect, this journal should have been included in our analysis as it met our eligibility criteria. This limitation should be considered while interpretating the results of this study; however, we anticipate the impact of this limitation should be minimal as we included 82 journals.
Educational programs, at academic institutions or through virtual events, that inform authors about the benefits of open science practices, such as protocol registrations and the use of reporting guidelines, may increase the implementation of these practices. In addition, higher publication standards at the journal level may increase the use of open science practices at the article level. The main initiative that is spreading awareness about this topic is the TOP. They provide a guideline for what good open science practices at the journal level looks like, as well as auditing a large number of journals on open science practices.
Social media plays a big role in promoting published studies, especially in the field of radiology. 21 A study conducted by Pozdnyakov et al 22 explores the visibility benefits of posting research articles. Their discussion further supports the need for open science promotion, as this will make it easier to discuss research content on social media, provides further free reading for the general public, and increases visibility of published studies.
Future research should continue to audit the implementation of open science practices, both at the journal and article level, and evaluate whether there is improvement or regression. In addition, since open science is a relatively new concept, more methods of quantifying and evaluating open science should be explored, in addition to our Open Science Score, in order to have a standard and validated method of evaluation. Lastly, more studies that evaluate open science as a whole, as opposed to one or 2 components, would provide more context on the current state of open science across different areas of research.
Research Data
sj-xlsx-1-caj-10.1177_08465371231211290 – Supplemental material for Cross-Sectional Evaluation of Open Science Practices at Imaging Journals: A Meta-Research Study
Supplemental material, sj-xlsx-1-caj-10.1177_08465371231211290 for Cross-Sectional Evaluation of Open Science Practices at Imaging Journals: A Meta-Research Study by Mohammed Kashif Al-Ghita, Kelly Cobey, David Moher, Mariska M.G. Leeflang, Sanam Ebrahimzadeh, Eric Lam, Paul Rooprai, Ahmed Al Khalil, Nabil Islam, Hamza Algodi, Haben Dawit, Robert Adamo, Mahdi Zeghal and Matthew D.F. McInnes in Canadian Association of Radiologists Journal
Footnotes
STROBE Statement—Checklist of Items That Should Be Included in Reports of Cross-Sectional Studies.
| Item no. | Recommendation | Page |
|
|---|---|---|---|
| Title and abstract | 1 | (a) Indicate the study’s design with a commonly used term in the title or the abstract | 1 |
| (b) Provide in the abstract an informative and balanced summary of what was done and what was found | 2 | ||
|
|
|||
| Background/rationale | 2 | Explain the scientific background and rationale for the investigation being reported | 4 |
| Objectives | 3 | State specific objectives, including any prespecified hypotheses | 6 |
|
|
|||
| Study design | 4 | Present key elements of study design early in the paper | 7 |
| Setting | 5 | Describe the setting, locations, and relevant dates, including periods of recruitment, exposure, follow-up, and data collection | 7 |
| Participants | 6 | (a) Give the eligibility criteria, and the sources and methods of selection of participants | 6 |
| Variables | 7 | Clearly define all outcomes, exposures, predictors, potential confounders, and effect modifiers. Give diagnostic criteria, if applicable | n/a |
| Data sources/measurement | 8* | For each variable of interest, give sources of data and details of methods of assessment (measurement). Describe comparability of assessment methods if there is more than one group | 9 |
| Bias | 9 | Describe any efforts to address potential sources of bias | n/a |
| Study size | 10 | Explain how the study size was arrived at | 8 |
| Quantitative variables | 11 | Explain how quantitative variables were handled in the analyses. If applicable, describe which groupings were chosen and why | 9 |
| Statistical methods | 12 | (a) Describe all statistical methods, including those used to control for confounding | 9 |
| (b) Describe any methods used to examine subgroups and interactions | 10 | ||
| (c) Explain how missing data were addressed | n/a | ||
| (d) If applicable, describe analytical methods taking account of sampling strategy | n/a | ||
| (e) Describe any sensitivity analyses | n/a | ||
|
|
|||
| Participants | 13* | (a) Report numbers of individuals at each stage of study—for example, numbers potentially eligible, examined for eligibility, confirmed eligible, included in the study, completing follow-up, and analyzed | 10 |
| (b) Give reasons for non-participation at each stage | 10 | ||
| (c) Consider use of a flow diagram | n/a | ||
| Descriptive data | 14* | (a) Give characteristics of study participants (eg, demographic, clinical, social) and information on exposures and potential confounders | 10 |
| (b) Indicate number of participants with missing data for each variable of interest | n/a | ||
| Outcome data | 15* | Report numbers of outcome events or summary measures | 10 |
| Main results | 16 | (a) Give unadjusted estimates and, if applicable, confounder-adjusted estimates and their precision (eg, 95% confidence interval). Make clear which confounders were adjusted for and why they were included | 10 |
| (b) Report category boundaries when continuous variables were categorized | n/a | ||
| (c) If relevant, consider translating estimates of relative risk into absolute risk for a meaningful time period | n/a | ||
| Other analyses | 17 | Report other analyses done—for example, analyses of subgroups and interactions, and sensitivity analyses | 11 |
|
|
|||
| Key results | 18 | Summarize key results with reference to study objectives | 12 |
| Limitations | 19 | Discuss limitations of the study, taking into account sources of potential bias or imprecision. Discuss both direction and magnitude of any potential bias | 13 |
| Interpretation | 20 | Give a cautious overall interpretation of results considering objectives, limitations, multiplicity of analyses, results from similar studies, and other relevant evidence | 13 |
| Generalizability | 21 | Discuss the generalizability (external validity) of the study results | 13 |
|
|
|||
| Funding | 22 | Give the source of funding and the role of the funders for the present study and, if applicable, for the original study on which the present article is based | 6 |
Appendix 2a: List of Included Journals
Keywords
radiol* OR imaging OR ROENTGENOLOG* OR nuclear
List of journals (from Web of Science) Appendix. 2a: List of included journals
Keywords
radiol* OR imaging OR ROENTGENOLOG* OR nuclear
List of journals (from Web of Science)
List of Excluded Journals.
| 1 | Applied Radiation and Isotopes |
| 2 | Biomedical Optics Express |
| 3 | Brachytherapy |
| 4 | Cancer Biotherapy and Radiopharmaceuticals |
| 5 | Cancer Radiotherapie |
| 6 | Dose-Response |
| 7 | EJNMMI Physics |
| 8 | Health Physics |
| 9 | International Journal of Hyperthermia |
| 10 | International Journal of Radiation Biology |
| 11 | International Journal of Radiation Oncology Biology Physics |
| 12 | Journal of Applied Clinical Medical Physics |
| 13 | Journal of Biomedical Optics |
| 14 | Journal of Contemporary Brachytherapy |
| 15 | Journal of Innovative Optical Health Sciences |
| 16 | Optical Health Sciences |
| 17 | Journal of Radiation Research |
| 18 | Journal of Radiological Protection |
| 19 | Magnetic Resonance Materials in Physics Biology and Medicine |
| 20 | Magnetic Resonance Materials in Physics Biology & Medicine |
| 21 | Medical Dosimetry |
| 22 | Medical Image Analysis |
| 23 | Medical Physics |
| 24 | 0029-5566.Is,Il. |
| 25 | Photoacoustics |
| 26 | 1120-1797.Is,Il. |
| 27 | Practical Radiation Oncology |
| 28 | 1824-4785.Is,Il. |
| 29 | Radiation Protection Dosimetry |
| 30 | Radiation Research |
| 31 | Radiology and Oncology |
| 32 | Radioprotection |
| 33 | Radiotherapy & Oncology |
| 34 | 1438-9010.Is,Il |
| 35 | Seminars in Radiation Oncology |
| 36 | 0179-7158.Is,Il. |
| 37 | Zeitschrift Fur Medizinische Physik |
| 38 | Ultraschall In Der Medizin |
| 39 | Seminars in Musculoskeletal Radiology |
| 40 | Concepts in Magnetic Resonance |
| 41 | 0887-2171.Is,Il |
| 42 | 0301-5629.Is,Il |
| 43 | Radiologe |
| 44 | Revista Espanola De Medicina Nuclear E Imagen Molecular |
| 45 | Seminars in Interventional Radiology |
| 46 | Seminars in Nuclear Medicine |
| 47 | Seminars in Roentgenology |
| 48 | Radiologic Clinics of North America |
| 49 | Radiographics |
| 50 | Neuroimaging Clinics of North America |
| 51 | Magnetic Resonance Imaging Clinics of North America |
| 52 | Diagnostic & Interventional Radiology |
| 53 | Journal of Medical Imaging & Health Informatics |
Appendix 3a. Journal Extraction Form
Section 1: Journal characteristics
Section 2a: Registration
Section 2b: Use of reporting guidelines
Section 2c: Ethics
Section 2d: Data Sharing
Section 2e: Article Processing Cost
Section 2f: TOP factor
Section 2g: Preprint and peer review
Appendix 3b: Article Screening and Extractions
Article screening:
Please exclude articles if they match any of the following characteristics
Article extractions:
Section 1: General info
Section 2a: Registration
Section 3b: use of reporting guidelines
Section 3c: ethics
Section 3d: data sharing
Section 3e: Open Access Practices
Section 3F: Pre-print:
Appendix 4a
Appendix 4b
The Frequency of Preprint Publications on Different Servers.
| Preprint server | Frequency |
|---|---|
|
|
13 |
|
|
13 |
|
|
2 |
|
|
2 |
|
|
1 |
|
|
1 |
| Total | 38 |
Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
Supplemental Material
Supplemental material for this article is available online.
References
Supplementary Material
Please find the following supplemental material available below.
For Open Access articles published under a Creative Commons License, all supplemental material carries the same license as the article it is associated with.
For non-Open Access articles published, all supplemental material carries a non-exclusive license, and permission requests for re-use of supplemental material or any part of supplemental material shall be sent directly to the copyright owner as specified in the copyright notice associated with the article.
