Written by Lindsay Morton Over 4 years: 74k+ eligible articles. Nearly 85k signed reviews. More than 30k published peer review history packages…
Written by Marcel LaFlamme
For almost a year, Open Science Indicators have offered the ability to measure three Open Science practices: data sharing, code sharing, and preprint posting. Now, PLOS and DataSeer are adding a fourth indicator for protocol sharing. As we expand the tool’s capabilities, we invite your feedback on the approach we’ve taken in this preliminary data release.
What is a protocol?
The word protocol means different things in different fields, so first we want to clearly signal what types of research output are in scope for this indicator. In line with the community-developed definition used in the PRO-MaP recommendations, we are defining protocol as “detailed and/or step-by-step instructions for carrying out a research procedure.” Clinical study protocols, review protocols, registered report protocols, and other protocols describing a study that will take place in the future generally do not meet this definition, although we’re currently scoping work on a fifth indicator that is likely to include these outputs.
How we got here
In 2022, PLOS conducted a study of researcher practices and priorities around sharing detailed methods information including protocols. The study found that, while methods sections of research articles are regarded as adequate for evaluating study findings, they are not widely perceived as adequate for reproducing results or reusing a method in a different context. To ensure that methods information is usable for a range of research tasks, study participants reported publicly sharing protocols through channels including peer-reviewed publications, supplementary information, dedicated repositories, and other websites.
With these results in mind, we drafted a set of requirements built on our OSI measurement framework and consulted on them with stakeholders including tool providers, meta-researchers, and other methods experts. We then worked with DataSeer to operationalize the requirements. Our current approach detects links to or citations of outputs from an allowlist of publications and repositories known to focus on protocols. In keeping with our approach to measuring data and code sharing, we also detect relevant metadata from supplementary information where available. Please consult our methods documentation for more detail.
What the data say
From 2019 through mid-2023, the rate of protocol sharing for research articles published in PLOS journals hovered around 8%. In contrast to trends in other Open Science practices over the same period, adoption of protocol sharing by PLOS authors did not appear to change appreciably – according to these preliminary results. We assume that not all articles generate protocols, but the rate of protocol generation was not calculated for this release.
Among a comparator set of about 18,000 Open Access research articles from PubMed Central, the rate of protocol sharing did appear to increase from 10% in 2019 to 15% in the first half of 2023. Additional qualitative research may help to explain why these trendlines diverge, whether because of limitations in our data sources or actual differences in author behavior.
Preliminary analysis of the locations of protocols associated with PLOS articles indicates that a clear majority (84%) appear as peer-reviewed publications, with Nature Protocols and the Springer Protocols collection as the most cited sources. Sharing in dedicated repositories like protocols.io and Protocol Exchange became less common over the reporting period, falling from 11% in 2019 to 2% in the first half of 2023, while sharing via supplementary information became more common. The use of repositories is often viewed as a best practice for protocol sharing because these protocols can be updated as they evolve over time.
Our roadmap for further developing the protocols indicator includes adding detection of protocols on lab websites and other online locations. We plan to look more deeply at citations of published protocols, so that we can understand the extent to which authors are pointing to procedures actually used in their study as opposed to referencing protocols for some other reason. We also want to be able to assess how often researchers share their own protocols versus protocols created by others.
Just as importantly, we’d like to hear from you: are there publications or repositories missing from our allowlist? How should we address the limitations of an allowlist-based approach? And are there other ways of communicating detailed methods information that we should consider? We’d be grateful for your input by November 15; you can comment below or write to mlaflamme [at] plos.org to share your perspective.