Skip to content

When you choose to publish with PLOS, your research makes an impact. Make your work accessible to all, without restrictions, and accelerate scientific discovery with options like preprints and published peer review that make your work more Open.

PLOS BLOGS The Official PLOS Blog

Why quantifying Openness matters

Written by Iain Hrynaszkiewicz

PLOS’ mission is to transform research communication for the good of science and society. We believe that Open Science is the key. Open Science supports rigor, reproducibility, inclusivity, and efficiency—and in doing so, results in accelerated progress, stronger research, and increased trust in results. Increasing the prevalence of Open Science is a prerequisite to reaping the full benefits. At PLOS, we work to foster openness through policy, and by creating opportunities for researchers to engage in Open Science practices as part of their everyday experience publishing in our journals.

But we can only succeed in driving Open Science adoption if we understand:

  1. The research communities we serve
  2. Current norms in those communities
  3. How Open Science behaviors change over time—and especially in response to new policies and offerings

Combined with other research activities, Open Science Indicators (OSIs) are an important source of data that help us to develop solutions that better serve PLOS authors, and the research community in general. Read on to learn more about why and how we’re measuring Open Science practices more comprehensively and consistently at PLOS.

What are Open Science Indicators?

Open Science Indicators are a new source of information on Open Science practices across the published literature. Produced in partnership with DataSeer, they use Natural Language Processing (NLP) and artificial intelligence (AI)  to identify and measure Open Science practices in all PLOS articles, and in a comparator set of similar Open Access research articles. The dataset begins in January 2019, and is updated quarterly.

Measuring Openness with Open Science Indicators

PLOS began developing Open Science Indicators in 2021 to meet our own need for better information. We needed a fuller understanding to inform a data-driven strategy that meets our aims to: 

  1. increase adoption of Open Science practices and
  2. increase evidence of the benefits and effects of these practices.

But until recently, there was no easy way to measure Open Science practices, save from having an expert read every article. And we are not the only ones who need better information on Open Science practices. Research funders, institutions, and policy makers all share the same challenge.

Before initiating OSI we first defined six guiding principles. While the collective requirements we developed for Open Science Indicators are, so far, unique, much of our approach builds on  the work of others: meta-researchers, tool developers, and community-endorsed frameworks. Our inspirations include the FAIR principles, the Charite Metrics Dashboard, and a major piece of meta-research that analyzed different aspects of open science practice in PubMed Central. Even more recently, national efforts such as the French Open Science Monitor and UK Reproducibility Network have, respectively, created solutions or explored requirements for monitoring open research practices to suit their needs.

In shaping our own guiding principles, we felt it was important that Open Science Indicators rely on the publicly available text and metadata included in any research publication, so that the measurements might be applied not just at PLOS, but across any publicly available research article. We wanted to enable comparison across fields, regions, and publication venues in the short term, and to lay the groundwork for a system that might be easily expanded to give a broader view of the whole scientific communications landscape in the future. And of course, we wanted to make the data publicly available to all, for use at PLOS and other organizations.

Selecting the first Indicators

PLOS journals support and facilitate a wide range of Open Science practices, so we needed to prioritize which Open Science Indicators to measure first. We selected, to begin with, three practices that are highly relevant to a large percentage of the researchers we serve, and so have high potential for broad adoption: data sharing, code sharing, and preprint posting. 

Each has its own vital role to play in the system of Open Science.

Data sharing

Data is the raw output of primary research. Readers rely on raw scientific data to enhance their understanding of published research. Data are key for verification, replication and reanalysis, to inform future investigations, and to inform systematic reviews or meta-analyses.

At PLOS, our pioneering data availability policy requires authors to make the raw data underlying their conclusions accessible (within ethical boundaries) at the time of publication. PLOS data availability policy gives authors several options for sharing their data, depending on what works best for them and their manuscript. At the same time, we appreciate that data repositories are the most efficient and impactful way to share. Indicators help us  track progress towards best practices.

Code sharing

Data only mean so much without the context to understand how it was generated and analyzed, and the tools to validate and reproduce the work. Augmenting a research article with Open Code enhances understanding, facilitates reproducibility and reanalysis, promotes trust, and saves other researchers time and effort, leading to a more efficient scientific system overall.

Code sharing is strongly encouraged at all PLOS journals, and is required at PLOS Computational Biology

Preprints

Preprints empower researchers to take control of their research communications for more efficient and inclusive sharing. Researchers use preprints to establish priority, broadcast results, seek community feedback, increase readership, and bolster grant, job or tenure applications.

PLOS journals have always welcomed preprints, and since 2018, we’ve offered facilitated preprint posting at submission to help make the process even easier for our authors.

Looking ahead

Measuring data sharing, code sharing, and preprint posting is one step toward understanding the state of Open Science practices today, and charting the course ahead based on researchers’ real-world behavior. PLOS uses OSIs to guide our decision-making. For example:

  • Prioritizing which repositories to extend our Accessible Data icon to
  • Monitoring the effectiveness of new solutions, such as our integration with EarthArXiv, and PLOS Computational Biology’s code-sharing policy
  • Understanding the Open Science practices of researchers in different disciplines and regions, helping us to align our systems and policies with their needs, and sharing what we learn more widely 

In the future, we’ll add additional data points and new Indicators to the OSI dataset, and expand it to include more comparator content. We’re also combining the quantitative insights from OSIs with other sources of information, such as published qualitative research, as well as our own novel investigations into different communities and practices. Combining qualitative insights with diverse, responsibly-applied quantitative insights provides a more complete picture of research quality and impact, and aligns with efforts to reform research assessment.

And we’ll continue to make the dataset publicly available for others to read and consult, share, remix, and reuse, for the betterment of science, which ultimately is what Open Science is trying to achieve. 

Related Posts
Back to top