Skip to content

When you choose to publish with PLOS, your research makes an impact. Make your work accessible to all, without restrictions, and accelerate scientific discovery with options like preprints and published peer review that make your work more Open.

PLOS BLOGS The Official PLOS Blog

Collaboratively seeking better solutions for monitoring Open Science

Author: Iain Hrynaszkiewicz, Director, Open Research Solutions, PLOS; and Chris Heid, Head of Product, PLOS

Summary

Research by PLOS and Research Consulting has found there is a growing need for Open Science Indicators (Open Science monitoring solutions) among some funders and institutions but implementation of monitoring solutions may be limited unless Open Science practices are a strategic priority for organisations. Research data sharing, and code and software sharing, are among the most important Open Science practices to monitor but organisations need information that is compatible with their own structure and nomenclature to be usable, which is not available currently. In the future Open Science Indicators need to monitor not just prevalence but also the effects or qualities of Open Science practices.

What did we want to know and why?

PLOS launched Open Science Indicators (OSI) in 2022 with DataSeer to support our goal to increase adoption of Open Science practices beyond open access to research articles. Previous research by PLOS and others has identified a similar Open Science “measurement problem” among funders and institutions, and a lack of suitable solutions to solve this problem. While OSI was designed to support PLOS’s needs, sharing the OSI data and methods openly is a way to positively influence Open Science activities outside of PLOS. This Open Science approach to OSI has been reflected in numerous reuses of the OSI dataset and results – by funders, governmental committees, institutions, researcherssolution developers and journals. Therefore, we wanted to learn more about the problem of monitoring Open Science practices from the perspective of funders and institutions – and the extent to which OSI or other solutions might help to solve it.

How did we find out?

In collaboration with Research Consulting in the UK, between October and December 2023 we conducted a series of interviews with 24 representatives of 5 research funders and 10 institutions*, followed by 2 focus groups.

What did we learn?

Our interviews confirmed that, among our cohort at least, interest in Open Science (Research) Indicators is growing, and we confirmed an emerging link between Open Science practices and the quality and reliability of research – as shown for example by the UK Committee on Research Integrity’s interest in these indicators.

Of the various Open Science practices – beyond open access to research articles – that may be important to monitor, interest was greatest in research data sharing; code and software sharing, and the quality or “FAIRness” of data and metadata. This aligns to some extent with findings of previous research, with the exception of the Open Science practice of study registration (preregistration), which was the highest priority in a peer-reviewed study targeted at the biomedical research field.

Figure: Relative importance of 6 Open Science Indicators that emerged from our interviews, ranked during the focus groups

Open Science Indicators are being used by the senior leadership of a minority of funders and institutions we interviewed, to track the effects of, or to develop, open science plans and policies. While “compliance monitoring” could imply punitive outcomes for non-compliance, this isn’t the case. The intent of such activities is to identify where researcher or institutional support is needed  – in line with how OSIs are being used at PLOS, to promote improvements in practice. For institutions in particular, Open Science Indicators are also being used at a more functional or departmental level, such as to make a case for budget or to provide training. Indeed, attention on Open Science practice is more likely to be at a “grassroots” or department, rather than senior leadership level, and many organisations would support making Open Science more of a strategic priority. This desire for greater priority, or resources, for Open Science practices echoes what PLOS has heard from representatives of other publishers who want to see better practice in research data sharing and policy at their journals.

Some institutions and funders have been experimenting with monitoring solutions but participants in our research reported difficulties in gathering sufficiently comprehensive or detailed information to meet their needs. Resources such as PLOS Open Science Indicators could form part of an Open Science monitoring solution for funders and institutions, but a complete solution would require more comprehensive coverage of an organisation’s research outputs and greater metadata quality that allows users to view the results according to their organisation’s structure. Also, being able to benchmark an organisation against a relevant comparator organisation or group is important – to give context to results and help make a case for intervention, while avoiding the (re)creation of institutional rankings on the information.

Our focus groups revealed several desirable characteristics of monitoring solutions that would make them more trustworthy to their users. These included:

  • Endorsement and recognition from a range of sector stakeholders
  • Transparency of data sources, data collection methods, datasets and analysis
  • Clear OSI definitions and specifications
  • Coverage of a large number of organisations
  • Use of open infrastructure based on Open Science principles

What happens next?

While PLOS Open Science Indicators is not a product or service as such, it’s crucial for PLOS to understand the needs of our stakeholders and customers who share a desire to increase adoption of Open Science – if this is to be achieved beyond content published in PLOS’ journals, which is our aim. PLOS is involved in several initiatives in the burgeoning Open Science monitoring space including a new initiative led by the French Open Science Monitor in collaboration with UNESCO. Also, in collaboration with DataSeer and others, we are developing Open Research indicator pilots for UK institutions – led by the UK Reproducibility Network (UKRN). The insights from this research are informing how we are approaching collaboration with these community initiatives. The results have also been influential in our 2024 plans for OSI, in terms of what new features or indicators to consider and in improvements to data quality – for users at and outside of PLOS.

A theme from our findings and what we are hearing from research institutions in our collaborations with the UKRN is a need to think beyond the prevalence of Open Science practices – to the effects and qualities of those practices. This is also in line with the original principles and concept for OSI or, to use another widely adopted framework that informed OSI, the ‘Interoperable’ and ‘Reusable’ (I and R) parts of the FAIR acronym. Watch this space for more information on work already underway to better understand the effects of Open Science practices.

Limitations of our findings

We did not target particular research topics or disciplines in these discussions, but did focus on funders and institutions in the US and Europe, where Open Science policies may lend themselves to a potential need to monitor Open Science. As such our findings are not representative of all funders and institutions, and likely reflect the views of early adopters and innovators in this area.

Acknowledgements

*We are grateful to the following individuals and organisations for their participation in this research: Katharina Rieck, Austrian Science Fund; Dagmar Meyer, European Research Council Executive Agency; Janne Pölönen, and Marita Kari, Federation of Finnish Learned Societies;  Hans de Jonge, and Jeroen Sondervan, Dutch Research Council (NWO); Sylvia Jeney, Swiss National Science Foundation; Keith Webster, Carnegie Mellon University; Yusuf Ozkan, Imperial College London; Phillip Spector, Johns Hopkins University; Malika Ihle, Ludwig Maximilian University of Munich; Mario Malički, Zach Chandler, and Steve Goodman, Stanford University; Ed Fay, University of Bristol; Michael Eadie, and Valerie McCutcheon, University of Glasgow; Laurian Williamson, University of Leicester; Alice Howarth, and Martin Wolf, University of Liverpool; Alison Ashmore, University of Nottingham.

We also thank Andrea Chiarelli (Principal Consultant) and the Research Consulting team for their contributions to this research.

Related Posts
Back to top