When you choose to publish with PLOS, your research makes an impact. Make your work accessible to all, without restrictions, and accelerate scientific discovery with options like preprints and published peer review that make your work more Open.

PLOS BLOGS The Official PLOS Blog

How can publishers better meet the open research needs of funders and institutions?

Author: Iain Hrynaszkiewicz, Director, Open Research Solutions

Publishers investing in simple solutions in their workflows can help to better meet the needs of funders and institutions who wish to support open research practices, research released this week by PLOS concludes.

Policies can be an effective solution for changing research culture and practice. A growing number of research-performing organisations (institutions) and funding agencies have policies that support open research practices — sharing of research data, code and software — as do publishers. Seeking to deepen our understanding of funder and institution needs related to open research, we surveyed more than 100 funders and institutions in 2020. We wanted to know if they are evaluating how researchers share data and code, how they are doing it, why they are doing it, and how satisfied they are with their ability to get these tasks done. Our results are available as a preprint along with an anonymised dataset.

Half of respondents had tried to evaluate if researchers are sharing data or code in the past and most plan to do this in the future. The most common reason for assessing data or code sharing was a desire to understand or compare the current state of practices of researchers and the third most common was to inform the development of policies, support or services with the information.

However, the approaches being used to assess open research practice appear to be unsatisfactory. Personal contact with researchers, such as via email, was the most common method. One respondent described the task as “A very manual process of looking at the individual articles we have funded to see how the data is shared.” For organisations that don’t — yet — have policies on open research, beyond open access to articles, their needs are even less well met. To plan for change, as one respondent from an institution noted, they need better information on how their faculty are currently practicing open research.

Overall, the results imply that the growth of policies and requirements for making research data and code available does not appear to be matched with capabilities or solutions for determining if these policies have been complied with. Measuring policy compliance was the second most common reason for evaluating open research practices. Further, there appear to be insufficient tools, or awareness of tools, to support organisations in guiding strategic and policy decisions that could benefit open research.

The issues that seem to be the most challenging for funders and institutions are also the most complex. Determining if research data have been made available in a reusable form, and determining why research data or code have not been shared were among the least satisfied tasks. Our previous research suggested researchers also need better solutions for finding research data they can reuse, and obtaining data that are not publicly available.

Another key finding is the limitation of the research paper as the primary mode of research assessment. The evaluation of open research practices associated with specific grants, rather than papers, was even more important to respondents.

However, while evaluating research papers for data and code sharing practices will not tell the whole story of open research, the results do imply that simple solutions implemented by journals and publishers can make some progress towards better meeting some of the needs we assessed in our survey. Many respondents were not satisfied with their ability to complete important tasks that relate to finding information in publications. 

Simple solutions more publishers could provide include:

  • Mandatory Data Availability Statements (DAS) in all relevant publications.
    Across the STM industry around 15% of papers include a DAS. Since we introduced our data availability policy in 2014, 100% of PLOS research articles include a DAS.
  • Supporting researchers to provide information on why research data (and code) are not publicly available with their publications.
    Time and again “data available on request” has been shown to be ineffective at supporting new research — and is not permitted in PLOS journals. 
  • Enabling and encouraging the use of data repositories.
    Recommending the use of data repositories is a useful step, but making them easily and freely accessible — integrated into the publishing process — can be even more effective. Rates of repository use are higher in journals that partner closely with repositories and remove cost barriers to their use.
  • Providing visible links to research data on publications.Many researchers also struggle to find data they can reuse, hence PLOS will soon be experimenting with improving this functionality in our articles, and integrating the Dryad repository with submission.

For many researchers, to whom we know compliance with funder policies is important, publishing work in a way, or in journals, that support transparency in the availability of research data and code will be helpful in meeting their funders’ and institutions’ needs — as well as supporting reuse and impact of their research.

  1. Thanks for sharing this interesting work. For what it’s worth, I think (as I wrote about 5 years in https://danielskatzblog.wordpress.com/2016/04/13/data-and-software-management-plans-must-be-public-and-should-be-machine-readable/) a primary issue from the funder’s point of view is that plans about the data and software that the project plans to produce are currently mostly not public or machine-readable. If the data and software in these plans was identifiable, then publishers could contribute by either affirming that this data and software was produced and shared, or could at least use the same identifiers and point back to the plan, letting the funders do this confirmation.

  2. What can publishers do? Wrong question. Publishers and researchers respond to incentives created by research funders. While research funders remain wedded to citation counting as the measure of research performance that is the metric that publishers and researchers will focus on.

  3. Thank you for this post Iain. Interesting observations, and good input for STM’s Research Data Program’s (stm-researchdata.org) effort to bring better alignment to funder and publisher data policies. We’ll be following the results of your initiatives closely to see how we can improve and extend publisher’s efforts to share, link and cite research data. Very much agree on your findings on the need for improved submission workflows for data and better linking between articles and research data. That is exactly what SCHOLIX does, the linking framework between Crossref, Datacite and OpenAire. Read here for the way publishers can easily adopt and implement SCHOLIX: T Read here for the way publishers can easily adopt and implement SCHOLIX: https://www.stm-researchdata.org/wp-content/uploads/2021/09/Scholix-How-To-v3.pdf

  4. What recourse do publishers have? Wrong question. Research funders have an impact on the incentives offered to academic publishers and individual researchers. Publishers and academics will prioritize citation counts as a measure of research impact so long as funding agencies insist on using them.

Leave a Reply

Your email address will not be published. Required fields are marked *

Add your ORCID here. (e.g. 0000-0002-7299-680X)

Related Posts
Back to top