Written by Lauren Cadwallader, Lindsay Morton, and Iain Hrynaszkiewicz The latest Open Science Indicators (OSI) dataset has arrived! And with it, a…
Author: Iain Hrynaszkiewicz, Director, Open Research Solutions
Publishers investing in simple solutions in their workflows can help to better meet the needs of funders and institutions who wish to support open research practices, research released this week by PLOS concludes.
Policies can be an effective solution for changing research culture and practice. A growing number of research-performing organisations (institutions) and funding agencies have policies that support open research practices — sharing of research data, code and software — as do publishers. Seeking to deepen our understanding of funder and institution needs related to open research, we surveyed more than 100 funders and institutions in 2020. We wanted to know if they are evaluating how researchers share data and code, how they are doing it, why they are doing it, and how satisfied they are with their ability to get these tasks done. Our results are available as a preprint along with an anonymised dataset.
Half of respondents had tried to evaluate if researchers are sharing data or code in the past and most plan to do this in the future. The most common reason for assessing data or code sharing was a desire to understand or compare the current state of practices of researchers and the third most common was to inform the development of policies, support or services with the information.
However, the approaches being used to assess open research practice appear to be unsatisfactory. Personal contact with researchers, such as via email, was the most common method. One respondent described the task as “A very manual process of looking at the individual articles we have funded to see how the data is shared.” For organisations that don’t — yet — have policies on open research, beyond open access to articles, their needs are even less well met. To plan for change, as one respondent from an institution noted, they need better information on how their faculty are currently practicing open research.
Overall, the results imply that the growth of policies and requirements for making research data and code available does not appear to be matched with capabilities or solutions for determining if these policies have been complied with. Measuring policy compliance was the second most common reason for evaluating open research practices. Further, there appear to be insufficient tools, or awareness of tools, to support organisations in guiding strategic and policy decisions that could benefit open research.
The issues that seem to be the most challenging for funders and institutions are also the most complex. Determining if research data have been made available in a reusable form, and determining why research data or code have not been shared were among the least satisfied tasks. Our previous research suggested researchers also need better solutions for finding research data they can reuse, and obtaining data that are not publicly available.
Another key finding is the limitation of the research paper as the primary mode of research assessment. The evaluation of open research practices associated with specific grants, rather than papers, was even more important to respondents.
However, while evaluating research papers for data and code sharing practices will not tell the whole story of open research, the results do imply that simple solutions implemented by journals and publishers can make some progress towards better meeting some of the needs we assessed in our survey. Many respondents were not satisfied with their ability to complete important tasks that relate to finding information in publications.
Simple solutions more publishers could provide include:
- Mandatory Data Availability Statements (DAS) in all relevant publications.
Across the STM industry around 15% of papers include a DAS. Since we introduced our data availability policy in 2014, 100% of PLOS research articles include a DAS.
- Supporting researchers to provide information on why research data (and code) are not publicly available with their publications.
Time and again “data available on request” has been shown to be ineffective at supporting new research — and is not permitted in PLOS journals.
- Enabling and encouraging the use of data repositories.
Recommending the use of data repositories is a useful step, but making them easily and freely accessible — integrated into the publishing process — can be even more effective. Rates of repository use are higher in journals that partner closely with repositories and remove cost barriers to their use.
- Providing visible links to research data on publications.Many researchers also struggle to find data they can reuse, hence PLOS will soon be experimenting with improving this functionality in our articles, and integrating the Dryad repository with submission.
For many researchers, to whom we know compliance with funder policies is important, publishing work in a way, or in journals, that support transparency in the availability of research data and code will be helpful in meeting their funders’ and institutions’ needs — as well as supporting reuse and impact of their research.