Hindawi / Blog / Blog Post

Open Science

Hindawi’s response to US Office of Science and Technology Policy

Opinion
Hindawi’s response to US Office of Science and Technology Policy

Can US Federal Research Agencies move beyond Public Access to fuel innovation and integrity in the research process itself?


We recently submitted a response to the US Office of Science and Technology Policy (OSTP) publication: Public Access to Peer-Reviewed Scholarly Publications, Data and Code Resulting From Federally Funded Research.

The OSTP asked for thoughts on: limitations and barriers to the effective communication of research outputs; opportunities for change; what more Federal agencies can do; the engagement of Federal Agencies with other actors; and the benefit of immediate access to American science leadership and American competitiveness. Our submission, published in full below, was led by Dr Catriona J. MacCallum, Director of Open Science. In part, it reflects the upcoming final report* of the EU Open Science Policy Platform, to which she contributed as writing chair and member representative of the Open Access Scholarly Publishers Association

Response from Hindawi Ltd to US OSTP RFI: Public Access to Peer-Reviewed Scholarly Publications, Data and Code Resulting From Federally Funded Research

(submitted by Dr Catriona J. MacCallum, Director of Open Science, Hindawi Ltd)*

Topic 1. What current limitations exist to the effective communication of research outputs (publications, data, and code) and how might communications evolve to accelerate public access while advancing the quality of scientific research? What are the barriers to and opportunities for change?

A. Limitations and barriers to the effective communication of research outputs:

  1. A hypercompetitive research culture that prioritises in-fashion research/science, individual status and novelty over collaboration, research integrity and reusability1–8.

  2. A system of research communication, where the policies, practices and the processes of communication themselves are neither evidence-based nor subject to independent scrutiny9.

  3. An academic evaluation, reward and career structure based on a limited set of outputs (heavily focussed on publications and primarily journal articles) that does not reward transparency, rigour, collaboration and the sharing and reuse of a variety of research outputs (such as data, code, etc.)1,10–14.

  4. A legacy publishing system based on the ownership and control of research outputs (e.g via subscriptions), rather than on the creation of tools, products and services that maximise effective and reliable research communication in a globally networked digital age15,16.

    1.  A publishing market where the reputation of researchers is linked to scholarly publishers based on specific journal brands, often using Journal Impact Factor as a proxy for quality of the individual research output, rather than the individual merits of the researchers17. As stated by DORA, Journal Impact Factor and other journal-level metrics do not necessarily reflect the quality of an individual's published research, the range of individual outputs and/or the contributions of researchers18,19.

    2. A system of peer review that introduces delays and bias in research and which is not open to independent scrutiny to test its effectiveness and integrity20,21.

  5. No coherent, sustainable, open and interoperable infrastructure to support the effective communication of the full range of research outputs22. 

    1. In particular, much of the data and metadata to support the connections between different research outputs is controlled by commercial companies and kept closed and often monetized. As the dependence on proprietary data providers grows, universities and research funders risk becoming completely reliant on a few large companies for critical evaluation and decision support23. 

  6. A disparity in progress and motivation among different disciplines and institutions, among different actors and organisations, and among researchers at different stages of their career.

  7. A lack of policy alignment across local, regional, national and international jurisdictions, and no clear legal or regulatory framework for public or private individuals or actors. This disadvantages researchers collaborating in different jurisdictions and slows progress among other actors. 

A hypercompetitive research culture alongside an evaluation system that ranks both researchers and institutions on a very limited set of proxy metrics is perhaps the key barrier to the effective communication of research. The consequences include an unwillingness among researchers to collaborate and share research outputs, a tendency for them to maximise the number of publications (salami slicing) and a culture in which cutting corners and selective reporting is acceptable if it ensures work is put in the most favourable light for publication5,7,24–26. All of which can (and does) severely disadvantage effective communication of research outputs.

Furthermore, research integrity and ethics are not commonly part of the education or continuing development of researchers and are not used as part of research or researcher evaluation. Several expert reports and surveys have demonstrated a lack of awareness, support, training and leadership around research and publication ethics and integrity, in particular among researchers27.

B: Opportunities for change

There are three opportunities for meaningful systemic change across all of scholarly communication, irrespective of discipline or jurisdiction.

  1. Cultural & Social: to foster a practice and process of research and scholarship that fuels innovation, promotes integrity, fosters collaboration, shares failure, celebrates success and rewards a diversity of talent, skills and performance. Key to this are:

    1. a wholesale change to the reward and tenure system to align the reputation and career progression of researchers, and the mission of publicly funded institutions with the processes, practices and outputs that best serve science and society. This is applicable to every discipline and includes both applied and fundamental research.

  2. Technological: to create a truly open, reusable and interoperable infrastructure for scholarly communication that makes collaboration, dissemination and discovery as frictionless as possible23 (see response to questions 2 and 3)

  3. Economic and legal : to fundamentally shift the business relationships between scholarly publishers and the research community from a model based on ownership, control, and journal brands to one based on value-added services, collaborative partnerships, and community engagement (see response to question 3)

An effective research communication system must ensure that there is trust in the research processes, and in the reliability of published articles, data, code or other related outputs – including those that don’t necessarily lend themselves to novelty or directly benefit personal status

For example, researchers, institutions and publishers have little incentive to publish null, negative and inconclusive results. This has created substantial and damaging publication bias across the entire research system28,29. Publication bias and related issues is likely to exist in some form in all disciplines, including the arts, humanities and social sciences30, but research into the prevalence of such bias and its consequences have largely been limited to clinical and preclinical disciplines. 

Researchers need to be empowered by a reward system that encourages them to collaborate and share their work openly, to be creative, honest and transparent and to take responsible risks. They should not be stigmatised for failure nor penalised for the publication and sharing of null, negative or inconclusive results.

Topic 2. What more can Federal agencies do to make tax-payer funded research results, including peer-reviewed author manuscripts, data, and code funded by the Federal Government, freely and publicly accessible in a way that minimizes delay, maximizes access, and enhances usability? How can the Federal Government engage with other sectors to achieve these goals?

A. What More Federal agencies can do:

  1. Ensure public access to research outputs:

    1. Mandate that peer reviewed publications are made open access on publication without any embargo period.

    2. Promote FAIR data principles and management31.

    3. Ensure that data management plans lay out the expectations for sharing the data and code underlying any published work.

    4. Mandate that all grantees include a Data Availability Statement in any publication32.

  2. Change academic culture and reward:

    1. Encourage the adoption and implementation of DORA at all US Institutions and across all federal funding agencies33. Ensure there is also adherence to principles about the responsible use of metrics in research communication34.

    2. Work with publishers, data repositories and other service providers to enable article-data or article-code linking.

    3. Require citations to data and software code, and ensure they are given at least the same level of reward and recognition that are given to publications.  

    4. Provide training and education at all stages of researchers’ careers on open access and open science. This includes training for

      1. skills associated with research integrity, research and publication ethics, data/code stewardship, management and reuse. 

      2. using infrastructure and the artificial intelligence tools necessary to mine data and text at scale.

  3. Support the development of a fully open infrastructure for research communication:

    1. Support a model in which commercial players can develop and support open infrastructure using service-based business models that don’t involve ownership of this infrastructure or create dependencies on any single provider.

    2. Provide mechanisms of funding, including for the creation and stewardship of data by institutions and repositories.

    3. Provide dedicated investment to sustain the maintenance and ongoing development of cybersafe infrastructures and services.

    4. Research, develop and implement community agreed standards for different disciplines. Crucially, this involves

      1. the support and widespread international adoption of community-based, community-governed persistent identifiers (PIDS), such as ORCID IDs to track trace and discover research outputs and the emergence of new disciplines, and to help fuel collaboration.

      2. Community-agreed, international metadata standards, where the metadata themselves are openly available for independent scrutiny to enable effective services and tools to be built upon them.

  4. Create an economic, legal and policy framework for public access:

    1. Adopt an evidence-based approach to policy making by supporting and funding research about research (meta-research) as a direct bridge to policy development (see for example: the US Center for Science of Science and Innovation (CSSI)35, the Meta-Research Innovation Center at Stanford (METRICS)36,  and the Research on Research Institute (RORI)37.

      1. As for funded research projects and outputs, it is important to review and evaluate what policies work or not in different contexts. 

      2. Monitor any unintended or negative consequences, either for the research community, or other actors and entities (public and private), and the communities and society that it serves.

      3. Apply the same principles of research integrity, reuse and access to policy development that there are for research practice and process. For example, policies should be available for independent scrutiny (e.g. peer review) 

    2. Provide clear roles and rights for re-users and consumers of publicly funded research outputs, in particular consumers should not be excluded because of affordability.

    3. Remove obstacles for low-to-middle income countries (LMICs) to contribute, reuse and collaborate, within community agreed standards.

B. Engagement of Federal Agencies with other Actors

Enabling open and FAIR access to research outputs entails active partnerships and collaboration among all sectors and disciplines, including the involvement of researchers, business and local communities as well as institutions, research funders, governments as well as citizens. Importantly, if such a system is to be trusted and effective, it must also manage the needs and responsibilities of different stakeholders, communities and jurisdictions .

Such a multi-stakeholder environment can only function if there is a common understanding of the importance and value of enabling access to these outputs and a responsibility from all stakeholders in how that research is conducted, produced and shared openly and reliably. 

At a minimum Federal agencies need to: 

  1. Align key research communication policies at a State and Federal level.

  2. Work with research funding agencies in Europe, China, Africa, India and South America, and also via the United Nations to develop a global framework for and standards of access and reuse.

  3. Work with publishers, repositories, and other service and infrastructure providers to develop new business models for open access, FAIR data and open infrastructure.

  4. Work with Scholarly Societies and National Academies to develop codes of research integrity and principles of scholarship that are discipline-specific but aligned to community principles and common objectives.

Topic 3. How would American science leadership and American competitiveness benefit from immediate access to these resources? What are potential challenges and effective approaches for overcoming them? Analyses that weigh the trade-offs of different approaches and models, especially those that provide data, will be particularly helpful.

Openness is a vital instrument which, when used responsibly, can fuel a faster, more effective, more reliable, more trustworthy, more equitable and more innovative research communication system. Openness in science has the potential to not only respond to the world’s greatest practical challenges but to also benefit industry, technology, society and scholarly research itself. If delays and barriers to creating, sharing, verifying and discovering research can be removed, we can not only respond more quickly and effectively to public health emergencies (such as COVID19) but we can also harness this collective knowledge to ensure that the US and other national economies benefit and the UN Sustainable Development Goals are achieved more quickly. This is an opportunity for all actors and organisations to contribute and benefit.

Hindawi Ltd is a case in point. We are a commercial, Open Access publisher that makes the content of all our journals openly and freely available. We are developing and implementing standards for open science, such as data sharing, and data and code linking and citation. We are strengthening our editorial and research integrity policies to enable reuse and discovery. We deposit all our metadata and make it publicly available via Crossref, including citations and abstracts. We are developing new services and tools, in particular an open source, end-to-end, publishing management  platform called Phenom38, and creating publishing partnerships to deliver these services to other publishers, such as Wiley, AAAS and Geo Science World, as well as for our own journals, at scale. We are doing this because openness allows us to innovate and gives us a commercial advantage, and because this is to the benefit, rather than at the expense, of science, society and the economy. 

The US has an opportunity to take a global leadership position on the development of evidence-based policy and practice. This will only be achieved, however if done in parallel with the development of processes and practices that maximize both 1) the reliability and useability of research outputs and 2) opportunities for collaboration and co-creation, both nationally and internationally. These processes and practices require dedicated tools, technology, appropriate funding and services set within an interoperable infrastructure and a clear legal regulatory framework to permit different actors and entities, commercial and not-for-profit, to contribute and gain from the system. These include but are not limited to:

i. Clear relevant evidence-based policies that aim to increase the availability and reuse of research outputs in a global competitive context (see also response B4 to Topic 2)

ii. A global interoperable infrastructure of tools, services, hardware and software (see also response B2 to Topic 1 and A3 to Topic 2)

iii. Clear regulatory frameworks to manage the interests of different stakeholders (see also response A4 to Topic 2)

iv. A transparent, competitive market

It is in the interest of US markets and the US economy to ensure a transparent competitive market that enables private companies, including small and mid-size enterprises (SMEs), as well as publicly funded organisations such as universities and research performing organisations, to contribute and benefit from publicly accessible research outputs. This emerging market has not yet been fully exploited, because of the constraints of the existing research communication system (incl. non-Disclosure Agreements, multi-year contract terms, and privately negotiated prices for journal subscriptions), a perceived incompatibility with  intellectual property rights (IPR) and competitiveness policies and because of conflicting internal financial and legal rules.

Topic 4. Additional information.

At the heart of a system that prioritises access, reliability and reuse of research outputs are the researchers themselves. To harness their skills and expertise, all of the above needs to be embedded within a research culture that motivates experimentation, sharing, trust and collaboration while ensuring there is space for individual creativity and exchange with society, as well as economic return. It must also facilitate equity of opportunity across the globe in how knowledge and expertise is contributed to this system, as well as how it is accessed, disseminated, discovered and reused.

 

*Note that this response was informed by discussions and includes some text by Catriona J. MacCallum and others from the EU Open Science Policy Platform Final Report ‘Progress on Open Science: Towards a Shared Research Knowledge System’ led by the European Commission Directorate-General for Research and Innovation, which will be published later in May 2020. The final report was compiled by Eva Mendez (Chair, OSPP Mandate 2), Rebecca Lawrence (Editor and Coordinator), Catriona J. MacCallum (Writing Chair), Eva Moar (Rapporteur), together with all the members of the Open Science Policy Platform.

 

References

1. Smaldino, P. E. & McElreath, R. The natural selection of bad science. R. Soc. Open Sci. 3, (2016).

2. Alberts, B., Kirschner, M. W., Tilghman, S. & Varmus, H. Rescuing US biomedical research from its systemic flaws. Proc. Natl. Acad. Sci. 111, 5773–5777 (2014).

3. Jones, R. & Wisdon, J. The Biomedical Bubble: Why UK research and innovation needs a greater diversity of priorities, politics, places and people. https://www.nesta.org.uk/report/biomedical-bubble/ (2018).

4. Samuel Moore, Cameron Neylon, Martin Paul Eve, Daniel Paul O’Donnell & Damian Pattinson. “Excellence R Us”: university research and the fetishisation of excellence. Palgrave Commun. 3, 16105 (2017).

5. Tressoldi, P. E., Giofré, D., Sella, F. & Cumming, G. High Impact = High Statistical Standards? Not Necessarily So. PLOS ONE 8, e56180 (2013).

6. Brembs, B. Reliable novelty: New should not trump true. PLOS Biol. 17, e3000117 (2019).

7. Casadevall, A. & Fang, F. C. Causes for the Persistence of Impact Factor Mania. mBio 5, e00064-14 (2014).

8. Ioannidis, J. P. A. Concentration of the Most-Cited Papers in the Scientific Literature: Analysis of Journal Ecosystems. PLoS ONE 1, (2006).

9. Ioannidis, J. P. A. Meta-research: Why research on research matters. PLOS Biol. 16, e2005468 (2018).

10. Landis, S. C. et al. A call for transparent reporting to optimize the predictive value of preclinical research. Nature 490, 187–191 (2012).

11. Brembs, B., Button, K. & Munafò, M. Deep impact: unintended consequences of journal rank. Front. Hum. Neurosci. 7, 291 (2013).

12. Baas, J. & Fennell, C. When Peer Reviewers Go Rogue - Estimated Prevalence of Citation Manipulation by Reviewers Based on the Citation Patterns of 69,000 Reviewers. https://papers.ssrn.com/abstract=3339568 (2019).

13. Wouters, P. et al. Rethinking impact factors: better ways to judge a journal. Nature 569, 621 (2019).

14. Haustein, S. & Larivière, V. The Use of Bibliometrics for Assessing Research: Possibilities, Limitations and Adverse Effects. in Incentives and Performance (eds. Welpe, I. M., Wollersheim, J., Ringelhan, S. & Osterloh, M.) 121–139 (Springer International Publishing, 2015). doi:10.1007/978-3-319-09785-5_8.

15. Jubb Consulting, United Kingdom et al. Future of scholarly publishing and scholarly communication:  Report of the Expert Group to the European Commission. Copyr. Fair Use Sch. Commun. Etc (2019).

16. Fyfe, A. et al. Untangling Academic Publishing: A history of the relationship between commercial interests, academic prestige and the circulation of research. https://zenodo.org/record/546100#.Wz9bbX7TWB0 (2017) doi:10.5281/zenodo.546100.

17. Perneger, T. V. Citation analysis of identical consensus statements revealed journal-related bias. J. Clin. Epidemiol. 63, 660–664 (2010).

18. Lariviere, V. et al. A simple proposal for the publication of journal citation distributions. bioRxiv 062109 (2016) doi:10.1101/062109.

19. Dora. ASCB http://www.ascb.org/dora/.

20. Lee, C. J. & Moher, D. Promote scientific integrity via journal peer review data. Science 357, 256–257 (2017).

21. Squazzoni, F. et al. Data Sharing and Research on Peer Review: A Call to Action. https://osf.io/sr6eg (2019) doi:10.31235/osf.io/sr6eg.

22. Neylon, C., Bilder, G. & Lin, J. Principles for Open Scholarly Infrastructures. 1–5 https://espace.curtin.edu.au/handle/20.500.11937/56273 (2015) doi:10.6084/M9.FIGSHARE.1314859.

23. Peters, P. A radically open approach to developing infrastructure for Open Science. Hindawi Blog https://medium.com/@Hindawi/https-about-hindawi-com-opinion-a-radically-open-approach-to-developing-infrastructure-for-open-science-d0e6a1dfb99f (2018).

24. Landis, S. C. et al. A call for transparent reporting to optimize the predictive value of preclinical research. Nature 490, 187–191 (2012).

25. Fang, F. C. & Casadevall, A. Retracted Science and the Retraction Index. Infect. Immun. 79, 3855–3859 (2011).

26. Macleod, M. R. et al. Risk of Bias in Reports of In Vivo Research: A Focus for Improvement. PLOS Biol. 13, e1002273 (2015).

27. What researchers think about the culture they work in | Wellcome. https://wellcome.ac.uk/reports/what-researchers-think-about-research-culture.

28. Fanelli, D. Negative results are disappearing from most disciplines and countries. Scientometrics 90, 891–904 (2012).

29. Ioannidis, J. P. A. Why Most Published Research Findings Are False. PLoS Med 2, e124 (2005).

30. Peels, R. Replicability and replication in the humanities. Res. Integr. Peer Rev. 4, 2 (2019).

31. Union, P. O. of the E. Turning FAIR into reality : final report and action plan from the European Commission expert group on FAIR data. https://publications.europa.eu/en/publication-detail/-/publication/7769a148-f1f6-11e8-9982-01aa75ed71a1/language-en/format-PDF (2018).

32. Colavizza, G., Hrynaszkiewicz, I., Staden, I., Whitaker, K. & McGillivray, B. The citation advantage of linking publications to research data. PLOS ONE 15, e0230416 (2020).

33. Schenkman, R. & Patterson, M. Reforming Research Assessment. eLife (2013).

34. Leiden manifesto for research Metrics. Leiden manifesto for research Metrics http://www.leidenmanifesto.org/.

35. dcampbell. Center for the Science of Science and Innovation Policy. American Institutes for Research https://www.air.org/project/center-science-science-and-innovation-policy (2014).

36. Home | Meta-research Innovation Center at Stanford. https://metrics.stanford.edu/.

37. Research on Research Institute. http://researchonresearch.org.

38. Phenom System and Publisher Services. Hindawi https://www.hindawi.com/publishing-partnerships/phenom-system-and-publisher-services/.

[END]

We have begun to integrate the 200+ Hindawi journals into Wiley’s journal portfolio. You can find out more about how this benefits our journal communities on our FAQ.