figshare
Browse
D1-06-Tatum.pdf (3.08 MB)

What is the evaluative object of Open Science?

Download (3.08 MB)
presentation
posted on 2017-11-27, 14:01 authored by Clifford TatumClifford Tatum
Increased attention to Open Science in research policy raises questions about how to incorporate Open Science in research evaluation. There is recognition among policy circles of the need for better alignment between the objectives of open science and the criteria used for evaluating the outcome of research [1, 2]. Key factors contributing to the need for better alignment include researchers’ publishing practices, wherein journal prestige is prioritized ahead of openness [3], and the additional effort entailed in opening up resources embedded in particular research practices [4]. Both point to a complex relationship between reputational incentives and the focus on impact in research evaluation. Recent contributions addressing this issue generally characterize the solution as a need to incentivize implementation of Open Science and/or expand research evaluation criteria to include aspects of openness [2]). However, it remains unclear how to facilitate evaluation of open science across heterogeneous research settings. In this presentation, I outline an “evaluative inquiry” approach to the assessment of open science. The evaluative inquiry framework [5] treats evaluation events as instances of knowledge production, in close interaction with those who are being evaluated. This co-production orientation shifts evaluation from strictly top-down to a more dialogic process, and from strictly rewarding past output to also mobilizing future-oriented research planning. The aim of this approach is to situate evaluation in the context of local epistemic priorities, thereby linking evaluation to research practices rather than to administrative tasks. In this way, evaluation is enrolled in both the implementation and assessment of open science.

References

[1] Dutch Ministry of Education, Culture and Science, (OCW). 2017. “National Plan Open Science.” DOI: 10.4233/uuid:9e9fa82e-06c1-4d0d-9e20-5620259a6c65.
[2] European Commission. 2017. “Evaluation of Research Careers Fully Acknowledging Open Science Practices; Rewards, Incentives And/Or Recognition for Researchers Practicing Open Science.” Directorate-General for Research and Innovation Open Science and ERA policy.https://ec.europa.eu/research/openscience/pdf/os_rewards_wgreport_final.pdf
[3] Nature Research. 2015. “Author Insights 2015 Survey.” doi:10.6084/m9.figshare.1425362.v7.
[4] Tatum, Clifford, Alex Rushforth, Thed N. van Leeuwen, and Sarah de Rijcke. 2017. “Epistemic Data Practices: Case Study Report for the CWTS/Elsevier Study on Open Data.” doi:10.17632/bwrnfb4bvh.1.
[5] Fochler, Maximilian, and Sarah de Rijcke. 2017. “Implicated in the Indicator Game? An Experimental Debate.” Engaging Science, Technology, and Society 3 (0): 21–40. doi:10.17351/ests2017.108.

History

Usage metrics

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC