figshare
Browse

NWB2023_Estimating expert review quality scores for journal articles with bibliometrics and artificial intelligence

Download (2.24 MB)
presentation
posted on 2023-10-03, 16:39 authored by Mike ThelwallMike Thelwall, Kayvan Kousha, Paul Wilson, Mahshid Abdoli, Meiko Makita, Emma Stuart, Jonathan Levitt

Bibliometric indicators are often used to help estimate the quality of academic journal articles, but with little systematic evidence about the relationship between the two. This talk summarises the findings of studies of this relationship for quality scores from the UK Research Excellence Framework (REF) 2021, the world’s most financially important systematic academic expert review exercise. It also reports the extent to which machine learning can estimate the quality of journal articles from bibliometric information and metadata. These studies drew on an almost complete set of individual article quality scores from REF2021 – the largest science-wide data set of this nature. The bibliometric analyses surprisingly suggest that there may be a positive association between quality scores and article or journal citation rates in all broad fields. This relationship is never very strong and is weak in some social sciences and in the arts and humanities. The machine learning analyses developed a range of strategies to predict quality scores alongside expert reviewers to reduce the human labour needed for reviewing, then asked a sample of the original expert reviewers to comment on the desirability of the strategies. Although a technically desirable strategy was developed, its ethical implications led to it not being recommended for future REFs.

History

Usage metrics

    Categories

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC