figshare
Browse
Altmetrics Poster MRT.pptx (122.77 kB)

Engineers Don't Blog and Other Stories (why Scopus uses subject area benchmarking)

Download (0 kB)
poster
posted on 2015-10-08, 08:23 authored by Michael TaylorMichael Taylor

•There are clear differences between subject areas in all types of alternative metrics.

•Scholarly Activity (and Citation) provides the broadest and densest data, closely followed by Social Activity.

•Popular, highly-visible general journals dominate Mass Media and Scholarly Commentary, but are less influential in terms of Citation and Scholarly Activity.

•Life Sciences (esp. Neuroscience and Psychology) are highly active in all areas of alternative metrics.

•Although Engineering attracts a reasonable level of Citation, and is of substantial economic importance, it is either at the bottom or near the bottom for all sources of altmetric.

•Subjects with non-linguistic discourse tend to show lowest levels of activity in discursive channels.

•When comparing publications, normalizing or benchmarking for subject area is essential for all alternative metrics and citation.

Methodology

•2.4M publications indexed by Scopus with a publishing data of 2014

•Citation, Altmetric and Mendeley data sampled in September 10, 2015.

•Alternative metric data collated into four buckets, as defined by Snowball Metrics:

- Social Activity (Twitter, Facebook, Reddit and Google+.

- Scholarly Activity (Mendeley and Citeulike).

- Scholarly Commentary (blogs, Wikipedia, F1000 Prime, Pubpeer, etc.

- Mass media.

•Publications’ data assigned to the journals’ Scopus / ASJC mid-tier subject code (many to one relationship – no fractional counting, total publications for all buckets same as given in ‘Citation’).

•Counts classified into 3 groups – no activity (pale pink), 1 count per publication (red), 2 or more counts (green).

About Scopus benchmarking

•Scopus only computes a bench-mark when there are a minimum of 2500 publications in a cohort.

•A cohort is defined by its Scopus / ASJC subject code, its publishing window and (if N >= 2500 is achieved) its document type (e.g., conference paper, article, review, etc.).

•The publishing window for citation and Scholarly Activity is eighteen months, for other metrics it is two months.

•For each metric being benchmarked, the publication is ranked, the percentile edges calculated, and the percentile assigned.

 

The dataset is available here: http://dx.doi.org/10.17632/smjj59mbmb.1

History

Usage metrics

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC