Crowdsourced disciplines and universal impact
The use of quantitative metrics to gauge the impact of scholarly publications, authors, and disciplines is predicated on the availability of reliable annotation data. Citation and download counts are widely available from digital libraries. However, current annotation systems rely on proprietary labels, refer to journals or articles but not authors, and are manually curated. To address these limitations, we present a social framework based on crowdsourced annotations of scholars, designed to keep up with the rapidly evolving disciplinary and interdisciplinary landscape. The crowdsourcing approach has the added advantage that when combined with citation information about the authors, it enables the computation of discipline-specific statistics and discipline-neutral impact metrics. In turn, these impact metrics create an incentive for users to provide disciplinary annotations of authors.