figshare
Browse
DOCUMENT
2019_csaudience_stability_and_mca.pdf (4.13 MB)
DOCUMENT
2019_neuroaudience_stability_and_mca.pdf (3.74 MB)
DOCUMENT
2020_mni_adm_stability_and_mca.pdf (6.34 MB)
1/0
3 files

Comparing Perturbation Models for Evaluating Stability of Neuroimaging Pipelines

Version 2 2020-01-31, 17:24
Version 1 2019-11-13, 17:15
presentation
posted on 2020-01-31, 17:24 authored by Greg KiarGreg Kiar
A lack of software reproducibility has become increasingly apparent in the last several years, calling into question the validity of scientific findings affected by published tools. Reproducibility issues may have numerous sources of error, including the underlying numerical stability of algorithms and implementations employed. Various forms of instability have been observed in neuroimaging, including across operating system versions, minor noise injections, and implementation of theoretically equivalent algorithms. In this paper we explore the effect of various perturbation methods on a typical neuroimaging pipeline through the use of i) near-epsilon noise injections, ii) Monte Carlo Arithmetic, and iii) varying operating systems to identify the quality and severity of their impact. The work presented here demonstrates that even low order computational models such as the connectome estimation pipeline that we used are susceptible to noise. This suggests that stability is a relevant axis upon which tools should be compared, developed, or improved, alongside more commonly considered axes such as accuracy/biological feasibility or performance. The heterogeneity observed across participants clearly illustrates that stability is a property of not just the data or tools independently, but their interaction. Characterization of stability should therefore be evaluated for specific analyses and performed on a representative set of subjects for consideration in subsequent statistical testing. Additionally, identifying how this relationship scales to higher-order models is an exciting next step which will be explored. Finally, the joint application of perturbation methods with post-processing approaches such as bagging or signal normalization may lead to the development of more numerically stable analyses while maintaining sensitivity to meaningful variation.

History