Data-driven Assessment of Structural Image Quality [article]

Adon Rosen, David R. Roalf, Kosha Ruparel, Jason Blake, Kevin Seelaus, L. Prayosha Villa, Rastko Ciric, Philip A. Cook, Christos Davatzikos, Mark A. Elliott, Angel Garcia De La Garza, Efstathios D. Gennatas (+8 others)
2017 bioRxiv   pre-print
Data quality is increasingly recognized as one of the most important confounding factors in brain imaging research. It is particularly important for studies of brain development, where age is systematically related to in-scanner motion and data quality. Prior work has demonstrated that in-scanner head motion biases estimates of structural neuroimaging measures. However, objective measures of data quality are not available for most structural brain images. Here we sought to identify quantitative
more » ... measures of data quality for T1-weighted volumes, describe how such measures of quality relate to cortical thickness, and delineate how this in turn may bias inference regarding associations with age in youth. Three highly-trained raters provided manual ratings of 1,840 raw T1-weighted volumes. These images included a training set of 1,065 images from Philadelphia Neurodevelopmental Cohort (PNC), a test set of 533 images from the PNC, as well as an external test set of 242 adults acquired on a different scanner. Manual ratings were compared to automated quality measures provided by the Preprocessed Connectomes Project's Quality Assurance Protocol (QAP), as well as FreeSurfer's Euler number, which summarizes the topological complexity of the reconstructed cortical surface. Results revealed that the Euler number was consistently correlated with manual ratings across samples. Furthermore, the Euler number could be used to identify images scored "unusable" by human raters with a high degree of accuracy (AUC: 0.98-0.99), and out-performed proxy measures from functional timeseries acquired in the same scanning session. The Euler number also was significantly related to cortical thickness in a regionally heterogeneous pattern that was consistent across datasets and replicated prior results. Finally, data quality both inflated and obscured associations with age during adolescence. Taken together, these results indicate that reliable measures of data quality can be automatically derived from T1-weighted volumes, and that failing to control for data quality can systematically bias the results of studies of brain maturation.
doi:10.1101/125161 fatcat:v67wpsefwnh57m3kq6fr6ubpzm