Skip to main content
Management‌ ‌and‌ ‌Quality‌ ‌Control‌ ‌of‌ ‌Large‌ Neuroimaging‌ ‌Datasets: Developments from the Barcelonaβeta Brain Research Center
Jordi Huguet, Carles Falcon, David Fusté, Sergi Girona, David Vicente, José Luis Molinuevo, Juan Domingo Gispert, Greg Operto
Presenting author:
Greg Operto
Recent decades have witnessed an increasing number of large to very large neuroimaging datasets. Collecting, hosting, managing, processing or reviewing those datasets is typically achieved through a local neuroinformatics infrastructure. In particular for groups with their own imaging equipment, setting up such a system is still a hard task. We propose a practical model guided by principles such as user involvement, lightweightness, modularity and reusability. This model is based on the experience from an 8-year old institution managing cohort studies on Alzheimer’s Disease, the BarcelonaBeta Brain Research Center. Such a model gave rise to an ecosystem of tools aiming at improved quality control through seamless automatic processes combined with a variety of library modules, command line and graphical user interfaces and instant messaging (IM) applets. This ecosystem was shaped around XNAT and is made of independently-reusable and freely available components, including:
- bx: provides command-based interaction with the XNAT data over a set of frequent use cases
- snaprate: assists with the review of processing outputs and collects quality assessment from a panel of experts
- nisnap: creates snapshots of segmentation maps
- bbrc-validator: runs systematic checks on imaging data and their derivatives regardless of the data type
- xnat-monitors: send daily updates through IM integration
This paradigm is scalable to the general community of researchers working with large neuroimaging datasets.