Skip to main content
MEGqc - an automated and standardized quality control workflow for MEG BIDS data
Aaron Reer, Evgeniia Gapontseva, Jochen W. Rieger
Presenting author:
Aaron Reer
Due to the high sensitivity of the sensors, magnetoencephalography (MEG) data are susceptible to noise, which can severely corrupt the data quality. Consequently, quality control (QC) of such data is an important step for valid and reproducible science. However, the visual detection and annotation of artifacts in MEG data requires expertise, is a tedious and time extensive task and is hardly standardized. Since quality control is commonly done in an idiosyncratic fashion it might also be subject to individual biases. Despite the minimization of human biases, standardization of QC routines will additionally enable comparisons across datasets and acquisition sites, facilitating the quality assessment of in-house and shared datasets. Therefore, we developed a software tool for automated and standardized quality control of MEG recordings: MEGqc. MEGqc strives to support researchers to standardize and speed up their quality control workflow and is designed to be easy and intuitive to use, e.g. only minimal user input (path to the dataset) is required. Therefore, the tool is tailored to the established BIDS standard. Among other metrics we detect noise frequencies in the Power Spectral Density and calculate their relative power, calculate several metrics to describe the ‘noisiness’ of channels and/or epochs, e.g. STD or peak-to-peak amplitudes, and quantify EOG and ECG related noise averaged over all channels and on a per-channel basis. MEGqc generates BIDS compliant html reports for interactive visualization. Moreover, it provides machine interoperable JSON outputs, which allow for the integration into automated workflows. MEGqc is open source, can be found on Github, and its documentation is hosted on readthedocs.