This course introduces you to best practices for processing neuroimaging data with a focus on data organization and campus computing resources.
We begin with software installation, HPC account setup, and a discussion of neuroimaging data. Subsequently, we explore issues of data sharing and reproducibility, emphasizing the BIDS standard for naming and organizing neuroimaging data, and the process of converting DICOM data into BIDS-compliant datasets.
Before introducing individual data processing pipelines, you'll spend time learning about the HPC (High-Performance Computing Cluster): especially the job submission system (SLURM), and Globus for effective data transfer. Importantly, you'll use containerized applications to process data: DICOM conversion tools, anonymization software, quality assessment tools (e.g. MRIQC), fMRI preprocessing (fmriprep), and dMRI preprocessing tools (QSIprep).
We begin with software installation, HPC account setup, and a discussion of neuroimaging data. Subsequently, we explore issues of data sharing and reproducibility, emphasizing the BIDS standard for naming and organizing neuroimaging data, and the process of converting DICOM data into BIDS-compliant datasets.
Before introducing individual data processing pipelines, you'll spend time learning about the HPC (High-Performance Computing Cluster): especially the job submission system (SLURM), and Globus for effective data transfer. Importantly, you'll use containerized applications to process data: DICOM conversion tools, anonymization software, quality assessment tools (e.g. MRIQC), fMRI preprocessing (fmriprep), and dMRI preprocessing tools (QSIprep).
Course Credits
3