View on GitHub

Neuroimaging codes

Pipeline for NeuroImaging

Download this project as a .zip file Download this project as a tar.gz file

Required Programs

These are the programs that we use in this pipeline. All of these except for Mango and a batch renaming application are called from the terminal.


unzip .tar.gz files

gunzip  ./original/*.tar.gz

for i in ./original/*.tar ; do
    sudo tar xopvf $i
done

Convert DICOM files to NIfTI format

This program batch converts DICOM files into NIfTI format (nii.gz). It also crops the output of neck and non brain tissues, rsulting in a file with a "co" prefix and also attempts to align the images to a standard orientation.

./dcm2nii -a y -d n -e n -f n -g n -i n -p n -r n -x y ./original/*/

find ./original/  -type f -name co*.nii.gz -exec cp {} ./data/ \;

At this point, rename the files with BetterRename.app or another program capable of batch renaming files using regular expressions to format names to mmu12345_Xwk.nii.gz for later steps using the following regular expression:

(\w{2})+(\d{8})+(\_)+(\d{10})+(\w{6})+(\w{3})+(\d{5})(\d{1})+(\w*)

Replace the regular expression with (X = the age of the Subject at scan):

\6\7_Xwk

Rigid affine alignment to a template to AC-PC Align

This aligns all of the scans within the same general template space, but does not effect the size or shape of the scan being aligned to the template. However, this does not explicitly AC-PC align the images and does not change the scan origin like the acpcdetect program in the ART package.

for i in ./data/*.nii.gz ; do
    cd ./Template Directory/
    $ANTSPATH/ANTS 3 -m MI[<Template>.nii.gz,$(dirname $i)/$(basename $i .nii.gz).nii.gz,1,32] -o $(dirname $i)/$(basename $i .nii.gz)ACPC -i 0 --number-of-affine-iterations 1x1x1 --rigid-affine true
    $ANTSPATH/WarpImageMultiTransform 3 $(dirname $i)/$(basename $i .nii.gz).nii.gz $(dirname $i)/$(basename $i .nii.gz)_acpc.nii.gz $(dirname $i)/$(basename $i .nii.gz)ACPCAffine.txt -R <Template>.nii.gz
done

Reslice images to .35mm isotropic resolution and set a consistent field of view

for i in ./data/*_acpc.nii.gz ; do
    ./c3d $(dirname $i)/$(basename $i .nii.gz).nii.gz -interpolation Cubic -resample-mm .35x.35x.35mm -trim-to-size 256x256x256vox -verbose -o $(dirname $i)/$(basename $i .nii.gz)_resampled.nii.gz
done

N4ITK Bias Field Correction

This removes the bias field generated by the magnetic inhomogeneity in the MRI scanning procedures. This is done by iteratively running the script three time with increasingly fine parameters.

for i in /Users/thehunsakers/Documents/MPRAGE/*/*_resampled.nii.gz ; do
    $ANTSPATH/N4BiasFieldCorrection -d 3 -i $(dirname $i)/$(basename $i .nii.gz).nii.gz -o $(dirname $i)/$(basename $i .nii.gz)_n4.nii.gz -s 8 -b [200] -c [50x50x50x50,0.000001]
    $ANTSPATH/N4BiasFieldCorrection -d 3 -i $(dirname $i)/$(basename $i .nii.gz)_n4.nii.gz -o $(dirname $i)/$(basename $i .nii.gz)_n4.nii.gz -s 4 -b [200] -c [50x50x50x50,0.000001]
    $ANTSPATH/N4BiasFieldCorrection -d 3 -i $(dirname $i)/$(basename $i .nii.gz)_n4.nii.gz -o $(dirname $i)/$(basename $i .nii.gz)_n4.nii.gz -s 2 -b [200] -c [50x50x50x50,0.000001]
done

Landmark Guided Template Matching to Generate Region of Interest

Place control point landmarks of control points in the ROI located at regions of maximal anatomical variability across scans and this will take the traced, gold standard ROI in the templates folder and map it onto each scan. The guided registration script is located Here.

for i in ./data/*_n4.nii.gz ; do
    cd ./data/Template Folder/
    sh guidedregistration.sh <Template>.nii.gz <Template_roi>.nii.gz $(dirname $i)/$(basename $i .nii.gz).nii.gz $(dirname $i)/$(basename $i .nii.gz)_roi.nii.gz $(dirname $i)/$(basename $i .nii.gz)_OUTPUT 100x100x10 3
done

for i in ./data/*_OUTPUTSEGMENTED.nii.gz ; do
    ./c3d $(dirname $i)/$(basename $i .nii.gz).nii.gz -binarize -o $(dirname $i)/$(basename $i .nii.gz)_binary.nii.gz
done

Learning Based Wrapper to Fix Landmark Matching ROI

At this point, trace a number of primates regions of interest to be gold standard and run through the SegAdapter pipeline that uses machine learning to correct systematic errors introduced during the Landmark Guided Template Matching Step

./bl ./data/inputIMAGEFILE.txt ./data/manualSegmentationFile.txt ./data/autoSegmentationFile.txt 1 2 4x4x4 .1 500  ./data/TRAINING/training

for i in ./data/*.nii.gz ; do
    ./sa $(dirname $i)/$(basename $i .nii.gz).nii.gz $(dirname $i)/$(basename $i .nii.gz).nii.gz ./data/TRAINING/training $(dirname $i)/$(basename $i .nii.gz)_CORR.nii.gz
done

Compute DICE Coefficient

for i in ./data/*.nii.gz; do
    c3d -verbose -overlap 1 $(dirname $i)/$(basename $i .nii.gz).nii.gz $(dirname $i)/$(basename $i .nii.gz)_OUTPUThipp_binary.nii.gz $(dirname $i)/$(basename $i .nii.gz)_CORR.nii.gz > ./data/$(basename $i .nii.gz).txt
done

Extract ROI Volumes for Analysis

This will compute the Volumes and number of Voxels contained within each region of interest for analysis.

find ./data/  -type f -name *_forANALYSIS.nii.gz -exec cp -fpv {} ./data/FINAL_ROI/ \;

for i in $(find ./data/FINAL_ROI/ -type f -name "_CORR.nii.gz"); do
    val=$(fslval $i dim1)
    xsize=$(echo "$val/2" | bc)
    fslroi $i $(dirname $i)/$(basename $i .nii.gz)_HPCl.nii.gz 0 $xsize 0 -1 0 -1
    xmin=$xsize; xsize=$(echo "$val-$xmin" | bc)
    fslroi $i $(dirname $i)/$(basename $i .nii.gz)_HPCr.nii.gz $xmin $xsize 0 -1 0 -1
    fslmaths $(dirname $i)/$(basename $i .nii.gz)_HPCr.nii.gz -mul 2 $(dirname $i)/$(basename $i .nii.gz)_HPCr.nii.gz
    fslmerge -x $(dirname $i)/$(basename $i .nii.gz)_HPC.nii.gz $(dirname $i)/$(basename $i .nii.gz)_HPCr.nii.gz $(dirname $i)/$(basename $i .nii.gz)_HPCl.nii.gz
    c3d $(dirname $i)/$(basename $i .nii.gz)_HPC.nii.gz -split -oo $(dirname $i)/temp.nii.gz $(dirname $i)/$(basename $i .nii.gz)_HPCl.nii.gz $(dirname $i)/$(basename $i .nii.gz)_HPCr.nii.gz
done

for i in ./data/FINAL_ROI/*_forANALYSIS.nii.gz; do
    $FSLDIR/bin/fslstats $i -V >$(dirname $i)/$(basename $i .nii.gz).txt
done

for i in ./data/FINAL_ROI/*right.txt; do
    newname=`echo "$i" | sed 's/_acpc_resampled_n4_OUTPUTSEGMENTED_binary_Corrected_forANALYSIS_right/_right/g'`
      mv "$i" "$newname"
done

for i in ./data/FINAL_ROI/*left.txt; do
    newname=`echo "$i" | sed 's/_acpc_resampled_n4_OUTPUTSEGMENTED_binary_Corrected_forANALYSIS_left/_left/g'`
      mv "$i" "$newname"
done

for i in ./data/FINAL_ROI/*forANALYSIS.txt; do
    rm -f $i
done

find ./data/FINAL_ROI/  -type f -name *.txt -exec cp -fpv {} ./data/FINAL_ROI/Volumes/ \;

Trim ROIs for 3D Rendering

This trims the ROIs to have a 5 voxel padding. This greatly reduces the file size as well as makes 3D visualization much more clear.

for i in ./data/*_Corrected.nii.gz ; do
    ./c3d $(dirname $i)/$(basename $i) -trim 5vox -o $(dirname $i)/$(basename $i .nii.gz)_forRENDERING.nii.gz
done

Organize ROI into a File using R

This script takes the region of interest volumes generated above and tabulates them in a single text file with Subject ID, Age, Left Volume, and Right

R
rm(list = ls(all = TRUE))
library(limma)
library(reshape)
library(outliers)
library(psych)
library(doBy)
library(gdata)
library(gplots)
library(lattice)
setwd("./data/FINAL_ROI/Volumes/")
files=list.files(path=".", recursive=TRUE)
all_files=data.frame(files=NULL)
for(i in seq(along=files))
{
orig=read.table(files[i], header=FALSE, sep=" ")
orig=rename(orig,c(V1="voxels",V2="volume"))
all_files=rbind(all_files,orig)
}
all_files=subset(all_files, select=-c(V3))
files=removeExt(files)
all_files=cbind(files,all_files)
data=all_files
allRIGHT=subset(data,grepl("_HPCr",data$files)
allLEFT=subset(data,grepl("_HPCl",data$files))
Subjects=allRIGHT$files
Subjects=as.character(Subjects)
Age=gsub("(\\w+)_(\\d+)(\\w+)", "\\2", Subjects)
Subjects=gsub("(\\w+)_(\\d+)(\\w+)", "\\1", Subjects)
R_Voxels=allRIGHT$voxels
R_Volume=allRIGHT$volume
L_Voxels=allLEFT$voxels
L_Volume=allLEFT$volume
Hippocampal_Volumes=cbind(Subjects,Age,R_Volume,L_Volume)
write.table(Hippocampal_Volumes, file="./data/FINAL_ROI/Volumes/Hippocampal_ROI.txt",sep=",",row.names=FALSE,col.names=TRUE)

Have yet to be implemented