User Tools

Site Tools


Quick Navigation:








In progress…


Single Subject Data Processing usually involves several preprocessing steps to reduce the influence of non-experimental influences on the data. These usually involve reducing the differences in slice time acquisition within each volume (TR), realigning each TR over the time course to correct for motion, coregistering the EPI (functional) images with the high-resolution (structural) images, normalizing the data to a standard template, and smoothing the data with a FWHM kernel.

After these steps, fMRI data acquired during along with a functional task are fit to a linear regression model representing the onsets and durations of each condition. It is this last step that finally generates the “pretty pictures” of activation on the brain.

Single Subject Processing in AFNI

AFNI has both a Graphical User Interface (GUI) and Command-Line Interface (CLI) for performing all of these single-subject processing steps. The GUI option is and requires the installation of PyQT, whereas the CLI option is, which should run on most computers with a fresh install of AFNI.

Installing Dependencies for

These instructions are specific to MacOS X 10.7 or later

  1. Install AFNI
  2. Install Xcode (Free from AppStore)
  3. Install HomeBrew
    1. ruby -e “$(curl -fsSL”
  4. Install PyQT
    1. brew install pyqt
  5. Set Python Path
    1. export PYTHONPATH=/usr/local/lib/python2.7/site-packages:$PYTHONPATH
    2. Consider adding the above line to your .bashrc or .profile (assuming Bash Shell)
  6. Run any of the uber_* programs

Specify Analysis using GUI with

I've launched and specified information about my files and design. In particular I've gone with the default options for processing blocks (time shift, co-register, normalize, motion correct, smooth, mask, and regress) specified an anatomic datafile (MPRAGE) and two functional runs. I've written out four stimulus files, representing four conditions. The files are specified in SECONDS and within each file, there is one row for each functional run.

Then it's on to specifying the options. I'll set an outlier limit of 10% (0.1), meaning if 10% of voxels are outliers, then censor that TR. I'll allow two CPUs to be used for processing, process the data using both a standard regression model and with a REML estimator. The data will be coregistered using the LPC cost function and be normalized to the TT_N27 (Colin27) brain. I'll set some contrasts of interest, remove the first 6TRs for each run to allow for scanner warmup. Finally the data will be blurred to 6mm and censored for motion over 3mm or 3 degrees (roughly).

Specify Analysis in

Whether you use or specify your analysis directly to, you will end up with something like what is printed below. This syntax (generated by does all of the things described above, with the advantage that you could have just written it to the command-line and scripted it out. It can be useful to use to setup your initial options and then modify the resulting commands as you see fit. There are a number of options in that are not available (yet) in

set top_dir   = demo
set anat_dir  = $top_dir/anat
set epi_dir   = $top_dir/func
set stim_dir  = $top_dir/stim_times

# set subject and group identifiers
set subj      = Subject1
set group_id  = Controls

# run to create a single subject processing script -subj_id $subj                                                   \
        -script proc.$subj -scr_overwrite                                     \
        -blocks tshift align tlrc volreg blur mask scale regress              \
        -copy_anat $anat_dir/T1MEMPRAGEs021a1001.nii.gz                       \
        -tcat_remove_first_trs 0                                              \
        -dsets                                                                \
            $epi_dir/fMRIFastLocalizer1s004a001.nii.gz                        \
            $epi_dir/fMRIFastLocalizer2s006a001.nii.gz                        \
        -align_opts_aea -giant_move                                           \
        -volreg_align_to first                                                \
        -volreg_align_e2a                                                     \
        -volreg_tlrc_warp                                                     \
        -blur_size 6.0                                                        \
        -regress_stim_times                                                   \
            $stim_dir/times-afni_cond1.txt                                    \
            $stim_dir/times-afni_cond2.txt                                    \
            $stim_dir/times-afni_cond3.txt                                    \
            $stim_dir/times-afni_cond4.txt                                    \
        -regress_stim_labels                                                  \
            Cond1 Cond2 Cond3 Cond4                                           \
        -regress_basis 'GAM'                                                  \
        -regress_censor_motion 0.3                                            \
        -regress_censor_outliers 0.1                                          \
        -regress_opts_3dD                                                     \
            -jobs 2                                                           \
            -gltsym 'SYM: Cond1 -Cond2' -glt_label 1 Cond1-Cond2              \
            -gltsym 'SYM: Cond1 -Cond3' -glt_label 2 Cond1-Cond3              \
            -gltsym 'SYM: Cond2 -Cond3' -glt_label 3 Cond2-Cond3              \
            -gltsym 'SYM: 0.333*Cond1 +0.333*Cond2 +0.333*Cond3' -glt_label 4 \
        mean.CCC                                                              \
            -gltsym 'SYM: Cond1 -0.5*Cond2 -0.5*Cond3' -glt_label 5 C-CC      \
        -regress_reml_exec                                                    \
        -regress_make_ideal_sum sum_ideal.1D                                  \
        -regress_est_blur_epits                                               \

Reviewing Single Subject Results will copy all of the input files into a new directory and then run processing on those files. Should you want to change options and re-run, you can simply delete that results folder and start again without worrying about deleting your original files.

Inside of the results folder are a series of scripts that you should run. The @ss_review_driver is likely the one to start with. Launching this script will take you through a series of steps to check your data. It will start by taking you through the censored outliers and motion, check the registration accuracy, and even pull up activation maps for inspection. I highly recommend keeping a log of this information in a lab notebook or database for later reference!

afni_singlesubject.txt · Last modified: 2014/07/13 10:19 by pmolfese