From Center for Cognitive Neuroscience
Jump to: navigation, search

Back to all things Hoffman2

Back to all things MATLAB

If you have a bunch of subjects that need to be processed through SPM, why waste your laptop using the GUI interface to slowly process each subject? And why waste a MATLAB license doing rote work that can be accomplished with compiled MATLAB code that works without a license?

Use Hoffman2's strengths and run that processing as one (or more!) jobs to get it done quickly and efficiently.

Create the Batch File(s)

  1. Launch SPM
  2. Click on the "Batch" button (to the left of "Quit").
  3. In the new window, go "File" > "New batch" to start with a new batch file.
  4. Using the "SPM" menu at the top of the window, add and modify the appropriate steps you would like to take and specify which data should be worked with.
  5. When you are satisfied with how things are set up, click "File" > "Save batch" and give your batch file a name (ending with ".mat").
  6. This is the basis for your SPM job.

After creating a Batch file for a single subject, those individuals comfortable with MATLAB programming should be capable of reading the contents of the Batch file and understanding how to apply its settings to N subjects with a few "for" loops and directory listings. We highly recommend that you save yourself the carpal tunnel from clicking and leverage the power of scripting.

Create the Wrapper Script

SPM is a wonderful tool because it is capable of being compiled into an executable using SPM8's own "make_exec" script. On Hoffman2, we have compiled it and placed it at


Now you need to create a wrapper script that will call this executable properly. The example shown below may be found at

qsub <<CMD
# No spaces before each pound (#) sign
# Use current working directory
#$ -cwd
# Error stream is merged with the standard output
#$ -j y
# Use the bash shell for job execution
#$ -S /bin/bash
# Use your normal environment variables in the job
#$ -V
# Use 1GB of RAM and the main queue, estimating 2 hours for completion
#$ -l h_data=1G,h_rt=2:00:00
# Only email on abort
#$ -m a
# Name the job "spm"
#$ -N spm

# Load the module environment
. /u/local/Modules/default/init/modules.sh

# Load the newest MATLAB
module load matlab

# Run the compiled version of SPM8 on your batch file
/u/project/CCN/apps/spm_exec/spm8 "batch" "PATH_TO_YOUR_BATCH_FILE"


As always, normal job submission guidelines should be followed when creating this file. Such as setting the time and memory limits appropriately.

The words "PATH_TO_YOUR_BATCH_FILE" should also be replaced with the actual full path to the .mat batch file you saved previously and want processed.

Make this wrapper script executable using the command

$ chmod 750 /path/to/your/wrapper/script

filling in the appropriate path to your script.

Submit the Job

Now just execute the wrapper script to submit the job. If we were running our example, we would do

 $ cd /u/home/FMRI/apps/examples/spm
 $ ./run_spm_job.sh

Wait for the Job to Run

Remember how to check on job status?

Using Interactive mode

For spm8 you need this in your .bashrc file:
export MATLABPATH=/u/project/CCN/apps/spm8
For spm12 you need this in your .bashrc file:
export MATLABPATH=/u/project/CCN/apps/spm12
 1. Start an interactive session via qrsh (for example):
 qrsh -l h_data=4G,h_rt=8:00:00 -pe shared 2

 2. Load the MATLAB module.  Use matlab/8.4 for spm8 and matlab/9.1 for spm12:
 module load matlab/9.1

 3. Set the MATLABPATH environment variable:
 export MATLABPATH=/u/project/CCN/apps/spm12

 4. Start MATLAB:

 5. Start spm from the MATLAB prompt:
 >> spm

View the Results

Use SPM or whichever tool is appropriate to view the results.

Going Further

If you created a single batch file to process 100 subjects, that would be fine and dandy, but it would take a very long time to go through each one in sequence.

But if you created 10 batch files with 10 subjects in each, and ran 10 separate jobs (since the compiled version of SPM does not use MATLAB licenses), the processing would be finished 10 times as quickly.

Consider parallelising your jobs when working at scale.