How do I use Matlab's Distributed Computing features?

Matlab Distributed Computing Server (MDCS) is installed for Matlab versions R2014a, R2015a, R2015b, R2016a, and R2017a.

NOTE: the commands used to submit jobs has changed with R2017a. This page describes the commands for R2016a and earlier. For R2017a, run "configCluster" and read the printed commands.

With this you can e.g. submit jobs directly to our job queue scheduler, without having to use slurm's commands directly. To do this login to the cluster's login node (e.g. with SSH (X-forwarding enabled), or perhaps more efficiently with Thinlinc (see our Thinlinc guide).

Guide: Getting Started with Serial and Parallel MATLAB on Tintin

When logged in you load the matlab module with e.g. "module load matlab/R2014a", and launch the GUI with just "matlab". A simple test case that can be run is the following (this is just one very simple example, the pdf guide linked to above has more detailed information):

>> configCluster
>> ClusterInfo.clear
>> ClusterInfo.setWallTime('00:10:00')
>> ClusterInfo.setProjectName('b2015999')
>> ClusterInfo.setQueueName('node')
>> c = parcluster
>> job = c.batch(@parallel_example, 1, {90, 5}, 'pool', 15)
>> job.wait
>> job.fetchOutputs{:}
>> job.delete

where parallel_example.m is a file with the following matlab function:

function t = parallel_example(nLoopIters, sleepTime) 
  t0 = tic; 
  parfor idx = 1:nLoopIters 
    A(idx) = idx; 
  t = toc(t0); 

This will schedule a 16 tasks node-job (15 + 1) on Tintin under the staff project (so you'll have to change this to your project name). For the moment jobs are hard coded to be node jobs. This means that if you request 17 tasks instead (16 + 1) you will get a 2 node job, but only 1 core will be used on the second node. In this case you'd obviously request 32 tasks (31 + 1) instead.

The curly brackets {90, 5} in the example contain the input arguments for the function to be called, in this example nLoopIters=90 and sleepTime=5.

For more information about Matlab's Distributed Computing features please see Matlab's HPC Portal.