HOWTO: Submit multiple jobs using parameters

Often users want to submit a large amount of jobs all at once with each using different parameters for each job.  These parameters could be anything, including the path of a data file or different input values for a program.  This how-to will show you how you can do this using a simple python script, a CSV file, and a template script.  You will need to adapt this advice for your own situation.

Consider the following batch script:

#!/bin/bash
#SBATCH --ntasks-per-node=2
#SBATCH --time=1:00:00
#SBATCH --job-name=week42_data8

# Copy input data to the nodes fast local disk
cp ~/week42/data/source1/data8.in $TMPDIR

cd $TMPDIR

# Run the analysis
full_analysis data8.in data8.out

# Copy results to proper folder
cp  data8.out ~/week42/results

Lets say you need to submit 100 of these jobs on a weekly basis.  Each job uses a different data file as input.  You recieve data from two different sources, and thus your data is located within two different folders.  All of the jobs from one week need to store their results in a single weekly results folder.  The output file name is based upon the input file name.

Creating a Template Script

As you can see, this job follows a general template.  There are three main parameters that change in each job:

  1. The week 
    • Used as part of the job name
    • Used to find the proper data file to copy to the nodes local disk
    • Used to copy the results to the correct folder
  2. The data source
    • Used to find the proper data file to copy to the nodes local disk
  3. The data file's name
    • Used as part of the job name
    • Used to find the proper data file to copy to the nodes local disk
    • Used to specify both the input and output file to the program full_analysis
    • Used to copy the results to the correct folder

If we replace these parameters with variables, prefixed by the dollar sign $and surrounded by curly braces { }, we get the following template script:

Slurm does not support using variables in the #SBATCH section, so we need to set the job name in the submit command.
#!/bin/bash
#SBATCH --ntasks-per-node=2
#SBATCH --time=1:00:00

# Copy input data to the nodes fast local disk 
cp ~/${WEEK}/data/${SOURCE}/${DATA}.out $TMPDIR
cd $TMPDIR

# Run the analysis 
full_analysis ${DATA}.in ${DATA}.out

# Copy results to proper folder
cp  ${DATA}.out ~/${WEEK}/results

Automating Job Submission

We can now use the sbatch --exportoption to pass parameters to our template script.  The format for passing parameters is:

sbatch --job-name=name --export=var_name=value[,var_name=value...]

Submitting a 100 jobs using sbatch --export option manually does not make our task much easier than modifying and submitting each job one by one.  To complete our task we need to automate the submission of our jobs.  We will do this by using a python script that submits our jobs using parameters it reads from a CSV file.  

Note that python was chosen for this task for its general ease of use and understandability -- if you feel more comfortable using another scritping language feel free to interpret/translate this python code for your own use.

Here is the script that submits the jobs using the parameters:

The below script needs python3 to run.
Try using python3 submit_jobs.py to run the script.
#!/usr/bin/python3
## file: submit_jobs.py
import os
import csv, subprocess

home = os.path.expanduser('~')
parameter_file_full_path = home + "/testDir/jobs_mult_params/week42/job_params.csv"

with open(parameter_file_full_path, mode='r', newline='', encoding='utf-8-sig') as csvfile:
    reader = csv.reader(csvfile)
    for job in reader:
        submit_command = ("sbatch "
            "--job-name={0}_{1} "
            "--export=WEEK={0},SOURCE={1},DATA={2} template_1.sh").format(*job)

        print(submit_command)# Uncomment this line when done testing to use the submit command created
        # uncomment the following 3 lines when done testing to submit the jobs
#        exit_status = subprocess.call(submit_command, shell=True)
#        if exit_status is 1:  # Check to make sure the job submitted
#            print("Job {0} failed to submit".format(submit_command))
print("Done submitting jobs!")

This script will open the CSV file specified by the variable parameter_file_full_path and step through the file line by line, submitting a job for each line using the lines values.  If the submit command returns a non-zero exit code, usually indicating it was not submitted, we will print this out to the display.  The jobs will be submitted using the general format:

sbatch --export=WEEK=WEEK_VALUE,SOURCE=SOURCE_VALUE,DATA=DATA_VALUE template_1.sh

Where WEEK_VALUE ,SOURCE_VALUE, and DATA_VALUE are the first, second, and third comma separated values in the CSV file's current row, and template_1.sh is the name of our template script.

Creating a CSV File

We now need to create a CSV file with parameters for each job.  This can be done with a regular text editor or using a spreadsheet editor such as excel.  By default you should use commas as your delimiter.  

Here is our CSV file with parameters:

week42,source1,data1
week42,source1,data2
week42,source1,data3
...
week42,source2,data98
week42,source2,data99
week42,source2,data100

The submit script would read in the first row of this CSV file and form and execute the command:

sbatch --job-name=week42_source1 --export=WEEK=week42,SOURCE=source1,DATA=data1 template_1.sh

Submitting Jobs

Once all the above is done all you need to do to submit your jobs is make sure the CSV file is populated with the proper parameters and run the automatic submission script.  

Before submitting a large amount of jobs for the first time using this method it is recommended you test our your implimentation with a small amount of jobs.