A Quick Word on Files

Condor has the ability to work with files which live on the NFS server (castor, loxcyc, rattus) as well as files local to the execute host. If you plan to work with a ton of small files or a handful of large files, feel free to use the NFS server as the source for your files. If you have a bunch of large files to process, you'll likely be better off telling Condor to transfer the files to the execute host before executing your job. Not only will you get better performance, everyone else will still be able to use the NFS server, allowing you to save face at the same time...Trust me, I speak from experience.

To transfer files to the execute host, use the following directives:

should_transfer_files = YES
when_to_transfer_output = ON_EXIT
transfer_input_files = /full/path/to/infile1,/full/path/to/infile2,...

This can be done globally by placing these directives at the top of the recipe, or on a per-job basis by placing them before each "queue" directive.

Bowtie Template

universe=vanilla

environment="BOWTIE_INDEXES=/proj/genome/programs/bowtie-0.12.1/indexes OUTDIR=/full/path/to/output"

executable=/proj/genome/programs/bowtie-0.12.1/bowtie
arguments=hg19sp75spike -v 2 -k 11 -m 10 --best --strata -p 4 -q $(OUTDIR)/1184_1_1.fastq --un $(OUTDIR)/1184_1_1.unmapped.fa --max $(OUTDIR)/1185_1_1.repeat.fa $(OUTDIR)/1184_1_1.bowtie.txt

log=bowtie.$(Process).log
output=bowtie.$(Process).out
error=bowtie.$(Process).err

request_cpus = 4
request_memory = 8000
request_disk = 0

queue

It's important to set the "request_cpus" variable to match the -p option to bowtie. It's also probably a good idea to set the "request_memory" to a more realistic value...8000 is almost 8 Gigs

ERANGE Template

universe=vanilla

environment="PYTHONPATH=/path/to/cistematic/root CISTEMATIC_ROOT=/path/to/cistematic/root ERANGEPATH=/path/to/erange"

executable=/bin/sh

log=rrpa.$(Process).log
output=rrpa.$(Process).out
error=rrpa.$(Process).err

getenv = true

arguments = $(ERANGEPATH)/doc/runRNAPairedAnalysis.sh hsapiens 1184_1_1 /proj/genome/gbdb/hg19/hg19.rmsk.db
queue

Tophat Template

universe=vanilla

environment = "PATH=/bin:/usr/bin:/usr/local/bin:/woldlab/castor/proj/genome/programs/${BOWTIE_DIR}"

executable=/usr/bin/python

error = ${BASE_DIR}/logs/tophat.$(Process).err
output = ${BASE_DIR}/logs/tophat.$(Process).out
log = ${BASE_DIR}/logs/tophat.$(Process).log

request_cpus = 6
request_memory = 4000
request_disk = 0

should_transfer_files = YES
when_to_transfer_output = ON_EXIT

transfer_input_files = ${ALL_FASTQ_FILES}
transfer_output_files = XFER/accepted_hits.bam
transfer_output_remaps = "accepted_hits.bam = ${RESULTS_DIR}/accepted_hits.bam"

arguments = "${TOPHAT_DIR}/tophat --bowtie1 -o XFER -p 6 -G ${GENES_DIR}/${GTF_FILE} --transcriptome-index ${GENES_DIR}/ --no-novel-juncs --library-type fr-unstranded ${INDEX_DIR}/${GENOME_BASE} ${FASTQ_FILES_1} ${FASTQ_FILES_2}"
queue

WoldlabWiki: Condor/Templates (last edited 2015-04-13 22:38:12 by hamrhein)