This repository has been archived by the owner on Apr 19, 2023. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 42
/
Copy pathxcpEngine
executable file
·450 lines (395 loc) · 15 KB
/
xcpEngine
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
#!/usr/bin/env bash
###################################################################
# ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ #
###################################################################
source $FSLDIR/etc/fslconf/fsl.sh
echo "Received options:" $@
args=`echo $@`
###################################################################
# Р Ћ 2015-2017
###################################################################
<< XCPENGINE
xcpEngine is software that reads in image processing pipelines
(design files) and deploys an assortment of multimodal analytic
tools in order to run those pipelines.
Development team:
Lead: Rastko Ciric
rastko@mail.med.upenn.edu
Maintainers and contributors:
Azeez Adebimpe
adebimpe@upenn.edu
Matt Cieslak
mcieslak@upenn.edu
Multimodal: Adon FG Rosen
adrose@mail.med.upenn.edu
PI: Ted Satterthwaite
sattertt@mail.med.upenn.edu
This program is free software: you can redistribute it and/or
modify it under the terms of the GNU General Public License as
published by the Free Software Foundation.
Any third-party programs distributed with xcpEngine are
distributed under their own licenses.
As a courtesy, we ask that you include the following citations
in publications that make use of this program:
Rastko Ciric, Adon F. G. Rosen, Guray Erus, Matthew Cieslak, Azeez Adebimpe,
Philip A. Cook, Danielle S. Bassett, Christos Davatzikos, Daniel H. Wolf,
Theodore D. Satterthwaite Mitigating head motion artifact in functional
connectivity MRI,Nature Protocol
Dependencies:
* FSL (available from FMRIB)
* R and the following packages (available from CRAN)
* RNifti
* pracma
* optparse
* signal
* ANTs (available from GitHub)
* AFNI (available from AFNI/NIfTI server)
* Convert3D (available from ITK-Snap server)
* Other packages may be required for implementation of specific
strategies, e.g. for denoising.
XCPENGINE
###################################################################
[[ -z ${XCPEDIR} ]] \
&& echo "[XCPEDIR is undefined. Aborting.]" && exit
source ${XCPEDIR}/core/constants
source ${XCPEDIR}/core/functions/library.sh
source ${XCPEDIR}/core/functions/cluster_submit
usage(){
cat << endstream
___________________________________________________________________
Usage: `basename $0` -d <design> <options>
Compulsory arguments:
-d : Primary design file for pipeline:
The design file specifies the pipeline modules to
be included in the current analysis and configures
any parameters necessary to run the modules.
-c : Cohort file for pipeline input:
A comma-separated catalogue of the analytic sample.
Each row corresponds to a subject, and each column
corresponds either to an identifier or to an input.
-o : Parent directory for pipeline output:
A valid path on the current filesystem specifying
the directory wherein all output from the current
analysis will be written.
Optional arguments:
-m : Execution mode:
Input can either be 's' (for serial execution on a
single machine)[default], 'c' (for execution on a
computing cluster) or a path to a file (for execution
on a computing cluster, subject to the specifications
defined in the file).
-i : Scratch space for pipeline intermediates:
Some systems operate more quickly when temporary
files are written in a dedicated scratch space. This
argument enables a scratch space for intermediates.
-r : Root directory for inputs:
If all paths defined in the cohort file are defined
relative to a root directory, then this argument will
define the root directory. Otherwise, all paths will
be treated as absolute.
-t : Trace:
Integer value ( 0 - 3 ) that indicates the level
of verbosity during module execution. Higher
levels reduce readability but provide useful
information for troubleshooting.
0 [default]: Human-readable explanations of
processing steps and error traces only.
1: Explicitly trace module-level computations.
Print a workflow map when execution completes.
2: Explicitly trace module- and utility-level
computations.
3: All commands called by the module and all
children are traced and explicitly replicated
in a log file.
endstream
}
###################################################################
# Set defaults (serial/local run), quiet
###################################################################
mode=s
trace=0
###################################################################
# Parse arguments
###################################################################
if [ "$1" == '"$@"' ]
then
shift
fi
while getopts "d:i:c:o:m:r:t:a:uhv" OPTION
do
case $OPTION in
d)
indesign=${OPTARG}
;;
i)
scratch=${OPTARG}
;;
c)
incohort=${OPTARG}
;;
o)
out_super=${OPTARG}
;;
m)
mode=${OPTARG}
;;
r)
inroot=${OPTARG}
;;
t)
trace=${OPTARG}
(( trace > 3 )) && trace=3
;;
v)
cat ${XCPEDIR}/core/VERSION
exit 1
;;
a)
asgt=( ${asgt[@]} -a ${OPTARG} )
;;
*)
usage
exit 1
;;
esac
done
shift $((OPTIND-1))
###################################################################
# Validate and parse the design and cohort files.
# All pipeline variables are available after validation.
###################################################################
echo \$XCPEDIR is $XCPEDIR
export XCPEDIR
###################################################################
# Check the inputs
#python ${XCPEDIR}/checks/check_inputs.py ${args}
#if [ $? -eq 0 ]; then
# echo OK
#else
#echo Your inputs would result in a failed pipeline. Exiting.
#exit 1
#fi
echo \
"
Constructing a pipeline based on user specifications
····································································"
source ${XCPEDIR}/core/validateDesign
source ${XCPEDIR}/core/validateIntermediate
source ${XCPEDIR}/core/validateCohort
source ${XCPEDIR}/core/validateOutput
source ${XCPEDIR}/core/validateMode
echo \
"····································································"
###################################################################
# Obtain a timestamp and uuid for the current deployment of the engine.
###################################################################
export xcp_date=$(date +%Y%m%d%H%M)
export xcp_uuid=$(uuidgen)
echo "Date: $xcp_date"
echo "UUID: $xcp_uuid"
###################################################################
# Prepare to analyse the entire pipeline.
###################################################################
modules=( ${pipeline//,/ } )
python -c "import sentry_sdk; sentry_sdk.init('https://34b713b3ba2240329b2b671685006e94@sentry.io/1854243')"
###################################################################
# Verify that all dependencies are installed.
# Obtain version information for all general and specific
# dependencies.
###################################################################
source ${XCPEDIR}/core/checkDependenciesXCP
mkdir -p ${out_group}/dependencies
###################################################################
# Write version information to the version tracking file.
###################################################################
versions=${out_group}/dependencies/${analysis}_${xcp_date}_pipelineDescription.json
printx ${XCPEDIR}/core/ENVCHECK >> ${versions}
###################################################################
# ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ #
#
# EXECUTE THE PIPELINE
#
# ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ ⊗ #
###################################################################
printx ${XCPEDIR}/core/CONSOLE_HEADER
prestats_rerun[1]=1
###################################################################
# Run the localiser first.
###################################################################
cxt=0
level=subject
module=LOCALISER
unset ids
printx ${XCPEDIR}/core/CONSOLE_MODULE_SUBMIT
for sub in ${!cohort[@]}
do
subject=${cohort[sub]}
################################################################
# Run the localiser.
################################################################
mod_cmd="
${XCPEDIR}/core/xcpLocaliser
-d ${design}
-s ${sub}:${incohort}
-o ${out_super}
${scratch} ${root_arg} ${asgt[@]}
-t ${trace}
"
subject_parse ${subject} ${incohort}
mkdir -p ${out[sub]}/${prefix[sub]}_logs
case ${mode} in
s)
${mod_cmd}
;;
c)
cluster_submit ${mod_cmd}
;;
esac
done
###################################################################
# Wait for the localiser to run to completion. (cluster mode)
###################################################################
if [[ ${mode} == c ]]
then
sleep 5
${XCPEDIR}/utils/qstatus ${ids}
fi
(( cxt++ ))
###################################################################
# Then, iterate through all modules.
###################################################################
for module in ${modules[@]}
do
################################################################
# Obtain paths to the subject- and group-level scripts for
# the current module, if each exists.
################################################################
printx ${XCPEDIR}/core/CONSOLE_MODULE_SUBMIT
modS=$(ls -d1 ${XCPEDIR}/modules/${module}/${module}.mod 2>/dev/null)
modG=$(ls -d1 ${XCPEDIR}/modules/${module}/${module}.g.mod 2>/dev/null)
################################################################
# SUBJECT-LEVEL PROCESSING
#---------------------------------------------------------------
# If the current module has a subject-level component, then
# iterate through all subjects and run it.
################################################################
level=subject
if [[ -n ${modS} ]]
then
unset ids
source ${XCPEDIR}/core/validateSubjects
for sub in ${!cohort[@]}
do
##########################################################
# Before running the subject, verify that it has not
# yielded erroneous output or been labelled for exclusion
# for quality reasons.
##########################################################
logfile=${out[sub]}/${prefix[sub]}_logs/${analysis}_${xcp_date}_${prefix[sub]}LOG
##########################################################
# Execute the current module.
##########################################################
source ${XCPEDIR}/core/defineDesign
mod_cmd="
${modS}
-d ${design[sub]}
-c ${cxt}
-t ${trace}
"
case ${mode} in
s)
${mod_cmd}
;;
c)
cluster_submit ${mod_cmd}
;;
esac
done
#############################################################
# Wait for all subject-level routines to run to completion.
#############################################################
if [[ ${mode} == c ]]
then
sleep 5
${XCPEDIR}/utils/qstatus ${ids}
fi
fi
################################################################
# GROUP-LEVEL PROCESSING
#---------------------------------------------------------------
# If the current module has a group-level component, then
# execute it.
################################################################
level=group
if [[ -n ${modG} ]]
then
mod_cmd="
${modG}
-s ${incohort}
-c ${cxt}
-o ${out_super}
-t ${trace}
"
source ${XCPEDIR}/core/validateGroup
case ${mode} in
s)
${mod_cmd}
;;
c)
cluster_submit ${mod_cmd}
sleep 5
${XCPEDIR}/utils/qstatus -e ${ids}
;;
esac
fi
################################################################
# Advance the pipeline to the next context.
################################################################
(( cxt++ ))
done
################################################################
# Generate report for each subject
################################################################
echo "generating report"
for sub in ${!cohort[@]}
do
mkdir -p ${out[sub]}/figures
#mv $(ls -f ${out[sub]}/regress/*svg) ${out[sub]}/figures/ 2>/dev/null
echo $standard >> ${out[sub]}/template.txt
##########################################################
# remove previous html
##########################################################
rm ${out[sub]}/${prefix[sub]}_report.html 2>/dev/null
##########################################################
# Execute the current module.
##########################################################
echo $pipeline > ${out[sub]}/module.csv
${XCPEDIR}/core/report.py -t ${out[sub]}/template.nii.gz -p ${prefix[sub]} \
-o ${out[sub]} -m ${out[sub]}/module.csv
python ${XCPEDIR}/core/sentry_setup.py -p ${prefix[sub]} -o ${out[sub]} -m ${out[sub]}/module.csv
##########################################################
# remove temporary files
##########################################################
rm -rf ${out[sub]}/module.csv ${out[sub]}/template.txt
# remove some data to clean up
rm -rf ${out[sub]}/*nii.gz 2>/dev/null
rm -rf ${out[sub]}/prestats/*preprocessed.nii.gz 2>/dev/null
rm -rf ${out[sub]}/prestats/*segmentation.nii.gz 2>/dev/null
rm -rf ${out[sub]}/prestats/*structbrain.nii.gz 2>/dev/null
rm -rf ${out[sub]}/prestats/*structmask.nii.gz 2>/dev/null
rm -f ${out[sub]}/prestats/*struct.nii.gz 2>/dev/null
rm -rf ${out[sub]}/fcon/*referenceVolume.nii.gz 2>/dev/null
rm -rf ${out[sub]}/fcon/*/*.nii.gz 2>/dev/null
rm -rf ${out[sub]}/reho/*referenceVolume.nii.gz 2>/dev/null
rm -rf ${out[sub]}/alff/*referenceVolume.nii.gz 2>/dev/null
rm -rf ${out[sub]}/roiquant/*referenceVolume.nii.gz 2>/dev/null
rm -rf ${out[sub]}/roiquant/*/*.nii.gz 2>/dev/null
rm -rf ${out[sub]}/qcfc/*.nii.gz 2>/dev/null
rm -rf ${out[sub]}/qcfc/*voxts.png 2>/dev/null
rm -rf ${out[sub]}/roiquant/*/*.nii.gz 2>/dev/null
done
###################################################################
# Conclude the analysis.
###################################################################
source ${XCPEDIR}/core/terminatePipeline
printx ${XCPEDIR}/core/CONSOLE_FOOTER