Skip to content, Skip to search

Changes

SPIM Workflow Manager For HPC

1 byte removed, 02:42, 29 August 2018
no edit summary
Pipeline input parameters are entered by a user into a config.yaml configuration file. In the first step, the .czi raw data are concurrently resaved into the HDF5 container in parallel on the cluster. Similarly, the individual time points are registered in parallel using fluorescent beads as fiduciary markers on the cluster. Subsequently, a non-parallel job executed by Snakemake consolidate the registration XML files into a single one, followed by time-lapse registration using the beads segmented during the spatial registration step. After this, the pipeline diverge into either parallel content-based fusion or parallel multi-view deconvolution. To achieve this divergence in practice, the Snakemake pipeline is launched from the Fiji plugin as two separate jobs using two different config.yaml files set to execute content-based fusion and deconvolution respectively. In the final stage of the pipeline, the fusion/deconvolution output is saved into a new HDF5 container. Figure
[[Image:Drosophila.pdf]]
shows results of registration, fusion and deconvolution in different time points.
88
edits