Skip to content, Skip to search

Changes

SPIM Workflow Manager For HPC

29 bytes removed, 05:41, 31 August 2018
Usage
= Usage =
Now you should see the plugin under {{bc | Plugins | Multiview Reconstruction | SPIM Workflow Manager for HPC}}. Upon plugin invocation from the application menu, the user is you are prompted for HEAppE credentials. Following a successful login, the main window containing all jobs arranged in a table is displayed. In this context, the term ''job'' is used for a single pipeline run with specified parameters. The plugin actively inquires information on the created jobs from HEAppE and updates the table as appropriate.
For creating a new job, the plugin provides a wizard allowing the user you to specify input and output data paths as well as to set up a configuration file ''config.yaml'', which effectively characterizes the dataset and defines settings for individual workflow tasks. The plugin supports uploading local input image data to the remote HPC resource, providing information on the progress and estimated remaining time.
Once a job execution is selected by the useryou, the configuration file is sent to the cluster via HEAppE, which is responsible for the job life cycle from this point on. The user You can display a detailed progress dashboard showing current states of all individual computational tasks for the selected job as well as output logs useful for debugging.
[[File:ui_screens.PNG|1200px]]
Following a successfully finished pipeline, the user you can interactively examine the processed SPIM image data using the BigDataServer [https://imagej.net/BigDataViewer] as well as download resultant data and a summary file containing key information about the performed job. Importantly, the user you can edit the corresponding local configuration file in a common text editor, and restart an interrupted, finished, or failed job.
= HPC Cluster =
88
edits