Skip to content, Skip to search


SPIM Workflow Manager For HPC

48 bytes added, 03:04, 12 December 2018
For creating a new job, right click in the main window and choose ''Create a new job''. A window with input and output data location will pop up. You have the option to use demonstration data on the Salamon cluster or specify your own input data location. Eventually you may choose your working directory (specified during login) as both your input and output data location. Once a new job is configured, you are able to upload your own data (if you chose this option in the previous step) by right clicking on the job line and choosing ''Upload data''. When ''Done'' appears in the ''Upload '' column, you can start the job by {{bc | right click | Start job}}. Status of your job changes to ''Queued'', then to ''Running'' and finally to ''Finished''when your pipeline finishes successfully.
The plugin provides a wizard allowing you to set up a configuration file ''config.yaml'', which effectively characterizes the dataset and defines settings for individual workflow tasks. The plugin supports uploading local input image data to the remote HPC resource, providing information on the progress and estimated remaining time.