Difference between revisions of "HPC Workflow Manager"

Line 34: Line 34:
 
=== How to start a job ===
 
=== How to start a job ===
 
If you have created a new job, the main window should look roughly like Figure 5.
 
If you have created a new job, the main window should look roughly like Figure 5.
 +
 +
Here you can see the following columns:
 +
* “Job ID” - Job’s identification number;
 +
* “Status” – The job’s current status which can be:
 +
** “Unknown” – the state of the job is not known;
 +
** “Configuring” – the job is being configured;
 +
** “Queued” – the job is in a queue and where there are available nodes it will be executed;
 +
** “Running” – the job was executed is currently running;
 +
** “Finished” – the job has stopped running successfully, completing its tasks;
 +
** “Failed” – the job has stopped running unsuccessfully, it did not complete its tasks;
 +
** “Canceled” – the job was stopped by the user; and
 +
** Disposed – the job was disposed.
 +
* “Creation time” – the time when the job was created.
 +
* “Start time” – the time when the job was last started.
 +
* “End time” – the time when the job last ended.
 +
* “Upload” – whether the job was uploaded.
 +
* “Download” – whether the job was downloaded.
 +
* “Workflow Type”- whether it is SPIM or Macro workflow type.

Revision as of 06:41, 8 October 2019

General Information

The HPC Workflow Manager Client supports two workflow types:

  • SPIM; and
  • Macro.

This guide will only explain how to use the newly added Macro workflow type.

How to use

How to start the plugin

From the Fiji menu bar select Plugins > Multiview Reconstruction > HPC Workflow Manager and fill in the Login dialog that will appear. For example, see the filled in dialog in Figure 1.

How to login

You need to enter the username, password, and email for your account. If it is the first time you use this installation of the program you must create a new directory anywhere to use as a working directory. If you have used HPC Workflow manager in the past you can use an already existing working directory. Select the working directory by clicking on the browse button or typing the path. The directory must already exist.

Press "Ok" and the dialog will should disappear, and a progress dialog should appear. If not, then a new message should inform you of the error made during filling in the dialog.

How to create a new job

After the connection to the HPC Cluster is made and the jobs are downloaded from the cluster you should see a window like the one in Figure 2. If it is the first time you run this plugin the table will be empty.

Right click in the empty table or an empty row of the table to display the context menu, an example of the context menu is featured in Figure 3.

Select the first option “Create a new job”. The “Create job” window will appear. From the “Workflow Type” section select the “Macro Execution” option.

In the input data location, you must provide a directory which contains your Macro script (it should be named “user.ijm”). If this is the first time you are using the HPC Workflow plugin with Macro support, you can use the example found in the following link: https://github.com/MKrumnikl/Ij1MPIWrapper/tree/addFeatureScatter/src/main/resources/ExampleScripts/HelloWorld

In the node configuration select four nodes (4) by pressing the up arrow in the numeric field four times.

In the “Output data location” section leave the default option, “Output data location”, selected.

Now, the filled-in form should look like Figure 4. If you are using Linux save the “HelloWorld” example script in your home directory (“~/HelloWorld/user.ijm”) and use that path instead of “C:/Documents/HelloWorld”. When you are sure that the form is filled-in correctly press the “Create” button.

How to start a job

If you have created a new job, the main window should look roughly like Figure 5.

Here you can see the following columns:

  • “Job ID” - Job’s identification number;
  • “Status” – The job’s current status which can be:
    • “Unknown” – the state of the job is not known;
    • “Configuring” – the job is being configured;
    • “Queued” – the job is in a queue and where there are available nodes it will be executed;
    • “Running” – the job was executed is currently running;
    • “Finished” – the job has stopped running successfully, completing its tasks;
    • “Failed” – the job has stopped running unsuccessfully, it did not complete its tasks;
    • “Canceled” – the job was stopped by the user; and
    • Disposed – the job was disposed.
  • “Creation time” – the time when the job was created.
  • “Start time” – the time when the job was last started.
  • “End time” – the time when the job last ended.
  • “Upload” – whether the job was uploaded.
  • “Download” – whether the job was downloaded.
  • “Workflow Type”- whether it is SPIM or Macro workflow type.