Page history Edit this page How do I edit this website?

Labkit - How To Segment A Large Image On An HPC Cluster

Labkit - How To Segment A Large Image On An HPC Cluster

High performance computing (HPC) cluster are helpful to process large images that do not fit on consumer computers at very high speed. In order to use Labkit to segment a very large image on an HPC cluster, we need to train a classifier on a small subimage and run the Labkit command tool line on the cluster.

On the local computer:

  1. Download the dataset of interest, unzip
  2. Use BigStitcher FIJI plugin to convert the dataset to BDV HDF5 + XML format:
    • Install BigStitcher update site in Fiji
    • Run Plugins > BigStitcher > Batch Processing > Define dataset … (Use Bioformats importer and make sure to select the correct pixel size)
  3. Open the image in Labkit using: Plugins > Labkit > Open Image File With Labkit
  4. Continue as described in the quick automatic segmentation
  5. Save the trained classifier SegmentationSave Classifier …

On the HPC cluster:

  1. Copy the dataset HDF5 + XML to the cluster
  2. Copy the trained classifier to the cluster
  3. Download the Labkit command line tool
  4. Unzip the command line tool archive
  5. Edit the “Snakemake” file and change the following lines:
IMAGE = “/path/to/your/dataset.xml”
CLASSIFIER = “/path/to/your/pixel.classifier”
USE_GPU = ”true”
  1. Run the snakemake script:
$ snakemake --cluster=”sbatch --partition=gpu” --jobs=10 --local-cores=1 --restart-times=10