PerMedCoE Covid19 Pilot workflow (PyCOMPSs)
main @ 7ef4b06

Workflow Type: COMPSs

COVID-19 Multiscale Modelling of the Virus and Patients’ Tissue Workflow

Table of Contents


Uses multiscale simulations to predict patient-specific SARS‑CoV‑2 severity subtypes (moderate, severe or control), using single-cell RNA-Seq data, MaBoSS and PhysiBoSS. Boolean models are used to determine the behaviour of individual agents as a function of extracellular conditions and the concentration of different substrates, including the number of virions. Predictions of severity subtypes are based on a meta-analysis of personalised model outputs simulating cellular apoptosis regulation in epithelial cells infected by SARS‑CoV‑2.

The workflow uses the following building blocks, described in order of execution:

  1. High-throughput mutant analysis
  2. Single-cell processing
  3. Personalise patient
  4. PhysiBoSS
  5. Analysis of all simulations

For details on individual workflow steps, see the user documentation for each building block.

GitHub repository


Building Blocks

The BuildingBlocks folder contains the script to install the Building Blocks used in the COVID-19 Workflow.


The Workflow folder contains the workflows implementations.

Currently contains the implementation using PyCOMPSs and Snakemake (in progress).


The Resources folder contains dataset files.


The Tests folder contains the scripts that run each Building Block used in the workflow for the given small dataset. They can be executed individually for testing purposes.


Local machine

This section explains the requirements and usage for the COVID19 Workflow in a laptop or desktop computer.


Usage steps

  1. Clone this repository:
git clone
  1. Install the Building Blocks required for the COVID19 Workflow:
  1. Get the required Building Block images from the project B2DROP:
  • Required images:
    • MaBoSS.singularity
    • meta_analysis.singularity
    • PhysiCell-COVID19.singularity
    • single_cell.singularity

The path where these files are stored MUST be exported in the PERMEDCOE_IMAGES environment variable.

:warning: TIP: These containers can be built manually as follows (be patient since some of them may take some time):

  1. Clone the BuildingBlocks repository
    git clone
  2. Build the required Building Block images
    cd BuildingBlocks/Resources/images
    sudo singularity build MaBoSS.sif MaBoSS.singularity
    sudo singularity build meta_analysis.sif meta_analysis.singularity
    sudo singularity build PhysiCell-COVID19.sif PhysiCell-COVID19.singularity
    sudo singularity build single_cell.sif single_cell.singularity
    cd ../../..

If using PyCOMPSs in local PC (make sure that PyCOMPSs in installed):

  1. Go to Workflow/PyCOMPSs folder

    cd Workflows/PyCOMPSs
  2. Execute ./

If using Snakemake in local PC (make sure that SnakeMake is installed):

  1. Go to Workflow/SnakeMake folder

    cd Workflows/SnakeMake
  2. Execute ./

TIP: If you want to run the workflow with a different dataset, please update the script setting the dataset variable to the new dataset folder and their file names.

MareNostrum 4

This section explains the requirements and usage for the COVID19 Workflow in the MareNostrum 4 supercomputer.

Requirements in MN4

  • Access to MN4

All Building Blocks are already installed in MN4, and the COVID19 Workflow available.

Usage steps in MN4

  1. Load the COMPSs, Singularity and permedcoe modules

    module load COMPSs/3.1
    module load singularity/3.5.2
    module use /apps/modules/modulefiles/tools/COMPSs/libraries
    module load permedcoe

    TIP: Include the loading into your ${HOME}/.bashrc file to load it automatically on the session start.

    This commands will load COMPSs and the permedcoe package which provides all necessary dependencies, as well as the path to the singularity container images (PERMEDCOE_IMAGES environment variable) and testing dataset (COVID19WORKFLOW_DATASET environment variable).

  2. Get a copy of the pilot workflow into your desired folder

    mkdir desired_folder
    cd desired_folder
  3. Go to Workflow/PyCOMPSs folder

    cd Workflow/PyCOMPSs
  4. Execute ./

This command will launch a job into the job queuing system (SLURM) requesting 2 nodes (one node acting half master and half worker, and other full worker node) for 20 minutes, and is prepared to use the singularity images that are already deployed in MN4 (located into the PERMEDCOE_IMAGES environment variable). It uses the dataset located into ../../Resources/data folder.

:warning: TIP: If you want to run the workflow with a different dataset, please edit the script and define the appropriate dataset path.

After the execution, a results folder will be available with with COVID19 Workflow results.

Mahti or Puhti

This section explains how to run the COVID19 workflow on CSC supercomputers using SnakeMake.


  • Install snakemake (or check if there is a version installed using module spider snakemake)
  • Install workflow, using the same steps as for the local machine. With the exception that containers have to be built elsewhere.


  1. Go to Workflow/SnakeMake folder

    cd Workflow/SnakeMake
  2. Edit with the correct partition, account, and resource specifications.

  3. Execute ./

:warning: Snakemake provides a --cluster flag, but this functionality should be avoided as it's really not suited for HPC systems.


Apache 2.0


This software has been developed for the PerMedCoE project, funded by the European Commission (EU H2020 951773).

Version History

main @ 7ef4b06 (earliest) Created 23rd May 2023 at 13:07 by Miguel Vazquez

Added replicate analysis to

Frozen main 7ef4b06
help Creators and Submitter
  • Javier Conejero

Views: 1140   Downloads: 122

Created: 23rd May 2023 at 13:07

Last updated: 23rd May 2023 at 13:33

help Tags

This item has not yet been tagged.

help Attributions


Total size: 40.7 MB
Powered by
Copyright © 2008 - 2024 The University of Manchester and HITS gGmbH