COVID-19 Multiscale Modelling of the Virus and Patients’ Tissue Workflow
Table of Contents
Description
Uses multiscale simulations to predict patient-specific SARS‑CoV‑2 severity subtypes (moderate, severe or control), using single-cell RNA-Seq data, MaBoSS and PhysiBoSS. Boolean models are used to determine the behaviour of individual agents as a function of extracellular conditions and the concentration of different substrates, including the number of virions. Predictions of severity subtypes are based on a meta-analysis of personalised model outputs simulating cellular apoptosis regulation in epithelial cells infected by SARS‑CoV‑2.
The workflow uses the following building blocks, described in order of execution:
- High-throughput mutant analysis
- Single-cell processing
- Personalise patient
- PhysiBoSS
- Analysis of all simulations
For details on individual workflow steps, see the user documentation for each building block.
Contents
Building Blocks
The BuildingBlocks
folder contains the script to install the
Building Blocks used in the COVID-19 Workflow.
Workflows
The Workflow
folder contains the workflows implementations.
Currently contains the implementation using PyCOMPSs and Snakemake (in progress).
Resources
The Resources
folder contains dataset files.
Tests
The Tests
folder contains the scripts that run each Building Block
used in the workflow for the given small dataset.
They can be executed individually for testing purposes.
Instructions
Local machine
This section explains the requirements and usage for the COVID19 Workflow in a laptop or desktop computer.
Requirements
permedcoe
package- PyCOMPSs / Snakemake
- Singularity
Usage steps
- Clone this repository:
git clone https://github.com/PerMedCoE/covid-19-workflow.git
- Install the Building Blocks required for the COVID19 Workflow:
covid-19-workflow/BuildingBlocks/./install_BBs.sh
- Get the required Building Block images from the project B2DROP:
- Required images:
- MaBoSS.singularity
- meta_analysis.singularity
- PhysiCell-COVID19.singularity
- single_cell.singularity
The path where these files are stored MUST be exported in the PERMEDCOE_IMAGES
environment variable.
:warning: TIP: These containers can be built manually as follows (be patient since some of them may take some time):
- Clone the
BuildingBlocks
repositorygit clone https://github.com/PerMedCoE/BuildingBlocks.git
- Build the required Building Block images
cd BuildingBlocks/Resources/images sudo singularity build MaBoSS.sif MaBoSS.singularity sudo singularity build meta_analysis.sif meta_analysis.singularity sudo singularity build PhysiCell-COVID19.sif PhysiCell-COVID19.singularity sudo singularity build single_cell.sif single_cell.singularity cd ../../..
If using PyCOMPSs in local PC (make sure that PyCOMPSs in installed):
-
Go to
Workflow/PyCOMPSs
foldercd Workflows/PyCOMPSs
-
Execute
./run.sh
If using Snakemake in local PC (make sure that SnakeMake is installed):
-
Go to
Workflow/SnakeMake
foldercd Workflows/SnakeMake
-
Execute
./run.sh
TIP: If you want to run the workflow with a different dataset, please update the
run.sh
script setting thedataset
variable to the new dataset folder and their file names.
MareNostrum 4
This section explains the requirements and usage for the COVID19 Workflow in the MareNostrum 4 supercomputer.
Requirements in MN4
- Access to MN4
All Building Blocks are already installed in MN4, and the COVID19 Workflow available.
Usage steps in MN4
-
Load the
COMPSs
,Singularity
andpermedcoe
modulesexport COMPSS_PYTHON_VERSION=3 module load COMPSs/3.1 module load singularity/3.5.2 module use /apps/modules/modulefiles/tools/COMPSs/libraries module load permedcoe
TIP: Include the loading into your
${HOME}/.bashrc
file to load it automatically on the session start.This commands will load COMPSs and the permedcoe package which provides all necessary dependencies, as well as the path to the singularity container images (
PERMEDCOE_IMAGES
environment variable) and testing dataset (COVID19WORKFLOW_DATASET
environment variable). -
Get a copy of the pilot workflow into your desired folder
mkdir desired_folder cd desired_folder get_covid19workflow
-
Go to
Workflow/PyCOMPSs
foldercd Workflow/PyCOMPSs
-
Execute
./launch.sh
This command will launch a job into the job queuing system (SLURM) requesting 2 nodes (one node acting half master and half worker, and other full worker node) for 20 minutes, and is prepared to use the singularity images that are already deployed in MN4 (located into the PERMEDCOE_IMAGES
environment variable). It uses the dataset located into ../../Resources/data
folder.
:warning: TIP: If you want to run the workflow with a different dataset, please edit the
launch.sh
script and define the appropriate dataset path.
After the execution, a results
folder will be available with with COVID19 Workflow results.
Mahti or Puhti
This section explains how to run the COVID19 workflow on CSC supercomputers using SnakeMake.
Requirements
- Install snakemake (or check if there is a version installed using
module spider snakemake
) - Install workflow, using the same steps as for the local machine. With the exception that containers have to be built elsewhere.
Steps
-
Go to
Workflow/SnakeMake
foldercd Workflow/SnakeMake
-
Edit
launch.sh
with the correct partition, account, and resource specifications. -
Execute
./launch.sh
:warning: Snakemake provides a
--cluster
flag, but this functionality should be avoided as it's really not suited for HPC systems.
License
Contact
This software has been developed for the PerMedCoE project, funded by the European Commission (EU H2020 951773).
Version History
main @ 7ef4b06 (earliest) Created 23rd May 2023 at 13:07 by Miguel Vazquez
Added replicate analysis to b_2_run_per_patients.sh
Frozen
main
7ef4b06
Creator
Submitter
Views: 1861 Downloads: 226
Created: 23rd May 2023 at 13:07
Last updated: 23rd May 2023 at 13:33
This item has not yet been tagged.
None