Skip to content

Mosquito-Alert/rm_hpc

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

RM0 - Pipeline HPC

This repository contains the code for running a high-performance computing (HPC) pipeline that get the mosquito basic reproductive number based on https://github.com/mpardo1/RM_mosquito

The pipeline is orchestrated using Snakemake which automates tasks related to data preparation, model training, and prediction.

Data sources

The model needs data from the following data sources:

In the case of future forecast, please consider (not yet tested):

Data preparation

Download GPWv4 - Population Density

Please download the file gpw-v4-population-density-rev11_2020_2pt5_min_tif.zip export it and place it in the following path: data/gpw/gpw_v4_population_density_rev11_2020_2pt5_min.tif

Deployment

1.Install Snakemake

We recommend using Mamba (a faster drop-in replacement for Conda). If you don't have Conda or Mamba installed, consider installing Miniforge.

Install Snakemake, Snakedeploy, and necessary plugins:

mamba create -c conda-forge -c bioconda --name snakemake snakemake=9.8.1 snakedeploy=0.11.0

If you're running on an HPC with SLURM, install additional plugins:

mamba install -n snakemake -c bioconda snakemake-executor-plugin-slurm=1.5.0

Activate the environment:

conda activate snakemake

2. Deploy the workflow

Create and move into a project directory:

mkdir -p path/to/project-workdir
cd path/to/project-workdir

Deploy the workflow using Snakedeploy:

snakedeploy deploy-workflow <URL_TO_THIS_REPO> . --tag <DESIRED_TAG>

This will create two directories:

  • workflow/: contains the deployed Snakemake module
  • config/: contains configuration files

3. Configure workflow

Edit config/config.yaml to specify your settings (paths, parameters, etc.) according to your data and environment

4. Run workflow

Local execution with conda

snakemake --cores all --sdm conda

HPC execution with SLURM

Use the provided SLURM profile:

snakemake --cores all --sdm conda --profile slurm

For advanced features such as cluster execution, cloud deployments, and workflow customization, see the Snakemake documentation.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages