Skip to content

matsengrp/dasm-models

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 

Repository files navigation

DASM Models

Pre-trained DASM (Deep Antibody Selection Model) models for use with netam.

Available Models

DASMHumV1.0-4M

A 4M parameter transformer-based model trained on paired heavy-light chain antibody sequences from human repertoires.

Model Architecture:

  • Encoder: PlaceholderEncoder
  • Model: TransformerBinarySelectionModelWiggleAct
  • Parameters: 5 transformer layers, 8 attention heads, 32 d_model per head
  • Output: 20-dimensional (amino acid selection factors)

Training Data:

  • Human antibody sequences with paired heavy and light chains
  • Datasets: v1tangCC, v1vanwinkleheavyTrainCC, v1jaffePairedCC, v1vanwinklelightTrainCC

Dependencies:

  • Neutral model: ThriftyHumV0.2-59
  • Multihit model: ThriftyHumV0.2-59-hc-tangshm

Usage

These models are automatically downloaded when requested through the pretrained model interface in netam:

from netam import pretrained

# Load the DASM model
crepe = pretrained.load("DASMHumV1.0-4M")

# Use with paired heavy-light chain sequences
heavy_seq = "QVQLVESG..."  # Heavy chain amino acid sequence
light_seq = "DIQMTQSP..."  # Light chain amino acid sequence

# Get selection factors for both chains
selection_factors = crepe.model.selection_factors_of_aa_str((heavy_seq, light_seq))

Version History

v1.0.0

  • Initial release with DASMHumV1.0-4M model

License

MIT License

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 2

  •  
  •