Pre-trained DASM (Deep Antibody Selection Model) models for use with netam.
A 4M parameter transformer-based model trained on paired heavy-light chain antibody sequences from human repertoires.
Model Architecture:
- Encoder: PlaceholderEncoder
- Model: TransformerBinarySelectionModelWiggleAct
- Parameters: 5 transformer layers, 8 attention heads, 32 d_model per head
- Output: 20-dimensional (amino acid selection factors)
Training Data:
- Human antibody sequences with paired heavy and light chains
- Datasets: v1tangCC, v1vanwinkleheavyTrainCC, v1jaffePairedCC, v1vanwinklelightTrainCC
Dependencies:
- Neutral model: ThriftyHumV0.2-59
- Multihit model: ThriftyHumV0.2-59-hc-tangshm
These models are automatically downloaded when requested through the pretrained model interface in netam:
from netam import pretrained
# Load the DASM model
crepe = pretrained.load("DASMHumV1.0-4M")
# Use with paired heavy-light chain sequences
heavy_seq = "QVQLVESG..." # Heavy chain amino acid sequence
light_seq = "DIQMTQSP..." # Light chain amino acid sequence
# Get selection factors for both chains
selection_factors = crepe.model.selection_factors_of_aa_str((heavy_seq, light_seq))- Initial release with DASMHumV1.0-4M model
MIT License