Releases: matsengrp/netam
Releases · matsengrp/netam
Add pretrained DNSM models
What's Changed
- Pretrained DNSM models can now be downloaded and loaded with
pretrained.load(Issue #161) by @matsen in #162 - Implement Whichmut Trainer and Loss Function (Issue #152) by @jgallowa07 in #157
- model references by @matsen in #164
- Add masking support to ParentIndependentBinarySelectionModel by @jgallowa07 in #166
- Release pretrained DASM model with associated demo notebook by @matsen in #168
Full Changelog: v0.2.3...v0.2.4
Add MIT License and fix PyPI dependencies
This release adds an MIT License to the project and fixes PyPI publishing by replacing the git dependency with the standard PyPI fire package.
Main changes:
- Add MIT License making the software open source and freely available
- Fix PyPI dependency: use fire package instead of git dependency to enable PyPI publishing
This makes netam installable via pip install netam from PyPI.
Release for Thrifty and DNSM papers
This release marks the state of the repository for
v0.2.1
What's Changed
- allow strong preferences from pick_device using a string by @matsen in #85
- Support for dnsm-experiments-1 PR#35 by @willdumm in #86
- dnsm-experiments-1 #39 supporting PR by @willdumm in #87
- DXSM ambiguous sequences by @willdumm in #88
- New Dataset superclass by @willdumm in #89
- Chunked model evaluation by @willdumm in #91
Full Changelog: v0.2.0...v0.2.1
v0.2.0
What's Changed (Initial release)
- infrastructure and shmoof models by @matsen in #1
- Hyperparam opt, more models, more flexible training by @matsen in #2
- Fix branch length optimization using correct loss; WiggleAct by @matsen in #5
- Adding per-base inference by @matsen in #9
- Branch length optimization by @matsen in #11
- masking from child sequences; DNSM model bugfix; Yun branch lengths by @matsen in #14
- Add CI Tests by @willdumm in #21
- Standardization fixes; less radical LR reset by @matsen in #20
- Delete experiment.py by @matsen in #23
- syncing with epam update; better tensorboard; refactoring by @matsen in #26
- parallel branch length optimization by @matsen in #28
- Ability to parallelize between GPUs by @matsen in #30
- Simplified handling of DNSM datasets by @matsen in #32
- Bring over epam code to avoid circular dependency by @matsen in #34
- More flexible training; device bugfix; more hyperparams in yml; renaming to weight_decay by @matsen in #36
- Bring tests/test_sequences.py from epam by @matsen in #38
- Bringing over tests/test_molevol.py by @matsen in #40
- Try "warm up" phase by @matsen in #41
- Add ability to get attention maps by @matsen in #43
- Further development of attention maps; no weight decay for 1D parameters by @matsen in #45
- Minor fixes; pyproject.toml file by @matsen in #47
- Fix issues having to do with standardization by @matsen in #49
- don't record loss before training, which fixes incorrect logging by @matsen in #53
- Add
codon_prob.pywith a model to adjust codon probs by hit class by @willdumm in #50 - Format Docstrings by @willdumm in #59
- Refactor neutral_aa_mut_probs to return per-AA information by @matsen in #62
- First approximation to a DASM: 20 output dimensions but same loss by @matsen in #64
- Adding a per-AA loss to the DASM by @matsen in #66
- Better DASM handling of ambiguous amino acids by @matsen in #68
- Integrate the multihit model into the DNSM framework by @willdumm in #71
- Docstrings; multihit device fix by @matsen in #73
- Split forward functions in DXSM models by @willdumm in #74
- Renaming to CSP where appropriate, and other related things by @matsen in #75
- Replacing normalize_sub_probs with a check; fixing consistency problems by @matsen in #77
- Add loss weights keyword argument to DASMBurrito constructor by @willdumm in #78
- Release cleanup; weights-only crepe loading by @matsen in #80
- Release of thrifty models; pretrained module; demo notebook by @matsen in #81
- moving shared code to dxsm.py; add_shm_model_outputs_to_pcp_df by @matsen in #83
- Publish to PyPI by @willdumm in #82
New Contributors
Full Changelog: https://github.com/matsengrp/netam/commits/v0.2.0