Skip to content

Commit 38b24dc

Browse files
Igor Shilovfacebook-github-bot
Igor Shilov
authored andcommitted
release v1.2.0 (#500)
Summary: subj Pull Request resolved: #500 Reviewed By: ashkan-software Differential Revision: D39386971 Pulled By: ffuuugor fbshipit-source-id: a8d8c813d4c8891ceaf9d249a8d02fec38e502c5
1 parent d49e9f0 commit 38b24dc

File tree

2 files changed

+21
-1
lines changed

2 files changed

+21
-1
lines changed

CHANGELOG.md

+20
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,25 @@
11
# Changelog
22

3+
## v1.2
4+
5+
### New ways to compute per sample gradients
6+
We're glad to present Opacus v1.2, which contains some major updates to per sample gradient computation mechanisms
7+
and includes all the good stuff from the recent PyTorch releases.
8+
* Functorch - per sample gradients for all
9+
* ExpandedWeights - yet another way to compute per sample gradients
10+
* See [Release notes](https://github.com/pytorch/opacus/releases/tag/v1.2.0)
11+
and [GradSampleModule README](https://github.com/pytorch/opacus/blob/main/opacus/grad_sample/README.md)
12+
for detailed feature explanation
13+
14+
### Other improvements
15+
* Fix `utils.unfold2d` with non-symmetric pad/dilation/kernel_size/stride (#443)
16+
* Add support for "same" and "valid" padding for hooks-based grad sampler for convolution layers
17+
* Improve model validation to support frozen layers and catch copied parameters (#489)
18+
* Remove annoying logging from `set_to_none` (#471)
19+
* Improved documentation (#480, #478, #482, #485, #486, #487, #488)
20+
* Imtegration test improvements (#407, #479, #481. #473)
21+
22+
323
## v1.1.3
424
### Bug fixes
525
* Support layers with a mix of frozen and learnable parameters (#437)

opacus/version.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -13,4 +13,4 @@
1313
# See the License for the specific language governing permissions and
1414
# limitations under the License.
1515

16-
__version__ = "1.1.3"
16+
__version__ = "1.2.0"

0 commit comments

Comments
 (0)