Skip to content
This repository was archived by the owner on Jul 9, 2025. It is now read-only.

Commit 953d96e

Browse files
authored
Update README.md
1 parent 71e02c4 commit 953d96e

File tree

1 file changed

+6
-6
lines changed

1 file changed

+6
-6
lines changed

README.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -16,17 +16,17 @@ With Tensorflow Similarity you can train two main types of models:
1616

1717
## What's new
1818

19+
- [May 2022]: 0.16 major optimization release
20+
* Cross-batch memory (XBM) loss added thank to @chjort
21+
* Many self-supervised related improvement thanks to @dewball345
22+
* Major layers and callback refactoring to make them faster and more flexible. E.g `EvalCallback()` now support splited validation.
23+
For full changes see [the changelog](./releases.md)
24+
1925
- [Jan 2022]: 0.15 self-supervised release
2026
* Added support for self-supervised contrastive learning. Including SimCLR, SimSiam, and Barlow Twins. Checkout the in-depth [hello world notebook](examples/unsupervised_hello_world.ipynb) to get started.
2127
* Soft Nearest Neighbor Loss added thanks to [Abhishar Sinha](https://github.com/abhisharsinha)
2228
* Added GenerlizedMeanPooling2D support that improves similarity matching accuracy over GlobalMeanPooling2D.
2329
* Numerous speed optimizations and general bug fixes.
24-
- [Dec 2021]:
25-
* Sampler speed optimizations and general bug fixes.
26-
- [Oct 2021]: 0.14 similarity release
27-
* 0.14 is out which includes various speed improvements and post initial release bug fixes.
28-
* Added `Samplers.*` IO [notebook](examples/sampler_io_cookbook.ipynb) detailing how to efficiently sample your data for successful training.
29-
3030

3131
For previous changes and more details - see [the changelog](./releases.md)
3232

0 commit comments

Comments
 (0)