Skip to content

Commit 7d9a9d0

Browse files
committed
Release: v4.2.0
1 parent c949516 commit 7d9a9d0

File tree

5 files changed

+8
-8
lines changed

5 files changed

+8
-8
lines changed

README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -224,7 +224,7 @@ Min, Patrick Lewis, Ledell Wu, Sergey Edunov, Danqi Chen, and Wen-tau Yih.
224224
ultilingual BERT into [DistilmBERT](https://github.com/huggingface/transformers/tree/master/examples/distillation) and a German version of DistilBERT.
225225
1. **[SqueezeBert](https://huggingface.co/transformers/model_doc/squeezebert.html)** released with the paper [SqueezeBERT: What can computer vision teach NLP about efficient neural networks?](https://arxiv.org/abs/2006.11316) by Forrest N. Iandola, Albert E. Shaw, Ravi Krishna, and Kurt W. Keutzer.
226226
1. **[T5](https://huggingface.co/transformers/model_doc/t5.html)** (from Google AI) released with the paper [Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer](https://arxiv.org/abs/1910.10683) by Colin Raffel and Noam Shazeer and Adam Roberts and Katherine Lee and Sharan Narang and Michael Matena and Yanqi Zhou and Wei Li and Peter J. Liu.
227-
1. **[TAPAS](https://huggingface.co/transformers/master/model_doc/tapas.html)** (from Google AI) released with the paper [TAPAS: Weakly Supervised Table Parsing via Pre-training](https://arxiv.org/abs/2004.02349) by Jonathan Herzig, Paweł Krzysztof Nowak, Thomas Müller, Francesco Piccinno and Julian Martin Eisenschlos.
227+
1. **[TAPAS](https://huggingface.co/transformers/model_doc/tapas.html)** (from Google AI) released with the paper [TAPAS: Weakly Supervised Table Parsing via Pre-training](https://arxiv.org/abs/2004.02349) by Jonathan Herzig, Paweł Krzysztof Nowak, Thomas Müller, Francesco Piccinno and Julian Martin Eisenschlos.
228228
1. **[Transformer-XL](https://huggingface.co/transformers/model_doc/transformerxl.html)** (from Google/CMU) released with the paper [Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context](https://arxiv.org/abs/1901.02860) by Zihang Dai*, Zhilin Yang*, Yiming Yang, Jaime Carbonell, Quoc V. Le, Ruslan Salakhutdinov.
229229
1. **[XLM](https://huggingface.co/transformers/model_doc/xlm.html)** (from Facebook) released together with the paper [Cross-lingual Language Model Pretraining](https://arxiv.org/abs/1901.07291) by Guillaume Lample and Alexis Conneau.
230230
1. **[XLM-ProphetNet](https://huggingface.co/transformers/model_doc/xlmprophetnet.html)** (from Microsoft Research) released with the paper [ProphetNet: Predicting Future N-gram for Sequence-to-Sequence Pre-training](https://arxiv.org/abs/2001.04063) by Yu Yan, Weizhen Qi, Yeyun Gong, Dayiheng Liu, Nan Duan, Jiusheng Chen, Ruofei Zhang and Ming Zhou.

docs/source/conf.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@
2626
# The short X.Y version
2727
version = u''
2828
# The full version, including alpha/beta/rc tags
29-
release = u'4.1.1'
29+
release = u'4.2.0'
3030
# Prefix link to point to master, comment this during version release and uncomment below line
3131
extlinks = {'prefix_link': ('https://github.com/huggingface/transformers/blob/master/%s', '')}
3232
# Prefix link to always point to corresponding version, uncomment this during version release

docs/source/index.rst

+3-3
Original file line numberDiff line numberDiff line change
@@ -181,9 +181,9 @@ and conversion utilities for the following models:
181181
32. :doc:`T5 <model_doc/t5>` (from Google AI) released with the paper `Exploring the Limits of Transfer Learning with a
182182
Unified Text-to-Text Transformer <https://arxiv.org/abs/1910.10683>`__ by Colin Raffel and Noam Shazeer and Adam
183183
Roberts and Katherine Lee and Sharan Narang and Michael Matena and Yanqi Zhou and Wei Li and Peter J. Liu.
184-
33. `TAPAS <https://huggingface.co/transformers/master/model_doc/tapas.html>`__ (from Google AI) released with the
185-
paper `TAPAS: Weakly Supervised Table Parsing via Pre-training <https://arxiv.org/abs/2004.02349>`__ by Jonathan
186-
Herzig, Paweł Krzysztof Nowak, Thomas Müller, Francesco Piccinno and Julian Martin Eisenschlos.
184+
33. :doc:`TAPAS <model_doc/tapas>` (from Google AI) released with the paper `TAPAS: Weakly Supervised Table Parsing via
185+
Pre-training <https://arxiv.org/abs/2004.02349>`__ by Jonathan Herzig, Paweł Krzysztof Nowak, Thomas Müller,
186+
Francesco Piccinno and Julian Martin Eisenschlos.
187187
34. :doc:`Transformer-XL <model_doc/transformerxl>` (from Google/CMU) released with the paper `Transformer-XL:
188188
Attentive Language Models Beyond a Fixed-Length Context <https://arxiv.org/abs/1901.02860>`__ by Zihang Dai*,
189189
Zhilin Yang*, Yiming Yang, Jaime Carbonell, Quoc V. Le, Ruslan Salakhutdinov.

setup.py

+2-2
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@
2626
2727
2. Commit these changes with the message: "Release: VERSION"
2828
29-
3. Add a tag in git to mark the release: "git tag VERSION -m'Adds tag VERSION for pypi' "
29+
3. Add a tag in git to mark the release: "git tag VERSION -m 'Adds tag VERSION for pypi' "
3030
Push the tag to git: git push --tags origin master
3131
3232
4. Build both the sources and the wheel. Do not change anything in setup.py between
@@ -248,7 +248,7 @@ def run(self):
248248

249249
setup(
250250
name="transformers",
251-
version="4.2.0.dev0", # expected format is one of x.y.z.dev0, or x.y.z.rc1 or x.y.z (no to dashes, yes to dots)
251+
version="4.2.0", # expected format is one of x.y.z.dev0, or x.y.z.rc1 or x.y.z (no to dashes, yes to dots)
252252
author="Thomas Wolf, Lysandre Debut, Victor Sanh, Julien Chaumond, Sam Shleifer, Patrick von Platen, Sylvain Gugger, Google AI Language Team Authors, Open AI team Authors, Facebook AI Authors, Carnegie Mellon University Authors",
253253
author_email="[email protected]",
254254
description="State-of-the-art Natural Language Processing for TensorFlow 2.0 and PyTorch",

src/transformers/__init__.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@
2222
# to defer the actual importing for when the objects are requested. This way `import transformers` provides the names
2323
# in the namespace without actually importing anything (and especially none of the backends).
2424

25-
__version__ = "4.2.0dev0"
25+
__version__ = "4.2.0"
2626

2727
# Work around to update TensorFlow's absl.logging threshold which alters the
2828
# default Python logging output behavior when present.

0 commit comments

Comments
 (0)