@@ -35,6 +35,8 @@ Choose the right framework for every part of a model's lifetime:
35
35
- Move a single model between TF2.0/PyTorch frameworks at will
36
36
- Seamlessly pick the right framework for training, evaluation, production
37
37
38
+ Experimental support for Flax with a few models right now, expected to grow in the coming months.
39
+
38
40
Contents
39
41
-----------------------------------------------------------------------------------------------------------------------
40
42
@@ -52,8 +54,8 @@ The documentation is organized in five parts:
52
54
- **MODELS ** for the classes and functions related to each model implemented in the library.
53
55
- **INTERNAL HELPERS ** for the classes and functions we use internally.
54
56
55
- The library currently contains PyTorch and Tensorflow implementations, pre-trained model weights, usage scripts and
56
- conversion utilities for the following models:
57
+ The library currently contains PyTorch, Tensorflow and Flax implementations, pretrained model weights, usage scripts
58
+ and conversion utilities for the following models:
57
59
58
60
..
59
61
This list is updated automatically from the README with `make fix-copies`. Do not update manually!
@@ -166,6 +168,95 @@ conversion utilities for the following models:
166
168
34. `Other community models <https://huggingface.co/models >`__, contributed by the `community
167
169
<https://huggingface.co/users> `__.
168
170
171
+
172
+ The table below represents the current support in the library for each of those models, whether they have a Python
173
+ tokenizer (called "slow"). A "fast" tokenizer backed by the 🤗 Tokenizers library, whether they have support in PyTorch,
174
+ TensorFlow and/or Flax.
175
+
176
+ ..
177
+ This table is updated automatically from the auto modules with `make fix-copies`. Do not update manually!
178
+
179
+ .. rst-class :: center-aligned-table
180
+
181
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
182
+ | Model | Tokenizer slow | Tokenizer fast | PyTorch support | TensorFlow support | Flax Support |
183
+ +=============================+================+================+=================+====================+==============+
184
+ | ALBERT | ✅ | ✅ | ✅ | ✅ | ❌ |
185
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
186
+ | BART | ✅ | ✅ | ✅ | ✅ | ❌ |
187
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
188
+ | BERT | ✅ | ✅ | ✅ | ✅ | ✅ |
189
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
190
+ | Bert Generation | ✅ | ❌ | ✅ | ❌ | ❌ |
191
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
192
+ | Blenderbot | ✅ | ❌ | ✅ | ✅ | ❌ |
193
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
194
+ | CTRL | ✅ | ❌ | ✅ | ✅ | ❌ |
195
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
196
+ | CamemBERT | ✅ | ✅ | ✅ | ✅ | ❌ |
197
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
198
+ | DPR | ✅ | ✅ | ✅ | ✅ | ❌ |
199
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
200
+ | DeBERTa | ✅ | ❌ | ✅ | ❌ | ❌ |
201
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
202
+ | DistilBERT | ✅ | ✅ | ✅ | ✅ | ❌ |
203
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
204
+ | ELECTRA | ✅ | ✅ | ✅ | ✅ | ❌ |
205
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
206
+ | Encoder decoder | ❌ | ❌ | ✅ | ❌ | ❌ |
207
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
208
+ | FairSeq Machine-Translation | ✅ | ❌ | ✅ | ❌ | ❌ |
209
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
210
+ | FlauBERT | ✅ | ❌ | ✅ | ✅ | ❌ |
211
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
212
+ | Funnel Transformer | ✅ | ✅ | ✅ | ✅ | ❌ |
213
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
214
+ | LXMERT | ✅ | ✅ | ✅ | ✅ | ❌ |
215
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
216
+ | LayoutLM | ✅ | ✅ | ✅ | ❌ | ❌ |
217
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
218
+ | Longformer | ✅ | ✅ | ✅ | ✅ | ❌ |
219
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
220
+ | Marian | ✅ | ❌ | ✅ | ✅ | ❌ |
221
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
222
+ | MobileBERT | ✅ | ✅ | ✅ | ✅ | ❌ |
223
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
224
+ | OpenAI GPT | ✅ | ✅ | ✅ | ✅ | ❌ |
225
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
226
+ | OpenAI GPT-2 | ✅ | ✅ | ✅ | ✅ | ❌ |
227
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
228
+ | Pegasus | ✅ | ✅ | ✅ | ✅ | ❌ |
229
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
230
+ | ProphetNet | ✅ | ❌ | ✅ | ❌ | ❌ |
231
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
232
+ | RAG | ✅ | ❌ | ✅ | ❌ | ❌ |
233
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
234
+ | Reformer | ✅ | ✅ | ✅ | ❌ | ❌ |
235
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
236
+ | RetriBERT | ✅ | ✅ | ✅ | ❌ | ❌ |
237
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
238
+ | RoBERTa | ✅ | ✅ | ✅ | ✅ | ✅ |
239
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
240
+ | SqueezeBERT | ✅ | ✅ | ✅ | ❌ | ❌ |
241
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
242
+ | T5 | ✅ | ✅ | ✅ | ✅ | ❌ |
243
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
244
+ | Transformer-XL | ✅ | ❌ | ✅ | ✅ | ❌ |
245
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
246
+ | XLM | ✅ | ❌ | ✅ | ✅ | ❌ |
247
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
248
+ | XLM-RoBERTa | ✅ | ✅ | ✅ | ✅ | ❌ |
249
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
250
+ | XLMProphetNet | ✅ | ❌ | ✅ | ❌ | ❌ |
251
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
252
+ | XLNet | ✅ | ✅ | ✅ | ✅ | ❌ |
253
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
254
+ | mBART | ✅ | ✅ | ✅ | ✅ | ❌ |
255
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
256
+ | mT5 | ✅ | ✅ | ✅ | ✅ | ❌ |
257
+ +-----------------------------+----------------+----------------+-----------------+--------------------+--------------+
258
+
259
+
169
260
.. toctree ::
170
261
:maxdepth: 2
171
262
:caption: Get started
0 commit comments