@@ -46,7 +46,10 @@ The documentation is organized in five parts:
46
46
- **ADVANCED GUIDES ** contains more advanced guides that are more specific to a given script or part of the library.
47
47
- **RESEARCH ** focuses on tutorials that have less to do with how to use the library but more about general resarch in
48
48
transformers model
49
- - **PACKAGE REFERENCE ** contains the documentation of each public class and function.
49
+ - The three last section contain the documentation of each public class and function, grouped in:
50
+ - **MAIN CLASSES ** for the main classes exposing the important APIs of the library.
51
+ - **MODELS ** for the classes and functions related to each model implemented in the library.
52
+ - **INTERNAL HELPERS ** for the classes and functions we use internally.
50
53
51
54
The library currently contains PyTorch and Tensorflow implementations, pre-trained model weights, usage scripts and
52
55
conversion utilities for the following models:
@@ -188,50 +191,60 @@ conversion utilities for the following models:
188
191
189
192
.. toctree ::
190
193
:maxdepth: 2
191
- :caption: Package Reference
194
+ :caption: Main Classes
192
195
193
196
main_classes/configuration
194
- main_classes/output
197
+ main_classes/logging
195
198
main_classes/model
196
- main_classes/tokenizer
197
- main_classes/pipelines
198
- main_classes/trainer
199
199
main_classes/optimizer_schedules
200
+ main_classes/output
201
+ main_classes/pipelines
200
202
main_classes/processors
201
- main_classes/logging
203
+ main_classes/tokenizer
204
+ main_classes/trainer
205
+
206
+ .. toctree ::
207
+ :maxdepth: 2
208
+ :caption: Models
209
+
210
+ model_doc/albert
202
211
model_doc/auto
203
- model_doc/encoderdecoder
212
+ model_doc/bart
204
213
model_doc/bert
205
- model_doc/gpt
206
- model_doc/transformerxl
207
- model_doc/gpt2
208
- model_doc/xlm
209
- model_doc/xlnet
210
- model_doc/roberta
211
- model_doc/distilbert
212
- model_doc/ctrl
214
+ model_doc/bertgeneration
213
215
model_doc/camembert
214
- model_doc/albert
215
- model_doc/xlmroberta
216
- model_doc/flaubert
217
- model_doc/bart
218
- model_doc/t5
219
- model_doc/electra
216
+ model_doc/ctrl
220
217
model_doc/dialogpt
221
- model_doc/reformer
222
- model_doc/marian
223
- model_doc/longformer
224
- model_doc/retribert
225
- model_doc/mobilebert
218
+ model_doc/distilbert
226
219
model_doc/dpr
227
- model_doc/pegasus
228
- model_doc/mbart
220
+ model_doc/electra
221
+ model_doc/encoderdecoder
222
+ model_doc/flaubert
229
223
model_doc/fsmt
230
224
model_doc/funnel
231
- model_doc/lxmert
232
- model_doc/bertgeneration
233
225
model_doc/layoutlm
226
+ model_doc/longformer
227
+ model_doc/lxmert
228
+ model_doc/marian
229
+ model_doc/mbart
230
+ model_doc/mobilebert
231
+ model_doc/gpt
232
+ model_doc/gpt2
233
+ model_doc/pegasus
234
234
model_doc/rag
235
+ model_doc/reformer
236
+ model_doc/retribert
237
+ model_doc/roberta
238
+ model_doc/t5
239
+ model_doc/transformerxl
240
+ model_doc/xlm
241
+ model_doc/xlmroberta
242
+ model_doc/xlnet
243
+
244
+ .. toctree ::
245
+ :maxdepth: 2
246
+ :caption: Internal Helpers
247
+
235
248
internal/modeling_utils
236
- internal/tokenization_utils
237
249
internal/pipelines_utils
250
+ internal/tokenization_utils
0 commit comments