Skip to content

Latest commit

 

History

History
5 lines (3 loc) · 610 Bytes

File metadata and controls

5 lines (3 loc) · 610 Bytes

T5 Language Models

T5 (Text-To-Text Transfer Transformer) is a sequence-to-sequence model that frames all NLP tasks as text-to-text problems. Originally introduced by Raffel et al. in Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer, this model enables a single architecture to be applied across translation, summarization, classification, question answering, and more.

For more information on using our T5 implementation, visit its model page in our documentation.