T5 (Text-To-Text Transfer Transformer) is a sequence-to-sequence model that frames all NLP tasks as text-to-text problems. Originally introduced by Raffel et al. in Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer, this model enables a single architecture to be applied across translation, summarization, classification, question answering, and more.
For more information on using our T5 implementation, visit its model page in our documentation.