What:

A universal language model that reframed every NLP problem as a text-to-text problem. Instead of masking single words like BERT, it masks variable-length spans of words. T5 then predicts these spans in an auto-regressive manner.

Prefixes:

TaskInput (with prefix)Output
Sentimentsentiment: I loved the movie.positive
Translationtranslate English to French: How are you?Comment รงa va ?
Summarizationsummarize: The article discusses...The article says...
QAquestion: What is the capital of Spain? context: Spain is...Madrid
Grammar correctiongrammar: This sentence are bad.This sentence is bad.