What:
A universal language model that reframed every NLP problem as a text-to-text problem. Instead of masking single words like BERT, it masks variable-length spans of words. T5 then predicts these spans in an auto-regressive manner.
Prefixes:
| Task | Input (with prefix) | Output |
|---|---|---|
| Sentiment | sentiment: I loved the movie. | positive |
| Translation | translate English to French: How are you? | Comment รงa va ? |
| Summarization | summarize: The article discusses... | The article says... |
| QA | question: What is the capital of Spain? context: Spain is... | Madrid |
| Grammar correction | grammar: This sentence are bad. | This sentence is bad. |