File size: 1,037 Bytes
965ce1c
 
754bb8f
965ce1c
 
 
754bb8f
 
 
 
965ce1c
 
 
 
 
 
 
 
 
 
754bb8f
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
## t5-base-fine-tuned-on-jfleg
T5-base model fine-tuned on the JFLEG dataset with the objective of **text2text-generation**.

# Model Description:
T5 is an encoder-decoder model pre-trained with a multi-task mixture of unsupervised and supervised tasks and for which each task is converted into a text-to-text format.
.T5 works well on a variety of tasks out-of-the-box by prepending a different prefix to the input corresponding to each task, e.g., for translation: translate English to German: …, for summarization: summarize: ….

The T5 model was presented in **Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer** by Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang, Michael Matena, Yanqi Zhou, Wei Li, Peter J. Liu.

## Pre-Processing:
For this task of grammar correction, we’ll use the prefix “grammar: “ to each of the input sentences.
```
Grammar: Your Sentence

```

## How to use :
You can use this model directly with the pipeline for text2text-generation.