Commit
•
965ce1c
1
Parent(s):
754bb8f
Update README.md
Browse files
README.md
CHANGED
@@ -1,11 +1,23 @@
|
|
1 |
-
|
|
|
2 |
|
3 |
-
|
|
|
|
|
4 |
|
5 |
The T5 model was presented in **Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer** by Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang, Michael Matena, Yanqi Zhou, Wei Li, Peter J. Liu.
|
6 |
|
7 |
## Pre-Processing:
|
8 |
-
For this task of grammar correction, we’ll use the prefix “grammar: “ to each of the input sentences.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
9 |
|
10 |
|
11 |
|
|
|
1 |
+
## t5-base-fine-tuned-on-jfleg
|
2 |
+
T5-base model fine-tuned on the JFLEG dataset with the objective of **text2text-generation**.
|
3 |
|
4 |
+
# Model Description:
|
5 |
+
T5 is an encoder-decoder model pre-trained with a multi-task mixture of unsupervised and supervised tasks and for which each task is converted into a text-to-text format.
|
6 |
+
.T5 works well on a variety of tasks out-of-the-box by prepending a different prefix to the input corresponding to each task, e.g., for translation: translate English to German: …, for summarization: summarize: ….
|
7 |
|
8 |
The T5 model was presented in **Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer** by Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang, Michael Matena, Yanqi Zhou, Wei Li, Peter J. Liu.
|
9 |
|
10 |
## Pre-Processing:
|
11 |
+
For this task of grammar correction, we’ll use the prefix “grammar: “ to each of the input sentences.
|
12 |
+
```
|
13 |
+
Grammar: Your Sentence
|
14 |
+
|
15 |
+
```
|
16 |
+
|
17 |
+
## How to use :
|
18 |
+
You can use this model directly with the pipeline for text2text-generation.
|
19 |
+
|
20 |
+
|
21 |
|
22 |
|
23 |
|