Modfiededition commited on
Commit
965ce1c
1 Parent(s): 754bb8f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +15 -3
README.md CHANGED
@@ -1,11 +1,23 @@
1
- # Model Description:
 
2
 
3
- This model is a T5-base model which is pre-trained on the C4 dataset and fine-tuned on the JFLEG dataset. T5 is an encoder-decoder model pre-trained on a multi-task mixture of unsupervised and supervised tasks and for which each task is converted into a text-to-text format.
 
 
4
 
5
  The T5 model was presented in **Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer** by Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang, Michael Matena, Yanqi Zhou, Wei Li, Peter J. Liu.
6
 
7
  ## Pre-Processing:
8
- For this task of grammar correction, we’ll use the prefix “grammar: “ to each of the input sentences. This is done because T5 models are able to perform multiple tasks like translation and summarization with a single model, and a unique prefix is used for each task so that the model learns which task to perform.
 
 
 
 
 
 
 
 
 
9
 
10
 
11
 
 
1
+ ## t5-base-fine-tuned-on-jfleg
2
+ T5-base model fine-tuned on the JFLEG dataset with the objective of **text2text-generation**.
3
 
4
+ # Model Description:
5
+ T5 is an encoder-decoder model pre-trained with a multi-task mixture of unsupervised and supervised tasks and for which each task is converted into a text-to-text format.
6
+ .T5 works well on a variety of tasks out-of-the-box by prepending a different prefix to the input corresponding to each task, e.g., for translation: translate English to German: …, for summarization: summarize: ….
7
 
8
  The T5 model was presented in **Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer** by Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang, Michael Matena, Yanqi Zhou, Wei Li, Peter J. Liu.
9
 
10
  ## Pre-Processing:
11
+ For this task of grammar correction, we’ll use the prefix “grammar: “ to each of the input sentences.
12
+ ```
13
+ Grammar: Your Sentence
14
+
15
+ ```
16
+
17
+ ## How to use :
18
+ You can use this model directly with the pipeline for text2text-generation.
19
+
20
+
21
 
22
 
23