Modfiededition commited on
Commit
cd18109
1 Parent(s): 42e659d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -1,11 +1,11 @@
1
  ## t5-base-fine-tuned-on-jfleg
2
- T5-base model fine-tuned on the JFLEG dataset with the objective of **text2text-generation**.
3
 
4
  # Model Description:
5
  T5 is an encoder-decoder model pre-trained with a multi-task mixture of unsupervised and supervised tasks and for which each task is converted into a text-to-text format.
6
  .T5 works well on a variety of tasks out-of-the-box by prepending a different prefix to the input corresponding to each task, e.g., for translation: translate English to German: …, for summarization: summarize: ….
7
 
8
- The T5 model was presented in **Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer** by Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang, Michael Matena, Yanqi Zhou, Wei Li, Peter J. Liu.
9
 
10
  ## Pre-Processing:
11
  For this task of grammar correction, we’ll use the prefix “grammar: “ to each of the input sentences.
@@ -15,7 +15,7 @@ Grammar: Your Sentence
15
  ```
16
 
17
  ## How to use :
18
- You can use this model directly with the pipeline for for detecting and correcting grammatical mistakes.
19
 
20
  ```
21
  from transformers import pipeline
 
1
  ## t5-base-fine-tuned-on-jfleg
2
+ T5-base model fine-tuned on the [JFLEG dataset](https://huggingface.co/datasets/jfleg) with the objective of **text2text-generation**.
3
 
4
  # Model Description:
5
  T5 is an encoder-decoder model pre-trained with a multi-task mixture of unsupervised and supervised tasks and for which each task is converted into a text-to-text format.
6
  .T5 works well on a variety of tasks out-of-the-box by prepending a different prefix to the input corresponding to each task, e.g., for translation: translate English to German: …, for summarization: summarize: ….
7
 
8
+ The T5 model was presented in [**Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer**](https://arxiv.org/pdf/1910.10683.pdf) by Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang, Michael Matena, Yanqi Zhou, Wei Li, Peter J. Liu.
9
 
10
  ## Pre-Processing:
11
  For this task of grammar correction, we’ll use the prefix “grammar: “ to each of the input sentences.
 
15
  ```
16
 
17
  ## How to use :
18
+ You can use this model directly with the pipeline for detecting and correcting grammatical mistakes.
19
 
20
  ```
21
  from transformers import pipeline