patrickvonplaten jstjohn commited on
Commit
39ecb93
1 Parent(s): 9d8f8ca

Fix R vs S denoising documentation swap. (#2)

Browse files

- Fix R vs S denoising documentation swap. (71c34c5c6e3e54c5bf2cee360c3edfaa42acc1b3)


Co-authored-by: John St John <jstjohn@users.noreply.huggingface.co>

Files changed (1) hide show
  1. README.md +4 -4
README.md CHANGED
@@ -99,9 +99,9 @@ This model was contributed by [Daniel Hesslow](https://huggingface.co/Seledorn).
99
  The following shows how one can predict masked passages using the different denoising strategies.
100
  Given the size of the model the following examples need to be run on at least a 40GB A100 GPU.
101
 
102
- ### R-Denoising
103
 
104
- For *R-Denoising*, please make sure to prompt the text with the prefix `[S2S]` as shown below.
105
 
106
  ```python
107
  from transformers import T5ForConditionalGeneration, AutoTokenizer
@@ -120,9 +120,9 @@ print(tokenizer.decode(outputs[0]))
120
  # -> <pad>. Dudley was a very good boy, but he was also very stupid.</s>
121
  ```
122
 
123
- ### S-Denoising
124
 
125
- For *S-Denoising*, please make sure to prompt the text with the prefix `[NLU]` as shown below.
126
 
127
  ```python
128
  from transformers import T5ForConditionalGeneration, AutoTokenizer
99
  The following shows how one can predict masked passages using the different denoising strategies.
100
  Given the size of the model the following examples need to be run on at least a 40GB A100 GPU.
101
 
102
+ ### S-Denoising
103
 
104
+ For *S-Denoising*, please make sure to prompt the text with the prefix `[S2S]` as shown below.
105
 
106
  ```python
107
  from transformers import T5ForConditionalGeneration, AutoTokenizer
120
  # -> <pad>. Dudley was a very good boy, but he was also very stupid.</s>
121
  ```
122
 
123
+ ### R-Denoising
124
 
125
+ For *R-Denoising*, please make sure to prompt the text with the prefix `[NLU]` as shown below.
126
 
127
  ```python
128
  from transformers import T5ForConditionalGeneration, AutoTokenizer