agemagician commited on
Commit
89b9d2c
1 Parent(s): 7ac377f

update readme with more examples

Browse files
Files changed (1) hide show
  1. README.md +66 -0
README.md CHANGED
@@ -124,6 +124,8 @@ The model is mostly meant to be fine-tuned on a supervised dataset. See the [mod
124
 
125
  ### How to use
126
 
 
 
127
  ```python
128
  from transformers import T5Tokenizer, LongT5Model
129
 
@@ -136,6 +138,70 @@ outputs = model(**inputs)
136
  last_hidden_states = outputs.last_hidden_state
137
  ```
138
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
139
  ### BibTeX entry and citation info
140
 
141
  ```bibtex
 
124
 
125
  ### How to use
126
 
127
+ The following shows how one can extract the last hidden representation for the model.
128
+
129
  ```python
130
  from transformers import T5Tokenizer, LongT5Model
131
 
 
138
  last_hidden_states = outputs.last_hidden_state
139
  ```
140
 
141
+ The following shows how one can predict masked passages using the different denoising strategies.
142
+
143
+ ### S-Denoising
144
+
145
+ For *S-Denoising*, please make sure to prompt the text with the prefix `[S2S]` as shown below.
146
+
147
+ ```python
148
+ from transformers import LongT5ForConditionalGeneration, T5Tokenizer
149
+ import torch
150
+
151
+ model = LongT5ForConditionalGeneration.from_pretrained("agemagician/mlong-t5-tglobal-xl", low_cpu_mem_usage=True, torch_dtype=torch.bfloat16).to("cuda")
152
+ tokenizer = T5Tokenizer.from_pretrained("agemagician/mlong-t5-tglobal-xl")
153
+
154
+ input_string = "[S2S] Mr. Dursley was the director of a firm called Grunnings, which made drills. He was a big, solid man with a bald head. Mrs. Dursley was thin and blonde and more than the usual amount of neck, which came in very useful as she spent so much of her time craning over garden fences, spying on the neighbours. The Dursleys had a small son called Dudley and in their opinion there was no finer boy anywhere <extra_id_0>"
155
+
156
+ inputs = tokenizer(input_string, return_tensors="pt").input_ids.to("cuda")
157
+
158
+ outputs = model.generate(inputs, max_length=200)
159
+
160
+ print(tokenizer.decode(outputs[0]))
161
+ ```
162
+
163
+ ### R-Denoising
164
+
165
+ For *R-Denoising*, please make sure to prompt the text with the prefix `[NLU]` as shown below.
166
+
167
+ ```python
168
+ from transformers import LongT5ForConditionalGeneration, T5Tokenizer
169
+ import torch
170
+
171
+ model = LongT5ForConditionalGeneration.from_pretrained("agemagician/mlong-t5-tglobal-xl", low_cpu_mem_usage=True, torch_dtype=torch.bfloat16).to("cuda")
172
+ tokenizer = T5Tokenizer.from_pretrained("agemagician/mlong-t5-tglobal-xl")
173
+
174
+ input_string = "[NLU] Mr. Dursley was the director of a firm called <extra_id_0>, which made <extra_id_1>. He was a big, solid man with a bald head. Mrs. Dursley was thin and <extra_id_2> of neck, which came in very useful as she spent so much of her time <extra_id_3>. The Dursleys had a small son called Dudley and <extra_id_4>"
175
+
176
+ inputs = tokenizer(input_string, return_tensors="pt", add_special_tokens=False).input_ids.to("cuda")
177
+
178
+ outputs = model.generate(inputs, max_length=200)
179
+
180
+ print(tokenizer.decode(outputs[0]))
181
+ ```
182
+
183
+ ### X-Denoising
184
+
185
+ For *X-Denoising*, please make sure to prompt the text with the prefix `[NLG]` as shown below.
186
+
187
+ ```python
188
+ from transformers import LongT5ForConditionalGeneration, T5Tokenizer
189
+ import torch
190
+
191
+ model = LongT5ForConditionalGeneration.from_pretrained("agemagician/mlong-t5-tglobal-xl", low_cpu_mem_usage=True, torch_dtype=torch.bfloat16).to("cuda")
192
+ tokenizer = T5Tokenizer.from_pretrained("agemagician/mlong-t5-tglobal-xl")
193
+
194
+ input_string = "[NLG] Mr. Dursley was the director of a firm called Grunnings, which made drills. He was a big, solid man wiht a bald head. Mrs. Dursley was thin and blonde and more than the usual amount of neck, which came in very useful as she
195
+ spent so much of her time craning over garden fences, spying on the neighbours. The Dursleys had a small son called Dudley and in their opinion there was no finer boy anywhere. <extra_id_0>"
196
+
197
+ model.cuda()
198
+ inputs = tokenizer(input_string, return_tensors="pt", add_special_tokens=False).input_ids.to("cuda")
199
+
200
+ outputs = model.generate(inputs, max_length=200)
201
+
202
+ print(tokenizer.decode(outputs[0]))
203
+ ```
204
+
205
  ### BibTeX entry and citation info
206
 
207
  ```bibtex