asahi417 commited on
Commit
6a4a582
1 Parent(s): f278e5b

model update

Browse files
Files changed (1) hide show
  1. README.md +22 -10
README.md CHANGED
@@ -214,14 +214,14 @@ This model is fine-tuned version of [google/mt5-small](https://huggingface.co/go
214
  [lmqg/qg_squad](https://huggingface.co/datasets/lmqg/qg_squad) (dataset_name: default) via [`lmqg`](https://github.com/asahi417/lm-question-generation).
215
 
216
 
217
- Please cite our paper if you use the model ([TBA](TBA)).
218
 
219
  ```
220
 
221
  @inproceedings{ushio-etal-2022-generative,
222
- title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration: {A} {U}nified {B}enchmark and {E}valuation",
223
  author = "Ushio, Asahi and
224
- Alva-Manchego, Fernando and
225
  Camacho-Collados, Jose",
226
  booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
227
  month = dec,
@@ -238,17 +238,27 @@ Please cite our paper if you use the model ([TBA](TBA)).
238
  - **Training data:** [lmqg/qg_squad](https://huggingface.co/datasets/lmqg/qg_squad) (default)
239
  - **Online Demo:** [https://autoqg.net/](https://autoqg.net/)
240
  - **Repository:** [https://github.com/asahi417/lm-question-generation](https://github.com/asahi417/lm-question-generation)
241
- - **Paper:** [TBA](TBA)
242
 
243
  ### Usage
 
244
  ```python
245
 
246
- from transformers import pipeline
 
 
 
 
 
 
247
 
248
- model_path = 'lmqg/mt5-small-squad'
249
- pipe = pipeline("text2text-generation", model_path)
250
 
251
- # Question Generation
 
 
 
252
  question = pipe('<hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records.')
253
 
254
  ```
@@ -299,11 +309,12 @@ The following hyperparameters were used during fine-tuning:
299
  The full configuration can be found at [fine-tuning config file](https://huggingface.co/lmqg/mt5-small-squad/raw/main/trainer_config.json).
300
 
301
  ## Citation
 
302
 
303
  @inproceedings{ushio-etal-2022-generative,
304
- title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration: {A} {U}nified {B}enchmark and {E}valuation",
305
  author = "Ushio, Asahi and
306
- Alva-Manchego, Fernando and
307
  Camacho-Collados, Jose",
308
  booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
309
  month = dec,
@@ -312,3 +323,4 @@ The full configuration can be found at [fine-tuning config file](https://hugging
312
  publisher = "Association for Computational Linguistics",
313
  }
314
 
 
 
214
  [lmqg/qg_squad](https://huggingface.co/datasets/lmqg/qg_squad) (dataset_name: default) via [`lmqg`](https://github.com/asahi417/lm-question-generation).
215
 
216
 
217
+ Please cite our paper if you use the model ([https://arxiv.org/abs/2210.03992](https://arxiv.org/abs/2210.03992)).
218
 
219
  ```
220
 
221
  @inproceedings{ushio-etal-2022-generative,
222
+ title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
223
  author = "Ushio, Asahi and
224
+ Alva-Manchego, Fernando and
225
  Camacho-Collados, Jose",
226
  booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
227
  month = dec,
 
238
  - **Training data:** [lmqg/qg_squad](https://huggingface.co/datasets/lmqg/qg_squad) (default)
239
  - **Online Demo:** [https://autoqg.net/](https://autoqg.net/)
240
  - **Repository:** [https://github.com/asahi417/lm-question-generation](https://github.com/asahi417/lm-question-generation)
241
+ - **Paper:** [https://arxiv.org/abs/2210.03992](https://arxiv.org/abs/2210.03992)
242
 
243
  ### Usage
244
+ - With [`lmqg`](https://github.com/asahi417/lm-question-generation#lmqg-language-model-for-question-generation-)
245
  ```python
246
 
247
+ from lmqg import TransformersQG
248
+ # initialize model
249
+ model = TransformersQG(language='en', model='lmqg/mt5-small-squad')
250
+ # model prediction
251
+ question = model.generate_q(list_context=["William Turner was an English painter who specialised in watercolour landscapes"], list_answer=["William Turner"])
252
+
253
+ ```
254
 
255
+ - With `transformers`
256
+ ```python
257
 
258
+ from transformers import pipeline
259
+ # initialize model
260
+ pipe = pipeline("text2text-generation", 'lmqg/mt5-small-squad')
261
+ # question generation
262
  question = pipe('<hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records.')
263
 
264
  ```
 
309
  The full configuration can be found at [fine-tuning config file](https://huggingface.co/lmqg/mt5-small-squad/raw/main/trainer_config.json).
310
 
311
  ## Citation
312
+ ```
313
 
314
  @inproceedings{ushio-etal-2022-generative,
315
+ title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
316
  author = "Ushio, Asahi and
317
+ Alva-Manchego, Fernando and
318
  Camacho-Collados, Jose",
319
  booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
320
  month = dec,
 
323
  publisher = "Association for Computational Linguistics",
324
  }
325
 
326
+ ```