asahi417 commited on
Commit
570f968
1 Parent(s): 4f4a32d

model update

Browse files
Files changed (1) hide show
  1. README.md +22 -10
README.md CHANGED
@@ -283,14 +283,14 @@ This model is fine-tuned version of [facebook/bart-large](https://huggingface.co
283
  [lmqg/qg_squad](https://huggingface.co/datasets/lmqg/qg_squad) (dataset_name: default) via [`lmqg`](https://github.com/asahi417/lm-question-generation).
284
 
285
 
286
- Please cite our paper if you use the model ([TBA](TBA)).
287
 
288
  ```
289
 
290
  @inproceedings{ushio-etal-2022-generative,
291
- title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration: {A} {U}nified {B}enchmark and {E}valuation",
292
  author = "Ushio, Asahi and
293
- Alva-Manchego, Fernando and
294
  Camacho-Collados, Jose",
295
  booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
296
  month = dec,
@@ -307,17 +307,27 @@ Please cite our paper if you use the model ([TBA](TBA)).
307
  - **Training data:** [lmqg/qg_squad](https://huggingface.co/datasets/lmqg/qg_squad) (default)
308
  - **Online Demo:** [https://autoqg.net/](https://autoqg.net/)
309
  - **Repository:** [https://github.com/asahi417/lm-question-generation](https://github.com/asahi417/lm-question-generation)
310
- - **Paper:** [TBA](TBA)
311
 
312
  ### Usage
 
313
  ```python
314
 
315
- from transformers import pipeline
 
 
 
 
 
 
316
 
317
- model_path = 'lmqg/bart-large-squad'
318
- pipe = pipeline("text2text-generation", model_path)
319
 
320
- # Question Generation
 
 
 
321
  question = pipe('<hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records.')
322
 
323
  ```
@@ -371,11 +381,12 @@ The following hyperparameters were used during fine-tuning:
371
  The full configuration can be found at [fine-tuning config file](https://huggingface.co/lmqg/bart-large-squad/raw/main/trainer_config.json).
372
 
373
  ## Citation
 
374
 
375
  @inproceedings{ushio-etal-2022-generative,
376
- title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration: {A} {U}nified {B}enchmark and {E}valuation",
377
  author = "Ushio, Asahi and
378
- Alva-Manchego, Fernando and
379
  Camacho-Collados, Jose",
380
  booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
381
  month = dec,
@@ -384,3 +395,4 @@ The full configuration can be found at [fine-tuning config file](https://hugging
384
  publisher = "Association for Computational Linguistics",
385
  }
386
 
 
 
283
  [lmqg/qg_squad](https://huggingface.co/datasets/lmqg/qg_squad) (dataset_name: default) via [`lmqg`](https://github.com/asahi417/lm-question-generation).
284
 
285
 
286
+ Please cite our paper if you use the model ([https://arxiv.org/abs/2210.03992](https://arxiv.org/abs/2210.03992)).
287
 
288
  ```
289
 
290
  @inproceedings{ushio-etal-2022-generative,
291
+ title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
292
  author = "Ushio, Asahi and
293
+ Alva-Manchego, Fernando and
294
  Camacho-Collados, Jose",
295
  booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
296
  month = dec,
 
307
  - **Training data:** [lmqg/qg_squad](https://huggingface.co/datasets/lmqg/qg_squad) (default)
308
  - **Online Demo:** [https://autoqg.net/](https://autoqg.net/)
309
  - **Repository:** [https://github.com/asahi417/lm-question-generation](https://github.com/asahi417/lm-question-generation)
310
+ - **Paper:** [https://arxiv.org/abs/2210.03992](https://arxiv.org/abs/2210.03992)
311
 
312
  ### Usage
313
+ - With [`lmqg`](https://github.com/asahi417/lm-question-generation#lmqg-language-model-for-question-generation-)
314
  ```python
315
 
316
+ from lmqg import TransformersQG
317
+ # initialize model
318
+ model = TransformersQG(language='en', model='lmqg/bart-large-squad')
319
+ # model prediction
320
+ question = model.generate_q(list_context=["William Turner was an English painter who specialised in watercolour landscapes"], list_answer=["William Turner"])
321
+
322
+ ```
323
 
324
+ - With `transformers`
325
+ ```python
326
 
327
+ from transformers import pipeline
328
+ # initialize model
329
+ pipe = pipeline("text2text-generation", 'lmqg/bart-large-squad')
330
+ # question generation
331
  question = pipe('<hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records.')
332
 
333
  ```
 
381
  The full configuration can be found at [fine-tuning config file](https://huggingface.co/lmqg/bart-large-squad/raw/main/trainer_config.json).
382
 
383
  ## Citation
384
+ ```
385
 
386
  @inproceedings{ushio-etal-2022-generative,
387
+ title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
388
  author = "Ushio, Asahi and
389
+ Alva-Manchego, Fernando and
390
  Camacho-Collados, Jose",
391
  booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
392
  month = dec,
 
395
  publisher = "Association for Computational Linguistics",
396
  }
397
 
398
+ ```