asahi417 commited on
Commit
69719e1
1 Parent(s): 9986e23

model update

Browse files
Files changed (1) hide show
  1. README.md +22 -10
README.md CHANGED
@@ -53,14 +53,14 @@ This model is fine-tuned version of [t5-large](https://huggingface.co/t5-large)
53
  [lmqg/qg_subjqa](https://huggingface.co/datasets/lmqg/qg_subjqa) (dataset_name: restaurants) via [`lmqg`](https://github.com/asahi417/lm-question-generation).
54
 
55
 
56
- Please cite our paper if you use the model ([TBA](TBA)).
57
 
58
  ```
59
 
60
  @inproceedings{ushio-etal-2022-generative,
61
- title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration: {A} {U}nified {B}enchmark and {E}valuation",
62
  author = "Ushio, Asahi and
63
- Alva-Manchego, Fernando and
64
  Camacho-Collados, Jose",
65
  booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
66
  month = dec,
@@ -77,17 +77,27 @@ Please cite our paper if you use the model ([TBA](TBA)).
77
  - **Training data:** [lmqg/qg_subjqa](https://huggingface.co/datasets/lmqg/qg_subjqa) (restaurants)
78
  - **Online Demo:** [https://autoqg.net/](https://autoqg.net/)
79
  - **Repository:** [https://github.com/asahi417/lm-question-generation](https://github.com/asahi417/lm-question-generation)
80
- - **Paper:** [TBA](TBA)
81
 
82
  ### Usage
 
83
  ```python
84
 
85
- from transformers import pipeline
 
 
 
 
 
 
86
 
87
- model_path = 'lmqg/t5-large-subjqa-vanilla-restaurants'
88
- pipe = pipeline("text2text-generation", model_path)
89
 
90
- # Question Generation
 
 
 
91
  question = pipe('generate question: <hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records.')
92
 
93
  ```
@@ -126,11 +136,12 @@ The following hyperparameters were used during fine-tuning:
126
  The full configuration can be found at [fine-tuning config file](https://huggingface.co/lmqg/t5-large-subjqa-vanilla-restaurants/raw/main/trainer_config.json).
127
 
128
  ## Citation
 
129
 
130
  @inproceedings{ushio-etal-2022-generative,
131
- title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration: {A} {U}nified {B}enchmark and {E}valuation",
132
  author = "Ushio, Asahi and
133
- Alva-Manchego, Fernando and
134
  Camacho-Collados, Jose",
135
  booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
136
  month = dec,
@@ -139,3 +150,4 @@ The full configuration can be found at [fine-tuning config file](https://hugging
139
  publisher = "Association for Computational Linguistics",
140
  }
141
 
 
 
53
  [lmqg/qg_subjqa](https://huggingface.co/datasets/lmqg/qg_subjqa) (dataset_name: restaurants) via [`lmqg`](https://github.com/asahi417/lm-question-generation).
54
 
55
 
56
+ Please cite our paper if you use the model ([https://arxiv.org/abs/2210.03992](https://arxiv.org/abs/2210.03992)).
57
 
58
  ```
59
 
60
  @inproceedings{ushio-etal-2022-generative,
61
+ title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
62
  author = "Ushio, Asahi and
63
+ Alva-Manchego, Fernando and
64
  Camacho-Collados, Jose",
65
  booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
66
  month = dec,
 
77
  - **Training data:** [lmqg/qg_subjqa](https://huggingface.co/datasets/lmqg/qg_subjqa) (restaurants)
78
  - **Online Demo:** [https://autoqg.net/](https://autoqg.net/)
79
  - **Repository:** [https://github.com/asahi417/lm-question-generation](https://github.com/asahi417/lm-question-generation)
80
+ - **Paper:** [https://arxiv.org/abs/2210.03992](https://arxiv.org/abs/2210.03992)
81
 
82
  ### Usage
83
+ - With [`lmqg`](https://github.com/asahi417/lm-question-generation#lmqg-language-model-for-question-generation-)
84
  ```python
85
 
86
+ from lmqg import TransformersQG
87
+ # initialize model
88
+ model = TransformersQG(language='en', model='lmqg/t5-large-subjqa-vanilla-restaurants')
89
+ # model prediction
90
+ question = model.generate_q(list_context=["William Turner was an English painter who specialised in watercolour landscapes"], list_answer=["William Turner"])
91
+
92
+ ```
93
 
94
+ - With `transformers`
95
+ ```python
96
 
97
+ from transformers import pipeline
98
+ # initialize model
99
+ pipe = pipeline("text2text-generation", 'lmqg/t5-large-subjqa-vanilla-restaurants')
100
+ # question generation
101
  question = pipe('generate question: <hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records.')
102
 
103
  ```
 
136
  The full configuration can be found at [fine-tuning config file](https://huggingface.co/lmqg/t5-large-subjqa-vanilla-restaurants/raw/main/trainer_config.json).
137
 
138
  ## Citation
139
+ ```
140
 
141
  @inproceedings{ushio-etal-2022-generative,
142
+ title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
143
  author = "Ushio, Asahi and
144
+ Alva-Manchego, Fernando and
145
  Camacho-Collados, Jose",
146
  booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
147
  month = dec,
 
150
  publisher = "Association for Computational Linguistics",
151
  }
152
 
153
+ ```