asahi417 commited on
Commit
80d5e1f
1 Parent(s): 604029a

model update

Browse files
Files changed (1) hide show
  1. README.md +75 -82
README.md CHANGED
@@ -7,7 +7,7 @@ metrics:
7
  - rouge-l
8
  - bertscore
9
  - moverscore
10
- language: en
11
  datasets:
12
  - lmqg/qg_dequad
13
  pipeline_tag: text2text-generation
@@ -15,18 +15,18 @@ tags:
15
  - question generation
16
  - answer extraction
17
  widget:
18
- - text: "generate question: <hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records."
19
  example_title: "Question Generation Example 1"
20
- - text: "generate question: Beyonce further expanded her acting career, starring as blues singer <hl> Etta James <hl> in the 2008 musical biopic, Cadillac Records."
21
  example_title: "Question Generation Example 2"
22
- - text: "generate question: Beyonce further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, <hl> Cadillac Records <hl> ."
23
  example_title: "Question Generation Example 3"
24
- - text: "<hl> Beyonce further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records. <hl> Her performance in the film received praise from critics, and she garnered several nominations for her portrayal of James, including a Satellite Award nomination for Best Supporting Actress, and a NAACP Image Award nomination for Outstanding Supporting Actress."
25
  example_title: "Answer Extraction Example 1"
26
- - text: "Beyonce further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records. <hl> Her performance in the film received praise from critics, and she garnered several nominations for her portrayal of James, including a Satellite Award nomination for Best Supporting Actress, and a NAACP Image Award nomination for Outstanding Supporting Actress. <hl>"
27
  example_title: "Answer Extraction Example 2"
28
  model-index:
29
- - name: lmqg/mt5-small-dequad-multitask
30
  results:
31
  - task:
32
  name: Text2text Generation
@@ -36,67 +36,48 @@ model-index:
36
  type: default
37
  args: default
38
  metrics:
39
- - name: BLEU4
40
- type: bleu4
41
- value: 0.008153318257935705
42
- - name: ROUGE-L
43
- type: rouge-l
44
- value: 0.10153326763371277
45
- - name: METEOR
46
- type: meteor
47
- value: 0.12181097136639749
48
- - name: BERTScore
49
- type: bertscore
50
- value: 0.8038890473051649
51
- - name: MoverScore
52
- type: moverscore
53
- value: 0.551016955735025
54
- - name: QAAlignedF1Score (BERTScore)
55
- type: qa_aligned_f1_score_bertscore
56
- value: 0.8001787025118497
57
- - name: QAAlignedRecall (BERTScore)
58
- type: qa_aligned_recall_bertscore
59
- value: 0.812320688466097
60
- - name: QAAlignedPrecision (BERTScore)
61
- type: qa_aligned_precision_bertscore
62
- value: 0.7891379324540309
63
- - name: QAAlignedF1Score (MoverScore)
64
- type: qa_aligned_f1_score_moverscore
65
- value: 0.5399337820733731
66
- - name: QAAlignedRecall (MoverScore)
67
- type: qa_aligned_recall_moverscore
68
- value: 0.5427046152241427
69
- - name: QAAlignedPrecision (MoverScore)
70
- type: qa_aligned_precision_moverscore
71
- value: 0.537701100527795
72
  ---
73
 
74
- # Model Card of `lmqg/mt5-small-dequad-multitask`
75
- This model is fine-tuned version of [google/mt5-small](https://huggingface.co/google/mt5-small) for question generation task on the
76
- [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) (dataset_name: default) via [`lmqg`](https://github.com/asahi417/lm-question-generation).
77
- This model is fine-tuned on the answer extraction task as well as the question generation.
78
 
79
- Please cite our paper if you use the model ([https://arxiv.org/abs/2210.03992](https://arxiv.org/abs/2210.03992)).
80
-
81
- ```
82
-
83
- @inproceedings{ushio-etal-2022-generative,
84
- title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
85
- author = "Ushio, Asahi and
86
- Alva-Manchego, Fernando and
87
- Camacho-Collados, Jose",
88
- booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
89
- month = dec,
90
- year = "2022",
91
- address = "Abu Dhabi, U.A.E.",
92
- publisher = "Association for Computational Linguistics",
93
- }
94
-
95
- ```
96
 
97
  ### Overview
98
  - **Language model:** [google/mt5-small](https://huggingface.co/google/mt5-small)
99
- - **Language:** en
100
  - **Training data:** [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) (default)
101
  - **Online Demo:** [https://autoqg.net/](https://autoqg.net/)
102
  - **Repository:** [https://github.com/asahi417/lm-question-generation](https://github.com/asahi417/lm-question-generation)
@@ -105,44 +86,57 @@ Please cite our paper if you use the model ([https://arxiv.org/abs/2210.03992](h
105
  ### Usage
106
  - With [`lmqg`](https://github.com/asahi417/lm-question-generation#lmqg-language-model-for-question-generation-)
107
  ```python
108
-
109
  from lmqg import TransformersQG
 
110
  # initialize model
111
- model = TransformersQG(language='en', model='lmqg/mt5-small-dequad-multitask')
 
112
  # model prediction
113
- question_answer = model.generate_qa("William Turner was an English painter who specialised in watercolour landscapes")
114
 
115
  ```
116
 
117
  - With `transformers`
118
  ```python
119
-
120
  from transformers import pipeline
121
- # initialize model
122
- pipe = pipeline("text2text-generation", 'lmqg/mt5-small-dequad-multitask')
 
123
  # answer extraction
124
- answer = pipe('extract answers: <hl> Beyonce further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records. <hl> Her performance in the film received praise from critics, and she garnered several nominations for her portrayal of James, including a Satellite Award nomination for Best Supporting Actress, and a NAACP Image Award nomination for Outstanding Supporting Actress.')
 
125
  # question generation
126
- question = pipe('generate question: <hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records.')
127
 
128
  ```
129
 
130
- ## Evaluation Metrics
131
 
132
 
133
- ### Metrics
134
 
135
- | Dataset | Type | BLEU4 | ROUGE-L | METEOR | BERTScore | MoverScore | Link |
136
- |:--------|:-----|------:|--------:|-------:|----------:|-----------:|-----:|
137
- | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) | default | 0.008 | 0.102 | 0.122 | 0.804 | 0.551 | [link](https://huggingface.co/lmqg/mt5-small-dequad-multitask/raw/main/eval/metric.first.sentence.paragraph_answer.question.lmqg_qg_dequad.default.json) |
 
 
 
 
 
 
 
138
 
139
 
140
- ### Metrics (QAG)
141
 
142
- | Dataset | Type | QA Aligned F1 Score (BERTScore) | QA Aligned F1 Score (MoverScore) | Link |
143
- |:--------|:-----|--------------------------------:|---------------------------------:|-----:|
144
- | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) | default | 0.8 | 0.54 | [link](https://huggingface.co/lmqg/mt5-small-dequad-multitask/raw/main/eval/metric.first.answer.paragraph.questions_answers.lmqg_qg_dequad.default.json) |
145
-
 
 
 
 
146
 
147
 
148
 
@@ -165,11 +159,10 @@ The following hyperparameters were used during fine-tuning:
165
  - gradient_accumulation_steps: 4
166
  - label_smoothing: 0.15
167
 
168
- The full configuration can be found at [fine-tuning config file](https://huggingface.co/lmqg/mt5-small-dequad-multitask/raw/main/trainer_config.json).
169
 
170
  ## Citation
171
  ```
172
-
173
  @inproceedings{ushio-etal-2022-generative,
174
  title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
175
  author = "Ushio, Asahi and
 
7
  - rouge-l
8
  - bertscore
9
  - moverscore
10
+ language: de
11
  datasets:
12
  - lmqg/qg_dequad
13
  pipeline_tag: text2text-generation
 
15
  - question generation
16
  - answer extraction
17
  widget:
18
+ - text: "generate question: Empfangs- und Sendeantenne sollen in ihrer Polarisation übereinstimmen, andernfalls <hl> wird die Signalübertragung stark gedämpft. <hl>"
19
  example_title: "Question Generation Example 1"
20
+ - text: "generate question: das erste weltweit errichtete Hermann Brehmer <hl> 1855 <hl> im niederschlesischen ''Görbersdorf'' (heute Sokołowsko, Polen)."
21
  example_title: "Question Generation Example 2"
22
+ - text: "generate question: Er muss Zyperngrieche sein und wird direkt für <hl> fünf Jahre <hl> gewählt (Art. 43 Abs. 1 der Verfassung) und verfügt über weitreichende Exekutivkompetenzen."
23
  example_title: "Question Generation Example 3"
24
+ - text: "extract answers: Sommerzeit <hl> Frühling <hl>: Umstellung von Normalzeit auf Sommerzeit die Uhr wird um eine Stunde ''vor''gestellt. Herbst: Umstellung von Sommerzeit auf Normalzeit – die Uhr wird um eine Stunde ''zurück''gestellt. Als Sommerzeit wird die gegenüber der Zonenzeit meist um eine Stunde vorgestellte Uhrzeit bezeichnet, die während eines bestimmten Zeitraums im Sommerhalbjahr (und oft auch etwas darüber hinaus) als gesetzliche Zeit dient. Eine solche Regelung wird fast nur in Ländern der gemäßigten Zonen angewandt. Die mitteleuropäische Sommerzeit beginnt am letzten Sonntag im März um 2:00 Uhr MEZ, indem die Stundenzählung um eine Stunde von 2:00 Uhr auf 3:00 Uhr vorgestellt wird. Sie endet jeweils am letzten Sonntag im Oktober um 3:00 Uhr MESZ, indem die Stundenzählung um eine Stunde von 3:00 Uhr auf 2:00 Uhr zurückgestellt wird."
25
  example_title: "Answer Extraction Example 1"
26
+ - text: "extract answers: Iran === Landwirtschaft === Die landwirtschaftliche Nutzfläche beträgt trotz zahlreicher Gebirge und Wüsten 10 % der Landesfläche, wobei ein Drittel künstlich bewässert wird. Die Landwirtschaft ist einer der größten Arbeitgeber des Landes. Wichtige Produkte sind Pistazien, Weizen, Reis, Zucker, Baumwolle, Früchte, Nüsse, Datteln, Wolle und Kaviar. Seit der Revolution von 1979 wurde der Anbau von Weintrauben wegen des islamischen Alkoholverbots auf den 200.000 Hektar Rebfläche fast vollständig auf Tafeltrauben und Rosinen umgestellt. Bei Rosinen ist <hl> der Iran <hl> inzwischen nach der Türkei der zweitgrößte Exporteur der Welt, bei Safran mit ungefähr 90 % Marktanteil des globalen Bedarfs mit Abstand der größte."
27
  example_title: "Answer Extraction Example 2"
28
  model-index:
29
+ - name: lmqg/mt5-small-dequad-qg-ae
30
  results:
31
  - task:
32
  name: Text2text Generation
 
36
  type: default
37
  args: default
38
  metrics:
39
+ - name: BLEU4 (Question Generation)
40
+ type: bleu4_question_generation
41
+ value: 0.82
42
+ - name: ROUGE-L (Question Generation)
43
+ type: rouge_l_question_generation
44
+ value: 10.15
45
+ - name: METEOR (Question Generation)
46
+ type: meteor_question_generation
47
+ value: 12.18
48
+ - name: BERTScore (Question Generation)
49
+ type: bertscore_question_generation
50
+ value: 80.39
51
+ - name: MoverScore (Question Generation)
52
+ type: moverscore_question_generation
53
+ value: 55.1
54
+ - name: QAAlignedF1Score-BERTScore (Question & Answer Generation)
55
+ type: qa_aligned_f1_score_bertscore_question_answer_generation
56
+ value: 80.02
57
+ - name: QAAlignedRecall-BERTScore (Question & Answer Generation)
58
+ type: qa_aligned_recall_bertscore_question_answer_generation
59
+ value: 81.23
60
+ - name: QAAlignedPrecision-BERTScore (Question & Answer Generation)
61
+ type: qa_aligned_precision_bertscore_question_answer_generation
62
+ value: 78.91
63
+ - name: QAAlignedF1Score-MoverScore (Question & Answer Generation)
64
+ type: qa_aligned_f1_score_moverscore_question_answer_generation
65
+ value: 53.99
66
+ - name: QAAlignedRecall-MoverScore (Question & Answer Generation)
67
+ type: qa_aligned_recall_moverscore_question_answer_generation
68
+ value: 54.27
69
+ - name: QAAlignedPrecision-MoverScore (Question & Answer Generation)
70
+ type: qa_aligned_precision_moverscore_question_answer_generation
71
+ value: 53.77
72
  ---
73
 
74
+ # Model Card of `lmqg/mt5-small-dequad-qg-ae`
75
+ This model is fine-tuned version of [google/mt5-small](https://huggingface.co/google/mt5-small) for question generation and answer extraction jointly on the [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) (dataset_name: default) via [`lmqg`](https://github.com/asahi417/lm-question-generation).
 
 
76
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
77
 
78
  ### Overview
79
  - **Language model:** [google/mt5-small](https://huggingface.co/google/mt5-small)
80
+ - **Language:** de
81
  - **Training data:** [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) (default)
82
  - **Online Demo:** [https://autoqg.net/](https://autoqg.net/)
83
  - **Repository:** [https://github.com/asahi417/lm-question-generation](https://github.com/asahi417/lm-question-generation)
 
86
  ### Usage
87
  - With [`lmqg`](https://github.com/asahi417/lm-question-generation#lmqg-language-model-for-question-generation-)
88
  ```python
 
89
  from lmqg import TransformersQG
90
+
91
  # initialize model
92
+ model = TransformersQG(language="de", model="lmqg/mt5-small-dequad-qg-ae")
93
+
94
  # model prediction
95
+ question_answer_pairs = model.generate_qa("das erste weltweit errichtete Hermann Brehmer 1855 im niederschlesischen ''Görbersdorf'' (heute Sokołowsko, Polen).")
96
 
97
  ```
98
 
99
  - With `transformers`
100
  ```python
 
101
  from transformers import pipeline
102
+
103
+ pipe = pipeline("text2text-generation", "lmqg/mt5-small-dequad-qg-ae")
104
+
105
  # answer extraction
106
+ answer = pipe("generate question: Empfangs- und Sendeantenne sollen in ihrer Polarisation übereinstimmen, andernfalls <hl> wird die Signalübertragung stark gedämpft. <hl>")
107
+
108
  # question generation
109
+ question = pipe("extract answers: Sommerzeit <hl> Frühling <hl>: Umstellung von Normalzeit auf Sommerzeit – die Uhr wird um eine Stunde ''vor''gestellt. Herbst: Umstellung von Sommerzeit auf Normalzeit – die Uhr wird um eine Stunde ''zurück''gestellt. Als Sommerzeit wird die gegenüber der Zonenzeit meist um eine Stunde vorgestellte Uhrzeit bezeichnet, die während eines bestimmten Zeitraums im Sommerhalbjahr (und oft auch etwas darüber hinaus) als gesetzliche Zeit dient. Eine solche Regelung wird fast nur in Ländern der gemäßigten Zonen angewandt. Die mitteleuropäische Sommerzeit beginnt am letzten Sonntag im März um 2:00 Uhr MEZ, indem die Stundenzählung um eine Stunde von 2:00 Uhr auf 3:00 Uhr vorgestellt wird. Sie endet jeweils am letzten Sonntag im Oktober um 3:00 Uhr MESZ, indem die Stundenzählung um eine Stunde von 3:00 Uhr auf 2:00 Uhr zurückgestellt wird.")
110
 
111
  ```
112
 
113
+ ## Evaluation
114
 
115
 
116
+ - ***Metric (Question Generation)***: [raw metric file](https://huggingface.co/lmqg/mt5-small-dequad-qg-ae/raw/main/eval/metric.first.sentence.paragraph_answer.question.lmqg_qg_dequad.default.json)
117
 
118
+ | | Score | Type | Dataset |
119
+ |:-----------|--------:|:--------|:-----------------------------------------------------------------|
120
+ | BERTScore | 80.39 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
121
+ | Bleu_1 | 10.13 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
122
+ | Bleu_2 | 4.24 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
123
+ | Bleu_3 | 1.89 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
124
+ | Bleu_4 | 0.82 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
125
+ | METEOR | 12.18 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
126
+ | MoverScore | 55.1 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
127
+ | ROUGE_L | 10.15 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
128
 
129
 
130
+ - ***Metric (Question & Answer Generation)***: [raw metric file](https://huggingface.co/lmqg/mt5-small-dequad-qg-ae/raw/main/eval/metric.first.answer.paragraph.questions_answers.lmqg_qg_dequad.default.json)
131
 
132
+ | | Score | Type | Dataset |
133
+ |:--------------------------------|--------:|:--------|:-----------------------------------------------------------------|
134
+ | QAAlignedF1Score (BERTScore) | 80.02 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
135
+ | QAAlignedF1Score (MoverScore) | 53.99 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
136
+ | QAAlignedPrecision (BERTScore) | 78.91 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
137
+ | QAAlignedPrecision (MoverScore) | 53.77 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
138
+ | QAAlignedRecall (BERTScore) | 81.23 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
139
+ | QAAlignedRecall (MoverScore) | 54.27 | default | [lmqg/qg_dequad](https://huggingface.co/datasets/lmqg/qg_dequad) |
140
 
141
 
142
 
 
159
  - gradient_accumulation_steps: 4
160
  - label_smoothing: 0.15
161
 
162
+ The full configuration can be found at [fine-tuning config file](https://huggingface.co/lmqg/mt5-small-dequad-qg-ae/raw/main/trainer_config.json).
163
 
164
  ## Citation
165
  ```
 
166
  @inproceedings{ushio-etal-2022-generative,
167
  title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
168
  author = "Ushio, Asahi and