asahi417 commited on
Commit
a896544
1 Parent(s): 2db81ee

model update

Browse files
README.md CHANGED
@@ -26,7 +26,7 @@ widget:
26
  - text: "extract answers: 지난 22일 아프리카TV는 BJ 철구가 서비스 정지 처분을 받았음을 밝혔다. 서비스 정지 처분을 사유는 철구가 10대 청소년에게 유해한 장면을 방송으로 내보냈기 때문이었다. 문제가 된 장면은 BJ 철구가 미성년자는 시청할 수 없게 하는 19세 시청 가능 설정을 하지 않은 채 흡연하는 모습을 여과 없이 드러낸 장면이다. 아프리카TV는 청소년 보호 정책의 '청소년들이 해로운 환경으로부터 보호받을 수 있도록 조치한다'라고 조항을 근거로 철구에게 서비스 정지 처분을 내렸다. 흡연 이외에 음주 방송 등도 19세 시청 가능 설정을 해야만 방송할 수 있다. <hl> 게다가 철구의 방송 정지 처분은 이번에 처음이 아니라 16번 째기 때문에 더욱더 논란이 되고 있다. <hl>"
27
  example_title: "Answer Extraction Example 2"
28
  model-index:
29
- - name: lmqg/mt5-base-koquad-multitask
30
  results:
31
  - task:
32
  name: Text2text Generation
@@ -51,34 +51,49 @@ model-index:
51
  - name: MoverScore (Question Generation)
52
  type: moverscore_question_generation
53
  value: 83.24
54
- - name: QAAlignedF1Score-BERTScore
55
- type: qa_aligned_f1_score_bertscore
56
  value: 80.28
57
- - name: QAAlignedRecall-BERTScore
58
- type: qa_aligned_recall_bertscore
59
  value: 83.91
60
- - name: QAAlignedPrecision-BERTScore
61
- type: qa_aligned_precision_bertscore
62
  value: 77.03
63
- - name: QAAlignedF1Score-MoverScore
64
- type: qa_aligned_f1_score_moverscore
65
  value: 81.97
66
- - name: QAAlignedRecall-MoverScore
67
- type: qa_aligned_recall_moverscore
68
  value: 86.43
69
- - name: QAAlignedPrecision-MoverScore
70
- type: qa_aligned_precision_moverscore
71
  value: 78.1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
72
  - name: AnswerF1Score (Answer Extraction)
73
- type: answer_f1_score_answer_extraction
74
  value: 88.43
75
  - name: AnswerExactMatch (Answer Extraction)
76
  type: answer_exact_match_answer_extraction
77
  value: 83.02
78
  ---
79
 
80
- # Model Card of `lmqg/mt5-base-koquad-multitask`
81
- This model is fine-tuned version of [google/mt5-base](https://huggingface.co/google/mt5-base) for question generation task and answer extraction jointly on the [lmqg/qg_koquad](https://huggingface.co/datasets/lmqg/qg_koquad) (dataset_name: default) via [`lmqg`](https://github.com/asahi417/lm-question-generation).
82
 
83
 
84
  ### Overview
@@ -95,7 +110,7 @@ This model is fine-tuned version of [google/mt5-base](https://huggingface.co/goo
95
  from lmqg import TransformersQG
96
 
97
  # initialize model
98
- model = TransformersQG(language="ko", model="lmqg/mt5-base-koquad-multitask")
99
 
100
  # model prediction
101
  question_answer_pairs = model.generate_qa("1990년 영화 《 남부군 》에서 단역으로 영화배우 첫 데뷔에 이어 같은 해 KBS 드라마 《지구인》에서 단역으로 출연하였고 이듬해 MBC 《여명의 눈동자》를 통해 단역으로 출연하였다.")
@@ -106,7 +121,7 @@ question_answer_pairs = model.generate_qa("1990년 영화 《 남부군 》에
106
  ```python
107
  from transformers import pipeline
108
 
109
- pipe = pipeline("text2text-generation", "lmqg/mt5-base-koquad-multitask")
110
 
111
  # answer extraction
112
  answer = pipe("generate question: 1990년 영화 《 <hl> 남부군 <hl> 》에서 단역으로 영화배우 첫 데뷔에 이어 같은 해 KBS 드라마 《지구인》에서 단역으로 출연하였고 이듬해 MBC 《여명의 눈동자》를 통해 단역으로 출연하였다.")
@@ -119,7 +134,7 @@ question = pipe("extract answers: 또한 스피어스는 많은 새로운 여성
119
  ## Evaluation
120
 
121
 
122
- - ***Metric (Question Generation)***: [raw metric file](https://huggingface.co/lmqg/mt5-base-koquad-multitask/raw/main/eval/metric.first.sentence.paragraph_answer.question.lmqg_qg_koquad.default.json)
123
 
124
  | | Score | Type | Dataset |
125
  |:-----------|--------:|:--------|:-----------------------------------------------------------------|
@@ -133,7 +148,7 @@ question = pipe("extract answers: 또한 스피어스는 많은 새로운 여성
133
  | ROUGE_L | 28.55 | default | [lmqg/qg_koquad](https://huggingface.co/datasets/lmqg/qg_koquad) |
134
 
135
 
136
- - ***Metric (Question & Answer Generation)***: [raw metric file](https://huggingface.co/lmqg/mt5-base-koquad-multitask/raw/main/eval/metric.first.answer.paragraph.questions_answers.lmqg_qg_koquad.default.json)
137
 
138
  | | Score | Type | Dataset |
139
  |:--------------------------------|--------:|:--------|:-----------------------------------------------------------------|
@@ -145,7 +160,7 @@ question = pipe("extract answers: 또한 스피어스는 많은 새로운 여성
145
  | QAAlignedRecall (MoverScore) | 86.43 | default | [lmqg/qg_koquad](https://huggingface.co/datasets/lmqg/qg_koquad) |
146
 
147
 
148
- - ***Metric (Answer Generation)***: [raw metric file](https://huggingface.co/lmqg/mt5-base-koquad-multitask/raw/main/eval/metric.first.answer.paragraph_sentence.answer.lmqg_qg_koquad.default.json)
149
 
150
  | | Score | Type | Dataset |
151
  |:-----------------|--------:|:--------|:-----------------------------------------------------------------|
@@ -181,7 +196,7 @@ The following hyperparameters were used during fine-tuning:
181
  - gradient_accumulation_steps: 2
182
  - label_smoothing: 0.15
183
 
184
- The full configuration can be found at [fine-tuning config file](https://huggingface.co/lmqg/mt5-base-koquad-multitask/raw/main/trainer_config.json).
185
 
186
  ## Citation
187
  ```
 
26
  - text: "extract answers: 지난 22일 아프리카TV는 BJ 철구가 서비스 정지 처분을 받았음을 밝혔다. 서비스 정지 처분을 사유는 철구가 10대 청소년에게 유해한 장면을 방송으로 내보냈기 때문이었다. 문제가 된 장면은 BJ 철구가 미성년자는 시청할 수 없게 하는 19세 시청 가능 설정을 하지 않은 채 흡연하는 모습을 여과 없이 드러낸 장면이다. 아프리카TV는 청소년 보호 정책의 '청소년들이 해로운 환경으로부터 보호받을 수 있도록 조치한다'라고 조항을 근거로 철구에게 서비스 정지 처분을 내렸다. 흡연 이외에 음주 방송 등도 19세 시청 가능 설정을 해야만 방송할 수 있다. <hl> 게다가 철구의 방송 정지 처분은 이번에 처음이 아니라 16번 째기 때문에 더욱더 논란이 되고 있다. <hl>"
27
  example_title: "Answer Extraction Example 2"
28
  model-index:
29
+ - name: lmqg/mt5-base-koquad-qg-ae
30
  results:
31
  - task:
32
  name: Text2text Generation
 
51
  - name: MoverScore (Question Generation)
52
  type: moverscore_question_generation
53
  value: 83.24
54
+ - name: QAAlignedF1Score-BERTScore (Question & Answer Generation)
55
+ type: qa_aligned_f1_score_bertscore_question_answer_generation
56
  value: 80.28
57
+ - name: QAAlignedRecall-BERTScore (Question & Answer Generation)
58
+ type: qa_aligned_recall_bertscore_question_answer_generation
59
  value: 83.91
60
+ - name: QAAlignedPrecision-BERTScore (Question & Answer Generation)
61
+ type: qa_aligned_precision_bertscore_question_answer_generation
62
  value: 77.03
63
+ - name: QAAlignedF1Score-MoverScore (Question & Answer Generation)
64
+ type: qa_aligned_f1_score_moverscore_question_answer_generation
65
  value: 81.97
66
+ - name: QAAlignedRecall-MoverScore (Question & Answer Generation)
67
+ type: qa_aligned_recall_moverscore_question_answer_generation
68
  value: 86.43
69
+ - name: QAAlignedPrecision-MoverScore (Question & Answer Generation)
70
+ type: qa_aligned_precision_moverscore_question_answer_generation
71
  value: 78.1
72
+ - name: BLEU4 (Answer Extraction)
73
+ type: bleu4_answer_extraction
74
+ value: 34.98
75
+ - name: ROUGE-L (Answer Extraction)
76
+ type: rouge_l_answer_extraction
77
+ value: 83.83
78
+ - name: METEOR (Answer Extraction)
79
+ type: meteor_answer_extraction
80
+ value: 61.26
81
+ - name: BERTScore (Answer Extraction)
82
+ type: bertscore_answer_extraction
83
+ value: 96.14
84
+ - name: MoverScore (Answer Extraction)
85
+ type: moverscore_answer_extraction
86
+ value: 95.2
87
  - name: AnswerF1Score (Answer Extraction)
88
+ type: answer_f1_score__answer_extraction
89
  value: 88.43
90
  - name: AnswerExactMatch (Answer Extraction)
91
  type: answer_exact_match_answer_extraction
92
  value: 83.02
93
  ---
94
 
95
+ # Model Card of `lmqg/mt5-base-koquad-qg-ae`
96
+ This model is fine-tuned version of [google/mt5-base](https://huggingface.co/google/mt5-base) for question generation and answer extraction jointly on the [lmqg/qg_koquad](https://huggingface.co/datasets/lmqg/qg_koquad) (dataset_name: default) via [`lmqg`](https://github.com/asahi417/lm-question-generation).
97
 
98
 
99
  ### Overview
 
110
  from lmqg import TransformersQG
111
 
112
  # initialize model
113
+ model = TransformersQG(language="ko", model="lmqg/mt5-base-koquad-qg-ae")
114
 
115
  # model prediction
116
  question_answer_pairs = model.generate_qa("1990년 영화 《 남부군 》에서 단역으로 영화배우 첫 데뷔에 이어 같은 해 KBS 드라마 《지구인》에서 단역으로 출연하였고 이듬해 MBC 《여명의 눈동자》를 통해 단역으로 출연하였다.")
 
121
  ```python
122
  from transformers import pipeline
123
 
124
+ pipe = pipeline("text2text-generation", "lmqg/mt5-base-koquad-qg-ae")
125
 
126
  # answer extraction
127
  answer = pipe("generate question: 1990년 영화 《 <hl> 남부군 <hl> 》에서 단역으로 영화배우 첫 데뷔에 이어 같은 해 KBS 드라마 《지구인》에서 단역으로 출연하였고 이듬해 MBC 《여명의 눈동자》를 통해 단역으로 출연하였다.")
 
134
  ## Evaluation
135
 
136
 
137
+ - ***Metric (Question Generation)***: [raw metric file](https://huggingface.co/lmqg/mt5-base-koquad-qg-ae/raw/main/eval/metric.first.sentence.paragraph_answer.question.lmqg_qg_koquad.default.json)
138
 
139
  | | Score | Type | Dataset |
140
  |:-----------|--------:|:--------|:-----------------------------------------------------------------|
 
148
  | ROUGE_L | 28.55 | default | [lmqg/qg_koquad](https://huggingface.co/datasets/lmqg/qg_koquad) |
149
 
150
 
151
+ - ***Metric (Question & Answer Generation)***: [raw metric file](https://huggingface.co/lmqg/mt5-base-koquad-qg-ae/raw/main/eval/metric.first.answer.paragraph.questions_answers.lmqg_qg_koquad.default.json)
152
 
153
  | | Score | Type | Dataset |
154
  |:--------------------------------|--------:|:--------|:-----------------------------------------------------------------|
 
160
  | QAAlignedRecall (MoverScore) | 86.43 | default | [lmqg/qg_koquad](https://huggingface.co/datasets/lmqg/qg_koquad) |
161
 
162
 
163
+ - ***Metric (Answer Extraction)***: [raw metric file](https://huggingface.co/lmqg/mt5-base-koquad-qg-ae/raw/main/eval/metric.first.answer.paragraph_sentence.answer.lmqg_qg_koquad.default.json)
164
 
165
  | | Score | Type | Dataset |
166
  |:-----------------|--------:|:--------|:-----------------------------------------------------------------|
 
196
  - gradient_accumulation_steps: 2
197
  - label_smoothing: 0.15
198
 
199
+ The full configuration can be found at [fine-tuning config file](https://huggingface.co/lmqg/mt5-base-koquad-qg-ae/raw/main/trainer_config.json).
200
 
201
  ## Citation
202
  ```
eval/metric.first.answer.paragraph.questions_answers.lmqg_qg_koquad.default.json CHANGED
@@ -1,5 +1 @@
1
- <<<<<<< HEAD
2
- {"test": {"QAAlignedF1Score (BERTScore)": 0.8027605790024022, "QAAlignedRecall (BERTScore)": 0.8391231145490027, "QAAlignedPrecision (BERTScore)": 0.7702733584383046, "QAAlignedF1Score (MoverScore)": 0.8197272130915817, "QAAlignedRecall (MoverScore)": 0.8642533114826558, "QAAlignedPrecision (MoverScore)": 0.7810095393516585}, "validation": {"QAAlignedF1Score (BERTScore)": 0.8270067657961688, "QAAlignedRecall (BERTScore)": 0.8366250100173795, "QAAlignedPrecision (BERTScore)": 0.8181326643551915, "QAAlignedF1Score (MoverScore)": 0.8616471419521426, "QAAlignedRecall (MoverScore)": 0.8687903601401978, "QAAlignedPrecision (MoverScore)": 0.8559235660539652}}
3
- =======
4
  {"test": {"QAAlignedF1Score (BERTScore)": 0.8027605791423696, "QAAlignedRecall (BERTScore)": 0.8391231148330259, "QAAlignedPrecision (BERTScore)": 0.7702733584479435, "QAAlignedF1Score (MoverScore)": 0.8197379797060238, "QAAlignedRecall (MoverScore)": 0.8642724490753525, "QAAlignedPrecision (MoverScore)": 0.7810142692565831}, "validation": {"QAAlignedF1Score (BERTScore)": 0.8270067656226541, "QAAlignedRecall (BERTScore)": 0.8366250098715292, "QAAlignedPrecision (BERTScore)": 0.8181326641770484, "QAAlignedF1Score (MoverScore)": 0.8616624306894872, "QAAlignedRecall (MoverScore)": 0.8688060705285917, "QAAlignedPrecision (MoverScore)": 0.8559387631638197}}
5
- >>>>>>> f6f982b5193f2a958e645f57b4457c79bdf5a15c
 
 
 
 
1
  {"test": {"QAAlignedF1Score (BERTScore)": 0.8027605791423696, "QAAlignedRecall (BERTScore)": 0.8391231148330259, "QAAlignedPrecision (BERTScore)": 0.7702733584479435, "QAAlignedF1Score (MoverScore)": 0.8197379797060238, "QAAlignedRecall (MoverScore)": 0.8642724490753525, "QAAlignedPrecision (MoverScore)": 0.7810142692565831}, "validation": {"QAAlignedF1Score (BERTScore)": 0.8270067656226541, "QAAlignedRecall (BERTScore)": 0.8366250098715292, "QAAlignedPrecision (BERTScore)": 0.8181326641770484, "QAAlignedF1Score (MoverScore)": 0.8616624306894872, "QAAlignedRecall (MoverScore)": 0.8688060705285917, "QAAlignedPrecision (MoverScore)": 0.8559387631638197}}
 
eval/metric.first.answer.paragraph_sentence.answer.lmqg_qg_koquad.default.json CHANGED
@@ -1,5 +1 @@
1
- <<<<<<< HEAD
2
- {"validation": {"Bleu_1": 0.7234977041324892, "Bleu_2": 0.6232681100476516, "Bleu_3": 0.4884212653629802, "Bleu_4": 0.33498483094197157, "METEOR": 0.5963769689946691, "ROUGE_L": 0.8159680680273093, "BERTScore": 0.9524190997058138, "MoverScore": 0.9417069414119854, "AnswerF1Score": 84.91490896277583, "AnswerExactMatch": 79.48317724592438}, "test": {"Bleu_1": 0.7493322781678949, "Bleu_2": 0.6538723254802299, "Bleu_3": 0.5138896890145673, "Bleu_4": 0.3497542417446219, "METEOR": 0.6126328482686303, "ROUGE_L": 0.8383120221719403, "BERTScore": 0.9613852180822252, "MoverScore": 0.9520114109966031, "AnswerF1Score": 88.43194707215045, "AnswerExactMatch": 83.02115851543532}}
3
- =======
4
  {"validation": {"Bleu_1": 0.7234977041324892, "Bleu_2": 0.6232681100476516, "Bleu_3": 0.4884212653629802, "Bleu_4": 0.33498483094197157, "METEOR": 0.5963769689946691, "ROUGE_L": 0.8159680680273093, "BERTScore": 0.9524191018611327, "MoverScore": 0.9417059572948544, "AnswerF1Score": 84.91490896277583, "AnswerExactMatch": 79.48317724592438}, "test": {"Bleu_1": 0.7493322781678949, "Bleu_2": 0.6538723254802299, "Bleu_3": 0.5138896890145673, "Bleu_4": 0.3497542417446219, "METEOR": 0.6126328482686303, "ROUGE_L": 0.8383120221719403, "BERTScore": 0.9613852186352686, "MoverScore": 0.952047569907686, "AnswerF1Score": 88.43194707215045, "AnswerExactMatch": 83.02115851543532}}
5
- >>>>>>> f6f982b5193f2a958e645f57b4457c79bdf5a15c
 
 
 
 
1
  {"validation": {"Bleu_1": 0.7234977041324892, "Bleu_2": 0.6232681100476516, "Bleu_3": 0.4884212653629802, "Bleu_4": 0.33498483094197157, "METEOR": 0.5963769689946691, "ROUGE_L": 0.8159680680273093, "BERTScore": 0.9524191018611327, "MoverScore": 0.9417059572948544, "AnswerF1Score": 84.91490896277583, "AnswerExactMatch": 79.48317724592438}, "test": {"Bleu_1": 0.7493322781678949, "Bleu_2": 0.6538723254802299, "Bleu_3": 0.5138896890145673, "Bleu_4": 0.3497542417446219, "METEOR": 0.6126328482686303, "ROUGE_L": 0.8383120221719403, "BERTScore": 0.9613852186352686, "MoverScore": 0.952047569907686, "AnswerF1Score": 88.43194707215045, "AnswerExactMatch": 83.02115851543532}}