asahi417 commited on
Commit
0bad4a2
1 Parent(s): 0c3716c

model update

Browse files
Files changed (1) hide show
  1. README.md +0 -23
README.md CHANGED
@@ -51,21 +51,6 @@ model-index:
51
  - name: MoverScore (Question Generation)
52
  type: moverscore_question_generation
53
  value: 57.15
54
- - name: BLEU4 (Question & Answer Generation (with Gold Answer))
55
- type: bleu4_question_answer_generation_with_gold_answer
56
- value: 3.8
57
- - name: ROUGE-L (Question & Answer Generation (with Gold Answer))
58
- type: rouge_l_question_answer_generation_with_gold_answer
59
- value: 23.43
60
- - name: METEOR (Question & Answer Generation (with Gold Answer))
61
- type: meteor_question_answer_generation_with_gold_answer
62
- value: 24.87
63
- - name: BERTScore (Question & Answer Generation (with Gold Answer))
64
- type: bertscore_question_answer_generation_with_gold_answer
65
- value: 77.22
66
- - name: MoverScore (Question & Answer Generation (with Gold Answer))
67
- type: moverscore_question_answer_generation_with_gold_answer
68
- value: 54.79
69
  - name: QAAlignedF1Score-BERTScore (Question & Answer Generation (with Gold Answer))
70
  type: qa_aligned_f1_score_bertscore_question_answer_generation_with_gold_answer
71
  value: 81.98
@@ -167,20 +152,12 @@ question = pipe("extract answers: <hl> Il 6 ottobre 1973 , la Siria e l' Egitto,
167
 
168
  | | Score | Type | Dataset |
169
  |:--------------------------------|--------:|:--------|:-----------------------------------------------------------------|
170
- | BERTScore | 77.22 | default | [lmqg/qg_itquad](https://huggingface.co/datasets/lmqg/qg_itquad) |
171
- | Bleu_1 | 23.77 | default | [lmqg/qg_itquad](https://huggingface.co/datasets/lmqg/qg_itquad) |
172
- | Bleu_2 | 13.36 | default | [lmqg/qg_itquad](https://huggingface.co/datasets/lmqg/qg_itquad) |
173
- | Bleu_3 | 6.73 | default | [lmqg/qg_itquad](https://huggingface.co/datasets/lmqg/qg_itquad) |
174
- | Bleu_4 | 3.8 | default | [lmqg/qg_itquad](https://huggingface.co/datasets/lmqg/qg_itquad) |
175
- | METEOR | 24.87 | default | [lmqg/qg_itquad](https://huggingface.co/datasets/lmqg/qg_itquad) |
176
- | MoverScore | 54.79 | default | [lmqg/qg_itquad](https://huggingface.co/datasets/lmqg/qg_itquad) |
177
  | QAAlignedF1Score (BERTScore) | 81.98 | default | [lmqg/qg_itquad](https://huggingface.co/datasets/lmqg/qg_itquad) |
178
  | QAAlignedF1Score (MoverScore) | 56.35 | default | [lmqg/qg_itquad](https://huggingface.co/datasets/lmqg/qg_itquad) |
179
  | QAAlignedPrecision (BERTScore) | 81.19 | default | [lmqg/qg_itquad](https://huggingface.co/datasets/lmqg/qg_itquad) |
180
  | QAAlignedPrecision (MoverScore) | 56 | default | [lmqg/qg_itquad](https://huggingface.co/datasets/lmqg/qg_itquad) |
181
  | QAAlignedRecall (BERTScore) | 82.83 | default | [lmqg/qg_itquad](https://huggingface.co/datasets/lmqg/qg_itquad) |
182
  | QAAlignedRecall (MoverScore) | 56.75 | default | [lmqg/qg_itquad](https://huggingface.co/datasets/lmqg/qg_itquad) |
183
- | ROUGE_L | 23.43 | default | [lmqg/qg_itquad](https://huggingface.co/datasets/lmqg/qg_itquad) |
184
 
185
 
186
  - ***Metric (Answer Extraction)***: [raw metric file](https://huggingface.co/lmqg/mt5-base-itquad-qg-ae/raw/main/eval/metric.first.answer.paragraph_sentence.answer.lmqg_qg_itquad.default.json)
 
51
  - name: MoverScore (Question Generation)
52
  type: moverscore_question_generation
53
  value: 57.15
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
54
  - name: QAAlignedF1Score-BERTScore (Question & Answer Generation (with Gold Answer))
55
  type: qa_aligned_f1_score_bertscore_question_answer_generation_with_gold_answer
56
  value: 81.98
 
152
 
153
  | | Score | Type | Dataset |
154
  |:--------------------------------|--------:|:--------|:-----------------------------------------------------------------|
 
 
 
 
 
 
 
155
  | QAAlignedF1Score (BERTScore) | 81.98 | default | [lmqg/qg_itquad](https://huggingface.co/datasets/lmqg/qg_itquad) |
156
  | QAAlignedF1Score (MoverScore) | 56.35 | default | [lmqg/qg_itquad](https://huggingface.co/datasets/lmqg/qg_itquad) |
157
  | QAAlignedPrecision (BERTScore) | 81.19 | default | [lmqg/qg_itquad](https://huggingface.co/datasets/lmqg/qg_itquad) |
158
  | QAAlignedPrecision (MoverScore) | 56 | default | [lmqg/qg_itquad](https://huggingface.co/datasets/lmqg/qg_itquad) |
159
  | QAAlignedRecall (BERTScore) | 82.83 | default | [lmqg/qg_itquad](https://huggingface.co/datasets/lmqg/qg_itquad) |
160
  | QAAlignedRecall (MoverScore) | 56.75 | default | [lmqg/qg_itquad](https://huggingface.co/datasets/lmqg/qg_itquad) |
 
161
 
162
 
163
  - ***Metric (Answer Extraction)***: [raw metric file](https://huggingface.co/lmqg/mt5-base-itquad-qg-ae/raw/main/eval/metric.first.answer.paragraph_sentence.answer.lmqg_qg_itquad.default.json)