asahi417 commited on
Commit
484b0cf
1 Parent(s): d6098bd

model update

Browse files
Files changed (1) hide show
  1. README.md +2 -32
README.md CHANGED
@@ -46,29 +46,6 @@ model-index:
46
  - name: MoverScore
47
  type: moverscore
48
  value: 0.6251283990068167
49
- - task:
50
- name: Text2text Generation
51
- type: text2text-generation
52
- dataset:
53
- name: lmqg/qg_squad
54
- type: default
55
- args: default
56
- metrics:
57
- - name: BLEU4
58
- type: bleu4
59
- value: 0.17865591948915446
60
- - name: ROUGE-L
61
- type: rouge-l
62
- value: 0.4434755425309365
63
- - name: METEOR
64
- type: meteor
65
- value: 0.20137442726325325
66
- - name: BERTScore
67
- type: bertscore
68
- value: 0.9023929154360358
69
- - name: MoverScore
70
- type: moverscore
71
- value: 0.6095406387914699
72
  ---
73
 
74
  # Language Models Fine-tuning on Question Generation: `lmqg/t5-large-subjqa-books`
@@ -93,8 +70,7 @@ model_path = 'lmqg/t5-large-subjqa-books'
93
  pipe = pipeline("text2text-generation", model_path)
94
 
95
  # Question Generation
96
- input_text = 'generate question: <hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records.'
97
- question = pipe(input_text)
98
  ```
99
 
100
  ## Evaluation Metrics
@@ -104,15 +80,9 @@ question = pipe(input_text)
104
 
105
  | Dataset | Type | BLEU4 | ROUGE-L | METEOR | BERTScore | MoverScore | Link |
106
  |:--------|:-----|------:|--------:|-------:|----------:|-----------:|-----:|
107
- | [lmqg/qg_subjqa](https://huggingface.co/datasets/lmqg/qg_subjqa) | books | 3.533435659461028e-06 | 0.23681615122464109 | 0.20826196682882675 | 0.9288704804100916 | 0.6251283990068167 | [link](https://huggingface.co/lmqg/t5-large-subjqa-books/raw/main/eval/metric.first.sentence.paragraph_answer.question.lmqg_qg_subjqa.books.json) |
108
-
109
 
110
 
111
- ### Out-of-domain Metrics
112
-
113
- | Dataset | Type | BLEU4 | ROUGE-L | METEOR | BERTScore | MoverScore | Link |
114
- |:--------|:-----|------:|--------:|-------:|----------:|-----------:|-----:|
115
- | [lmqg/qg_squad](https://huggingface.co/datasets/lmqg/qg_squad) | default | 0.17865591948915446 | 0.4434755425309365 | 0.20137442726325325 | 0.9023929154360358 | 0.6095406387914699 | [link](https://huggingface.co/lmqg/t5-large-subjqa-books/raw/main/eval_ood/metric.first.sentence.paragraph_answer.question.lmqg_qg_squad.default.json) |
116
 
117
 
118
  ## Training hyperparameters
 
46
  - name: MoverScore
47
  type: moverscore
48
  value: 0.6251283990068167
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
49
  ---
50
 
51
  # Language Models Fine-tuning on Question Generation: `lmqg/t5-large-subjqa-books`
 
70
  pipe = pipeline("text2text-generation", model_path)
71
 
72
  # Question Generation
73
+ question = pipe('generate question: <hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records.')
 
74
  ```
75
 
76
  ## Evaluation Metrics
 
80
 
81
  | Dataset | Type | BLEU4 | ROUGE-L | METEOR | BERTScore | MoverScore | Link |
82
  |:--------|:-----|------:|--------:|-------:|----------:|-----------:|-----:|
83
+ | [lmqg/qg_subjqa](https://huggingface.co/datasets/lmqg/qg_subjqa) | books | 0.0 | 0.237 | 0.208 | 0.929 | 0.625 | [link](https://huggingface.co/lmqg/t5-large-subjqa-books/raw/main/eval/metric.first.sentence.paragraph_answer.question.lmqg_qg_subjqa.books.json) |
 
84
 
85
 
 
 
 
 
 
86
 
87
 
88
  ## Training hyperparameters