asahi417 commited on
Commit
d021c2b
1 Parent(s): 03f5b5f

model update

Browse files
Files changed (1) hide show
  1. README.md +5 -35
README.md CHANGED
@@ -14,11 +14,11 @@ pipeline_tag: text2text-generation
14
  tags:
15
  - question generation
16
  widget:
17
- - text: "generate question: <hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records."
18
  example_title: "Question Generation Example 1"
19
- - text: "generate question: Beyonce further expanded her acting career, starring as blues singer <hl> Etta James <hl> in the 2008 musical biopic, Cadillac Records."
20
  example_title: "Question Generation Example 2"
21
- - text: "generate question: Beyonce further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, <hl> Cadillac Records <hl> ."
22
  example_title: "Question Generation Example 3"
23
  model-index:
24
  - name: lmqg/bart-large-subjqa-restaurants
@@ -46,29 +46,6 @@ model-index:
46
  - name: MoverScore
47
  type: moverscore
48
  value: 0.6356671815651991
49
- - task:
50
- name: Text2text Generation
51
- type: text2text-generation
52
- dataset:
53
- name: lmqg/qg_squad
54
- type: default
55
- args: default
56
- metrics:
57
- - name: BLEU4
58
- type: bleu4
59
- value: 0.023769132392289873
60
- - name: ROUGE-L
61
- type: rouge-l
62
- value: 0.27350674683342796
63
- - name: METEOR
64
- type: meteor
65
- value: 0.09227251650839328
66
- - name: BERTScore
67
- type: bertscore
68
- value: 0.8847807145448456
69
- - name: MoverScore
70
- type: moverscore
71
- value: 0.5400628455451874
72
  ---
73
 
74
  # Language Models Fine-tuning on Question Generation: `lmqg/bart-large-subjqa-restaurants`
@@ -93,8 +70,7 @@ model_path = 'lmqg/bart-large-subjqa-restaurants'
93
  pipe = pipeline("text2text-generation", model_path)
94
 
95
  # Question Generation
96
- input_text = 'generate question: <hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records.'
97
- question = pipe(input_text)
98
  ```
99
 
100
  ## Evaluation Metrics
@@ -104,15 +80,9 @@ question = pipe(input_text)
104
 
105
  | Dataset | Type | BLEU4 | ROUGE-L | METEOR | BERTScore | MoverScore | Link |
106
  |:--------|:-----|------:|--------:|-------:|----------:|-----------:|-----:|
107
- | [lmqg/qg_subjqa](https://huggingface.co/datasets/lmqg/qg_subjqa) | restaurants | 0.05540154382690559 | 0.24773274232304157 | 0.224617708822779 | 0.9322543442249298 | 0.6356671815651991 | [link](https://huggingface.co/lmqg/bart-large-subjqa-restaurants/raw/main/eval/metric.first.sentence.paragraph_answer.question.lmqg_qg_subjqa.restaurants.json) |
108
-
109
 
110
 
111
- ### Out-of-domain Metrics
112
-
113
- | Dataset | Type | BLEU4 | ROUGE-L | METEOR | BERTScore | MoverScore | Link |
114
- |:--------|:-----|------:|--------:|-------:|----------:|-----------:|-----:|
115
- | [lmqg/qg_squad](https://huggingface.co/datasets/lmqg/qg_squad) | default | 0.023769132392289873 | 0.27350674683342796 | 0.09227251650839328 | 0.8847807145448456 | 0.5400628455451874 | [link](https://huggingface.co/lmqg/bart-large-subjqa-restaurants/raw/main/eval_ood/metric.first.sentence.paragraph_answer.question.lmqg_qg_squad.default.json) |
116
 
117
 
118
  ## Training hyperparameters
 
14
  tags:
15
  - question generation
16
  widget:
17
+ - text: "<hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records."
18
  example_title: "Question Generation Example 1"
19
+ - text: "Beyonce further expanded her acting career, starring as blues singer <hl> Etta James <hl> in the 2008 musical biopic, Cadillac Records."
20
  example_title: "Question Generation Example 2"
21
+ - text: "Beyonce further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, <hl> Cadillac Records <hl> ."
22
  example_title: "Question Generation Example 3"
23
  model-index:
24
  - name: lmqg/bart-large-subjqa-restaurants
 
46
  - name: MoverScore
47
  type: moverscore
48
  value: 0.6356671815651991
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
49
  ---
50
 
51
  # Language Models Fine-tuning on Question Generation: `lmqg/bart-large-subjqa-restaurants`
 
70
  pipe = pipeline("text2text-generation", model_path)
71
 
72
  # Question Generation
73
+ question = pipe('<hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records.')
 
74
  ```
75
 
76
  ## Evaluation Metrics
 
80
 
81
  | Dataset | Type | BLEU4 | ROUGE-L | METEOR | BERTScore | MoverScore | Link |
82
  |:--------|:-----|------:|--------:|-------:|----------:|-----------:|-----:|
83
+ | [lmqg/qg_subjqa](https://huggingface.co/datasets/lmqg/qg_subjqa) | restaurants | 0.055 | 0.248 | 0.225 | 0.932 | 0.636 | [link](https://huggingface.co/lmqg/bart-large-subjqa-restaurants/raw/main/eval/metric.first.sentence.paragraph_answer.question.lmqg_qg_subjqa.restaurants.json) |
 
84
 
85
 
 
 
 
 
 
86
 
87
 
88
  ## Training hyperparameters