asahi417 commited on
Commit
534072c
1 Parent(s): 1ea855d

model update

Browse files
Files changed (1) hide show
  1. README.md +5 -35
README.md CHANGED
@@ -14,11 +14,11 @@ pipeline_tag: text2text-generation
14
  tags:
15
  - question generation
16
  widget:
17
- - text: "generate question: <hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records."
18
  example_title: "Question Generation Example 1"
19
- - text: "generate question: Beyonce further expanded her acting career, starring as blues singer <hl> Etta James <hl> in the 2008 musical biopic, Cadillac Records."
20
  example_title: "Question Generation Example 2"
21
- - text: "generate question: Beyonce further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, <hl> Cadillac Records <hl> ."
22
  example_title: "Question Generation Example 3"
23
  model-index:
24
  - name: lmqg/bart-large-subjqa-books
@@ -46,29 +46,6 @@ model-index:
46
  - name: MoverScore
47
  type: moverscore
48
  value: 0.6244982140336275
49
- - task:
50
- name: Text2text Generation
51
- type: text2text-generation
52
- dataset:
53
- name: lmqg/qg_squad
54
- type: default
55
- args: default
56
- metrics:
57
- - name: BLEU4
58
- type: bleu4
59
- value: 0.024707668170383792
60
- - name: ROUGE-L
61
- type: rouge-l
62
- value: 0.2738334540375138
63
- - name: METEOR
64
- type: meteor
65
- value: 0.09336329466493658
66
- - name: BERTScore
67
- type: bertscore
68
- value: 0.8819915806304527
69
- - name: MoverScore
70
- type: moverscore
71
- value: 0.5391954986190068
72
  ---
73
 
74
  # Language Models Fine-tuning on Question Generation: `lmqg/bart-large-subjqa-books`
@@ -93,8 +70,7 @@ model_path = 'lmqg/bart-large-subjqa-books'
93
  pipe = pipeline("text2text-generation", model_path)
94
 
95
  # Question Generation
96
- input_text = 'generate question: <hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records.'
97
- question = pipe(input_text)
98
  ```
99
 
100
  ## Evaluation Metrics
@@ -104,15 +80,9 @@ question = pipe(input_text)
104
 
105
  | Dataset | Type | BLEU4 | ROUGE-L | METEOR | BERTScore | MoverScore | Link |
106
  |:--------|:-----|------:|--------:|-------:|----------:|-----------:|-----:|
107
- | [lmqg/qg_subjqa](https://huggingface.co/datasets/lmqg/qg_subjqa) | books | 3.8775125297960566e-06 | 0.2370773529266555 | 0.20603930224653336 | 0.9283541731185314 | 0.6244982140336275 | [link](https://huggingface.co/lmqg/bart-large-subjqa-books/raw/main/eval/metric.first.sentence.paragraph_answer.question.lmqg_qg_subjqa.books.json) |
108
-
109
 
110
 
111
- ### Out-of-domain Metrics
112
-
113
- | Dataset | Type | BLEU4 | ROUGE-L | METEOR | BERTScore | MoverScore | Link |
114
- |:--------|:-----|------:|--------:|-------:|----------:|-----------:|-----:|
115
- | [lmqg/qg_squad](https://huggingface.co/datasets/lmqg/qg_squad) | default | 0.024707668170383792 | 0.2738334540375138 | 0.09336329466493658 | 0.8819915806304527 | 0.5391954986190068 | [link](https://huggingface.co/lmqg/bart-large-subjqa-books/raw/main/eval_ood/metric.first.sentence.paragraph_answer.question.lmqg_qg_squad.default.json) |
116
 
117
 
118
  ## Training hyperparameters
 
14
  tags:
15
  - question generation
16
  widget:
17
+ - text: "<hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records."
18
  example_title: "Question Generation Example 1"
19
+ - text: "Beyonce further expanded her acting career, starring as blues singer <hl> Etta James <hl> in the 2008 musical biopic, Cadillac Records."
20
  example_title: "Question Generation Example 2"
21
+ - text: "Beyonce further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, <hl> Cadillac Records <hl> ."
22
  example_title: "Question Generation Example 3"
23
  model-index:
24
  - name: lmqg/bart-large-subjqa-books
 
46
  - name: MoverScore
47
  type: moverscore
48
  value: 0.6244982140336275
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
49
  ---
50
 
51
  # Language Models Fine-tuning on Question Generation: `lmqg/bart-large-subjqa-books`
 
70
  pipe = pipeline("text2text-generation", model_path)
71
 
72
  # Question Generation
73
+ question = pipe('<hl> Beyonce <hl> further expanded her acting career, starring as blues singer Etta James in the 2008 musical biopic, Cadillac Records.')
 
74
  ```
75
 
76
  ## Evaluation Metrics
 
80
 
81
  | Dataset | Type | BLEU4 | ROUGE-L | METEOR | BERTScore | MoverScore | Link |
82
  |:--------|:-----|------:|--------:|-------:|----------:|-----------:|-----:|
83
+ | [lmqg/qg_subjqa](https://huggingface.co/datasets/lmqg/qg_subjqa) | books | 0.0 | 0.237 | 0.206 | 0.928 | 0.624 | [link](https://huggingface.co/lmqg/bart-large-subjqa-books/raw/main/eval/metric.first.sentence.paragraph_answer.question.lmqg_qg_subjqa.books.json) |
 
84
 
85
 
 
 
 
 
 
86
 
87
 
88
  ## Training hyperparameters