model update
Browse files
README.md
CHANGED
@@ -48,11 +48,29 @@ model-index:
|
|
48 |
value: 0.6587523781250593
|
49 |
---
|
50 |
|
51 |
-
#
|
52 |
This model is fine-tuned version of [facebook/mbart-large-cc25](https://huggingface.co/facebook/mbart-large-cc25) for question generation task on the
|
53 |
-
[lmqg/qg_ruquad](https://huggingface.co/datasets/lmqg/qg_ruquad) (dataset_name: default).
|
54 |
|
55 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
56 |
### Overview
|
57 |
- **Language model:** [facebook/mbart-large-cc25](https://huggingface.co/facebook/mbart-large-cc25)
|
58 |
- **Language:** ru
|
@@ -71,6 +89,7 @@ pipe = pipeline("text2text-generation", model_path)
|
|
71 |
|
72 |
# Question Generation
|
73 |
question = pipe('Нелишним будет отметить, что, развивая это направление, Д. И. Менделеев, поначалу априорно выдвинув идею о температуре, при которой высота мениска будет нулевой, <hl> в мае 1860 года <hl> провёл серию опытов.')
|
|
|
74 |
```
|
75 |
|
76 |
## Evaluation Metrics
|
@@ -107,4 +126,16 @@ The following hyperparameters were used during fine-tuning:
|
|
107 |
The full configuration can be found at [fine-tuning config file](https://huggingface.co/lmqg/mbart-large-cc25-ruquad/raw/main/trainer_config.json).
|
108 |
|
109 |
## Citation
|
110 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
48 |
value: 0.6587523781250593
|
49 |
---
|
50 |
|
51 |
+
# Model Card of `lmqg/mbart-large-cc25-ruquad`
|
52 |
This model is fine-tuned version of [facebook/mbart-large-cc25](https://huggingface.co/facebook/mbart-large-cc25) for question generation task on the
|
53 |
+
[lmqg/qg_ruquad](https://huggingface.co/datasets/lmqg/qg_ruquad) (dataset_name: default) via [`lmqg`](https://github.com/asahi417/lm-question-generation).
|
54 |
|
55 |
|
56 |
+
Please cite our paper if you use the model ([TBA](TBA)).
|
57 |
+
|
58 |
+
```
|
59 |
+
|
60 |
+
@inproceedings{ushio-etal-2022-generative,
|
61 |
+
title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration: {A} {U}nified {B}enchmark and {E}valuation",
|
62 |
+
author = "Ushio, Asahi and
|
63 |
+
Alva-Manchego, Fernando and
|
64 |
+
Camacho-Collados, Jose",
|
65 |
+
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
|
66 |
+
month = dec,
|
67 |
+
year = "2022",
|
68 |
+
address = "Abu Dhabi, U.A.E.",
|
69 |
+
publisher = "Association for Computational Linguistics",
|
70 |
+
}
|
71 |
+
|
72 |
+
```
|
73 |
+
|
74 |
### Overview
|
75 |
- **Language model:** [facebook/mbart-large-cc25](https://huggingface.co/facebook/mbart-large-cc25)
|
76 |
- **Language:** ru
|
|
|
89 |
|
90 |
# Question Generation
|
91 |
question = pipe('Нелишним будет отметить, что, развивая это направление, Д. И. Менделеев, поначалу априорно выдвинув идею о температуре, при которой высота мениска будет нулевой, <hl> в мае 1860 года <hl> провёл серию опытов.')
|
92 |
+
|
93 |
```
|
94 |
|
95 |
## Evaluation Metrics
|
|
|
126 |
The full configuration can be found at [fine-tuning config file](https://huggingface.co/lmqg/mbart-large-cc25-ruquad/raw/main/trainer_config.json).
|
127 |
|
128 |
## Citation
|
129 |
+
|
130 |
+
@inproceedings{ushio-etal-2022-generative,
|
131 |
+
title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration: {A} {U}nified {B}enchmark and {E}valuation",
|
132 |
+
author = "Ushio, Asahi and
|
133 |
+
Alva-Manchego, Fernando and
|
134 |
+
Camacho-Collados, Jose",
|
135 |
+
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
|
136 |
+
month = dec,
|
137 |
+
year = "2022",
|
138 |
+
address = "Abu Dhabi, U.A.E.",
|
139 |
+
publisher = "Association for Computational Linguistics",
|
140 |
+
}
|
141 |
+
|