added results on grammatical error correction
Browse files
README.md
CHANGED
@@ -154,6 +154,37 @@ We use the binary formulation of this task (positive vs. negative).
|
|
154 |
</details>
|
155 |
|
156 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
157 |
|
158 |
### Machine translation
|
159 |
|
|
|
154 |
</details>
|
155 |
|
156 |
|
157 |
+
### Grammatical error correction
|
158 |
+
|
159 |
+
[ASK-RAW](https://huggingface.co/datasets/ltg/ask-gec) is dataset for Norwegian grammatical error correction (GEC) created by [Matias Jentoft (2023)](https://www.duo.uio.no/handle/10852/103885).
|
160 |
+
|
161 |
+
<details>
|
162 |
+
<summary>Method (click to expand)</summary>
|
163 |
+
|
164 |
+
* Evaluation setting: zero-shot and few-shot settings via natural language generation using the greedy decoding strategy.
|
165 |
+
* Prompt: ```"Her er eksempler på perfekt korrigering av grammatiske feil:\n\nTekst: {source_text}\nKorreksjon:{target_text}"```
|
166 |
+
* Few-shot results show the average scores across 5 repetitions
|
167 |
+
* Evaluation script: https://github.com/ltgoslo/norallm/blob/main/initial_evaluation/gec.py
|
168 |
+
* Performance metrics: the evaluation metric uses [ERRANT](https://github.com/chrisjbryant/errant/tree/main), which identifies edit-spans and then calculates the F_{0.5} scores between the gold edits and predicted edits.
|
169 |
+
|
170 |
+
</details>
|
171 |
+
|
172 |
+
<details open>
|
173 |
+
<summary>Results on [the ASK corpus](https://huggingface.co/datasets/ltg/ask-gec) (ERRANT F_{0.5})</summary>
|
174 |
+
|
175 |
+
|Model|0-shot (F0.5)|1-shot (F0.5)|32-shot (F0.5)|
|
176 |
+
|---|---|---|---|
|
177 |
+
|NorMistral-7b-warm|**40.8**|41.8|48.5|
|
178 |
+
|NorMistral-7b-scratch|22.1|28.8|42.1|
|
179 |
+
|NorBLOOM-7b|8.7|24.5|32.0|
|
180 |
+
|NB-GPT-J|9.1|28.2|30.6|
|
181 |
+
|GPT-Sw3-6.7B|30.5|42.9|**50.6**|
|
182 |
+
|GPT-Sw3-6.7B-v2|40.6|**43.4**|49.8|
|
183 |
+
|Falcon-7B|10.8|12.4|15.5|
|
184 |
+
|Mistral-7B-v0.1|26.0|27.4|30.6|
|
185 |
+
|
186 |
+
</details>
|
187 |
+
|
188 |
|
189 |
### Machine translation
|
190 |
|