Add distilled version cometinho

#2
by BramVanroy - opened
Files changed (1) hide show
  1. README.md +24 -1
README.md CHANGED
@@ -36,7 +36,7 @@ reference = ["They were able to control the fire.", "Schools and kindergartens o
36
  comet_score = comet_metric.compute(predictions=hypothesis, references=reference, sources=source)
37
  ```
38
 
39
- It has several configurations, named after the COMET model to be used. It will default to `wmt20-comet-da` (previously known as `wmt-large-da-estimator-1719`). Alternate models that can be chosen include `wmt20-comet-qe-da`, `wmt21-comet-mqm`, `wmt21-cometinho-da`, `wmt21-comet-qe-mqm` and `emnlp20-comet-rank`.
40
 
41
  It also has several optional arguments:
42
 
@@ -138,9 +138,32 @@ Also, calculating the COMET metric involves downloading the model from which fea
138
  publisher = "Association for Computational Linguistics",
139
  url = "https://www.aclweb.org/anthology/2020.emnlp-main.213",
140
  pages = "2685--2702",
 
 
141
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
142
  ```
143
 
 
144
  ## Further References
145
 
146
  - [COMET website](https://unbabel.github.io/COMET/html/index.html)
36
  comet_score = comet_metric.compute(predictions=hypothesis, references=reference, sources=source)
37
  ```
38
 
39
+ It has several configurations, named after the COMET model to be used. It will default to `wmt20-comet-da` (previously known as `wmt-large-da-estimator-1719`). Alternate models that can be chosen include `wmt20-comet-qe-da`, `wmt21-comet-mqm`, `wmt21-cometinho-da`, `wmt21-comet-qe-mqm` and `emnlp20-comet-rank`. Notably, a distilled model is also available, which is 80% smaller and 2.128x faster while performing close to non-distilled alternatives. You can use it with the identifier `eamt22-cometinho-da`. This version, called Cometinho, was elected as [the best paper](https://aclanthology.org/2022.eamt-1.9) at the annual European conference on machine translation.
40
 
41
  It also has several optional arguments:
42
 
138
  publisher = "Association for Computational Linguistics",
139
  url = "https://www.aclweb.org/anthology/2020.emnlp-main.213",
140
  pages = "2685--2702",
141
+ }
142
+ ```
143
 
144
+ For the distilled version:
145
+
146
+ ```bibtex
147
+ @inproceedings{rei-etal-2022-searching,
148
+ title = "Searching for {COMETINHO}: The Little Metric That Could",
149
+ author = "Rei, Ricardo and
150
+ Farinha, Ana C and
151
+ de Souza, Jos{\'e} G.C. and
152
+ Ramos, Pedro G. and
153
+ Martins, Andr{\'e} F.T. and
154
+ Coheur, Luisa and
155
+ Lavie, Alon",
156
+ booktitle = "Proceedings of the 23rd Annual Conference of the European Association for Machine Translation",
157
+ month = jun,
158
+ year = "2022",
159
+ address = "Ghent, Belgium",
160
+ publisher = "European Association for Machine Translation",
161
+ url = "https://aclanthology.org/2022.eamt-1.9",
162
+ pages = "61--70",
163
+ }
164
  ```
165
 
166
+
167
  ## Further References
168
 
169
  - [COMET website](https://unbabel.github.io/COMET/html/index.html)