dsai-artur-zygadlo
commited on
Commit
•
dd48ac3
1
Parent(s):
10cf51c
Update README.md
Browse files
README.md
CHANGED
@@ -41,13 +41,13 @@ We fine-tuned TrelBERT to [KLEJ benchmark](klejbenchmark.com) tasks and achieved
|
|
41 |
|NKJP-NER|94.4|
|
42 |
|CDSC-E|93.9|
|
43 |
|CDSC-R|93.6|
|
44 |
-
|CBD|
|
45 |
|PolEmo2.0-IN|89.3|
|
46 |
|PolEmo2.0-OUT|78.1|
|
47 |
|DYK|67.4|
|
48 |
|PSC|95.7|
|
49 |
|AR|86.1|
|
50 |
-
|__avg__|__86.
|
51 |
|
52 |
For fine-tuning to KLEJ tasks we used [Polish RoBERTa](https://github.com/sdadas/polish-roberta) scripts, which we modified to use `transformers` library. In the CBD task, we set the maximum sequence length to 128 and implemented the same preprocessing procedure as in the MLM phase.
|
53 |
|
|
|
41 |
|NKJP-NER|94.4|
|
42 |
|CDSC-E|93.9|
|
43 |
|CDSC-R|93.6|
|
44 |
+
|CBD|76.1|
|
45 |
|PolEmo2.0-IN|89.3|
|
46 |
|PolEmo2.0-OUT|78.1|
|
47 |
|DYK|67.4|
|
48 |
|PSC|95.7|
|
49 |
|AR|86.1|
|
50 |
+
|__avg__|__86.1__|
|
51 |
|
52 |
For fine-tuning to KLEJ tasks we used [Polish RoBERTa](https://github.com/sdadas/polish-roberta) scripts, which we modified to use `transformers` library. In the CBD task, we set the maximum sequence length to 128 and implemented the same preprocessing procedure as in the MLM phase.
|
53 |
|