dsai-artur-zygadlo commited on
Commit
c431a5d
1 Parent(s): 6894a2f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -36,7 +36,7 @@ CC BY 4.0
36
 
37
  We fine-tuned TrelBERT to [KLEJ benchmark](klejbenchmark.com) tasks and achieved the following results:
38
 
39
- |model name | score |
40
  |--|--|
41
  |NKJP-NER|94.4|
42
  |CDSC-E|93.9|
@@ -47,7 +47,7 @@ We fine-tuned TrelBERT to [KLEJ benchmark](klejbenchmark.com) tasks and achieved
47
  |DYK|67.4|
48
  |PSC|95.7|
49
  |AR|86.1|
50
- |avg|86.0|
51
 
52
  For fine-tuning to KLEJ tasks we used [Polish RoBERTa](https://github.com/sdadas/polish-roberta) scripts, which we modified to use `transformers` library. In the CBD task, we set the maximum sequence length to 128 and implemented the same preprocessing procedure as in the MLM phase.
53
 
 
36
 
37
  We fine-tuned TrelBERT to [KLEJ benchmark](klejbenchmark.com) tasks and achieved the following results:
38
 
39
+ |task name|score|
40
  |--|--|
41
  |NKJP-NER|94.4|
42
  |CDSC-E|93.9|
 
47
  |DYK|67.4|
48
  |PSC|95.7|
49
  |AR|86.1|
50
+ |__avg__|__86.0__|
51
 
52
  For fine-tuning to KLEJ tasks we used [Polish RoBERTa](https://github.com/sdadas/polish-roberta) scripts, which we modified to use `transformers` library. In the CBD task, we set the maximum sequence length to 128 and implemented the same preprocessing procedure as in the MLM phase.
53