mihaimasala commited on
Commit
1241907
1 Parent(s): c3fd437

Added links to paper

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -11,7 +11,7 @@ language:
11
  ## Pretrained juridical BERT model for Romanian
12
 
13
  BERT Romanian juridical model trained using a masked language modeling (MLM) and next sentence prediction (NSP) objective.
14
- It was introduced in this [paper](TODO). Two BERT models were released: **jurBERT-base** and **jurBERT-large**, all versions uncased.
15
 
16
  | Model | Weights | L | H | A | MLM accuracy | NSP accuracy |
17
  |----------------|:---------:|:------:|:------:|:------:|:--------------:|:--------------:|
@@ -98,7 +98,7 @@ We report Mean AUC and Std AUC on the task of predicting the outcome of a case.
98
  | *jurBERT-base* | *59.65* | *1.16* |
99
  | *jurBERT-base + hf* | **61.46**| **1.76** |
100
 
101
- For complete results and discussion please refer to the [paper](TODO).
102
 
103
  ### BibTeX entry and citation info
104
 
 
11
  ## Pretrained juridical BERT model for Romanian
12
 
13
  BERT Romanian juridical model trained using a masked language modeling (MLM) and next sentence prediction (NSP) objective.
14
+ It was introduced in this [paper](https://aclanthology.org/2021.nllp-1.8/). Two BERT models were released: **jurBERT-base** and **jurBERT-large**, all versions uncased.
15
 
16
  | Model | Weights | L | H | A | MLM accuracy | NSP accuracy |
17
  |----------------|:---------:|:------:|:------:|:------:|:--------------:|:--------------:|
 
98
  | *jurBERT-base* | *59.65* | *1.16* |
99
  | *jurBERT-base + hf* | **61.46**| **1.76** |
100
 
101
+ For complete results and discussion please refer to the [paper](https://aclanthology.org/2021.nllp-1.8/).
102
 
103
  ### BibTeX entry and citation info
104