mihaimasala commited on
Commit
af9617d
1 Parent(s): e9cbeb8

Added links to paper

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -11,7 +11,7 @@ language:
11
  ## Pretrained juridical BERT model for Romanian
12
 
13
  BERT Romanian juridical model trained using a masked language modeling (MLM) and next sentence prediction (NSP) objective.
14
- It was introduced in this [paper](TODO). Two BERT models were released: **jurBERT-base** and **jurBERT-large**, all versions uncased.
15
 
16
  | Model | Weights | L | H | A | MLM accuracy | NSP accuracy |
17
  |----------------|:---------:|:------:|:------:|:------:|:--------------:|:--------------:|
@@ -88,7 +88,7 @@ We report Mean AUC and Std AUC on the task of predicting the outcome of a case.
88
  | *jurBERT-large* | *82.04* | *0.64* |
89
 
90
 
91
- For complete results and discussion please refer to the [paper](TODO).
92
 
93
  ### BibTeX entry and citation info
94
 
11
  ## Pretrained juridical BERT model for Romanian
12
 
13
  BERT Romanian juridical model trained using a masked language modeling (MLM) and next sentence prediction (NSP) objective.
14
+ It was introduced in this [paper](https://aclanthology.org/2021.nllp-1.8/). Two BERT models were released: **jurBERT-base** and **jurBERT-large**, all versions uncased.
15
 
16
  | Model | Weights | L | H | A | MLM accuracy | NSP accuracy |
17
  |----------------|:---------:|:------:|:------:|:------:|:--------------:|:--------------:|
88
  | *jurBERT-large* | *82.04* | *0.64* |
89
 
90
 
91
+ For complete results and discussion please refer to the [paper](https://aclanthology.org/2021.nllp-1.8/).
92
 
93
  ### BibTeX entry and citation info
94