Sabine commited on
Commit
e0d78f4
1 Parent(s): 32092db

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -135,7 +135,7 @@ Outputs:
135
  ['I', 'Ġlove', 'Ġsalad']
136
  ```
137
 
138
- So I think this is not fundamentally linked to the model itself.
139
 
140
  ## BibTeX entry and citation info
141
 
135
  ['I', 'Ġlove', 'Ġsalad']
136
  ```
137
 
138
+ The pretraining of LegalRoBERTa was restricted by the size of legal corpora available and the number of pretraining steps is small compared to the popular domain adapted models. This makes legalRoBERTa significantly **under-trained**.
139
 
140
  ## BibTeX entry and citation info
141