XLM-RoBERTa models with continued pretraining on the MultiLegalPile
Joel Niklaus
joelniklaus
AI & ML interests
Pretraining, Instruction Tuning, Domain Adaptation, Benchmarks, Legal Datasets
Organizations
models
40
joelniklaus/legal-swiss-longformer-base
Fill-Mask
•
Updated
•
6
•
2
joelniklaus/legal-swiss-roberta-base
Fill-Mask
•
Updated
joelniklaus/legal-swiss-roberta-large
Fill-Mask
•
Updated
•
262
•
1
joelniklaus/legal-croatian-roberta-base
Fill-Mask
•
Updated
•
2
•
1
joelniklaus/legal-english-longformer-base
Updated
•
1
joelniklaus/legal-english-roberta-large
Fill-Mask
•
Updated
•
29
joelniklaus/legal-english-roberta-base
Fill-Mask
•
Updated
•
1
joelniklaus/legal-xlm-roberta-base
Fill-Mask
•
Updated
•
58
•
1
joelniklaus/legal-xlm-roberta-large
Fill-Mask
•
Updated
•
29
•
3
joelniklaus/legal-portuguese-roberta-base
Fill-Mask
•
Updated
•
34
datasets
26
joelniklaus/Multi_Legal_Pile
Updated
•
5.97k
•
40
joelniklaus/Multi_Legal_Pile_Commercial
Viewer
•
Updated
•
28.8k
•
6
joelniklaus/legalnero
Viewer
•
Updated
•
1
joelniklaus/greek_legal_ner
Viewer
•
Updated
•
10
joelniklaus/legal-mc4
Viewer
•
Updated
•
8
joelniklaus/MultiLegalPile_Chunks_4000
Viewer
•
Updated
joelniklaus/eurlex_resources
Updated
•
6
joelniklaus/lextreme
Viewer
•
Updated
•
11
•
16
joelniklaus/MultiLegalPileWikipediaFiltered
Viewer
•
Updated
•
3
joelniklaus/EU_Wikipedias
Viewer
•
Updated
•
1