metadata
license: mit
Model description
LegalBert is a BERT-base-cased model fine-tuned on a subset of the case.law
corpus. Further details can be found in this paper:
A Dataset for Statutory Reasoning in Tax Law Entailment and Question Answering
Nils Holzenberger, Andrew Blair-Stanek and Benjamin Van Durme
Proceedings of the 2020 Natural Legal Language Processing (NLLP) Workshop, 24 August 2020
Usage
from transformers import AutoModel, AutoTokenizer
model = AutoModel.from_pretrained("jhu-clsp/LegalBert")
tokenizer = AutoTokenizer.from_pretrained("jhu-clsp/LegalBert")
Citation
@inproceedings{holzenberger20dataset,
author = {Nils Holzenberger and
Andrew Blair{-}Stanek and
Benjamin Van Durme},
title = {A Dataset for Statutory Reasoning in Tax Law Entailment and Question
Answering},
booktitle = {Proceedings of the Natural Legal Language Processing Workshop 2020
co-located with the 26th {ACM} {SIGKDD} International Conference on
Knowledge Discovery {\&} Data Mining {(KDD} 2020), Virtual Workshop,
August 24, 2020},
series = {{CEUR} Workshop Proceedings},
volume = {2645},
pages = {31--38},
publisher = {CEUR-WS.org},
year = {2020},
url = {http://ceur-ws.org/Vol-2645/paper5.pdf},
}