shenbinqian
commited on
Commit
•
fdcc6dc
1
Parent(s):
db06be0
Update README.md
Browse files
README.md
CHANGED
@@ -20,7 +20,7 @@ should probably proofread and complete it, then remove this comment. -->
|
|
20 |
# roberta-large-finetuned-abbr-unfiltered-plod
|
21 |
|
22 |
This model is a fine-tuned version of [roberta-large](https://huggingface.co/roberta-large) on the [PLODv2 unfiltered dataset](https://github.com/shenbinqian/PLODv2-CLM4AbbrDetection).
|
23 |
-
It is released with our LREC-COLING 2024 publication (
|
24 |
|
25 |
Results on abbreviations:
|
26 |
- Precision: 0.8916
|
|
|
20 |
# roberta-large-finetuned-abbr-unfiltered-plod
|
21 |
|
22 |
This model is a fine-tuned version of [roberta-large](https://huggingface.co/roberta-large) on the [PLODv2 unfiltered dataset](https://github.com/shenbinqian/PLODv2-CLM4AbbrDetection).
|
23 |
+
It is released with our LREC-COLING 2024 publication [Using character-level models for efficient abbreviation and long-form detection](https://aclanthology.org/2024.lrec-main.270/). It achieves the following results on the test set:
|
24 |
|
25 |
Results on abbreviations:
|
26 |
- Precision: 0.8916
|