Update README.md
Browse files
README.md
CHANGED
@@ -5,15 +5,13 @@ tags:
|
|
5 |
license: mit
|
6 |
pipeline_tag: feature-extraction
|
7 |
widget:
|
8 |
-
- text: "
|
9 |
---
|
10 |
|
11 |
## KRISSBERT
|
12 |
|
13 |
Entity linking faces significant challenges such as prolific variations and prevalent ambiguities, especially in high-value domains with myriad entities. Standard classification approaches suffer from the annotation bottleneck and cannot effectively handle unseen entities. Zero-shot entity linking has emerged as a promising direction for generalizing to new entities, but it still requires example gold entity mentions during training and canonical descriptions for all entities, both of which are rarely available outside of Wikipedia ([Logeswaran et al., 2019](https://aclanthology.org/P19-1335.pdf); [Wu et al., 2020](https://aclanthology.org/2020.emnlp-main.519.pdf)). We explore Knowledge-RIch Self-Supervision (KRISS) and train a contextual encoder (KRISSBERT) for entity linking, by leveraging readily available unlabeled text and domain knowledge.
|
14 |
|
15 |
-
![Illustration of knowledge-rich self-supervised entity linking.](kriss-fig.png)
|
16 |
-
|
17 |
This KRISSBERT is initialized with [PubMedBERT](https://huggingface.co/microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract) parameters, and then trained using self-supervised examples that are generated by combining [PubMed](https://pubmed.ncbi.nlm.nih.gov/) abstracts and the [UMLS](https://www.nlm.nih.gov/research/umls/index.html) ontology. Experiments on seven standard biomedical entity linking datasets show that KRISSBERT attains new state of the art, outperforming prior self-supervised methods by as much as 20 absolute points in accuracy.
|
18 |
See [Zhang et al., 2021](https://arxiv.org/abs/2112.07887) for the details.
|
19 |
|
|
|
5 |
license: mit
|
6 |
pipeline_tag: feature-extraction
|
7 |
widget:
|
8 |
+
- text: "<ENT> ER </ENT> crowding has become a wide-spread problem."
|
9 |
---
|
10 |
|
11 |
## KRISSBERT
|
12 |
|
13 |
Entity linking faces significant challenges such as prolific variations and prevalent ambiguities, especially in high-value domains with myriad entities. Standard classification approaches suffer from the annotation bottleneck and cannot effectively handle unseen entities. Zero-shot entity linking has emerged as a promising direction for generalizing to new entities, but it still requires example gold entity mentions during training and canonical descriptions for all entities, both of which are rarely available outside of Wikipedia ([Logeswaran et al., 2019](https://aclanthology.org/P19-1335.pdf); [Wu et al., 2020](https://aclanthology.org/2020.emnlp-main.519.pdf)). We explore Knowledge-RIch Self-Supervision (KRISS) and train a contextual encoder (KRISSBERT) for entity linking, by leveraging readily available unlabeled text and domain knowledge.
|
14 |
|
|
|
|
|
15 |
This KRISSBERT is initialized with [PubMedBERT](https://huggingface.co/microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract) parameters, and then trained using self-supervised examples that are generated by combining [PubMed](https://pubmed.ncbi.nlm.nih.gov/) abstracts and the [UMLS](https://www.nlm.nih.gov/research/umls/index.html) ontology. Experiments on seven standard biomedical entity linking datasets show that KRISSBERT attains new state of the art, outperforming prior self-supervised methods by as much as 20 absolute points in accuracy.
|
16 |
See [Zhang et al., 2021](https://arxiv.org/abs/2112.07887) for the details.
|
17 |
|