Token Classification
Transformers
PyTorch
English
bert
chemistry
biology
zero-shot
BERT
PubMedBERT
Inference Endpoints
MilosKosRad commited on
Commit
ed03fce
1 Parent(s): 0494f32

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -30,7 +30,7 @@ library_name: transformers
30
 
31
  This model was created during the research collaboration between Bayer Pharma and The Institute for Artificial Intelligence Research and Development of Serbia.
32
  The model is trained on 26 biomedical Named Entity (NE) classes and can perform zero-shot inference. It also can be further fine-tuned for new classes with just few examples (few-shot learning).
33
- For more details about our method please see the paper named ["A transformer-based method for zero and few-shot biomedical named entity recognition"](https://arxiv.org/abs/2305.04928). The model corresponds to PubMedBERT-based model, trained with 1 in the first segment (check paper for more details).
34
 
35
  Model takes two strings as input. String1 is NE label that is being searched in second string. String2 is short text where one wants to searc for NE (represented by String1).
36
  Model outputs list of ones (corresponding to the found Named Entities) and zeros (corresponding to other non-NE tokens) of the Sring2.
@@ -145,12 +145,12 @@ Code used for training and testing the model is available at https://github.com/
145
 
146
  If you use this model, or are inspired by it, please cite in your paper the following paper:
147
 
148
- Košprdić M.,Prodanović N., Ljajić A., Bašaragin B., Milošević N., 2023. A transformer-based method for zero and few-shot biomedical named entity recognition. arXiv preprint arXiv:2305.04928. https://arxiv.org/abs/2305.04928
149
 
150
  or in bibtex:
151
  ```
152
  @misc{kosprdic2023transformerbased,
153
- title={A transformer-based method for zero and few-shot biomedical named entity recognition},
154
  author={Miloš Košprdić and Nikola Prodanović and Adela Ljajić and Bojana Bašaragin and Nikola Milošević},
155
  year={2023},
156
  eprint={2305.04928},
 
30
 
31
  This model was created during the research collaboration between Bayer Pharma and The Institute for Artificial Intelligence Research and Development of Serbia.
32
  The model is trained on 26 biomedical Named Entity (NE) classes and can perform zero-shot inference. It also can be further fine-tuned for new classes with just few examples (few-shot learning).
33
+ For more details about our method please see the paper named ["From Zero to Hero: Harnessing Transformers for Biomedical Named Entity Recognition in Zero- and Few-shot Contexts"](https://arxiv.org/abs/2305.04928). The model corresponds to PubMedBERT-based model, trained with 1 in the first segment (check paper for more details).
34
 
35
  Model takes two strings as input. String1 is NE label that is being searched in second string. String2 is short text where one wants to searc for NE (represented by String1).
36
  Model outputs list of ones (corresponding to the found Named Entities) and zeros (corresponding to other non-NE tokens) of the Sring2.
 
145
 
146
  If you use this model, or are inspired by it, please cite in your paper the following paper:
147
 
148
+ Košprdić M.,Prodanović N., Ljajić A., Bašaragin B., Milošević N., 2023. From Zero to Hero: Harnessing Transformers for Biomedical Named Entity Recognition in Zero- and Few-shot Contexts. arXiv preprint arXiv:2305.04928. https://arxiv.org/abs/2305.04928
149
 
150
  or in bibtex:
151
  ```
152
  @misc{kosprdic2023transformerbased,
153
+ title={From Zero to Hero: Harnessing Transformers for Biomedical Named Entity Recognition in Zero- and Few-shot Contexts},
154
  author={Miloš Košprdić and Nikola Prodanović and Adela Ljajić and Bojana Bašaragin and Nikola Milošević},
155
  year={2023},
156
  eprint={2305.04928},