fl399 commited on
Commit
64afb2b
1 Parent(s): ec3f68b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +37 -0
README.md CHANGED
@@ -23,9 +23,46 @@ datasets:
23
  ### Expected input and output
24
  The input should be a string of biomedical entity names, e.g., "covid infection" or "Hydroxychloroquine". The [CLS] embedding of the last layer is regarded as the output.
25
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
26
  ### SapBERT-PubMedBERT
27
  SapBERT by [Liu et al. (2020)](https://arxiv.org/pdf/2010.11784.pdf). Trained with [UMLS](https://www.nlm.nih.gov/research/umls/licensedcontent/umlsknowledgesources.html) 2020AA (English only), using [microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext](https://huggingface.co/microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext) as the base model.
28
 
 
 
 
29
  ### Citation
30
  ```bibtex
31
  @inproceedings{liu-etal-2021-self,
 
23
  ### Expected input and output
24
  The input should be a string of biomedical entity names, e.g., "covid infection" or "Hydroxychloroquine". The [CLS] embedding of the last layer is regarded as the output.
25
 
26
+ #### Extracting embeddings from SapBERT
27
+
28
+ The following script converts a list of strings (entity names) into embeddings.
29
+ ```python
30
+ import numpy as np
31
+ import torch
32
+ from tqdm.auto import tqdm
33
+ from transformers import AutoTokenizer, AutoModel
34
+
35
+ tokenizer = AutoTokenizer.from_pretrained("cambridgeltl/SapBERT-from-PubMedBERT-fulltext")
36
+ model = AutoModel.from_pretrained("cambridgeltl/SapBERT-from-PubMedBERT-fulltext").cuda()
37
+
38
+ # replace with your own list of entity names
39
+ all_names = ["covid-19", "Coronavirus infection", "high fever", "Tumor of posterior wall of oropharynx"]
40
+
41
+ bs = 128 # batch size during inference
42
+ all_embs = []
43
+ for i in tqdm(np.arange(0, len(all_names), bs)):
44
+ toks = tokenizer.batch_encode_plus(all_names[i:i+bs],
45
+ padding="max_length",
46
+ max_length=25,
47
+ truncation=True,
48
+ return_tensors="pt")
49
+ toks_cuda = {}
50
+ for k,v in toks.items():
51
+ toks_cuda[k] = v.cuda()
52
+ cls_rep = model(**toks_cuda)[0][:,0,:] # use CLS representation as the embedding
53
+ all_embs.append(cls_rep.cpu().detach().numpy())
54
+
55
+ all_embs = np.concatenate(all_embs, axis=0)
56
+ ```
57
+
58
+ For more details about training and eval, see SapBERT [github repo](https://github.com/cambridgeltl/sapbert).
59
+
60
  ### SapBERT-PubMedBERT
61
  SapBERT by [Liu et al. (2020)](https://arxiv.org/pdf/2010.11784.pdf). Trained with [UMLS](https://www.nlm.nih.gov/research/umls/licensedcontent/umlsknowledgesources.html) 2020AA (English only), using [microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext](https://huggingface.co/microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext) as the base model.
62
 
63
+
64
+
65
+
66
  ### Citation
67
  ```bibtex
68
  @inproceedings{liu-etal-2021-self,