Entity centric reference encoder?

#2
by mhill - opened

Does this model include the entity centric reference encoder? If not are you planning to release that part of the model or code to train the model with that encoder?

Same question.

I'm struggling to figure how to use this model without entity embeddings. The run_entity_linking script creates FAISS index using some vector files, but those are not included in model files.

Same question. I am unsure about how to use the model for inference. So far i've tried using it like any other transformer model for computing word embeddings, but am still confused whether this is the way the model was intended to be used, and if yes, what to do next:

from tqdm.auto import tqdm
from transformers import AutoTokenizer, AutoModel 
model=microsoft/BiomedNLP-KRISSBERT-PubMed-UMLS-EL
def get_word_embeddings(tokenizer, model, list): 
    all_names = list
    bs = 128 # batch size during inference
    all_embs = []
    for i in tqdm(np.arange(0, len(all_names), bs)):
        toks = tokenizer.batch_encode_plus(all_names[i:i+bs], 
                                        padding="max_length", 
                                        max_length=25, 
                                        truncation=True,
                                        return_tensors="pt")
        toks_cuda = {}
        for k,v in toks.items():
            toks_cuda[k] = v.cuda()
        cls_rep = model(**toks_cuda)[0][:,0,:] # use CLS representation as the embedding

        all_embs.append(cls_rep.cpu().detach().numpy())
    all_embs = np.concatenate(all_embs, axis=0)
    return(all_embs)
tokenizer = AutoTokenizer.from_pretrained(model_name)  
model = AutoModel.from_pretrained(model_name).cuda()
ent_emb= get_word_embeddings(tokenizer, model, ent_l)

This approach to training a model for entity linking has great potential, and i'm extremely grateful to Microsoft for their research and for releasing their models. It would be great if they chose to elaborate a little about how this one can be used for entity linking, assuming a valid UMLS license. Otherwise, I'm hoping we (the community) might be able to collaborate to figure out how to use it.

I am also trying to figure out the utility of this model for embedding since the self supervised encoding part is missing from model files. I am not sure if it should be used like other linking models or not

Sign up or log in to comment