Relation distilled BERT: lexical relation embedding model driven by pre-trained language models.
Team members 2
RelBERT is a high-quality semantic representative embedding of word pairs powered by pre-trained language model. Install relbert via pip,
pip install relbert
and play with RelBERT models.
from relbert import RelBERT model = RelBERT('relbert/relbert-roberta-large') vector = model.get_embedding(['Tokyo', 'Japan']) # shape of (1024, )
See more information bellow.