YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
language: en
tags:
- sentence-embeddings
- sentence-similarity
- dual-encoder
cambridgeltl/trans-encoder-bi-simcse-roberta-large
An unsupervised sentence encoder (bi-encoder) proposed by Liu et al. (2021). The model is trained with unlabelled sentence pairs sampled from STS2012-2016, STS-b, and SICK-R, using princeton-nlp/unsup-simcse-roberta-large as the base model. Please use [CLS]
(before pooler) as the representation of the input.
Citation
@article{liu2021trans,
title={Trans-Encoder: Unsupervised sentence-pair modelling through self-and mutual-distillations},
author={Liu, Fangyu and Jiao, Yunlong and Massiah, Jordan and Yilmaz, Emine and Havrylov, Serhii},
journal={arXiv preprint arXiv:2109.13059},
year={2021}
}
- Downloads last month
- 14
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.