Papers
arxiv:2104.08821

SimCSE: Simple Contrastive Learning of Sentence Embeddings

Published on Apr 18, 2021
Authors:
,
,

Abstract

This paper presents SimCSE, a simple contrastive learning framework that greatly advances state-of-the-art sentence embeddings. We first describe an unsupervised approach, which takes an input sentence and predicts itself in a contrastive objective, with only standard dropout used as noise. This simple method works surprisingly well, performing on par with previous supervised counterparts. We find that dropout acts as minimal data augmentation, and removing it leads to a representation collapse. Then, we propose a supervised approach, which incorporates annotated pairs from natural language inference datasets into our contrastive learning framework by using "entailment" pairs as positives and "contradiction" pairs as hard negatives. We evaluate SimCSE on standard semantic textual similarity (STS) tasks, and our unsupervised and supervised models using BERT base achieve an average of 76.3% and 81.6% Spearman's correlation respectively, a 4.2% and 2.2% improvement compared to the previous best results. We also show -- both theoretically and empirically -- that the contrastive learning objective regularizes pre-trained embeddings' anisotropic space to be more uniform, and it better aligns positive pairs when supervised signals are available.

Community

Sign up or log in to comment

Models citing this paper 35

Browse 35 models citing this paper

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2104.08821 in a dataset README.md to link it from this page.

Spaces citing this paper 22

Collections including this paper 1