rbroc commited on
Commit
e924706
1 Parent(s): fe55222

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -7,7 +7,7 @@ tags:
7
  - speech
8
  ---
9
 
10
- ### Contrastive user encoder
11
  This model is a `DistilBertModel` trained by fine-tuning `distilbert-base-uncased` on author-based triplet loss.
12
 
13
  #### Details
@@ -17,7 +17,7 @@ Training and evaluation details are provided in our EMNLP Findings paper:
17
 
18
  #### Training
19
  We fine-tuned DistilBERT on triplets consisting of:
20
- - a Reddit submission from a given user (the "anchor");
21
  - an additional post from the same user (a "positive example");
22
  - a post from a different, randomly selected user (the "negative example")
23
 
 
7
  - speech
8
  ---
9
 
10
+ ### Contrastive user encoder (single post)
11
  This model is a `DistilBertModel` trained by fine-tuning `distilbert-base-uncased` on author-based triplet loss.
12
 
13
  #### Details
 
17
 
18
  #### Training
19
  We fine-tuned DistilBERT on triplets consisting of:
20
+ - a single Reddit submission from a given user (the "anchor") - see ```rbroc/contrastive-user-encoder-multipost``` for a model trained on aggregated embeddings of multiple anchors;
21
  - an additional post from the same user (a "positive example");
22
  - a post from a different, randomly selected user (the "negative example")
23