Use bert instead of electra?
#1
by
PhilipMay
- opened
Hi,
happy that you are using our model to train this. :-)
Just wanted to drop by and say that electra is not providing good performance when working with sentence embessings (semantic embeddings).
The exact reason is unknown but BERT and roBERTa models work better...
Maybe you want to use deepset/gbert-base and even better deepset/gbert-large - they might perform better...