Edit model card

klue-cross-encoder-v1

  • klue/bert-base ๋ชจ๋ธ์„ ํ›ˆ๋ จ์‹œ์ผœ cross-encoder๋กœ ํŒŒ์ธํŠœ๋‹ํ•œ ๋ชจ๋ธ
  • This model was trained using SentenceTransformers Cross-Encoder class.

Training

  • sts(10)-sts(10)ํ›ˆ๋ จ ์‹œํ‚ด

  • STS : seed=111,epoch=10, lr=1e-4, eps=1e-6, warm_step=10%, max_seq_len=128, train_batch=128(small ๋ชจ๋ธ=32) (albert 13m/7G) ํ›ˆ๋ จ์ฝ”๋“œ

  • ํ‰๊ฐ€์ฝ”๋“œ,ํ…Œ์ŠคํŠธ์ฝ”๋“œ

  • ๋ชจ๋ธ korsts klue-sts glue(stsb) stsb_multi_mt(en)
    albert-small-kor-cross-encoder-v1 0.8455 0.8526 0.8513 0.7976
    klue-cross-encoder-v1 0.8262 0.8833 0.8512 0.7889
    kpf-cross-encoder-v1 0.8799 0.9133 0.8626 0.8027

Usage and Performance

Pre-trained models can be used like this:

from sentence_transformers import CrossEncoder
model = CrossEncoder('bongsoo/kpf-cross-encoder-v1')
scores = model.predict([('์˜ค๋Š˜ ๋‚ ์”จ๊ฐ€ ์ข‹๋‹ค', '์˜ค๋Š˜ ๋“ฑ์‚ฐ์„ ํ•œ๋‹ค'), ('์˜ค๋Š˜ ๋‚ ์”จ๊ฐ€ ํ๋ฆฌ๋‹ค', '์˜ค๋Š˜ ๋น„๊ฐ€ ๋‚ด๋ฆฐ๋‹ค')])
print(scores)
[0.10161418 0.45563662]

The model will predict scores for the pairs ('Sentence 1', 'Sentence 2') and ('Sentence 3', 'Sentence 4').

You can use this model also without sentence_transformers and by just using Transformers AutoModel class

Downloads last month
53