Edit model card

SetFit with jhgan/ko-sroberta-multitask

This is a SetFit model that can be used for Text Classification. This SetFit model uses jhgan/ko-sroberta-multitask as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

Model Details

Model Description

Model Sources

Model Labels

Label Examples
rag
  • 'QR코드 스캔 후 필요한 서류와 절차는?'
  • '연구용역사업의 원가계산서 관련, 일정 금액 이상 지출 승인은 누구에게 받나요?'
  • '계약부서 승인 없이 지급신청 시 주의할 점은?'
general
  • '아래 글의 요지 좀 설명해줘.\n \n 다른 문화권에서 온 여자와 데이트. 관계에 대해 좋은 점이 많이 있습니다. 공통된 직업적 관심사, 동일한 성욕, 그리고 서로를 존중한다는 점은 제게는 새로운 관계입니다(항상 남성에 대해 안 좋은 태도를 가진 여자들과만 사귀어 왔죠). 그녀는 저를 정말 사랑해요. \n \n 하지만 장기적인 생존 가능성에 대해 몇 가지 심각한 우려가 있습니다. 하나는 부모님에 관한 것입니다. 제 부모님은 우리가 사귀는 사이라는 사실을 알게 되자 "네가 미국에 머물 수 있는 티켓이라는 걸 기억하라"고 말씀하셨어요. 우리가 진짜 사귀는 사이라는 사실을 알게 된 부모님은 제가 얼마나 버는지 알고 싶어 하셨고(저는 대학원생입니다), 존경의 표시로 은퇴한 부모님을 부양하는 전통에 대해 제가 괜찮은지 확인하고 싶어 하셨습니다(부모님은 그런 도움이 필요 없을 만큼 잘 살고 계시지만요). 여자친구는 이에 대해 부모님의 의견에 동의하며 제가 괜찮지 않다면 돈을 더 벌어서 직접 해야 한다고 말했습니다. 또한 여자친구는 제가 이전에 결혼했고 지금은 이혼했다는 사실을 부모님이 '절대 알 수 없다'고 말합니다. \n \n 제가 극복하거나 간과할 수 있었던 다른 문제들도 있지만(한 가지 예로, 그녀는 사교적이지 않지만 저는 사교적입니다), 이러한 문제들이 결합되어 그녀와의 미래는 앞으로 많은 문제가 예고되어 있고 위험하다고 느낍니다. 이전 결혼 생활에서 저는 그런 징후를 무시하고 대가를 치렀고, 그 역사를 반복하고 싶지 않습니다. 동시에 저와 성적으로도 잘 어울리는 파트너가 있다는 것은 정말 좋은 일입니다. \n \n 다른 사람들은 이런 다문화적인 상황에서 어떤 경험을 했는지, 특히 장기적인 경험이 있다면 어떤지 궁금합니다.'
  • '너는 누구냐니까'
  • '문제와 몇 가지 답 옵션("A", "B", "C", "D"와 연관된)이 주어집니다. 상식적인 지식을 바탕으로 정답을 선택해야 합니다. 연상에 기반한 답은 피하고, 답안 세트는 연상을 넘어서는 상식을 파악하기 위해 의도적으로 선택된 것입니다. 'A', 'B', 'C', 'D', 'E' 중 하나를 제외하고는 다른 문자를 생성하지 말고 각 문제에 대해 하나의 답만 작성하세요.\n\n폰이라는 이름은 매우 다재다능할 수 있지만, 모든 부품이 중요한 것은 무엇일까요?\n(A)체스 게임 (B)계획 (C)체스 세트 (D)체커 (E)노스 캐롤라이나'

Evaluation

Metrics

Label Accuracy
all 0.9952

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import SetFitModel

# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("NTIS/sroberta-embedding")
# Run inference
preds = model("원장 인계 전 필요한 절차는?")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 2 24.824 722
Label Training Sample Count
rag 553
general 447

Training Hyperparameters

  • batch_size: (64, 64)
  • num_epochs: (4, 4)
  • max_steps: -1
  • sampling_strategy: oversampling
  • body_learning_rate: (2e-05, 1e-05)
  • head_learning_rate: 0.01
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • seed: 42
  • eval_max_steps: -1
  • load_best_model_at_end: True

Training Results

Epoch Step Training Loss Validation Loss
0.0001 1 0.2655 -
0.0063 50 0.2091 -
0.0126 100 0.2327 -
0.0189 150 0.1578 -
0.0253 200 0.0836 -
0.0316 250 0.0274 -
0.0379 300 0.0068 -
0.0442 350 0.0032 -
0.0505 400 0.0013 -
0.0568 450 0.0012 -
0.0632 500 0.0009 -
0.0695 550 0.0006 -
0.0758 600 0.0004 -
0.0821 650 0.0004 -
0.0884 700 0.0003 -
0.0947 750 0.0003 -
0.1011 800 0.0003 -
0.1074 850 0.0002 -
0.1137 900 0.0002 -
0.1200 950 0.0002 -
0.1263 1000 0.0002 -
0.1326 1050 0.0001 -
0.1390 1100 0.0001 -
0.1453 1150 0.0001 -
0.1516 1200 0.0001 -
0.1579 1250 0.0001 -
0.1642 1300 0.0001 -
0.1705 1350 0.0001 -
0.1769 1400 0.0001 -
0.1832 1450 0.0001 -
0.1895 1500 0.0001 -
0.1958 1550 0.0001 -
0.2021 1600 0.0 -
0.2084 1650 0.0001 -
0.2148 1700 0.0001 -
0.2211 1750 0.0 -
0.2274 1800 0.0001 -
0.2337 1850 0.0 -
0.2400 1900 0.0 -
0.2463 1950 0.0 -
0.2527 2000 0.0 -
0.2590 2050 0.0 -
0.2653 2100 0.0 -
0.2716 2150 0.0 -
0.2779 2200 0.0 -
0.2842 2250 0.0 -
0.2906 2300 0.0 -
0.2969 2350 0.0 -
0.3032 2400 0.0 -
0.3095 2450 0.0 -
0.3158 2500 0.0 -
0.3221 2550 0.0 -
0.3284 2600 0.0 -
0.3348 2650 0.0 -
0.3411 2700 0.0 -
0.3474 2750 0.0 -
0.3537 2800 0.0 -
0.3600 2850 0.0 -
0.3663 2900 0.0 -
0.3727 2950 0.0 -
0.3790 3000 0.0 -
0.3853 3050 0.0 -
0.3916 3100 0.0 -
0.3979 3150 0.0 -
0.4042 3200 0.0 -
0.4106 3250 0.0 -
0.4169 3300 0.0 -
0.4232 3350 0.0 -
0.4295 3400 0.0 -
0.4358 3450 0.0 -
0.4421 3500 0.0 -
0.4485 3550 0.0 -
0.4548 3600 0.0 -
0.4611 3650 0.0 -
0.4674 3700 0.0 -
0.4737 3750 0.0 -
0.4800 3800 0.0 -
0.4864 3850 0.0 -
0.4927 3900 0.0 -
0.4990 3950 0.0 -
0.5053 4000 0.0 -
0.5116 4050 0.0 -
0.5179 4100 0.0 -
0.5243 4150 0.0 -
0.5306 4200 0.0 -
0.5369 4250 0.0 -
0.5432 4300 0.0 -
0.5495 4350 0.0004 -
0.5558 4400 0.0001 -
0.5622 4450 0.0 -
0.5685 4500 0.0096 -
0.5748 4550 0.0 -
0.5811 4600 0.0 -
0.5874 4650 0.0 -
0.5937 4700 0.0 -
0.6001 4750 0.0 -
0.6064 4800 0.0 -
0.6127 4850 0.0 -
0.6190 4900 0.0 -
0.6253 4950 0.0 -
0.6316 5000 0.0 -
0.6379 5050 0.0 -
0.6443 5100 0.0 -
0.6506 5150 0.0 -
0.6569 5200 0.0 -
0.6632 5250 0.0 -
0.6695 5300 0.0 -
0.6758 5350 0.0 -
0.6822 5400 0.0 -
0.6885 5450 0.0 -
0.6948 5500 0.0 -
0.7011 5550 0.0 -
0.7074 5600 0.0 -
0.7137 5650 0.0 -
0.7201 5700 0.0 -
0.7264 5750 0.0 -
0.7327 5800 0.0 -
0.7390 5850 0.0 -
0.7453 5900 0.0 -
0.7516 5950 0.0 -
0.7580 6000 0.0 -
0.7643 6050 0.0 -
0.7706 6100 0.0 -
0.7769 6150 0.0 -
0.7832 6200 0.0 -
0.7895 6250 0.0 -
0.7959 6300 0.0 -
0.8022 6350 0.0 -
0.8085 6400 0.0 -
0.8148 6450 0.0 -
0.8211 6500 0.0 -
0.8274 6550 0.0 -
0.8338 6600 0.0 -
0.8401 6650 0.0 -
0.8464 6700 0.0 -
0.8527 6750 0.0 -
0.8590 6800 0.0 -
0.8653 6850 0.0 -
0.8717 6900 0.0 -
0.8780 6950 0.0 -
0.8843 7000 0.0 -
0.8906 7050 0.0 -
0.8969 7100 0.0 -
0.9032 7150 0.0 -
0.9096 7200 0.0 -
0.9159 7250 0.0 -
0.9222 7300 0.0 -
0.9285 7350 0.0 -
0.9348 7400 0.0 -
0.9411 7450 0.0 -
0.9474 7500 0.0 -
0.9538 7550 0.0 -
0.9601 7600 0.0 -
0.9664 7650 0.0 -
0.9727 7700 0.0 -
0.9790 7750 0.0 -
0.9853 7800 0.0 -
0.9917 7850 0.0 -
0.9980 7900 0.0 -
1.0 7916 - 0.0096
1.0043 7950 0.0 -
1.0106 8000 0.0 -
1.0169 8050 0.0 -
1.0232 8100 0.0 -
1.0296 8150 0.0 -
1.0359 8200 0.0 -
1.0422 8250 0.0 -
1.0485 8300 0.0 -
1.0548 8350 0.0 -
1.0611 8400 0.0 -
1.0675 8450 0.0 -
1.0738 8500 0.0 -
1.0801 8550 0.0 -
1.0864 8600 0.0 -
1.0927 8650 0.0 -
1.0990 8700 0.0 -
1.1054 8750 0.0 -
1.1117 8800 0.0 -
1.1180 8850 0.0 -
1.1243 8900 0.0 -
1.1306 8950 0.0 -
1.1369 9000 0.0 -
1.1433 9050 0.0 -
1.1496 9100 0.0 -
1.1559 9150 0.0 -
1.1622 9200 0.0 -
1.1685 9250 0.0 -
1.1748 9300 0.0 -
1.1812 9350 0.0 -
1.1875 9400 0.0 -
1.1938 9450 0.0 -
1.2001 9500 0.0 -
1.2064 9550 0.0 -
1.2127 9600 0.0 -
1.2191 9650 0.0 -
1.2254 9700 0.0 -
1.2317 9750 0.0 -
1.2380 9800 0.0 -
1.2443 9850 0.0 -
1.2506 9900 0.0 -
1.2569 9950 0.0 -
1.2633 10000 0.0 -
1.2696 10050 0.0 -
1.2759 10100 0.0 -
1.2822 10150 0.0 -
1.2885 10200 0.0 -
1.2948 10250 0.0 -
1.3012 10300 0.0 -
1.3075 10350 0.0 -
1.3138 10400 0.0 -
1.3201 10450 0.0 -
1.3264 10500 0.0 -
1.3327 10550 0.0 -
1.3391 10600 0.0 -
1.3454 10650 0.0 -
1.3517 10700 0.0 -
1.3580 10750 0.0 -
1.3643 10800 0.0 -
1.3706 10850 0.0 -
1.3770 10900 0.0 -
1.3833 10950 0.0 -
1.3896 11000 0.0 -
1.3959 11050 0.0 -
1.4022 11100 0.0 -
1.4085 11150 0.0 -
1.4149 11200 0.0 -
1.4212 11250 0.0 -
1.4275 11300 0.0 -
1.4338 11350 0.0 -
1.4401 11400 0.0 -
1.4464 11450 0.0 -
1.4528 11500 0.0 -
1.4591 11550 0.0 -
1.4654 11600 0.0 -
1.4717 11650 0.0 -
1.4780 11700 0.0 -
1.4843 11750 0.0 -
1.4907 11800 0.0 -
1.4970 11850 0.0 -
1.5033 11900 0.0 -
1.5096 11950 0.0 -
1.5159 12000 0.0 -
1.5222 12050 0.0 -
1.5285 12100 0.0 -
1.5349 12150 0.0 -
1.5412 12200 0.0 -
1.5475 12250 0.0 -
1.5538 12300 0.0 -
1.5601 12350 0.0 -
1.5664 12400 0.0 -
1.5728 12450 0.0 -
1.5791 12500 0.0 -
1.5854 12550 0.0 -
1.5917 12600 0.0 -
1.5980 12650 0.0 -
1.6043 12700 0.0 -
1.6107 12750 0.0 -
1.6170 12800 0.0 -
1.6233 12850 0.0 -
1.6296 12900 0.0 -
1.6359 12950 0.0 -
1.6422 13000 0.0 -
1.6486 13050 0.0 -
1.6549 13100 0.0 -
1.6612 13150 0.0 -
1.6675 13200 0.0 -
1.6738 13250 0.0 -
1.6801 13300 0.0 -
1.6865 13350 0.0 -
1.6928 13400 0.0 -
1.6991 13450 0.0 -
1.7054 13500 0.0 -
1.7117 13550 0.0 -
1.7180 13600 0.0 -
1.7244 13650 0.0 -
1.7307 13700 0.0 -
1.7370 13750 0.0 -
1.7433 13800 0.0 -
1.7496 13850 0.0 -
1.7559 13900 0.0 -
1.7623 13950 0.0 -
1.7686 14000 0.0 -
1.7749 14050 0.0 -
1.7812 14100 0.0 -
1.7875 14150 0.0 -
1.7938 14200 0.0 -
1.8002 14250 0.0 -
1.8065 14300 0.0 -
1.8128 14350 0.0 -
1.8191 14400 0.0 -
1.8254 14450 0.0 -
1.8317 14500 0.0 -
1.8380 14550 0.0 -
1.8444 14600 0.0 -
1.8507 14650 0.0 -
1.8570 14700 0.0 -
1.8633 14750 0.0 -
1.8696 14800 0.0 -
1.8759 14850 0.0 -
1.8823 14900 0.0 -
1.8886 14950 0.0 -
1.8949 15000 0.0 -
1.9012 15050 0.0 -
1.9075 15100 0.0 -
1.9138 15150 0.0 -
1.9202 15200 0.0 -
1.9265 15250 0.0 -
1.9328 15300 0.0 -
1.9391 15350 0.0 -
1.9454 15400 0.0 -
1.9517 15450 0.0 -
1.9581 15500 0.0 -
1.9644 15550 0.0 -
1.9707 15600 0.0 -
1.9770 15650 0.0 -
1.9833 15700 0.0 -
1.9896 15750 0.0 -
1.9960 15800 0.0 -
2.0 15832 - 0.0096
2.0023 15850 0.0 -
2.0086 15900 0.0 -
2.0149 15950 0.0 -
2.0212 16000 0.0 -
2.0275 16050 0.0 -
2.0339 16100 0.0 -
2.0402 16150 0.0 -
2.0465 16200 0.0 -
2.0528 16250 0.0 -
2.0591 16300 0.0 -
2.0654 16350 0.0 -
2.0718 16400 0.0 -
2.0781 16450 0.0 -
2.0844 16500 0.0 -
2.0907 16550 0.0 -
2.0970 16600 0.0 -
2.1033 16650 0.0 -
2.1097 16700 0.0 -
2.1160 16750 0.0 -
2.1223 16800 0.0 -
2.1286 16850 0.0 -
2.1349 16900 0.0 -
2.1412 16950 0.0 -
2.1475 17000 0.0 -
2.1539 17050 0.0 -
2.1602 17100 0.0 -
2.1665 17150 0.0 -
2.1728 17200 0.0 -
2.1791 17250 0.0 -
2.1854 17300 0.0 -
2.1918 17350 0.0 -
2.1981 17400 0.0 -
2.2044 17450 0.0 -
2.2107 17500 0.0 -
2.2170 17550 0.0 -
2.2233 17600 0.0 -
2.2297 17650 0.0 -
2.2360 17700 0.0 -
2.2423 17750 0.0 -
2.2486 17800 0.0 -
2.2549 17850 0.0 -
2.2612 17900 0.0 -
2.2676 17950 0.0 -
2.2739 18000 0.0 -
2.2802 18050 0.0 -
2.2865 18100 0.0 -
2.2928 18150 0.0 -
2.2991 18200 0.0 -
2.3055 18250 0.0 -
2.3118 18300 0.0 -
2.3181 18350 0.0 -
2.3244 18400 0.0 -
2.3307 18450 0.0 -
2.3370 18500 0.0 -
2.3434 18550 0.0 -
2.3497 18600 0.0 -
2.3560 18650 0.0 -
2.3623 18700 0.0 -
2.3686 18750 0.0 -
2.3749 18800 0.0 -
2.3813 18850 0.0 -
2.3876 18900 0.0 -
2.3939 18950 0.0 -
2.4002 19000 0.0 -
2.4065 19050 0.0 -
2.4128 19100 0.0 -
2.4192 19150 0.0 -
2.4255 19200 0.0 -
2.4318 19250 0.0 -
2.4381 19300 0.0 -
2.4444 19350 0.0 -
2.4507 19400 0.0 -
2.4570 19450 0.0 -
2.4634 19500 0.0 -
2.4697 19550 0.0 -
2.4760 19600 0.0 -
2.4823 19650 0.0 -
2.4886 19700 0.0 -
2.4949 19750 0.0 -
2.5013 19800 0.0 -
2.5076 19850 0.0 -
2.5139 19900 0.0 -
2.5202 19950 0.0 -
2.5265 20000 0.0 -
2.5328 20050 0.0 -
2.5392 20100 0.0 -
2.5455 20150 0.0 -
2.5518 20200 0.0 -
2.5581 20250 0.0 -
2.5644 20300 0.0 -
2.5707 20350 0.0 -
2.5771 20400 0.0 -
2.5834 20450 0.0 -
2.5897 20500 0.0 -
2.5960 20550 0.0 -
2.6023 20600 0.0 -
2.6086 20650 0.0 -
2.6150 20700 0.0 -
2.6213 20750 0.0 -
2.6276 20800 0.0 -
2.6339 20850 0.0 -
2.6402 20900 0.0 -
2.6465 20950 0.0 -
2.6529 21000 0.0 -
2.6592 21050 0.0 -
2.6655 21100 0.0 -
2.6718 21150 0.0 -
2.6781 21200 0.0 -
2.6844 21250 0.0 -
2.6908 21300 0.0 -
2.6971 21350 0.0 -
2.7034 21400 0.0 -
2.7097 21450 0.0 -
2.7160 21500 0.0 -
2.7223 21550 0.0 -
2.7287 21600 0.0 -
2.7350 21650 0.0 -
2.7413 21700 0.0 -
2.7476 21750 0.0 -
2.7539 21800 0.0 -
2.7602 21850 0.0 -
2.7665 21900 0.0 -
2.7729 21950 0.0 -
2.7792 22000 0.0 -
2.7855 22050 0.0 -
2.7918 22100 0.0 -
2.7981 22150 0.0 -
2.8044 22200 0.0 -
2.8108 22250 0.0 -
2.8171 22300 0.0 -
2.8234 22350 0.0 -
2.8297 22400 0.0 -
2.8360 22450 0.0 -
2.8423 22500 0.0 -
2.8487 22550 0.0 -
2.8550 22600 0.0 -
2.8613 22650 0.0 -
2.8676 22700 0.0 -
2.8739 22750 0.0 -
2.8802 22800 0.0 -
2.8866 22850 0.0 -
2.8929 22900 0.0 -
2.8992 22950 0.0 -
2.9055 23000 0.0 -
2.9118 23050 0.0 -
2.9181 23100 0.0 -
2.9245 23150 0.0 -
2.9308 23200 0.0 -
2.9371 23250 0.0 -
2.9434 23300 0.0 -
2.9497 23350 0.0 -
2.9560 23400 0.0 -
2.9624 23450 0.0 -
2.9687 23500 0.0 -
2.9750 23550 0.0 -
2.9813 23600 0.0 -
2.9876 23650 0.0 -
2.9939 23700 0.0 -
3.0 23748 - 0.0128
3.0003 23750 0.0 -
3.0066 23800 0.0 -
3.0129 23850 0.0 -
3.0192 23900 0.0 -
3.0255 23950 0.0 -
3.0318 24000 0.0 -
3.0382 24050 0.0 -
3.0445 24100 0.0 -
3.0508 24150 0.0 -
3.0571 24200 0.0 -
3.0634 24250 0.0 -
3.0697 24300 0.0 -
3.0760 24350 0.0 -
3.0824 24400 0.0 -
3.0887 24450 0.0 -
3.0950 24500 0.0 -
3.1013 24550 0.0 -
3.1076 24600 0.0 -
3.1139 24650 0.0 -
3.1203 24700 0.0 -
3.1266 24750 0.0 -
3.1329 24800 0.0 -
3.1392 24850 0.0 -
3.1455 24900 0.0 -
3.1518 24950 0.0 -
3.1582 25000 0.0 -
3.1645 25050 0.0 -
3.1708 25100 0.0 -
3.1771 25150 0.0 -
3.1834 25200 0.0 -
3.1897 25250 0.0 -
3.1961 25300 0.0 -
3.2024 25350 0.0 -
3.2087 25400 0.0 -
3.2150 25450 0.0 -
3.2213 25500 0.0 -
3.2276 25550 0.0 -
3.2340 25600 0.0 -
3.2403 25650 0.0 -
3.2466 25700 0.0 -
3.2529 25750 0.0 -
3.2592 25800 0.0 -
3.2655 25850 0.0 -
3.2719 25900 0.0 -
3.2782 25950 0.0 -
3.2845 26000 0.0 -
3.2908 26050 0.0 -
3.2971 26100 0.0 -
3.3034 26150 0.0 -
3.3098 26200 0.0 -
3.3161 26250 0.0 -
3.3224 26300 0.0 -
3.3287 26350 0.0 -
3.3350 26400 0.0 -
3.3413 26450 0.0 -
3.3477 26500 0.0 -
3.3540 26550 0.0 -
3.3603 26600 0.0 -
3.3666 26650 0.0 -
3.3729 26700 0.0 -
3.3792 26750 0.0 -
3.3855 26800 0.0 -
3.3919 26850 0.0 -
3.3982 26900 0.0 -
3.4045 26950 0.0 -
3.4108 27000 0.0 -
3.4171 27050 0.0 -
3.4234 27100 0.0 -
3.4298 27150 0.0 -
3.4361 27200 0.0 -
3.4424 27250 0.0 -
3.4487 27300 0.0 -
3.4550 27350 0.0 -
3.4613 27400 0.0 -
3.4677 27450 0.0 -
3.4740 27500 0.0 -
3.4803 27550 0.0 -
3.4866 27600 0.0 -
3.4929 27650 0.0 -
3.4992 27700 0.0 -
3.5056 27750 0.0 -
3.5119 27800 0.0 -
3.5182 27850 0.0 -
3.5245 27900 0.0 -
3.5308 27950 0.0 -
3.5371 28000 0.0 -
3.5435 28050 0.0 -
3.5498 28100 0.0 -
3.5561 28150 0.0 -
3.5624 28200 0.0 -
3.5687 28250 0.0 -
3.5750 28300 0.0 -
3.5814 28350 0.0 -
3.5877 28400 0.0 -
3.5940 28450 0.0 -
3.6003 28500 0.0 -
3.6066 28550 0.0 -
3.6129 28600 0.0 -
3.6193 28650 0.0 -
3.6256 28700 0.0 -
3.6319 28750 0.0 -
3.6382 28800 0.0 -
3.6445 28850 0.0 -
3.6508 28900 0.0 -
3.6572 28950 0.0 -
3.6635 29000 0.0 -
3.6698 29050 0.0 -
3.6761 29100 0.0 -
3.6824 29150 0.0 -
3.6887 29200 0.0 -
3.6950 29250 0.0 -
3.7014 29300 0.0 -
3.7077 29350 0.0 -
3.7140 29400 0.0 -
3.7203 29450 0.0 -
3.7266 29500 0.0 -
3.7329 29550 0.0 -
3.7393 29600 0.0 -
3.7456 29650 0.0 -
3.7519 29700 0.0 -
3.7582 29750 0.0 -
3.7645 29800 0.0 -
3.7708 29850 0.0 -
3.7772 29900 0.0 -
3.7835 29950 0.0 -
3.7898 30000 0.0 -
3.7961 30050 0.0 -
3.8024 30100 0.0 -
3.8087 30150 0.0 -
3.8151 30200 0.0 -
3.8214 30250 0.0 -
3.8277 30300 0.0 -
3.8340 30350 0.0 -
3.8403 30400 0.0 -
3.8466 30450 0.0 -
3.8530 30500 0.0 -
3.8593 30550 0.0 -
3.8656 30600 0.0 -
3.8719 30650 0.0 -
3.8782 30700 0.0 -
3.8845 30750 0.0 -
3.8909 30800 0.0 -
3.8972 30850 0.0 -
3.9035 30900 0.0 -
3.9098 30950 0.0 -
3.9161 31000 0.0 -
3.9224 31050 0.0 -
3.9288 31100 0.0 -
3.9351 31150 0.0 -
3.9414 31200 0.0 -
3.9477 31250 0.0 -
3.9540 31300 0.0 -
3.9603 31350 0.0 -
3.9666 31400 0.0 -
3.9730 31450 0.0 -
3.9793 31500 0.0 -
3.9856 31550 0.0 -
3.9919 31600 0.0 -
3.9982 31650 0.0 -
4.0 31664 - 0.0117
  • The bold row denotes the saved checkpoint.

Framework Versions

  • Python: 3.9.18
  • SetFit: 1.0.3
  • Sentence Transformers: 2.2.1
  • Transformers: 4.32.1
  • PyTorch: 1.10.0
  • Datasets: 2.20.0
  • Tokenizers: 0.13.3

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}
Downloads last month
3
Inference API
This model can be loaded on Inference API (serverless).

Finetuned from

Evaluation results