Edit model card

SetFit with mini1013/master_domain

This is a SetFit model that can be used for Text Classification. This SetFit model uses mini1013/master_domain as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

Model Details

Model Description

Model Sources

Model Labels

Label Examples
9
  • 'APC BK500EI UPS배터리 무정전전원장치 300W 500VA 다피(dappy)'
  • '리안리 SP750 80PLUS GOLD (WHITE) 주식회사 브라보세컨즈'
  • 'APC Smart UPS C 2000VA Tower 무정전전원장치 - smc2000ic 주식회사 파인인프라'
2
  • '3RSYS R200 RGB (블랙) 미들타워 컴온씨앤씨(주)'
  • 'DAVEN AQUA (블랙) 주식회사 꿈누리'
  • 'w 대원TMT DW-H1200 허브랙 (H1200×D800×W600/25U/회색) (착불배송) (주)원영씨앤씨'
0
  • '인텔 코어i7-13세대 13700K 랩터레이크 정품 에어캡배송 (주)신우밀루유떼'
  • 'AMD 라이젠5-4세대 5600X (버미어)벌크포장 AS 3년 태성에프앤비(주)'
  • '[INTEL] 코어10세대 i7-10700 벌크 병행 쿨러미포함 (코멧레이크) (주)컴퓨존'
4
  • 'SAPPHIRE 라데온 RX 7900 GRE PURE D6 16GB 주식회사 꿈누리'
  • 'ASRock 라데온 RX 7900 XTX Phantom Gaming OC D6 24GB 대원씨티에스 주식회사 에스씨엠인포텍'
  • '[HY] INNO3D 지포스 GT1030 D5 2GB LP 무소음 (주)제이케이존'
8
  • '잘만 ZM-STC10 (2g) 주식회사 피씨사자'
  • '3RSYS APB BAR 35 (주)컴퓨존'
  • 'LP30 ARGB PSU 커버 화이트 주식회사보성닷컴'
6
  • 'NEXTU NEXT-206NEC EX 에스앤와이'
  • 'LANstar PCI-E 내부 SATA3 4포트 카드/LS-PCIE-4SATA/PC 내부에 SATA3 4포트 생성/발열 방지용 방열판/LP 브라켓 포함 디피시스템'
  • 'NEXTU NEXT-405NEC LP 에스앤와이'
3
  • 'V-Color BLACK DDR5-5200 CL42 STANDARD 벌크 (8GB) (주)가이드컴'
  • 'TEAMGROUP T-Force DDR5 6000 CL38 Delta RGB 화이트 패키지 32GB(16Gx2) (주)서린씨앤아이'
  • 'ADATA DDR5-5600 CL46 (16GB)/정품판매점/하이닉스A다이/언락/평생 제한 보증/R 주식회사 에이알씨앤아이'
5
  • 'ASRock H510M-HDV/M.2 SE 에즈윈 주식회사디케이'
  • 'DK ASRock B760M PG Riptide D5 에즈윈 주식회사디케이'
  • '[ ] GIGABYTE B650 AORUS ELITE AX ICE 제이씨현 뉴비시스템즈'
7
  • '아틱 P14 PWM PST 블랙 VALUE 5팩 (주)서린씨앤아이'
  • '앱코 타이폰 120X5 CPU 쿨러 알루미늄 방열판 주식회사 지디스엠알오'
  • 'Thermalright Peerless Assassin 120 SE 서린 태성에프앤비(주)'
1
  • '엠비에프 CAT.7 SFTP 금도금 UTP 3중 쉴드 패치코드 기가비트 랜케이블 0.5M (MBF-U705G) 주식회사 아크런 (Akrun Co., Ltd.)'
  • 'MBF-C5E305R 305M 레드 BOX CAT.5E UTP 랜케이블 컴샷정보'
  • '엠비에프 CAT.5e UTP 제작형 랜케이블 박스 MBF-C5E305Y 옐로우 305m (주)아토닉스'

Evaluation

Metrics

Label Metric
all 0.9098

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import SetFitModel

# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("mini1013/master_cate_el1")
# Run inference
preds = model("앱코 NCORE G30 트루포스 (블랙) 미들타워 컴퓨터 케이스  오케이 바이오")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 4 9.206 18
Label Training Sample Count
0 50
1 50
2 50
3 50
4 50
5 50
6 50
7 50
8 50
9 50

Training Hyperparameters

  • batch_size: (512, 512)
  • num_epochs: (20, 20)
  • max_steps: -1
  • sampling_strategy: oversampling
  • num_iterations: 40
  • body_learning_rate: (2e-05, 2e-05)
  • head_learning_rate: 2e-05
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • seed: 42
  • eval_max_steps: -1
  • load_best_model_at_end: False

Training Results

Epoch Step Training Loss Validation Loss
0.0127 1 0.4969 -
0.6329 50 0.2753 -
1.2658 100 0.0677 -
1.8987 150 0.014 -
2.5316 200 0.0023 -
3.1646 250 0.0001 -
3.7975 300 0.0001 -
4.4304 350 0.0001 -
5.0633 400 0.0001 -
5.6962 450 0.0 -
6.3291 500 0.0001 -
6.9620 550 0.0001 -
7.5949 600 0.0 -
8.2278 650 0.0 -
8.8608 700 0.0 -
9.4937 750 0.0 -
10.1266 800 0.0 -
10.7595 850 0.0 -
11.3924 900 0.0 -
12.0253 950 0.0 -
12.6582 1000 0.0 -
13.2911 1050 0.0 -
13.9241 1100 0.0 -
14.5570 1150 0.0 -
15.1899 1200 0.0 -
15.8228 1250 0.0 -
16.4557 1300 0.0 -
17.0886 1350 0.0 -
17.7215 1400 0.0 -
18.3544 1450 0.0 -
18.9873 1500 0.0 -
19.6203 1550 0.0 -

Framework Versions

  • Python: 3.10.12
  • SetFit: 1.1.0.dev0
  • Sentence Transformers: 3.1.1
  • Transformers: 4.46.1
  • PyTorch: 2.4.0+cu121
  • Datasets: 2.20.0
  • Tokenizers: 0.20.0

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}
Downloads last month
1,112
Safetensors
Model size
111M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for mini1013/master_cate_el1

Base model

klue/roberta-base
Finetuned
(54)
this model

Evaluation results