Model Card for Model ID
Model Description
midm-bitext-S-7B-inst-v1 λ―ΈμΈ νλ
ν΄λΉ λͺ¨λΈμ λ€μ΄λ² μν 리뷰 λ°μ΄ν°μ μΈ NSMCμ λν΄ KT-AI/midm-bitext-S-7B-inst-v1μ λ―ΈμΈνλν λͺ¨λΈμ λλ€.
μν 리뷰 ν μ€νΈλ₯Ό ν둬ννΈμ ν¬ν¨νμ¬ λͺ¨λΈμ μ λ ₯μ,'κΈμ ' λλ 'λΆμ ' μ΄λΌκ³ μμΈ‘ ν μ€νΈλ₯Ό μ§μ μμ±ν©λλ€.
κ²°κ³Όμ μΌλ‘, μ νλ 90.0%λ₯Ό κ°μ§λ λͺ¨λΈμ μμ±νμ΅λλ€.
Train, Test λ°μ΄ν°μ
ν΄λΉ λͺ¨λΈμ NSMCμ train λ°μ΄ν°μ μμ 2,000κ°μ μνμ νμ΅μ μ¬μ©νμ΅λλ€.
ν΄λΉ λͺ¨λΈμ NSMCμ test λ°μ΄ν°μ μμ 1,000κ°μ μνμ νκ°μ μ¬μ©νμ΅λλ€.
Training_step_loss
Confusion_Matrix
Accuracy_Classification_Report
Training procedure
The following bitsandbytes
quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: bfloat16
Framework versions
- PEFT 0.7.0
- Downloads last month
- 0
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support
HF Inference deployability: The model has no pipeline_tag.
Model tree for seojin0128/hw-midm-7B-nsmc
Base model
KT-AI/midm-bitext-S-7B-inst-v1