9B_241128_sft_special-task_241202

This model is a fine-tuned version of saves/Yi-1.5-9B-sft-241128 on the 10.TCM-SRT, the 2.TCM-DS, the 3.TCM-DID, the 4.TCM-FT-Lite, the 5.TCM-CHGD, the 6.Med-Treat, the 7.TCM-Clin, the 8.TCMeEE, the 9.TCM-LitData, the A_problem, the B_problem, the C_problem, the D_problem, the SPD-5038-gpt4oc, the zl_2 and the zl_3 datasets. It achieves the following results on the evaluation set:

  • Loss: 0.3506

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2.5e-06
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • distributed_type: multi-GPU
  • num_devices: 6
  • gradient_accumulation_steps: 5
  • total_train_batch_size: 120
  • total_eval_batch_size: 24
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.05
  • num_epochs: 2.0

Training results

Training Loss Epoch Step Validation Loss
0.2796 1.8335 1000 0.3508

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu121
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
5
Safetensors
Model size
8.83B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.