Lora finetuning on Wikipedia-10, applying counter factual data augmentation (CDA)

  • Dataset: Wikipedia-10
  • Target modules = ["q_proj", "k_proj", "v_proj", "o_proj", "gate_proj", "up_proj", "down_proj"]
{
    "epoch": 0.05217151712326255,
    "total_flos": 2.605973017460736e+18,
    "train_loss": 0.03771647383272648,
    "train_runtime": 18281.2091,
    "train_samples": 1226723,
    "train_samples_per_second": 3.501,
    "train_steps_per_second": 0.109
}

Training script: https://github.com/ao9000/bias-bench/blob/main/experiments/run_clm.py

Downloads last month
1
Safetensors
Model size
6.74B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.