erx-llama-3-8b-high / train_results.json
bdsaglam's picture
Upload folder using huggingface_hub
a19b3ea verified
raw
history blame contribute delete
208 Bytes
{
"epoch": 1.0,
"total_flos": 2.052344509146071e+17,
"train_loss": 0.08958469805575578,
"train_runtime": 1690.0157,
"train_samples_per_second": 5.249,
"train_steps_per_second": 0.656
}