Mini_DPO_test02-Mistral-7B-Instruct-v0.2-slerp / model.safetensors.index.json

Commit History

Upload folder using huggingface_hub
9cf7168
verified

MaziyarPanahi commited on