metadata
library_name: transformers
base_model:
- axolotl-ai-co/romulus-mistral-nemo-12b-simpo
datasets:
- jondurbin/gutenberg-dpo-v0.1
- nbeerbower/gutenberg2-dpo
license: apache-2.0
Mistral-Nemo-Gutenberg-Doppel-12B
axolotl-ai-co/romulus-mistral-nemo-12b-simpo finetuned on jondurbin/gutenberg-dpo-v0.1 and nbeerbower/gutenberg2-dpo.
Method
ORPO tuned with 2x A100 for 3 epochs.