Check G-reen/EXPERIMENT-DPO-m7b2-2-merged (https://huggingface.co/G-reen/EXPERIMENT-DPO-m7b2-2-merged) for details.
- Downloads last month
- 0
Unable to determine this model's library. Check the
docs
.
Check G-reen/EXPERIMENT-DPO-m7b2-2-merged (https://huggingface.co/G-reen/EXPERIMENT-DPO-m7b2-2-merged) for details.