DPO dataset
#11 opened 2 months ago
by
Andriy
Adding Evaluation Results
1
#10 opened 3 months ago
by
leaderboard-pr-bot
![](https://cdn-avatars.huggingface.co/v1/production/uploads/655506df9dc61e22c5f9c732/IZGvup0FdVlioPPIPnzZv.jpeg)
Version for Qwen1.5-72B
#9 opened 4 months ago
by
TNTOutburst
SlimOrca for DPO ?
#8 opened 4 months ago
by
yleo
![](https://cdn-avatars.huggingface.co/v1/production/uploads/63440271b7485028097cde4f/5io9diZc2OrInrGWX11Dk.jpeg)
Wants to know how to deploy model and try it for my own
1
#4 opened 5 months ago
by
LeonBlue
![](https://cdn-avatars.huggingface.co/v1/production/uploads/noauth/7IazipA8UtKb1jcZql-d0.jpeg)
which chat template I should use?
1
#3 opened 5 months ago
by
wyxwangmed
New Leader!
6
#2 opened 5 months ago
by
DKRacingFan
GPTQ model is available
6
#1 opened 5 months ago
by
MaziyarPanahi
![](https://cdn-avatars.huggingface.co/v1/production/uploads/5fd5e18a90b6dc4633f6d292/gZXHW5dd9R86AV9LMZ--y.png)