File size: 614 Bytes
28a4e31 a7ce2f3 28a4e31 a7ce2f3 28a4e31 a7ce2f3 28a4e31 a7ce2f3 28a4e31 a7ce2f3 28a4e31 a7ce2f3 28a4e31 a7ce2f3 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 |
---
base_model:
- meta-llama/Llama-3.2-1B
---
## Model Description
This is the meta-llama/Llama-3.2-1B base model fine tuned on the mlabonne/orpo-dpo-mix-40k dataset.
## Evaluation Results
We used lm-evalutation-harness from EleutherAI to evaluate this fine-tuned version of meta-llama/Llama-3.2-1B on the 'Hellaswag' benchmark.
### Hellaswag
| Tasks |Version|Filter|n-shot| Metric | |Value | |Stderr|
|---------|------:|------|-----:|--------|---|-----:|---|-----:|
|hellaswag| 1|none | 0|acc |↑ |0.4773|± |0.0050|
| | |none | 0|acc_norm|↑ |0.6358|± |0.0048| |