File size: 577 Bytes
c576075
6fb8a9f
c576075
cf6f801
 
 
 
712ee75
cf6f801
 
f3f9d89
1
2
3
4
5
6
7
8
9
10
11
# palmer
palmer-003 focuses on reaching sota performance by MErging of Experts + fine-tuning, where each expert is consolidated into one model and finally is fine-tuned on useful textual data.

```
### Evaluation
	               ARC-C     OBQA   HellaSwag  PIQA  Winogrande Average
tinyllama        | 0.3029 | 0.3600 | 0.5935 | 0.7329 | 0.5959 | 0.5170 |
palmer-002-2401  | 0.3311 | 0.3600 | 0.5981 | 0.7416 | 0.6006 | 0.5266 |
babbage-002      | 0.3285 | 0.3620 | 0.6380 | 0.7606 | 0.6085 | 0.5395 |
palmer-003       | 0.3370 | 0.3740 | 0.6128 | 0.7486 | 0.6535 | 0.5451 |
```