|
--- |
|
license: apache-2.0 |
|
tags: |
|
- merge |
|
- mergekit |
|
- lazymergekit |
|
- mlabonne/Marcoro14-7B-slerp |
|
- mlabonne/NeuralBeagle14-7B |
|
--- |
|
|
|
|
|
|
|
 |
|
|
|
|
|
|
|
# Maverick-7B |
|
|
|
This model is a merge of the following models: |
|
* [mlabonne/Marcoro14-7B-slerp](https://huggingface.co/mlabonne/Marcoro14-7B-slerp) |
|
* [mlabonne/NeuralBeagle14-7B](https://huggingface.co/mlabonne/NeuralBeagle14-7B) |
|
|
|
|
|
## 🏆 Evaluation |
|
|
|
### TruthfulQA |
|
|
|
| Task |Version|Metric|Value | |Stderr| |
|
|-------------|------:|------|-----:|---|-----:| |
|
|truthfulqa_mc| 1|mc1 |0.5165|± |0.0175| |
|
| | |mc2 |0.6661|± |0.0152| |
|
|
|
### GPT4ALL |
|
|
|
| Task |Version| Metric |Value | |Stderr| |
|
|-------------|------:|--------|-----:|---|-----:| |
|
|arc_challenge| 0|acc |0.6442|± |0.0140| |
|
| | |acc_norm|0.6570|± |0.0139| |
|
|arc_easy | 0|acc |0.8645|± |0.0070| |
|
| | |acc_norm|0.8304|± |0.0077| |
|
|boolq | 1|acc |0.8850|± |0.0056| |
|
|hellaswag | 0|acc |0.6813|± |0.0047| |
|
| | |acc_norm|0.8571|± |0.0035| |
|
|openbookqa | 0|acc |0.3640|± |0.0215| |
|
| | |acc_norm|0.4800|± |0.0224| |
|
|piqa | 0|acc |0.8324|± |0.0087| |
|
| | |acc_norm|0.8460|± |0.0084| |
|
|winogrande | 0|acc |0.7869|± |0.0115| |
|
|
|
|