File size: 1,670 Bytes
c2ed11d |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 |
---
datasets:
- Sao10K/Claude-3-Opus-Instruct-15K
- abacusai/SystemChat-1.1
- Ba2han/DollyLlama-5k
language:
- en
---
hf-causal-experimental (pretrained='E:/text-generation-webui/models/phi-984-dora'), limit: None, provide_description: False, num_fewshot: 5, batch_size: None
| Task |Version| Metric |Value | |Stderr|
|---------------------------------|------:|--------|-----:|---|-----:|
|arc_challenge | 0|acc |0.5870|± |0.0144|
| | |acc_norm|0.6067|± |0.0143|
|hendrycksTest-abstract_algebra | 1|acc |0.3500|± |0.0479|
| | |acc_norm|0.3500|± |0.0479|
|hendrycksTest-college_biology | 1|acc |0.8264|± |0.0317|
| | |acc_norm|0.8264|± |0.0317|
|hendrycksTest-college_chemistry | 1|acc |**0.4900**|± |0.0502|
| | |acc_norm|0.4900|± |0.0502|
|hendrycksTest-college_mathematics| 1|acc |**0.3900**|± |0.0490|
| | |acc_norm|0.3900|± |0.0490|
|hendrycksTest-college_physics | 1|acc |**0.4020**|± |0.0488|
| | |acc_norm|0.4020|± |0.0488|
|winogrande | 0|acc |**0.7309**|± |0.0125|
**We have Llama-3 at home!**
The model has been trained on filtered versions of tagged datasets, as well as a few thousand more examples generated with llama-3-70B.
Use **Zephyr template** with any system message. Default system message should be:
You are a smart, friendly and helpful assistant. |