license: apache-2.0 | |
datasets: | |
- garage-bAInd/Open-Platypus | |
language: | |
- en | |
![image/png](https://cdn-uploads.huggingface.co/production/uploads/6468ce47e134d050a58aa89c/cKySe1S5IW_KnbZpKmozQ.png) | |
<a href="https://www.buymeacoffee.com/PulsarAI" target="_blank"><img src="https://cdn.buymeacoffee.com/buttons/v2/default-yellow.png" alt="Buy Me A Coffee" style="height: 60px !important;width: 217px !important;" ></a> | |
# Nebula-7b | |
Original weights of Nebula-7B. Finetuned from [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1). | |
## Lora Weights | |
You can access original lora weights from here: | |
[PulsarAI/Nebula-7B-Lora](https://huggingface.co/PulsarAI/Nebula-7B-Lora) | |
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard) | |
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__Nebula-7B) | |
| Metric | Value | | |
|-----------------------|---------------------------| | |
| Avg. | 53.93 | | |
| ARC (25-shot) | 59.3 | | |
| HellaSwag (10-shot) | 83.46 | | |
| MMLU (5-shot) | 57.0 | | |
| TruthfulQA (0-shot) | 45.56 | | |
| Winogrande (5-shot) | 76.4 | | |
| GSM8K (5-shot) | 14.86 | | |
| DROP (3-shot) | 40.96 | | |