YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Quantization made by Richard Erkhov.
pythia-70m-deduped-cleansharegpt - bnb 4bits
- Model creator: https://huggingface.co/HWERI/
- Original model: https://huggingface.co/HWERI/pythia-70m-deduped-cleansharegpt/
Original model description:
license: apache-2.0 datasets: - CaterinaLac/sharegpt-deduplicated language: - en - zh - fr - es
Model Card
Pythia-70m-deduped finetuned on a cleaned version of ShareGPT data.
Open LLM Leaderboard Evaluation Results
Detailed results can be found here
Metric | Value |
---|---|
Avg. | 25.34 |
ARC (25-shot) | 25.68 |
HellaSwag (10-shot) | 25.4 |
MMLU (5-shot) | 23.12 |
TruthfulQA (0-shot) | 51.15 |
Winogrande (5-shot) | 52.01 |
GSM8K (5-shot) | 0.0 |
DROP (3-shot) | 0.0 |
- Downloads last month
- 1
This model does not have enough activity to be deployed to Inference API (serverless) yet.
Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.