Model Card for Cerebras 1.3b Dollyfied.

This is a finetuned model of Cerebras 1.3b model. using DataBricksLabs Dolly Framework

Model Details

Model Description

This is a finetuned version of cerebras' 1.3Billion paramater model that has been trained to follow instructions.

It was accomplished using DataBricks Dolly training tools, and was trained for 2 epochs.

Uses

This is a simple GPT chatbot that has been finetuned to understand instructions. Its knowledge about facts about the world is should be considered suspect at best.

Direct Use

If you have a use you put it to, Please let me know.

[More Information Needed]

Downstream Use [optional]

[More Information Needed]

Out-of-Scope Use

Any form of use where any form of accuracy is needed. FOR THE LOVE OF GOD DO NOT FOLLOW MEDICAL ADVICE FROM THIS. or financial advice.

[More Information Needed]

Bias, Risks, and Limitations

Limitations... Yes, I am sure there are so so many.

Environmental Impact

Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).

  • Hardware Type: 8xA100s (accomplished while I was downloading the model I was actually training.)
  • Minutes used: 17
  • Cloud Provider: LambdaGPU
  • Compute Region: USA
  • Carbon Emitted: [More Information Needed]

Open LLM Leaderboard Evaluation Results

Detailed results can be found here

Metric Value
Avg. 27.1
ARC (25-shot) 27.73
HellaSwag (10-shot) 37.91
MMLU (5-shot) 26.66
TruthfulQA (0-shot) 40.14
Winogrande (5-shot) 52.72
GSM8K (5-shot) 0.0
DROP (3-shot) 4.54
Downloads last month
1,219
Safetensors
Model size
1.42B params
Tensor type
BF16
Β·
U8
Β·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Datasets used to train Corianas/Quokka_1.3b

Spaces using Corianas/Quokka_1.3b 24