Edit model card

emailgen-pythia-410m-deduped

colab

This model is a fine-tuned version of EleutherAI/pythia-410m-deduped on email data. It achieves the following results on the evaluation set:

  • Loss: 2.1018
  • Accuracy: 0.6157
  • perplexity: 8.181

Model description

  • fine-tuned on dataset of emails for 4 epochs
  • intended use: "text completion" of partially written emails

Usage example

from transformers import pipeline

model_tag = "postbot/emailgen-pythia-410m-deduped"
generator = pipeline(
    "text-generation",
    model=model_tag,
)

prompt = """
Hello, 

Following up on the bubblegum shipment."""

result = generator(
    prompt,
)  # generate
print(result[0]["generated_text"])

Open LLM Leaderboard Evaluation Results

Detailed results can be found here

Metric Value
Avg. 26.65
ARC (25-shot) 27.9
HellaSwag (10-shot) 40.04
MMLU (5-shot) 27.35
TruthfulQA (0-shot) 38.2
Winogrande (5-shot) 52.09
GSM8K (5-shot) 0.0
DROP (3-shot) 0.99
Downloads last month
1,779
Safetensors
Model size
506M params
Tensor type
F32
·
U8
·

Finetuned from