NOTE: there is currently a bug with huggingface API for OPT models. Please use the colab notebook to test :)
Why write the rest of your email when you can generate it?
from transformers import pipeline model_tag = "pszemraj/opt-125m-email-generation" generator = pipeline( 'text-generation', model=model_tag, use_fast=False, do_sample=False, ) prompt = """ Hello, Following up on the bubblegum shipment.""" generator( prompt, max_length=96, ) # generate
- colab notebook for testing/use
This model is a fine-tuned version of facebook/opt-125m on an
- Emails, phone numbers, etc., were attempted to be excluded in a dataset preparation step using clean-text in Python.
- Note that API is restricted to generating 64 tokens - you can generate longer emails by using this in a text-generation
It achieves the following results on the evaluation set:
- Loss: 2.5552
- OPT models cannot be used commercially
- here is a GitHub gist for a script to generate emails in the console or to a text file.
email_bodyfield of train + validation (get more data) from the aeslc dataset.
|Training Loss||Epoch||Step||Validation Loss|
- Transformers 4.20.1
- Pytorch 1.11.0+cu113
- Tokenizers 0.12.1
- Downloads last month