t5-base-kw2email-v4 / README.md
pszemraj's picture
Update README.md
c31e16d
|
raw
history blame
3.05 kB
metadata
license: apache-2.0
tags:
  - generated_from_trainer
  - email generation
  - email
datasets:
  - aeslc
  - postbot/multi_emails_kw
widget:
  - text: Thursday pay invoice need asap thanks Pierre good morning dear Harold
    example_title: invoice
  - text: dear elia when will space be ready need urgently regards ronald
    example_title: space ready
  - text: >-
      Tuesday need talk with you important stuff dear jonathan status war in
      Syria
    example_title: war status
  - text: dear bob will back wednesday need urgently regards elena
    example_title: return wednesday
  - text: dear mary thanks for your last invoice need know when payment be
    example_title: last invoice
  - text: >-
      pct1_dropremainder rounding may truncate the last examples in a dataset if
      the number of examples in your dataset don’t divide evenly by 100 dear bob
    example_title: pct1_dropremainder
  - text: >-
      dear joseph have all invoices ready Monday next invoice in 30 days have
      great weekend
    example_title: next invoice
  - text: >-
      dear mary I have couple questions on new contract we agreed on need know
      thoughts regarding contract
    example_title: contract
  - text: Friday will make report due soon please thanks dear john
    example_title: report due soon
  - text: >-
      need take photos sunday want finish thursday photo exhibition need urgent
      help thanks dear john
    example_title: photo exhibition
  - text: Tuesday need talk with you important stuff dear reginald
    example_title: important talk
  - text: dear maria how are you doing thanks very much
    example_title: thanks
  - text: >-
      dear james tomorrow will prepare file for june report before leave need
      know when leave
    example_title: file for june report
parameters:
  min_length: 16
  max_length: 256
  no_repeat_ngram_size: 2
  do_sample: false
  num_beams: 8
  early_stopping: true
  repetition_penalty: 5.5
  length_penalty: 0.9

t5-base-kw2email-v4

Note: this model was still trained on postbot/multi_emails_kw which only contains part of the Sony emails.

This version improves on prior "base" versions by using training hyperparameters more closely aligned with bigscience/T0

This model is a fine-tuned version of pszemraj/t5-base-kw2email-v3.5 on the None dataset.

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 8
  • eval_batch_size: 2
  • seed: 42
  • distributed_type: multi-GPU
  • gradient_accumulation_steps: 32
  • total_train_batch_size: 256
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.03
  • num_epochs: 2

Training results

Framework versions

  • Transformers 4.21.2
  • Pytorch 1.12.1+cu113
  • Datasets 2.4.0
  • Tokenizers 0.12.1