File size: 2,947 Bytes
59a4641
 
 
 
003c532
 
 
 
 
 
ad93fc3
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
003c532
 
 
 
ad93fc3
003c532
ad93fc3
003c532
 
ad93fc3
59a4641
c31e16d
 
 
 
59a4641
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
---
license: apache-2.0
tags:
- generated_from_trainer
- email generation
- email
datasets:
- aeslc
- postbot/multi_emails_kw
widget:
- text: Thursday pay invoice need asap thanks Pierre good morning dear Harold
  example_title: invoice
- text: dear elia when will space be ready need urgently regards ronald
  example_title: space ready
- text: Tuesday need talk with you important stuff dear jonathan status war in Syria
  example_title: war status
- text: dear bob will back wednesday need urgently regards elena
  example_title: return wednesday
- text: dear mary thanks for your last invoice need know when payment be
  example_title: last invoice
- text: pct1_dropremainder rounding may truncate the last examples in a dataset if
    the number of examples in your dataset don’t divide evenly by 100 dear bob
  example_title: pct1_dropremainder
- text: dear joseph have all invoices ready Monday next invoice in 30 days have great
    weekend
  example_title: next invoice
- text: dear mary I have couple questions on new contract we agreed on need know thoughts
    regarding contract
  example_title: contract
- text: Friday will make report due soon please thanks dear john
  example_title: report due soon
- text: need take photos sunday want finish thursday photo exhibition need urgent
    help thanks dear john
  example_title: photo exhibition
- text: Tuesday need talk with you important stuff dear reginald
  example_title: important talk
- text: dear maria how are you doing thanks very much
  example_title: thanks
- text: dear james tomorrow will prepare file for june report before leave need know
    when leave
  example_title: file for june report
parameters:
  min_length: 16
  max_length: 256
  no_repeat_ngram_size: 2
  do_sample: false
  num_beams: 8
  early_stopping: true
  repetition_penalty: 5.5
  length_penalty: 0.9
base_model: pszemraj/t5-base-kw2email-v3.5
---
# t5-base-kw2email-v4


This version **improves on prior "base" versions** by using training hyperparameters more closely aligned with [bigscience/T0](https://huggingface.co/bigscience/T0)

This model is a fine-tuned version of [pszemraj/t5-base-kw2email-v3.5](https://huggingface.co/pszemraj/t5-base-kw2email-v3.5) on the None dataset.

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 8
- eval_batch_size: 2
- seed: 42
- distributed_type: multi-GPU
- gradient_accumulation_steps: 32
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.03
- num_epochs: 2

### Training results



### Framework versions

- Transformers 4.21.2
- Pytorch 1.12.1+cu113
- Datasets 2.4.0
- Tokenizers 0.12.1