File size: 5,633 Bytes
56aaab0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- rouge
model-index:
- name: t5_recommendation_sports_equipment_english
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# t5_recommendation_sports_equipment_english

This model is a fine-tuned version of [t5-large](https://huggingface.co/t5-large) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4438
- Rouge1: 72.2222
- Rouge2: 66.6667
- Rougel: 72.2222
- Rougelsum: 72.2222
- Gen Len: 4.0952

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 40

### Training results

| Training Loss | Epoch | Step | Validation Loss | Rouge1  | Rouge2  | Rougel  | Rougelsum | Gen Len |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:-------:|
| No log        | 1.0   | 1    | 9.9716          | 12.6943 | 0.0     | 12.4298 | 12.3728   | 19.0    |
| No log        | 2.0   | 2    | 10.1466         | 10.0457 | 0.0     | 9.8413  | 9.8341    | 19.0    |
| No log        | 3.0   | 3    | 8.3378          | 10.7204 | 0.0     | 10.4782 | 10.4681   | 19.0    |
| No log        | 4.0   | 4    | 7.3021          | 10.7204 | 0.0     | 10.4782 | 10.4681   | 19.0    |
| No log        | 5.0   | 5    | 6.3242          | 10.5739 | 0.0     | 10.3628 | 10.3550   | 19.0    |
| No log        | 6.0   | 6    | 5.4331          | 10.3340 | 0.7937  | 10.3340 | 10.2659   | 19.0    |
| No log        | 7.0   | 7    | 4.7152          | 10.8955 | 0.7937  | 10.9896 | 10.8598   | 18.9524 |
| No log        | 8.0   | 8    | 3.9937          | 14.1311 | 3.5923  | 14.2840 | 13.9026   | 15.0952 |
| No log        | 9.0   | 9    | 3.1163          | 16.2812 | 1.0025  | 16.1905 | 16.0614   | 6.4762  |
| No log        | 10.0  | 10   | 2.3306          | 23.1746 | 7.1429  | 23.1746 | 23.6508   | 4.1429  |
| No log        | 11.0  | 11   | 1.9695          | 21.4286 | 7.1429  | 21.4286 | 21.7460   | 4.0476  |
| No log        | 12.0  | 12   | 1.5552          | 24.1270 | 7.1429  | 23.9683 | 24.1270   | 3.9048  |
| No log        | 13.0  | 13   | 0.8986          | 9.0476  | 0.0     | 9.0476  | 9.0476    | 3.7619  |
| No log        | 14.0  | 14   | 0.7398          | 18.2540 | 2.3810  | 18.2540 | 18.2540   | 4.1905  |
| No log        | 15.0  | 15   | 0.6966          | 12.6984 | 0.0     | 11.9048 | 12.6984   | 3.6667  |
| No log        | 16.0  | 16   | 0.6352          | 32.5397 | 14.2857 | 32.5397 | 31.7460   | 3.7619  |
| No log        | 17.0  | 17   | 0.5722          | 43.6508 | 23.8095 | 43.6508 | 43.6508   | 4.0952  |
| No log        | 18.0  | 18   | 0.5628          | 43.6508 | 23.8095 | 43.6508 | 43.6508   | 3.8571  |
| No log        | 19.0  | 19   | 0.5526          | 43.1746 | 23.8095 | 43.0159 | 42.8571   | 3.8571  |
| No log        | 20.0  | 20   | 0.5522          | 48.4127 | 38.0952 | 48.4127 | 48.4127   | 3.7619  |
| No log        | 21.0  | 21   | 0.5201          | 42.8571 | 28.5714 | 42.6190 | 42.6984   | 4.2381  |
| No log        | 22.0  | 22   | 0.5262          | 36.9841 | 19.0476 | 36.9841 | 36.9841   | 4.2857  |
| No log        | 23.0  | 23   | 0.5093          | 38.0952 | 23.8095 | 37.5397 | 37.9365   | 4.1429  |
| No log        | 24.0  | 24   | 0.4818          | 45.6349 | 33.3333 | 45.0794 | 45.3968   | 4.1429  |
| No log        | 25.0  | 25   | 0.4547          | 50.7937 | 38.0952 | 50.0    | 50.7937   | 4.1429  |
| No log        | 26.0  | 26   | 0.4455          | 50.7937 | 38.0952 | 50.0    | 50.7937   | 4.1429  |
| No log        | 27.0  | 27   | 0.4660          | 53.1746 | 42.8571 | 53.1746 | 53.1746   | 4.0476  |
| No log        | 28.0  | 28   | 0.4825          | 53.1746 | 42.8571 | 53.1746 | 53.1746   | 4.0     |
| No log        | 29.0  | 29   | 0.4928          | 53.1746 | 42.8571 | 53.1746 | 53.1746   | 4.0476  |
| No log        | 30.0  | 30   | 0.4838          | 57.4603 | 42.8571 | 57.1429 | 57.1429   | 4.0476  |
| No log        | 31.0  | 31   | 0.4955          | 60.3175 | 47.6190 | 60.3175 | 60.3175   | 4.0476  |
| No log        | 32.0  | 32   | 0.5066          | 62.6984 | 52.3810 | 62.6984 | 62.6984   | 4.1429  |
| No log        | 33.0  | 33   | 0.5189          | 62.6984 | 52.3810 | 62.6984 | 62.6984   | 4.1905  |
| No log        | 34.0  | 34   | 0.5234          | 62.6984 | 52.3810 | 62.6984 | 62.6984   | 4.1905  |
| No log        | 35.0  | 35   | 0.5225          | 62.6984 | 52.3810 | 62.6984 | 62.6984   | 4.1905  |
| No log        | 36.0  | 36   | 0.5225          | 62.6984 | 52.3810 | 62.6984 | 62.6984   | 4.1905  |
| No log        | 37.0  | 37   | 0.5058          | 62.2222 | 52.3810 | 61.9048 | 61.9048   | 4.1429  |
| No log        | 38.0  | 38   | 0.4861          | 70.6349 | 61.9048 | 69.8413 | 69.8413   | 4.1905  |
| No log        | 39.0  | 39   | 0.4625          | 70.6349 | 61.9048 | 69.8413 | 69.8413   | 4.1905  |
| No log        | 40.0  | 40   | 0.4438          | 72.2222 | 66.6667 | 72.2222 | 72.2222   | 4.0952  |


### Framework versions

- Transformers 4.26.0
- Pytorch 2.3.0+cu121
- Datasets 2.8.0
- Tokenizers 0.13.3