File size: 6,227 Bytes
55fbbcd
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
---
license: bsd-3-clause
tags:
- generated_from_trainer
datasets:
- mbpp
model-index:
- name: codet5p-770m-py-sanitized-chrf-1-False-1e-05-0.1-lora
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# codet5p-770m-py-sanitized-chrf-1-False-1e-05-0.1-lora

This model is a fine-tuned version of [Salesforce/codet5p-770m-py](https://huggingface.co/Salesforce/codet5p-770m-py) on the mbpp dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8630
- Score: 18.8495
- Char Order: 6
- Word Order: 0
- Beta: 2

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 6
- eval_batch_size: 6
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 50
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | Score   | Char Order | Word Order | Beta |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:----------:|:----------:|:----:|
| 1.1537        | 1.0   | 20   | 1.1773          | 15.9967 | 6          | 0          | 2    |
| 1.1524        | 2.0   | 40   | 1.1736          | 16.1464 | 6          | 0          | 2    |
| 1.1553        | 3.0   | 60   | 1.1665          | 16.8836 | 6          | 0          | 2    |
| 1.1204        | 4.0   | 80   | 1.1551          | 16.9562 | 6          | 0          | 2    |
| 1.1368        | 5.0   | 100  | 1.1360          | 16.8337 | 6          | 0          | 2    |
| 1.0961        | 6.0   | 120  | 1.1074          | 16.8866 | 6          | 0          | 2    |
| 1.0747        | 7.0   | 140  | 1.0538          | 17.2168 | 6          | 0          | 2    |
| 1.0228        | 8.0   | 160  | 0.9966          | 17.8457 | 6          | 0          | 2    |
| 0.9791        | 9.0   | 180  | 0.9571          | 17.9549 | 6          | 0          | 2    |
| 0.9597        | 10.0  | 200  | 0.9356          | 18.8652 | 6          | 0          | 2    |
| 0.9466        | 11.0  | 220  | 0.9255          | 18.7134 | 6          | 0          | 2    |
| 0.9206        | 12.0  | 240  | 0.9180          | 18.5846 | 6          | 0          | 2    |
| 0.9194        | 13.0  | 260  | 0.9119          | 19.1993 | 6          | 0          | 2    |
| 0.8938        | 14.0  | 280  | 0.9065          | 19.2773 | 6          | 0          | 2    |
| 0.8856        | 15.0  | 300  | 0.9019          | 19.1140 | 6          | 0          | 2    |
| 0.9002        | 16.0  | 320  | 0.8981          | 19.0127 | 6          | 0          | 2    |
| 0.8733        | 17.0  | 340  | 0.8945          | 19.1110 | 6          | 0          | 2    |
| 0.8815        | 18.0  | 360  | 0.8914          | 18.9013 | 6          | 0          | 2    |
| 0.8788        | 19.0  | 380  | 0.8887          | 18.7065 | 6          | 0          | 2    |
| 0.8895        | 20.0  | 400  | 0.8862          | 18.8140 | 6          | 0          | 2    |
| 0.8727        | 21.0  | 420  | 0.8839          | 18.9816 | 6          | 0          | 2    |
| 0.8533        | 22.0  | 440  | 0.8819          | 18.8941 | 6          | 0          | 2    |
| 0.8542        | 23.0  | 460  | 0.8800          | 18.8941 | 6          | 0          | 2    |
| 0.8397        | 24.0  | 480  | 0.8786          | 18.9750 | 6          | 0          | 2    |
| 0.8337        | 25.0  | 500  | 0.8772          | 18.8138 | 6          | 0          | 2    |
| 0.8439        | 26.0  | 520  | 0.8759          | 18.8138 | 6          | 0          | 2    |
| 0.8403        | 27.0  | 540  | 0.8744          | 18.9115 | 6          | 0          | 2    |
| 0.8423        | 28.0  | 560  | 0.8732          | 18.8551 | 6          | 0          | 2    |
| 0.8303        | 29.0  | 580  | 0.8720          | 18.8551 | 6          | 0          | 2    |
| 0.8299        | 30.0  | 600  | 0.8711          | 18.8677 | 6          | 0          | 2    |
| 0.8235        | 31.0  | 620  | 0.8701          | 18.8649 | 6          | 0          | 2    |
| 0.8207        | 32.0  | 640  | 0.8694          | 18.8780 | 6          | 0          | 2    |
| 0.8244        | 33.0  | 660  | 0.8685          | 18.8499 | 6          | 0          | 2    |
| 0.8114        | 34.0  | 680  | 0.8678          | 18.8717 | 6          | 0          | 2    |
| 0.8249        | 35.0  | 700  | 0.8672          | 18.7947 | 6          | 0          | 2    |
| 0.8193        | 36.0  | 720  | 0.8666          | 18.8761 | 6          | 0          | 2    |
| 0.8146        | 37.0  | 740  | 0.8662          | 19.0580 | 6          | 0          | 2    |
| 0.807         | 38.0  | 760  | 0.8657          | 18.7895 | 6          | 0          | 2    |
| 0.8012        | 39.0  | 780  | 0.8651          | 18.7895 | 6          | 0          | 2    |
| 0.8065        | 40.0  | 800  | 0.8648          | 18.7348 | 6          | 0          | 2    |
| 0.8106        | 41.0  | 820  | 0.8644          | 18.8183 | 6          | 0          | 2    |
| 0.7992        | 42.0  | 840  | 0.8642          | 18.8183 | 6          | 0          | 2    |
| 0.8058        | 43.0  | 860  | 0.8639          | 18.8729 | 6          | 0          | 2    |
| 0.7893        | 44.0  | 880  | 0.8636          | 18.8729 | 6          | 0          | 2    |
| 0.8162        | 45.0  | 900  | 0.8634          | 18.9041 | 6          | 0          | 2    |
| 0.8106        | 46.0  | 920  | 0.8632          | 18.8495 | 6          | 0          | 2    |
| 0.7955        | 47.0  | 940  | 0.8632          | 18.8495 | 6          | 0          | 2    |
| 0.8172        | 48.0  | 960  | 0.8631          | 18.8495 | 6          | 0          | 2    |
| 0.8024        | 49.0  | 980  | 0.8630          | 18.8495 | 6          | 0          | 2    |
| 0.8086        | 50.0  | 1000 | 0.8630          | 18.8495 | 6          | 0          | 2    |


### Framework versions

- Transformers 4.30.0.dev0
- Pytorch 2.0.1
- Datasets 2.13.1
- Tokenizers 0.13.3