File size: 5,806 Bytes
268004c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
---
license: bsd-3-clause
tags:
- generated_from_trainer
datasets:
- mbpp
model-index:
- name: codet5p-770m-py-codebleu-32-True-1e-06-0.1
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# codet5p-770m-py-codebleu-32-True-1e-06-0.1

This model is a fine-tuned version of [Salesforce/codet5p-770m-py](https://huggingface.co/Salesforce/codet5p-770m-py) on the mbpp dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8087
- Codebleu: 0.0867
- Ngram Match Score: 0.0137
- Weighted Ngram Match Score: 0.0422
- Syntax Match Score: 0.1204
- Dataflow Match Score: 0.0824

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-06
- train_batch_size: 6
- eval_batch_size: 6
- seed: 42
- gradient_accumulation_steps: 32
- total_train_batch_size: 192
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 50
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | Codebleu | Ngram Match Score | Weighted Ngram Match Score | Syntax Match Score | Dataflow Match Score |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-----------------:|:--------------------------:|:------------------:|:--------------------:|
| 1.9228        | 0.51  | 1    | 0.9113          | 0.0047   | 0.0000            | 0.0000                     | 0.0048             | 0.0070               |
| 0.9857        | 1.52  | 3    | 0.9112          | 0.0047   | 0.0000            | 0.0000                     | 0.0048             | 0.0070               |
| 0.9734        | 2.54  | 5    | 0.9112          | 0.0069   | 0.0000            | 0.0001                     | 0.0067             | 0.0105               |
| 0.9624        | 3.56  | 7    | 0.9111          | 0.0074   | 0.0000            | 0.0002                     | 0.0072             | 0.0112               |
| 0.9586        | 4.57  | 9    | 0.9107          | 0.0087   | 0.0000            | 0.0003                     | 0.0092             | 0.0126               |
| 0.9708        | 5.59  | 11   | 0.9097          | 0.0140   | 0.0000            | 0.0019                     | 0.0178             | 0.0168               |
| 0.9667        | 6.6   | 13   | 0.9092          | 0.0171   | 0.0000            | 0.0034                     | 0.0202             | 0.0216               |
| 0.9791        | 7.62  | 15   | 0.9058          | 0.0211   | 0.0000            | 0.0057                     | 0.0255             | 0.0258               |
| 0.9702        | 8.63  | 17   | 0.9048          | 0.0317   | 0.0001            | 0.0144                     | 0.0366             | 0.0391               |
| 0.9563        | 9.65  | 19   | 0.9034          | 0.0398   | 0.0007            | 0.0192                     | 0.0477             | 0.0468               |
| 0.9654        | 10.67 | 21   | 0.8927          | 0.0482   | 0.0014            | 0.0215                     | 0.0583             | 0.0566               |
| 0.9458        | 11.68 | 23   | 0.8898          | 0.0602   | 0.0043            | 0.0275                     | 0.0742             | 0.0684               |
| 0.9523        | 12.7  | 25   | 0.8866          | 0.0647   | 0.0053            | 0.0286                     | 0.0829             | 0.0705               |
| 0.942         | 13.71 | 27   | 0.8847          | 0.0786   | 0.0091            | 0.0338                     | 0.1069             | 0.0789               |
| 0.94          | 14.73 | 29   | 0.8648          | 0.0798   | 0.0099            | 0.0357                     | 0.1079             | 0.0803               |
| 0.9025        | 15.75 | 31   | 0.8604          | 0.0809   | 0.0105            | 0.0363                     | 0.1122             | 0.0782               |
| 0.9058        | 16.76 | 33   | 0.8577          | 0.0815   | 0.0107            | 0.0362                     | 0.1132             | 0.0789               |
| 0.893         | 17.78 | 35   | 0.8543          | 0.0816   | 0.0110            | 0.0363                     | 0.1132             | 0.0789               |
| 0.8959        | 18.79 | 37   | 0.8524          | 0.0805   | 0.0109            | 0.0362                     | 0.1113             | 0.0782               |
| 0.877         | 19.81 | 39   | 0.8422          | 0.0808   | 0.0118            | 0.0385                     | 0.1113             | 0.0782               |
| 0.861         | 20.83 | 41   | 0.8374          | 0.0811   | 0.0118            | 0.0385                     | 0.1113             | 0.0789               |
| 0.8365        | 21.84 | 43   | 0.8376          | 0.0827   | 0.0119            | 0.0386                     | 0.1132             | 0.0810               |
| 0.8293        | 22.86 | 45   | 0.8331          | 0.0853   | 0.0126            | 0.0390                     | 0.1180             | 0.0824               |
| 0.8288        | 23.87 | 47   | 0.8246          | 0.0852   | 0.0134            | 0.0421                     | 0.1180             | 0.0810               |
| 0.8175        | 24.89 | 49   | 0.8141          | 0.0852   | 0.0134            | 0.0421                     | 0.1180             | 0.0810               |
| 0.6345        | 25.4  | 50   | 0.8087          | 0.0867   | 0.0137            | 0.0422                     | 0.1204             | 0.0824               |


### Framework versions

- Transformers 4.30.0.dev0
- Pytorch 2.0.1
- Datasets 2.13.1
- Tokenizers 0.13.3