File size: 9,054 Bytes
a5a1295
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
---
license: cc-by-nc-4.0
base_model: MCG-NJU/videomae-base
tags:
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
- f1
model-index:
- name: videomae-base-finetuned-chickenbehaviour-2
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# videomae-base-finetuned-chickenbehaviour-2

This model is a fine-tuned version of [MCG-NJU/videomae-base](https://huggingface.co/MCG-NJU/videomae-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1357
- Accuracy: 0.6697
- Precision: 0.6429
- Recall: 0.6697
- F1: 0.6354

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 127240

### Training results

| Training Loss | Epoch | Step   | Validation Loss | Accuracy | Precision | Recall | F1     |
|:-------------:|:-----:|:------:|:---------------:|:--------:|:---------:|:------:|:------:|
| 2.035         | 0.01  | 1591   | 1.6961          | 0.5116   | 0.4264    | 0.5116 | 0.4139 |
| 1.5431        | 1.01  | 3182   | 1.4395          | 0.5898   | 0.5240    | 0.5898 | 0.5167 |
| 1.4118        | 2.01  | 4773   | 1.3632          | 0.6051   | 0.5553    | 0.6051 | 0.5535 |
| 1.3413        | 3.01  | 6364   | 1.3312          | 0.6021   | 0.5516    | 0.6021 | 0.5384 |
| 1.2969        | 4.01  | 7955   | 1.2739          | 0.6212   | 0.6122    | 0.6212 | 0.5663 |
| 1.2636        | 5.01  | 9546   | 1.3212          | 0.6058   | 0.6187    | 0.6058 | 0.5430 |
| 1.2231        | 6.01  | 11137  | 1.2543          | 0.6242   | 0.6461    | 0.6242 | 0.5747 |
| 1.1989        | 7.01  | 12728  | 1.2378          | 0.6405   | 0.6356    | 0.6405 | 0.5869 |
| 1.1566        | 8.01  | 14319  | 1.2124          | 0.6528   | 0.6199    | 0.6528 | 0.5942 |
| 1.1145        | 9.01  | 15910  | 1.1803          | 0.6476   | 0.6341    | 0.6476 | 0.6052 |
| 1.0567        | 10.01 | 17501  | 1.2577          | 0.6266   | 0.6279    | 0.6266 | 0.5969 |
| 1.0172        | 11.01 | 19092  | 1.1961          | 0.6570   | 0.6369    | 0.6570 | 0.6083 |
| 0.9817        | 12.01 | 20683  | 1.2287          | 0.6620   | 0.6499    | 0.6620 | 0.6049 |
| 0.9279        | 13.01 | 22274  | 1.2358          | 0.6549   | 0.6504    | 0.6549 | 0.6213 |
| 0.8913        | 14.01 | 23865  | 1.1815          | 0.6681   | 0.6325    | 0.6681 | 0.6308 |
| 0.8559        | 15.01 | 25456  | 1.3212          | 0.6391   | 0.6392    | 0.6391 | 0.6037 |
| 0.8083        | 16.01 | 27047  | 1.3073          | 0.6231   | 0.6251    | 0.6231 | 0.6006 |
| 0.7662        | 17.01 | 28638  | 1.2982          | 0.6462   | 0.6252    | 0.6462 | 0.6214 |
| 0.7363        | 18.01 | 30229  | 1.3019          | 0.6575   | 0.6428    | 0.6575 | 0.6264 |
| 0.6787        | 19.01 | 31820  | 1.3867          | 0.6511   | 0.6368    | 0.6511 | 0.6230 |
| 0.6433        | 20.01 | 33411  | 1.4019          | 0.6365   | 0.6375    | 0.6365 | 0.6139 |
| 0.5969        | 21.01 | 35002  | 1.4419          | 0.6341   | 0.6212    | 0.6341 | 0.6104 |
| 0.563         | 22.01 | 36593  | 1.4778          | 0.6509   | 0.6293    | 0.6509 | 0.6170 |
| 0.5252        | 23.01 | 38184  | 1.4864          | 0.6433   | 0.6316    | 0.6433 | 0.6214 |
| 0.5           | 24.01 | 39775  | 1.6704          | 0.6233   | 0.6273    | 0.6233 | 0.6023 |
| 0.4622        | 25.01 | 41366  | 1.6658          | 0.6488   | 0.6260    | 0.6488 | 0.6119 |
| 0.4292        | 26.01 | 42957  | 1.6428          | 0.6495   | 0.6287    | 0.6495 | 0.6243 |
| 0.4044        | 27.01 | 44548  | 1.6703          | 0.6587   | 0.6311    | 0.6587 | 0.6387 |
| 0.3952        | 28.01 | 46139  | 1.7576          | 0.6330   | 0.6171    | 0.6330 | 0.6123 |
| 0.3681        | 29.01 | 47730  | 1.9032          | 0.6554   | 0.6349    | 0.6554 | 0.6231 |
| 0.3541        | 30.01 | 49321  | 1.9508          | 0.6445   | 0.6320    | 0.6445 | 0.6207 |
| 0.322         | 31.01 | 50912  | 2.1317          | 0.6226   | 0.6277    | 0.6226 | 0.6099 |
| 0.3239        | 32.01 | 52503  | 1.9785          | 0.6509   | 0.6321    | 0.6509 | 0.6328 |
| 0.301         | 33.01 | 54094  | 2.2050          | 0.6436   | 0.6259    | 0.6436 | 0.6097 |
| 0.28          | 34.01 | 55685  | 2.2268          | 0.6320   | 0.6319    | 0.6320 | 0.6174 |
| 0.2742        | 35.01 | 57276  | 2.3538          | 0.6419   | 0.6239    | 0.6419 | 0.6158 |
| 0.2433        | 36.01 | 58867  | 2.3947          | 0.6478   | 0.6237    | 0.6478 | 0.6184 |
| 0.2677        | 37.01 | 60458  | 2.4007          | 0.6455   | 0.6285    | 0.6455 | 0.6234 |
| 0.2316        | 38.01 | 62049  | 2.5197          | 0.6297   | 0.6246    | 0.6297 | 0.6120 |
| 0.2229        | 39.01 | 63640  | 2.5478          | 0.6506   | 0.6322    | 0.6506 | 0.6235 |
| 0.215         | 40.01 | 65231  | 2.5168          | 0.6445   | 0.6455    | 0.6445 | 0.6209 |
| 0.2032        | 41.01 | 66822  | 2.6607          | 0.6443   | 0.6304    | 0.6443 | 0.6161 |
| 0.1957        | 42.01 | 68413  | 2.6434          | 0.6219   | 0.6206    | 0.6219 | 0.6059 |
| 0.1839        | 43.01 | 70004  | 2.6378          | 0.6480   | 0.6182    | 0.6480 | 0.6202 |
| 0.1672        | 44.01 | 71595  | 2.8355          | 0.6330   | 0.6175    | 0.6330 | 0.6095 |
| 0.1554        | 45.01 | 73186  | 2.8833          | 0.6297   | 0.6180    | 0.6297 | 0.6090 |
| 0.1525        | 46.01 | 74777  | 2.8732          | 0.6499   | 0.6212    | 0.6499 | 0.6247 |
| 0.1443        | 47.01 | 76368  | 2.7936          | 0.6513   | 0.6240    | 0.6513 | 0.6297 |
| 0.1361        | 48.01 | 77959  | 2.8815          | 0.6443   | 0.6187    | 0.6443 | 0.6230 |
| 0.1351        | 49.01 | 79550  | 3.0703          | 0.6429   | 0.6244    | 0.6429 | 0.6175 |
| 0.1196        | 50.01 | 81141  | 3.0275          | 0.6424   | 0.6250    | 0.6424 | 0.6190 |
| 0.111         | 51.01 | 82732  | 3.1255          | 0.6419   | 0.6281    | 0.6419 | 0.6189 |
| 0.1119        | 52.01 | 84323  | 3.1854          | 0.6471   | 0.6299    | 0.6471 | 0.6215 |
| 0.1069        | 53.01 | 85914  | 3.2136          | 0.6384   | 0.6251    | 0.6384 | 0.6195 |
| 0.093         | 54.01 | 87505  | 3.3125          | 0.6506   | 0.6145    | 0.6506 | 0.6155 |
| 0.0901        | 55.01 | 89096  | 3.3028          | 0.6384   | 0.6277    | 0.6384 | 0.6217 |
| 0.0776        | 56.01 | 90687  | 3.3315          | 0.6488   | 0.6272    | 0.6488 | 0.6298 |
| 0.0837        | 57.01 | 92278  | 3.4385          | 0.6558   | 0.6374    | 0.6558 | 0.6242 |
| 0.0701        | 58.01 | 93869  | 3.3800          | 0.6440   | 0.6321    | 0.6440 | 0.6286 |
| 0.0682        | 59.01 | 95460  | 3.4473          | 0.6542   | 0.6344    | 0.6542 | 0.6262 |
| 0.0763        | 60.01 | 97051  | 3.4505          | 0.6315   | 0.6149    | 0.6315 | 0.6148 |
| 0.0629        | 61.01 | 98642  | 3.4402          | 0.6504   | 0.6233    | 0.6504 | 0.6253 |
| 0.0552        | 62.01 | 100233 | 3.4402          | 0.6537   | 0.6324    | 0.6537 | 0.6315 |
| 0.0463        | 63.01 | 101824 | 3.5300          | 0.6466   | 0.6217    | 0.6466 | 0.6217 |
| 0.0471        | 64.01 | 103415 | 3.6793          | 0.6511   | 0.6346    | 0.6511 | 0.6223 |
| 0.0448        | 65.01 | 105006 | 3.6850          | 0.6450   | 0.6265    | 0.6450 | 0.6170 |
| 0.0362        | 66.01 | 106597 | 3.6585          | 0.6483   | 0.6265    | 0.6483 | 0.6242 |
| 0.0419        | 67.01 | 108188 | 3.6285          | 0.6344   | 0.6192    | 0.6344 | 0.6169 |
| 0.0309        | 68.01 | 109779 | 3.6657          | 0.6490   | 0.6264    | 0.6490 | 0.6269 |
| 0.0312        | 69.01 | 111370 | 3.7123          | 0.6417   | 0.6239    | 0.6417 | 0.6205 |
| 0.0315        | 70.01 | 112961 | 3.7538          | 0.6490   | 0.6224    | 0.6490 | 0.6189 |
| 0.0294        | 71.01 | 114552 | 3.7064          | 0.6483   | 0.6234    | 0.6483 | 0.6237 |
| 0.0282        | 72.01 | 116143 | 3.7945          | 0.6429   | 0.6247    | 0.6429 | 0.6192 |
| 0.0275        | 73.01 | 117734 | 3.7550          | 0.6528   | 0.6297    | 0.6528 | 0.6272 |
| 0.0319        | 74.01 | 119325 | 3.7407          | 0.6509   | 0.6289    | 0.6509 | 0.6234 |
| 0.021         | 75.01 | 120916 | 3.7527          | 0.6532   | 0.6290    | 0.6532 | 0.6270 |
| 0.0159        | 76.01 | 122507 | 3.7780          | 0.6516   | 0.6241    | 0.6516 | 0.6243 |
| 0.0133        | 77.01 | 124098 | 3.7923          | 0.6499   | 0.6272    | 0.6499 | 0.6240 |
| 0.0125        | 78.01 | 125689 | 3.8070          | 0.6504   | 0.6263    | 0.6504 | 0.6217 |
| 0.0132        | 79.01 | 127240 | 3.7964          | 0.6506   | 0.6264    | 0.6506 | 0.6225 |


### Framework versions

- Transformers 4.39.1
- Pytorch 2.1.0
- Datasets 2.18.0
- Tokenizers 0.15.2