File size: 2,150 Bytes
1ad20a2
3919e78
 
 
 
 
1ad20a2
3919e78
5b2dc07
 
fa938d3
5b2dc07
fa938d3
5b2dc07
3919e78
db7b8e1
 
 
3919e78
 
 
5b2dc07
3919e78
 
 
5b2dc07
a7d67c8
bbd7189
f680768
5b2dc07
6fd137f
5b2dc07
3919e78
 
 
5b2dc07
 
f2c092d
bfe9086
f2c092d
5b2dc07
fa938d3
 
5b2dc07
db7b8e1
 
 
 
 
 
 
 
 
 
 
 
 
 
5b2dc07
3919e78
 
5b2dc07
 
3919e78
f680768
5b2dc07
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
---
tags:
- generated_from_keras_callback
model-index:
- name: juancopi81/mutopia_guitar_mmm
  results: []
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# juancopi81/mutopia_guitar_mmm

This model was trained from scratch on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.1448
- Validation Loss: 1.9892
- Epoch: 14

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'WarmUp', 'config': {'initial_learning_rate': 0.0005, 'decay_schedule_fn': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 0.0005, 'decay_steps': 1025, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, '__passive_serialization__': True}, 'warmup_steps': 1000, 'power': 1.0, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: mixed_float16

### Training results

| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 2.2881     | 2.2059          | 0     |
| 1.7702     | 1.8533          | 1     |
| 1.4625     | 1.6948          | 2     |
| 1.2876     | 1.6865          | 3     |
| 1.1926     | 1.6414          | 4     |
| 1.1329     | 1.6360          | 5     |
| 1.1069     | 1.6448          | 6     |
| 1.0408     | 1.6207          | 7     |
| 0.8939     | 1.5837          | 8     |
| 0.7265     | 1.5901          | 9     |
| 0.5902     | 1.6261          | 10    |
| 0.4489     | 1.7007          | 11    |
| 0.3223     | 1.7940          | 12    |
| 0.2158     | 1.9032          | 13    |
| 0.1448     | 1.9892          | 14    |


### Framework versions

- Transformers 4.22.2
- TensorFlow 2.8.2
- Datasets 2.5.1
- Tokenizers 0.12.1