File size: 1,573 Bytes
1ad20a2
3919e78
 
 
 
 
1ad20a2
3919e78
f680768
 
44abe18
f680768
44abe18
f680768
3919e78
f680768
 
 
3919e78
 
 
f680768
3919e78
 
 
f680768
3919e78
44abe18
a7d67c8
f680768
 
 
3919e78
 
 
 
f680768
3919e78
f680768
3919e78
f680768
3919e78
 
f680768
 
3919e78
 
f680768
 
3919e78
f680768
3919e78
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
---
tags:
- generated_from_keras_callback
model-index:
- name: juancopi81/mutopia_guitar_mmm
  results: []
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# juancopi81/mutopia_guitar_mmm

This model was trained from scratch on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.5798
- Validation Loss: 1.5411
- Epoch: 0

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'WarmUp', 'config': {'initial_learning_rate': 5e-07, 'decay_schedule_fn': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 5e-07, 'decay_steps': 5726, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, '__passive_serialization__': True}, 'warmup_steps': 1000, 'power': 1.0, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: mixed_float16

### Training results

| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 0.5798     | 1.5411          | 0     |


### Framework versions

- Transformers 4.22.1
- TensorFlow 2.8.2
- Datasets 2.5.1
- Tokenizers 0.12.1