File size: 1,572 Bytes
1ad20a2 3919e78 1ad20a2 3919e78 ba1ae84 7f0db84 ba1ae84 7f0db84 ba1ae84 3919e78 ba1ae84 71ded9c ba1ae84 3919e78 ba1ae84 3919e78 ba1ae84 a7d67c8 bbd7189 f680768 ba1ae84 6fd137f ba1ae84 3919e78 ba1ae84 6fd137f 3919e78 ba1ae84 3919e78 ba1ae84 3919e78 ba1ae84 3919e78 ba1ae84 f680768 3919e78 f680768 ba1ae84 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 |
---
tags:
- generated_from_keras_callback
model-index:
- name: juancopi81/mutopia_guitar_mmm
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# juancopi81/mutopia_guitar_mmm
This model was trained from scratch on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.5368
- Validation Loss: 1.5482
- Epoch: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'WarmUp', 'config': {'initial_learning_rate': 5e-07, 'decay_schedule_fn': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 5e-07, 'decay_steps': 350, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, '__passive_serialization__': True}, 'warmup_steps': 1000, 'power': 1.0, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: mixed_float16
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 0.5368 | 1.5482 | 0 |
### Framework versions
- Transformers 4.22.1
- TensorFlow 2.8.2
- Datasets 2.5.1
- Tokenizers 0.12.1
|