|
--- |
|
license: other |
|
base_model: nvidia/mit-b0 |
|
tags: |
|
- generated_from_keras_callback |
|
model-index: |
|
- name: greathero/mit-b0-finetuned-contrails-morethanx35newercontrailsdataset |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information Keras had access to. You should |
|
probably proofread and complete it, then remove this comment. --> |
|
|
|
# greathero/mit-b0-finetuned-contrails-morethanx35newercontrailsdataset |
|
|
|
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on an unknown dataset. |
|
It achieves the following results on the evaluation set: |
|
- Train Loss: 0.0016 |
|
- Validation Loss: 0.0026 |
|
- Validation Mean Iou: 0.9068 |
|
- Validation Mean Accuracy: 0.9377 |
|
- Validation Overall Accuracy: 0.9993 |
|
- Validation Accuracy Unlabeled: 1.0 |
|
- Validation Accuracy Notlabeled: 0.9997 |
|
- Validation Accuracy Otherclass: 1.0 |
|
- Validation Accuracy Contrail: 0.7513 |
|
- Validation Iou Unlabeled: 1.0 |
|
- Validation Iou Notlabeled: 0.9993 |
|
- Validation Iou Otherclass: 0.9993 |
|
- Validation Iou Contrail: 0.6286 |
|
- Epoch: 49 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': 6e-05, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False} |
|
- training_precision: float32 |
|
|
|
### Training results |
|
|
|
| Train Loss | Validation Loss | Validation Mean Iou | Validation Mean Accuracy | Validation Overall Accuracy | Validation Accuracy Unlabeled | Validation Accuracy Notlabeled | Validation Accuracy Otherclass | Validation Accuracy Contrail | Validation Iou Unlabeled | Validation Iou Notlabeled | Validation Iou Otherclass | Validation Iou Contrail | Epoch | |
|
|:----------:|:---------------:|:-------------------:|:------------------------:|:---------------------------:|:-----------------------------:|:------------------------------:|:------------------------------:|:----------------------------:|:------------------------:|:-------------------------:|:-------------------------:|:-----------------------:|:-----:| |
|
| 0.2395 | 0.0367 | 0.2496 | 0.25 | 0.9983 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.9983 | 0.0 | 0.0 | 0 | |
|
| 0.0286 | 0.0204 | 0.2496 | 0.25 | 0.9983 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.9983 | 0.0 | 0.0 | 1 | |
|
| 0.0194 | 0.0156 | 0.2496 | 0.25 | 0.9983 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.9983 | 0.0 | 0.0 | 2 | |
|
| 0.0157 | 0.0144 | 0.2496 | 0.25 | 0.9983 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.9983 | 0.0 | 0.0 | 3 | |
|
| 0.0141 | 0.0137 | 0.2496 | 0.25 | 0.9983 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.9983 | 0.0 | 0.0 | 4 | |
|
| 0.0133 | 0.0132 | 0.2496 | 0.25 | 0.9983 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.9983 | 0.0 | 0.0 | 5 | |
|
| 0.0126 | 0.0125 | 0.2496 | 0.25 | 0.9983 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.9983 | 0.0 | 0.0 | 6 | |
|
| 0.0118 | 0.0117 | 0.2496 | 0.25 | 0.9983 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.9983 | 0.0 | 0.0 | 7 | |
|
| 0.0112 | 0.0110 | 0.2551 | 0.2555 | 0.9983 | 0.0 | 1.0000 | 0.0222 | 0.0 | 0.0 | 0.9983 | 0.0220 | 0.0 | 8 | |
|
| 0.0106 | 0.0103 | 0.2744 | 0.2756 | 0.9983 | 0.0180 | 1.0000 | 0.0832 | 0.0013 | 0.0179 | 0.9983 | 0.0801 | 0.0013 | 9 | |
|
| 0.0100 | 0.0094 | 0.3172 | 0.3225 | 0.9983 | 0.0527 | 1.0000 | 0.2344 | 0.0030 | 0.0521 | 0.9983 | 0.2154 | 0.0030 | 10 | |
|
| 0.0091 | 0.0087 | 0.3117 | 0.3138 | 0.9983 | 0.1026 | 1.0000 | 0.1470 | 0.0055 | 0.1014 | 0.9983 | 0.1415 | 0.0055 | 11 | |
|
| 0.0083 | 0.0076 | 0.3844 | 0.3941 | 0.9983 | 0.1540 | 1.0000 | 0.4189 | 0.0037 | 0.1516 | 0.9983 | 0.3840 | 0.0037 | 12 | |
|
| 0.0073 | 0.0070 | 0.4402 | 0.4781 | 0.9983 | 0.1997 | 0.9999 | 0.6616 | 0.0511 | 0.1978 | 0.9983 | 0.5167 | 0.0479 | 13 | |
|
| 0.0067 | 0.0062 | 0.5049 | 0.5454 | 0.9983 | 0.4868 | 1.0000 | 0.6893 | 0.0057 | 0.4702 | 0.9983 | 0.5456 | 0.0056 | 14 | |
|
| 0.0060 | 0.0056 | 0.5133 | 0.5326 | 0.9983 | 0.4799 | 1.0000 | 0.6283 | 0.0221 | 0.4722 | 0.9983 | 0.5608 | 0.0218 | 15 | |
|
| 0.0055 | 0.0056 | 0.5975 | 0.6504 | 0.9984 | 0.7018 | 0.9999 | 0.8058 | 0.0942 | 0.6731 | 0.9984 | 0.6317 | 0.0868 | 16 | |
|
| 0.0051 | 0.0050 | 0.5800 | 0.6003 | 0.9984 | 0.6893 | 0.9999 | 0.6644 | 0.0476 | 0.6707 | 0.9984 | 0.6048 | 0.0462 | 17 | |
|
| 0.0048 | 0.0047 | 0.6598 | 0.6878 | 0.9984 | 0.8252 | 0.9998 | 0.7809 | 0.1453 | 0.8013 | 0.9984 | 0.7095 | 0.1298 | 18 | |
|
| 0.0045 | 0.0045 | 0.6596 | 0.7122 | 0.9984 | 0.9057 | 0.9998 | 0.7739 | 0.1693 | 0.8117 | 0.9984 | 0.6774 | 0.1508 | 19 | |
|
| 0.0042 | 0.0042 | 0.6956 | 0.7259 | 0.9985 | 0.8669 | 0.9998 | 0.8294 | 0.2074 | 0.8609 | 0.9985 | 0.7396 | 0.1835 | 20 | |
|
| 0.0039 | 0.0045 | 0.6824 | 0.6971 | 0.9985 | 0.8710 | 0.9999 | 0.7933 | 0.1243 | 0.8579 | 0.9985 | 0.7546 | 0.1186 | 21 | |
|
| 0.0038 | 0.0041 | 0.7516 | 0.7748 | 0.9985 | 0.9612 | 0.9998 | 0.8779 | 0.2605 | 0.9397 | 0.9985 | 0.8395 | 0.2287 | 22 | |
|
| 0.0035 | 0.0038 | 0.7683 | 0.8089 | 0.9986 | 0.9667 | 0.9997 | 0.9334 | 0.3359 | 0.9356 | 0.9986 | 0.8546 | 0.2843 | 23 | |
|
| 0.0034 | 0.0037 | 0.7905 | 0.8212 | 0.9986 | 0.9639 | 0.9996 | 0.9182 | 0.4031 | 0.9508 | 0.9986 | 0.8856 | 0.3270 | 24 | |
|
| 0.0032 | 0.0037 | 0.7938 | 0.8154 | 0.9987 | 0.9515 | 0.9997 | 0.9112 | 0.3992 | 0.9469 | 0.9987 | 0.8939 | 0.3358 | 25 | |
|
| 0.0031 | 0.0036 | 0.8250 | 0.8524 | 0.9988 | 0.9806 | 0.9996 | 0.9431 | 0.4864 | 0.9772 | 0.9988 | 0.9315 | 0.3924 | 26 | |
|
| 0.0030 | 0.0035 | 0.8204 | 0.8392 | 0.9987 | 0.9736 | 0.9997 | 0.9820 | 0.4015 | 0.9736 | 0.9987 | 0.9659 | 0.3435 | 27 | |
|
| 0.0029 | 0.0033 | 0.8431 | 0.8689 | 0.9988 | 0.9917 | 0.9997 | 0.9834 | 0.5011 | 0.9910 | 0.9988 | 0.9653 | 0.4172 | 28 | |
|
| 0.0027 | 0.0034 | 0.8337 | 0.8539 | 0.9988 | 0.9945 | 0.9997 | 0.9875 | 0.4341 | 0.9945 | 0.9988 | 0.9648 | 0.3767 | 29 | |
|
| 0.0027 | 0.0033 | 0.8518 | 0.8834 | 0.9988 | 1.0 | 0.9996 | 0.9931 | 0.5411 | 0.9972 | 0.9988 | 0.9802 | 0.4311 | 30 | |
|
| 0.0025 | 0.0031 | 0.8622 | 0.8848 | 0.9990 | 0.9986 | 0.9997 | 0.9958 | 0.5449 | 0.9986 | 0.9990 | 0.9890 | 0.4622 | 31 | |
|
| 0.0025 | 0.0031 | 0.8667 | 0.8975 | 0.9989 | 0.9986 | 0.9996 | 0.9986 | 0.5933 | 0.9979 | 0.9989 | 0.9897 | 0.4801 | 32 | |
|
| 0.0024 | 0.0031 | 0.8607 | 0.8791 | 0.9990 | 1.0 | 0.9998 | 0.9986 | 0.5181 | 1.0 | 0.9990 | 0.9877 | 0.4563 | 33 | |
|
| 0.0022 | 0.0031 | 0.8783 | 0.9102 | 0.9990 | 1.0 | 0.9996 | 0.9917 | 0.6497 | 1.0 | 0.9990 | 0.9876 | 0.5266 | 34 | |
|
| 0.0022 | 0.0029 | 0.8780 | 0.9016 | 0.9991 | 1.0 | 0.9997 | 0.9931 | 0.6138 | 1.0 | 0.9991 | 0.9903 | 0.5225 | 35 | |
|
| 0.0022 | 0.0030 | 0.8853 | 0.9238 | 0.9991 | 1.0 | 0.9996 | 0.9986 | 0.6969 | 0.9972 | 0.9991 | 0.9965 | 0.5484 | 36 | |
|
| 0.0021 | 0.0030 | 0.8859 | 0.9156 | 0.9991 | 1.0 | 0.9996 | 0.9986 | 0.6641 | 1.0 | 0.9991 | 0.9979 | 0.5466 | 37 | |
|
| 0.0021 | 0.0029 | 0.8865 | 0.9165 | 0.9991 | 1.0 | 0.9997 | 1.0 | 0.6662 | 0.9993 | 0.9991 | 0.9952 | 0.5526 | 38 | |
|
| 0.0021 | 0.0028 | 0.8932 | 0.9227 | 0.9992 | 1.0 | 0.9997 | 0.9986 | 0.6926 | 1.0 | 0.9992 | 0.9965 | 0.5771 | 39 | |
|
| 0.0020 | 0.0029 | 0.8929 | 0.9235 | 0.9992 | 1.0 | 0.9997 | 0.9986 | 0.6959 | 1.0 | 0.9992 | 0.9979 | 0.5747 | 40 | |
|
| 0.0020 | 0.0028 | 0.8935 | 0.9252 | 0.9992 | 1.0 | 0.9996 | 0.9986 | 0.7026 | 1.0 | 0.9991 | 0.9979 | 0.5767 | 41 | |
|
| 0.0019 | 0.0028 | 0.8934 | 0.9144 | 0.9992 | 1.0 | 0.9998 | 0.9986 | 0.6590 | 1.0 | 0.9992 | 0.9986 | 0.5758 | 42 | |
|
| 0.0019 | 0.0028 | 0.8945 | 0.9174 | 0.9992 | 1.0 | 0.9997 | 0.9986 | 0.6713 | 1.0 | 0.9992 | 0.9979 | 0.5807 | 43 | |
|
| 0.0018 | 0.0027 | 0.9009 | 0.9327 | 0.9992 | 1.0 | 0.9997 | 0.9986 | 0.7324 | 1.0 | 0.9992 | 0.9979 | 0.6065 | 44 | |
|
| 0.0019 | 0.0027 | 0.9028 | 0.9385 | 0.9992 | 1.0 | 0.9996 | 1.0 | 0.7543 | 1.0 | 0.9992 | 0.9993 | 0.6125 | 45 | |
|
| 0.0017 | 0.0028 | 0.9048 | 0.9362 | 0.9992 | 1.0 | 0.9997 | 1.0 | 0.7453 | 1.0 | 0.9992 | 1.0 | 0.6199 | 46 | |
|
| 0.0017 | 0.0029 | 0.9043 | 0.9362 | 0.9992 | 1.0 | 0.9997 | 0.9986 | 0.7466 | 1.0 | 0.9992 | 0.9979 | 0.6202 | 47 | |
|
| 0.0017 | 0.0027 | 0.9060 | 0.9365 | 0.9993 | 1.0 | 0.9997 | 0.9986 | 0.7477 | 1.0 | 0.9993 | 0.9979 | 0.6270 | 48 | |
|
| 0.0016 | 0.0026 | 0.9068 | 0.9377 | 0.9993 | 1.0 | 0.9997 | 1.0 | 0.7513 | 1.0 | 0.9993 | 0.9993 | 0.6286 | 49 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.35.2 |
|
- TensorFlow 2.15.0 |
|
- Datasets 2.15.0 |
|
- Tokenizers 0.15.0 |
|
|