metadata
license: apache-2.0
base_model: microsoft/conditional-detr-resnet-50
tags:
- generated_from_trainer
model-index:
- name: cdetr-mist1-brain-gt-tumors-8ah-6l
results: []
cdetr-mist1-brain-gt-tumors-8ah-6l
This model is a fine-tuned version of microsoft/conditional-detr-resnet-50 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 2.7389
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
5.4149 | 1.0 | 115 | 4.3974 |
3.9453 | 2.0 | 230 | 3.6520 |
3.7269 | 3.0 | 345 | 3.7602 |
3.5898 | 4.0 | 460 | 3.5671 |
3.486 | 5.0 | 575 | 3.4912 |
3.4073 | 6.0 | 690 | 3.4095 |
3.4181 | 7.0 | 805 | 3.3183 |
3.3603 | 8.0 | 920 | 3.1111 |
3.2777 | 9.0 | 1035 | 3.1992 |
3.2851 | 10.0 | 1150 | 3.3997 |
3.266 | 11.0 | 1265 | 3.2861 |
3.2803 | 12.0 | 1380 | 3.1813 |
3.1733 | 13.0 | 1495 | 2.9838 |
3.2094 | 14.0 | 1610 | 3.1175 |
3.1718 | 15.0 | 1725 | 3.0064 |
3.1303 | 16.0 | 1840 | 3.0869 |
3.0897 | 17.0 | 1955 | 3.0306 |
3.0233 | 18.0 | 2070 | 2.9479 |
3.0156 | 19.0 | 2185 | 2.9145 |
3.0277 | 20.0 | 2300 | 2.8919 |
3.0847 | 21.0 | 2415 | 2.9321 |
3.0333 | 22.0 | 2530 | 2.9128 |
3.0126 | 23.0 | 2645 | 2.8627 |
2.9968 | 24.0 | 2760 | 3.0186 |
3.0295 | 25.0 | 2875 | 3.0148 |
3.0294 | 26.0 | 2990 | 3.0341 |
3.0395 | 27.0 | 3105 | 2.9997 |
3.0445 | 28.0 | 3220 | 3.0575 |
2.9761 | 29.0 | 3335 | 2.9707 |
3.0075 | 30.0 | 3450 | 2.9392 |
3.0198 | 31.0 | 3565 | 2.9122 |
2.9782 | 32.0 | 3680 | 2.9471 |
2.9773 | 33.0 | 3795 | 3.0306 |
2.9528 | 34.0 | 3910 | 2.8513 |
2.9228 | 35.0 | 4025 | 2.8997 |
2.9221 | 36.0 | 4140 | 2.8646 |
2.8933 | 37.0 | 4255 | 2.8871 |
2.8925 | 38.0 | 4370 | 2.9407 |
2.9069 | 39.0 | 4485 | 2.9625 |
2.9246 | 40.0 | 4600 | 2.9946 |
2.9089 | 41.0 | 4715 | 2.8936 |
2.8573 | 42.0 | 4830 | 2.8272 |
2.8768 | 43.0 | 4945 | 2.9868 |
2.9666 | 44.0 | 5060 | 2.9200 |
2.958 | 45.0 | 5175 | 2.8755 |
2.8923 | 46.0 | 5290 | 2.8518 |
2.9204 | 47.0 | 5405 | 2.9000 |
2.9644 | 48.0 | 5520 | 2.8969 |
2.9011 | 49.0 | 5635 | 2.7918 |
2.9329 | 50.0 | 5750 | 2.9139 |
2.9031 | 51.0 | 5865 | 2.7796 |
2.9029 | 52.0 | 5980 | 2.8025 |
2.9555 | 53.0 | 6095 | 2.9121 |
2.9366 | 54.0 | 6210 | 2.9035 |
2.8871 | 55.0 | 6325 | 2.8759 |
2.863 | 56.0 | 6440 | 2.8540 |
2.8897 | 57.0 | 6555 | 2.8401 |
2.828 | 58.0 | 6670 | 2.8590 |
2.8221 | 59.0 | 6785 | 2.9255 |
2.835 | 60.0 | 6900 | 2.9809 |
2.886 | 61.0 | 7015 | 2.9907 |
2.8227 | 62.0 | 7130 | 2.8283 |
2.7864 | 63.0 | 7245 | 2.8258 |
2.8179 | 64.0 | 7360 | 2.9504 |
2.7944 | 65.0 | 7475 | 2.8042 |
2.7986 | 66.0 | 7590 | 2.8307 |
2.7567 | 67.0 | 7705 | 2.8060 |
2.7552 | 68.0 | 7820 | 2.7994 |
2.7933 | 69.0 | 7935 | 2.8493 |
2.7393 | 70.0 | 8050 | 2.8409 |
2.7357 | 71.0 | 8165 | 2.8086 |
2.7264 | 72.0 | 8280 | 2.7773 |
2.7614 | 73.0 | 8395 | 2.8937 |
2.7279 | 74.0 | 8510 | 2.8887 |
2.745 | 75.0 | 8625 | 2.8274 |
2.7225 | 76.0 | 8740 | 2.7971 |
2.7094 | 77.0 | 8855 | 2.8685 |
2.7306 | 78.0 | 8970 | 2.8482 |
2.6844 | 79.0 | 9085 | 2.7372 |
2.6949 | 80.0 | 9200 | 2.8149 |
2.7342 | 81.0 | 9315 | 2.7647 |
2.6813 | 82.0 | 9430 | 2.7666 |
2.7161 | 83.0 | 9545 | 2.8437 |
2.6953 | 84.0 | 9660 | 2.7895 |
2.6714 | 85.0 | 9775 | 2.7683 |
2.6611 | 86.0 | 9890 | 2.7004 |
2.6714 | 87.0 | 10005 | 2.7183 |
2.6655 | 88.0 | 10120 | 2.7043 |
2.6509 | 89.0 | 10235 | 2.7705 |
2.6266 | 90.0 | 10350 | 2.7152 |
2.6677 | 91.0 | 10465 | 2.7295 |
2.6438 | 92.0 | 10580 | 2.7018 |
2.6267 | 93.0 | 10695 | 2.7063 |
2.6286 | 94.0 | 10810 | 2.7798 |
2.6043 | 95.0 | 10925 | 2.7712 |
2.6188 | 96.0 | 11040 | 2.7614 |
2.6028 | 97.0 | 11155 | 2.7405 |
2.621 | 98.0 | 11270 | 2.7415 |
2.61 | 99.0 | 11385 | 2.7415 |
2.6164 | 100.0 | 11500 | 2.7389 |
Framework versions
- Transformers 4.35.0
- Pytorch 2.0.0
- Datasets 2.1.0
- Tokenizers 0.14.1