Edit model card

AhamadShaik/SegFormer_RESIZE_x.5

This model is a fine-tuned version of nvidia/mit-b0 on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.0497
  • Train Dice Coef: 0.8670
  • Train Iou: 0.7679
  • Validation Loss: 0.0477
  • Validation Dice Coef: 0.8831
  • Validation Iou: 0.7923
  • Train Lr: 1e-10
  • Epoch: 99

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': 1e-10, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
  • training_precision: float32

Training results

Train Loss Train Dice Coef Train Iou Validation Loss Validation Dice Coef Validation Iou Train Lr Epoch
0.2269 0.5814 0.4248 0.1165 0.7019 0.5504 1e-04 0
0.1305 0.6934 0.5423 0.0877 0.7790 0.6433 1e-04 1
0.1116 0.7311 0.5867 0.0729 0.8299 0.7120 1e-04 2
0.0985 0.7624 0.6241 0.0648 0.8555 0.7491 1e-04 3
0.0918 0.7766 0.6431 0.0711 0.8271 0.7098 1e-04 4
0.0869 0.7877 0.6566 0.0607 0.8552 0.7492 1e-04 5
0.0818 0.7993 0.6722 0.0555 0.8665 0.7662 1e-04 6
0.0753 0.8136 0.6906 0.0544 0.8701 0.7719 1e-04 7
0.0719 0.8216 0.7016 0.0530 0.8725 0.7754 1e-04 8
0.0715 0.8221 0.7027 0.0588 0.8610 0.7579 1e-04 9
0.0673 0.8304 0.7139 0.0502 0.8766 0.7820 1e-04 10
0.0634 0.8388 0.7260 0.0520 0.8757 0.7806 1e-04 11
0.0617 0.8435 0.7328 0.0513 0.8776 0.7831 1e-04 12
0.0731 0.8230 0.7046 0.0540 0.8722 0.7752 1e-04 13
0.0612 0.8439 0.7335 0.0523 0.8749 0.7793 1e-04 14
0.0568 0.8534 0.7473 0.0537 0.8779 0.7842 1e-04 15
0.0549 0.8569 0.7529 0.0486 0.8817 0.7903 5e-06 16
0.0526 0.8607 0.7584 0.0470 0.8849 0.7953 5e-06 17
0.0516 0.8641 0.7633 0.0478 0.8844 0.7946 5e-06 18
0.0523 0.8625 0.7610 0.0483 0.8817 0.7901 5e-06 19
0.0507 0.8662 0.7661 0.0475 0.8842 0.7941 5e-06 20
0.0504 0.8664 0.7665 0.0477 0.8832 0.7924 5e-06 21
0.0504 0.8674 0.7682 0.0474 0.8833 0.7926 5e-06 22
0.0501 0.8655 0.7657 0.0475 0.8833 0.7926 2.5e-07 23
0.0498 0.8677 0.7687 0.0471 0.8845 0.7944 2.5e-07 24
0.0504 0.8665 0.7672 0.0470 0.8846 0.7946 2.5e-07 25
0.0502 0.8677 0.7686 0.0472 0.8844 0.7943 2.5e-07 26
0.0502 0.8662 0.7667 0.0477 0.8833 0.7925 2.5e-07 27
0.0507 0.8667 0.7670 0.0462 0.8853 0.7957 1.25e-08 28
0.0495 0.8685 0.7701 0.0475 0.8841 0.7937 1.25e-08 29
0.0503 0.8669 0.7676 0.0472 0.8840 0.7936 1.25e-08 30
0.0495 0.8689 0.7704 0.0471 0.8854 0.7959 1.25e-08 31
0.0496 0.8681 0.7693 0.0474 0.8844 0.7942 1.25e-08 32
0.0502 0.8665 0.7667 0.0480 0.8823 0.7912 1.25e-08 33
0.0499 0.8663 0.7668 0.0467 0.8852 0.7955 6.25e-10 34
0.0498 0.8668 0.7676 0.0471 0.8844 0.7943 6.25e-10 35
0.0505 0.8653 0.7653 0.0480 0.8821 0.7908 6.25e-10 36
0.0497 0.8687 0.7702 0.0471 0.8847 0.7947 6.25e-10 37
0.0506 0.8660 0.7662 0.0476 0.8838 0.7935 6.25e-10 38
0.0499 0.8678 0.7688 0.0473 0.8849 0.7951 1e-10 39
0.0499 0.8668 0.7676 0.0476 0.8839 0.7935 1e-10 40
0.0500 0.8672 0.7679 0.0478 0.8829 0.7921 1e-10 41
0.0500 0.8670 0.7677 0.0468 0.8845 0.7944 1e-10 42
0.0502 0.8668 0.7673 0.0474 0.8837 0.7932 1e-10 43
0.0500 0.8666 0.7671 0.0476 0.8832 0.7926 1e-10 44
0.0495 0.8682 0.7695 0.0474 0.8839 0.7935 1e-10 45
0.0495 0.8680 0.7690 0.0474 0.8842 0.7938 1e-10 46
0.0502 0.8666 0.7671 0.0474 0.8840 0.7937 1e-10 47
0.0501 0.8668 0.7673 0.0473 0.8840 0.7936 1e-10 48
0.0498 0.8676 0.7686 0.0470 0.8842 0.7939 1e-10 49
0.0495 0.8677 0.7690 0.0477 0.8831 0.7924 1e-10 50
0.0496 0.8694 0.7713 0.0471 0.8846 0.7945 1e-10 51
0.0496 0.8686 0.7699 0.0467 0.8851 0.7953 1e-10 52
0.0495 0.8688 0.7701 0.0469 0.8848 0.7949 1e-10 53
0.0497 0.8677 0.7686 0.0468 0.8848 0.7950 1e-10 54
0.0492 0.8689 0.7704 0.0473 0.8845 0.7944 1e-10 55
0.0498 0.8678 0.7687 0.0473 0.8837 0.7932 1e-10 56
0.0502 0.8668 0.7672 0.0471 0.8838 0.7934 1e-10 57
0.0497 0.8670 0.7676 0.0469 0.8840 0.7936 1e-10 58
0.0500 0.8680 0.7690 0.0473 0.8837 0.7933 1e-10 59
0.0497 0.8681 0.7692 0.0467 0.8840 0.7937 1e-10 60
0.0496 0.8685 0.7694 0.0474 0.8844 0.7944 1e-10 61
0.0506 0.8659 0.7660 0.0474 0.8838 0.7933 1e-10 62
0.0496 0.8677 0.7689 0.0472 0.8850 0.7953 1e-10 63
0.0498 0.8669 0.7675 0.0468 0.8836 0.7930 1e-10 64
0.0498 0.8675 0.7684 0.0471 0.8843 0.7942 1e-10 65
0.0499 0.8680 0.7691 0.0472 0.8842 0.7941 1e-10 66
0.0499 0.8677 0.7688 0.0474 0.8835 0.7928 1e-10 67
0.0501 0.8655 0.7656 0.0466 0.8855 0.7960 1e-10 68
0.0499 0.8673 0.7682 0.0480 0.8825 0.7913 1e-10 69
0.0494 0.8682 0.7698 0.0470 0.8851 0.7955 1e-10 70
0.0499 0.8676 0.7685 0.0475 0.8837 0.7932 1e-10 71
0.0500 0.8672 0.7681 0.0467 0.8855 0.7960 1e-10 72
0.0502 0.8662 0.7664 0.0473 0.8829 0.7919 1e-10 73
0.0498 0.8670 0.7679 0.0474 0.8846 0.7947 1e-10 74
0.0501 0.8665 0.7671 0.0480 0.8827 0.7916 1e-10 75
0.0493 0.8677 0.7689 0.0473 0.8836 0.7930 1e-10 76
0.0496 0.8678 0.7687 0.0474 0.8843 0.7942 1e-10 77
0.0495 0.8679 0.7689 0.0472 0.8844 0.7943 1e-10 78
0.0496 0.8679 0.7690 0.0470 0.8846 0.7945 1e-10 79
0.0501 0.8673 0.7683 0.0473 0.8836 0.7931 1e-10 80
0.0497 0.8679 0.7691 0.0471 0.8839 0.7936 1e-10 81
0.0496 0.8681 0.7693 0.0475 0.8836 0.7931 1e-10 82
0.0495 0.8689 0.7703 0.0474 0.8836 0.7930 1e-10 83
0.0496 0.8685 0.7697 0.0470 0.8845 0.7945 1e-10 84
0.0504 0.8665 0.7669 0.0477 0.8833 0.7926 1e-10 85
0.0496 0.8677 0.7690 0.0478 0.8830 0.7921 1e-10 86
0.0493 0.8682 0.7694 0.0470 0.8837 0.7931 1e-10 87
0.0495 0.8677 0.7688 0.0475 0.8835 0.7929 1e-10 88
0.0499 0.8668 0.7673 0.0471 0.8844 0.7942 1e-10 89
0.0495 0.8682 0.7694 0.0476 0.8836 0.7930 1e-10 90
0.0499 0.8672 0.7679 0.0475 0.8835 0.7929 1e-10 91
0.0496 0.8676 0.7685 0.0478 0.8831 0.7923 1e-10 92
0.0500 0.8677 0.7686 0.0475 0.8838 0.7934 1e-10 93
0.0495 0.8677 0.7686 0.0471 0.8837 0.7931 1e-10 94
0.0494 0.8680 0.7691 0.0473 0.8835 0.7930 1e-10 95
0.0500 0.8656 0.7659 0.0465 0.8848 0.7950 1e-10 96
0.0494 0.8678 0.7690 0.0477 0.8821 0.7907 1e-10 97
0.0498 0.8681 0.7691 0.0475 0.8843 0.7942 1e-10 98
0.0497 0.8670 0.7679 0.0477 0.8831 0.7923 1e-10 99

Framework versions

  • Transformers 4.27.1
  • TensorFlow 2.11.0
  • Datasets 2.10.1
  • Tokenizers 0.13.2
Downloads last month
9
Unable to determine this model’s pipeline type. Check the docs .