segformer-b0-finetuned-morphpadver1-no_ckp-2

This model is a fine-tuned version of nvidia/mit-b0 on the NICOPOI-9/morphpad_512_4class dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0023
  • Mean Iou: 0.9987
  • Mean Accuracy: 0.9994
  • Overall Accuracy: 0.9994
  • Accuracy 0-0: 0.9992
  • Accuracy 0-90: 0.9991
  • Accuracy 90-0: 0.9994
  • Accuracy 90-90: 0.9998
  • Iou 0-0: 0.9991
  • Iou 0-90: 0.9984
  • Iou 90-0: 0.9984
  • Iou 90-90: 0.9991

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy 0-0 Accuracy 0-90 Accuracy 90-0 Accuracy 90-90 Iou 0-0 Iou 0-90 Iou 90-0 Iou 90-90
0.2037 0.1112 1000 0.1512 0.8947 0.9437 0.9447 0.9614 0.9508 0.8858 0.9769 0.9353 0.8554 0.8502 0.9381
0.0768 0.2224 2000 0.0772 0.9450 0.9715 0.9720 0.9772 0.9613 0.9599 0.9878 0.9643 0.9249 0.9246 0.9664
0.1051 0.3337 3000 0.0555 0.9622 0.9807 0.9809 0.9885 0.9656 0.9794 0.9893 0.9798 0.9440 0.9446 0.9804
0.0426 0.4449 4000 0.0438 0.9706 0.9851 0.9852 0.9908 0.9785 0.9805 0.9905 0.9852 0.9557 0.9558 0.9858
0.03 0.5561 5000 0.0386 0.9735 0.9866 0.9867 0.9897 0.9893 0.9758 0.9916 0.9866 0.9589 0.9602 0.9886
0.0255 0.6673 6000 0.0314 0.9788 0.9892 0.9894 0.9926 0.9796 0.9880 0.9968 0.9902 0.9671 0.9664 0.9914
0.0446 0.7786 7000 0.0353 0.9822 0.9910 0.9911 0.9935 0.9873 0.9890 0.9944 0.9915 0.9727 0.9725 0.9921
0.0188 0.8898 8000 0.0323 0.9836 0.9917 0.9919 0.9964 0.9861 0.9882 0.9962 0.9932 0.9741 0.9730 0.9941
0.017 1.0010 9000 0.0167 0.9895 0.9947 0.9948 0.9966 0.9916 0.9931 0.9977 0.9946 0.9844 0.9842 0.9951
0.0135 1.1122 10000 0.0191 0.9890 0.9945 0.9946 0.9975 0.9882 0.9949 0.9974 0.9952 0.9829 0.9825 0.9955
0.0164 1.2234 11000 0.0147 0.9912 0.9956 0.9956 0.9979 0.9942 0.9922 0.9980 0.9957 0.9867 0.9863 0.9961
0.0602 1.3347 12000 0.0195 0.9913 0.9956 0.9957 0.9972 0.9929 0.9937 0.9986 0.9960 0.9866 0.9864 0.9962
0.0095 1.4459 13000 0.0229 0.9899 0.9949 0.9950 0.9970 0.9941 0.9899 0.9987 0.9959 0.9840 0.9832 0.9966
0.0434 1.5571 14000 0.0094 0.9942 0.9971 0.9971 0.9985 0.9947 0.9965 0.9987 0.9970 0.9913 0.9914 0.9971
0.0108 1.6683 15000 0.0110 0.9935 0.9967 0.9968 0.9983 0.9931 0.9967 0.9989 0.9973 0.9897 0.9896 0.9974
0.0091 1.7796 16000 0.0094 0.9946 0.9973 0.9973 0.9981 0.9954 0.9973 0.9983 0.9972 0.9919 0.9918 0.9973
0.0417 1.8908 17000 0.0123 0.9930 0.9965 0.9966 0.9969 0.9931 0.9972 0.9988 0.9961 0.9887 0.9898 0.9976
0.0101 2.0020 18000 0.0080 0.9954 0.9977 0.9977 0.9980 0.9959 0.9982 0.9987 0.9975 0.9933 0.9930 0.9978
0.0072 2.1132 19000 0.0193 0.9943 0.9971 0.9972 0.9981 0.9939 0.9978 0.9987 0.9975 0.9911 0.9906 0.9978
0.0077 2.2244 20000 0.0056 0.9967 0.9984 0.9984 0.9986 0.9975 0.9984 0.9990 0.9980 0.9956 0.9952 0.9981
0.0048 2.3357 21000 0.0052 0.9969 0.9985 0.9985 0.9991 0.9973 0.9981 0.9994 0.9983 0.9956 0.9955 0.9983
0.0052 2.4469 22000 0.0155 0.9962 0.9981 0.9981 0.9992 0.9948 0.9989 0.9994 0.9982 0.9941 0.9941 0.9983
0.0042 2.5581 23000 0.0042 0.9976 0.9988 0.9988 0.9989 0.9979 0.9990 0.9994 0.9984 0.9967 0.9967 0.9985
0.004 2.6693 24000 0.0041 0.9976 0.9988 0.9988 0.9995 0.9977 0.9986 0.9994 0.9984 0.9967 0.9968 0.9986
0.0044 2.7806 25000 0.0056 0.9971 0.9986 0.9986 0.9992 0.9968 0.9987 0.9995 0.9987 0.9956 0.9956 0.9986
0.0036 2.8918 26000 0.0236 0.9960 0.9980 0.9980 0.9988 0.9946 0.9991 0.9995 0.9985 0.9935 0.9932 0.9987
0.0038 3.0030 27000 0.0156 0.9969 0.9985 0.9985 0.9992 0.9963 0.9989 0.9996 0.9988 0.9952 0.9951 0.9987
0.0028 3.1142 28000 0.0037 0.9979 0.9989 0.9990 0.9992 0.9978 0.9992 0.9995 0.9988 0.9970 0.9969 0.9988
0.0044 3.2254 29000 0.0030 0.9983 0.9992 0.9992 0.9993 0.9987 0.9993 0.9994 0.9989 0.9978 0.9978 0.9989
0.0051 3.3367 30000 0.0111 0.9975 0.9987 0.9988 0.9993 0.9969 0.9990 0.9998 0.9989 0.9961 0.9961 0.9989
0.0027 3.4479 31000 0.0176 0.9964 0.9982 0.9982 0.9995 0.9945 0.9992 0.9996 0.9990 0.9939 0.9939 0.9989
0.0126 3.5591 32000 0.0095 0.9973 0.9986 0.9987 0.9996 0.9963 0.9992 0.9995 0.9990 0.9956 0.9956 0.9989
0.0097 3.6703 33000 0.0044 0.9973 0.9986 0.9987 0.9997 0.9962 0.9989 0.9997 0.9989 0.9956 0.9958 0.9988
0.0066 3.7816 34000 0.0030 0.9982 0.9991 0.9991 0.9996 0.9977 0.9993 0.9997 0.9990 0.9974 0.9976 0.9987
0.0024 3.8928 35000 0.0025 0.9987 0.9993 0.9993 0.9996 0.9990 0.9991 0.9996 0.9991 0.9983 0.9982 0.9991
0.0032 4.0040 36000 0.0026 0.9985 0.9992 0.9992 0.9996 0.9983 0.9994 0.9996 0.9990 0.9979 0.9979 0.9990
0.003 4.1152 37000 0.0025 0.9986 0.9993 0.9993 0.9995 0.9987 0.9993 0.9997 0.9991 0.9981 0.9981 0.9990
0.0023 4.2264 38000 0.0023 0.9987 0.9994 0.9994 0.9996 0.9989 0.9992 0.9997 0.9991 0.9984 0.9984 0.9991
0.0068 4.3377 39000 0.0025 0.9987 0.9993 0.9993 0.9993 0.9988 0.9995 0.9997 0.9991 0.9983 0.9982 0.9990
0.0058 4.4489 40000 0.0030 0.9980 0.9990 0.9990 0.9996 0.9970 0.9996 0.9997 0.9991 0.9968 0.9971 0.9989
0.0039 4.5601 41000 0.0022 0.9989 0.9994 0.9994 0.9995 0.9993 0.9993 0.9997 0.9992 0.9985 0.9985 0.9992
0.0021 4.6713 42000 0.0022 0.9988 0.9994 0.9994 0.9995 0.9990 0.9994 0.9997 0.9992 0.9984 0.9984 0.9991
0.0016 4.7826 43000 0.0023 0.9987 0.9993 0.9993 0.9995 0.9987 0.9995 0.9996 0.9992 0.9982 0.9981 0.9992
0.0283 4.8938 44000 0.0023 0.9987 0.9994 0.9994 0.9992 0.9991 0.9994 0.9998 0.9991 0.9984 0.9984 0.9991

Framework versions

  • Transformers 4.48.3
  • Pytorch 2.1.0
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
20
Safetensors
Model size
3.72M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for NICOPOI-9/segformer-b0-finetuned-morphpadver1-no_ckp-2

Base model

nvidia/mit-b0
Finetuned
(365)
this model