Edit model card

segformer-b1-from-scratch-run1

This model is a fine-tuned version of on the samitizerxu/kelp_data_rgbaa_swin_nir dataset. It achieves the following results on the evaluation set:

  • Iou Kelp: 0.0067
  • Loss: 0.9872

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.1
  • train_batch_size: 22
  • eval_batch_size: 22
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • num_epochs: 40

Training results

Training Loss Epoch Step Iou Kelp Validation Loss
0.9999 0.15 30 0.0067 0.9872
1.0 0.29 60 0.0067 0.9872
0.9933 0.44 90 0.0067 0.9872
0.998 0.59 120 0.0067 0.9872
1.0 0.73 150 0.0067 0.9872
0.9998 0.88 180 0.0067 0.9872
0.9998 1.02 210 0.0067 0.9872
1.0 1.17 240 0.0082 0.9861
0.9976 1.32 270 0.0069 0.9869
0.9995 1.46 300 0.0070 0.9868
0.9967 1.61 330 0.0067 0.9872
0.9945 1.76 360 0.0067 0.9872
1.0 1.9 390 0.0067 0.9872
0.9992 2.05 420 0.0067 0.9872
0.9991 2.2 450 0.0067 0.9872
0.997 2.34 480 0.0067 0.9872
0.999 2.49 510 0.0067 0.9872
0.9999 2.63 540 0.0067 0.9872
0.9991 2.78 570 0.0067 0.9872
0.9987 2.93 600 0.0067 0.9872
0.9999 3.07 630 0.0067 0.9872
0.9983 3.22 660 0.0067 0.9872
0.9973 3.37 690 0.0067 0.9872
0.9987 3.51 720 0.0067 0.9872
0.9915 3.66 750 0.0067 0.9872
0.9984 3.8 780 0.0067 0.9872
0.9992 3.95 810 0.0067 0.9872
0.9993 4.1 840 0.0067 0.9872
1.0 4.24 870 0.0067 0.9872
0.9998 4.39 900 0.0067 0.9872
0.9999 4.54 930 0.0067 0.9872
0.9995 4.68 960 0.0067 0.9872
0.998 4.83 990 0.0067 0.9872
0.9989 4.98 1020 0.0067 0.9872
0.9975 5.12 1050 0.0067 0.9872
0.9993 5.27 1080 0.0067 0.9872
0.9971 5.41 1110 0.0067 0.9872
0.9944 5.56 1140 0.0067 0.9872
0.9967 5.71 1170 0.0067 0.9872
0.9986 5.85 1200 0.0067 0.9872
0.9994 6.0 1230 0.0067 0.9872
0.9997 6.15 1260 0.0067 0.9872
0.9998 6.29 1290 0.0067 0.9872
0.999 6.44 1320 0.0067 0.9872
0.9996 6.59 1350 0.0067 0.9872
1.0 6.73 1380 0.0067 0.9872
0.9999 6.88 1410 0.0067 0.9872
0.9933 7.02 1440 0.0067 0.9872
0.998 7.17 1470 0.0067 0.9872
0.9968 7.32 1500 0.0067 0.9872
0.997 7.46 1530 0.0067 0.9872
0.9981 7.61 1560 0.0067 0.9872
0.9992 7.76 1590 0.0067 0.9872
0.9999 7.9 1620 0.0067 0.9872
0.9964 8.05 1650 0.0067 0.9872
0.9999 8.2 1680 0.0067 0.9872
0.9941 8.34 1710 0.0067 0.9872
0.9963 8.49 1740 0.0067 0.9872
0.998 8.63 1770 0.0067 0.9872
0.9989 8.78 1800 0.0067 0.9872
1.0 8.93 1830 0.0067 0.9872
1.0 9.07 1860 0.0067 0.9872
0.9974 9.22 1890 0.0067 0.9872
0.9989 9.37 1920 0.0067 0.9872
0.9989 9.51 1950 0.0067 0.9872
0.996 9.66 1980 0.0067 0.9872
0.9995 9.8 2010 0.0067 0.9872
0.9973 9.95 2040 0.0067 0.9872
0.9957 10.1 2070 0.0067 0.9872
0.9996 10.24 2100 0.0067 0.9872
1.0 10.39 2130 0.0067 0.9872
0.9967 10.54 2160 0.0067 0.9872
0.9989 10.68 2190 0.0067 0.9872
0.9989 10.83 2220 0.0067 0.9872
0.9994 10.98 2250 0.0067 0.9872
0.9992 11.12 2280 0.0067 0.9872
0.9973 11.27 2310 0.0067 0.9872
0.9993 11.41 2340 0.0067 0.9872
0.9973 11.56 2370 0.0067 0.9872
0.9996 11.71 2400 0.0067 0.9872
1.0 11.85 2430 0.0067 0.9872
0.9989 12.0 2460 0.0067 0.9872
1.0 12.15 2490 0.0067 0.9872
0.9987 12.29 2520 0.0067 0.9872
0.9914 12.44 2550 0.0067 0.9872
0.9974 12.59 2580 0.0067 0.9872
1.0 12.73 2610 0.0067 0.9872
0.999 12.88 2640 0.0067 0.9872
1.0 13.02 2670 0.0067 0.9872
0.9991 13.17 2700 0.0067 0.9872
0.9979 13.32 2730 0.0067 0.9872
1.0 13.46 2760 0.0067 0.9872
0.9973 13.61 2790 0.0067 0.9872
0.9995 13.76 2820 0.0067 0.9872
0.9973 13.9 2850 0.0067 0.9872
0.9961 14.05 2880 0.0067 0.9872
0.9907 14.2 2910 0.0067 0.9872
0.9984 14.34 2940 0.0067 0.9872
0.9986 14.49 2970 0.0067 0.9872
0.9935 14.63 3000 0.0067 0.9872
0.998 14.78 3030 0.0067 0.9872
0.9982 14.93 3060 0.0067 0.9872
0.9956 15.07 3090 0.0067 0.9872
0.9991 15.22 3120 0.0067 0.9872
0.9985 15.37 3150 0.0067 0.9872
0.9958 15.51 3180 0.0067 0.9872
0.9998 15.66 3210 0.0067 0.9872
0.9972 15.8 3240 0.0067 0.9872
0.9996 15.95 3270 0.0067 0.9872
0.9965 16.1 3300 0.0067 0.9872
0.9983 16.24 3330 0.0067 0.9872
0.9993 16.39 3360 0.0067 0.9872
0.9962 16.54 3390 0.0067 0.9872
0.9985 16.68 3420 0.0067 0.9872
0.9998 16.83 3450 0.0067 0.9872
0.9993 16.98 3480 0.0067 0.9872
0.9993 17.12 3510 0.0067 0.9872
0.9998 17.27 3540 0.0067 0.9872
1.0 17.41 3570 0.0067 0.9872
0.9999 17.56 3600 0.0067 0.9872
0.9993 17.71 3630 0.0067 0.9872
0.999 17.85 3660 0.0067 0.9872
0.9975 18.0 3690 0.0067 0.9872
0.9993 18.15 3720 0.0067 0.9872
1.0 18.29 3750 0.0067 0.9872
0.9983 18.44 3780 0.0067 0.9872
0.9994 18.59 3810 0.0067 0.9872
0.9993 18.73 3840 0.0067 0.9872
0.9982 18.88 3870 0.0067 0.9872
0.9997 19.02 3900 0.0067 0.9872
0.9955 19.17 3930 0.0067 0.9872
0.9992 19.32 3960 0.0067 0.9872
0.9592 19.46 3990 0.0067 0.9872
0.9897 19.61 4020 0.0067 0.9872
0.9994 19.76 4050 0.0067 0.9872
0.9989 19.9 4080 0.0067 0.9872
0.9995 20.05 4110 0.0067 0.9872
0.9995 20.2 4140 0.0067 0.9872
0.9938 20.34 4170 0.0067 0.9872
0.9987 20.49 4200 0.0067 0.9872
0.9999 20.63 4230 0.0067 0.9872
0.9994 20.78 4260 0.0067 0.9872
0.9954 20.93 4290 0.0067 0.9872
0.9975 21.07 4320 0.0067 0.9872
0.9997 21.22 4350 0.0067 0.9872
0.9978 21.37 4380 0.0067 0.9872
0.9994 21.51 4410 0.0067 0.9872
0.9985 21.66 4440 0.0067 0.9872
0.9998 21.8 4470 0.0067 0.9872
0.998 21.95 4500 0.0067 0.9872
0.9983 22.1 4530 0.0067 0.9872
0.9989 22.24 4560 0.0067 0.9872
0.9973 22.39 4590 0.0067 0.9872
0.9961 22.54 4620 0.0067 0.9872
0.9984 22.68 4650 0.0067 0.9872
1.0 22.83 4680 0.0067 0.9872
0.9949 22.98 4710 0.0067 0.9872
0.9989 23.12 4740 0.0067 0.9872
0.9998 23.27 4770 0.0067 0.9872
0.9999 23.41 4800 0.0067 0.9872
0.9996 23.56 4830 0.0067 0.9872
0.9974 23.71 4860 0.0067 0.9872
0.9997 23.85 4890 0.0067 0.9872
0.9999 24.0 4920 0.0067 0.9872
0.9962 24.15 4950 0.0067 0.9872
0.9996 24.29 4980 0.0067 0.9872
0.9999 24.44 5010 0.0067 0.9872
0.9973 24.59 5040 0.0067 0.9872
0.9996 24.73 5070 0.0067 0.9872
0.9995 24.88 5100 0.0067 0.9872
0.9999 25.02 5130 0.0067 0.9872
0.9988 25.17 5160 0.0067 0.9872
1.0 25.32 5190 0.0067 0.9872
1.0 25.46 5220 0.0067 0.9872
0.9996 25.61 5250 0.0067 0.9872
0.9965 25.76 5280 0.0067 0.9872
0.9976 25.9 5310 0.0067 0.9872
1.0 26.05 5340 0.0067 0.9872
0.9989 26.2 5370 0.0067 0.9872
0.9996 26.34 5400 0.0067 0.9872
0.9998 26.49 5430 0.0067 0.9872
1.0 26.63 5460 0.0067 0.9872
0.9996 26.78 5490 0.0067 0.9872
0.9972 26.93 5520 0.0067 0.9872
0.9984 27.07 5550 0.0067 0.9872
0.9961 27.22 5580 0.0067 0.9872
1.0 27.37 5610 0.0067 0.9872
0.9977 27.51 5640 0.0067 0.9872
0.9969 27.66 5670 0.0067 0.9872
0.9971 27.8 5700 0.0067 0.9872
0.9986 27.95 5730 0.0067 0.9872
0.9995 28.1 5760 0.0067 0.9872
0.9992 28.24 5790 0.0067 0.9872
0.9976 28.39 5820 0.0067 0.9872
0.9994 28.54 5850 0.0067 0.9872
0.998 28.68 5880 0.0067 0.9872
0.9952 28.83 5910 0.0067 0.9872
0.9998 28.98 5940 0.0067 0.9872
0.9937 29.12 5970 0.0067 0.9872
0.9989 29.27 6000 0.0067 0.9872
0.9993 29.41 6030 0.0067 0.9872
0.9989 29.56 6060 0.0067 0.9872
0.999 29.71 6090 0.0067 0.9872
0.9939 29.85 6120 0.0067 0.9872
1.0 30.0 6150 0.0067 0.9872
0.9996 30.15 6180 0.0067 0.9872
0.9994 30.29 6210 0.0067 0.9872
0.999 30.44 6240 0.0067 0.9872
1.0 30.59 6270 0.0067 0.9872
0.9956 30.73 6300 0.0067 0.9872
0.9971 30.88 6330 0.0067 0.9872
0.9985 31.02 6360 0.0067 0.9872
1.0 31.17 6390 0.0067 0.9872
0.9987 31.32 6420 0.0067 0.9872
0.9992 31.46 6450 0.0067 0.9872
0.9996 31.61 6480 0.0067 0.9872
0.9998 31.76 6510 0.0067 0.9872
0.9989 31.9 6540 0.0067 0.9872
1.0 32.05 6570 0.0067 0.9872
0.9966 32.2 6600 0.0067 0.9872
0.9994 32.34 6630 0.0067 0.9872
0.9987 32.49 6660 0.0067 0.9872
0.9993 32.63 6690 0.0067 0.9872
0.9971 32.78 6720 0.0067 0.9872
0.9971 32.93 6750 0.0067 0.9872
0.9929 33.07 6780 0.0067 0.9872
0.9997 33.22 6810 0.0067 0.9872
0.9978 33.37 6840 0.0067 0.9872
1.0 33.51 6870 0.0067 0.9872
0.9991 33.66 6900 0.0067 0.9872
0.9971 33.8 6930 0.0067 0.9872
0.9999 33.95 6960 0.0067 0.9872
0.9999 34.1 6990 0.0067 0.9872
0.9997 34.24 7020 0.0067 0.9872
1.0 34.39 7050 0.0067 0.9872
0.9986 34.54 7080 0.0067 0.9872
0.9996 34.68 7110 0.0067 0.9872
0.9994 34.83 7140 0.0067 0.9872
0.9997 34.98 7170 0.0067 0.9872
0.9999 35.12 7200 0.0067 0.9872
0.9969 35.27 7230 0.0067 0.9872
1.0 35.41 7260 0.0067 0.9872
0.9984 35.56 7290 0.0067 0.9872
0.9961 35.71 7320 0.0067 0.9872
0.9988 35.85 7350 0.0067 0.9872
0.9985 36.0 7380 0.0067 0.9872
0.9997 36.15 7410 0.0067 0.9872
1.0 36.29 7440 0.0067 0.9872
0.9987 36.44 7470 0.0067 0.9872
0.9991 36.59 7500 0.0067 0.9872
0.9992 36.73 7530 0.0067 0.9872
0.9999 36.88 7560 0.0067 0.9872
0.9996 37.02 7590 0.0067 0.9872
0.9995 37.17 7620 0.0067 0.9872
0.9998 37.32 7650 0.0067 0.9872
0.9969 37.46 7680 0.0067 0.9872
0.9989 37.61 7710 0.0067 0.9872
0.9992 37.76 7740 0.0067 0.9872
0.9959 37.9 7770 0.0067 0.9872
0.9987 38.05 7800 0.0067 0.9872
0.998 38.2 7830 0.0067 0.9872
0.9992 38.34 7860 0.0067 0.9872
0.9992 38.49 7890 0.0067 0.9872
0.9993 38.63 7920 0.0067 0.9872
0.9997 38.78 7950 0.0067 0.9872
0.9976 38.93 7980 0.0067 0.9872
1.0 39.07 8010 0.0067 0.9872
0.9959 39.22 8040 0.0067 0.9872
0.9973 39.37 8070 0.0067 0.9872
0.9996 39.51 8100 0.0067 0.9872
1.0 39.66 8130 0.0067 0.9872
0.9986 39.8 8160 0.0067 0.9872
0.9999 39.95 8190 0.0067 0.9872

Framework versions

  • Transformers 4.37.1
  • Pytorch 2.1.2
  • Datasets 2.16.1
  • Tokenizers 0.15.1
Downloads last month
2
Safetensors
Model size
13.7M params
Tensor type
F32
·