segformer-b0-finetuned-morphpadver1-hgo-3

This model is a fine-tuned version of nvidia/mit-b0 on the NICOPOI-9/morphpad_hgo_512_4class dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0870
  • Mean Iou: 0.9780
  • Mean Accuracy: 0.9889
  • Overall Accuracy: 0.9888
  • Accuracy 0-0: 0.9913
  • Accuracy 0-90: 0.9886
  • Accuracy 90-0: 0.9892
  • Accuracy 90-90: 0.9863
  • Iou 0-0: 0.9811
  • Iou 0-90: 0.9785
  • Iou 90-0: 0.9754
  • Iou 90-90: 0.9771

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 80

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy 0-0 Accuracy 0-90 Accuracy 90-0 Accuracy 90-90 Iou 0-0 Iou 0-90 Iou 90-0 Iou 90-90
1.2394 2.5445 4000 1.2392 0.2273 0.3754 0.3738 0.4516 0.3985 0.2299 0.4215 0.2520 0.2205 0.1802 0.2565
1.0858 5.0891 8000 1.0986 0.2889 0.4462 0.4463 0.4510 0.3646 0.5196 0.4498 0.3208 0.2453 0.2792 0.3104
1.0059 7.6336 12000 1.0060 0.3295 0.4934 0.4935 0.4793 0.6104 0.4583 0.4256 0.3472 0.3209 0.3057 0.3442
0.9457 10.1781 16000 0.9497 0.3565 0.5236 0.5240 0.4976 0.6276 0.5356 0.4335 0.3777 0.3459 0.3490 0.3532
0.93 12.7226 20000 0.9072 0.3759 0.5397 0.5412 0.4649 0.6012 0.6437 0.4491 0.4073 0.3550 0.3549 0.3863
0.8403 15.2672 24000 0.8349 0.4263 0.5967 0.5959 0.6497 0.5299 0.6555 0.5517 0.4548 0.4083 0.3976 0.4443
0.8826 17.8117 28000 0.6734 0.5172 0.6784 0.6787 0.6590 0.7037 0.6902 0.6606 0.5517 0.4864 0.4837 0.5469
0.8028 20.3562 32000 0.4582 0.6843 0.8116 0.8120 0.7923 0.7953 0.8322 0.8268 0.6986 0.6686 0.6646 0.7056
0.8461 22.9008 36000 0.2845 0.8071 0.8931 0.8928 0.9089 0.8839 0.8825 0.8971 0.8322 0.7928 0.7850 0.8182
0.7034 25.4453 40000 0.3351 0.7724 0.8716 0.8713 0.8859 0.8680 0.8485 0.8839 0.7928 0.7649 0.7472 0.7849
1.0428 27.9898 44000 0.1882 0.8750 0.9334 0.9331 0.9467 0.9277 0.9323 0.9267 0.8912 0.8673 0.8647 0.8767
0.3497 30.5344 48000 0.1620 0.8982 0.9464 0.9461 0.9582 0.9539 0.9279 0.9455 0.9163 0.8878 0.8840 0.9046
0.0803 33.0789 52000 0.1314 0.9187 0.9577 0.9575 0.9711 0.9517 0.9508 0.9574 0.9300 0.9101 0.9121 0.9224
0.1394 35.6234 56000 0.1271 0.9228 0.9598 0.9597 0.9648 0.9612 0.9530 0.9603 0.9332 0.9163 0.9167 0.9250
0.0579 38.1679 60000 0.1170 0.9351 0.9665 0.9664 0.9723 0.9652 0.9619 0.9665 0.9446 0.9300 0.9282 0.9377
0.1097 40.7125 64000 0.1121 0.9402 0.9690 0.9691 0.9634 0.9748 0.9732 0.9644 0.9477 0.9380 0.9334 0.9416
0.0615 43.2570 68000 0.1069 0.9458 0.9720 0.9721 0.9707 0.9713 0.9777 0.9685 0.9523 0.9424 0.9426 0.9459
0.0425 45.8015 72000 0.0967 0.9540 0.9764 0.9764 0.9789 0.9740 0.9798 0.9729 0.9609 0.9506 0.9514 0.9529
0.0396 48.3461 76000 0.0991 0.9599 0.9795 0.9795 0.9827 0.9766 0.9783 0.9805 0.9656 0.9562 0.9560 0.9617
0.4319 50.8906 80000 0.0975 0.9583 0.9786 0.9787 0.9759 0.9826 0.9769 0.9792 0.9607 0.9566 0.9551 0.9608
0.0299 53.4351 84000 0.0959 0.9662 0.9828 0.9827 0.9858 0.9837 0.9807 0.9811 0.9712 0.9655 0.9637 0.9644
0.0282 55.9796 88000 0.0933 0.9687 0.9841 0.9841 0.9866 0.9830 0.9823 0.9846 0.9729 0.9665 0.9655 0.9698
0.1038 58.5242 92000 0.0864 0.9707 0.9852 0.9851 0.9882 0.9874 0.9842 0.9809 0.9727 0.9709 0.9698 0.9695
0.0246 61.0687 96000 0.0990 0.9722 0.9859 0.9859 0.9885 0.9873 0.9847 0.9832 0.9764 0.9716 0.9706 0.9703
0.0208 63.6132 100000 0.0839 0.9749 0.9873 0.9873 0.9901 0.9897 0.9850 0.9845 0.9781 0.9748 0.9726 0.9743
0.0749 66.1578 104000 0.0874 0.9761 0.9879 0.9879 0.9901 0.9888 0.9875 0.9851 0.9798 0.9757 0.9734 0.9753
0.0798 68.7023 108000 0.0866 0.9753 0.9875 0.9875 0.9908 0.9881 0.9883 0.9830 0.9792 0.9759 0.9722 0.9740
0.1236 71.2468 112000 0.0933 0.9775 0.9886 0.9886 0.9911 0.9895 0.9870 0.9868 0.9813 0.9773 0.9748 0.9765
0.0165 73.7913 116000 0.0894 0.9785 0.9891 0.9891 0.9902 0.9912 0.9867 0.9886 0.9818 0.9782 0.9759 0.9783
0.0204 76.3359 120000 0.0885 0.9792 0.9895 0.9895 0.9909 0.9913 0.9874 0.9884 0.9823 0.9791 0.9766 0.9788
0.7823 78.8804 124000 0.0870 0.9780 0.9889 0.9888 0.9913 0.9886 0.9892 0.9863 0.9811 0.9785 0.9754 0.9771

Framework versions

  • Transformers 4.48.3
  • Pytorch 2.1.0
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
5
Safetensors
Model size
3.72M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for NICOPOI-9/segformer-b0-finetuned-morphpadver1-hgo-3

Base model

nvidia/mit-b0
Finetuned
(365)
this model