segformer-b0-finetuned-segments-chargers-2-15

This model is a fine-tuned version of nvidia/mit-b0 on the dskong07/chargers-large-v0.1 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5246
  • Mean Iou: 0.6936
  • Mean Accuracy: 0.7865
  • Overall Accuracy: 0.8737
  • Accuracy Unlabeled: nan
  • Accuracy Screen: 0.8112
  • Accuracy Body: 0.8204
  • Accuracy Cable: 0.5689
  • Accuracy Plug: 0.7794
  • Accuracy Void-background: 0.9527
  • Iou Unlabeled: nan
  • Iou Screen: 0.6428
  • Iou Body: 0.7165
  • Iou Cable: 0.4784
  • Iou Plug: 0.7602
  • Iou Void-background: 0.8699

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Unlabeled Accuracy Screen Accuracy Body Accuracy Cable Accuracy Plug Accuracy Void-background Iou Unlabeled Iou Screen Iou Body Iou Cable Iou Plug Iou Void-background
1.5655 0.4545 20 1.5839 0.2857 0.4743 0.6852 nan 0.0011 0.9372 0.2006 0.5724 0.6603 0.0 0.0011 0.5394 0.1392 0.4020 0.6327
1.0387 0.9091 40 1.1224 0.4674 0.5660 0.7519 nan 0.3246 0.9455 0.1082 0.7175 0.7344 nan 0.3106 0.5946 0.0854 0.6450 0.7013
0.8506 1.3636 60 0.9374 0.4166 0.5127 0.7649 nan 0.0088 0.9269 0.0955 0.7074 0.8249 nan 0.0088 0.6031 0.0855 0.6173 0.7685
1.316 1.8182 80 0.8349 0.5009 0.5943 0.7935 nan 0.5485 0.7621 0.1866 0.5533 0.9208 nan 0.4427 0.6167 0.1428 0.5252 0.7769
1.266 2.2727 100 0.7590 0.5428 0.6460 0.7983 nan 0.5567 0.9028 0.2308 0.7277 0.8120 nan 0.4578 0.6421 0.1727 0.6712 0.7704
0.8593 2.7273 120 0.6299 0.5461 0.6352 0.8205 nan 0.5860 0.8529 0.1316 0.7085 0.8969 nan 0.4586 0.6578 0.1155 0.6799 0.8187
0.8198 3.1818 140 0.7950 0.5250 0.6838 0.7766 nan 0.8901 0.6993 0.2443 0.7259 0.8594 nan 0.4157 0.5596 0.1809 0.6593 0.8094
0.5949 3.6364 160 0.5951 0.5629 0.6478 0.8330 nan 0.5695 0.8356 0.1649 0.7372 0.9320 nan 0.4646 0.6751 0.1432 0.6980 0.8337
0.6526 4.0909 180 0.6001 0.5719 0.6773 0.8276 nan 0.6910 0.8250 0.2582 0.7077 0.9048 nan 0.4875 0.6667 0.1978 0.6758 0.8316
0.604 4.5455 200 0.5667 0.5745 0.6851 0.8281 nan 0.6975 0.8459 0.2699 0.7244 0.8878 nan 0.5012 0.6688 0.2147 0.6592 0.8287
0.9183 5.0 220 0.5883 0.5631 0.6876 0.8149 nan 0.8346 0.7327 0.2647 0.6826 0.9237 nan 0.5150 0.6250 0.2047 0.6529 0.8180
0.7613 5.4545 240 0.5747 0.5662 0.6809 0.8123 nan 0.6507 0.8818 0.2722 0.7655 0.8342 nan 0.5111 0.6512 0.2208 0.6570 0.7907
0.5511 5.9091 260 0.5322 0.5933 0.6901 0.8376 nan 0.6194 0.8284 0.3510 0.7269 0.9249 nan 0.4969 0.6815 0.2498 0.6996 0.8388
0.4982 6.3636 280 0.5335 0.5795 0.6651 0.8320 nan 0.5108 0.8423 0.3209 0.7286 0.9230 nan 0.4651 0.6735 0.2494 0.6851 0.8243
0.4108 6.8182 300 0.5281 0.5856 0.7001 0.8354 nan 0.8443 0.7602 0.2416 0.7109 0.9438 nan 0.5248 0.6554 0.2105 0.6875 0.8499
0.3376 7.2727 320 0.5272 0.6009 0.7188 0.8306 nan 0.8168 0.8113 0.3663 0.7066 0.8927 nan 0.5315 0.6551 0.2982 0.6852 0.8345
0.7766 7.7273 340 0.5079 0.6229 0.7564 0.8391 nan 0.8021 0.7792 0.5319 0.7504 0.9184 nan 0.5374 0.6640 0.3668 0.6997 0.8466
0.2429 8.1818 360 0.4761 0.6026 0.6890 0.8481 nan 0.6583 0.8449 0.2794 0.7283 0.9341 nan 0.5177 0.6870 0.2522 0.6996 0.8566
0.3841 8.6364 380 0.5158 0.6200 0.7652 0.8325 nan 0.8574 0.7548 0.5769 0.7249 0.9122 nan 0.5241 0.6429 0.3738 0.7084 0.8507
0.1936 9.0909 400 0.4504 0.6434 0.7384 0.8578 nan 0.7268 0.8556 0.4359 0.7526 0.9212 nan 0.5913 0.7055 0.3452 0.7196 0.8552
0.2613 9.5455 420 0.4961 0.6302 0.7245 0.8470 nan 0.7415 0.7658 0.3729 0.7789 0.9634 nan 0.5875 0.6679 0.3161 0.7391 0.8403
0.4556 10.0 440 0.5313 0.6149 0.7346 0.8324 nan 0.8388 0.7388 0.4024 0.7559 0.9374 nan 0.5525 0.6306 0.3420 0.7108 0.8384
0.3093 10.4545 460 0.5350 0.6375 0.7984 0.8363 nan 0.8549 0.7216 0.6702 0.8157 0.9293 nan 0.5122 0.6350 0.4503 0.7317 0.8582
0.3501 10.9091 480 0.4808 0.6231 0.7178 0.8476 nan 0.7342 0.8303 0.3612 0.7393 0.9242 nan 0.5390 0.6785 0.3196 0.7237 0.8544
0.2578 11.3636 500 0.4619 0.6534 0.7584 0.8571 nan 0.7772 0.7877 0.5154 0.7582 0.9533 nan 0.5924 0.6883 0.4019 0.7278 0.8567
0.2126 11.8182 520 0.4903 0.6421 0.7441 0.8454 nan 0.7623 0.7699 0.5032 0.7381 0.9471 nan 0.5930 0.6614 0.4031 0.7148 0.8381
0.2819 12.2727 540 0.4747 0.6415 0.7479 0.8513 nan 0.7971 0.7860 0.4807 0.7309 0.9447 nan 0.5606 0.6747 0.3977 0.7159 0.8586
0.4983 12.7273 560 0.4771 0.6471 0.7305 0.8557 nan 0.6815 0.8187 0.4402 0.7633 0.9489 nan 0.5985 0.6893 0.3828 0.7182 0.8468
0.2204 13.1818 580 0.5358 0.6215 0.7475 0.8318 nan 0.8870 0.6911 0.4973 0.7026 0.9593 nan 0.5409 0.6175 0.4127 0.6943 0.8421
0.2746 13.6364 600 0.4943 0.6590 0.7715 0.8484 nan 0.7947 0.7570 0.6081 0.7496 0.9482 nan 0.6194 0.6638 0.4429 0.7312 0.8377
0.2906 14.0909 620 0.4353 0.6656 0.7480 0.8677 nan 0.6908 0.8820 0.4950 0.7480 0.9239 nan 0.6107 0.7228 0.4001 0.7317 0.8629
0.2525 14.5455 640 0.4791 0.6443 0.7224 0.8571 nan 0.6094 0.8913 0.4654 0.7341 0.9119 nan 0.5499 0.7035 0.3952 0.7189 0.8540
0.1833 15.0 660 0.4864 0.6493 0.7763 0.8484 nan 0.8969 0.7388 0.5243 0.7716 0.9499 nan 0.5615 0.6540 0.4238 0.7441 0.8632
0.1939 15.4545 680 0.4617 0.6640 0.7551 0.8626 nan 0.7545 0.8548 0.5001 0.7434 0.9229 nan 0.6174 0.7057 0.4156 0.7235 0.8580
0.2707 15.9091 700 0.4290 0.6809 0.7779 0.8708 nan 0.7613 0.8438 0.5709 0.7756 0.9381 nan 0.6280 0.7198 0.4356 0.7521 0.8690
0.3119 16.3636 720 0.5260 0.6407 0.7639 0.8423 nan 0.8560 0.7194 0.5168 0.7685 0.9585 nan 0.5350 0.6394 0.4234 0.7476 0.8579
0.3174 16.8182 740 0.5124 0.6571 0.7863 0.8448 nan 0.8463 0.7425 0.6094 0.7940 0.9396 nan 0.5712 0.6501 0.4540 0.7623 0.8480
0.2184 17.2727 760 0.4456 0.6717 0.7646 0.8642 nan 0.8054 0.8222 0.4617 0.7932 0.9402 nan 0.6414 0.7052 0.3969 0.7615 0.8538
0.1249 17.7273 780 0.4879 0.6582 0.7493 0.8532 nan 0.7889 0.7806 0.4841 0.7405 0.9525 nan 0.6257 0.6726 0.4193 0.7299 0.8434
0.1712 18.1818 800 0.5040 0.6731 0.7851 0.8483 nan 0.8385 0.7146 0.5750 0.8302 0.9672 nan 0.6569 0.6503 0.4501 0.7794 0.8289
0.2217 18.6364 820 0.5103 0.6633 0.7491 0.8509 nan 0.7282 0.7919 0.5329 0.7477 0.9450 nan 0.6338 0.6720 0.4389 0.7388 0.8331
0.3263 19.0909 840 0.5004 0.6648 0.7623 0.8554 nan 0.8285 0.7740 0.4984 0.7578 0.9530 nan 0.6267 0.6735 0.4297 0.7451 0.8489
0.1662 19.5455 860 0.4651 0.6780 0.7726 0.8644 nan 0.8039 0.8349 0.5213 0.7743 0.9287 nan 0.6384 0.7013 0.4381 0.7540 0.8582
0.1361 20.0 880 0.4748 0.6767 0.7656 0.8678 nan 0.7775 0.8272 0.5149 0.7618 0.9467 nan 0.6257 0.7085 0.4340 0.7511 0.8644
0.2124 20.4545 900 0.4660 0.6767 0.7776 0.8666 nan 0.8329 0.8000 0.5421 0.7603 0.9528 nan 0.6190 0.7016 0.4474 0.7487 0.8671
0.2573 20.9091 920 0.4938 0.6776 0.7773 0.8651 nan 0.8314 0.8083 0.5014 0.8013 0.9440 nan 0.6328 0.6987 0.4245 0.7700 0.8617
0.1497 21.3636 940 0.4430 0.6854 0.7821 0.8745 nan 0.7735 0.8398 0.5606 0.7905 0.9461 nan 0.6077 0.7236 0.4501 0.7661 0.8793
0.1861 21.8182 960 0.5097 0.6506 0.7678 0.8542 nan 0.8649 0.7894 0.5257 0.7242 0.9346 nan 0.5470 0.6768 0.4381 0.7176 0.8735
0.1572 22.2727 980 0.4777 0.6711 0.7551 0.8675 nan 0.7502 0.8347 0.4718 0.7713 0.9476 nan 0.6129 0.7084 0.4165 0.7517 0.8658
0.1082 22.7273 1000 0.4540 0.6867 0.7899 0.8677 nan 0.7930 0.8204 0.6061 0.7906 0.9395 nan 0.6199 0.7075 0.4731 0.7686 0.8645
0.1285 23.1818 1020 0.4463 0.6797 0.7735 0.8694 nan 0.7947 0.8460 0.5223 0.7717 0.9326 nan 0.6129 0.7137 0.4458 0.7557 0.8703
0.1978 23.6364 1040 0.4781 0.6904 0.7899 0.8704 nan 0.8070 0.7945 0.5618 0.8244 0.9619 nan 0.6368 0.7081 0.4667 0.7763 0.8642
0.1879 24.0909 1060 0.4848 0.6803 0.7783 0.8687 nan 0.8016 0.8162 0.5624 0.7626 0.9489 nan 0.6187 0.7084 0.4558 0.7497 0.8689
0.102 24.5455 1080 0.4841 0.6775 0.7682 0.8680 nan 0.7574 0.8436 0.5251 0.7786 0.9365 nan 0.6052 0.7108 0.4453 0.7586 0.8675
0.2149 25.0 1100 0.4672 0.6908 0.7884 0.8737 nan 0.8144 0.8204 0.5707 0.7849 0.9515 nan 0.6215 0.7174 0.4772 0.7633 0.8747
1.1322 25.4545 1120 0.4990 0.6663 0.7486 0.8681 nan 0.7371 0.8434 0.4733 0.7423 0.9470 nan 0.5989 0.7132 0.4169 0.7335 0.8689
0.0783 25.9091 1140 0.4601 0.6975 0.8088 0.8743 nan 0.8220 0.8114 0.6712 0.7896 0.9496 nan 0.6191 0.7188 0.5039 0.7694 0.8760
0.1457 26.3636 1160 0.4470 0.6896 0.7797 0.8757 nan 0.7891 0.8530 0.5372 0.7795 0.9395 nan 0.6254 0.7267 0.4578 0.7617 0.8764
0.1361 26.8182 1180 0.5201 0.6678 0.7768 0.8604 nan 0.8519 0.7749 0.5550 0.7467 0.9552 nan 0.5847 0.6838 0.4638 0.7387 0.8679
0.1691 27.2727 1200 0.4951 0.6754 0.7539 0.8697 nan 0.7242 0.8636 0.4831 0.7634 0.9356 nan 0.6207 0.7152 0.4287 0.7453 0.8672
0.1999 27.7273 1220 0.4614 0.6979 0.8111 0.8711 nan 0.8696 0.7905 0.6040 0.8397 0.9518 nan 0.6315 0.7069 0.4962 0.7861 0.8686
0.0889 28.1818 1240 0.4832 0.6938 0.7830 0.8728 nan 0.8059 0.8247 0.5632 0.7714 0.9496 nan 0.6636 0.7167 0.4688 0.7563 0.8635
0.1142 28.6364 1260 0.4679 0.6879 0.7740 0.8739 nan 0.7970 0.8397 0.5295 0.7576 0.9464 nan 0.6528 0.7206 0.4518 0.7450 0.8692
0.0918 29.0909 1280 0.5233 0.6846 0.7799 0.8648 nan 0.7956 0.7887 0.5848 0.7716 0.9589 nan 0.6379 0.6956 0.4797 0.7548 0.8551
0.1316 29.5455 1300 0.5407 0.6767 0.7714 0.8627 nan 0.8172 0.7958 0.5277 0.7653 0.9511 nan 0.6302 0.6903 0.4545 0.7513 0.8573
0.1038 30.0 1320 0.5312 0.6896 0.7940 0.8614 nan 0.8361 0.7694 0.6171 0.7919 0.9552 nan 0.6347 0.6822 0.5066 0.7728 0.8519
0.0852 30.4545 1340 0.5382 0.6821 0.7743 0.8662 nan 0.8174 0.8112 0.5254 0.7705 0.9471 nan 0.6370 0.6993 0.4586 0.7549 0.8608
0.1675 30.9091 1360 0.5164 0.6961 0.7941 0.8706 nan 0.7941 0.8111 0.6478 0.7668 0.9505 nan 0.6395 0.7098 0.5145 0.7536 0.8630
0.1527 31.3636 1380 0.4914 0.6975 0.7940 0.8734 nan 0.8204 0.8185 0.5917 0.7903 0.9492 nan 0.6434 0.7152 0.4932 0.7670 0.8686
0.1452 31.8182 1400 0.4774 0.6926 0.7818 0.8759 nan 0.7997 0.8295 0.5545 0.7711 0.9541 nan 0.6384 0.7226 0.4719 0.7560 0.8740
0.118 32.2727 1420 0.5084 0.6836 0.7877 0.8683 nan 0.8095 0.8179 0.5938 0.7752 0.9421 nan 0.6059 0.7065 0.4776 0.7570 0.8712
0.084 32.7273 1440 0.5185 0.6837 0.7724 0.8705 nan 0.7650 0.8410 0.5464 0.7686 0.9413 nan 0.6181 0.7139 0.4632 0.7545 0.8688
0.0946 33.1818 1460 0.5205 0.6840 0.7788 0.8700 nan 0.8178 0.8142 0.5450 0.7654 0.9516 nan 0.6270 0.7076 0.4633 0.7523 0.8701
0.1055 33.6364 1480 0.4754 0.6966 0.7909 0.8787 nan 0.8024 0.8560 0.5764 0.7817 0.9381 nan 0.6386 0.7327 0.4761 0.7567 0.8791
0.0648 34.0909 1500 0.4594 0.7028 0.8001 0.8794 nan 0.8077 0.8491 0.6292 0.7747 0.9400 nan 0.6393 0.7337 0.5006 0.7623 0.8784
0.105 34.5455 1520 0.4706 0.6978 0.8011 0.8757 nan 0.8362 0.8296 0.6192 0.7779 0.9424 nan 0.6291 0.7220 0.4987 0.7616 0.8775
1.4787 35.0 1540 0.4627 0.7087 0.7987 0.8810 nan 0.7919 0.8393 0.6184 0.7920 0.9521 nan 0.6504 0.7338 0.5105 0.7724 0.8763
0.1147 35.4545 1560 0.4701 0.7046 0.8014 0.8762 nan 0.7854 0.8391 0.6550 0.7862 0.9412 nan 0.6453 0.7250 0.5151 0.7680 0.8699
0.195 35.9091 1580 0.4830 0.6947 0.7915 0.8748 nan 0.8244 0.8209 0.5901 0.7709 0.9514 nan 0.6360 0.7180 0.4902 0.7551 0.8743
0.3486 36.3636 1600 0.4924 0.6965 0.7919 0.8764 nan 0.8062 0.8362 0.5987 0.7725 0.9459 nan 0.6405 0.7245 0.4880 0.7541 0.8752
0.1099 36.8182 1620 0.4828 0.6952 0.7860 0.8761 nan 0.7993 0.8391 0.5788 0.7665 0.9462 nan 0.6440 0.7243 0.4825 0.7524 0.8730
0.0522 37.2727 1640 0.4881 0.6936 0.7887 0.8723 nan 0.8043 0.8286 0.5928 0.7742 0.9436 nan 0.6372 0.7160 0.4903 0.7572 0.8674
0.1351 37.7273 1660 0.4858 0.6964 0.7871 0.8757 nan 0.8118 0.8421 0.5696 0.7700 0.9418 nan 0.6471 0.7234 0.4857 0.7540 0.8718
0.203 38.1818 1680 0.5016 0.6945 0.7850 0.8725 nan 0.8046 0.8431 0.5647 0.7771 0.9356 nan 0.6460 0.7160 0.4839 0.7596 0.8671
0.0954 38.6364 1700 0.4769 0.7112 0.8060 0.8779 nan 0.8099 0.8198 0.6541 0.7918 0.9541 nan 0.6605 0.7232 0.5307 0.7715 0.8699
0.0995 39.0909 1720 0.4993 0.6996 0.7862 0.8747 nan 0.7803 0.8414 0.5899 0.7760 0.9431 nan 0.6558 0.7209 0.4950 0.7601 0.8661
0.1813 39.5455 1740 0.4946 0.6987 0.7918 0.8757 nan 0.8139 0.8281 0.5865 0.7809 0.9493 nan 0.6520 0.7207 0.4905 0.7589 0.8713
0.0621 40.0 1760 0.4762 0.7033 0.7927 0.8803 nan 0.8028 0.8447 0.5892 0.7781 0.9485 nan 0.6568 0.7321 0.4905 0.7598 0.8774
0.0687 40.4545 1780 0.4884 0.7011 0.7915 0.8787 nan 0.7979 0.8304 0.6021 0.7714 0.9559 nan 0.6511 0.7281 0.4949 0.7560 0.8752
0.1386 40.9091 1800 0.4828 0.6969 0.7889 0.8782 nan 0.7934 0.8515 0.5913 0.7668 0.9417 nan 0.6379 0.7306 0.4859 0.7523 0.8778
0.0459 41.3636 1820 0.4872 0.6968 0.7956 0.8762 nan 0.8166 0.8278 0.6053 0.7798 0.9488 nan 0.6313 0.7230 0.4910 0.7618 0.8769
0.0411 41.8182 1840 0.4718 0.7066 0.7973 0.8807 nan 0.7948 0.8468 0.6196 0.7786 0.9467 nan 0.6525 0.7339 0.5061 0.7630 0.8772
0.0401 42.2727 1860 0.4735 0.7065 0.7990 0.8800 nan 0.8039 0.8356 0.6113 0.7926 0.9513 nan 0.6451 0.7300 0.5087 0.7711 0.8777
1.8317 42.7273 1880 0.4900 0.6958 0.7804 0.8790 nan 0.7847 0.8469 0.5523 0.7673 0.9509 nan 0.6388 0.7309 0.4760 0.7554 0.8779
0.1147 43.1818 1900 0.4926 0.6940 0.7874 0.8764 nan 0.8138 0.8472 0.5618 0.7744 0.9396 nan 0.6277 0.7249 0.4798 0.7588 0.8787
0.0765 43.6364 1920 0.4913 0.6995 0.7907 0.8788 nan 0.7868 0.8574 0.5862 0.7845 0.9387 nan 0.6380 0.7324 0.4888 0.7614 0.8772
0.1342 44.0909 1940 0.4832 0.6969 0.7836 0.8793 nan 0.7621 0.8697 0.5768 0.7730 0.9365 nan 0.6321 0.7356 0.4813 0.7571 0.8786
0.0698 44.5455 1960 0.4760 0.7022 0.7965 0.8799 nan 0.8003 0.8392 0.6173 0.7756 0.9500 nan 0.6399 0.7314 0.4997 0.7599 0.8801
0.0881 45.0 1980 0.4828 0.7027 0.7946 0.8807 nan 0.8076 0.8461 0.5947 0.7775 0.9472 nan 0.6397 0.7330 0.4984 0.7609 0.8816
0.0903 45.4545 2000 0.4890 0.7004 0.7894 0.8796 nan 0.8024 0.8472 0.5696 0.7810 0.9468 nan 0.6392 0.7313 0.4905 0.7620 0.8792
0.1647 45.9091 2020 0.4961 0.7043 0.8010 0.8782 nan 0.8182 0.8313 0.6124 0.7948 0.9483 nan 0.6389 0.7256 0.5088 0.7711 0.8772
0.0829 46.3636 2040 0.4977 0.7014 0.7929 0.8785 nan 0.8011 0.8502 0.5970 0.7755 0.9409 nan 0.6419 0.7295 0.4988 0.7598 0.8769
0.1189 46.8182 2060 0.4946 0.7029 0.7989 0.8773 nan 0.8201 0.8269 0.5936 0.8040 0.9500 nan 0.6383 0.7231 0.5006 0.7769 0.8757
0.0862 47.2727 2080 0.5052 0.7012 0.8009 0.8752 nan 0.8232 0.8213 0.6186 0.7936 0.9481 nan 0.6410 0.7184 0.5038 0.7703 0.8726
0.0754 47.7273 2100 0.5120 0.6969 0.7851 0.8758 nan 0.7846 0.8404 0.5816 0.7725 0.9464 nan 0.6457 0.7239 0.4867 0.7575 0.8708
0.1364 48.1818 2120 0.5077 0.6958 0.7822 0.8769 nan 0.7784 0.8646 0.5686 0.7654 0.9341 nan 0.6430 0.7293 0.4803 0.7529 0.8737
0.1038 48.6364 2140 0.5018 0.6965 0.7898 0.8760 nan 0.8126 0.8410 0.5757 0.7779 0.9419 nan 0.6405 0.7232 0.4853 0.7590 0.8745
0.1046 49.0909 2160 0.4999 0.6968 0.7912 0.8755 nan 0.8105 0.8361 0.5884 0.7772 0.9439 nan 0.6402 0.7216 0.4901 0.7588 0.8736
0.0701 49.5455 2180 0.5165 0.6942 0.7917 0.8743 nan 0.8236 0.8410 0.5828 0.7745 0.9367 nan 0.6330 0.7200 0.4862 0.7573 0.8746
0.5858 50.0 2200 0.5246 0.6936 0.7865 0.8737 nan 0.8112 0.8204 0.5689 0.7794 0.9527 nan 0.6428 0.7165 0.4784 0.7602 0.8699

Framework versions

  • Transformers 4.48.3
  • Pytorch 2.6.0+cpu
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
39
Safetensors
Model size
3.72M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for dskong07/segformer-b0-finetuned-segments-chargers-2-15

Base model

nvidia/mit-b0
Finetuned
(365)
this model