yolo_finetuned_fruits
This model is a fine-tuned version of hustvl/yolos-tiny on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.8182
- Map: 0.5572
- Map 50: 0.8422
- Map 75: 0.5925
- Map Small: -1.0
- Map Medium: 0.4995
- Map Large: 0.5815
- Mar 1: 0.406
- Mar 10: 0.7081
- Mar 100: 0.771
- Mar Small: -1.0
- Mar Medium: 0.6571
- Mar Large: 0.7893
- Map Banana: 0.4184
- Mar 100 Banana: 0.755
- Map Orange: 0.5804
- Mar 100 Orange: 0.7667
- Map Apple: 0.6727
- Mar 100 Apple: 0.7914
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Banana | Mar 100 Banana | Map Orange | Mar 100 Orange | Map Apple | Mar 100 Apple |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 60 | 1.3832 | 0.0662 | 0.1157 | 0.0675 | -1.0 | 0.0739 | 0.0823 | 0.2406 | 0.4375 | 0.5766 | -1.0 | 0.3429 | 0.6114 | 0.0589 | 0.585 | 0.0233 | 0.5476 | 0.1164 | 0.5971 |
No log | 2.0 | 120 | 1.1698 | 0.1353 | 0.2296 | 0.1411 | -1.0 | 0.084 | 0.153 | 0.2702 | 0.4985 | 0.5976 | -1.0 | 0.3 | 0.6393 | 0.1204 | 0.64 | 0.0371 | 0.45 | 0.2482 | 0.7029 |
No log | 3.0 | 180 | 1.1113 | 0.2256 | 0.3806 | 0.2196 | -1.0 | 0.2253 | 0.2425 | 0.2869 | 0.541 | 0.6777 | -1.0 | 0.4429 | 0.7111 | 0.2027 | 0.6825 | 0.0814 | 0.619 | 0.3926 | 0.7314 |
No log | 4.0 | 240 | 1.1347 | 0.2893 | 0.4969 | 0.3288 | -1.0 | 0.227 | 0.3081 | 0.31 | 0.5639 | 0.6804 | -1.0 | 0.5 | 0.708 | 0.2585 | 0.6575 | 0.1758 | 0.6524 | 0.4336 | 0.7314 |
No log | 5.0 | 300 | 1.2339 | 0.3139 | 0.5879 | 0.2967 | -1.0 | 0.331 | 0.3322 | 0.3008 | 0.5406 | 0.632 | -1.0 | 0.5429 | 0.6512 | 0.206 | 0.575 | 0.2292 | 0.681 | 0.5065 | 0.64 |
No log | 6.0 | 360 | 0.9967 | 0.4014 | 0.6469 | 0.4216 | -1.0 | 0.3962 | 0.4191 | 0.358 | 0.6481 | 0.7231 | -1.0 | 0.6286 | 0.742 | 0.3186 | 0.6625 | 0.2708 | 0.7238 | 0.6147 | 0.7829 |
No log | 7.0 | 420 | 1.0597 | 0.4142 | 0.7448 | 0.4478 | -1.0 | 0.3237 | 0.4436 | 0.3425 | 0.6121 | 0.6731 | -1.0 | 0.5143 | 0.6985 | 0.3237 | 0.6575 | 0.3926 | 0.6762 | 0.5264 | 0.6857 |
No log | 8.0 | 480 | 0.8963 | 0.4543 | 0.7499 | 0.457 | -1.0 | 0.4688 | 0.4784 | 0.3545 | 0.6702 | 0.7344 | -1.0 | 0.6357 | 0.75 | 0.3374 | 0.72 | 0.4997 | 0.7548 | 0.526 | 0.7286 |
1.1464 | 9.0 | 540 | 0.9669 | 0.4473 | 0.706 | 0.4926 | -1.0 | 0.3025 | 0.4892 | 0.369 | 0.6374 | 0.719 | -1.0 | 0.5643 | 0.7461 | 0.3263 | 0.6775 | 0.4506 | 0.731 | 0.5649 | 0.7486 |
1.1464 | 10.0 | 600 | 0.9448 | 0.4642 | 0.7241 | 0.5078 | -1.0 | 0.4185 | 0.4969 | 0.3656 | 0.6285 | 0.7131 | -1.0 | 0.5714 | 0.7363 | 0.3286 | 0.6875 | 0.4589 | 0.7262 | 0.6051 | 0.7257 |
1.1464 | 11.0 | 660 | 0.9464 | 0.4645 | 0.7321 | 0.4897 | -1.0 | 0.393 | 0.502 | 0.381 | 0.6472 | 0.7229 | -1.0 | 0.6071 | 0.7425 | 0.3356 | 0.6925 | 0.4614 | 0.7476 | 0.5964 | 0.7286 |
1.1464 | 12.0 | 720 | 0.9143 | 0.4816 | 0.7601 | 0.5366 | -1.0 | 0.4266 | 0.5129 | 0.37 | 0.6644 | 0.7434 | -1.0 | 0.6143 | 0.7642 | 0.3459 | 0.7225 | 0.4752 | 0.7333 | 0.6236 | 0.7743 |
1.1464 | 13.0 | 780 | 0.8523 | 0.5186 | 0.7851 | 0.5524 | -1.0 | 0.4882 | 0.5457 | 0.3976 | 0.6733 | 0.7352 | -1.0 | 0.6071 | 0.7542 | 0.3849 | 0.7475 | 0.5316 | 0.7238 | 0.6394 | 0.7343 |
1.1464 | 14.0 | 840 | 0.8937 | 0.5077 | 0.7907 | 0.557 | -1.0 | 0.4944 | 0.5348 | 0.3906 | 0.6622 | 0.748 | -1.0 | 0.6571 | 0.7649 | 0.3544 | 0.7125 | 0.5471 | 0.7571 | 0.6215 | 0.7743 |
1.1464 | 15.0 | 900 | 0.8502 | 0.52 | 0.8012 | 0.5662 | -1.0 | 0.4128 | 0.5524 | 0.4075 | 0.669 | 0.7367 | -1.0 | 0.5857 | 0.7619 | 0.3692 | 0.715 | 0.5478 | 0.7381 | 0.6432 | 0.7571 |
1.1464 | 16.0 | 960 | 0.8644 | 0.515 | 0.8117 | 0.5603 | -1.0 | 0.4659 | 0.5399 | 0.3743 | 0.6587 | 0.7319 | -1.0 | 0.5857 | 0.7537 | 0.3634 | 0.7325 | 0.5928 | 0.769 | 0.5889 | 0.6943 |
0.6981 | 17.0 | 1020 | 0.8121 | 0.5252 | 0.8011 | 0.5394 | -1.0 | 0.4677 | 0.5524 | 0.3924 | 0.6975 | 0.7589 | -1.0 | 0.6571 | 0.7767 | 0.347 | 0.73 | 0.5675 | 0.7667 | 0.6611 | 0.78 |
0.6981 | 18.0 | 1080 | 0.8345 | 0.5364 | 0.8268 | 0.5691 | -1.0 | 0.5232 | 0.559 | 0.3959 | 0.6849 | 0.7559 | -1.0 | 0.6571 | 0.7713 | 0.3633 | 0.7425 | 0.58 | 0.7595 | 0.6659 | 0.7657 |
0.6981 | 19.0 | 1140 | 0.8186 | 0.531 | 0.8115 | 0.5705 | -1.0 | 0.4991 | 0.5533 | 0.3908 | 0.6815 | 0.748 | -1.0 | 0.65 | 0.7623 | 0.3856 | 0.7525 | 0.5816 | 0.7571 | 0.6257 | 0.7343 |
0.6981 | 20.0 | 1200 | 0.7999 | 0.562 | 0.8515 | 0.598 | -1.0 | 0.4755 | 0.5915 | 0.4071 | 0.7006 | 0.7647 | -1.0 | 0.6143 | 0.7881 | 0.4081 | 0.755 | 0.6069 | 0.7762 | 0.6711 | 0.7629 |
0.6981 | 21.0 | 1260 | 0.8050 | 0.5545 | 0.8284 | 0.6088 | -1.0 | 0.4877 | 0.5829 | 0.3939 | 0.704 | 0.7691 | -1.0 | 0.6429 | 0.7885 | 0.4096 | 0.7625 | 0.608 | 0.7619 | 0.6459 | 0.7829 |
0.6981 | 22.0 | 1320 | 0.8181 | 0.5503 | 0.813 | 0.573 | -1.0 | 0.4993 | 0.5756 | 0.4033 | 0.7077 | 0.77 | -1.0 | 0.6714 | 0.787 | 0.4089 | 0.7375 | 0.5731 | 0.7667 | 0.6689 | 0.8057 |
0.6981 | 23.0 | 1380 | 0.8230 | 0.553 | 0.8332 | 0.5737 | -1.0 | 0.4984 | 0.5758 | 0.4016 | 0.7059 | 0.7706 | -1.0 | 0.6571 | 0.7882 | 0.405 | 0.7575 | 0.5733 | 0.7571 | 0.6807 | 0.7971 |
0.6981 | 24.0 | 1440 | 0.8177 | 0.5512 | 0.8339 | 0.5859 | -1.0 | 0.5113 | 0.5758 | 0.402 | 0.701 | 0.7675 | -1.0 | 0.6357 | 0.7882 | 0.4096 | 0.7525 | 0.5716 | 0.7643 | 0.6724 | 0.7857 |
0.5444 | 25.0 | 1500 | 0.8300 | 0.558 | 0.8348 | 0.5836 | -1.0 | 0.4993 | 0.5835 | 0.4049 | 0.7054 | 0.7706 | -1.0 | 0.65 | 0.7902 | 0.4095 | 0.75 | 0.5775 | 0.7619 | 0.6868 | 0.8 |
0.5444 | 26.0 | 1560 | 0.8121 | 0.5618 | 0.8348 | 0.5814 | -1.0 | 0.5067 | 0.5873 | 0.4102 | 0.7154 | 0.7776 | -1.0 | 0.6714 | 0.7954 | 0.4142 | 0.7575 | 0.5854 | 0.7667 | 0.6859 | 0.8086 |
0.5444 | 27.0 | 1620 | 0.8138 | 0.5582 | 0.8328 | 0.5901 | -1.0 | 0.5006 | 0.5824 | 0.4064 | 0.7073 | 0.7702 | -1.0 | 0.6571 | 0.7883 | 0.4105 | 0.755 | 0.5879 | 0.7643 | 0.6762 | 0.7914 |
0.5444 | 28.0 | 1680 | 0.8084 | 0.5597 | 0.842 | 0.5932 | -1.0 | 0.4993 | 0.5837 | 0.4052 | 0.704 | 0.7677 | -1.0 | 0.6571 | 0.7854 | 0.4212 | 0.7525 | 0.5829 | 0.7619 | 0.6748 | 0.7886 |
0.5444 | 29.0 | 1740 | 0.8162 | 0.5581 | 0.8424 | 0.593 | -1.0 | 0.4995 | 0.5829 | 0.4052 | 0.7065 | 0.7719 | -1.0 | 0.6571 | 0.79 | 0.4187 | 0.76 | 0.5811 | 0.7643 | 0.6745 | 0.7914 |
0.5444 | 30.0 | 1800 | 0.8182 | 0.5572 | 0.8422 | 0.5925 | -1.0 | 0.4995 | 0.5815 | 0.406 | 0.7081 | 0.771 | -1.0 | 0.6571 | 0.7893 | 0.4184 | 0.755 | 0.5804 | 0.7667 | 0.6727 | 0.7914 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.1
- Tokenizers 0.21.1
- Downloads last month
- 2
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for aiarenm/yolo_finetuned_fruits
Base model
hustvl/yolos-tiny