powerinfer-seq-cls
This model is a fine-tuned version of PowerInfer/SmallThinker-3B-Preview on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.0290
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine_with_restarts
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 20
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
1.7722 | 0.0571 | 30 | 1.4541 |
0.9933 | 0.1143 | 60 | 0.7808 |
0.4922 | 0.1714 | 90 | 0.5142 |
0.3496 | 0.2286 | 120 | 0.3577 |
0.2746 | 0.2857 | 150 | 0.3452 |
0.2295 | 0.3429 | 180 | 0.1985 |
0.1529 | 0.4000 | 210 | 0.1752 |
0.1471 | 0.4572 | 240 | 0.1584 |
0.1819 | 0.5143 | 270 | 0.2583 |
0.1638 | 0.5715 | 300 | 0.2517 |
0.1294 | 0.6286 | 330 | 0.1405 |
0.1388 | 0.6858 | 360 | 0.1234 |
0.1207 | 0.7429 | 390 | 0.1893 |
0.1079 | 0.8001 | 420 | 0.1293 |
0.1075 | 0.8572 | 450 | 0.2412 |
0.1034 | 0.9144 | 480 | 0.2327 |
0.1196 | 0.9715 | 510 | 0.1451 |
0.0994 | 1.0295 | 540 | 0.0909 |
0.1103 | 1.0867 | 570 | 0.0970 |
0.1109 | 1.1438 | 600 | 0.1160 |
0.1 | 1.2010 | 630 | 0.1187 |
0.0858 | 1.2581 | 660 | 0.1644 |
0.0992 | 1.3153 | 690 | 0.1040 |
0.0689 | 1.3724 | 720 | 0.0764 |
0.0681 | 1.4296 | 750 | 0.0853 |
0.0905 | 1.4867 | 780 | 0.1184 |
0.0957 | 1.5439 | 810 | 0.1317 |
0.0868 | 1.6010 | 840 | 0.2817 |
0.1281 | 1.6582 | 870 | 0.2166 |
0.0785 | 1.7153 | 900 | 0.0928 |
0.0738 | 1.7725 | 930 | 0.2114 |
0.1337 | 1.8296 | 960 | 0.1321 |
0.1074 | 1.8868 | 990 | 0.1436 |
0.0829 | 1.9439 | 1020 | 0.1643 |
0.0942 | 2.0019 | 1050 | 0.1076 |
0.098 | 2.0591 | 1080 | 0.0937 |
0.0771 | 2.1162 | 1110 | 0.0598 |
0.0684 | 2.1734 | 1140 | 0.0551 |
0.1826 | 2.2305 | 1170 | 0.1241 |
0.1139 | 2.2877 | 1200 | 0.0567 |
0.0837 | 2.3448 | 1230 | 0.1561 |
0.1254 | 2.4020 | 1260 | 0.0968 |
0.0854 | 2.4591 | 1290 | 0.0820 |
0.0769 | 2.5163 | 1320 | 0.1383 |
0.0776 | 2.5734 | 1350 | 0.1381 |
0.0553 | 2.6306 | 1380 | 0.0787 |
0.1121 | 2.6877 | 1410 | 0.1219 |
0.0585 | 2.7449 | 1440 | 0.0777 |
0.0595 | 2.8020 | 1470 | 0.0906 |
0.0523 | 2.8591 | 1500 | 0.1116 |
0.0501 | 2.9163 | 1530 | 0.0475 |
0.076 | 2.9734 | 1560 | 0.1763 |
0.0663 | 3.0314 | 1590 | 0.0593 |
0.0576 | 3.0886 | 1620 | 0.0571 |
0.0369 | 3.1457 | 1650 | 0.0646 |
0.0537 | 3.2029 | 1680 | 0.0503 |
0.0474 | 3.2600 | 1710 | 0.0802 |
0.0698 | 3.3172 | 1740 | 0.1044 |
0.0566 | 3.3743 | 1770 | 0.1519 |
0.0466 | 3.4315 | 1800 | 0.0743 |
0.045 | 3.4886 | 1830 | 0.0652 |
0.0565 | 3.5458 | 1860 | 0.0635 |
0.0325 | 3.6029 | 1890 | 0.0801 |
0.0415 | 3.6601 | 1920 | 0.0729 |
0.048 | 3.7172 | 1950 | 0.0544 |
0.0369 | 3.7744 | 1980 | 0.0577 |
0.0406 | 3.8315 | 2010 | 0.0514 |
0.0437 | 3.8887 | 2040 | 0.0552 |
0.0445 | 3.9458 | 2070 | 0.0773 |
0.0472 | 4.0038 | 2100 | 0.0496 |
0.0495 | 4.0610 | 2130 | 0.0641 |
0.0472 | 4.1181 | 2160 | 0.0457 |
0.0367 | 4.1753 | 2190 | 0.0638 |
0.0404 | 4.2324 | 2220 | 0.0868 |
0.0529 | 4.2896 | 2250 | 0.0360 |
0.0306 | 4.3467 | 2280 | 0.0610 |
0.0405 | 4.4039 | 2310 | 0.0797 |
0.0447 | 4.4610 | 2340 | 0.0750 |
0.0552 | 4.5182 | 2370 | 0.0374 |
0.0337 | 4.5753 | 2400 | 0.0397 |
0.0306 | 4.6325 | 2430 | 0.0565 |
0.0382 | 4.6896 | 2460 | 0.0534 |
0.036 | 4.7468 | 2490 | 0.0461 |
0.0429 | 4.8039 | 2520 | 0.0609 |
0.0273 | 4.8611 | 2550 | 0.0404 |
0.0323 | 4.9182 | 2580 | 0.0407 |
0.0366 | 4.9754 | 2610 | 0.0406 |
0.0295 | 5.0333 | 2640 | 0.0511 |
0.0458 | 5.0905 | 2670 | 0.0414 |
0.0274 | 5.1476 | 2700 | 0.0399 |
0.0404 | 5.2048 | 2730 | 0.0474 |
0.0332 | 5.2619 | 2760 | 0.0425 |
0.0318 | 5.3191 | 2790 | 0.0532 |
0.0336 | 5.3762 | 2820 | 0.0397 |
0.03 | 5.4334 | 2850 | 0.0383 |
0.0289 | 5.4905 | 2880 | 0.0513 |
0.0355 | 5.5477 | 2910 | 0.0321 |
0.0283 | 5.6048 | 2940 | 0.0406 |
0.0269 | 5.6620 | 2970 | 0.0300 |
0.0239 | 5.7191 | 3000 | 0.0424 |
0.0285 | 5.7763 | 3030 | 0.0430 |
0.0283 | 5.8334 | 3060 | 0.0440 |
0.0328 | 5.8906 | 3090 | 0.0558 |
0.0273 | 5.9477 | 3120 | 0.0366 |
0.0516 | 6.0057 | 3150 | 0.0690 |
0.0468 | 6.0629 | 3180 | 0.0417 |
0.0323 | 6.1200 | 3210 | 0.0457 |
0.0259 | 6.1772 | 3240 | 0.0385 |
0.0302 | 6.2343 | 3270 | 0.0382 |
0.0254 | 6.2915 | 3300 | 0.0426 |
0.0312 | 6.3486 | 3330 | 0.0378 |
0.0289 | 6.4058 | 3360 | 0.0356 |
0.0388 | 6.4629 | 3390 | 0.0767 |
0.0294 | 6.5201 | 3420 | 0.0464 |
0.0303 | 6.5772 | 3450 | 0.0373 |
0.0247 | 6.6344 | 3480 | 0.0638 |
0.0213 | 6.6915 | 3510 | 0.0408 |
0.0298 | 6.7487 | 3540 | 0.0443 |
0.0229 | 6.8058 | 3570 | 0.0400 |
0.0194 | 6.8630 | 3600 | 0.0399 |
0.024 | 6.9201 | 3630 | 0.0435 |
0.0217 | 6.9773 | 3660 | 0.0387 |
0.0229 | 7.0352 | 3690 | 0.0383 |
0.0269 | 7.0924 | 3720 | 0.0404 |
0.0296 | 7.1495 | 3750 | 0.0724 |
0.0271 | 7.2067 | 3780 | 0.0490 |
0.0245 | 7.2638 | 3810 | 0.0377 |
0.0255 | 7.3210 | 3840 | 0.0407 |
0.0226 | 7.3781 | 3870 | 0.0369 |
0.0223 | 7.4353 | 3900 | 0.0341 |
0.0247 | 7.4924 | 3930 | 0.0385 |
0.0234 | 7.5496 | 3960 | 0.0369 |
0.0222 | 7.6067 | 3990 | 0.0372 |
0.014 | 7.6639 | 4020 | 0.0391 |
0.0252 | 7.7210 | 4050 | 0.0432 |
0.0318 | 7.7782 | 4080 | 0.0397 |
0.0254 | 7.8353 | 4110 | 0.0398 |
0.0262 | 7.8925 | 4140 | 0.0352 |
0.0199 | 7.9496 | 4170 | 0.0382 |
0.024 | 8.0076 | 4200 | 0.0363 |
0.0241 | 8.0648 | 4230 | 0.0327 |
0.0238 | 8.1219 | 4260 | 0.0336 |
0.0212 | 8.1791 | 4290 | 0.0372 |
0.0207 | 8.2362 | 4320 | 0.0336 |
0.0265 | 8.2934 | 4350 | 0.0369 |
0.0211 | 8.3505 | 4380 | 0.0498 |
0.0242 | 8.4077 | 4410 | 0.0635 |
0.0257 | 8.4648 | 4440 | 0.0594 |
0.0205 | 8.5220 | 4470 | 0.0565 |
0.0215 | 8.5791 | 4500 | 0.0487 |
0.0211 | 8.6363 | 4530 | 0.0449 |
0.0257 | 8.6934 | 4560 | 0.0452 |
0.0248 | 8.7506 | 4590 | 0.0401 |
0.0187 | 8.8077 | 4620 | 0.0419 |
0.0175 | 8.8649 | 4650 | 0.0434 |
0.0333 | 8.9220 | 4680 | 0.0457 |
0.0238 | 8.9792 | 4710 | 0.0466 |
0.0232 | 9.0371 | 4740 | 0.0474 |
0.0231 | 9.0943 | 4770 | 0.0502 |
0.0232 | 9.1514 | 4800 | 0.0479 |
0.0177 | 9.2086 | 4830 | 0.0476 |
0.0296 | 9.2657 | 4860 | 0.0497 |
0.023 | 9.3229 | 4890 | 0.0457 |
0.0237 | 9.3800 | 4920 | 0.0465 |
0.0221 | 9.4372 | 4950 | 0.0461 |
0.0231 | 9.4943 | 4980 | 0.0309 |
0.0221 | 9.5515 | 5010 | 0.0329 |
0.0198 | 9.6086 | 5040 | 0.0334 |
0.0194 | 9.6658 | 5070 | 0.0308 |
0.0188 | 9.7229 | 5100 | 0.0281 |
0.0168 | 9.7801 | 5130 | 0.0276 |
0.0199 | 9.8372 | 5160 | 0.0282 |
0.0202 | 9.8944 | 5190 | 0.0288 |
0.0148 | 9.9515 | 5220 | 0.0301 |
0.0152 | 10.0095 | 5250 | 0.0303 |
0.015 | 10.0667 | 5280 | 0.0302 |
0.0145 | 10.1238 | 5310 | 0.0300 |
0.0181 | 10.1810 | 5340 | 0.0303 |
0.0174 | 10.2381 | 5370 | 0.0300 |
0.0166 | 10.2953 | 5400 | 0.0300 |
0.0208 | 10.3524 | 5430 | 0.0300 |
0.0201 | 10.4096 | 5460 | 0.0301 |
0.0159 | 10.4667 | 5490 | 0.0302 |
0.0192 | 10.5239 | 5520 | 0.0302 |
0.0114 | 10.5810 | 5550 | 0.0302 |
0.0198 | 10.6382 | 5580 | 0.0302 |
0.0186 | 10.6953 | 5610 | 0.0301 |
0.0178 | 10.7525 | 5640 | 0.0299 |
0.0174 | 10.8096 | 5670 | 0.0298 |
0.0179 | 10.8668 | 5700 | 0.0298 |
0.0169 | 10.9239 | 5730 | 0.0298 |
0.0159 | 10.9811 | 5760 | 0.0298 |
0.02 | 11.0391 | 5790 | 0.0298 |
0.0163 | 11.0962 | 5820 | 0.0298 |
0.0191 | 11.1534 | 5850 | 0.0273 |
0.0188 | 11.2105 | 5880 | 0.0412 |
0.0327 | 11.2677 | 5910 | 0.0970 |
0.0592 | 11.3248 | 5940 | 0.1420 |
0.0542 | 11.3820 | 5970 | 0.0973 |
0.0776 | 11.4391 | 6000 | 0.0648 |
0.0597 | 11.4962 | 6030 | 0.1299 |
0.0566 | 11.5534 | 6060 | 0.0834 |
0.049 | 11.6105 | 6090 | 0.0560 |
0.0638 | 11.6677 | 6120 | 0.0450 |
0.0412 | 11.7248 | 6150 | 0.1571 |
0.0606 | 11.7820 | 6180 | 0.1047 |
0.0363 | 11.8391 | 6210 | 0.0540 |
0.0315 | 11.8963 | 6240 | 0.0546 |
0.0397 | 11.9534 | 6270 | 0.0920 |
0.0272 | 12.0114 | 6300 | 0.0515 |
0.0402 | 12.0686 | 6330 | 0.0834 |
0.0393 | 12.1257 | 6360 | 0.0482 |
0.0302 | 12.1829 | 6390 | 0.0605 |
0.0354 | 12.2400 | 6420 | 0.1459 |
0.0379 | 12.2972 | 6450 | 0.0443 |
0.0317 | 12.3543 | 6480 | 0.0440 |
0.0278 | 12.4115 | 6510 | 0.0516 |
0.0224 | 12.4686 | 6540 | 0.0533 |
0.031 | 12.5258 | 6570 | 0.0474 |
0.0469 | 12.5829 | 6600 | 0.1061 |
0.0739 | 12.6401 | 6630 | 0.0689 |
0.0301 | 12.6972 | 6660 | 0.0511 |
0.0344 | 12.7544 | 6690 | 0.0428 |
0.0305 | 12.8115 | 6720 | 0.0441 |
0.0282 | 12.8687 | 6750 | 0.0439 |
0.0238 | 12.9258 | 6780 | 0.0448 |
0.041 | 12.9830 | 6810 | 0.0395 |
0.0273 | 13.0410 | 6840 | 0.0427 |
0.0284 | 13.0981 | 6870 | 0.0736 |
0.0296 | 13.1553 | 6900 | 0.0824 |
0.3257 | 13.2124 | 6930 | 0.1177 |
0.1631 | 13.2696 | 6960 | 0.0749 |
0.0364 | 13.3267 | 6990 | 0.0722 |
0.0289 | 13.3839 | 7020 | 0.0501 |
0.0252 | 13.4410 | 7050 | 0.0722 |
0.033 | 13.4982 | 7080 | 0.0498 |
0.045 | 13.5553 | 7110 | 0.0712 |
0.0354 | 13.6125 | 7140 | 0.0592 |
0.0255 | 13.6696 | 7170 | 0.0491 |
0.0382 | 13.7268 | 7200 | 0.1050 |
0.0373 | 13.7839 | 7230 | 0.0591 |
0.0381 | 13.8411 | 7260 | 0.0626 |
0.0265 | 13.8982 | 7290 | 0.0612 |
0.0263 | 13.9554 | 7320 | 0.0870 |
0.0299 | 14.0133 | 7350 | 0.0602 |
0.0289 | 14.0705 | 7380 | 0.0501 |
0.0255 | 14.1276 | 7410 | 0.0454 |
0.0223 | 14.1848 | 7440 | 0.0581 |
0.0246 | 14.2419 | 7470 | 0.0464 |
0.0221 | 14.2991 | 7500 | 0.0478 |
0.0286 | 14.3562 | 7530 | 0.0839 |
0.0279 | 14.4134 | 7560 | 0.0606 |
0.0287 | 14.4705 | 7590 | 0.0443 |
0.0186 | 14.5277 | 7620 | 0.0591 |
0.0213 | 14.5848 | 7650 | 0.0548 |
0.0239 | 14.6420 | 7680 | 0.0619 |
0.0258 | 14.6991 | 7710 | 0.0739 |
0.0385 | 14.7563 | 7740 | 0.0575 |
0.0338 | 14.8134 | 7770 | 0.0795 |
0.0321 | 14.8706 | 7800 | 0.0482 |
0.0271 | 14.9277 | 7830 | 0.0462 |
0.0236 | 14.9849 | 7860 | 0.0465 |
0.0249 | 15.0429 | 7890 | 0.0438 |
0.0214 | 15.1000 | 7920 | 0.0697 |
0.0295 | 15.1572 | 7950 | 0.0677 |
0.0257 | 15.2143 | 7980 | 0.0403 |
0.0247 | 15.2715 | 8010 | 0.0490 |
0.0272 | 15.3286 | 8040 | 0.0453 |
0.0197 | 15.3858 | 8070 | 0.0507 |
0.0209 | 15.4429 | 8100 | 0.0561 |
0.0155 | 15.5001 | 8130 | 0.0569 |
0.0243 | 15.5572 | 8160 | 0.0494 |
0.0247 | 15.6144 | 8190 | 0.0508 |
0.0305 | 15.6715 | 8220 | 0.0517 |
0.0219 | 15.7287 | 8250 | 0.0370 |
0.0239 | 15.7858 | 8280 | 0.0427 |
0.0231 | 15.8430 | 8310 | 0.0311 |
0.0215 | 15.9001 | 8340 | 0.0315 |
0.0202 | 15.9573 | 8370 | 0.0288 |
0.0258 | 16.0152 | 8400 | 0.0299 |
0.021 | 16.0724 | 8430 | 0.0289 |
0.0191 | 16.1295 | 8460 | 0.0296 |
0.019 | 16.1867 | 8490 | 0.0310 |
0.0233 | 16.2438 | 8520 | 0.0365 |
0.0183 | 16.3010 | 8550 | 0.0275 |
0.0203 | 16.3581 | 8580 | 0.0296 |
0.0218 | 16.4153 | 8610 | 0.0282 |
0.0172 | 16.4724 | 8640 | 0.0274 |
0.0197 | 16.5296 | 8670 | 0.0273 |
0.0189 | 16.5867 | 8700 | 0.0329 |
0.0174 | 16.6439 | 8730 | 0.0285 |
0.019 | 16.7010 | 8760 | 0.0294 |
0.0191 | 16.7582 | 8790 | 0.0379 |
0.0246 | 16.8153 | 8820 | 0.0348 |
0.0179 | 16.8725 | 8850 | 0.0372 |
0.0207 | 16.9296 | 8880 | 0.0449 |
0.0195 | 16.9868 | 8910 | 0.0361 |
0.0148 | 17.0448 | 8940 | 0.0388 |
0.0193 | 17.1019 | 8970 | 0.0430 |
0.0122 | 17.1591 | 9000 | 0.0422 |
0.0167 | 17.2162 | 9030 | 0.0337 |
0.0218 | 17.2734 | 9060 | 0.0345 |
0.0173 | 17.3305 | 9090 | 0.0389 |
0.02 | 17.3877 | 9120 | 0.0464 |
0.0155 | 17.4448 | 9150 | 0.0391 |
0.0212 | 17.5020 | 9180 | 0.0370 |
0.0187 | 17.5591 | 9210 | 0.0362 |
0.0195 | 17.6163 | 9240 | 0.0367 |
0.0221 | 17.6734 | 9270 | 0.0443 |
0.0191 | 17.7306 | 9300 | 0.0375 |
0.0199 | 17.7877 | 9330 | 0.0391 |
0.0201 | 17.8449 | 9360 | 0.0373 |
0.0179 | 17.9020 | 9390 | 0.0356 |
0.0178 | 17.9592 | 9420 | 0.0375 |
0.0202 | 18.0171 | 9450 | 0.0323 |
0.0186 | 18.0743 | 9480 | 0.0364 |
0.0156 | 18.1314 | 9510 | 0.0299 |
0.0151 | 18.1886 | 9540 | 0.0295 |
0.0174 | 18.2457 | 9570 | 0.0295 |
0.0175 | 18.3029 | 9600 | 0.0309 |
0.0166 | 18.3600 | 9630 | 0.0315 |
0.0176 | 18.4172 | 9660 | 0.0305 |
0.0196 | 18.4743 | 9690 | 0.0308 |
0.0142 | 18.5315 | 9720 | 0.0328 |
0.0175 | 18.5886 | 9750 | 0.0311 |
0.0199 | 18.6458 | 9780 | 0.0304 |
0.0132 | 18.7029 | 9810 | 0.0305 |
0.016 | 18.7601 | 9840 | 0.0305 |
0.0172 | 18.8172 | 9870 | 0.0300 |
0.0146 | 18.8744 | 9900 | 0.0299 |
0.0187 | 18.9315 | 9930 | 0.0300 |
0.016 | 18.9887 | 9960 | 0.0300 |
0.0151 | 19.0467 | 9990 | 0.0304 |
0.0146 | 19.1038 | 10020 | 0.0308 |
0.0151 | 19.1610 | 10050 | 0.0307 |
0.0156 | 19.2181 | 10080 | 0.0305 |
0.0181 | 19.2753 | 10110 | 0.0287 |
0.0147 | 19.3324 | 10140 | 0.0285 |
0.0139 | 19.3896 | 10170 | 0.0283 |
0.0145 | 19.4467 | 10200 | 0.0282 |
0.0195 | 19.5039 | 10230 | 0.0286 |
0.0146 | 19.5610 | 10260 | 0.0290 |
0.0139 | 19.6182 | 10290 | 0.0291 |
0.0199 | 19.6753 | 10320 | 0.0290 |
0.0131 | 19.7325 | 10350 | 0.0289 |
0.0149 | 19.7896 | 10380 | 0.0290 |
0.0176 | 19.8468 | 10410 | 0.0291 |
0.0172 | 19.9039 | 10440 | 0.0290 |
0.0167 | 19.9611 | 10470 | 0.0290 |
Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.20.3
- Downloads last month
- 1
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for HAO-K/powerinfer-seq-cls
Base model
Qwen/Qwen2.5-3B
Finetuned
Qwen/Qwen2.5-3B-Instruct
Finetuned
PowerInfer/SmallThinker-3B-Preview