Configuration Parsing
Warning:
In adapter_config.json: "peft.task_type" must be a string
DeepSeek-R1-Distill-Qwen-1.5B-2-contract-sections-classification-v4-50
This model is a fine-tuned version of deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.9474
- Accuracy Evaluate: 0.7023
- Precision Evaluate: 0.7283
- Recall Evaluate: 0.7032
- F1 Evaluate: 0.7003
- Accuracy Sklearn: 0.7023
- Precision Sklearn: 0.7252
- Recall Sklearn: 0.7023
- F1 Sklearn: 0.6989
- Acuracia Rotulo Objeto: 0.8554
- Acuracia Rotulo Obrigacoes: 0.7795
- Acuracia Rotulo Valor: 0.6017
- Acuracia Rotulo Vigencia: 0.5932
- Acuracia Rotulo Rescisao: 0.6676
- Acuracia Rotulo Foro: 0.9654
- Acuracia Rotulo Reajuste: 0.4306
- Acuracia Rotulo Fiscalizacao: 0.6435
- Acuracia Rotulo Publicacao: 0.8177
- Acuracia Rotulo Pagamento: 0.4457
- Acuracia Rotulo Casos Omissos: 0.8276
- Acuracia Rotulo Sancoes: 0.7339
- Acuracia Rotulo Dotacao Orcamentaria: 0.7802
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy Evaluate | Precision Evaluate | Recall Evaluate | F1 Evaluate | Accuracy Sklearn | Precision Sklearn | Recall Sklearn | F1 Sklearn | Acuracia Rotulo Objeto | Acuracia Rotulo Obrigacoes | Acuracia Rotulo Valor | Acuracia Rotulo Vigencia | Acuracia Rotulo Rescisao | Acuracia Rotulo Foro | Acuracia Rotulo Reajuste | Acuracia Rotulo Fiscalizacao | Acuracia Rotulo Publicacao | Acuracia Rotulo Pagamento | Acuracia Rotulo Casos Omissos | Acuracia Rotulo Sancoes | Acuracia Rotulo Dotacao Orcamentaria |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
3.9512 | 1.0 | 1000 | 3.6400 | 0.0653 | 0.1038 | 0.0796 | 0.0507 | 0.0653 | 0.1226 | 0.0653 | 0.0461 | 0.0021 | 0.0067 | 0.0143 | 0.1129 | 0.1219 | 0.3077 | 0.0036 | 0.0 | 0.0739 | 0.0072 | 0.0 | 0.0550 | 0.3297 |
3.3304 | 2.0 | 2000 | 3.1552 | 0.0877 | 0.1109 | 0.1027 | 0.0684 | 0.0877 | 0.1335 | 0.0877 | 0.0645 | 0.0145 | 0.0202 | 0.0659 | 0.1627 | 0.1219 | 0.3077 | 0.0036 | 0.0 | 0.4236 | 0.0072 | 0.0148 | 0.0550 | 0.1374 |
2.7872 | 3.0 | 3000 | 2.7514 | 0.1517 | 0.1497 | 0.1604 | 0.1408 | 0.1517 | 0.1648 | 0.1517 | 0.1429 | 0.0785 | 0.1414 | 0.2321 | 0.1680 | 0.1330 | 0.2923 | 0.0071 | 0.0946 | 0.5123 | 0.0181 | 0.2660 | 0.0642 | 0.0769 |
2.4261 | 4.0 | 4000 | 2.5576 | 0.2005 | 0.1879 | 0.1996 | 0.1902 | 0.2005 | 0.1960 | 0.2005 | 0.1943 | 0.2273 | 0.2677 | 0.2350 | 0.1890 | 0.0997 | 0.3192 | 0.0214 | 0.1104 | 0.4187 | 0.0181 | 0.5320 | 0.1009 | 0.0549 |
2.2045 | 5.0 | 5000 | 2.4390 | 0.2325 | 0.2250 | 0.2249 | 0.2204 | 0.2325 | 0.2313 | 0.2325 | 0.2267 | 0.3037 | 0.3367 | 0.2350 | 0.2310 | 0.1330 | 0.3231 | 0.0427 | 0.1104 | 0.4286 | 0.0254 | 0.5714 | 0.1284 | 0.0549 |
2.0064 | 6.0 | 6000 | 2.3316 | 0.2695 | 0.2628 | 0.2597 | 0.2554 | 0.2695 | 0.2714 | 0.2695 | 0.2637 | 0.4008 | 0.3468 | 0.2407 | 0.2336 | 0.1690 | 0.3577 | 0.2206 | 0.1167 | 0.4877 | 0.0507 | 0.5862 | 0.1376 | 0.0275 |
1.8509 | 7.0 | 7000 | 2.2336 | 0.2933 | 0.2896 | 0.2810 | 0.2782 | 0.2933 | 0.2989 | 0.2933 | 0.2876 | 0.4690 | 0.3586 | 0.2436 | 0.2598 | 0.1745 | 0.3962 | 0.2384 | 0.1451 | 0.4828 | 0.0797 | 0.6404 | 0.1376 | 0.0275 |
1.7072 | 8.0 | 8000 | 2.1432 | 0.321 | 0.3267 | 0.3075 | 0.3084 | 0.321 | 0.3354 | 0.321 | 0.3181 | 0.5186 | 0.3653 | 0.2493 | 0.2625 | 0.1911 | 0.4192 | 0.2847 | 0.2587 | 0.4926 | 0.1196 | 0.6650 | 0.1376 | 0.0330 |
1.6043 | 9.0 | 9000 | 2.0637 | 0.3443 | 0.3542 | 0.3288 | 0.3306 | 0.3443 | 0.3637 | 0.3443 | 0.3414 | 0.5702 | 0.3822 | 0.2521 | 0.2808 | 0.2022 | 0.4615 | 0.3132 | 0.3123 | 0.5172 | 0.1341 | 0.6601 | 0.1560 | 0.0330 |
1.4611 | 10.0 | 10000 | 1.9872 | 0.3822 | 0.3950 | 0.3639 | 0.3670 | 0.3822 | 0.4059 | 0.3822 | 0.3803 | 0.5971 | 0.4242 | 0.2751 | 0.3386 | 0.2133 | 0.5154 | 0.3203 | 0.4322 | 0.5369 | 0.1993 | 0.6601 | 0.1743 | 0.0440 |
1.3565 | 11.0 | 11000 | 1.9195 | 0.4037 | 0.4203 | 0.3862 | 0.3889 | 0.4037 | 0.4298 | 0.4037 | 0.4018 | 0.6116 | 0.4360 | 0.2951 | 0.3491 | 0.2188 | 0.5615 | 0.3310 | 0.5047 | 0.5567 | 0.2428 | 0.6601 | 0.1927 | 0.0604 |
1.2776 | 12.0 | 12000 | 1.8541 | 0.4235 | 0.4395 | 0.4056 | 0.4077 | 0.4235 | 0.4499 | 0.4235 | 0.4220 | 0.6281 | 0.4663 | 0.3209 | 0.3596 | 0.2410 | 0.5962 | 0.3416 | 0.5205 | 0.5665 | 0.2681 | 0.6601 | 0.2385 | 0.0659 |
1.2015 | 13.0 | 13000 | 1.7973 | 0.4415 | 0.4623 | 0.4297 | 0.4304 | 0.4415 | 0.4723 | 0.4415 | 0.4417 | 0.6384 | 0.4848 | 0.3524 | 0.3885 | 0.2604 | 0.6192 | 0.3452 | 0.4353 | 0.5714 | 0.2862 | 0.7931 | 0.3119 | 0.0989 |
1.0948 | 14.0 | 14000 | 1.7435 | 0.465 | 0.4844 | 0.4533 | 0.4516 | 0.465 | 0.4969 | 0.465 | 0.4648 | 0.6653 | 0.5253 | 0.3954 | 0.3963 | 0.2825 | 0.6731 | 0.3488 | 0.4479 | 0.5862 | 0.2645 | 0.7980 | 0.3670 | 0.1429 |
1.0335 | 15.0 | 15000 | 1.6933 | 0.4788 | 0.4981 | 0.4701 | 0.4663 | 0.4788 | 0.5114 | 0.4788 | 0.4789 | 0.6880 | 0.5269 | 0.4069 | 0.4042 | 0.2936 | 0.6923 | 0.3488 | 0.4511 | 0.5911 | 0.3007 | 0.8030 | 0.4128 | 0.1923 |
0.9816 | 16.0 | 16000 | 1.6456 | 0.4968 | 0.5181 | 0.4918 | 0.4848 | 0.4968 | 0.5318 | 0.4968 | 0.4966 | 0.7004 | 0.5455 | 0.4298 | 0.4094 | 0.3102 | 0.7115 | 0.3488 | 0.4511 | 0.7044 | 0.3043 | 0.8030 | 0.4495 | 0.2253 |
0.9221 | 17.0 | 17000 | 1.5969 | 0.5142 | 0.5474 | 0.5091 | 0.5038 | 0.5142 | 0.5601 | 0.5142 | 0.5150 | 0.7149 | 0.5875 | 0.4470 | 0.4173 | 0.3324 | 0.7538 | 0.3523 | 0.4637 | 0.7143 | 0.2717 | 0.7980 | 0.4954 | 0.2692 |
0.8315 | 18.0 | 18000 | 1.5564 | 0.5325 | 0.5665 | 0.5267 | 0.5212 | 0.5325 | 0.5794 | 0.5325 | 0.5332 | 0.7335 | 0.6145 | 0.4527 | 0.4199 | 0.4017 | 0.7692 | 0.3523 | 0.4763 | 0.7192 | 0.2681 | 0.7980 | 0.5229 | 0.3187 |
0.8019 | 19.0 | 19000 | 1.5088 | 0.551 | 0.5805 | 0.5418 | 0.5361 | 0.551 | 0.5949 | 0.551 | 0.5508 | 0.7335 | 0.6768 | 0.4728 | 0.4278 | 0.4266 | 0.7885 | 0.3523 | 0.4858 | 0.7291 | 0.2790 | 0.7980 | 0.5321 | 0.3407 |
0.7627 | 20.0 | 20000 | 1.4605 | 0.5627 | 0.5898 | 0.5555 | 0.5485 | 0.5627 | 0.6039 | 0.5627 | 0.5621 | 0.7355 | 0.6869 | 0.4814 | 0.4331 | 0.4460 | 0.8269 | 0.3523 | 0.4858 | 0.7389 | 0.2971 | 0.8030 | 0.5505 | 0.3846 |
0.6963 | 21.0 | 21000 | 1.4205 | 0.5835 | 0.6192 | 0.5789 | 0.5722 | 0.5835 | 0.6276 | 0.5835 | 0.5826 | 0.75 | 0.6953 | 0.4842 | 0.4357 | 0.4709 | 0.8654 | 0.3523 | 0.5804 | 0.7389 | 0.3261 | 0.8030 | 0.6055 | 0.4176 |
0.6698 | 22.0 | 22000 | 1.3759 | 0.5938 | 0.6262 | 0.5887 | 0.5825 | 0.5938 | 0.6348 | 0.5938 | 0.5931 | 0.7583 | 0.7020 | 0.4986 | 0.4436 | 0.5097 | 0.8692 | 0.3559 | 0.5741 | 0.7389 | 0.3442 | 0.8079 | 0.6055 | 0.4451 |
0.6175 | 23.0 | 23000 | 1.3362 | 0.601 | 0.6335 | 0.5976 | 0.5908 | 0.601 | 0.6412 | 0.601 | 0.6001 | 0.7665 | 0.7088 | 0.5043 | 0.4357 | 0.5263 | 0.8692 | 0.3594 | 0.5804 | 0.7438 | 0.3551 | 0.8079 | 0.6330 | 0.4780 |
0.5775 | 24.0 | 24000 | 1.3042 | 0.6102 | 0.6430 | 0.6066 | 0.5991 | 0.6102 | 0.6513 | 0.6102 | 0.6089 | 0.7810 | 0.7256 | 0.5129 | 0.4357 | 0.5485 | 0.8731 | 0.3594 | 0.5836 | 0.7438 | 0.3587 | 0.8079 | 0.6606 | 0.4945 |
0.5471 | 25.0 | 25000 | 1.2682 | 0.619 | 0.6507 | 0.6167 | 0.6089 | 0.619 | 0.6586 | 0.619 | 0.6177 | 0.7851 | 0.7340 | 0.5186 | 0.4436 | 0.5485 | 0.8769 | 0.3879 | 0.5868 | 0.7635 | 0.3768 | 0.8079 | 0.6881 | 0.5 |
0.5339 | 26.0 | 26000 | 1.2367 | 0.6262 | 0.6564 | 0.6243 | 0.6163 | 0.6262 | 0.6637 | 0.6262 | 0.6241 | 0.7996 | 0.7407 | 0.5272 | 0.4462 | 0.5540 | 0.8808 | 0.3879 | 0.5931 | 0.7635 | 0.3804 | 0.8128 | 0.6972 | 0.5330 |
0.4904 | 27.0 | 27000 | 1.2076 | 0.6352 | 0.6632 | 0.6339 | 0.6257 | 0.6352 | 0.6706 | 0.6352 | 0.6329 | 0.8079 | 0.7508 | 0.5301 | 0.4541 | 0.5706 | 0.8846 | 0.3950 | 0.5962 | 0.7783 | 0.3804 | 0.8128 | 0.6972 | 0.5824 |
0.4679 | 28.0 | 28000 | 1.1778 | 0.6405 | 0.6669 | 0.6392 | 0.6313 | 0.6405 | 0.6741 | 0.6405 | 0.6382 | 0.8058 | 0.7525 | 0.5301 | 0.4672 | 0.5762 | 0.8846 | 0.4021 | 0.6088 | 0.7833 | 0.4022 | 0.8177 | 0.6972 | 0.5824 |
0.4567 | 29.0 | 29000 | 1.1544 | 0.6468 | 0.6700 | 0.6461 | 0.6373 | 0.6468 | 0.6772 | 0.6468 | 0.6440 | 0.8140 | 0.7593 | 0.5330 | 0.4698 | 0.5900 | 0.8885 | 0.4057 | 0.6088 | 0.7833 | 0.4022 | 0.8177 | 0.7064 | 0.6209 |
0.4424 | 30.0 | 30000 | 1.1306 | 0.652 | 0.6728 | 0.6513 | 0.6422 | 0.652 | 0.6806 | 0.652 | 0.6490 | 0.8202 | 0.7643 | 0.5330 | 0.4777 | 0.5983 | 0.8962 | 0.4057 | 0.6120 | 0.7833 | 0.4094 | 0.8177 | 0.7064 | 0.6429 |
0.409 | 31.0 | 31000 | 1.1112 | 0.657 | 0.6785 | 0.6577 | 0.6486 | 0.657 | 0.6857 | 0.657 | 0.6543 | 0.8244 | 0.7559 | 0.5415 | 0.4751 | 0.6066 | 0.9115 | 0.4093 | 0.6278 | 0.7833 | 0.4203 | 0.8227 | 0.7064 | 0.6648 |
0.4016 | 32.0 | 32000 | 1.0915 | 0.664 | 0.6861 | 0.6644 | 0.6564 | 0.664 | 0.6922 | 0.664 | 0.6614 | 0.8326 | 0.7626 | 0.5559 | 0.4856 | 0.6122 | 0.9115 | 0.4128 | 0.6278 | 0.7833 | 0.4312 | 0.8227 | 0.7064 | 0.6923 |
0.3937 | 33.0 | 33000 | 1.0746 | 0.6685 | 0.6937 | 0.6695 | 0.6620 | 0.6685 | 0.6979 | 0.6685 | 0.6658 | 0.8306 | 0.7694 | 0.5702 | 0.4856 | 0.6122 | 0.9423 | 0.4128 | 0.6246 | 0.7833 | 0.4312 | 0.8227 | 0.7156 | 0.7033 |
0.3765 | 34.0 | 34000 | 1.0577 | 0.6727 | 0.6971 | 0.6744 | 0.6670 | 0.6727 | 0.7010 | 0.6727 | 0.6700 | 0.8388 | 0.7660 | 0.5759 | 0.4934 | 0.6150 | 0.9462 | 0.4164 | 0.6246 | 0.7931 | 0.4348 | 0.8227 | 0.7156 | 0.7253 |
0.3727 | 35.0 | 35000 | 1.0426 | 0.6785 | 0.7022 | 0.6800 | 0.6729 | 0.6785 | 0.7055 | 0.6785 | 0.6756 | 0.8409 | 0.7677 | 0.5903 | 0.5171 | 0.6150 | 0.9538 | 0.4164 | 0.6309 | 0.7931 | 0.4348 | 0.8227 | 0.7156 | 0.7418 |
0.3593 | 36.0 | 36000 | 1.0304 | 0.6803 | 0.7041 | 0.6810 | 0.6743 | 0.6803 | 0.7072 | 0.6803 | 0.6773 | 0.8430 | 0.7694 | 0.5903 | 0.5249 | 0.6177 | 0.9577 | 0.4199 | 0.6309 | 0.7931 | 0.4348 | 0.8227 | 0.7064 | 0.7418 |
0.3532 | 37.0 | 37000 | 1.0170 | 0.6827 | 0.7052 | 0.6838 | 0.6774 | 0.6827 | 0.7075 | 0.6827 | 0.6795 | 0.8450 | 0.7694 | 0.5903 | 0.5276 | 0.6260 | 0.9615 | 0.4199 | 0.6309 | 0.7980 | 0.4384 | 0.8227 | 0.7064 | 0.7527 |
0.3486 | 38.0 | 38000 | 1.0061 | 0.685 | 0.7071 | 0.6857 | 0.6801 | 0.685 | 0.7082 | 0.685 | 0.6814 | 0.8450 | 0.7727 | 0.5931 | 0.5354 | 0.6316 | 0.9615 | 0.4199 | 0.6309 | 0.7980 | 0.4384 | 0.8227 | 0.7064 | 0.7582 |
0.3383 | 39.0 | 39000 | 0.9962 | 0.6863 | 0.7083 | 0.6869 | 0.6812 | 0.6863 | 0.7099 | 0.6863 | 0.6829 | 0.8471 | 0.7710 | 0.5931 | 0.5354 | 0.6399 | 0.9615 | 0.4235 | 0.6309 | 0.7980 | 0.4420 | 0.8227 | 0.7064 | 0.7582 |
0.3375 | 40.0 | 40000 | 0.9883 | 0.6885 | 0.7114 | 0.6887 | 0.6837 | 0.6885 | 0.7123 | 0.6885 | 0.6851 | 0.8512 | 0.7727 | 0.5931 | 0.5381 | 0.6482 | 0.9615 | 0.4235 | 0.6341 | 0.7980 | 0.4457 | 0.8227 | 0.7064 | 0.7582 |
0.3105 | 41.0 | 41000 | 0.9787 | 0.692 | 0.7138 | 0.6925 | 0.6875 | 0.692 | 0.7143 | 0.692 | 0.6885 | 0.8492 | 0.7744 | 0.5931 | 0.5538 | 0.6565 | 0.9615 | 0.4235 | 0.6372 | 0.7980 | 0.4457 | 0.8227 | 0.7064 | 0.7802 |
0.3194 | 42.0 | 42000 | 0.9722 | 0.6953 | 0.7173 | 0.6957 | 0.6910 | 0.6953 | 0.7178 | 0.6953 | 0.6919 | 0.8512 | 0.7761 | 0.5989 | 0.5669 | 0.6565 | 0.9615 | 0.4306 | 0.6404 | 0.7980 | 0.4457 | 0.8227 | 0.7156 | 0.7802 |
0.3241 | 43.0 | 43000 | 0.9658 | 0.6987 | 0.7218 | 0.6987 | 0.6950 | 0.6987 | 0.7210 | 0.6987 | 0.6953 | 0.8533 | 0.7811 | 0.6017 | 0.5774 | 0.6620 | 0.9615 | 0.4306 | 0.6435 | 0.8030 | 0.4457 | 0.8276 | 0.7156 | 0.7802 |
0.3078 | 44.0 | 44000 | 0.9608 | 0.6993 | 0.7227 | 0.6992 | 0.6957 | 0.6993 | 0.7214 | 0.6993 | 0.6958 | 0.8533 | 0.7811 | 0.6017 | 0.5801 | 0.6620 | 0.9654 | 0.4306 | 0.6435 | 0.8030 | 0.4457 | 0.8276 | 0.7156 | 0.7802 |
0.3084 | 45.0 | 45000 | 0.9572 | 0.6997 | 0.7234 | 0.7002 | 0.6964 | 0.6997 | 0.7222 | 0.6997 | 0.6964 | 0.8512 | 0.7811 | 0.6017 | 0.5853 | 0.6620 | 0.9654 | 0.4306 | 0.6435 | 0.8030 | 0.4457 | 0.8276 | 0.7248 | 0.7802 |
0.3223 | 46.0 | 46000 | 0.9536 | 0.702 | 0.7275 | 0.7031 | 0.6997 | 0.702 | 0.7249 | 0.702 | 0.6987 | 0.8533 | 0.7811 | 0.6017 | 0.5853 | 0.6648 | 0.9654 | 0.4306 | 0.6530 | 0.8177 | 0.4457 | 0.8276 | 0.7339 | 0.7802 |
0.2982 | 47.0 | 47000 | 0.9505 | 0.7017 | 0.7279 | 0.7028 | 0.6997 | 0.7017 | 0.7250 | 0.7017 | 0.6984 | 0.8554 | 0.7795 | 0.6017 | 0.5906 | 0.6648 | 0.9654 | 0.4306 | 0.6435 | 0.8177 | 0.4457 | 0.8276 | 0.7339 | 0.7802 |
0.3047 | 48.0 | 48000 | 0.9490 | 0.7017 | 0.7278 | 0.7029 | 0.6998 | 0.7017 | 0.7249 | 0.7017 | 0.6985 | 0.8533 | 0.7795 | 0.6017 | 0.5906 | 0.6676 | 0.9654 | 0.4306 | 0.6435 | 0.8177 | 0.4457 | 0.8276 | 0.7339 | 0.7802 |
0.3013 | 49.0 | 49000 | 0.9477 | 0.7027 | 0.7284 | 0.7038 | 0.7007 | 0.7027 | 0.7255 | 0.7027 | 0.6994 | 0.8554 | 0.7795 | 0.6017 | 0.5906 | 0.6676 | 0.9654 | 0.4306 | 0.6530 | 0.8177 | 0.4457 | 0.8276 | 0.7339 | 0.7802 |
0.3005 | 50.0 | 50000 | 0.9474 | 0.7023 | 0.7283 | 0.7032 | 0.7003 | 0.7023 | 0.7252 | 0.7023 | 0.6989 | 0.8554 | 0.7795 | 0.6017 | 0.5932 | 0.6676 | 0.9654 | 0.4306 | 0.6435 | 0.8177 | 0.4457 | 0.8276 | 0.7339 | 0.7802 |
Framework versions
- PEFT 0.14.0
- Transformers 4.48.3
- Pytorch 2.6.0+cu124
- Datasets 3.3.0
- Tokenizers 0.21.0
- Downloads last month
- 0
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no pipeline_tag.
Model tree for marcelovidigal/DeepSeek-R1-Distill-Qwen-1.5B-2-contract-sections-classification-v4-50
Base model
deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B