turkish-hs-2class-prediction
This model is a fine-tuned version of dbmdz/bert-base-turkish-uncased on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.5811
- Accuracy: 0.8879
- Macro F1: 0.8829
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 16
- eval_batch_size: 20
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Macro F1 |
---|---|---|---|---|---|
0.5915 | 0.1460 | 100 | 0.4860 | 0.7703 | 0.7399 |
0.4414 | 0.2920 | 200 | 0.3925 | 0.8122 | 0.8044 |
0.3889 | 0.4380 | 300 | 0.3714 | 0.8323 | 0.8180 |
0.3713 | 0.5839 | 400 | 0.3467 | 0.8487 | 0.8376 |
0.3442 | 0.7299 | 500 | 0.3240 | 0.8569 | 0.8505 |
0.3427 | 0.8759 | 600 | 0.3084 | 0.8678 | 0.8612 |
0.3295 | 1.0219 | 700 | 0.3067 | 0.8651 | 0.8598 |
0.282 | 1.1679 | 800 | 0.3136 | 0.8678 | 0.8631 |
0.29 | 1.3139 | 900 | 0.3050 | 0.8696 | 0.8604 |
0.2993 | 1.4599 | 1000 | 0.3068 | 0.8660 | 0.8614 |
0.2835 | 1.6058 | 1100 | 0.2951 | 0.8769 | 0.8697 |
0.2751 | 1.7518 | 1200 | 0.3079 | 0.8742 | 0.8694 |
0.2717 | 1.8978 | 1300 | 0.2912 | 0.8806 | 0.8732 |
0.2615 | 2.0438 | 1400 | 0.2887 | 0.8833 | 0.8779 |
0.2413 | 2.1898 | 1500 | 0.3005 | 0.8824 | 0.8779 |
0.2446 | 2.3358 | 1600 | 0.3081 | 0.8788 | 0.8742 |
0.2022 | 2.4818 | 1700 | 0.3216 | 0.8815 | 0.8754 |
0.245 | 2.6277 | 1800 | 0.3109 | 0.8861 | 0.8812 |
0.2341 | 2.7737 | 1900 | 0.3369 | 0.8742 | 0.8712 |
0.239 | 2.9197 | 2000 | 0.2988 | 0.8879 | 0.8828 |
0.2228 | 3.0657 | 2100 | 0.3174 | 0.8824 | 0.8751 |
0.1819 | 3.2117 | 2200 | 0.3371 | 0.8815 | 0.8747 |
0.1841 | 3.3577 | 2300 | 0.3331 | 0.8861 | 0.8810 |
0.2208 | 3.5036 | 2400 | 0.3237 | 0.8906 | 0.8864 |
0.1934 | 3.6496 | 2500 | 0.3337 | 0.8897 | 0.8852 |
0.1947 | 3.7956 | 2600 | 0.3446 | 0.8861 | 0.8798 |
0.2043 | 3.9416 | 2700 | 0.3472 | 0.8888 | 0.8836 |
0.154 | 4.0876 | 2800 | 0.3549 | 0.8888 | 0.8842 |
0.1657 | 4.2336 | 2900 | 0.3664 | 0.8833 | 0.8764 |
0.1456 | 4.3796 | 3000 | 0.3905 | 0.8879 | 0.8829 |
0.1774 | 4.5255 | 3100 | 0.3801 | 0.8861 | 0.8807 |
0.1971 | 4.6715 | 3200 | 0.3943 | 0.8842 | 0.8796 |
0.182 | 4.8175 | 3300 | 0.3607 | 0.8906 | 0.8855 |
0.1751 | 4.9635 | 3400 | 0.3829 | 0.8879 | 0.8828 |
0.123 | 5.1095 | 3500 | 0.4142 | 0.8833 | 0.8776 |
0.1678 | 5.2555 | 3600 | 0.4128 | 0.8888 | 0.8833 |
0.1339 | 5.4015 | 3700 | 0.4287 | 0.8879 | 0.8820 |
0.164 | 5.5474 | 3800 | 0.4406 | 0.8842 | 0.8798 |
0.1521 | 5.6934 | 3900 | 0.4191 | 0.8906 | 0.8855 |
0.1476 | 5.8394 | 4000 | 0.4343 | 0.8806 | 0.8750 |
0.171 | 5.9854 | 4100 | 0.4305 | 0.8861 | 0.8802 |
0.1171 | 6.1314 | 4200 | 0.4552 | 0.8824 | 0.8780 |
0.1237 | 6.2774 | 4300 | 0.4561 | 0.8870 | 0.8808 |
0.1261 | 6.4234 | 4400 | 0.4696 | 0.8833 | 0.8778 |
0.1345 | 6.5693 | 4500 | 0.4848 | 0.8888 | 0.8830 |
0.1172 | 6.7153 | 4600 | 0.5006 | 0.8842 | 0.8797 |
0.1574 | 6.8613 | 4700 | 0.4770 | 0.8833 | 0.8782 |
0.1144 | 7.0073 | 4800 | 0.4923 | 0.8851 | 0.8803 |
0.1129 | 7.1533 | 4900 | 0.5090 | 0.8861 | 0.8811 |
0.1086 | 7.2993 | 5000 | 0.5437 | 0.8879 | 0.8833 |
0.1155 | 7.4453 | 5100 | 0.5360 | 0.8861 | 0.8814 |
0.121 | 7.5912 | 5200 | 0.5260 | 0.8851 | 0.8790 |
0.0984 | 7.7372 | 5300 | 0.5583 | 0.8806 | 0.8756 |
0.1267 | 7.8832 | 5400 | 0.5423 | 0.8861 | 0.8809 |
0.1087 | 8.0292 | 5500 | 0.5492 | 0.8879 | 0.8832 |
0.1117 | 8.1752 | 5600 | 0.5604 | 0.8851 | 0.8808 |
0.112 | 8.3212 | 5700 | 0.5614 | 0.8833 | 0.8784 |
0.1037 | 8.4672 | 5800 | 0.5707 | 0.8815 | 0.8773 |
0.0813 | 8.6131 | 5900 | 0.5736 | 0.8806 | 0.8761 |
0.1034 | 8.7591 | 6000 | 0.5672 | 0.8842 | 0.8793 |
0.0985 | 8.9051 | 6100 | 0.5761 | 0.8824 | 0.8778 |
0.1043 | 9.0511 | 6200 | 0.5739 | 0.8833 | 0.8785 |
0.082 | 9.1971 | 6300 | 0.5752 | 0.8851 | 0.8802 |
0.1019 | 9.3431 | 6400 | 0.5816 | 0.8815 | 0.8767 |
0.0743 | 9.4891 | 6500 | 0.5814 | 0.8842 | 0.8793 |
0.1045 | 9.6350 | 6600 | 0.5843 | 0.8815 | 0.8767 |
0.0743 | 9.7810 | 6700 | 0.5831 | 0.8833 | 0.8784 |
0.1138 | 9.9270 | 6800 | 0.5811 | 0.8879 | 0.8829 |
Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 18
Model tree for HrantDinkFoundation/turkish-hs-2class-prediction
Base model
dbmdz/bert-base-turkish-uncased