layoutlmv3-finetuned

This model is a fine-tuned version of microsoft/layoutlmv3-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3160
  • Precision: 0.8865
  • Recall: 0.8880
  • F1: 0.8872
  • Accuracy: 0.9313

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • training_steps: 4000

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 0.3155 100 0.3799 0.8272 0.8281 0.8276 0.8989
No log 0.6309 200 0.3640 0.8102 0.8403 0.825 0.8976
No log 0.9464 300 0.3647 0.8169 0.8445 0.8305 0.8983
No log 1.2618 400 0.3327 0.8465 0.8486 0.8475 0.9095
0.3176 1.5773 500 0.3260 0.8389 0.8509 0.8449 0.9106
0.3176 1.8927 600 0.3263 0.8450 0.8489 0.8469 0.9090
0.3176 2.2082 700 0.3157 0.8531 0.8593 0.8562 0.9165
0.3176 2.5237 800 0.3015 0.8532 0.8665 0.8598 0.9157
0.3176 2.8391 900 0.2953 0.8486 0.8644 0.8565 0.9170
0.2295 3.1546 1000 0.3092 0.8611 0.8731 0.8671 0.9191
0.2295 3.4700 1100 0.2928 0.8707 0.8695 0.8701 0.9227
0.2295 3.7855 1200 0.2885 0.8650 0.8725 0.8688 0.9230
0.2295 4.1009 1300 0.2968 0.8727 0.8717 0.8722 0.9221
0.2295 4.4164 1400 0.3031 0.8654 0.8714 0.8684 0.9201
0.1703 4.7319 1500 0.3076 0.8628 0.8725 0.8676 0.9188
0.1703 5.0473 1600 0.2884 0.8791 0.8710 0.8751 0.9251
0.1703 5.3628 1700 0.3150 0.8669 0.8763 0.8716 0.9216
0.1703 5.6782 1800 0.3061 0.8634 0.8812 0.8722 0.9232
0.1703 5.9937 1900 0.2930 0.8776 0.8785 0.8780 0.9264
0.1333 6.3091 2000 0.3095 0.8726 0.8804 0.8765 0.9255
0.1333 6.6246 2100 0.2997 0.8757 0.8801 0.8779 0.9262
0.1333 6.9401 2200 0.3002 0.8783 0.8801 0.8792 0.9278
0.1333 7.2555 2300 0.2980 0.8795 0.8837 0.8816 0.9289
0.1333 7.5710 2400 0.3057 0.8813 0.8822 0.8818 0.9282
0.1112 7.8864 2500 0.3050 0.8799 0.8791 0.8795 0.9280
0.1112 8.2019 2600 0.3030 0.8819 0.8819 0.8819 0.9296
0.1112 8.5174 2700 0.3190 0.8664 0.8831 0.8747 0.9249
0.1112 8.8328 2800 0.3137 0.8821 0.8822 0.8821 0.9283
0.1112 9.1483 2900 0.3110 0.8883 0.8820 0.8851 0.9308
0.0895 9.4637 3000 0.3184 0.8769 0.8851 0.8809 0.9283
0.0895 9.7792 3100 0.3067 0.8769 0.8891 0.8829 0.9294
0.0895 10.0946 3200 0.3161 0.8819 0.8871 0.8845 0.9306
0.0895 10.4101 3300 0.3251 0.8762 0.8874 0.8818 0.9280
0.0895 10.7256 3400 0.3123 0.8863 0.8851 0.8857 0.9309
0.0788 11.0410 3500 0.3160 0.8865 0.8880 0.8872 0.9313
0.0788 11.3565 3600 0.3205 0.8835 0.8870 0.8852 0.9303
0.0788 11.6719 3700 0.3249 0.8798 0.8900 0.8849 0.9297
0.0788 11.9874 3800 0.3192 0.8833 0.8874 0.8853 0.9300
0.0788 12.3028 3900 0.3192 0.8838 0.8889 0.8864 0.9303
0.069 12.6183 4000 0.3201 0.8824 0.8895 0.8859 0.9299

Framework versions

  • Transformers 4.49.0.dev0
  • Pytorch 2.5.1+cu124
  • Datasets 3.3.0
  • Tokenizers 0.21.0
Downloads last month
36
Safetensors
Model size
126M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for kaydee/layoutlmv3-finetuned

Finetuned
(230)
this model