Neuria_BERT_Graficos

This model is a fine-tuned version of dccuchile/bert-base-spanish-wwm-cased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0062
  • Accuracy: 1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 16
  • total_train_batch_size: 256
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.7306 0.8767 4 1.4663 0.3756
1.5419 1.8767 8 1.2706 0.5902
1.3248 2.8767 12 1.0288 0.6927
1.0823 3.8767 16 0.7660 0.8146
0.8438 4.8767 20 0.5362 0.9220
0.6149 5.8767 24 0.3765 0.9463
0.456 6.8767 28 0.2742 0.9512
0.3428 7.8767 32 0.2059 0.9610
0.2517 8.8767 36 0.1648 0.9659
0.1923 9.8767 40 0.1213 0.9756
0.1519 10.8767 44 0.1018 0.9756
0.1172 11.8767 48 0.0906 0.9756
0.0879 12.8767 52 0.0812 0.9805
0.0688 13.8767 56 0.0633 0.9854
0.0538 14.8767 60 0.0473 0.9854
0.0435 15.8767 64 0.0367 0.9902
0.0381 16.8767 68 0.0274 0.9902
0.027 17.8767 72 0.0330 0.9951
0.0236 18.8767 76 0.0346 0.9902
0.0211 19.8767 80 0.0269 0.9902
0.0201 20.8767 84 0.0234 0.9951
0.0176 21.8767 88 0.0203 0.9951
0.0151 22.8767 92 0.0189 0.9951
0.0152 23.8767 96 0.0108 1.0
0.0157 24.8767 100 0.0141 0.9951
0.0133 25.8767 104 0.0093 1.0
0.0123 26.8767 108 0.0153 0.9951
0.0123 27.8767 112 0.0096 1.0
0.0112 28.8767 116 0.0111 0.9951
0.0111 29.8767 120 0.0088 1.0
0.0114 30.8767 124 0.0137 0.9951
0.0104 31.8767 128 0.0085 1.0
0.012 32.8767 132 0.0066 1.0
0.0106 33.8767 136 0.0081 1.0
0.0103 34.8767 140 0.0083 1.0
0.0093 35.8767 144 0.0095 0.9951
0.0099 36.8767 148 0.0144 0.9951
0.0088 37.8767 152 0.0094 1.0
0.0085 38.8767 156 0.0075 1.0
0.0081 39.8767 160 0.0067 1.0
0.0082 40.8767 164 0.0060 1.0
0.0079 41.8767 168 0.0058 1.0
0.0078 42.8767 172 0.0058 1.0
0.0078 43.8767 176 0.0060 1.0
0.0076 44.8767 180 0.0061 1.0
0.0077 45.8767 184 0.0062 1.0
0.0079 46.8767 188 0.0064 1.0
0.0078 47.8767 192 0.0065 1.0
0.0081 48.8767 196 0.0062 1.0
0.0071 49.8767 200 0.0062 1.0

Framework versions

  • Transformers 4.49.0
  • Pytorch 2.4.1
  • Datasets 2.19.1
  • Tokenizers 0.21.0
Downloads last month
3
Safetensors
Model size
110M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for neuria99/Neuria_BERT_Graficos

Finetuned
(111)
this model