bert-base-spanish-wwm-cased
This model is a fine-tuned version of dccuchile/bert-base-spanish-wwm-cased on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.9387
- F1 Macro: 0.8879
- F1: 0.9207
- F1 Neg: 0.8551
- Acc: 0.8975
- Prec: 0.9015
- Recall: 0.9407
- Mcc: 0.7774
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | F1 Macro | F1 | F1 Neg | Acc | Prec | Recall | Mcc |
---|---|---|---|---|---|---|---|---|---|---|
0.2306 | 1.0 | 2061 | 0.5481 | 0.8775 | 0.9167 | 0.8382 | 0.89 | 0.88 | 0.9565 | 0.7614 |
0.0717 | 2.0 | 4122 | 0.7879 | 0.8615 | 0.8903 | 0.8328 | 0.8675 | 0.9348 | 0.8498 | 0.7293 |
0.0203 | 3.0 | 6183 | 0.9309 | 0.8795 | 0.9105 | 0.8485 | 0.8875 | 0.916 | 0.9051 | 0.7591 |
0.0146 | 4.0 | 8244 | 0.9387 | 0.8879 | 0.9207 | 0.8551 | 0.8975 | 0.9015 | 0.9407 | 0.7774 |
0.0103 | 5.0 | 10305 | 1.0163 | 0.8774 | 0.9126 | 0.8421 | 0.8875 | 0.8969 | 0.9289 | 0.7558 |
Framework versions
- Transformers 4.38.2
- Pytorch 2.2.1+cu121
- Datasets 2.18.0
- Tokenizers 0.15.2
- Downloads last month
- 121
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for dtorber/bert-base-spanish-wwm-cased
Base model
dccuchile/bert-base-spanish-wwm-cased