sulaimank's picture
End of training
7eb0ed0 verified
metadata
library_name: transformers
license: apache-2.0
base_model: facebook/wav2vec2-xls-r-300m
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: wav2vec2-xls-r-300m-Fleurs_AMMI_AFRIVOICE_LRSC-ln-10hrs-v2
    results: []

wav2vec2-xls-r-300m-Fleurs_AMMI_AFRIVOICE_LRSC-ln-10hrs-v2

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4351
  • Wer: 0.2577
  • Cer: 0.0833

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
7.7647 1.0 768 3.1933 1.0 1.0
3.0093 2.0 1536 2.8646 1.0 1.0
2.8666 3.0 2304 2.8188 1.0 1.0
2.7248 4.0 3072 2.2736 1.0 0.7102
1.8518 5.0 3840 1.1205 0.9523 0.2684
1.2788 6.0 4608 0.8644 0.6379 0.1739
1.0685 7.0 5376 0.7393 0.5324 0.1498
0.9306 8.0 6144 0.6640 0.4750 0.1369
0.8376 9.0 6912 0.6044 0.4290 0.1282
0.7758 10.0 7680 0.5699 0.4055 0.1215
0.7303 11.0 8448 0.5588 0.3849 0.1159
0.6748 12.0 9216 0.5249 0.3716 0.1115
0.6509 13.0 9984 0.5161 0.3550 0.1074
0.6141 14.0 10752 0.5091 0.3599 0.1076
0.5903 15.0 11520 0.4849 0.3486 0.1050
0.5676 16.0 12288 0.4614 0.3460 0.1053
0.5421 17.0 13056 0.4494 0.3476 0.1080
0.5363 18.0 13824 0.4598 0.3189 0.0980
0.5139 19.0 14592 0.4553 0.3157 0.0976
0.5002 20.0 15360 0.4492 0.3118 0.0960
0.4847 21.0 16128 0.4415 0.3145 0.0972
0.467 22.0 16896 0.4287 0.2988 0.0923
0.4558 23.0 17664 0.4367 0.2970 0.0927
0.4442 24.0 18432 0.4265 0.3019 0.0951
0.4274 25.0 19200 0.4247 0.2910 0.0912
0.424 26.0 19968 0.4205 0.2900 0.0915
0.403 27.0 20736 0.4131 0.2909 0.0915
0.4017 28.0 21504 0.4068 0.3002 0.0985
0.393 29.0 22272 0.4118 0.2895 0.0910
0.3852 30.0 23040 0.4275 0.2857 0.0897
0.3783 31.0 23808 0.4145 0.2852 0.0911
0.3776 32.0 24576 0.4200 0.2781 0.0882
0.3667 33.0 25344 0.4174 0.2778 0.0876
0.3582 34.0 26112 0.4129 0.2827 0.0902
0.3517 35.0 26880 0.4231 0.2780 0.0870
0.3479 36.0 27648 0.4058 0.2806 0.0910
0.3394 37.0 28416 0.4375 0.2757 0.0868
0.3336 38.0 29184 0.4320 0.2723 0.0861
0.3267 39.0 29952 0.4165 0.2750 0.0871
0.3179 40.0 30720 0.4198 0.2729 0.0865
0.3157 41.0 31488 0.4152 0.2721 0.0861
0.3097 42.0 32256 0.4007 0.2753 0.0886
0.306 43.0 33024 0.4471 0.2705 0.0853
0.3061 44.0 33792 0.4154 0.2649 0.0849
0.2931 45.0 34560 0.4506 0.2611 0.0832
0.2994 46.0 35328 0.4287 0.2669 0.0851
0.2972 47.0 36096 0.4188 0.2670 0.0859
0.2836 48.0 36864 0.4350 0.2554 0.0828
0.283 49.0 37632 0.4167 0.2620 0.0845
0.2765 50.0 38400 0.4314 0.2634 0.0849
0.278 51.0 39168 0.4097 0.2702 0.0892
0.2744 52.0 39936 0.4255 0.2591 0.0835
0.2662 53.0 40704 0.4293 0.2595 0.0836
0.2663 54.0 41472 0.4423 0.2583 0.0826
0.264 55.0 42240 0.4313 0.2587 0.0835
0.2549 56.0 43008 0.4441 0.2614 0.0842
0.2586 57.0 43776 0.4259 0.2611 0.0857
0.2577 58.0 44544 0.4258 0.2570 0.0837
0.2486 59.0 45312 0.4332 0.2606 0.0845
0.2473 60.0 46080 0.4402 0.2589 0.0834
0.2464 61.0 46848 0.4263 0.2567 0.0826
0.2442 62.0 47616 0.4527 0.2563 0.0819
0.2444 63.0 48384 0.4351 0.2577 0.0833

Framework versions

  • Transformers 4.46.1
  • Pytorch 2.1.0+cu118
  • Datasets 3.1.0
  • Tokenizers 0.20.1