Moroccan-Darija-STT-large-turbo-v1.6.2

This model is a fine-tuned version of openai/whisper-large-v3-turbo on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3670
  • Wer: 93.8922
  • Cer: 53.0699

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 4.375e-06
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 7

Training results

Training Loss Epoch Step Validation Loss Wer Cer
0.9764 0.3067 50 0.4338 95.5238 48.4807
0.7447 0.6135 100 0.3480 76.8742 31.1994
0.7076 0.9202 150 0.3278 82.1703 38.6822
0.6132 1.2270 200 0.3237 77.6606 34.7094
0.562 1.5337 250 0.3166 73.6780 31.1944
0.6057 1.8405 300 0.3114 91.5997 47.9874
0.4726 2.1472 350 0.3175 81.8859 38.8882
0.5005 2.4540 400 0.3188 85.1657 41.4337
0.4421 2.7607 450 0.3149 82.8397 40.7699
0.3551 3.0675 500 0.3303 105.1456 64.8954
0.3845 3.3742 550 0.3288 81.1747 38.4896
0.3513 3.6810 600 0.3292 83.2580 41.5519
0.3658 3.9877 650 0.3317 99.4896 54.9820
0.4258 4.2945 700 0.3384 89.1064 46.7882
0.4206 4.6012 750 0.3433 98.1844 56.1390
0.3115 4.9080 800 0.3406 89.1315 48.7577
0.3037 5.2147 850 0.3473 76.4809 38.0487
0.35 5.5215 900 0.3525 85.2410 44.4437
0.3754 5.8282 950 0.3535 97.1888 58.1035
0.2915 6.1350 1000 0.3661 94.5616 54.5327
0.2725 6.4417 1050 0.3676 99.8494 58.5798
0.2822 6.7485 1100 0.3670 93.8922 53.0699

Framework versions

  • Transformers 4.48.0.dev0
  • Pytorch 2.5.1+cu124
  • Datasets 3.1.0
  • Tokenizers 0.21.0
Downloads last month
60
Safetensors
Model size
809M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for BounharAbdelaziz/Moroccan-Darija-STT-large-turbo-v1.6.2

Finetuned
(181)
this model