You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

mms-1b-all-lin-Fleurs_AMMI_AFRIVOICE_LRSC-1hrs-v1

This model is a fine-tuned version of facebook/mms-1b-all on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.7464
  • Wer: 0.4690
  • Cer: 0.1857

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
20.2969 1.0 20 15.4344 6.0492 1.9074
8.6598 2.0 40 4.6574 1.0 0.9999
4.0919 3.0 60 3.6494 1.0177 0.8475
3.5479 4.0 80 3.1117 0.9919 0.7177
2.9834 5.0 100 2.9935 0.9976 0.6736
2.9875 6.0 120 2.6889 0.9079 0.6285
2.4481 7.0 140 2.5176 0.8753 0.5858
2.1694 8.0 160 2.3393 0.8359 0.5096
1.9925 9.0 180 2.1131 0.8073 0.4976
1.7516 10.0 200 2.1095 0.7797 0.4831
1.5563 11.0 220 1.9102 0.7538 0.4248
1.3814 12.0 240 1.8076 0.7351 0.4033
1.2278 13.0 260 1.7135 0.7027 0.3646
1.0993 14.0 280 1.6182 0.6884 0.3347
0.992 15.0 300 1.5669 0.6584 0.3120
0.8873 16.0 320 1.5504 0.6491 0.2943
1.0836 17.0 340 1.5024 0.6229 0.2854
0.9552 18.0 360 1.4542 0.6003 0.2668
0.6556 19.0 380 1.4784 0.5956 0.2609
0.6097 20.0 400 1.4696 0.5847 0.2499
0.5452 21.0 420 1.4980 0.5749 0.2443
0.5073 22.0 440 1.4862 0.5674 0.2359
0.5055 23.0 460 1.4599 0.5463 0.2297
0.4414 24.0 480 1.4955 0.5574 0.2279
0.4092 25.0 500 1.4877 0.5516 0.2280
0.3752 26.0 520 1.4439 0.5471 0.2241
0.3616 27.0 540 1.4631 0.5346 0.2196
0.3228 28.0 560 1.4857 0.5285 0.2142
0.3132 29.0 580 1.4594 0.5257 0.2156
0.2877 30.0 600 1.5222 0.5288 0.2123
0.2993 31.0 620 1.5179 0.5294 0.2132
0.2699 32.0 640 1.5192 0.5233 0.2130
0.2635 33.0 660 1.5113 0.5144 0.2090
0.2331 34.0 680 1.5547 0.5217 0.2098
0.2374 35.0 700 1.5334 0.5034 0.2026
0.2317 36.0 720 1.5067 0.5143 0.2057
0.2121 37.0 740 1.5589 0.5067 0.2051
0.2023 38.0 760 1.5842 0.5053 0.2048
0.2002 39.0 780 1.6043 0.5167 0.2033
0.1981 40.0 800 1.5686 0.5096 0.2049
0.1902 41.0 820 1.6279 0.5056 0.2026
0.1743 42.0 840 1.5759 0.5089 0.2022
0.1892 43.0 860 1.6137 0.5065 0.1999
0.1641 44.0 880 1.6219 0.4974 0.1981
0.4117 45.0 900 1.5925 0.4973 0.1964
0.1497 46.0 920 1.6325 0.4967 0.1973
0.1592 47.0 940 1.5893 0.4992 0.1976
0.1399 48.0 960 1.6137 0.4985 0.1980
0.145 49.0 980 1.5947 0.4862 0.1967
0.129 50.0 1000 1.6438 0.4972 0.1977
0.1198 51.0 1020 1.6853 0.5019 0.1970
0.1266 52.0 1040 1.6279 0.4993 0.1967
0.1176 53.0 1060 1.6381 0.4965 0.1973
0.1157 54.0 1080 1.6612 0.4891 0.1942
0.1126 55.0 1100 1.6502 0.4867 0.1946
0.1078 56.0 1120 1.7070 0.4874 0.1939
0.1074 57.0 1140 1.6772 0.4939 0.1961
0.1145 58.0 1160 1.7170 0.4863 0.1927
0.0998 59.0 1180 1.7114 0.4850 0.1923
0.1024 60.0 1200 1.7010 0.4838 0.1932
0.1089 61.0 1220 1.6970 0.4762 0.1897
0.0886 62.0 1240 1.7086 0.4762 0.1898
0.0971 63.0 1260 1.6991 0.4776 0.1901
0.0932 64.0 1280 1.6583 0.4761 0.1889
0.0847 65.0 1300 1.7442 0.4809 0.1887
0.0913 66.0 1320 1.7359 0.4772 0.1910
0.0892 67.0 1340 1.6782 0.4797 0.1902
0.0818 68.0 1360 1.7167 0.4844 0.1920
0.0857 69.0 1380 1.7230 0.4797 0.1897
0.0878 70.0 1400 1.6981 0.4860 0.1928
0.0739 71.0 1420 1.7313 0.4735 0.1862
0.0757 72.0 1440 1.7053 0.4767 0.1883
0.0716 73.0 1460 1.7432 0.4792 0.1884
0.075 74.0 1480 1.7191 0.4770 0.1878
0.0666 75.0 1500 1.7157 0.4760 0.1895
0.0701 76.0 1520 1.7501 0.4779 0.1873
0.0743 77.0 1540 1.7318 0.4750 0.1873
0.0676 78.0 1560 1.7164 0.4736 0.1867
0.062 79.0 1580 1.7338 0.4658 0.1851
0.0637 80.0 1600 1.7325 0.4730 0.1876
0.0697 81.0 1620 1.7165 0.4693 0.1861
0.0584 82.0 1640 1.7529 0.4700 0.1863
0.0597 83.0 1660 1.7759 0.4687 0.1850
0.0658 84.0 1680 1.7439 0.4683 0.1860
0.0555 85.0 1700 1.7395 0.4721 0.1870
0.0617 86.0 1720 1.7525 0.4685 0.1855
0.0641 87.0 1740 1.7330 0.4705 0.1849
0.0542 88.0 1760 1.7356 0.4704 0.1862
0.0582 89.0 1780 1.7464 0.4690 0.1857

Framework versions

  • Transformers 4.48.1
  • Pytorch 2.5.1+cu124
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
15
Safetensors
Model size
965M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for asr-africa/mms-1b-all-lin-Fleurs_AMMI_AFRIVOICE_LRSC-1hrs-v1

Finetuned
(257)
this model

Collection including asr-africa/mms-1b-all-lin-Fleurs_AMMI_AFRIVOICE_LRSC-1hrs-v1