whisper-ft-large-1000-f

This model is a fine-tuned version of openai/whisper-large-v3-turbo on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 3.9591

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • optimizer: Use OptimizerNames.PAGED_ADAMW_8BIT with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 10
  • training_steps: 40
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
4.6008 0.0029 2 6.9022
4.3327 0.0057 4 6.9022
4.6788 0.0086 6 6.7334
4.1284 0.0114 8 5.6978
2.5702 0.0143 10 4.6321
1.5963 0.0171 12 4.3133
1.3669 0.02 14 4.2577
1.1867 0.0229 16 4.3040
1.7891 0.0257 18 4.3839
1.14 0.0286 20 5.0947
0.7743 0.0314 22 4.4104
0.965 0.0343 24 4.0427
0.8264 0.0371 26 3.9472
0.7508 0.04 28 3.9840
0.4857 0.0429 30 4.0175
0.609 0.0457 32 4.2039
1.2697 0.0486 34 4.1989
0.6072 0.0514 36 4.0805
0.6064 0.0543 38 3.9941
0.7662 0.0571 40 3.9591

Framework versions

  • Transformers 4.47.1
  • Pytorch 2.5.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
15
Safetensors
Model size
809M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for snaoi-csl/whisper-ft-large-1000-f

Finetuned
(181)
this model