Error when using this model from faster-whisper
#15
by
lzrhahahahaha
- opened
Hi,
I tried to load this model using faster-whisper, but got the following error:
File "./F5-TTS/src/f5_tts/eval/utils_eval.py", line 374, in load_asr_model
model = WhisperModel(model_size, device="cuda", compute_type="float16", download_root=ckpt_dir)
File ".../.conda_envs/myenv/lib/python3.10/site-packages/faster_whisper/transcribe.py", line 141, in __init__
self.hf_tokenizer = tokenizers.Tokenizer.from_file(tokenizer_file)
Exception: data did not match any variant of untagged enum ModelWrapper at line 264903 column 3
My faster-whisper version is 0.10.1, and my tokenizers version is 0.15.2. Other faster-whisper model (like large-v3
) can be loaded normally.
Please help me on this issue. Thanks
lzrhahahahaha
changed discussion status to
closed
It works fine on my end with a fresh python virtual environment and fresh install of faster-whisper. My guess is that there is some dependency conflict between incompatible packages in your python environment, or alternatively some old cached model files interfering.
I recommend:
- Create a fresh python venv. Install pytorch, transformers and faster-whisper in this environment and try again. Recommend using
uv
,conda
or pythonvenv
. - Clear/delete old cached model directories of downloaded models from Huggingface. Usually these are in your home directory in
~/.cache/huggingface/hub
.
Thanks for your reply.
I solved this issue by upgrading the version of faster-whisper and tokenizers.
Thanks!