batch inference error
1
#13 opened 2 days ago
by
404dreamer
Error in preprocessing prompt inputs
#12 opened 3 days ago
by
darvec
cannot import name 'Qwen2_5_VLImageProcessor' (on vLLM)
4
#11 opened 6 days ago
by
cbrug
Update preprocessor_config.json
#10 opened 9 days ago
by
Isotr0py

Hardware Requirements
#9 opened 11 days ago
by
shreyas0985
Vision tokens missing from chat template
#8 opened 12 days ago
by
depasquale

ERROR:hf-to-gguf:Model Qwen2_5_VLForConditionalGeneration is not supported
#7 opened 13 days ago
by
li-gz
docs(readme): fix typo in README.md
#6 opened 17 days ago
by
BjornMelin

Out of Memory on two H100 (80GB) each and load_in_8_bit = True
#4 opened 22 days ago
by
Maverick17
Model Memory Requirements
2
#3 opened 24 days ago
by
nvip1204
Video Inference - TypeError: process_vision_info() got an unexpected keyword argument 'return_video_kwargs'
2
#2 opened 24 days ago
by
hmanju
Qwen/Qwen2.5-VL-72B-Instruct-AWQ and Qwen/Qwen2.5-VL-40<B-Instruct-AWQ please
6
#1 opened 26 days ago
by
devops724