This model was converted to the OpenVINO IR format using the following command;

optimum-cli export openvino -m "{local-dir}/Dolphin3.0-Mistral-24B" --task text-generation-with-past --weight-format int4 --ratio 1 --group-size 128 --dataset wikitext2 --disable-stateful --all-layers --awq --scale-estimation "{local-dir}Dolphin3.0-Mistral-24B-int4_asym-awq-se-ns-ov"
Downloads last month
3
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Collection including Echo9Zulu/Dolphin3.0-Mistral-24B-int4_asym-awq-se-ns-ov