fastllm model for Qweb-7B-Chat-fp16
Github address: https://github.com/ztxz16/fastllm
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.