4, 5, and 6 bit quants of nisten/zelensky-78b
use this as a commander model
no speculative decoding support
- Downloads last month
- 124
16-bit
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.