MLX
English
llama

TinyLlama-1.1B-Chat-v1.0-4bit

This model was converted to MLX format from TinyLlama/TinyLlama-1.1B-Chat-v1.0. Refer to the original model card for more details on the model.

Use with mlx

pip install mlx
git clone https://github.com/ml-explore/mlx-examples.git
cd mlx-examples/llms/hf_llm
python generate.py --model mlx-community/TinyLlama-1.1B-Chat-v1.0-4bit --prompt "My name is"
Downloads last month
24
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.

Datasets used to train mlx-community/TinyLlama-1.1B-Chat-v1.0-4bit