GGUF
reasoning
conversational
thinking
tiny
small
Inference Endpoints

Sapling Dream V1

Introducing SaplingDream, a compact GPT model with 0.5 billion parameters, based on the Qwen/Qwen2.5-0.5B-Instruct architecture. This model has been fine-tuned on a RTX4060 8GB for a bit over two days on ~0.3B tokens...

Datasets & Resources

Evaluation Loss Chart

Evaluation Loss Chart

Our Apps & Socials

Chat Assistant | Support Us | GitHub

Long live the Islamic Republic of Pakistan; Glory to the Islamic Republic of Pakistan 🇵🇰

Pakistan Flag
Downloads last month
23
GGUF
Model size
494M params
Architecture
qwen2
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for XeTute/SaplingDream_V1-0.5B-GGUF

Base model

Qwen/Qwen2.5-0.5B
Quantized
(2)
this model

Datasets used to train XeTute/SaplingDream_V1-0.5B-GGUF

Collection including XeTute/SaplingDream_V1-0.5B-GGUF