
Arcee-Lite is a compact yet powerful 1.5B parameter language model developed as part of the DistillKit open-source project. Despite its small size, Arcee-Lite demonstrates impressive performance, particularly in the MMLU (Massive Multitask Language Understanding) benchmark.
GGUFS available here
Key Features
- Model Size: 1.5 billion parameters
- MMLU Score: 55.93
- Distillation Source: Phi-3-Medium
- Enhanced Performance: Merged with high-performing distillations
About DistillKit
DistillKit is our new open-source project focused on creating efficient, smaller models that maintain high performance. Arcee-Lite is one of the first models to emerge from this initiative.
Performance
Arcee-Lite showcases remarkable capabilities for its size:
- Achieves a 55.93 score on the MMLU benchmark
- Demonstrates exceptional performance across various tasks
Use Cases
Arcee-Lite is suitable for a wide range of applications where a balance between model size and performance is crucial:
- Embedded systems
- Mobile applications
- Edge computing
- Resource-constrained environments

Please note that our internal evaluations were consistantly higher than their counterparts on the OpenLLM Leaderboard - and should only be compared against the relative performance between the models, not weighed against the leaderboard.
- Downloads last month
- 228