|
--- |
|
base_model: |
|
- Qwen/Qwen2.5-14B-Instruct-1M |
|
- deepseek-ai/DeepSeek-R1 |
|
library_name: transformers |
|
license: cc-by-nc-nd-4.0 |
|
tags: |
|
- reasoning |
|
- R1 |
|
- 1M |
|
- fast |
|
- Deca |
|
- Deca-AI |
|
- Deca-2 |
|
- Qwen |
|
--- |
|
## This is the old version of Deca 2 mini. Use deca-ai/2-mini-beta |
|
The Deca 2 family of models is currently in BETA |
|
 |
|
The Deca 2 family of models, currently in BETA, is built on cutting-edge architectures like DeepSeek R1, and Qwen 2, delivering extraordinary performance. With a focus on insane speed and high efficiency, Deca 2 is revolutionizing text generation and setting new standards in the industry. It also comes with a **1 million** context window. |
|
|
|
As more capabilities are added, Deca 2 will evolve into a more powerful, any-to-any model in the future. While it’s focused on text generation for now, its foundation is designed to scale, bringing even more advanced functionalities to come. |