Request: Create distill of Mistral Small 24B

#128
by Kenshiro-28 - opened

Congratulations on your legendary R1 model! I was wondering if you’d consider releasing an official R1 distill of Mistral Small 24B? A distilled version could offer improved efficiency and accessibility for those with limited resources.

https://huggingface.co/mistralai/Mistral-Small-24B-Instruct-2501

Thank you! I wonder if an official distill would have any advantage over the data sets used in the Dolphin models.

Thank you! I wonder if an official distill would have any advantage over the data sets used in the Dolphin models.

It surely would have a great advantage. Deepseek could use their original huge dataset for that, meaning it would be true Deepseek model in its core and I'm pretty sure that creators of Dolphin would welcome it too, because they could use that model as a base for their Dolphin model instead of the original Mistral model. It would be a completely different flavor, allowing merges with other Mistral based models.

Sign up or log in to comment