This model is used as the reasoning model behind the RAG module of LaQwenTa, a fine-tuned version of Qwen3-0.6b for educational purposes and mostly MCQAs answering. This model was finetuned on ~20k STEM MCQAs.

Downloads last month
5
Safetensors
Model size
596M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for jeanprbt/MNLP_M2_rag_model

Finetuned
Qwen/Qwen3-0.6B
Finetuned
(240)
this model

Dataset used to train jeanprbt/MNLP_M2_rag_model