Mit-1.5B

Model Description

The Mit series of large language models (LLMs) by WinkingFace is designed to seamlessly integrate intuitive conversational abilities with advanced multi-step reasoning. Unlike conventional AI models that focus solely on generating responses, Mit enhances structured thinking, contextual understanding, and function-calling precision to deliver more accurate and insightful interactions.

Mit models are adaptable, capable of switching between standard conversational tasks and complex reasoning. By incorporating refined logical inference mechanisms, Mit achieves superior accuracy in judgment, decision-making, and long-form analytical tasks.

Built on the open-source Qwen platform, Mit has undergone extensive architectural refinements and performance optimizations to align more effectively with real-world applications. Our fine-tuning efforts emphasize deeper contextual awareness, enhanced response coherence, and improved execution of function-calling, making Mit a powerful and versatile AI system.

Requirements

Mit's code is integrated into WinkingFace's custom version of transformers, and we recommend using this modified version for optimal compatibility.

To prevent potential errors, such as:

KeyError: 'mit'

install the custom transformers package using the following command:

pip install git+https://github.com/WinkingFaceAI/tfm-recooked.git 

This ensures seamless functionality and avoids compatibility issues with the model.

License

This code repository and the model weights are licensed under the Apache 2.0 License. The Mit series is fully compatible with commercial use and allows for modifications and derivative works, including but not limited to distillation for training other LLMs.

Please note that:

Mit-0.5B, Mit-1.5B, Mit-3B, and Mit-7B are derived from the Qwen series, which is also licensed under the Apache 2.0 License.

Contact

For any questions or inquiries, feel free to contact us here 📨.

Downloads last month
3
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Collection including WinkingFace/Mit-1.5B