T5-small distilled from T5-flan-base using OpenOrca'a FLAN dataset using Seq2seq methodology. The baseline for Seq2seq distillation on T5 decoder models. It is free under Apache License.

The distilled model is a T5-small model with 60M parameters, not a T5-flan-small model.

Downloads last month
33
Safetensors
Model size
60.5M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Dataset used to train Sayan01/T5-Flan-Small