Roberta

This model can tackle the zero-width non-joiner character for Persian writing. Also, the model was trained on new multi-types corpora with a new set of vocabulary.

Questions?

Post a Github issue on the ParsRoBERTa Issues repo.

Downloads last month
730
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for HooshvareLab/roberta-fa-zwnj-base

Finetunes
5 models