
ai4bharat/IndicBERTv2-MLM-only
Fill-Mask
•
Updated
•
245k
•
7
IndicBERT v2 is a multilingual BERT model pretrained on IndicCorp v2, an Indic monolingual corpus of 20.9 billion tokens, covering 24 consitutionally