This model is a retraining of "emanjavacas/GysBERT-v2" which has been retrained (all layers) 15 epochs on the "arch-be/brabant-xvii" dataset for masked language model.
Base model