Classifier
Collection
3 items
•
Updated
This model is a fine-tuned version of distilbert/distilbert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
---|---|---|---|---|---|
No log | 0 | 0 | 0.6991 | 0.3573 | 0.4668 |
No log | 0.6006 | 188 | 0.3642 | 0.8505 | 0.7246 |
No log | 1.2013 | 376 | 0.3155 | 0.8761 | 0.7717 |
0.3491 | 1.8019 | 564 | 0.3068 | 0.8833 | 0.7972 |
0.3491 | 2.4026 | 752 | 0.3198 | 0.8833 | 0.8016 |
Base model
distilbert/distilbert-base-uncased