API Endpoint Extractor Model
Browse files
README.md
CHANGED
@@ -16,7 +16,7 @@ should probably proofread and complete it, then remove this comment. -->
|
|
16 |
|
17 |
This model is a fine-tuned version of [distilbert/distilbert-base-uncased](https://huggingface.co/distilbert/distilbert-base-uncased) on the None dataset.
|
18 |
It achieves the following results on the evaluation set:
|
19 |
-
- Loss: 1.
|
20 |
|
21 |
## Model description
|
22 |
|
@@ -39,6 +39,7 @@ The following hyperparameters were used during training:
|
|
39 |
- train_batch_size: 16
|
40 |
- eval_batch_size: 16
|
41 |
- seed: 42
|
|
|
42 |
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
43 |
- lr_scheduler_type: linear
|
44 |
- num_epochs: 3
|
@@ -47,14 +48,14 @@ The following hyperparameters were used during training:
|
|
47 |
|
48 |
| Training Loss | Epoch | Step | Validation Loss |
|
49 |
|:-------------:|:-----:|:----:|:---------------:|
|
50 |
-
| No log | 1.0 | 1 | 1.
|
51 |
-
| No log | 2.0 | 2 | 1.
|
52 |
-
| No log | 3.0 | 3 | 1.
|
53 |
|
54 |
|
55 |
### Framework versions
|
56 |
|
57 |
-
- Transformers 4.53.
|
58 |
-
- Pytorch 2.6.0+
|
59 |
-
- Datasets
|
60 |
- Tokenizers 0.21.2
|
|
|
16 |
|
17 |
This model is a fine-tuned version of [distilbert/distilbert-base-uncased](https://huggingface.co/distilbert/distilbert-base-uncased) on the None dataset.
|
18 |
It achieves the following results on the evaluation set:
|
19 |
+
- Loss: 1.2141
|
20 |
|
21 |
## Model description
|
22 |
|
|
|
39 |
- train_batch_size: 16
|
40 |
- eval_batch_size: 16
|
41 |
- seed: 42
|
42 |
+
- distributed_type: tpu
|
43 |
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
44 |
- lr_scheduler_type: linear
|
45 |
- num_epochs: 3
|
|
|
48 |
|
49 |
| Training Loss | Epoch | Step | Validation Loss |
|
50 |
|:-------------:|:-----:|:----:|:---------------:|
|
51 |
+
| No log | 1.0 | 1 | 1.1371 |
|
52 |
+
| No log | 2.0 | 2 | 1.1911 |
|
53 |
+
| No log | 3.0 | 3 | 1.2141 |
|
54 |
|
55 |
|
56 |
### Framework versions
|
57 |
|
58 |
+
- Transformers 4.53.1
|
59 |
+
- Pytorch 2.6.0+cpu
|
60 |
+
- Datasets 4.0.0
|
61 |
- Tokenizers 0.21.2
|