V2 -86 perpex %28 Accurary -- 0 4.610025 4.462641 0.289884 86.716248 2:34:50
Browse files
README.md
CHANGED
@@ -1 +1,19 @@
|
|
1 |
gpt2-turkish-wiki
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
gpt2-turkish-wiki
|
2 |
+
|
3 |
+
Current version is demo only with some trained wikipedia text in Turkish.
|
4 |
+
|
5 |
+
Using modified https://github.com/piegu/fastai-projects/blob/master/finetuning-English-GPT2-any-language-Portuguese-HuggingFace-fastaiv2_FAST.ipynb
|
6 |
+
|
7 |
+
Inference is not so good at the moment.
|
8 |
+
|
9 |
+
Epoch train_loss valid_loss accuracy perplexity time
|
10 |
+
0 4.373726 5.398773 0.264228 221.134857 02:56
|
11 |
+
1 4.264910 5.344171 0.267870 209.384140 02:54
|
12 |
+
|
13 |
+
|
14 |
+
TODO: Total turkish wikipedia text is 3 GB xml file
|
15 |
+
|
16 |
+
1 epoch training on full wikipedia turkish gave some good results, will update here when have full model
|
17 |
+
epoch train_loss valid_loss accuracy perplexity time
|
18 |
+
0 3.948997 4.001249 0.330571 54.666405 2:41:54
|
19 |
+
|