MaxLSB commited on
Commit
4116090
·
verified ·
1 Parent(s): 26c01be

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +24 -0
README.md CHANGED
@@ -29,6 +29,8 @@ Luth was trained using full fine-tuning on the Luth-SFT dataset with [Axolotl](h
29
 
30
  We used LightEval for evaluation, with custom tasks for the French benchmarks. The models were evaluated with a `temperature=0`.
31
 
 
 
32
  **French Evaluation:**
33
 
34
  ![French Evaluation](media/french_evaluation.png)
@@ -37,6 +39,28 @@ We used LightEval for evaluation, with custom tasks for the French benchmarks. T
37
 
38
  ![English Evaluation](media/english_evaluation.png)
39
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
40
  ## Citation
41
 
42
  ```bibtex
 
29
 
30
  We used LightEval for evaluation, with custom tasks for the French benchmarks. The models were evaluated with a `temperature=0`.
31
 
32
+ ### Evaluation Visualizations
33
+
34
  **French Evaluation:**
35
 
36
  ![French Evaluation](media/french_evaluation.png)
 
39
 
40
  ![English Evaluation](media/english_evaluation.png)
41
 
42
+ ### French Benchmark Scores
43
+
44
+ | Benchmark | Qwen3-1.7B | SmolLM2-1.7B-Instruct | Qwen2.5-1.5B-Instruct | Luth-1.7B-Instruct |
45
+ |-------------------|------------------|-----------------------|-----------------------|----------------------|
46
+ | ifeval-fr | 54.53 | 31.24 | 32.90 | <u>57.67</u> |
47
+ | gpqa-diamond-fr | 26.90 | 21.83 | 28.93 | <u>38.58</u> |
48
+ | mmlu-fr | 28.46 | 33.73 | 46.25 | <u>49.66</u> |
49
+ | math-500-fr | 60.80 | 11.20 | 32.20 | <u>64.00</u> |
50
+ | arc-chall-fr | 33.28 | 28.57 | 32.68 | <u>35.16</u> |
51
+ | hellaswag-fr | 24.86 | <u>49.58</u> | 34.34 | 31.93 |
52
+
53
+ ### English Benchmark Scores
54
+
55
+ | Benchmark | Qwen3-1.7B | SmolLM2-1.7B-Instruct | Qwen2.5-1.5B-Instruct | Luth-1.7B-Instruct |
56
+ |-------------------|------------------|-----------------------|-----------------------|----------------------|
57
+ | ifeval-en | <u>68.39</u> | 48.24 | 39.93 | 65.80 |
58
+ | gpqa-diamond-en | <u>31.82</u> | 24.75 | 30.30 | 31.82 |
59
+ | mmlu-en | 52.74 | 50.27 | 59.81 | <u>60.19</u> |
60
+ | math-500-en | 69.20 | 22.40 | 56.00 | <u>70.00</u> |
61
+ | arc-chall-en | 36.09 | 42.32 | 41.04 | <u>42.24</u> |
62
+ | hellaswag-en | 46.96 | <u>66.94</u> | 64.48 | 58.55 |
63
+
64
  ## Citation
65
 
66
  ```bibtex