Qubitium commited on
Commit
8bc4b26
·
verified ·
1 Parent(s): 575ff1e

Fix compat with HF Transformers `model.save_pretrained()`

Browse files

Latest transformers (4.49.0), likely earlier versions too, will now validate the generation config file and since top_k and temperature non-default values are set, do_sample must be set to True or eles this configuration is self-conflicting.

Files changed (1) hide show
  1. generation_config.json +15 -15
generation_config.json CHANGED
@@ -1,15 +1,15 @@
1
- {
2
- "max_new_tokens": 1000,
3
- "do_sample": false,
4
- "use_cache": true,
5
- "temperature": 0.0,
6
- "top_k": 5,
7
- "top_p": 0.85,
8
- "repetition_penalty": 1.05,
9
- "pad_token_id": 3,
10
- "bos_token_id": 1,
11
- "eos_token_id": 2,
12
- "user_token_id": 4,
13
- "bot_token_id": 5,
14
- "start_token_id": 1
15
- }
 
1
+ {
2
+ "max_new_tokens": 1000,
3
+ "do_sample": true,
4
+ "use_cache": true,
5
+ "temperature": 0.0,
6
+ "top_k": 5,
7
+ "top_p": 0.85,
8
+ "repetition_penalty": 1.05,
9
+ "pad_token_id": 3,
10
+ "bos_token_id": 1,
11
+ "eos_token_id": 2,
12
+ "user_token_id": 4,
13
+ "bot_token_id": 5,
14
+ "start_token_id": 1
15
+ }