ErrorAI commited on
Commit
9595945
·
verified ·
1 Parent(s): 260b0f9

Training in progress, step 385, checkpoint

Browse files
last-checkpoint/README.md ADDED
@@ -0,0 +1,202 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: Xenova/tiny-random-Phi3ForCausalLM
3
+ library_name: peft
4
+ ---
5
+
6
+ # Model Card for Model ID
7
+
8
+ <!-- Provide a quick summary of what the model is/does. -->
9
+
10
+
11
+
12
+ ## Model Details
13
+
14
+ ### Model Description
15
+
16
+ <!-- Provide a longer summary of what this model is. -->
17
+
18
+
19
+
20
+ - **Developed by:** [More Information Needed]
21
+ - **Funded by [optional]:** [More Information Needed]
22
+ - **Shared by [optional]:** [More Information Needed]
23
+ - **Model type:** [More Information Needed]
24
+ - **Language(s) (NLP):** [More Information Needed]
25
+ - **License:** [More Information Needed]
26
+ - **Finetuned from model [optional]:** [More Information Needed]
27
+
28
+ ### Model Sources [optional]
29
+
30
+ <!-- Provide the basic links for the model. -->
31
+
32
+ - **Repository:** [More Information Needed]
33
+ - **Paper [optional]:** [More Information Needed]
34
+ - **Demo [optional]:** [More Information Needed]
35
+
36
+ ## Uses
37
+
38
+ <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
39
+
40
+ ### Direct Use
41
+
42
+ <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
43
+
44
+ [More Information Needed]
45
+
46
+ ### Downstream Use [optional]
47
+
48
+ <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
49
+
50
+ [More Information Needed]
51
+
52
+ ### Out-of-Scope Use
53
+
54
+ <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
55
+
56
+ [More Information Needed]
57
+
58
+ ## Bias, Risks, and Limitations
59
+
60
+ <!-- This section is meant to convey both technical and sociotechnical limitations. -->
61
+
62
+ [More Information Needed]
63
+
64
+ ### Recommendations
65
+
66
+ <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
67
+
68
+ Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
69
+
70
+ ## How to Get Started with the Model
71
+
72
+ Use the code below to get started with the model.
73
+
74
+ [More Information Needed]
75
+
76
+ ## Training Details
77
+
78
+ ### Training Data
79
+
80
+ <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
81
+
82
+ [More Information Needed]
83
+
84
+ ### Training Procedure
85
+
86
+ <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
87
+
88
+ #### Preprocessing [optional]
89
+
90
+ [More Information Needed]
91
+
92
+
93
+ #### Training Hyperparameters
94
+
95
+ - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
96
+
97
+ #### Speeds, Sizes, Times [optional]
98
+
99
+ <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
100
+
101
+ [More Information Needed]
102
+
103
+ ## Evaluation
104
+
105
+ <!-- This section describes the evaluation protocols and provides the results. -->
106
+
107
+ ### Testing Data, Factors & Metrics
108
+
109
+ #### Testing Data
110
+
111
+ <!-- This should link to a Dataset Card if possible. -->
112
+
113
+ [More Information Needed]
114
+
115
+ #### Factors
116
+
117
+ <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
118
+
119
+ [More Information Needed]
120
+
121
+ #### Metrics
122
+
123
+ <!-- These are the evaluation metrics being used, ideally with a description of why. -->
124
+
125
+ [More Information Needed]
126
+
127
+ ### Results
128
+
129
+ [More Information Needed]
130
+
131
+ #### Summary
132
+
133
+
134
+
135
+ ## Model Examination [optional]
136
+
137
+ <!-- Relevant interpretability work for the model goes here -->
138
+
139
+ [More Information Needed]
140
+
141
+ ## Environmental Impact
142
+
143
+ <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
144
+
145
+ Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
146
+
147
+ - **Hardware Type:** [More Information Needed]
148
+ - **Hours used:** [More Information Needed]
149
+ - **Cloud Provider:** [More Information Needed]
150
+ - **Compute Region:** [More Information Needed]
151
+ - **Carbon Emitted:** [More Information Needed]
152
+
153
+ ## Technical Specifications [optional]
154
+
155
+ ### Model Architecture and Objective
156
+
157
+ [More Information Needed]
158
+
159
+ ### Compute Infrastructure
160
+
161
+ [More Information Needed]
162
+
163
+ #### Hardware
164
+
165
+ [More Information Needed]
166
+
167
+ #### Software
168
+
169
+ [More Information Needed]
170
+
171
+ ## Citation [optional]
172
+
173
+ <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
174
+
175
+ **BibTeX:**
176
+
177
+ [More Information Needed]
178
+
179
+ **APA:**
180
+
181
+ [More Information Needed]
182
+
183
+ ## Glossary [optional]
184
+
185
+ <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
186
+
187
+ [More Information Needed]
188
+
189
+ ## More Information [optional]
190
+
191
+ [More Information Needed]
192
+
193
+ ## Model Card Authors [optional]
194
+
195
+ [More Information Needed]
196
+
197
+ ## Model Card Contact
198
+
199
+ [More Information Needed]
200
+ ### Framework versions
201
+
202
+ - PEFT 0.13.2
last-checkpoint/adapter_config.json ADDED
@@ -0,0 +1,31 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alpha_pattern": {},
3
+ "auto_mapping": null,
4
+ "base_model_name_or_path": "Xenova/tiny-random-Phi3ForCausalLM",
5
+ "bias": "none",
6
+ "fan_in_fan_out": null,
7
+ "inference_mode": true,
8
+ "init_lora_weights": true,
9
+ "layer_replication": null,
10
+ "layers_pattern": null,
11
+ "layers_to_transform": null,
12
+ "loftq_config": {},
13
+ "lora_alpha": 16,
14
+ "lora_dropout": 0.05,
15
+ "megatron_config": null,
16
+ "megatron_core": "megatron.core",
17
+ "modules_to_save": null,
18
+ "peft_type": "LORA",
19
+ "r": 8,
20
+ "rank_pattern": {},
21
+ "revision": null,
22
+ "target_modules": [
23
+ "down_proj",
24
+ "gate_up_proj",
25
+ "o_proj",
26
+ "qkv_proj"
27
+ ],
28
+ "task_type": "CAUSAL_LM",
29
+ "use_dora": false,
30
+ "use_rslora": false
31
+ }
last-checkpoint/adapter_model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:cfe6fc6df51d06ee936c0477399646cd75ff000d17d37e3546334ff1bb5116cb
3
+ size 30696
last-checkpoint/added_tokens.json ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "<|assistant|>": 32001,
3
+ "<|endoftext|>": 32000,
4
+ "<|end|>": 32007,
5
+ "<|placeholder1|>": 32002,
6
+ "<|placeholder2|>": 32003,
7
+ "<|placeholder3|>": 32004,
8
+ "<|placeholder4|>": 32005,
9
+ "<|placeholder5|>": 32008,
10
+ "<|placeholder6|>": 32009,
11
+ "<|system|>": 32006,
12
+ "<|user|>": 32010
13
+ }
last-checkpoint/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:88b23314e217f8991861757cd55afbc9bc4c21e3aa4e15f2419485d79fcd2451
3
+ size 67326
last-checkpoint/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:973d9879f912c51152093c19bae8acaf4c7aad9def9c5404775b9d0cea67c7e8
3
+ size 14244
last-checkpoint/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:aa44d66df4458ea783f3e65cc7f2c60ca4e0f16b8d7237402e7fb2588f2426a9
3
+ size 1064
last-checkpoint/special_tokens_map.json ADDED
@@ -0,0 +1,30 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "<|endoftext|>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "<|endoftext|>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "unk_token": {
24
+ "content": "<unk>",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ }
30
+ }
last-checkpoint/tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
last-checkpoint/tokenizer.model ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9e556afd44213b6bd1be2b850ebbbd98f5481437a8021afaf58ee7fb1818d347
3
+ size 499723
last-checkpoint/tokenizer_config.json ADDED
@@ -0,0 +1,131 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_bos_token": true,
3
+ "add_eos_token": false,
4
+ "add_prefix_space": null,
5
+ "added_tokens_decoder": {
6
+ "0": {
7
+ "content": "<unk>",
8
+ "lstrip": false,
9
+ "normalized": false,
10
+ "rstrip": false,
11
+ "single_word": false,
12
+ "special": true
13
+ },
14
+ "1": {
15
+ "content": "<s>",
16
+ "lstrip": false,
17
+ "normalized": false,
18
+ "rstrip": false,
19
+ "single_word": false,
20
+ "special": true
21
+ },
22
+ "2": {
23
+ "content": "</s>",
24
+ "lstrip": false,
25
+ "normalized": false,
26
+ "rstrip": true,
27
+ "single_word": false,
28
+ "special": false
29
+ },
30
+ "32000": {
31
+ "content": "<|endoftext|>",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false,
36
+ "special": true
37
+ },
38
+ "32001": {
39
+ "content": "<|assistant|>",
40
+ "lstrip": false,
41
+ "normalized": false,
42
+ "rstrip": true,
43
+ "single_word": false,
44
+ "special": true
45
+ },
46
+ "32002": {
47
+ "content": "<|placeholder1|>",
48
+ "lstrip": false,
49
+ "normalized": false,
50
+ "rstrip": true,
51
+ "single_word": false,
52
+ "special": true
53
+ },
54
+ "32003": {
55
+ "content": "<|placeholder2|>",
56
+ "lstrip": false,
57
+ "normalized": false,
58
+ "rstrip": true,
59
+ "single_word": false,
60
+ "special": true
61
+ },
62
+ "32004": {
63
+ "content": "<|placeholder3|>",
64
+ "lstrip": false,
65
+ "normalized": false,
66
+ "rstrip": true,
67
+ "single_word": false,
68
+ "special": true
69
+ },
70
+ "32005": {
71
+ "content": "<|placeholder4|>",
72
+ "lstrip": false,
73
+ "normalized": false,
74
+ "rstrip": true,
75
+ "single_word": false,
76
+ "special": true
77
+ },
78
+ "32006": {
79
+ "content": "<|system|>",
80
+ "lstrip": false,
81
+ "normalized": false,
82
+ "rstrip": true,
83
+ "single_word": false,
84
+ "special": true
85
+ },
86
+ "32007": {
87
+ "content": "<|end|>",
88
+ "lstrip": false,
89
+ "normalized": false,
90
+ "rstrip": true,
91
+ "single_word": false,
92
+ "special": true
93
+ },
94
+ "32008": {
95
+ "content": "<|placeholder5|>",
96
+ "lstrip": false,
97
+ "normalized": false,
98
+ "rstrip": true,
99
+ "single_word": false,
100
+ "special": true
101
+ },
102
+ "32009": {
103
+ "content": "<|placeholder6|>",
104
+ "lstrip": false,
105
+ "normalized": false,
106
+ "rstrip": true,
107
+ "single_word": false,
108
+ "special": true
109
+ },
110
+ "32010": {
111
+ "content": "<|user|>",
112
+ "lstrip": false,
113
+ "normalized": false,
114
+ "rstrip": true,
115
+ "single_word": false,
116
+ "special": true
117
+ }
118
+ },
119
+ "bos_token": "<s>",
120
+ "chat_template": "{% if not add_generation_prompt is defined %}{% set add_generation_prompt = false %}{% endif %}{% set loop_messages = messages %}{% for message in loop_messages %}{% set content = '<|start_header_id|>' + message['role'] + '<|end_header_id|>\n\n'+ message['content'] | trim + '<|eot_id|>' %}{% if loop.index0 == 0 %}{% set content = bos_token + content %}{% endif %}{{ content }}{% endfor %}{% if add_generation_prompt %}{{ '<|start_header_id|>assistant<|end_header_id|>\n\n' }}{% endif %}",
121
+ "clean_up_tokenization_spaces": false,
122
+ "eos_token": "<|endoftext|>",
123
+ "legacy": true,
124
+ "model_max_length": 4096,
125
+ "pad_token": "<|endoftext|>",
126
+ "padding_side": "left",
127
+ "sp_model_kwargs": {},
128
+ "tokenizer_class": "LlamaTokenizer",
129
+ "unk_token": "<unk>",
130
+ "use_default_system_prompt": false
131
+ }
last-checkpoint/trainer_state.json ADDED
@@ -0,0 +1,2728 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_metric": null,
3
+ "best_model_checkpoint": null,
4
+ "epoch": 0.011796156291411019,
5
+ "eval_steps": 500,
6
+ "global_step": 385,
7
+ "is_hyper_param_search": false,
8
+ "is_local_process_zero": true,
9
+ "is_world_process_zero": true,
10
+ "log_history": [
11
+ {
12
+ "epoch": 3.0639366990677975e-05,
13
+ "grad_norm": 0.24610228836536407,
14
+ "learning_rate": 2e-05,
15
+ "loss": 10.4009,
16
+ "step": 1
17
+ },
18
+ {
19
+ "epoch": 6.127873398135595e-05,
20
+ "grad_norm": 0.2093752771615982,
21
+ "learning_rate": 4e-05,
22
+ "loss": 10.3915,
23
+ "step": 2
24
+ },
25
+ {
26
+ "epoch": 9.191810097203391e-05,
27
+ "grad_norm": 0.2284351885318756,
28
+ "learning_rate": 6e-05,
29
+ "loss": 10.3906,
30
+ "step": 3
31
+ },
32
+ {
33
+ "epoch": 0.0001225574679627119,
34
+ "grad_norm": 0.2171163409948349,
35
+ "learning_rate": 8e-05,
36
+ "loss": 10.3869,
37
+ "step": 4
38
+ },
39
+ {
40
+ "epoch": 0.00015319683495338985,
41
+ "grad_norm": 0.2038799673318863,
42
+ "learning_rate": 0.0001,
43
+ "loss": 10.3949,
44
+ "step": 5
45
+ },
46
+ {
47
+ "epoch": 0.00018383620194406782,
48
+ "grad_norm": 0.2396441251039505,
49
+ "learning_rate": 9.999989500822154e-05,
50
+ "loss": 10.4008,
51
+ "step": 6
52
+ },
53
+ {
54
+ "epoch": 0.0002144755689347458,
55
+ "grad_norm": 0.26151952147483826,
56
+ "learning_rate": 9.999958003332706e-05,
57
+ "loss": 10.4042,
58
+ "step": 7
59
+ },
60
+ {
61
+ "epoch": 0.0002451149359254238,
62
+ "grad_norm": 0.24929848313331604,
63
+ "learning_rate": 9.999905507663936e-05,
64
+ "loss": 10.4017,
65
+ "step": 8
66
+ },
67
+ {
68
+ "epoch": 0.0002757543029161018,
69
+ "grad_norm": 0.24374260008335114,
70
+ "learning_rate": 9.999832014036307e-05,
71
+ "loss": 10.4036,
72
+ "step": 9
73
+ },
74
+ {
75
+ "epoch": 0.0003063936699067797,
76
+ "grad_norm": 0.24356040358543396,
77
+ "learning_rate": 9.999737522758472e-05,
78
+ "loss": 10.4014,
79
+ "step": 10
80
+ },
81
+ {
82
+ "epoch": 0.0003370330368974577,
83
+ "grad_norm": 0.22188138961791992,
84
+ "learning_rate": 9.99962203422726e-05,
85
+ "loss": 10.3966,
86
+ "step": 11
87
+ },
88
+ {
89
+ "epoch": 0.00036767240388813565,
90
+ "grad_norm": 0.23609934747219086,
91
+ "learning_rate": 9.999485548927685e-05,
92
+ "loss": 10.3977,
93
+ "step": 12
94
+ },
95
+ {
96
+ "epoch": 0.0003983117708788136,
97
+ "grad_norm": 0.24605651199817657,
98
+ "learning_rate": 9.999328067432943e-05,
99
+ "loss": 10.3925,
100
+ "step": 13
101
+ },
102
+ {
103
+ "epoch": 0.0004289511378694916,
104
+ "grad_norm": 0.2491612583398819,
105
+ "learning_rate": 9.9991495904044e-05,
106
+ "loss": 10.3905,
107
+ "step": 14
108
+ },
109
+ {
110
+ "epoch": 0.0004595905048601696,
111
+ "grad_norm": 0.2473774403333664,
112
+ "learning_rate": 9.998950118591606e-05,
113
+ "loss": 10.3967,
114
+ "step": 15
115
+ },
116
+ {
117
+ "epoch": 0.0004902298718508476,
118
+ "grad_norm": 0.24023188650608063,
119
+ "learning_rate": 9.998729652832273e-05,
120
+ "loss": 10.3919,
121
+ "step": 16
122
+ },
123
+ {
124
+ "epoch": 0.0005208692388415255,
125
+ "grad_norm": 0.249563068151474,
126
+ "learning_rate": 9.998488194052287e-05,
127
+ "loss": 10.3966,
128
+ "step": 17
129
+ },
130
+ {
131
+ "epoch": 0.0005515086058322036,
132
+ "grad_norm": 0.2716507911682129,
133
+ "learning_rate": 9.998225743265693e-05,
134
+ "loss": 10.3958,
135
+ "step": 18
136
+ },
137
+ {
138
+ "epoch": 0.0005821479728228815,
139
+ "grad_norm": 0.251860648393631,
140
+ "learning_rate": 9.997942301574701e-05,
141
+ "loss": 10.3934,
142
+ "step": 19
143
+ },
144
+ {
145
+ "epoch": 0.0006127873398135594,
146
+ "grad_norm": 0.2491675615310669,
147
+ "learning_rate": 9.997637870169672e-05,
148
+ "loss": 10.3971,
149
+ "step": 20
150
+ },
151
+ {
152
+ "epoch": 0.0006434267068042374,
153
+ "grad_norm": 0.2549178898334503,
154
+ "learning_rate": 9.997312450329115e-05,
155
+ "loss": 10.3912,
156
+ "step": 21
157
+ },
158
+ {
159
+ "epoch": 0.0006740660737949153,
160
+ "grad_norm": 0.24476251006126404,
161
+ "learning_rate": 9.99696604341969e-05,
162
+ "loss": 10.3781,
163
+ "step": 22
164
+ },
165
+ {
166
+ "epoch": 0.0007047054407855934,
167
+ "grad_norm": 0.24391278624534607,
168
+ "learning_rate": 9.996598650896192e-05,
169
+ "loss": 10.3703,
170
+ "step": 23
171
+ },
172
+ {
173
+ "epoch": 0.0007353448077762713,
174
+ "grad_norm": 0.2502497434616089,
175
+ "learning_rate": 9.996210274301546e-05,
176
+ "loss": 10.3699,
177
+ "step": 24
178
+ },
179
+ {
180
+ "epoch": 0.0007659841747669493,
181
+ "grad_norm": 0.24495452642440796,
182
+ "learning_rate": 9.995800915266809e-05,
183
+ "loss": 10.3641,
184
+ "step": 25
185
+ },
186
+ {
187
+ "epoch": 0.0007966235417576273,
188
+ "grad_norm": 0.24135306477546692,
189
+ "learning_rate": 9.995370575511151e-05,
190
+ "loss": 10.3599,
191
+ "step": 26
192
+ },
193
+ {
194
+ "epoch": 0.0008272629087483053,
195
+ "grad_norm": 0.22803382575511932,
196
+ "learning_rate": 9.99491925684186e-05,
197
+ "loss": 10.3602,
198
+ "step": 27
199
+ },
200
+ {
201
+ "epoch": 0.0008579022757389832,
202
+ "grad_norm": 0.23162369430065155,
203
+ "learning_rate": 9.994446961154325e-05,
204
+ "loss": 10.3633,
205
+ "step": 28
206
+ },
207
+ {
208
+ "epoch": 0.0008885416427296612,
209
+ "grad_norm": 0.2333999127149582,
210
+ "learning_rate": 9.993953690432031e-05,
211
+ "loss": 10.3599,
212
+ "step": 29
213
+ },
214
+ {
215
+ "epoch": 0.0009191810097203392,
216
+ "grad_norm": 0.2367755025625229,
217
+ "learning_rate": 9.993439446746558e-05,
218
+ "loss": 10.3586,
219
+ "step": 30
220
+ },
221
+ {
222
+ "epoch": 0.0009498203767110172,
223
+ "grad_norm": 0.2526112496852875,
224
+ "learning_rate": 9.992904232257555e-05,
225
+ "loss": 10.36,
226
+ "step": 31
227
+ },
228
+ {
229
+ "epoch": 0.0009804597437016952,
230
+ "grad_norm": 0.2687104642391205,
231
+ "learning_rate": 9.99234804921275e-05,
232
+ "loss": 10.3609,
233
+ "step": 32
234
+ },
235
+ {
236
+ "epoch": 0.0010110991106923731,
237
+ "grad_norm": 0.27773356437683105,
238
+ "learning_rate": 9.991770899947926e-05,
239
+ "loss": 10.3589,
240
+ "step": 33
241
+ },
242
+ {
243
+ "epoch": 0.001041738477683051,
244
+ "grad_norm": 0.28705063462257385,
245
+ "learning_rate": 9.991172786886922e-05,
246
+ "loss": 10.3587,
247
+ "step": 34
248
+ },
249
+ {
250
+ "epoch": 0.001072377844673729,
251
+ "grad_norm": 0.296935111284256,
252
+ "learning_rate": 9.990553712541617e-05,
253
+ "loss": 10.3589,
254
+ "step": 35
255
+ },
256
+ {
257
+ "epoch": 0.0011030172116644071,
258
+ "grad_norm": 0.3149632513523102,
259
+ "learning_rate": 9.989913679511919e-05,
260
+ "loss": 10.3625,
261
+ "step": 36
262
+ },
263
+ {
264
+ "epoch": 0.001133656578655085,
265
+ "grad_norm": 0.3028753995895386,
266
+ "learning_rate": 9.989252690485755e-05,
267
+ "loss": 10.3755,
268
+ "step": 37
269
+ },
270
+ {
271
+ "epoch": 0.001164295945645763,
272
+ "grad_norm": 0.3272389769554138,
273
+ "learning_rate": 9.988570748239062e-05,
274
+ "loss": 10.3758,
275
+ "step": 38
276
+ },
277
+ {
278
+ "epoch": 0.0011949353126364409,
279
+ "grad_norm": 0.35342222452163696,
280
+ "learning_rate": 9.987867855635776e-05,
281
+ "loss": 10.3697,
282
+ "step": 39
283
+ },
284
+ {
285
+ "epoch": 0.0012255746796271188,
286
+ "grad_norm": 0.37038740515708923,
287
+ "learning_rate": 9.987144015627809e-05,
288
+ "loss": 10.3771,
289
+ "step": 40
290
+ },
291
+ {
292
+ "epoch": 0.001256214046617797,
293
+ "grad_norm": 0.435486763715744,
294
+ "learning_rate": 9.986399231255056e-05,
295
+ "loss": 10.3806,
296
+ "step": 41
297
+ },
298
+ {
299
+ "epoch": 0.0012868534136084749,
300
+ "grad_norm": 0.43793022632598877,
301
+ "learning_rate": 9.985633505645364e-05,
302
+ "loss": 10.3783,
303
+ "step": 42
304
+ },
305
+ {
306
+ "epoch": 0.0013174927805991528,
307
+ "grad_norm": 0.3990364372730255,
308
+ "learning_rate": 9.98484684201453e-05,
309
+ "loss": 10.3722,
310
+ "step": 43
311
+ },
312
+ {
313
+ "epoch": 0.0013481321475898307,
314
+ "grad_norm": 0.3756003677845001,
315
+ "learning_rate": 9.984039243666283e-05,
316
+ "loss": 10.3695,
317
+ "step": 44
318
+ },
319
+ {
320
+ "epoch": 0.0013787715145805088,
321
+ "grad_norm": 0.38554298877716064,
322
+ "learning_rate": 9.983210713992268e-05,
323
+ "loss": 10.3747,
324
+ "step": 45
325
+ },
326
+ {
327
+ "epoch": 0.0014094108815711868,
328
+ "grad_norm": 0.3810689449310303,
329
+ "learning_rate": 9.98236125647204e-05,
330
+ "loss": 10.3769,
331
+ "step": 46
332
+ },
333
+ {
334
+ "epoch": 0.0014400502485618647,
335
+ "grad_norm": 0.3995368182659149,
336
+ "learning_rate": 9.981490874673039e-05,
337
+ "loss": 10.3882,
338
+ "step": 47
339
+ },
340
+ {
341
+ "epoch": 0.0014706896155525426,
342
+ "grad_norm": 0.3769223690032959,
343
+ "learning_rate": 9.980599572250584e-05,
344
+ "loss": 10.3825,
345
+ "step": 48
346
+ },
347
+ {
348
+ "epoch": 0.0015013289825432207,
349
+ "grad_norm": 0.38782820105552673,
350
+ "learning_rate": 9.97968735294785e-05,
351
+ "loss": 10.3838,
352
+ "step": 49
353
+ },
354
+ {
355
+ "epoch": 0.0015319683495338987,
356
+ "grad_norm": 0.3910014033317566,
357
+ "learning_rate": 9.978754220595861e-05,
358
+ "loss": 10.3778,
359
+ "step": 50
360
+ },
361
+ {
362
+ "epoch": 0.0015626077165245766,
363
+ "grad_norm": 0.41532161831855774,
364
+ "learning_rate": 9.977800179113463e-05,
365
+ "loss": 10.3535,
366
+ "step": 51
367
+ },
368
+ {
369
+ "epoch": 0.0015932470835152545,
370
+ "grad_norm": 0.36595699191093445,
371
+ "learning_rate": 9.97682523250732e-05,
372
+ "loss": 10.3524,
373
+ "step": 52
374
+ },
375
+ {
376
+ "epoch": 0.0016238864505059326,
377
+ "grad_norm": 0.4012646973133087,
378
+ "learning_rate": 9.975829384871884e-05,
379
+ "loss": 10.3407,
380
+ "step": 53
381
+ },
382
+ {
383
+ "epoch": 0.0016545258174966106,
384
+ "grad_norm": 0.3944823443889618,
385
+ "learning_rate": 9.974812640389388e-05,
386
+ "loss": 10.3522,
387
+ "step": 54
388
+ },
389
+ {
390
+ "epoch": 0.0016851651844872885,
391
+ "grad_norm": 0.4177343547344208,
392
+ "learning_rate": 9.973775003329824e-05,
393
+ "loss": 10.3444,
394
+ "step": 55
395
+ },
396
+ {
397
+ "epoch": 0.0017158045514779664,
398
+ "grad_norm": 0.45602279901504517,
399
+ "learning_rate": 9.97271647805093e-05,
400
+ "loss": 10.3406,
401
+ "step": 56
402
+ },
403
+ {
404
+ "epoch": 0.0017464439184686445,
405
+ "grad_norm": 0.48096930980682373,
406
+ "learning_rate": 9.971637068998159e-05,
407
+ "loss": 10.3446,
408
+ "step": 57
409
+ },
410
+ {
411
+ "epoch": 0.0017770832854593225,
412
+ "grad_norm": 0.4903767704963684,
413
+ "learning_rate": 9.970536780704678e-05,
414
+ "loss": 10.3438,
415
+ "step": 58
416
+ },
417
+ {
418
+ "epoch": 0.0018077226524500004,
419
+ "grad_norm": 0.502999484539032,
420
+ "learning_rate": 9.969415617791336e-05,
421
+ "loss": 10.3429,
422
+ "step": 59
423
+ },
424
+ {
425
+ "epoch": 0.0018383620194406783,
426
+ "grad_norm": 0.47343501448631287,
427
+ "learning_rate": 9.968273584966644e-05,
428
+ "loss": 10.3432,
429
+ "step": 60
430
+ },
431
+ {
432
+ "epoch": 0.0018690013864313562,
433
+ "grad_norm": 0.5070235729217529,
434
+ "learning_rate": 9.96711068702677e-05,
435
+ "loss": 10.341,
436
+ "step": 61
437
+ },
438
+ {
439
+ "epoch": 0.0018996407534220344,
440
+ "grad_norm": 0.49711889028549194,
441
+ "learning_rate": 9.965926928855499e-05,
442
+ "loss": 10.3328,
443
+ "step": 62
444
+ },
445
+ {
446
+ "epoch": 0.0019302801204127123,
447
+ "grad_norm": 0.48925232887268066,
448
+ "learning_rate": 9.964722315424227e-05,
449
+ "loss": 10.3333,
450
+ "step": 63
451
+ },
452
+ {
453
+ "epoch": 0.0019609194874033904,
454
+ "grad_norm": 0.48452919721603394,
455
+ "learning_rate": 9.963496851791935e-05,
456
+ "loss": 10.3311,
457
+ "step": 64
458
+ },
459
+ {
460
+ "epoch": 0.001991558854394068,
461
+ "grad_norm": 0.492946982383728,
462
+ "learning_rate": 9.962250543105167e-05,
463
+ "loss": 10.3337,
464
+ "step": 65
465
+ },
466
+ {
467
+ "epoch": 0.0020221982213847463,
468
+ "grad_norm": 0.5217669606208801,
469
+ "learning_rate": 9.960983394598009e-05,
470
+ "loss": 10.3336,
471
+ "step": 66
472
+ },
473
+ {
474
+ "epoch": 0.002052837588375424,
475
+ "grad_norm": 0.5229796767234802,
476
+ "learning_rate": 9.959695411592068e-05,
477
+ "loss": 10.3299,
478
+ "step": 67
479
+ },
480
+ {
481
+ "epoch": 0.002083476955366102,
482
+ "grad_norm": 0.5266199707984924,
483
+ "learning_rate": 9.95838659949645e-05,
484
+ "loss": 10.332,
485
+ "step": 68
486
+ },
487
+ {
488
+ "epoch": 0.0021141163223567802,
489
+ "grad_norm": 0.5694423317909241,
490
+ "learning_rate": 9.957056963807736e-05,
491
+ "loss": 10.325,
492
+ "step": 69
493
+ },
494
+ {
495
+ "epoch": 0.002144755689347458,
496
+ "grad_norm": 0.5378698706626892,
497
+ "learning_rate": 9.955706510109957e-05,
498
+ "loss": 10.3161,
499
+ "step": 70
500
+ },
501
+ {
502
+ "epoch": 0.002175395056338136,
503
+ "grad_norm": 0.4891819655895233,
504
+ "learning_rate": 9.954335244074574e-05,
505
+ "loss": 10.3199,
506
+ "step": 71
507
+ },
508
+ {
509
+ "epoch": 0.0022060344233288142,
510
+ "grad_norm": 0.4915742874145508,
511
+ "learning_rate": 9.952943171460455e-05,
512
+ "loss": 10.3132,
513
+ "step": 72
514
+ },
515
+ {
516
+ "epoch": 0.002236673790319492,
517
+ "grad_norm": 0.4707483649253845,
518
+ "learning_rate": 9.951530298113847e-05,
519
+ "loss": 10.3078,
520
+ "step": 73
521
+ },
522
+ {
523
+ "epoch": 0.00226731315731017,
524
+ "grad_norm": 0.5317016243934631,
525
+ "learning_rate": 9.950096629968352e-05,
526
+ "loss": 10.2982,
527
+ "step": 74
528
+ },
529
+ {
530
+ "epoch": 0.0022979525243008478,
531
+ "grad_norm": 0.5227375626564026,
532
+ "learning_rate": 9.948642173044905e-05,
533
+ "loss": 10.2931,
534
+ "step": 75
535
+ },
536
+ {
537
+ "epoch": 0.002328591891291526,
538
+ "grad_norm": 0.4966718852519989,
539
+ "learning_rate": 9.94716693345175e-05,
540
+ "loss": 10.2857,
541
+ "step": 76
542
+ },
543
+ {
544
+ "epoch": 0.002359231258282204,
545
+ "grad_norm": 0.498467355966568,
546
+ "learning_rate": 9.945670917384403e-05,
547
+ "loss": 10.2837,
548
+ "step": 77
549
+ },
550
+ {
551
+ "epoch": 0.0023898706252728818,
552
+ "grad_norm": 0.5161658525466919,
553
+ "learning_rate": 9.944154131125642e-05,
554
+ "loss": 10.2833,
555
+ "step": 78
556
+ },
557
+ {
558
+ "epoch": 0.00242050999226356,
559
+ "grad_norm": 0.4879140853881836,
560
+ "learning_rate": 9.942616581045473e-05,
561
+ "loss": 10.2878,
562
+ "step": 79
563
+ },
564
+ {
565
+ "epoch": 0.0024511493592542376,
566
+ "grad_norm": 0.503729522228241,
567
+ "learning_rate": 9.941058273601096e-05,
568
+ "loss": 10.289,
569
+ "step": 80
570
+ },
571
+ {
572
+ "epoch": 0.0024817887262449157,
573
+ "grad_norm": 0.48749202489852905,
574
+ "learning_rate": 9.939479215336893e-05,
575
+ "loss": 10.2847,
576
+ "step": 81
577
+ },
578
+ {
579
+ "epoch": 0.002512428093235594,
580
+ "grad_norm": 0.5124345421791077,
581
+ "learning_rate": 9.93787941288439e-05,
582
+ "loss": 10.2828,
583
+ "step": 82
584
+ },
585
+ {
586
+ "epoch": 0.0025430674602262716,
587
+ "grad_norm": 0.5024367570877075,
588
+ "learning_rate": 9.936258872962228e-05,
589
+ "loss": 10.2694,
590
+ "step": 83
591
+ },
592
+ {
593
+ "epoch": 0.0025737068272169497,
594
+ "grad_norm": 0.4918675720691681,
595
+ "learning_rate": 9.934617602376142e-05,
596
+ "loss": 10.2753,
597
+ "step": 84
598
+ },
599
+ {
600
+ "epoch": 0.002604346194207628,
601
+ "grad_norm": 0.4941501021385193,
602
+ "learning_rate": 9.932955608018933e-05,
603
+ "loss": 10.2718,
604
+ "step": 85
605
+ },
606
+ {
607
+ "epoch": 0.0026349855611983056,
608
+ "grad_norm": 0.5033401846885681,
609
+ "learning_rate": 9.931272896870426e-05,
610
+ "loss": 10.2693,
611
+ "step": 86
612
+ },
613
+ {
614
+ "epoch": 0.0026656249281889837,
615
+ "grad_norm": 0.49800774455070496,
616
+ "learning_rate": 9.929569475997457e-05,
617
+ "loss": 10.269,
618
+ "step": 87
619
+ },
620
+ {
621
+ "epoch": 0.0026962642951796614,
622
+ "grad_norm": 0.581160306930542,
623
+ "learning_rate": 9.927845352553831e-05,
624
+ "loss": 10.2758,
625
+ "step": 88
626
+ },
627
+ {
628
+ "epoch": 0.0027269036621703395,
629
+ "grad_norm": 0.6085376143455505,
630
+ "learning_rate": 9.926100533780303e-05,
631
+ "loss": 10.2805,
632
+ "step": 89
633
+ },
634
+ {
635
+ "epoch": 0.0027575430291610177,
636
+ "grad_norm": 0.5927174687385559,
637
+ "learning_rate": 9.924335027004536e-05,
638
+ "loss": 10.259,
639
+ "step": 90
640
+ },
641
+ {
642
+ "epoch": 0.0027881823961516954,
643
+ "grad_norm": 0.6134704351425171,
644
+ "learning_rate": 9.92254883964108e-05,
645
+ "loss": 10.2574,
646
+ "step": 91
647
+ },
648
+ {
649
+ "epoch": 0.0028188217631423735,
650
+ "grad_norm": 0.6188032627105713,
651
+ "learning_rate": 9.920741979191331e-05,
652
+ "loss": 10.257,
653
+ "step": 92
654
+ },
655
+ {
656
+ "epoch": 0.0028494611301330517,
657
+ "grad_norm": 0.6642237901687622,
658
+ "learning_rate": 9.918914453243508e-05,
659
+ "loss": 10.2553,
660
+ "step": 93
661
+ },
662
+ {
663
+ "epoch": 0.0028801004971237294,
664
+ "grad_norm": 0.670950710773468,
665
+ "learning_rate": 9.917066269472623e-05,
666
+ "loss": 10.2507,
667
+ "step": 94
668
+ },
669
+ {
670
+ "epoch": 0.0029107398641144075,
671
+ "grad_norm": 0.6101608276367188,
672
+ "learning_rate": 9.91519743564044e-05,
673
+ "loss": 10.2564,
674
+ "step": 95
675
+ },
676
+ {
677
+ "epoch": 0.002941379231105085,
678
+ "grad_norm": 0.48022058606147766,
679
+ "learning_rate": 9.913307959595444e-05,
680
+ "loss": 10.2585,
681
+ "step": 96
682
+ },
683
+ {
684
+ "epoch": 0.0029720185980957633,
685
+ "grad_norm": 0.6569609045982361,
686
+ "learning_rate": 9.911397849272813e-05,
687
+ "loss": 10.2623,
688
+ "step": 97
689
+ },
690
+ {
691
+ "epoch": 0.0030026579650864415,
692
+ "grad_norm": 0.6302333474159241,
693
+ "learning_rate": 9.909467112694384e-05,
694
+ "loss": 10.2634,
695
+ "step": 98
696
+ },
697
+ {
698
+ "epoch": 0.003033297332077119,
699
+ "grad_norm": 0.6016459465026855,
700
+ "learning_rate": 9.907515757968613e-05,
701
+ "loss": 10.2572,
702
+ "step": 99
703
+ },
704
+ {
705
+ "epoch": 0.0030639366990677973,
706
+ "grad_norm": 0.6104916930198669,
707
+ "learning_rate": 9.905543793290551e-05,
708
+ "loss": 10.2613,
709
+ "step": 100
710
+ },
711
+ {
712
+ "epoch": 0.003094576066058475,
713
+ "grad_norm": 0.4617011845111847,
714
+ "learning_rate": 9.903551226941801e-05,
715
+ "loss": 10.2507,
716
+ "step": 101
717
+ },
718
+ {
719
+ "epoch": 0.003125215433049153,
720
+ "grad_norm": 0.45665284991264343,
721
+ "learning_rate": 9.901538067290485e-05,
722
+ "loss": 10.2518,
723
+ "step": 102
724
+ },
725
+ {
726
+ "epoch": 0.0031558548000398313,
727
+ "grad_norm": 0.4125213623046875,
728
+ "learning_rate": 9.89950432279121e-05,
729
+ "loss": 10.243,
730
+ "step": 103
731
+ },
732
+ {
733
+ "epoch": 0.003186494167030509,
734
+ "grad_norm": 0.4446852207183838,
735
+ "learning_rate": 9.897450001985039e-05,
736
+ "loss": 10.253,
737
+ "step": 104
738
+ },
739
+ {
740
+ "epoch": 0.003217133534021187,
741
+ "grad_norm": 0.4295095205307007,
742
+ "learning_rate": 9.895375113499439e-05,
743
+ "loss": 10.2319,
744
+ "step": 105
745
+ },
746
+ {
747
+ "epoch": 0.0032477729010118653,
748
+ "grad_norm": 0.4166061580181122,
749
+ "learning_rate": 9.893279666048261e-05,
750
+ "loss": 10.2316,
751
+ "step": 106
752
+ },
753
+ {
754
+ "epoch": 0.003278412268002543,
755
+ "grad_norm": 0.4115273654460907,
756
+ "learning_rate": 9.891163668431695e-05,
757
+ "loss": 10.2319,
758
+ "step": 107
759
+ },
760
+ {
761
+ "epoch": 0.003309051634993221,
762
+ "grad_norm": 0.4369446635246277,
763
+ "learning_rate": 9.889027129536237e-05,
764
+ "loss": 10.2299,
765
+ "step": 108
766
+ },
767
+ {
768
+ "epoch": 0.003339691001983899,
769
+ "grad_norm": 0.420180082321167,
770
+ "learning_rate": 9.886870058334644e-05,
771
+ "loss": 10.228,
772
+ "step": 109
773
+ },
774
+ {
775
+ "epoch": 0.003370330368974577,
776
+ "grad_norm": 0.389148473739624,
777
+ "learning_rate": 9.88469246388591e-05,
778
+ "loss": 10.2292,
779
+ "step": 110
780
+ },
781
+ {
782
+ "epoch": 0.003400969735965255,
783
+ "grad_norm": 0.37859535217285156,
784
+ "learning_rate": 9.882494355335211e-05,
785
+ "loss": 10.2265,
786
+ "step": 111
787
+ },
788
+ {
789
+ "epoch": 0.003431609102955933,
790
+ "grad_norm": 0.3675991892814636,
791
+ "learning_rate": 9.880275741913884e-05,
792
+ "loss": 10.2227,
793
+ "step": 112
794
+ },
795
+ {
796
+ "epoch": 0.003462248469946611,
797
+ "grad_norm": 0.3687768876552582,
798
+ "learning_rate": 9.878036632939374e-05,
799
+ "loss": 10.2206,
800
+ "step": 113
801
+ },
802
+ {
803
+ "epoch": 0.003492887836937289,
804
+ "grad_norm": 0.3665653467178345,
805
+ "learning_rate": 9.875777037815202e-05,
806
+ "loss": 10.2206,
807
+ "step": 114
808
+ },
809
+ {
810
+ "epoch": 0.003523527203927967,
811
+ "grad_norm": 0.3494964838027954,
812
+ "learning_rate": 9.873496966030923e-05,
813
+ "loss": 10.2228,
814
+ "step": 115
815
+ },
816
+ {
817
+ "epoch": 0.003554166570918645,
818
+ "grad_norm": 0.372764527797699,
819
+ "learning_rate": 9.871196427162092e-05,
820
+ "loss": 10.2236,
821
+ "step": 116
822
+ },
823
+ {
824
+ "epoch": 0.0035848059379093226,
825
+ "grad_norm": 0.35803404450416565,
826
+ "learning_rate": 9.868875430870216e-05,
827
+ "loss": 10.2178,
828
+ "step": 117
829
+ },
830
+ {
831
+ "epoch": 0.0036154453049000008,
832
+ "grad_norm": 0.3608494699001312,
833
+ "learning_rate": 9.866533986902713e-05,
834
+ "loss": 10.2158,
835
+ "step": 118
836
+ },
837
+ {
838
+ "epoch": 0.003646084671890679,
839
+ "grad_norm": 0.39143475890159607,
840
+ "learning_rate": 9.86417210509288e-05,
841
+ "loss": 10.2166,
842
+ "step": 119
843
+ },
844
+ {
845
+ "epoch": 0.0036767240388813566,
846
+ "grad_norm": 0.3595784604549408,
847
+ "learning_rate": 9.861789795359842e-05,
848
+ "loss": 10.2183,
849
+ "step": 120
850
+ },
851
+ {
852
+ "epoch": 0.0037073634058720347,
853
+ "grad_norm": 0.32507047057151794,
854
+ "learning_rate": 9.859387067708517e-05,
855
+ "loss": 10.2171,
856
+ "step": 121
857
+ },
858
+ {
859
+ "epoch": 0.0037380027728627124,
860
+ "grad_norm": 0.3041350543498993,
861
+ "learning_rate": 9.85696393222957e-05,
862
+ "loss": 10.2085,
863
+ "step": 122
864
+ },
865
+ {
866
+ "epoch": 0.0037686421398533906,
867
+ "grad_norm": 0.3051709830760956,
868
+ "learning_rate": 9.854520399099377e-05,
869
+ "loss": 10.2146,
870
+ "step": 123
871
+ },
872
+ {
873
+ "epoch": 0.0037992815068440687,
874
+ "grad_norm": 0.362964004278183,
875
+ "learning_rate": 9.85205647857997e-05,
876
+ "loss": 10.2013,
877
+ "step": 124
878
+ },
879
+ {
880
+ "epoch": 0.0038299208738347464,
881
+ "grad_norm": 0.3657281994819641,
882
+ "learning_rate": 9.849572181019007e-05,
883
+ "loss": 10.1957,
884
+ "step": 125
885
+ },
886
+ {
887
+ "epoch": 0.0038605602408254246,
888
+ "grad_norm": 0.3641434609889984,
889
+ "learning_rate": 9.847067516849717e-05,
890
+ "loss": 10.196,
891
+ "step": 126
892
+ },
893
+ {
894
+ "epoch": 0.0038911996078161027,
895
+ "grad_norm": 0.3638749420642853,
896
+ "learning_rate": 9.844542496590872e-05,
897
+ "loss": 10.194,
898
+ "step": 127
899
+ },
900
+ {
901
+ "epoch": 0.003921838974806781,
902
+ "grad_norm": 0.35783156752586365,
903
+ "learning_rate": 9.84199713084672e-05,
904
+ "loss": 10.193,
905
+ "step": 128
906
+ },
907
+ {
908
+ "epoch": 0.003952478341797458,
909
+ "grad_norm": 0.329607218503952,
910
+ "learning_rate": 9.839431430306965e-05,
911
+ "loss": 10.2009,
912
+ "step": 129
913
+ },
914
+ {
915
+ "epoch": 0.003983117708788136,
916
+ "grad_norm": 0.41538935899734497,
917
+ "learning_rate": 9.836845405746704e-05,
918
+ "loss": 10.1916,
919
+ "step": 130
920
+ },
921
+ {
922
+ "epoch": 0.004013757075778814,
923
+ "grad_norm": 0.3260403573513031,
924
+ "learning_rate": 9.834239068026387e-05,
925
+ "loss": 10.1979,
926
+ "step": 131
927
+ },
928
+ {
929
+ "epoch": 0.0040443964427694925,
930
+ "grad_norm": 0.3089582026004791,
931
+ "learning_rate": 9.83161242809178e-05,
932
+ "loss": 10.2174,
933
+ "step": 132
934
+ },
935
+ {
936
+ "epoch": 0.004075035809760171,
937
+ "grad_norm": 0.3140828013420105,
938
+ "learning_rate": 9.828965496973906e-05,
939
+ "loss": 10.209,
940
+ "step": 133
941
+ },
942
+ {
943
+ "epoch": 0.004105675176750848,
944
+ "grad_norm": 0.30631038546562195,
945
+ "learning_rate": 9.826298285789002e-05,
946
+ "loss": 10.2078,
947
+ "step": 134
948
+ },
949
+ {
950
+ "epoch": 0.004136314543741526,
951
+ "grad_norm": 0.2800963222980499,
952
+ "learning_rate": 9.823610805738479e-05,
953
+ "loss": 10.2003,
954
+ "step": 135
955
+ },
956
+ {
957
+ "epoch": 0.004166953910732204,
958
+ "grad_norm": 0.25573664903640747,
959
+ "learning_rate": 9.820903068108871e-05,
960
+ "loss": 10.1959,
961
+ "step": 136
962
+ },
963
+ {
964
+ "epoch": 0.004197593277722882,
965
+ "grad_norm": 0.2738712728023529,
966
+ "learning_rate": 9.818175084271786e-05,
967
+ "loss": 10.1859,
968
+ "step": 137
969
+ },
970
+ {
971
+ "epoch": 0.0042282326447135605,
972
+ "grad_norm": 0.3030584156513214,
973
+ "learning_rate": 9.815426865683857e-05,
974
+ "loss": 10.1822,
975
+ "step": 138
976
+ },
977
+ {
978
+ "epoch": 0.004258872011704238,
979
+ "grad_norm": 0.3009280860424042,
980
+ "learning_rate": 9.812658423886698e-05,
981
+ "loss": 10.1876,
982
+ "step": 139
983
+ },
984
+ {
985
+ "epoch": 0.004289511378694916,
986
+ "grad_norm": 0.34544259309768677,
987
+ "learning_rate": 9.809869770506856e-05,
988
+ "loss": 10.1817,
989
+ "step": 140
990
+ },
991
+ {
992
+ "epoch": 0.004320150745685594,
993
+ "grad_norm": 0.3761567175388336,
994
+ "learning_rate": 9.807060917255757e-05,
995
+ "loss": 10.183,
996
+ "step": 141
997
+ },
998
+ {
999
+ "epoch": 0.004350790112676272,
1000
+ "grad_norm": 0.33650821447372437,
1001
+ "learning_rate": 9.804231875929661e-05,
1002
+ "loss": 10.1803,
1003
+ "step": 142
1004
+ },
1005
+ {
1006
+ "epoch": 0.00438142947966695,
1007
+ "grad_norm": 0.37510374188423157,
1008
+ "learning_rate": 9.80138265840961e-05,
1009
+ "loss": 10.1853,
1010
+ "step": 143
1011
+ },
1012
+ {
1013
+ "epoch": 0.0044120688466576284,
1014
+ "grad_norm": 0.2876983880996704,
1015
+ "learning_rate": 9.798513276661386e-05,
1016
+ "loss": 10.1988,
1017
+ "step": 144
1018
+ },
1019
+ {
1020
+ "epoch": 0.004442708213648306,
1021
+ "grad_norm": 0.26207780838012695,
1022
+ "learning_rate": 9.79562374273544e-05,
1023
+ "loss": 10.1959,
1024
+ "step": 145
1025
+ },
1026
+ {
1027
+ "epoch": 0.004473347580638984,
1028
+ "grad_norm": 0.27307620644569397,
1029
+ "learning_rate": 9.792714068766872e-05,
1030
+ "loss": 10.1919,
1031
+ "step": 146
1032
+ },
1033
+ {
1034
+ "epoch": 0.004503986947629662,
1035
+ "grad_norm": 0.479770302772522,
1036
+ "learning_rate": 9.789784266975352e-05,
1037
+ "loss": 10.1815,
1038
+ "step": 147
1039
+ },
1040
+ {
1041
+ "epoch": 0.00453462631462034,
1042
+ "grad_norm": 0.4118158519268036,
1043
+ "learning_rate": 9.786834349665083e-05,
1044
+ "loss": 10.1935,
1045
+ "step": 148
1046
+ },
1047
+ {
1048
+ "epoch": 0.004565265681611018,
1049
+ "grad_norm": 0.4143422245979309,
1050
+ "learning_rate": 9.783864329224752e-05,
1051
+ "loss": 10.203,
1052
+ "step": 149
1053
+ },
1054
+ {
1055
+ "epoch": 0.0045959050486016955,
1056
+ "grad_norm": 0.3514252305030823,
1057
+ "learning_rate": 9.780874218127464e-05,
1058
+ "loss": 10.1834,
1059
+ "step": 150
1060
+ },
1061
+ {
1062
+ "epoch": 0.004626544415592374,
1063
+ "grad_norm": 0.3279332220554352,
1064
+ "learning_rate": 9.777864028930705e-05,
1065
+ "loss": 10.186,
1066
+ "step": 151
1067
+ },
1068
+ {
1069
+ "epoch": 0.004657183782583052,
1070
+ "grad_norm": 0.2270706295967102,
1071
+ "learning_rate": 9.774833774276278e-05,
1072
+ "loss": 10.1987,
1073
+ "step": 152
1074
+ },
1075
+ {
1076
+ "epoch": 0.00468782314957373,
1077
+ "grad_norm": 0.3779345154762268,
1078
+ "learning_rate": 9.771783466890254e-05,
1079
+ "loss": 10.2011,
1080
+ "step": 153
1081
+ },
1082
+ {
1083
+ "epoch": 0.004718462516564408,
1084
+ "grad_norm": 0.32367634773254395,
1085
+ "learning_rate": 9.768713119582927e-05,
1086
+ "loss": 10.21,
1087
+ "step": 154
1088
+ },
1089
+ {
1090
+ "epoch": 0.004749101883555085,
1091
+ "grad_norm": 0.2722915709018707,
1092
+ "learning_rate": 9.765622745248739e-05,
1093
+ "loss": 10.1845,
1094
+ "step": 155
1095
+ },
1096
+ {
1097
+ "epoch": 0.0047797412505457635,
1098
+ "grad_norm": 0.25113949179649353,
1099
+ "learning_rate": 9.762512356866252e-05,
1100
+ "loss": 10.1843,
1101
+ "step": 156
1102
+ },
1103
+ {
1104
+ "epoch": 0.004810380617536442,
1105
+ "grad_norm": 0.2509434223175049,
1106
+ "learning_rate": 9.75938196749807e-05,
1107
+ "loss": 10.1794,
1108
+ "step": 157
1109
+ },
1110
+ {
1111
+ "epoch": 0.00484101998452712,
1112
+ "grad_norm": 0.2345142364501953,
1113
+ "learning_rate": 9.7562315902908e-05,
1114
+ "loss": 10.1776,
1115
+ "step": 158
1116
+ },
1117
+ {
1118
+ "epoch": 0.004871659351517798,
1119
+ "grad_norm": 0.23735208809375763,
1120
+ "learning_rate": 9.753061238474991e-05,
1121
+ "loss": 10.1806,
1122
+ "step": 159
1123
+ },
1124
+ {
1125
+ "epoch": 0.004902298718508475,
1126
+ "grad_norm": 0.2244131863117218,
1127
+ "learning_rate": 9.749870925365077e-05,
1128
+ "loss": 10.1812,
1129
+ "step": 160
1130
+ },
1131
+ {
1132
+ "epoch": 0.004932938085499153,
1133
+ "grad_norm": 0.24302279949188232,
1134
+ "learning_rate": 9.746660664359326e-05,
1135
+ "loss": 10.184,
1136
+ "step": 161
1137
+ },
1138
+ {
1139
+ "epoch": 0.0049635774524898315,
1140
+ "grad_norm": 0.2579200267791748,
1141
+ "learning_rate": 9.743430468939777e-05,
1142
+ "loss": 10.1815,
1143
+ "step": 162
1144
+ },
1145
+ {
1146
+ "epoch": 0.00499421681948051,
1147
+ "grad_norm": 0.2534882426261902,
1148
+ "learning_rate": 9.740180352672188e-05,
1149
+ "loss": 10.1845,
1150
+ "step": 163
1151
+ },
1152
+ {
1153
+ "epoch": 0.005024856186471188,
1154
+ "grad_norm": 0.24429906904697418,
1155
+ "learning_rate": 9.736910329205978e-05,
1156
+ "loss": 10.1765,
1157
+ "step": 164
1158
+ },
1159
+ {
1160
+ "epoch": 0.005055495553461866,
1161
+ "grad_norm": 0.24666520953178406,
1162
+ "learning_rate": 9.733620412274173e-05,
1163
+ "loss": 10.1804,
1164
+ "step": 165
1165
+ },
1166
+ {
1167
+ "epoch": 0.005086134920452543,
1168
+ "grad_norm": 0.2347945123910904,
1169
+ "learning_rate": 9.730310615693338e-05,
1170
+ "loss": 10.1822,
1171
+ "step": 166
1172
+ },
1173
+ {
1174
+ "epoch": 0.005116774287443221,
1175
+ "grad_norm": 0.22700455784797668,
1176
+ "learning_rate": 9.726980953363536e-05,
1177
+ "loss": 10.1818,
1178
+ "step": 167
1179
+ },
1180
+ {
1181
+ "epoch": 0.005147413654433899,
1182
+ "grad_norm": 0.23626674711704254,
1183
+ "learning_rate": 9.723631439268248e-05,
1184
+ "loss": 10.1759,
1185
+ "step": 168
1186
+ },
1187
+ {
1188
+ "epoch": 0.0051780530214245776,
1189
+ "grad_norm": 0.257414311170578,
1190
+ "learning_rate": 9.720262087474335e-05,
1191
+ "loss": 10.1768,
1192
+ "step": 169
1193
+ },
1194
+ {
1195
+ "epoch": 0.005208692388415256,
1196
+ "grad_norm": 0.3152066171169281,
1197
+ "learning_rate": 9.716872912131964e-05,
1198
+ "loss": 10.1714,
1199
+ "step": 170
1200
+ },
1201
+ {
1202
+ "epoch": 0.005239331755405933,
1203
+ "grad_norm": 0.28246307373046875,
1204
+ "learning_rate": 9.713463927474559e-05,
1205
+ "loss": 10.1716,
1206
+ "step": 171
1207
+ },
1208
+ {
1209
+ "epoch": 0.005269971122396611,
1210
+ "grad_norm": 0.22445334494113922,
1211
+ "learning_rate": 9.710035147818735e-05,
1212
+ "loss": 10.1757,
1213
+ "step": 172
1214
+ },
1215
+ {
1216
+ "epoch": 0.005300610489387289,
1217
+ "grad_norm": 0.30164289474487305,
1218
+ "learning_rate": 9.706586587564237e-05,
1219
+ "loss": 10.175,
1220
+ "step": 173
1221
+ },
1222
+ {
1223
+ "epoch": 0.005331249856377967,
1224
+ "grad_norm": 0.3072303235530853,
1225
+ "learning_rate": 9.703118261193885e-05,
1226
+ "loss": 10.1654,
1227
+ "step": 174
1228
+ },
1229
+ {
1230
+ "epoch": 0.0053618892233686455,
1231
+ "grad_norm": 0.32241764664649963,
1232
+ "learning_rate": 9.699630183273507e-05,
1233
+ "loss": 10.159,
1234
+ "step": 175
1235
+ },
1236
+ {
1237
+ "epoch": 0.005392528590359323,
1238
+ "grad_norm": 0.29490017890930176,
1239
+ "learning_rate": 9.696122368451886e-05,
1240
+ "loss": 10.1653,
1241
+ "step": 176
1242
+ },
1243
+ {
1244
+ "epoch": 0.005423167957350001,
1245
+ "grad_norm": 0.3198687434196472,
1246
+ "learning_rate": 9.69259483146069e-05,
1247
+ "loss": 10.1572,
1248
+ "step": 177
1249
+ },
1250
+ {
1251
+ "epoch": 0.005453807324340679,
1252
+ "grad_norm": 0.3245239555835724,
1253
+ "learning_rate": 9.689047587114416e-05,
1254
+ "loss": 10.1551,
1255
+ "step": 178
1256
+ },
1257
+ {
1258
+ "epoch": 0.005484446691331357,
1259
+ "grad_norm": 0.33736497163772583,
1260
+ "learning_rate": 9.685480650310318e-05,
1261
+ "loss": 10.1626,
1262
+ "step": 179
1263
+ },
1264
+ {
1265
+ "epoch": 0.005515086058322035,
1266
+ "grad_norm": 0.378255695104599,
1267
+ "learning_rate": 9.681894036028365e-05,
1268
+ "loss": 10.151,
1269
+ "step": 180
1270
+ },
1271
+ {
1272
+ "epoch": 0.005545725425312713,
1273
+ "grad_norm": 0.2807524502277374,
1274
+ "learning_rate": 9.678287759331152e-05,
1275
+ "loss": 10.1632,
1276
+ "step": 181
1277
+ },
1278
+ {
1279
+ "epoch": 0.005576364792303391,
1280
+ "grad_norm": 0.29772505164146423,
1281
+ "learning_rate": 9.674661835363858e-05,
1282
+ "loss": 10.193,
1283
+ "step": 182
1284
+ },
1285
+ {
1286
+ "epoch": 0.005607004159294069,
1287
+ "grad_norm": 0.2409011572599411,
1288
+ "learning_rate": 9.671016279354171e-05,
1289
+ "loss": 10.1763,
1290
+ "step": 183
1291
+ },
1292
+ {
1293
+ "epoch": 0.005637643526284747,
1294
+ "grad_norm": 0.26572665572166443,
1295
+ "learning_rate": 9.667351106612223e-05,
1296
+ "loss": 10.1766,
1297
+ "step": 184
1298
+ },
1299
+ {
1300
+ "epoch": 0.005668282893275425,
1301
+ "grad_norm": 0.31308165192604065,
1302
+ "learning_rate": 9.663666332530541e-05,
1303
+ "loss": 10.1814,
1304
+ "step": 185
1305
+ },
1306
+ {
1307
+ "epoch": 0.005698922260266103,
1308
+ "grad_norm": 0.21770384907722473,
1309
+ "learning_rate": 9.659961972583959e-05,
1310
+ "loss": 10.1631,
1311
+ "step": 186
1312
+ },
1313
+ {
1314
+ "epoch": 0.005729561627256781,
1315
+ "grad_norm": 0.20671813189983368,
1316
+ "learning_rate": 9.656238042329574e-05,
1317
+ "loss": 10.1553,
1318
+ "step": 187
1319
+ },
1320
+ {
1321
+ "epoch": 0.005760200994247459,
1322
+ "grad_norm": 0.23726466298103333,
1323
+ "learning_rate": 9.652494557406666e-05,
1324
+ "loss": 10.1508,
1325
+ "step": 188
1326
+ },
1327
+ {
1328
+ "epoch": 0.005790840361238137,
1329
+ "grad_norm": 0.23543891310691833,
1330
+ "learning_rate": 9.648731533536643e-05,
1331
+ "loss": 10.1515,
1332
+ "step": 189
1333
+ },
1334
+ {
1335
+ "epoch": 0.005821479728228815,
1336
+ "grad_norm": 0.24708674848079681,
1337
+ "learning_rate": 9.644948986522966e-05,
1338
+ "loss": 10.1499,
1339
+ "step": 190
1340
+ },
1341
+ {
1342
+ "epoch": 0.005852119095219493,
1343
+ "grad_norm": 0.24789170920848846,
1344
+ "learning_rate": 9.641146932251089e-05,
1345
+ "loss": 10.1515,
1346
+ "step": 191
1347
+ },
1348
+ {
1349
+ "epoch": 0.00588275846221017,
1350
+ "grad_norm": 0.3027416467666626,
1351
+ "learning_rate": 9.637325386688389e-05,
1352
+ "loss": 10.1453,
1353
+ "step": 192
1354
+ },
1355
+ {
1356
+ "epoch": 0.0059133978292008485,
1357
+ "grad_norm": 0.3192403018474579,
1358
+ "learning_rate": 9.6334843658841e-05,
1359
+ "loss": 10.1505,
1360
+ "step": 193
1361
+ },
1362
+ {
1363
+ "epoch": 0.005944037196191527,
1364
+ "grad_norm": 0.25966107845306396,
1365
+ "learning_rate": 9.62962388596925e-05,
1366
+ "loss": 10.1594,
1367
+ "step": 194
1368
+ },
1369
+ {
1370
+ "epoch": 0.005974676563182205,
1371
+ "grad_norm": 0.23529139161109924,
1372
+ "learning_rate": 9.625743963156582e-05,
1373
+ "loss": 10.162,
1374
+ "step": 195
1375
+ },
1376
+ {
1377
+ "epoch": 0.006005315930172883,
1378
+ "grad_norm": 0.22195586562156677,
1379
+ "learning_rate": 9.621844613740494e-05,
1380
+ "loss": 10.151,
1381
+ "step": 196
1382
+ },
1383
+ {
1384
+ "epoch": 0.00603595529716356,
1385
+ "grad_norm": 0.4857119619846344,
1386
+ "learning_rate": 9.617925854096975e-05,
1387
+ "loss": 10.146,
1388
+ "step": 197
1389
+ },
1390
+ {
1391
+ "epoch": 0.006066594664154238,
1392
+ "grad_norm": 0.34034472703933716,
1393
+ "learning_rate": 9.613987700683526e-05,
1394
+ "loss": 10.1656,
1395
+ "step": 198
1396
+ },
1397
+ {
1398
+ "epoch": 0.0060972340311449165,
1399
+ "grad_norm": 0.36044079065322876,
1400
+ "learning_rate": 9.610030170039094e-05,
1401
+ "loss": 10.1633,
1402
+ "step": 199
1403
+ },
1404
+ {
1405
+ "epoch": 0.006127873398135595,
1406
+ "grad_norm": 0.314235657453537,
1407
+ "learning_rate": 9.606053278784009e-05,
1408
+ "loss": 10.1535,
1409
+ "step": 200
1410
+ },
1411
+ {
1412
+ "epoch": 0.006158512765126273,
1413
+ "grad_norm": 0.23128853738307953,
1414
+ "learning_rate": 9.602057043619903e-05,
1415
+ "loss": 10.1636,
1416
+ "step": 201
1417
+ },
1418
+ {
1419
+ "epoch": 0.00618915213211695,
1420
+ "grad_norm": 0.2757967412471771,
1421
+ "learning_rate": 9.598041481329653e-05,
1422
+ "loss": 10.165,
1423
+ "step": 202
1424
+ },
1425
+ {
1426
+ "epoch": 0.006219791499107628,
1427
+ "grad_norm": 0.3894402086734772,
1428
+ "learning_rate": 9.5940066087773e-05,
1429
+ "loss": 10.1658,
1430
+ "step": 203
1431
+ },
1432
+ {
1433
+ "epoch": 0.006250430866098306,
1434
+ "grad_norm": 0.2699827253818512,
1435
+ "learning_rate": 9.589952442907979e-05,
1436
+ "loss": 10.1837,
1437
+ "step": 204
1438
+ },
1439
+ {
1440
+ "epoch": 0.0062810702330889845,
1441
+ "grad_norm": 0.2653277814388275,
1442
+ "learning_rate": 9.585879000747855e-05,
1443
+ "loss": 10.1644,
1444
+ "step": 205
1445
+ },
1446
+ {
1447
+ "epoch": 0.006311709600079663,
1448
+ "grad_norm": 0.2447018176317215,
1449
+ "learning_rate": 9.581786299404045e-05,
1450
+ "loss": 10.1589,
1451
+ "step": 206
1452
+ },
1453
+ {
1454
+ "epoch": 0.006342348967070341,
1455
+ "grad_norm": 0.2339245080947876,
1456
+ "learning_rate": 9.57767435606455e-05,
1457
+ "loss": 10.1542,
1458
+ "step": 207
1459
+ },
1460
+ {
1461
+ "epoch": 0.006372988334061018,
1462
+ "grad_norm": 0.2381335347890854,
1463
+ "learning_rate": 9.57354318799818e-05,
1464
+ "loss": 10.1471,
1465
+ "step": 208
1466
+ },
1467
+ {
1468
+ "epoch": 0.006403627701051696,
1469
+ "grad_norm": 0.2182401418685913,
1470
+ "learning_rate": 9.569392812554478e-05,
1471
+ "loss": 10.1462,
1472
+ "step": 209
1473
+ },
1474
+ {
1475
+ "epoch": 0.006434267068042374,
1476
+ "grad_norm": 0.21164676547050476,
1477
+ "learning_rate": 9.565223247163661e-05,
1478
+ "loss": 10.1469,
1479
+ "step": 210
1480
+ },
1481
+ {
1482
+ "epoch": 0.006464906435033052,
1483
+ "grad_norm": 0.24600906670093536,
1484
+ "learning_rate": 9.56103450933653e-05,
1485
+ "loss": 10.1559,
1486
+ "step": 211
1487
+ },
1488
+ {
1489
+ "epoch": 0.0064955458020237306,
1490
+ "grad_norm": 0.23725466430187225,
1491
+ "learning_rate": 9.556826616664407e-05,
1492
+ "loss": 10.1552,
1493
+ "step": 212
1494
+ },
1495
+ {
1496
+ "epoch": 0.006526185169014408,
1497
+ "grad_norm": 0.23444445431232452,
1498
+ "learning_rate": 9.552599586819058e-05,
1499
+ "loss": 10.1545,
1500
+ "step": 213
1501
+ },
1502
+ {
1503
+ "epoch": 0.006556824536005086,
1504
+ "grad_norm": 0.21064426004886627,
1505
+ "learning_rate": 9.548353437552618e-05,
1506
+ "loss": 10.1484,
1507
+ "step": 214
1508
+ },
1509
+ {
1510
+ "epoch": 0.006587463902995764,
1511
+ "grad_norm": 0.2170412838459015,
1512
+ "learning_rate": 9.544088186697515e-05,
1513
+ "loss": 10.1495,
1514
+ "step": 215
1515
+ },
1516
+ {
1517
+ "epoch": 0.006618103269986442,
1518
+ "grad_norm": 0.26441171765327454,
1519
+ "learning_rate": 9.539803852166403e-05,
1520
+ "loss": 10.1564,
1521
+ "step": 216
1522
+ },
1523
+ {
1524
+ "epoch": 0.00664874263697712,
1525
+ "grad_norm": 0.22346778213977814,
1526
+ "learning_rate": 9.535500451952077e-05,
1527
+ "loss": 10.1438,
1528
+ "step": 217
1529
+ },
1530
+ {
1531
+ "epoch": 0.006679382003967798,
1532
+ "grad_norm": 0.20265920460224152,
1533
+ "learning_rate": 9.531178004127403e-05,
1534
+ "loss": 10.152,
1535
+ "step": 218
1536
+ },
1537
+ {
1538
+ "epoch": 0.006710021370958476,
1539
+ "grad_norm": 0.20692609250545502,
1540
+ "learning_rate": 9.52683652684524e-05,
1541
+ "loss": 10.1467,
1542
+ "step": 219
1543
+ },
1544
+ {
1545
+ "epoch": 0.006740660737949154,
1546
+ "grad_norm": 0.19775721430778503,
1547
+ "learning_rate": 9.522476038338365e-05,
1548
+ "loss": 10.156,
1549
+ "step": 220
1550
+ },
1551
+ {
1552
+ "epoch": 0.006771300104939832,
1553
+ "grad_norm": 0.21164998412132263,
1554
+ "learning_rate": 9.518096556919396e-05,
1555
+ "loss": 10.1525,
1556
+ "step": 221
1557
+ },
1558
+ {
1559
+ "epoch": 0.00680193947193051,
1560
+ "grad_norm": 0.22803594172000885,
1561
+ "learning_rate": 9.513698100980715e-05,
1562
+ "loss": 10.1542,
1563
+ "step": 222
1564
+ },
1565
+ {
1566
+ "epoch": 0.0068325788389211875,
1567
+ "grad_norm": 0.18837936222553253,
1568
+ "learning_rate": 9.509280688994389e-05,
1569
+ "loss": 10.1421,
1570
+ "step": 223
1571
+ },
1572
+ {
1573
+ "epoch": 0.006863218205911866,
1574
+ "grad_norm": 0.24142253398895264,
1575
+ "learning_rate": 9.504844339512095e-05,
1576
+ "loss": 10.1327,
1577
+ "step": 224
1578
+ },
1579
+ {
1580
+ "epoch": 0.006893857572902544,
1581
+ "grad_norm": 0.24025557935237885,
1582
+ "learning_rate": 9.500389071165046e-05,
1583
+ "loss": 10.1365,
1584
+ "step": 225
1585
+ },
1586
+ {
1587
+ "epoch": 0.006924496939893222,
1588
+ "grad_norm": 0.25372639298439026,
1589
+ "learning_rate": 9.495914902663901e-05,
1590
+ "loss": 10.1389,
1591
+ "step": 226
1592
+ },
1593
+ {
1594
+ "epoch": 0.0069551363068839,
1595
+ "grad_norm": 0.26540708541870117,
1596
+ "learning_rate": 9.491421852798696e-05,
1597
+ "loss": 10.1319,
1598
+ "step": 227
1599
+ },
1600
+ {
1601
+ "epoch": 0.006985775673874578,
1602
+ "grad_norm": 0.2537267804145813,
1603
+ "learning_rate": 9.486909940438762e-05,
1604
+ "loss": 10.1329,
1605
+ "step": 228
1606
+ },
1607
+ {
1608
+ "epoch": 0.007016415040865255,
1609
+ "grad_norm": 0.31383460760116577,
1610
+ "learning_rate": 9.48237918453265e-05,
1611
+ "loss": 10.132,
1612
+ "step": 229
1613
+ },
1614
+ {
1615
+ "epoch": 0.007047054407855934,
1616
+ "grad_norm": 0.2152584046125412,
1617
+ "learning_rate": 9.477829604108044e-05,
1618
+ "loss": 10.1469,
1619
+ "step": 230
1620
+ },
1621
+ {
1622
+ "epoch": 0.007077693774846612,
1623
+ "grad_norm": 0.19982336461544037,
1624
+ "learning_rate": 9.473261218271685e-05,
1625
+ "loss": 10.1642,
1626
+ "step": 231
1627
+ },
1628
+ {
1629
+ "epoch": 0.00710833314183729,
1630
+ "grad_norm": 0.1662013679742813,
1631
+ "learning_rate": 9.46867404620929e-05,
1632
+ "loss": 10.148,
1633
+ "step": 232
1634
+ },
1635
+ {
1636
+ "epoch": 0.007138972508827968,
1637
+ "grad_norm": 0.20403601229190826,
1638
+ "learning_rate": 9.464068107185476e-05,
1639
+ "loss": 10.1515,
1640
+ "step": 233
1641
+ },
1642
+ {
1643
+ "epoch": 0.007169611875818645,
1644
+ "grad_norm": 0.24628746509552002,
1645
+ "learning_rate": 9.459443420543667e-05,
1646
+ "loss": 10.1535,
1647
+ "step": 234
1648
+ },
1649
+ {
1650
+ "epoch": 0.007200251242809323,
1651
+ "grad_norm": 0.16836899518966675,
1652
+ "learning_rate": 9.454800005706033e-05,
1653
+ "loss": 10.157,
1654
+ "step": 235
1655
+ },
1656
+ {
1657
+ "epoch": 0.0072308906098000015,
1658
+ "grad_norm": 0.17160779237747192,
1659
+ "learning_rate": 9.450137882173384e-05,
1660
+ "loss": 10.1332,
1661
+ "step": 236
1662
+ },
1663
+ {
1664
+ "epoch": 0.00726152997679068,
1665
+ "grad_norm": 0.19151553511619568,
1666
+ "learning_rate": 9.445457069525108e-05,
1667
+ "loss": 10.1318,
1668
+ "step": 237
1669
+ },
1670
+ {
1671
+ "epoch": 0.007292169343781358,
1672
+ "grad_norm": 0.21934491395950317,
1673
+ "learning_rate": 9.440757587419077e-05,
1674
+ "loss": 10.13,
1675
+ "step": 238
1676
+ },
1677
+ {
1678
+ "epoch": 0.007322808710772035,
1679
+ "grad_norm": 0.16131079196929932,
1680
+ "learning_rate": 9.436039455591573e-05,
1681
+ "loss": 10.1289,
1682
+ "step": 239
1683
+ },
1684
+ {
1685
+ "epoch": 0.007353448077762713,
1686
+ "grad_norm": 0.21789006888866425,
1687
+ "learning_rate": 9.431302693857195e-05,
1688
+ "loss": 10.1351,
1689
+ "step": 240
1690
+ },
1691
+ {
1692
+ "epoch": 0.007384087444753391,
1693
+ "grad_norm": 0.24290093779563904,
1694
+ "learning_rate": 9.426547322108789e-05,
1695
+ "loss": 10.1265,
1696
+ "step": 241
1697
+ },
1698
+ {
1699
+ "epoch": 0.0074147268117440695,
1700
+ "grad_norm": 0.2682904005050659,
1701
+ "learning_rate": 9.421773360317347e-05,
1702
+ "loss": 10.1266,
1703
+ "step": 242
1704
+ },
1705
+ {
1706
+ "epoch": 0.007445366178734748,
1707
+ "grad_norm": 0.34069767594337463,
1708
+ "learning_rate": 9.416980828531943e-05,
1709
+ "loss": 10.125,
1710
+ "step": 243
1711
+ },
1712
+ {
1713
+ "epoch": 0.007476005545725425,
1714
+ "grad_norm": 0.2901553809642792,
1715
+ "learning_rate": 9.412169746879632e-05,
1716
+ "loss": 10.1281,
1717
+ "step": 244
1718
+ },
1719
+ {
1720
+ "epoch": 0.007506644912716103,
1721
+ "grad_norm": 0.18873432278633118,
1722
+ "learning_rate": 9.407340135565374e-05,
1723
+ "loss": 10.1392,
1724
+ "step": 245
1725
+ },
1726
+ {
1727
+ "epoch": 0.007537284279706781,
1728
+ "grad_norm": 0.18116723001003265,
1729
+ "learning_rate": 9.402492014871951e-05,
1730
+ "loss": 10.156,
1731
+ "step": 246
1732
+ },
1733
+ {
1734
+ "epoch": 0.007567923646697459,
1735
+ "grad_norm": 0.21934951841831207,
1736
+ "learning_rate": 9.397625405159875e-05,
1737
+ "loss": 10.1424,
1738
+ "step": 247
1739
+ },
1740
+ {
1741
+ "epoch": 0.0075985630136881374,
1742
+ "grad_norm": 0.30539992451667786,
1743
+ "learning_rate": 9.392740326867304e-05,
1744
+ "loss": 10.1459,
1745
+ "step": 248
1746
+ },
1747
+ {
1748
+ "epoch": 0.007629202380678816,
1749
+ "grad_norm": 0.34416231513023376,
1750
+ "learning_rate": 9.387836800509962e-05,
1751
+ "loss": 10.1471,
1752
+ "step": 249
1753
+ },
1754
+ {
1755
+ "epoch": 0.007659841747669493,
1756
+ "grad_norm": 0.30565956234931946,
1757
+ "learning_rate": 9.382914846681047e-05,
1758
+ "loss": 10.1363,
1759
+ "step": 250
1760
+ },
1761
+ {
1762
+ "epoch": 0.007690481114660171,
1763
+ "grad_norm": 0.1922389566898346,
1764
+ "learning_rate": 9.377974486051147e-05,
1765
+ "loss": 10.1499,
1766
+ "step": 251
1767
+ },
1768
+ {
1769
+ "epoch": 0.007721120481650849,
1770
+ "grad_norm": 0.24070623517036438,
1771
+ "learning_rate": 9.373015739368152e-05,
1772
+ "loss": 10.162,
1773
+ "step": 252
1774
+ },
1775
+ {
1776
+ "epoch": 0.007751759848641527,
1777
+ "grad_norm": 0.29451125860214233,
1778
+ "learning_rate": 9.368038627457165e-05,
1779
+ "loss": 10.1614,
1780
+ "step": 253
1781
+ },
1782
+ {
1783
+ "epoch": 0.007782399215632205,
1784
+ "grad_norm": 0.27838650345802307,
1785
+ "learning_rate": 9.363043171220423e-05,
1786
+ "loss": 10.1661,
1787
+ "step": 254
1788
+ },
1789
+ {
1790
+ "epoch": 0.007813038582622884,
1791
+ "grad_norm": 0.20925530791282654,
1792
+ "learning_rate": 9.358029391637195e-05,
1793
+ "loss": 10.1536,
1794
+ "step": 255
1795
+ },
1796
+ {
1797
+ "epoch": 0.007843677949613562,
1798
+ "grad_norm": 0.19836044311523438,
1799
+ "learning_rate": 9.35299730976371e-05,
1800
+ "loss": 10.1423,
1801
+ "step": 256
1802
+ },
1803
+ {
1804
+ "epoch": 0.00787431731660424,
1805
+ "grad_norm": 0.19826310873031616,
1806
+ "learning_rate": 9.347946946733055e-05,
1807
+ "loss": 10.1335,
1808
+ "step": 257
1809
+ },
1810
+ {
1811
+ "epoch": 0.007904956683594916,
1812
+ "grad_norm": 0.1715853065252304,
1813
+ "learning_rate": 9.342878323755095e-05,
1814
+ "loss": 10.1356,
1815
+ "step": 258
1816
+ },
1817
+ {
1818
+ "epoch": 0.007935596050585594,
1819
+ "grad_norm": 0.18618403375148773,
1820
+ "learning_rate": 9.337791462116381e-05,
1821
+ "loss": 10.1337,
1822
+ "step": 259
1823
+ },
1824
+ {
1825
+ "epoch": 0.007966235417576273,
1826
+ "grad_norm": 0.16666516661643982,
1827
+ "learning_rate": 9.332686383180055e-05,
1828
+ "loss": 10.1348,
1829
+ "step": 260
1830
+ },
1831
+ {
1832
+ "epoch": 0.00799687478456695,
1833
+ "grad_norm": 0.20730982720851898,
1834
+ "learning_rate": 9.327563108385773e-05,
1835
+ "loss": 10.1422,
1836
+ "step": 261
1837
+ },
1838
+ {
1839
+ "epoch": 0.008027514151557629,
1840
+ "grad_norm": 0.18355782330036163,
1841
+ "learning_rate": 9.322421659249603e-05,
1842
+ "loss": 10.1407,
1843
+ "step": 262
1844
+ },
1845
+ {
1846
+ "epoch": 0.008058153518548307,
1847
+ "grad_norm": 0.17472055554389954,
1848
+ "learning_rate": 9.31726205736394e-05,
1849
+ "loss": 10.1408,
1850
+ "step": 263
1851
+ },
1852
+ {
1853
+ "epoch": 0.008088792885538985,
1854
+ "grad_norm": 0.16438454389572144,
1855
+ "learning_rate": 9.312084324397416e-05,
1856
+ "loss": 10.1428,
1857
+ "step": 264
1858
+ },
1859
+ {
1860
+ "epoch": 0.008119432252529663,
1861
+ "grad_norm": 0.14644020795822144,
1862
+ "learning_rate": 9.306888482094806e-05,
1863
+ "loss": 10.1376,
1864
+ "step": 265
1865
+ },
1866
+ {
1867
+ "epoch": 0.008150071619520341,
1868
+ "grad_norm": 0.16809964179992676,
1869
+ "learning_rate": 9.301674552276942e-05,
1870
+ "loss": 10.1384,
1871
+ "step": 266
1872
+ },
1873
+ {
1874
+ "epoch": 0.00818071098651102,
1875
+ "grad_norm": 0.15989115834236145,
1876
+ "learning_rate": 9.29644255684061e-05,
1877
+ "loss": 10.1446,
1878
+ "step": 267
1879
+ },
1880
+ {
1881
+ "epoch": 0.008211350353501696,
1882
+ "grad_norm": 0.1357845813035965,
1883
+ "learning_rate": 9.291192517758474e-05,
1884
+ "loss": 10.1377,
1885
+ "step": 268
1886
+ },
1887
+ {
1888
+ "epoch": 0.008241989720492374,
1889
+ "grad_norm": 0.16372017562389374,
1890
+ "learning_rate": 9.28592445707897e-05,
1891
+ "loss": 10.1385,
1892
+ "step": 269
1893
+ },
1894
+ {
1895
+ "epoch": 0.008272629087483052,
1896
+ "grad_norm": 0.16150210797786713,
1897
+ "learning_rate": 9.280638396926218e-05,
1898
+ "loss": 10.148,
1899
+ "step": 270
1900
+ },
1901
+ {
1902
+ "epoch": 0.00830326845447373,
1903
+ "grad_norm": 0.17247039079666138,
1904
+ "learning_rate": 9.275334359499936e-05,
1905
+ "loss": 10.1395,
1906
+ "step": 271
1907
+ },
1908
+ {
1909
+ "epoch": 0.008333907821464408,
1910
+ "grad_norm": 0.19114305078983307,
1911
+ "learning_rate": 9.270012367075336e-05,
1912
+ "loss": 10.144,
1913
+ "step": 272
1914
+ },
1915
+ {
1916
+ "epoch": 0.008364547188455087,
1917
+ "grad_norm": 0.18105651438236237,
1918
+ "learning_rate": 9.264672442003034e-05,
1919
+ "loss": 10.1447,
1920
+ "step": 273
1921
+ },
1922
+ {
1923
+ "epoch": 0.008395186555445765,
1924
+ "grad_norm": 0.1573803424835205,
1925
+ "learning_rate": 9.259314606708964e-05,
1926
+ "loss": 10.135,
1927
+ "step": 274
1928
+ },
1929
+ {
1930
+ "epoch": 0.008425825922436443,
1931
+ "grad_norm": 0.18148738145828247,
1932
+ "learning_rate": 9.253938883694267e-05,
1933
+ "loss": 10.1308,
1934
+ "step": 275
1935
+ },
1936
+ {
1937
+ "epoch": 0.008456465289427121,
1938
+ "grad_norm": 0.20688171684741974,
1939
+ "learning_rate": 9.248545295535213e-05,
1940
+ "loss": 10.1298,
1941
+ "step": 276
1942
+ },
1943
+ {
1944
+ "epoch": 0.008487104656417799,
1945
+ "grad_norm": 0.24721446633338928,
1946
+ "learning_rate": 9.2431338648831e-05,
1947
+ "loss": 10.1292,
1948
+ "step": 277
1949
+ },
1950
+ {
1951
+ "epoch": 0.008517744023408476,
1952
+ "grad_norm": 0.24435733258724213,
1953
+ "learning_rate": 9.237704614464156e-05,
1954
+ "loss": 10.1251,
1955
+ "step": 278
1956
+ },
1957
+ {
1958
+ "epoch": 0.008548383390399154,
1959
+ "grad_norm": 0.2644619047641754,
1960
+ "learning_rate": 9.232257567079448e-05,
1961
+ "loss": 10.1229,
1962
+ "step": 279
1963
+ },
1964
+ {
1965
+ "epoch": 0.008579022757389832,
1966
+ "grad_norm": 0.2871146500110626,
1967
+ "learning_rate": 9.226792745604784e-05,
1968
+ "loss": 10.124,
1969
+ "step": 280
1970
+ },
1971
+ {
1972
+ "epoch": 0.00860966212438051,
1973
+ "grad_norm": 0.14572951197624207,
1974
+ "learning_rate": 9.221310172990616e-05,
1975
+ "loss": 10.147,
1976
+ "step": 281
1977
+ },
1978
+ {
1979
+ "epoch": 0.008640301491371188,
1980
+ "grad_norm": 0.24766798317432404,
1981
+ "learning_rate": 9.215809872261947e-05,
1982
+ "loss": 10.157,
1983
+ "step": 282
1984
+ },
1985
+ {
1986
+ "epoch": 0.008670940858361866,
1987
+ "grad_norm": 0.190408855676651,
1988
+ "learning_rate": 9.210291866518229e-05,
1989
+ "loss": 10.1525,
1990
+ "step": 283
1991
+ },
1992
+ {
1993
+ "epoch": 0.008701580225352544,
1994
+ "grad_norm": 0.16426639258861542,
1995
+ "learning_rate": 9.204756178933274e-05,
1996
+ "loss": 10.147,
1997
+ "step": 284
1998
+ },
1999
+ {
2000
+ "epoch": 0.008732219592343222,
2001
+ "grad_norm": 0.23785221576690674,
2002
+ "learning_rate": 9.19920283275515e-05,
2003
+ "loss": 10.1518,
2004
+ "step": 285
2005
+ },
2006
+ {
2007
+ "epoch": 0.0087628589593339,
2008
+ "grad_norm": 0.17590944468975067,
2009
+ "learning_rate": 9.19363185130608e-05,
2010
+ "loss": 10.1443,
2011
+ "step": 286
2012
+ },
2013
+ {
2014
+ "epoch": 0.008793498326324579,
2015
+ "grad_norm": 0.10858047753572464,
2016
+ "learning_rate": 9.18804325798236e-05,
2017
+ "loss": 10.1312,
2018
+ "step": 287
2019
+ },
2020
+ {
2021
+ "epoch": 0.008824137693315257,
2022
+ "grad_norm": 0.15725189447402954,
2023
+ "learning_rate": 9.18243707625424e-05,
2024
+ "loss": 10.1221,
2025
+ "step": 288
2026
+ },
2027
+ {
2028
+ "epoch": 0.008854777060305933,
2029
+ "grad_norm": 0.13659140467643738,
2030
+ "learning_rate": 9.176813329665841e-05,
2031
+ "loss": 10.1299,
2032
+ "step": 289
2033
+ },
2034
+ {
2035
+ "epoch": 0.008885416427296611,
2036
+ "grad_norm": 0.2288004457950592,
2037
+ "learning_rate": 9.17117204183505e-05,
2038
+ "loss": 10.1196,
2039
+ "step": 290
2040
+ },
2041
+ {
2042
+ "epoch": 0.00891605579428729,
2043
+ "grad_norm": 0.21320736408233643,
2044
+ "learning_rate": 9.165513236453419e-05,
2045
+ "loss": 10.1182,
2046
+ "step": 291
2047
+ },
2048
+ {
2049
+ "epoch": 0.008946695161277968,
2050
+ "grad_norm": 0.21424899995326996,
2051
+ "learning_rate": 9.15983693728607e-05,
2052
+ "loss": 10.1193,
2053
+ "step": 292
2054
+ },
2055
+ {
2056
+ "epoch": 0.008977334528268646,
2057
+ "grad_norm": 0.2179030030965805,
2058
+ "learning_rate": 9.154143168171592e-05,
2059
+ "loss": 10.1192,
2060
+ "step": 293
2061
+ },
2062
+ {
2063
+ "epoch": 0.009007973895259324,
2064
+ "grad_norm": 0.3053359389305115,
2065
+ "learning_rate": 9.148431953021947e-05,
2066
+ "loss": 10.1168,
2067
+ "step": 294
2068
+ },
2069
+ {
2070
+ "epoch": 0.009038613262250002,
2071
+ "grad_norm": 0.25327202677726746,
2072
+ "learning_rate": 9.142703315822357e-05,
2073
+ "loss": 10.1252,
2074
+ "step": 295
2075
+ },
2076
+ {
2077
+ "epoch": 0.00906925262924068,
2078
+ "grad_norm": 0.13767379522323608,
2079
+ "learning_rate": 9.136957280631212e-05,
2080
+ "loss": 10.138,
2081
+ "step": 296
2082
+ },
2083
+ {
2084
+ "epoch": 0.009099891996231358,
2085
+ "grad_norm": 0.25260189175605774,
2086
+ "learning_rate": 9.131193871579975e-05,
2087
+ "loss": 10.1382,
2088
+ "step": 297
2089
+ },
2090
+ {
2091
+ "epoch": 0.009130531363222037,
2092
+ "grad_norm": 0.25824061036109924,
2093
+ "learning_rate": 9.125413112873067e-05,
2094
+ "loss": 10.1408,
2095
+ "step": 298
2096
+ },
2097
+ {
2098
+ "epoch": 0.009161170730212713,
2099
+ "grad_norm": 0.31405767798423767,
2100
+ "learning_rate": 9.119615028787771e-05,
2101
+ "loss": 10.1411,
2102
+ "step": 299
2103
+ },
2104
+ {
2105
+ "epoch": 0.009191810097203391,
2106
+ "grad_norm": 0.28313931822776794,
2107
+ "learning_rate": 9.113799643674136e-05,
2108
+ "loss": 10.1334,
2109
+ "step": 300
2110
+ },
2111
+ {
2112
+ "epoch": 0.00922244946419407,
2113
+ "grad_norm": 0.18197022378444672,
2114
+ "learning_rate": 9.107966981954869e-05,
2115
+ "loss": 10.136,
2116
+ "step": 301
2117
+ },
2118
+ {
2119
+ "epoch": 0.009253088831184747,
2120
+ "grad_norm": 0.198178231716156,
2121
+ "learning_rate": 9.102117068125226e-05,
2122
+ "loss": 10.1561,
2123
+ "step": 302
2124
+ },
2125
+ {
2126
+ "epoch": 0.009283728198175425,
2127
+ "grad_norm": 0.32714104652404785,
2128
+ "learning_rate": 9.096249926752927e-05,
2129
+ "loss": 10.1519,
2130
+ "step": 303
2131
+ },
2132
+ {
2133
+ "epoch": 0.009314367565166104,
2134
+ "grad_norm": 0.19037552177906036,
2135
+ "learning_rate": 9.09036558247803e-05,
2136
+ "loss": 10.1544,
2137
+ "step": 304
2138
+ },
2139
+ {
2140
+ "epoch": 0.009345006932156782,
2141
+ "grad_norm": 0.16049005091190338,
2142
+ "learning_rate": 9.08446406001285e-05,
2143
+ "loss": 10.145,
2144
+ "step": 305
2145
+ },
2146
+ {
2147
+ "epoch": 0.00937564629914746,
2148
+ "grad_norm": 0.17063891887664795,
2149
+ "learning_rate": 9.078545384141839e-05,
2150
+ "loss": 10.1333,
2151
+ "step": 306
2152
+ },
2153
+ {
2154
+ "epoch": 0.009406285666138138,
2155
+ "grad_norm": 0.1404566466808319,
2156
+ "learning_rate": 9.072609579721491e-05,
2157
+ "loss": 10.1271,
2158
+ "step": 307
2159
+ },
2160
+ {
2161
+ "epoch": 0.009436925033128816,
2162
+ "grad_norm": 0.127190962433815,
2163
+ "learning_rate": 9.06665667168023e-05,
2164
+ "loss": 10.1289,
2165
+ "step": 308
2166
+ },
2167
+ {
2168
+ "epoch": 0.009467564400119494,
2169
+ "grad_norm": 0.13612677156925201,
2170
+ "learning_rate": 9.060686685018315e-05,
2171
+ "loss": 10.132,
2172
+ "step": 309
2173
+ },
2174
+ {
2175
+ "epoch": 0.00949820376711017,
2176
+ "grad_norm": 0.16601459681987762,
2177
+ "learning_rate": 9.054699644807723e-05,
2178
+ "loss": 10.126,
2179
+ "step": 310
2180
+ },
2181
+ {
2182
+ "epoch": 0.009528843134100849,
2183
+ "grad_norm": 0.12142466008663177,
2184
+ "learning_rate": 9.048695576192058e-05,
2185
+ "loss": 10.1321,
2186
+ "step": 311
2187
+ },
2188
+ {
2189
+ "epoch": 0.009559482501091527,
2190
+ "grad_norm": 0.19359469413757324,
2191
+ "learning_rate": 9.042674504386431e-05,
2192
+ "loss": 10.1388,
2193
+ "step": 312
2194
+ },
2195
+ {
2196
+ "epoch": 0.009590121868082205,
2197
+ "grad_norm": 0.1542837768793106,
2198
+ "learning_rate": 9.036636454677364e-05,
2199
+ "loss": 10.1341,
2200
+ "step": 313
2201
+ },
2202
+ {
2203
+ "epoch": 0.009620761235072883,
2204
+ "grad_norm": 0.16852296888828278,
2205
+ "learning_rate": 9.030581452422679e-05,
2206
+ "loss": 10.1315,
2207
+ "step": 314
2208
+ },
2209
+ {
2210
+ "epoch": 0.009651400602063561,
2211
+ "grad_norm": 0.15417806804180145,
2212
+ "learning_rate": 9.024509523051399e-05,
2213
+ "loss": 10.134,
2214
+ "step": 315
2215
+ },
2216
+ {
2217
+ "epoch": 0.00968203996905424,
2218
+ "grad_norm": 0.15264514088630676,
2219
+ "learning_rate": 9.018420692063624e-05,
2220
+ "loss": 10.135,
2221
+ "step": 316
2222
+ },
2223
+ {
2224
+ "epoch": 0.009712679336044918,
2225
+ "grad_norm": 0.1251794546842575,
2226
+ "learning_rate": 9.012314985030445e-05,
2227
+ "loss": 10.1384,
2228
+ "step": 317
2229
+ },
2230
+ {
2231
+ "epoch": 0.009743318703035596,
2232
+ "grad_norm": 0.13219048082828522,
2233
+ "learning_rate": 9.006192427593825e-05,
2234
+ "loss": 10.1378,
2235
+ "step": 318
2236
+ },
2237
+ {
2238
+ "epoch": 0.009773958070026274,
2239
+ "grad_norm": 0.15922632813453674,
2240
+ "learning_rate": 9.00005304546649e-05,
2241
+ "loss": 10.1374,
2242
+ "step": 319
2243
+ },
2244
+ {
2245
+ "epoch": 0.00980459743701695,
2246
+ "grad_norm": 0.14578090608119965,
2247
+ "learning_rate": 8.993896864431826e-05,
2248
+ "loss": 10.14,
2249
+ "step": 320
2250
+ },
2251
+ {
2252
+ "epoch": 0.009835236804007629,
2253
+ "grad_norm": 0.11984135210514069,
2254
+ "learning_rate": 8.987723910343769e-05,
2255
+ "loss": 10.1399,
2256
+ "step": 321
2257
+ },
2258
+ {
2259
+ "epoch": 0.009865876170998307,
2260
+ "grad_norm": 0.14472252130508423,
2261
+ "learning_rate": 8.981534209126694e-05,
2262
+ "loss": 10.1372,
2263
+ "step": 322
2264
+ },
2265
+ {
2266
+ "epoch": 0.009896515537988985,
2267
+ "grad_norm": 0.12021119147539139,
2268
+ "learning_rate": 8.975327786775315e-05,
2269
+ "loss": 10.1348,
2270
+ "step": 323
2271
+ },
2272
+ {
2273
+ "epoch": 0.009927154904979663,
2274
+ "grad_norm": 0.12422214448451996,
2275
+ "learning_rate": 8.969104669354562e-05,
2276
+ "loss": 10.1358,
2277
+ "step": 324
2278
+ },
2279
+ {
2280
+ "epoch": 0.009957794271970341,
2281
+ "grad_norm": 0.19478030502796173,
2282
+ "learning_rate": 8.962864882999482e-05,
2283
+ "loss": 10.1285,
2284
+ "step": 325
2285
+ },
2286
+ {
2287
+ "epoch": 0.00998843363896102,
2288
+ "grad_norm": 0.22339242696762085,
2289
+ "learning_rate": 8.956608453915125e-05,
2290
+ "loss": 10.126,
2291
+ "step": 326
2292
+ },
2293
+ {
2294
+ "epoch": 0.010019073005951697,
2295
+ "grad_norm": 0.23529845476150513,
2296
+ "learning_rate": 8.950335408376437e-05,
2297
+ "loss": 10.1268,
2298
+ "step": 327
2299
+ },
2300
+ {
2301
+ "epoch": 0.010049712372942375,
2302
+ "grad_norm": 0.21394489705562592,
2303
+ "learning_rate": 8.944045772728146e-05,
2304
+ "loss": 10.1266,
2305
+ "step": 328
2306
+ },
2307
+ {
2308
+ "epoch": 0.010080351739933054,
2309
+ "grad_norm": 0.27770131826400757,
2310
+ "learning_rate": 8.937739573384653e-05,
2311
+ "loss": 10.1165,
2312
+ "step": 329
2313
+ },
2314
+ {
2315
+ "epoch": 0.010110991106923732,
2316
+ "grad_norm": 0.17290028929710388,
2317
+ "learning_rate": 8.93141683682992e-05,
2318
+ "loss": 10.1316,
2319
+ "step": 330
2320
+ },
2321
+ {
2322
+ "epoch": 0.010141630473914408,
2323
+ "grad_norm": 0.22467806935310364,
2324
+ "learning_rate": 8.925077589617365e-05,
2325
+ "loss": 10.1453,
2326
+ "step": 331
2327
+ },
2328
+ {
2329
+ "epoch": 0.010172269840905086,
2330
+ "grad_norm": 0.21685166656970978,
2331
+ "learning_rate": 8.918721858369738e-05,
2332
+ "loss": 10.1677,
2333
+ "step": 332
2334
+ },
2335
+ {
2336
+ "epoch": 0.010202909207895764,
2337
+ "grad_norm": 0.2303144633769989,
2338
+ "learning_rate": 8.91234966977902e-05,
2339
+ "loss": 10.1491,
2340
+ "step": 333
2341
+ },
2342
+ {
2343
+ "epoch": 0.010233548574886443,
2344
+ "grad_norm": 0.17318762838840485,
2345
+ "learning_rate": 8.90596105060631e-05,
2346
+ "loss": 10.1422,
2347
+ "step": 334
2348
+ },
2349
+ {
2350
+ "epoch": 0.01026418794187712,
2351
+ "grad_norm": 0.16077402234077454,
2352
+ "learning_rate": 8.899556027681709e-05,
2353
+ "loss": 10.1349,
2354
+ "step": 335
2355
+ },
2356
+ {
2357
+ "epoch": 0.010294827308867799,
2358
+ "grad_norm": 0.10533782094717026,
2359
+ "learning_rate": 8.8931346279042e-05,
2360
+ "loss": 10.1316,
2361
+ "step": 336
2362
+ },
2363
+ {
2364
+ "epoch": 0.010325466675858477,
2365
+ "grad_norm": 0.13302670419216156,
2366
+ "learning_rate": 8.886696878241556e-05,
2367
+ "loss": 10.1211,
2368
+ "step": 337
2369
+ },
2370
+ {
2371
+ "epoch": 0.010356106042849155,
2372
+ "grad_norm": 0.12751758098602295,
2373
+ "learning_rate": 8.880242805730208e-05,
2374
+ "loss": 10.1179,
2375
+ "step": 338
2376
+ },
2377
+ {
2378
+ "epoch": 0.010386745409839833,
2379
+ "grad_norm": 0.20965084433555603,
2380
+ "learning_rate": 8.873772437475136e-05,
2381
+ "loss": 10.121,
2382
+ "step": 339
2383
+ },
2384
+ {
2385
+ "epoch": 0.010417384776830511,
2386
+ "grad_norm": 0.16965822875499725,
2387
+ "learning_rate": 8.86728580064976e-05,
2388
+ "loss": 10.1193,
2389
+ "step": 340
2390
+ },
2391
+ {
2392
+ "epoch": 0.010448024143821188,
2393
+ "grad_norm": 0.17672576010227203,
2394
+ "learning_rate": 8.860782922495822e-05,
2395
+ "loss": 10.1162,
2396
+ "step": 341
2397
+ },
2398
+ {
2399
+ "epoch": 0.010478663510811866,
2400
+ "grad_norm": 0.20148865878582,
2401
+ "learning_rate": 8.85426383032327e-05,
2402
+ "loss": 10.1224,
2403
+ "step": 342
2404
+ },
2405
+ {
2406
+ "epoch": 0.010509302877802544,
2407
+ "grad_norm": 0.2884007692337036,
2408
+ "learning_rate": 8.84772855151015e-05,
2409
+ "loss": 10.1101,
2410
+ "step": 343
2411
+ },
2412
+ {
2413
+ "epoch": 0.010539942244793222,
2414
+ "grad_norm": 0.1849764734506607,
2415
+ "learning_rate": 8.841177113502482e-05,
2416
+ "loss": 10.121,
2417
+ "step": 344
2418
+ },
2419
+ {
2420
+ "epoch": 0.0105705816117839,
2421
+ "grad_norm": 0.10992087423801422,
2422
+ "learning_rate": 8.834609543814149e-05,
2423
+ "loss": 10.1371,
2424
+ "step": 345
2425
+ },
2426
+ {
2427
+ "epoch": 0.010601220978774578,
2428
+ "grad_norm": 0.18129339814186096,
2429
+ "learning_rate": 8.828025870026789e-05,
2430
+ "loss": 10.1279,
2431
+ "step": 346
2432
+ },
2433
+ {
2434
+ "epoch": 0.010631860345765257,
2435
+ "grad_norm": 0.23670345544815063,
2436
+ "learning_rate": 8.821426119789662e-05,
2437
+ "loss": 10.1342,
2438
+ "step": 347
2439
+ },
2440
+ {
2441
+ "epoch": 0.010662499712755935,
2442
+ "grad_norm": 0.27441591024398804,
2443
+ "learning_rate": 8.814810320819551e-05,
2444
+ "loss": 10.1379,
2445
+ "step": 348
2446
+ },
2447
+ {
2448
+ "epoch": 0.010693139079746613,
2449
+ "grad_norm": 0.2867147922515869,
2450
+ "learning_rate": 8.808178500900635e-05,
2451
+ "loss": 10.1313,
2452
+ "step": 349
2453
+ },
2454
+ {
2455
+ "epoch": 0.010723778446737291,
2456
+ "grad_norm": 0.25631478428840637,
2457
+ "learning_rate": 8.801530687884378e-05,
2458
+ "loss": 10.1241,
2459
+ "step": 350
2460
+ },
2461
+ {
2462
+ "epoch": 0.01075441781372797,
2463
+ "grad_norm": 0.13319076597690582,
2464
+ "learning_rate": 8.794866909689408e-05,
2465
+ "loss": 10.1408,
2466
+ "step": 351
2467
+ },
2468
+ {
2469
+ "epoch": 0.010785057180718646,
2470
+ "grad_norm": 0.21450123190879822,
2471
+ "learning_rate": 8.7881871943014e-05,
2472
+ "loss": 10.1527,
2473
+ "step": 352
2474
+ },
2475
+ {
2476
+ "epoch": 0.010815696547709324,
2477
+ "grad_norm": 0.3608259856700897,
2478
+ "learning_rate": 8.781491569772966e-05,
2479
+ "loss": 10.1454,
2480
+ "step": 353
2481
+ },
2482
+ {
2483
+ "epoch": 0.010846335914700002,
2484
+ "grad_norm": 0.23542171716690063,
2485
+ "learning_rate": 8.774780064223523e-05,
2486
+ "loss": 10.1558,
2487
+ "step": 354
2488
+ },
2489
+ {
2490
+ "epoch": 0.01087697528169068,
2491
+ "grad_norm": 0.11694921553134918,
2492
+ "learning_rate": 8.76805270583919e-05,
2493
+ "loss": 10.1551,
2494
+ "step": 355
2495
+ },
2496
+ {
2497
+ "epoch": 0.010907614648681358,
2498
+ "grad_norm": 0.13456478714942932,
2499
+ "learning_rate": 8.761309522872657e-05,
2500
+ "loss": 10.1316,
2501
+ "step": 356
2502
+ },
2503
+ {
2504
+ "epoch": 0.010938254015672036,
2505
+ "grad_norm": 0.14492842555046082,
2506
+ "learning_rate": 8.754550543643079e-05,
2507
+ "loss": 10.1285,
2508
+ "step": 357
2509
+ },
2510
+ {
2511
+ "epoch": 0.010968893382662714,
2512
+ "grad_norm": 0.14882643520832062,
2513
+ "learning_rate": 8.747775796535941e-05,
2514
+ "loss": 10.1277,
2515
+ "step": 358
2516
+ },
2517
+ {
2518
+ "epoch": 0.010999532749653393,
2519
+ "grad_norm": 0.10642393678426743,
2520
+ "learning_rate": 8.740985310002956e-05,
2521
+ "loss": 10.1303,
2522
+ "step": 359
2523
+ },
2524
+ {
2525
+ "epoch": 0.01103017211664407,
2526
+ "grad_norm": 0.12168186902999878,
2527
+ "learning_rate": 8.734179112561935e-05,
2528
+ "loss": 10.1224,
2529
+ "step": 360
2530
+ },
2531
+ {
2532
+ "epoch": 0.011060811483634749,
2533
+ "grad_norm": 0.1298719346523285,
2534
+ "learning_rate": 8.727357232796667e-05,
2535
+ "loss": 10.1207,
2536
+ "step": 361
2537
+ },
2538
+ {
2539
+ "epoch": 0.011091450850625425,
2540
+ "grad_norm": 0.14631900191307068,
2541
+ "learning_rate": 8.720519699356804e-05,
2542
+ "loss": 10.137,
2543
+ "step": 362
2544
+ },
2545
+ {
2546
+ "epoch": 0.011122090217616103,
2547
+ "grad_norm": 0.15650637447834015,
2548
+ "learning_rate": 8.713666540957739e-05,
2549
+ "loss": 10.1305,
2550
+ "step": 363
2551
+ },
2552
+ {
2553
+ "epoch": 0.011152729584606782,
2554
+ "grad_norm": 0.15470795333385468,
2555
+ "learning_rate": 8.706797786380481e-05,
2556
+ "loss": 10.1333,
2557
+ "step": 364
2558
+ },
2559
+ {
2560
+ "epoch": 0.01118336895159746,
2561
+ "grad_norm": 0.12874755263328552,
2562
+ "learning_rate": 8.699913464471543e-05,
2563
+ "loss": 10.1355,
2564
+ "step": 365
2565
+ },
2566
+ {
2567
+ "epoch": 0.011214008318588138,
2568
+ "grad_norm": 0.19236956536769867,
2569
+ "learning_rate": 8.693013604142812e-05,
2570
+ "loss": 10.1359,
2571
+ "step": 366
2572
+ },
2573
+ {
2574
+ "epoch": 0.011244647685578816,
2575
+ "grad_norm": 0.10384097695350647,
2576
+ "learning_rate": 8.686098234371434e-05,
2577
+ "loss": 10.1301,
2578
+ "step": 367
2579
+ },
2580
+ {
2581
+ "epoch": 0.011275287052569494,
2582
+ "grad_norm": 0.10370136052370071,
2583
+ "learning_rate": 8.679167384199684e-05,
2584
+ "loss": 10.1379,
2585
+ "step": 368
2586
+ },
2587
+ {
2588
+ "epoch": 0.011305926419560172,
2589
+ "grad_norm": 0.13283184170722961,
2590
+ "learning_rate": 8.672221082734857e-05,
2591
+ "loss": 10.1367,
2592
+ "step": 369
2593
+ },
2594
+ {
2595
+ "epoch": 0.01133656578655085,
2596
+ "grad_norm": 0.10607551783323288,
2597
+ "learning_rate": 8.665259359149132e-05,
2598
+ "loss": 10.1376,
2599
+ "step": 370
2600
+ },
2601
+ {
2602
+ "epoch": 0.011367205153541528,
2603
+ "grad_norm": 0.09600384533405304,
2604
+ "learning_rate": 8.658282242679461e-05,
2605
+ "loss": 10.135,
2606
+ "step": 371
2607
+ },
2608
+ {
2609
+ "epoch": 0.011397844520532207,
2610
+ "grad_norm": 0.12220925837755203,
2611
+ "learning_rate": 8.651289762627436e-05,
2612
+ "loss": 10.1374,
2613
+ "step": 372
2614
+ },
2615
+ {
2616
+ "epoch": 0.011428483887522883,
2617
+ "grad_norm": 0.15407343208789825,
2618
+ "learning_rate": 8.644281948359178e-05,
2619
+ "loss": 10.1371,
2620
+ "step": 373
2621
+ },
2622
+ {
2623
+ "epoch": 0.011459123254513561,
2624
+ "grad_norm": 0.16631901264190674,
2625
+ "learning_rate": 8.637258829305199e-05,
2626
+ "loss": 10.1323,
2627
+ "step": 374
2628
+ },
2629
+ {
2630
+ "epoch": 0.01148976262150424,
2631
+ "grad_norm": 0.21785318851470947,
2632
+ "learning_rate": 8.63022043496029e-05,
2633
+ "loss": 10.1221,
2634
+ "step": 375
2635
+ },
2636
+ {
2637
+ "epoch": 0.011520401988494917,
2638
+ "grad_norm": 0.21979299187660217,
2639
+ "learning_rate": 8.623166794883393e-05,
2640
+ "loss": 10.1171,
2641
+ "step": 376
2642
+ },
2643
+ {
2644
+ "epoch": 0.011551041355485596,
2645
+ "grad_norm": 0.2309437394142151,
2646
+ "learning_rate": 8.616097938697475e-05,
2647
+ "loss": 10.119,
2648
+ "step": 377
2649
+ },
2650
+ {
2651
+ "epoch": 0.011581680722476274,
2652
+ "grad_norm": 0.21765172481536865,
2653
+ "learning_rate": 8.609013896089409e-05,
2654
+ "loss": 10.1197,
2655
+ "step": 378
2656
+ },
2657
+ {
2658
+ "epoch": 0.011612320089466952,
2659
+ "grad_norm": 0.201009601354599,
2660
+ "learning_rate": 8.601914696809845e-05,
2661
+ "loss": 10.1192,
2662
+ "step": 379
2663
+ },
2664
+ {
2665
+ "epoch": 0.01164295945645763,
2666
+ "grad_norm": 0.24598164856433868,
2667
+ "learning_rate": 8.594800370673083e-05,
2668
+ "loss": 10.1206,
2669
+ "step": 380
2670
+ },
2671
+ {
2672
+ "epoch": 0.011673598823448308,
2673
+ "grad_norm": 0.20482785999774933,
2674
+ "learning_rate": 8.587670947556954e-05,
2675
+ "loss": 10.1211,
2676
+ "step": 381
2677
+ },
2678
+ {
2679
+ "epoch": 0.011704238190438986,
2680
+ "grad_norm": 0.1774405539035797,
2681
+ "learning_rate": 8.580526457402692e-05,
2682
+ "loss": 10.1507,
2683
+ "step": 382
2684
+ },
2685
+ {
2686
+ "epoch": 0.011734877557429663,
2687
+ "grad_norm": 0.18236958980560303,
2688
+ "learning_rate": 8.573366930214806e-05,
2689
+ "loss": 10.1516,
2690
+ "step": 383
2691
+ },
2692
+ {
2693
+ "epoch": 0.01176551692442034,
2694
+ "grad_norm": 0.19878503680229187,
2695
+ "learning_rate": 8.566192396060953e-05,
2696
+ "loss": 10.1571,
2697
+ "step": 384
2698
+ },
2699
+ {
2700
+ "epoch": 0.011796156291411019,
2701
+ "grad_norm": 0.19371262192726135,
2702
+ "learning_rate": 8.559002885071819e-05,
2703
+ "loss": 10.1471,
2704
+ "step": 385
2705
+ }
2706
+ ],
2707
+ "logging_steps": 1,
2708
+ "max_steps": 1538,
2709
+ "num_input_tokens_seen": 0,
2710
+ "num_train_epochs": 1,
2711
+ "save_steps": 385,
2712
+ "stateful_callbacks": {
2713
+ "TrainerControl": {
2714
+ "args": {
2715
+ "should_epoch_stop": false,
2716
+ "should_evaluate": false,
2717
+ "should_log": false,
2718
+ "should_save": true,
2719
+ "should_training_stop": false
2720
+ },
2721
+ "attributes": {}
2722
+ }
2723
+ },
2724
+ "total_flos": 40170225401856.0,
2725
+ "train_batch_size": 4,
2726
+ "trial_name": null,
2727
+ "trial_params": null
2728
+ }
last-checkpoint/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9cf46e560e6c11dc285014c129fad149dbc7ca148b6bb108c291065eb6edfbe9
3
+ size 6776