File size: 6,608 Bytes
df1c058
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
2021-05-31 05:47:27,758	INFO	__main__	Namespace(adjust_lr=False, config='torchdistill/configs/sample/glue/sst2/kd/bert_base_uncased_from_bert_large_uncased.yaml', log='log/glue/sst2/kd/bert_base_uncased_from_bert_large_uncased.txt', private_output='leaderboard/glue/kd/bert_base_uncased_from_bert_large_uncased/', seed=None, student_only=False, task_name='sst2', test_only=False, world_size=1)
2021-05-31 05:47:27,789	INFO	__main__	Distributed environment: NO
Num processes: 1
Process index: 0
Local process index: 0
Device: cuda
Use FP16 precision: True

2021-05-31 05:47:35,040	WARNING	datasets.builder	Reusing dataset glue (/root/.cache/huggingface/datasets/glue/sst2/1.0.0/dacbe3125aa31d7f70367a07a8a9e72a5a0bfeb5fc42e75c9db75b96da6053ad)
2021-05-31 05:47:38,503	INFO	__main__	Start training
2021-05-31 05:47:38,503	INFO	torchdistill.models.util	[teacher model]
2021-05-31 05:47:38,503	INFO	torchdistill.models.util	Using the original teacher model
2021-05-31 05:47:38,503	INFO	torchdistill.models.util	[student model]
2021-05-31 05:47:38,504	INFO	torchdistill.models.util	Using the original student model
2021-05-31 05:47:38,504	INFO	torchdistill.core.distillation	Loss = 1.0 * OrgLoss
2021-05-31 05:47:38,504	INFO	torchdistill.core.distillation	Freezing the whole teacher model
2021-05-31 05:47:42,142	INFO	torchdistill.misc.log	Epoch: [0]  [   0/2105]  eta: 0:10:14  lr: 9.998416468725257e-05  sample/s: 14.042895693690195  loss: 0.3713 (0.3713)  time: 0.2918  data: 0.0070  max mem: 1909
2021-05-31 05:50:16,780	INFO	torchdistill.misc.log	Epoch: [0]  [ 500/2105]  eta: 0:08:16  lr: 9.20665083135392e-05  sample/s: 11.948494724127714  loss: 0.0681 (0.0908)  time: 0.3081  data: 0.0032  max mem: 3637
2021-05-31 05:52:52,257	INFO	torchdistill.misc.log	Epoch: [0]  [1000/2105]  eta: 0:05:42  lr: 8.414885193982581e-05  sample/s: 11.96046261450051  loss: 0.0413 (0.0710)  time: 0.3093  data: 0.0033  max mem: 3637
2021-05-31 05:55:28,379	INFO	torchdistill.misc.log	Epoch: [0]  [1500/2105]  eta: 0:03:08  lr: 7.623119556611244e-05  sample/s: 11.926241231549644  loss: 0.0231 (0.0601)  time: 0.3108  data: 0.0032  max mem: 3798
2021-05-31 05:58:04,665	INFO	torchdistill.misc.log	Epoch: [0]  [2000/2105]  eta: 0:00:32  lr: 6.831353919239906e-05  sample/s: 16.632809944065624  loss: 0.0248 (0.0530)  time: 0.3105  data: 0.0033  max mem: 3802
2021-05-31 05:58:37,981	INFO	torchdistill.misc.log	Epoch: [0] Total time: 0:10:56
2021-05-31 05:58:39,659	INFO	/usr/local/lib/python3.7/dist-packages/datasets/metric.py	Removing /root/.cache/huggingface/metrics/glue/sst2/default_experiment-1-0.arrow
2021-05-31 05:58:39,659	INFO	__main__	Validation: accuracy = 0.9151376146788991
2021-05-31 05:58:39,659	INFO	__main__	Updating ckpt at ./resource/ckpt/glue/sst2/kd/sst2-bert-base-uncased_from_bert-large-uncased
2021-05-31 05:58:40,894	INFO	torchdistill.misc.log	Epoch: [1]  [   0/2105]  eta: 0:10:41  lr: 6.665083135391924e-05  sample/s: 13.684057874084145  loss: 0.0106 (0.0106)  time: 0.3047  data: 0.0124  max mem: 3802
2021-05-31 06:01:18,103	INFO	torchdistill.misc.log	Epoch: [1]  [ 500/2105]  eta: 0:08:22  lr: 5.8733174980205864e-05  sample/s: 11.957572794359765  loss: 0.0071 (0.0139)  time: 0.3221  data: 0.0031  max mem: 3802
2021-05-31 06:03:54,053	INFO	torchdistill.misc.log	Epoch: [1]  [1000/2105]  eta: 0:05:45  lr: 5.081551860649249e-05  sample/s: 11.940883530020505  loss: 0.0065 (0.0137)  time: 0.3034  data: 0.0031  max mem: 3802
2021-05-31 06:06:29,428	INFO	torchdistill.misc.log	Epoch: [1]  [1500/2105]  eta: 0:03:08  lr: 4.28978622327791e-05  sample/s: 13.916417959969143  loss: 0.0063 (0.0133)  time: 0.3077  data: 0.0032  max mem: 3802
2021-05-31 06:09:03,728	INFO	torchdistill.misc.log	Epoch: [1]  [2000/2105]  eta: 0:00:32  lr: 3.4980205859065716e-05  sample/s: 11.953755940463552  loss: 0.0084 (0.0129)  time: 0.3038  data: 0.0032  max mem: 3802
2021-05-31 06:09:35,385	INFO	torchdistill.misc.log	Epoch: [1] Total time: 0:10:54
2021-05-31 06:09:37,066	INFO	/usr/local/lib/python3.7/dist-packages/datasets/metric.py	Removing /root/.cache/huggingface/metrics/glue/sst2/default_experiment-1-0.arrow
2021-05-31 06:09:37,066	INFO	__main__	Validation: accuracy = 0.9243119266055045
2021-05-31 06:09:37,067	INFO	__main__	Updating ckpt at ./resource/ckpt/glue/sst2/kd/sst2-bert-base-uncased_from_bert-large-uncased
2021-05-31 06:09:38,580	INFO	torchdistill.misc.log	Epoch: [2]  [   0/2105]  eta: 0:10:29  lr: 3.3317498020585904e-05  sample/s: 13.716311384138436  loss: 0.0019 (0.0019)  time: 0.2992  data: 0.0076  max mem: 3802
2021-05-31 06:12:12,846	INFO	torchdistill.misc.log	Epoch: [2]  [ 500/2105]  eta: 0:08:15  lr: 2.5399841646872525e-05  sample/s: 13.906025093019228  loss: 0.0049 (0.0055)  time: 0.2998  data: 0.0031  max mem: 3802
2021-05-31 06:14:49,099	INFO	torchdistill.misc.log	Epoch: [2]  [1000/2105]  eta: 0:05:43  lr: 1.7482185273159146e-05  sample/s: 10.724082628769473  loss: 0.0038 (0.0054)  time: 0.3174  data: 0.0031  max mem: 3802
2021-05-31 06:17:26,529	INFO	torchdistill.misc.log	Epoch: [2]  [1500/2105]  eta: 0:03:08  lr: 9.564528899445763e-06  sample/s: 11.955757780905381  loss: 0.0028 (0.0053)  time: 0.3269  data: 0.0032  max mem: 3802
2021-05-31 06:20:02,249	INFO	torchdistill.misc.log	Epoch: [2]  [2000/2105]  eta: 0:00:32  lr: 1.6468725257323833e-06  sample/s: 11.92430011293717  loss: 0.0027 (0.0053)  time: 0.2978  data: 0.0031  max mem: 3802
2021-05-31 06:20:34,146	INFO	torchdistill.misc.log	Epoch: [2] Total time: 0:10:55
2021-05-31 06:20:35,823	INFO	/usr/local/lib/python3.7/dist-packages/datasets/metric.py	Removing /root/.cache/huggingface/metrics/glue/sst2/default_experiment-1-0.arrow
2021-05-31 06:20:35,823	INFO	__main__	Validation: accuracy = 0.9288990825688074
2021-05-31 06:20:35,823	INFO	__main__	Updating ckpt at ./resource/ckpt/glue/sst2/kd/sst2-bert-base-uncased_from_bert-large-uncased
2021-05-31 06:20:36,974	INFO	__main__	[Teacher: bert-large-uncased]
2021-05-31 06:20:41,563	INFO	/usr/local/lib/python3.7/dist-packages/datasets/metric.py	Removing /root/.cache/huggingface/metrics/glue/sst2/default_experiment-1-0.arrow
2021-05-31 06:20:41,563	INFO	__main__	Test: accuracy = 0.9346330275229358
2021-05-31 06:20:43,462	INFO	__main__	[Student: bert-base-uncased]
2021-05-31 06:20:45,138	INFO	/usr/local/lib/python3.7/dist-packages/datasets/metric.py	Removing /root/.cache/huggingface/metrics/glue/sst2/default_experiment-1-0.arrow
2021-05-31 06:20:45,138	INFO	__main__	Test: accuracy = 0.9288990825688074
2021-05-31 06:20:45,138	INFO	__main__	Start prediction for private dataset(s)
2021-05-31 06:20:45,139	INFO	__main__	sst2/test: 1821 samples