File size: 1,508 Bytes
9a67fbe |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 |
--- OUTER FOLD 1/5 ---
INFO: Best params for fold 1: {'lr': 0.000756755929227675, 'hidden_dim': 64, 'batch_size': 64}
INFO: Fold 1 Val RMSE: 1.5386, MAE: 1.2724
--- OUTER FOLD 2/5 ---
INFO: Best params for fold 2: {'lr': 0.0007303755012255117, 'hidden_dim': 128, 'batch_size': 32}
INFO: Fold 2 Val RMSE: 1.4841, MAE: 1.2102
--- OUTER FOLD 3/5 ---
INFO: Best params for fold 3: {'lr': 0.0006175439707655367, 'hidden_dim': 128, 'batch_size': 64}
INFO: Fold 3 Val RMSE: 1.4478, MAE: 1.1961
--- OUTER FOLD 4/5 ---
INFO: Best params for fold 4: {'lr': 0.0007618320309633699, 'hidden_dim': 256, 'batch_size': 32}
INFO: Fold 4 Val RMSE: 1.5224, MAE: 1.2472
--- OUTER FOLD 5/5 ---
INFO: Best params for fold 5: {'lr': 0.0006858999160561152, 'hidden_dim': 64, 'batch_size': 32}
INFO: Fold 5 Val RMSE: 1.4781, MAE: 1.2242
------ Nested Cross-Validation Summary ------
Unbiased Validation RMSE: 1.4942 ± 0.0325
Unbiased Validation MAE: 1.2300 ± 0.0271
VAL FOLD RMSEs: [1.5385734, 1.4840797, 1.4478317, 1.5224414, 1.4781064]
VAL FOLD MAEs: [1.27239, 1.2101815, 1.1960989, 1.2471735, 1.2242327]
===== STEP 2: Final Model Training & Testing =====
INFO: Finding best hyperparameters on the FULL train/val set for final model...
INFO: Optimal hyperparameters for final model: {'lr': 0.0008903488639350984, 'hidden_dim': 64, 'batch_size': 64}
INFO: Training final model...
===== STEP 3: Final Held-Out Test Evaluation =====
Test RMSE: 1.7444 (95% CI: [1.5304, 2.0141])
Test MAE: 1.3390 (95% CI: [1.2356, 1.4498])
|