-- Quantization strategy:
 --   model.layers.0.self_attn                           8.0414 bpw - exp. error: 0.00231304
 --   model.layers.0.mlp                                 8.0346 bpw - exp. error: 0.00221935
 --   model.layers.1.self_attn                           8.0414 bpw - exp. error: 0.00190707
 --   model.layers.1.mlp                                 8.0346 bpw - exp. error: 0.00185876
 --   model.layers.2.self_attn                           8.0414 bpw - exp. error: 0.00080757
 --   model.layers.2.mlp                                 8.0346 bpw - exp. error: 0.00312320
 --   model.layers.3.self_attn                           8.0414 bpw - exp. error: 0.00055968
 --   model.layers.3.mlp                                 8.0346 bpw - exp. error: 0.00349023
 --   model.layers.4.self_attn                           8.0414 bpw - exp. error: 0.00060596
 --   model.layers.4.mlp                                 8.0346 bpw - exp. error: 0.00107729
 --   model.layers.5.self_attn                           8.0414 bpw - exp. error: 0.00063598
 --   model.layers.5.mlp                                 8.0346 bpw - exp. error: 0.00079957
 --   model.layers.6.self_attn                           8.0414 bpw - exp. error: 0.00071120
 --   model.layers.6.mlp                                 8.0346 bpw - exp. error: 0.00094647
 --   model.layers.7.self_attn                           8.0414 bpw - exp. error: 0.00074661
 --   model.layers.7.mlp                                 8.0346 bpw - exp. error: 0.00109008
 --   model.layers.8.self_attn                           8.0414 bpw - exp. error: 0.00085548
 --   model.layers.8.mlp                                 8.0346 bpw - exp. error: 0.00120750
 --   model.layers.9.self_attn                           8.0414 bpw - exp. error: 0.00090076
 --   model.layers.9.mlp                                 8.0346 bpw - exp. error: 0.00136397
 --   model.layers.10.self_attn                          8.0414 bpw - exp. error: 0.00101369
 --   model.layers.10.mlp                                8.0346 bpw - exp. error: 0.00152688
 --   model.layers.11.self_attn                          8.0414 bpw - exp. error: 0.00113892
 --   model.layers.11.mlp                                8.0346 bpw - exp. error: 0.00160677
 --   model.layers.12.self_attn                          8.0414 bpw - exp. error: 0.00124207
 --   model.layers.12.mlp                                8.0346 bpw - exp. error: 0.00166692
 --   model.layers.13.self_attn                          8.0414 bpw - exp. error: 0.00120159
 --   model.layers.13.mlp                                8.0346 bpw - exp. error: 0.00180155
 --   model.layers.14.self_attn                          8.0414 bpw - exp. error: 0.00130511
 --   model.layers.14.mlp                                8.0346 bpw - exp. error: 0.00180295
 --   model.layers.15.self_attn                          8.0414 bpw - exp. error: 0.00128359
 --   model.layers.15.mlp                                8.0346 bpw - exp. error: 0.00182972
 --   model.layers.16.self_attn                          8.0414 bpw - exp. error: 0.00143487
 --   model.layers.16.mlp                                8.0346 bpw - exp. error: 0.00190457
 --   model.layers.17.self_attn                          8.0414 bpw - exp. error: 0.00141536
 --   model.layers.17.mlp                                8.0346 bpw - exp. error: 0.00189976
 --   model.layers.18.self_attn                          8.0414 bpw - exp. error: 0.00147880
 --   model.layers.18.mlp                                8.0346 bpw - exp. error: 0.00194527
 --   model.layers.19.self_attn                          8.0414 bpw - exp. error: 0.00140794
 --   model.layers.19.mlp                                8.0346 bpw - exp. error: 0.00198472
 --   model.layers.20.self_attn                          8.0414 bpw - exp. error: 0.00139448
 --   model.layers.20.mlp                                8.0346 bpw - exp. error: 0.00206014
 --   model.layers.21.self_attn                          8.0414 bpw - exp. error: 0.00141150
 --   model.layers.21.mlp                                8.0346 bpw - exp. error: 0.00203777
 --   model.layers.22.self_attn                          8.0414 bpw - exp. error: 0.00136862
 --   model.layers.22.mlp                                8.0346 bpw - exp. error: 0.00203118
 --   model.layers.23.self_attn                          8.0414 bpw - exp. error: 0.00129910
 --   model.layers.23.mlp                                8.0346 bpw - exp. error: 0.00203162
 --   model.layers.24.self_attn                          8.0414 bpw - exp. error: 0.00127187
 --   model.layers.24.mlp                                8.0346 bpw - exp. error: 0.00199441
 --   model.layers.25.self_attn                          8.0414 bpw - exp. error: 0.00124861
 --   model.layers.25.mlp                                8.0346 bpw - exp. error: 0.00201886
 --   model.layers.26.self_attn                          8.0414 bpw - exp. error: 0.00122997
 --   model.layers.26.mlp                                8.0346 bpw - exp. error: 0.00196617
 --   model.layers.27.self_attn                          8.0414 bpw - exp. error: 0.00116638
 --   model.layers.27.mlp                                8.0346 bpw - exp. error: 0.00189536
 --   model.layers.28.self_attn                          8.0414 bpw - exp. error: 0.00104954
 --   model.layers.28.mlp                                8.0346 bpw - exp. error: 0.00180170
 --   model.layers.29.self_attn                          8.0414 bpw - exp. error: 0.00097699
 --   model.layers.29.mlp                                8.0346 bpw - exp. error: 0.00176433
 --   model.layers.30.self_attn                          8.0414 bpw - exp. error: 0.00095343
 --   model.layers.30.mlp                                8.0346 bpw - exp. error: 0.00172087
 --   model.layers.31.self_attn                          8.0414 bpw - exp. error: 0.00091330
 --   model.layers.31.mlp                                8.0346 bpw - exp. error: 0.00169612
 --   model.layers.32.self_attn                          8.0414 bpw - exp. error: 0.00098182
 --   model.layers.32.mlp                                8.0346 bpw - exp. error: 0.00188125
 --   model.layers.33.self_attn                          8.0414 bpw - exp. error: 0.00076354
 --   model.layers.33.mlp                                8.0346 bpw - exp. error: 0.00162950
 --   model.layers.34.self_attn                          8.0414 bpw - exp. error: 0.00082926
 --   model.layers.34.mlp                                8.0346 bpw - exp. error: 0.00180018
 --   model.layers.35.self_attn                          8.0414 bpw - exp. error: 0.00105421
 --   model.layers.35.mlp                                8.0346 bpw - exp. error: 0.00170070
 --   model.layers.36.self_attn                          8.0414 bpw - exp. error: 0.00073896
 --   model.layers.36.mlp                                8.0346 bpw - exp. error: 0.00176191
 --   model.layers.37.self_attn                          8.0414 bpw - exp. error: 0.00075559
 --   model.layers.37.mlp                                8.0346 bpw - exp. error: 0.00165809
 --   model.layers.38.self_attn                          8.0414 bpw - exp. error: 0.00065418
 --   model.layers.38.mlp                                8.0346 bpw - exp. error: 0.00117587
 --   model.layers.39.self_attn                          8.0414 bpw - exp. error: 0.00041204
 --   model.layers.39.mlp                                8.0346 bpw - exp. error: 0.00097704
 -- sum(log(err)): -530.376730
 -- max(err): 0.003490
Downloads last month
24
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for pipilok/phi-4-unsloth-exl2-8bpw-hb8

Base model

microsoft/phi-4
Finetuned
unsloth/phi-4
Quantized
(19)
this model