GGUF for IP FULL not Lite

#1
by QrusherZA - opened

Hi, is it possible to create the full version of the IP in gguf, or tell me how you did it wil make it myself, i have the fp16 version but City96 tools keeps saying unknown architecture

I can do that, as to create this gguf you need to modify the code of ComfyUI-GGUF along with some changes in llama.cpp to get final quantized versions.

Bro thanks for the quick response man.. Do you have a fork? would love to see the changes you made if possible (Don't need to explain, i'm a C# developer but understand other languages as well)

I don’t maintain a fork, because every new model release requires different changes, so my local copy keeps evolving as needed. I’ll quantize the models and also post the required changes here. I’m out on the road today, but I’ll take care of it tomorrow.

Thanks bro, really appreciate, please do not rush on my account, It is not something urgent for me, get to it whenever you have time to do so.

It is next on my list as i am already quantizing and uploading a huge model. I'll do it right after after X-Mas day.

In the meanwhile if you wan't you can make these changes to convert.py for supporting lynx.

add these classes

class ModelLynx(ModelTemplate):
arch = "wan"
ndims_fix = True
keys_detect = [
(
"blocks.0.self_attn.ref_adapter.to_k_ref.weight",
"blocks.1.self_attn.ref_adapter.to_k_ref.weight",
)
]
keys_hiprec = [
".modulation", # nn.parameter, can't load from BF16 ver
]

class ModelLynxIP(ModelTemplate):
arch = "wan"
ndims_fix = True
keys_detect = [
(
"blocks.0.cross_attn.ip_adapter.to_k_ip.weight",
"blocks.2.cross_attn.ip_adapter.to_k_ip.weight",
)
]
keys_hiprec = [
".modulation", # nn.parameter, can't load from BF16 ver
]

also add this to arch_list = []
ModelLynx, ModelLynxIP

Thank you, you are a star my man. I got this working just had to make a small change on the WanVideoWrapper for lynx modules, and bam, working with full ip <3, the change was just that the tensor when converted to gguf was 2d instead of 3d, so did a work around for that

Glad to help.

vantagewithai changed discussion status to closed

Sign up or log in to comment