ValueError: Adapter name(s) {'lora'} not in the list of present adapters: set().

#1
by Jerry8 - opened

pipe = FluxKontextPipeline.from_pretrained("black-forest-labs/FLUX.1-Kontext-dev", torch_dtype=torch.bfloat16).to("cuda")
pipe.load_lora_weights("kontext-community/relighting-kontext-dev-lora-v3", weight_name="relighting-kontext-dev-lora-v3.safetensors", adapter_name="lora")
pipe.set_adapters(["lora"], adapter_weights=[1.0])

Lora weight loading exception, debugging shows that the state_dict of Lora weights does not align with the parameters of Flux's transformer, and the Lora adapter also failed to register. The error message indicates that there is no 'lora' adapter when calling pipe.set_adapters(["lora"], adapter_weights=[1.0]).

Same error, I use 'pipe.fuse_lora(lora_scale=0.75)' to load lora.

Sign up or log in to comment