could quanto model be used in comfy? how?
could this multi quantity model used in comfy UI
i also wonder if there is a custom node that adds support for this quant, as currently it doesn't work:
got prompt
!!! Exception during processing !!! 'blocks.0.ffn.0.weight'
Traceback (most recent call last):
File "C:_stability_matrix\Data\Packages\ComfyUI\execution.py", line 427, in execute
output_data, output_ui, has_subgraph, has_pending_tasks = await get_output_data(prompt_id, unique_id, obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "C:_stability_matrix\Data\Packages\ComfyUI\execution.py", line 270, in get_output_data
return_values = await _async_map_node_over_list(prompt_id, unique_id, obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "C:_stability_matrix\Data\Packages\ComfyUI\custom_nodes\ComfyUI-Lora-Manager\py\metadata_collector\metadata_hook.py", line 172, in async_map_node_over_list_with_metadata
results = await original_map_node_over_list(prompt_id, unique_id, obj, input_data_all, func, allow_interrupt, execution_block_cb, pre_execute_cb)
File "C:_stability_matrix\Data\Packages\ComfyUI\execution.py", line 244, in _async_map_node_over_list
await process_inputs(input_dict, i)
File "C:_stability_matrix\Data\Packages\ComfyUI\execution.py", line 232, in process_inputs
result = f(**inputs)
File "C:_stability_matrix\Data\Packages\ComfyUI\nodes.py", line 916, in load_unet
model = comfy.sd.load_diffusion_model(unet_path, model_options=model_options)
File "C:_stability_matrix\Data\Packages\ComfyUI\comfy\sd.py", line 1184, in load_diffusion_model
model = load_diffusion_model_state_dict(sd, model_options=model_options)
File "C:_stability_matrix\Data\Packages\ComfyUI\comfy\sd.py", line 1133, in load_diffusion_model_state_dict
model_config = model_detection.model_config_from_unet(sd, "")
File "C:_stability_matrix\Data\Packages\ComfyUI\comfy\model_detection.py", line 617, in model_config_from_unet
unet_config = detect_unet_config(state_dict, unet_key_prefix, metadata=metadata)
File "C:_stability_matrix\Data\Packages\ComfyUI\comfy\model_detection.py", line 351, in detect_unet_config
dit_config["ffn_dim"] = state_dict['{}blocks.0.ffn.0.weight'.format(key_prefix)].shape[0]
KeyError: 'blocks.0.ffn.0.weight'
Got the same error myself. Exact same. Per Gemini:
You've hit another interesting error! This KeyError: 'blocks.0.ffn.0.weight' happens deep inside ComfyUI's model loading code, specifically when it's trying to figure out the architecture of the UNet model you're loading.
Here's what's happening:
ComfyUI's model loader correctly identifies that you're trying to load a DiT-style (Diffusion Transformer) model.
It then tries to determine the specific variant of the DiT model by checking for the existence of certain keys in the model's file.
Your model seems to be a variation that the loader doesn't fully recognize. It expects to find a key named 'blocks.0.ffn.0.weight' to determine one of the model's parameters, but that key doesn't exist in your model file, which causes the KeyError.
This is with how ComfyUI is detecting the architecture of the specific UNet model you are trying to load. It's likely that the model file you are using has a slightly different structure than what ComfyUI's automatic detection is expecting. This can sometimes happen with newer models or fine-tuned versions of models.
To move forward, you could try a few things:
Ensure you are using a model that is known to be compatible with the standard ComfyUI loaders. Sometimes model creators provide specific instructions or versions for ComfyUI.
You could check the source of the model (e.g., Hugging Face page) for any notes about its architecture or for a version that is structured differently.
If you're feeling adventurous, you could investigate modifying the model file itself to rename the keys to what ComfyUI expects, but that is a more advanced solution.
This is a model compatibility issue with ComfyUI's loader. Your workflow is likely correct, but the model file itself is causing the loader to fail.