Anima-DF11-ComfyUI / README.md
mingyi456's picture
Update README.md
2632844 verified
metadata
license: other
license_name: circlestone-labs-non-commercial-license
license_link: https://huggingface.co/circlestone-labs/Anima/blob/main/LICENSE.md
language:
  - en
pipeline_tag: text-to-image
tags:
  - comfyui
  - diffusion-single-file
base_model:
  - circlestone-labs/Anima
base_model_relation: quantized

For more information (including how to compress models yourself), check out https://huggingface.co/DFloat11 and https://github.com/LeanModels/DFloat11

Feel free to request for other models for compression as well, although models whose architecture I am unfamiliar with might be slightly tricky for me.

How to Use

ComfyUI

Install the ComfyUI DFloat11 Extended node via the ComfyUI manager. After installing, use the provided workflow json, or simply replace the "Load Diffusion Model" node of an existing workflow with the "Load Diffusion Model" node. If you run into any issues, feel free to leave a comment. The workflow is also embedded in the below png image.

diffusers

As far as I know, this model is not implemented in diffusers yet.

Compression Details

This is the pattern_dict for compressing Anima-based models in ComfyUI:

pattern_dict_comfyui = {
    r"t_embedder\.1" : (
        "linear_1",
        "linear_2",
    ),
    r"blocks\.\d+" : (
        "self_attn.q_proj",
        "self_attn.k_proj",
        "self_attn.v_proj",
        "self_attn.output_proj",
        "cross_attn.q_proj",
        "cross_attn.k_proj",
        "cross_attn.v_proj",
        "cross_attn.output_proj",
        "mlp.layer1",
        "mlp.layer2",
        "adaln_modulation_self_attn.1",
        "adaln_modulation_self_attn.2",
        "adaln_modulation_cross_attn.1",
        "adaln_modulation_cross_attn.2",
        "adaln_modulation_mlp.1",
        "adaln_modulation_mlp.2",
    ),
    r"llm_adapter\.embed": [],
    
    r"llm_adapter\.blocks\.\d+" : (
        "self_attn.q_proj",
        "self_attn.k_proj",
        "self_attn.v_proj",
        "self_attn.o_proj",
        "cross_attn.q_proj",
        "cross_attn.k_proj",
        "cross_attn.v_proj",
        "cross_attn.o_proj",
        "mlp.0",
        "mlp.2",
    ),
},