qqceqqq TianxiangMa commited on
Commit
4724aad
·
0 Parent(s):

Duplicate from bytedance-research/Phantom

Browse files

Co-authored-by: Tianxiang Ma <TianxiangMa@users.noreply.huggingface.co>

.gitattributes ADDED
@@ -0,0 +1,45 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ *.7z filter=lfs diff=lfs merge=lfs -text
2
+ *.arrow filter=lfs diff=lfs merge=lfs -text
3
+ *.bin filter=lfs diff=lfs merge=lfs -text
4
+ *.bz2 filter=lfs diff=lfs merge=lfs -text
5
+ *.ckpt filter=lfs diff=lfs merge=lfs -text
6
+ *.ftz filter=lfs diff=lfs merge=lfs -text
7
+ *.gz filter=lfs diff=lfs merge=lfs -text
8
+ *.h5 filter=lfs diff=lfs merge=lfs -text
9
+ *.joblib filter=lfs diff=lfs merge=lfs -text
10
+ *.lfs.* filter=lfs diff=lfs merge=lfs -text
11
+ *.mlmodel filter=lfs diff=lfs merge=lfs -text
12
+ *.model filter=lfs diff=lfs merge=lfs -text
13
+ *.msgpack filter=lfs diff=lfs merge=lfs -text
14
+ *.npy filter=lfs diff=lfs merge=lfs -text
15
+ *.npz filter=lfs diff=lfs merge=lfs -text
16
+ *.onnx filter=lfs diff=lfs merge=lfs -text
17
+ *.ot filter=lfs diff=lfs merge=lfs -text
18
+ *.parquet filter=lfs diff=lfs merge=lfs -text
19
+ *.pb filter=lfs diff=lfs merge=lfs -text
20
+ *.pickle filter=lfs diff=lfs merge=lfs -text
21
+ *.pkl filter=lfs diff=lfs merge=lfs -text
22
+ *.pt filter=lfs diff=lfs merge=lfs -text
23
+ *.pth filter=lfs diff=lfs merge=lfs -text
24
+ *.rar filter=lfs diff=lfs merge=lfs -text
25
+ *.safetensors filter=lfs diff=lfs merge=lfs -text
26
+ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
27
+ *.tar.* filter=lfs diff=lfs merge=lfs -text
28
+ *.tar filter=lfs diff=lfs merge=lfs -text
29
+ *.tflite filter=lfs diff=lfs merge=lfs -text
30
+ *.tgz filter=lfs diff=lfs merge=lfs -text
31
+ *.wasm filter=lfs diff=lfs merge=lfs -text
32
+ *.xz filter=lfs diff=lfs merge=lfs -text
33
+ *.zip filter=lfs diff=lfs merge=lfs -text
34
+ *.zst filter=lfs diff=lfs merge=lfs -text
35
+ *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ id_eval.png filter=lfs diff=lfs merge=lfs -text
37
+ ip_eval_m_00.png filter=lfs diff=lfs merge=lfs -text
38
+ ip_eval_s.png filter=lfs diff=lfs merge=lfs -text
39
+ result1.gif filter=lfs diff=lfs merge=lfs -text
40
+ result2.gif filter=lfs diff=lfs merge=lfs -text
41
+ result3.gif filter=lfs diff=lfs merge=lfs -text
42
+ result4.gif filter=lfs diff=lfs merge=lfs -text
43
+ teaser.png filter=lfs diff=lfs merge=lfs -text
44
+ *.gif filter=lfs diff=lfs merge=lfs -text
45
+ *.png filter=lfs diff=lfs merge=lfs -text
Phantom-Wan-1.3B.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3a23db1c11e23d350d24a418dcd2a252f50193097bca838d0c4c3ea8be3c0a84
3
+ size 5692970224
Phantom_Wan_14B-00001-of-00006.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7b1a6e11fd966b67d2bd921a09d18116aa75154cea5200e91e36ed8e856d2529
3
+ size 9782642624
Phantom_Wan_14B-00002-of-00006.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4d689cfc16f6f5610efbde429a62a8acc24e6081f46e710b6b25a10209bba2c1
3
+ size 9524424752
Phantom_Wan_14B-00003-of-00006.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2bc867be733059c38c80aba12957dd637d30e5bda21746debe570fcd9506e6bf
3
+ size 9450866864
Phantom_Wan_14B-00004-of-00006.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:dde0dc06da5f2d72617961e9284dddb0c2ab4999139ee5020435d25163f4f0f8
3
+ size 9451004040
Phantom_Wan_14B-00005-of-00006.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c917c8bffc3abcf6f8b3706cfcb55a178d9e4103117641156e39ff5f27a483d9
3
+ size 9524424840
Phantom_Wan_14B-00006-of-00006.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:85f460d6bd95809fab90111e884f8cbf1dd3d69993ce971ac71fa53eb847dec0
3
+ size 9420714056
Phantom_Wan_14B.safetensors.index.json ADDED
@@ -0,0 +1,1102 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "total_size": 57154077176
4
+ },
5
+ "weight_map": {
6
+ "patch_embedding.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
7
+ "patch_embedding.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
8
+ "text_embedding.0.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
9
+ "text_embedding.0.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
10
+ "text_embedding.2.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
11
+ "text_embedding.2.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
12
+ "time_embedding.0.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
13
+ "time_embedding.0.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
14
+ "time_embedding.2.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
15
+ "time_embedding.2.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
16
+ "time_projection.1.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
17
+ "time_projection.1.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
18
+ "blocks.0.modulation": "Phantom_Wan_14B-00001-of-00006.safetensors",
19
+ "blocks.0.self_attn.q.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
20
+ "blocks.0.self_attn.q.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
21
+ "blocks.0.self_attn.k.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
22
+ "blocks.0.self_attn.k.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
23
+ "blocks.0.self_attn.v.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
24
+ "blocks.0.self_attn.v.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
25
+ "blocks.0.self_attn.o.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
26
+ "blocks.0.self_attn.o.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
27
+ "blocks.0.self_attn.norm_q.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
28
+ "blocks.0.self_attn.norm_k.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
29
+ "blocks.0.norm3.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
30
+ "blocks.0.norm3.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
31
+ "blocks.0.cross_attn.q.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
32
+ "blocks.0.cross_attn.q.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
33
+ "blocks.0.cross_attn.k.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
34
+ "blocks.0.cross_attn.k.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
35
+ "blocks.0.cross_attn.v.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
36
+ "blocks.0.cross_attn.v.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
37
+ "blocks.0.cross_attn.o.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
38
+ "blocks.0.cross_attn.o.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
39
+ "blocks.0.cross_attn.norm_q.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
40
+ "blocks.0.cross_attn.norm_k.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
41
+ "blocks.0.ffn.0.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
42
+ "blocks.0.ffn.0.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
43
+ "blocks.0.ffn.2.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
44
+ "blocks.0.ffn.2.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
45
+ "blocks.1.modulation": "Phantom_Wan_14B-00001-of-00006.safetensors",
46
+ "blocks.1.self_attn.q.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
47
+ "blocks.1.self_attn.q.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
48
+ "blocks.1.self_attn.k.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
49
+ "blocks.1.self_attn.k.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
50
+ "blocks.1.self_attn.v.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
51
+ "blocks.1.self_attn.v.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
52
+ "blocks.1.self_attn.o.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
53
+ "blocks.1.self_attn.o.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
54
+ "blocks.1.self_attn.norm_q.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
55
+ "blocks.1.self_attn.norm_k.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
56
+ "blocks.1.norm3.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
57
+ "blocks.1.norm3.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
58
+ "blocks.1.cross_attn.q.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
59
+ "blocks.1.cross_attn.q.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
60
+ "blocks.1.cross_attn.k.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
61
+ "blocks.1.cross_attn.k.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
62
+ "blocks.1.cross_attn.v.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
63
+ "blocks.1.cross_attn.v.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
64
+ "blocks.1.cross_attn.o.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
65
+ "blocks.1.cross_attn.o.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
66
+ "blocks.1.cross_attn.norm_q.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
67
+ "blocks.1.cross_attn.norm_k.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
68
+ "blocks.1.ffn.0.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
69
+ "blocks.1.ffn.0.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
70
+ "blocks.1.ffn.2.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
71
+ "blocks.1.ffn.2.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
72
+ "blocks.2.modulation": "Phantom_Wan_14B-00001-of-00006.safetensors",
73
+ "blocks.2.self_attn.q.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
74
+ "blocks.2.self_attn.q.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
75
+ "blocks.2.self_attn.k.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
76
+ "blocks.2.self_attn.k.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
77
+ "blocks.2.self_attn.v.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
78
+ "blocks.2.self_attn.v.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
79
+ "blocks.2.self_attn.o.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
80
+ "blocks.2.self_attn.o.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
81
+ "blocks.2.self_attn.norm_q.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
82
+ "blocks.2.self_attn.norm_k.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
83
+ "blocks.2.norm3.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
84
+ "blocks.2.norm3.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
85
+ "blocks.2.cross_attn.q.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
86
+ "blocks.2.cross_attn.q.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
87
+ "blocks.2.cross_attn.k.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
88
+ "blocks.2.cross_attn.k.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
89
+ "blocks.2.cross_attn.v.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
90
+ "blocks.2.cross_attn.v.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
91
+ "blocks.2.cross_attn.o.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
92
+ "blocks.2.cross_attn.o.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
93
+ "blocks.2.cross_attn.norm_q.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
94
+ "blocks.2.cross_attn.norm_k.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
95
+ "blocks.2.ffn.0.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
96
+ "blocks.2.ffn.0.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
97
+ "blocks.2.ffn.2.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
98
+ "blocks.2.ffn.2.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
99
+ "blocks.3.modulation": "Phantom_Wan_14B-00001-of-00006.safetensors",
100
+ "blocks.3.self_attn.q.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
101
+ "blocks.3.self_attn.q.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
102
+ "blocks.3.self_attn.k.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
103
+ "blocks.3.self_attn.k.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
104
+ "blocks.3.self_attn.v.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
105
+ "blocks.3.self_attn.v.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
106
+ "blocks.3.self_attn.o.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
107
+ "blocks.3.self_attn.o.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
108
+ "blocks.3.self_attn.norm_q.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
109
+ "blocks.3.self_attn.norm_k.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
110
+ "blocks.3.norm3.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
111
+ "blocks.3.norm3.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
112
+ "blocks.3.cross_attn.q.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
113
+ "blocks.3.cross_attn.q.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
114
+ "blocks.3.cross_attn.k.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
115
+ "blocks.3.cross_attn.k.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
116
+ "blocks.3.cross_attn.v.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
117
+ "blocks.3.cross_attn.v.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
118
+ "blocks.3.cross_attn.o.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
119
+ "blocks.3.cross_attn.o.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
120
+ "blocks.3.cross_attn.norm_q.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
121
+ "blocks.3.cross_attn.norm_k.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
122
+ "blocks.3.ffn.0.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
123
+ "blocks.3.ffn.0.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
124
+ "blocks.3.ffn.2.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
125
+ "blocks.3.ffn.2.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
126
+ "blocks.4.modulation": "Phantom_Wan_14B-00001-of-00006.safetensors",
127
+ "blocks.4.self_attn.q.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
128
+ "blocks.4.self_attn.q.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
129
+ "blocks.4.self_attn.k.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
130
+ "blocks.4.self_attn.k.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
131
+ "blocks.4.self_attn.v.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
132
+ "blocks.4.self_attn.v.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
133
+ "blocks.4.self_attn.o.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
134
+ "blocks.4.self_attn.o.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
135
+ "blocks.4.self_attn.norm_q.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
136
+ "blocks.4.self_attn.norm_k.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
137
+ "blocks.4.norm3.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
138
+ "blocks.4.norm3.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
139
+ "blocks.4.cross_attn.q.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
140
+ "blocks.4.cross_attn.q.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
141
+ "blocks.4.cross_attn.k.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
142
+ "blocks.4.cross_attn.k.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
143
+ "blocks.4.cross_attn.v.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
144
+ "blocks.4.cross_attn.v.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
145
+ "blocks.4.cross_attn.o.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
146
+ "blocks.4.cross_attn.o.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
147
+ "blocks.4.cross_attn.norm_q.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
148
+ "blocks.4.cross_attn.norm_k.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
149
+ "blocks.4.ffn.0.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
150
+ "blocks.4.ffn.0.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
151
+ "blocks.4.ffn.2.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
152
+ "blocks.4.ffn.2.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
153
+ "blocks.5.modulation": "Phantom_Wan_14B-00001-of-00006.safetensors",
154
+ "blocks.5.self_attn.q.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
155
+ "blocks.5.self_attn.q.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
156
+ "blocks.5.self_attn.k.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
157
+ "blocks.5.self_attn.k.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
158
+ "blocks.5.self_attn.v.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
159
+ "blocks.5.self_attn.v.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
160
+ "blocks.5.self_attn.o.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
161
+ "blocks.5.self_attn.o.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
162
+ "blocks.5.self_attn.norm_q.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
163
+ "blocks.5.self_attn.norm_k.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
164
+ "blocks.5.norm3.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
165
+ "blocks.5.norm3.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
166
+ "blocks.5.cross_attn.q.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
167
+ "blocks.5.cross_attn.q.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
168
+ "blocks.5.cross_attn.k.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
169
+ "blocks.5.cross_attn.k.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
170
+ "blocks.5.cross_attn.v.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
171
+ "blocks.5.cross_attn.v.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
172
+ "blocks.5.cross_attn.o.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
173
+ "blocks.5.cross_attn.o.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
174
+ "blocks.5.cross_attn.norm_q.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
175
+ "blocks.5.cross_attn.norm_k.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
176
+ "blocks.5.ffn.0.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
177
+ "blocks.5.ffn.0.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
178
+ "blocks.5.ffn.2.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
179
+ "blocks.5.ffn.2.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
180
+ "blocks.6.modulation": "Phantom_Wan_14B-00001-of-00006.safetensors",
181
+ "blocks.6.self_attn.q.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
182
+ "blocks.6.self_attn.q.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
183
+ "blocks.6.self_attn.k.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
184
+ "blocks.6.self_attn.k.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
185
+ "blocks.6.self_attn.v.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
186
+ "blocks.6.self_attn.v.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
187
+ "blocks.6.self_attn.o.weight": "Phantom_Wan_14B-00001-of-00006.safetensors",
188
+ "blocks.6.self_attn.o.bias": "Phantom_Wan_14B-00001-of-00006.safetensors",
189
+ "blocks.6.self_attn.norm_q.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
190
+ "blocks.6.self_attn.norm_k.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
191
+ "blocks.6.norm3.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
192
+ "blocks.6.norm3.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
193
+ "blocks.6.cross_attn.q.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
194
+ "blocks.6.cross_attn.q.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
195
+ "blocks.6.cross_attn.k.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
196
+ "blocks.6.cross_attn.k.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
197
+ "blocks.6.cross_attn.v.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
198
+ "blocks.6.cross_attn.v.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
199
+ "blocks.6.cross_attn.o.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
200
+ "blocks.6.cross_attn.o.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
201
+ "blocks.6.cross_attn.norm_q.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
202
+ "blocks.6.cross_attn.norm_k.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
203
+ "blocks.6.ffn.0.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
204
+ "blocks.6.ffn.0.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
205
+ "blocks.6.ffn.2.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
206
+ "blocks.6.ffn.2.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
207
+ "blocks.7.modulation": "Phantom_Wan_14B-00002-of-00006.safetensors",
208
+ "blocks.7.self_attn.q.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
209
+ "blocks.7.self_attn.q.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
210
+ "blocks.7.self_attn.k.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
211
+ "blocks.7.self_attn.k.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
212
+ "blocks.7.self_attn.v.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
213
+ "blocks.7.self_attn.v.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
214
+ "blocks.7.self_attn.o.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
215
+ "blocks.7.self_attn.o.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
216
+ "blocks.7.self_attn.norm_q.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
217
+ "blocks.7.self_attn.norm_k.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
218
+ "blocks.7.norm3.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
219
+ "blocks.7.norm3.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
220
+ "blocks.7.cross_attn.q.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
221
+ "blocks.7.cross_attn.q.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
222
+ "blocks.7.cross_attn.k.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
223
+ "blocks.7.cross_attn.k.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
224
+ "blocks.7.cross_attn.v.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
225
+ "blocks.7.cross_attn.v.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
226
+ "blocks.7.cross_attn.o.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
227
+ "blocks.7.cross_attn.o.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
228
+ "blocks.7.cross_attn.norm_q.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
229
+ "blocks.7.cross_attn.norm_k.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
230
+ "blocks.7.ffn.0.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
231
+ "blocks.7.ffn.0.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
232
+ "blocks.7.ffn.2.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
233
+ "blocks.7.ffn.2.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
234
+ "blocks.8.modulation": "Phantom_Wan_14B-00002-of-00006.safetensors",
235
+ "blocks.8.self_attn.q.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
236
+ "blocks.8.self_attn.q.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
237
+ "blocks.8.self_attn.k.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
238
+ "blocks.8.self_attn.k.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
239
+ "blocks.8.self_attn.v.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
240
+ "blocks.8.self_attn.v.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
241
+ "blocks.8.self_attn.o.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
242
+ "blocks.8.self_attn.o.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
243
+ "blocks.8.self_attn.norm_q.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
244
+ "blocks.8.self_attn.norm_k.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
245
+ "blocks.8.norm3.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
246
+ "blocks.8.norm3.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
247
+ "blocks.8.cross_attn.q.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
248
+ "blocks.8.cross_attn.q.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
249
+ "blocks.8.cross_attn.k.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
250
+ "blocks.8.cross_attn.k.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
251
+ "blocks.8.cross_attn.v.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
252
+ "blocks.8.cross_attn.v.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
253
+ "blocks.8.cross_attn.o.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
254
+ "blocks.8.cross_attn.o.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
255
+ "blocks.8.cross_attn.norm_q.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
256
+ "blocks.8.cross_attn.norm_k.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
257
+ "blocks.8.ffn.0.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
258
+ "blocks.8.ffn.0.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
259
+ "blocks.8.ffn.2.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
260
+ "blocks.8.ffn.2.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
261
+ "blocks.9.modulation": "Phantom_Wan_14B-00002-of-00006.safetensors",
262
+ "blocks.9.self_attn.q.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
263
+ "blocks.9.self_attn.q.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
264
+ "blocks.9.self_attn.k.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
265
+ "blocks.9.self_attn.k.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
266
+ "blocks.9.self_attn.v.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
267
+ "blocks.9.self_attn.v.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
268
+ "blocks.9.self_attn.o.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
269
+ "blocks.9.self_attn.o.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
270
+ "blocks.9.self_attn.norm_q.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
271
+ "blocks.9.self_attn.norm_k.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
272
+ "blocks.9.norm3.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
273
+ "blocks.9.norm3.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
274
+ "blocks.9.cross_attn.q.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
275
+ "blocks.9.cross_attn.q.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
276
+ "blocks.9.cross_attn.k.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
277
+ "blocks.9.cross_attn.k.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
278
+ "blocks.9.cross_attn.v.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
279
+ "blocks.9.cross_attn.v.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
280
+ "blocks.9.cross_attn.o.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
281
+ "blocks.9.cross_attn.o.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
282
+ "blocks.9.cross_attn.norm_q.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
283
+ "blocks.9.cross_attn.norm_k.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
284
+ "blocks.9.ffn.0.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
285
+ "blocks.9.ffn.0.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
286
+ "blocks.9.ffn.2.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
287
+ "blocks.9.ffn.2.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
288
+ "blocks.10.modulation": "Phantom_Wan_14B-00002-of-00006.safetensors",
289
+ "blocks.10.self_attn.q.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
290
+ "blocks.10.self_attn.q.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
291
+ "blocks.10.self_attn.k.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
292
+ "blocks.10.self_attn.k.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
293
+ "blocks.10.self_attn.v.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
294
+ "blocks.10.self_attn.v.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
295
+ "blocks.10.self_attn.o.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
296
+ "blocks.10.self_attn.o.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
297
+ "blocks.10.self_attn.norm_q.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
298
+ "blocks.10.self_attn.norm_k.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
299
+ "blocks.10.norm3.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
300
+ "blocks.10.norm3.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
301
+ "blocks.10.cross_attn.q.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
302
+ "blocks.10.cross_attn.q.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
303
+ "blocks.10.cross_attn.k.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
304
+ "blocks.10.cross_attn.k.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
305
+ "blocks.10.cross_attn.v.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
306
+ "blocks.10.cross_attn.v.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
307
+ "blocks.10.cross_attn.o.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
308
+ "blocks.10.cross_attn.o.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
309
+ "blocks.10.cross_attn.norm_q.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
310
+ "blocks.10.cross_attn.norm_k.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
311
+ "blocks.10.ffn.0.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
312
+ "blocks.10.ffn.0.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
313
+ "blocks.10.ffn.2.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
314
+ "blocks.10.ffn.2.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
315
+ "blocks.11.modulation": "Phantom_Wan_14B-00002-of-00006.safetensors",
316
+ "blocks.11.self_attn.q.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
317
+ "blocks.11.self_attn.q.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
318
+ "blocks.11.self_attn.k.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
319
+ "blocks.11.self_attn.k.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
320
+ "blocks.11.self_attn.v.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
321
+ "blocks.11.self_attn.v.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
322
+ "blocks.11.self_attn.o.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
323
+ "blocks.11.self_attn.o.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
324
+ "blocks.11.self_attn.norm_q.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
325
+ "blocks.11.self_attn.norm_k.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
326
+ "blocks.11.norm3.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
327
+ "blocks.11.norm3.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
328
+ "blocks.11.cross_attn.q.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
329
+ "blocks.11.cross_attn.q.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
330
+ "blocks.11.cross_attn.k.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
331
+ "blocks.11.cross_attn.k.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
332
+ "blocks.11.cross_attn.v.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
333
+ "blocks.11.cross_attn.v.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
334
+ "blocks.11.cross_attn.o.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
335
+ "blocks.11.cross_attn.o.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
336
+ "blocks.11.cross_attn.norm_q.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
337
+ "blocks.11.cross_attn.norm_k.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
338
+ "blocks.11.ffn.0.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
339
+ "blocks.11.ffn.0.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
340
+ "blocks.11.ffn.2.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
341
+ "blocks.11.ffn.2.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
342
+ "blocks.12.modulation": "Phantom_Wan_14B-00002-of-00006.safetensors",
343
+ "blocks.12.self_attn.q.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
344
+ "blocks.12.self_attn.q.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
345
+ "blocks.12.self_attn.k.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
346
+ "blocks.12.self_attn.k.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
347
+ "blocks.12.self_attn.v.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
348
+ "blocks.12.self_attn.v.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
349
+ "blocks.12.self_attn.o.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
350
+ "blocks.12.self_attn.o.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
351
+ "blocks.12.self_attn.norm_q.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
352
+ "blocks.12.self_attn.norm_k.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
353
+ "blocks.12.norm3.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
354
+ "blocks.12.norm3.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
355
+ "blocks.12.cross_attn.q.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
356
+ "blocks.12.cross_attn.q.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
357
+ "blocks.12.cross_attn.k.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
358
+ "blocks.12.cross_attn.k.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
359
+ "blocks.12.cross_attn.v.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
360
+ "blocks.12.cross_attn.v.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
361
+ "blocks.12.cross_attn.o.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
362
+ "blocks.12.cross_attn.o.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
363
+ "blocks.12.cross_attn.norm_q.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
364
+ "blocks.12.cross_attn.norm_k.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
365
+ "blocks.12.ffn.0.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
366
+ "blocks.12.ffn.0.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
367
+ "blocks.12.ffn.2.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
368
+ "blocks.12.ffn.2.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
369
+ "blocks.13.modulation": "Phantom_Wan_14B-00002-of-00006.safetensors",
370
+ "blocks.13.self_attn.q.weight": "Phantom_Wan_14B-00002-of-00006.safetensors",
371
+ "blocks.13.self_attn.q.bias": "Phantom_Wan_14B-00002-of-00006.safetensors",
372
+ "blocks.13.self_attn.k.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
373
+ "blocks.13.self_attn.k.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
374
+ "blocks.13.self_attn.v.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
375
+ "blocks.13.self_attn.v.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
376
+ "blocks.13.self_attn.o.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
377
+ "blocks.13.self_attn.o.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
378
+ "blocks.13.self_attn.norm_q.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
379
+ "blocks.13.self_attn.norm_k.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
380
+ "blocks.13.norm3.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
381
+ "blocks.13.norm3.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
382
+ "blocks.13.cross_attn.q.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
383
+ "blocks.13.cross_attn.q.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
384
+ "blocks.13.cross_attn.k.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
385
+ "blocks.13.cross_attn.k.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
386
+ "blocks.13.cross_attn.v.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
387
+ "blocks.13.cross_attn.v.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
388
+ "blocks.13.cross_attn.o.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
389
+ "blocks.13.cross_attn.o.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
390
+ "blocks.13.cross_attn.norm_q.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
391
+ "blocks.13.cross_attn.norm_k.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
392
+ "blocks.13.ffn.0.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
393
+ "blocks.13.ffn.0.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
394
+ "blocks.13.ffn.2.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
395
+ "blocks.13.ffn.2.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
396
+ "blocks.14.modulation": "Phantom_Wan_14B-00003-of-00006.safetensors",
397
+ "blocks.14.self_attn.q.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
398
+ "blocks.14.self_attn.q.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
399
+ "blocks.14.self_attn.k.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
400
+ "blocks.14.self_attn.k.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
401
+ "blocks.14.self_attn.v.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
402
+ "blocks.14.self_attn.v.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
403
+ "blocks.14.self_attn.o.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
404
+ "blocks.14.self_attn.o.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
405
+ "blocks.14.self_attn.norm_q.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
406
+ "blocks.14.self_attn.norm_k.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
407
+ "blocks.14.norm3.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
408
+ "blocks.14.norm3.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
409
+ "blocks.14.cross_attn.q.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
410
+ "blocks.14.cross_attn.q.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
411
+ "blocks.14.cross_attn.k.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
412
+ "blocks.14.cross_attn.k.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
413
+ "blocks.14.cross_attn.v.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
414
+ "blocks.14.cross_attn.v.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
415
+ "blocks.14.cross_attn.o.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
416
+ "blocks.14.cross_attn.o.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
417
+ "blocks.14.cross_attn.norm_q.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
418
+ "blocks.14.cross_attn.norm_k.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
419
+ "blocks.14.ffn.0.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
420
+ "blocks.14.ffn.0.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
421
+ "blocks.14.ffn.2.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
422
+ "blocks.14.ffn.2.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
423
+ "blocks.15.modulation": "Phantom_Wan_14B-00003-of-00006.safetensors",
424
+ "blocks.15.self_attn.q.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
425
+ "blocks.15.self_attn.q.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
426
+ "blocks.15.self_attn.k.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
427
+ "blocks.15.self_attn.k.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
428
+ "blocks.15.self_attn.v.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
429
+ "blocks.15.self_attn.v.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
430
+ "blocks.15.self_attn.o.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
431
+ "blocks.15.self_attn.o.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
432
+ "blocks.15.self_attn.norm_q.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
433
+ "blocks.15.self_attn.norm_k.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
434
+ "blocks.15.norm3.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
435
+ "blocks.15.norm3.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
436
+ "blocks.15.cross_attn.q.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
437
+ "blocks.15.cross_attn.q.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
438
+ "blocks.15.cross_attn.k.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
439
+ "blocks.15.cross_attn.k.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
440
+ "blocks.15.cross_attn.v.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
441
+ "blocks.15.cross_attn.v.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
442
+ "blocks.15.cross_attn.o.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
443
+ "blocks.15.cross_attn.o.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
444
+ "blocks.15.cross_attn.norm_q.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
445
+ "blocks.15.cross_attn.norm_k.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
446
+ "blocks.15.ffn.0.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
447
+ "blocks.15.ffn.0.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
448
+ "blocks.15.ffn.2.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
449
+ "blocks.15.ffn.2.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
450
+ "blocks.16.modulation": "Phantom_Wan_14B-00003-of-00006.safetensors",
451
+ "blocks.16.self_attn.q.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
452
+ "blocks.16.self_attn.q.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
453
+ "blocks.16.self_attn.k.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
454
+ "blocks.16.self_attn.k.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
455
+ "blocks.16.self_attn.v.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
456
+ "blocks.16.self_attn.v.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
457
+ "blocks.16.self_attn.o.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
458
+ "blocks.16.self_attn.o.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
459
+ "blocks.16.self_attn.norm_q.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
460
+ "blocks.16.self_attn.norm_k.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
461
+ "blocks.16.norm3.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
462
+ "blocks.16.norm3.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
463
+ "blocks.16.cross_attn.q.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
464
+ "blocks.16.cross_attn.q.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
465
+ "blocks.16.cross_attn.k.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
466
+ "blocks.16.cross_attn.k.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
467
+ "blocks.16.cross_attn.v.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
468
+ "blocks.16.cross_attn.v.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
469
+ "blocks.16.cross_attn.o.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
470
+ "blocks.16.cross_attn.o.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
471
+ "blocks.16.cross_attn.norm_q.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
472
+ "blocks.16.cross_attn.norm_k.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
473
+ "blocks.16.ffn.0.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
474
+ "blocks.16.ffn.0.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
475
+ "blocks.16.ffn.2.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
476
+ "blocks.16.ffn.2.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
477
+ "blocks.17.modulation": "Phantom_Wan_14B-00003-of-00006.safetensors",
478
+ "blocks.17.self_attn.q.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
479
+ "blocks.17.self_attn.q.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
480
+ "blocks.17.self_attn.k.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
481
+ "blocks.17.self_attn.k.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
482
+ "blocks.17.self_attn.v.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
483
+ "blocks.17.self_attn.v.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
484
+ "blocks.17.self_attn.o.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
485
+ "blocks.17.self_attn.o.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
486
+ "blocks.17.self_attn.norm_q.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
487
+ "blocks.17.self_attn.norm_k.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
488
+ "blocks.17.norm3.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
489
+ "blocks.17.norm3.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
490
+ "blocks.17.cross_attn.q.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
491
+ "blocks.17.cross_attn.q.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
492
+ "blocks.17.cross_attn.k.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
493
+ "blocks.17.cross_attn.k.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
494
+ "blocks.17.cross_attn.v.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
495
+ "blocks.17.cross_attn.v.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
496
+ "blocks.17.cross_attn.o.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
497
+ "blocks.17.cross_attn.o.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
498
+ "blocks.17.cross_attn.norm_q.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
499
+ "blocks.17.cross_attn.norm_k.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
500
+ "blocks.17.ffn.0.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
501
+ "blocks.17.ffn.0.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
502
+ "blocks.17.ffn.2.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
503
+ "blocks.17.ffn.2.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
504
+ "blocks.18.modulation": "Phantom_Wan_14B-00003-of-00006.safetensors",
505
+ "blocks.18.self_attn.q.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
506
+ "blocks.18.self_attn.q.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
507
+ "blocks.18.self_attn.k.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
508
+ "blocks.18.self_attn.k.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
509
+ "blocks.18.self_attn.v.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
510
+ "blocks.18.self_attn.v.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
511
+ "blocks.18.self_attn.o.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
512
+ "blocks.18.self_attn.o.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
513
+ "blocks.18.self_attn.norm_q.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
514
+ "blocks.18.self_attn.norm_k.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
515
+ "blocks.18.norm3.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
516
+ "blocks.18.norm3.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
517
+ "blocks.18.cross_attn.q.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
518
+ "blocks.18.cross_attn.q.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
519
+ "blocks.18.cross_attn.k.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
520
+ "blocks.18.cross_attn.k.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
521
+ "blocks.18.cross_attn.v.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
522
+ "blocks.18.cross_attn.v.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
523
+ "blocks.18.cross_attn.o.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
524
+ "blocks.18.cross_attn.o.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
525
+ "blocks.18.cross_attn.norm_q.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
526
+ "blocks.18.cross_attn.norm_k.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
527
+ "blocks.18.ffn.0.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
528
+ "blocks.18.ffn.0.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
529
+ "blocks.18.ffn.2.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
530
+ "blocks.18.ffn.2.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
531
+ "blocks.19.modulation": "Phantom_Wan_14B-00003-of-00006.safetensors",
532
+ "blocks.19.self_attn.q.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
533
+ "blocks.19.self_attn.q.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
534
+ "blocks.19.self_attn.k.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
535
+ "blocks.19.self_attn.k.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
536
+ "blocks.19.self_attn.v.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
537
+ "blocks.19.self_attn.v.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
538
+ "blocks.19.self_attn.o.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
539
+ "blocks.19.self_attn.o.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
540
+ "blocks.19.self_attn.norm_q.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
541
+ "blocks.19.self_attn.norm_k.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
542
+ "blocks.19.norm3.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
543
+ "blocks.19.norm3.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
544
+ "blocks.19.cross_attn.q.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
545
+ "blocks.19.cross_attn.q.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
546
+ "blocks.19.cross_attn.k.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
547
+ "blocks.19.cross_attn.k.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
548
+ "blocks.19.cross_attn.v.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
549
+ "blocks.19.cross_attn.v.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
550
+ "blocks.19.cross_attn.o.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
551
+ "blocks.19.cross_attn.o.bias": "Phantom_Wan_14B-00003-of-00006.safetensors",
552
+ "blocks.19.cross_attn.norm_q.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
553
+ "blocks.19.cross_attn.norm_k.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
554
+ "blocks.19.ffn.0.weight": "Phantom_Wan_14B-00003-of-00006.safetensors",
555
+ "blocks.19.ffn.0.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
556
+ "blocks.19.ffn.2.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
557
+ "blocks.19.ffn.2.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
558
+ "blocks.20.modulation": "Phantom_Wan_14B-00004-of-00006.safetensors",
559
+ "blocks.20.self_attn.q.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
560
+ "blocks.20.self_attn.q.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
561
+ "blocks.20.self_attn.k.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
562
+ "blocks.20.self_attn.k.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
563
+ "blocks.20.self_attn.v.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
564
+ "blocks.20.self_attn.v.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
565
+ "blocks.20.self_attn.o.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
566
+ "blocks.20.self_attn.o.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
567
+ "blocks.20.self_attn.norm_q.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
568
+ "blocks.20.self_attn.norm_k.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
569
+ "blocks.20.norm3.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
570
+ "blocks.20.norm3.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
571
+ "blocks.20.cross_attn.q.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
572
+ "blocks.20.cross_attn.q.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
573
+ "blocks.20.cross_attn.k.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
574
+ "blocks.20.cross_attn.k.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
575
+ "blocks.20.cross_attn.v.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
576
+ "blocks.20.cross_attn.v.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
577
+ "blocks.20.cross_attn.o.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
578
+ "blocks.20.cross_attn.o.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
579
+ "blocks.20.cross_attn.norm_q.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
580
+ "blocks.20.cross_attn.norm_k.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
581
+ "blocks.20.ffn.0.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
582
+ "blocks.20.ffn.0.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
583
+ "blocks.20.ffn.2.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
584
+ "blocks.20.ffn.2.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
585
+ "blocks.21.modulation": "Phantom_Wan_14B-00004-of-00006.safetensors",
586
+ "blocks.21.self_attn.q.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
587
+ "blocks.21.self_attn.q.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
588
+ "blocks.21.self_attn.k.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
589
+ "blocks.21.self_attn.k.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
590
+ "blocks.21.self_attn.v.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
591
+ "blocks.21.self_attn.v.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
592
+ "blocks.21.self_attn.o.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
593
+ "blocks.21.self_attn.o.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
594
+ "blocks.21.self_attn.norm_q.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
595
+ "blocks.21.self_attn.norm_k.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
596
+ "blocks.21.norm3.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
597
+ "blocks.21.norm3.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
598
+ "blocks.21.cross_attn.q.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
599
+ "blocks.21.cross_attn.q.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
600
+ "blocks.21.cross_attn.k.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
601
+ "blocks.21.cross_attn.k.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
602
+ "blocks.21.cross_attn.v.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
603
+ "blocks.21.cross_attn.v.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
604
+ "blocks.21.cross_attn.o.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
605
+ "blocks.21.cross_attn.o.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
606
+ "blocks.21.cross_attn.norm_q.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
607
+ "blocks.21.cross_attn.norm_k.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
608
+ "blocks.21.ffn.0.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
609
+ "blocks.21.ffn.0.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
610
+ "blocks.21.ffn.2.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
611
+ "blocks.21.ffn.2.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
612
+ "blocks.22.modulation": "Phantom_Wan_14B-00004-of-00006.safetensors",
613
+ "blocks.22.self_attn.q.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
614
+ "blocks.22.self_attn.q.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
615
+ "blocks.22.self_attn.k.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
616
+ "blocks.22.self_attn.k.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
617
+ "blocks.22.self_attn.v.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
618
+ "blocks.22.self_attn.v.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
619
+ "blocks.22.self_attn.o.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
620
+ "blocks.22.self_attn.o.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
621
+ "blocks.22.self_attn.norm_q.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
622
+ "blocks.22.self_attn.norm_k.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
623
+ "blocks.22.norm3.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
624
+ "blocks.22.norm3.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
625
+ "blocks.22.cross_attn.q.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
626
+ "blocks.22.cross_attn.q.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
627
+ "blocks.22.cross_attn.k.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
628
+ "blocks.22.cross_attn.k.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
629
+ "blocks.22.cross_attn.v.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
630
+ "blocks.22.cross_attn.v.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
631
+ "blocks.22.cross_attn.o.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
632
+ "blocks.22.cross_attn.o.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
633
+ "blocks.22.cross_attn.norm_q.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
634
+ "blocks.22.cross_attn.norm_k.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
635
+ "blocks.22.ffn.0.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
636
+ "blocks.22.ffn.0.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
637
+ "blocks.22.ffn.2.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
638
+ "blocks.22.ffn.2.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
639
+ "blocks.23.modulation": "Phantom_Wan_14B-00004-of-00006.safetensors",
640
+ "blocks.23.self_attn.q.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
641
+ "blocks.23.self_attn.q.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
642
+ "blocks.23.self_attn.k.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
643
+ "blocks.23.self_attn.k.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
644
+ "blocks.23.self_attn.v.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
645
+ "blocks.23.self_attn.v.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
646
+ "blocks.23.self_attn.o.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
647
+ "blocks.23.self_attn.o.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
648
+ "blocks.23.self_attn.norm_q.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
649
+ "blocks.23.self_attn.norm_k.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
650
+ "blocks.23.norm3.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
651
+ "blocks.23.norm3.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
652
+ "blocks.23.cross_attn.q.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
653
+ "blocks.23.cross_attn.q.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
654
+ "blocks.23.cross_attn.k.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
655
+ "blocks.23.cross_attn.k.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
656
+ "blocks.23.cross_attn.v.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
657
+ "blocks.23.cross_attn.v.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
658
+ "blocks.23.cross_attn.o.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
659
+ "blocks.23.cross_attn.o.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
660
+ "blocks.23.cross_attn.norm_q.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
661
+ "blocks.23.cross_attn.norm_k.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
662
+ "blocks.23.ffn.0.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
663
+ "blocks.23.ffn.0.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
664
+ "blocks.23.ffn.2.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
665
+ "blocks.23.ffn.2.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
666
+ "blocks.24.modulation": "Phantom_Wan_14B-00004-of-00006.safetensors",
667
+ "blocks.24.self_attn.q.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
668
+ "blocks.24.self_attn.q.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
669
+ "blocks.24.self_attn.k.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
670
+ "blocks.24.self_attn.k.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
671
+ "blocks.24.self_attn.v.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
672
+ "blocks.24.self_attn.v.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
673
+ "blocks.24.self_attn.o.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
674
+ "blocks.24.self_attn.o.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
675
+ "blocks.24.self_attn.norm_q.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
676
+ "blocks.24.self_attn.norm_k.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
677
+ "blocks.24.norm3.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
678
+ "blocks.24.norm3.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
679
+ "blocks.24.cross_attn.q.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
680
+ "blocks.24.cross_attn.q.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
681
+ "blocks.24.cross_attn.k.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
682
+ "blocks.24.cross_attn.k.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
683
+ "blocks.24.cross_attn.v.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
684
+ "blocks.24.cross_attn.v.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
685
+ "blocks.24.cross_attn.o.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
686
+ "blocks.24.cross_attn.o.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
687
+ "blocks.24.cross_attn.norm_q.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
688
+ "blocks.24.cross_attn.norm_k.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
689
+ "blocks.24.ffn.0.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
690
+ "blocks.24.ffn.0.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
691
+ "blocks.24.ffn.2.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
692
+ "blocks.24.ffn.2.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
693
+ "blocks.25.modulation": "Phantom_Wan_14B-00004-of-00006.safetensors",
694
+ "blocks.25.self_attn.q.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
695
+ "blocks.25.self_attn.q.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
696
+ "blocks.25.self_attn.k.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
697
+ "blocks.25.self_attn.k.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
698
+ "blocks.25.self_attn.v.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
699
+ "blocks.25.self_attn.v.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
700
+ "blocks.25.self_attn.o.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
701
+ "blocks.25.self_attn.o.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
702
+ "blocks.25.self_attn.norm_q.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
703
+ "blocks.25.self_attn.norm_k.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
704
+ "blocks.25.norm3.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
705
+ "blocks.25.norm3.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
706
+ "blocks.25.cross_attn.q.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
707
+ "blocks.25.cross_attn.q.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
708
+ "blocks.25.cross_attn.k.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
709
+ "blocks.25.cross_attn.k.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
710
+ "blocks.25.cross_attn.v.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
711
+ "blocks.25.cross_attn.v.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
712
+ "blocks.25.cross_attn.o.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
713
+ "blocks.25.cross_attn.o.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
714
+ "blocks.25.cross_attn.norm_q.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
715
+ "blocks.25.cross_attn.norm_k.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
716
+ "blocks.25.ffn.0.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
717
+ "blocks.25.ffn.0.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
718
+ "blocks.25.ffn.2.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
719
+ "blocks.25.ffn.2.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
720
+ "blocks.26.modulation": "Phantom_Wan_14B-00004-of-00006.safetensors",
721
+ "blocks.26.self_attn.q.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
722
+ "blocks.26.self_attn.q.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
723
+ "blocks.26.self_attn.k.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
724
+ "blocks.26.self_attn.k.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
725
+ "blocks.26.self_attn.v.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
726
+ "blocks.26.self_attn.v.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
727
+ "blocks.26.self_attn.o.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
728
+ "blocks.26.self_attn.o.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
729
+ "blocks.26.self_attn.norm_q.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
730
+ "blocks.26.self_attn.norm_k.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
731
+ "blocks.26.norm3.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
732
+ "blocks.26.norm3.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
733
+ "blocks.26.cross_attn.q.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
734
+ "blocks.26.cross_attn.q.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
735
+ "blocks.26.cross_attn.k.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
736
+ "blocks.26.cross_attn.k.bias": "Phantom_Wan_14B-00004-of-00006.safetensors",
737
+ "blocks.26.cross_attn.v.weight": "Phantom_Wan_14B-00004-of-00006.safetensors",
738
+ "blocks.26.cross_attn.v.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
739
+ "blocks.26.cross_attn.o.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
740
+ "blocks.26.cross_attn.o.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
741
+ "blocks.26.cross_attn.norm_q.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
742
+ "blocks.26.cross_attn.norm_k.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
743
+ "blocks.26.ffn.0.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
744
+ "blocks.26.ffn.0.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
745
+ "blocks.26.ffn.2.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
746
+ "blocks.26.ffn.2.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
747
+ "blocks.27.modulation": "Phantom_Wan_14B-00005-of-00006.safetensors",
748
+ "blocks.27.self_attn.q.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
749
+ "blocks.27.self_attn.q.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
750
+ "blocks.27.self_attn.k.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
751
+ "blocks.27.self_attn.k.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
752
+ "blocks.27.self_attn.v.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
753
+ "blocks.27.self_attn.v.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
754
+ "blocks.27.self_attn.o.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
755
+ "blocks.27.self_attn.o.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
756
+ "blocks.27.self_attn.norm_q.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
757
+ "blocks.27.self_attn.norm_k.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
758
+ "blocks.27.norm3.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
759
+ "blocks.27.norm3.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
760
+ "blocks.27.cross_attn.q.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
761
+ "blocks.27.cross_attn.q.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
762
+ "blocks.27.cross_attn.k.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
763
+ "blocks.27.cross_attn.k.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
764
+ "blocks.27.cross_attn.v.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
765
+ "blocks.27.cross_attn.v.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
766
+ "blocks.27.cross_attn.o.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
767
+ "blocks.27.cross_attn.o.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
768
+ "blocks.27.cross_attn.norm_q.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
769
+ "blocks.27.cross_attn.norm_k.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
770
+ "blocks.27.ffn.0.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
771
+ "blocks.27.ffn.0.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
772
+ "blocks.27.ffn.2.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
773
+ "blocks.27.ffn.2.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
774
+ "blocks.28.modulation": "Phantom_Wan_14B-00005-of-00006.safetensors",
775
+ "blocks.28.self_attn.q.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
776
+ "blocks.28.self_attn.q.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
777
+ "blocks.28.self_attn.k.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
778
+ "blocks.28.self_attn.k.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
779
+ "blocks.28.self_attn.v.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
780
+ "blocks.28.self_attn.v.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
781
+ "blocks.28.self_attn.o.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
782
+ "blocks.28.self_attn.o.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
783
+ "blocks.28.self_attn.norm_q.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
784
+ "blocks.28.self_attn.norm_k.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
785
+ "blocks.28.norm3.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
786
+ "blocks.28.norm3.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
787
+ "blocks.28.cross_attn.q.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
788
+ "blocks.28.cross_attn.q.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
789
+ "blocks.28.cross_attn.k.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
790
+ "blocks.28.cross_attn.k.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
791
+ "blocks.28.cross_attn.v.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
792
+ "blocks.28.cross_attn.v.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
793
+ "blocks.28.cross_attn.o.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
794
+ "blocks.28.cross_attn.o.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
795
+ "blocks.28.cross_attn.norm_q.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
796
+ "blocks.28.cross_attn.norm_k.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
797
+ "blocks.28.ffn.0.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
798
+ "blocks.28.ffn.0.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
799
+ "blocks.28.ffn.2.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
800
+ "blocks.28.ffn.2.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
801
+ "blocks.29.modulation": "Phantom_Wan_14B-00005-of-00006.safetensors",
802
+ "blocks.29.self_attn.q.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
803
+ "blocks.29.self_attn.q.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
804
+ "blocks.29.self_attn.k.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
805
+ "blocks.29.self_attn.k.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
806
+ "blocks.29.self_attn.v.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
807
+ "blocks.29.self_attn.v.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
808
+ "blocks.29.self_attn.o.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
809
+ "blocks.29.self_attn.o.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
810
+ "blocks.29.self_attn.norm_q.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
811
+ "blocks.29.self_attn.norm_k.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
812
+ "blocks.29.norm3.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
813
+ "blocks.29.norm3.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
814
+ "blocks.29.cross_attn.q.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
815
+ "blocks.29.cross_attn.q.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
816
+ "blocks.29.cross_attn.k.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
817
+ "blocks.29.cross_attn.k.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
818
+ "blocks.29.cross_attn.v.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
819
+ "blocks.29.cross_attn.v.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
820
+ "blocks.29.cross_attn.o.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
821
+ "blocks.29.cross_attn.o.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
822
+ "blocks.29.cross_attn.norm_q.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
823
+ "blocks.29.cross_attn.norm_k.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
824
+ "blocks.29.ffn.0.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
825
+ "blocks.29.ffn.0.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
826
+ "blocks.29.ffn.2.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
827
+ "blocks.29.ffn.2.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
828
+ "blocks.30.modulation": "Phantom_Wan_14B-00005-of-00006.safetensors",
829
+ "blocks.30.self_attn.q.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
830
+ "blocks.30.self_attn.q.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
831
+ "blocks.30.self_attn.k.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
832
+ "blocks.30.self_attn.k.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
833
+ "blocks.30.self_attn.v.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
834
+ "blocks.30.self_attn.v.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
835
+ "blocks.30.self_attn.o.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
836
+ "blocks.30.self_attn.o.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
837
+ "blocks.30.self_attn.norm_q.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
838
+ "blocks.30.self_attn.norm_k.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
839
+ "blocks.30.norm3.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
840
+ "blocks.30.norm3.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
841
+ "blocks.30.cross_attn.q.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
842
+ "blocks.30.cross_attn.q.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
843
+ "blocks.30.cross_attn.k.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
844
+ "blocks.30.cross_attn.k.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
845
+ "blocks.30.cross_attn.v.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
846
+ "blocks.30.cross_attn.v.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
847
+ "blocks.30.cross_attn.o.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
848
+ "blocks.30.cross_attn.o.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
849
+ "blocks.30.cross_attn.norm_q.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
850
+ "blocks.30.cross_attn.norm_k.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
851
+ "blocks.30.ffn.0.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
852
+ "blocks.30.ffn.0.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
853
+ "blocks.30.ffn.2.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
854
+ "blocks.30.ffn.2.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
855
+ "blocks.31.modulation": "Phantom_Wan_14B-00005-of-00006.safetensors",
856
+ "blocks.31.self_attn.q.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
857
+ "blocks.31.self_attn.q.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
858
+ "blocks.31.self_attn.k.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
859
+ "blocks.31.self_attn.k.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
860
+ "blocks.31.self_attn.v.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
861
+ "blocks.31.self_attn.v.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
862
+ "blocks.31.self_attn.o.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
863
+ "blocks.31.self_attn.o.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
864
+ "blocks.31.self_attn.norm_q.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
865
+ "blocks.31.self_attn.norm_k.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
866
+ "blocks.31.norm3.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
867
+ "blocks.31.norm3.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
868
+ "blocks.31.cross_attn.q.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
869
+ "blocks.31.cross_attn.q.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
870
+ "blocks.31.cross_attn.k.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
871
+ "blocks.31.cross_attn.k.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
872
+ "blocks.31.cross_attn.v.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
873
+ "blocks.31.cross_attn.v.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
874
+ "blocks.31.cross_attn.o.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
875
+ "blocks.31.cross_attn.o.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
876
+ "blocks.31.cross_attn.norm_q.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
877
+ "blocks.31.cross_attn.norm_k.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
878
+ "blocks.31.ffn.0.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
879
+ "blocks.31.ffn.0.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
880
+ "blocks.31.ffn.2.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
881
+ "blocks.31.ffn.2.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
882
+ "blocks.32.modulation": "Phantom_Wan_14B-00005-of-00006.safetensors",
883
+ "blocks.32.self_attn.q.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
884
+ "blocks.32.self_attn.q.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
885
+ "blocks.32.self_attn.k.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
886
+ "blocks.32.self_attn.k.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
887
+ "blocks.32.self_attn.v.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
888
+ "blocks.32.self_attn.v.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
889
+ "blocks.32.self_attn.o.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
890
+ "blocks.32.self_attn.o.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
891
+ "blocks.32.self_attn.norm_q.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
892
+ "blocks.32.self_attn.norm_k.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
893
+ "blocks.32.norm3.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
894
+ "blocks.32.norm3.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
895
+ "blocks.32.cross_attn.q.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
896
+ "blocks.32.cross_attn.q.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
897
+ "blocks.32.cross_attn.k.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
898
+ "blocks.32.cross_attn.k.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
899
+ "blocks.32.cross_attn.v.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
900
+ "blocks.32.cross_attn.v.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
901
+ "blocks.32.cross_attn.o.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
902
+ "blocks.32.cross_attn.o.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
903
+ "blocks.32.cross_attn.norm_q.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
904
+ "blocks.32.cross_attn.norm_k.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
905
+ "blocks.32.ffn.0.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
906
+ "blocks.32.ffn.0.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
907
+ "blocks.32.ffn.2.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
908
+ "blocks.32.ffn.2.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
909
+ "blocks.33.modulation": "Phantom_Wan_14B-00005-of-00006.safetensors",
910
+ "blocks.33.self_attn.q.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
911
+ "blocks.33.self_attn.q.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
912
+ "blocks.33.self_attn.k.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
913
+ "blocks.33.self_attn.k.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
914
+ "blocks.33.self_attn.v.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
915
+ "blocks.33.self_attn.v.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
916
+ "blocks.33.self_attn.o.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
917
+ "blocks.33.self_attn.o.bias": "Phantom_Wan_14B-00005-of-00006.safetensors",
918
+ "blocks.33.self_attn.norm_q.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
919
+ "blocks.33.self_attn.norm_k.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
920
+ "blocks.33.norm3.weight": "Phantom_Wan_14B-00005-of-00006.safetensors",
921
+ "blocks.33.norm3.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
922
+ "blocks.33.cross_attn.q.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
923
+ "blocks.33.cross_attn.q.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
924
+ "blocks.33.cross_attn.k.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
925
+ "blocks.33.cross_attn.k.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
926
+ "blocks.33.cross_attn.v.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
927
+ "blocks.33.cross_attn.v.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
928
+ "blocks.33.cross_attn.o.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
929
+ "blocks.33.cross_attn.o.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
930
+ "blocks.33.cross_attn.norm_q.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
931
+ "blocks.33.cross_attn.norm_k.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
932
+ "blocks.33.ffn.0.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
933
+ "blocks.33.ffn.0.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
934
+ "blocks.33.ffn.2.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
935
+ "blocks.33.ffn.2.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
936
+ "blocks.34.modulation": "Phantom_Wan_14B-00006-of-00006.safetensors",
937
+ "blocks.34.self_attn.q.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
938
+ "blocks.34.self_attn.q.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
939
+ "blocks.34.self_attn.k.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
940
+ "blocks.34.self_attn.k.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
941
+ "blocks.34.self_attn.v.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
942
+ "blocks.34.self_attn.v.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
943
+ "blocks.34.self_attn.o.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
944
+ "blocks.34.self_attn.o.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
945
+ "blocks.34.self_attn.norm_q.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
946
+ "blocks.34.self_attn.norm_k.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
947
+ "blocks.34.norm3.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
948
+ "blocks.34.norm3.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
949
+ "blocks.34.cross_attn.q.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
950
+ "blocks.34.cross_attn.q.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
951
+ "blocks.34.cross_attn.k.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
952
+ "blocks.34.cross_attn.k.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
953
+ "blocks.34.cross_attn.v.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
954
+ "blocks.34.cross_attn.v.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
955
+ "blocks.34.cross_attn.o.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
956
+ "blocks.34.cross_attn.o.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
957
+ "blocks.34.cross_attn.norm_q.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
958
+ "blocks.34.cross_attn.norm_k.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
959
+ "blocks.34.ffn.0.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
960
+ "blocks.34.ffn.0.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
961
+ "blocks.34.ffn.2.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
962
+ "blocks.34.ffn.2.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
963
+ "blocks.35.modulation": "Phantom_Wan_14B-00006-of-00006.safetensors",
964
+ "blocks.35.self_attn.q.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
965
+ "blocks.35.self_attn.q.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
966
+ "blocks.35.self_attn.k.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
967
+ "blocks.35.self_attn.k.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
968
+ "blocks.35.self_attn.v.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
969
+ "blocks.35.self_attn.v.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
970
+ "blocks.35.self_attn.o.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
971
+ "blocks.35.self_attn.o.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
972
+ "blocks.35.self_attn.norm_q.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
973
+ "blocks.35.self_attn.norm_k.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
974
+ "blocks.35.norm3.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
975
+ "blocks.35.norm3.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
976
+ "blocks.35.cross_attn.q.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
977
+ "blocks.35.cross_attn.q.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
978
+ "blocks.35.cross_attn.k.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
979
+ "blocks.35.cross_attn.k.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
980
+ "blocks.35.cross_attn.v.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
981
+ "blocks.35.cross_attn.v.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
982
+ "blocks.35.cross_attn.o.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
983
+ "blocks.35.cross_attn.o.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
984
+ "blocks.35.cross_attn.norm_q.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
985
+ "blocks.35.cross_attn.norm_k.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
986
+ "blocks.35.ffn.0.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
987
+ "blocks.35.ffn.0.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
988
+ "blocks.35.ffn.2.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
989
+ "blocks.35.ffn.2.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
990
+ "blocks.36.modulation": "Phantom_Wan_14B-00006-of-00006.safetensors",
991
+ "blocks.36.self_attn.q.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
992
+ "blocks.36.self_attn.q.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
993
+ "blocks.36.self_attn.k.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
994
+ "blocks.36.self_attn.k.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
995
+ "blocks.36.self_attn.v.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
996
+ "blocks.36.self_attn.v.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
997
+ "blocks.36.self_attn.o.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
998
+ "blocks.36.self_attn.o.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
999
+ "blocks.36.self_attn.norm_q.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1000
+ "blocks.36.self_attn.norm_k.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1001
+ "blocks.36.norm3.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1002
+ "blocks.36.norm3.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1003
+ "blocks.36.cross_attn.q.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1004
+ "blocks.36.cross_attn.q.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1005
+ "blocks.36.cross_attn.k.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1006
+ "blocks.36.cross_attn.k.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1007
+ "blocks.36.cross_attn.v.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1008
+ "blocks.36.cross_attn.v.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1009
+ "blocks.36.cross_attn.o.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1010
+ "blocks.36.cross_attn.o.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1011
+ "blocks.36.cross_attn.norm_q.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1012
+ "blocks.36.cross_attn.norm_k.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1013
+ "blocks.36.ffn.0.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1014
+ "blocks.36.ffn.0.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1015
+ "blocks.36.ffn.2.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1016
+ "blocks.36.ffn.2.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1017
+ "blocks.37.modulation": "Phantom_Wan_14B-00006-of-00006.safetensors",
1018
+ "blocks.37.self_attn.q.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1019
+ "blocks.37.self_attn.q.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1020
+ "blocks.37.self_attn.k.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1021
+ "blocks.37.self_attn.k.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1022
+ "blocks.37.self_attn.v.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1023
+ "blocks.37.self_attn.v.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1024
+ "blocks.37.self_attn.o.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1025
+ "blocks.37.self_attn.o.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1026
+ "blocks.37.self_attn.norm_q.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1027
+ "blocks.37.self_attn.norm_k.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1028
+ "blocks.37.norm3.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1029
+ "blocks.37.norm3.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1030
+ "blocks.37.cross_attn.q.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1031
+ "blocks.37.cross_attn.q.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1032
+ "blocks.37.cross_attn.k.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1033
+ "blocks.37.cross_attn.k.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1034
+ "blocks.37.cross_attn.v.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1035
+ "blocks.37.cross_attn.v.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1036
+ "blocks.37.cross_attn.o.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1037
+ "blocks.37.cross_attn.o.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1038
+ "blocks.37.cross_attn.norm_q.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1039
+ "blocks.37.cross_attn.norm_k.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1040
+ "blocks.37.ffn.0.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1041
+ "blocks.37.ffn.0.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1042
+ "blocks.37.ffn.2.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1043
+ "blocks.37.ffn.2.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1044
+ "blocks.38.modulation": "Phantom_Wan_14B-00006-of-00006.safetensors",
1045
+ "blocks.38.self_attn.q.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1046
+ "blocks.38.self_attn.q.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1047
+ "blocks.38.self_attn.k.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1048
+ "blocks.38.self_attn.k.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1049
+ "blocks.38.self_attn.v.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1050
+ "blocks.38.self_attn.v.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1051
+ "blocks.38.self_attn.o.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1052
+ "blocks.38.self_attn.o.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1053
+ "blocks.38.self_attn.norm_q.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1054
+ "blocks.38.self_attn.norm_k.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1055
+ "blocks.38.norm3.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1056
+ "blocks.38.norm3.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1057
+ "blocks.38.cross_attn.q.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1058
+ "blocks.38.cross_attn.q.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1059
+ "blocks.38.cross_attn.k.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1060
+ "blocks.38.cross_attn.k.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1061
+ "blocks.38.cross_attn.v.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1062
+ "blocks.38.cross_attn.v.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1063
+ "blocks.38.cross_attn.o.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1064
+ "blocks.38.cross_attn.o.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1065
+ "blocks.38.cross_attn.norm_q.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1066
+ "blocks.38.cross_attn.norm_k.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1067
+ "blocks.38.ffn.0.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1068
+ "blocks.38.ffn.0.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1069
+ "blocks.38.ffn.2.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1070
+ "blocks.38.ffn.2.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1071
+ "blocks.39.modulation": "Phantom_Wan_14B-00006-of-00006.safetensors",
1072
+ "blocks.39.self_attn.q.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1073
+ "blocks.39.self_attn.q.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1074
+ "blocks.39.self_attn.k.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1075
+ "blocks.39.self_attn.k.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1076
+ "blocks.39.self_attn.v.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1077
+ "blocks.39.self_attn.v.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1078
+ "blocks.39.self_attn.o.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1079
+ "blocks.39.self_attn.o.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1080
+ "blocks.39.self_attn.norm_q.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1081
+ "blocks.39.self_attn.norm_k.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1082
+ "blocks.39.norm3.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1083
+ "blocks.39.norm3.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1084
+ "blocks.39.cross_attn.q.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1085
+ "blocks.39.cross_attn.q.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1086
+ "blocks.39.cross_attn.k.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1087
+ "blocks.39.cross_attn.k.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1088
+ "blocks.39.cross_attn.v.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1089
+ "blocks.39.cross_attn.v.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1090
+ "blocks.39.cross_attn.o.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1091
+ "blocks.39.cross_attn.o.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1092
+ "blocks.39.cross_attn.norm_q.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1093
+ "blocks.39.cross_attn.norm_k.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1094
+ "blocks.39.ffn.0.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1095
+ "blocks.39.ffn.0.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1096
+ "blocks.39.ffn.2.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1097
+ "blocks.39.ffn.2.bias": "Phantom_Wan_14B-00006-of-00006.safetensors",
1098
+ "head.modulation": "Phantom_Wan_14B-00006-of-00006.safetensors",
1099
+ "head.head.weight": "Phantom_Wan_14B-00006-of-00006.safetensors",
1100
+ "head.head.bias": "Phantom_Wan_14B-00006-of-00006.safetensors"
1101
+ }
1102
+ }
README.md ADDED
@@ -0,0 +1,169 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ pipeline_tag: image-to-video
4
+ library_name: phantom
5
+ ---
6
+
7
+ <h3 align="center">
8
+ Phantom: Subject-Consistent Video Generation via Cross-Modal Alignment
9
+ </h3>
10
+
11
+ <div style="display:flex;justify-content: center">
12
+ <a href="https://arxiv.org/abs/2502.11079"><img alt="Build" src="https://img.shields.io/badge/arXiv%20paper-2502.11079-b31b1b.svg"></a>
13
+ <a href="https://phantom-video.github.io/Phantom/"><img alt="Build" src="https://img.shields.io/badge/Project_page-More_visualizations-green"></a>
14
+ <a href="https://github.com/Phantom-video/Phantom"><img src="https://img.shields.io/static/v1?label=GitHub&message=Code&color=green&logo=github"></a>
15
+ </div>
16
+
17
+ ><p align="center"> <span style="color:#137cf3; font-family: Gill Sans">Lijie Liu</span><sup>*</sup></a>, <span style="color:#137cf3; font-family: Gill Sans">Tianxiang Ma</span><sup>*</sup></a>, <span style="color:#137cf3; font-family: Gill Sans">Bingchuan Li</span><sup>*</sup></a>, <span style="color:#137cf3; font-family: Gill Sans">Zhuowei Chen</span><sup>*</sup></a>, <span style="color:#137cf3; font-family: Gill Sans">Jiawei Liu</span><sup></sup></a>, <span style="color:#137cf3; font-family: Gill Sans">Gen Li</span>, <span style="color:#137cf3; font-family: Gill Sans">Siyu Zhou</span>, <span style="color:#137cf3; font-family: Gill Sans">Qian He</span></a>, <span style="color:#137cf3; font-family: Gill Sans">Xinglong Wu</span></a> <br>
18
+ ><span style="font-size: 16px"><sup> * </sup>Equal contribution,<sup> &dagger; </sup>Project lead</span> <br>
19
+ ><span style="font-size: 16px">Intelligent Creation Team, ByteDance</span>
20
+
21
+
22
+ <p align="center">
23
+ <img src="./assets/teaser.png" width=95%>
24
+ <p>
25
+
26
+ ## 🔥 Latest News!
27
+ * May 27, 2025: 🎉 We have released the Phantom-Wan-14B model, a more powerful Subject-to-Video generation model.
28
+ * Apr 23, 2025: 😊 Thanks to [ComfyUI-WanVideoWrapper](https://github.com/kijai/ComfyUI-WanVideoWrapper/tree/dev) for adapting ComfyUI to Phantom-Wan-1.3B. Everyone is welcome to use it!
29
+ * Apr 21, 2025: 👋 Phantom-Wan is coming! We adapted the Phantom framework into the [Wan2.1](https://github.com/Wan-Video/Wan2.1) video generation model. The inference codes and checkpoint have been released.
30
+ * Apr 10, 2025: We have updated the [full version](https://arxiv.org/pdf/2502.11079v2) of the Phantom paper, which now includes more detailed descriptions of the model architecture and dataset pipeline.
31
+ * Feb 16, 2025: We proposed a novel subject-consistent video generation model, **Phantom**, and have released the [report](https://arxiv.org/pdf/2502.11079v1) publicly. For more video demos, please visit the [project page](https://phantom-video.github.io/Phantom/).
32
+
33
+
34
+ ## 📑 Todo List
35
+ - [x] Inference codes and Checkpoint of Phantom-Wan-1.3B
36
+ - [x] Checkpoint of Phantom-Wan-14B
37
+ - [ ] Checkpoint of Phantom-Wan-14B Pro
38
+ - [ ] Open source Phantom-Data
39
+ - [ ] Training codes of Phantom-Wan
40
+
41
+ ## 📖 Overview
42
+ Phantom is a unified video generation framework for single and multi-subject references, built on existing text-to-video and image-to-video architectures. It achieves cross-modal alignment using text-image-video triplet data by redesigning the joint text-image injection model. Additionally, it emphasizes subject consistency in human generation while enhancing ID-preserving video generation.
43
+
44
+ ## ⚡️ Quickstart
45
+
46
+ ### Installation
47
+ Clone the repo:
48
+ ```sh
49
+ git clone https://github.com/Phantom-video/Phantom.git
50
+ cd Phantom
51
+ ```
52
+
53
+ Install dependencies:
54
+ ```sh
55
+ # Ensure torch >= 2.4.0
56
+ pip install -r requirements.txt
57
+ ```
58
+
59
+ ### Model Download
60
+ | Models | Download Link | Notes |
61
+ |--------------|---------------------------------------------------------------------------------------------------------------------------------------------------------|-------------------------------|
62
+ | Phantom-Wan-1.3B | 🤗 [Huggingface](https://huggingface.co/bytedance-research/Phantom/blob/main/Phantom-Wan-1.3B.pth) | Supports both 480P and 720P
63
+ | Phantom-Wan-14B | 🤗 [Huggingface](https://huggingface.co/bytedance-research/Phantom/tree/main) | Supports both 480P and 720P
64
+
65
+ First you need to download the 1.3B original model of Wan2.1, since our Phantom-Wan model relies on the Wan2.1 VAE and Text Encoder model. Download Wan2.1-1.3B using huggingface-cli:
66
+ ``` sh
67
+ pip install "huggingface_hub[cli]"
68
+ huggingface-cli download Wan-AI/Wan2.1-T2V-1.3B --local-dir ./Wan2.1-T2V-1.3B
69
+ ```
70
+
71
+ Then download the Phantom-Wan-1.3B and Phantom-Wan-14B model:
72
+ ``` sh
73
+ huggingface-cli download bytedance-research/Phantom --local-dir ./Phantom-Wan-Models
74
+ ```
75
+ Alternatively, you can manually download the required models and place them in the `Phantom-Wan-Models` folder.
76
+
77
+ ### Run Subject-to-Video Generation
78
+
79
+ #### Phantom-Wan-1.3B
80
+
81
+ - Single-GPU inference
82
+
83
+ ``` sh
84
+ python generate.py --task s2v-1.3B --size 832*480 --ckpt_dir ./Wan2.1-T2V-1.3B --phantom_ckpt ./Phantom-Wan-Models/Phantom-Wan-1.3B.pth --ref_image "examples/ref1.png,examples/ref2.png" --prompt "暖阳漫过草地,扎着双马尾、头戴绿色蝴蝶结、身穿浅绿色连衣裙的小女孩蹲在盛开的雏菊旁。她身旁一只棕白相间的狗狗吐着舌头,毛茸茸尾巴欢快摇晃。小女孩笑着举起黄红配色、带有蓝色按钮的玩具相机,将和狗狗的欢乐瞬间定格。" --base_seed 42
85
+ ```
86
+
87
+ - Multi-GPU inference using FSDP + xDiT USP
88
+
89
+ ``` sh
90
+ pip install "xfuser>=0.4.1"
91
+ torchrun --nproc_per_node=8 generate.py --task s2v-1.3B --size 832*480 --ckpt_dir ./Wan2.1-T2V-1.3B --phantom_ckpt ./Phantom-Wan-Models/Phantom-Wan-1.3B.pth --ref_image "examples/ref3.png,examples/ref4.png" --dit_fsdp --t5_fsdp --ulysses_size 4 --ring_size 2 --prompt "夕阳下,一位有着小麦色肌肤、留着乌黑长发的女人穿上有着大朵立体花朵装饰、肩袖处带有飘逸纱带的红色纱裙,漫步在金色的海滩上,海风轻拂她的长发,画面唯美动人。" --base_seed 42
92
+ ```
93
+
94
+ > 💡Note:
95
+ > * Changing `--ref_image` can achieve single reference Subject-to-Video generation or multi-reference Subject-to-Video generation. The number of reference images should be within 4.
96
+ > * To achieve the best generation results, we recommend that you describe the visual content of the reference image as accurately as possible when writing `--prompt`. For example, "examples/ref1.png" can be described as "a toy camera in yellow and red with blue buttons".
97
+ > * When the generated video is unsatisfactory, the most straightforward solution is to try changing the `--base_seed` and modifying the description in the `--prompt`.
98
+
99
+ For inferencing examples, please refer to "infer.sh". You will get the following generated results:
100
+
101
+ <table>
102
+ <tr>
103
+ <td><img src="./assets/result1.gif" alt="GIF 1" width="400"></td>
104
+ <td><img src="./assets/result2.gif" alt="GIF 2" width="400"></td>
105
+ </tr>
106
+ <tr>
107
+ <td><img src="./assets/result3.gif" alt="GIF 3" width="400"></td>
108
+ <td><img src="./assets/result4.gif" alt="GIF 4" width="400"></td>
109
+ </tr>
110
+ </table>
111
+
112
+
113
+ #### Phantom-Wan-14B
114
+
115
+ - Single-GPU inference
116
+
117
+ ``` sh
118
+ python generate.py --task s2v-14B --size 832*480 --frame_num 121 --sample_fps 24 --ckpt_dir ./Wan2.1-T2V-1.3B --phantom_ckpt ./Phantom-Wan-Models --ref_image "examples/ref12.png,examples/ref13.png" --prompt "扎着双丸子头,身着红黑配色并带有火焰纹饰服饰,颈戴金项圈、臂缠金护腕的哪吒,和有着一头淡蓝色头发,额间有蓝色印记,身着一袭白色长袍的敖丙,并肩坐在教室的座位上,他们专注地讨论着书本内容。背景为柔和的灯光和窗外微风拂过的树叶,营造出安静又充满活力的学习氛围。"
119
+ ```
120
+
121
+ - Multi-GPU inference using FSDP + xDiT USP
122
+
123
+ ``` sh
124
+ pip install "xfuser>=0.4.1"
125
+ torchrun --nproc_per_node=8 generate.py --task s2v-14B --size 832*480 --frame_num 121 --sample_fps 24 --ckpt_dir ./Wan2.1-T2V-1.3B --phantom_ckpt ./Phantom-Wan-Models --ref_image "examples/ref14.png,examples/ref15.png,examples/ref16.png" --dit_fsdp --t5_fsdp --ulysses_size 8 --ring_size 1 --prompt "一位戴着黄色帽子、身穿黄色上衣配棕色背带的卡通老爷爷,在装饰有粉色和蓝色桌椅、悬挂着彩色吊灯且摆满彩色圆球装饰的清新卡通风格咖啡馆里,端起一只蓝色且冒着热气的咖啡杯,画面风格卡通、清新。"
126
+ ```
127
+
128
+ > 💡Note:
129
+ > * The currently released Phantom-Wan-14B model was trained on 480P data but can also be applied to generating videos at 720P and higher resolutions, though the results may be less stable. We plan to release a version further trained on 720P data in the future.
130
+ > * The Phantom-Wan-14B model was trained on 24fps data, but it can also generate 16fps videos, similar to the native Wan2.1. However, the quality may experience a slight decline.
131
+
132
+ For more inference examples, please refer to "infer.sh". You will get the following generated results:
133
+
134
+ <table>
135
+ <tr>
136
+ <td><img src="./assets/result5.gif" alt="GIF 1" width="400"></td>
137
+ </tr>
138
+ <tr>
139
+ <td><img src="./assets/result7.gif" alt="GIF 2" width="400"></td>
140
+ </tr>
141
+ <tr>
142
+ <td><img src="./assets/result6.gif" alt="GIF 3" width="400"></td>
143
+ </tr>
144
+ </table>
145
+
146
+ > The GIF videos are compressed.
147
+
148
+
149
+ ## Acknowledgements
150
+ We would like to express our gratitude to the SEED team for their support. Special thanks to Lu Jiang, Haoyuan Guo, Zhibei Ma, and Sen Wang for their assistance with the model and data. In addition, we are also very grateful to Siying Chen, Qingyang Li, and Wei Han for their help with the evaluation.
151
+
152
+ ## ⭐ Citation
153
+
154
+ If Phantom is helpful, please help to ⭐ our [repo](https://github.com/Phantom-video/Phantom).
155
+
156
+ If you find this project useful for your research, please consider citing our [paper](https://arxiv.org/abs/2502.11079).
157
+
158
+ ### BibTeX
159
+ ```bibtex
160
+ @article{liu2025phantom,
161
+ title={Phantom: Subject-consistent video generation via cross-modal alignment},
162
+ author={Liu, Lijie and Ma, Tianxiang and Li, Bingchuan and Chen, Zhuowei and Liu, Jiawei and Li, Gen and Zhou, Siyu and He, Qian and Wu, Xinglong},
163
+ journal={arXiv preprint arXiv:2502.11079},
164
+ year={2025}
165
+ }
166
+ ```
167
+
168
+ ## 📧 Contact
169
+ If you have any comments or questions regarding this open-source project, please open a new issue or contact [Tianxiang Ma](https://tianxiangma.github.io/).
assets/id_eval.png ADDED

Git LFS Details

  • SHA256: ab852373a29f26cee4088e213c526ec0ce04a65b211d11b391eb2146928bd88e
  • Pointer size: 133 Bytes
  • Size of remote file: 10.9 MB
assets/ip_eval_m_00.png ADDED

Git LFS Details

  • SHA256: 012d541e58a000dc2ec3c1dd5a3264db8372100e368fafce49ebd4fa2bfeb8ba
  • Pointer size: 133 Bytes
  • Size of remote file: 17 MB
assets/ip_eval_s.png ADDED

Git LFS Details

  • SHA256: 78474c1418d3c8e4a1ca80e718d44e76ba59469916cca0127f07e6bd6cb8d35d
  • Pointer size: 133 Bytes
  • Size of remote file: 11.1 MB
assets/ref1.png ADDED

Git LFS Details

  • SHA256: 28586e41daf7f45c5e6b8e215cc8c55be08f32dae0f7b5b38540c95952d668c7
  • Pointer size: 131 Bytes
  • Size of remote file: 321 kB
assets/ref10.png ADDED

Git LFS Details

  • SHA256: 19e6e4f071acc3110a9559bae3a6d6379eea6af8b44482608dbcc674bafb2c14
  • Pointer size: 131 Bytes
  • Size of remote file: 350 kB
assets/ref11.png ADDED

Git LFS Details

  • SHA256: 7d5cc55c6360555c9bacffd0d3e3a8fd064f966919dd543448643e6dbcc883c8
  • Pointer size: 132 Bytes
  • Size of remote file: 1.36 MB
assets/ref12.png ADDED

Git LFS Details

  • SHA256: b4e3360f2b931b1082b71a1afcaa4f92d76f08e132b043509e1fde4057ab79f2
  • Pointer size: 131 Bytes
  • Size of remote file: 887 kB
assets/ref13.png ADDED

Git LFS Details

  • SHA256: c23af492984e7ce149f63782f4e378cb8eed92ad0726e3b5527507dce1d67534
  • Pointer size: 131 Bytes
  • Size of remote file: 623 kB
assets/ref14.png ADDED

Git LFS Details

  • SHA256: 6cdc2b9f71f49e08f711ee723bc19e7e75f8970affea21ace36f297781269166
  • Pointer size: 131 Bytes
  • Size of remote file: 525 kB
assets/ref15.png ADDED

Git LFS Details

  • SHA256: bf6b9552a942092d87371c123437761db6520a2f8465e47f2de772f66762c018
  • Pointer size: 131 Bytes
  • Size of remote file: 374 kB
assets/ref16.png ADDED

Git LFS Details

  • SHA256: 23e53b75b1b5f52a18cf72e69d5b566dc30b85aa1215616d4ab70d51b4fe8147
  • Pointer size: 131 Bytes
  • Size of remote file: 801 kB
assets/ref17.png ADDED

Git LFS Details

  • SHA256: 2e65bb33b6bf46d7bf0f430491c55b2cacb9fb318d8f8b8d4ef512872fe8111d
  • Pointer size: 132 Bytes
  • Size of remote file: 2.72 MB
assets/ref18.png ADDED

Git LFS Details

  • SHA256: 6299141e6ff65819b5028be1045522102b2418f8467602de88691b4b96df6a0b
  • Pointer size: 132 Bytes
  • Size of remote file: 3.43 MB
assets/ref2.png ADDED

Git LFS Details

  • SHA256: d152b9f0bb14e404a18a6a4fdfd67e1cb7f504e4f3be48aea095f4ca49e499d1
  • Pointer size: 131 Bytes
  • Size of remote file: 567 kB
assets/ref3.png ADDED

Git LFS Details

  • SHA256: d0e1fb55f84f858ba929b2b55379963244dd2202790e576a28c5543e97148323
  • Pointer size: 131 Bytes
  • Size of remote file: 785 kB
assets/ref4.png ADDED

Git LFS Details

  • SHA256: 1d358f1e10433ad414ffc3f84fc74f48bd42ef36d9ceb798eff80a1022823eee
  • Pointer size: 132 Bytes
  • Size of remote file: 1.71 MB
assets/ref5.png ADDED

Git LFS Details

  • SHA256: 3902f26beae4fc1209d072348330f67bb9639a1a82f32cbbd753d0ca4ae6755f
  • Pointer size: 132 Bytes
  • Size of remote file: 1.43 MB
assets/ref6.png ADDED

Git LFS Details

  • SHA256: 5b0afb49db6848b4c60a2dc9e1a45d8b6d0560d9cac793709aa9a9187b18b8c6
  • Pointer size: 131 Bytes
  • Size of remote file: 868 kB
assets/ref7.png ADDED

Git LFS Details

  • SHA256: c2f531cbf3eee3eb2322a8d9b65f42244bfe4f033cc0ecdcc95286b61cc7b75a
  • Pointer size: 131 Bytes
  • Size of remote file: 665 kB
assets/ref8.png ADDED

Git LFS Details

  • SHA256: 4d1b1f9a6a50cea5b81802533345471ff19b348619c57f4537da97f8bed863a5
  • Pointer size: 131 Bytes
  • Size of remote file: 729 kB
assets/ref9.png ADDED

Git LFS Details

  • SHA256: e010141d01bc4b832426eb701a4af4672e377a9aaa67204313838a5ae10a1c3d
  • Pointer size: 131 Bytes
  • Size of remote file: 420 kB
assets/result1.gif ADDED

Git LFS Details

  • SHA256: 49bbc4dbe2f9e9cef38ca20ee773b7a4554315eddeca291d9ec9fd7f4f51a2dc
  • Pointer size: 132 Bytes
  • Size of remote file: 5.96 MB
assets/result2.gif ADDED

Git LFS Details

  • SHA256: 566842209fc089d96e5c21f633e004b24f142cd968366c35d44d4cbe124a0672
  • Pointer size: 132 Bytes
  • Size of remote file: 8.56 MB
assets/result3.gif ADDED

Git LFS Details

  • SHA256: 60168b7f3fdbc34a892c809e9e18fa103b773175427ab05e39ec08ab9ff6b2e5
  • Pointer size: 133 Bytes
  • Size of remote file: 13.1 MB
assets/result4.gif ADDED

Git LFS Details

  • SHA256: 868f3d69f64c1c9f27ed4968968624b8e0093d98e709555d12a8851527d0dfde
  • Pointer size: 132 Bytes
  • Size of remote file: 5.78 MB
assets/result5.gif ADDED

Git LFS Details

  • SHA256: 86c5bf1e896b064c85643aeaf1041c3efa6f763a7cf5ae77983a54054d5cd0c2
  • Pointer size: 132 Bytes
  • Size of remote file: 9.9 MB
assets/result6.gif ADDED

Git LFS Details

  • SHA256: 33070f0ec3d187ae67162750083338c898ffb6d8a5766f29b8bf20070a12c06c
  • Pointer size: 132 Bytes
  • Size of remote file: 8.08 MB
assets/result7.gif ADDED

Git LFS Details

  • SHA256: 281cc99d1a64247315e2310a9a44b7e5418e8dc7a662545b3ecb579043bbdd9c
  • Pointer size: 133 Bytes
  • Size of remote file: 27.4 MB
assets/teaser.png ADDED

Git LFS Details

  • SHA256: 86e150c7481222b572e84c14cfdf1a736e22a7ab348bf8a22f6adfc0131a878f
  • Pointer size: 132 Bytes
  • Size of remote file: 3.37 MB