| W0222 16:33:50.453541 41281 warnings.py:110] /workspace/quip-sharp/lib/codebook/__init__.py:6: FutureWarning: `torch.library.impl_abstract` was renamed to `torch.library.register_fake`. Please use that instead; we will remove `torch.library.impl_abstract` in a future version of PyTorch. |
| @torch.library.impl_abstract("quip_lib::decode_matvec_e8p") |
|
|
| W0222 16:33:50.454899 41281 warnings.py:110] /workspace/quip-sharp/lib/codebook/__init__.py:25: FutureWarning: `torch.library.impl_abstract` was renamed to `torch.library.register_fake`. Please use that instead; we will remove `torch.library.impl_abstract` in a future version of PyTorch. |
| @torch.library.impl_abstract("quip_lib::decompress_packed_e8p") |
|
|
| I0222 16:33:50.570622 41281 utils.py:148] Note: detected 96 virtual cores but NumExpr set to maximum of 64, check "NUMEXPR_MAX_THREADS" environment variable. |
| I0222 16:33:50.570672 41281 utils.py:151] Note: NumExpr detected 96 cores but "NUMEXPR_MAX_THREADS" not set, so enforcing safe limit of 16. |
| I0222 16:33:50.570697 41281 utils.py:164] NumExpr defaulting to 16 threads. |
| W0222 16:33:51.252380 41281 warnings.py:110] /workspace/quip-sharp/lib/utils/matmul_had.py:92: FutureWarning: `torch.library.impl_abstract` was renamed to `torch.library.register_fake`. Please use that instead; we will remove `torch.library.impl_abstract` in a future version of PyTorch. |
| @torch.library.impl_abstract("quip_lib::hadamard") |
|
|
| I0222 16:34:24.121408 41281 quantize_decompress_robust.py:121] Loaded codebook E8P12 |
| I0222 16:34:24.121847 41281 quantize_decompress_robust.py:126] Progress: 6/50 done, 44 remaining |
| I0222 16:34:24.121875 41281 quantize_decompress_robust.py:131] Loading base model... |
|
Loading weights: 0%| | 0/453 [00:00<?, ?it/s]
Loading weights: 0%| | 1/453 [00:00<00:00, 15709.00it/s, Materializing param=lm_head.weight]
Loading weights: 0%| | 1/453 [00:00<00:00, 7570.95it/s, Materializing param=lm_head.weight]
Loading weights: 0%| | 2/453 [00:00<00:00, 7423.55it/s, Materializing param=model.embed_tokens.weight]
Loading weights: 0%| | 2/453 [00:00<00:00, 6492.73it/s, Materializing param=model.embed_tokens.weight]
Loading weights: 1%| | 3/453 [00:00<00:00, 7285.99it/s, Materializing param=model.layers.0.input_layernorm.weight]
Loading weights: 1%| | 3/453 [00:00<00:00, 6710.89it/s, Materializing param=model.layers.0.input_layernorm.weight]
Loading weights: 1%| | 4/453 [00:00<00:00, 6485.20it/s, Materializing param=model.layers.0.mlp.down_proj.weight]
Loading weights: 1%| | 4/453 [00:00<00:00, 6150.01it/s, Materializing param=model.layers.0.mlp.down_proj.weight]
Loading weights: 1%| | 5/453 [00:00<00:00, 6811.15it/s, Materializing param=model.layers.0.mlp.gate_proj.weight]
Loading weights: 1%| | 5/453 [00:00<00:00, 6510.87it/s, Materializing param=model.layers.0.mlp.gate_proj.weight]
Loading weights: 1%|▏ | 6/453 [00:00<00:00, 7065.08it/s, Materializing param=model.layers.0.mlp.up_proj.weight]
Loading weights: 1%|▏ | 6/453 [00:00<00:00, 6774.11it/s, Materializing param=model.layers.0.mlp.up_proj.weight]
Loading weights: 2%|▏ | 7/453 [00:00<00:00, 7222.66it/s, Materializing param=model.layers.0.post_attention_layernorm.weight]
Loading weights: 2%|▏ | 7/453 [00:00<00:00, 6980.53it/s, Materializing param=model.layers.0.post_attention_layernorm.weight]
Loading weights: 2%|▏ | 8/453 [00:00<00:00, 7394.10it/s, Materializing param=model.layers.0.self_attn.k_proj.weight]
Loading weights: 2%|▏ | 8/453 [00:00<00:00, 7182.03it/s, Materializing param=model.layers.0.self_attn.k_proj.weight]
Loading weights: 2%|▏ | 9/453 [00:00<00:00, 7543.71it/s, Materializing param=model.layers.0.self_attn.o_proj.weight]
Loading weights: 2%|▏ | 9/453 [00:00<00:00, 7354.13it/s, Materializing param=model.layers.0.self_attn.o_proj.weight]
Loading weights: 2%|▏ | 10/453 [00:00<00:00, 5000.36it/s, Materializing param=model.layers.0.self_attn.q_proj.weight]
Loading weights: 2%|▏ | 10/453 [00:00<00:00, 4916.54it/s, Materializing param=model.layers.0.self_attn.q_proj.weight]
Loading weights: 2%|▏ | 11/453 [00:00<00:00, 5238.71it/s, Materializing param=model.layers.0.self_attn.v_proj.weight]
Loading weights: 2%|▏ | 11/453 [00:00<00:00, 5163.66it/s, Materializing param=model.layers.0.self_attn.v_proj.weight]
Loading weights: 3%|▎ | 12/453 [00:00<00:00, 5433.04it/s, Materializing param=model.layers.1.input_layernorm.weight]
Loading weights: 3%|▎ | 12/453 [00:00<00:00, 5357.85it/s, Materializing param=model.layers.1.input_layernorm.weight]
Loading weights: 3%|▎ | 13/453 [00:00<00:00, 4782.56it/s, Materializing param=model.layers.1.mlp.down_proj.weight]
Loading weights: 3%|▎ | 13/453 [00:00<00:00, 4723.32it/s, Materializing param=model.layers.1.mlp.down_proj.weight]
Loading weights: 3%|▎ | 14/453 [00:00<00:00, 4939.04it/s, Materializing param=model.layers.1.mlp.gate_proj.weight]
Loading weights: 3%|▎ | 14/453 [00:00<00:00, 4883.99it/s, Materializing param=model.layers.1.mlp.gate_proj.weight]
Loading weights: 3%|▎ | 15/453 [00:00<00:00, 5096.36it/s, Materializing param=model.layers.1.mlp.up_proj.weight]
Loading weights: 3%|▎ | 15/453 [00:00<00:00, 5042.44it/s, Materializing param=model.layers.1.mlp.up_proj.weight]
Loading weights: 4%|▎ | 16/453 [00:00<00:00, 5242.47it/s, Materializing param=model.layers.1.post_attention_layernorm.weight]
Loading weights: 4%|▎ | 16/453 [00:00<00:00, 5187.76it/s, Materializing param=model.layers.1.post_attention_layernorm.weight]
Loading weights: 4%|▍ | 17/453 [00:00<00:00, 5378.53it/s, Materializing param=model.layers.1.self_attn.k_proj.weight]
Loading weights: 4%|▍ | 17/453 [00:00<00:00, 5327.10it/s, Materializing param=model.layers.1.self_attn.k_proj.weight]
Loading weights: 4%|▍ | 18/453 [00:00<00:00, 5511.17it/s, Materializing param=model.layers.1.self_attn.o_proj.weight]
Loading weights: 4%|▍ | 18/453 [00:00<00:00, 5456.99it/s, Materializing param=model.layers.1.self_attn.o_proj.weight]
Loading weights: 4%|▍ | 19/453 [00:00<00:00, 5567.40it/s, Materializing param=model.layers.1.self_attn.q_proj.weight]
Loading weights: 4%|▍ | 19/453 [00:00<00:00, 5518.06it/s, Materializing param=model.layers.1.self_attn.q_proj.weight]
Loading weights: 4%|▍ | 20/453 [00:00<00:00, 5688.35it/s, Materializing param=model.layers.1.self_attn.v_proj.weight]
Loading weights: 4%|▍ | 20/453 [00:00<00:00, 5639.78it/s, Materializing param=model.layers.1.self_attn.v_proj.weight]
Loading weights: 5%|▍ | 21/453 [00:00<00:00, 5636.78it/s, Materializing param=model.layers.2.input_layernorm.weight]
Loading weights: 5%|▍ | 21/453 [00:00<00:00, 5589.92it/s, Materializing param=model.layers.2.input_layernorm.weight]
Loading weights: 5%|▍ | 22/453 [00:00<00:00, 5760.33it/s, Materializing param=model.layers.2.mlp.down_proj.weight]
Loading weights: 5%|▍ | 22/453 [00:00<00:00, 5715.73it/s, Materializing param=model.layers.2.mlp.down_proj.weight]
Loading weights: 5%|▌ | 23/453 [00:00<00:00, 5596.95it/s, Materializing param=model.layers.2.mlp.gate_proj.weight]
Loading weights: 5%|▌ | 23/453 [00:00<00:00, 5554.41it/s, Materializing param=model.layers.2.mlp.gate_proj.weight]
Loading weights: 5%|▌ | 24/453 [00:00<00:00, 5707.18it/s, Materializing param=model.layers.2.mlp.up_proj.weight]
Loading weights: 5%|▌ | 24/453 [00:00<00:00, 5667.66it/s, Materializing param=model.layers.2.mlp.up_proj.weight]
Loading weights: 6%|▌ | 25/453 [00:00<00:00, 5503.18it/s, Materializing param=model.layers.2.post_attention_layernorm.weight]
Loading weights: 6%|▌ | 25/453 [00:00<00:00, 5461.62it/s, Materializing param=model.layers.2.post_attention_layernorm.weight]
Loading weights: 6%|▌ | 26/453 [00:00<00:00, 5582.96it/s, Materializing param=model.layers.2.self_attn.k_proj.weight]
Loading weights: 6%|▌ | 26/453 [00:00<00:00, 5543.23it/s, Materializing param=model.layers.2.self_attn.k_proj.weight]
Loading weights: 6%|▌ | 27/453 [00:00<00:00, 5662.03it/s, Materializing param=model.layers.2.self_attn.o_proj.weight]
Loading weights: 6%|▌ | 27/453 [00:00<00:00, 5626.58it/s, Materializing param=model.layers.2.self_attn.o_proj.weight]
Loading weights: 6%|▌ | 28/453 [00:00<00:00, 5745.62it/s, Materializing param=model.layers.2.self_attn.q_proj.weight]
Loading weights: 6%|▌ | 28/453 [00:00<00:00, 5708.76it/s, Materializing param=model.layers.2.self_attn.q_proj.weight]
Loading weights: 6%|▋ | 29/453 [00:00<00:00, 5822.35it/s, Materializing param=model.layers.2.self_attn.v_proj.weight]
Loading weights: 6%|▋ | 29/453 [00:00<00:00, 5785.52it/s, Materializing param=model.layers.2.self_attn.v_proj.weight]
Loading weights: 7%|▋ | 30/453 [00:00<00:00, 5895.29it/s, Materializing param=model.layers.3.input_layernorm.weight]
Loading weights: 7%|▋ | 30/453 [00:00<00:00, 5860.42it/s, Materializing param=model.layers.3.input_layernorm.weight]
Loading weights: 7%|▋ | 31/453 [00:00<00:00, 5974.52it/s, Materializing param=model.layers.3.mlp.down_proj.weight]
Loading weights: 7%|▋ | 31/453 [00:00<00:00, 5939.58it/s, Materializing param=model.layers.3.mlp.down_proj.weight]
Loading weights: 7%|▋ | 32/453 [00:00<00:00, 6022.78it/s, Materializing param=model.layers.3.mlp.gate_proj.weight]
Loading weights: 7%|▋ | 32/453 [00:00<00:00, 5988.92it/s, Materializing param=model.layers.3.mlp.gate_proj.weight]
Loading weights: 7%|▋ | 33/453 [00:00<00:00, 6095.57it/s, Materializing param=model.layers.3.mlp.up_proj.weight]
Loading weights: 7%|▋ | 33/453 [00:00<00:00, 6061.93it/s, Materializing param=model.layers.3.mlp.up_proj.weight]
Loading weights: 8%|▊ | 34/453 [00:00<00:00, 5802.67it/s, Materializing param=model.layers.3.post_attention_layernorm.weight]
Loading weights: 8%|▊ | 34/453 [00:00<00:00, 5769.56it/s, Materializing param=model.layers.3.post_attention_layernorm.weight]
Loading weights: 8%|▊ | 35/453 [00:00<00:00, 5424.40it/s, Materializing param=model.layers.3.self_attn.k_proj.weight]
Loading weights: 8%|▊ | 35/453 [00:00<00:00, 5396.29it/s, Materializing param=model.layers.3.self_attn.k_proj.weight]
Loading weights: 8%|▊ | 36/453 [00:00<00:00, 5486.14it/s, Materializing param=model.layers.3.self_attn.o_proj.weight]
Loading weights: 8%|▊ | 36/453 [00:00<00:00, 5460.35it/s, Materializing param=model.layers.3.self_attn.o_proj.weight]
Loading weights: 8%|▊ | 37/453 [00:00<00:00, 5548.81it/s, Materializing param=model.layers.3.self_attn.q_proj.weight]
Loading weights: 8%|▊ | 37/453 [00:00<00:00, 5523.34it/s, Materializing param=model.layers.3.self_attn.q_proj.weight]
Loading weights: 8%|▊ | 38/453 [00:00<00:00, 5609.33it/s, Materializing param=model.layers.3.self_attn.v_proj.weight]
Loading weights: 8%|▊ | 38/453 [00:00<00:00, 5584.37it/s, Materializing param=model.layers.3.self_attn.v_proj.weight]
Loading weights: 9%|▊ | 39/453 [00:00<00:00, 5669.16it/s, Materializing param=model.layers.4.input_layernorm.weight]
Loading weights: 9%|▊ | 39/453 [00:00<00:00, 5644.51it/s, Materializing param=model.layers.4.input_layernorm.weight]
Loading weights: 9%|▉ | 40/453 [00:00<00:00, 5625.78it/s, Materializing param=model.layers.4.mlp.down_proj.weight]
Loading weights: 9%|▉ | 40/453 [00:00<00:00, 5599.31it/s, Materializing param=model.layers.4.mlp.down_proj.weight]
Loading weights: 9%|▉ | 41/453 [00:00<00:00, 5686.72it/s, Materializing param=model.layers.4.mlp.gate_proj.weight]
Loading weights: 9%|▉ | 41/453 [00:00<00:00, 5662.19it/s, Materializing param=model.layers.4.mlp.gate_proj.weight]
Loading weights: 9%|▉ | 42/453 [00:00<00:00, 5739.63it/s, Materializing param=model.layers.4.mlp.up_proj.weight]
Loading weights: 9%|▉ | 42/453 [00:00<00:00, 5714.31it/s, Materializing param=model.layers.4.mlp.up_proj.weight]
Loading weights: 9%|▉ | 43/453 [00:00<00:00, 5749.10it/s, Materializing param=model.layers.4.post_attention_layernorm.weight]
Loading weights: 9%|▉ | 43/453 [00:00<00:00, 5724.29it/s, Materializing param=model.layers.4.post_attention_layernorm.weight]
Loading weights: 10%|▉ | 44/453 [00:00<00:00, 5796.88it/s, Materializing param=model.layers.4.self_attn.k_proj.weight]
Loading weights: 10%|▉ | 44/453 [00:00<00:00, 5772.58it/s, Materializing param=model.layers.4.self_attn.k_proj.weight]
Loading weights: 10%|▉ | 45/453 [00:00<00:00, 5843.10it/s, Materializing param=model.layers.4.self_attn.o_proj.weight]
Loading weights: 10%|▉ | 45/453 [00:00<00:00, 5819.85it/s, Materializing param=model.layers.4.self_attn.o_proj.weight]
Loading weights: 10%|█ | 46/453 [00:00<00:00, 5891.24it/s, Materializing param=model.layers.4.self_attn.q_proj.weight]
Loading weights: 10%|█ | 46/453 [00:00<00:00, 5867.94it/s, Materializing param=model.layers.4.self_attn.q_proj.weight]
Loading weights: 10%|█ | 47/453 [00:00<00:00, 5939.87it/s, Materializing param=model.layers.4.self_attn.v_proj.weight]
Loading weights: 10%|█ | 47/453 [00:00<00:00, 5916.69it/s, Materializing param=model.layers.4.self_attn.v_proj.weight]
Loading weights: 11%|█ | 48/453 [00:00<00:00, 5986.52it/s, Materializing param=model.layers.5.input_layernorm.weight]
Loading weights: 11%|█ | 48/453 [00:00<00:00, 5963.82it/s, Materializing param=model.layers.5.input_layernorm.weight]
Loading weights: 11%|█ | 49/453 [00:00<00:00, 6033.02it/s, Materializing param=model.layers.5.mlp.down_proj.weight]
Loading weights: 11%|█ | 49/453 [00:00<00:00, 6010.09it/s, Materializing param=model.layers.5.mlp.down_proj.weight]
Loading weights: 11%|█ | 50/453 [00:00<00:00, 6076.41it/s, Materializing param=model.layers.5.mlp.gate_proj.weight]
Loading weights: 11%|█ | 50/453 [00:00<00:00, 6053.96it/s, Materializing param=model.layers.5.mlp.gate_proj.weight]
Loading weights: 11%|█▏ | 51/453 [00:00<00:00, 6106.47it/s, Materializing param=model.layers.5.mlp.up_proj.weight]
Loading weights: 11%|█▏ | 51/453 [00:00<00:00, 6083.54it/s, Materializing param=model.layers.5.mlp.up_proj.weight]
Loading weights: 11%|█▏ | 52/453 [00:00<00:00, 6099.78it/s, Materializing param=model.layers.5.post_attention_layernorm.weight]
Loading weights: 11%|█▏ | 52/453 [00:00<00:00, 6076.67it/s, Materializing param=model.layers.5.post_attention_layernorm.weight]
Loading weights: 12%|█▏ | 53/453 [00:00<00:00, 6138.29it/s, Materializing param=model.layers.5.self_attn.k_proj.weight]
Loading weights: 12%|█▏ | 53/453 [00:00<00:00, 6115.49it/s, Materializing param=model.layers.5.self_attn.k_proj.weight]
Loading weights: 12%|█▏ | 54/453 [00:00<00:00, 6176.50it/s, Materializing param=model.layers.5.self_attn.o_proj.weight]
Loading weights: 12%|█▏ | 54/453 [00:00<00:00, 6154.52it/s, Materializing param=model.layers.5.self_attn.o_proj.weight]
Loading weights: 12%|█▏ | 55/453 [00:00<00:00, 6214.12it/s, Materializing param=model.layers.5.self_attn.q_proj.weight]
Loading weights: 12%|█▏ | 55/453 [00:00<00:00, 6193.10it/s, Materializing param=model.layers.5.self_attn.q_proj.weight]
Loading weights: 12%|█▏ | 56/453 [00:00<00:00, 6252.32it/s, Materializing param=model.layers.5.self_attn.v_proj.weight]
Loading weights: 12%|█▏ | 56/453 [00:00<00:00, 6230.43it/s, Materializing param=model.layers.5.self_attn.v_proj.weight]
Loading weights: 13%|█▎ | 57/453 [00:00<00:00, 6289.47it/s, Materializing param=model.layers.6.input_layernorm.weight]
Loading weights: 13%|█▎ | 57/453 [00:00<00:00, 6268.36it/s, Materializing param=model.layers.6.input_layernorm.weight]
Loading weights: 13%|█▎ | 58/453 [00:00<00:00, 6325.26it/s, Materializing param=model.layers.6.mlp.down_proj.weight]
Loading weights: 13%|█▎ | 58/453 [00:00<00:00, 6304.45it/s, Materializing param=model.layers.6.mlp.down_proj.weight]
Loading weights: 13%|█▎ | 59/453 [00:00<00:00, 6301.44it/s, Materializing param=model.layers.6.mlp.gate_proj.weight]
Loading weights: 13%|█▎ | 59/453 [00:00<00:00, 6280.01it/s, Materializing param=model.layers.6.mlp.gate_proj.weight]
Loading weights: 13%|█▎ | 60/453 [00:00<00:00, 6196.65it/s, Materializing param=model.layers.6.mlp.up_proj.weight]
Loading weights: 13%|█▎ | 60/453 [00:00<00:00, 6175.51it/s, Materializing param=model.layers.6.mlp.up_proj.weight]
Loading weights: 13%|█▎ | 61/453 [00:00<00:00, 6225.88it/s, Materializing param=model.layers.6.post_attention_layernorm.weight]
Loading weights: 13%|█▎ | 61/453 [00:00<00:00, 6205.34it/s, Materializing param=model.layers.6.post_attention_layernorm.weight]
Loading weights: 14%|█▎ | 62/453 [00:00<00:00, 5988.55it/s, Materializing param=model.layers.6.self_attn.k_proj.weight]
Loading weights: 14%|█▎ | 62/453 [00:00<00:00, 5968.48it/s, Materializing param=model.layers.6.self_attn.k_proj.weight]
Loading weights: 14%|█▍ | 63/453 [00:00<00:00, 6006.57it/s, Materializing param=model.layers.6.self_attn.o_proj.weight]
Loading weights: 14%|█▍ | 63/453 [00:00<00:00, 5989.28it/s, Materializing param=model.layers.6.self_attn.o_proj.weight]
Loading weights: 14%|█▍ | 64/453 [00:00<00:00, 5989.46it/s, Materializing param=model.layers.6.self_attn.q_proj.weight]
Loading weights: 14%|█▍ | 64/453 [00:00<00:00, 5972.27it/s, Materializing param=model.layers.6.self_attn.q_proj.weight]
Loading weights: 14%|█▍ | 65/453 [00:00<00:00, 6021.64it/s, Materializing param=model.layers.6.self_attn.v_proj.weight]
Loading weights: 14%|█▍ | 65/453 [00:00<00:00, 6004.80it/s, Materializing param=model.layers.6.self_attn.v_proj.weight]
Loading weights: 15%|█▍ | 66/453 [00:00<00:00, 6055.70it/s, Materializing param=model.layers.7.input_layernorm.weight]
Loading weights: 15%|█▍ | 66/453 [00:00<00:00, 6039.58it/s, Materializing param=model.layers.7.input_layernorm.weight]
Loading weights: 15%|█▍ | 67/453 [00:00<00:00, 5993.14it/s, Materializing param=model.layers.7.mlp.down_proj.weight]
Loading weights: 15%|█▍ | 67/453 [00:00<00:00, 5976.32it/s, Materializing param=model.layers.7.mlp.down_proj.weight]
Loading weights: 15%|█▌ | 68/453 [00:00<00:00, 6024.39it/s, Materializing param=model.layers.7.mlp.gate_proj.weight]
Loading weights: 15%|█▌ | 68/453 [00:00<00:00, 6008.27it/s, Materializing param=model.layers.7.mlp.gate_proj.weight]
Loading weights: 15%|█▌ | 69/453 [00:00<00:00, 5989.51it/s, Materializing param=model.layers.7.mlp.up_proj.weight]
Loading weights: 15%|█▌ | 69/453 [00:00<00:00, 5973.07it/s, Materializing param=model.layers.7.mlp.up_proj.weight]
Loading weights: 15%|█▌ | 70/453 [00:00<00:00, 5983.80it/s, Materializing param=model.layers.7.post_attention_layernorm.weight]
Loading weights: 15%|█▌ | 70/453 [00:00<00:00, 5967.63it/s, Materializing param=model.layers.7.post_attention_layernorm.weight]
Loading weights: 16%|█▌ | 71/453 [00:00<00:00, 5963.66it/s, Materializing param=model.layers.7.self_attn.k_proj.weight]
Loading weights: 16%|█▌ | 71/453 [00:00<00:00, 5948.54it/s, Materializing param=model.layers.7.self_attn.k_proj.weight]
Loading weights: 16%|█▌ | 72/453 [00:00<00:00, 5850.93it/s, Materializing param=model.layers.7.self_attn.o_proj.weight]
Loading weights: 16%|█▌ | 72/453 [00:00<00:00, 5835.22it/s, Materializing param=model.layers.7.self_attn.o_proj.weight]
Loading weights: 16%|█▌ | 73/453 [00:00<00:00, 5878.10it/s, Materializing param=model.layers.7.self_attn.q_proj.weight]
Loading weights: 16%|█▌ | 73/453 [00:00<00:00, 5863.80it/s, Materializing param=model.layers.7.self_attn.q_proj.weight]
Loading weights: 16%|█▋ | 74/453 [00:00<00:00, 5907.13it/s, Materializing param=model.layers.7.self_attn.v_proj.weight]
Loading weights: 16%|█▋ | 74/453 [00:00<00:00, 5892.67it/s, Materializing param=model.layers.7.self_attn.v_proj.weight]
Loading weights: 17%|█▋ | 75/453 [00:00<00:00, 5888.45it/s, Materializing param=model.layers.8.input_layernorm.weight]
Loading weights: 17%|█▋ | 75/453 [00:00<00:00, 5874.92it/s, Materializing param=model.layers.8.input_layernorm.weight]
Loading weights: 17%|█▋ | 76/453 [00:00<00:00, 5925.59it/s, Materializing param=model.layers.8.mlp.down_proj.weight]
Loading weights: 17%|█▋ | 76/453 [00:00<00:00, 5911.63it/s, Materializing param=model.layers.8.mlp.down_proj.weight]
Loading weights: 17%|█▋ | 77/453 [00:00<00:00, 5955.18it/s, Materializing param=model.layers.8.mlp.gate_proj.weight]
Loading weights: 17%|█▋ | 77/453 [00:00<00:00, 5941.38it/s, Materializing param=model.layers.8.mlp.gate_proj.weight]
Loading weights: 17%|█▋ | 78/453 [00:00<00:00, 5941.37it/s, Materializing param=model.layers.8.mlp.up_proj.weight]
Loading weights: 17%|█▋ | 78/453 [00:00<00:00, 5927.16it/s, Materializing param=model.layers.8.mlp.up_proj.weight]
Loading weights: 17%|█▋ | 79/453 [00:00<00:00, 5973.50it/s, Materializing param=model.layers.8.post_attention_layernorm.weight]
Loading weights: 17%|█▋ | 79/453 [00:00<00:00, 5959.43it/s, Materializing param=model.layers.8.post_attention_layernorm.weight]
Loading weights: 18%|█▊ | 80/453 [00:00<00:00, 5793.24it/s, Materializing param=model.layers.8.self_attn.k_proj.weight]
Loading weights: 18%|█▊ | 80/453 [00:00<00:00, 5750.74it/s, Materializing param=model.layers.8.self_attn.k_proj.weight]
Loading weights: 18%|█▊ | 81/453 [00:00<00:00, 5766.59it/s, Materializing param=model.layers.8.self_attn.o_proj.weight]
Loading weights: 18%|█▊ | 81/453 [00:00<00:00, 5749.71it/s, Materializing param=model.layers.8.self_attn.o_proj.weight]
Loading weights: 18%|█▊ | 82/453 [00:00<00:00, 5783.69it/s, Materializing param=model.layers.8.self_attn.q_proj.weight]
Loading weights: 18%|█▊ | 82/453 [00:00<00:00, 5770.11it/s, Materializing param=model.layers.8.self_attn.q_proj.weight]
Loading weights: 18%|█▊ | 83/453 [00:00<00:00, 5807.15it/s, Materializing param=model.layers.8.self_attn.v_proj.weight]
Loading weights: 18%|█▊ | 83/453 [00:00<00:00, 5794.78it/s, Materializing param=model.layers.8.self_attn.v_proj.weight]
Loading weights: 19%|█▊ | 84/453 [00:00<00:00, 5833.04it/s, Materializing param=model.layers.9.input_layernorm.weight]
Loading weights: 19%|█▊ | 84/453 [00:00<00:00, 5819.94it/s, Materializing param=model.layers.9.input_layernorm.weight]
Loading weights: 19%|█▉ | 85/453 [00:00<00:00, 5854.70it/s, Materializing param=model.layers.9.mlp.down_proj.weight]
Loading weights: 19%|█▉ | 85/453 [00:00<00:00, 5841.36it/s, Materializing param=model.layers.9.mlp.down_proj.weight]
Loading weights: 19%|█▉ | 86/453 [00:00<00:00, 5877.15it/s, Materializing param=model.layers.9.mlp.gate_proj.weight]
Loading weights: 19%|█▉ | 86/453 [00:00<00:00, 5864.35it/s, Materializing param=model.layers.9.mlp.gate_proj.weight]
Loading weights: 19%|█▉ | 87/453 [00:00<00:00, 5901.93it/s, Materializing param=model.layers.9.mlp.up_proj.weight]
Loading weights: 19%|█▉ | 87/453 [00:00<00:00, 5888.02it/s, Materializing param=model.layers.9.mlp.up_proj.weight]
Loading weights: 19%|█▉ | 88/453 [00:00<00:00, 5915.42it/s, Materializing param=model.layers.9.post_attention_layernorm.weight]
Loading weights: 19%|█▉ | 88/453 [00:00<00:00, 5901.90it/s, Materializing param=model.layers.9.post_attention_layernorm.weight]
Loading weights: 20%|█▉ | 89/453 [00:00<00:00, 5938.77it/s, Materializing param=model.layers.9.self_attn.k_proj.weight]
Loading weights: 20%|█▉ | 89/453 [00:00<00:00, 5925.95it/s, Materializing param=model.layers.9.self_attn.k_proj.weight]
Loading weights: 20%|█▉ | 90/453 [00:00<00:00, 5953.78it/s, Materializing param=model.layers.9.self_attn.o_proj.weight]
Loading weights: 20%|█▉ | 90/453 [00:00<00:00, 5940.94it/s, Materializing param=model.layers.9.self_attn.o_proj.weight]
Loading weights: 20%|██ | 91/453 [00:00<00:00, 5977.22it/s, Materializing param=model.layers.9.self_attn.q_proj.weight]
Loading weights: 20%|██ | 91/453 [00:00<00:00, 5964.80it/s, Materializing param=model.layers.9.self_attn.q_proj.weight]
Loading weights: 20%|██ | 92/453 [00:00<00:00, 5985.73it/s, Materializing param=model.layers.9.self_attn.v_proj.weight]
Loading weights: 20%|██ | 92/453 [00:00<00:00, 5973.31it/s, Materializing param=model.layers.9.self_attn.v_proj.weight]
Loading weights: 21%|██ | 93/453 [00:00<00:00, 6005.89it/s, Materializing param=model.layers.10.input_layernorm.weight]
Loading weights: 21%|██ | 93/453 [00:00<00:00, 5993.52it/s, Materializing param=model.layers.10.input_layernorm.weight]
Loading weights: 21%|██ | 94/453 [00:00<00:00, 6027.13it/s, Materializing param=model.layers.10.mlp.down_proj.weight]
Loading weights: 21%|██ | 94/453 [00:00<00:00, 6014.81it/s, Materializing param=model.layers.10.mlp.down_proj.weight]
Loading weights: 21%|██ | 95/453 [00:00<00:00, 5996.82it/s, Materializing param=model.layers.10.mlp.gate_proj.weight]
Loading weights: 21%|██ | 95/453 [00:00<00:00, 5984.75it/s, Materializing param=model.layers.10.mlp.gate_proj.weight]
Loading weights: 21%|██ | 96/453 [00:00<00:00, 6019.90it/s, Materializing param=model.layers.10.mlp.up_proj.weight]
Loading weights: 21%|██ | 96/453 [00:00<00:00, 6008.31it/s, Materializing param=model.layers.10.mlp.up_proj.weight]
Loading weights: 21%|██▏ | 97/453 [00:00<00:00, 6035.15it/s, Materializing param=model.layers.10.post_attention_layernorm.weight]
Loading weights: 21%|██▏ | 97/453 [00:00<00:00, 6022.91it/s, Materializing param=model.layers.10.post_attention_layernorm.weight]
Loading weights: 22%|██▏ | 98/453 [00:00<00:00, 6057.56it/s, Materializing param=model.layers.10.self_attn.k_proj.weight]
Loading weights: 22%|██▏ | 98/453 [00:00<00:00, 6045.89it/s, Materializing param=model.layers.10.self_attn.k_proj.weight]
Loading weights: 22%|██▏ | 99/453 [00:00<00:00, 6071.86it/s, Materializing param=model.layers.10.self_attn.o_proj.weight]
Loading weights: 22%|██▏ | 99/453 [00:00<00:00, 6060.25it/s, Materializing param=model.layers.10.self_attn.o_proj.weight]
Loading weights: 22%|██▏ | 100/453 [00:00<00:00, 5962.05it/s, Materializing param=model.layers.10.self_attn.q_proj.weight]
Loading weights: 22%|██▏ | 100/453 [00:00<00:00, 5949.87it/s, Materializing param=model.layers.10.self_attn.q_proj.weight]
Loading weights: 22%|██▏ | 101/453 [00:00<00:00, 5987.29it/s, Materializing param=model.layers.10.self_attn.v_proj.weight]
Loading weights: 22%|██▏ | 101/453 [00:00<00:00, 5975.80it/s, Materializing param=model.layers.10.self_attn.v_proj.weight]
Loading weights: 23%|██▎ | 102/453 [00:00<00:00, 6007.68it/s, Materializing param=model.layers.11.input_layernorm.weight]
Loading weights: 23%|██▎ | 102/453 [00:00<00:00, 5996.82it/s, Materializing param=model.layers.11.input_layernorm.weight]
Loading weights: 23%|██▎ | 103/453 [00:00<00:00, 6027.06it/s, Materializing param=model.layers.11.mlp.down_proj.weight]
Loading weights: 23%|██▎ | 103/453 [00:00<00:00, 6016.14it/s, Materializing param=model.layers.11.mlp.down_proj.weight]
Loading weights: 23%|██▎ | 104/453 [00:00<00:00, 6026.88it/s, Materializing param=model.layers.11.mlp.gate_proj.weight]
Loading weights: 23%|██▎ | 104/453 [00:00<00:00, 6015.41it/s, Materializing param=model.layers.11.mlp.gate_proj.weight]
Loading weights: 23%|██▎ | 105/453 [00:00<00:00, 6046.15it/s, Materializing param=model.layers.11.mlp.up_proj.weight]
Loading weights: 23%|██▎ | 105/453 [00:00<00:00, 6035.88it/s, Materializing param=model.layers.11.mlp.up_proj.weight]
Loading weights: 23%|██▎ | 106/453 [00:00<00:00, 6067.01it/s, Materializing param=model.layers.11.post_attention_layernorm.weight]
Loading weights: 23%|██▎ | 106/453 [00:00<00:00, 6053.95it/s, Materializing param=model.layers.11.post_attention_layernorm.weight]
Loading weights: 24%|██▎ | 107/453 [00:00<00:00, 6084.39it/s, Materializing param=model.layers.11.self_attn.k_proj.weight]
Loading weights: 24%|██▎ | 107/453 [00:00<00:00, 6073.03it/s, Materializing param=model.layers.11.self_attn.k_proj.weight]
Loading weights: 24%|██▍ | 108/453 [00:00<00:00, 6073.57it/s, Materializing param=model.layers.11.self_attn.o_proj.weight]
Loading weights: 24%|██▍ | 108/453 [00:00<00:00, 6063.08it/s, Materializing param=model.layers.11.self_attn.o_proj.weight]
Loading weights: 24%|██▍ | 109/453 [00:00<00:00, 6097.27it/s, Materializing param=model.layers.11.self_attn.q_proj.weight]
Loading weights: 24%|██▍ | 109/453 [00:00<00:00, 6086.63it/s, Materializing param=model.layers.11.self_attn.q_proj.weight]
Loading weights: 24%|██▍ | 110/453 [00:00<00:00, 6116.82it/s, Materializing param=model.layers.11.self_attn.v_proj.weight]
Loading weights: 24%|██▍ | 110/453 [00:00<00:00, 6106.54it/s, Materializing param=model.layers.11.self_attn.v_proj.weight]
Loading weights: 25%|██▍ | 111/453 [00:00<00:00, 5925.06it/s, Materializing param=model.layers.12.input_layernorm.weight]
Loading weights: 25%|██▍ | 111/453 [00:00<00:00, 5914.30it/s, Materializing param=model.layers.12.input_layernorm.weight]
Loading weights: 25%|██▍ | 112/453 [00:00<00:00, 5942.89it/s, Materializing param=model.layers.12.mlp.down_proj.weight]
Loading weights: 25%|██▍ | 112/453 [00:00<00:00, 5933.14it/s, Materializing param=model.layers.12.mlp.down_proj.weight]
Loading weights: 25%|██▍ | 113/453 [00:00<00:00, 4976.49it/s, Materializing param=model.layers.12.mlp.gate_proj.weight]
Loading weights: 25%|██▍ | 113/453 [00:00<00:00, 4568.34it/s, Materializing param=model.layers.12.mlp.gate_proj.weight]
Loading weights: 25%|██▌ | 114/453 [00:00<00:00, 4532.79it/s, Materializing param=model.layers.12.mlp.up_proj.weight]
Loading weights: 25%|██▌ | 114/453 [00:00<00:00, 4526.14it/s, Materializing param=model.layers.12.mlp.up_proj.weight]
Loading weights: 25%|██▌ | 115/453 [00:00<00:00, 4530.85it/s, Materializing param=model.layers.12.post_attention_layernorm.weight]
Loading weights: 25%|██▌ | 115/453 [00:00<00:00, 4524.30it/s, Materializing param=model.layers.12.post_attention_layernorm.weight]
Loading weights: 26%|██▌ | 116/453 [00:00<00:00, 4530.46it/s, Materializing param=model.layers.12.self_attn.k_proj.weight]
Loading weights: 26%|██▌ | 116/453 [00:00<00:00, 4524.43it/s, Materializing param=model.layers.12.self_attn.k_proj.weight]
Loading weights: 26%|██▌ | 117/453 [00:00<00:00, 3814.90it/s, Materializing param=model.layers.12.self_attn.o_proj.weight]
Loading weights: 26%|██▌ | 117/453 [00:00<00:00, 3805.26it/s, Materializing param=model.layers.12.self_attn.o_proj.weight]
Loading weights: 26%|██▌ | 118/453 [00:00<00:00, 3824.47it/s, Materializing param=model.layers.12.self_attn.q_proj.weight]
Loading weights: 26%|██▌ | 118/453 [00:00<00:00, 3819.95it/s, Materializing param=model.layers.12.self_attn.q_proj.weight]
Loading weights: 26%|██▋ | 119/453 [00:00<00:00, 3824.84it/s, Materializing param=model.layers.12.self_attn.v_proj.weight]
Loading weights: 26%|██▋ | 119/453 [00:00<00:00, 3820.53it/s, Materializing param=model.layers.12.self_attn.v_proj.weight]
Loading weights: 26%|██▋ | 120/453 [00:00<00:00, 3843.99it/s, Materializing param=model.layers.13.input_layernorm.weight]
Loading weights: 26%|██▋ | 120/453 [00:00<00:00, 3839.91it/s, Materializing param=model.layers.13.input_layernorm.weight]
Loading weights: 27%|██▋ | 121/453 [00:00<00:00, 3862.13it/s, Materializing param=model.layers.13.mlp.down_proj.weight]
Loading weights: 27%|██▋ | 121/453 [00:00<00:00, 3858.31it/s, Materializing param=model.layers.13.mlp.down_proj.weight]
Loading weights: 27%|██▋ | 122/453 [00:00<00:00, 3880.49it/s, Materializing param=model.layers.13.mlp.gate_proj.weight]
Loading weights: 27%|██▋ | 122/453 [00:00<00:00, 3876.52it/s, Materializing param=model.layers.13.mlp.gate_proj.weight]
Loading weights: 27%|██▋ | 123/453 [00:00<00:00, 3897.02it/s, Materializing param=model.layers.13.mlp.up_proj.weight]
Loading weights: 27%|██▋ | 123/453 [00:00<00:00, 3893.29it/s, Materializing param=model.layers.13.mlp.up_proj.weight]
Loading weights: 27%|██▋ | 124/453 [00:00<00:00, 3915.87it/s, Materializing param=model.layers.13.post_attention_layernorm.weight]
Loading weights: 27%|██▋ | 124/453 [00:00<00:00, 3911.24it/s, Materializing param=model.layers.13.post_attention_layernorm.weight]
Loading weights: 28%|██▊ | 125/453 [00:00<00:00, 3891.74it/s, Materializing param=model.layers.13.self_attn.k_proj.weight]
Loading weights: 28%|██▊ | 125/453 [00:00<00:00, 3887.65it/s, Materializing param=model.layers.13.self_attn.k_proj.weight]
Loading weights: 28%|██▊ | 126/453 [00:00<00:00, 3910.51it/s, Materializing param=model.layers.13.self_attn.o_proj.weight]
Loading weights: 28%|██▊ | 126/453 [00:00<00:00, 3906.96it/s, Materializing param=model.layers.13.self_attn.o_proj.weight]
Loading weights: 28%|██▊ | 127/453 [00:00<00:00, 3911.68it/s, Materializing param=model.layers.13.self_attn.q_proj.weight]
Loading weights: 28%|██▊ | 127/453 [00:00<00:00, 3907.23it/s, Materializing param=model.layers.13.self_attn.q_proj.weight]
Loading weights: 28%|██▊ | 128/453 [00:00<00:00, 3927.77it/s, Materializing param=model.layers.13.self_attn.v_proj.weight]
Loading weights: 28%|██▊ | 128/453 [00:00<00:00, 3924.09it/s, Materializing param=model.layers.13.self_attn.v_proj.weight]
Loading weights: 28%|██▊ | 129/453 [00:00<00:00, 3945.38it/s, Materializing param=model.layers.14.input_layernorm.weight]
Loading weights: 28%|██▊ | 129/453 [00:00<00:00, 3941.70it/s, Materializing param=model.layers.14.input_layernorm.weight]
Loading weights: 29%|██▊ | 130/453 [00:00<00:00, 3948.27it/s, Materializing param=model.layers.14.mlp.down_proj.weight]
Loading weights: 29%|██▊ | 130/453 [00:00<00:00, 3944.47it/s, Materializing param=model.layers.14.mlp.down_proj.weight]
Loading weights: 29%|██▉ | 131/453 [00:00<00:00, 3965.23it/s, Materializing param=model.layers.14.mlp.gate_proj.weight]
Loading weights: 29%|██▉ | 131/453 [00:00<00:00, 3961.26it/s, Materializing param=model.layers.14.mlp.gate_proj.weight]
Loading weights: 29%|██▉ | 132/453 [00:00<00:00, 3933.56it/s, Materializing param=model.layers.14.mlp.up_proj.weight]
Loading weights: 29%|██▉ | 132/453 [00:00<00:00, 3929.70it/s, Materializing param=model.layers.14.mlp.up_proj.weight]
Loading weights: 29%|██▉ | 133/453 [00:00<00:00, 3951.70it/s, Materializing param=model.layers.14.post_attention_layernorm.weight]
Loading weights: 29%|██▉ | 133/453 [00:00<00:00, 3947.82it/s, Materializing param=model.layers.14.post_attention_layernorm.weight]
Loading weights: 30%|██▉ | 134/453 [00:00<00:00, 3963.70it/s, Materializing param=model.layers.14.self_attn.k_proj.weight]
Loading weights: 30%|██▉ | 134/453 [00:00<00:00, 3959.99it/s, Materializing param=model.layers.14.self_attn.k_proj.weight]
Loading weights: 30%|██▉ | 135/453 [00:00<00:00, 3976.31it/s, Materializing param=model.layers.14.self_attn.o_proj.weight]
Loading weights: 30%|██▉ | 135/453 [00:00<00:00, 3972.55it/s, Materializing param=model.layers.14.self_attn.o_proj.weight]
Loading weights: 30%|███ | 136/453 [00:00<00:00, 3993.18it/s, Materializing param=model.layers.14.self_attn.q_proj.weight]
Loading weights: 30%|███ | 136/453 [00:00<00:00, 3989.57it/s, Materializing param=model.layers.14.self_attn.q_proj.weight]
Loading weights: 30%|███ | 137/453 [00:00<00:00, 4007.67it/s, Materializing param=model.layers.14.self_attn.v_proj.weight]
Loading weights: 30%|███ | 137/453 [00:00<00:00, 4003.93it/s, Materializing param=model.layers.14.self_attn.v_proj.weight]
Loading weights: 30%|███ | 138/453 [00:00<00:00, 4025.78it/s, Materializing param=model.layers.15.input_layernorm.weight]
Loading weights: 30%|███ | 138/453 [00:00<00:00, 4021.92it/s, Materializing param=model.layers.15.input_layernorm.weight]
Loading weights: 31%|███ | 139/453 [00:00<00:00, 4041.90it/s, Materializing param=model.layers.15.mlp.down_proj.weight]
Loading weights: 31%|███ | 139/453 [00:00<00:00, 4038.15it/s, Materializing param=model.layers.15.mlp.down_proj.weight]
Loading weights: 31%|███ | 140/453 [00:00<00:00, 4000.81it/s, Materializing param=model.layers.15.mlp.gate_proj.weight]
Loading weights: 31%|███ | 140/453 [00:00<00:00, 3997.02it/s, Materializing param=model.layers.15.mlp.gate_proj.weight]
Loading weights: 31%|███ | 141/453 [00:00<00:00, 4018.24it/s, Materializing param=model.layers.15.mlp.up_proj.weight]
Loading weights: 31%|███ | 141/453 [00:00<00:00, 4014.78it/s, Materializing param=model.layers.15.mlp.up_proj.weight]
Loading weights: 31%|███▏ | 142/453 [00:00<00:00, 4034.76it/s, Materializing param=model.layers.15.post_attention_layernorm.weight]
Loading weights: 31%|███▏ | 142/453 [00:00<00:00, 4030.96it/s, Materializing param=model.layers.15.post_attention_layernorm.weight]
Loading weights: 32%|███▏ | 143/453 [00:00<00:00, 4035.73it/s, Materializing param=model.layers.15.self_attn.k_proj.weight]
Loading weights: 32%|███▏ | 143/453 [00:00<00:00, 4032.01it/s, Materializing param=model.layers.15.self_attn.k_proj.weight]
Loading weights: 32%|███▏ | 144/453 [00:00<00:00, 4051.16it/s, Materializing param=model.layers.15.self_attn.o_proj.weight]
Loading weights: 32%|███▏ | 144/453 [00:00<00:00, 4047.82it/s, Materializing param=model.layers.15.self_attn.o_proj.weight]
Loading weights: 32%|███▏ | 145/453 [00:00<00:00, 4000.46it/s, Materializing param=model.layers.15.self_attn.q_proj.weight]
Loading weights: 32%|███▏ | 145/453 [00:00<00:00, 3996.91it/s, Materializing param=model.layers.15.self_attn.q_proj.weight]
Loading weights: 32%|███▏ | 146/453 [00:00<00:00, 4017.19it/s, Materializing param=model.layers.15.self_attn.v_proj.weight]
Loading weights: 32%|███▏ | 146/453 [00:00<00:00, 4013.92it/s, Materializing param=model.layers.15.self_attn.v_proj.weight]
Loading weights: 32%|███▏ | 147/453 [00:00<00:00, 4032.98it/s, Materializing param=model.layers.16.input_layernorm.weight]
Loading weights: 32%|███▏ | 147/453 [00:00<00:00, 4029.53it/s, Materializing param=model.layers.16.input_layernorm.weight]
Loading weights: 33%|███▎ | 148/453 [00:00<00:00, 4026.52it/s, Materializing param=model.layers.16.mlp.down_proj.weight]
Loading weights: 33%|███▎ | 148/453 [00:00<00:00, 4023.13it/s, Materializing param=model.layers.16.mlp.down_proj.weight]
Loading weights: 33%|███▎ | 149/453 [00:00<00:00, 4043.16it/s, Materializing param=model.layers.16.mlp.gate_proj.weight]
Loading weights: 33%|███▎ | 149/453 [00:00<00:00, 4039.87it/s, Materializing param=model.layers.16.mlp.gate_proj.weight]
Loading weights: 33%|███▎ | 150/453 [00:00<00:00, 4058.82it/s, Materializing param=model.layers.16.mlp.up_proj.weight]
Loading weights: 33%|███▎ | 150/453 [00:00<00:00, 4055.52it/s, Materializing param=model.layers.16.mlp.up_proj.weight]
Loading weights: 33%|███▎ | 151/453 [00:00<00:00, 4074.37it/s, Materializing param=model.layers.16.post_attention_layernorm.weight]
Loading weights: 33%|███▎ | 151/453 [00:00<00:00, 4070.94it/s, Materializing param=model.layers.16.post_attention_layernorm.weight]
Loading weights: 34%|███▎ | 152/453 [00:00<00:00, 4089.67it/s, Materializing param=model.layers.16.self_attn.k_proj.weight]
Loading weights: 34%|███▎ | 152/453 [00:00<00:00, 4086.18it/s, Materializing param=model.layers.16.self_attn.k_proj.weight]
Loading weights: 34%|███▍ | 153/453 [00:00<00:00, 4104.65it/s, Materializing param=model.layers.16.self_attn.o_proj.weight]
Loading weights: 34%|███▍ | 153/453 [00:00<00:00, 4101.31it/s, Materializing param=model.layers.16.self_attn.o_proj.weight]
Loading weights: 34%|███▍ | 154/453 [00:00<00:00, 4088.02it/s, Materializing param=model.layers.16.self_attn.q_proj.weight]
Loading weights: 34%|███▍ | 154/453 [00:00<00:00, 4084.45it/s, Materializing param=model.layers.16.self_attn.q_proj.weight]
Loading weights: 34%|███▍ | 155/453 [00:00<00:00, 4103.78it/s, Materializing param=model.layers.16.self_attn.v_proj.weight]
Loading weights: 34%|███▍ | 155/453 [00:00<00:00, 4100.55it/s, Materializing param=model.layers.16.self_attn.v_proj.weight]
Loading weights: 34%|███▍ | 156/453 [00:00<00:00, 4111.73it/s, Materializing param=model.layers.17.input_layernorm.weight]
Loading weights: 34%|███▍ | 156/453 [00:00<00:00, 4108.22it/s, Materializing param=model.layers.17.input_layernorm.weight]
Loading weights: 35%|███▍ | 157/453 [00:00<00:00, 4127.55it/s, Materializing param=model.layers.17.mlp.down_proj.weight]
Loading weights: 35%|███▍ | 157/453 [00:00<00:00, 4124.24it/s, Materializing param=model.layers.17.mlp.down_proj.weight]
Loading weights: 35%|███▍ | 158/453 [00:00<00:00, 4141.95it/s, Materializing param=model.layers.17.mlp.gate_proj.weight]
Loading weights: 35%|███▍ | 158/453 [00:00<00:00, 4138.77it/s, Materializing param=model.layers.17.mlp.gate_proj.weight]
Loading weights: 35%|███▌ | 159/453 [00:00<00:00, 4156.74it/s, Materializing param=model.layers.17.mlp.up_proj.weight]
Loading weights: 35%|███▌ | 159/453 [00:00<00:00, 4153.42it/s, Materializing param=model.layers.17.mlp.up_proj.weight]
Loading weights: 35%|███▌ | 160/453 [00:00<00:00, 4171.34it/s, Materializing param=model.layers.17.post_attention_layernorm.weight]
Loading weights: 35%|███▌ | 160/453 [00:00<00:00, 4167.84it/s, Materializing param=model.layers.17.post_attention_layernorm.weight]
Loading weights: 36%|███▌ | 161/453 [00:00<00:00, 4185.59it/s, Materializing param=model.layers.17.self_attn.k_proj.weight]
Loading weights: 36%|███▌ | 161/453 [00:00<00:00, 4181.81it/s, Materializing param=model.layers.17.self_attn.k_proj.weight]
Loading weights: 36%|███▌ | 162/453 [00:00<00:00, 4199.77it/s, Materializing param=model.layers.17.self_attn.o_proj.weight]
Loading weights: 36%|███▌ | 162/453 [00:00<00:00, 4196.40it/s, Materializing param=model.layers.17.self_attn.o_proj.weight]
Loading weights: 36%|███▌ | 163/453 [00:00<00:00, 4214.00it/s, Materializing param=model.layers.17.self_attn.q_proj.weight]
Loading weights: 36%|███▌ | 163/453 [00:00<00:00, 4210.79it/s, Materializing param=model.layers.17.self_attn.q_proj.weight]
Loading weights: 36%|███▌ | 164/453 [00:00<00:00, 4221.98it/s, Materializing param=model.layers.17.self_attn.v_proj.weight]
Loading weights: 36%|███▌ | 164/453 [00:00<00:00, 4218.59it/s, Materializing param=model.layers.17.self_attn.v_proj.weight]
Loading weights: 36%|███▋ | 165/453 [00:00<00:00, 4236.18it/s, Materializing param=model.layers.18.input_layernorm.weight]
Loading weights: 36%|███▋ | 165/453 [00:00<00:00, 4232.76it/s, Materializing param=model.layers.18.input_layernorm.weight]
Loading weights: 37%|███▋ | 166/453 [00:00<00:00, 4250.12it/s, Materializing param=model.layers.18.mlp.down_proj.weight]
Loading weights: 37%|███▋ | 166/453 [00:00<00:00, 4246.80it/s, Materializing param=model.layers.18.mlp.down_proj.weight]
Loading weights: 37%|███▋ | 167/453 [00:00<00:00, 4252.41it/s, Materializing param=model.layers.18.mlp.gate_proj.weight]
Loading weights: 37%|███▋ | 167/453 [00:00<00:00, 4248.80it/s, Materializing param=model.layers.18.mlp.gate_proj.weight]
Loading weights: 37%|███▋ | 168/453 [00:00<00:00, 4266.45it/s, Materializing param=model.layers.18.mlp.up_proj.weight]
Loading weights: 37%|███▋ | 168/453 [00:00<00:00, 4263.12it/s, Materializing param=model.layers.18.mlp.up_proj.weight]
Loading weights: 37%|███▋ | 169/453 [00:00<00:00, 4280.11it/s, Materializing param=model.layers.18.post_attention_layernorm.weight]
Loading weights: 37%|███▋ | 169/453 [00:00<00:00, 4276.55it/s, Materializing param=model.layers.18.post_attention_layernorm.weight]
Loading weights: 38%|███▊ | 170/453 [00:00<00:00, 4291.55it/s, Materializing param=model.layers.18.self_attn.k_proj.weight]
Loading weights: 38%|███▊ | 170/453 [00:00<00:00, 4288.06it/s, Materializing param=model.layers.18.self_attn.k_proj.weight]
Loading weights: 38%|███▊ | 171/453 [00:00<00:00, 4305.05it/s, Materializing param=model.layers.18.self_attn.o_proj.weight]
Loading weights: 38%|███▊ | 171/453 [00:00<00:00, 4301.77it/s, Materializing param=model.layers.18.self_attn.o_proj.weight]
Loading weights: 38%|███▊ | 172/453 [00:00<00:00, 4318.85it/s, Materializing param=model.layers.18.self_attn.q_proj.weight]
Loading weights: 38%|███▊ | 172/453 [00:00<00:00, 4315.70it/s, Materializing param=model.layers.18.self_attn.q_proj.weight]
Loading weights: 38%|███▊ | 173/453 [00:00<00:00, 4332.91it/s, Materializing param=model.layers.18.self_attn.v_proj.weight]
Loading weights: 38%|███▊ | 173/453 [00:00<00:00, 4329.75it/s, Materializing param=model.layers.18.self_attn.v_proj.weight]
Loading weights: 38%|███▊ | 174/453 [00:00<00:00, 4347.28it/s, Materializing param=model.layers.19.input_layernorm.weight]
Loading weights: 38%|███▊ | 174/453 [00:00<00:00, 4343.53it/s, Materializing param=model.layers.19.input_layernorm.weight]
Loading weights: 39%|███▊ | 175/453 [00:00<00:00, 4342.34it/s, Materializing param=model.layers.19.mlp.down_proj.weight]
Loading weights: 39%|███▊ | 175/453 [00:00<00:00, 4338.70it/s, Materializing param=model.layers.19.mlp.down_proj.weight]
Loading weights: 39%|███▉ | 176/453 [00:00<00:00, 4356.48it/s, Materializing param=model.layers.19.mlp.gate_proj.weight]
Loading weights: 39%|███▉ | 176/453 [00:00<00:00, 4353.17it/s, Materializing param=model.layers.19.mlp.gate_proj.weight]
Loading weights: 39%|███▉ | 177/453 [00:00<00:00, 4369.56it/s, Materializing param=model.layers.19.mlp.up_proj.weight]
Loading weights: 39%|███▉ | 177/453 [00:00<00:00, 4366.09it/s, Materializing param=model.layers.19.mlp.up_proj.weight]
Loading weights: 39%|███▉ | 178/453 [00:00<00:00, 4357.18it/s, Materializing param=model.layers.19.post_attention_layernorm.weight]
Loading weights: 39%|███▉ | 178/453 [00:00<00:00, 4353.58it/s, Materializing param=model.layers.19.post_attention_layernorm.weight]
Loading weights: 40%|███▉ | 179/453 [00:00<00:00, 4371.53it/s, Materializing param=model.layers.19.self_attn.k_proj.weight]
Loading weights: 40%|███▉ | 179/453 [00:00<00:00, 4368.28it/s, Materializing param=model.layers.19.self_attn.k_proj.weight]
Loading weights: 40%|███▉ | 180/453 [00:00<00:00, 4384.80it/s, Materializing param=model.layers.19.self_attn.o_proj.weight]
Loading weights: 40%|███▉ | 180/453 [00:00<00:00, 4381.57it/s, Materializing param=model.layers.19.self_attn.o_proj.weight]
Loading weights: 40%|███▉ | 181/453 [00:00<00:00, 4397.95it/s, Materializing param=model.layers.19.self_attn.q_proj.weight]
Loading weights: 40%|███▉ | 181/453 [00:00<00:00, 4394.89it/s, Materializing param=model.layers.19.self_attn.q_proj.weight]
Loading weights: 40%|████ | 182/453 [00:00<00:00, 4392.80it/s, Materializing param=model.layers.19.self_attn.v_proj.weight]
Loading weights: 40%|████ | 182/453 [00:00<00:00, 4389.24it/s, Materializing param=model.layers.19.self_attn.v_proj.weight]
Loading weights: 40%|████ | 183/453 [00:00<00:00, 4406.29it/s, Materializing param=model.layers.20.input_layernorm.weight]
Loading weights: 40%|████ | 183/453 [00:00<00:00, 4402.93it/s, Materializing param=model.layers.20.input_layernorm.weight]
Loading weights: 41%|████ | 184/453 [00:00<00:00, 4419.10it/s, Materializing param=model.layers.20.mlp.down_proj.weight]
Loading weights: 41%|████ | 184/453 [00:00<00:00, 4415.84it/s, Materializing param=model.layers.20.mlp.down_proj.weight]
Loading weights: 41%|████ | 185/453 [00:00<00:00, 4428.71it/s, Materializing param=model.layers.20.mlp.gate_proj.weight]
Loading weights: 41%|████ | 185/453 [00:00<00:00, 4425.18it/s, Materializing param=model.layers.20.mlp.gate_proj.weight]
Loading weights: 41%|████ | 186/453 [00:00<00:00, 4442.49it/s, Materializing param=model.layers.20.mlp.up_proj.weight]
Loading weights: 41%|████ | 186/453 [00:00<00:00, 4439.30it/s, Materializing param=model.layers.20.mlp.up_proj.weight]
Loading weights: 41%|████▏ | 187/453 [00:00<00:00, 4455.38it/s, Materializing param=model.layers.20.post_attention_layernorm.weight]
Loading weights: 41%|████▏ | 187/453 [00:00<00:00, 4451.92it/s, Materializing param=model.layers.20.post_attention_layernorm.weight]
Loading weights: 42%|████▏ | 188/453 [00:00<00:00, 4467.66it/s, Materializing param=model.layers.20.self_attn.k_proj.weight]
Loading weights: 42%|████▏ | 188/453 [00:00<00:00, 4464.35it/s, Materializing param=model.layers.20.self_attn.k_proj.weight]
Loading weights: 42%|████▏ | 189/453 [00:00<00:00, 4480.03it/s, Materializing param=model.layers.20.self_attn.o_proj.weight]
Loading weights: 42%|████▏ | 189/453 [00:00<00:00, 4476.46it/s, Materializing param=model.layers.20.self_attn.o_proj.weight]
Loading weights: 42%|████▏ | 190/453 [00:00<00:00, 4491.93it/s, Materializing param=model.layers.20.self_attn.q_proj.weight]
Loading weights: 42%|████▏ | 190/453 [00:00<00:00, 4488.54it/s, Materializing param=model.layers.20.self_attn.q_proj.weight]
Loading weights: 42%|████▏ | 191/453 [00:00<00:00, 4504.48it/s, Materializing param=model.layers.20.self_attn.v_proj.weight]
Loading weights: 42%|████▏ | 191/453 [00:00<00:00, 4501.29it/s, Materializing param=model.layers.20.self_attn.v_proj.weight]
Loading weights: 42%|████▏ | 192/453 [00:00<00:00, 4517.11it/s, Materializing param=model.layers.21.input_layernorm.weight]
Loading weights: 42%|████▏ | 192/453 [00:00<00:00, 4513.62it/s, Materializing param=model.layers.21.input_layernorm.weight]
Loading weights: 43%|████▎ | 193/453 [00:00<00:00, 4529.79it/s, Materializing param=model.layers.21.mlp.down_proj.weight]
Loading weights: 43%|████▎ | 193/453 [00:00<00:00, 4526.50it/s, Materializing param=model.layers.21.mlp.down_proj.weight]
Loading weights: 43%|████▎ | 194/453 [00:00<00:00, 4541.98it/s, Materializing param=model.layers.21.mlp.gate_proj.weight]
Loading weights: 43%|████▎ | 194/453 [00:00<00:00, 4538.83it/s, Materializing param=model.layers.21.mlp.gate_proj.weight]
Loading weights: 43%|████▎ | 195/453 [00:00<00:00, 4526.00it/s, Materializing param=model.layers.21.mlp.up_proj.weight]
Loading weights: 43%|████▎ | 195/453 [00:00<00:00, 4522.55it/s, Materializing param=model.layers.21.mlp.up_proj.weight]
Loading weights: 43%|████▎ | 196/453 [00:00<00:00, 4538.36it/s, Materializing param=model.layers.21.post_attention_layernorm.weight]
Loading weights: 43%|████▎ | 196/453 [00:00<00:00, 4535.11it/s, Materializing param=model.layers.21.post_attention_layernorm.weight]
Loading weights: 43%|████▎ | 197/453 [00:00<00:00, 4548.39it/s, Materializing param=model.layers.21.self_attn.k_proj.weight]
Loading weights: 43%|████▎ | 197/453 [00:00<00:00, 4544.96it/s, Materializing param=model.layers.21.self_attn.k_proj.weight]
Loading weights: 44%|████▎ | 198/453 [00:00<00:00, 4561.46it/s, Materializing param=model.layers.21.self_attn.o_proj.weight]
Loading weights: 44%|████▎ | 198/453 [00:00<00:00, 4558.18it/s, Materializing param=model.layers.21.self_attn.o_proj.weight]
Loading weights: 44%|████▍ | 199/453 [00:00<00:00, 4572.84it/s, Materializing param=model.layers.21.self_attn.q_proj.weight]
Loading weights: 44%|████▍ | 199/453 [00:00<00:00, 4569.56it/s, Materializing param=model.layers.21.self_attn.q_proj.weight]
Loading weights: 44%|████▍ | 200/453 [00:00<00:00, 4584.67it/s, Materializing param=model.layers.21.self_attn.v_proj.weight]
Loading weights: 44%|████▍ | 200/453 [00:00<00:00, 4581.26it/s, Materializing param=model.layers.21.self_attn.v_proj.weight]
Loading weights: 44%|████▍ | 201/453 [00:00<00:00, 4596.08it/s, Materializing param=model.layers.22.input_layernorm.weight]
Loading weights: 44%|████▍ | 201/453 [00:00<00:00, 4592.93it/s, Materializing param=model.layers.22.input_layernorm.weight]
Loading weights: 45%|████▍ | 202/453 [00:00<00:00, 4568.29it/s, Materializing param=model.layers.22.mlp.down_proj.weight]
Loading weights: 45%|████▍ | 202/453 [00:00<00:00, 4564.80it/s, Materializing param=model.layers.22.mlp.down_proj.weight]
Loading weights: 45%|████▍ | 203/453 [00:00<00:00, 4580.68it/s, Materializing param=model.layers.22.mlp.gate_proj.weight]
Loading weights: 45%|████▍ | 203/453 [00:00<00:00, 4577.43it/s, Materializing param=model.layers.22.mlp.gate_proj.weight]
Loading weights: 45%|████▌ | 204/453 [00:00<00:00, 4586.42it/s, Materializing param=model.layers.22.mlp.up_proj.weight]
Loading weights: 45%|████▌ | 204/453 [00:00<00:00, 4583.23it/s, Materializing param=model.layers.22.mlp.up_proj.weight]
Loading weights: 45%|████▌ | 205/453 [00:00<00:00, 4597.91it/s, Materializing param=model.layers.22.post_attention_layernorm.weight]
Loading weights: 45%|████▌ | 205/453 [00:00<00:00, 4594.77it/s, Materializing param=model.layers.22.post_attention_layernorm.weight]
Loading weights: 45%|████▌ | 206/453 [00:00<00:00, 4592.30it/s, Materializing param=model.layers.22.self_attn.k_proj.weight]
Loading weights: 45%|████▌ | 206/453 [00:00<00:00, 4588.91it/s, Materializing param=model.layers.22.self_attn.k_proj.weight]
Loading weights: 46%|████▌ | 207/453 [00:00<00:00, 4604.55it/s, Materializing param=model.layers.22.self_attn.o_proj.weight]
Loading weights: 46%|████▌ | 207/453 [00:00<00:00, 4601.31it/s, Materializing param=model.layers.22.self_attn.o_proj.weight]
Loading weights: 46%|████▌ | 208/453 [00:00<00:00, 4615.86it/s, Materializing param=model.layers.22.self_attn.q_proj.weight]
Loading weights: 46%|████▌ | 208/453 [00:00<00:00, 4612.68it/s, Materializing param=model.layers.22.self_attn.q_proj.weight]
Loading weights: 46%|████▌ | 209/453 [00:00<00:00, 4627.20it/s, Materializing param=model.layers.22.self_attn.v_proj.weight]
Loading weights: 46%|████▌ | 209/453 [00:00<00:00, 4624.15it/s, Materializing param=model.layers.22.self_attn.v_proj.weight]
Loading weights: 46%|████▋ | 210/453 [00:00<00:00, 4638.81it/s, Materializing param=model.layers.23.input_layernorm.weight]
Loading weights: 46%|████▋ | 210/453 [00:00<00:00, 4635.44it/s, Materializing param=model.layers.23.input_layernorm.weight]
Loading weights: 47%|████▋ | 211/453 [00:00<00:00, 4641.66it/s, Materializing param=model.layers.23.mlp.down_proj.weight]
Loading weights: 47%|████▋ | 211/453 [00:00<00:00, 4638.06it/s, Materializing param=model.layers.23.mlp.down_proj.weight]
Loading weights: 47%|████▋ | 212/453 [00:00<00:00, 4652.41it/s, Materializing param=model.layers.23.mlp.gate_proj.weight]
Loading weights: 47%|████▋ | 212/453 [00:00<00:00, 4649.10it/s, Materializing param=model.layers.23.mlp.gate_proj.weight]
Loading weights: 47%|████▋ | 213/453 [00:00<00:00, 4663.21it/s, Materializing param=model.layers.23.mlp.up_proj.weight]
Loading weights: 47%|████▋ | 213/453 [00:00<00:00, 4660.07it/s, Materializing param=model.layers.23.mlp.up_proj.weight]
Loading weights: 47%|████▋ | 214/453 [00:00<00:00, 4674.15it/s, Materializing param=model.layers.23.post_attention_layernorm.weight]
Loading weights: 47%|████▋ | 214/453 [00:00<00:00, 4670.74it/s, Materializing param=model.layers.23.post_attention_layernorm.weight]
Loading weights: 47%|████▋ | 215/453 [00:00<00:00, 4684.67it/s, Materializing param=model.layers.23.self_attn.k_proj.weight]
Loading weights: 47%|████▋ | 215/453 [00:00<00:00, 4681.43it/s, Materializing param=model.layers.23.self_attn.k_proj.weight]
Loading weights: 48%|████▊ | 216/453 [00:00<00:00, 4691.57it/s, Materializing param=model.layers.23.self_attn.o_proj.weight]
Loading weights: 48%|████▊ | 216/453 [00:00<00:00, 4688.46it/s, Materializing param=model.layers.23.self_attn.o_proj.weight]
Loading weights: 48%|████▊ | 217/453 [00:00<00:00, 4702.94it/s, Materializing param=model.layers.23.self_attn.q_proj.weight]
Loading weights: 48%|████▊ | 217/453 [00:00<00:00, 4699.83it/s, Materializing param=model.layers.23.self_attn.q_proj.weight]
Loading weights: 48%|████▊ | 218/453 [00:00<00:00, 4713.96it/s, Materializing param=model.layers.23.self_attn.v_proj.weight]
Loading weights: 48%|████▊ | 218/453 [00:00<00:00, 4710.71it/s, Materializing param=model.layers.23.self_attn.v_proj.weight]
Loading weights: 48%|████▊ | 219/453 [00:00<00:00, 4724.51it/s, Materializing param=model.layers.24.input_layernorm.weight]
Loading weights: 48%|████▊ | 219/453 [00:00<00:00, 4721.45it/s, Materializing param=model.layers.24.input_layernorm.weight]
Loading weights: 49%|████▊ | 220/453 [00:00<00:00, 4735.41it/s, Materializing param=model.layers.24.mlp.down_proj.weight]
Loading weights: 49%|████▊ | 220/453 [00:00<00:00, 4732.13it/s, Materializing param=model.layers.24.mlp.down_proj.weight]
Loading weights: 49%|████▉ | 221/453 [00:00<00:00, 4721.39it/s, Materializing param=model.layers.24.mlp.gate_proj.weight]
Loading weights: 49%|████▉ | 221/453 [00:00<00:00, 4718.24it/s, Materializing param=model.layers.24.mlp.gate_proj.weight]
Loading weights: 49%|████▉ | 222/453 [00:00<00:00, 4733.11it/s, Materializing param=model.layers.24.mlp.up_proj.weight]
Loading weights: 49%|████▉ | 222/453 [00:00<00:00, 4730.03it/s, Materializing param=model.layers.24.mlp.up_proj.weight]
Loading weights: 49%|████▉ | 223/453 [00:00<00:00, 4743.82it/s, Materializing param=model.layers.24.post_attention_layernorm.weight]
Loading weights: 49%|████▉ | 223/453 [00:00<00:00, 4740.72it/s, Materializing param=model.layers.24.post_attention_layernorm.weight]
Loading weights: 49%|████▉ | 224/453 [00:00<00:00, 4754.32it/s, Materializing param=model.layers.24.self_attn.k_proj.weight]
Loading weights: 49%|████▉ | 224/453 [00:00<00:00, 4751.24it/s, Materializing param=model.layers.24.self_attn.k_proj.weight]
Loading weights: 50%|████▉ | 225/453 [00:00<00:00, 4764.86it/s, Materializing param=model.layers.24.self_attn.o_proj.weight]
Loading weights: 50%|████▉ | 225/453 [00:00<00:00, 4761.69it/s, Materializing param=model.layers.24.self_attn.o_proj.weight]
Loading weights: 50%|████▉ | 226/453 [00:00<00:00, 4768.63it/s, Materializing param=model.layers.24.self_attn.q_proj.weight]
Loading weights: 50%|████▉ | 226/453 [00:00<00:00, 4765.58it/s, Materializing param=model.layers.24.self_attn.q_proj.weight]
Loading weights: 50%|█████ | 227/453 [00:00<00:00, 4779.01it/s, Materializing param=model.layers.24.self_attn.v_proj.weight]
Loading weights: 50%|█████ | 227/453 [00:00<00:00, 4775.79it/s, Materializing param=model.layers.24.self_attn.v_proj.weight]
Loading weights: 50%|█████ | 228/453 [00:00<00:00, 4789.10it/s, Materializing param=model.layers.25.input_layernorm.weight]
Loading weights: 50%|█████ | 228/453 [00:00<00:00, 4786.03it/s, Materializing param=model.layers.25.input_layernorm.weight]
Loading weights: 51%|█████ | 229/453 [00:00<00:00, 4799.38it/s, Materializing param=model.layers.25.mlp.down_proj.weight]
Loading weights: 51%|█████ | 229/453 [00:00<00:00, 4796.27it/s, Materializing param=model.layers.25.mlp.down_proj.weight]
Loading weights: 51%|█████ | 230/453 [00:00<00:00, 4809.62it/s, Materializing param=model.layers.25.mlp.gate_proj.weight]
Loading weights: 51%|█████ | 230/453 [00:00<00:00, 4806.55it/s, Materializing param=model.layers.25.mlp.gate_proj.weight]
Loading weights: 51%|█████ | 231/453 [00:00<00:00, 4819.94it/s, Materializing param=model.layers.25.mlp.up_proj.weight]
Loading weights: 51%|█████ | 231/453 [00:00<00:00, 4816.89it/s, Materializing param=model.layers.25.mlp.up_proj.weight]
Loading weights: 51%|█████ | 232/453 [00:00<00:00, 4830.11it/s, Materializing param=model.layers.25.post_attention_layernorm.weight]
Loading weights: 51%|█████ | 232/453 [00:00<00:00, 4826.90it/s, Materializing param=model.layers.25.post_attention_layernorm.weight]
Loading weights: 51%|█████▏ | 233/453 [00:00<00:00, 4838.22it/s, Materializing param=model.layers.25.self_attn.k_proj.weight]
Loading weights: 51%|█████▏ | 233/453 [00:00<00:00, 4835.04it/s, Materializing param=model.layers.25.self_attn.k_proj.weight]
Loading weights: 52%|█████▏ | 234/453 [00:00<00:00, 4846.73it/s, Materializing param=model.layers.25.self_attn.o_proj.weight]
Loading weights: 52%|█████▏ | 234/453 [00:00<00:00, 4843.67it/s, Materializing param=model.layers.25.self_attn.o_proj.weight]
Loading weights: 52%|█████▏ | 235/453 [00:00<00:00, 4850.53it/s, Materializing param=model.layers.25.self_attn.q_proj.weight]
Loading weights: 52%|█████▏ | 235/453 [00:00<00:00, 4847.19it/s, Materializing param=model.layers.25.self_attn.q_proj.weight]
Loading weights: 52%|█████▏ | 236/453 [00:00<00:00, 4855.35it/s, Materializing param=model.layers.25.self_attn.v_proj.weight]
Loading weights: 52%|█████▏ | 236/453 [00:00<00:00, 4852.12it/s, Materializing param=model.layers.25.self_attn.v_proj.weight]
Loading weights: 52%|█████▏ | 237/453 [00:00<00:00, 4865.19it/s, Materializing param=model.layers.26.input_layernorm.weight]
Loading weights: 52%|█████▏ | 237/453 [00:00<00:00, 4862.24it/s, Materializing param=model.layers.26.input_layernorm.weight]
Loading weights: 53%|█████▎ | 238/453 [00:00<00:00, 4875.50it/s, Materializing param=model.layers.26.mlp.down_proj.weight]
Loading weights: 53%|█████▎ | 238/453 [00:00<00:00, 4872.62it/s, Materializing param=model.layers.26.mlp.down_proj.weight]
Loading weights: 53%|█████▎ | 239/453 [00:00<00:00, 4885.70it/s, Materializing param=model.layers.26.mlp.gate_proj.weight]
Loading weights: 53%|█████▎ | 239/453 [00:00<00:00, 4882.66it/s, Materializing param=model.layers.26.mlp.gate_proj.weight]
Loading weights: 53%|█████▎ | 240/453 [00:00<00:00, 4895.62it/s, Materializing param=model.layers.26.mlp.up_proj.weight]
Loading weights: 53%|█████▎ | 240/453 [00:00<00:00, 4892.65it/s, Materializing param=model.layers.26.mlp.up_proj.weight]
Loading weights: 53%|█████▎ | 241/453 [00:00<00:00, 4905.50it/s, Materializing param=model.layers.26.post_attention_layernorm.weight]
Loading weights: 53%|█████▎ | 241/453 [00:00<00:00, 4902.26it/s, Materializing param=model.layers.26.post_attention_layernorm.weight]
Loading weights: 53%|█████▎ | 242/453 [00:00<00:00, 4913.31it/s, Materializing param=model.layers.26.self_attn.k_proj.weight]
Loading weights: 53%|█████▎ | 242/453 [00:00<00:00, 4910.34it/s, Materializing param=model.layers.26.self_attn.k_proj.weight]
Loading weights: 54%|█████▎ | 243/453 [00:00<00:00, 4921.16it/s, Materializing param=model.layers.26.self_attn.o_proj.weight]
Loading weights: 54%|█████▎ | 243/453 [00:00<00:00, 4917.95it/s, Materializing param=model.layers.26.self_attn.o_proj.weight]
Loading weights: 54%|█████▍ | 244/453 [00:00<00:00, 4930.77it/s, Materializing param=model.layers.26.self_attn.q_proj.weight]
Loading weights: 54%|█████▍ | 244/453 [00:00<00:00, 4927.63it/s, Materializing param=model.layers.26.self_attn.q_proj.weight]
Loading weights: 54%|█████▍ | 245/453 [00:00<00:00, 4922.68it/s, Materializing param=model.layers.26.self_attn.v_proj.weight]
Loading weights: 54%|█████▍ | 245/453 [00:00<00:00, 4919.50it/s, Materializing param=model.layers.26.self_attn.v_proj.weight]
Loading weights: 54%|█████▍ | 246/453 [00:00<00:00, 4933.30it/s, Materializing param=model.layers.27.input_layernorm.weight]
Loading weights: 54%|█████▍ | 246/453 [00:00<00:00, 4930.09it/s, Materializing param=model.layers.27.input_layernorm.weight]
Loading weights: 55%|█████▍ | 247/453 [00:00<00:00, 4942.43it/s, Materializing param=model.layers.27.mlp.down_proj.weight]
Loading weights: 55%|█████▍ | 247/453 [00:00<00:00, 4939.42it/s, Materializing param=model.layers.27.mlp.down_proj.weight]
Loading weights: 55%|█████▍ | 248/453 [00:00<00:00, 4951.76it/s, Materializing param=model.layers.27.mlp.gate_proj.weight]
Loading weights: 55%|█████▍ | 248/453 [00:00<00:00, 4948.77it/s, Materializing param=model.layers.27.mlp.gate_proj.weight]
Loading weights: 55%|█████▍ | 249/453 [00:00<00:00, 4947.50it/s, Materializing param=model.layers.27.mlp.up_proj.weight]
Loading weights: 55%|█████▍ | 249/453 [00:00<00:00, 4944.24it/s, Materializing param=model.layers.27.mlp.up_proj.weight]
Loading weights: 55%|█████▌ | 250/453 [00:00<00:00, 4956.52it/s, Materializing param=model.layers.27.post_attention_layernorm.weight]
Loading weights: 55%|█████▌ | 250/453 [00:00<00:00, 4953.40it/s, Materializing param=model.layers.27.post_attention_layernorm.weight]
Loading weights: 55%|█████▌ | 251/453 [00:00<00:00, 4927.94it/s, Materializing param=model.layers.27.self_attn.k_proj.weight]
Loading weights: 55%|█████▌ | 251/453 [00:00<00:00, 4924.76it/s, Materializing param=model.layers.27.self_attn.k_proj.weight]
Loading weights: 56%|█████▌ | 252/453 [00:00<00:00, 4935.24it/s, Materializing param=model.layers.27.self_attn.o_proj.weight]
Loading weights: 56%|█████▌ | 252/453 [00:00<00:00, 4932.22it/s, Materializing param=model.layers.27.self_attn.o_proj.weight]
Loading weights: 56%|█████▌ | 253/453 [00:00<00:00, 4944.43it/s, Materializing param=model.layers.27.self_attn.q_proj.weight]
Loading weights: 56%|█████▌ | 253/453 [00:00<00:00, 4941.46it/s, Materializing param=model.layers.27.self_attn.q_proj.weight]
Loading weights: 56%|█████▌ | 254/453 [00:00<00:00, 4953.66it/s, Materializing param=model.layers.27.self_attn.v_proj.weight]
Loading weights: 56%|█████▌ | 254/453 [00:00<00:00, 4950.80it/s, Materializing param=model.layers.27.self_attn.v_proj.weight]
Loading weights: 56%|█████▋ | 255/453 [00:00<00:00, 4963.28it/s, Materializing param=model.layers.28.input_layernorm.weight]
Loading weights: 56%|█████▋ | 255/453 [00:00<00:00, 4960.38it/s, Materializing param=model.layers.28.input_layernorm.weight]
Loading weights: 57%|█████▋ | 256/453 [00:00<00:00, 4972.59it/s, Materializing param=model.layers.28.mlp.down_proj.weight]
Loading weights: 57%|█████▋ | 256/453 [00:00<00:00, 4969.44it/s, Materializing param=model.layers.28.mlp.down_proj.weight]
Loading weights: 57%|█████▋ | 257/453 [00:00<00:00, 4981.68it/s, Materializing param=model.layers.28.mlp.gate_proj.weight]
Loading weights: 57%|█████▋ | 257/453 [00:00<00:00, 4978.48it/s, Materializing param=model.layers.28.mlp.gate_proj.weight]
Loading weights: 57%|█████▋ | 258/453 [00:00<00:00, 4990.69it/s, Materializing param=model.layers.28.mlp.up_proj.weight]
Loading weights: 57%|█████▋ | 258/453 [00:00<00:00, 4987.63it/s, Materializing param=model.layers.28.mlp.up_proj.weight]
Loading weights: 57%|█████▋ | 259/453 [00:00<00:00, 4999.72it/s, Materializing param=model.layers.28.post_attention_layernorm.weight]
Loading weights: 57%|█████▋ | 259/453 [00:00<00:00, 4996.60it/s, Materializing param=model.layers.28.post_attention_layernorm.weight]
Loading weights: 57%|█████▋ | 260/453 [00:00<00:00, 5008.68it/s, Materializing param=model.layers.28.self_attn.k_proj.weight]
Loading weights: 57%|█████▋ | 260/453 [00:00<00:00, 5005.66it/s, Materializing param=model.layers.28.self_attn.k_proj.weight]
Loading weights: 58%|█████▊ | 261/453 [00:00<00:00, 5017.91it/s, Materializing param=model.layers.28.self_attn.o_proj.weight]
Loading weights: 58%|█████▊ | 261/453 [00:00<00:00, 5014.90it/s, Materializing param=model.layers.28.self_attn.o_proj.weight]
Loading weights: 58%|█████▊ | 262/453 [00:00<00:00, 5027.14it/s, Materializing param=model.layers.28.self_attn.q_proj.weight]
Loading weights: 58%|█████▊ | 262/453 [00:00<00:00, 5024.22it/s, Materializing param=model.layers.28.self_attn.q_proj.weight]
Loading weights: 58%|█████▊ | 263/453 [00:00<00:00, 5036.24it/s, Materializing param=model.layers.28.self_attn.v_proj.weight]
Loading weights: 58%|█████▊ | 263/453 [00:00<00:00, 5033.23it/s, Materializing param=model.layers.28.self_attn.v_proj.weight]
Loading weights: 58%|█████▊ | 264/453 [00:00<00:00, 5045.53it/s, Materializing param=model.layers.29.input_layernorm.weight]
Loading weights: 58%|█████▊ | 264/453 [00:00<00:00, 5042.56it/s, Materializing param=model.layers.29.input_layernorm.weight]
Loading weights: 58%|█████▊ | 265/453 [00:00<00:00, 5054.99it/s, Materializing param=model.layers.29.mlp.down_proj.weight]
Loading weights: 58%|█████▊ | 265/453 [00:00<00:00, 5051.89it/s, Materializing param=model.layers.29.mlp.down_proj.weight]
Loading weights: 59%|█████▊ | 266/453 [00:00<00:00, 5064.04it/s, Materializing param=model.layers.29.mlp.gate_proj.weight]
Loading weights: 59%|█████▊ | 266/453 [00:00<00:00, 5061.10it/s, Materializing param=model.layers.29.mlp.gate_proj.weight]
Loading weights: 59%|█████▉ | 267/453 [00:00<00:00, 5073.13it/s, Materializing param=model.layers.29.mlp.up_proj.weight]
Loading weights: 59%|█████▉ | 267/453 [00:00<00:00, 5070.08it/s, Materializing param=model.layers.29.mlp.up_proj.weight]
Loading weights: 59%|█████▉ | 268/453 [00:00<00:00, 5082.10it/s, Materializing param=model.layers.29.post_attention_layernorm.weight]
Loading weights: 59%|█████▉ | 268/453 [00:00<00:00, 5078.93it/s, Materializing param=model.layers.29.post_attention_layernorm.weight]
Loading weights: 59%|█████▉ | 269/453 [00:00<00:00, 5090.91it/s, Materializing param=model.layers.29.self_attn.k_proj.weight]
Loading weights: 59%|█████▉ | 269/453 [00:00<00:00, 5087.90it/s, Materializing param=model.layers.29.self_attn.k_proj.weight]
Loading weights: 60%|█████▉ | 270/453 [00:00<00:00, 5099.87it/s, Materializing param=model.layers.29.self_attn.o_proj.weight]
Loading weights: 60%|█████▉ | 270/453 [00:00<00:00, 5097.02it/s, Materializing param=model.layers.29.self_attn.o_proj.weight]
Loading weights: 60%|█████▉ | 271/453 [00:00<00:00, 5109.07it/s, Materializing param=model.layers.29.self_attn.q_proj.weight]
Loading weights: 60%|█████▉ | 271/453 [00:00<00:00, 5106.14it/s, Materializing param=model.layers.29.self_attn.q_proj.weight]
Loading weights: 60%|██████ | 272/453 [00:00<00:00, 5118.26it/s, Materializing param=model.layers.29.self_attn.v_proj.weight]
Loading weights: 60%|██████ | 272/453 [00:00<00:00, 5115.37it/s, Materializing param=model.layers.29.self_attn.v_proj.weight]
Loading weights: 60%|██████ | 273/453 [00:00<00:00, 5127.24it/s, Materializing param=model.layers.30.input_layernorm.weight]
Loading weights: 60%|██████ | 273/453 [00:00<00:00, 5124.34it/s, Materializing param=model.layers.30.input_layernorm.weight]
Loading weights: 60%|██████ | 274/453 [00:00<00:00, 5136.31it/s, Materializing param=model.layers.30.mlp.down_proj.weight]
Loading weights: 60%|██████ | 274/453 [00:00<00:00, 5133.37it/s, Materializing param=model.layers.30.mlp.down_proj.weight]
Loading weights: 61%|██████ | 275/453 [00:00<00:00, 5145.35it/s, Materializing param=model.layers.30.mlp.gate_proj.weight]
Loading weights: 61%|██████ | 275/453 [00:00<00:00, 5142.26it/s, Materializing param=model.layers.30.mlp.gate_proj.weight]
Loading weights: 61%|██████ | 276/453 [00:00<00:00, 5154.41it/s, Materializing param=model.layers.30.mlp.up_proj.weight]
Loading weights: 61%|██████ | 276/453 [00:00<00:00, 5151.47it/s, Materializing param=model.layers.30.mlp.up_proj.weight]
Loading weights: 61%|██████ | 277/453 [00:00<00:00, 5163.38it/s, Materializing param=model.layers.30.post_attention_layernorm.weight]
Loading weights: 61%|██████ | 277/453 [00:00<00:00, 5160.24it/s, Materializing param=model.layers.30.post_attention_layernorm.weight]
Loading weights: 61%|██████▏ | 278/453 [00:00<00:00, 5171.95it/s, Materializing param=model.layers.30.self_attn.k_proj.weight]
Loading weights: 61%|██████▏ | 278/453 [00:00<00:00, 5168.88it/s, Materializing param=model.layers.30.self_attn.k_proj.weight]
Loading weights: 62%|██████▏ | 279/453 [00:00<00:00, 5180.49it/s, Materializing param=model.layers.30.self_attn.o_proj.weight]
Loading weights: 62%|██████▏ | 279/453 [00:00<00:00, 5177.44it/s, Materializing param=model.layers.30.self_attn.o_proj.weight]
Loading weights: 62%|██████▏ | 280/453 [00:00<00:00, 5189.04it/s, Materializing param=model.layers.30.self_attn.q_proj.weight]
Loading weights: 62%|██████▏ | 280/453 [00:00<00:00, 5185.97it/s, Materializing param=model.layers.30.self_attn.q_proj.weight]
Loading weights: 62%|██████▏ | 281/453 [00:00<00:00, 5197.52it/s, Materializing param=model.layers.30.self_attn.v_proj.weight]
Loading weights: 62%|██████▏ | 281/453 [00:00<00:00, 5194.65it/s, Materializing param=model.layers.30.self_attn.v_proj.weight]
Loading weights: 62%|██████▏ | 282/453 [00:00<00:00, 5206.21it/s, Materializing param=model.layers.31.input_layernorm.weight]
Loading weights: 62%|██████▏ | 282/453 [00:00<00:00, 5203.07it/s, Materializing param=model.layers.31.input_layernorm.weight]
Loading weights: 62%|██████▏ | 283/453 [00:00<00:00, 5214.57it/s, Materializing param=model.layers.31.mlp.down_proj.weight]
Loading weights: 62%|██████▏ | 283/453 [00:00<00:00, 5211.46it/s, Materializing param=model.layers.31.mlp.down_proj.weight]
Loading weights: 63%|██████▎ | 284/453 [00:00<00:00, 5223.04it/s, Materializing param=model.layers.31.mlp.gate_proj.weight]
Loading weights: 63%|██████▎ | 284/453 [00:00<00:00, 5220.20it/s, Materializing param=model.layers.31.mlp.gate_proj.weight]
Loading weights: 63%|██████▎ | 285/453 [00:00<00:00, 5231.57it/s, Materializing param=model.layers.31.mlp.up_proj.weight]
Loading weights: 63%|██████▎ | 285/453 [00:00<00:00, 5228.55it/s, Materializing param=model.layers.31.mlp.up_proj.weight]
Loading weights: 63%|██████▎ | 286/453 [00:00<00:00, 5239.95it/s, Materializing param=model.layers.31.post_attention_layernorm.weight]
Loading weights: 63%|██████▎ | 286/453 [00:00<00:00, 5236.77it/s, Materializing param=model.layers.31.post_attention_layernorm.weight]
Loading weights: 63%|██████▎ | 287/453 [00:00<00:00, 5248.30it/s, Materializing param=model.layers.31.self_attn.k_proj.weight]
Loading weights: 63%|██████▎ | 287/453 [00:00<00:00, 5245.26it/s, Materializing param=model.layers.31.self_attn.k_proj.weight]
Loading weights: 64%|██████▎ | 288/453 [00:00<00:00, 5256.43it/s, Materializing param=model.layers.31.self_attn.o_proj.weight]
Loading weights: 64%|██████▎ | 288/453 [00:00<00:00, 5253.44it/s, Materializing param=model.layers.31.self_attn.o_proj.weight]
Loading weights: 64%|██████▍ | 289/453 [00:00<00:00, 5264.76it/s, Materializing param=model.layers.31.self_attn.q_proj.weight]
Loading weights: 64%|██████▍ | 289/453 [00:00<00:00, 5261.77it/s, Materializing param=model.layers.31.self_attn.q_proj.weight]
Loading weights: 64%|██████▍ | 290/453 [00:00<00:00, 5273.22it/s, Materializing param=model.layers.31.self_attn.v_proj.weight]
Loading weights: 64%|██████▍ | 290/453 [00:00<00:00, 5270.44it/s, Materializing param=model.layers.31.self_attn.v_proj.weight]
Loading weights: 64%|██████▍ | 291/453 [00:00<00:00, 5281.90it/s, Materializing param=model.layers.32.input_layernorm.weight]
Loading weights: 64%|██████▍ | 291/453 [00:00<00:00, 5278.96it/s, Materializing param=model.layers.32.input_layernorm.weight]
Loading weights: 64%|██████▍ | 292/453 [00:00<00:00, 5290.30it/s, Materializing param=model.layers.32.mlp.down_proj.weight]
Loading weights: 64%|██████▍ | 292/453 [00:00<00:00, 5287.29it/s, Materializing param=model.layers.32.mlp.down_proj.weight]
Loading weights: 65%|██████▍ | 293/453 [00:00<00:00, 5298.51it/s, Materializing param=model.layers.32.mlp.gate_proj.weight]
Loading weights: 65%|██████▍ | 293/453 [00:00<00:00, 5295.43it/s, Materializing param=model.layers.32.mlp.gate_proj.weight]
Loading weights: 65%|██████▍ | 294/453 [00:00<00:00, 5306.64it/s, Materializing param=model.layers.32.mlp.up_proj.weight]
Loading weights: 65%|██████▍ | 294/453 [00:00<00:00, 5303.74it/s, Materializing param=model.layers.32.mlp.up_proj.weight]
Loading weights: 65%|██████▌ | 295/453 [00:00<00:00, 5315.08it/s, Materializing param=model.layers.32.post_attention_layernorm.weight]
Loading weights: 65%|██████▌ | 295/453 [00:00<00:00, 5312.07it/s, Materializing param=model.layers.32.post_attention_layernorm.weight]
Loading weights: 65%|██████▌ | 296/453 [00:00<00:00, 5323.25it/s, Materializing param=model.layers.32.self_attn.k_proj.weight]
Loading weights: 65%|██████▌ | 296/453 [00:00<00:00, 5320.08it/s, Materializing param=model.layers.32.self_attn.k_proj.weight]
Loading weights: 66%|██████▌ | 297/453 [00:00<00:00, 5331.08it/s, Materializing param=model.layers.32.self_attn.o_proj.weight]
Loading weights: 66%|██████▌ | 297/453 [00:00<00:00, 5328.12it/s, Materializing param=model.layers.32.self_attn.o_proj.weight]
Loading weights: 66%|██████▌ | 298/453 [00:00<00:00, 5339.14it/s, Materializing param=model.layers.32.self_attn.q_proj.weight]
Loading weights: 66%|██████▌ | 298/453 [00:00<00:00, 5336.26it/s, Materializing param=model.layers.32.self_attn.q_proj.weight]
Loading weights: 66%|██████▌ | 299/453 [00:00<00:00, 5347.46it/s, Materializing param=model.layers.32.self_attn.v_proj.weight]
Loading weights: 66%|██████▌ | 299/453 [00:00<00:00, 5344.57it/s, Materializing param=model.layers.32.self_attn.v_proj.weight]
Loading weights: 66%|██████▌ | 300/453 [00:00<00:00, 5355.64it/s, Materializing param=model.layers.33.input_layernorm.weight]
Loading weights: 66%|██████▌ | 300/453 [00:00<00:00, 5352.70it/s, Materializing param=model.layers.33.input_layernorm.weight]
Loading weights: 66%|██████▋ | 301/453 [00:00<00:00, 5363.81it/s, Materializing param=model.layers.33.mlp.down_proj.weight]
Loading weights: 66%|██████▋ | 301/453 [00:00<00:00, 5360.83it/s, Materializing param=model.layers.33.mlp.down_proj.weight]
Loading weights: 67%|██████▋ | 302/453 [00:00<00:00, 5371.82it/s, Materializing param=model.layers.33.mlp.gate_proj.weight]
Loading weights: 67%|██████▋ | 302/453 [00:00<00:00, 5368.81it/s, Materializing param=model.layers.33.mlp.gate_proj.weight]
Loading weights: 67%|██████▋ | 303/453 [00:00<00:00, 5379.70it/s, Materializing param=model.layers.33.mlp.up_proj.weight]
Loading weights: 67%|██████▋ | 303/453 [00:00<00:00, 5376.88it/s, Materializing param=model.layers.33.mlp.up_proj.weight]
Loading weights: 67%|██████▋ | 304/453 [00:00<00:00, 5387.72it/s, Materializing param=model.layers.33.post_attention_layernorm.weight]
Loading weights: 67%|██████▋ | 304/453 [00:00<00:00, 5384.65it/s, Materializing param=model.layers.33.post_attention_layernorm.weight]
Loading weights: 67%|██████▋ | 305/453 [00:00<00:00, 5395.48it/s, Materializing param=model.layers.33.self_attn.k_proj.weight]
Loading weights: 67%|██████▋ | 305/453 [00:00<00:00, 5392.39it/s, Materializing param=model.layers.33.self_attn.k_proj.weight]
Loading weights: 68%|██████▊ | 306/453 [00:00<00:00, 5403.14it/s, Materializing param=model.layers.33.self_attn.o_proj.weight]
Loading weights: 68%|██████▊ | 306/453 [00:00<00:00, 5400.23it/s, Materializing param=model.layers.33.self_attn.o_proj.weight]
Loading weights: 68%|██████▊ | 307/453 [00:00<00:00, 5410.98it/s, Materializing param=model.layers.33.self_attn.q_proj.weight]
Loading weights: 68%|██████▊ | 307/453 [00:00<00:00, 5408.03it/s, Materializing param=model.layers.33.self_attn.q_proj.weight]
Loading weights: 68%|██████▊ | 308/453 [00:00<00:00, 5418.79it/s, Materializing param=model.layers.33.self_attn.v_proj.weight]
Loading weights: 68%|██████▊ | 308/453 [00:00<00:00, 5415.79it/s, Materializing param=model.layers.33.self_attn.v_proj.weight]
Loading weights: 68%|██████▊ | 309/453 [00:00<00:00, 5426.53it/s, Materializing param=model.layers.34.input_layernorm.weight]
Loading weights: 68%|██████▊ | 309/453 [00:00<00:00, 5423.51it/s, Materializing param=model.layers.34.input_layernorm.weight]
Loading weights: 68%|██████▊ | 310/453 [00:00<00:00, 5434.33it/s, Materializing param=model.layers.34.mlp.down_proj.weight]
Loading weights: 68%|██████▊ | 310/453 [00:00<00:00, 5431.38it/s, Materializing param=model.layers.34.mlp.down_proj.weight]
Loading weights: 69%|██████▊ | 311/453 [00:00<00:00, 5441.97it/s, Materializing param=model.layers.34.mlp.gate_proj.weight]
Loading weights: 69%|██████▊ | 311/453 [00:00<00:00, 5439.11it/s, Materializing param=model.layers.34.mlp.gate_proj.weight]
Loading weights: 69%|██████▉ | 312/453 [00:00<00:00, 5449.94it/s, Materializing param=model.layers.34.mlp.up_proj.weight]
Loading weights: 69%|██████▉ | 312/453 [00:00<00:00, 5446.94it/s, Materializing param=model.layers.34.mlp.up_proj.weight]
Loading weights: 69%|██████▉ | 313/453 [00:00<00:00, 5457.66it/s, Materializing param=model.layers.34.post_attention_layernorm.weight]
Loading weights: 69%|██████▉ | 313/453 [00:00<00:00, 5454.57it/s, Materializing param=model.layers.34.post_attention_layernorm.weight]
Loading weights: 69%|██████▉ | 314/453 [00:00<00:00, 5465.05it/s, Materializing param=model.layers.34.self_attn.k_proj.weight]
Loading weights: 69%|██████▉ | 314/453 [00:00<00:00, 5461.61it/s, Materializing param=model.layers.34.self_attn.k_proj.weight]
Loading weights: 70%|██████▉ | 315/453 [00:00<00:00, 5471.92it/s, Materializing param=model.layers.34.self_attn.o_proj.weight]
Loading weights: 70%|██████▉ | 315/453 [00:00<00:00, 5468.93it/s, Materializing param=model.layers.34.self_attn.o_proj.weight]
Loading weights: 70%|██████▉ | 316/453 [00:00<00:00, 5479.44it/s, Materializing param=model.layers.34.self_attn.q_proj.weight]
Loading weights: 70%|██████▉ | 316/453 [00:00<00:00, 5476.50it/s, Materializing param=model.layers.34.self_attn.q_proj.weight]
Loading weights: 70%|██████▉ | 317/453 [00:00<00:00, 5486.71it/s, Materializing param=model.layers.34.self_attn.v_proj.weight]
Loading weights: 70%|██████▉ | 317/453 [00:00<00:00, 5483.88it/s, Materializing param=model.layers.34.self_attn.v_proj.weight]
Loading weights: 70%|███████ | 318/453 [00:00<00:00, 5494.36it/s, Materializing param=model.layers.35.input_layernorm.weight]
Loading weights: 70%|███████ | 318/453 [00:00<00:00, 5491.40it/s, Materializing param=model.layers.35.input_layernorm.weight]
Loading weights: 70%|███████ | 319/453 [00:00<00:00, 5501.82it/s, Materializing param=model.layers.35.mlp.down_proj.weight]
Loading weights: 70%|███████ | 319/453 [00:00<00:00, 5498.91it/s, Materializing param=model.layers.35.mlp.down_proj.weight]
Loading weights: 71%|███████ | 320/453 [00:00<00:00, 5509.28it/s, Materializing param=model.layers.35.mlp.gate_proj.weight]
Loading weights: 71%|███████ | 320/453 [00:00<00:00, 5506.37it/s, Materializing param=model.layers.35.mlp.gate_proj.weight]
Loading weights: 71%|███████ | 321/453 [00:00<00:00, 5516.70it/s, Materializing param=model.layers.35.mlp.up_proj.weight]
Loading weights: 71%|███████ | 321/453 [00:00<00:00, 5513.89it/s, Materializing param=model.layers.35.mlp.up_proj.weight]
Loading weights: 71%|███████ | 322/453 [00:00<00:00, 5524.31it/s, Materializing param=model.layers.35.post_attention_layernorm.weight]
Loading weights: 71%|███████ | 322/453 [00:00<00:00, 5521.30it/s, Materializing param=model.layers.35.post_attention_layernorm.weight]
Loading weights: 71%|███████▏ | 323/453 [00:00<00:00, 5531.48it/s, Materializing param=model.layers.35.self_attn.k_proj.weight]
Loading weights: 71%|███████▏ | 323/453 [00:00<00:00, 5528.53it/s, Materializing param=model.layers.35.self_attn.k_proj.weight]
Loading weights: 72%|███████▏ | 324/453 [00:00<00:00, 5538.61it/s, Materializing param=model.layers.35.self_attn.o_proj.weight]
Loading weights: 72%|███████▏ | 324/453 [00:00<00:00, 5535.75it/s, Materializing param=model.layers.35.self_attn.o_proj.weight]
Loading weights: 72%|███████▏ | 325/453 [00:00<00:00, 5546.10it/s, Materializing param=model.layers.35.self_attn.q_proj.weight]
Loading weights: 72%|███████▏ | 325/453 [00:00<00:00, 5543.22it/s, Materializing param=model.layers.35.self_attn.q_proj.weight]
Loading weights: 72%|███████▏ | 326/453 [00:00<00:00, 5553.50it/s, Materializing param=model.layers.35.self_attn.v_proj.weight]
Loading weights: 72%|███████▏ | 326/453 [00:00<00:00, 5550.57it/s, Materializing param=model.layers.35.self_attn.v_proj.weight]
Loading weights: 72%|███████▏ | 327/453 [00:00<00:00, 5560.93it/s, Materializing param=model.layers.36.input_layernorm.weight]
Loading weights: 72%|███████▏ | 327/453 [00:00<00:00, 5558.05it/s, Materializing param=model.layers.36.input_layernorm.weight]
Loading weights: 72%|███████▏ | 328/453 [00:00<00:00, 5568.19it/s, Materializing param=model.layers.36.mlp.down_proj.weight]
Loading weights: 72%|███████▏ | 328/453 [00:00<00:00, 5565.35it/s, Materializing param=model.layers.36.mlp.down_proj.weight]
Loading weights: 73%|███████▎ | 329/453 [00:00<00:00, 5575.66it/s, Materializing param=model.layers.36.mlp.gate_proj.weight]
Loading weights: 73%|███████▎ | 329/453 [00:00<00:00, 5572.69it/s, Materializing param=model.layers.36.mlp.gate_proj.weight]
Loading weights: 73%|███████▎ | 330/453 [00:00<00:00, 5582.93it/s, Materializing param=model.layers.36.mlp.up_proj.weight]
Loading weights: 73%|███████▎ | 330/453 [00:00<00:00, 5580.07it/s, Materializing param=model.layers.36.mlp.up_proj.weight]
Loading weights: 73%|███████▎ | 331/453 [00:00<00:00, 5590.24it/s, Materializing param=model.layers.36.post_attention_layernorm.weight]
Loading weights: 73%|███████▎ | 331/453 [00:00<00:00, 5587.16it/s, Materializing param=model.layers.36.post_attention_layernorm.weight]
Loading weights: 73%|███████▎ | 332/453 [00:00<00:00, 5597.37it/s, Materializing param=model.layers.36.self_attn.k_proj.weight]
Loading weights: 73%|███████▎ | 332/453 [00:00<00:00, 5594.25it/s, Materializing param=model.layers.36.self_attn.k_proj.weight]
Loading weights: 74%|███████▎ | 333/453 [00:00<00:00, 5604.32it/s, Materializing param=model.layers.36.self_attn.o_proj.weight]
Loading weights: 74%|███████▎ | 333/453 [00:00<00:00, 5601.24it/s, Materializing param=model.layers.36.self_attn.o_proj.weight]
Loading weights: 74%|███████▎ | 334/453 [00:00<00:00, 5611.06it/s, Materializing param=model.layers.36.self_attn.q_proj.weight]
Loading weights: 74%|███████▎ | 334/453 [00:00<00:00, 5608.19it/s, Materializing param=model.layers.36.self_attn.q_proj.weight]
Loading weights: 74%|███████▍ | 335/453 [00:00<00:00, 5618.41it/s, Materializing param=model.layers.36.self_attn.v_proj.weight]
Loading weights: 74%|███████▍ | 335/453 [00:00<00:00, 5615.49it/s, Materializing param=model.layers.36.self_attn.v_proj.weight]
Loading weights: 74%|███████▍ | 336/453 [00:00<00:00, 5625.40it/s, Materializing param=model.layers.37.input_layernorm.weight]
Loading weights: 74%|███████▍ | 336/453 [00:00<00:00, 5622.57it/s, Materializing param=model.layers.37.input_layernorm.weight]
Loading weights: 74%|███████▍ | 337/453 [00:00<00:00, 5632.68it/s, Materializing param=model.layers.37.mlp.down_proj.weight]
Loading weights: 74%|███████▍ | 337/453 [00:00<00:00, 5629.76it/s, Materializing param=model.layers.37.mlp.down_proj.weight]
Loading weights: 75%|███████▍ | 338/453 [00:00<00:00, 5639.70it/s, Materializing param=model.layers.37.mlp.gate_proj.weight]
Loading weights: 75%|███████▍ | 338/453 [00:00<00:00, 5636.74it/s, Materializing param=model.layers.37.mlp.gate_proj.weight]
Loading weights: 75%|███████▍ | 339/453 [00:00<00:00, 5646.77it/s, Materializing param=model.layers.37.mlp.up_proj.weight]
Loading weights: 75%|███████▍ | 339/453 [00:00<00:00, 5643.95it/s, Materializing param=model.layers.37.mlp.up_proj.weight]
Loading weights: 75%|███████▌ | 340/453 [00:00<00:00, 5653.78it/s, Materializing param=model.layers.37.post_attention_layernorm.weight]
Loading weights: 75%|███████▌ | 340/453 [00:00<00:00, 5650.66it/s, Materializing param=model.layers.37.post_attention_layernorm.weight]
Loading weights: 75%|███████▌ | 341/453 [00:00<00:00, 5660.58it/s, Materializing param=model.layers.37.self_attn.k_proj.weight]
Loading weights: 75%|███████▌ | 341/453 [00:00<00:00, 5657.66it/s, Materializing param=model.layers.37.self_attn.k_proj.weight]
Loading weights: 75%|███████▌ | 342/453 [00:00<00:00, 5667.51it/s, Materializing param=model.layers.37.self_attn.o_proj.weight]
Loading weights: 75%|███████▌ | 342/453 [00:00<00:00, 5664.53it/s, Materializing param=model.layers.37.self_attn.o_proj.weight]
Loading weights: 76%|███████▌ | 343/453 [00:00<00:00, 5674.28it/s, Materializing param=model.layers.37.self_attn.q_proj.weight]
Loading weights: 76%|███████▌ | 343/453 [00:00<00:00, 5671.13it/s, Materializing param=model.layers.37.self_attn.q_proj.weight]
Loading weights: 76%|███████▌ | 344/453 [00:00<00:00, 5681.06it/s, Materializing param=model.layers.37.self_attn.v_proj.weight]
Loading weights: 76%|███████▌ | 344/453 [00:00<00:00, 5678.15it/s, Materializing param=model.layers.37.self_attn.v_proj.weight]
Loading weights: 76%|███████▌ | 345/453 [00:00<00:00, 5688.07it/s, Materializing param=model.layers.38.input_layernorm.weight]
Loading weights: 76%|███████▌ | 345/453 [00:00<00:00, 5685.17it/s, Materializing param=model.layers.38.input_layernorm.weight]
Loading weights: 76%|███████▋ | 346/453 [00:00<00:00, 5694.91it/s, Materializing param=model.layers.38.mlp.down_proj.weight]
Loading weights: 76%|███████▋ | 346/453 [00:00<00:00, 5691.90it/s, Materializing param=model.layers.38.mlp.down_proj.weight]
Loading weights: 77%|███████▋ | 347/453 [00:00<00:00, 5701.37it/s, Materializing param=model.layers.38.mlp.gate_proj.weight]
Loading weights: 77%|███████▋ | 347/453 [00:00<00:00, 5698.45it/s, Materializing param=model.layers.38.mlp.gate_proj.weight]
Loading weights: 77%|███████▋ | 348/453 [00:00<00:00, 5708.21it/s, Materializing param=model.layers.38.mlp.up_proj.weight]
Loading weights: 77%|███████▋ | 348/453 [00:00<00:00, 5705.42it/s, Materializing param=model.layers.38.mlp.up_proj.weight]
Loading weights: 77%|███████▋ | 349/453 [00:00<00:00, 5715.09it/s, Materializing param=model.layers.38.post_attention_layernorm.weight]
Loading weights: 77%|███████▋ | 349/453 [00:00<00:00, 5712.13it/s, Materializing param=model.layers.38.post_attention_layernorm.weight]
Loading weights: 77%|███████▋ | 350/453 [00:00<00:00, 5721.77it/s, Materializing param=model.layers.38.self_attn.k_proj.weight]
Loading weights: 77%|███████▋ | 350/453 [00:00<00:00, 5718.72it/s, Materializing param=model.layers.38.self_attn.k_proj.weight]
Loading weights: 77%|███████▋ | 351/453 [00:00<00:00, 5728.47it/s, Materializing param=model.layers.38.self_attn.o_proj.weight]
Loading weights: 77%|███████▋ | 351/453 [00:00<00:00, 5725.56it/s, Materializing param=model.layers.38.self_attn.o_proj.weight]
Loading weights: 78%|███████▊ | 352/453 [00:00<00:00, 5735.44it/s, Materializing param=model.layers.38.self_attn.q_proj.weight]
Loading weights: 78%|███████▊ | 352/453 [00:00<00:00, 5732.53it/s, Materializing param=model.layers.38.self_attn.q_proj.weight]
Loading weights: 78%|███████▊ | 353/453 [00:00<00:00, 5741.54it/s, Materializing param=model.layers.38.self_attn.v_proj.weight]
Loading weights: 78%|███████▊ | 353/453 [00:00<00:00, 5738.56it/s, Materializing param=model.layers.38.self_attn.v_proj.weight]
Loading weights: 78%|███████▊ | 354/453 [00:00<00:00, 5748.18it/s, Materializing param=model.layers.39.input_layernorm.weight]
Loading weights: 78%|███████▊ | 354/453 [00:00<00:00, 5745.40it/s, Materializing param=model.layers.39.input_layernorm.weight]
Loading weights: 78%|███████▊ | 355/453 [00:00<00:00, 5755.08it/s, Materializing param=model.layers.39.mlp.down_proj.weight]
Loading weights: 78%|███████▊ | 355/453 [00:00<00:00, 5752.10it/s, Materializing param=model.layers.39.mlp.down_proj.weight]
Loading weights: 79%|███████▊ | 356/453 [00:00<00:00, 5761.50it/s, Materializing param=model.layers.39.mlp.gate_proj.weight]
Loading weights: 79%|███████▊ | 356/453 [00:00<00:00, 5758.54it/s, Materializing param=model.layers.39.mlp.gate_proj.weight]
Loading weights: 79%|███████▉ | 357/453 [00:00<00:00, 5767.84it/s, Materializing param=model.layers.39.mlp.up_proj.weight]
Loading weights: 79%|███████▉ | 357/453 [00:00<00:00, 5764.82it/s, Materializing param=model.layers.39.mlp.up_proj.weight]
Loading weights: 79%|███████▉ | 358/453 [00:00<00:00, 5774.10it/s, Materializing param=model.layers.39.post_attention_layernorm.weight]
Loading weights: 79%|███████▉ | 358/453 [00:00<00:00, 5771.04it/s, Materializing param=model.layers.39.post_attention_layernorm.weight]
Loading weights: 79%|███████▉ | 359/453 [00:00<00:00, 5780.36it/s, Materializing param=model.layers.39.self_attn.k_proj.weight]
Loading weights: 79%|███████▉ | 359/453 [00:00<00:00, 5777.37it/s, Materializing param=model.layers.39.self_attn.k_proj.weight]
Loading weights: 79%|███████▉ | 360/453 [00:00<00:00, 5786.67it/s, Materializing param=model.layers.39.self_attn.o_proj.weight]
Loading weights: 79%|███████▉ | 360/453 [00:00<00:00, 5783.67it/s, Materializing param=model.layers.39.self_attn.o_proj.weight]
Loading weights: 80%|███████▉ | 361/453 [00:00<00:00, 5792.86it/s, Materializing param=model.layers.39.self_attn.q_proj.weight]
Loading weights: 80%|███████▉ | 361/453 [00:00<00:00, 5789.89it/s, Materializing param=model.layers.39.self_attn.q_proj.weight]
Loading weights: 80%|███████▉ | 362/453 [00:00<00:00, 5799.21it/s, Materializing param=model.layers.39.self_attn.v_proj.weight]
Loading weights: 80%|███████▉ | 362/453 [00:00<00:00, 5796.42it/s, Materializing param=model.layers.39.self_attn.v_proj.weight]
Loading weights: 80%|████████ | 363/453 [00:00<00:00, 5805.19it/s, Materializing param=model.layers.40.input_layernorm.weight]
Loading weights: 80%|████████ | 363/453 [00:00<00:00, 5802.36it/s, Materializing param=model.layers.40.input_layernorm.weight]
Loading weights: 80%|████████ | 364/453 [00:00<00:00, 5811.59it/s, Materializing param=model.layers.40.mlp.down_proj.weight]
Loading weights: 80%|████████ | 364/453 [00:00<00:00, 5808.80it/s, Materializing param=model.layers.40.mlp.down_proj.weight]
Loading weights: 81%|████████ | 365/453 [00:00<00:00, 5818.09it/s, Materializing param=model.layers.40.mlp.gate_proj.weight]
Loading weights: 81%|████████ | 365/453 [00:00<00:00, 5815.29it/s, Materializing param=model.layers.40.mlp.gate_proj.weight]
Loading weights: 81%|████████ | 366/453 [00:00<00:00, 5824.27it/s, Materializing param=model.layers.40.mlp.up_proj.weight]
Loading weights: 81%|████████ | 366/453 [00:00<00:00, 5821.45it/s, Materializing param=model.layers.40.mlp.up_proj.weight]
Loading weights: 81%|████████ | 367/453 [00:00<00:00, 5830.85it/s, Materializing param=model.layers.40.post_attention_layernorm.weight]
Loading weights: 81%|████████ | 367/453 [00:00<00:00, 5827.94it/s, Materializing param=model.layers.40.post_attention_layernorm.weight]
Loading weights: 81%|████████ | 368/453 [00:00<00:00, 5837.03it/s, Materializing param=model.layers.40.self_attn.k_proj.weight]
Loading weights: 81%|████████ | 368/453 [00:00<00:00, 5834.25it/s, Materializing param=model.layers.40.self_attn.k_proj.weight]
Loading weights: 81%|████████▏ | 369/453 [00:00<00:00, 5843.33it/s, Materializing param=model.layers.40.self_attn.o_proj.weight]
Loading weights: 81%|████████▏ | 369/453 [00:00<00:00, 5840.46it/s, Materializing param=model.layers.40.self_attn.o_proj.weight]
Loading weights: 82%|████████▏ | 370/453 [00:00<00:00, 5849.42it/s, Materializing param=model.layers.40.self_attn.q_proj.weight]
Loading weights: 82%|████████▏ | 370/453 [00:00<00:00, 5846.56it/s, Materializing param=model.layers.40.self_attn.q_proj.weight]
Loading weights: 82%|████████▏ | 371/453 [00:00<00:00, 5855.54it/s, Materializing param=model.layers.40.self_attn.v_proj.weight]
Loading weights: 82%|████████▏ | 371/453 [00:00<00:00, 5852.50it/s, Materializing param=model.layers.40.self_attn.v_proj.weight]
Loading weights: 82%|████████▏ | 372/453 [00:00<00:00, 5861.55it/s, Materializing param=model.layers.41.input_layernorm.weight]
Loading weights: 82%|████████▏ | 372/453 [00:00<00:00, 5858.41it/s, Materializing param=model.layers.41.input_layernorm.weight]
Loading weights: 82%|████████▏ | 373/453 [00:00<00:00, 5867.08it/s, Materializing param=model.layers.41.mlp.down_proj.weight]
Loading weights: 82%|████████▏ | 373/453 [00:00<00:00, 5864.14it/s, Materializing param=model.layers.41.mlp.down_proj.weight]
Loading weights: 83%|████████▎ | 374/453 [00:00<00:00, 5872.92it/s, Materializing param=model.layers.41.mlp.gate_proj.weight]
Loading weights: 83%|████████▎ | 374/453 [00:00<00:00, 5870.04it/s, Materializing param=model.layers.41.mlp.gate_proj.weight]
Loading weights: 83%|████████▎ | 375/453 [00:00<00:00, 5878.94it/s, Materializing param=model.layers.41.mlp.up_proj.weight]
Loading weights: 83%|████████▎ | 375/453 [00:00<00:00, 5876.22it/s, Materializing param=model.layers.41.mlp.up_proj.weight]
Loading weights: 83%|████████▎ | 376/453 [00:00<00:00, 5884.99it/s, Materializing param=model.layers.41.post_attention_layernorm.weight]
Loading weights: 83%|████████▎ | 376/453 [00:00<00:00, 5882.00it/s, Materializing param=model.layers.41.post_attention_layernorm.weight]
Loading weights: 83%|████████▎ | 377/453 [00:00<00:00, 5891.05it/s, Materializing param=model.layers.41.self_attn.k_proj.weight]
Loading weights: 83%|████████▎ | 377/453 [00:00<00:00, 5888.16it/s, Materializing param=model.layers.41.self_attn.k_proj.weight]
Loading weights: 83%|████████▎ | 378/453 [00:00<00:00, 5897.21it/s, Materializing param=model.layers.41.self_attn.o_proj.weight]
Loading weights: 83%|████████▎ | 378/453 [00:00<00:00, 5894.47it/s, Materializing param=model.layers.41.self_attn.o_proj.weight]
Loading weights: 84%|████████▎ | 379/453 [00:00<00:00, 5903.41it/s, Materializing param=model.layers.41.self_attn.q_proj.weight]
Loading weights: 84%|████████▎ | 379/453 [00:00<00:00, 5900.63it/s, Materializing param=model.layers.41.self_attn.q_proj.weight]
Loading weights: 84%|████████▍ | 380/453 [00:00<00:00, 5909.62it/s, Materializing param=model.layers.41.self_attn.v_proj.weight]
Loading weights: 84%|████████▍ | 380/453 [00:00<00:00, 5906.68it/s, Materializing param=model.layers.41.self_attn.v_proj.weight]
Loading weights: 84%|████████▍ | 381/453 [00:00<00:00, 5915.41it/s, Materializing param=model.layers.42.input_layernorm.weight]
Loading weights: 84%|████████▍ | 381/453 [00:00<00:00, 5912.54it/s, Materializing param=model.layers.42.input_layernorm.weight]
Loading weights: 84%|████████▍ | 382/453 [00:00<00:00, 5921.51it/s, Materializing param=model.layers.42.mlp.down_proj.weight]
Loading weights: 84%|████████▍ | 382/453 [00:00<00:00, 5918.71it/s, Materializing param=model.layers.42.mlp.down_proj.weight]
Loading weights: 85%|████████▍ | 383/453 [00:00<00:00, 5927.57it/s, Materializing param=model.layers.42.mlp.gate_proj.weight]
Loading weights: 85%|████████▍ | 383/453 [00:00<00:00, 5924.79it/s, Materializing param=model.layers.42.mlp.gate_proj.weight]
Loading weights: 85%|████████▍ | 384/453 [00:00<00:00, 5933.78it/s, Materializing param=model.layers.42.mlp.up_proj.weight]
Loading weights: 85%|████████▍ | 384/453 [00:00<00:00, 5931.05it/s, Materializing param=model.layers.42.mlp.up_proj.weight]
Loading weights: 85%|████████▍ | 385/453 [00:00<00:00, 5939.94it/s, Materializing param=model.layers.42.post_attention_layernorm.weight]
Loading weights: 85%|████████▍ | 385/453 [00:00<00:00, 5937.03it/s, Materializing param=model.layers.42.post_attention_layernorm.weight]
Loading weights: 85%|████████▌ | 386/453 [00:00<00:00, 5945.87it/s, Materializing param=model.layers.42.self_attn.k_proj.weight]
Loading weights: 85%|████████▌ | 386/453 [00:00<00:00, 5942.93it/s, Materializing param=model.layers.42.self_attn.k_proj.weight]
Loading weights: 85%|████████▌ | 387/453 [00:00<00:00, 5951.68it/s, Materializing param=model.layers.42.self_attn.o_proj.weight]
Loading weights: 85%|████████▌ | 387/453 [00:00<00:00, 5948.80it/s, Materializing param=model.layers.42.self_attn.o_proj.weight]
Loading weights: 86%|████████▌ | 388/453 [00:00<00:00, 5957.43it/s, Materializing param=model.layers.42.self_attn.q_proj.weight]
Loading weights: 86%|████████▌ | 388/453 [00:00<00:00, 5954.59it/s, Materializing param=model.layers.42.self_attn.q_proj.weight]
Loading weights: 86%|████████▌ | 389/453 [00:00<00:00, 5963.26it/s, Materializing param=model.layers.42.self_attn.v_proj.weight]
Loading weights: 86%|████████▌ | 389/453 [00:00<00:00, 5960.47it/s, Materializing param=model.layers.42.self_attn.v_proj.weight]
Loading weights: 86%|████████▌ | 390/453 [00:00<00:00, 5969.21it/s, Materializing param=model.layers.43.input_layernorm.weight]
Loading weights: 86%|████████▌ | 390/453 [00:00<00:00, 5966.38it/s, Materializing param=model.layers.43.input_layernorm.weight]
Loading weights: 86%|████████▋ | 391/453 [00:00<00:00, 5974.90it/s, Materializing param=model.layers.43.mlp.down_proj.weight]
Loading weights: 86%|████████▋ | 391/453 [00:00<00:00, 5972.12it/s, Materializing param=model.layers.43.mlp.down_proj.weight]
Loading weights: 87%|████████▋ | 392/453 [00:00<00:00, 5980.33it/s, Materializing param=model.layers.43.mlp.gate_proj.weight]
Loading weights: 87%|████████▋ | 392/453 [00:00<00:00, 5977.55it/s, Materializing param=model.layers.43.mlp.gate_proj.weight]
Loading weights: 87%|████████▋ | 393/453 [00:00<00:00, 5986.21it/s, Materializing param=model.layers.43.mlp.up_proj.weight]
Loading weights: 87%|████████▋ | 393/453 [00:00<00:00, 5983.51it/s, Materializing param=model.layers.43.mlp.up_proj.weight]
Loading weights: 87%|████████▋ | 394/453 [00:00<00:00, 5992.25it/s, Materializing param=model.layers.43.post_attention_layernorm.weight]
Loading weights: 87%|████████▋ | 394/453 [00:00<00:00, 5989.41it/s, Materializing param=model.layers.43.post_attention_layernorm.weight]
Loading weights: 87%|████████▋ | 395/453 [00:00<00:00, 5998.05it/s, Materializing param=model.layers.43.self_attn.k_proj.weight]
Loading weights: 87%|████████▋ | 395/453 [00:00<00:00, 5995.29it/s, Materializing param=model.layers.43.self_attn.k_proj.weight]
Loading weights: 87%|████████▋ | 396/453 [00:00<00:00, 6003.97it/s, Materializing param=model.layers.43.self_attn.o_proj.weight]
Loading weights: 87%|████████▋ | 396/453 [00:00<00:00, 6001.11it/s, Materializing param=model.layers.43.self_attn.o_proj.weight]
Loading weights: 88%|████████▊ | 397/453 [00:00<00:00, 6009.83it/s, Materializing param=model.layers.43.self_attn.q_proj.weight]
Loading weights: 88%|████████▊ | 397/453 [00:00<00:00, 6007.04it/s, Materializing param=model.layers.43.self_attn.q_proj.weight]
Loading weights: 88%|████████▊ | 398/453 [00:00<00:00, 6015.59it/s, Materializing param=model.layers.43.self_attn.v_proj.weight]
Loading weights: 88%|████████▊ | 398/453 [00:00<00:00, 6012.78it/s, Materializing param=model.layers.43.self_attn.v_proj.weight]
Loading weights: 88%|████████▊ | 399/453 [00:00<00:00, 6021.38it/s, Materializing param=model.layers.44.input_layernorm.weight]
Loading weights: 88%|████████▊ | 399/453 [00:00<00:00, 6018.65it/s, Materializing param=model.layers.44.input_layernorm.weight]
Loading weights: 88%|████████▊ | 400/453 [00:00<00:00, 6027.34it/s, Materializing param=model.layers.44.mlp.down_proj.weight]
Loading weights: 88%|████████▊ | 400/453 [00:00<00:00, 6024.48it/s, Materializing param=model.layers.44.mlp.down_proj.weight]
Loading weights: 89%|████████▊ | 401/453 [00:00<00:00, 6032.85it/s, Materializing param=model.layers.44.mlp.gate_proj.weight]
Loading weights: 89%|████████▊ | 401/453 [00:00<00:00, 6029.60it/s, Materializing param=model.layers.44.mlp.gate_proj.weight]
Loading weights: 89%|████████▊ | 402/453 [00:00<00:00, 6038.10it/s, Materializing param=model.layers.44.mlp.up_proj.weight]
Loading weights: 89%|████████▊ | 402/453 [00:00<00:00, 6035.27it/s, Materializing param=model.layers.44.mlp.up_proj.weight]
Loading weights: 89%|████████▉ | 403/453 [00:00<00:00, 6043.84it/s, Materializing param=model.layers.44.post_attention_layernorm.weight]
Loading weights: 89%|████████▉ | 403/453 [00:00<00:00, 6040.90it/s, Materializing param=model.layers.44.post_attention_layernorm.weight]
Loading weights: 89%|████████▉ | 404/453 [00:00<00:00, 6049.36it/s, Materializing param=model.layers.44.self_attn.k_proj.weight]
Loading weights: 89%|████████▉ | 404/453 [00:00<00:00, 6046.62it/s, Materializing param=model.layers.44.self_attn.k_proj.weight]
Loading weights: 89%|████████▉ | 405/453 [00:00<00:00, 6055.02it/s, Materializing param=model.layers.44.self_attn.o_proj.weight]
Loading weights: 89%|████████▉ | 405/453 [00:00<00:00, 6052.15it/s, Materializing param=model.layers.44.self_attn.o_proj.weight]
Loading weights: 90%|████████▉ | 406/453 [00:00<00:00, 6060.49it/s, Materializing param=model.layers.44.self_attn.q_proj.weight]
Loading weights: 90%|████████▉ | 406/453 [00:00<00:00, 6057.70it/s, Materializing param=model.layers.44.self_attn.q_proj.weight]
Loading weights: 90%|████████▉ | 407/453 [00:00<00:00, 6066.13it/s, Materializing param=model.layers.44.self_attn.v_proj.weight]
Loading weights: 90%|████████▉ | 407/453 [00:00<00:00, 6063.41it/s, Materializing param=model.layers.44.self_attn.v_proj.weight]
Loading weights: 90%|█████████ | 408/453 [00:00<00:00, 6071.89it/s, Materializing param=model.layers.45.input_layernorm.weight]
Loading weights: 90%|█████████ | 408/453 [00:00<00:00, 6068.96it/s, Materializing param=model.layers.45.input_layernorm.weight]
Loading weights: 90%|█████████ | 409/453 [00:00<00:00, 6077.28it/s, Materializing param=model.layers.45.mlp.down_proj.weight]
Loading weights: 90%|█████████ | 409/453 [00:00<00:00, 6074.55it/s, Materializing param=model.layers.45.mlp.down_proj.weight]
Loading weights: 91%|█████████ | 410/453 [00:00<00:00, 6082.98it/s, Materializing param=model.layers.45.mlp.gate_proj.weight]
Loading weights: 91%|█████████ | 410/453 [00:00<00:00, 6080.14it/s, Materializing param=model.layers.45.mlp.gate_proj.weight]
Loading weights: 91%|█████████ | 411/453 [00:00<00:00, 6088.19it/s, Materializing param=model.layers.45.mlp.up_proj.weight]
Loading weights: 91%|█████████ | 411/453 [00:00<00:00, 6085.37it/s, Materializing param=model.layers.45.mlp.up_proj.weight]
Loading weights: 91%|█████████ | 412/453 [00:00<00:00, 6093.81it/s, Materializing param=model.layers.45.post_attention_layernorm.weight]
Loading weights: 91%|█████████ | 412/453 [00:00<00:00, 6090.91it/s, Materializing param=model.layers.45.post_attention_layernorm.weight]
Loading weights: 91%|█████████ | 413/453 [00:00<00:00, 6099.31it/s, Materializing param=model.layers.45.self_attn.k_proj.weight]
Loading weights: 91%|█████████ | 413/453 [00:00<00:00, 6096.35it/s, Materializing param=model.layers.45.self_attn.k_proj.weight]
Loading weights: 91%|█████████▏| 414/453 [00:00<00:00, 6104.71it/s, Materializing param=model.layers.45.self_attn.o_proj.weight]
Loading weights: 91%|█████████▏| 414/453 [00:00<00:00, 6101.90it/s, Materializing param=model.layers.45.self_attn.o_proj.weight]
Loading weights: 92%|█████████▏| 415/453 [00:00<00:00, 6110.09it/s, Materializing param=model.layers.45.self_attn.q_proj.weight]
Loading weights: 92%|█████████▏| 415/453 [00:00<00:00, 6107.30it/s, Materializing param=model.layers.45.self_attn.q_proj.weight]
Loading weights: 92%|█████████▏| 416/453 [00:00<00:00, 6115.67it/s, Materializing param=model.layers.45.self_attn.v_proj.weight]
Loading weights: 92%|█████████▏| 416/453 [00:00<00:00, 6112.71it/s, Materializing param=model.layers.45.self_attn.v_proj.weight]
Loading weights: 92%|█████████▏| 417/453 [00:00<00:00, 6120.78it/s, Materializing param=model.layers.46.input_layernorm.weight]
Loading weights: 92%|█████████▏| 417/453 [00:00<00:00, 6117.76it/s, Materializing param=model.layers.46.input_layernorm.weight]
Loading weights: 92%|█████████▏| 418/453 [00:00<00:00, 6125.83it/s, Materializing param=model.layers.46.mlp.down_proj.weight]
Loading weights: 92%|█████████▏| 418/453 [00:00<00:00, 6122.92it/s, Materializing param=model.layers.46.mlp.down_proj.weight]
Loading weights: 92%|█████████▏| 419/453 [00:00<00:00, 6131.10it/s, Materializing param=model.layers.46.mlp.gate_proj.weight]
Loading weights: 92%|█████████▏| 419/453 [00:00<00:00, 6128.26it/s, Materializing param=model.layers.46.mlp.gate_proj.weight]
Loading weights: 93%|█████████▎| 420/453 [00:00<00:00, 6136.49it/s, Materializing param=model.layers.46.mlp.up_proj.weight]
Loading weights: 93%|█████████▎| 420/453 [00:00<00:00, 6133.33it/s, Materializing param=model.layers.46.mlp.up_proj.weight]
Loading weights: 93%|█████████▎| 421/453 [00:00<00:00, 6141.34it/s, Materializing param=model.layers.46.post_attention_layernorm.weight]
Loading weights: 93%|█████████▎| 421/453 [00:00<00:00, 6138.38it/s, Materializing param=model.layers.46.post_attention_layernorm.weight]
Loading weights: 93%|█████████▎| 422/453 [00:00<00:00, 6146.35it/s, Materializing param=model.layers.46.self_attn.k_proj.weight]
Loading weights: 93%|█████████▎| 422/453 [00:00<00:00, 6143.45it/s, Materializing param=model.layers.46.self_attn.k_proj.weight]
Loading weights: 93%|█████████▎| 423/453 [00:00<00:00, 6151.39it/s, Materializing param=model.layers.46.self_attn.o_proj.weight]
Loading weights: 93%|█████████▎| 423/453 [00:00<00:00, 6148.51it/s, Materializing param=model.layers.46.self_attn.o_proj.weight]
Loading weights: 94%|█████████▎| 424/453 [00:00<00:00, 6156.44it/s, Materializing param=model.layers.46.self_attn.q_proj.weight]
Loading weights: 94%|█████████▎| 424/453 [00:00<00:00, 6153.67it/s, Materializing param=model.layers.46.self_attn.q_proj.weight]
Loading weights: 94%|█████████▍| 425/453 [00:00<00:00, 6161.76it/s, Materializing param=model.layers.46.self_attn.v_proj.weight]
Loading weights: 94%|█████████▍| 425/453 [00:00<00:00, 6158.93it/s, Materializing param=model.layers.46.self_attn.v_proj.weight]
Loading weights: 94%|█████████▍| 426/453 [00:00<00:00, 6166.88it/s, Materializing param=model.layers.47.input_layernorm.weight]
Loading weights: 94%|█████████▍| 426/453 [00:00<00:00, 6164.11it/s, Materializing param=model.layers.47.input_layernorm.weight]
Loading weights: 94%|█████████▍| 427/453 [00:00<00:00, 6172.28it/s, Materializing param=model.layers.47.mlp.down_proj.weight]
Loading weights: 94%|█████████▍| 427/453 [00:00<00:00, 6169.58it/s, Materializing param=model.layers.47.mlp.down_proj.weight]
Loading weights: 94%|█████████▍| 428/453 [00:00<00:00, 6177.82it/s, Materializing param=model.layers.47.mlp.gate_proj.weight]
Loading weights: 94%|█████████▍| 428/453 [00:00<00:00, 6175.16it/s, Materializing param=model.layers.47.mlp.gate_proj.weight]
Loading weights: 95%|█████████▍| 429/453 [00:00<00:00, 6183.23it/s, Materializing param=model.layers.47.mlp.up_proj.weight]
Loading weights: 95%|█████████▍| 429/453 [00:00<00:00, 6180.47it/s, Materializing param=model.layers.47.mlp.up_proj.weight]
Loading weights: 95%|█████████▍| 430/453 [00:00<00:00, 6188.22it/s, Materializing param=model.layers.47.post_attention_layernorm.weight]
Loading weights: 95%|█████████▍| 430/453 [00:00<00:00, 6185.33it/s, Materializing param=model.layers.47.post_attention_layernorm.weight]
Loading weights: 95%|█████████▌| 431/453 [00:00<00:00, 6193.09it/s, Materializing param=model.layers.47.self_attn.k_proj.weight]
Loading weights: 95%|█████████▌| 431/453 [00:00<00:00, 6190.40it/s, Materializing param=model.layers.47.self_attn.k_proj.weight]
Loading weights: 95%|█████████▌| 432/453 [00:00<00:00, 6198.37it/s, Materializing param=model.layers.47.self_attn.o_proj.weight]
Loading weights: 95%|█████████▌| 432/453 [00:00<00:00, 6195.70it/s, Materializing param=model.layers.47.self_attn.o_proj.weight]
Loading weights: 96%|█████████▌| 433/453 [00:00<00:00, 6203.57it/s, Materializing param=model.layers.47.self_attn.q_proj.weight]
Loading weights: 96%|█████████▌| 433/453 [00:00<00:00, 6200.97it/s, Materializing param=model.layers.47.self_attn.q_proj.weight]
Loading weights: 96%|█████████▌| 434/453 [00:00<00:00, 6208.82it/s, Materializing param=model.layers.47.self_attn.v_proj.weight]
Loading weights: 96%|█████████▌| 434/453 [00:00<00:00, 6206.24it/s, Materializing param=model.layers.47.self_attn.v_proj.weight]
Loading weights: 96%|█████████▌| 435/453 [00:00<00:00, 6213.95it/s, Materializing param=model.layers.48.input_layernorm.weight]
Loading weights: 96%|█████████▌| 435/453 [00:00<00:00, 6211.20it/s, Materializing param=model.layers.48.input_layernorm.weight]
Loading weights: 96%|█████████▌| 436/453 [00:00<00:00, 6219.15it/s, Materializing param=model.layers.48.mlp.down_proj.weight]
Loading weights: 96%|█████████▌| 436/453 [00:00<00:00, 6216.38it/s, Materializing param=model.layers.48.mlp.down_proj.weight]
Loading weights: 96%|█████████▋| 437/453 [00:00<00:00, 6224.21it/s, Materializing param=model.layers.48.mlp.gate_proj.weight]
Loading weights: 96%|█████████▋| 437/453 [00:00<00:00, 6221.42it/s, Materializing param=model.layers.48.mlp.gate_proj.weight]
Loading weights: 97%|█████████▋| 438/453 [00:00<00:00, 6229.42it/s, Materializing param=model.layers.48.mlp.up_proj.weight]
Loading weights: 97%|█████████▋| 438/453 [00:00<00:00, 6226.67it/s, Materializing param=model.layers.48.mlp.up_proj.weight]
Loading weights: 97%|█████████▋| 439/453 [00:00<00:00, 6234.55it/s, Materializing param=model.layers.48.post_attention_layernorm.weight]
Loading weights: 97%|█████████▋| 439/453 [00:00<00:00, 6231.72it/s, Materializing param=model.layers.48.post_attention_layernorm.weight]
Loading weights: 97%|█████████▋| 440/453 [00:00<00:00, 6239.29it/s, Materializing param=model.layers.48.self_attn.k_proj.weight]
Loading weights: 97%|█████████▋| 440/453 [00:00<00:00, 6236.40it/s, Materializing param=model.layers.48.self_attn.k_proj.weight]
Loading weights: 97%|█████████▋| 441/453 [00:00<00:00, 6244.20it/s, Materializing param=model.layers.48.self_attn.o_proj.weight]
Loading weights: 97%|█████████▋| 441/453 [00:00<00:00, 6241.48it/s, Materializing param=model.layers.48.self_attn.o_proj.weight]
Loading weights: 98%|█████████▊| 442/453 [00:00<00:00, 6249.31it/s, Materializing param=model.layers.48.self_attn.q_proj.weight]
Loading weights: 98%|█████████▊| 442/453 [00:00<00:00, 6246.49it/s, Materializing param=model.layers.48.self_attn.q_proj.weight]
Loading weights: 98%|█████████▊| 443/453 [00:00<00:00, 6254.34it/s, Materializing param=model.layers.48.self_attn.v_proj.weight]
Loading weights: 98%|█████████▊| 443/453 [00:00<00:00, 6251.54it/s, Materializing param=model.layers.48.self_attn.v_proj.weight]
Loading weights: 98%|█████████▊| 444/453 [00:00<00:00, 6259.29it/s, Materializing param=model.layers.49.input_layernorm.weight]
Loading weights: 98%|█████████▊| 444/453 [00:00<00:00, 6256.62it/s, Materializing param=model.layers.49.input_layernorm.weight]
Loading weights: 98%|█████████▊| 445/453 [00:00<00:00, 6264.34it/s, Materializing param=model.layers.49.mlp.down_proj.weight]
Loading weights: 98%|█████████▊| 445/453 [00:00<00:00, 6261.54it/s, Materializing param=model.layers.49.mlp.down_proj.weight]
Loading weights: 98%|█████████▊| 446/453 [00:00<00:00, 6269.18it/s, Materializing param=model.layers.49.mlp.gate_proj.weight]
Loading weights: 98%|█████████▊| 446/453 [00:00<00:00, 6266.53it/s, Materializing param=model.layers.49.mlp.gate_proj.weight]
Loading weights: 99%|█████████▊| 447/453 [00:00<00:00, 6274.13it/s, Materializing param=model.layers.49.mlp.up_proj.weight]
Loading weights: 99%|█████████▊| 447/453 [00:00<00:00, 6271.36it/s, Materializing param=model.layers.49.mlp.up_proj.weight]
Loading weights: 99%|█████████▉| 448/453 [00:00<00:00, 6270.85it/s, Materializing param=model.layers.49.post_attention_layernorm.weight]
Loading weights: 99%|█████████▉| 448/453 [00:00<00:00, 6267.61it/s, Materializing param=model.layers.49.post_attention_layernorm.weight]
Loading weights: 99%|█████████▉| 449/453 [00:00<00:00, 6276.43it/s, Materializing param=model.layers.49.self_attn.k_proj.weight]
Loading weights: 99%|█████████▉| 449/453 [00:00<00:00, 6273.75it/s, Materializing param=model.layers.49.self_attn.k_proj.weight]
Loading weights: 99%|█████████▉| 450/453 [00:00<00:00, 6282.93it/s, Materializing param=model.layers.49.self_attn.o_proj.weight]
Loading weights: 99%|█████████▉| 450/453 [00:00<00:00, 6280.38it/s, Materializing param=model.layers.49.self_attn.o_proj.weight]
Loading weights: 100%|█████████▉| 451/453 [00:00<00:00, 6289.34it/s, Materializing param=model.layers.49.self_attn.q_proj.weight]
Loading weights: 100%|█████████▉| 451/453 [00:00<00:00, 6286.72it/s, Materializing param=model.layers.49.self_attn.q_proj.weight]
Loading weights: 100%|█████████▉| 452/453 [00:00<00:00, 6295.89it/s, Materializing param=model.layers.49.self_attn.v_proj.weight]
Loading weights: 100%|█████████▉| 452/453 [00:00<00:00, 6293.45it/s, Materializing param=model.layers.49.self_attn.v_proj.weight]
Loading weights: 100%|██████████| 453/453 [00:00<00:00, 6302.56it/s, Materializing param=model.norm.weight]
Loading weights: 100%|██████████| 453/453 [00:00<00:00, 6300.13it/s, Materializing param=model.norm.weight]
Loading weights: 100%|██████████| 453/453 [00:00<00:00, 6293.14it/s, Materializing param=model.norm.weight] |
| I0222 16:34:24.735307 41281 quantize_decompress_robust.py:149] 50 layers total |
| I0222 16:34:24.735381 41281 quantize_decompress_robust.py:159] === Layer 6/50 === |
| I0222 16:34:27.713109 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 16:34:27.713336 41281 quip.py:389] mean square of Wr: 0.0003589858242776245 |
| I0222 16:34:27.722298 41281 quip.py:390] difference between Hr and Hr.T: 9.775161743164062e-06 |
| I0222 16:34:27.722443 41281 quip.py:391] max abs of Hr: 4.624887943267822 |
| I0222 16:34:27.730944 41281 quip.py:392] min diag of Lhr: 0.9359808564186096 |
| I0222 16:34:36.831238 41281 misc.py:25] /tmp/q2_temp/6_qkv.pt frob error: 0.16626720130443573 |
| I0222 16:34:36.831572 41281 misc.py:26] /tmp/q2_temp/6_qkv.pt proxy error: 0.004825077950954437 |
| I0222 16:34:37.288379 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 16:34:39.943430 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:34:39.943617 41281 quip.py:389] mean square of Wr: 0.0019445267971605062 |
| I0222 16:34:39.943944 41281 quip.py:390] difference between Hr and Hr.T: 2.09808349609375e-05 |
| I0222 16:34:39.944055 41281 quip.py:391] max abs of Hr: 14.199299812316895 |
| I0222 16:34:39.944150 41281 quip.py:392] min diag of Lhr: 1.4030505418777466 |
| I0222 16:34:48.027232 41281 misc.py:25] /tmp/q2_temp/6_o.pt frob error: 0.22180239856243134 |
| I0222 16:34:48.027362 41281 misc.py:26] /tmp/q2_temp/6_o.pt proxy error: 0.0258488692343235 |
| I0222 16:34:48.436775 41281 quantize_decompress_robust.py:194] o done |
| I0222 16:34:59.782377 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:34:59.782883 41281 quip.py:389] mean square of Wr: 0.0010649937903508544 |
| I0222 16:34:59.783166 41281 quip.py:390] difference between Hr and Hr.T: 3.5762786865234375e-05 |
| I0222 16:34:59.783275 41281 quip.py:391] max abs of Hr: 39.674983978271484 |
| I0222 16:34:59.783355 41281 quip.py:392] min diag of Lhr: 3.589798927307129 |
| I0222 16:35:20.085968 41281 misc.py:25] /tmp/q2_temp/6_up.pt frob error: 0.1227526068687439 |
| I0222 16:35:20.086100 41281 misc.py:26] /tmp/q2_temp/6_up.pt proxy error: 0.03258279338479042 |
| I0222 16:35:23.207643 41281 quantize_decompress_robust.py:212] up done |
| I0222 16:35:31.127265 41281 quip.py:388] mean square of W: 1.000000238418579 |
| I0222 16:35:31.127592 41281 quip.py:389] mean square of Wr: 0.003196085337549448 |
| I0222 16:35:31.130311 41281 quip.py:390] difference between Hr and Hr.T: 2.5510787963867188e-05 |
| I0222 16:35:31.130976 41281 quip.py:391] max abs of Hr: 17.19301986694336 |
| I0222 16:35:31.131070 41281 quip.py:392] min diag of Lhr: 2.7954678535461426 |
| I0222 16:36:02.251144 41281 misc.py:25] /tmp/q2_temp/6_down.pt frob error: 0.12354319542646408 |
| I0222 16:36:02.251285 41281 misc.py:26] /tmp/q2_temp/6_down.pt proxy error: 0.04477909207344055 |
| I0222 16:36:03.324254 41281 quantize_decompress_robust.py:226] down done |
| I0222 16:36:03.686825 41281 quantize_decompress_robust.py:79] Saved progress for layer 6 (416 MB) |
| I0222 16:36:03.687082 41281 quantize_decompress_robust.py:239] Layer 6 done in 99.0s [7/50] |
| I0222 16:36:03.687133 41281 quantize_decompress_robust.py:159] === Layer 7/50 === |
| I0222 16:36:06.512980 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:36:06.513193 41281 quip.py:389] mean square of Wr: 0.000402717269025743 |
| I0222 16:36:06.513667 41281 quip.py:390] difference between Hr and Hr.T: 8.106231689453125e-06 |
| I0222 16:36:06.513780 41281 quip.py:391] max abs of Hr: 4.7170515060424805 |
| I0222 16:36:06.513867 41281 quip.py:392] min diag of Lhr: 1.0006486177444458 |
| I0222 16:36:15.919051 41281 misc.py:25] /tmp/q2_temp/7_qkv.pt frob error: 0.1618097722530365 |
| I0222 16:36:15.919236 41281 misc.py:26] /tmp/q2_temp/7_qkv.pt proxy error: 0.005370061378926039 |
| I0222 16:36:16.634470 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 16:36:19.564512 41281 quip.py:388] mean square of W: 0.9999999403953552 |
| I0222 16:36:19.564687 41281 quip.py:389] mean square of Wr: 0.0017649403307586908 |
| I0222 16:36:19.564998 41281 quip.py:390] difference between Hr and Hr.T: 1.4543533325195312e-05 |
| I0222 16:36:19.565114 41281 quip.py:391] max abs of Hr: 11.913304328918457 |
| I0222 16:36:19.565211 41281 quip.py:392] min diag of Lhr: 1.418710708618164 |
| I0222 16:36:28.323707 41281 misc.py:25] /tmp/q2_temp/7_o.pt frob error: 0.18744780123233795 |
| I0222 16:36:28.323941 41281 misc.py:26] /tmp/q2_temp/7_o.pt proxy error: 0.030367540195584297 |
| I0222 16:36:28.623557 41281 quantize_decompress_robust.py:194] o done |
| I0222 16:36:41.252835 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:36:41.253344 41281 quip.py:389] mean square of Wr: 0.0009514886187389493 |
| I0222 16:36:41.253634 41281 quip.py:390] difference between Hr and Hr.T: 2.956390380859375e-05 |
| I0222 16:36:41.253741 41281 quip.py:391] max abs of Hr: 35.41178894042969 |
| I0222 16:36:41.253837 41281 quip.py:392] min diag of Lhr: 3.3686931133270264 |
| I0222 16:36:59.287367 41281 misc.py:25] /tmp/q2_temp/7_up.pt frob error: 0.12106442451477051 |
| I0222 16:36:59.287488 41281 misc.py:26] /tmp/q2_temp/7_up.pt proxy error: 0.033247094601392746 |
| I0222 16:37:02.370314 41281 quantize_decompress_robust.py:212] up done |
| I0222 16:37:10.426152 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:37:10.426468 41281 quip.py:389] mean square of Wr: 0.0025785588659346104 |
| I0222 16:37:10.429183 41281 quip.py:390] difference between Hr and Hr.T: 1.8715858459472656e-05 |
| I0222 16:37:10.429845 41281 quip.py:391] max abs of Hr: 12.603669166564941 |
| I0222 16:37:10.429954 41281 quip.py:392] min diag of Lhr: 2.533036708831787 |
| I0222 16:37:44.821143 41281 misc.py:25] /tmp/q2_temp/7_down.pt frob error: 0.12118113785982132 |
| I0222 16:37:44.821273 41281 misc.py:26] /tmp/q2_temp/7_down.pt proxy error: 0.046561334282159805 |
| I0222 16:37:45.916494 41281 quantize_decompress_robust.py:226] down done |
| I0222 16:37:46.285243 41281 quantize_decompress_robust.py:79] Saved progress for layer 7 (416 MB) |
| I0222 16:37:46.285511 41281 quantize_decompress_robust.py:239] Layer 7 done in 102.6s [8/50] |
| I0222 16:37:46.285560 41281 quantize_decompress_robust.py:159] === Layer 8/50 === |
| I0222 16:37:49.021543 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 16:37:49.021736 41281 quip.py:389] mean square of Wr: 0.0003759975079447031 |
| I0222 16:37:49.022253 41281 quip.py:390] difference between Hr and Hr.T: 8.344650268554688e-06 |
| I0222 16:37:49.022366 41281 quip.py:391] max abs of Hr: 4.095651626586914 |
| I0222 16:37:49.022452 41281 quip.py:392] min diag of Lhr: 0.9816886782646179 |
| I0222 16:37:57.833284 41281 misc.py:25] /tmp/q2_temp/8_qkv.pt frob error: 0.15450403094291687 |
| I0222 16:37:57.833442 41281 misc.py:26] /tmp/q2_temp/8_qkv.pt proxy error: 0.005846114829182625 |
| I0222 16:37:58.454804 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 16:38:00.911305 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 16:38:00.911473 41281 quip.py:389] mean square of Wr: 0.0022546867839992046 |
| I0222 16:38:00.911812 41281 quip.py:390] difference between Hr and Hr.T: 2.6226043701171875e-05 |
| I0222 16:38:00.911927 41281 quip.py:391] max abs of Hr: 15.226912498474121 |
| I0222 16:38:00.912004 41281 quip.py:392] min diag of Lhr: 1.6767549514770508 |
| I0222 16:38:09.419327 41281 misc.py:25] /tmp/q2_temp/8_o.pt frob error: 0.15140853822231293 |
| I0222 16:38:09.419471 41281 misc.py:26] /tmp/q2_temp/8_o.pt proxy error: 0.03089079260826111 |
| I0222 16:38:09.737112 41281 quantize_decompress_robust.py:194] o done |
| I0222 16:38:21.865915 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:38:21.866426 41281 quip.py:389] mean square of Wr: 0.000907133100554347 |
| I0222 16:38:21.866738 41281 quip.py:390] difference between Hr and Hr.T: 3.0517578125e-05 |
| I0222 16:38:21.866845 41281 quip.py:391] max abs of Hr: 33.66083908081055 |
| I0222 16:38:21.866964 41281 quip.py:392] min diag of Lhr: 3.442572832107544 |
| I0222 16:38:41.189775 41281 misc.py:25] /tmp/q2_temp/8_up.pt frob error: 0.11971818655729294 |
| I0222 16:38:41.189916 41281 misc.py:26] /tmp/q2_temp/8_up.pt proxy error: 0.03281329199671745 |
| I0222 16:38:44.336200 41281 quantize_decompress_robust.py:212] up done |
| I0222 16:38:52.166524 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:38:52.166843 41281 quip.py:389] mean square of Wr: 0.004406275227665901 |
| I0222 16:38:52.169569 41281 quip.py:390] difference between Hr and Hr.T: 3.0517578125e-05 |
| I0222 16:38:52.170243 41281 quip.py:391] max abs of Hr: 22.718990325927734 |
| I0222 16:38:52.170352 41281 quip.py:392] min diag of Lhr: 3.3187122344970703 |
| I0222 16:39:26.631816 41281 misc.py:25] /tmp/q2_temp/8_down.pt frob error: 0.11861241608858109 |
| I0222 16:39:26.631953 41281 misc.py:26] /tmp/q2_temp/8_down.pt proxy error: 0.04898221418261528 |
| I0222 16:39:27.807429 41281 quantize_decompress_robust.py:226] down done |
| I0222 16:39:28.295262 41281 quantize_decompress_robust.py:79] Saved progress for layer 8 (416 MB) |
| I0222 16:39:28.295527 41281 quantize_decompress_robust.py:239] Layer 8 done in 102.0s [9/50] |
| I0222 16:39:28.295572 41281 quantize_decompress_robust.py:159] === Layer 9/50 === |
| I0222 16:39:31.125866 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 16:39:31.126063 41281 quip.py:389] mean square of Wr: 0.0004379386955406517 |
| I0222 16:39:31.126586 41281 quip.py:390] difference between Hr and Hr.T: 5.7220458984375e-06 |
| I0222 16:39:31.126700 41281 quip.py:391] max abs of Hr: 5.072149276733398 |
| I0222 16:39:31.126790 41281 quip.py:392] min diag of Lhr: 1.072903037071228 |
| I0222 16:39:40.432020 41281 misc.py:25] /tmp/q2_temp/9_qkv.pt frob error: 0.1489240676164627 |
| I0222 16:39:40.432151 41281 misc.py:26] /tmp/q2_temp/9_qkv.pt proxy error: 0.008931471966207027 |
| I0222 16:39:40.885722 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 16:39:43.463335 41281 quip.py:388] mean square of W: 1.000000238418579 |
| I0222 16:39:43.463522 41281 quip.py:389] mean square of Wr: 0.0026325953658670187 |
| I0222 16:39:43.463868 41281 quip.py:390] difference between Hr and Hr.T: 2.765655517578125e-05 |
| I0222 16:39:43.463984 41281 quip.py:391] max abs of Hr: 19.14781379699707 |
| I0222 16:39:43.464083 41281 quip.py:392] min diag of Lhr: 1.8359148502349854 |
| I0222 16:39:52.420343 41281 misc.py:25] /tmp/q2_temp/9_o.pt frob error: 0.1515427678823471 |
| I0222 16:39:52.420479 41281 misc.py:26] /tmp/q2_temp/9_o.pt proxy error: 0.03595173358917236 |
| I0222 16:39:52.738111 41281 quantize_decompress_robust.py:194] o done |
| I0222 16:40:05.084970 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:40:05.085483 41281 quip.py:389] mean square of Wr: 0.0007785016787238419 |
| I0222 16:40:05.085791 41281 quip.py:390] difference between Hr and Hr.T: 2.574920654296875e-05 |
| I0222 16:40:05.085899 41281 quip.py:391] max abs of Hr: 30.747955322265625 |
| I0222 16:40:05.086011 41281 quip.py:392] min diag of Lhr: 3.1936140060424805 |
| I0222 16:40:24.854889 41281 misc.py:25] /tmp/q2_temp/9_up.pt frob error: 0.11965135484933853 |
| I0222 16:40:24.855238 41281 misc.py:26] /tmp/q2_temp/9_up.pt proxy error: 0.03186498582363129 |
| I0222 16:40:28.073627 41281 quantize_decompress_robust.py:212] up done |
| I0222 16:40:35.758543 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:40:35.758859 41281 quip.py:389] mean square of Wr: 0.0023127724416553974 |
| I0222 16:40:35.761603 41281 quip.py:390] difference between Hr and Hr.T: 1.4185905456542969e-05 |
| I0222 16:40:35.762270 41281 quip.py:391] max abs of Hr: 11.487878799438477 |
| I0222 16:40:35.762376 41281 quip.py:392] min diag of Lhr: 2.4355156421661377 |
| I0222 16:41:09.720964 41281 misc.py:25] /tmp/q2_temp/9_down.pt frob error: 0.11769086867570877 |
| I0222 16:41:09.721215 41281 misc.py:26] /tmp/q2_temp/9_down.pt proxy error: 0.04867607355117798 |
| I0222 16:41:10.817868 41281 quantize_decompress_robust.py:226] down done |
| I0222 16:41:11.200380 41281 quantize_decompress_robust.py:79] Saved progress for layer 9 (416 MB) |
| I0222 16:41:11.200671 41281 quantize_decompress_robust.py:239] Layer 9 done in 102.9s [10/50] |
| I0222 16:41:11.200716 41281 quantize_decompress_robust.py:159] === Layer 10/50 === |
| I0222 16:41:14.054067 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:41:14.054280 41281 quip.py:389] mean square of Wr: 0.0003575569426175207 |
| I0222 16:41:14.054628 41281 quip.py:390] difference between Hr and Hr.T: 7.987022399902344e-06 |
| I0222 16:41:14.054739 41281 quip.py:391] max abs of Hr: 3.973365068435669 |
| I0222 16:41:14.054829 41281 quip.py:392] min diag of Lhr: 0.9810859560966492 |
| I0222 16:41:23.919628 41281 misc.py:25] /tmp/q2_temp/10_qkv.pt frob error: 0.14942626655101776 |
| I0222 16:41:23.919768 41281 misc.py:26] /tmp/q2_temp/10_qkv.pt proxy error: 0.0070838346146047115 |
| I0222 16:41:24.474379 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 16:41:27.028351 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 16:41:27.028526 41281 quip.py:389] mean square of Wr: 0.0025412780232727528 |
| I0222 16:41:27.028857 41281 quip.py:390] difference between Hr and Hr.T: 3.0517578125e-05 |
| I0222 16:41:27.028969 41281 quip.py:391] max abs of Hr: 18.033267974853516 |
| I0222 16:41:27.029063 41281 quip.py:392] min diag of Lhr: 1.6702167987823486 |
| I0222 16:41:35.925074 41281 misc.py:25] /tmp/q2_temp/10_o.pt frob error: 0.16179917752742767 |
| I0222 16:41:35.925209 41281 misc.py:26] /tmp/q2_temp/10_o.pt proxy error: 0.030141880735754967 |
| I0222 16:41:36.325430 41281 quantize_decompress_robust.py:194] o done |
| I0222 16:41:48.868936 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 16:41:48.869441 41281 quip.py:389] mean square of Wr: 0.0006970043177716434 |
| I0222 16:41:48.869739 41281 quip.py:390] difference between Hr and Hr.T: 2.86102294921875e-05 |
| I0222 16:41:48.869846 41281 quip.py:391] max abs of Hr: 27.45383644104004 |
| I0222 16:41:48.869956 41281 quip.py:392] min diag of Lhr: 3.0673351287841797 |
| I0222 16:42:07.555672 41281 misc.py:25] /tmp/q2_temp/10_up.pt frob error: 0.1200333759188652 |
| I0222 16:42:07.555805 41281 misc.py:26] /tmp/q2_temp/10_up.pt proxy error: 0.03064199723303318 |
| I0222 16:42:10.737098 41281 quantize_decompress_robust.py:212] up done |
| I0222 16:42:18.531082 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:42:18.531398 41281 quip.py:389] mean square of Wr: 0.0029280027374625206 |
| I0222 16:42:18.534134 41281 quip.py:390] difference between Hr and Hr.T: 1.7881393432617188e-05 |
| I0222 16:42:18.534799 41281 quip.py:391] max abs of Hr: 15.051030158996582 |
| I0222 16:42:18.534882 41281 quip.py:392] min diag of Lhr: 2.742818832397461 |
| I0222 16:42:52.031752 41281 misc.py:25] /tmp/q2_temp/10_down.pt frob error: 0.1174471527338028 |
| I0222 16:42:52.031882 41281 misc.py:26] /tmp/q2_temp/10_down.pt proxy error: 0.05027260631322861 |
| I0222 16:42:53.110753 41281 quantize_decompress_robust.py:226] down done |
| I0222 16:42:53.480092 41281 quantize_decompress_robust.py:79] Saved progress for layer 10 (416 MB) |
| I0222 16:42:53.480386 41281 quantize_decompress_robust.py:239] Layer 10 done in 102.3s [11/50] |
| I0222 16:42:53.480436 41281 quantize_decompress_robust.py:159] === Layer 11/50 === |
| I0222 16:42:56.735137 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 16:42:56.735331 41281 quip.py:389] mean square of Wr: 0.00040413494571112096 |
| I0222 16:42:56.735851 41281 quip.py:390] difference between Hr and Hr.T: 5.841255187988281e-06 |
| I0222 16:42:56.735963 41281 quip.py:391] max abs of Hr: 4.6356587409973145 |
| I0222 16:42:56.736060 41281 quip.py:392] min diag of Lhr: 1.0466105937957764 |
| I0222 16:43:05.921148 41281 misc.py:25] /tmp/q2_temp/11_qkv.pt frob error: 0.14988087117671967 |
| I0222 16:43:05.921280 41281 misc.py:26] /tmp/q2_temp/11_qkv.pt proxy error: 0.009313972666859627 |
| I0222 16:43:06.431988 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 16:43:08.735385 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:43:08.735560 41281 quip.py:389] mean square of Wr: 0.0022105900570750237 |
| I0222 16:43:08.735885 41281 quip.py:390] difference between Hr and Hr.T: 2.0265579223632812e-05 |
| I0222 16:43:08.735999 41281 quip.py:391] max abs of Hr: 15.064866065979004 |
| I0222 16:43:08.736084 41281 quip.py:392] min diag of Lhr: 1.723968505859375 |
| I0222 16:43:17.719820 41281 misc.py:25] /tmp/q2_temp/11_o.pt frob error: 0.1593584567308426 |
| I0222 16:43:17.719948 41281 misc.py:26] /tmp/q2_temp/11_o.pt proxy error: 0.03329264000058174 |
| I0222 16:43:18.035536 41281 quantize_decompress_robust.py:194] o done |
| I0222 16:43:30.097784 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:43:30.098293 41281 quip.py:389] mean square of Wr: 0.0006848288467153907 |
| I0222 16:43:30.098568 41281 quip.py:390] difference between Hr and Hr.T: 2.9087066650390625e-05 |
| I0222 16:43:30.098675 41281 quip.py:391] max abs of Hr: 27.03841781616211 |
| I0222 16:43:30.098755 41281 quip.py:392] min diag of Lhr: 3.008265733718872 |
| I0222 16:43:48.347398 41281 misc.py:25] /tmp/q2_temp/11_up.pt frob error: 0.11977558583021164 |
| I0222 16:43:48.347544 41281 misc.py:26] /tmp/q2_temp/11_up.pt proxy error: 0.030971581116318703 |
| I0222 16:43:51.473072 41281 quantize_decompress_robust.py:212] up done |
| I0222 16:43:59.344011 41281 quip.py:388] mean square of W: 0.9999998807907104 |
| I0222 16:43:59.344322 41281 quip.py:389] mean square of Wr: 0.002602087100967765 |
| I0222 16:43:59.347069 41281 quip.py:390] difference between Hr and Hr.T: 1.7404556274414062e-05 |
| I0222 16:43:59.347732 41281 quip.py:391] max abs of Hr: 13.500384330749512 |
| I0222 16:43:59.347826 41281 quip.py:392] min diag of Lhr: 2.614137649536133 |
| I0222 16:44:33.933761 41281 misc.py:25] /tmp/q2_temp/11_down.pt frob error: 0.115293949842453 |
| I0222 16:44:33.933929 41281 misc.py:26] /tmp/q2_temp/11_down.pt proxy error: 0.05052240192890167 |
| I0222 16:44:35.192301 41281 quantize_decompress_robust.py:226] down done |
| I0222 16:44:35.631998 41281 quantize_decompress_robust.py:79] Saved progress for layer 11 (416 MB) |
| I0222 16:44:35.632364 41281 quantize_decompress_robust.py:239] Layer 11 done in 102.2s [12/50] |
| I0222 16:44:35.632424 41281 quantize_decompress_robust.py:159] === Layer 12/50 === |
| I0222 16:44:39.389546 41281 quip.py:388] mean square of W: 0.9999999403953552 |
| I0222 16:44:39.389754 41281 quip.py:389] mean square of Wr: 0.0004444332735147327 |
| I0222 16:44:39.390125 41281 quip.py:390] difference between Hr and Hr.T: 9.5367431640625e-06 |
| I0222 16:44:39.390240 41281 quip.py:391] max abs of Hr: 6.896420955657959 |
| I0222 16:44:39.390372 41281 quip.py:392] min diag of Lhr: 1.1127594709396362 |
| I0222 16:44:49.318573 41281 misc.py:25] /tmp/q2_temp/12_qkv.pt frob error: 0.14976657927036285 |
| I0222 16:44:49.318718 41281 misc.py:26] /tmp/q2_temp/12_qkv.pt proxy error: 0.010185549035668373 |
| I0222 16:44:49.778600 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 16:44:52.293732 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:44:52.293912 41281 quip.py:389] mean square of Wr: 0.0029015045147389174 |
| I0222 16:44:52.294252 41281 quip.py:390] difference between Hr and Hr.T: 2.574920654296875e-05 |
| I0222 16:44:52.294360 41281 quip.py:391] max abs of Hr: 20.482192993164062 |
| I0222 16:44:52.294458 41281 quip.py:392] min diag of Lhr: 1.8660067319869995 |
| I0222 16:45:02.126383 41281 misc.py:25] /tmp/q2_temp/12_o.pt frob error: 0.2221638709306717 |
| I0222 16:45:02.126549 41281 misc.py:26] /tmp/q2_temp/12_o.pt proxy error: 0.03602280840277672 |
| I0222 16:45:02.526194 41281 quantize_decompress_robust.py:194] o done |
| I0222 16:45:15.002389 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:45:15.002902 41281 quip.py:389] mean square of Wr: 0.0006387734320014715 |
| I0222 16:45:15.003191 41281 quip.py:390] difference between Hr and Hr.T: 2.6226043701171875e-05 |
| I0222 16:45:15.003298 41281 quip.py:391] max abs of Hr: 25.848098754882812 |
| I0222 16:45:15.003398 41281 quip.py:392] min diag of Lhr: 2.95153546333313 |
| I0222 16:45:33.248032 41281 misc.py:25] /tmp/q2_temp/12_up.pt frob error: 0.11878173798322678 |
| I0222 16:45:33.248251 41281 misc.py:26] /tmp/q2_temp/12_up.pt proxy error: 0.03153599053621292 |
| I0222 16:45:36.420688 41281 quantize_decompress_robust.py:212] up done |
| I0222 16:45:44.452214 41281 quip.py:388] mean square of W: 1.000000238418579 |
| I0222 16:45:44.452534 41281 quip.py:389] mean square of Wr: 0.0025197723880410194 |
| I0222 16:45:44.455270 41281 quip.py:390] difference between Hr and Hr.T: 1.5854835510253906e-05 |
| I0222 16:45:44.455924 41281 quip.py:391] max abs of Hr: 12.35795783996582 |
| I0222 16:45:44.456038 41281 quip.py:392] min diag of Lhr: 2.608415365219116 |
| I0222 16:46:17.356106 41281 misc.py:25] /tmp/q2_temp/12_down.pt frob error: 0.11516763269901276 |
| I0222 16:46:17.356356 41281 misc.py:26] /tmp/q2_temp/12_down.pt proxy error: 0.05224791169166565 |
| I0222 16:46:18.519716 41281 quantize_decompress_robust.py:226] down done |
| I0222 16:46:18.899466 41281 quantize_decompress_robust.py:79] Saved progress for layer 12 (416 MB) |
| I0222 16:46:18.899809 41281 quantize_decompress_robust.py:239] Layer 12 done in 103.3s [13/50] |
| I0222 16:46:18.899878 41281 quantize_decompress_robust.py:159] === Layer 13/50 === |
| I0222 16:46:22.252501 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:46:22.252719 41281 quip.py:389] mean square of Wr: 0.000414924172218889 |
| I0222 16:46:22.253224 41281 quip.py:390] difference between Hr and Hr.T: 6.4373016357421875e-06 |
| I0222 16:46:22.253335 41281 quip.py:391] max abs of Hr: 4.396927356719971 |
| I0222 16:46:22.253422 41281 quip.py:392] min diag of Lhr: 1.1128807067871094 |
| I0222 16:46:31.333192 41281 misc.py:25] /tmp/q2_temp/13_qkv.pt frob error: 0.1384318619966507 |
| I0222 16:46:31.333323 41281 misc.py:26] /tmp/q2_temp/13_qkv.pt proxy error: 0.010058000683784485 |
| I0222 16:46:31.861935 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 16:46:34.135975 41281 quip.py:388] mean square of W: 0.9999998807907104 |
| I0222 16:46:34.136147 41281 quip.py:389] mean square of Wr: 0.0032958185765892267 |
| I0222 16:46:34.136460 41281 quip.py:390] difference between Hr and Hr.T: 2.6226043701171875e-05 |
| I0222 16:46:34.136572 41281 quip.py:391] max abs of Hr: 19.98457908630371 |
| I0222 16:46:34.136652 41281 quip.py:392] min diag of Lhr: 2.275784730911255 |
| I0222 16:46:42.538130 41281 misc.py:25] /tmp/q2_temp/13_o.pt frob error: 0.13393618166446686 |
| I0222 16:46:42.538262 41281 misc.py:26] /tmp/q2_temp/13_o.pt proxy error: 0.04339377582073212 |
| I0222 16:46:42.934686 41281 quantize_decompress_robust.py:194] o done |
| I0222 16:46:54.409210 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:46:54.409731 41281 quip.py:389] mean square of Wr: 0.000735440116841346 |
| I0222 16:46:54.410012 41281 quip.py:390] difference between Hr and Hr.T: 2.5272369384765625e-05 |
| I0222 16:46:54.410117 41281 quip.py:391] max abs of Hr: 27.80887794494629 |
| I0222 16:46:54.410203 41281 quip.py:392] min diag of Lhr: 3.272016763687134 |
| I0222 16:47:13.936526 41281 misc.py:25] /tmp/q2_temp/13_up.pt frob error: 0.11728060245513916 |
| I0222 16:47:13.936660 41281 misc.py:26] /tmp/q2_temp/13_up.pt proxy error: 0.03349560499191284 |
| I0222 16:47:17.185861 41281 quantize_decompress_robust.py:212] up done |
| I0222 16:47:24.846943 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:47:24.847278 41281 quip.py:389] mean square of Wr: 0.0015332384500652552 |
| I0222 16:47:24.849979 41281 quip.py:390] difference between Hr and Hr.T: 7.62939453125e-06 |
| I0222 16:47:24.850645 41281 quip.py:391] max abs of Hr: 7.2641987800598145 |
| I0222 16:47:24.850756 41281 quip.py:392] min diag of Lhr: 2.069241523742676 |
| I0222 16:47:57.425176 41281 misc.py:25] /tmp/q2_temp/13_down.pt frob error: 0.11380613595247269 |
| I0222 16:47:57.425319 41281 misc.py:26] /tmp/q2_temp/13_down.pt proxy error: 0.05520528182387352 |
| I0222 16:47:58.509221 41281 quantize_decompress_robust.py:226] down done |
| I0222 16:47:58.887126 41281 quantize_decompress_robust.py:79] Saved progress for layer 13 (416 MB) |
| I0222 16:47:58.887410 41281 quantize_decompress_robust.py:239] Layer 13 done in 100.0s [14/50] |
| I0222 16:47:58.887454 41281 quantize_decompress_robust.py:159] === Layer 14/50 === |
| I0222 16:48:01.729001 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 16:48:01.729201 41281 quip.py:389] mean square of Wr: 0.00047936703776940703 |
| I0222 16:48:01.729519 41281 quip.py:390] difference between Hr and Hr.T: 7.152557373046875e-06 |
| I0222 16:48:01.729631 41281 quip.py:391] max abs of Hr: 5.445004463195801 |
| I0222 16:48:01.729715 41281 quip.py:392] min diag of Lhr: 1.2035819292068481 |
| I0222 16:48:10.522480 41281 misc.py:25] /tmp/q2_temp/14_qkv.pt frob error: 0.1398346871137619 |
| I0222 16:48:10.522630 41281 misc.py:26] /tmp/q2_temp/14_qkv.pt proxy error: 0.011122402735054493 |
| I0222 16:48:10.966032 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 16:48:13.631851 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:48:13.632020 41281 quip.py:389] mean square of Wr: 0.002620685612782836 |
| I0222 16:48:13.632327 41281 quip.py:390] difference between Hr and Hr.T: 2.8133392333984375e-05 |
| I0222 16:48:13.632442 41281 quip.py:391] max abs of Hr: 18.014883041381836 |
| I0222 16:48:13.632519 41281 quip.py:392] min diag of Lhr: 1.9154529571533203 |
| I0222 16:48:22.319983 41281 misc.py:25] /tmp/q2_temp/14_o.pt frob error: 0.14547376334667206 |
| I0222 16:48:22.320106 41281 misc.py:26] /tmp/q2_temp/14_o.pt proxy error: 0.045944999903440475 |
| I0222 16:48:22.628485 41281 quantize_decompress_robust.py:194] o done |
| I0222 16:48:34.949513 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 16:48:34.950019 41281 quip.py:389] mean square of Wr: 0.0005935472436249256 |
| I0222 16:48:34.950295 41281 quip.py:390] difference between Hr and Hr.T: 2.384185791015625e-05 |
| I0222 16:48:34.950400 41281 quip.py:391] max abs of Hr: 22.557226181030273 |
| I0222 16:48:34.950488 41281 quip.py:392] min diag of Lhr: 2.984339714050293 |
| I0222 16:48:55.189741 41281 misc.py:25] /tmp/q2_temp/14_up.pt frob error: 0.11683830618858337 |
| I0222 16:48:55.189880 41281 misc.py:26] /tmp/q2_temp/14_up.pt proxy error: 0.03180290758609772 |
| I0222 16:48:58.363609 41281 quantize_decompress_robust.py:212] up done |
| I0222 16:49:06.736485 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:49:06.736805 41281 quip.py:389] mean square of Wr: 0.0018804811406880617 |
| I0222 16:49:06.739549 41281 quip.py:390] difference between Hr and Hr.T: 1.3113021850585938e-05 |
| I0222 16:49:06.740211 41281 quip.py:391] max abs of Hr: 10.007909774780273 |
| I0222 16:49:06.740298 41281 quip.py:392] min diag of Lhr: 2.236142635345459 |
| I0222 16:49:41.032201 41281 misc.py:25] /tmp/q2_temp/14_down.pt frob error: 0.1165497675538063 |
| I0222 16:49:41.032339 41281 misc.py:26] /tmp/q2_temp/14_down.pt proxy error: 0.05158313736319542 |
| I0222 16:49:42.144870 41281 quantize_decompress_robust.py:226] down done |
| I0222 16:49:42.517798 41281 quantize_decompress_robust.py:79] Saved progress for layer 14 (416 MB) |
| I0222 16:49:42.518104 41281 quantize_decompress_robust.py:239] Layer 14 done in 103.6s [15/50] |
| I0222 16:49:42.518150 41281 quantize_decompress_robust.py:159] === Layer 15/50 === |
| I0222 16:49:46.258142 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:49:46.258339 41281 quip.py:389] mean square of Wr: 0.00048413651529699564 |
| I0222 16:49:46.258691 41281 quip.py:390] difference between Hr and Hr.T: 8.702278137207031e-06 |
| I0222 16:49:46.258800 41281 quip.py:391] max abs of Hr: 4.993661403656006 |
| I0222 16:49:46.258897 41281 quip.py:392] min diag of Lhr: 1.2152736186981201 |
| I0222 16:49:56.323207 41281 misc.py:25] /tmp/q2_temp/15_qkv.pt frob error: 0.1360544115304947 |
| I0222 16:49:56.323324 41281 misc.py:26] /tmp/q2_temp/15_qkv.pt proxy error: 0.013176986947655678 |
| I0222 16:49:56.843209 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 16:49:59.542572 41281 quip.py:388] mean square of W: 0.9999998807907104 |
| I0222 16:49:59.542738 41281 quip.py:389] mean square of Wr: 0.0025247675366699696 |
| I0222 16:49:59.543074 41281 quip.py:390] difference between Hr and Hr.T: 1.6689300537109375e-05 |
| I0222 16:49:59.543186 41281 quip.py:391] max abs of Hr: 18.257862091064453 |
| I0222 16:49:59.543273 41281 quip.py:392] min diag of Lhr: 1.8209843635559082 |
| I0222 16:50:08.526134 41281 misc.py:25] /tmp/q2_temp/15_o.pt frob error: 0.18952539563179016 |
| I0222 16:50:08.526286 41281 misc.py:26] /tmp/q2_temp/15_o.pt proxy error: 0.041246797889471054 |
| I0222 16:50:08.926916 41281 quantize_decompress_robust.py:194] o done |
| I0222 16:50:21.475279 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 16:50:21.475792 41281 quip.py:389] mean square of Wr: 0.000777874025516212 |
| I0222 16:50:21.476074 41281 quip.py:390] difference between Hr and Hr.T: 3.24249267578125e-05 |
| I0222 16:50:21.476180 41281 quip.py:391] max abs of Hr: 33.21556854248047 |
| I0222 16:50:21.476269 41281 quip.py:392] min diag of Lhr: 3.3313608169555664 |
| I0222 16:50:40.146952 41281 misc.py:25] /tmp/q2_temp/15_up.pt frob error: 0.11708223819732666 |
| I0222 16:50:40.147097 41281 misc.py:26] /tmp/q2_temp/15_up.pt proxy error: 0.03148098662495613 |
| I0222 16:50:43.316018 41281 quantize_decompress_robust.py:212] up done |
| I0222 16:50:51.059007 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:50:51.059326 41281 quip.py:389] mean square of Wr: 0.0018741014646366239 |
| I0222 16:50:51.062064 41281 quip.py:390] difference between Hr and Hr.T: 1.0132789611816406e-05 |
| I0222 16:50:51.062735 41281 quip.py:391] max abs of Hr: 9.85262680053711 |
| I0222 16:50:51.062829 41281 quip.py:392] min diag of Lhr: 2.216912269592285 |
| I0222 16:51:25.451926 41281 misc.py:25] /tmp/q2_temp/15_down.pt frob error: 0.11761819571256638 |
| I0222 16:51:25.452059 41281 misc.py:26] /tmp/q2_temp/15_down.pt proxy error: 0.05309982970356941 |
| I0222 16:51:26.649389 41281 quantize_decompress_robust.py:226] down done |
| I0222 16:51:27.028870 41281 quantize_decompress_robust.py:79] Saved progress for layer 15 (416 MB) |
| I0222 16:51:27.029194 41281 quantize_decompress_robust.py:239] Layer 15 done in 104.5s [16/50] |
| I0222 16:51:27.029251 41281 quantize_decompress_robust.py:159] === Layer 16/50 === |
| I0222 16:51:30.971825 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 16:51:30.972032 41281 quip.py:389] mean square of Wr: 0.0004681006248574704 |
| I0222 16:51:30.972584 41281 quip.py:390] difference between Hr and Hr.T: 9.775161743164062e-06 |
| I0222 16:51:30.972699 41281 quip.py:391] max abs of Hr: 5.527798175811768 |
| I0222 16:51:30.972802 41281 quip.py:392] min diag of Lhr: 1.2040826082229614 |
| I0222 16:51:40.126875 41281 misc.py:25] /tmp/q2_temp/16_qkv.pt frob error: 0.13166989386081696 |
| I0222 16:51:40.127041 41281 misc.py:26] /tmp/q2_temp/16_qkv.pt proxy error: 0.01301662903279066 |
| I0222 16:51:40.674262 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 16:51:43.646307 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:51:43.646515 41281 quip.py:389] mean square of Wr: 0.0031152586452662945 |
| I0222 16:51:43.646903 41281 quip.py:390] difference between Hr and Hr.T: 2.86102294921875e-05 |
| I0222 16:51:43.647018 41281 quip.py:391] max abs of Hr: 21.383481979370117 |
| I0222 16:51:43.647134 41281 quip.py:392] min diag of Lhr: 2.1595211029052734 |
| I0222 16:51:52.816121 41281 misc.py:25] /tmp/q2_temp/16_o.pt frob error: 0.1352829933166504 |
| I0222 16:51:52.816249 41281 misc.py:26] /tmp/q2_temp/16_o.pt proxy error: 0.04417947307229042 |
| I0222 16:51:53.123446 41281 quantize_decompress_robust.py:194] o done |
| I0222 16:52:05.049370 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:52:05.049877 41281 quip.py:389] mean square of Wr: 0.0005132529186084867 |
| I0222 16:52:05.050165 41281 quip.py:390] difference between Hr and Hr.T: 2.5272369384765625e-05 |
| I0222 16:52:05.050273 41281 quip.py:391] max abs of Hr: 23.37834358215332 |
| I0222 16:52:05.050372 41281 quip.py:392] min diag of Lhr: 2.7413206100463867 |
| I0222 16:52:23.865857 41281 misc.py:25] /tmp/q2_temp/16_up.pt frob error: 0.11709681153297424 |
| I0222 16:52:23.865985 41281 misc.py:26] /tmp/q2_temp/16_up.pt proxy error: 0.0297560878098011 |
| I0222 16:52:26.987128 41281 quantize_decompress_robust.py:212] up done |
| I0222 16:52:35.481338 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:52:35.481654 41281 quip.py:389] mean square of Wr: 0.0017339019104838371 |
| I0222 16:52:35.484391 41281 quip.py:390] difference between Hr and Hr.T: 1.1444091796875e-05 |
| I0222 16:52:35.485060 41281 quip.py:391] max abs of Hr: 8.75611400604248 |
| I0222 16:52:35.485154 41281 quip.py:392] min diag of Lhr: 2.1059932708740234 |
| I0222 16:53:08.953874 41281 misc.py:25] /tmp/q2_temp/16_down.pt frob error: 0.1193924993276596 |
| I0222 16:53:08.954000 41281 misc.py:26] /tmp/q2_temp/16_down.pt proxy error: 0.04970523715019226 |
| I0222 16:53:10.127648 41281 quantize_decompress_robust.py:226] down done |
| I0222 16:53:10.498789 41281 quantize_decompress_robust.py:79] Saved progress for layer 16 (416 MB) |
| I0222 16:53:10.499107 41281 quantize_decompress_robust.py:239] Layer 16 done in 103.5s [17/50] |
| I0222 16:53:10.499161 41281 quantize_decompress_robust.py:159] === Layer 17/50 === |
| I0222 16:53:14.145819 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:53:14.146024 41281 quip.py:389] mean square of Wr: 0.0004187328740954399 |
| I0222 16:53:14.146498 41281 quip.py:390] difference between Hr and Hr.T: 6.9141387939453125e-06 |
| I0222 16:53:14.146625 41281 quip.py:391] max abs of Hr: 4.684053421020508 |
| I0222 16:53:14.146723 41281 quip.py:392] min diag of Lhr: 1.123374581336975 |
| I0222 16:53:23.232728 41281 misc.py:25] /tmp/q2_temp/17_qkv.pt frob error: 0.13813510537147522 |
| I0222 16:53:23.232852 41281 misc.py:26] /tmp/q2_temp/17_qkv.pt proxy error: 0.010103990323841572 |
| I0222 16:53:23.693306 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 16:53:25.520669 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:53:25.520860 41281 quip.py:389] mean square of Wr: 0.003383580595254898 |
| I0222 16:53:25.521173 41281 quip.py:390] difference between Hr and Hr.T: 2.1457672119140625e-05 |
| I0222 16:53:25.521290 41281 quip.py:391] max abs of Hr: 21.66204261779785 |
| I0222 16:53:25.521377 41281 quip.py:392] min diag of Lhr: 2.2414939403533936 |
| I0222 16:53:34.224634 41281 misc.py:25] /tmp/q2_temp/17_o.pt frob error: 0.15789549052715302 |
| I0222 16:53:34.224757 41281 misc.py:26] /tmp/q2_temp/17_o.pt proxy error: 0.03870975598692894 |
| I0222 16:53:34.531105 41281 quantize_decompress_robust.py:194] o done |
| I0222 16:53:45.964232 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:53:45.964728 41281 quip.py:389] mean square of Wr: 0.0008669139351695776 |
| I0222 16:53:45.965014 41281 quip.py:390] difference between Hr and Hr.T: 3.3855438232421875e-05 |
| I0222 16:53:45.965124 41281 quip.py:391] max abs of Hr: 35.89528274536133 |
| I0222 16:53:45.965221 41281 quip.py:392] min diag of Lhr: 3.381849527359009 |
| I0222 16:54:05.453678 41281 misc.py:25] /tmp/q2_temp/17_up.pt frob error: 0.11662574857473373 |
| I0222 16:54:05.453805 41281 misc.py:26] /tmp/q2_temp/17_up.pt proxy error: 0.030788611620664597 |
| I0222 16:54:08.557298 41281 quantize_decompress_robust.py:212] up done |
| I0222 16:54:17.379848 41281 quip.py:388] mean square of W: 0.9999998807907104 |
| I0222 16:54:17.380161 41281 quip.py:389] mean square of Wr: 0.0008929643663577735 |
| I0222 16:54:17.382905 41281 quip.py:390] difference between Hr and Hr.T: 5.841255187988281e-06 |
| I0222 16:54:17.383572 41281 quip.py:391] max abs of Hr: 4.46522331237793 |
| I0222 16:54:17.383664 41281 quip.py:392] min diag of Lhr: 1.5406675338745117 |
| I0222 16:54:49.129086 41281 misc.py:25] /tmp/q2_temp/17_down.pt frob error: 0.11775645613670349 |
| I0222 16:54:49.129252 41281 misc.py:26] /tmp/q2_temp/17_down.pt proxy error: 0.052612073719501495 |
| I0222 16:54:50.212579 41281 quantize_decompress_robust.py:226] down done |
| I0222 16:54:50.734596 41281 quantize_decompress_robust.py:79] Saved progress for layer 17 (416 MB) |
| I0222 16:54:50.734909 41281 quantize_decompress_robust.py:239] Layer 17 done in 100.2s [18/50] |
| I0222 16:54:50.734955 41281 quantize_decompress_robust.py:159] === Layer 18/50 === |
| I0222 16:54:53.424857 41281 quip.py:388] mean square of W: 0.9999999403953552 |
| I0222 16:54:53.425060 41281 quip.py:389] mean square of Wr: 0.0004240343696437776 |
| I0222 16:54:53.425534 41281 quip.py:390] difference between Hr and Hr.T: 6.67572021484375e-06 |
| I0222 16:54:53.425652 41281 quip.py:391] max abs of Hr: 4.2239508628845215 |
| I0222 16:54:53.425733 41281 quip.py:392] min diag of Lhr: 1.1379287242889404 |
| I0222 16:55:02.134520 41281 misc.py:25] /tmp/q2_temp/18_qkv.pt frob error: 0.1371413618326187 |
| I0222 16:55:02.134654 41281 misc.py:26] /tmp/q2_temp/18_qkv.pt proxy error: 0.011949711479246616 |
| I0222 16:55:02.651694 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 16:55:05.639206 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:55:05.639379 41281 quip.py:389] mean square of Wr: 0.002454286441206932 |
| I0222 16:55:05.639701 41281 quip.py:390] difference between Hr and Hr.T: 3.1948089599609375e-05 |
| I0222 16:55:05.639811 41281 quip.py:391] max abs of Hr: 16.938425064086914 |
| I0222 16:55:05.639893 41281 quip.py:392] min diag of Lhr: 1.7909947633743286 |
| I0222 16:55:14.719856 41281 misc.py:25] /tmp/q2_temp/18_o.pt frob error: 0.16257372498512268 |
| I0222 16:55:14.719975 41281 misc.py:26] /tmp/q2_temp/18_o.pt proxy error: 0.025027725845575333 |
| I0222 16:55:15.034730 41281 quantize_decompress_robust.py:194] o done |
| I0222 16:55:27.751981 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:55:27.752485 41281 quip.py:389] mean square of Wr: 0.0005537284887395799 |
| I0222 16:55:27.752769 41281 quip.py:390] difference between Hr and Hr.T: 2.956390380859375e-05 |
| I0222 16:55:27.752873 41281 quip.py:391] max abs of Hr: 22.718069076538086 |
| I0222 16:55:27.752966 41281 quip.py:392] min diag of Lhr: 2.8445658683776855 |
| I0222 16:55:46.345749 41281 misc.py:25] /tmp/q2_temp/18_up.pt frob error: 0.11743556708097458 |
| I0222 16:55:46.345877 41281 misc.py:26] /tmp/q2_temp/18_up.pt proxy error: 0.028061529621481895 |
| I0222 16:55:49.518701 41281 quantize_decompress_robust.py:212] up done |
| I0222 16:55:57.862697 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:55:57.863005 41281 quip.py:389] mean square of Wr: 0.0010303562739863992 |
| I0222 16:55:57.865754 41281 quip.py:390] difference between Hr and Hr.T: 7.033348083496094e-06 |
| I0222 16:55:57.866416 41281 quip.py:391] max abs of Hr: 5.4190521240234375 |
| I0222 16:55:57.866507 41281 quip.py:392] min diag of Lhr: 1.5889480113983154 |
| I0222 16:56:32.629878 41281 misc.py:25] /tmp/q2_temp/18_down.pt frob error: 0.1220395490527153 |
| I0222 16:56:32.630007 41281 misc.py:26] /tmp/q2_temp/18_down.pt proxy error: 0.04871528223156929 |
| I0222 16:56:33.731996 41281 quantize_decompress_robust.py:226] down done |
| I0222 16:56:34.110359 41281 quantize_decompress_robust.py:79] Saved progress for layer 18 (416 MB) |
| I0222 16:56:34.110685 41281 quantize_decompress_robust.py:239] Layer 18 done in 103.4s [19/50] |
| I0222 16:56:34.110736 41281 quantize_decompress_robust.py:159] === Layer 19/50 === |
| I0222 16:56:36.951416 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:56:36.951629 41281 quip.py:389] mean square of Wr: 0.00040895582060329616 |
| I0222 16:56:36.952130 41281 quip.py:390] difference between Hr and Hr.T: 6.4373016357421875e-06 |
| I0222 16:56:36.952243 41281 quip.py:391] max abs of Hr: 5.07997989654541 |
| I0222 16:56:36.952342 41281 quip.py:392] min diag of Lhr: 1.1201145648956299 |
| I0222 16:56:46.439757 41281 misc.py:25] /tmp/q2_temp/19_qkv.pt frob error: 0.13503634929656982 |
| I0222 16:56:46.439906 41281 misc.py:26] /tmp/q2_temp/19_qkv.pt proxy error: 0.011372024193406105 |
| I0222 16:56:46.862850 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 16:56:49.514500 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 16:56:49.514680 41281 quip.py:389] mean square of Wr: 0.003662597853690386 |
| I0222 16:56:49.514986 41281 quip.py:390] difference between Hr and Hr.T: 3.62396240234375e-05 |
| I0222 16:56:49.515103 41281 quip.py:391] max abs of Hr: 23.219751358032227 |
| I0222 16:56:49.515180 41281 quip.py:392] min diag of Lhr: 2.170038938522339 |
| I0222 16:56:58.025125 41281 misc.py:25] /tmp/q2_temp/19_o.pt frob error: 0.15457911789417267 |
| I0222 16:56:58.025257 41281 misc.py:26] /tmp/q2_temp/19_o.pt proxy error: 0.017111478373408318 |
| I0222 16:56:58.419448 41281 quantize_decompress_robust.py:194] o done |
| I0222 16:57:10.501315 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:57:10.501826 41281 quip.py:389] mean square of Wr: 0.00024055506219156086 |
| I0222 16:57:10.502121 41281 quip.py:390] difference between Hr and Hr.T: 1.1444091796875e-05 |
| I0222 16:57:10.502225 41281 quip.py:391] max abs of Hr: 10.050774574279785 |
| I0222 16:57:10.502308 41281 quip.py:392] min diag of Lhr: 1.813976764678955 |
| I0222 16:57:28.139908 41281 misc.py:25] /tmp/q2_temp/19_up.pt frob error: 0.11812211573123932 |
| I0222 16:57:28.140046 41281 misc.py:26] /tmp/q2_temp/19_up.pt proxy error: 0.026788894087076187 |
| I0222 16:57:31.279144 41281 quantize_decompress_robust.py:212] up done |
| I0222 16:57:39.352813 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 16:57:39.353135 41281 quip.py:389] mean square of Wr: 0.0013120643561705947 |
| I0222 16:57:39.355849 41281 quip.py:390] difference between Hr and Hr.T: 1.0967254638671875e-05 |
| I0222 16:57:39.356501 41281 quip.py:391] max abs of Hr: 7.040231704711914 |
| I0222 16:57:39.356606 41281 quip.py:392] min diag of Lhr: 1.762155532836914 |
| I0222 16:58:12.628087 41281 misc.py:25] /tmp/q2_temp/19_down.pt frob error: 0.12307731807231903 |
| I0222 16:58:12.628246 41281 misc.py:26] /tmp/q2_temp/19_down.pt proxy error: 0.04636365920305252 |
| I0222 16:58:13.710037 41281 quantize_decompress_robust.py:226] down done |
| I0222 16:58:14.069844 41281 quantize_decompress_robust.py:79] Saved progress for layer 19 (416 MB) |
| I0222 16:58:14.070148 41281 quantize_decompress_robust.py:239] Layer 19 done in 100.0s [20/50] |
| I0222 16:58:14.070204 41281 quantize_decompress_robust.py:159] === Layer 20/50 === |
| I0222 16:58:17.032447 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:58:17.032660 41281 quip.py:389] mean square of Wr: 0.00043009844375774264 |
| I0222 16:58:17.033156 41281 quip.py:390] difference between Hr and Hr.T: 7.62939453125e-06 |
| I0222 16:58:17.033272 41281 quip.py:391] max abs of Hr: 5.659342288970947 |
| I0222 16:58:17.033360 41281 quip.py:392] min diag of Lhr: 1.1157407760620117 |
| I0222 16:58:25.722745 41281 misc.py:25] /tmp/q2_temp/20_qkv.pt frob error: 0.14119307696819305 |
| I0222 16:58:25.723007 41281 misc.py:26] /tmp/q2_temp/20_qkv.pt proxy error: 0.009685168974101543 |
| I0222 16:58:26.164272 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 16:58:28.627551 41281 quip.py:388] mean square of W: 1.000000238418579 |
| I0222 16:58:28.627725 41281 quip.py:389] mean square of Wr: 0.0020861211232841015 |
| I0222 16:58:28.628064 41281 quip.py:390] difference between Hr and Hr.T: 2.0265579223632812e-05 |
| I0222 16:58:28.628181 41281 quip.py:391] max abs of Hr: 13.32692813873291 |
| I0222 16:58:28.628271 41281 quip.py:392] min diag of Lhr: 1.7959611415863037 |
| I0222 16:58:36.627185 41281 misc.py:25] /tmp/q2_temp/20_o.pt frob error: 0.18694713711738586 |
| I0222 16:58:36.627327 41281 misc.py:26] /tmp/q2_temp/20_o.pt proxy error: 0.014296190813183784 |
| I0222 16:58:37.022922 41281 quantize_decompress_robust.py:194] o done |
| I0222 16:58:50.618655 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:58:50.619168 41281 quip.py:389] mean square of Wr: 0.0006312636542133987 |
| I0222 16:58:50.619463 41281 quip.py:390] difference between Hr and Hr.T: 2.3365020751953125e-05 |
| I0222 16:58:50.619574 41281 quip.py:391] max abs of Hr: 23.29350471496582 |
| I0222 16:58:50.619679 41281 quip.py:392] min diag of Lhr: 2.837581157684326 |
| I0222 16:59:08.779019 41281 misc.py:25] /tmp/q2_temp/20_up.pt frob error: 0.11763779073953629 |
| I0222 16:59:08.779142 41281 misc.py:26] /tmp/q2_temp/20_up.pt proxy error: 0.029806828126311302 |
| I0222 16:59:11.912597 41281 quantize_decompress_robust.py:212] up done |
| I0222 16:59:19.938528 41281 quip.py:388] mean square of W: 1.0 |
| I0222 16:59:19.938845 41281 quip.py:389] mean square of Wr: 0.0005041625699959695 |
| I0222 16:59:19.941564 41281 quip.py:390] difference between Hr and Hr.T: 3.725290298461914e-06 |
| I0222 16:59:19.942225 41281 quip.py:391] max abs of Hr: 2.6095478534698486 |
| I0222 16:59:19.942321 41281 quip.py:392] min diag of Lhr: 1.1204609870910645 |
| I0222 16:59:52.529647 41281 misc.py:25] /tmp/q2_temp/20_down.pt frob error: 0.11875497549772263 |
| I0222 16:59:52.529780 41281 misc.py:26] /tmp/q2_temp/20_down.pt proxy error: 0.042808882892131805 |
| I0222 16:59:53.622264 41281 quantize_decompress_robust.py:226] down done |
| I0222 16:59:53.986733 41281 quantize_decompress_robust.py:79] Saved progress for layer 20 (416 MB) |
| I0222 16:59:53.987035 41281 quantize_decompress_robust.py:239] Layer 20 done in 99.9s [21/50] |
| I0222 16:59:53.987080 41281 quantize_decompress_robust.py:159] === Layer 21/50 === |
| I0222 16:59:56.820378 41281 quip.py:388] mean square of W: 1.000000238418579 |
| I0222 16:59:56.820580 41281 quip.py:389] mean square of Wr: 0.0003902136813849211 |
| I0222 16:59:56.821148 41281 quip.py:390] difference between Hr and Hr.T: 7.987022399902344e-06 |
| I0222 16:59:56.821260 41281 quip.py:391] max abs of Hr: 4.416016578674316 |
| I0222 16:59:56.821346 41281 quip.py:392] min diag of Lhr: 1.0527969598770142 |
| I0222 17:00:05.638741 41281 misc.py:25] /tmp/q2_temp/21_qkv.pt frob error: 0.14277081191539764 |
| I0222 17:00:05.638870 41281 misc.py:26] /tmp/q2_temp/21_qkv.pt proxy error: 0.008287270553410053 |
| I0222 17:00:06.140955 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:00:08.424009 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:00:08.424177 41281 quip.py:389] mean square of Wr: 0.0020921556279063225 |
| I0222 17:00:08.424489 41281 quip.py:390] difference between Hr and Hr.T: 1.3589859008789062e-05 |
| I0222 17:00:08.424614 41281 quip.py:391] max abs of Hr: 12.474000930786133 |
| I0222 17:00:08.424692 41281 quip.py:392] min diag of Lhr: 1.6426812410354614 |
| I0222 17:00:16.722287 41281 misc.py:25] /tmp/q2_temp/21_o.pt frob error: 0.2727900743484497 |
| I0222 17:00:16.722415 41281 misc.py:26] /tmp/q2_temp/21_o.pt proxy error: 0.025599390268325806 |
| I0222 17:00:17.027273 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:00:29.623398 41281 quip.py:388] mean square of W: 0.9999998807907104 |
| I0222 17:00:29.623905 41281 quip.py:389] mean square of Wr: 0.0008239537128247321 |
| I0222 17:00:29.624176 41281 quip.py:390] difference between Hr and Hr.T: 2.9087066650390625e-05 |
| I0222 17:00:29.624280 41281 quip.py:391] max abs of Hr: 31.794822692871094 |
| I0222 17:00:29.624359 41281 quip.py:392] min diag of Lhr: 3.3802525997161865 |
| I0222 17:00:48.440940 41281 misc.py:25] /tmp/q2_temp/21_up.pt frob error: 0.11743388324975967 |
| I0222 17:00:48.441064 41281 misc.py:26] /tmp/q2_temp/21_up.pt proxy error: 0.031545575708150864 |
| I0222 17:00:51.605269 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:00:59.825983 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:00:59.826306 41281 quip.py:389] mean square of Wr: 0.0006956174620427191 |
| I0222 17:00:59.829028 41281 quip.py:390] difference between Hr and Hr.T: 4.1425228118896484e-06 |
| I0222 17:00:59.829679 41281 quip.py:391] max abs of Hr: 3.466911792755127 |
| I0222 17:00:59.829761 41281 quip.py:392] min diag of Lhr: 1.3381156921386719 |
| I0222 17:01:32.223311 41281 misc.py:25] /tmp/q2_temp/21_down.pt frob error: 0.11643578857183456 |
| I0222 17:01:32.223477 41281 misc.py:26] /tmp/q2_temp/21_down.pt proxy error: 0.047288138419389725 |
| I0222 17:01:33.316757 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:01:33.691017 41281 quantize_decompress_robust.py:79] Saved progress for layer 21 (416 MB) |
| I0222 17:01:33.691325 41281 quantize_decompress_robust.py:239] Layer 21 done in 99.7s [22/50] |
| I0222 17:01:33.691368 41281 quantize_decompress_robust.py:159] === Layer 22/50 === |
| I0222 17:01:37.230617 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:01:37.230823 41281 quip.py:389] mean square of Wr: 0.0003498316218610853 |
| I0222 17:01:37.231148 41281 quip.py:390] difference between Hr and Hr.T: 6.9141387939453125e-06 |
| I0222 17:01:37.231263 41281 quip.py:391] max abs of Hr: 3.5973761081695557 |
| I0222 17:01:37.231358 41281 quip.py:392] min diag of Lhr: 0.9940224885940552 |
| I0222 17:01:46.223536 41281 misc.py:25] /tmp/q2_temp/22_qkv.pt frob error: 0.14523471891880035 |
| I0222 17:01:46.223657 41281 misc.py:26] /tmp/q2_temp/22_qkv.pt proxy error: 0.008231200277805328 |
| I0222 17:01:47.106458 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:01:50.162810 41281 quip.py:388] mean square of W: 0.9999999403953552 |
| I0222 17:01:50.162997 41281 quip.py:389] mean square of Wr: 0.0017089198809117079 |
| I0222 17:01:50.163341 41281 quip.py:390] difference between Hr and Hr.T: 1.7404556274414062e-05 |
| I0222 17:01:50.163450 41281 quip.py:391] max abs of Hr: 11.776627540588379 |
| I0222 17:01:50.163550 41281 quip.py:392] min diag of Lhr: 1.3710404634475708 |
| I0222 17:01:59.015589 41281 misc.py:25] /tmp/q2_temp/22_o.pt frob error: 0.21437162160873413 |
| I0222 17:01:59.015747 41281 misc.py:26] /tmp/q2_temp/22_o.pt proxy error: 0.023401955142617226 |
| I0222 17:01:59.331559 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:02:13.192710 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:02:13.193216 41281 quip.py:389] mean square of Wr: 0.0008141219150274992 |
| I0222 17:02:13.193506 41281 quip.py:390] difference between Hr and Hr.T: 3.4332275390625e-05 |
| I0222 17:02:13.193616 41281 quip.py:391] max abs of Hr: 33.53421401977539 |
| I0222 17:02:13.193724 41281 quip.py:392] min diag of Lhr: 3.4228575229644775 |
| I0222 17:02:32.554421 41281 misc.py:25] /tmp/q2_temp/22_up.pt frob error: 0.11681568622589111 |
| I0222 17:02:32.554546 41281 misc.py:26] /tmp/q2_temp/22_up.pt proxy error: 0.03643285110592842 |
| I0222 17:02:35.587948 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:02:43.429760 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:02:43.430077 41281 quip.py:389] mean square of Wr: 0.0017993975197896361 |
| I0222 17:02:43.432802 41281 quip.py:390] difference between Hr and Hr.T: 7.867813110351562e-06 |
| I0222 17:02:43.433471 41281 quip.py:391] max abs of Hr: 8.722585678100586 |
| I0222 17:02:43.433571 41281 quip.py:392] min diag of Lhr: 2.2337160110473633 |
| I0222 17:03:17.148589 41281 misc.py:25] /tmp/q2_temp/22_down.pt frob error: 0.11162256449460983 |
| I0222 17:03:17.148706 41281 misc.py:26] /tmp/q2_temp/22_down.pt proxy error: 0.05858625844120979 |
| I0222 17:03:18.213953 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:03:18.591563 41281 quantize_decompress_robust.py:79] Saved progress for layer 22 (416 MB) |
| I0222 17:03:18.591874 41281 quantize_decompress_robust.py:239] Layer 22 done in 104.9s [23/50] |
| I0222 17:03:18.591916 41281 quantize_decompress_robust.py:159] === Layer 23/50 === |
| I0222 17:03:21.524380 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 17:03:21.524584 41281 quip.py:389] mean square of Wr: 0.00035011948784813285 |
| I0222 17:03:21.524921 41281 quip.py:390] difference between Hr and Hr.T: 8.344650268554688e-06 |
| I0222 17:03:21.525035 41281 quip.py:391] max abs of Hr: 4.155845642089844 |
| I0222 17:03:21.525139 41281 quip.py:392] min diag of Lhr: 1.016338586807251 |
| I0222 17:03:30.233879 41281 misc.py:25] /tmp/q2_temp/23_qkv.pt frob error: 0.14121900498867035 |
| I0222 17:03:30.234003 41281 misc.py:26] /tmp/q2_temp/23_qkv.pt proxy error: 0.00782094243913889 |
| I0222 17:03:30.700406 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:03:33.179358 41281 quip.py:388] mean square of W: 0.9999999403953552 |
| I0222 17:03:33.179534 41281 quip.py:389] mean square of Wr: 0.0038940836675465107 |
| I0222 17:03:33.179867 41281 quip.py:390] difference between Hr and Hr.T: 4.3392181396484375e-05 |
| I0222 17:03:33.179981 41281 quip.py:391] max abs of Hr: 25.770116806030273 |
| I0222 17:03:33.180074 41281 quip.py:392] min diag of Lhr: 1.7734248638153076 |
| I0222 17:03:41.916626 41281 misc.py:25] /tmp/q2_temp/23_o.pt frob error: 0.2700197398662567 |
| I0222 17:03:41.916770 41281 misc.py:26] /tmp/q2_temp/23_o.pt proxy error: 0.020714513957500458 |
| I0222 17:03:42.224332 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:03:54.572670 41281 quip.py:388] mean square of W: 1.000000238418579 |
| I0222 17:03:54.573191 41281 quip.py:389] mean square of Wr: 0.0005626058555208147 |
| I0222 17:03:54.573493 41281 quip.py:390] difference between Hr and Hr.T: 1.8596649169921875e-05 |
| I0222 17:03:54.573605 41281 quip.py:391] max abs of Hr: 21.442054748535156 |
| I0222 17:03:54.573725 41281 quip.py:392] min diag of Lhr: 2.7969703674316406 |
| I0222 17:04:15.148527 41281 misc.py:25] /tmp/q2_temp/23_up.pt frob error: 0.11586294323205948 |
| I0222 17:04:15.148699 41281 misc.py:26] /tmp/q2_temp/23_up.pt proxy error: 0.03906916454434395 |
| I0222 17:04:20.375294 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:04:28.723381 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:04:28.723704 41281 quip.py:389] mean square of Wr: 0.003340332070365548 |
| I0222 17:04:28.726450 41281 quip.py:390] difference between Hr and Hr.T: 1.4781951904296875e-05 |
| I0222 17:04:28.727116 41281 quip.py:391] max abs of Hr: 15.906333923339844 |
| I0222 17:04:28.727210 41281 quip.py:392] min diag of Lhr: 3.0973057746887207 |
| I0222 17:05:02.128968 41281 misc.py:25] /tmp/q2_temp/23_down.pt frob error: 0.10940137505531311 |
| I0222 17:05:02.129261 41281 misc.py:26] /tmp/q2_temp/23_down.pt proxy error: 0.06160622835159302 |
| I0222 17:05:03.339493 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:05:03.721719 41281 quantize_decompress_robust.py:79] Saved progress for layer 23 (416 MB) |
| I0222 17:05:03.722069 41281 quantize_decompress_robust.py:239] Layer 23 done in 105.1s [24/50] |
| I0222 17:05:03.722121 41281 quantize_decompress_robust.py:159] === Layer 24/50 === |
| I0222 17:05:06.651004 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 17:05:06.651205 41281 quip.py:389] mean square of Wr: 0.0003869243373628706 |
| I0222 17:05:06.651730 41281 quip.py:390] difference between Hr and Hr.T: 8.344650268554688e-06 |
| I0222 17:05:06.651842 41281 quip.py:391] max abs of Hr: 5.04913854598999 |
| I0222 17:05:06.651943 41281 quip.py:392] min diag of Lhr: 1.0784428119659424 |
| I0222 17:05:16.246881 41281 misc.py:25] /tmp/q2_temp/24_qkv.pt frob error: 0.13875889778137207 |
| I0222 17:05:16.247055 41281 misc.py:26] /tmp/q2_temp/24_qkv.pt proxy error: 0.008320006541907787 |
| I0222 17:05:16.850558 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:05:19.636351 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:05:19.636525 41281 quip.py:389] mean square of Wr: 0.003038754453882575 |
| I0222 17:05:19.636844 41281 quip.py:390] difference between Hr and Hr.T: 2.765655517578125e-05 |
| I0222 17:05:19.636953 41281 quip.py:391] max abs of Hr: 22.757068634033203 |
| I0222 17:05:19.637040 41281 quip.py:392] min diag of Lhr: 1.9338968992233276 |
| I0222 17:05:28.318843 41281 misc.py:25] /tmp/q2_temp/24_o.pt frob error: 0.18827980756759644 |
| I0222 17:05:28.318978 41281 misc.py:26] /tmp/q2_temp/24_o.pt proxy error: 0.033949825912714005 |
| I0222 17:05:28.622190 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:05:41.010434 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:05:41.010955 41281 quip.py:389] mean square of Wr: 0.0004289076605346054 |
| I0222 17:05:41.011246 41281 quip.py:390] difference between Hr and Hr.T: 1.6450881958007812e-05 |
| I0222 17:05:41.011351 41281 quip.py:391] max abs of Hr: 17.151575088500977 |
| I0222 17:05:41.011448 41281 quip.py:392] min diag of Lhr: 2.340364456176758 |
| I0222 17:05:59.691361 41281 misc.py:25] /tmp/q2_temp/24_up.pt frob error: 0.114356130361557 |
| I0222 17:05:59.691478 41281 misc.py:26] /tmp/q2_temp/24_up.pt proxy error: 0.037416525185108185 |
| I0222 17:06:02.804985 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:06:10.706620 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:06:10.706942 41281 quip.py:389] mean square of Wr: 3.398924309294671e-05 |
| I0222 17:06:10.709659 41281 quip.py:390] difference between Hr and Hr.T: 4.023313522338867e-06 |
| I0222 17:06:10.710326 41281 quip.py:391] max abs of Hr: 0.5305824875831604 |
| I0222 17:06:10.710420 41281 quip.py:392] min diag of Lhr: 0.3197571933269501 |
| I0222 17:06:43.718878 41281 misc.py:25] /tmp/q2_temp/24_down.pt frob error: 0.1177394911646843 |
| I0222 17:06:43.719000 41281 misc.py:26] /tmp/q2_temp/24_down.pt proxy error: 0.004115932155400515 |
| I0222 17:06:44.821460 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:06:45.196323 41281 quantize_decompress_robust.py:79] Saved progress for layer 24 (416 MB) |
| I0222 17:06:45.196653 41281 quantize_decompress_robust.py:239] Layer 24 done in 101.5s [25/50] |
| I0222 17:06:45.196697 41281 quantize_decompress_robust.py:159] === Layer 25/50 === |
| I0222 17:06:48.026671 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:06:48.026865 41281 quip.py:389] mean square of Wr: 0.0004009536642115563 |
| I0222 17:06:48.027399 41281 quip.py:390] difference between Hr and Hr.T: 7.867813110351562e-06 |
| I0222 17:06:48.027519 41281 quip.py:391] max abs of Hr: 4.342444896697998 |
| I0222 17:06:48.027606 41281 quip.py:392] min diag of Lhr: 1.0809084177017212 |
| I0222 17:06:56.932055 41281 misc.py:25] /tmp/q2_temp/25_qkv.pt frob error: 0.15432125329971313 |
| I0222 17:06:56.932199 41281 misc.py:26] /tmp/q2_temp/25_qkv.pt proxy error: 0.00789154227823019 |
| I0222 17:06:57.461075 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:06:59.929625 41281 quip.py:388] mean square of W: 0.9999999403953552 |
| I0222 17:06:59.929793 41281 quip.py:389] mean square of Wr: 0.0018874858506023884 |
| I0222 17:06:59.930109 41281 quip.py:390] difference between Hr and Hr.T: 1.621246337890625e-05 |
| I0222 17:06:59.930222 41281 quip.py:391] max abs of Hr: 11.937150955200195 |
| I0222 17:06:59.930296 41281 quip.py:392] min diag of Lhr: 1.5207430124282837 |
| I0222 17:07:08.419522 41281 misc.py:25] /tmp/q2_temp/25_o.pt frob error: 0.17099541425704956 |
| I0222 17:07:08.419647 41281 misc.py:26] /tmp/q2_temp/25_o.pt proxy error: 0.02450719103217125 |
| I0222 17:07:08.739649 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:07:21.007610 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:07:21.008126 41281 quip.py:389] mean square of Wr: 0.001011477317661047 |
| I0222 17:07:21.008440 41281 quip.py:390] difference between Hr and Hr.T: 3.528594970703125e-05 |
| I0222 17:07:21.008544 41281 quip.py:391] max abs of Hr: 38.90740203857422 |
| I0222 17:07:21.008661 41281 quip.py:392] min diag of Lhr: 3.9351465702056885 |
| I0222 17:07:41.190020 41281 misc.py:25] /tmp/q2_temp/25_up.pt frob error: 0.11443808674812317 |
| I0222 17:07:41.190279 41281 misc.py:26] /tmp/q2_temp/25_up.pt proxy error: 0.033862531185150146 |
| I0222 17:07:44.402023 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:07:52.568332 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:07:52.568653 41281 quip.py:389] mean square of Wr: 0.0030374987982213497 |
| I0222 17:07:52.571382 41281 quip.py:390] difference between Hr and Hr.T: 1.9788742065429688e-05 |
| I0222 17:07:52.572027 41281 quip.py:391] max abs of Hr: 15.30272102355957 |
| I0222 17:07:52.572122 41281 quip.py:392] min diag of Lhr: 2.7528233528137207 |
| I0222 17:08:25.429026 41281 misc.py:25] /tmp/q2_temp/25_down.pt frob error: 0.11858786642551422 |
| I0222 17:08:25.429172 41281 misc.py:26] /tmp/q2_temp/25_down.pt proxy error: 0.045873012393713 |
| I0222 17:08:26.518523 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:08:26.888804 41281 quantize_decompress_robust.py:79] Saved progress for layer 25 (416 MB) |
| I0222 17:08:26.889126 41281 quantize_decompress_robust.py:239] Layer 25 done in 101.7s [26/50] |
| I0222 17:08:26.889174 41281 quantize_decompress_robust.py:159] === Layer 26/50 === |
| I0222 17:08:29.644822 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:08:29.645028 41281 quip.py:389] mean square of Wr: 0.0003847850894089788 |
| I0222 17:08:29.645350 41281 quip.py:390] difference between Hr and Hr.T: 5.364418029785156e-06 |
| I0222 17:08:29.645459 41281 quip.py:391] max abs of Hr: 3.7590694427490234 |
| I0222 17:08:29.645580 41281 quip.py:392] min diag of Lhr: 1.0788096189498901 |
| I0222 17:08:38.722961 41281 misc.py:25] /tmp/q2_temp/26_qkv.pt frob error: 0.1479480266571045 |
| I0222 17:08:38.723113 41281 misc.py:26] /tmp/q2_temp/26_qkv.pt proxy error: 0.008671092800796032 |
| I0222 17:08:39.171325 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:08:41.640631 41281 quip.py:388] mean square of W: 0.9999998807907104 |
| I0222 17:08:41.640809 41281 quip.py:389] mean square of Wr: 0.0020864582620561123 |
| I0222 17:08:41.641138 41281 quip.py:390] difference between Hr and Hr.T: 1.9550323486328125e-05 |
| I0222 17:08:41.641247 41281 quip.py:391] max abs of Hr: 13.19355297088623 |
| I0222 17:08:41.641344 41281 quip.py:392] min diag of Lhr: 1.700335144996643 |
| I0222 17:08:50.921475 41281 misc.py:25] /tmp/q2_temp/26_o.pt frob error: 0.151687890291214 |
| I0222 17:08:50.921630 41281 misc.py:26] /tmp/q2_temp/26_o.pt proxy error: 0.03251472860574722 |
| I0222 17:08:51.331014 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:09:03.950308 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:09:03.950829 41281 quip.py:389] mean square of Wr: 0.0008851393940858543 |
| I0222 17:09:03.951146 41281 quip.py:390] difference between Hr and Hr.T: 3.147125244140625e-05 |
| I0222 17:09:03.951251 41281 quip.py:391] max abs of Hr: 34.79465866088867 |
| I0222 17:09:03.951369 41281 quip.py:392] min diag of Lhr: 3.7499561309814453 |
| I0222 17:09:23.943195 41281 misc.py:25] /tmp/q2_temp/26_up.pt frob error: 0.11408265680074692 |
| I0222 17:09:23.943363 41281 misc.py:26] /tmp/q2_temp/26_up.pt proxy error: 0.035774003714323044 |
| I0222 17:09:27.179316 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:09:35.034271 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:09:35.034583 41281 quip.py:389] mean square of Wr: 0.0027040201239287853 |
| I0222 17:09:35.037302 41281 quip.py:390] difference between Hr and Hr.T: 1.5974044799804688e-05 |
| I0222 17:09:35.037967 41281 quip.py:391] max abs of Hr: 12.880915641784668 |
| I0222 17:09:35.038094 41281 quip.py:392] min diag of Lhr: 2.689634084701538 |
| I0222 17:10:09.327838 41281 misc.py:25] /tmp/q2_temp/26_down.pt frob error: 0.11487426608800888 |
| I0222 17:10:09.327979 41281 misc.py:26] /tmp/q2_temp/26_down.pt proxy error: 0.050968900322914124 |
| I0222 17:10:10.414166 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:10:10.779045 41281 quantize_decompress_robust.py:79] Saved progress for layer 26 (416 MB) |
| I0222 17:10:10.779361 41281 quantize_decompress_robust.py:239] Layer 26 done in 103.9s [27/50] |
| I0222 17:10:10.779409 41281 quantize_decompress_robust.py:159] === Layer 27/50 === |
| I0222 17:10:13.750823 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 17:10:13.751025 41281 quip.py:389] mean square of Wr: 0.000460948416730389 |
| I0222 17:10:13.751357 41281 quip.py:390] difference between Hr and Hr.T: 4.887580871582031e-06 |
| I0222 17:10:13.751466 41281 quip.py:391] max abs of Hr: 4.478280067443848 |
| I0222 17:10:13.751576 41281 quip.py:392] min diag of Lhr: 1.1955386400222778 |
| I0222 17:10:23.321725 41281 misc.py:25] /tmp/q2_temp/27_qkv.pt frob error: 0.14114713668823242 |
| I0222 17:10:23.321867 41281 misc.py:26] /tmp/q2_temp/27_qkv.pt proxy error: 0.012476939707994461 |
| I0222 17:10:23.771486 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:10:26.444629 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:10:26.444805 41281 quip.py:389] mean square of Wr: 0.0025971585419028997 |
| I0222 17:10:26.445138 41281 quip.py:390] difference between Hr and Hr.T: 2.6226043701171875e-05 |
| I0222 17:10:26.445250 41281 quip.py:391] max abs of Hr: 17.989961624145508 |
| I0222 17:10:26.445344 41281 quip.py:392] min diag of Lhr: 1.8209353685379028 |
| I0222 17:10:35.525554 41281 misc.py:25] /tmp/q2_temp/27_o.pt frob error: 0.1459985375404358 |
| I0222 17:10:35.525715 41281 misc.py:26] /tmp/q2_temp/27_o.pt proxy error: 0.04073497653007507 |
| I0222 17:10:35.928023 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:10:48.195974 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:10:48.196480 41281 quip.py:389] mean square of Wr: 0.0008694106945767999 |
| I0222 17:10:48.196780 41281 quip.py:390] difference between Hr and Hr.T: 2.956390380859375e-05 |
| I0222 17:10:48.196890 41281 quip.py:391] max abs of Hr: 33.30216979980469 |
| I0222 17:10:48.196996 41281 quip.py:392] min diag of Lhr: 3.747452735900879 |
| I0222 17:11:07.390249 41281 misc.py:25] /tmp/q2_temp/27_up.pt frob error: 0.113958440721035 |
| I0222 17:11:07.390402 41281 misc.py:26] /tmp/q2_temp/27_up.pt proxy error: 0.03588837385177612 |
| I0222 17:11:10.584512 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:11:18.551985 41281 quip.py:388] mean square of W: 0.9999998807907104 |
| I0222 17:11:18.552301 41281 quip.py:389] mean square of Wr: 0.0026672170497477055 |
| I0222 17:11:18.555035 41281 quip.py:390] difference between Hr and Hr.T: 1.3828277587890625e-05 |
| I0222 17:11:18.555696 41281 quip.py:391] max abs of Hr: 13.53426742553711 |
| I0222 17:11:18.555795 41281 quip.py:392] min diag of Lhr: 2.723112106323242 |
| I0222 17:11:53.627599 41281 misc.py:25] /tmp/q2_temp/27_down.pt frob error: 0.11346927285194397 |
| I0222 17:11:53.627749 41281 misc.py:26] /tmp/q2_temp/27_down.pt proxy error: 0.05523872748017311 |
| I0222 17:11:54.708082 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:11:55.211985 41281 quantize_decompress_robust.py:79] Saved progress for layer 27 (416 MB) |
| I0222 17:11:55.212314 41281 quantize_decompress_robust.py:239] Layer 27 done in 104.4s [28/50] |
| I0222 17:11:55.212363 41281 quantize_decompress_robust.py:159] === Layer 28/50 === |
| I0222 17:11:58.243222 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:11:58.243430 41281 quip.py:389] mean square of Wr: 0.0003768505994230509 |
| I0222 17:11:58.243750 41281 quip.py:390] difference between Hr and Hr.T: 4.887580871582031e-06 |
| I0222 17:11:58.243860 41281 quip.py:391] max abs of Hr: 3.586416006088257 |
| I0222 17:11:58.243948 41281 quip.py:392] min diag of Lhr: 1.0852382183074951 |
| I0222 17:12:07.818577 41281 misc.py:25] /tmp/q2_temp/28_qkv.pt frob error: 0.14164675772190094 |
| I0222 17:12:07.818699 41281 misc.py:26] /tmp/q2_temp/28_qkv.pt proxy error: 0.009872574359178543 |
| I0222 17:12:08.344794 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:12:10.632931 41281 quip.py:388] mean square of W: 0.9999999403953552 |
| I0222 17:12:10.633096 41281 quip.py:389] mean square of Wr: 0.0028632590547204018 |
| I0222 17:12:10.633416 41281 quip.py:390] difference between Hr and Hr.T: 3.528594970703125e-05 |
| I0222 17:12:10.633533 41281 quip.py:391] max abs of Hr: 22.2597713470459 |
| I0222 17:12:10.633622 41281 quip.py:392] min diag of Lhr: 1.7882893085479736 |
| I0222 17:12:19.933665 41281 misc.py:25] /tmp/q2_temp/28_o.pt frob error: 0.15287306904792786 |
| I0222 17:12:19.933797 41281 misc.py:26] /tmp/q2_temp/28_o.pt proxy error: 0.033465445041656494 |
| I0222 17:12:20.330774 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:12:33.052960 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:12:33.053461 41281 quip.py:389] mean square of Wr: 0.0007448632968589664 |
| I0222 17:12:33.053755 41281 quip.py:390] difference between Hr and Hr.T: 2.574920654296875e-05 |
| I0222 17:12:33.053864 41281 quip.py:391] max abs of Hr: 28.1485595703125 |
| I0222 17:12:33.053969 41281 quip.py:392] min diag of Lhr: 3.472224473953247 |
| I0222 17:12:52.078444 41281 misc.py:25] /tmp/q2_temp/28_up.pt frob error: 0.11427387595176697 |
| I0222 17:12:52.078572 41281 misc.py:26] /tmp/q2_temp/28_up.pt proxy error: 0.03543896973133087 |
| I0222 17:12:55.336194 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:13:03.578626 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 17:13:03.578940 41281 quip.py:389] mean square of Wr: 0.0019354498945176601 |
| I0222 17:13:03.581669 41281 quip.py:390] difference between Hr and Hr.T: 9.655952453613281e-06 |
| I0222 17:13:03.582336 41281 quip.py:391] max abs of Hr: 9.541924476623535 |
| I0222 17:13:03.582428 41281 quip.py:392] min diag of Lhr: 2.318354845046997 |
| I0222 17:13:38.631607 41281 misc.py:25] /tmp/q2_temp/28_down.pt frob error: 0.11327487230300903 |
| I0222 17:13:38.631731 41281 misc.py:26] /tmp/q2_temp/28_down.pt proxy error: 0.05543941259384155 |
| I0222 17:13:39.719633 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:13:40.092247 41281 quantize_decompress_robust.py:79] Saved progress for layer 28 (416 MB) |
| I0222 17:13:40.092577 41281 quantize_decompress_robust.py:239] Layer 28 done in 104.9s [29/50] |
| I0222 17:13:40.092624 41281 quantize_decompress_robust.py:159] === Layer 29/50 === |
| I0222 17:13:42.935189 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:13:42.935379 41281 quip.py:389] mean square of Wr: 0.00042908976320177317 |
| I0222 17:13:42.935917 41281 quip.py:390] difference between Hr and Hr.T: 5.4836273193359375e-06 |
| I0222 17:13:42.936035 41281 quip.py:391] max abs of Hr: 4.8200507164001465 |
| I0222 17:13:42.936128 41281 quip.py:392] min diag of Lhr: 1.1688055992126465 |
| I0222 17:13:52.319249 41281 misc.py:25] /tmp/q2_temp/29_qkv.pt frob error: 0.1418677121400833 |
| I0222 17:13:52.319367 41281 misc.py:26] /tmp/q2_temp/29_qkv.pt proxy error: 0.012217910028994083 |
| I0222 17:13:52.850383 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:13:55.527287 41281 quip.py:388] mean square of W: 1.000000238418579 |
| I0222 17:13:55.527454 41281 quip.py:389] mean square of Wr: 0.002747674472630024 |
| I0222 17:13:55.527779 41281 quip.py:390] difference between Hr and Hr.T: 2.193450927734375e-05 |
| I0222 17:13:55.527893 41281 quip.py:391] max abs of Hr: 19.11827278137207 |
| I0222 17:13:55.527973 41281 quip.py:392] min diag of Lhr: 1.9584681987762451 |
| I0222 17:14:04.727689 41281 misc.py:25] /tmp/q2_temp/29_o.pt frob error: 0.15286608040332794 |
| I0222 17:14:04.727827 41281 misc.py:26] /tmp/q2_temp/29_o.pt proxy error: 0.03613538667559624 |
| I0222 17:14:05.125868 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:14:17.534055 41281 quip.py:388] mean square of W: 0.9999998807907104 |
| I0222 17:14:17.534556 41281 quip.py:389] mean square of Wr: 0.0007343466859310865 |
| I0222 17:14:17.534841 41281 quip.py:390] difference between Hr and Hr.T: 2.5272369384765625e-05 |
| I0222 17:14:17.534950 41281 quip.py:391] max abs of Hr: 29.751203536987305 |
| I0222 17:14:17.535049 41281 quip.py:392] min diag of Lhr: 3.461754083633423 |
| I0222 17:14:37.483253 41281 misc.py:25] /tmp/q2_temp/29_up.pt frob error: 0.11422370374202728 |
| I0222 17:14:37.483458 41281 misc.py:26] /tmp/q2_temp/29_up.pt proxy error: 0.03624532371759415 |
| I0222 17:14:40.958338 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:14:48.960671 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:14:48.960991 41281 quip.py:389] mean square of Wr: 0.0026003846433013678 |
| I0222 17:14:48.963719 41281 quip.py:390] difference between Hr and Hr.T: 1.9311904907226562e-05 |
| I0222 17:14:48.964387 41281 quip.py:391] max abs of Hr: 13.157620429992676 |
| I0222 17:14:48.964488 41281 quip.py:392] min diag of Lhr: 2.718493700027466 |
| I0222 17:15:23.630494 41281 misc.py:25] /tmp/q2_temp/29_down.pt frob error: 0.11214291304349899 |
| I0222 17:15:23.630880 41281 misc.py:26] /tmp/q2_temp/29_down.pt proxy error: 0.05649726837873459 |
| I0222 17:15:24.708896 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:15:25.190986 41281 quantize_decompress_robust.py:79] Saved progress for layer 29 (416 MB) |
| I0222 17:15:25.191320 41281 quantize_decompress_robust.py:239] Layer 29 done in 105.1s [30/50] |
| I0222 17:15:25.191369 41281 quantize_decompress_robust.py:159] === Layer 30/50 === |
| I0222 17:15:28.498118 41281 quip.py:388] mean square of W: 0.9999998807907104 |
| I0222 17:15:28.498318 41281 quip.py:389] mean square of Wr: 0.00045999360736459494 |
| I0222 17:15:28.498673 41281 quip.py:390] difference between Hr and Hr.T: 4.589557647705078e-06 |
| I0222 17:15:28.498788 41281 quip.py:391] max abs of Hr: 4.503913402557373 |
| I0222 17:15:28.498883 41281 quip.py:392] min diag of Lhr: 1.2180216312408447 |
| I0222 17:15:38.022633 41281 misc.py:25] /tmp/q2_temp/30_qkv.pt frob error: 0.14139600098133087 |
| I0222 17:15:38.022860 41281 misc.py:26] /tmp/q2_temp/30_qkv.pt proxy error: 0.013004481792449951 |
| I0222 17:15:38.471681 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:15:40.927145 41281 quip.py:388] mean square of W: 1.000000238418579 |
| I0222 17:15:40.927313 41281 quip.py:389] mean square of Wr: 0.0023995323572307825 |
| I0222 17:15:40.927651 41281 quip.py:390] difference between Hr and Hr.T: 3.0040740966796875e-05 |
| I0222 17:15:40.927768 41281 quip.py:391] max abs of Hr: 17.130815505981445 |
| I0222 17:15:40.927856 41281 quip.py:392] min diag of Lhr: 1.7757503986358643 |
| I0222 17:15:49.317357 41281 misc.py:25] /tmp/q2_temp/30_o.pt frob error: 0.1993110179901123 |
| I0222 17:15:49.317629 41281 misc.py:26] /tmp/q2_temp/30_o.pt proxy error: 0.04069327190518379 |
| I0222 17:15:49.623142 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:16:01.579092 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:16:01.579595 41281 quip.py:389] mean square of Wr: 0.0007954120519571006 |
| I0222 17:16:01.579888 41281 quip.py:390] difference between Hr and Hr.T: 2.956390380859375e-05 |
| I0222 17:16:01.579999 41281 quip.py:391] max abs of Hr: 31.802156448364258 |
| I0222 17:16:01.580118 41281 quip.py:392] min diag of Lhr: 3.5607335567474365 |
| I0222 17:16:20.987305 41281 misc.py:25] /tmp/q2_temp/30_up.pt frob error: 0.11366003006696701 |
| I0222 17:16:20.987432 41281 misc.py:26] /tmp/q2_temp/30_up.pt proxy error: 0.03695451840758324 |
| I0222 17:16:24.114087 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:16:32.354046 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:16:32.354360 41281 quip.py:389] mean square of Wr: 0.00273866462521255 |
| I0222 17:16:32.357095 41281 quip.py:390] difference between Hr and Hr.T: 1.2159347534179688e-05 |
| I0222 17:16:32.357758 41281 quip.py:391] max abs of Hr: 13.07587718963623 |
| I0222 17:16:32.357864 41281 quip.py:392] min diag of Lhr: 2.8121814727783203 |
| I0222 17:17:06.931069 41281 misc.py:25] /tmp/q2_temp/30_down.pt frob error: 0.11267513781785965 |
| I0222 17:17:06.931223 41281 misc.py:26] /tmp/q2_temp/30_down.pt proxy error: 0.05745983123779297 |
| I0222 17:17:08.013292 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:17:08.526092 41281 quantize_decompress_robust.py:79] Saved progress for layer 30 (416 MB) |
| I0222 17:17:08.526432 41281 quantize_decompress_robust.py:239] Layer 30 done in 103.3s [31/50] |
| I0222 17:17:08.526479 41281 quantize_decompress_robust.py:159] === Layer 31/50 === |
| I0222 17:17:11.727193 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:17:11.727392 41281 quip.py:389] mean square of Wr: 0.00043657151400111616 |
| I0222 17:17:11.727942 41281 quip.py:390] difference between Hr and Hr.T: 4.827976226806641e-06 |
| I0222 17:17:11.728060 41281 quip.py:391] max abs of Hr: 4.4917378425598145 |
| I0222 17:17:11.728151 41281 quip.py:392] min diag of Lhr: 1.2040351629257202 |
| I0222 17:17:21.420633 41281 misc.py:25] /tmp/q2_temp/31_qkv.pt frob error: 0.13261456787586212 |
| I0222 17:17:21.420787 41281 misc.py:26] /tmp/q2_temp/31_qkv.pt proxy error: 0.012482315301895142 |
| I0222 17:17:21.865140 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:17:24.034780 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:17:24.034952 41281 quip.py:389] mean square of Wr: 0.003320118645206094 |
| I0222 17:17:24.035276 41281 quip.py:390] difference between Hr and Hr.T: 3.0994415283203125e-05 |
| I0222 17:17:24.035392 41281 quip.py:391] max abs of Hr: 19.894332885742188 |
| I0222 17:17:24.035486 41281 quip.py:392] min diag of Lhr: 2.249143123626709 |
| I0222 17:17:33.019858 41281 misc.py:25] /tmp/q2_temp/31_o.pt frob error: 0.13322219252586365 |
| I0222 17:17:33.019991 41281 misc.py:26] /tmp/q2_temp/31_o.pt proxy error: 0.04510972648859024 |
| I0222 17:17:33.327937 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:17:45.732456 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:17:45.732969 41281 quip.py:389] mean square of Wr: 0.0007740332512184978 |
| I0222 17:17:45.733281 41281 quip.py:390] difference between Hr and Hr.T: 2.574920654296875e-05 |
| I0222 17:17:45.733391 41281 quip.py:391] max abs of Hr: 31.230993270874023 |
| I0222 17:17:45.733527 41281 quip.py:392] min diag of Lhr: 3.5769412517547607 |
| I0222 17:18:05.242756 41281 misc.py:25] /tmp/q2_temp/31_up.pt frob error: 0.11319014430046082 |
| I0222 17:18:05.242890 41281 misc.py:26] /tmp/q2_temp/31_up.pt proxy error: 0.037764839828014374 |
| I0222 17:18:08.503322 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:18:16.426705 41281 quip.py:388] mean square of W: 1.000000238418579 |
| I0222 17:18:16.427033 41281 quip.py:389] mean square of Wr: 0.001024550525471568 |
| I0222 17:18:16.433408 41281 quip.py:390] difference between Hr and Hr.T: 3.635883331298828e-06 |
| I0222 17:18:16.434059 41281 quip.py:391] max abs of Hr: 4.7151408195495605 |
| I0222 17:18:16.434157 41281 quip.py:392] min diag of Lhr: 1.7297674417495728 |
| I0222 17:18:49.739616 41281 misc.py:25] /tmp/q2_temp/31_down.pt frob error: 0.11272697895765305 |
| I0222 17:18:49.739779 41281 misc.py:26] /tmp/q2_temp/31_down.pt proxy error: 0.057961687445640564 |
| I0222 17:18:50.813889 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:18:51.200330 41281 quantize_decompress_robust.py:79] Saved progress for layer 31 (416 MB) |
| I0222 17:18:51.200691 41281 quantize_decompress_robust.py:239] Layer 31 done in 102.7s [32/50] |
| I0222 17:18:51.200736 41281 quantize_decompress_robust.py:159] === Layer 32/50 === |
| I0222 17:18:54.637285 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:18:54.637497 41281 quip.py:389] mean square of Wr: 0.0004891424905508757 |
| I0222 17:18:54.637972 41281 quip.py:390] difference between Hr and Hr.T: 5.4836273193359375e-06 |
| I0222 17:18:54.638091 41281 quip.py:391] max abs of Hr: 4.569559097290039 |
| I0222 17:18:54.638195 41281 quip.py:392] min diag of Lhr: 1.2738791704177856 |
| I0222 17:19:03.527725 41281 misc.py:25] /tmp/q2_temp/32_qkv.pt frob error: 0.1347595602273941 |
| I0222 17:19:03.527857 41281 misc.py:26] /tmp/q2_temp/32_qkv.pt proxy error: 0.012887375429272652 |
| I0222 17:19:03.966452 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:19:06.226107 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 17:19:06.226283 41281 quip.py:389] mean square of Wr: 0.002566299866884947 |
| I0222 17:19:06.226593 41281 quip.py:390] difference between Hr and Hr.T: 3.5762786865234375e-05 |
| I0222 17:19:06.226706 41281 quip.py:391] max abs of Hr: 19.261857986450195 |
| I0222 17:19:06.226779 41281 quip.py:392] min diag of Lhr: 1.8419615030288696 |
| I0222 17:19:14.226982 41281 misc.py:25] /tmp/q2_temp/32_o.pt frob error: 0.14896367490291595 |
| I0222 17:19:14.227123 41281 misc.py:26] /tmp/q2_temp/32_o.pt proxy error: 0.04408099874854088 |
| I0222 17:19:14.630250 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:19:26.522253 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:19:26.522762 41281 quip.py:389] mean square of Wr: 0.0007532800664193928 |
| I0222 17:19:26.523060 41281 quip.py:390] difference between Hr and Hr.T: 2.956390380859375e-05 |
| I0222 17:19:26.523169 41281 quip.py:391] max abs of Hr: 30.42262840270996 |
| I0222 17:19:26.523278 41281 quip.py:392] min diag of Lhr: 3.523998260498047 |
| I0222 17:19:45.487827 41281 misc.py:25] /tmp/q2_temp/32_up.pt frob error: 0.11387144774198532 |
| I0222 17:19:45.487959 41281 misc.py:26] /tmp/q2_temp/32_up.pt proxy error: 0.03502656891942024 |
| I0222 17:19:48.572854 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:19:56.422286 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:19:56.422609 41281 quip.py:389] mean square of Wr: 0.0014318223111331463 |
| I0222 17:19:56.425328 41281 quip.py:390] difference between Hr and Hr.T: 9.179115295410156e-06 |
| I0222 17:19:56.425999 41281 quip.py:391] max abs of Hr: 7.120176792144775 |
| I0222 17:19:56.426099 41281 quip.py:392] min diag of Lhr: 1.9806859493255615 |
| I0222 17:20:31.019537 41281 misc.py:25] /tmp/q2_temp/32_down.pt frob error: 0.11631596088409424 |
| I0222 17:20:31.019671 41281 misc.py:26] /tmp/q2_temp/32_down.pt proxy error: 0.05351492017507553 |
| I0222 17:20:32.124217 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:20:32.508285 41281 quantize_decompress_robust.py:79] Saved progress for layer 32 (416 MB) |
| I0222 17:20:32.508616 41281 quantize_decompress_robust.py:239] Layer 32 done in 101.3s [33/50] |
| I0222 17:20:32.508661 41281 quantize_decompress_robust.py:159] === Layer 33/50 === |
| I0222 17:20:35.246803 41281 quip.py:388] mean square of W: 1.000000238418579 |
| I0222 17:20:35.246998 41281 quip.py:389] mean square of Wr: 0.0004958920180797577 |
| I0222 17:20:35.247335 41281 quip.py:390] difference between Hr and Hr.T: 4.76837158203125e-06 |
| I0222 17:20:35.247449 41281 quip.py:391] max abs of Hr: 4.988340854644775 |
| I0222 17:20:35.247549 41281 quip.py:392] min diag of Lhr: 1.2662135362625122 |
| I0222 17:20:45.121744 41281 misc.py:25] /tmp/q2_temp/33_qkv.pt frob error: 0.13161183893680573 |
| I0222 17:20:45.121880 41281 misc.py:26] /tmp/q2_temp/33_qkv.pt proxy error: 0.015083040110766888 |
| I0222 17:20:45.565382 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:20:47.832949 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:20:47.833125 41281 quip.py:389] mean square of Wr: 0.002727937651798129 |
| I0222 17:20:47.833431 41281 quip.py:390] difference between Hr and Hr.T: 1.9073486328125e-05 |
| I0222 17:20:47.833542 41281 quip.py:391] max abs of Hr: 15.422118186950684 |
| I0222 17:20:47.833636 41281 quip.py:392] min diag of Lhr: 1.9348409175872803 |
| I0222 17:20:57.126322 41281 misc.py:25] /tmp/q2_temp/33_o.pt frob error: 0.1720314472913742 |
| I0222 17:20:57.126450 41281 misc.py:26] /tmp/q2_temp/33_o.pt proxy error: 0.0430070199072361 |
| I0222 17:20:57.524224 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:21:10.041915 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:21:10.042413 41281 quip.py:389] mean square of Wr: 0.0008080444531515241 |
| I0222 17:21:10.042699 41281 quip.py:390] difference between Hr and Hr.T: 3.0040740966796875e-05 |
| I0222 17:21:10.042806 41281 quip.py:391] max abs of Hr: 32.16202163696289 |
| I0222 17:21:10.042911 41281 quip.py:392] min diag of Lhr: 3.586087465286255 |
| I0222 17:21:29.251283 41281 misc.py:25] /tmp/q2_temp/33_up.pt frob error: 0.11486393213272095 |
| I0222 17:21:29.251409 41281 misc.py:26] /tmp/q2_temp/33_up.pt proxy error: 0.033745214343070984 |
| I0222 17:21:32.533279 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:21:40.689149 41281 quip.py:388] mean square of W: 0.9999998807907104 |
| I0222 17:21:40.689475 41281 quip.py:389] mean square of Wr: 0.0013455870794132352 |
| I0222 17:21:40.692203 41281 quip.py:390] difference between Hr and Hr.T: 7.450580596923828e-06 |
| I0222 17:21:40.692855 41281 quip.py:391] max abs of Hr: 6.655749320983887 |
| I0222 17:21:40.692955 41281 quip.py:392] min diag of Lhr: 1.9007474184036255 |
| I0222 17:22:15.141525 41281 misc.py:25] /tmp/q2_temp/33_down.pt frob error: 0.11860000342130661 |
| I0222 17:22:15.141695 41281 misc.py:26] /tmp/q2_temp/33_down.pt proxy error: 0.054814908653497696 |
| I0222 17:22:16.337260 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:22:16.863034 41281 quantize_decompress_robust.py:79] Saved progress for layer 33 (416 MB) |
| I0222 17:22:16.863397 41281 quantize_decompress_robust.py:239] Layer 33 done in 104.4s [34/50] |
| I0222 17:22:16.863464 41281 quantize_decompress_robust.py:159] === Layer 34/50 === |
| I0222 17:22:19.553207 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:22:19.553405 41281 quip.py:389] mean square of Wr: 0.00046879606088623405 |
| I0222 17:22:19.553971 41281 quip.py:390] difference between Hr and Hr.T: 5.841255187988281e-06 |
| I0222 17:22:19.554085 41281 quip.py:391] max abs of Hr: 4.572127342224121 |
| I0222 17:22:19.554185 41281 quip.py:392] min diag of Lhr: 1.2187855243682861 |
| I0222 17:22:29.119748 41281 misc.py:25] /tmp/q2_temp/34_qkv.pt frob error: 0.12907011806964874 |
| I0222 17:22:29.119914 41281 misc.py:26] /tmp/q2_temp/34_qkv.pt proxy error: 0.01442529633641243 |
| I0222 17:22:29.595049 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:22:32.645624 41281 quip.py:388] mean square of W: 0.9999999403953552 |
| I0222 17:22:32.645793 41281 quip.py:389] mean square of Wr: 0.003063495736569166 |
| I0222 17:22:32.646117 41281 quip.py:390] difference between Hr and Hr.T: 2.09808349609375e-05 |
| I0222 17:22:32.646226 41281 quip.py:391] max abs of Hr: 18.67108917236328 |
| I0222 17:22:32.646319 41281 quip.py:392] min diag of Lhr: 2.1224029064178467 |
| I0222 17:22:41.816081 41281 misc.py:25] /tmp/q2_temp/34_o.pt frob error: 0.13996702432632446 |
| I0222 17:22:41.816327 41281 misc.py:26] /tmp/q2_temp/34_o.pt proxy error: 0.044251441955566406 |
| I0222 17:22:42.215384 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:22:54.589181 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 17:22:54.589693 41281 quip.py:389] mean square of Wr: 0.0006898856954649091 |
| I0222 17:22:54.589970 41281 quip.py:390] difference between Hr and Hr.T: 2.9087066650390625e-05 |
| I0222 17:22:54.590076 41281 quip.py:391] max abs of Hr: 30.51664161682129 |
| I0222 17:22:54.590173 41281 quip.py:392] min diag of Lhr: 3.270338773727417 |
| I0222 17:23:12.373351 41281 misc.py:25] /tmp/q2_temp/34_up.pt frob error: 0.11647189408540726 |
| I0222 17:23:12.373485 41281 misc.py:26] /tmp/q2_temp/34_up.pt proxy error: 0.02979610301554203 |
| I0222 17:23:15.588364 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:23:23.638729 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:23:23.639048 41281 quip.py:389] mean square of Wr: 0.001139944070018828 |
| I0222 17:23:23.641774 41281 quip.py:390] difference between Hr and Hr.T: 6.9141387939453125e-06 |
| I0222 17:23:23.642428 41281 quip.py:391] max abs of Hr: 5.599666595458984 |
| I0222 17:23:23.642537 41281 quip.py:392] min diag of Lhr: 1.6894029378890991 |
| I0222 17:23:55.954080 41281 misc.py:25] /tmp/q2_temp/34_down.pt frob error: 0.12437344342470169 |
| I0222 17:23:55.954215 41281 misc.py:26] /tmp/q2_temp/34_down.pt proxy error: 0.04915504902601242 |
| I0222 17:23:57.126434 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:23:57.474847 41281 quantize_decompress_robust.py:79] Saved progress for layer 34 (416 MB) |
| I0222 17:23:57.475196 41281 quantize_decompress_robust.py:239] Layer 34 done in 100.6s [35/50] |
| I0222 17:23:57.475244 41281 quantize_decompress_robust.py:159] === Layer 35/50 === |
| I0222 17:24:00.250132 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:24:00.250338 41281 quip.py:389] mean square of Wr: 0.0004137875512242317 |
| I0222 17:24:00.250682 41281 quip.py:390] difference between Hr and Hr.T: 5.424022674560547e-06 |
| I0222 17:24:00.250790 41281 quip.py:391] max abs of Hr: 4.0358476638793945 |
| I0222 17:24:00.250878 41281 quip.py:392] min diag of Lhr: 1.130812406539917 |
| I0222 17:24:10.029206 41281 misc.py:25] /tmp/q2_temp/35_qkv.pt frob error: 0.13800480961799622 |
| I0222 17:24:10.029341 41281 misc.py:26] /tmp/q2_temp/35_qkv.pt proxy error: 0.010896592400968075 |
| I0222 17:24:10.659619 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:24:13.241034 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:24:13.241210 41281 quip.py:389] mean square of Wr: 0.0030495929531753063 |
| I0222 17:24:13.241533 41281 quip.py:390] difference between Hr and Hr.T: 2.09808349609375e-05 |
| I0222 17:24:13.241647 41281 quip.py:391] max abs of Hr: 18.472253799438477 |
| I0222 17:24:13.241729 41281 quip.py:392] min diag of Lhr: 2.0327236652374268 |
| I0222 17:24:22.327116 41281 misc.py:25] /tmp/q2_temp/35_o.pt frob error: 0.15498891472816467 |
| I0222 17:24:22.327276 41281 misc.py:26] /tmp/q2_temp/35_o.pt proxy error: 0.03586217015981674 |
| I0222 17:24:22.722813 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:24:35.209445 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:24:35.209958 41281 quip.py:389] mean square of Wr: 0.0008130117785185575 |
| I0222 17:24:35.210254 41281 quip.py:390] difference between Hr and Hr.T: 2.765655517578125e-05 |
| I0222 17:24:35.210363 41281 quip.py:391] max abs of Hr: 31.187564849853516 |
| I0222 17:24:35.210470 41281 quip.py:392] min diag of Lhr: 3.4644832611083984 |
| I0222 17:24:53.374293 41281 misc.py:25] /tmp/q2_temp/35_up.pt frob error: 0.11780513823032379 |
| I0222 17:24:53.374452 41281 misc.py:26] /tmp/q2_temp/35_up.pt proxy error: 0.029127346351742744 |
| I0222 17:24:56.486950 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:25:04.448855 41281 quip.py:388] mean square of W: 1.000000238418579 |
| I0222 17:25:04.449175 41281 quip.py:389] mean square of Wr: 0.001131522934883833 |
| I0222 17:25:04.451890 41281 quip.py:390] difference between Hr and Hr.T: 6.9141387939453125e-06 |
| I0222 17:25:04.452554 41281 quip.py:391] max abs of Hr: 5.58210563659668 |
| I0222 17:25:04.452650 41281 quip.py:392] min diag of Lhr: 1.6998432874679565 |
| I0222 17:25:37.331705 41281 misc.py:25] /tmp/q2_temp/35_down.pt frob error: 0.12598654627799988 |
| I0222 17:25:37.331843 41281 misc.py:26] /tmp/q2_temp/35_down.pt proxy error: 0.04986700043082237 |
| I0222 17:25:38.412138 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:25:38.789977 41281 quantize_decompress_robust.py:79] Saved progress for layer 35 (416 MB) |
| I0222 17:25:38.790312 41281 quantize_decompress_robust.py:239] Layer 35 done in 101.3s [36/50] |
| I0222 17:25:38.790359 41281 quantize_decompress_robust.py:159] === Layer 36/50 === |
| I0222 17:25:41.725312 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 17:25:41.725522 41281 quip.py:389] mean square of Wr: 0.000423346646130085 |
| I0222 17:25:41.726058 41281 quip.py:390] difference between Hr and Hr.T: 6.258487701416016e-06 |
| I0222 17:25:41.726175 41281 quip.py:391] max abs of Hr: 4.142746448516846 |
| I0222 17:25:41.726262 41281 quip.py:392] min diag of Lhr: 1.1216579675674438 |
| I0222 17:25:50.627329 41281 misc.py:25] /tmp/q2_temp/36_qkv.pt frob error: 0.1379663199186325 |
| I0222 17:25:50.627464 41281 misc.py:26] /tmp/q2_temp/36_qkv.pt proxy error: 0.012696616351604462 |
| I0222 17:25:51.152034 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:25:53.545314 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:25:53.545481 41281 quip.py:389] mean square of Wr: 0.002454964444041252 |
| I0222 17:25:53.545822 41281 quip.py:390] difference between Hr and Hr.T: 4.482269287109375e-05 |
| I0222 17:25:53.545936 41281 quip.py:391] max abs of Hr: 20.395843505859375 |
| I0222 17:25:53.546020 41281 quip.py:392] min diag of Lhr: 1.696892499923706 |
| I0222 17:26:02.625650 41281 misc.py:25] /tmp/q2_temp/36_o.pt frob error: 0.15147027373313904 |
| I0222 17:26:02.625782 41281 misc.py:26] /tmp/q2_temp/36_o.pt proxy error: 0.03371457755565643 |
| I0222 17:26:02.978717 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:26:15.212650 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:26:15.213150 41281 quip.py:389] mean square of Wr: 0.0007319668075069785 |
| I0222 17:26:15.213433 41281 quip.py:390] difference between Hr and Hr.T: 3.528594970703125e-05 |
| I0222 17:26:15.213541 41281 quip.py:391] max abs of Hr: 33.9200439453125 |
| I0222 17:26:15.213637 41281 quip.py:392] min diag of Lhr: 3.2932252883911133 |
| I0222 17:26:33.190320 41281 misc.py:25] /tmp/q2_temp/36_up.pt frob error: 0.11990956217050552 |
| I0222 17:26:33.190447 41281 misc.py:26] /tmp/q2_temp/36_up.pt proxy error: 0.025673113763332367 |
| I0222 17:26:36.409729 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:26:44.232091 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 17:26:44.232403 41281 quip.py:389] mean square of Wr: 0.0012516803108155727 |
| I0222 17:26:44.235142 41281 quip.py:390] difference between Hr and Hr.T: 9.417533874511719e-06 |
| I0222 17:26:44.235806 41281 quip.py:391] max abs of Hr: 7.030608654022217 |
| I0222 17:26:44.235900 41281 quip.py:392] min diag of Lhr: 1.704140067100525 |
| I0222 17:27:18.832325 41281 misc.py:25] /tmp/q2_temp/36_down.pt frob error: 0.13601462543010712 |
| I0222 17:27:18.832487 41281 misc.py:26] /tmp/q2_temp/36_down.pt proxy error: 0.04385872185230255 |
| I0222 17:27:20.017079 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:27:20.536419 41281 quantize_decompress_robust.py:79] Saved progress for layer 36 (416 MB) |
| I0222 17:27:20.536774 41281 quantize_decompress_robust.py:239] Layer 36 done in 101.7s [37/50] |
| I0222 17:27:20.536819 41281 quantize_decompress_robust.py:159] === Layer 37/50 === |
| I0222 17:27:23.335771 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:27:23.335975 41281 quip.py:389] mean square of Wr: 0.0004292965750209987 |
| I0222 17:27:23.336430 41281 quip.py:390] difference between Hr and Hr.T: 6.198883056640625e-06 |
| I0222 17:27:23.336543 41281 quip.py:391] max abs of Hr: 4.333855628967285 |
| I0222 17:27:23.336624 41281 quip.py:392] min diag of Lhr: 1.114458680152893 |
| I0222 17:27:32.032554 41281 misc.py:25] /tmp/q2_temp/37_qkv.pt frob error: 0.1367305964231491 |
| I0222 17:27:32.032691 41281 misc.py:26] /tmp/q2_temp/37_qkv.pt proxy error: 0.012785619124770164 |
| I0222 17:27:32.576230 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:27:34.730069 41281 quip.py:388] mean square of W: 0.9999999403953552 |
| I0222 17:27:34.730244 41281 quip.py:389] mean square of Wr: 0.0024339943192899227 |
| I0222 17:27:34.730554 41281 quip.py:390] difference between Hr and Hr.T: 1.7642974853515625e-05 |
| I0222 17:27:34.730669 41281 quip.py:391] max abs of Hr: 17.29367446899414 |
| I0222 17:27:34.730747 41281 quip.py:392] min diag of Lhr: 1.658444881439209 |
| I0222 17:27:43.025467 41281 misc.py:25] /tmp/q2_temp/37_o.pt frob error: 0.15980425477027893 |
| I0222 17:27:43.025602 41281 misc.py:26] /tmp/q2_temp/37_o.pt proxy error: 0.03499188274145126 |
| I0222 17:27:43.336163 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:27:55.551301 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:27:55.551824 41281 quip.py:389] mean square of Wr: 0.0004940585931763053 |
| I0222 17:27:55.552108 41281 quip.py:390] difference between Hr and Hr.T: 3.528594970703125e-05 |
| I0222 17:27:55.552211 41281 quip.py:391] max abs of Hr: 22.61634635925293 |
| I0222 17:27:55.552312 41281 quip.py:392] min diag of Lhr: 2.5363576412200928 |
| I0222 17:28:14.247578 41281 misc.py:25] /tmp/q2_temp/37_up.pt frob error: 0.12168700993061066 |
| I0222 17:28:14.247714 41281 misc.py:26] /tmp/q2_temp/37_up.pt proxy error: 0.024055708199739456 |
| I0222 17:28:17.463493 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:28:24.609368 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:28:24.609688 41281 quip.py:389] mean square of Wr: 0.0016911693383008242 |
| I0222 17:28:24.612419 41281 quip.py:390] difference between Hr and Hr.T: 1.4781951904296875e-05 |
| I0222 17:28:24.613071 41281 quip.py:391] max abs of Hr: 8.687889099121094 |
| I0222 17:28:24.613159 41281 quip.py:392] min diag of Lhr: 1.9219292402267456 |
| I0222 17:28:57.429668 41281 misc.py:25] /tmp/q2_temp/37_down.pt frob error: 0.14271999895572662 |
| I0222 17:28:57.429808 41281 misc.py:26] /tmp/q2_temp/37_down.pt proxy error: 0.04174881428480148 |
| I0222 17:28:58.515807 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:28:58.894829 41281 quantize_decompress_robust.py:79] Saved progress for layer 37 (416 MB) |
| I0222 17:28:58.895192 41281 quantize_decompress_robust.py:239] Layer 37 done in 98.4s [38/50] |
| I0222 17:28:58.895242 41281 quantize_decompress_robust.py:159] === Layer 38/50 === |
| I0222 17:29:01.427134 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 17:29:01.427341 41281 quip.py:389] mean square of Wr: 0.0004649522597901523 |
| I0222 17:29:01.427657 41281 quip.py:390] difference between Hr and Hr.T: 5.4836273193359375e-06 |
| I0222 17:29:01.427765 41281 quip.py:391] max abs of Hr: 4.539203643798828 |
| I0222 17:29:01.427869 41281 quip.py:392] min diag of Lhr: 1.125286340713501 |
| I0222 17:29:10.034443 41281 misc.py:25] /tmp/q2_temp/38_qkv.pt frob error: 0.14225123822689056 |
| I0222 17:29:10.034573 41281 misc.py:26] /tmp/q2_temp/38_qkv.pt proxy error: 0.012861372902989388 |
| I0222 17:29:10.465205 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:29:12.212570 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:29:12.212740 41281 quip.py:389] mean square of Wr: 0.002981953090056777 |
| I0222 17:29:12.213059 41281 quip.py:390] difference between Hr and Hr.T: 1.7404556274414062e-05 |
| I0222 17:29:12.213174 41281 quip.py:391] max abs of Hr: 17.114906311035156 |
| I0222 17:29:12.213247 41281 quip.py:392] min diag of Lhr: 2.1424813270568848 |
| I0222 17:29:20.123921 41281 misc.py:25] /tmp/q2_temp/38_o.pt frob error: 0.15805573761463165 |
| I0222 17:29:20.124044 41281 misc.py:26] /tmp/q2_temp/38_o.pt proxy error: 0.04310612753033638 |
| I0222 17:29:20.428670 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:29:31.775577 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 17:29:31.776129 41281 quip.py:389] mean square of Wr: 0.0007162276306189597 |
| I0222 17:29:31.776437 41281 quip.py:390] difference between Hr and Hr.T: 3.0994415283203125e-05 |
| I0222 17:29:31.776546 41281 quip.py:391] max abs of Hr: 27.901535034179688 |
| I0222 17:29:31.776667 41281 quip.py:392] min diag of Lhr: 2.8261985778808594 |
| I0222 17:29:52.148947 41281 misc.py:25] /tmp/q2_temp/38_up.pt frob error: 0.12117753177881241 |
| I0222 17:29:52.149100 41281 misc.py:26] /tmp/q2_temp/38_up.pt proxy error: 0.026187345385551453 |
| I0222 17:29:55.235634 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:30:03.659318 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:30:03.659649 41281 quip.py:389] mean square of Wr: 0.00100456434302032 |
| I0222 17:30:03.662342 41281 quip.py:390] difference between Hr and Hr.T: 7.808208465576172e-06 |
| I0222 17:30:03.663007 41281 quip.py:391] max abs of Hr: 5.292235851287842 |
| I0222 17:30:03.663133 41281 quip.py:392] min diag of Lhr: 1.5341774225234985 |
| I0222 17:30:38.234049 41281 misc.py:25] /tmp/q2_temp/38_down.pt frob error: 0.13655051589012146 |
| I0222 17:30:38.234185 41281 misc.py:26] /tmp/q2_temp/38_down.pt proxy error: 0.0450197197496891 |
| I0222 17:30:39.400652 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:30:39.754510 41281 quantize_decompress_robust.py:79] Saved progress for layer 38 (416 MB) |
| I0222 17:30:39.754859 41281 quantize_decompress_robust.py:239] Layer 38 done in 100.9s [39/50] |
| I0222 17:30:39.754904 41281 quantize_decompress_robust.py:159] === Layer 39/50 === |
| I0222 17:30:43.626119 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:30:43.626328 41281 quip.py:389] mean square of Wr: 0.0004742085875477642 |
| I0222 17:30:43.626660 41281 quip.py:390] difference between Hr and Hr.T: 6.4373016357421875e-06 |
| I0222 17:30:43.626776 41281 quip.py:391] max abs of Hr: 5.155056953430176 |
| I0222 17:30:43.626874 41281 quip.py:392] min diag of Lhr: 1.108538031578064 |
| I0222 17:30:53.242392 41281 misc.py:25] /tmp/q2_temp/39_qkv.pt frob error: 0.14164818823337555 |
| I0222 17:30:53.242525 41281 misc.py:26] /tmp/q2_temp/39_qkv.pt proxy error: 0.012975811026990414 |
| I0222 17:30:53.743164 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:30:56.704443 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 17:30:56.704645 41281 quip.py:389] mean square of Wr: 0.0029371781274676323 |
| I0222 17:30:56.705001 41281 quip.py:390] difference between Hr and Hr.T: 1.33514404296875e-05 |
| I0222 17:30:56.705120 41281 quip.py:391] max abs of Hr: 15.638252258300781 |
| I0222 17:30:56.705225 41281 quip.py:392] min diag of Lhr: 2.12136173248291 |
| I0222 17:31:05.321439 41281 misc.py:25] /tmp/q2_temp/39_o.pt frob error: 0.16524794697761536 |
| I0222 17:31:05.321572 41281 misc.py:26] /tmp/q2_temp/39_o.pt proxy error: 0.04239806905388832 |
| I0222 17:31:05.626926 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:31:17.846477 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:31:17.846981 41281 quip.py:389] mean square of Wr: 0.0007747419294901192 |
| I0222 17:31:17.847262 41281 quip.py:390] difference between Hr and Hr.T: 3.3855438232421875e-05 |
| I0222 17:31:17.847370 41281 quip.py:391] max abs of Hr: 31.930252075195312 |
| I0222 17:31:17.847476 41281 quip.py:392] min diag of Lhr: 2.943685531616211 |
| I0222 17:31:36.782448 41281 misc.py:25] /tmp/q2_temp/39_up.pt frob error: 0.12090174853801727 |
| I0222 17:31:36.782681 41281 misc.py:26] /tmp/q2_temp/39_up.pt proxy error: 0.026604142040014267 |
| I0222 17:31:39.895548 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:31:48.131342 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:31:48.131654 41281 quip.py:389] mean square of Wr: 0.0019248364260420203 |
| I0222 17:31:48.134399 41281 quip.py:390] difference between Hr and Hr.T: 1.3232231140136719e-05 |
| I0222 17:31:48.135063 41281 quip.py:391] max abs of Hr: 9.868791580200195 |
| I0222 17:31:48.135162 41281 quip.py:392] min diag of Lhr: 2.197420120239258 |
| I0222 17:32:21.327689 41281 misc.py:25] /tmp/q2_temp/39_down.pt frob error: 0.13046810030937195 |
| I0222 17:32:21.327833 41281 misc.py:26] /tmp/q2_temp/39_down.pt proxy error: 0.05030064284801483 |
| I0222 17:32:22.429401 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:32:22.918584 41281 quantize_decompress_robust.py:79] Saved progress for layer 39 (416 MB) |
| I0222 17:32:22.918936 41281 quantize_decompress_robust.py:239] Layer 39 done in 103.2s [40/50] |
| I0222 17:32:22.918983 41281 quantize_decompress_robust.py:159] === Layer 40/50 === |
| I0222 17:32:26.550603 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:32:26.550807 41281 quip.py:389] mean square of Wr: 0.0005524414591491222 |
| I0222 17:32:26.551135 41281 quip.py:390] difference between Hr and Hr.T: 8.821487426757812e-06 |
| I0222 17:32:26.551241 41281 quip.py:391] max abs of Hr: 5.823266506195068 |
| I0222 17:32:26.551345 41281 quip.py:392] min diag of Lhr: 1.2205114364624023 |
| I0222 17:32:36.124809 41281 misc.py:25] /tmp/q2_temp/40_qkv.pt frob error: 0.14088502526283264 |
| I0222 17:32:36.124941 41281 misc.py:26] /tmp/q2_temp/40_qkv.pt proxy error: 0.01438378170132637 |
| I0222 17:32:36.563655 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:32:39.344657 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:32:39.344820 41281 quip.py:389] mean square of Wr: 0.001978697720915079 |
| I0222 17:32:39.345142 41281 quip.py:390] difference between Hr and Hr.T: 1.7404556274414062e-05 |
| I0222 17:32:39.345253 41281 quip.py:391] max abs of Hr: 12.760272979736328 |
| I0222 17:32:39.345337 41281 quip.py:392] min diag of Lhr: 1.56758713722229 |
| I0222 17:32:48.432387 41281 misc.py:25] /tmp/q2_temp/40_o.pt frob error: 0.16856703162193298 |
| I0222 17:32:48.432525 41281 misc.py:26] /tmp/q2_temp/40_o.pt proxy error: 0.04066724330186844 |
| I0222 17:32:48.824878 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:33:01.361771 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:33:01.362297 41281 quip.py:389] mean square of Wr: 0.0009041193407028913 |
| I0222 17:33:01.362603 41281 quip.py:390] difference between Hr and Hr.T: 3.910064697265625e-05 |
| I0222 17:33:01.362707 41281 quip.py:391] max abs of Hr: 37.02832794189453 |
| I0222 17:33:01.362811 41281 quip.py:392] min diag of Lhr: 3.472799301147461 |
| I0222 17:33:20.985463 41281 misc.py:25] /tmp/q2_temp/40_up.pt frob error: 0.11979907006025314 |
| I0222 17:33:20.985596 41281 misc.py:26] /tmp/q2_temp/40_up.pt proxy error: 0.030246613547205925 |
| I0222 17:33:23.971998 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:33:32.147084 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 17:33:32.147403 41281 quip.py:389] mean square of Wr: 0.002410479821264744 |
| I0222 17:33:32.150135 41281 quip.py:390] difference between Hr and Hr.T: 1.239776611328125e-05 |
| I0222 17:33:32.150802 41281 quip.py:391] max abs of Hr: 11.594589233398438 |
| I0222 17:33:32.150908 41281 quip.py:392] min diag of Lhr: 2.5257797241210938 |
| I0222 17:34:06.431101 41281 misc.py:25] /tmp/q2_temp/40_down.pt frob error: 0.12315817177295685 |
| I0222 17:34:06.431246 41281 misc.py:26] /tmp/q2_temp/40_down.pt proxy error: 0.0547143816947937 |
| I0222 17:34:07.536672 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:34:07.907965 41281 quantize_decompress_robust.py:79] Saved progress for layer 40 (416 MB) |
| I0222 17:34:07.908330 41281 quantize_decompress_robust.py:239] Layer 40 done in 105.0s [41/50] |
| I0222 17:34:07.908376 41281 quantize_decompress_robust.py:159] === Layer 41/50 === |
| I0222 17:34:11.701276 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 17:34:11.701483 41281 quip.py:389] mean square of Wr: 0.0005951765342615545 |
| I0222 17:34:11.701854 41281 quip.py:390] difference between Hr and Hr.T: 8.106231689453125e-06 |
| I0222 17:34:11.701965 41281 quip.py:391] max abs of Hr: 5.605987548828125 |
| I0222 17:34:11.702074 41281 quip.py:392] min diag of Lhr: 1.3015254735946655 |
| I0222 17:34:21.222673 41281 misc.py:25] /tmp/q2_temp/41_qkv.pt frob error: 0.1343664526939392 |
| I0222 17:34:21.222804 41281 misc.py:26] /tmp/q2_temp/41_qkv.pt proxy error: 0.014540234580636024 |
| I0222 17:34:21.735688 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:34:24.536676 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:34:24.536859 41281 quip.py:389] mean square of Wr: 0.004308158531785011 |
| I0222 17:34:24.537177 41281 quip.py:390] difference between Hr and Hr.T: 5.7220458984375e-05 |
| I0222 17:34:24.537294 41281 quip.py:391] max abs of Hr: 31.282413482666016 |
| I0222 17:34:24.537392 41281 quip.py:392] min diag of Lhr: 2.184490919113159 |
| I0222 17:34:32.930947 41281 misc.py:25] /tmp/q2_temp/41_o.pt frob error: 0.15963491797447205 |
| I0222 17:34:32.931095 41281 misc.py:26] /tmp/q2_temp/41_o.pt proxy error: 0.035108812153339386 |
| I0222 17:34:33.329403 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:34:45.653675 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:34:45.654185 41281 quip.py:389] mean square of Wr: 0.001043360331095755 |
| I0222 17:34:45.654516 41281 quip.py:390] difference between Hr and Hr.T: 5.7220458984375e-05 |
| I0222 17:34:45.654626 41281 quip.py:391] max abs of Hr: 43.513973236083984 |
| I0222 17:34:45.654743 41281 quip.py:392] min diag of Lhr: 3.71694016456604 |
| I0222 17:35:05.848992 41281 misc.py:25] /tmp/q2_temp/41_up.pt frob error: 0.11783790588378906 |
| I0222 17:35:05.849114 41281 misc.py:26] /tmp/q2_temp/41_up.pt proxy error: 0.03169333562254906 |
| I0222 17:35:09.017364 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:35:17.445218 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 17:35:17.445534 41281 quip.py:389] mean square of Wr: 0.0017985682934522629 |
| I0222 17:35:17.448257 41281 quip.py:390] difference between Hr and Hr.T: 9.417533874511719e-06 |
| I0222 17:35:17.448923 41281 quip.py:391] max abs of Hr: 8.937530517578125 |
| I0222 17:35:17.449038 41281 quip.py:392] min diag of Lhr: 2.2541558742523193 |
| I0222 17:35:52.729675 41281 misc.py:25] /tmp/q2_temp/41_down.pt frob error: 0.12045308202505112 |
| I0222 17:35:52.729840 41281 misc.py:26] /tmp/q2_temp/41_down.pt proxy error: 0.05715218558907509 |
| I0222 17:35:53.835731 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:35:54.348654 41281 quantize_decompress_robust.py:79] Saved progress for layer 41 (416 MB) |
| I0222 17:35:54.349010 41281 quantize_decompress_robust.py:239] Layer 41 done in 106.4s [42/50] |
| I0222 17:35:54.349057 41281 quantize_decompress_robust.py:159] === Layer 42/50 === |
| I0222 17:35:57.926123 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:35:57.926332 41281 quip.py:389] mean square of Wr: 0.0005745256785303354 |
| I0222 17:35:57.926678 41281 quip.py:390] difference between Hr and Hr.T: 9.059906005859375e-06 |
| I0222 17:35:57.926791 41281 quip.py:391] max abs of Hr: 6.104416370391846 |
| I0222 17:35:57.926898 41281 quip.py:392] min diag of Lhr: 1.3129786252975464 |
| I0222 17:36:07.522517 41281 misc.py:25] /tmp/q2_temp/42_qkv.pt frob error: 0.130647212266922 |
| I0222 17:36:07.522648 41281 misc.py:26] /tmp/q2_temp/42_qkv.pt proxy error: 0.017144549638032913 |
| I0222 17:36:08.057166 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:36:11.127071 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 17:36:11.127242 41281 quip.py:389] mean square of Wr: 0.003498575184494257 |
| I0222 17:36:11.127576 41281 quip.py:390] difference between Hr and Hr.T: 2.09808349609375e-05 |
| I0222 17:36:11.127695 41281 quip.py:391] max abs of Hr: 20.473690032958984 |
| I0222 17:36:11.127792 41281 quip.py:392] min diag of Lhr: 2.0684096813201904 |
| I0222 17:36:19.430713 41281 misc.py:25] /tmp/q2_temp/42_o.pt frob error: 0.1846236288547516 |
| I0222 17:36:19.430843 41281 misc.py:26] /tmp/q2_temp/42_o.pt proxy error: 0.03632164001464844 |
| I0222 17:36:19.739162 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:36:32.291351 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:36:32.291857 41281 quip.py:389] mean square of Wr: 0.0011085006408393383 |
| I0222 17:36:32.292137 41281 quip.py:390] difference between Hr and Hr.T: 6.29425048828125e-05 |
| I0222 17:36:32.292247 41281 quip.py:391] max abs of Hr: 49.661380767822266 |
| I0222 17:36:32.292353 41281 quip.py:392] min diag of Lhr: 3.9979658126831055 |
| I0222 17:36:52.183720 41281 misc.py:25] /tmp/q2_temp/42_up.pt frob error: 0.1163967177271843 |
| I0222 17:36:52.183851 41281 misc.py:26] /tmp/q2_temp/42_up.pt proxy error: 0.03116638772189617 |
| I0222 17:36:55.419281 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:37:02.836043 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 17:37:02.836355 41281 quip.py:389] mean square of Wr: 0.001965958159416914 |
| I0222 17:37:02.839095 41281 quip.py:390] difference between Hr and Hr.T: 1.5497207641601562e-05 |
| I0222 17:37:02.839759 41281 quip.py:391] max abs of Hr: 10.044549942016602 |
| I0222 17:37:02.839852 41281 quip.py:392] min diag of Lhr: 2.3650548458099365 |
| I0222 17:37:35.752963 41281 misc.py:25] /tmp/q2_temp/42_down.pt frob error: 0.1204208955168724 |
| I0222 17:37:35.753110 41281 misc.py:26] /tmp/q2_temp/42_down.pt proxy error: 0.056361373513936996 |
| I0222 17:37:36.822703 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:37:37.182615 41281 quantize_decompress_robust.py:79] Saved progress for layer 42 (416 MB) |
| I0222 17:37:37.182983 41281 quantize_decompress_robust.py:239] Layer 42 done in 102.8s [43/50] |
| I0222 17:37:37.183035 41281 quantize_decompress_robust.py:159] === Layer 43/50 === |
| I0222 17:37:40.432783 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:37:40.432976 41281 quip.py:389] mean square of Wr: 0.0006172220455482602 |
| I0222 17:37:40.433305 41281 quip.py:390] difference between Hr and Hr.T: 9.059906005859375e-06 |
| I0222 17:37:40.433413 41281 quip.py:391] max abs of Hr: 6.220371246337891 |
| I0222 17:37:40.433520 41281 quip.py:392] min diag of Lhr: 1.3771648406982422 |
| I0222 17:37:49.333635 41281 misc.py:25] /tmp/q2_temp/43_qkv.pt frob error: 0.1283273547887802 |
| I0222 17:37:49.333761 41281 misc.py:26] /tmp/q2_temp/43_qkv.pt proxy error: 0.018901877105236053 |
| I0222 17:37:49.855780 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:37:51.648132 41281 quip.py:388] mean square of W: 0.9999999403953552 |
| I0222 17:37:51.648317 41281 quip.py:389] mean square of Wr: 0.0019482746720314026 |
| I0222 17:37:51.648663 41281 quip.py:390] difference between Hr and Hr.T: 1.6927719116210938e-05 |
| I0222 17:37:51.648774 41281 quip.py:391] max abs of Hr: 13.314170837402344 |
| I0222 17:37:51.648867 41281 quip.py:392] min diag of Lhr: 1.7113054990768433 |
| I0222 17:38:00.024250 41281 misc.py:25] /tmp/q2_temp/43_o.pt frob error: 0.16441930830478668 |
| I0222 17:38:00.024374 41281 misc.py:26] /tmp/q2_temp/43_o.pt proxy error: 0.04142212122678757 |
| I0222 17:38:00.332549 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:38:11.796174 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:38:11.796725 41281 quip.py:389] mean square of Wr: 0.0011742719216272235 |
| I0222 17:38:11.797019 41281 quip.py:390] difference between Hr and Hr.T: 6.008148193359375e-05 |
| I0222 17:38:11.797132 41281 quip.py:391] max abs of Hr: 50.61994552612305 |
| I0222 17:38:11.797255 41281 quip.py:392] min diag of Lhr: 4.178556442260742 |
| I0222 17:38:31.035335 41281 misc.py:25] /tmp/q2_temp/43_up.pt frob error: 0.11531689018011093 |
| I0222 17:38:31.035495 41281 misc.py:26] /tmp/q2_temp/43_up.pt proxy error: 0.03236864507198334 |
| I0222 17:38:34.117748 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:38:41.464822 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:38:41.465187 41281 quip.py:389] mean square of Wr: 0.0008768062689341605 |
| I0222 17:38:41.468789 41281 quip.py:390] difference between Hr and Hr.T: 7.092952728271484e-06 |
| I0222 17:38:41.469451 41281 quip.py:391] max abs of Hr: 4.7139811515808105 |
| I0222 17:38:41.469555 41281 quip.py:392] min diag of Lhr: 1.5886529684066772 |
| I0222 17:39:16.338649 41281 misc.py:25] /tmp/q2_temp/43_down.pt frob error: 0.12032882869243622 |
| I0222 17:39:16.338775 41281 misc.py:26] /tmp/q2_temp/43_down.pt proxy error: 0.05536403879523277 |
| I0222 17:39:17.406630 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:39:17.781286 41281 quantize_decompress_robust.py:79] Saved progress for layer 43 (416 MB) |
| I0222 17:39:17.781647 41281 quantize_decompress_robust.py:239] Layer 43 done in 100.6s [44/50] |
| I0222 17:39:17.781695 41281 quantize_decompress_robust.py:159] === Layer 44/50 === |
| I0222 17:39:21.556362 41281 quip.py:388] mean square of W: 0.9999998807907104 |
| I0222 17:39:21.556633 41281 quip.py:389] mean square of Wr: 0.00067420385312289 |
| I0222 17:39:21.556974 41281 quip.py:390] difference between Hr and Hr.T: 1.049041748046875e-05 |
| I0222 17:39:21.557086 41281 quip.py:391] max abs of Hr: 7.130764484405518 |
| I0222 17:39:21.557206 41281 quip.py:392] min diag of Lhr: 1.249155044555664 |
| I0222 17:39:31.130285 41281 misc.py:25] /tmp/q2_temp/44_qkv.pt frob error: 0.13009586930274963 |
| I0222 17:39:31.130447 41281 misc.py:26] /tmp/q2_temp/44_qkv.pt proxy error: 0.01729588396847248 |
| I0222 17:39:31.569743 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:39:34.653579 41281 quip.py:388] mean square of W: 0.9999999403953552 |
| I0222 17:39:34.653763 41281 quip.py:389] mean square of Wr: 0.0029798499308526516 |
| I0222 17:39:34.654105 41281 quip.py:390] difference between Hr and Hr.T: 2.288818359375e-05 |
| I0222 17:39:34.654213 41281 quip.py:391] max abs of Hr: 20.548295974731445 |
| I0222 17:39:34.654308 41281 quip.py:392] min diag of Lhr: 2.1319739818573 |
| I0222 17:39:43.718193 41281 misc.py:25] /tmp/q2_temp/44_o.pt frob error: 0.1669449657201767 |
| I0222 17:39:43.718320 41281 misc.py:26] /tmp/q2_temp/44_o.pt proxy error: 0.041543684899806976 |
| I0222 17:39:44.027964 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:39:56.423702 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:39:56.424262 41281 quip.py:389] mean square of Wr: 0.001026699086651206 |
| I0222 17:39:56.424558 41281 quip.py:390] difference between Hr and Hr.T: 5.53131103515625e-05 |
| I0222 17:39:56.424670 41281 quip.py:391] max abs of Hr: 44.78391647338867 |
| I0222 17:39:56.424781 41281 quip.py:392] min diag of Lhr: 3.828829765319824 |
| I0222 17:40:17.350878 41281 misc.py:25] /tmp/q2_temp/44_up.pt frob error: 0.11415864527225494 |
| I0222 17:40:17.351009 41281 misc.py:26] /tmp/q2_temp/44_up.pt proxy error: 0.03461460396647453 |
| I0222 17:40:20.515066 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:40:28.969136 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:40:28.969473 41281 quip.py:389] mean square of Wr: 0.0017801079666242003 |
| I0222 17:40:28.972199 41281 quip.py:390] difference between Hr and Hr.T: 1.5914440155029297e-05 |
| I0222 17:40:28.972851 41281 quip.py:391] max abs of Hr: 9.29697036743164 |
| I0222 17:40:28.972953 41281 quip.py:392] min diag of Lhr: 2.281700849533081 |
| I0222 17:41:04.324376 41281 misc.py:25] /tmp/q2_temp/44_down.pt frob error: 0.1199658066034317 |
| I0222 17:41:04.324568 41281 misc.py:26] /tmp/q2_temp/44_down.pt proxy error: 0.05496646463871002 |
| I0222 17:41:05.504332 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:41:05.857841 41281 quantize_decompress_robust.py:79] Saved progress for layer 44 (416 MB) |
| I0222 17:41:05.858205 41281 quantize_decompress_robust.py:239] Layer 44 done in 108.1s [45/50] |
| I0222 17:41:05.858254 41281 quantize_decompress_robust.py:159] === Layer 45/50 === |
| I0222 17:41:09.534841 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 17:41:09.535034 41281 quip.py:389] mean square of Wr: 0.0007460000924766064 |
| I0222 17:41:09.535366 41281 quip.py:390] difference between Hr and Hr.T: 1.4066696166992188e-05 |
| I0222 17:41:09.535475 41281 quip.py:391] max abs of Hr: 7.439826965332031 |
| I0222 17:41:09.535603 41281 quip.py:392] min diag of Lhr: 1.5009851455688477 |
| I0222 17:41:19.019804 41281 misc.py:25] /tmp/q2_temp/45_qkv.pt frob error: 0.12372434139251709 |
| I0222 17:41:19.019933 41281 misc.py:26] /tmp/q2_temp/45_qkv.pt proxy error: 0.017686786130070686 |
| I0222 17:41:19.461789 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:41:22.428205 41281 quip.py:388] mean square of W: 0.9999998807907104 |
| I0222 17:41:22.428369 41281 quip.py:389] mean square of Wr: 0.001908112084493041 |
| I0222 17:41:22.428694 41281 quip.py:390] difference between Hr and Hr.T: 3.0040740966796875e-05 |
| I0222 17:41:22.428812 41281 quip.py:391] max abs of Hr: 15.934697151184082 |
| I0222 17:41:22.428894 41281 quip.py:392] min diag of Lhr: 1.5172961950302124 |
| I0222 17:41:31.522227 41281 misc.py:25] /tmp/q2_temp/45_o.pt frob error: 0.177424356341362 |
| I0222 17:41:31.522339 41281 misc.py:26] /tmp/q2_temp/45_o.pt proxy error: 0.02925078198313713 |
| I0222 17:41:31.828866 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:41:44.471624 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 17:41:44.472123 41281 quip.py:389] mean square of Wr: 0.0009110037353821099 |
| I0222 17:41:44.472407 41281 quip.py:390] difference between Hr and Hr.T: 5.245208740234375e-05 |
| I0222 17:41:44.472515 41281 quip.py:391] max abs of Hr: 39.35268020629883 |
| I0222 17:41:44.472613 41281 quip.py:392] min diag of Lhr: 3.7125744819641113 |
| I0222 17:42:03.284923 41281 misc.py:25] /tmp/q2_temp/45_up.pt frob error: 0.11348067969083786 |
| I0222 17:42:03.285040 41281 misc.py:26] /tmp/q2_temp/45_up.pt proxy error: 0.03381526470184326 |
| I0222 17:42:06.358918 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:42:14.636089 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:42:14.636398 41281 quip.py:389] mean square of Wr: 0.001670267665758729 |
| I0222 17:42:14.639147 41281 quip.py:390] difference between Hr and Hr.T: 1.33514404296875e-05 |
| I0222 17:42:14.639806 41281 quip.py:391] max abs of Hr: 9.368295669555664 |
| I0222 17:42:14.639899 41281 quip.py:392] min diag of Lhr: 2.1838929653167725 |
| I0222 17:42:47.728680 41281 misc.py:25] /tmp/q2_temp/45_down.pt frob error: 0.12386368960142136 |
| I0222 17:42:47.728830 41281 misc.py:26] /tmp/q2_temp/45_down.pt proxy error: 0.051695071160793304 |
| I0222 17:42:48.831911 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:42:49.204843 41281 quantize_decompress_robust.py:79] Saved progress for layer 45 (416 MB) |
| I0222 17:42:49.205203 41281 quantize_decompress_robust.py:239] Layer 45 done in 103.3s [46/50] |
| I0222 17:42:49.205252 41281 quantize_decompress_robust.py:159] === Layer 46/50 === |
| I0222 17:42:52.835701 41281 quip.py:388] mean square of W: 0.9999999403953552 |
| I0222 17:42:52.835890 41281 quip.py:389] mean square of Wr: 0.000675210147164762 |
| I0222 17:42:52.836371 41281 quip.py:390] difference between Hr and Hr.T: 1.1920928955078125e-05 |
| I0222 17:42:52.836485 41281 quip.py:391] max abs of Hr: 7.664760589599609 |
| I0222 17:42:52.836575 41281 quip.py:392] min diag of Lhr: 1.3238660097122192 |
| I0222 17:43:02.622006 41281 misc.py:25] /tmp/q2_temp/46_qkv.pt frob error: 0.1305638998746872 |
| I0222 17:43:02.622130 41281 misc.py:26] /tmp/q2_temp/46_qkv.pt proxy error: 0.015190163627266884 |
| I0222 17:43:03.063101 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:43:06.134177 41281 quip.py:388] mean square of W: 0.9999999403953552 |
| I0222 17:43:06.134345 41281 quip.py:389] mean square of Wr: 0.0016818322474136949 |
| I0222 17:43:06.134665 41281 quip.py:390] difference between Hr and Hr.T: 1.7642974853515625e-05 |
| I0222 17:43:06.134775 41281 quip.py:391] max abs of Hr: 15.07264232635498 |
| I0222 17:43:06.134859 41281 quip.py:392] min diag of Lhr: 1.5468155145645142 |
| I0222 17:43:15.318524 41281 misc.py:25] /tmp/q2_temp/46_o.pt frob error: 0.15807661414146423 |
| I0222 17:43:15.318639 41281 misc.py:26] /tmp/q2_temp/46_o.pt proxy error: 0.03787262365221977 |
| I0222 17:43:15.626783 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:43:27.901609 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:43:27.902108 41281 quip.py:389] mean square of Wr: 0.0007986149284988642 |
| I0222 17:43:27.902387 41281 quip.py:390] difference between Hr and Hr.T: 5.054473876953125e-05 |
| I0222 17:43:27.902498 41281 quip.py:391] max abs of Hr: 38.38069534301758 |
| I0222 17:43:27.902595 41281 quip.py:392] min diag of Lhr: 3.467820167541504 |
| I0222 17:43:46.878246 41281 misc.py:25] /tmp/q2_temp/46_up.pt frob error: 0.11384608596563339 |
| I0222 17:43:46.878369 41281 misc.py:26] /tmp/q2_temp/46_up.pt proxy error: 0.031171593815088272 |
| I0222 17:43:49.999925 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:43:58.135686 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:43:58.135993 41281 quip.py:389] mean square of Wr: 0.0015732520259916782 |
| I0222 17:43:58.138747 41281 quip.py:390] difference between Hr and Hr.T: 1.6570091247558594e-05 |
| I0222 17:43:58.139410 41281 quip.py:391] max abs of Hr: 8.821403503417969 |
| I0222 17:43:58.139500 41281 quip.py:392] min diag of Lhr: 2.0250465869903564 |
| I0222 17:44:31.333377 41281 misc.py:25] /tmp/q2_temp/46_down.pt frob error: 0.13264307379722595 |
| I0222 17:44:31.333499 41281 misc.py:26] /tmp/q2_temp/46_down.pt proxy error: 0.042417991906404495 |
| I0222 17:44:32.409160 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:44:32.765928 41281 quantize_decompress_robust.py:79] Saved progress for layer 46 (416 MB) |
| I0222 17:44:32.766318 41281 quantize_decompress_robust.py:239] Layer 46 done in 103.6s [47/50] |
| I0222 17:44:32.766377 41281 quantize_decompress_robust.py:159] === Layer 47/50 === |
| I0222 17:44:36.433660 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 17:44:36.433859 41281 quip.py:389] mean square of Wr: 0.0007433627615682781 |
| I0222 17:44:36.434178 41281 quip.py:390] difference between Hr and Hr.T: 1.430511474609375e-05 |
| I0222 17:44:36.434283 41281 quip.py:391] max abs of Hr: 7.961364269256592 |
| I0222 17:44:36.434365 41281 quip.py:392] min diag of Lhr: 1.4590827226638794 |
| I0222 17:44:45.821644 41281 misc.py:25] /tmp/q2_temp/47_qkv.pt frob error: 0.12677960097789764 |
| I0222 17:44:45.821789 41281 misc.py:26] /tmp/q2_temp/47_qkv.pt proxy error: 0.014667291194200516 |
| I0222 17:44:46.249361 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:44:49.233303 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:44:49.233471 41281 quip.py:389] mean square of Wr: 0.0021708174608647823 |
| I0222 17:44:49.233791 41281 quip.py:390] difference between Hr and Hr.T: 2.4318695068359375e-05 |
| I0222 17:44:49.233901 41281 quip.py:391] max abs of Hr: 18.1649169921875 |
| I0222 17:44:49.233982 41281 quip.py:392] min diag of Lhr: 1.5288821458816528 |
| I0222 17:44:58.218040 41281 misc.py:25] /tmp/q2_temp/47_o.pt frob error: 0.17813728749752045 |
| I0222 17:44:58.218153 41281 misc.py:26] /tmp/q2_temp/47_o.pt proxy error: 0.02830180712044239 |
| I0222 17:44:58.524616 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:45:10.609598 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:45:10.610100 41281 quip.py:389] mean square of Wr: 0.0008006872958503664 |
| I0222 17:45:10.610381 41281 quip.py:390] difference between Hr and Hr.T: 6.67572021484375e-05 |
| I0222 17:45:10.610490 41281 quip.py:391] max abs of Hr: 43.03177261352539 |
| I0222 17:45:10.610587 41281 quip.py:392] min diag of Lhr: 3.2841362953186035 |
| I0222 17:45:29.275573 41281 misc.py:25] /tmp/q2_temp/47_up.pt frob error: 0.11482042074203491 |
| I0222 17:45:29.275688 41281 misc.py:26] /tmp/q2_temp/47_up.pt proxy error: 0.028380345553159714 |
| I0222 17:45:32.284625 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:45:40.424976 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 17:45:40.425284 41281 quip.py:389] mean square of Wr: 0.001203504391014576 |
| I0222 17:45:40.428028 41281 quip.py:390] difference between Hr and Hr.T: 1.2159347534179688e-05 |
| I0222 17:45:40.428689 41281 quip.py:391] max abs of Hr: 6.7389302253723145 |
| I0222 17:45:40.428783 41281 quip.py:392] min diag of Lhr: 1.7179012298583984 |
| I0222 17:46:14.531841 41281 misc.py:25] /tmp/q2_temp/47_down.pt frob error: 0.13468386232852936 |
| I0222 17:46:14.532011 41281 misc.py:26] /tmp/q2_temp/47_down.pt proxy error: 0.03973129764199257 |
| I0222 17:46:15.618096 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:46:15.986444 41281 quantize_decompress_robust.py:79] Saved progress for layer 47 (416 MB) |
| I0222 17:46:15.986818 41281 quantize_decompress_robust.py:239] Layer 47 done in 103.2s [48/50] |
| I0222 17:46:15.986868 41281 quantize_decompress_robust.py:159] === Layer 48/50 === |
| I0222 17:46:19.631489 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:46:19.631689 41281 quip.py:389] mean square of Wr: 0.000754921231418848 |
| I0222 17:46:19.632178 41281 quip.py:390] difference between Hr and Hr.T: 1.71661376953125e-05 |
| I0222 17:46:19.632298 41281 quip.py:391] max abs of Hr: 10.236075401306152 |
| I0222 17:46:19.632393 41281 quip.py:392] min diag of Lhr: 1.4281150102615356 |
| I0222 17:46:29.220347 41281 misc.py:25] /tmp/q2_temp/48_qkv.pt frob error: 0.13410456478595734 |
| I0222 17:46:29.220464 41281 misc.py:26] /tmp/q2_temp/48_qkv.pt proxy error: 0.01385707687586546 |
| I0222 17:46:29.660462 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:46:32.637475 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:46:32.637653 41281 quip.py:389] mean square of Wr: 0.00251777027733624 |
| I0222 17:46:32.637976 41281 quip.py:390] difference between Hr and Hr.T: 3.409385681152344e-05 |
| I0222 17:46:32.638087 41281 quip.py:391] max abs of Hr: 20.66263771057129 |
| I0222 17:46:32.638169 41281 quip.py:392] min diag of Lhr: 1.4765782356262207 |
| I0222 17:46:41.427262 41281 misc.py:25] /tmp/q2_temp/48_o.pt frob error: 0.17184792459011078 |
| I0222 17:46:41.427381 41281 misc.py:26] /tmp/q2_temp/48_o.pt proxy error: 0.026211827993392944 |
| I0222 17:46:41.738264 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:46:53.359834 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 17:46:53.360346 41281 quip.py:389] mean square of Wr: 0.0008039108361117542 |
| I0222 17:46:53.360644 41281 quip.py:390] difference between Hr and Hr.T: 7.82012939453125e-05 |
| I0222 17:46:53.360753 41281 quip.py:391] max abs of Hr: 45.55986785888672 |
| I0222 17:46:53.360859 41281 quip.py:392] min diag of Lhr: 3.2289552688598633 |
| I0222 17:47:13.248890 41281 misc.py:25] /tmp/q2_temp/48_up.pt frob error: 0.11571403592824936 |
| I0222 17:47:13.249015 41281 misc.py:26] /tmp/q2_temp/48_up.pt proxy error: 0.019106069579720497 |
| I0222 17:47:16.300189 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:47:23.347915 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:47:23.348222 41281 quip.py:389] mean square of Wr: 0.00040265751886181533 |
| I0222 17:47:23.350938 41281 quip.py:390] difference between Hr and Hr.T: 7.808208465576172e-06 |
| I0222 17:47:23.351602 41281 quip.py:391] max abs of Hr: 2.844658613204956 |
| I0222 17:47:23.351728 41281 quip.py:392] min diag of Lhr: 0.9162649512290955 |
| I0222 17:47:57.831313 41281 misc.py:25] /tmp/q2_temp/48_down.pt frob error: 0.1457708328962326 |
| I0222 17:47:57.831446 41281 misc.py:26] /tmp/q2_temp/48_down.pt proxy error: 0.02626357413828373 |
| I0222 17:47:58.911984 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:47:59.387649 41281 quantize_decompress_robust.py:79] Saved progress for layer 48 (416 MB) |
| I0222 17:47:59.387997 41281 quantize_decompress_robust.py:239] Layer 48 done in 103.4s [49/50] |
| I0222 17:47:59.388040 41281 quantize_decompress_robust.py:159] === Layer 49/50 === |
| I0222 17:48:02.138954 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 17:48:02.139144 41281 quip.py:389] mean square of Wr: 0.0004956302582286298 |
| I0222 17:48:02.139636 41281 quip.py:390] difference between Hr and Hr.T: 1.4781951904296875e-05 |
| I0222 17:48:02.139755 41281 quip.py:391] max abs of Hr: 7.831757068634033 |
| I0222 17:48:02.139852 41281 quip.py:392] min diag of Lhr: 1.1191297769546509 |
| I0222 17:48:11.422336 41281 misc.py:25] /tmp/q2_temp/49_qkv.pt frob error: 0.13562406599521637 |
| I0222 17:48:11.422461 41281 misc.py:26] /tmp/q2_temp/49_qkv.pt proxy error: 0.013013405725359917 |
| I0222 17:48:11.861971 41281 quantize_decompress_robust.py:180] qkv done |
| I0222 17:48:13.744464 41281 quip.py:388] mean square of W: 1.0000001192092896 |
| I0222 17:48:13.744645 41281 quip.py:389] mean square of Wr: 0.00308518810197711 |
| I0222 17:48:13.744988 41281 quip.py:390] difference between Hr and Hr.T: 0.0001049041748046875 |
| I0222 17:48:13.745099 41281 quip.py:391] max abs of Hr: 36.214900970458984 |
| I0222 17:48:13.745178 41281 quip.py:392] min diag of Lhr: 1.334347128868103 |
| I0222 17:48:22.220154 41281 misc.py:25] /tmp/q2_temp/49_o.pt frob error: 0.17522643506526947 |
| I0222 17:48:22.220284 41281 misc.py:26] /tmp/q2_temp/49_o.pt proxy error: 0.010520759969949722 |
| I0222 17:48:22.526140 41281 quantize_decompress_robust.py:194] o done |
| I0222 17:48:33.827397 41281 quip.py:388] mean square of W: 1.0 |
| I0222 17:48:33.827907 41281 quip.py:389] mean square of Wr: 0.0008213515975512564 |
| I0222 17:48:33.828197 41281 quip.py:390] difference between Hr and Hr.T: 0.0001049041748046875 |
| I0222 17:48:33.828306 41281 quip.py:391] max abs of Hr: 52.53691482543945 |
| I0222 17:48:33.828424 41281 quip.py:392] min diag of Lhr: 3.1403679847717285 |
| I0222 17:48:53.650333 41281 misc.py:25] /tmp/q2_temp/49_up.pt frob error: 0.1184532642364502 |
| I0222 17:48:53.650495 41281 misc.py:26] /tmp/q2_temp/49_up.pt proxy error: 0.009319067932665348 |
| I0222 17:48:56.715537 41281 quantize_decompress_robust.py:212] up done |
| I0222 17:49:04.843454 41281 quip.py:388] mean square of W: 1.000000238418579 |
| I0222 17:49:04.843780 41281 quip.py:389] mean square of Wr: 0.00036129806539975107 |
| I0222 17:49:04.846487 41281 quip.py:390] difference between Hr and Hr.T: 1.621246337890625e-05 |
| I0222 17:49:04.847155 41281 quip.py:391] max abs of Hr: 3.3946971893310547 |
| I0222 17:49:04.847256 41281 quip.py:392] min diag of Lhr: 0.7430058717727661 |
| I0222 17:49:39.330766 41281 misc.py:25] /tmp/q2_temp/49_down.pt frob error: 0.17033936083316803 |
| I0222 17:49:39.330888 41281 misc.py:26] /tmp/q2_temp/49_down.pt proxy error: 0.007896381430327892 |
| I0222 17:49:40.419783 41281 quantize_decompress_robust.py:226] down done |
| I0222 17:49:40.784071 41281 quantize_decompress_robust.py:79] Saved progress for layer 49 (416 MB) |
| I0222 17:49:40.784420 41281 quantize_decompress_robust.py:239] Layer 49 done in 101.4s [50/50] |
| I0222 17:49:41.072075 41281 quantize_decompress_robust.py:244] Quantization done in 4516s (75.3min) |
| I0222 17:49:41.072382 41281 quantize_decompress_robust.py:252] All 50 layers done! Assembling final FP16 model... |
|
Loading weights: 0%| | 0/453 [00:00<?, ?it/s]
Loading weights: 0%| | 1/453 [00:00<00:00, 59074.70it/s, Materializing param=lm_head.weight]
Loading weights: 0%| | 1/453 [00:00<00:00, 14413.42it/s, Materializing param=lm_head.weight]
Loading weights: 0%| | 2/453 [00:00<00:00, 11023.14it/s, Materializing param=model.embed_tokens.weight]
Loading weights: 0%| | 2/453 [00:00<00:00, 9000.65it/s, Materializing param=model.embed_tokens.weight]
Loading weights: 1%| | 3/453 [00:00<00:00, 9946.97it/s, Materializing param=model.layers.0.input_layernorm.weight]
Loading weights: 1%| | 3/453 [00:00<00:00, 8879.97it/s, Materializing param=model.layers.0.input_layernorm.weight]
Loading weights: 1%| | 4/453 [00:00<00:00, 9409.54it/s, Materializing param=model.layers.0.mlp.down_proj.weight]
Loading weights: 1%| | 4/453 [00:00<00:00, 8733.58it/s, Materializing param=model.layers.0.mlp.down_proj.weight]
Loading weights: 1%| | 5/453 [00:00<00:00, 7707.28it/s, Materializing param=model.layers.0.mlp.gate_proj.weight]
Loading weights: 1%| | 5/453 [00:00<00:00, 7317.35it/s, Materializing param=model.layers.0.mlp.gate_proj.weight]
Loading weights: 1%|▏ | 6/453 [00:00<00:00, 4824.74it/s, Materializing param=model.layers.0.mlp.up_proj.weight]
Loading weights: 1%|▏ | 6/453 [00:00<00:00, 4687.25it/s, Materializing param=model.layers.0.mlp.up_proj.weight]
Loading weights: 2%|▏ | 7/453 [00:00<00:00, 5139.18it/s, Materializing param=model.layers.0.post_attention_layernorm.weight]
Loading weights: 2%|▏ | 7/453 [00:00<00:00, 5007.70it/s, Materializing param=model.layers.0.post_attention_layernorm.weight]
Loading weights: 2%|▏ | 8/453 [00:00<00:00, 5411.13it/s, Materializing param=model.layers.0.self_attn.k_proj.weight]
Loading weights: 2%|▏ | 8/453 [00:00<00:00, 5299.18it/s, Materializing param=model.layers.0.self_attn.k_proj.weight]
Loading weights: 2%|▏ | 9/453 [00:00<00:00, 5689.33it/s, Materializing param=model.layers.0.self_attn.o_proj.weight]
Loading weights: 2%|▏ | 9/453 [00:00<00:00, 5579.18it/s, Materializing param=model.layers.0.self_attn.o_proj.weight]
Loading weights: 2%|▏ | 10/453 [00:00<00:00, 5909.97it/s, Materializing param=model.layers.0.self_attn.q_proj.weight]
Loading weights: 2%|▏ | 10/453 [00:00<00:00, 5799.65it/s, Materializing param=model.layers.0.self_attn.q_proj.weight]
Loading weights: 2%|▏ | 11/453 [00:00<00:00, 5568.78it/s, Materializing param=model.layers.0.self_attn.v_proj.weight]
Loading weights: 2%|▏ | 11/453 [00:00<00:00, 5474.29it/s, Materializing param=model.layers.0.self_attn.v_proj.weight]
Loading weights: 3%|▎ | 12/453 [00:00<00:00, 5744.97it/s, Materializing param=model.layers.1.input_layernorm.weight]
Loading weights: 3%|▎ | 12/453 [00:00<00:00, 5659.69it/s, Materializing param=model.layers.1.input_layernorm.weight]
Loading weights: 3%|▎ | 13/453 [00:00<00:00, 5906.19it/s, Materializing param=model.layers.1.mlp.down_proj.weight]
Loading weights: 3%|▎ | 13/453 [00:00<00:00, 5822.93it/s, Materializing param=model.layers.1.mlp.down_proj.weight]
Loading weights: 3%|▎ | 14/453 [00:00<00:00, 6056.76it/s, Materializing param=model.layers.1.mlp.gate_proj.weight]
Loading weights: 3%|▎ | 14/453 [00:00<00:00, 5974.18it/s, Materializing param=model.layers.1.mlp.gate_proj.weight]
Loading weights: 3%|▎ | 15/453 [00:00<00:00, 5787.38it/s, Materializing param=model.layers.1.mlp.up_proj.weight]
Loading weights: 3%|▎ | 15/453 [00:00<00:00, 5714.83it/s, Materializing param=model.layers.1.mlp.up_proj.weight]
Loading weights: 4%|▎ | 16/453 [00:00<00:00, 5309.25it/s, Materializing param=model.layers.1.post_attention_layernorm.weight]
Loading weights: 4%|▎ | 16/453 [00:00<00:00, 5249.85it/s, Materializing param=model.layers.1.post_attention_layernorm.weight]
Loading weights: 4%|▍ | 17/453 [00:00<00:00, 5429.73it/s, Materializing param=model.layers.1.self_attn.k_proj.weight]
Loading weights: 4%|▍ | 17/453 [00:00<00:00, 5375.29it/s, Materializing param=model.layers.1.self_attn.k_proj.weight]
Loading weights: 4%|▍ | 18/453 [00:00<00:00, 5549.65it/s, Materializing param=model.layers.1.self_attn.o_proj.weight]
Loading weights: 4%|▍ | 18/453 [00:00<00:00, 5498.72it/s, Materializing param=model.layers.1.self_attn.o_proj.weight]
Loading weights: 4%|▍ | 19/453 [00:00<00:00, 5667.58it/s, Materializing param=model.layers.1.self_attn.q_proj.weight]
Loading weights: 4%|▍ | 19/453 [00:00<00:00, 5616.45it/s, Materializing param=model.layers.1.self_attn.q_proj.weight]
Loading weights: 4%|▍ | 20/453 [00:00<00:00, 5035.48it/s, Materializing param=model.layers.1.self_attn.v_proj.weight]
Loading weights: 4%|▍ | 20/453 [00:00<00:00, 4992.92it/s, Materializing param=model.layers.1.self_attn.v_proj.weight]
Loading weights: 5%|▍ | 21/453 [00:00<00:00, 4537.65it/s, Materializing param=model.layers.2.input_layernorm.weight]
Loading weights: 5%|▍ | 21/453 [00:00<00:00, 4500.56it/s, Materializing param=model.layers.2.input_layernorm.weight]
Loading weights: 5%|▍ | 22/453 [00:00<00:00, 4633.43it/s, Materializing param=model.layers.2.mlp.down_proj.weight]
Loading weights: 5%|▍ | 22/453 [00:00<00:00, 4603.61it/s, Materializing param=model.layers.2.mlp.down_proj.weight]
Loading weights: 5%|▌ | 23/453 [00:00<00:00, 4733.51it/s, Materializing param=model.layers.2.mlp.gate_proj.weight]
Loading weights: 5%|▌ | 23/453 [00:00<00:00, 4704.20it/s, Materializing param=model.layers.2.mlp.gate_proj.weight]
Loading weights: 5%|▌ | 24/453 [00:00<00:00, 4831.22it/s, Materializing param=model.layers.2.mlp.up_proj.weight]
Loading weights: 5%|▌ | 24/453 [00:00<00:00, 4801.95it/s, Materializing param=model.layers.2.mlp.up_proj.weight]
Loading weights: 6%|▌ | 25/453 [00:00<00:00, 4926.82it/s, Materializing param=model.layers.2.post_attention_layernorm.weight]
Loading weights: 6%|▌ | 25/453 [00:00<00:00, 4896.00it/s, Materializing param=model.layers.2.post_attention_layernorm.weight]
Loading weights: 6%|▌ | 26/453 [00:00<00:00, 5011.35it/s, Materializing param=model.layers.2.self_attn.k_proj.weight]
Loading weights: 6%|▌ | 26/453 [00:00<00:00, 4981.13it/s, Materializing param=model.layers.2.self_attn.k_proj.weight]
Loading weights: 6%|▌ | 27/453 [00:00<00:00, 5095.21it/s, Materializing param=model.layers.2.self_attn.o_proj.weight]
Loading weights: 6%|▌ | 27/453 [00:00<00:00, 5066.49it/s, Materializing param=model.layers.2.self_attn.o_proj.weight]
Loading weights: 6%|▌ | 28/453 [00:00<00:00, 4634.22it/s, Materializing param=model.layers.2.self_attn.q_proj.weight]
Loading weights: 6%|▌ | 28/453 [00:00<00:00, 4608.58it/s, Materializing param=model.layers.2.self_attn.q_proj.weight]
Loading weights: 6%|▋ | 29/453 [00:00<00:00, 4710.33it/s, Materializing param=model.layers.2.self_attn.v_proj.weight]
Loading weights: 6%|▋ | 29/453 [00:00<00:00, 4686.73it/s, Materializing param=model.layers.2.self_attn.v_proj.weight]
Loading weights: 7%|▋ | 30/453 [00:00<00:00, 4788.38it/s, Materializing param=model.layers.3.input_layernorm.weight]
Loading weights: 7%|▋ | 30/453 [00:00<00:00, 4765.35it/s, Materializing param=model.layers.3.input_layernorm.weight]
Loading weights: 7%|▋ | 31/453 [00:00<00:00, 4862.87it/s, Materializing param=model.layers.3.mlp.down_proj.weight]
Loading weights: 7%|▋ | 31/453 [00:00<00:00, 4840.96it/s, Materializing param=model.layers.3.mlp.down_proj.weight]
Loading weights: 7%|▋ | 32/453 [00:00<00:00, 4938.29it/s, Materializing param=model.layers.3.mlp.gate_proj.weight]
Loading weights: 7%|▋ | 32/453 [00:00<00:00, 4915.14it/s, Materializing param=model.layers.3.mlp.gate_proj.weight]
Loading weights: 7%|▋ | 33/453 [00:00<00:00, 5008.58it/s, Materializing param=model.layers.3.mlp.up_proj.weight]
Loading weights: 7%|▋ | 33/453 [00:00<00:00, 4985.84it/s, Materializing param=model.layers.3.mlp.up_proj.weight]
Loading weights: 8%|▊ | 34/453 [00:00<00:00, 4856.17it/s, Materializing param=model.layers.3.post_attention_layernorm.weight]
Loading weights: 8%|▊ | 34/453 [00:00<00:00, 4831.82it/s, Materializing param=model.layers.3.post_attention_layernorm.weight]
Loading weights: 8%|▊ | 35/453 [00:00<00:00, 4917.78it/s, Materializing param=model.layers.3.self_attn.k_proj.weight]
Loading weights: 8%|▊ | 35/453 [00:00<00:00, 4892.38it/s, Materializing param=model.layers.3.self_attn.k_proj.weight]
Loading weights: 8%|▊ | 36/453 [00:00<00:00, 4976.76it/s, Materializing param=model.layers.3.self_attn.o_proj.weight]
Loading weights: 8%|▊ | 36/453 [00:00<00:00, 4956.34it/s, Materializing param=model.layers.3.self_attn.o_proj.weight]
Loading weights: 8%|▊ | 37/453 [00:00<00:00, 4937.14it/s, Materializing param=model.layers.3.self_attn.q_proj.weight]
Loading weights: 8%|▊ | 37/453 [00:00<00:00, 4914.01it/s, Materializing param=model.layers.3.self_attn.q_proj.weight]
Loading weights: 8%|▊ | 38/453 [00:00<00:00, 5001.68it/s, Materializing param=model.layers.3.self_attn.v_proj.weight]
Loading weights: 8%|▊ | 38/453 [00:00<00:00, 4981.36it/s, Materializing param=model.layers.3.self_attn.v_proj.weight]
Loading weights: 9%|▊ | 39/453 [00:00<00:00, 5060.73it/s, Materializing param=model.layers.4.input_layernorm.weight]
Loading weights: 9%|▊ | 39/453 [00:00<00:00, 5040.45it/s, Materializing param=model.layers.4.input_layernorm.weight]
Loading weights: 9%|▉ | 40/453 [00:00<00:00, 5119.06it/s, Materializing param=model.layers.4.mlp.down_proj.weight]
Loading weights: 9%|▉ | 40/453 [00:00<00:00, 5100.39it/s, Materializing param=model.layers.4.mlp.down_proj.weight]
Loading weights: 9%|▉ | 41/453 [00:00<00:00, 4581.86it/s, Materializing param=model.layers.4.mlp.gate_proj.weight]
Loading weights: 9%|▉ | 41/453 [00:00<00:00, 4565.08it/s, Materializing param=model.layers.4.mlp.gate_proj.weight]
Loading weights: 9%|▉ | 42/453 [00:00<00:00, 4636.42it/s, Materializing param=model.layers.4.mlp.up_proj.weight]
Loading weights: 9%|▉ | 42/453 [00:00<00:00, 4621.22it/s, Materializing param=model.layers.4.mlp.up_proj.weight]
Loading weights: 9%|▉ | 43/453 [00:00<00:00, 4669.99it/s, Materializing param=model.layers.4.post_attention_layernorm.weight]
Loading weights: 9%|▉ | 43/453 [00:00<00:00, 4653.48it/s, Materializing param=model.layers.4.post_attention_layernorm.weight]
Loading weights: 10%|▉ | 44/453 [00:00<00:00, 4720.05it/s, Materializing param=model.layers.4.self_attn.k_proj.weight]
Loading weights: 10%|▉ | 44/453 [00:00<00:00, 4704.17it/s, Materializing param=model.layers.4.self_attn.k_proj.weight]
Loading weights: 10%|▉ | 45/453 [00:00<00:00, 4770.47it/s, Materializing param=model.layers.4.self_attn.o_proj.weight]
Loading weights: 10%|▉ | 45/453 [00:00<00:00, 4754.97it/s, Materializing param=model.layers.4.self_attn.o_proj.weight]
Loading weights: 10%|█ | 46/453 [00:00<00:00, 4820.68it/s, Materializing param=model.layers.4.self_attn.q_proj.weight]
Loading weights: 10%|█ | 46/453 [00:00<00:00, 4805.31it/s, Materializing param=model.layers.4.self_attn.q_proj.weight]
Loading weights: 10%|█ | 47/453 [00:00<00:00, 4870.95it/s, Materializing param=model.layers.4.self_attn.v_proj.weight]
Loading weights: 10%|█ | 47/453 [00:00<00:00, 4856.19it/s, Materializing param=model.layers.4.self_attn.v_proj.weight]
Loading weights: 11%|█ | 48/453 [00:00<00:00, 4919.76it/s, Materializing param=model.layers.5.input_layernorm.weight]
Loading weights: 11%|█ | 48/453 [00:00<00:00, 4904.42it/s, Materializing param=model.layers.5.input_layernorm.weight]
Loading weights: 11%|█ | 49/453 [00:00<00:00, 4967.27it/s, Materializing param=model.layers.5.mlp.down_proj.weight]
Loading weights: 11%|█ | 49/453 [00:00<00:00, 4952.55it/s, Materializing param=model.layers.5.mlp.down_proj.weight]
Loading weights: 11%|█ | 50/453 [00:00<00:00, 5014.47it/s, Materializing param=model.layers.5.mlp.gate_proj.weight]
Loading weights: 11%|█ | 50/453 [00:00<00:00, 4999.17it/s, Materializing param=model.layers.5.mlp.gate_proj.weight]
Loading weights: 11%|█▏ | 51/453 [00:00<00:00, 5059.35it/s, Materializing param=model.layers.5.mlp.up_proj.weight]
Loading weights: 11%|█▏ | 51/453 [00:00<00:00, 5044.32it/s, Materializing param=model.layers.5.mlp.up_proj.weight]
Loading weights: 11%|█▏ | 52/453 [00:00<00:00, 5105.55it/s, Materializing param=model.layers.5.post_attention_layernorm.weight]
Loading weights: 11%|█▏ | 52/453 [00:00<00:00, 5089.82it/s, Materializing param=model.layers.5.post_attention_layernorm.weight]
Loading weights: 12%|█▏ | 53/453 [00:00<00:00, 4842.78it/s, Materializing param=model.layers.5.self_attn.k_proj.weight]
Loading weights: 12%|█▏ | 53/453 [00:00<00:00, 4826.38it/s, Materializing param=model.layers.5.self_attn.k_proj.weight]
Loading weights: 12%|█▏ | 54/453 [00:00<00:00, 4881.30it/s, Materializing param=model.layers.5.self_attn.o_proj.weight]
Loading weights: 12%|█▏ | 54/453 [00:00<00:00, 4867.14it/s, Materializing param=model.layers.5.self_attn.o_proj.weight]
Loading weights: 12%|█▏ | 55/453 [00:00<00:00, 4922.68it/s, Materializing param=model.layers.5.self_attn.q_proj.weight]
Loading weights: 12%|█▏ | 55/453 [00:00<00:00, 4908.54it/s, Materializing param=model.layers.5.self_attn.q_proj.weight]
Loading weights: 12%|█▏ | 56/453 [00:00<00:00, 4963.36it/s, Materializing param=model.layers.5.self_attn.v_proj.weight]
Loading weights: 12%|█▏ | 56/453 [00:00<00:00, 4948.93it/s, Materializing param=model.layers.5.self_attn.v_proj.weight]
Loading weights: 13%|█▎ | 57/453 [00:00<00:00, 5001.47it/s, Materializing param=model.layers.6.input_layernorm.weight]
Loading weights: 13%|█▎ | 57/453 [00:00<00:00, 4987.39it/s, Materializing param=model.layers.6.input_layernorm.weight]
Loading weights: 13%|█▎ | 58/453 [00:00<00:00, 5040.81it/s, Materializing param=model.layers.6.mlp.down_proj.weight]
Loading weights: 13%|█▎ | 58/453 [00:00<00:00, 5027.79it/s, Materializing param=model.layers.6.mlp.down_proj.weight]
Loading weights: 13%|█▎ | 59/453 [00:00<00:00, 4493.54it/s, Materializing param=model.layers.6.mlp.gate_proj.weight]
Loading weights: 13%|█▎ | 59/453 [00:00<00:00, 4481.91it/s, Materializing param=model.layers.6.mlp.gate_proj.weight]
Loading weights: 13%|█▎ | 60/453 [00:00<00:00, 4531.12it/s, Materializing param=model.layers.6.mlp.up_proj.weight]
Loading weights: 13%|█▎ | 60/453 [00:00<00:00, 4520.70it/s, Materializing param=model.layers.6.mlp.up_proj.weight]
Loading weights: 13%|█▎ | 61/453 [00:00<00:00, 4562.28it/s, Materializing param=model.layers.6.post_attention_layernorm.weight]
Loading weights: 13%|█▎ | 61/453 [00:00<00:00, 4551.08it/s, Materializing param=model.layers.6.post_attention_layernorm.weight]
Loading weights: 14%|█▎ | 62/453 [00:00<00:00, 4598.37it/s, Materializing param=model.layers.6.self_attn.k_proj.weight]
Loading weights: 14%|█▎ | 62/453 [00:00<00:00, 4587.90it/s, Materializing param=model.layers.6.self_attn.k_proj.weight]
Loading weights: 14%|█▍ | 63/453 [00:00<00:00, 4634.26it/s, Materializing param=model.layers.6.self_attn.o_proj.weight]
Loading weights: 14%|█▍ | 63/453 [00:00<00:00, 4623.97it/s, Materializing param=model.layers.6.self_attn.o_proj.weight]
Loading weights: 14%|█▍ | 64/453 [00:00<00:00, 4670.88it/s, Materializing param=model.layers.6.self_attn.q_proj.weight]
Loading weights: 14%|█▍ | 64/453 [00:00<00:00, 4660.50it/s, Materializing param=model.layers.6.self_attn.q_proj.weight]
Loading weights: 14%|█▍ | 65/453 [00:00<00:00, 4705.95it/s, Materializing param=model.layers.6.self_attn.v_proj.weight]
Loading weights: 14%|█▍ | 65/453 [00:00<00:00, 4695.57it/s, Materializing param=model.layers.6.self_attn.v_proj.weight]
Loading weights: 15%|█▍ | 66/453 [00:00<00:00, 4741.27it/s, Materializing param=model.layers.7.input_layernorm.weight]
Loading weights: 15%|█▍ | 66/453 [00:00<00:00, 4730.90it/s, Materializing param=model.layers.7.input_layernorm.weight]
Loading weights: 15%|█▍ | 67/453 [00:00<00:00, 4777.68it/s, Materializing param=model.layers.7.mlp.down_proj.weight]
Loading weights: 15%|█▍ | 67/453 [00:00<00:00, 4767.39it/s, Materializing param=model.layers.7.mlp.down_proj.weight]
Loading weights: 15%|█▌ | 68/453 [00:00<00:00, 4812.42it/s, Materializing param=model.layers.7.mlp.gate_proj.weight]
Loading weights: 15%|█▌ | 68/453 [00:00<00:00, 4801.96it/s, Materializing param=model.layers.7.mlp.gate_proj.weight]
Loading weights: 15%|█▌ | 69/453 [00:00<00:00, 4847.69it/s, Materializing param=model.layers.7.mlp.up_proj.weight]
Loading weights: 15%|█▌ | 69/453 [00:00<00:00, 4837.32it/s, Materializing param=model.layers.7.mlp.up_proj.weight]
Loading weights: 15%|█▌ | 70/453 [00:00<00:00, 4882.78it/s, Materializing param=model.layers.7.post_attention_layernorm.weight]
Loading weights: 15%|█▌ | 70/453 [00:00<00:00, 4871.92it/s, Materializing param=model.layers.7.post_attention_layernorm.weight]
Loading weights: 16%|█▌ | 71/453 [00:00<00:00, 4915.90it/s, Materializing param=model.layers.7.self_attn.k_proj.weight]
Loading weights: 16%|█▌ | 71/453 [00:00<00:00, 4905.62it/s, Materializing param=model.layers.7.self_attn.k_proj.weight]
Loading weights: 16%|█▌ | 72/453 [00:00<00:00, 4950.09it/s, Materializing param=model.layers.7.self_attn.o_proj.weight]
Loading weights: 16%|█▌ | 72/453 [00:00<00:00, 4939.40it/s, Materializing param=model.layers.7.self_attn.o_proj.weight]
Loading weights: 16%|█▌ | 73/453 [00:00<00:00, 4983.47it/s, Materializing param=model.layers.7.self_attn.q_proj.weight]
Loading weights: 16%|█▌ | 73/453 [00:00<00:00, 4973.11it/s, Materializing param=model.layers.7.self_attn.q_proj.weight]
Loading weights: 16%|█▋ | 74/453 [00:00<00:00, 4688.21it/s, Materializing param=model.layers.7.self_attn.v_proj.weight]
Loading weights: 16%|█▋ | 74/453 [00:00<00:00, 4677.83it/s, Materializing param=model.layers.7.self_attn.v_proj.weight]
Loading weights: 17%|█▋ | 75/453 [00:00<00:00, 4717.08it/s, Materializing param=model.layers.8.input_layernorm.weight]
Loading weights: 17%|█▋ | 75/453 [00:00<00:00, 4707.55it/s, Materializing param=model.layers.8.input_layernorm.weight]
Loading weights: 17%|█▋ | 76/453 [00:00<00:00, 4746.24it/s, Materializing param=model.layers.8.mlp.down_proj.weight]
Loading weights: 17%|█▋ | 76/453 [00:00<00:00, 4736.86it/s, Materializing param=model.layers.8.mlp.down_proj.weight]
Loading weights: 17%|█▋ | 77/453 [00:00<00:00, 4743.57it/s, Materializing param=model.layers.8.mlp.gate_proj.weight]
Loading weights: 17%|█▋ | 77/453 [00:00<00:00, 4734.05it/s, Materializing param=model.layers.8.mlp.gate_proj.weight]
Loading weights: 17%|█▋ | 78/453 [00:00<00:00, 4772.72it/s, Materializing param=model.layers.8.mlp.up_proj.weight]
Loading weights: 17%|█▋ | 78/453 [00:00<00:00, 4763.62it/s, Materializing param=model.layers.8.mlp.up_proj.weight]
Loading weights: 17%|█▋ | 79/453 [00:00<00:00, 4801.69it/s, Materializing param=model.layers.8.post_attention_layernorm.weight]
Loading weights: 17%|█▋ | 79/453 [00:00<00:00, 4791.41it/s, Materializing param=model.layers.8.post_attention_layernorm.weight]
Loading weights: 18%|█▊ | 80/453 [00:00<00:00, 4828.60it/s, Materializing param=model.layers.8.self_attn.k_proj.weight]
Loading weights: 18%|█▊ | 80/453 [00:00<00:00, 4819.65it/s, Materializing param=model.layers.8.self_attn.k_proj.weight]
Loading weights: 18%|█▊ | 81/453 [00:00<00:00, 4856.74it/s, Materializing param=model.layers.8.self_attn.o_proj.weight]
Loading weights: 18%|█▊ | 81/453 [00:00<00:00, 4847.87it/s, Materializing param=model.layers.8.self_attn.o_proj.weight]
Loading weights: 18%|█▊ | 82/453 [00:00<00:00, 4885.34it/s, Materializing param=model.layers.8.self_attn.q_proj.weight]
Loading weights: 18%|█▊ | 82/453 [00:00<00:00, 4876.41it/s, Materializing param=model.layers.8.self_attn.q_proj.weight]
Loading weights: 18%|█▊ | 83/453 [00:00<00:00, 4913.44it/s, Materializing param=model.layers.8.self_attn.v_proj.weight]
Loading weights: 18%|█▊ | 83/453 [00:00<00:00, 4904.24it/s, Materializing param=model.layers.8.self_attn.v_proj.weight]
Loading weights: 19%|█▊ | 84/453 [00:00<00:00, 4940.56it/s, Materializing param=model.layers.9.input_layernorm.weight]
Loading weights: 19%|█▊ | 84/453 [00:00<00:00, 4931.44it/s, Materializing param=model.layers.9.input_layernorm.weight]
Loading weights: 19%|█▉ | 85/453 [00:00<00:00, 4967.68it/s, Materializing param=model.layers.9.mlp.down_proj.weight]
Loading weights: 19%|█▉ | 85/453 [00:00<00:00, 4958.08it/s, Materializing param=model.layers.9.mlp.down_proj.weight]
Loading weights: 19%|█▉ | 86/453 [00:00<00:00, 4995.43it/s, Materializing param=model.layers.9.mlp.gate_proj.weight]
Loading weights: 19%|█▉ | 86/453 [00:00<00:00, 4986.39it/s, Materializing param=model.layers.9.mlp.gate_proj.weight]
Loading weights: 19%|█▉ | 87/453 [00:00<00:00, 5023.19it/s, Materializing param=model.layers.9.mlp.up_proj.weight]
Loading weights: 19%|█▉ | 87/453 [00:00<00:00, 5014.15it/s, Materializing param=model.layers.9.mlp.up_proj.weight]
Loading weights: 19%|█▉ | 88/453 [00:00<00:00, 5050.61it/s, Materializing param=model.layers.9.post_attention_layernorm.weight]
Loading weights: 19%|█▉ | 88/453 [00:00<00:00, 5041.44it/s, Materializing param=model.layers.9.post_attention_layernorm.weight]
Loading weights: 20%|█▉ | 89/453 [00:00<00:00, 5077.37it/s, Materializing param=model.layers.9.self_attn.k_proj.weight]
Loading weights: 20%|█▉ | 89/453 [00:00<00:00, 5068.34it/s, Materializing param=model.layers.9.self_attn.k_proj.weight]
Loading weights: 20%|█▉ | 90/453 [00:00<00:00, 5103.73it/s, Materializing param=model.layers.9.self_attn.o_proj.weight]
Loading weights: 20%|█▉ | 90/453 [00:00<00:00, 5094.71it/s, Materializing param=model.layers.9.self_attn.o_proj.weight]
Loading weights: 20%|██ | 91/453 [00:00<00:00, 5130.13it/s, Materializing param=model.layers.9.self_attn.q_proj.weight]
Loading weights: 20%|██ | 91/453 [00:00<00:00, 5121.04it/s, Materializing param=model.layers.9.self_attn.q_proj.weight]
Loading weights: 20%|██ | 92/453 [00:00<00:00, 5125.54it/s, Materializing param=model.layers.9.self_attn.v_proj.weight]
Loading weights: 20%|██ | 92/453 [00:00<00:00, 5115.95it/s, Materializing param=model.layers.9.self_attn.v_proj.weight]
Loading weights: 21%|██ | 93/453 [00:00<00:00, 5148.83it/s, Materializing param=model.layers.10.input_layernorm.weight]
Loading weights: 21%|██ | 93/453 [00:00<00:00, 5139.54it/s, Materializing param=model.layers.10.input_layernorm.weight]
Loading weights: 21%|██ | 94/453 [00:00<00:00, 5086.83it/s, Materializing param=model.layers.10.mlp.down_proj.weight]
Loading weights: 21%|██ | 94/453 [00:00<00:00, 5077.52it/s, Materializing param=model.layers.10.mlp.down_proj.weight]
Loading weights: 21%|██ | 95/453 [00:00<00:00, 5035.88it/s, Materializing param=model.layers.10.mlp.gate_proj.weight]
Loading weights: 21%|██ | 95/453 [00:00<00:00, 5027.17it/s, Materializing param=model.layers.10.mlp.gate_proj.weight]
Loading weights: 21%|██ | 96/453 [00:00<00:00, 5040.35it/s, Materializing param=model.layers.10.mlp.up_proj.weight]
Loading weights: 21%|██ | 96/453 [00:00<00:00, 5031.97it/s, Materializing param=model.layers.10.mlp.up_proj.weight]
Loading weights: 21%|██▏ | 97/453 [00:00<00:00, 5063.19it/s, Materializing param=model.layers.10.post_attention_layernorm.weight]
Loading weights: 21%|██▏ | 97/453 [00:00<00:00, 5054.45it/s, Materializing param=model.layers.10.post_attention_layernorm.weight]
Loading weights: 22%|██▏ | 98/453 [00:00<00:00, 5085.51it/s, Materializing param=model.layers.10.self_attn.k_proj.weight]
Loading weights: 22%|██▏ | 98/453 [00:00<00:00, 5077.16it/s, Materializing param=model.layers.10.self_attn.k_proj.weight]
Loading weights: 22%|██▏ | 99/453 [00:00<00:00, 5107.64it/s, Materializing param=model.layers.10.self_attn.o_proj.weight]
Loading weights: 22%|██▏ | 99/453 [00:00<00:00, 5099.55it/s, Materializing param=model.layers.10.self_attn.o_proj.weight]
Loading weights: 22%|██▏ | 100/453 [00:00<00:00, 5130.27it/s, Materializing param=model.layers.10.self_attn.q_proj.weight]
Loading weights: 22%|██▏ | 100/453 [00:00<00:00, 5122.13it/s, Materializing param=model.layers.10.self_attn.q_proj.weight]
Loading weights: 22%|██▏ | 101/453 [00:00<00:00, 4836.89it/s, Materializing param=model.layers.10.self_attn.v_proj.weight]
Loading weights: 22%|██▏ | 101/453 [00:00<00:00, 4828.62it/s, Materializing param=model.layers.10.self_attn.v_proj.weight]
Loading weights: 23%|██▎ | 102/453 [00:00<00:00, 4857.55it/s, Materializing param=model.layers.11.input_layernorm.weight]
Loading weights: 23%|██▎ | 102/453 [00:00<00:00, 4850.39it/s, Materializing param=model.layers.11.input_layernorm.weight]
Loading weights: 23%|██▎ | 103/453 [00:00<00:00, 4879.58it/s, Materializing param=model.layers.11.mlp.down_proj.weight]
Loading weights: 23%|██▎ | 103/453 [00:00<00:00, 4871.38it/s, Materializing param=model.layers.11.mlp.down_proj.weight]
Loading weights: 23%|██▎ | 104/453 [00:00<00:00, 4900.55it/s, Materializing param=model.layers.11.mlp.gate_proj.weight]
Loading weights: 23%|██▎ | 104/453 [00:00<00:00, 4893.13it/s, Materializing param=model.layers.11.mlp.gate_proj.weight]
Loading weights: 23%|██▎ | 105/453 [00:00<00:00, 4921.41it/s, Materializing param=model.layers.11.mlp.up_proj.weight]
Loading weights: 23%|██▎ | 105/453 [00:00<00:00, 4913.94it/s, Materializing param=model.layers.11.mlp.up_proj.weight]
Loading weights: 23%|██▎ | 106/453 [00:00<00:00, 4942.32it/s, Materializing param=model.layers.11.post_attention_layernorm.weight]
Loading weights: 23%|██▎ | 106/453 [00:00<00:00, 4934.53it/s, Materializing param=model.layers.11.post_attention_layernorm.weight]
Loading weights: 24%|██▎ | 107/453 [00:00<00:00, 4962.19it/s, Materializing param=model.layers.11.self_attn.k_proj.weight]
Loading weights: 24%|██▎ | 107/453 [00:00<00:00, 4954.63it/s, Materializing param=model.layers.11.self_attn.k_proj.weight]
Loading weights: 24%|██▍ | 108/453 [00:00<00:00, 4983.93it/s, Materializing param=model.layers.11.self_attn.o_proj.weight]
Loading weights: 24%|██▍ | 108/453 [00:00<00:00, 4976.71it/s, Materializing param=model.layers.11.self_attn.o_proj.weight]
Loading weights: 24%|██▍ | 109/453 [00:00<00:00, 5005.68it/s, Materializing param=model.layers.11.self_attn.q_proj.weight]
Loading weights: 24%|██▍ | 109/453 [00:00<00:00, 4998.62it/s, Materializing param=model.layers.11.self_attn.q_proj.weight]
Loading weights: 24%|██▍ | 110/453 [00:00<00:00, 5027.50it/s, Materializing param=model.layers.11.self_attn.v_proj.weight]
Loading weights: 24%|██▍ | 110/453 [00:00<00:00, 5020.71it/s, Materializing param=model.layers.11.self_attn.v_proj.weight]
Loading weights: 25%|██▍ | 111/453 [00:00<00:00, 5049.49it/s, Materializing param=model.layers.12.input_layernorm.weight]
Loading weights: 25%|██▍ | 111/453 [00:00<00:00, 5042.38it/s, Materializing param=model.layers.12.input_layernorm.weight]
Loading weights: 25%|██▍ | 112/453 [00:00<00:00, 5071.44it/s, Materializing param=model.layers.12.mlp.down_proj.weight]
Loading weights: 25%|██▍ | 112/453 [00:00<00:00, 5064.49it/s, Materializing param=model.layers.12.mlp.down_proj.weight]
Loading weights: 25%|██▍ | 113/453 [00:00<00:00, 5092.64it/s, Materializing param=model.layers.12.mlp.gate_proj.weight]
Loading weights: 25%|██▍ | 113/453 [00:00<00:00, 5085.53it/s, Materializing param=model.layers.12.mlp.gate_proj.weight]
Loading weights: 25%|██▌ | 114/453 [00:00<00:00, 5072.73it/s, Materializing param=model.layers.12.mlp.up_proj.weight]
Loading weights: 25%|██▌ | 114/453 [00:00<00:00, 5065.26it/s, Materializing param=model.layers.12.mlp.up_proj.weight]
Loading weights: 25%|██▌ | 115/453 [00:00<00:00, 5090.93it/s, Materializing param=model.layers.12.post_attention_layernorm.weight]
Loading weights: 25%|██▌ | 115/453 [00:00<00:00, 5083.63it/s, Materializing param=model.layers.12.post_attention_layernorm.weight]
Loading weights: 26%|██▌ | 116/453 [00:00<00:00, 5109.74it/s, Materializing param=model.layers.12.self_attn.k_proj.weight]
Loading weights: 26%|██▌ | 116/453 [00:00<00:00, 5102.61it/s, Materializing param=model.layers.12.self_attn.k_proj.weight]
Loading weights: 26%|██▌ | 117/453 [00:00<00:00, 5091.71it/s, Materializing param=model.layers.12.self_attn.o_proj.weight]
Loading weights: 26%|██▌ | 117/453 [00:00<00:00, 5084.37it/s, Materializing param=model.layers.12.self_attn.o_proj.weight]
Loading weights: 26%|██▌ | 118/453 [00:00<00:00, 5009.70it/s, Materializing param=model.layers.12.self_attn.q_proj.weight]
Loading weights: 26%|██▌ | 118/453 [00:00<00:00, 5002.35it/s, Materializing param=model.layers.12.self_attn.q_proj.weight]
Loading weights: 26%|██▋ | 119/453 [00:00<00:00, 5027.62it/s, Materializing param=model.layers.12.self_attn.v_proj.weight]
Loading weights: 26%|██▋ | 119/453 [00:00<00:00, 5020.84it/s, Materializing param=model.layers.12.self_attn.v_proj.weight]
Loading weights: 26%|██▋ | 120/453 [00:00<00:00, 5034.22it/s, Materializing param=model.layers.13.input_layernorm.weight]
Loading weights: 26%|██▋ | 120/453 [00:00<00:00, 5027.53it/s, Materializing param=model.layers.13.input_layernorm.weight]
Loading weights: 27%|██▋ | 121/453 [00:00<00:00, 5052.93it/s, Materializing param=model.layers.13.mlp.down_proj.weight]
Loading weights: 27%|██▋ | 121/453 [00:00<00:00, 5046.34it/s, Materializing param=model.layers.13.mlp.down_proj.weight]
Loading weights: 27%|██▋ | 122/453 [00:00<00:00, 5071.21it/s, Materializing param=model.layers.13.mlp.gate_proj.weight]
Loading weights: 27%|██▋ | 122/453 [00:00<00:00, 5064.78it/s, Materializing param=model.layers.13.mlp.gate_proj.weight]
Loading weights: 27%|██▋ | 123/453 [00:00<00:00, 5089.27it/s, Materializing param=model.layers.13.mlp.up_proj.weight]
Loading weights: 27%|██▋ | 123/453 [00:00<00:00, 5082.90it/s, Materializing param=model.layers.13.mlp.up_proj.weight]
Loading weights: 27%|██▋ | 124/453 [00:00<00:00, 5108.32it/s, Materializing param=model.layers.13.post_attention_layernorm.weight]
Loading weights: 27%|██▋ | 124/453 [00:00<00:00, 5101.61it/s, Materializing param=model.layers.13.post_attention_layernorm.weight]
Loading weights: 28%|██▊ | 125/453 [00:00<00:00, 5068.62it/s, Materializing param=model.layers.13.self_attn.k_proj.weight]
Loading weights: 28%|██▊ | 125/453 [00:00<00:00, 5061.67it/s, Materializing param=model.layers.13.self_attn.k_proj.weight]
Loading weights: 28%|██▊ | 126/453 [00:00<00:00, 5085.67it/s, Materializing param=model.layers.13.self_attn.o_proj.weight]
Loading weights: 28%|██▊ | 126/453 [00:00<00:00, 5079.51it/s, Materializing param=model.layers.13.self_attn.o_proj.weight]
Loading weights: 28%|██▊ | 127/453 [00:00<00:00, 5104.12it/s, Materializing param=model.layers.13.self_attn.q_proj.weight]
Loading weights: 28%|██▊ | 127/453 [00:00<00:00, 5097.87it/s, Materializing param=model.layers.13.self_attn.q_proj.weight]
Loading weights: 28%|██▊ | 128/453 [00:00<00:00, 4897.61it/s, Materializing param=model.layers.13.self_attn.v_proj.weight]
Loading weights: 28%|██▊ | 128/453 [00:00<00:00, 4891.27it/s, Materializing param=model.layers.13.self_attn.v_proj.weight]
Loading weights: 28%|██▊ | 129/453 [00:00<00:00, 4913.73it/s, Materializing param=model.layers.14.input_layernorm.weight]
Loading weights: 28%|██▊ | 129/453 [00:00<00:00, 4908.02it/s, Materializing param=model.layers.14.input_layernorm.weight]
Loading weights: 29%|██▊ | 130/453 [00:00<00:00, 4931.44it/s, Materializing param=model.layers.14.mlp.down_proj.weight]
Loading weights: 29%|██▊ | 130/453 [00:00<00:00, 4925.87it/s, Materializing param=model.layers.14.mlp.down_proj.weight]
Loading weights: 29%|██▉ | 131/453 [00:00<00:00, 4949.81it/s, Materializing param=model.layers.14.mlp.gate_proj.weight]
Loading weights: 29%|██▉ | 131/453 [00:00<00:00, 4944.15it/s, Materializing param=model.layers.14.mlp.gate_proj.weight]
Loading weights: 29%|██▉ | 132/453 [00:00<00:00, 4967.15it/s, Materializing param=model.layers.14.mlp.up_proj.weight]
Loading weights: 29%|██▉ | 132/453 [00:00<00:00, 4961.27it/s, Materializing param=model.layers.14.mlp.up_proj.weight]
Loading weights: 29%|██▉ | 133/453 [00:00<00:00, 4984.25it/s, Materializing param=model.layers.14.post_attention_layernorm.weight]
Loading weights: 29%|██▉ | 133/453 [00:00<00:00, 4977.98it/s, Materializing param=model.layers.14.post_attention_layernorm.weight]
Loading weights: 30%|██▉ | 134/453 [00:00<00:00, 5000.86it/s, Materializing param=model.layers.14.self_attn.k_proj.weight]
Loading weights: 30%|██▉ | 134/453 [00:00<00:00, 4994.82it/s, Materializing param=model.layers.14.self_attn.k_proj.weight]
Loading weights: 30%|██▉ | 135/453 [00:00<00:00, 5017.15it/s, Materializing param=model.layers.14.self_attn.o_proj.weight]
Loading weights: 30%|██▉ | 135/453 [00:00<00:00, 5011.20it/s, Materializing param=model.layers.14.self_attn.o_proj.weight]
Loading weights: 30%|███ | 136/453 [00:00<00:00, 5033.85it/s, Materializing param=model.layers.14.self_attn.q_proj.weight]
Loading weights: 30%|███ | 136/453 [00:00<00:00, 5027.94it/s, Materializing param=model.layers.14.self_attn.q_proj.weight]
Loading weights: 30%|███ | 137/453 [00:00<00:00, 5050.54it/s, Materializing param=model.layers.14.self_attn.v_proj.weight]
Loading weights: 30%|███ | 137/453 [00:00<00:00, 5045.13it/s, Materializing param=model.layers.14.self_attn.v_proj.weight]
Loading weights: 30%|███ | 138/453 [00:00<00:00, 5068.47it/s, Materializing param=model.layers.15.input_layernorm.weight]
Loading weights: 30%|███ | 138/453 [00:00<00:00, 5062.84it/s, Materializing param=model.layers.15.input_layernorm.weight]
Loading weights: 31%|███ | 139/453 [00:00<00:00, 5078.73it/s, Materializing param=model.layers.15.mlp.down_proj.weight]
Loading weights: 31%|███ | 139/453 [00:00<00:00, 5072.95it/s, Materializing param=model.layers.15.mlp.down_proj.weight]
Loading weights: 31%|███ | 140/453 [00:00<00:00, 5052.42it/s, Materializing param=model.layers.15.mlp.gate_proj.weight]
Loading weights: 31%|███ | 140/453 [00:00<00:00, 5046.34it/s, Materializing param=model.layers.15.mlp.gate_proj.weight]
Loading weights: 31%|███ | 141/453 [00:00<00:00, 5067.93it/s, Materializing param=model.layers.15.mlp.up_proj.weight]
Loading weights: 31%|███ | 141/453 [00:00<00:00, 5062.03it/s, Materializing param=model.layers.15.mlp.up_proj.weight]
Loading weights: 31%|███▏ | 142/453 [00:00<00:00, 5084.00it/s, Materializing param=model.layers.15.post_attention_layernorm.weight]
Loading weights: 31%|███▏ | 142/453 [00:00<00:00, 5078.11it/s, Materializing param=model.layers.15.post_attention_layernorm.weight]
Loading weights: 32%|███▏ | 143/453 [00:00<00:00, 5099.18it/s, Materializing param=model.layers.15.self_attn.k_proj.weight]
Loading weights: 32%|███▏ | 143/453 [00:00<00:00, 5093.33it/s, Materializing param=model.layers.15.self_attn.k_proj.weight]
Loading weights: 32%|███▏ | 144/453 [00:00<00:00, 5049.45it/s, Materializing param=model.layers.15.self_attn.o_proj.weight]
Loading weights: 32%|███▏ | 144/453 [00:00<00:00, 5043.50it/s, Materializing param=model.layers.15.self_attn.o_proj.weight]
Loading weights: 32%|███▏ | 145/453 [00:00<00:00, 5037.22it/s, Materializing param=model.layers.15.self_attn.q_proj.weight]
Loading weights: 32%|███▏ | 145/453 [00:00<00:00, 5031.60it/s, Materializing param=model.layers.15.self_attn.q_proj.weight]
Loading weights: 32%|███▏ | 146/453 [00:00<00:00, 5005.79it/s, Materializing param=model.layers.15.self_attn.v_proj.weight]
Loading weights: 32%|███▏ | 146/453 [00:00<00:00, 5000.07it/s, Materializing param=model.layers.15.self_attn.v_proj.weight]
Loading weights: 32%|███▏ | 147/453 [00:00<00:00, 5023.04it/s, Materializing param=model.layers.16.input_layernorm.weight]
Loading weights: 32%|███▏ | 147/453 [00:00<00:00, 5017.76it/s, Materializing param=model.layers.16.input_layernorm.weight]
Loading weights: 33%|███▎ | 148/453 [00:00<00:00, 5038.61it/s, Materializing param=model.layers.16.mlp.down_proj.weight]
Loading weights: 33%|███▎ | 148/453 [00:00<00:00, 5033.38it/s, Materializing param=model.layers.16.mlp.down_proj.weight]
Loading weights: 33%|███▎ | 149/453 [00:00<00:00, 5044.69it/s, Materializing param=model.layers.16.mlp.gate_proj.weight]
Loading weights: 33%|███▎ | 149/453 [00:00<00:00, 5039.40it/s, Materializing param=model.layers.16.mlp.gate_proj.weight]
Loading weights: 33%|███▎ | 150/453 [00:00<00:00, 4953.04it/s, Materializing param=model.layers.16.mlp.up_proj.weight]
Loading weights: 33%|███▎ | 150/453 [00:00<00:00, 4947.63it/s, Materializing param=model.layers.16.mlp.up_proj.weight]
Loading weights: 33%|███▎ | 151/453 [00:00<00:00, 4968.15it/s, Materializing param=model.layers.16.post_attention_layernorm.weight]
Loading weights: 33%|███▎ | 151/453 [00:00<00:00, 4962.86it/s, Materializing param=model.layers.16.post_attention_layernorm.weight]
Loading weights: 34%|███▎ | 152/453 [00:00<00:00, 4982.95it/s, Materializing param=model.layers.16.self_attn.k_proj.weight]
Loading weights: 34%|███▎ | 152/453 [00:00<00:00, 4977.74it/s, Materializing param=model.layers.16.self_attn.k_proj.weight]
Loading weights: 34%|███▍ | 153/453 [00:00<00:00, 4997.50it/s, Materializing param=model.layers.16.self_attn.o_proj.weight]
Loading weights: 34%|███▍ | 153/453 [00:00<00:00, 4992.36it/s, Materializing param=model.layers.16.self_attn.o_proj.weight]
Loading weights: 34%|███▍ | 154/453 [00:00<00:00, 4946.19it/s, Materializing param=model.layers.16.self_attn.q_proj.weight]
Loading weights: 34%|███▍ | 154/453 [00:00<00:00, 4941.04it/s, Materializing param=model.layers.16.self_attn.q_proj.weight]
Loading weights: 34%|███▍ | 155/453 [00:00<00:00, 4960.45it/s, Materializing param=model.layers.16.self_attn.v_proj.weight]
Loading weights: 34%|███▍ | 155/453 [00:00<00:00, 4955.50it/s, Materializing param=model.layers.16.self_attn.v_proj.weight]
Loading weights: 34%|███▍ | 156/453 [00:00<00:00, 4975.41it/s, Materializing param=model.layers.17.input_layernorm.weight]
Loading weights: 34%|███▍ | 156/453 [00:00<00:00, 4970.46it/s, Materializing param=model.layers.17.input_layernorm.weight]
Loading weights: 35%|███▍ | 157/453 [00:00<00:00, 4989.78it/s, Materializing param=model.layers.17.mlp.down_proj.weight]
Loading weights: 35%|███▍ | 157/453 [00:00<00:00, 4985.09it/s, Materializing param=model.layers.17.mlp.down_proj.weight]
Loading weights: 35%|███▍ | 158/453 [00:00<00:00, 5004.76it/s, Materializing param=model.layers.17.mlp.gate_proj.weight]
Loading weights: 35%|███▍ | 158/453 [00:00<00:00, 4999.81it/s, Materializing param=model.layers.17.mlp.gate_proj.weight]
Loading weights: 35%|███▌ | 159/453 [00:00<00:00, 5018.66it/s, Materializing param=model.layers.17.mlp.up_proj.weight]
Loading weights: 35%|███▌ | 159/453 [00:00<00:00, 5013.45it/s, Materializing param=model.layers.17.mlp.up_proj.weight]
Loading weights: 35%|███▌ | 160/453 [00:00<00:00, 5032.42it/s, Materializing param=model.layers.17.post_attention_layernorm.weight]
Loading weights: 35%|███▌ | 160/453 [00:00<00:00, 5027.33it/s, Materializing param=model.layers.17.post_attention_layernorm.weight]
Loading weights: 36%|███▌ | 161/453 [00:00<00:00, 5043.41it/s, Materializing param=model.layers.17.self_attn.k_proj.weight]
Loading weights: 36%|███▌ | 161/453 [00:00<00:00, 5038.33it/s, Materializing param=model.layers.17.self_attn.k_proj.weight]
Loading weights: 36%|███▌ | 162/453 [00:00<00:00, 5057.97it/s, Materializing param=model.layers.17.self_attn.o_proj.weight]
Loading weights: 36%|███▌ | 162/453 [00:00<00:00, 5053.12it/s, Materializing param=model.layers.17.self_attn.o_proj.weight]
Loading weights: 36%|███▌ | 163/453 [00:00<00:00, 5072.12it/s, Materializing param=model.layers.17.self_attn.q_proj.weight]
Loading weights: 36%|███▌ | 163/453 [00:00<00:00, 5067.24it/s, Materializing param=model.layers.17.self_attn.q_proj.weight]
Loading weights: 36%|███▌ | 164/453 [00:00<00:00, 5086.82it/s, Materializing param=model.layers.17.self_attn.v_proj.weight]
Loading weights: 36%|███▌ | 164/453 [00:00<00:00, 5082.01it/s, Materializing param=model.layers.17.self_attn.v_proj.weight]
Loading weights: 36%|███▋ | 165/453 [00:00<00:00, 5076.14it/s, Materializing param=model.layers.18.input_layernorm.weight]
Loading weights: 36%|███▋ | 165/453 [00:00<00:00, 5071.04it/s, Materializing param=model.layers.18.input_layernorm.weight]
Loading weights: 37%|███▋ | 166/453 [00:00<00:00, 5074.15it/s, Materializing param=model.layers.18.mlp.down_proj.weight]
Loading weights: 37%|███▋ | 166/453 [00:00<00:00, 5069.46it/s, Materializing param=model.layers.18.mlp.down_proj.weight]
Loading weights: 37%|███▋ | 167/453 [00:00<00:00, 4972.98it/s, Materializing param=model.layers.18.mlp.gate_proj.weight]
Loading weights: 37%|███▋ | 167/453 [00:00<00:00, 4968.18it/s, Materializing param=model.layers.18.mlp.gate_proj.weight]
Loading weights: 37%|███▋ | 168/453 [00:00<00:00, 4986.05it/s, Materializing param=model.layers.18.mlp.up_proj.weight]
Loading weights: 37%|███▋ | 168/453 [00:00<00:00, 4981.53it/s, Materializing param=model.layers.18.mlp.up_proj.weight]
Loading weights: 37%|███▋ | 169/453 [00:00<00:00, 4999.77it/s, Materializing param=model.layers.18.post_attention_layernorm.weight]
Loading weights: 37%|███▋ | 169/453 [00:00<00:00, 4995.08it/s, Materializing param=model.layers.18.post_attention_layernorm.weight]
Loading weights: 38%|███▊ | 170/453 [00:00<00:00, 4955.77it/s, Materializing param=model.layers.18.self_attn.k_proj.weight]
Loading weights: 38%|███▊ | 170/453 [00:00<00:00, 4950.99it/s, Materializing param=model.layers.18.self_attn.k_proj.weight]
Loading weights: 38%|███▊ | 171/453 [00:00<00:00, 4968.49it/s, Materializing param=model.layers.18.self_attn.o_proj.weight]
Loading weights: 38%|███▊ | 171/453 [00:00<00:00, 4964.05it/s, Materializing param=model.layers.18.self_attn.o_proj.weight]
Loading weights: 38%|███▊ | 172/453 [00:00<00:00, 4981.70it/s, Materializing param=model.layers.18.self_attn.q_proj.weight]
Loading weights: 38%|███▊ | 172/453 [00:00<00:00, 4977.20it/s, Materializing param=model.layers.18.self_attn.q_proj.weight]
Loading weights: 38%|███▊ | 173/453 [00:00<00:00, 4995.08it/s, Materializing param=model.layers.18.self_attn.v_proj.weight]
Loading weights: 38%|███▊ | 173/453 [00:00<00:00, 4990.54it/s, Materializing param=model.layers.18.self_attn.v_proj.weight]
Loading weights: 38%|███▊ | 174/453 [00:00<00:00, 5008.30it/s, Materializing param=model.layers.19.input_layernorm.weight]
Loading weights: 38%|███▊ | 174/453 [00:00<00:00, 5003.69it/s, Materializing param=model.layers.19.input_layernorm.weight]
Loading weights: 39%|███▊ | 175/453 [00:00<00:00, 4995.67it/s, Materializing param=model.layers.19.mlp.down_proj.weight]
Loading weights: 39%|███▊ | 175/453 [00:00<00:00, 4990.98it/s, Materializing param=model.layers.19.mlp.down_proj.weight]
Loading weights: 39%|███▉ | 176/453 [00:00<00:00, 4994.47it/s, Materializing param=model.layers.19.mlp.gate_proj.weight]
Loading weights: 39%|███▉ | 176/453 [00:00<00:00, 4990.01it/s, Materializing param=model.layers.19.mlp.gate_proj.weight]
Loading weights: 39%|███▉ | 177/453 [00:00<00:00, 5008.99it/s, Materializing param=model.layers.19.mlp.up_proj.weight]
Loading weights: 39%|███▉ | 177/453 [00:00<00:00, 5004.73it/s, Materializing param=model.layers.19.mlp.up_proj.weight]
Loading weights: 39%|███▉ | 178/453 [00:00<00:00, 5022.17it/s, Materializing param=model.layers.19.post_attention_layernorm.weight]
Loading weights: 39%|███▉ | 178/453 [00:00<00:00, 5017.65it/s, Materializing param=model.layers.19.post_attention_layernorm.weight]
Loading weights: 40%|███▉ | 179/453 [00:00<00:00, 5034.91it/s, Materializing param=model.layers.19.self_attn.k_proj.weight]
Loading weights: 40%|███▉ | 179/453 [00:00<00:00, 5030.66it/s, Materializing param=model.layers.19.self_attn.k_proj.weight]
Loading weights: 40%|███▉ | 180/453 [00:00<00:00, 5048.31it/s, Materializing param=model.layers.19.self_attn.o_proj.weight]
Loading weights: 40%|███▉ | 180/453 [00:00<00:00, 5044.09it/s, Materializing param=model.layers.19.self_attn.o_proj.weight]
Loading weights: 40%|███▉ | 181/453 [00:00<00:00, 5061.03it/s, Materializing param=model.layers.19.self_attn.q_proj.weight]
Loading weights: 40%|███▉ | 181/453 [00:00<00:00, 5056.95it/s, Materializing param=model.layers.19.self_attn.q_proj.weight]
Loading weights: 40%|████ | 182/453 [00:00<00:00, 5074.14it/s, Materializing param=model.layers.19.self_attn.v_proj.weight]
Loading weights: 40%|████ | 182/453 [00:00<00:00, 5069.69it/s, Materializing param=model.layers.19.self_attn.v_proj.weight]
Loading weights: 40%|████ | 183/453 [00:00<00:00, 5086.97it/s, Materializing param=model.layers.20.input_layernorm.weight]
Loading weights: 40%|████ | 183/453 [00:00<00:00, 5082.56it/s, Materializing param=model.layers.20.input_layernorm.weight]
Loading weights: 41%|████ | 184/453 [00:00<00:00, 4933.21it/s, Materializing param=model.layers.20.mlp.down_proj.weight]
Loading weights: 41%|████ | 184/453 [00:00<00:00, 4928.80it/s, Materializing param=model.layers.20.mlp.down_proj.weight]
Loading weights: 41%|████ | 185/453 [00:00<00:00, 4945.07it/s, Materializing param=model.layers.20.mlp.gate_proj.weight]
Loading weights: 41%|████ | 185/453 [00:00<00:00, 4940.92it/s, Materializing param=model.layers.20.mlp.gate_proj.weight]
Loading weights: 41%|████ | 186/453 [00:00<00:00, 4957.05it/s, Materializing param=model.layers.20.mlp.up_proj.weight]
Loading weights: 41%|████ | 186/453 [00:00<00:00, 4952.96it/s, Materializing param=model.layers.20.mlp.up_proj.weight]
Loading weights: 41%|████▏ | 187/453 [00:00<00:00, 4969.21it/s, Materializing param=model.layers.20.post_attention_layernorm.weight]
Loading weights: 41%|████▏ | 187/453 [00:00<00:00, 4964.93it/s, Materializing param=model.layers.20.post_attention_layernorm.weight]
Loading weights: 42%|████▏ | 188/453 [00:00<00:00, 4981.20it/s, Materializing param=model.layers.20.self_attn.k_proj.weight]
Loading weights: 42%|████▏ | 188/453 [00:00<00:00, 4977.02it/s, Materializing param=model.layers.20.self_attn.k_proj.weight]
Loading weights: 42%|████▏ | 189/453 [00:00<00:00, 4993.06it/s, Materializing param=model.layers.20.self_attn.o_proj.weight]
Loading weights: 42%|████▏ | 189/453 [00:00<00:00, 4988.98it/s, Materializing param=model.layers.20.self_attn.o_proj.weight]
Loading weights: 42%|████▏ | 190/453 [00:00<00:00, 5005.01it/s, Materializing param=model.layers.20.self_attn.q_proj.weight]
Loading weights: 42%|████▏ | 190/453 [00:00<00:00, 5000.90it/s, Materializing param=model.layers.20.self_attn.q_proj.weight]
Loading weights: 42%|████▏ | 191/453 [00:00<00:00, 4977.09it/s, Materializing param=model.layers.20.self_attn.v_proj.weight]
Loading weights: 42%|████▏ | 191/453 [00:00<00:00, 4972.64it/s, Materializing param=model.layers.20.self_attn.v_proj.weight]
Loading weights: 42%|████▏ | 192/453 [00:00<00:00, 4990.43it/s, Materializing param=model.layers.21.input_layernorm.weight]
Loading weights: 42%|████▏ | 192/453 [00:00<00:00, 4986.54it/s, Materializing param=model.layers.21.input_layernorm.weight]
Loading weights: 43%|████▎ | 193/453 [00:00<00:00, 4915.93it/s, Materializing param=model.layers.21.mlp.down_proj.weight]
Loading weights: 43%|████▎ | 193/453 [00:00<00:00, 4911.90it/s, Materializing param=model.layers.21.mlp.down_proj.weight]
Loading weights: 43%|████▎ | 194/453 [00:00<00:00, 4929.27it/s, Materializing param=model.layers.21.mlp.gate_proj.weight]
Loading weights: 43%|████▎ | 194/453 [00:00<00:00, 4925.37it/s, Materializing param=model.layers.21.mlp.gate_proj.weight]
Loading weights: 43%|████▎ | 195/453 [00:00<00:00, 4941.09it/s, Materializing param=model.layers.21.mlp.up_proj.weight]
Loading weights: 43%|████▎ | 195/453 [00:00<00:00, 4937.28it/s, Materializing param=model.layers.21.mlp.up_proj.weight]
Loading weights: 43%|████▎ | 196/453 [00:00<00:00, 4953.44it/s, Materializing param=model.layers.21.post_attention_layernorm.weight]
Loading weights: 43%|████▎ | 196/453 [00:00<00:00, 4949.63it/s, Materializing param=model.layers.21.post_attention_layernorm.weight]
Loading weights: 43%|████▎ | 197/453 [00:00<00:00, 4965.31it/s, Materializing param=model.layers.21.self_attn.k_proj.weight]
Loading weights: 43%|████▎ | 197/453 [00:00<00:00, 4961.41it/s, Materializing param=model.layers.21.self_attn.k_proj.weight]
Loading weights: 44%|████▎ | 198/453 [00:00<00:00, 4977.00it/s, Materializing param=model.layers.21.self_attn.o_proj.weight]
Loading weights: 44%|████▎ | 198/453 [00:00<00:00, 4973.10it/s, Materializing param=model.layers.21.self_attn.o_proj.weight]
Loading weights: 44%|████▍ | 199/453 [00:00<00:00, 4988.30it/s, Materializing param=model.layers.21.self_attn.q_proj.weight]
Loading weights: 44%|████▍ | 199/453 [00:00<00:00, 4984.54it/s, Materializing param=model.layers.21.self_attn.q_proj.weight]
Loading weights: 44%|████▍ | 200/453 [00:00<00:00, 5000.03it/s, Materializing param=model.layers.21.self_attn.v_proj.weight]
Loading weights: 44%|████▍ | 200/453 [00:00<00:00, 4996.13it/s, Materializing param=model.layers.21.self_attn.v_proj.weight]
Loading weights: 44%|████▍ | 201/453 [00:00<00:00, 5011.12it/s, Materializing param=model.layers.22.input_layernorm.weight]
Loading weights: 44%|████▍ | 201/453 [00:00<00:00, 5006.89it/s, Materializing param=model.layers.22.input_layernorm.weight]
Loading weights: 45%|████▍ | 202/453 [00:00<00:00, 5022.79it/s, Materializing param=model.layers.22.mlp.down_proj.weight]
Loading weights: 45%|████▍ | 202/453 [00:00<00:00, 5018.98it/s, Materializing param=model.layers.22.mlp.down_proj.weight]
Loading weights: 45%|████▍ | 203/453 [00:00<00:00, 5034.49it/s, Materializing param=model.layers.22.mlp.gate_proj.weight]
Loading weights: 45%|████▍ | 203/453 [00:00<00:00, 5030.48it/s, Materializing param=model.layers.22.mlp.gate_proj.weight]
Loading weights: 45%|████▌ | 204/453 [00:00<00:00, 5046.17it/s, Materializing param=model.layers.22.mlp.up_proj.weight]
Loading weights: 45%|████▌ | 204/453 [00:00<00:00, 5042.45it/s, Materializing param=model.layers.22.mlp.up_proj.weight]
Loading weights: 45%|████▌ | 205/453 [00:00<00:00, 5058.31it/s, Materializing param=model.layers.22.post_attention_layernorm.weight]
Loading weights: 45%|████▌ | 205/453 [00:00<00:00, 5054.21it/s, Materializing param=model.layers.22.post_attention_layernorm.weight]
Loading weights: 45%|████▌ | 206/453 [00:00<00:00, 5069.63it/s, Materializing param=model.layers.22.self_attn.k_proj.weight]
Loading weights: 45%|████▌ | 206/453 [00:00<00:00, 5065.70it/s, Materializing param=model.layers.22.self_attn.k_proj.weight]
Loading weights: 46%|████▌ | 207/453 [00:00<00:00, 5081.15it/s, Materializing param=model.layers.22.self_attn.o_proj.weight]
Loading weights: 46%|████▌ | 207/453 [00:00<00:00, 5077.23it/s, Materializing param=model.layers.22.self_attn.o_proj.weight]
Loading weights: 46%|████▌ | 208/453 [00:00<00:00, 5081.25it/s, Materializing param=model.layers.22.self_attn.q_proj.weight]
Loading weights: 46%|████▌ | 208/453 [00:00<00:00, 5077.14it/s, Materializing param=model.layers.22.self_attn.q_proj.weight]
Loading weights: 46%|████▌ | 209/453 [00:00<00:00, 5065.64it/s, Materializing param=model.layers.22.self_attn.v_proj.weight]
Loading weights: 46%|████▌ | 209/453 [00:00<00:00, 5061.55it/s, Materializing param=model.layers.22.self_attn.v_proj.weight]
Loading weights: 46%|████▋ | 210/453 [00:00<00:00, 5059.74it/s, Materializing param=model.layers.23.input_layernorm.weight]
Loading weights: 46%|████▋ | 210/453 [00:00<00:00, 5055.93it/s, Materializing param=model.layers.23.input_layernorm.weight]
Loading weights: 47%|████▋ | 211/453 [00:00<00:00, 5070.66it/s, Materializing param=model.layers.23.mlp.down_proj.weight]
Loading weights: 47%|████▋ | 211/453 [00:00<00:00, 5067.09it/s, Materializing param=model.layers.23.mlp.down_proj.weight]
Loading weights: 47%|████▋ | 212/453 [00:00<00:00, 5023.71it/s, Materializing param=model.layers.23.mlp.gate_proj.weight]
Loading weights: 47%|████▋ | 212/453 [00:00<00:00, 5019.74it/s, Materializing param=model.layers.23.mlp.gate_proj.weight]
Loading weights: 47%|████▋ | 213/453 [00:00<00:00, 5022.69it/s, Materializing param=model.layers.23.mlp.up_proj.weight]
Loading weights: 47%|████▋ | 213/453 [00:00<00:00, 5019.00it/s, Materializing param=model.layers.23.mlp.up_proj.weight]
Loading weights: 47%|████▋ | 214/453 [00:00<00:00, 5034.64it/s, Materializing param=model.layers.23.post_attention_layernorm.weight]
Loading weights: 47%|████▋ | 214/453 [00:00<00:00, 5030.89it/s, Materializing param=model.layers.23.post_attention_layernorm.weight]
Loading weights: 47%|████▋ | 215/453 [00:00<00:00, 5045.15it/s, Materializing param=model.layers.23.self_attn.k_proj.weight]
Loading weights: 47%|████▋ | 215/453 [00:00<00:00, 5041.20it/s, Materializing param=model.layers.23.self_attn.k_proj.weight]
Loading weights: 48%|████▊ | 216/453 [00:00<00:00, 5055.94it/s, Materializing param=model.layers.23.self_attn.o_proj.weight]
Loading weights: 48%|████▊ | 216/453 [00:00<00:00, 5052.45it/s, Materializing param=model.layers.23.self_attn.o_proj.weight]
Loading weights: 48%|████▊ | 217/453 [00:00<00:00, 5067.28it/s, Materializing param=model.layers.23.self_attn.q_proj.weight]
Loading weights: 48%|████▊ | 217/453 [00:00<00:00, 5063.64it/s, Materializing param=model.layers.23.self_attn.q_proj.weight]
Loading weights: 48%|████▊ | 218/453 [00:00<00:00, 5078.64it/s, Materializing param=model.layers.23.self_attn.v_proj.weight]
Loading weights: 48%|████▊ | 218/453 [00:00<00:00, 5075.06it/s, Materializing param=model.layers.23.self_attn.v_proj.weight]
Loading weights: 48%|████▊ | 219/453 [00:00<00:00, 5040.87it/s, Materializing param=model.layers.24.input_layernorm.weight]
Loading weights: 48%|████▊ | 219/453 [00:00<00:00, 5037.19it/s, Materializing param=model.layers.24.input_layernorm.weight]
Loading weights: 49%|████▊ | 220/453 [00:00<00:00, 5051.28it/s, Materializing param=model.layers.24.mlp.down_proj.weight]
Loading weights: 49%|████▊ | 220/453 [00:00<00:00, 5047.74it/s, Materializing param=model.layers.24.mlp.down_proj.weight]
Loading weights: 49%|████▉ | 221/453 [00:00<00:00, 5061.60it/s, Materializing param=model.layers.24.mlp.gate_proj.weight]
Loading weights: 49%|████▉ | 221/453 [00:00<00:00, 5058.12it/s, Materializing param=model.layers.24.mlp.gate_proj.weight]
Loading weights: 49%|████▉ | 222/453 [00:00<00:00, 5072.37it/s, Materializing param=model.layers.24.mlp.up_proj.weight]
Loading weights: 49%|████▉ | 222/453 [00:00<00:00, 5068.70it/s, Materializing param=model.layers.24.mlp.up_proj.weight]
Loading weights: 49%|████▉ | 223/453 [00:00<00:00, 5052.26it/s, Materializing param=model.layers.24.post_attention_layernorm.weight]
Loading weights: 49%|████▉ | 223/453 [00:00<00:00, 5048.31it/s, Materializing param=model.layers.24.post_attention_layernorm.weight]
Loading weights: 49%|████▉ | 224/453 [00:00<00:00, 5062.17it/s, Materializing param=model.layers.24.self_attn.k_proj.weight]
Loading weights: 49%|████▉ | 224/453 [00:00<00:00, 5058.63it/s, Materializing param=model.layers.24.self_attn.k_proj.weight]
Loading weights: 50%|████▉ | 225/453 [00:00<00:00, 5072.64it/s, Materializing param=model.layers.24.self_attn.o_proj.weight]
Loading weights: 50%|████▉ | 225/453 [00:00<00:00, 5069.12it/s, Materializing param=model.layers.24.self_attn.o_proj.weight]
Loading weights: 50%|████▉ | 226/453 [00:00<00:00, 5015.94it/s, Materializing param=model.layers.24.self_attn.q_proj.weight]
Loading weights: 50%|████▉ | 226/453 [00:00<00:00, 5012.18it/s, Materializing param=model.layers.24.self_attn.q_proj.weight]
Loading weights: 50%|█████ | 227/453 [00:00<00:00, 5026.88it/s, Materializing param=model.layers.24.self_attn.v_proj.weight]
Loading weights: 50%|█████ | 227/453 [00:00<00:00, 5023.30it/s, Materializing param=model.layers.24.self_attn.v_proj.weight]
Loading weights: 50%|█████ | 228/453 [00:00<00:00, 5038.20it/s, Materializing param=model.layers.25.input_layernorm.weight]
Loading weights: 50%|█████ | 228/453 [00:00<00:00, 5034.73it/s, Materializing param=model.layers.25.input_layernorm.weight]
Loading weights: 51%|█████ | 229/453 [00:00<00:00, 5048.23it/s, Materializing param=model.layers.25.mlp.down_proj.weight]
Loading weights: 51%|█████ | 229/453 [00:00<00:00, 5044.88it/s, Materializing param=model.layers.25.mlp.down_proj.weight]
Loading weights: 51%|█████ | 230/453 [00:00<00:00, 5058.28it/s, Materializing param=model.layers.25.mlp.gate_proj.weight]
Loading weights: 51%|█████ | 230/453 [00:00<00:00, 5054.86it/s, Materializing param=model.layers.25.mlp.gate_proj.weight]
Loading weights: 51%|█████ | 231/453 [00:00<00:00, 5068.74it/s, Materializing param=model.layers.25.mlp.up_proj.weight]
Loading weights: 51%|█████ | 231/453 [00:00<00:00, 5065.51it/s, Materializing param=model.layers.25.mlp.up_proj.weight]
Loading weights: 51%|█████ | 232/453 [00:00<00:00, 5079.02it/s, Materializing param=model.layers.25.post_attention_layernorm.weight]
Loading weights: 51%|█████ | 232/453 [00:00<00:00, 5075.57it/s, Materializing param=model.layers.25.post_attention_layernorm.weight]
Loading weights: 51%|█████▏ | 233/453 [00:00<00:00, 5089.03it/s, Materializing param=model.layers.25.self_attn.k_proj.weight]
Loading weights: 51%|█████▏ | 233/453 [00:00<00:00, 5085.65it/s, Materializing param=model.layers.25.self_attn.k_proj.weight]
Loading weights: 52%|█████▏ | 234/453 [00:00<00:00, 5088.14it/s, Materializing param=model.layers.25.self_attn.o_proj.weight]
Loading weights: 52%|█████▏ | 234/453 [00:00<00:00, 5084.58it/s, Materializing param=model.layers.25.self_attn.o_proj.weight]
Loading weights: 52%|█████▏ | 235/453 [00:00<00:00, 5042.93it/s, Materializing param=model.layers.25.self_attn.q_proj.weight]
Loading weights: 52%|█████▏ | 235/453 [00:00<00:00, 5039.04it/s, Materializing param=model.layers.25.self_attn.q_proj.weight]
Loading weights: 52%|█████▏ | 236/453 [00:00<00:00, 5051.68it/s, Materializing param=model.layers.25.self_attn.v_proj.weight]
Loading weights: 52%|█████▏ | 236/453 [00:00<00:00, 5048.35it/s, Materializing param=model.layers.25.self_attn.v_proj.weight]
Loading weights: 52%|█████▏ | 237/453 [00:00<00:00, 5061.46it/s, Materializing param=model.layers.26.input_layernorm.weight]
Loading weights: 52%|█████▏ | 237/453 [00:00<00:00, 5058.16it/s, Materializing param=model.layers.26.input_layernorm.weight]
Loading weights: 53%|█████▎ | 238/453 [00:00<00:00, 5071.22it/s, Materializing param=model.layers.26.mlp.down_proj.weight]
Loading weights: 53%|█████▎ | 238/453 [00:00<00:00, 5068.10it/s, Materializing param=model.layers.26.mlp.down_proj.weight]
Loading weights: 53%|█████▎ | 239/453 [00:00<00:00, 5072.94it/s, Materializing param=model.layers.26.mlp.gate_proj.weight]
Loading weights: 53%|█████▎ | 239/453 [00:00<00:00, 5069.02it/s, Materializing param=model.layers.26.mlp.gate_proj.weight]
Loading weights: 53%|█████▎ | 240/453 [00:00<00:00, 5038.88it/s, Materializing param=model.layers.26.mlp.up_proj.weight]
Loading weights: 53%|█████▎ | 240/453 [00:00<00:00, 5035.58it/s, Materializing param=model.layers.26.mlp.up_proj.weight]
Loading weights: 53%|█████▎ | 241/453 [00:00<00:00, 5048.53it/s, Materializing param=model.layers.26.post_attention_layernorm.weight]
Loading weights: 53%|█████▎ | 241/453 [00:00<00:00, 5045.18it/s, Materializing param=model.layers.26.post_attention_layernorm.weight]
Loading weights: 53%|█████▎ | 242/453 [00:00<00:00, 5058.01it/s, Materializing param=model.layers.26.self_attn.k_proj.weight]
Loading weights: 53%|█████▎ | 242/453 [00:00<00:00, 5054.89it/s, Materializing param=model.layers.26.self_attn.k_proj.weight]
Loading weights: 54%|█████▎ | 243/453 [00:00<00:00, 5067.83it/s, Materializing param=model.layers.26.self_attn.o_proj.weight]
Loading weights: 54%|█████▎ | 243/453 [00:00<00:00, 5064.65it/s, Materializing param=model.layers.26.self_attn.o_proj.weight]
Loading weights: 54%|█████▍ | 244/453 [00:00<00:00, 5025.81it/s, Materializing param=model.layers.26.self_attn.q_proj.weight]
Loading weights: 54%|█████▍ | 244/453 [00:00<00:00, 5022.43it/s, Materializing param=model.layers.26.self_attn.q_proj.weight]
Loading weights: 54%|█████▍ | 245/453 [00:00<00:00, 5035.06it/s, Materializing param=model.layers.26.self_attn.v_proj.weight]
Loading weights: 54%|█████▍ | 245/453 [00:00<00:00, 5031.95it/s, Materializing param=model.layers.26.self_attn.v_proj.weight]
Loading weights: 54%|█████▍ | 246/453 [00:00<00:00, 5044.48it/s, Materializing param=model.layers.27.input_layernorm.weight]
Loading weights: 54%|█████▍ | 246/453 [00:00<00:00, 5041.40it/s, Materializing param=model.layers.27.input_layernorm.weight]
Loading weights: 55%|█████▍ | 247/453 [00:00<00:00, 5054.19it/s, Materializing param=model.layers.27.mlp.down_proj.weight]
Loading weights: 55%|█████▍ | 247/453 [00:00<00:00, 5050.99it/s, Materializing param=model.layers.27.mlp.down_proj.weight]
Loading weights: 55%|█████▍ | 248/453 [00:00<00:00, 5063.41it/s, Materializing param=model.layers.27.mlp.gate_proj.weight]
Loading weights: 55%|█████▍ | 248/453 [00:00<00:00, 5060.41it/s, Materializing param=model.layers.27.mlp.gate_proj.weight]
Loading weights: 55%|█████▍ | 249/453 [00:00<00:00, 5007.46it/s, Materializing param=model.layers.27.mlp.up_proj.weight]
Loading weights: 55%|█████▍ | 249/453 [00:00<00:00, 5004.20it/s, Materializing param=model.layers.27.mlp.up_proj.weight]
Loading weights: 55%|█████▌ | 250/453 [00:00<00:00, 5016.41it/s, Materializing param=model.layers.27.post_attention_layernorm.weight]
Loading weights: 55%|█████▌ | 250/453 [00:00<00:00, 5013.20it/s, Materializing param=model.layers.27.post_attention_layernorm.weight]
Loading weights: 55%|█████▌ | 251/453 [00:00<00:00, 5025.54it/s, Materializing param=model.layers.27.self_attn.k_proj.weight]
Loading weights: 55%|█████▌ | 251/453 [00:00<00:00, 5022.45it/s, Materializing param=model.layers.27.self_attn.k_proj.weight]
Loading weights: 56%|█████▌ | 252/453 [00:00<00:00, 5034.68it/s, Materializing param=model.layers.27.self_attn.o_proj.weight]
Loading weights: 56%|█████▌ | 252/453 [00:00<00:00, 5031.54it/s, Materializing param=model.layers.27.self_attn.o_proj.weight]
Loading weights: 56%|█████▌ | 253/453 [00:00<00:00, 5043.77it/s, Materializing param=model.layers.27.self_attn.q_proj.weight]
Loading weights: 56%|█████▌ | 253/453 [00:00<00:00, 5040.68it/s, Materializing param=model.layers.27.self_attn.q_proj.weight]
Loading weights: 56%|█████▌ | 254/453 [00:00<00:00, 5053.07it/s, Materializing param=model.layers.27.self_attn.v_proj.weight]
Loading weights: 56%|█████▌ | 254/453 [00:00<00:00, 5050.05it/s, Materializing param=model.layers.27.self_attn.v_proj.weight]
Loading weights: 56%|█████▋ | 255/453 [00:00<00:00, 4998.03it/s, Materializing param=model.layers.28.input_layernorm.weight]
Loading weights: 56%|█████▋ | 255/453 [00:00<00:00, 4994.80it/s, Materializing param=model.layers.28.input_layernorm.weight]
Loading weights: 57%|█████▋ | 256/453 [00:00<00:00, 5006.33it/s, Materializing param=model.layers.28.mlp.down_proj.weight]
Loading weights: 57%|█████▋ | 256/453 [00:00<00:00, 5003.20it/s, Materializing param=model.layers.28.mlp.down_proj.weight]
Loading weights: 57%|█████▋ | 257/453 [00:00<00:00, 5015.20it/s, Materializing param=model.layers.28.mlp.gate_proj.weight]
Loading weights: 57%|█████▋ | 257/453 [00:00<00:00, 5011.91it/s, Materializing param=model.layers.28.mlp.gate_proj.weight]
Loading weights: 57%|█████▋ | 258/453 [00:00<00:00, 5023.58it/s, Materializing param=model.layers.28.mlp.up_proj.weight]
Loading weights: 57%|█████▋ | 258/453 [00:00<00:00, 5020.39it/s, Materializing param=model.layers.28.mlp.up_proj.weight]
Loading weights: 57%|█████▋ | 259/453 [00:00<00:00, 5032.26it/s, Materializing param=model.layers.28.post_attention_layernorm.weight]
Loading weights: 57%|█████▋ | 259/453 [00:00<00:00, 5029.09it/s, Materializing param=model.layers.28.post_attention_layernorm.weight]
Loading weights: 57%|█████▋ | 260/453 [00:00<00:00, 5041.16it/s, Materializing param=model.layers.28.self_attn.k_proj.weight]
Loading weights: 57%|█████▋ | 260/453 [00:00<00:00, 5038.09it/s, Materializing param=model.layers.28.self_attn.k_proj.weight]
Loading weights: 58%|█████▊ | 261/453 [00:00<00:00, 5018.14it/s, Materializing param=model.layers.28.self_attn.o_proj.weight]
Loading weights: 58%|█████▊ | 261/453 [00:00<00:00, 5015.02it/s, Materializing param=model.layers.28.self_attn.o_proj.weight]
Loading weights: 58%|█████▊ | 262/453 [00:00<00:00, 5028.08it/s, Materializing param=model.layers.28.self_attn.q_proj.weight]
Loading weights: 58%|█████▊ | 262/453 [00:00<00:00, 5024.84it/s, Materializing param=model.layers.28.self_attn.q_proj.weight]
Loading weights: 58%|█████▊ | 263/453 [00:00<00:00, 5036.54it/s, Materializing param=model.layers.28.self_attn.v_proj.weight]
Loading weights: 58%|█████▊ | 263/453 [00:00<00:00, 5033.75it/s, Materializing param=model.layers.28.self_attn.v_proj.weight]
Loading weights: 58%|█████▊ | 264/453 [00:00<00:00, 5035.87it/s, Materializing param=model.layers.29.input_layernorm.weight]
Loading weights: 58%|█████▊ | 264/453 [00:00<00:00, 5032.94it/s, Materializing param=model.layers.29.input_layernorm.weight]
Loading weights: 58%|█████▊ | 265/453 [00:00<00:00, 5044.57it/s, Materializing param=model.layers.29.mlp.down_proj.weight]
Loading weights: 58%|█████▊ | 265/453 [00:00<00:00, 5041.71it/s, Materializing param=model.layers.29.mlp.down_proj.weight]
Loading weights: 59%|█████▊ | 266/453 [00:00<00:00, 5053.17it/s, Materializing param=model.layers.29.mlp.gate_proj.weight]
Loading weights: 59%|█████▊ | 266/453 [00:00<00:00, 5050.13it/s, Materializing param=model.layers.29.mlp.gate_proj.weight]
Loading weights: 59%|█████▉ | 267/453 [00:00<00:00, 5061.56it/s, Materializing param=model.layers.29.mlp.up_proj.weight]
Loading weights: 59%|█████▉ | 267/453 [00:00<00:00, 5058.54it/s, Materializing param=model.layers.29.mlp.up_proj.weight]
Loading weights: 59%|█████▉ | 268/453 [00:00<00:00, 5036.04it/s, Materializing param=model.layers.29.post_attention_layernorm.weight]
Loading weights: 59%|█████▉ | 268/453 [00:00<00:00, 5032.63it/s, Materializing param=model.layers.29.post_attention_layernorm.weight]
Loading weights: 59%|█████▉ | 269/453 [00:00<00:00, 5028.36it/s, Materializing param=model.layers.29.self_attn.k_proj.weight]
Loading weights: 59%|█████▉ | 269/453 [00:00<00:00, 5025.24it/s, Materializing param=model.layers.29.self_attn.k_proj.weight]
Loading weights: 60%|█████▉ | 270/453 [00:00<00:00, 5037.67it/s, Materializing param=model.layers.29.self_attn.o_proj.weight]
Loading weights: 60%|█████▉ | 270/453 [00:00<00:00, 5034.78it/s, Materializing param=model.layers.29.self_attn.o_proj.weight]
Loading weights: 60%|█████▉ | 271/453 [00:00<00:00, 5046.13it/s, Materializing param=model.layers.29.self_attn.q_proj.weight]
Loading weights: 60%|█████▉ | 271/453 [00:00<00:00, 5043.31it/s, Materializing param=model.layers.29.self_attn.q_proj.weight]
Loading weights: 60%|██████ | 272/453 [00:00<00:00, 5054.72it/s, Materializing param=model.layers.29.self_attn.v_proj.weight]
Loading weights: 60%|██████ | 272/453 [00:00<00:00, 5051.86it/s, Materializing param=model.layers.29.self_attn.v_proj.weight]
Loading weights: 60%|██████ | 273/453 [00:00<00:00, 5063.01it/s, Materializing param=model.layers.30.input_layernorm.weight]
Loading weights: 60%|██████ | 273/453 [00:00<00:00, 5060.10it/s, Materializing param=model.layers.30.input_layernorm.weight]
Loading weights: 60%|██████ | 274/453 [00:00<00:00, 5071.46it/s, Materializing param=model.layers.30.mlp.down_proj.weight]
Loading weights: 60%|██████ | 274/453 [00:00<00:00, 5068.62it/s, Materializing param=model.layers.30.mlp.down_proj.weight]
Loading weights: 61%|██████ | 275/453 [00:00<00:00, 5060.16it/s, Materializing param=model.layers.30.mlp.gate_proj.weight]
Loading weights: 61%|██████ | 275/453 [00:00<00:00, 5057.17it/s, Materializing param=model.layers.30.mlp.gate_proj.weight]
Loading weights: 61%|██████ | 276/453 [00:00<00:00, 5068.13it/s, Materializing param=model.layers.30.mlp.up_proj.weight]
Loading weights: 61%|██████ | 276/453 [00:00<00:00, 5065.34it/s, Materializing param=model.layers.30.mlp.up_proj.weight]
Loading weights: 61%|██████ | 277/453 [00:00<00:00, 5074.28it/s, Materializing param=model.layers.30.post_attention_layernorm.weight]
Loading weights: 61%|██████ | 277/453 [00:00<00:00, 5071.33it/s, Materializing param=model.layers.30.post_attention_layernorm.weight]
Loading weights: 61%|██████▏ | 278/453 [00:00<00:00, 5048.76it/s, Materializing param=model.layers.30.self_attn.k_proj.weight]
Loading weights: 61%|██████▏ | 278/453 [00:00<00:00, 5045.68it/s, Materializing param=model.layers.30.self_attn.k_proj.weight]
Loading weights: 62%|██████▏ | 279/453 [00:00<00:00, 5045.01it/s, Materializing param=model.layers.30.self_attn.o_proj.weight]
Loading weights: 62%|██████▏ | 279/453 [00:00<00:00, 5042.19it/s, Materializing param=model.layers.30.self_attn.o_proj.weight]
Loading weights: 62%|██████▏ | 280/453 [00:00<00:00, 5054.36it/s, Materializing param=model.layers.30.self_attn.q_proj.weight]
Loading weights: 62%|██████▏ | 280/453 [00:00<00:00, 5051.60it/s, Materializing param=model.layers.30.self_attn.q_proj.weight]
Loading weights: 62%|██████▏ | 281/453 [00:00<00:00, 5058.15it/s, Materializing param=model.layers.30.self_attn.v_proj.weight]
Loading weights: 62%|██████▏ | 281/453 [00:00<00:00, 5055.31it/s, Materializing param=model.layers.30.self_attn.v_proj.weight]
Loading weights: 62%|██████▏ | 282/453 [00:00<00:00, 5067.45it/s, Materializing param=model.layers.31.input_layernorm.weight]
Loading weights: 62%|██████▏ | 282/453 [00:00<00:00, 5064.76it/s, Materializing param=model.layers.31.input_layernorm.weight]
Loading weights: 62%|██████▏ | 283/453 [00:00<00:00, 4996.14it/s, Materializing param=model.layers.31.mlp.down_proj.weight]
Loading weights: 62%|██████▏ | 283/453 [00:00<00:00, 4993.39it/s, Materializing param=model.layers.31.mlp.down_proj.weight]
Loading weights: 63%|██████▎ | 284/453 [00:00<00:00, 5004.00it/s, Materializing param=model.layers.31.mlp.gate_proj.weight]
Loading weights: 63%|██████▎ | 284/453 [00:00<00:00, 5001.29it/s, Materializing param=model.layers.31.mlp.gate_proj.weight]
Loading weights: 63%|██████▎ | 285/453 [00:00<00:00, 5000.32it/s, Materializing param=model.layers.31.mlp.up_proj.weight]
Loading weights: 63%|██████▎ | 285/453 [00:00<00:00, 4997.58it/s, Materializing param=model.layers.31.mlp.up_proj.weight]
Loading weights: 63%|██████▎ | 286/453 [00:00<00:00, 5008.04it/s, Materializing param=model.layers.31.post_attention_layernorm.weight]
Loading weights: 63%|██████▎ | 286/453 [00:00<00:00, 5005.09it/s, Materializing param=model.layers.31.post_attention_layernorm.weight]
Loading weights: 63%|██████▎ | 287/453 [00:00<00:00, 5015.69it/s, Materializing param=model.layers.31.self_attn.k_proj.weight]
Loading weights: 63%|██████▎ | 287/453 [00:00<00:00, 5013.06it/s, Materializing param=model.layers.31.self_attn.k_proj.weight]
Loading weights: 64%|██████▎ | 288/453 [00:00<00:00, 5023.70it/s, Materializing param=model.layers.31.self_attn.o_proj.weight]
Loading weights: 64%|██████▎ | 288/453 [00:00<00:00, 5020.99it/s, Materializing param=model.layers.31.self_attn.o_proj.weight]
Loading weights: 64%|██████▍ | 289/453 [00:00<00:00, 5031.65it/s, Materializing param=model.layers.31.self_attn.q_proj.weight]
Loading weights: 64%|██████▍ | 289/453 [00:00<00:00, 5029.00it/s, Materializing param=model.layers.31.self_attn.q_proj.weight]
Loading weights: 64%|██████▍ | 290/453 [00:00<00:00, 5039.94it/s, Materializing param=model.layers.31.self_attn.v_proj.weight]
Loading weights: 64%|██████▍ | 290/453 [00:00<00:00, 5037.26it/s, Materializing param=model.layers.31.self_attn.v_proj.weight]
Loading weights: 64%|██████▍ | 291/453 [00:00<00:00, 5048.22it/s, Materializing param=model.layers.32.input_layernorm.weight]
Loading weights: 64%|██████▍ | 291/453 [00:00<00:00, 5045.46it/s, Materializing param=model.layers.32.input_layernorm.weight]
Loading weights: 64%|██████▍ | 292/453 [00:00<00:00, 5056.36it/s, Materializing param=model.layers.32.mlp.down_proj.weight]
Loading weights: 64%|██████▍ | 292/453 [00:00<00:00, 5053.69it/s, Materializing param=model.layers.32.mlp.down_proj.weight]
Loading weights: 65%|██████▍ | 293/453 [00:00<00:00, 5058.41it/s, Materializing param=model.layers.32.mlp.gate_proj.weight]
Loading weights: 65%|██████▍ | 293/453 [00:00<00:00, 5055.77it/s, Materializing param=model.layers.32.mlp.gate_proj.weight]
Loading weights: 65%|██████▍ | 294/453 [00:00<00:00, 5061.41it/s, Materializing param=model.layers.32.mlp.up_proj.weight]
Loading weights: 65%|██████▍ | 294/453 [00:00<00:00, 5058.60it/s, Materializing param=model.layers.32.mlp.up_proj.weight]
Loading weights: 65%|██████▌ | 295/453 [00:00<00:00, 5070.26it/s, Materializing param=model.layers.32.post_attention_layernorm.weight]
Loading weights: 65%|██████▌ | 295/453 [00:00<00:00, 5067.43it/s, Materializing param=model.layers.32.post_attention_layernorm.weight]
Loading weights: 65%|██████▌ | 296/453 [00:00<00:00, 5074.16it/s, Materializing param=model.layers.32.self_attn.k_proj.weight]
Loading weights: 65%|██████▌ | 296/453 [00:00<00:00, 5071.40it/s, Materializing param=model.layers.32.self_attn.k_proj.weight]
Loading weights: 66%|██████▌ | 297/453 [00:00<00:00, 5081.64it/s, Materializing param=model.layers.32.self_attn.o_proj.weight]
Loading weights: 66%|██████▌ | 297/453 [00:00<00:00, 5078.97it/s, Materializing param=model.layers.32.self_attn.o_proj.weight]
Loading weights: 66%|██████▌ | 298/453 [00:00<00:00, 5074.22it/s, Materializing param=model.layers.32.self_attn.q_proj.weight]
Loading weights: 66%|██████▌ | 298/453 [00:00<00:00, 5071.40it/s, Materializing param=model.layers.32.self_attn.q_proj.weight]
Loading weights: 66%|██████▌ | 299/453 [00:00<00:00, 5038.72it/s, Materializing param=model.layers.32.self_attn.v_proj.weight]
Loading weights: 66%|██████▌ | 299/453 [00:00<00:00, 5036.05it/s, Materializing param=model.layers.32.self_attn.v_proj.weight]
Loading weights: 66%|██████▌ | 300/453 [00:00<00:00, 5046.43it/s, Materializing param=model.layers.33.input_layernorm.weight]
Loading weights: 66%|██████▌ | 300/453 [00:00<00:00, 5043.90it/s, Materializing param=model.layers.33.input_layernorm.weight]
Loading weights: 66%|██████▋ | 301/453 [00:00<00:00, 5054.19it/s, Materializing param=model.layers.33.mlp.down_proj.weight]
Loading weights: 66%|██████▋ | 301/453 [00:00<00:00, 5051.72it/s, Materializing param=model.layers.33.mlp.down_proj.weight]
Loading weights: 67%|██████▋ | 302/453 [00:00<00:00, 5062.04it/s, Materializing param=model.layers.33.mlp.gate_proj.weight]
Loading weights: 67%|██████▋ | 302/453 [00:00<00:00, 5059.47it/s, Materializing param=model.layers.33.mlp.gate_proj.weight]
Loading weights: 67%|██████▋ | 303/453 [00:00<00:00, 5069.81it/s, Materializing param=model.layers.33.mlp.up_proj.weight]
Loading weights: 67%|██████▋ | 303/453 [00:00<00:00, 5067.28it/s, Materializing param=model.layers.33.mlp.up_proj.weight]
Loading weights: 67%|██████▋ | 304/453 [00:00<00:00, 5045.62it/s, Materializing param=model.layers.33.post_attention_layernorm.weight]
Loading weights: 67%|██████▋ | 304/453 [00:00<00:00, 5042.97it/s, Materializing param=model.layers.33.post_attention_layernorm.weight]
Loading weights: 67%|██████▋ | 305/453 [00:00<00:00, 5052.92it/s, Materializing param=model.layers.33.self_attn.k_proj.weight]
Loading weights: 67%|██████▋ | 305/453 [00:00<00:00, 5050.15it/s, Materializing param=model.layers.33.self_attn.k_proj.weight]
Loading weights: 68%|██████▊ | 306/453 [00:00<00:00, 5060.05it/s, Materializing param=model.layers.33.self_attn.o_proj.weight]
Loading weights: 68%|██████▊ | 306/453 [00:00<00:00, 5057.50it/s, Materializing param=model.layers.33.self_attn.o_proj.weight]
Loading weights: 68%|██████▊ | 307/453 [00:00<00:00, 5067.70it/s, Materializing param=model.layers.33.self_attn.q_proj.weight]
Loading weights: 68%|██████▊ | 307/453 [00:00<00:00, 5065.15it/s, Materializing param=model.layers.33.self_attn.q_proj.weight]
Loading weights: 68%|██████▊ | 308/453 [00:00<00:00, 5075.14it/s, Materializing param=model.layers.33.self_attn.v_proj.weight]
Loading weights: 68%|██████▊ | 308/453 [00:00<00:00, 5072.55it/s, Materializing param=model.layers.33.self_attn.v_proj.weight]
Loading weights: 68%|██████▊ | 309/453 [00:00<00:00, 5082.69it/s, Materializing param=model.layers.34.input_layernorm.weight]
Loading weights: 68%|██████▊ | 309/453 [00:00<00:00, 5080.10it/s, Materializing param=model.layers.34.input_layernorm.weight]
Loading weights: 68%|██████▊ | 310/453 [00:00<00:00, 5050.16it/s, Materializing param=model.layers.34.mlp.down_proj.weight]
Loading weights: 68%|██████▊ | 310/453 [00:00<00:00, 5047.47it/s, Materializing param=model.layers.34.mlp.down_proj.weight]
Loading weights: 69%|██████▊ | 311/453 [00:00<00:00, 5023.16it/s, Materializing param=model.layers.34.mlp.gate_proj.weight]
Loading weights: 69%|██████▊ | 311/453 [00:00<00:00, 5020.62it/s, Materializing param=model.layers.34.mlp.gate_proj.weight]
Loading weights: 69%|██████▉ | 312/453 [00:00<00:00, 5012.34it/s, Materializing param=model.layers.34.mlp.up_proj.weight]
Loading weights: 69%|██████▉ | 312/453 [00:00<00:00, 5009.62it/s, Materializing param=model.layers.34.mlp.up_proj.weight]
Loading weights: 69%|██████▉ | 313/453 [00:00<00:00, 5019.30it/s, Materializing param=model.layers.34.post_attention_layernorm.weight]
Loading weights: 69%|██████▉ | 313/453 [00:00<00:00, 5016.73it/s, Materializing param=model.layers.34.post_attention_layernorm.weight]
Loading weights: 69%|██████▉ | 314/453 [00:00<00:00, 5026.59it/s, Materializing param=model.layers.34.self_attn.k_proj.weight]
Loading weights: 69%|██████▉ | 314/453 [00:00<00:00, 5024.15it/s, Materializing param=model.layers.34.self_attn.k_proj.weight]
Loading weights: 70%|██████▉ | 315/453 [00:00<00:00, 5034.05it/s, Materializing param=model.layers.34.self_attn.o_proj.weight]
Loading weights: 70%|██████▉ | 315/453 [00:00<00:00, 5031.61it/s, Materializing param=model.layers.34.self_attn.o_proj.weight]
Loading weights: 70%|██████▉ | 316/453 [00:00<00:00, 5041.23it/s, Materializing param=model.layers.34.self_attn.q_proj.weight]
Loading weights: 70%|██████▉ | 316/453 [00:00<00:00, 5038.89it/s, Materializing param=model.layers.34.self_attn.q_proj.weight]
Loading weights: 70%|██████▉ | 317/453 [00:00<00:00, 5048.85it/s, Materializing param=model.layers.34.self_attn.v_proj.weight]
Loading weights: 70%|██████▉ | 317/453 [00:00<00:00, 5046.40it/s, Materializing param=model.layers.34.self_attn.v_proj.weight]
Loading weights: 70%|███████ | 318/453 [00:00<00:00, 5056.10it/s, Materializing param=model.layers.35.input_layernorm.weight]
Loading weights: 70%|███████ | 318/453 [00:00<00:00, 5053.67it/s, Materializing param=model.layers.35.input_layernorm.weight]
Loading weights: 70%|███████ | 319/453 [00:00<00:00, 5063.34it/s, Materializing param=model.layers.35.mlp.down_proj.weight]
Loading weights: 70%|███████ | 319/453 [00:00<00:00, 5060.85it/s, Materializing param=model.layers.35.mlp.down_proj.weight]
Loading weights: 71%|███████ | 320/453 [00:00<00:00, 5070.66it/s, Materializing param=model.layers.35.mlp.gate_proj.weight]
Loading weights: 71%|███████ | 320/453 [00:00<00:00, 5067.92it/s, Materializing param=model.layers.35.mlp.gate_proj.weight]
Loading weights: 71%|███████ | 321/453 [00:00<00:00, 5032.53it/s, Materializing param=model.layers.35.mlp.up_proj.weight]
Loading weights: 71%|███████ | 321/453 [00:00<00:00, 5030.04it/s, Materializing param=model.layers.35.mlp.up_proj.weight]
Loading weights: 71%|███████ | 322/453 [00:00<00:00, 5020.34it/s, Materializing param=model.layers.35.post_attention_layernorm.weight]
Loading weights: 71%|███████ | 322/453 [00:00<00:00, 5017.84it/s, Materializing param=model.layers.35.post_attention_layernorm.weight]
Loading weights: 71%|███████▏ | 323/453 [00:00<00:00, 5028.26it/s, Materializing param=model.layers.35.self_attn.k_proj.weight]
Loading weights: 71%|███████▏ | 323/453 [00:00<00:00, 5025.71it/s, Materializing param=model.layers.35.self_attn.k_proj.weight]
Loading weights: 72%|███████▏ | 324/453 [00:00<00:00, 5035.23it/s, Materializing param=model.layers.35.self_attn.o_proj.weight]
Loading weights: 72%|███████▏ | 324/453 [00:00<00:00, 5033.02it/s, Materializing param=model.layers.35.self_attn.o_proj.weight]
Loading weights: 72%|███████▏ | 325/453 [00:00<00:00, 5042.67it/s, Materializing param=model.layers.35.self_attn.q_proj.weight]
Loading weights: 72%|███████▏ | 325/453 [00:00<00:00, 5040.22it/s, Materializing param=model.layers.35.self_attn.q_proj.weight]
Loading weights: 72%|███████▏ | 326/453 [00:00<00:00, 5049.78it/s, Materializing param=model.layers.35.self_attn.v_proj.weight]
Loading weights: 72%|███████▏ | 326/453 [00:00<00:00, 5047.43it/s, Materializing param=model.layers.35.self_attn.v_proj.weight]
Loading weights: 72%|███████▏ | 327/453 [00:00<00:00, 5044.49it/s, Materializing param=model.layers.36.input_layernorm.weight]
Loading weights: 72%|███████▏ | 327/453 [00:00<00:00, 5041.95it/s, Materializing param=model.layers.36.input_layernorm.weight]
Loading weights: 72%|███████▏ | 328/453 [00:00<00:00, 5046.35it/s, Materializing param=model.layers.36.mlp.down_proj.weight]
Loading weights: 72%|███████▏ | 328/453 [00:00<00:00, 5043.89it/s, Materializing param=model.layers.36.mlp.down_proj.weight]
Loading weights: 73%|███████▎ | 329/453 [00:00<00:00, 5053.10it/s, Materializing param=model.layers.36.mlp.gate_proj.weight]
Loading weights: 73%|███████▎ | 329/453 [00:00<00:00, 5050.46it/s, Materializing param=model.layers.36.mlp.gate_proj.weight]
Loading weights: 73%|███████▎ | 330/453 [00:00<00:00, 5056.20it/s, Materializing param=model.layers.36.mlp.up_proj.weight]
Loading weights: 73%|███████▎ | 330/453 [00:00<00:00, 5053.77it/s, Materializing param=model.layers.36.mlp.up_proj.weight]
Loading weights: 73%|███████▎ | 331/453 [00:00<00:00, 5063.07it/s, Materializing param=model.layers.36.post_attention_layernorm.weight]
Loading weights: 73%|███████▎ | 331/453 [00:00<00:00, 5060.56it/s, Materializing param=model.layers.36.post_attention_layernorm.weight]
Loading weights: 73%|███████▎ | 332/453 [00:00<00:00, 5069.75it/s, Materializing param=model.layers.36.self_attn.k_proj.weight]
Loading weights: 73%|███████▎ | 332/453 [00:00<00:00, 5067.30it/s, Materializing param=model.layers.36.self_attn.k_proj.weight]
Loading weights: 74%|███████▎ | 333/453 [00:00<00:00, 5076.47it/s, Materializing param=model.layers.36.self_attn.o_proj.weight]
Loading weights: 74%|███████▎ | 333/453 [00:00<00:00, 5074.09it/s, Materializing param=model.layers.36.self_attn.o_proj.weight]
Loading weights: 74%|███████▎ | 334/453 [00:00<00:00, 5083.30it/s, Materializing param=model.layers.36.self_attn.q_proj.weight]
Loading weights: 74%|███████▎ | 334/453 [00:00<00:00, 5080.96it/s, Materializing param=model.layers.36.self_attn.q_proj.weight]
Loading weights: 74%|███████▍ | 335/453 [00:00<00:00, 5056.03it/s, Materializing param=model.layers.36.self_attn.v_proj.weight]
Loading weights: 74%|███████▍ | 335/453 [00:00<00:00, 5053.52it/s, Materializing param=model.layers.36.self_attn.v_proj.weight]
Loading weights: 74%|███████▍ | 336/453 [00:00<00:00, 5062.65it/s, Materializing param=model.layers.37.input_layernorm.weight]
Loading weights: 74%|███████▍ | 336/453 [00:00<00:00, 5060.35it/s, Materializing param=model.layers.37.input_layernorm.weight]
Loading weights: 74%|███████▍ | 337/453 [00:00<00:00, 5049.44it/s, Materializing param=model.layers.37.mlp.down_proj.weight]
Loading weights: 74%|███████▍ | 337/453 [00:00<00:00, 5046.86it/s, Materializing param=model.layers.37.mlp.down_proj.weight]
Loading weights: 75%|███████▍ | 338/453 [00:00<00:00, 5049.06it/s, Materializing param=model.layers.37.mlp.gate_proj.weight]
Loading weights: 75%|███████▍ | 338/453 [00:00<00:00, 5046.67it/s, Materializing param=model.layers.37.mlp.gate_proj.weight]
Loading weights: 75%|███████▍ | 339/453 [00:00<00:00, 5056.54it/s, Materializing param=model.layers.37.mlp.up_proj.weight]
Loading weights: 75%|███████▍ | 339/453 [00:00<00:00, 5054.37it/s, Materializing param=model.layers.37.mlp.up_proj.weight]
Loading weights: 75%|███████▌ | 340/453 [00:00<00:00, 5055.24it/s, Materializing param=model.layers.37.post_attention_layernorm.weight]
Loading weights: 75%|███████▌ | 340/453 [00:00<00:00, 5052.70it/s, Materializing param=model.layers.37.post_attention_layernorm.weight]
Loading weights: 75%|███████▌ | 341/453 [00:00<00:00, 5062.57it/s, Materializing param=model.layers.37.self_attn.k_proj.weight]
Loading weights: 75%|███████▌ | 341/453 [00:00<00:00, 5060.26it/s, Materializing param=model.layers.37.self_attn.k_proj.weight]
Loading weights: 75%|███████▌ | 342/453 [00:00<00:00, 5047.14it/s, Materializing param=model.layers.37.self_attn.o_proj.weight]
Loading weights: 75%|███████▌ | 342/453 [00:00<00:00, 5044.81it/s, Materializing param=model.layers.37.self_attn.o_proj.weight]
Loading weights: 76%|███████▌ | 343/453 [00:00<00:00, 5052.37it/s, Materializing param=model.layers.37.self_attn.q_proj.weight]
Loading weights: 76%|███████▌ | 343/453 [00:00<00:00, 5050.04it/s, Materializing param=model.layers.37.self_attn.q_proj.weight]
Loading weights: 76%|███████▌ | 344/453 [00:00<00:00, 5058.92it/s, Materializing param=model.layers.37.self_attn.v_proj.weight]
Loading weights: 76%|███████▌ | 344/453 [00:00<00:00, 5056.67it/s, Materializing param=model.layers.37.self_attn.v_proj.weight]
Loading weights: 76%|███████▌ | 345/453 [00:00<00:00, 5038.90it/s, Materializing param=model.layers.38.input_layernorm.weight]
Loading weights: 76%|███████▌ | 345/453 [00:00<00:00, 5036.49it/s, Materializing param=model.layers.38.input_layernorm.weight]
Loading weights: 76%|███████▋ | 346/453 [00:00<00:00, 5045.24it/s, Materializing param=model.layers.38.mlp.down_proj.weight]
Loading weights: 76%|███████▋ | 346/453 [00:00<00:00, 5042.88it/s, Materializing param=model.layers.38.mlp.down_proj.weight]
Loading weights: 77%|███████▋ | 347/453 [00:00<00:00, 5051.66it/s, Materializing param=model.layers.38.mlp.gate_proj.weight]
Loading weights: 77%|███████▋ | 347/453 [00:00<00:00, 5049.38it/s, Materializing param=model.layers.38.mlp.gate_proj.weight]
Loading weights: 77%|███████▋ | 348/453 [00:00<00:00, 5054.10it/s, Materializing param=model.layers.38.mlp.up_proj.weight]
Loading weights: 77%|███████▋ | 348/453 [00:00<00:00, 5051.82it/s, Materializing param=model.layers.38.mlp.up_proj.weight]
Loading weights: 77%|███████▋ | 349/453 [00:00<00:00, 5060.52it/s, Materializing param=model.layers.38.post_attention_layernorm.weight]
Loading weights: 77%|███████▋ | 349/453 [00:00<00:00, 5058.18it/s, Materializing param=model.layers.38.post_attention_layernorm.weight]
Loading weights: 77%|███████▋ | 350/453 [00:00<00:00, 5066.98it/s, Materializing param=model.layers.38.self_attn.k_proj.weight]
Loading weights: 77%|███████▋ | 350/453 [00:00<00:00, 5064.76it/s, Materializing param=model.layers.38.self_attn.k_proj.weight]
Loading weights: 77%|███████▋ | 351/453 [00:00<00:00, 5073.37it/s, Materializing param=model.layers.38.self_attn.o_proj.weight]
Loading weights: 77%|███████▋ | 351/453 [00:00<00:00, 5071.06it/s, Materializing param=model.layers.38.self_attn.o_proj.weight]
Loading weights: 78%|███████▊ | 352/453 [00:00<00:00, 5079.82it/s, Materializing param=model.layers.38.self_attn.q_proj.weight]
Loading weights: 78%|███████▊ | 352/453 [00:00<00:00, 5077.68it/s, Materializing param=model.layers.38.self_attn.q_proj.weight]
Loading weights: 78%|███████▊ | 353/453 [00:00<00:00, 5086.50it/s, Materializing param=model.layers.38.self_attn.v_proj.weight]
Loading weights: 78%|███████▊ | 353/453 [00:00<00:00, 5084.28it/s, Materializing param=model.layers.38.self_attn.v_proj.weight]
Loading weights: 78%|███████▊ | 354/453 [00:00<00:00, 5072.80it/s, Materializing param=model.layers.39.input_layernorm.weight]
Loading weights: 78%|███████▊ | 354/453 [00:00<00:00, 5070.36it/s, Materializing param=model.layers.39.input_layernorm.weight]
Loading weights: 78%|███████▊ | 355/453 [00:00<00:00, 5079.06it/s, Materializing param=model.layers.39.mlp.down_proj.weight]
Loading weights: 78%|███████▊ | 355/453 [00:00<00:00, 5076.85it/s, Materializing param=model.layers.39.mlp.down_proj.weight]
Loading weights: 79%|███████▊ | 356/453 [00:00<00:00, 5075.62it/s, Materializing param=model.layers.39.mlp.gate_proj.weight]
Loading weights: 79%|███████▊ | 356/453 [00:00<00:00, 5073.43it/s, Materializing param=model.layers.39.mlp.gate_proj.weight]
Loading weights: 79%|███████▉ | 357/453 [00:00<00:00, 5082.09it/s, Materializing param=model.layers.39.mlp.up_proj.weight]
Loading weights: 79%|███████▉ | 357/453 [00:00<00:00, 5079.88it/s, Materializing param=model.layers.39.mlp.up_proj.weight]
Loading weights: 79%|███████▉ | 358/453 [00:00<00:00, 5006.94it/s, Materializing param=model.layers.39.post_attention_layernorm.weight]
Loading weights: 79%|███████▉ | 358/453 [00:00<00:00, 5004.60it/s, Materializing param=model.layers.39.post_attention_layernorm.weight]
Loading weights: 79%|███████▉ | 359/453 [00:00<00:00, 5003.16it/s, Materializing param=model.layers.39.self_attn.k_proj.weight]
Loading weights: 79%|███████▉ | 359/453 [00:00<00:00, 5001.06it/s, Materializing param=model.layers.39.self_attn.k_proj.weight]
Loading weights: 79%|███████▉ | 360/453 [00:00<00:00, 5010.52it/s, Materializing param=model.layers.39.self_attn.o_proj.weight]
Loading weights: 79%|███████▉ | 360/453 [00:00<00:00, 5008.41it/s, Materializing param=model.layers.39.self_attn.o_proj.weight]
Loading weights: 80%|███████▉ | 361/453 [00:00<00:00, 5016.98it/s, Materializing param=model.layers.39.self_attn.q_proj.weight]
Loading weights: 80%|███████▉ | 361/453 [00:00<00:00, 5014.87it/s, Materializing param=model.layers.39.self_attn.q_proj.weight]
Loading weights: 80%|███████▉ | 362/453 [00:00<00:00, 5011.25it/s, Materializing param=model.layers.39.self_attn.v_proj.weight]
Loading weights: 80%|███████▉ | 362/453 [00:00<00:00, 5009.05it/s, Materializing param=model.layers.39.self_attn.v_proj.weight]
Loading weights: 80%|████████ | 363/453 [00:00<00:00, 5017.31it/s, Materializing param=model.layers.40.input_layernorm.weight]
Loading weights: 80%|████████ | 363/453 [00:00<00:00, 5015.21it/s, Materializing param=model.layers.40.input_layernorm.weight]
Loading weights: 80%|████████ | 364/453 [00:00<00:00, 5023.73it/s, Materializing param=model.layers.40.mlp.down_proj.weight]
Loading weights: 80%|████████ | 364/453 [00:00<00:00, 5021.60it/s, Materializing param=model.layers.40.mlp.down_proj.weight]
Loading weights: 81%|████████ | 365/453 [00:00<00:00, 5030.02it/s, Materializing param=model.layers.40.mlp.gate_proj.weight]
Loading weights: 81%|████████ | 365/453 [00:00<00:00, 5027.90it/s, Materializing param=model.layers.40.mlp.gate_proj.weight]
Loading weights: 81%|████████ | 366/453 [00:00<00:00, 5036.37it/s, Materializing param=model.layers.40.mlp.up_proj.weight]
Loading weights: 81%|████████ | 366/453 [00:00<00:00, 5034.29it/s, Materializing param=model.layers.40.mlp.up_proj.weight]
Loading weights: 81%|████████ | 367/453 [00:00<00:00, 5032.92it/s, Materializing param=model.layers.40.post_attention_layernorm.weight]
Loading weights: 81%|████████ | 367/453 [00:00<00:00, 5030.74it/s, Materializing param=model.layers.40.post_attention_layernorm.weight]
Loading weights: 81%|████████ | 368/453 [00:00<00:00, 5031.55it/s, Materializing param=model.layers.40.self_attn.k_proj.weight]
Loading weights: 81%|████████ | 368/453 [00:00<00:00, 5029.19it/s, Materializing param=model.layers.40.self_attn.k_proj.weight]
Loading weights: 81%|████████▏ | 369/453 [00:00<00:00, 5038.31it/s, Materializing param=model.layers.40.self_attn.o_proj.weight]
Loading weights: 81%|████████▏ | 369/453 [00:00<00:00, 5036.19it/s, Materializing param=model.layers.40.self_attn.o_proj.weight]
Loading weights: 82%|████████▏ | 370/453 [00:00<00:00, 5044.56it/s, Materializing param=model.layers.40.self_attn.q_proj.weight]
Loading weights: 82%|████████▏ | 370/453 [00:00<00:00, 5042.56it/s, Materializing param=model.layers.40.self_attn.q_proj.weight]
Loading weights: 82%|████████▏ | 371/453 [00:00<00:00, 5045.79it/s, Materializing param=model.layers.40.self_attn.v_proj.weight]
Loading weights: 82%|████████▏ | 371/453 [00:00<00:00, 5043.67it/s, Materializing param=model.layers.40.self_attn.v_proj.weight]
Loading weights: 82%|████████▏ | 372/453 [00:00<00:00, 5051.66it/s, Materializing param=model.layers.41.input_layernorm.weight]
Loading weights: 82%|████████▏ | 372/453 [00:00<00:00, 5049.55it/s, Materializing param=model.layers.41.input_layernorm.weight]
Loading weights: 82%|████████▏ | 373/453 [00:00<00:00, 5057.67it/s, Materializing param=model.layers.41.mlp.down_proj.weight]
Loading weights: 82%|████████▏ | 373/453 [00:00<00:00, 5055.58it/s, Materializing param=model.layers.41.mlp.down_proj.weight]
Loading weights: 83%|████████▎ | 374/453 [00:00<00:00, 5063.83it/s, Materializing param=model.layers.41.mlp.gate_proj.weight]
Loading weights: 83%|████████▎ | 374/453 [00:00<00:00, 5061.74it/s, Materializing param=model.layers.41.mlp.gate_proj.weight]
Loading weights: 83%|████████▎ | 375/453 [00:00<00:00, 5070.53it/s, Materializing param=model.layers.41.mlp.up_proj.weight]
Loading weights: 83%|████████▎ | 375/453 [00:00<00:00, 5068.49it/s, Materializing param=model.layers.41.mlp.up_proj.weight]
Loading weights: 83%|████████▎ | 376/453 [00:00<00:00, 5076.61it/s, Materializing param=model.layers.41.post_attention_layernorm.weight]
Loading weights: 83%|████████▎ | 376/453 [00:00<00:00, 5074.52it/s, Materializing param=model.layers.41.post_attention_layernorm.weight]
Loading weights: 83%|████████▎ | 377/453 [00:00<00:00, 5082.40it/s, Materializing param=model.layers.41.self_attn.k_proj.weight]
Loading weights: 83%|████████▎ | 377/453 [00:00<00:00, 5080.31it/s, Materializing param=model.layers.41.self_attn.k_proj.weight]
Loading weights: 83%|████████▎ | 378/453 [00:00<00:00, 5049.85it/s, Materializing param=model.layers.41.self_attn.o_proj.weight]
Loading weights: 83%|████████▎ | 378/453 [00:00<00:00, 5047.60it/s, Materializing param=model.layers.41.self_attn.o_proj.weight]
Loading weights: 84%|████████▎ | 379/453 [00:00<00:00, 5055.72it/s, Materializing param=model.layers.41.self_attn.q_proj.weight]
Loading weights: 84%|████████▎ | 379/453 [00:00<00:00, 5053.56it/s, Materializing param=model.layers.41.self_attn.q_proj.weight]
Loading weights: 84%|████████▍ | 380/453 [00:00<00:00, 5061.71it/s, Materializing param=model.layers.41.self_attn.v_proj.weight]
Loading weights: 84%|████████▍ | 380/453 [00:00<00:00, 5059.68it/s, Materializing param=model.layers.41.self_attn.v_proj.weight]
Loading weights: 84%|████████▍ | 381/453 [00:00<00:00, 5067.51it/s, Materializing param=model.layers.42.input_layernorm.weight]
Loading weights: 84%|████████▍ | 381/453 [00:00<00:00, 5065.38it/s, Materializing param=model.layers.42.input_layernorm.weight]
Loading weights: 84%|████████▍ | 382/453 [00:00<00:00, 5073.44it/s, Materializing param=model.layers.42.mlp.down_proj.weight]
Loading weights: 84%|████████▍ | 382/453 [00:00<00:00, 5071.37it/s, Materializing param=model.layers.42.mlp.down_proj.weight]
Loading weights: 85%|████████▍ | 383/453 [00:00<00:00, 5079.20it/s, Materializing param=model.layers.42.mlp.gate_proj.weight]
Loading weights: 85%|████████▍ | 383/453 [00:00<00:00, 5077.11it/s, Materializing param=model.layers.42.mlp.gate_proj.weight]
Loading weights: 85%|████████▍ | 384/453 [00:00<00:00, 5071.45it/s, Materializing param=model.layers.42.mlp.up_proj.weight]
Loading weights: 85%|████████▍ | 384/453 [00:00<00:00, 5069.24it/s, Materializing param=model.layers.42.mlp.up_proj.weight]
Loading weights: 85%|████████▍ | 385/453 [00:00<00:00, 5077.24it/s, Materializing param=model.layers.42.post_attention_layernorm.weight]
Loading weights: 85%|████████▍ | 385/453 [00:00<00:00, 5075.09it/s, Materializing param=model.layers.42.post_attention_layernorm.weight]
Loading weights: 85%|████████▌ | 386/453 [00:00<00:00, 5083.10it/s, Materializing param=model.layers.42.self_attn.k_proj.weight]
Loading weights: 85%|████████▌ | 386/453 [00:00<00:00, 5080.96it/s, Materializing param=model.layers.42.self_attn.k_proj.weight]
Loading weights: 85%|████████▌ | 387/453 [00:00<00:00, 5089.04it/s, Materializing param=model.layers.42.self_attn.o_proj.weight]
Loading weights: 85%|████████▌ | 387/453 [00:00<00:00, 5086.92it/s, Materializing param=model.layers.42.self_attn.o_proj.weight]
Loading weights: 86%|████████▌ | 388/453 [00:00<00:00, 5094.62it/s, Materializing param=model.layers.42.self_attn.q_proj.weight]
Loading weights: 86%|████████▌ | 388/453 [00:00<00:00, 5092.60it/s, Materializing param=model.layers.42.self_attn.q_proj.weight]
Loading weights: 86%|████████▌ | 389/453 [00:00<00:00, 5100.55it/s, Materializing param=model.layers.42.self_attn.v_proj.weight]
Loading weights: 86%|████████▌ | 389/453 [00:00<00:00, 5098.57it/s, Materializing param=model.layers.42.self_attn.v_proj.weight]
Loading weights: 86%|████████▌ | 390/453 [00:00<00:00, 5092.74it/s, Materializing param=model.layers.43.input_layernorm.weight]
Loading weights: 86%|████████▌ | 390/453 [00:00<00:00, 5090.76it/s, Materializing param=model.layers.43.input_layernorm.weight]
Loading weights: 86%|████████▋ | 391/453 [00:00<00:00, 5098.85it/s, Materializing param=model.layers.43.mlp.down_proj.weight]
Loading weights: 86%|████████▋ | 391/453 [00:00<00:00, 5096.88it/s, Materializing param=model.layers.43.mlp.down_proj.weight]
Loading weights: 87%|████████▋ | 392/453 [00:00<00:00, 5099.95it/s, Materializing param=model.layers.43.mlp.gate_proj.weight]
Loading weights: 87%|████████▋ | 392/453 [00:00<00:00, 5097.89it/s, Materializing param=model.layers.43.mlp.gate_proj.weight]
Loading weights: 87%|████████▋ | 393/453 [00:00<00:00, 5052.98it/s, Materializing param=model.layers.43.mlp.up_proj.weight]
Loading weights: 87%|████████▋ | 393/453 [00:00<00:00, 5050.82it/s, Materializing param=model.layers.43.mlp.up_proj.weight]
Loading weights: 87%|████████▋ | 394/453 [00:00<00:00, 5051.94it/s, Materializing param=model.layers.43.post_attention_layernorm.weight]
Loading weights: 87%|████████▋ | 394/453 [00:00<00:00, 5049.83it/s, Materializing param=model.layers.43.post_attention_layernorm.weight]
Loading weights: 87%|████████▋ | 395/453 [00:00<00:00, 5057.44it/s, Materializing param=model.layers.43.self_attn.k_proj.weight]
Loading weights: 87%|████████▋ | 395/453 [00:00<00:00, 5055.15it/s, Materializing param=model.layers.43.self_attn.k_proj.weight]
Loading weights: 87%|████████▋ | 396/453 [00:00<00:00, 5060.32it/s, Materializing param=model.layers.43.self_attn.o_proj.weight]
Loading weights: 87%|████████▋ | 396/453 [00:00<00:00, 5058.38it/s, Materializing param=model.layers.43.self_attn.o_proj.weight]
Loading weights: 88%|████████▊ | 397/453 [00:00<00:00, 5067.08it/s, Materializing param=model.layers.43.self_attn.q_proj.weight]
Loading weights: 88%|████████▊ | 397/453 [00:00<00:00, 5065.06it/s, Materializing param=model.layers.43.self_attn.q_proj.weight]
Loading weights: 88%|████████▊ | 398/453 [00:00<00:00, 5072.85it/s, Materializing param=model.layers.43.self_attn.v_proj.weight]
Loading weights: 88%|████████▊ | 398/453 [00:00<00:00, 5071.00it/s, Materializing param=model.layers.43.self_attn.v_proj.weight]
Loading weights: 88%|████████▊ | 399/453 [00:00<00:00, 5078.93it/s, Materializing param=model.layers.44.input_layernorm.weight]
Loading weights: 88%|████████▊ | 399/453 [00:00<00:00, 5077.02it/s, Materializing param=model.layers.44.input_layernorm.weight]
Loading weights: 88%|████████▊ | 400/453 [00:00<00:00, 5084.84it/s, Materializing param=model.layers.44.mlp.down_proj.weight]
Loading weights: 88%|████████▊ | 400/453 [00:00<00:00, 5082.90it/s, Materializing param=model.layers.44.mlp.down_proj.weight]
Loading weights: 89%|████████▊ | 401/453 [00:00<00:00, 5060.62it/s, Materializing param=model.layers.44.mlp.gate_proj.weight]
Loading weights: 89%|████████▊ | 401/453 [00:00<00:00, 5058.36it/s, Materializing param=model.layers.44.mlp.gate_proj.weight]
Loading weights: 89%|████████▊ | 402/453 [00:00<00:00, 5065.98it/s, Materializing param=model.layers.44.mlp.up_proj.weight]
Loading weights: 89%|████████▊ | 402/453 [00:00<00:00, 5063.96it/s, Materializing param=model.layers.44.mlp.up_proj.weight]
Loading weights: 89%|████████▉ | 403/453 [00:00<00:00, 5071.45it/s, Materializing param=model.layers.44.post_attention_layernorm.weight]
Loading weights: 89%|████████▉ | 403/453 [00:00<00:00, 5069.31it/s, Materializing param=model.layers.44.post_attention_layernorm.weight]
Loading weights: 89%|████████▉ | 404/453 [00:00<00:00, 5076.97it/s, Materializing param=model.layers.44.self_attn.k_proj.weight]
Loading weights: 89%|████████▉ | 404/453 [00:00<00:00, 5075.04it/s, Materializing param=model.layers.44.self_attn.k_proj.weight]
Loading weights: 89%|████████▉ | 405/453 [00:00<00:00, 5082.59it/s, Materializing param=model.layers.44.self_attn.o_proj.weight]
Loading weights: 89%|████████▉ | 405/453 [00:00<00:00, 5080.63it/s, Materializing param=model.layers.44.self_attn.o_proj.weight]
Loading weights: 90%|████████▉ | 406/453 [00:00<00:00, 5055.85it/s, Materializing param=model.layers.44.self_attn.q_proj.weight]
Loading weights: 90%|████████▉ | 406/453 [00:00<00:00, 5053.74it/s, Materializing param=model.layers.44.self_attn.q_proj.weight]
Loading weights: 90%|████████▉ | 407/453 [00:00<00:00, 5053.06it/s, Materializing param=model.layers.44.self_attn.v_proj.weight]
Loading weights: 90%|████████▉ | 407/453 [00:00<00:00, 5051.12it/s, Materializing param=model.layers.44.self_attn.v_proj.weight]
Loading weights: 90%|█████████ | 408/453 [00:00<00:00, 5058.70it/s, Materializing param=model.layers.45.input_layernorm.weight]
Loading weights: 90%|█████████ | 408/453 [00:00<00:00, 5056.72it/s, Materializing param=model.layers.45.input_layernorm.weight]
Loading weights: 90%|█████████ | 409/453 [00:00<00:00, 5041.26it/s, Materializing param=model.layers.45.mlp.down_proj.weight]
Loading weights: 90%|█████████ | 409/453 [00:00<00:00, 5039.07it/s, Materializing param=model.layers.45.mlp.down_proj.weight]
Loading weights: 91%|█████████ | 410/453 [00:00<00:00, 5046.50it/s, Materializing param=model.layers.45.mlp.gate_proj.weight]
Loading weights: 91%|█████████ | 410/453 [00:00<00:00, 5044.59it/s, Materializing param=model.layers.45.mlp.gate_proj.weight]
Loading weights: 91%|█████████ | 411/453 [00:00<00:00, 5052.00it/s, Materializing param=model.layers.45.mlp.up_proj.weight]
Loading weights: 91%|█████████ | 411/453 [00:00<00:00, 5050.15it/s, Materializing param=model.layers.45.mlp.up_proj.weight]
Loading weights: 91%|█████████ | 412/453 [00:00<00:00, 5057.79it/s, Materializing param=model.layers.45.post_attention_layernorm.weight]
Loading weights: 91%|█████████ | 412/453 [00:00<00:00, 5055.80it/s, Materializing param=model.layers.45.post_attention_layernorm.weight]
Loading weights: 91%|█████████ | 413/453 [00:00<00:00, 5063.32it/s, Materializing param=model.layers.45.self_attn.k_proj.weight]
Loading weights: 91%|█████████ | 413/453 [00:00<00:00, 5061.53it/s, Materializing param=model.layers.45.self_attn.k_proj.weight]
Loading weights: 91%|█████████▏| 414/453 [00:00<00:00, 5068.85it/s, Materializing param=model.layers.45.self_attn.o_proj.weight]
Loading weights: 91%|█████████▏| 414/453 [00:00<00:00, 5067.03it/s, Materializing param=model.layers.45.self_attn.o_proj.weight]
Loading weights: 92%|█████████▏| 415/453 [00:00<00:00, 5074.59it/s, Materializing param=model.layers.45.self_attn.q_proj.weight]
Loading weights: 92%|█████████▏| 415/453 [00:00<00:00, 5072.76it/s, Materializing param=model.layers.45.self_attn.q_proj.weight]
Loading weights: 92%|█████████▏| 416/453 [00:00<00:00, 5048.99it/s, Materializing param=model.layers.45.self_attn.v_proj.weight]
Loading weights: 92%|█████████▏| 416/453 [00:00<00:00, 5046.87it/s, Materializing param=model.layers.45.self_attn.v_proj.weight]
Loading weights: 92%|█████████▏| 417/453 [00:00<00:00, 5053.93it/s, Materializing param=model.layers.46.input_layernorm.weight]
Loading weights: 92%|█████████▏| 417/453 [00:00<00:00, 5051.98it/s, Materializing param=model.layers.46.input_layernorm.weight]
Loading weights: 92%|█████████▏| 418/453 [00:00<00:00, 5059.39it/s, Materializing param=model.layers.46.mlp.down_proj.weight]
Loading weights: 92%|█████████▏| 418/453 [00:00<00:00, 5057.56it/s, Materializing param=model.layers.46.mlp.down_proj.weight]
Loading weights: 92%|█████████▏| 419/453 [00:00<00:00, 5064.99it/s, Materializing param=model.layers.46.mlp.gate_proj.weight]
Loading weights: 92%|█████████▏| 419/453 [00:00<00:00, 5063.12it/s, Materializing param=model.layers.46.mlp.gate_proj.weight]
Loading weights: 93%|█████████▎| 420/453 [00:00<00:00, 5070.45it/s, Materializing param=model.layers.46.mlp.up_proj.weight]
Loading weights: 93%|█████████▎| 420/453 [00:00<00:00, 5068.65it/s, Materializing param=model.layers.46.mlp.up_proj.weight]
Loading weights: 93%|█████████▎| 421/453 [00:00<00:00, 5076.07it/s, Materializing param=model.layers.46.post_attention_layernorm.weight]
Loading weights: 93%|█████████▎| 421/453 [00:00<00:00, 5074.07it/s, Materializing param=model.layers.46.post_attention_layernorm.weight]
Loading weights: 93%|█████████▎| 422/453 [00:00<00:00, 5081.47it/s, Materializing param=model.layers.46.self_attn.k_proj.weight]
Loading weights: 93%|█████████▎| 422/453 [00:00<00:00, 5079.48it/s, Materializing param=model.layers.46.self_attn.k_proj.weight]
Loading weights: 93%|█████████▎| 423/453 [00:00<00:00, 5077.53it/s, Materializing param=model.layers.46.self_attn.o_proj.weight]
Loading weights: 93%|█████████▎| 423/453 [00:00<00:00, 5075.57it/s, Materializing param=model.layers.46.self_attn.o_proj.weight]
Loading weights: 94%|█████████▎| 424/453 [00:00<00:00, 5067.53it/s, Materializing param=model.layers.46.self_attn.q_proj.weight]
Loading weights: 94%|█████████▎| 424/453 [00:00<00:00, 5065.58it/s, Materializing param=model.layers.46.self_attn.q_proj.weight]
Loading weights: 94%|█████████▍| 425/453 [00:00<00:00, 5072.89it/s, Materializing param=model.layers.46.self_attn.v_proj.weight]
Loading weights: 94%|█████████▍| 425/453 [00:00<00:00, 5071.09it/s, Materializing param=model.layers.46.self_attn.v_proj.weight]
Loading weights: 94%|█████████▍| 426/453 [00:00<00:00, 5078.37it/s, Materializing param=model.layers.47.input_layernorm.weight]
Loading weights: 94%|█████████▍| 426/453 [00:00<00:00, 5076.59it/s, Materializing param=model.layers.47.input_layernorm.weight]
Loading weights: 94%|█████████▍| 427/453 [00:00<00:00, 5083.85it/s, Materializing param=model.layers.47.mlp.down_proj.weight]
Loading weights: 94%|█████████▍| 427/453 [00:00<00:00, 5081.99it/s, Materializing param=model.layers.47.mlp.down_proj.weight]
Loading weights: 94%|█████████▍| 428/453 [00:00<00:00, 5089.27it/s, Materializing param=model.layers.47.mlp.gate_proj.weight]
Loading weights: 94%|█████████▍| 428/453 [00:00<00:00, 5087.39it/s, Materializing param=model.layers.47.mlp.gate_proj.weight]
Loading weights: 95%|█████████▍| 429/453 [00:00<00:00, 5087.15it/s, Materializing param=model.layers.47.mlp.up_proj.weight]
Loading weights: 95%|█████████▍| 429/453 [00:00<00:00, 5085.31it/s, Materializing param=model.layers.47.mlp.up_proj.weight]
Loading weights: 95%|█████████▍| 430/453 [00:00<00:00, 5070.06it/s, Materializing param=model.layers.47.post_attention_layernorm.weight]
Loading weights: 95%|█████████▍| 430/453 [00:00<00:00, 5068.00it/s, Materializing param=model.layers.47.post_attention_layernorm.weight]
Loading weights: 95%|█████████▌| 431/453 [00:00<00:00, 5070.54it/s, Materializing param=model.layers.47.self_attn.k_proj.weight]
Loading weights: 95%|█████████▌| 431/453 [00:00<00:00, 5068.58it/s, Materializing param=model.layers.47.self_attn.k_proj.weight]
Loading weights: 95%|█████████▌| 432/453 [00:00<00:00, 5075.49it/s, Materializing param=model.layers.47.self_attn.o_proj.weight]
Loading weights: 95%|█████████▌| 432/453 [00:00<00:00, 5073.40it/s, Materializing param=model.layers.47.self_attn.o_proj.weight]
Loading weights: 96%|█████████▌| 433/453 [00:00<00:00, 5080.53it/s, Materializing param=model.layers.47.self_attn.q_proj.weight]
Loading weights: 96%|█████████▌| 433/453 [00:00<00:00, 5078.69it/s, Materializing param=model.layers.47.self_attn.q_proj.weight]
Loading weights: 96%|█████████▌| 434/453 [00:00<00:00, 5085.74it/s, Materializing param=model.layers.47.self_attn.v_proj.weight]
Loading weights: 96%|█████████▌| 434/453 [00:00<00:00, 5083.98it/s, Materializing param=model.layers.47.self_attn.v_proj.weight]
Loading weights: 96%|█████████▌| 435/453 [00:00<00:00, 5091.13it/s, Materializing param=model.layers.48.input_layernorm.weight]
Loading weights: 96%|█████████▌| 435/453 [00:00<00:00, 5089.34it/s, Materializing param=model.layers.48.input_layernorm.weight]
Loading weights: 96%|█████████▌| 436/453 [00:00<00:00, 5096.53it/s, Materializing param=model.layers.48.mlp.down_proj.weight]
Loading weights: 96%|█████████▌| 436/453 [00:00<00:00, 5094.73it/s, Materializing param=model.layers.48.mlp.down_proj.weight]
Loading weights: 96%|█████████▋| 437/453 [00:00<00:00, 5100.56it/s, Materializing param=model.layers.48.mlp.gate_proj.weight]
Loading weights: 96%|█████████▋| 437/453 [00:00<00:00, 5098.76it/s, Materializing param=model.layers.48.mlp.gate_proj.weight]
Loading weights: 97%|█████████▋| 438/453 [00:00<00:00, 5092.86it/s, Materializing param=model.layers.48.mlp.up_proj.weight]
Loading weights: 97%|█████████▋| 438/453 [00:00<00:00, 5090.98it/s, Materializing param=model.layers.48.mlp.up_proj.weight]
Loading weights: 97%|█████████▋| 439/453 [00:00<00:00, 5098.21it/s, Materializing param=model.layers.48.post_attention_layernorm.weight]
Loading weights: 97%|█████████▋| 439/453 [00:00<00:00, 5096.08it/s, Materializing param=model.layers.48.post_attention_layernorm.weight]
Loading weights: 97%|█████████▋| 440/453 [00:00<00:00, 5103.05it/s, Materializing param=model.layers.48.self_attn.k_proj.weight]
Loading weights: 97%|█████████▋| 440/453 [00:00<00:00, 5101.26it/s, Materializing param=model.layers.48.self_attn.k_proj.weight]
Loading weights: 97%|█████████▋| 441/453 [00:00<00:00, 5081.09it/s, Materializing param=model.layers.48.self_attn.o_proj.weight]
Loading weights: 97%|█████████▋| 441/453 [00:00<00:00, 5079.17it/s, Materializing param=model.layers.48.self_attn.o_proj.weight]
Loading weights: 98%|█████████▊| 442/453 [00:00<00:00, 5086.00it/s, Materializing param=model.layers.48.self_attn.q_proj.weight]
Loading weights: 98%|█████████▊| 442/453 [00:00<00:00, 5084.17it/s, Materializing param=model.layers.48.self_attn.q_proj.weight]
Loading weights: 98%|█████████▊| 443/453 [00:00<00:00, 5091.15it/s, Materializing param=model.layers.48.self_attn.v_proj.weight]
Loading weights: 98%|█████████▊| 443/453 [00:00<00:00, 5089.35it/s, Materializing param=model.layers.48.self_attn.v_proj.weight]
Loading weights: 98%|█████████▊| 444/453 [00:00<00:00, 5096.04it/s, Materializing param=model.layers.49.input_layernorm.weight]
Loading weights: 98%|█████████▊| 444/453 [00:00<00:00, 5094.21it/s, Materializing param=model.layers.49.input_layernorm.weight]
Loading weights: 98%|█████████▊| 445/453 [00:00<00:00, 5066.82it/s, Materializing param=model.layers.49.mlp.down_proj.weight]
Loading weights: 98%|█████████▊| 445/453 [00:00<00:00, 5064.94it/s, Materializing param=model.layers.49.mlp.down_proj.weight]
Loading weights: 98%|█████████▊| 446/453 [00:00<00:00, 5073.22it/s, Materializing param=model.layers.49.mlp.gate_proj.weight]
Loading weights: 98%|█████████▊| 446/453 [00:00<00:00, 5071.59it/s, Materializing param=model.layers.49.mlp.gate_proj.weight]
Loading weights: 99%|█████████▊| 447/453 [00:00<00:00, 5079.84it/s, Materializing param=model.layers.49.mlp.up_proj.weight]
Loading weights: 99%|█████████▊| 447/453 [00:00<00:00, 5078.25it/s, Materializing param=model.layers.49.mlp.up_proj.weight]
Loading weights: 99%|█████████▉| 448/453 [00:00<00:00, 5086.59it/s, Materializing param=model.layers.49.post_attention_layernorm.weight]
Loading weights: 99%|█████████▉| 448/453 [00:00<00:00, 5084.82it/s, Materializing param=model.layers.49.post_attention_layernorm.weight]
Loading weights: 99%|█████████▉| 449/453 [00:00<00:00, 5092.87it/s, Materializing param=model.layers.49.self_attn.k_proj.weight]
Loading weights: 99%|█████████▉| 449/453 [00:00<00:00, 5091.21it/s, Materializing param=model.layers.49.self_attn.k_proj.weight]
Loading weights: 99%|█████████▉| 450/453 [00:00<00:00, 5099.53it/s, Materializing param=model.layers.49.self_attn.o_proj.weight]
Loading weights: 99%|█████████▉| 450/453 [00:00<00:00, 5097.89it/s, Materializing param=model.layers.49.self_attn.o_proj.weight]
Loading weights: 100%|█████████▉| 451/453 [00:00<00:00, 5106.09it/s, Materializing param=model.layers.49.self_attn.q_proj.weight]
Loading weights: 100%|█████████▉| 451/453 [00:00<00:00, 5104.40it/s, Materializing param=model.layers.49.self_attn.q_proj.weight]
Loading weights: 100%|█████████▉| 452/453 [00:00<00:00, 5112.74it/s, Materializing param=model.layers.49.self_attn.v_proj.weight]
Loading weights: 100%|█████████▉| 452/453 [00:00<00:00, 5111.06it/s, Materializing param=model.layers.49.self_attn.v_proj.weight]
Loading weights: 100%|██████████| 453/453 [00:00<00:00, 5119.35it/s, Materializing param=model.norm.weight]
Loading weights: 100%|██████████| 453/453 [00:00<00:00, 5117.69it/s, Materializing param=model.norm.weight]
Loading weights: 100%|██████████| 453/453 [00:00<00:00, 5113.53it/s, Materializing param=model.norm.weight] |
| I0222 17:49:41.686670 41281 quantize_decompress_robust.py:270] Loaded layer 0 |
| I0222 17:49:41.931416 41281 quantize_decompress_robust.py:270] Loaded layer 1 |
| I0222 17:49:42.159745 41281 quantize_decompress_robust.py:270] Loaded layer 2 |
| I0222 17:49:42.380160 41281 quantize_decompress_robust.py:270] Loaded layer 3 |
| I0222 17:49:42.606572 41281 quantize_decompress_robust.py:270] Loaded layer 4 |
| I0222 17:49:42.839407 41281 quantize_decompress_robust.py:270] Loaded layer 5 |
| I0222 17:49:43.075967 41281 quantize_decompress_robust.py:270] Loaded layer 6 |
| I0222 17:49:43.309864 41281 quantize_decompress_robust.py:270] Loaded layer 7 |
| I0222 17:49:43.545372 41281 quantize_decompress_robust.py:270] Loaded layer 8 |
| I0222 17:49:43.779607 41281 quantize_decompress_robust.py:270] Loaded layer 9 |
| I0222 17:49:44.008318 41281 quantize_decompress_robust.py:270] Loaded layer 10 |
| I0222 17:49:44.227247 41281 quantize_decompress_robust.py:270] Loaded layer 11 |
| I0222 17:49:44.460201 41281 quantize_decompress_robust.py:270] Loaded layer 12 |
| I0222 17:49:44.697040 41281 quantize_decompress_robust.py:270] Loaded layer 13 |
| I0222 17:49:44.924147 41281 quantize_decompress_robust.py:270] Loaded layer 14 |
| I0222 17:49:45.152987 41281 quantize_decompress_robust.py:270] Loaded layer 15 |
| I0222 17:49:45.381372 41281 quantize_decompress_robust.py:270] Loaded layer 16 |
| I0222 17:49:45.618381 41281 quantize_decompress_robust.py:270] Loaded layer 17 |
| I0222 17:49:45.846254 41281 quantize_decompress_robust.py:270] Loaded layer 18 |
| I0222 17:49:46.083225 41281 quantize_decompress_robust.py:270] Loaded layer 19 |
| I0222 17:49:46.318634 41281 quantize_decompress_robust.py:270] Loaded layer 20 |
| I0222 17:49:46.555366 41281 quantize_decompress_robust.py:270] Loaded layer 21 |
| I0222 17:49:46.792297 41281 quantize_decompress_robust.py:270] Loaded layer 22 |
| I0222 17:49:47.016160 41281 quantize_decompress_robust.py:270] Loaded layer 23 |
| I0222 17:49:47.251988 41281 quantize_decompress_robust.py:270] Loaded layer 24 |
| I0222 17:49:47.480787 41281 quantize_decompress_robust.py:270] Loaded layer 25 |
| I0222 17:49:47.705153 41281 quantize_decompress_robust.py:270] Loaded layer 26 |
| I0222 17:49:47.939575 41281 quantize_decompress_robust.py:270] Loaded layer 27 |
| I0222 17:49:48.172534 41281 quantize_decompress_robust.py:270] Loaded layer 28 |
| I0222 17:49:48.419337 41281 quantize_decompress_robust.py:270] Loaded layer 29 |
| I0222 17:49:48.669432 41281 quantize_decompress_robust.py:270] Loaded layer 30 |
| I0222 17:49:48.910189 41281 quantize_decompress_robust.py:270] Loaded layer 31 |
| I0222 17:49:49.132575 41281 quantize_decompress_robust.py:270] Loaded layer 32 |
| I0222 17:49:49.354174 41281 quantize_decompress_robust.py:270] Loaded layer 33 |
| I0222 17:49:49.587378 41281 quantize_decompress_robust.py:270] Loaded layer 34 |
| I0222 17:49:49.819832 41281 quantize_decompress_robust.py:270] Loaded layer 35 |
| I0222 17:49:50.053029 41281 quantize_decompress_robust.py:270] Loaded layer 36 |
| I0222 17:49:50.281270 41281 quantize_decompress_robust.py:270] Loaded layer 37 |
| I0222 17:49:50.519028 41281 quantize_decompress_robust.py:270] Loaded layer 38 |
| I0222 17:49:50.768373 41281 quantize_decompress_robust.py:270] Loaded layer 39 |
| I0222 17:49:51.015100 41281 quantize_decompress_robust.py:270] Loaded layer 40 |
| I0222 17:49:51.276346 41281 quantize_decompress_robust.py:270] Loaded layer 41 |
| I0222 17:49:51.540829 41281 quantize_decompress_robust.py:270] Loaded layer 42 |
| I0222 17:49:51.778579 41281 quantize_decompress_robust.py:270] Loaded layer 43 |
| I0222 17:49:52.019993 41281 quantize_decompress_robust.py:270] Loaded layer 44 |
| I0222 17:49:52.263155 41281 quantize_decompress_robust.py:270] Loaded layer 45 |
| I0222 17:49:52.504542 41281 quantize_decompress_robust.py:270] Loaded layer 46 |
| I0222 17:49:52.744200 41281 quantize_decompress_robust.py:270] Loaded layer 47 |
| I0222 17:49:52.980363 41281 quantize_decompress_robust.py:270] Loaded layer 48 |
| I0222 17:49:53.217294 41281 quantize_decompress_robust.py:270] Loaded layer 49 |
| I0222 17:49:53.217360 41281 quantize_decompress_robust.py:272] Saving to /dev/shm/variant-c-fp16... |
|
Writing model shards: 0%| | 0/1 [00:00<?, ?it/s]
Writing model shards: 100%|██████████| 1/1 [00:21<00:00, 21.04s/it]
Writing model shards: 100%|██████████| 1/1 [00:21<00:00, 21.04s/it] |
| I0222 17:50:14.323399 41281 quantize_decompress_robust.py:276] DONE - model saved to /dev/shm/variant-c-fp16 |
|
|