0 stringclasses 12 values | 1 float64 0 4.34k |
|---|---|
megatron.core.transformer.attention.forward.qkv | 219.756134 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.14352 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.106944 |
megatron.core.transformer.attention.forward.core_attention | 4,114.963379 |
megatron.core.transformer.attention.forward.linear_proj | 3.5624 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 4,340.217773 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 416.34491 |
megatron.core.transformer.mlp.forward.linear_fc1 | 4.250048 |
megatron.core.transformer.mlp.forward.activation | 140.918304 |
megatron.core.transformer.mlp.forward.linear_fc2 | 1.436672 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 147.631393 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.177056 |
megatron.core.transformer.attention.forward.qkv | 0.949856 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.099808 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.099712 |
megatron.core.transformer.attention.forward.core_attention | 3,061.956299 |
megatron.core.transformer.attention.forward.linear_proj | 0.17952 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 3,063.664551 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.067712 |
megatron.core.transformer.mlp.forward.linear_fc1 | 0.760032 |
megatron.core.transformer.mlp.forward.activation | 0.093056 |
megatron.core.transformer.mlp.forward.linear_fc2 | 0.693376 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 1.567904 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.068192 |
megatron.core.transformer.attention.forward.qkv | 0.342624 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.005216 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.005344 |
megatron.core.transformer.attention.forward.core_attention | 26.465343 |
megatron.core.transformer.attention.forward.linear_proj | 0.18096 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 27.031616 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.067936 |
megatron.core.transformer.mlp.forward.linear_fc1 | 0.761568 |
megatron.core.transformer.mlp.forward.activation | 0.092864 |
megatron.core.transformer.mlp.forward.linear_fc2 | 0.69456 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 1.570848 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.067328 |
megatron.core.transformer.attention.forward.qkv | 0.342656 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.005376 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.00528 |
megatron.core.transformer.attention.forward.core_attention | 20.799105 |
megatron.core.transformer.attention.forward.linear_proj | 0.180352 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 21.364576 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.06768 |
megatron.core.transformer.mlp.forward.linear_fc1 | 0.761696 |
megatron.core.transformer.mlp.forward.activation | 0.094144 |
megatron.core.transformer.mlp.forward.linear_fc2 | 0.695008 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 1.571904 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.067936 |
megatron.core.transformer.attention.forward.qkv | 0.344032 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.005568 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.00528 |
megatron.core.transformer.attention.forward.core_attention | 54.164352 |
megatron.core.transformer.attention.forward.linear_proj | 0.18112 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 54.7328 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.067808 |
megatron.core.transformer.mlp.forward.linear_fc1 | 0.761536 |
megatron.core.transformer.mlp.forward.activation | 0.092192 |
megatron.core.transformer.mlp.forward.linear_fc2 | 0.696032 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 1.570784 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.067648 |
megatron.core.transformer.attention.forward.qkv | 0.344768 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.005248 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.005376 |
megatron.core.transformer.attention.forward.core_attention | 15.286432 |
megatron.core.transformer.attention.forward.linear_proj | 0.180256 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 15.854208 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.067936 |
megatron.core.transformer.mlp.forward.linear_fc1 | 0.762144 |
megatron.core.transformer.mlp.forward.activation | 0.092896 |
megatron.core.transformer.mlp.forward.linear_fc2 | 0.697568 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 1.573536 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.06864 |
megatron.core.transformer.attention.forward.qkv | 0.34416 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.005376 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.00528 |
megatron.core.transformer.attention.forward.core_attention | 38.563297 |
megatron.core.transformer.attention.forward.linear_proj | 0.180352 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 39.130978 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.067168 |
megatron.core.transformer.mlp.forward.linear_fc1 | 0.760384 |
megatron.core.transformer.mlp.forward.activation | 0.093536 |
megatron.core.transformer.mlp.forward.linear_fc2 | 0.69472 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 1.569824 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.068224 |
megatron.core.transformer.attention.forward.qkv | 0.343232 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.004928 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.005312 |
megatron.core.transformer.attention.forward.core_attention | 21.166752 |
megatron.core.transformer.attention.forward.linear_proj | 0.180192 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 21.732992 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.067872 |
megatron.core.transformer.mlp.forward.linear_fc1 | 0.762048 |
megatron.core.transformer.mlp.forward.activation | 0.092928 |
megatron.core.transformer.mlp.forward.linear_fc2 | 0.693696 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 1.57056 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.067872 |
megatron.core.transformer.attention.forward.qkv | 0.345984 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.00528 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.0056 |
megatron.core.transformer.attention.forward.core_attention | 10.870336 |
End of preview. Expand
in Data Studio
No dataset card yet
- Downloads last month
- 8