MathewLuo22's picture
Upload folder using huggingface_hub
6819380 verified
name,prefill_H100,prefill_MI300x,abs_diff
Attention,5.598502664353409,16.355703363920924,10.757200699567516
Llama3RotaryEmbedding,1.6660966986441785,4.529713012372985,2.863616313728807
LogitsProcessor,3.851259651293784,1.2953072618989923,-2.5559523893947915
"MergedColumnParallelLinear(weight=bfloat16[28672, 4096])",36.71509967192245,33.48230080715403,-3.2327988647684194
"QKVParallelLinear(weight=bfloat16[6144, 4096])",9.863807294141058,9.283515162067658,-0.5802921320733994
RMSNorm(weight=bfloat16[4096]) <- LlamaDecoderLayer,3.166628119696359,3.353858844750323,0.187230725053964
RMSNorm(weight=bfloat16[4096]) <- LlamaForCausalLM,,0.053253468498005566,
"RowParallelLinear(weight=bfloat16[4096, 14336])",25.70113205093823,20.549732971434175,-5.151399079504056
"RowParallelLinear(weight=bfloat16[4096, 4096])",7.7274924679659325,7.3817219772250535,-0.34577049074087896
Sampler,1.3505386801824866,1.1930674641142165,-0.15747121606827008
SiluAndMul,4.202015117001586,2.4797439544929962,-1.7222711625085894
others,0.050253827955756426,0.04208171207060033,-0.008172115885156095
vocab_embed_ops,0.1071737559047699,,