MathewLuo22's picture
Upload folder using huggingface_hub
6819380 verified
name,prefill_H100,prefill_MI300x,abs_diff
Attention,5.725389884282574,21.13655746796299,15.411167583680417
Llama3RotaryEmbedding,1.7857420562193158,3.1890826435170294,1.4033405872977136
LogitsProcessor,1.017143181316883,0.35926109541711065,-0.6578820858997723
"MergedColumnParallelLinear(weight=bfloat16[28672, 4096])",43.29349359545625,35.02980749731597,-8.263686098140276
"QKVParallelLinear(weight=bfloat16[6144, 4096])",9.751519197825727,8.501444341490565,-1.2500748563351625
RMSNorm(weight=bfloat16[4096]) <- LlamaDecoderLayer,2.87586431034889,2.772191898445982,-0.10367241190290821
"RowParallelLinear(weight=bfloat16[4096, 14336])",21.449999372013618,19.56647746212289,-1.883521909890728
"RowParallelLinear(weight=bfloat16[4096, 4096])",7.4886429722468115,5.775504701351854,-1.7131382708949578
Sampler,0.36281806980696546,0.3658306067798142,0.003012536972848756
SiluAndMul,6.082842551397157,3.1813915450811434,-2.9014510063160133
others,0.042990959395247705,0.04396077893029697,0.0009698195350492655
vocab_embed_ops,0.12355384969054385,0.07848996158435963,-0.045063888106184224