Kernels

GLM5 - Kernel

#4
by rahul7star - opened

Hey guys does this make sense why we use Flash Attention
https://huggingface.co/rahul7star/LLM-Brain/blob/main/Kernels-GLM5.md

Sign up or log in to comment