--- library_name: transformers tags: - kv-cache - fp8 --- # Llama-3.3-70B-Instruct-QKV-Cache-FP8-Per-Tensor Empty model card placeholder.