robertgshaw2's picture
Upload folder using huggingface_hub
310f2a5 verified
raw
history blame contribute delete
166 Bytes
DEFAULT_stage:
DEFAULT_modifiers:
QuantizationModifier:
ignore: [lm_head, 're:.*block_sparse_moe.gate']
targets: [Linear]
scheme: FP8_DYNAMIC