Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
aws-neuron
/
optimum-neuron-cache
like
28
Follow
AWS Inferentia and Trainium
159
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
647
b4bfd96
optimum-neuron-cache
/
inference-cache-config
/
trn2
4.43 kB
3 contributors
History:
2 commits
dacorvo
HF Staff
use longer sequence length for llama3 on trn2
f8538f0
verified
18 days ago
llama3.json
2.58 kB
use longer sequence length for llama3 on trn2
18 days ago
llama4.json
1.1 kB
add trn2 cached configs subdirectory
4 months ago
qwen3-moe.json
751 Bytes
add trn2 cached configs subdirectory
4 months ago