Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
aws-neuron
/
optimum-neuron-cache
like
14
Follow
AWS Inferentia and Trainium
89
License:
apache-2.0
Model card
Files
Files and versions
Community
352
7715c92
optimum-neuron-cache
/
neuronxcc-2.13.66.0+6dfecc895
/
0_REGISTRY
/
0.0.22
/
inference
/
llama
6 contributors
History:
73 commits
htokoyo
Synchronizing local compiler cache.
5b590b1
verified
9 months ago
01-ai
Synchronizing local compiler cache.
10 months ago
HuggingFaceTB
Synchronizing local compiler cache.
10 months ago
LargeWorldModel
Synchronizing local compiler cache.
10 months ago
abacusai
Synchronizing local compiler cache.
10 months ago
defog
Synchronizing local compiler cache.
10 months ago
elyza
Synchronizing local compiler cache.
10 months ago
gorilla-llm
Synchronizing local compiler cache.
10 months ago
ibm
Synchronizing local compiler cache.
10 months ago
llm-jp
Synchronizing local compiler cache.
9 months ago
m-a-p
Synchronizing local compiler cache.
10 months ago
meta-llama
Remove cached bf16 Llama3 entry
10 months ago