README.md exists but content is empty.
Downloads last month
17
Safetensors
Model size
8.02B params
Tensor type
BF16
·
F8_E4M3
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for nm-testing/Ministral-8B-Instruct-2410-FP8-dynamic

Quantized
(42)
this model