Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
Kquant03
/
MistralTrix-4x9B-ERP-GGUF
like
9
GGUF
Merge
Inference Endpoints
arxiv:
2101.03961
License:
apache-2.0
Model card
Files
Files and versions
Community
3
Deploy
Use this model
main
MistralTrix-4x9B-ERP-GGUF
1 contributor
History:
7 commits
Kquant03
Update README.md
586afbf
10 months ago
.gitattributes
Safe
1.98 kB
Upload 8 files
10 months ago
README.md
Safe
6.78 kB
Update README.md
10 months ago
ggml-model-q2_k.gguf
Safe
10 GB
LFS
Upload 8 files
10 months ago
ggml-model-q3_k_m.gguf
Safe
13.1 GB
LFS
Upload 8 files
10 months ago
ggml-model-q4_0.gguf
Safe
17 GB
LFS
Upload 8 files
10 months ago
ggml-model-q4_k_m.gguf
Safe
17 GB
LFS
Upload 8 files
10 months ago
ggml-model-q5_0.gguf
Safe
20.7 GB
LFS
Upload 8 files
10 months ago
ggml-model-q5_k_m.gguf
Safe
20.7 GB
LFS
Upload 8 files
10 months ago
ggml-model-q6_k.gguf
Safe
24.7 GB
LFS
Upload 8 files
10 months ago
ggml-model-q8_0.gguf
Safe
32 GB
LFS
Upload 8 files
10 months ago