Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
pmysl
/
c4ai-command-r-plus-GGUF
like
36
Text Generation
GGUF
Inference Endpoints
conversational
License:
cc-by-nc-4.0
Model card
Files
Files and versions
Community
7
Deploy
Use this model
13a2771
c4ai-command-r-plus-GGUF
/
imatrix
1 contributor
History:
24 commits
pmysl
BPE pre-tokenization for IQ2_S variant
13a2771
5 months ago
command-r-plus-IQ1_M.gguf
Safe
25.2 GB
LFS
Add chat template to IQ1_M variant
7 months ago
command-r-plus-IQ1_S.gguf
Safe
23.2 GB
LFS
Add chat template to IQ1_S variant
7 months ago
command-r-plus-IQ2_M.gguf
Safe
36 GB
LFS
Add chat template to IQ2_M variant
7 months ago
command-r-plus-IQ2_S.gguf
Safe
33.3 GB
LFS
BPE pre-tokenization for IQ2_S variant
5 months ago
command-r-plus-IQ2_XS.gguf
Safe
31.6 GB
LFS
BPE pre-tokenization for IQ2_XS variant
5 months ago
command-r-plus-IQ2_XXS.gguf
Safe
28.6 GB
LFS
BPE pre-tokenization for IQ2_XXS variant
5 months ago
command-r-plus-IQ3_M.gguf
Safe
47.7 GB
LFS
BPE pre-tokenization for IQ3_M variant
5 months ago
command-r-plus-IQ3_S.gguf
Safe
46 GB
LFS
BPE pre-tokenization for IQ3_S variant
5 months ago
command-r-plus-IQ3_XS.gguf
Safe
43.6 GB
LFS
BPE pre-tokenization for IQ3_XS variant
5 months ago
command-r-plus-IQ3_XXS.gguf
Safe
40.7 GB
LFS
BPE pre-tokenization for IQ3_XXS variant
5 months ago
command-r-plus-IQ4_NL-00001-of-00002.gguf
Safe
31.8 GB
LFS
BPE pre-tokenization for IQ4_NL variant
5 months ago
command-r-plus-IQ4_NL-00002-of-00002.gguf
Safe
27.5 GB
LFS
Add imatrix variants
7 months ago
command-r-plus-IQ4_XS-00001-of-00002.gguf
Safe
30.2 GB
LFS
BPE pre-tokenization for IQ4_XS variant
5 months ago
command-r-plus-IQ4_XS-00002-of-00002.gguf
Safe
26 GB
LFS
Add imatrix variants
7 months ago
command-r-plus-Q2_K_S.gguf
Safe
36.6 GB
LFS
BPE pre-tokenization for Q2_K_S variant
5 months ago
command-r-plus-f16-c2048-groups_merged-imatrix.dat
Safe
27.5 MB
LFS
Add imatrix variants
7 months ago