Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
wolfram
/
miquliz-120b-v2.0-GGUF
like
28
Transformers
GGUF
5 languages
mergekit
Merge
Inference Endpoints
conversational
arxiv:
2203.05482
Model card
Files
Files and versions
Community
4
Train
Deploy
Use this model
2c86442
miquliz-120b-v2.0-GGUF
1 contributor
History:
14 commits
wolfram
e27057f034b6ea2137b2087767feb372a125941c2ce1742a550007ed9751513d
2c86442
verified
11 months ago
.gitattributes
2.14 kB
e27057f034b6ea2137b2087767feb372a125941c2ce1742a550007ed9751513d
11 months ago
README.md
Safe
30.4 kB
Update README.md
12 months ago
miquliz-120b-v2.0.IQ1_S.gguf
Safe
25.2 GB
LFS
fbe56ddbaa35565bdd8c74f04749c3136a38b968251093bbe587d2e884a9aa4b
11 months ago
miquliz-120b-v2.0.IQ2_XS.gguf
Safe
35.4 GB
LFS
a4b56e8c294faecd13d5f45e857854285620e1eb62d38d0753b156463d717d42
11 months ago
miquliz-120b-v2.0.IQ2_XXS.gguf
Safe
31.8 GB
LFS
e27057f034b6ea2137b2087767feb372a125941c2ce1742a550007ed9751513d
11 months ago
miquliz-120b-v2.0.IQ3_XXS.gguf
Safe
49 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
miquliz-120b-v2.0.Q2_K.gguf
Safe
44.2 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
miquliz-120b-v2.0.Q4_K_M.gguf-split-a
50 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
miquliz-120b-v2.0.Q4_K_M.gguf-split-b
22.1 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
miquliz-120b-v2.0.Q5_K_M.gguf-split-a
50 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
miquliz-120b-v2.0.Q5_K_M.gguf-split-b
35 GB
LFS
Upload folder using huggingface_hub
about 1 year ago