Upload README.md
Browse files
README.md
CHANGED
@@ -103,6 +103,40 @@ Refer to the Provided Files table below to see what files use which methods, and
|
|
103 |
|
104 |
| Name | Quant method | Bits | Size | Max RAM required | Use case |
|
105 |
| ---- | ---- | ---- | ---- | ---- | ----- |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
106 |
| falcon-180b-chat.Q2_K.gguf | Q2_K | 2 | 73.97 GB| 76.47 GB | smallest, significant quality loss - not recommended for most purposes |
|
107 |
| falcon-180b-chat.Q3_K_S.gguf | Q3_K_S | 3 | 77.77 GB| 80.27 GB | very small, high quality loss |
|
108 |
| falcon-180b-chat.Q3_K_M.gguf | Q3_K_M | 3 | 85.18 GB| 87.68 GB | very small, high quality loss |
|
@@ -113,26 +147,42 @@ Refer to the Provided Files table below to see what files use which methods, and
|
|
113 |
| falcon-180b-chat.Q5_0.gguf | Q5_0 | 5 | 123.80 GB| 126.30 GB | legacy; medium, balanced quality - prefer using Q4_K_M |
|
114 |
| falcon-180b-chat.Q5_K_S.gguf | Q5_K_S | 5 | 123.80 GB| 126.30 GB | large, low quality loss - recommended |
|
115 |
| falcon-180b-chat.Q5_K_M.gguf | Q5_K_M | 5 | 130.99 GB| 133.49 GB | large, very low quality loss - recommended |
|
|
|
|
|
116 |
|
117 |
**Note**: the above RAM figures assume no GPU offloading. If layers are offloaded to the GPU, this will reduce RAM usage and use VRAM instead.
|
118 |
|
119 |
-
###
|
120 |
|
121 |
-
**Note:** HF does not support uploading files larger than 50GB. Therefore I have uploaded
|
122 |
|
123 |
<details>
|
124 |
-
<summary>Click for instructions regarding
|
125 |
|
126 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
127 |
|
128 |
Linux and macOS:
|
129 |
```
|
130 |
-
cat falcon-180b-chat.
|
|
|
131 |
```
|
132 |
Windows command line:
|
133 |
```
|
134 |
-
COPY /B falcon-180b-chat.
|
135 |
-
del falcon-180b-chat.
|
|
|
|
|
|
|
136 |
```
|
137 |
|
138 |
</details>
|
|
|
103 |
|
104 |
| Name | Quant method | Bits | Size | Max RAM required | Use case |
|
105 |
| ---- | ---- | ---- | ---- | ---- | ----- |
|
106 |
+
| [falcon-180b-chat.Q5_K_M.gguf-split-d](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q5_K_M.gguf-split-d) | Q5_K_M | 5 | 0.00 GB| 2.50 GB | large, very low quality loss - recommended |
|
107 |
+
| [falcon-180b-chat.Q4_0.gguf-split-a](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q4_0.gguf-split-a) | Q4_0 | 4 | 33.83 GB| 36.33 GB | legacy; small, very high quality loss - prefer using Q3_K_M |
|
108 |
+
| [falcon-180b-chat.Q4_0.gguf-split-b](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q4_0.gguf-split-b) | Q4_0 | 4 | 33.83 GB| 36.33 GB | legacy; small, very high quality loss - prefer using Q3_K_M |
|
109 |
+
| [falcon-180b-chat.Q4_0.gguf-split-c](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q4_0.gguf-split-c) | Q4_0 | 4 | 33.83 GB| 36.33 GB | legacy; small, very high quality loss - prefer using Q3_K_M |
|
110 |
+
| [falcon-180b-chat.Q4_K_S.gguf-split-a](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q4_K_S.gguf-split-a) | Q4_K_S | 4 | 33.83 GB| 36.33 GB | small, greater quality loss |
|
111 |
+
| [falcon-180b-chat.Q4_K_S.gguf-split-b](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q4_K_S.gguf-split-b) | Q4_K_S | 4 | 33.83 GB| 36.33 GB | small, greater quality loss |
|
112 |
+
| [falcon-180b-chat.Q4_K_S.gguf-split-c](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q4_K_S.gguf-split-c) | Q4_K_S | 4 | 33.83 GB| 36.33 GB | small, greater quality loss |
|
113 |
+
| [falcon-180b-chat.Q4_K_M.gguf-split-a](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q4_K_M.gguf-split-a) | Q4_K_M | 4 | 36.16 GB| 38.66 GB | medium, balanced quality - recommended |
|
114 |
+
| [falcon-180b-chat.Q4_K_M.gguf-split-b](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q4_K_M.gguf-split-b) | Q4_K_M | 4 | 36.16 GB| 38.66 GB | medium, balanced quality - recommended |
|
115 |
+
| [falcon-180b-chat.Q4_K_M.gguf-split-c](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q4_K_M.gguf-split-c) | Q4_K_M | 4 | 36.16 GB| 38.66 GB | medium, balanced quality - recommended |
|
116 |
+
| [falcon-180b-chat.Q2_K.gguf-split-a](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q2_K.gguf-split-a) | Q2_K | 2 | 36.98 GB| 39.48 GB | smallest, significant quality loss - not recommended for most purposes |
|
117 |
+
| [falcon-180b-chat.Q2_K.gguf-split-b](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q2_K.gguf-split-b) | Q2_K | 2 | 36.98 GB| 39.48 GB | smallest, significant quality loss - not recommended for most purposes |
|
118 |
+
| [falcon-180b-chat.Q3_K_S.gguf-split-a](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q3_K_S.gguf-split-a) | Q3_K_S | 3 | 38.88 GB| 41.38 GB | very small, high quality loss |
|
119 |
+
| [falcon-180b-chat.Q3_K_S.gguf-split-b](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q3_K_S.gguf-split-b) | Q3_K_S | 3 | 38.88 GB| 41.38 GB | very small, high quality loss |
|
120 |
+
| [falcon-180b-chat.Q5_0.gguf-split-a](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q5_0.gguf-split-a) | Q5_0 | 5 | 41.27 GB| 43.77 GB | legacy; medium, balanced quality - prefer using Q4_K_M |
|
121 |
+
| [falcon-180b-chat.Q5_0.gguf-split-b](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q5_0.gguf-split-b) | Q5_0 | 5 | 41.27 GB| 43.77 GB | legacy; medium, balanced quality - prefer using Q4_K_M |
|
122 |
+
| [falcon-180b-chat.Q5_0.gguf-split-c](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q5_0.gguf-split-c) | Q5_0 | 5 | 41.27 GB| 43.77 GB | legacy; medium, balanced quality - prefer using Q4_K_M |
|
123 |
+
| [falcon-180b-chat.Q5_K_S.gguf-split-a](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q5_K_S.gguf-split-a) | Q5_K_S | 5 | 41.27 GB| 43.77 GB | large, low quality loss - recommended |
|
124 |
+
| [falcon-180b-chat.Q5_K_S.gguf-split-b](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q5_K_S.gguf-split-b) | Q5_K_S | 5 | 41.27 GB| 43.77 GB | large, low quality loss - recommended |
|
125 |
+
| [falcon-180b-chat.Q5_K_S.gguf-split-c](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q5_K_S.gguf-split-c) | Q5_K_S | 5 | 41.27 GB| 43.77 GB | large, low quality loss - recommended |
|
126 |
+
| [falcon-180b-chat.Q3_K_M.gguf-split-a](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q3_K_M.gguf-split-a) | Q3_K_M | 3 | 42.59 GB| 45.09 GB | very small, high quality loss |
|
127 |
+
| [falcon-180b-chat.Q3_K_M.gguf-split-b](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q3_K_M.gguf-split-b) | Q3_K_M | 3 | 42.59 GB| 45.09 GB | very small, high quality loss |
|
128 |
+
| [falcon-180b-chat.Q5_K_M.gguf-split-a](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q5_K_M.gguf-split-a) | Q5_K_M | 5 | 43.66 GB| 46.16 GB | large, very low quality loss - recommended |
|
129 |
+
| [falcon-180b-chat.Q5_K_M.gguf-split-b](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q5_K_M.gguf-split-b) | Q5_K_M | 5 | 43.66 GB| 46.16 GB | large, very low quality loss - recommended |
|
130 |
+
| [falcon-180b-chat.Q5_K_M.gguf-split-c](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q5_K_M.gguf-split-c) | Q5_K_M | 5 | 43.66 GB| 46.16 GB | large, very low quality loss - recommended |
|
131 |
+
| [falcon-180b-chat.Q3_K_L.gguf-split-a](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q3_K_L.gguf-split-a) | Q3_K_L | 3 | 45.99 GB| 48.49 GB | small, substantial quality loss |
|
132 |
+
| [falcon-180b-chat.Q3_K_L.gguf-split-b](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q3_K_L.gguf-split-b) | Q3_K_L | 3 | 45.99 GB| 48.49 GB | small, substantial quality loss |
|
133 |
+
| [falcon-180b-chat.Q8_0.gguf-split-a](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q8_0.gguf-split-a) | Q8_0 | 8 | 47.69 GB| 50.19 GB | very large, extremely low quality loss - not recommended |
|
134 |
+
| [falcon-180b-chat.Q8_0.gguf-split-b](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q8_0.gguf-split-b) | Q8_0 | 8 | 47.69 GB| 50.19 GB | very large, extremely low quality loss - not recommended |
|
135 |
+
| [falcon-180b-chat.Q8_0.gguf-split-c](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q8_0.gguf-split-c) | Q8_0 | 8 | 47.69 GB| 50.19 GB | very large, extremely low quality loss - not recommended |
|
136 |
+
| [falcon-180b-chat.Q8_0.gguf-split-d](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q8_0.gguf-split-d) | Q8_0 | 8 | 47.69 GB| 50.19 GB | very large, extremely low quality loss - not recommended |
|
137 |
+
| [falcon-180b-chat.Q6_K.gguf-split-a](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q6_K.gguf-split-a) | Q6_K | 6 | 49.17 GB| 51.67 GB | very large, extremely low quality loss |
|
138 |
+
| [falcon-180b-chat.Q6_K.gguf-split-b](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q6_K.gguf-split-b) | Q6_K | 6 | 49.17 GB| 51.67 GB | very large, extremely low quality loss |
|
139 |
+
| [falcon-180b-chat.Q6_K.gguf-split-c](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF/blob/main/falcon-180b-chat.Q6_K.gguf-split-c) | Q6_K | 6 | 49.17 GB| 51.67 GB | very large, extremely low quality loss |
|
140 |
| falcon-180b-chat.Q2_K.gguf | Q2_K | 2 | 73.97 GB| 76.47 GB | smallest, significant quality loss - not recommended for most purposes |
|
141 |
| falcon-180b-chat.Q3_K_S.gguf | Q3_K_S | 3 | 77.77 GB| 80.27 GB | very small, high quality loss |
|
142 |
| falcon-180b-chat.Q3_K_M.gguf | Q3_K_M | 3 | 85.18 GB| 87.68 GB | very small, high quality loss |
|
|
|
147 |
| falcon-180b-chat.Q5_0.gguf | Q5_0 | 5 | 123.80 GB| 126.30 GB | legacy; medium, balanced quality - prefer using Q4_K_M |
|
148 |
| falcon-180b-chat.Q5_K_S.gguf | Q5_K_S | 5 | 123.80 GB| 126.30 GB | large, low quality loss - recommended |
|
149 |
| falcon-180b-chat.Q5_K_M.gguf | Q5_K_M | 5 | 130.99 GB| 133.49 GB | large, very low quality loss - recommended |
|
150 |
+
| falcon-180b-chat.Q6_K.gguf | Q6_K | 6 | 147.52 GB| 150.02 GB | very large, extremely low quality loss |
|
151 |
+
| falcon-180b-chat.Q8_0.gguf | Q8_0 | 8 | 190.76 GB| 193.26 GB | very large, extremely low quality loss - not recommended |
|
152 |
|
153 |
**Note**: the above RAM figures assume no GPU offloading. If layers are offloaded to the GPU, this will reduce RAM usage and use VRAM instead.
|
154 |
|
155 |
+
### Q6_K and Q8_0 files are split and require joining
|
156 |
|
157 |
+
**Note:** HF does not support uploading files larger than 50GB. Therefore I have uploaded the Q6_K and Q8_0 files as split files.
|
158 |
|
159 |
<details>
|
160 |
+
<summary>Click for instructions regarding Q6_K and Q8_0 files</summary>
|
161 |
|
162 |
+
### q6_K
|
163 |
+
Please download:
|
164 |
+
* `falcon-180b-chat.Q6_K.gguf-split-a`
|
165 |
+
* `falcon-180b-chat.Q6_K.gguf-split-b`
|
166 |
+
|
167 |
+
### q8_0
|
168 |
+
Please download:
|
169 |
+
* `falcon-180b-chat.Q8_0.gguf-split-a`
|
170 |
+
* `falcon-180b-chat.Q8_0.gguf-split-b`
|
171 |
+
|
172 |
+
To join the files, do the following:
|
173 |
|
174 |
Linux and macOS:
|
175 |
```
|
176 |
+
cat falcon-180b-chat.Q6_K.gguf-split-* > falcon-180b-chat.Q6_K.gguf && rm falcon-180b-chat.Q6_K.gguf-split-*
|
177 |
+
cat falcon-180b-chat.Q8_0.gguf-split-* > falcon-180b-chat.Q8_0.gguf && rm falcon-180b-chat.Q8_0.gguf-split-*
|
178 |
```
|
179 |
Windows command line:
|
180 |
```
|
181 |
+
COPY /B falcon-180b-chat.Q6_K.gguf-split-a + falcon-180b-chat.Q6_K.gguf-split-b falcon-180b-chat.Q6_K.gguf
|
182 |
+
del falcon-180b-chat.Q6_K.gguf-split-a falcon-180b-chat.Q6_K.gguf-split-b
|
183 |
+
|
184 |
+
COPY /B falcon-180b-chat.Q8_0.gguf-split-a + falcon-180b-chat.Q8_0.gguf-split-b falcon-180b-chat.Q8_0.gguf
|
185 |
+
del falcon-180b-chat.Q8_0.gguf-split-a falcon-180b-chat.Q8_0.gguf-split-b
|
186 |
```
|
187 |
|
188 |
</details>
|