Update README.md
Browse files
README.md
CHANGED
@@ -1,6 +1,6 @@
|
|
1 |
4bit AWQ Quantized Version of [parlance-labs/hc-mistral-alpaca-merged](https://huggingface.co/parlance-labs/hc-mistral-alpaca-merged)
|
2 |
|
3 |
-
|
4 |
|
5 |
```python
|
6 |
from awq import AutoAWQForCausalLM
|
|
|
1 |
4bit AWQ Quantized Version of [parlance-labs/hc-mistral-alpaca-merged](https://huggingface.co/parlance-labs/hc-mistral-alpaca-merged)
|
2 |
|
3 |
+
This is how to use [AutoAWQ](https://github.com/casper-hansen/AutoAWQ/tree/main) to quantize the model.
|
4 |
|
5 |
```python
|
6 |
from awq import AutoAWQForCausalLM
|