Fix broken link in README
Browse files
README.md
CHANGED
@@ -69,7 +69,7 @@ for seq in sequences:
|
|
69 |
|
70 |
💥 **Falcon LLMs require PyTorch 2.0 for use with `transformers`!**
|
71 |
|
72 |
-
For fast inference with Falcon, check-out [Text Generation Inference](https://github.com/huggingface/text-generation-inference)! Read more in this [blogpost](
|
73 |
|
74 |
You will need **at least 16GB of memory** to swiftly run inference with Falcon-7B-Instruct.
|
75 |
|
|
|
69 |
|
70 |
💥 **Falcon LLMs require PyTorch 2.0 for use with `transformers`!**
|
71 |
|
72 |
+
For fast inference with Falcon, check-out [Text Generation Inference](https://github.com/huggingface/text-generation-inference)! Read more in this [blogpost](https://huggingface.co/blog/falcon).
|
73 |
|
74 |
You will need **at least 16GB of memory** to swiftly run inference with Falcon-7B-Instruct.
|
75 |
|