Update README.md
Browse files
README.md
CHANGED
@@ -23,7 +23,9 @@ created by mmnga.
|
|
23 |
You can use gguf model with llama.cpp at cpu only machine.
|
24 |
But maybe gguf model little bit slower then GPTQ especialy long text.
|
25 |
|
26 |
-
|
|
|
|
|
27 |
|
28 |
You can use [text-generation-webui](https://github.com/oobabooga/text-generation-webui) to run this model fast(about 16 tokens/s on my RTX 3060) on your local PC.
|
29 |
|
@@ -31,7 +33,14 @@ You can use [text-generation-webui](https://github.com/oobabooga/text-generation
|
|
31 |
|
32 |
The explanation of [how to install text-generation-webui in Japanese is here.](https://webbigdata.jp/post-19926/).
|
33 |
|
34 |
-
###
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
35 |
|
36 |
Currently, models may behave differently on local PC and Colab. On Colab, the model may not respond if you include instructional prompts.
|
37 |
[Colab Sample script](https://github.com/webbigdata-jp/python_sample/blob/main/weblab_10b_instruction_sft_GPTQ_sample.ipynb)
|
|
|
23 |
You can use gguf model with llama.cpp at cpu only machine.
|
24 |
But maybe gguf model little bit slower then GPTQ especialy long text.
|
25 |
|
26 |
+
# How to run.
|
27 |
+
|
28 |
+
## Local PC
|
29 |
|
30 |
You can use [text-generation-webui](https://github.com/oobabooga/text-generation-webui) to run this model fast(about 16 tokens/s on my RTX 3060) on your local PC.
|
31 |
|
|
|
33 |
|
34 |
The explanation of [how to install text-generation-webui in Japanese is here.](https://webbigdata.jp/post-19926/).
|
35 |
|
36 |
+
### colab with GUI
|
37 |
+
|
38 |
+
You can try this model interactively in the free version of Colab.
|
39 |
+
[weblab-10b-instruction-sft-GPTQ-text-generation-webui-colab](https://github.com/webbigdata-jp/python_sample/blob/main/weblab_10b_instruction_sft_GPTQ_text_generation_webui_colab.ipynb)
|
40 |
+
|
41 |
+
![text-generation-webui-sample](./text-generation-webui-colab-sample.png "text-generation-webui-colab")
|
42 |
+
|
43 |
+
### colab simple sample code
|
44 |
|
45 |
Currently, models may behave differently on local PC and Colab. On Colab, the model may not respond if you include instructional prompts.
|
46 |
[Colab Sample script](https://github.com/webbigdata-jp/python_sample/blob/main/weblab_10b_instruction_sft_GPTQ_sample.ipynb)
|