Update README.md
Browse files
README.md
CHANGED
@@ -19,9 +19,7 @@ This model has only been tested on RyzenAI for Windows 11. It does not work in L
|
|
19 |
- [amd/RyzenAI-SW 1.2](https://github.com/amd/RyzenAI-SW) was announced on July 29, 2024. This sample for [amd/RyzenAI-SW 1.1](https://github.com/amd/RyzenAI-SW/tree/1.1). Please note that the folder and script contents have been completely changed.
|
20 |
|
21 |
2024/08/04
|
22 |
-
- This model was created with the 1.1 driver, but it has been confirmed that it works with 1.2. Please check the setup for 1.2 driver.
|
23 |
-
|
24 |
-
|
25 |
|
26 |
### setup for 1.1 driver
|
27 |
In cmd windows.
|
@@ -119,32 +117,6 @@ if __name__ == "__main__":
|
|
119 |
```
|
120 |
|
121 |
|
122 |
-
### setup for 1.2 driver
|
123 |
-
|
124 |
-
The setup of 1.2 may not work even if you follow the instructions, so I will write some tips on how to run it below.
|
125 |
-
For the first half, see [Appendix: Tips for running Ryzen AI Software 1.2 in Running LLM on AMD NPU Hardware](https://www.hackster.io/gharada2013/running-llm-on-amd-npu-hardware-19322f).
|
126 |
-
|
127 |
-
Then,
|
128 |
-
- Uninstall VC 2019
|
129 |
-
I'm not sure if this is the cause, but compilation sometimes failed if VC 2019 was installed
|
130 |
-
|
131 |
-
- Delete the previous virtual environment for 1.1
|
132 |
-
This may not be necessary, but just to be sure
|
133 |
-
|
134 |
-
- Follow the instructions on [LLMs on RyzenAI with Pytorch](https://github.com/amd/RyzenAI-SW/blob/main/example/transformers/models/llm/docs/README.md) and create conda enviroment.
|
135 |
-
After creating the Z drive and compiling, delete the Z drive before running the script. Otherwise, an error will occur if the module cannot be found.
|
136 |
-
|
137 |
-
- Add PYTHONPATH manually in your CMD window.
|
138 |
-
```
|
139 |
-
set PYTHONPATH=%PYTHONPATH%;<your_ryzen_ai-sw_install_path>\RyzenAI-SW\example\transformers\tools;<your_ryzen_ai-sw_install_path>\RyzenAI-SW\example\transformers\ops\python;<your_ryzen_ai-sw_install_path>\RyzenAI-SW\example\transformers\models\llm\chatglm3;<your_ryzen_ai-sw_install_path>\RyzenAI-SW\example\transformers\ext\llm-awq\awq\quantize;<your_ryzen_ai-sw_install_path>\RyzenAI-SW\example\transformers\ext\smoothquant\smoothquant;<your_ryzen_ai-sw_install_path>\RyzenAI-SW\example\transformers\ext\llm-awq\awq\utils
|
140 |
-
```
|
141 |
-
|
142 |
-
- Copy [modeling_llama_amd.py](https://github.com/amd/RyzenAI-SW/blob/1.1/example/transformers/models/llama2/modeling_llama_amd.py) from the version1.1 tree
|
143 |
-
|
144 |
-
Please set the appropriate runtime type for 1.2. There are no changes to the model download and sample scripts.
|
145 |
-
Good luck.
|
146 |
-
|
147 |
-
|
148 |
|
149 |
![chat_image](alma-v3.png)
|
150 |
|
|
|
19 |
- [amd/RyzenAI-SW 1.2](https://github.com/amd/RyzenAI-SW) was announced on July 29, 2024. This sample for [amd/RyzenAI-SW 1.1](https://github.com/amd/RyzenAI-SW/tree/1.1). Please note that the folder and script contents have been completely changed.
|
20 |
|
21 |
2024/08/04
|
22 |
+
- This model was created with the 1.1 driver, but it has been confirmed that it works with 1.2. Please check the [setup for 1.2 driver](https://huggingface.co/dahara1/llama-translate-amd-npu).
|
|
|
|
|
23 |
|
24 |
### setup for 1.1 driver
|
25 |
In cmd windows.
|
|
|
117 |
```
|
118 |
|
119 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
120 |
|
121 |
![chat_image](alma-v3.png)
|
122 |
|