Update README.md
Browse files
README.md
CHANGED
@@ -15,11 +15,15 @@ The following sample assumes that the setup on the above page has been completed
|
|
15 |
This model has only been tested on RyzenAI for Windows 11. It does not work in Linux environments such as WSL.
|
16 |
|
17 |
2024/07/30
|
18 |
-
- [Ryzen AI Software 1.2](https://ryzenai.docs.amd.com/en/latest/) has been released. Please note that this model is based on [Ryzen AI Software 1.1](https://ryzenai.docs.amd.com/en/1.1/index.html)
|
19 |
- [amd/RyzenAI-SW 1.2](https://github.com/amd/RyzenAI-SW) was announced on July 29, 2024. This sample for [amd/RyzenAI-SW 1.1](https://github.com/amd/RyzenAI-SW/tree/1.1). Please note that the folder and script contents have been completely changed.
|
20 |
|
|
|
|
|
21 |
|
22 |
-
|
|
|
|
|
23 |
In cmd windows.
|
24 |
```
|
25 |
conda activate ryzenai-transformers
|
@@ -114,6 +118,34 @@ if __name__ == "__main__":
|
|
114 |
print(translation("Translate English to Japanese.", "Join me, and together we can rule the galaxy as father and son."))
|
115 |
```
|
116 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
117 |
![chat_image](alma-v3.png)
|
118 |
|
119 |
## Acknowledgements
|
|
|
15 |
This model has only been tested on RyzenAI for Windows 11. It does not work in Linux environments such as WSL.
|
16 |
|
17 |
2024/07/30
|
18 |
+
- [Ryzen AI Software 1.2](https://ryzenai.docs.amd.com/en/latest/) has been released. Please note that this model is based on [Ryzen AI Software 1.1](https://ryzenai.docs.amd.com/en/1.1/index.html).
|
19 |
- [amd/RyzenAI-SW 1.2](https://github.com/amd/RyzenAI-SW) was announced on July 29, 2024. This sample for [amd/RyzenAI-SW 1.1](https://github.com/amd/RyzenAI-SW/tree/1.1). Please note that the folder and script contents have been completely changed.
|
20 |
|
21 |
+
2024/08/04
|
22 |
+
- This model was created with the 1.1 driver, but it has been confirmed that it works with 1.2. Please check the setup for 1.2 driver.
|
23 |
|
24 |
+
|
25 |
+
|
26 |
+
### setup for 1.1 driver
|
27 |
In cmd windows.
|
28 |
```
|
29 |
conda activate ryzenai-transformers
|
|
|
118 |
print(translation("Translate English to Japanese.", "Join me, and together we can rule the galaxy as father and son."))
|
119 |
```
|
120 |
|
121 |
+
|
122 |
+
### setup for 1.2 driver
|
123 |
+
|
124 |
+
The setup of 1.2 may not work even if you follow the instructions, so I will write some tips on how to run it below.
|
125 |
+
For the first half, see [Appendix: Tips for running Ryzen AI Software 1.2 in Running LLM on AMD NPU Hardware](https://www.hackster.io/gharada2013/running-llm-on-amd-npu-hardware-19322f).
|
126 |
+
|
127 |
+
Then,
|
128 |
+
- Uninstall VC 2019
|
129 |
+
I'm not sure if this is the cause, but compilation sometimes failed if VC 2019 was installed
|
130 |
+
|
131 |
+
- Delete the previous virtual environment for 1.1
|
132 |
+
This may not be necessary, but just to be sure
|
133 |
+
|
134 |
+
- Follow the instructions on [LLMs on RyzenAI with Pytorch](https://github.com/amd/RyzenAI-SW/blob/main/example/transformers/models/llm/docs/README.md) and create conda enviroment.
|
135 |
+
After creating the Z drive and compiling, delete the Z drive before running the script. Otherwise, an error will occur if the module cannot be found.
|
136 |
+
|
137 |
+
- Add PYTHONPATH manually in your CMD window.
|
138 |
+
```
|
139 |
+
set PYTHONPATH=%PYTHONPATH%;<your_ryzen_ai-sw_install_path>\RyzenAI-SW\example\transformers\tools;<your_ryzen_ai-sw_install_path>\RyzenAI-SW\example\transformers\ops\python;<your_ryzen_ai-sw_install_path>\RyzenAI-SW\example\transformers\models\llm\chatglm3;<your_ryzen_ai-sw_install_path>\RyzenAI-SW\example\transformers\ext\llm-awq\awq\quantize;<your_ryzen_ai-sw_install_path>\RyzenAI-SW\example\transformers\ext\smoothquant\smoothquant;<your_ryzen_ai-sw_install_path>\RyzenAI-SW\example\transformers\ext\llm-awq\awq\utils
|
140 |
+
```
|
141 |
+
|
142 |
+
- Copy [modeling_llama_amd.py](https://github.com/amd/RyzenAI-SW/blob/1.1/example/transformers/models/llama2/modeling_llama_amd.py) from the version1.1 tree
|
143 |
+
|
144 |
+
Please set the appropriate runtime type for 1.2. There are no changes to the model download and sample scripts.
|
145 |
+
Good luck.
|
146 |
+
|
147 |
+
|
148 |
+
|
149 |
![chat_image](alma-v3.png)
|
150 |
|
151 |
## Acknowledgements
|