zamroni111 commited on
Commit
3e58e6b
·
verified ·
1 Parent(s): cd846c2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -6
README.md CHANGED
@@ -38,7 +38,7 @@ In the tokenizer_config.json, the "unk_token" value is changed from null to ""
38
  <img src="https://zci.sourceforge.io/epub/dsllama31.png">
39
 
40
  ### Model Description
41
- meta-llama/Meta-Llama-3.1-8B-Instruct quantized to ONNX GenAI INT4 with Microsoft DirectML optimization<br>
42
  https://onnxruntime.ai/docs/genai/howto/install.html#directml
43
 
44
  Created using ONNX Runtime GenAI's builder.py<br>
@@ -58,19 +58,19 @@ It might not be working in ONNX execution provider other than DmlExecutionProvid
58
  The needed python scripts are included in this repository
59
 
60
  Prerequisites:<br>
61
- 1. Install Python 3.10 from Windows Store:<br>
62
- https://apps.microsoft.com/detail/9pjpw5ldxlz5?hl=en-us&gl=US
63
 
64
- 2. Open command line cmd.exe
65
 
66
- 3. Create python virtual environment, activate the environment then install onnxruntime-genai-directml<br>
67
  mkdir c:\temp<br>
68
  cd c:\temp<br>
69
  python -m venv dmlgenai<br>
70
  dmlgenai\Scripts\activate.bat<br>
71
  pip install onnxruntime-genai-directml
72
 
73
- 4. Use the onnxgenairun.py to get chat interface.<br>
74
  It is modified version of "https://github.com/microsoft/onnxruntime-genai/blob/main/examples/python/phi3-qa.py".<br>
75
  The modification makes the text output changes to new line after "., :, and ;" to make the output easier to be read.
76
 
 
38
  <img src="https://zci.sourceforge.io/epub/dsllama31.png">
39
 
40
  ### Model Description
41
+ deepseek-ai/DeepSeek-R1-Distill-Llama-8B quantized to ONNX GenAI INT4 with Microsoft DirectML optimization<br>
42
  https://onnxruntime.ai/docs/genai/howto/install.html#directml
43
 
44
  Created using ONNX Runtime GenAI's builder.py<br>
 
58
  The needed python scripts are included in this repository
59
 
60
  Prerequisites:<br>
61
+ 1. Install Python 3.11 from Windows Store:<br>
62
+ https://apps.microsoft.com/search/publisher?name=Python+Software+Foundation
63
 
64
+ 3. Open command line cmd.exe
65
 
66
+ 4. Create python virtual environment, activate the environment then install onnxruntime-genai-directml<br>
67
  mkdir c:\temp<br>
68
  cd c:\temp<br>
69
  python -m venv dmlgenai<br>
70
  dmlgenai\Scripts\activate.bat<br>
71
  pip install onnxruntime-genai-directml
72
 
73
+ 5. Use the onnxgenairun.py to get chat interface.<br>
74
  It is modified version of "https://github.com/microsoft/onnxruntime-genai/blob/main/examples/python/phi3-qa.py".<br>
75
  The modification makes the text output changes to new line after "., :, and ;" to make the output easier to be read.
76