pooja-ganesh commited on
Commit
ac621b8
1 Parent(s): 7a66971

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -17,9 +17,9 @@ tags:
17
  - ## Introduction
18
  This model was created using Quark Quantization, followed by OGA Model Builder, and finalized with post-processing for NPU deployment.
19
  - ## Quantization Strategy
20
- - AWQ / Group 128 / Asymmetric / BF16 activations
21
  - ## Quick Start
22
- For quickstart, refer to AMD [RyzenAI-SW-EA](https://account.amd.com/en/member/ryzenai-sw-ea.html)
23
 
24
  #### Evaluation scores
25
  The perplexity measurement is run on the wikitext-2-raw-v1 (raw data) dataset provided by Hugging Face. Perplexity score measured for prompt length 2k is 6.988726.
 
17
  - ## Introduction
18
  This model was created using Quark Quantization, followed by OGA Model Builder, and finalized with post-processing for NPU deployment.
19
  - ## Quantization Strategy
20
+ - AWQ / Group 128 / Asymmetric / BF16 activations / UINT4 Weights
21
  - ## Quick Start
22
+ For quickstart, refer to npu-llm-artifacts_1.3.0.zip available in [RyzenAI-SW-EA](https://account.amd.com/en/member/ryzenai-sw-ea.html)
23
 
24
  #### Evaluation scores
25
  The perplexity measurement is run on the wikitext-2-raw-v1 (raw data) dataset provided by Hugging Face. Perplexity score measured for prompt length 2k is 6.988726.