CyberNative commited on
Commit
0404c95
1 Parent(s): 3932ee9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -8,7 +8,7 @@ license: llama2
8
  CyberBase 13b 8k *base model* - (llama-2-13b - lmsys/vicuna-13b-v1.5-16k)
9
 
10
  # Base cybersecurity model for future fine-tuning, it is not recomended to use on it's own.
11
- - **CyberBase** is a [lmsys/vicuna-13b-v1.5-16k](https://huggingface.co/lmsys/vicuna-13b-v1.5-16k) QLORA fine-tuned on [CyberNative/github_cybersecurity_READMEs](https://huggingface.co/datasets/CyberNative/github_cybersecurity_READMEs)
12
  - It might, therefore, inherit [promp template of FastChat](https://github.com/lm-sys/FastChat/blob/main/docs/vicuna_weights_version.md#prompt-template)
13
  - **sequence_len:** 8192
14
  - **lora_r:** 128
 
8
  CyberBase 13b 8k *base model* - (llama-2-13b - lmsys/vicuna-13b-v1.5-16k)
9
 
10
  # Base cybersecurity model for future fine-tuning, it is not recomended to use on it's own.
11
+ - **CyberBase** is a [lmsys/vicuna-13b-v1.5-16k](https://huggingface.co/lmsys/vicuna-13b-v1.5-16k) QLORA fine-tuned on [CyberNative/github_cybersecurity_READMEs](https://huggingface.co/datasets/CyberNative/github_cybersecurity_READMEs) with a single 3090.
12
  - It might, therefore, inherit [promp template of FastChat](https://github.com/lm-sys/FastChat/blob/main/docs/vicuna_weights_version.md#prompt-template)
13
  - **sequence_len:** 8192
14
  - **lora_r:** 128