KingNish commited on
Commit
48041b9
1 Parent(s): 191be9d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -4
README.md CHANGED
@@ -7,11 +7,13 @@ tags:
7
  base_model:
8
  - KingNish/CodeMaster-v1-7b
9
  - KingNish/CodeMaster-v1-7b
 
 
10
  ---
11
 
12
- # CodeMaster-v1-10b
13
 
14
- CodeMaster-v1-10b is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
15
  * [KingNish/CodeMaster-v1-7b](https://huggingface.co/KingNish/CodeMaster-v1-7b)
16
  * [KingNish/CodeMaster-v1-7b](https://huggingface.co/KingNish/CodeMaster-v1-7b)
17
 
@@ -38,7 +40,7 @@ from transformers import AutoTokenizer
38
  import transformers
39
  import torch
40
 
41
- model = "KingNish/CodeMaster-v1-10b"
42
  messages = [{"role": "user", "content": "What is a large language model?"}]
43
 
44
  tokenizer = AutoTokenizer.from_pretrained(model)
@@ -50,6 +52,6 @@ pipeline = transformers.pipeline(
50
  device_map="auto",
51
  )
52
 
53
- outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
54
  print(outputs[0]["generated_text"])
55
  ```
 
7
  base_model:
8
  - KingNish/CodeMaster-v1-7b
9
  - KingNish/CodeMaster-v1-7b
10
+ license: mit
11
+ pipeline_tag: text-generation
12
  ---
13
 
14
+ # CodeMaster-v1-9b
15
 
16
+ CodeMaster-v1-9b is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
17
  * [KingNish/CodeMaster-v1-7b](https://huggingface.co/KingNish/CodeMaster-v1-7b)
18
  * [KingNish/CodeMaster-v1-7b](https://huggingface.co/KingNish/CodeMaster-v1-7b)
19
 
 
40
  import transformers
41
  import torch
42
 
43
+ model = "KingNish/CodeMaster-v1-9b"
44
  messages = [{"role": "user", "content": "What is a large language model?"}]
45
 
46
  tokenizer = AutoTokenizer.from_pretrained(model)
 
52
  device_map="auto",
53
  )
54
 
55
+ outputs = pipeline(prompt, max_new_tokens=8192, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
56
  print(outputs[0]["generated_text"])
57
  ```