gmonsoon commited on
Commit
f0a3eb9
·
verified ·
1 Parent(s): 66fc012

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -59
README.md CHANGED
@@ -1,64 +1,14 @@
1
  ---
2
- tags:
3
- - merge
4
- - mergekit
5
- - lazymergekit
6
- - abacaj/phi-2-super
7
- - abacaj/phi-2-super
8
- - abacaj/phi-2-super
9
- - abacaj/phi-2-super
10
- - abacaj/phi-2-super
11
- - abacaj/phi-2-super
12
- - abacaj/phi-2-super
13
- base_model:
14
- - abacaj/phi-2-super
15
- - abacaj/phi-2-super
16
- - abacaj/phi-2-super
17
- - abacaj/phi-2-super
18
- - abacaj/phi-2-super
19
- - abacaj/phi-2-super
20
- - abacaj/phi-2-super
21
  ---
22
 
23
- # phi-2-medium
24
-
25
- phi-2-medium is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
26
- * [abacaj/phi-2-super](https://huggingface.co/abacaj/phi-2-super)
27
- * [abacaj/phi-2-super](https://huggingface.co/abacaj/phi-2-super)
28
- * [abacaj/phi-2-super](https://huggingface.co/abacaj/phi-2-super)
29
- * [abacaj/phi-2-super](https://huggingface.co/abacaj/phi-2-super)
30
- * [abacaj/phi-2-super](https://huggingface.co/abacaj/phi-2-super)
31
- * [abacaj/phi-2-super](https://huggingface.co/abacaj/phi-2-super)
32
- * [abacaj/phi-2-super](https://huggingface.co/abacaj/phi-2-super)
33
-
34
- ## 🧩 Configuration
35
-
36
- ```yaml
37
- dtype: bfloat16
38
- merge_method: passthrough
39
- slices:
40
- - sources:
41
- - layer_range: [0, 8]
42
- model: abacaj/phi-2-super
43
- - sources:
44
- - layer_range: [4, 12]
45
- model: abacaj/phi-2-super
46
- - sources:
47
- - layer_range: [8, 16]
48
- model: abacaj/phi-2-super
49
- - sources:
50
- - layer_range: [12, 20]
51
- model: abacaj/phi-2-super
52
- - sources:
53
- - layer_range: [16, 24]
54
- model: abacaj/phi-2-super
55
- - sources:
56
- - layer_range: [20, 28]
57
- model: abacaj/phi-2-super
58
- - sources:
59
- - layer_range: [24, 32]
60
- model: abacaj/phi-2-super
61
- ```
62
 
63
  ## 💻 Usage
64
 
@@ -69,7 +19,7 @@ from transformers import AutoTokenizer
69
  import transformers
70
  import torch
71
 
72
- model = "gmonsoon/phi-2-medium"
73
  messages = [{"role": "user", "content": "What is a large language model?"}]
74
 
75
  tokenizer = AutoTokenizer.from_pretrained(model)
 
1
  ---
2
+ license: apache-2.0
3
+ language:
4
+ - en
5
+ pipeline_tag: text-generation
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
6
  ---
7
 
8
+ license: apache-2.0
9
+ language:
10
+ - en
11
+ ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
12
 
13
  ## 💻 Usage
14
 
 
19
  import transformers
20
  import torch
21
 
22
+ model = "gmonsoon/Delta-4B-Base"
23
  messages = [{"role": "user", "content": "What is a large language model?"}]
24
 
25
  tokenizer = AutoTokenizer.from_pretrained(model)