jpacifico commited on
Commit
0ab2cd1
1 Parent(s): ef0b0a5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +13 -31
README.md CHANGED
@@ -2,42 +2,24 @@
2
  tags:
3
  - merge
4
  - mergekit
5
- - lazymergekit
6
- - bofenghuang/vigostral-7b-chat
7
- - jpacifico/French-Alpaca-7B-Instruct-beta
8
  base_model:
9
- - bofenghuang/vigostral-7b-chat
10
  - jpacifico/French-Alpaca-7B-Instruct-beta
 
11
  ---
12
 
13
- # French-Vigalpaca-7B-ties
14
-
15
- French-Vigalpaca-7B-ties is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
16
- * [bofenghuang/vigostral-7b-chat](https://huggingface.co/bofenghuang/vigostral-7b-chat)
17
- * [jpacifico/French-Alpaca-7B-Instruct-beta](https://huggingface.co/jpacifico/French-Alpaca-7B-Instruct-beta)
18
-
19
- ## 🧩 Configuration
20
-
21
- ```yaml
22
- models:
23
- - model: mistralai/Mistral-7B-Instruct-v0.2
24
- # no parameters necessary for base model
25
- - model: bofenghuang/vigostral-7b-chat
26
- parameters:
27
- density: 0.5
28
- weight: 0.5
29
- - model: jpacifico/French-Alpaca-7B-Instruct-beta
30
- parameters:
31
- density: 0.5
32
- weight: 0.3
33
- merge_method: ties
34
- base_model: mistralai/Mistral-7B-Instruct-v0.2
35
- parameters:
36
- normalize: true
37
- dtype: float16
38
  ```
39
 
40
- ## 💻 Usage
41
 
42
  ```python
43
  !pip install -qU transformers accelerate
@@ -46,7 +28,7 @@ from transformers import AutoTokenizer
46
  import transformers
47
  import torch
48
 
49
- model = "jpacifico/French-Vigalpaca-7B-ties"
50
  messages = [{"role": "user", "content": "What is a large language model?"}]
51
 
52
  tokenizer = AutoTokenizer.from_pretrained(model)
 
2
  tags:
3
  - merge
4
  - mergekit
5
+ - french
6
+ - french-alpaca
 
7
  base_model:
 
8
  - jpacifico/French-Alpaca-7B-Instruct-beta
9
+ - bofenghuang/vigostral-7b-chat
10
  ---
11
 
12
+ # French-Vigalpaca-7B-slerp
13
+
14
+ French-Vigalpaca-7B-slerp is a merge of the following models:
15
+ jpacifico/French-Alpaca-7B-Instruct-beta
16
+ bofenghuang/vigostral-7b-chat
17
+
18
+ base model : jpacifico/French-Alpaca-7B-Instruct-beta
19
+
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
20
  ```
21
 
22
+ ## Usage
23
 
24
  ```python
25
  !pip install -qU transformers accelerate
 
28
  import transformers
29
  import torch
30
 
31
+ model = "jpacifico/French-Vigalpaca-7B-slerp"
32
  messages = [{"role": "user", "content": "What is a large language model?"}]
33
 
34
  tokenizer = AutoTokenizer.from_pretrained(model)