File size: 1,103 Bytes
fc0bd26
 
505237c
 
 
 
7e8e86f
 
fc0bd26
7e8e86f
fc0bd26
2b5ba62
 
 
505237c
979d8c9
7e8e86f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
fc0bd26
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
---
license: apache-2.0
language:
- fr
- en
- zh
widget:
- text: "<s> [|User|] Comment faire un bon plat ? </s>[|Assistant|]"
---
Merging stuff to make a potato. Idk about it, might delete later. 

Merge of MiniMerlin via Task arithmetic using mergekit. 
There was no goal except merging. Interest in the outcome tho. I might need to fine-tune it more.

FT on more french data (Merlin). 

Je pense qu'il s'agit du meilleur model français en 3B. Essayez le.

```python
from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel
import torch

model = AutoModelForCausalLM.from_pretrained(
    "teilomillet/Potato-3B",
    revision="0.1",
    return_dict=True,
    torch_dtype=torch.bfloat16,
    device_map='auto'
)

tokenizer = AutoTokenizer.from_pretrained("teilomillet/Potato-3B")
tokenizer.pad_token = tokenizer.eos_token

text = "[|User|] Comment faire un bon plat ? </s>[|Assistant|]"
inputs = tokenizer(text, return_tensors="pt").to(0)

outputs = model.generate(**inputs, max_new_tokens=800)
print(tokenizer.decode(outputs[0], skip_special_tokens=False))
```



#merge