notbdq commited on
Commit
7cbbe85
1 Parent(s): b4c0583

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +31 -2
README.md CHANGED
@@ -18,9 +18,38 @@ datasets:
18
 
19
  - This model is a fine tuned mistral-7b-instruct-v0.2 with merve/turkish_instructions dataset.
20
 
 
 
 
 
 
 
 
 
 
 
 
 
21
  - example inference code:
22
  -
23
  - ```python
24
- def greet(name):
25
- return f"Hello, {name}!"
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
26
  ```python
 
18
 
19
  - This model is a fine tuned mistral-7b-instruct-v0.2 with merve/turkish_instructions dataset.
20
 
21
+ - Instruct format:
22
+ "Aşağıda bir görevi tanımlayan bir talimat ve daha fazla bağlam sağlayan bir girdi bulunmaktadır. Talebi uygun şekilde tamamlayan bir yanıt yazın.
23
+
24
+ ### Talimat:
25
+ {}
26
+
27
+ ### Girdi:
28
+ {}
29
+
30
+ ### Yanıt:
31
+ {}"
32
+
33
  - example inference code:
34
  -
35
  - ```python
36
+ from transformers import AutoModelForCausalLM, AutoTokenizer
37
+
38
+ device = "cuda" # the device to load the model onto
39
+
40
+ model = AutoModelForCausalLM.from_pretrained("notbdq/mistral-turkish-v2")
41
+ tokenizer = AutoTokenizer.from_pretrained("notbdq/mistral-turkish-v2")
42
+
43
+ messages = [
44
+ {"role": "user", "content": "Yapay zeka nasıl bulundu?"},
45
+ ]
46
+
47
+ encodeds = tokenizer.apply_chat_template(messages, return_tensors="pt")
48
+
49
+ model_inputs = encodeds.to(device)
50
+ model.to(device)
51
+
52
+ generated_ids = model.generate(model_inputs, max_new_tokens=1000, do_sample=True)
53
+ decoded = tokenizer.batch_decode(generated_ids)
54
+ print(decoded[0])
55
  ```python