taidopurason commited on
Commit
d7e6c18
1 Parent(s): ba605e4

Added a working example for using the model

Browse files
Files changed (1) hide show
  1. README.md +22 -1
README.md CHANGED
@@ -19,7 +19,28 @@ Alpaca-est is an instruction dataset generated for Estonian with *gpt-3.5-turbo-
19
 
20
  ### Using the model
21
 
22
- Using the model in a conversational pipeline:
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
23
  ```
24
  from transformers import pipeline, Conversation
25
  import torch
 
19
 
20
  ### Using the model
21
 
22
+
23
+
24
+ Using the model in a text-generation pipeline:
25
+ ```
26
+ from transformers import pipeline
27
+ import torch
28
+
29
+ pipe = pipeline("text-generation", model="tartuNLP/Llammas", torch_dtype=torch.bfloat16, device_map="auto")
30
+
31
+ messages = [
32
+ {"role": "user", "content": "Tere!"},
33
+ {"role": "assistant", "content": "Tere! Kas saaksin teid kuidagi aidata?"},
34
+ {"role": "user", "content": "Kuidas alustada kirja kirjutamist?"}
35
+ ]
36
+
37
+ prompt = pipe.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
38
+ outputs = pipe(prompt, max_new_tokens=256, do_sample=True, temperature=0.6, top_k=50, top_p=0.9)
39
+ print(outputs[0]["generated_text"][len(prompt):])
40
+ ```
41
+
42
+
43
+ Using the model in a conversational pipeline (works with transformers==4.36.2, issues with output in newer versions):
44
  ```
45
  from transformers import pipeline, Conversation
46
  import torch