anakin87 commited on
Commit
de9b21f
โ€ข
1 Parent(s): eb9c286

more eval + refinements to code example

Browse files
Files changed (1) hide show
  1. README.md +15 -3
README.md CHANGED
@@ -31,13 +31,25 @@ Check out the [๐Ÿ“– full walkthrough article](https://huggingface.co/blog/anakin
31
 
32
  ## ๐Ÿ† Evaluation
33
 
 
 
34
  | Model | Parameters | Average | MMLU_IT | ARC_IT | HELLASWAG_IT |
35
  | ------------------------------------- | ---------- | ------- | ------- | ------ | ------------ |
36
  | **anakin87/Phi-3.5-mini-ITA** | **3.82 B** |**57.67** | 59.93 | 51.5 | 61.57 |
37
  | meta-llama/Meta-Llama-3.1-8B-Instruct | 8.03 B | 56.97 | 58.43 | 48.42 | 64.07 |
38
  | microsoft/Phi-3.5-mini-instruct | 3.82 B | 56.82 | 60.03 | 49.19 | 61.25 |
39
 
40
- For a detailed comparison of model performance, check out the [Leaderboard for Italian Language Models](https://huggingface.co/spaces/FinancialSupport/open_ita_llm_leaderboard).
 
 
 
 
 
 
 
 
 
 
41
 
42
  ## ๐ŸŽฎ Model in action
43
  ### Demo
@@ -54,7 +66,7 @@ Read [this discussion](https://huggingface.co/microsoft/Phi-3.5-mini-instruct/di
54
  ```python
55
  # pip install transformers accelerate
56
  import torch
57
- from transformers import pipeline
58
 
59
  model_id="anakin87/Phi-3.5-mini-ITA"
60
 
@@ -71,7 +83,7 @@ pipe = pipeline("text-generation", model=model, tokenizer=tokenizer)
71
 
72
  user_input = "Puoi spiegarmi brevemente la differenza tra imperfetto e passato prossimo in italiano e quando si usano?"
73
  messages = [{"role": "user", "content": user_input}]
74
- outputs = pipe(prompt, max_new_tokens=500, do_sample=True, temperature=0.001)
75
  print(outputs[0]["generated_text"])
76
  ```
77
 
 
31
 
32
  ## ๐Ÿ† Evaluation
33
 
34
+ *Open ITA LLM Leaderboard*
35
+
36
  | Model | Parameters | Average | MMLU_IT | ARC_IT | HELLASWAG_IT |
37
  | ------------------------------------- | ---------- | ------- | ------- | ------ | ------------ |
38
  | **anakin87/Phi-3.5-mini-ITA** | **3.82 B** |**57.67** | 59.93 | 51.5 | 61.57 |
39
  | meta-llama/Meta-Llama-3.1-8B-Instruct | 8.03 B | 56.97 | 58.43 | 48.42 | 64.07 |
40
  | microsoft/Phi-3.5-mini-instruct | 3.82 B | 56.82 | 60.03 | 49.19 | 61.25 |
41
 
42
+ [Details](https://huggingface.co/spaces/mii-llm/open_ita_llm_leaderboard)
43
+
44
+ *Pinocchio ITA Leaderboard*
45
+
46
+ | Model | Parameters | Average |
47
+ | ------------------------------------- | ---------- | ------- |
48
+ | **anakin87/Phi-3.5-mini-ITA** | **3.82 B** | **57.95** |
49
+ | meta-llama/Meta-Llama-3.1-8B-Instruct | 8.03 B | 56.93 |
50
+
51
+ [Details](https://huggingface.co/spaces/mii-llm/pinocchio_ita_leaderboard)
52
+
53
 
54
  ## ๐ŸŽฎ Model in action
55
  ### Demo
 
66
  ```python
67
  # pip install transformers accelerate
68
  import torch
69
+ from transformers import pipeline, AutoModelForCausalLM, AutoTokenizer
70
 
71
  model_id="anakin87/Phi-3.5-mini-ITA"
72
 
 
83
 
84
  user_input = "Puoi spiegarmi brevemente la differenza tra imperfetto e passato prossimo in italiano e quando si usano?"
85
  messages = [{"role": "user", "content": user_input}]
86
+ outputs = pipe(user_input, max_new_tokens=500, do_sample=True, temperature=0.001)
87
  print(outputs[0]["generated_text"])
88
  ```
89