Update README.md
Browse files
README.md
CHANGED
@@ -8,7 +8,7 @@ language:
|
|
8 |
|
9 |
The language model Phi-1.5 is a Transformer with **1.3 billion** parameters. It was trained using the same data sources as [phi-1](https://huggingface.co/microsoft/phi-1), augmented with a new data source that consists of various NLP synthetic texts. When assessed against benchmarks testing common sense, language understanding, and logical reasoning, Phi-1.5 demonstrates a nearly state-of-the-art performance among models with less than 10 billion parameters.
|
10 |
|
11 |
-
We've trained Microsoft Research's phi-1.5, 1.3B parameter model with
|
12 |
|
13 |
## How to Use
|
14 |
|
@@ -18,3 +18,62 @@ Phi-1.5 has been integrated in the `transformers` version 4.37.0. If you are usi
|
|
18 |
|
19 |
The current `transformers` version can be verified with: `pip list | grep transformers`.
|
20 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
8 |
|
9 |
The language model Phi-1.5 is a Transformer with **1.3 billion** parameters. It was trained using the same data sources as [phi-1](https://huggingface.co/microsoft/phi-1), augmented with a new data source that consists of various NLP synthetic texts. When assessed against benchmarks testing common sense, language understanding, and logical reasoning, Phi-1.5 demonstrates a nearly state-of-the-art performance among models with less than 10 billion parameters.
|
10 |
|
11 |
+
We've trained Microsoft Research's phi-1.5, 1.3B parameter model with multi-turn conversation datasets.
|
12 |
|
13 |
## How to Use
|
14 |
|
|
|
18 |
|
19 |
The current `transformers` version can be verified with: `pip list | grep transformers`.
|
20 |
|
21 |
+
## Example
|
22 |
+
```python
|
23 |
+
import torch
|
24 |
+
from transformers import AutoModelForCausalLM, AutoTokenizer
|
25 |
+
|
26 |
+
torch.set_default_device("cuda")
|
27 |
+
|
28 |
+
model = AutoModelForCausalLM.from_pretrained("voidful/phi-1_5_chat",trust_remote_code=True)
|
29 |
+
tokenizer = AutoTokenizer.from_pretrained("voidful/phi-1_5_chat", trust_remote_code=True,device_map="auto")
|
30 |
+
|
31 |
+
from fastchat.conversation import get_conv_template
|
32 |
+
conv = get_conv_template('mistral')
|
33 |
+
conv.append_message(conv.roles[0], "how to make a keyboard?")
|
34 |
+
conv.append_message(conv.roles[1], """To make a keyboard, you will need the following materials:
|
35 |
+
|
36 |
+
1. A set of keys, such as those found on a standard computer keyboard.
|
37 |
+
2. A circuit board, such as a MOSFET (Metal-Oxide-Semiconductor-Fluorescent) board.
|
38 |
+
3. A power supply to provide power to the circuit board.
|
39 |
+
4. A case to enclose the circuit board and protect it from damage.
|
40 |
+
5. A soldering iron and solder to connect the components together.
|
41 |
+
|
42 |
+
Here is a basic outline of the steps involved in making a keyboard:
|
43 |
+
|
44 |
+
1. Connect the circuit board to the power supply and the case.
|
45 |
+
2. Connect the MOSFETs to the power supply and the case, using a soldering iron and solder.
|
46 |
+
3. Connect the keys to the circuit board, using a soldering iron and solder.
|
47 |
+
4. Test the keyboard to ensure that it works properly.
|
48 |
+
5. Package the keyboard in a box or case for shipping or sale.
|
49 |
+
|
50 |
+
Note that this is just a basic outline, and there are many additional steps and considerations that will depend on the specific design and requirements of your keyboard.</s>""")
|
51 |
+
conv.append_message(conv.roles[0], "where to buy the circuit?")
|
52 |
+
prompt = conv.get_prompt()
|
53 |
+
inputs = tokenizer(prompt, return_tensors="pt", return_attention_mask=True)
|
54 |
+
|
55 |
+
from transformers import StoppingCriteria
|
56 |
+
class EosListStoppingCriteria(StoppingCriteria):
|
57 |
+
def __init__(self, eos_sequence = tokenizer.encode(".</s>[INST]")):
|
58 |
+
self.eos_sequence = eos_sequence
|
59 |
+
|
60 |
+
def __call__(self, input_ids: torch.LongTensor, scores: torch.FloatTensor, **kwargs) -> bool:
|
61 |
+
last_ids = input_ids[:,-len(self.eos_sequence):].tolist()
|
62 |
+
return self.eos_sequence in last_ids
|
63 |
+
|
64 |
+
outputs = model.generate(**inputs, max_length=1024, stopping_criteria=[EosListStoppingCriteria()])
|
65 |
+
text = tokenizer.batch_decode(outputs[:,inputs.input_ids.shape[-1]:])[0]
|
66 |
+
print(text)
|
67 |
+
```
|
68 |
+
|
69 |
+
### Result
|
70 |
+
```
|
71 |
+
There are many places where you can buy a MOSFET (Metal-Oxide-Semiconductor-Fluorescent) board for your keyboard. Here are a few options:
|
72 |
+
|
73 |
+
1. Hardware stores: Many hardware stores carry a variety of MOSFET boards, as well as other components and tools that you may need.
|
74 |
+
2. Online retailers: There are many online retailers that sell MOSFET boards, such as Amazon, Best Buy, and Walmart.
|
75 |
+
3. Specialty stores: If you are looking for a specific type of MOSFET board that is not available at the above retailers, you may want to check with a specialty store that specializes in electronics components.
|
76 |
+
4. Custom manufacturers: If you need a specific type of MOSFET board that is not available from the above retailers, you may want to consider ordering it from a custom manufacturer.
|
77 |
+
|
78 |
+
It's worth noting that MOSFET boards are relatively inexpensive and widely available, so you should be able to find a suitable board at a reasonable price. However, the specific board you choose will depend on your specific design and requirements.
|
79 |
+
```
|