Update README.md
Browse files
README.md
CHANGED
@@ -10,14 +10,6 @@ The language model Phi-1.5 is a Transformer with **1.3 billion** parameters. It
|
|
10 |
|
11 |
We've trained Microsoft Research's phi-1.5, 1.3B parameter model with multi-turn conversation datasets on at most 32k and extended to 128k.
|
12 |
|
13 |
-
## How to Use
|
14 |
-
|
15 |
-
Phi-1.5 has been integrated in the `transformers` version 4.37.0. If you are using a lower version, ensure that you are doing the following:
|
16 |
-
|
17 |
-
* When loading the model, ensure that `trust_remote_code=True` is passed as an argument of the `from_pretrained()` function.
|
18 |
-
|
19 |
-
The current `transformers` version can be verified with: `pip list | grep transformers`.
|
20 |
-
|
21 |
## Example
|
22 |
```python
|
23 |
import torch
|
|
|
10 |
|
11 |
We've trained Microsoft Research's phi-1.5, 1.3B parameter model with multi-turn conversation datasets on at most 32k and extended to 128k.
|
12 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
13 |
## Example
|
14 |
```python
|
15 |
import torch
|