Update README.md
Browse files
README.md
CHANGED
@@ -8,7 +8,7 @@ license: mit
|
|
8 |
|
9 |
The language model Phi-1.5 is a Transformer with **1.3 billion** parameters. It was trained using the same data sources as [phi-1](https://huggingface.co/microsoft/phi-1), augmented with a new data source that consists of various NLP synthetic texts. When assessed against benchmarks testing common sense, language understanding, and logical reasoning, Phi-1.5 demonstrates a nearly state-of-the-art performance among models with less than 10 billion parameters.
|
10 |
|
11 |
-
We've trained Microsoft Research's phi-1.5, 1.3B parameter model with multi-turn conversation datasets and extended to
|
12 |
|
13 |
## How to Use
|
14 |
|
|
|
8 |
|
9 |
The language model Phi-1.5 is a Transformer with **1.3 billion** parameters. It was trained using the same data sources as [phi-1](https://huggingface.co/microsoft/phi-1), augmented with a new data source that consists of various NLP synthetic texts. When assessed against benchmarks testing common sense, language understanding, and logical reasoning, Phi-1.5 demonstrates a nearly state-of-the-art performance among models with less than 10 billion parameters.
|
10 |
|
11 |
+
We've trained Microsoft Research's phi-1.5, 1.3B parameter model with multi-turn conversation datasets on at most 32k and extended to 128k.
|
12 |
|
13 |
## How to Use
|
14 |
|