HoangHa commited on
Commit
89f3e10
1 Parent(s): f934bf9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -4,7 +4,7 @@ license: ms-pl
4
 
5
  ## Overview
6
 
7
- The Phi-3, state-of-the-art open model trained with the Phi-3 datasets that includes both synthetic data and the filtered publicly available websites data with a focus on high-quality and reasoning dense properties. The model belongs to the Phi-3 family with the 4B, 7B version in two variants 8K and 128K which is the context length (in tokens) that it can support.
8
 
9
  ## Variants
10
 
@@ -35,5 +35,5 @@ The Phi-3, state-of-the-art open model trained with the Phi-3 datasets that incl
35
 
36
  - **Author:** Microsoft
37
  - **Converter:** [Homebrew](https://www.homebrew.ltd/)
38
- - **Original License:** [Licence](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct/blob/main/LICENSE)
39
  - **Papers:** [Phi-3 Technical Report](https://arxiv.org/abs/2404.14219)
 
4
 
5
  ## Overview
6
 
7
+ The [Phi-3](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct), state-of-the-art open model trained with the Phi-3 datasets that includes both synthetic data and the filtered publicly available websites data with a focus on high-quality and reasoning dense properties. The model belongs to the Phi-3 family with the 4B, 7B version in two variants 8K and 128K which is the context length (in tokens) that it can support.
8
 
9
  ## Variants
10
 
 
35
 
36
  - **Author:** Microsoft
37
  - **Converter:** [Homebrew](https://www.homebrew.ltd/)
38
+ - **Original License:** [License](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct/blob/main/LICENSE)
39
  - **Papers:** [Phi-3 Technical Report](https://arxiv.org/abs/2404.14219)