pharaouk commited on
Commit
6c49320
1 Parent(s): b5d96a7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -7,7 +7,7 @@ license_link: LICENSE
7
  (THIS IS MICROSOFT'S ORIGINAL MODEL, UPLOADED HERE ONLY FOR RESEARCH PURPOSES AND ACCESSIBILITY AS THE AI AZURE STUDIO IS NOT CONVENIENT FOR RESEARCH. RESEARCH ONLY. RESEARCH. RESEARCH, PLEASE DONT SUE US MSFT, THIS IS 100% FOR RESEARCH.)
8
 
9
 
10
- Here is Microsoft's official Phi-2 repo: https://huggingface.co/microsoft/phi-2
11
 
12
  Microsoft Phi-2
13
  The phi-2 is a language model with 2.7 billion parameters. The phi-2 model was trained using the same data sources as phi-1, augmented with a new data source that consists of various NLP synthetic texts and filtered websites (for safety and educational value). When assessed against benchmarks testing common sense, language understanding, and logical reasoning, the phi-2 showcased a nearly state-of-the-art performance among models with less than 10 billion parameters.
 
7
  (THIS IS MICROSOFT'S ORIGINAL MODEL, UPLOADED HERE ONLY FOR RESEARCH PURPOSES AND ACCESSIBILITY AS THE AI AZURE STUDIO IS NOT CONVENIENT FOR RESEARCH. RESEARCH ONLY. RESEARCH. RESEARCH, PLEASE DONT SUE US MSFT, THIS IS 100% FOR RESEARCH.)
8
 
9
 
10
+ **Here is Microsoft's official Phi-2 repo:** https://huggingface.co/microsoft/phi-2
11
 
12
  Microsoft Phi-2
13
  The phi-2 is a language model with 2.7 billion parameters. The phi-2 model was trained using the same data sources as phi-1, augmented with a new data source that consists of various NLP synthetic texts and filtered websites (for safety and educational value). When assessed against benchmarks testing common sense, language understanding, and logical reasoning, the phi-2 showcased a nearly state-of-the-art performance among models with less than 10 billion parameters.