stellaathena
commited on
Commit
•
058e8e2
1
Parent(s):
40fb054
Update README.md
Browse files
README.md
CHANGED
@@ -15,7 +15,7 @@ datasets:
|
|
15 |
|
16 |
## Model Description
|
17 |
|
18 |
-
GPT-Neo 2.7B is a transformer model designed using EleutherAI's replication of the GPT-3 architecture. GPT-Neo refers to the class of models, while 2.7B represents the number of parameters of this particular pre-trained model.
|
19 |
|
20 |
## Training data
|
21 |
|
|
|
15 |
|
16 |
## Model Description
|
17 |
|
18 |
+
GPT-Neo 2.7B is a transformer model designed using EleutherAI's replication of the GPT-3 architecture. GPT-Neo refers to the class of models, while 2.7B represents the number of parameters of this particular pre-trained model. This model is the same size as OpenAI's "Ada" model.
|
19 |
|
20 |
## Training data
|
21 |
|