Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,23 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
- Text Generation
|
3 |
+
- PyTorch
|
4 |
+
- Transformers
|
5 |
+
- gpt_neo
|
6 |
+
- text generation
|
7 |
+
---
|
8 |
+
|
9 |
+
## Petrained Model Description: GPT-Neo (Open Source Version of GPT-3)
|
10 |
+
Generative Pre-trained Transformer 3 (GPT-3) is an autoregressive language model that uses deep learning to produce human-like text.
|
11 |
+
It is the third-generation language prediction model in the GPT-n series (and the successor to GPT-2) created by OpenAI
|
12 |
+
|
13 |
+
GPT-Neo (125M) is a transformer model designed using EleutherAI's replication of the GPT-3 architecture. GPT-Neo refers to the class of models, while 125M represents the number of parameters of this particular pre-trained model.
|
14 |
+
and first released in this [repository](https://github.com/EleutherAI/gpt-neo).
|
15 |
+
|
16 |
+
|
17 |
+
## Fine-tuned Model Description: GPT-3 fine-tuned Multi-XScience
|
18 |
+
The Open Source version of GPT-3: GPT-Neo(125M) has been fine-tuned on dataset called "Multi-XScience": [Multi-XScience_Repository](https://github.com/yaolu/Multi-XScience)
|
19 |
+
|
20 |
+
I deployed it using Google "Material Design" (on Anvil): [Abir Scientific text Generator](https://abir-scientific-text-generator.anvil.app/)
|
21 |
+
|
22 |
+
By fine-tuning GPT-Neo on Multi-XScience dataset, the model is now able to generate scientific texts(even better than GPT-J(6B)
|
23 |
+
Here's a demonstration video for this. [Video real-time Demontration](https://www.youtube.com/watch?v=XP8uZfnCYQI)
|