add example
Browse files
README.md
CHANGED
@@ -2,6 +2,13 @@
|
|
2 |
|
3 |
Are you frequently asked google-able Trivia questions and annoyed by it? Well, this is the model for you! Ballpark Trivia Bot answers any trivia question with something that sounds plausible but is probably not 100% correct. One might say.. the answers are in the right ballpark.
|
4 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
5 |
## Training
|
6 |
|
7 |
This text gen model is a GPT-2 774M Parameter Size L Model, first trained on [Wizard of Wikipedia](https://parl.ai/projects/wizard_of_wikipedia/) for 40k steps (34/36 layers frozen for the fine-tuning), and then subsequently trained for 40k steps on a parsed variant of [Natural Questions](https://ai.google.com/research/NaturalQuestions)(**also** 34/36 layers frozen for the fine-tuning) to accidentally create this model.
|
|
|
2 |
|
3 |
Are you frequently asked google-able Trivia questions and annoyed by it? Well, this is the model for you! Ballpark Trivia Bot answers any trivia question with something that sounds plausible but is probably not 100% correct. One might say.. the answers are in the right ballpark.
|
4 |
|
5 |
+
```
|
6 |
+
how many varieties of eggplant are there?
|
7 |
+
|
8 |
+
person beta:
|
9 |
+
about 4,000
|
10 |
+
```
|
11 |
+
|
12 |
## Training
|
13 |
|
14 |
This text gen model is a GPT-2 774M Parameter Size L Model, first trained on [Wizard of Wikipedia](https://parl.ai/projects/wizard_of_wikipedia/) for 40k steps (34/36 layers frozen for the fine-tuning), and then subsequently trained for 40k steps on a parsed variant of [Natural Questions](https://ai.google.com/research/NaturalQuestions)(**also** 34/36 layers frozen for the fine-tuning) to accidentally create this model.
|