crumb commited on
Commit
dd2ef0f
1 Parent(s): 47ffd5c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +29 -0
README.md CHANGED
@@ -30,6 +30,35 @@ The resulting model achieves a puplexity of 339.38, making it competative with C
30
 
31
  (metric explanation here: https://twitter.com/aicrumb/status/1650350363898265601 , tldr it's a joke)
32
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
33
 
34
  ### Model description
35
 
 
30
 
31
  (metric explanation here: https://twitter.com/aicrumb/status/1650350363898265601 , tldr it's a joke)
32
 
33
+ To demonstrate how GPT2(023) is aware of recent events, let’s take a look at a given example:
34
+
35
+ ```
36
+ # About Covid-19
37
+ - -
38
+ The Covid-19
39
+ ```
40
+
41
+ The model completes the text as:
42
+
43
+ ```
44
+ # About Covid-19
45
+ - -
46
+ The Covid-19 pandemic is the worldwide pandemic that has left thousands of people unable to enter and work in or continue their normal daily normal life. In this brief post, we examine three of the main factors that have accelerated the pandemic and predict the path the pandemic will take through the rest of the world.
47
+ ```
48
+
49
+ As you can see, GPT2(023) can generate coherent and relevant text pertaining to the Covid-19 pandemic, showcasing its ability to understand recent events. However, it struggles with certain subjects that weren’t extremely relevant in it’s training data. As only 2.23 billion tokens were used during finetuning, the model may have missed out on many recent events. One of those events being the latest US election.
50
+
51
+ Given text in a question and answer format:
52
+
53
+ ```
54
+ Q: Who is the last president?
55
+ A: Donald Trump
56
+
57
+ Q: Who is the most recent president?
58
+ A:
59
+ ```
60
+
61
+ The model completes the text with: `Barack Obama`
62
 
63
  ### Model description
64