MartialTerran commited on
Commit
eccf472
1 Parent(s): c8f2523

Update Gettysburg_GPT2_v1.4.2.py

Browse files
Files changed (1) hide show
  1. Gettysburg_GPT2_v1.4.2.py +5 -0
Gettysburg_GPT2_v1.4.2.py CHANGED
@@ -24,6 +24,11 @@ class ToyGPT2(nn.Module):
24
  x = tok_emb + positional_embeddings # (B,T,C)
25
  """
26
 
 
 
 
 
 
27
  # Added into v1.4.2.py
28
 
29
  # Dynamic vocab_size Update: After the Tokenizer is created, in the def Main, the calculated tokenizer.vocab_size is assigned back to hyperparameters["vocab_size"]: hyperparameters["vocab_size"] = tokenizer.vocab_size This guarantees that the ToyGPT2 model is constructed before training with the correct vocabulary size derived from the Dataset (Gettysburg Address) plus special tokens. This helps prevent runtime errors when user makes a change to the Dataset resulting in a different number of tokens in the model vocabulary. [If the hyperparameters["vocab_size"] is any larger than the number of Tokenizer-defined tokens, the as-initialized output level might select an undefined token as the next-token and then produce an out-of-range crash. Checkpoints (when enabled within the def Training) will be saved to include the updated hyperparameters["vocab_size"]. A future feature, not yet implemented, should include automatically writing the hyperparameters["vocab_size"] to the filename of the saved checkpoint.
 
24
  x = tok_emb + positional_embeddings # (B,T,C)
25
  """
26
 
27
+ # Even without using any positional encoding at all, e.g.,
28
+ # x = tok_emb (B,T,C)
29
+ # this toy model can still memorize and sequence the original Dataset verbatim:
30
+ #Response: four score and seven years ago our fathers brought forth , on this continent , a new nation , conceived in liberty , and dedicated to the proposition that all men are created equal . now we are engaged in a great civil war , testing whether that nation , or any nation so conceived , and so dedicated , can long endure . we are met on a great battle - field of that war . we have come to dedicate a portion of that field , as a final resting - place for those who here gave their lives , that that nation might live . it is altogether fitting and proper that we should do this . but , in a larger sense , we cannot dedicate , we cannot consecrate - we cannot hallow - this ground . the brave men , living and dead , who struggled here , have consecrated it far above our poor power to add or detract . the world will little note , nor long remember what we say here , but it can never forget what they did here . it is for us the living , rather , to be dedicated here to the unfinished work which they who fought here have thus far so nobly advanced . it is rather for us to be here dedicated to the great task remaining before us - that from these honored dead we take increased devotion to that cause for which they here gave the last full measure of devotion - that we here highly resolve that these dead shall not have died in vain - that this nation , under god , shall have a new birth of freedom , and that government of the people , by the people , for the people , shall not perish from the earth . apple blossoms cream dumpling ebony flower guppy heart india jewels kale loon minnow neon orange pudding quaker radish swimming turtle ultrared velvet wipped xenial yellow zebra the
31
+
32
  # Added into v1.4.2.py
33
 
34
  # Dynamic vocab_size Update: After the Tokenizer is created, in the def Main, the calculated tokenizer.vocab_size is assigned back to hyperparameters["vocab_size"]: hyperparameters["vocab_size"] = tokenizer.vocab_size This guarantees that the ToyGPT2 model is constructed before training with the correct vocabulary size derived from the Dataset (Gettysburg Address) plus special tokens. This helps prevent runtime errors when user makes a change to the Dataset resulting in a different number of tokens in the model vocabulary. [If the hyperparameters["vocab_size"] is any larger than the number of Tokenizer-defined tokens, the as-initialized output level might select an undefined token as the next-token and then produce an out-of-range crash. Checkpoints (when enabled within the def Training) will be saved to include the updated hyperparameters["vocab_size"]. A future feature, not yet implemented, should include automatically writing the hyperparameters["vocab_size"] to the filename of the saved checkpoint.