IMISLab commited on
Commit
0e46432
1 Parent(s): 1094ba2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +0 -1
README.md CHANGED
@@ -68,7 +68,6 @@ We trained `google/umt5-small` [300 million parameters (~1.20 GB)] on the GreekS
68
  * Total training epochs = 10
69
  * AdamW optimizer (e = 1e−8, β1 = 0.9 and β2 = 0.0999)
70
  * Learning rate = 3e−4
71
- * Linear weight decay
72
  * No warmup steps
73
  * 32-bit floating precision
74
  * Tokenization
 
68
  * Total training epochs = 10
69
  * AdamW optimizer (e = 1e−8, β1 = 0.9 and β2 = 0.0999)
70
  * Learning rate = 3e−4
 
71
  * No warmup steps
72
  * 32-bit floating precision
73
  * Tokenization