Words
stringlengths
1
32
tokenizer
padding
(vocabulary
size
=
32,001).
examples
packed
a
single
sequence
maximize
length
(2,048
get
a
uniform
trained
160h
20xA100
GPUs
(4
epochs)
ChatGPT-generated
+
40h
GPT-4-generated
Open-ended
generation:
significantly
better
than
AGIEval:
doesn鈥檛
perform
BigBench-Hard:
par
Copyright
2023,
Labonne