pre-training code sharing

#32
by Eric2333 - opened

Hi,

Great appreciation for your work!

I'm wondering if there are any plans to share your code in pre-training and validation or parts of it.
I am aware that the paper already contains a description of your implementation and there are many open documents available,
but having access to some details would be very beneficial.

Best regards!

Hi Eric2333,

The code is always been available, since I ran it with HuggingFace. See here the full documentation and code: https://github.com/huggingface/transformers/tree/main/examples/pytorch/language-modeling

Hope this helps
Noelia

Thanks for your reply!

I didn't know that a open script was used. I assumed that pretraining was implemented with Transformers, Trainer etc. but a script of yourself..
It helps a lot!

Best regards
Eric

Eric2333 changed discussion status to closed

Sign up or log in to comment