New model card for distilgpt2

#1
by Marissa - opened

Expanded model card for distilgpt2, including information about training, use, limitations, emissions, and evaluation. Also added metadata for evaluation results

That looks great to me!
Thanks a lot @Marissa

What does the author think @VictorSanh - good for merge?

LGTM too!

Alright, let's merge it - hope that's fine for you @VictorSanh ! Otherwise we can still revert afterwards (#feature request haha @julien-c ) :-)

patrickvonplaten changed pull request status to merged
DistilBERT community org

lgtm!

only thing i would note: **Hours used:** 8 -> it's actually in the order of magnitude of 1 week on a node with 8 16GB v100

would you like to change that @Marissa ? :)

DistilBERT community org

can we wait for a little bit more feedback from the team before merging, next time?

In particular, not a huge fan of the collapsible sections UX-wise. What do others think?

I don't like collapsable sections either: they can be used for certain very long sections but not everywhere like this and the main content should not be collapsed.

Sounds good, so general review policy might help here @julien-c

thanks for the feedback -- will create a new PR with those changes!

Sign up or log in to comment