Library for `attention_head`
#1
by
ImaneBt
- opened
Hi there :wave:
Which library did you install to be able to do from attention_head import AttentionHead,Head, MultiHeadAttention, TransFormerBlock
?
Thanks
I wrote them myself, they should be in the repo when pulled. You can these as reference as well https://github.com/karpathy/minGPT