Add layer norm usage for Transformers.js

#11
by Xenova HF staff - opened

This produces the same output as the python version:

// [
//   [-0.00518727907910943,   0.06514579057693481,  -0.21559129655361176, ...],
//   [-0.008253306150436401,  0.005108598619699478,   -0.22179779410362244, ...],
// ]

Relevant discussion: https://huggingface.co/nomic-ai/nomic-embed-text-v1.5/discussions/4#65cce8d0c52afc14ceac26c2

zpn changed pull request status to merged
Nomic AI org

thanks @Xenova !

Sign up or log in to comment