mike dikka

fgdrfgrgrdgdr

AI & ML interests

None yet

Recent Activity

liked a model 15 minutes ago
Comfy-Org/HunyuanVideo_repackaged
liked a model 15 minutes ago
tencent/HunyuanVideo-PromptRewrite
liked a model 15 minutes ago
tencent/HunyuanVideo
View all activity

Organizations

None yet

fgdrfgrgrdgdr's activity

reacted to tomaarsen's post with ❀️ 3 days ago
view post
Post
2377
That didn't take long! Nomic AI has finetuned the new ModernBERT-base encoder model into a strong embedding model for search, classification, clustering and more!

Details:
πŸ€– Based on ModernBERT-base with 149M parameters.
πŸ“Š Outperforms both nomic-embed-text-v1 and nomic-embed-text-v1.5 on MTEB!
🏎️ Immediate FA2 and unpacking support for super efficient inference.
πŸͺ† Trained with Matryoshka support, i.e. 2 valid output dimensionalities: 768 and 256.
➑️ Maximum sequence length of 8192 tokens!
2️⃣ Trained in 2 stages: unsupervised contrastive data -> high quality labeled datasets.
βž• Integrated in Sentence Transformers, Transformers, LangChain, LlamaIndex, Haystack, etc.
πŸ›οΈ Apache 2.0 licensed: fully commercially permissible

Try it out here: nomic-ai/modernbert-embed-base

Very nice work by Zach Nussbaum and colleagues at Nomic AI.