Xenova HF staff commited on
Commit
1d6119f
1 Parent(s): d1946e5

Add transformers.js tag and example code

Browse files
Files changed (1) hide show
  1. README.md +31 -0
README.md CHANGED
@@ -1,6 +1,7 @@
1
  ---
2
  tags:
3
  - mteb
 
4
  model-index:
5
  - name: mxbai-embed-2d-large-v1
6
  results:
@@ -2693,6 +2694,36 @@ print('similarities:', similarities)
2693
 
2694
  You’ll be able to use the models through our API as well. The API is coming soon and will have some exciting features. Stay tuned!
2695
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2696
  ## Evaluation
2697
 
2698
  Please find more information in our [blog post](https://mixedbread.ai/blog/mxbai-embed-2d-large-v1).
 
1
  ---
2
  tags:
3
  - mteb
4
+ - transformers.js
5
  model-index:
6
  - name: mxbai-embed-2d-large-v1
7
  results:
 
2694
 
2695
  You’ll be able to use the models through our API as well. The API is coming soon and will have some exciting features. Stay tuned!
2696
 
2697
+ ### Transformers.js
2698
+
2699
+
2700
+ If you haven't already, you can install the [Transformers.js](https://huggingface.co/docs/transformers.js) JavaScript library from [NPM](https://www.npmjs.com/package/@xenova/transformers) using:
2701
+ ```bash
2702
+ npm i @xenova/transformers
2703
+ ```
2704
+
2705
+ You can then use the model to compute embeddings as follows:
2706
+
2707
+ ```js
2708
+ import { pipeline, cos_sim } from '@xenova/transformers';
2709
+
2710
+ // Create a feature-extraction pipeline
2711
+ const extractor = await pipeline('feature-extraction', 'mixedbread-ai/mxbai-embed-2d-large-v1', {
2712
+ quantized: false, // (Optional) remove this line to use the 8-bit quantized model
2713
+ });
2714
+
2715
+ // Compute sentence embeddings (with `cls` pooling)
2716
+ const sentences = ['Who is german and likes bread?', 'Everybody in Germany.' ];
2717
+ const output = await extractor(sentences, { pooling: 'cls' });
2718
+
2719
+ // Set embedding size and truncate embeddings
2720
+ const new_embedding_size = 768;
2721
+ const truncated = output.slice(null, [0, new_embedding_size]);
2722
+
2723
+ // Compute cosine similarity
2724
+ console.log(cos_sim(truncated[0].data, truncated[1].data)); // 0.6979532021425204
2725
+ ```
2726
+
2727
  ## Evaluation
2728
 
2729
  Please find more information in our [blog post](https://mixedbread.ai/blog/mxbai-embed-2d-large-v1).