sionic commited on
Commit
6f8bdfb
1 Parent(s): f54cf3c

Update README.md to add information to access embedding api v2

Browse files
Files changed (1) hide show
  1. README.md +108 -2
README.md CHANGED
@@ -2606,15 +2606,120 @@ model-index:
2606
 
2607
  Sionic AI delivers more accessible and cost-effective AI technology addressing the various needs to boost productivity and drive innovation.
2608
 
2609
- The Large Language Model (LLM) is not for research and experimentation. We offer solutions that leverage LLM to add value to your business. Anyone can easily train and control AI.
 
 
2610
 
2611
  ## How to get embeddings
2612
 
2613
- We are working on releasing v2 API. In the meantime, you can try our embedding API v1. Please visit [here](https://huggingface.co/sionic-ai/sionic-ai-v1).
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2614
 
2615
  ## Massive Text Embedding Benchmark (MTEB) Evaluation
2616
 
2617
  Both versions of Sionic AI's embedding show the state-of-the-art performances on the MTEB!
 
2618
 
2619
  | Model Name | Dimension | Sequence Length | Average (56) |
2620
  |:-----------------------------------------------------------------------:|:---------:|:---------------:|:------------:|
@@ -2623,3 +2728,4 @@ Both versions of Sionic AI's embedding show the state-of-the-art performances on
2623
  | [bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | 1024 | 512 | 64.23 |
2624
  | [gte-large-en](https://huggingface.co/barisaydin/gte-large) | 1024 | 512 | 63.13 |
2625
  | [text-embedding-ada-002](https://platform.openai.com/docs/guides/embeddings/types-of-embedding-models) | 1536 | 8191 | 60.99 |
 
 
2606
 
2607
  Sionic AI delivers more accessible and cost-effective AI technology addressing the various needs to boost productivity and drive innovation.
2608
 
2609
+ The Large Language Model (LLM) is not for research and experimentation.
2610
+ We offer solutions that leverage LLM to add value to your business.
2611
+ Anyone can easily train and control AI.
2612
 
2613
  ## How to get embeddings
2614
 
2615
+ Currently, we open the beta version of embedding APIs.
2616
+ To get embeddings, you should call API endpoint to send your text.
2617
+ You can send either a single sentence or multiple sentences.
2618
+ The embeddings that correspond to the inputs will be returned.
2619
+
2620
+ API Endpoint : https://api.sionic.ai/v2/embedding
2621
+
2622
+ ### Command line Example
2623
+ Request:
2624
+ ```shell
2625
+ curl https://api.sionic.ai/v2/embedding \
2626
+ -H "Content-Type: application/json" \
2627
+ -d '{
2628
+ "inputs": ["first query", "second query", "third query"]
2629
+ }'
2630
+ ```
2631
+
2632
+ Response:
2633
+ ```shell
2634
+ {
2635
+ "embedding": [
2636
+ [
2637
+ 0.5567971,
2638
+ -1.1578958,
2639
+ -0.7148851,
2640
+ -0.2326297,
2641
+ 0.4394867,
2642
+ ...
2643
+ ],
2644
+ [
2645
+ 0.5049863,
2646
+ -0.8253384,
2647
+ -1.0041373,
2648
+ -0.6503708,
2649
+ 0.5007141,
2650
+ ...
2651
+ ],
2652
+ [
2653
+ 0.6059823,
2654
+ -1.0369557,
2655
+ -0.6705063,
2656
+ -0.4467056,
2657
+ 0.8618057,
2658
+ ...
2659
+ ]
2660
+ ]
2661
+ }
2662
+ ```
2663
+
2664
+ ### Python code Example
2665
+ Get embeddings by directly calling embedding API.
2666
+
2667
+ ```python
2668
+ from typing import List
2669
+ import numpy as np
2670
+ import requests
2671
+
2672
+ def get_embedding(queries: List[str], url):
2673
+ response = requests.post(url=url, json={'inputs': queries})
2674
+ return np.asarray(response.json()['embedding'], dtype=np.float32)
2675
+
2676
+ url = "https://api.sionic.ai/v2/embedding"
2677
+ inputs1 = ["first query", "second query"]
2678
+ inputs2 = ["third query", "fourth query"]
2679
+ embedding1 = get_embedding(inputs1, url=url)
2680
+ embedding2 = get_embedding(inputs2, url=url)
2681
+ cos_similarity = (embedding1 / np.linalg.norm(embedding1)) @ (embedding2 / np.linalg.norm(embedding1)).T
2682
+ print(cos_similarity)
2683
+ ```
2684
+
2685
+ Using pre-defined [SionicEmbeddingModel](https://huggingface.co/sionic-ai/sionic-ai-v2/blob/main/model_api.py) to obtain embeddings.
2686
+
2687
+ ```python
2688
+ from model_api import SionicEmbeddingModel
2689
+ import numpy as np
2690
+
2691
+ inputs1 = ["first query", "second query"]
2692
+ inputs2 = ["third query", "fourth query"]
2693
+ model = SionicEmbeddingModel(url="https://api.sionic.ai/v2/embedding",
2694
+ dimension=3072)
2695
+ embedding1 = model.encode(inputs1)
2696
+ embedding2 = model.encode(inputs2)
2697
+ cos_similarity = (embedding1 / np.linalg.norm(embedding1)) @ (embedding2 / np.linalg.norm(embedding1)).T
2698
+ print(cos_similarity)
2699
+ ```
2700
+ We apply the instruction to encode short queries for retrieval tasks.
2701
+ By using `encode_queries()`, you can use the instruction to encode queries which is prefixed to each query as the following example.
2702
+ The recommended instruction for both v1 and v2 models is `"query: "`.
2703
+
2704
+ ```python
2705
+ from model_api import SionicEmbeddingModel
2706
+ import numpy as np
2707
+
2708
+ query = ["first query", "second query"]
2709
+ passage = ["This is a passage related to the first query", "This is a passage related to the second query"]
2710
+ model = SionicEmbeddingModel(url="https://api.sionic.ai/v2/embedding",
2711
+ instruction="query: ",
2712
+ dimension=3072)
2713
+ query_embedding = model.encode_queries(query)
2714
+ passage_embedding = model.encode_corpus(passage)
2715
+ cos_similarity = (query_embedding / np.linalg.norm(query_embedding)) @ (passage_embedding / np.linalg.norm(passage_embedding)).T
2716
+ print(cos_similarity)
2717
+ ```
2718
 
2719
  ## Massive Text Embedding Benchmark (MTEB) Evaluation
2720
 
2721
  Both versions of Sionic AI's embedding show the state-of-the-art performances on the MTEB!
2722
+ You can find a code to evaluate MTEB datasets using Sionic embedding APIs [here](https://huggingface.co/sionic-ai/sionic-ai-v2/blob/main/mteb_evaluate.py).
2723
 
2724
  | Model Name | Dimension | Sequence Length | Average (56) |
2725
  |:-----------------------------------------------------------------------:|:---------:|:---------------:|:------------:|
 
2728
  | [bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | 1024 | 512 | 64.23 |
2729
  | [gte-large-en](https://huggingface.co/barisaydin/gte-large) | 1024 | 512 | 63.13 |
2730
  | [text-embedding-ada-002](https://platform.openai.com/docs/guides/embeddings/types-of-embedding-models) | 1536 | 8191 | 60.99 |
2731
+