Update README.md to add a description of accessing Sionic AI's Embedding API v1
Browse files
README.md
CHANGED
@@ -2601,4 +2601,73 @@ model-index:
|
|
2601 |
language:
|
2602 |
- en
|
2603 |
library_name: transformers
|
2604 |
-
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
2601 |
language:
|
2602 |
- en
|
2603 |
library_name: transformers
|
2604 |
+
---
|
2605 |
+
|
2606 |
+
# Sionic AI Embedding API v1
|
2607 |
+
|
2608 |
+
## About Sionic AI
|
2609 |
+
|
2610 |
+
Homepage : https://sionic.ai/
|
2611 |
+
|
2612 |
+
Sionic AI delivers more accessible and cost-effective AI technology addressing the various needs to boost productivity and drive innovation.
|
2613 |
+
|
2614 |
+
The Large Language Model (LLM) is not for research and experimentation. We offer solutions that leverage LLM to add value to your business. Anyone can easily train and control AI.
|
2615 |
+
|
2616 |
+
You can try our product [here](https://www.s9m.ai) for free!
|
2617 |
+
|
2618 |
+
## How to get embeddings
|
2619 |
+
|
2620 |
+
To get embeddings, you should call API endpoint to send your text. You can send either a single sentence or multiple sentences. The embeddings that correspond to the inputs will be returned.
|
2621 |
+
|
2622 |
+
API Endpoint : https://api.sionic.ai/v1/embedding
|
2623 |
+
|
2624 |
+
Example request:
|
2625 |
+
```shell
|
2626 |
+
curl https://api.sionic.ai/v1/embedding \
|
2627 |
+
-H "Content-Type: application/json" \
|
2628 |
+
-d '{
|
2629 |
+
"inputs": ["first query", "second query", "third query"]
|
2630 |
+
}'
|
2631 |
+
```
|
2632 |
+
|
2633 |
+
Example response:
|
2634 |
+
```shell
|
2635 |
+
{
|
2636 |
+
"embedding": [
|
2637 |
+
[
|
2638 |
+
0.1380517,
|
2639 |
+
0.0749767,
|
2640 |
+
-0.0600897,
|
2641 |
+
0.6106221,
|
2642 |
+
-0.3284067,
|
2643 |
+
...
|
2644 |
+
],
|
2645 |
+
[
|
2646 |
+
-0.0237823,
|
2647 |
+
-0.103611,
|
2648 |
+
-0.0491666,
|
2649 |
+
0.671397,
|
2650 |
+
-0.8827474,
|
2651 |
+
...
|
2652 |
+
],
|
2653 |
+
[
|
2654 |
+
0.0137392,
|
2655 |
+
-0.1101281,
|
2656 |
+
-0.2256125,
|
2657 |
+
0.7899137,
|
2658 |
+
-0.8847492,
|
2659 |
+
...
|
2660 |
+
]
|
2661 |
+
]
|
2662 |
+
}
|
2663 |
+
```
|
2664 |
+
|
2665 |
+
## Massive Text Embedding Benchmark (MTEB) Evaluation
|
2666 |
+
|
2667 |
+
Both versions of Sionic AI's embedding show the state-of-the-art performances on the MTEB!
|
2668 |
+
You can find a code to evaluate MTEB datasets using v1 embedding [here](https://huggingface.co/sionic-ai/sionic-ai-v1/blob/main/mteb_evaluate.py).
|
2669 |
+
|
2670 |
+
| Model Name | Dimension | Sequence Length | Average (56) |
|
2671 |
+
|:-----------------------------------------------------------------------:|:---------:|:---:|:------------:|
|
2672 |
+
| [sionic-ai/sionic-ai-v2](https://huggingface.co/sionic-ai/sionic-ai-v2) | 3072 | 512 | **65.23** |
|
2673 |
+
| [sionic-ai/sionic-ai-v1](https://huggingface.co/sionic-ai/sionic-ai-v1) | 2048 | 512 | 64.92 |
|