|
--- |
|
license: apache-2.0 |
|
datasets: |
|
- fine-tuned/SciFact-512-192-gpt-4o-2024-05-13-986812 |
|
- allenai/c4 |
|
language: |
|
- en |
|
- en |
|
pipeline_tag: feature-extraction |
|
tags: |
|
- sentence-transformers |
|
- feature-extraction |
|
- sentence-similarity |
|
- mteb |
|
|
|
--- |
|
This model is a fine-tuned version of [**BAAI/bge-m3**](https://huggingface.co/BAAI/bge-m3) designed for the following use case: |
|
|
|
None |
|
|
|
## How to Use |
|
This model can be easily integrated into your NLP pipeline for tasks such as text classification, sentiment analysis, entity recognition, and more. Here's a simple example to get you started: |
|
|
|
```python |
|
from sentence_transformers import SentenceTransformer |
|
from sentence_transformers.util import cos_sim |
|
|
|
model = SentenceTransformer( |
|
'fine-tuned/SciFact-512-192-gpt-4o-2024-05-13-986812', |
|
trust_remote_code=True |
|
) |
|
|
|
embeddings = model.encode([ |
|
'first text to embed', |
|
'second text to embed' |
|
]) |
|
print(cos_sim(embeddings[0], embeddings[1])) |
|
``` |
|
|