Instructions to use YakovElm/Hyperledger20SetFitModel_balance_ratio_2 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- sentence-transformers
How to use YakovElm/Hyperledger20SetFitModel_balance_ratio_2 with sentence-transformers:
from sentence_transformers import SentenceTransformer model = SentenceTransformer("YakovElm/Hyperledger20SetFitModel_balance_ratio_2") sentences = [ "The weather is lovely today.", "It's so sunny outside!", "He drove to the stadium." ] embeddings = model.encode(sentences) similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] - setfit
How to use YakovElm/Hyperledger20SetFitModel_balance_ratio_2 with setfit:
from setfit import SetFitModel model = SetFitModel.from_pretrained("YakovElm/Hyperledger20SetFitModel_balance_ratio_2") - Notebooks
- Google Colab
- Kaggle
metadata
license: apache-2.0
tags:
- setfit
- sentence-transformers
- text-classification
pipeline_tag: text-classification
YakovElm/Hyperledger20SetFitModel_balance_ratio_2
This is a SetFit model that can be used for text classification. The model has been trained using an efficient few-shot learning technique that involves:
- Fine-tuning a Sentence Transformer with contrastive learning.
- Training a classification head with features from the fine-tuned Sentence Transformer.
Usage
To use this model for inference, first install the SetFit library:
python -m pip install setfit
You can then run inference as follows:
from setfit import SetFitModel
# Download from Hub and run inference
model = SetFitModel.from_pretrained("YakovElm/Hyperledger20SetFitModel_balance_ratio_2")
# Run inference
preds = model(["i loved the spiderman movie!", "pineapple on pizza is the worst 🤮"])
BibTeX entry and citation info
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}