--- library_name: setfit tags: - setfit - sentence-transformers - text-classification - generated_from_setfit_trainer metrics: - accuracy widget: - text: Abruzy on the blockchains Thailand - text: 'Crypto web3 through macro lens PhD macroeconomics Angel investor Startup advisor Founder Join 50000 others ' - text: Mobile Apps Part of PinsightMedia Kansas City MO - text: Founded in 55 we offer investment solutions including ETFs Tweets by vaneck intern Interactions endorsements Disclosures New York City - text: Founded in 2018 We are the first project to link NFTs and collectible toys on Ethereum Manchester England pipeline_tag: text-classification inference: true base_model: BAAI/bge-small-en-v1.5 model-index: - name: SetFit with BAAI/bge-small-en-v1.5 results: - task: type: text-classification name: Text Classification dataset: name: Unknown type: unknown split: test metrics: - type: accuracy value: 0.465149359886202 name: Accuracy --- # SetFit with BAAI/bge-small-en-v1.5 This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification. The model has been trained using an efficient few-shot learning technique that involves: 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning. 2. Training a classification head with features from the fine-tuned Sentence Transformer. ## Model Details ### Model Description - **Model Type:** SetFit - **Sentence Transformer body:** [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) - **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance - **Maximum Sequence Length:** 512 tokens - **Number of Classes:** 50 classes ### Model Sources - **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit) - **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055) - **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit) ### Model Labels | Label | Examples | |:------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | NFT | | | INFRASTRUCTURE | | | NFT_DIGITAL_ART | | | UNDETERMINED | | | RESEARCH_AGENCY | | | CRYPTO_MEDIA | | | NFT_GAMING | | | CENTRALIZED_EXCHANGE | | | VENTURE_CAPITAL_FIRM | | | DAO | | | DEVELOPMENT_AGENCY | | | DECENTRALIZED_COMPUTING | | | DEX | | | DEFI | | | L1_BLOCKCHAIN | | | WALLET | | | FOUNDATION | | | PRIVACY | | | NFT_MARKETPLACE | | | NFT_IDENTITY | | | SYNTHETIC_ASSETS | | | DECENTRALIZED_STORAGE | | | YIELD_FARMING | | | L2_BLOCKCHAIN | | | INSURANCE | | | GOVERNMENT | | | CHARITY | | | LEGAL_COMPLIANCE | | | METAVERSE | | | LENDING_BORROWING | | | PAYMENT_PROVIDER | | | MARKETING_AGENCY | | | PODCAST | | | RWA | | | STABLECOIN | | | SOCIALFI | | | PERPS | | | REFI | | | SOCIAL_MEDIA | | | MEME_COIN | | | LSD | | | REAL_ESTATE | | | OPTIONS | | | L0_BLOCKCHAIN | | | HEALTHCARE | | | GAMEFI | | | GAMBLEFI | | | SUPPLY_CHAIN | | | L3_BLOCKCHAIN | | | OTC_EXCHANGE | | ## Evaluation ### Metrics | Label | Accuracy | |:--------|:---------| | **all** | 0.4651 | ## Uses ### Direct Use for Inference First install the SetFit library: ```bash pip install setfit ``` Then you can load this model and run inference. ```python from setfit import SetFitModel # Download from the 🤗 Hub model = SetFitModel.from_pretrained("kasparas12/crypto_organization_infer_model_setfit") # Run inference preds = model("Abruzy on the blockchains Thailand") ``` ## Training Details ### Training Set Metrics | Training set | Min | Median | Max | |:-------------|:----|:--------|:----| | Word count | 2 | 16.3567 | 45 | | Label | Training Sample Count | |:------------------------|:----------------------| | DEVELOPMENT_AGENCY | 99 | | RESEARCH_AGENCY | 124 | | MARKETING_AGENCY | 55 | | FOUNDATION | 74 | | CHARITY | 25 | | L0_BLOCKCHAIN | 19 | | L1_BLOCKCHAIN | 126 | | L2_BLOCKCHAIN | 101 | | L3_BLOCKCHAIN | 2 | | VENTURE_CAPITAL_FIRM | 296 | | GOVERNMENT | 32 | | CENTRALIZED_EXCHANGE | 94 | | OTC_EXCHANGE | 1 | | DEX | 117 | | LENDING_BORROWING | 30 | | INSURANCE | 9 | | YIELD_FARMING | 18 | | SYNTHETIC_ASSETS | 7 | | LSD | 30 | | PERPS | 12 | | OPTIONS | 10 | | WALLET | 104 | | STABLECOIN | 17 | | DEFI | 445 | | NFT | 74 | | NFT_MARKETPLACE | 72 | | NFT_DIGITAL_ART | 149 | | NFT_GAMING | 102 | | NFT_IDENTITY | 33 | | PRIVACY | 54 | | DECENTRALIZED_STORAGE | 44 | | DECENTRALIZED_COMPUTING | 21 | | SOCIALFI | 27 | | SOCIAL_MEDIA | 23 | | SUPPLY_CHAIN | 2 | | REAL_ESTATE | 4 | | REFI | 11 | | HEALTHCARE | 2 | | LEGAL_COMPLIANCE | 36 | | GAMEFI | 9 | | GAMBLEFI | 10 | | INFRASTRUCTURE | 326 | | RWA | 12 | | METAVERSE | 33 | | MEME_COIN | 21 | | PAYMENT_PROVIDER | 50 | | DAO | 232 | | CRYPTO_MEDIA | 445 | | PODCAST | 35 | | UNDETERMINED | 307 | ### Training Hyperparameters - batch_size: (64, 64) - num_epochs: (1, 1) - max_steps: -1 - sampling_strategy: oversampling - num_iterations: 20 - body_learning_rate: (2e-05, 1e-05) - head_learning_rate: 0.01 - loss: CosineSimilarityLoss - distance_metric: cosine_distance - margin: 0.25 - end_to_end: False - use_amp: False - warmup_proportion: 0.1 - seed: 42 - eval_max_steps: -1 - load_best_model_at_end: False ### Training Results | Epoch | Step | Training Loss | Validation Loss | |:------:|:----:|:-------------:|:---------------:| | 0.0004 | 1 | 0.2438 | - | | 0.0201 | 50 | 0.2407 | - | | 0.0402 | 100 | 0.2306 | - | | 0.0603 | 150 | 0.2304 | - | | 0.0804 | 200 | 0.2098 | - | | 0.1004 | 250 | 0.1973 | - | | 0.1205 | 300 | 0.1684 | - | | 0.1406 | 350 | 0.1296 | - | | 0.1607 | 400 | 0.1704 | - | | 0.1808 | 450 | 0.1603 | - | | 0.2009 | 500 | 0.1461 | - | | 0.2210 | 550 | 0.1629 | - | | 0.2411 | 600 | 0.1675 | - | | 0.2611 | 650 | 0.1422 | - | | 0.2812 | 700 | 0.1116 | - | | 0.3013 | 750 | 0.0899 | - | | 0.3214 | 800 | 0.1419 | - | | 0.3415 | 850 | 0.0981 | - | | 0.3616 | 900 | 0.1234 | - | | 0.3817 | 950 | 0.1019 | - | | 0.4018 | 1000 | 0.0946 | - | | 0.4219 | 1050 | 0.1035 | - | | 0.4419 | 1100 | 0.0938 | - | | 0.4620 | 1150 | 0.1147 | - | | 0.4821 | 1200 | 0.0826 | - | | 0.5022 | 1250 | 0.0997 | - | | 0.5223 | 1300 | 0.1065 | - | | 0.5424 | 1350 | 0.0701 | - | | 0.5625 | 1400 | 0.0753 | - | | 0.5826 | 1450 | 0.0651 | - | | 0.6027 | 1500 | 0.0893 | - | | 0.6227 | 1550 | 0.0871 | - | | 0.6428 | 1600 | 0.0593 | - | | 0.6629 | 1650 | 0.0797 | - | | 0.6830 | 1700 | 0.0811 | - | | 0.7031 | 1750 | 0.0522 | - | | 0.7232 | 1800 | 0.0833 | - | | 0.7433 | 1850 | 0.0805 | - | | 0.7634 | 1900 | 0.0942 | - | | 0.7834 | 1950 | 0.0688 | - | | 0.8035 | 2000 | 0.0606 | - | | 0.8236 | 2050 | 0.0733 | - | | 0.8437 | 2100 | 0.0921 | - | | 0.8638 | 2150 | 0.0629 | - | | 0.8839 | 2200 | 0.0871 | - | | 0.9040 | 2250 | 0.0401 | - | | 0.9241 | 2300 | 0.0586 | - | | 0.9442 | 2350 | 0.1114 | - | | 0.9642 | 2400 | 0.0566 | - | | 0.9843 | 2450 | 0.0653 | - | ### Framework Versions - Python: 3.9.16 - SetFit: 1.0.3 - Sentence Transformers: 2.2.2 - Transformers: 4.21.3 - PyTorch: 1.12.1+cu116 - Datasets: 2.4.0 - Tokenizers: 0.12.1 ## Citation ### BibTeX ```bibtex @article{https://doi.org/10.48550/arxiv.2209.11055, doi = {10.48550/ARXIV.2209.11055}, url = {https://arxiv.org/abs/2209.11055}, author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren}, keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Efficient Few-Shot Learning Without Prompts}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ```