Instructions to use tinycompany/ShawtyIsBad-e5-large with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use tinycompany/ShawtyIsBad-e5-large with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="tinycompany/ShawtyIsBad-e5-large")# Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("tinycompany/ShawtyIsBad-e5-large") model = AutoModelForCausalLM.from_pretrained("tinycompany/ShawtyIsBad-e5-large") - Notebooks
- Google Colab
- Kaggle
- Local Apps
- vLLM
How to use tinycompany/ShawtyIsBad-e5-large with vLLM:
Install from pip and serve model
# Install vLLM from pip: pip install vllm # Start the vLLM server: vllm serve "tinycompany/ShawtyIsBad-e5-large" # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:8000/v1/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "tinycompany/ShawtyIsBad-e5-large", "prompt": "Once upon a time,", "max_tokens": 512, "temperature": 0.5 }'Use Docker
docker model run hf.co/tinycompany/ShawtyIsBad-e5-large
- SGLang
How to use tinycompany/ShawtyIsBad-e5-large with SGLang:
Install from pip and serve model
# Install SGLang from pip: pip install sglang # Start the SGLang server: python3 -m sglang.launch_server \ --model-path "tinycompany/ShawtyIsBad-e5-large" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "tinycompany/ShawtyIsBad-e5-large", "prompt": "Once upon a time,", "max_tokens": 512, "temperature": 0.5 }'Use Docker images
docker run --gpus all \ --shm-size 32g \ -p 30000:30000 \ -v ~/.cache/huggingface:/root/.cache/huggingface \ --env "HF_TOKEN=<secret>" \ --ipc=host \ lmsysorg/sglang:latest \ python3 -m sglang.launch_server \ --model-path "tinycompany/ShawtyIsBad-e5-large" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "tinycompany/ShawtyIsBad-e5-large", "prompt": "Once upon a time,", "max_tokens": 512, "temperature": 0.5 }' - Docker Model Runner
How to use tinycompany/ShawtyIsBad-e5-large with Docker Model Runner:
docker model run hf.co/tinycompany/ShawtyIsBad-e5-large
metadata
library_name: transformers
license: mit
Mean Perplexity: 1425.6071816176866
This is a filler text
| Tokenizer | English | Hindi | Tamil | Bengali | Malayalam | Telugu | Gujarati | Punjabi | Code_Python | Code_Java | c++ | Math | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | deepseek-ai/DeepSeek-R1 (128k) | 338874 | 22855 | 48957 | 39617 | 73928 | 40345 | 101020 | 79172 | 5231 | 2224 | 7055 | 5376 |
| 1 | unsloth/phi-4 (100k) | 308645 | 40456 | 59750 | 116122 | 149889 | 48689 | 118335 | 87413 | 4809 | 2110 | 6529 | 5573 |
| 2 | deepseek-ai/DeepSeek-R1-Distill-Llama-8B (128k) | 308512 | 21110 | 59625 | 115138 | 149883 | 48661 | 118061 | 86765 | 4809 | 2111 | 6530 | 5574 |
| 3 | unsloth/gemma-2-9b-it(256k) | 323335 | 15916 | 53913 | 53402 | 57219 | 47610 | 107925 | 87222 | 5948 | 2569 | 8639 | 5871 |
| 4 | Ornaments/72k-Bilingual-BBPE-TK-SPM (72k) | 366710 | 11447 | 61408 | 94191 | 97207 | 50229 | 117874 | 90045 | 8201 | 4000 | 13706 | 5585 |
| 5 | Ornaments/72k-Bilingual-BBPE-TK-SPM-Identity (72k) | 330830 | 10318 | 59089 | 93740 | 92655 | 44975 | 109411 | 87922 | 7819 | 3743 | 12953 | 5253 |
| 6 | Ornaments/72k-TK-BBPE-HF (72k) | 321274 | 10813 | 67585 | 159985 | 193813 | 55654 | 134397 | 97063 | 5225 | 2263 | 7090 | 5150 |
| 7 | nvidia/Nemotron-4-Mini-Hindi-4B-Instruct (256k) | 332271 | 14327 | 55473 | 36615 | 45783 | 48270 | 160115 | 117174 | 6186 | 2732 | 8861 | 6136 |
| 8 | sarvamai/OpenHathi-7B-Hi-v0.1-Base (48k) | 370133 | 15633 | 67845 | 120340 | 105953 | 68315 | 159122 | 113817 | 6595 | 2792 | 9233 | 6223 |
| 9 | sarvamai/sarvam-1 (68k) | 385386 | 11257 | 61396 | 27348 | 31822 | 51463 | 119666 | 103344 | 7331 | 3068 | 9724 | 6864 |
asasjop dasklfjads adsfjdlk fadsjfl; ' afjlkeadksfpoa jdslkjfj[a0'f'kasd mdlkjfae[fpowe[fkdl;]]