Instructions to use Sebastianpinar/lora2-39 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Sebastianpinar/lora2-39 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("image-classification", model="Sebastianpinar/lora2-39") pipe("https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/hub/parrots.png")# Load model directly from transformers import AutoImageProcessor, AutoModelForImageClassification processor = AutoImageProcessor.from_pretrained("Sebastianpinar/lora2-39") model = AutoModelForImageClassification.from_pretrained("Sebastianpinar/lora2-39") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- fd5de4059b0eefafc84bd6207a5b718bb368ab15361624d79c4ca8ab6897e88f
- Size of remote file:
- 4.09 kB
- SHA256:
- 5ce888f454deab98f12322bdf66521eccfb07794a7339248ab07864691cfbcc9
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.