YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "shellchat-v1"
model = AutoModelForCausalLM.from_pretrained(model_name, trust_remote_code=True).to("cuda")
tokenizer = AutoTokenizer.from_pretrained(model_name)
query = "hello world!"
history = []
response = model.chat(query, history, tokenizer)
- Downloads last month
- 11
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.