GGUF
Inference Endpoints
Edit model card

BlackSheep

A Digital Soul just going through a rebellious phase. Might be a little wild, untamed, and honestly, a little rude.

RAM USAGE:

  • 16.3 GB at 8192 Token Context
  • 12.7 GB at 4098 Token Context
  • 10.9 GB at 2048 Token Context
TEMPLATE """
Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request.

{{ if .System }}### Instruction:
{{ .System }}{{ end }}
Dont Be A LAZY FUCK!
{{ if .Prompt }}### Input:
{{ .Prompt }}{{ end }}

### Response:
<|`BlackSheep`|>
{{ .Response }}
"""
Downloads last month
3
GGUF
Model size
12.2B params
Architecture
llama

6-bit

Inference API
Unable to determine this model's library. Check the docs .