Text Generation
Transformers
PyTorch
English
German
llama
conversational
custom_code
Inference Endpoints
text-generation-inference

text-generation-inference random tokens

#3
by dengie - opened

I tried to use the model with TGI. Regardless of the prompt the model returned only random tokens (mostly 'O' and '\n'). I tried to set the "--trust-remote-code" flag but still no diffrence.

Sign up or log in to comment