Text Generation
Transformers
PyTorch
mpt
Composer
MosaicML
llm-foundry
conversational
custom_code
text-generation-inference

How to only get the model answer as output

#18
by ibrim - opened

It returns my prompt as well along with its output, how to make it print only the result?

Sign up or log in to comment