model output

#86
by foxsilverfox - opened

Hello,

Is it normal in text generation that the output prediction contains also the input?

Thx

OpenAI community org

Hey @foxsilverfox , that depends of the code that you use to generate text

Hi @lysandre . Well i have a dataset with inputs and one column output.I prepared my prompts and I trained the model.i am using text generation.am using the following for text generation and am replacing the input with empty string is this the normal thing to do?
def generate_predictions(model_name, tokenizer, prompts):
generator = pipeline('text-generation', model='./mygpt')
set_seed(42)
predictions = []

for prompt in prompts:
    generated_text = generator(prompt, max_length=250, num_return_sequences=1)
    modified_text = generated_text[0]['generated_text'].replace(prompt, '')
    predictions.append(modified_text)

return predictions

Sign up or log in to comment