Can several different prompts be handled together?
chat1=[{'role':'user','content':'write a story'}]
chat2=[{'role':'user','content':'write anothher story'}]
Can I combine them into a list of chats so model can handle them together?
This should be possible - apply the chat template from the tokenizer to each of the chats separately, then send the tokenized chats as a batch to the model
Thank you. But I am a new learner. Could you give me a favor to tell me in detail about it? Thank you!
input1 = tokenizer.encode(prompt1, add_special_tokens=False, return_tensors="pt")
input2 = tokenizer.encode(prompt2, add_special_tokens=False, return_tensors="pt")
then since input1 and input2 are not in the same length, shall I pad 0 to the shorter one so that they share the same length. Then I concatenate them and send to model to generate answer?
This should be possible - apply the chat template from the tokenizer to each of the chats separately, then send the tokenized chats as a batch to the model