ValueError: The following `model_kwargs` are not used by the model: ['return_full_text']

#4
by Nightbird07 - opened

I have tried to remove or add the return_full_text but it seems the problem is out there



@st

	.cache_resource
def llm_pipeline():
    pipe = pipeline(
        'text2text-generation',
        model = base_model,
        tokenizer = tokenizer,
        max_length = 256,
        do_sample = True,
        temperature = 0.3,
        top_p= 0.95    )
    local_llm = HuggingFacePipeline(pipeline=pipe)
    return local_llm

Sign up or log in to comment