gemma-7b-it doesn't answer for some questions and returns '/n'

#55
by mudogruer - opened

The model doesn't generate any answer sometimes. I updated transformers library to '4.38.1' but nothing has changed.

Screenshot 2024-02-26 000015.png

Google org

Hey, Surya from the Gemma team here -- are you using the right formatting template and control tokens? What are your sampling settings?

Hey, thanks for respond. Here is the settings:
pipe = pipeline(
task = "text-generation",
model = fine_tuned_model,
tokenizer = tokenizer,
eos_token_id = model.config.eos_token_id,
max_new_tokens = 30,
)

def Sequence(promt):
sequence = pipe(
prompt,
max_new_tokens = 50,
temperature = 0.5
)
return sequence

prompt = ""+"Which is the heaviest element?"+""
print(Sequence(prompt))

Google org

This may not be using the right formatting such as <start_of_turn>, <end_of_turn> etc. -- does your prompt include those?

I guess it does.
def generate_prompt(sample):
full_prompt = f"""{sample['question']}
{sample['correct_answer']}
"""
return {"text": full_prompt}

It generates %40 of prompts. But mostly it doesn't. And it doesn't response exactly same prompts. I created a loop and created 200 prompts and run 3 times, got same results.

Hello @suryabhupa ,

I am trying to use gemma model for generation using langchain and hf text generation pipeline, and the output is far from expectation:

Code:

import transformers
from langchain_community.llms.huggingface_pipeline import HuggingFacePipeline

model_id = 'google/gemma-7b-it'
tokenizer = transformers.AutoTokenizer.from_pretrained(model_id, token=access_token,
    cache_dir='./../hf_models/')

model = transformers.AutoModelForCausalLM.from_pretrained(
    model_id,
    trust_remote_code=True,
    device_map='auto',
    token=access_token
)

generate_text = transformers.pipeline(
    model=model,
    tokenizer=tokenizer,
    # return_full_text=True,
    task="text-generation",
    generation_config=generation_config,
)

llm = HuggingFacePipeline(pipeline=generate_text)

prompt = '''<start_of_turn>user
Explain why the sky is blue<end_of_turn>
<start_of_turn>model'''
output = llm.invoke(prompt)
print(output)

Output:

The sky is blue because of the scattering of of particles particles particles.

When light particles scattering scattering scattering particles particles scattering scattering scattering scattering scattering particles particles scattering scattering scattering scattering scattering particles particles scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering scattering

Am I doing something incorrect? What should be the right way? Can you please share your comments.

Google org

Hey @akshayparakh25 thanks for raising this, that is certainly not what expected outputs should look like! @osanseviero this may be a problem on the HF side, does this look like the intended use case for this?

Google org

Hi there! You're using a generation_config in your code example but are not sharing its value. This can significantly impact the generations, so I think the chosen values might be leading to this.

@akshayparakh25 When I use the base model I get the same weird response. When I try it with the it model, it gets solved. cc @osanseviero @suryabhupa

I'm having the same is you after fine-tuning the model with Qlora also I faced the same issue when I tried to use Gemma-7b give me the same behaviour with another weird behaviour which I mentioned here https://huggingface.co/google/gemma-7b/discussions/91

Sign up or log in to comment