Bloom3B set_seed

#13
by JoseGris - opened

Hi,

I am trying to get different results with bloom using always same prompt.

In some colabs I have seen that some pleople are using the following set_seed:

from transformers import AutoModelForCausalLM, AutoTokenizer, set_seed
set_seed(X)

But everytime I change X, I get the same result.

How can I change the seed? Is there any way to get a different result everytime I run the same prompt?

Best regards and thanks in advance,
Jose

BigScience Workshop org

How are you generating text? Can you share a code snippet please?

BigScience Workshop org

I believe the do_sample parameter defaults to False. Try setting that to True in your call to generate because otherwise you're doing greedy decoding which is deterministic and unaffected by random seeds.

Thanks a lot cakiki, but it seems to do the same with do_sample = True

image.png

BigScience Workshop org

I think that's because you're setting top_k to 1, which basically means greedy decoding since there's only one (k=1) token to sample from. Can you try with top_k set to something like 100? Better yet, try using top_p instead of top_k.

Thanks Caikiki, with top_k=50 and do_sample= True works perfect!

image.png

BigScience Workshop org

Nice!

cakiki changed discussion status to closed

Sign up or log in to comment