How to provide the prompt

#1
by ShanGao - opened

Dear Authors,

Thanks for sharing the model. I read your manuscript and it looks a very interesting model. So I played a little bit but didn't get expected results. I tried several different enzymes, and each time the generated sequences are very different from the prompt enzyme, but similar to GFP regardless which enzymes I tried to generate.

Here is how I called:

model = AutoModelForCausalLM.from_pretrained("nferruz/1.24.3.1").to(device)
tokenizer = AutoTokenizer.from_pretrained("nferruz/1.24.3.1")
my_pipe = pipeline('text-generation', model=model, tokenizer=tokenizer, device=device)
prompt = "2.3.1.4" # example prompt value. Also tried others
sequences = my_pipe(prompt, repetition_penalty=1.2, top_k=9, top_p=1.0, temperature=1.0, ...)

Please advice.

Thanks!

Hi, this is a fine-tuned model on the 1.24.3.1 class. So it will only output sequences from that class. If you'd like to generate from a different enzyme, please use Zymctrl: https://huggingface.co/AI4PD/ZymCTRL

Best
Noelia

nferruz changed discussion status to closed

Sign up or log in to comment