pete88b commited on
Commit
d514fca
1 Parent(s): eb0d815

Use BioGptModel to get features

Browse files

We can't use `BioGptForCausalLM` to get the features as it returns logits for next token prediction. I'm pretty sure we want `BioGptModel` to give us last_hidden_state

Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -30,9 +30,9 @@ set a seed for reproducibility:
30
  Here is how to use this model to get the features of a given text in PyTorch:
31
 
32
  ```python
33
- from transformers import BioGptTokenizer, BioGptForCausalLM
34
  tokenizer = BioGptTokenizer.from_pretrained("microsoft/biogpt")
35
- model = BioGptForCausalLM.from_pretrained("microsoft/biogpt")
36
  text = "Replace me by any text you'd like."
37
  encoded_input = tokenizer(text, return_tensors='pt')
38
  output = model(**encoded_input)
30
  Here is how to use this model to get the features of a given text in PyTorch:
31
 
32
  ```python
33
+ from transformers import BioGptTokenizer, BioGptModel
34
  tokenizer = BioGptTokenizer.from_pretrained("microsoft/biogpt")
35
+ model = BioGptModel.from_pretrained("microsoft/biogpt")
36
  text = "Replace me by any text you'd like."
37
  encoded_input = tokenizer(text, return_tensors='pt')
38
  output = model(**encoded_input)