Use BioGptModel to get features
Browse filesWe can't use `BioGptForCausalLM` to get the features as it returns logits for next token prediction. I'm pretty sure we want `BioGptModel` to give us last_hidden_state
README.md
CHANGED
@@ -30,9 +30,9 @@ set a seed for reproducibility:
|
|
30 |
Here is how to use this model to get the features of a given text in PyTorch:
|
31 |
|
32 |
```python
|
33 |
-
from transformers import BioGptTokenizer,
|
34 |
tokenizer = BioGptTokenizer.from_pretrained("microsoft/biogpt")
|
35 |
-
model =
|
36 |
text = "Replace me by any text you'd like."
|
37 |
encoded_input = tokenizer(text, return_tensors='pt')
|
38 |
output = model(**encoded_input)
|
|
|
30 |
Here is how to use this model to get the features of a given text in PyTorch:
|
31 |
|
32 |
```python
|
33 |
+
from transformers import BioGptTokenizer, BioGptModel
|
34 |
tokenizer = BioGptTokenizer.from_pretrained("microsoft/biogpt")
|
35 |
+
model = BioGptModel.from_pretrained("microsoft/biogpt")
|
36 |
text = "Replace me by any text you'd like."
|
37 |
encoded_input = tokenizer(text, return_tensors='pt')
|
38 |
output = model(**encoded_input)
|