English
File size: 516 Bytes
2a96957
 
9a71336
 
 
 
2a96957
9a71336
a5f3c6d
55e7e6f
a5f3c6d
55e7e6f
aa3b25f
55e7e6f
 
 
 
 
aa3b25f
55e7e6f
9a71336
55e7e6f
9a71336
55e7e6f
9a71336
55e7e6f
9a71336
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
---
license: apache-2.0
datasets:
- grantg123/deductiveinductivereasoning
language:
- en
---
---
!pip install simpletransformers

!pip freeze | grep simpletransformers

base_model: bert-base-uncased

tokenizer = BertTokenizer.from_pretrained

model = BertModel.from_pretrained

input_text: You are hurting my feelings. You must be mean.

tokenized_input = tokenizer(input_text, return_tensors='pt')

outputs = model(**tokenized_input)

hidden_states = outputs.last_hidden_state

pooled_output = outputs.pooler_output