File size: 966 Bytes
c286aa9
 
 
50811df
 
67acbd3
 
d983a86
c286aa9
 
 
 
 
 
 
 
 
 
 
 
db6d704
 
c286aa9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
---
language:
- en
datasets:
- m-newhauser/senator-tweets
widget:
- text: "[start]"
  example_title: "Starting Notation"
---

# Phi-2 Senator Tweets

[Phi-2](https://huggingface.co/microsoft/phi-2) finetuned on [Senator Tweets](https://huggingface.co/datasets/m-newhauser/senator-tweets).

The starting token is [start] and the ending token is [end]

Example:
```python
from transformers import AutoTokenizer, AutoModelForCausalLM

model = AutoModelForCausalLM.from_pretrained("Realluke/phi-2-senator-tweets", torch_dtype="auto", trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained("microsoft/phi-2", trust_remote_code=True)

inputs = tokenizer("[start]", return_tensors="pt", return_attention_mask=False)
outputs = model.generate(**inputs, max_length=200)
text = tokenizer.batch_decode(outputs)[0]

print(text)
```

## Model Details

### Model Description

- **Steps:** 750
- **Finetuning Examples:** 1000
- **GPU:** NVIDIA Tesla T4
- **GPU Hours:** 2