File size: 1,309 Bytes
35e6793
a6f8764
 
0871b1d
6dbdbfd
35e6793
2ad232e
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
36123d6
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
---
datasets:
- openagi-project/OpenAGI-set-dpo-v0.1
base_model: freecs/ThetaWave-7B-v0.1
license: apache-2.0
---

# OpenAGI-7B-v0.1

DPO tuned on a small set of GPT4 generated responses.

Give it a try:
```python
from transformers import AutoModelForCausalLM, AutoTokenizer

device = "cuda"  # the device to load the model onto

model = AutoModelForCausalLM.from_pretrained("openagi-project/OpenAGI-7B-v0.1")
tokenizer = AutoTokenizer.from_pretrained("openagi-project/OpenAGI-7B-v0.1")

messages = [
    {"role": "user", "content": "Who are you?"},
]

encodeds = tokenizer.apply_chat_template(messages, return_tensors="pt")

model_inputs = encodeds.to(device)
model.to(device)

generated_ids = model.generate(model_inputs, max_new_tokens=1000, do_sample=True)
decoded = tokenizer.batch_decode(generated_ids)
print(decoded[0])
```

*" My goal as the founder of FreeCS.org is to establish an Open-Source AI Research Lab driven by its Community. Currently, I am the sole contributor at FreeCS.org. If you share our vision, we welcome you to join our community and contribute to our mission at [freecs.org/#community](https://freecs.org/#community). "*             
 |- [GR](https://twitter.com/gr_username)


If you'd like to support this project, kindly consider making a [donation](https://freecs.org/donate).