File size: 2,919 Bytes
d728116
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
10cfea0
d728116
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
---
license: other
language:
- en
pipeline_tag: conversational
inference: false
tags:
- AI
- ConversationalAI
---

<h1 style="text-align: center">LLmRa-1.3B-V2</h1>
<h2 style="text-align: center">A conversational Open Pre-trained Transformer Language Model fine-tune.</h2>

**LLmRa 1.3B-V2**, as a proof-of-concept fine-tune of [facebook/opt-1.3b](https://huggingface.co/facebook/opt-1.3b) optimized for dialogue.

**Disclaimer:** NSFW data was included in the fine-tuning of this model. Although SFW inputs will usually result in SFW outputs, you are advised to **chat at your own risk. This model is not suitable for use by minors.**

**Warning:** This model is **NOT** suitable for use by minors. **It will output X-rated content under certain circumstances.**

**Model Fine-Tuned on LLmRa-100K conversational dataset - small version**

---

## Usage Format

To effectively utilize the model, follow this structured format for engaging text-based conversations:

**1. Initialization**

Here is how you can define the personality of the language model:

```
<|system|>[Persona]
```

- **Persona**: You can define a specific persona or context for the AI, but it's optional. It can be a character, a role, or just a style of interaction.

**2. AI Introduction**

```
<|user|>[User input]<|model|>
```
- Users can start the conversation by entering their message within `<|user|>` and closing with `<|model|>`.

---

### Example Usage:

Here's an example of how to start a conversation with the AI:

```
<|system|>I'm here to provide information and assistance on a wide range of topics.
<|model|>Hello! Welcome to our AI-powered assistant. How can I assist you today?
<|user|>Tell me about the history of artificial intelligence.
<|model|>
```

Continue the conversation as needed. This structured format helps maintain a smooth and engaging interaction with the AI.

You are not required to include `User`, you can change it to your prefered name or leave it blank You may also add the AI name, example:

```
<|user|>YourNameHere: Hello.<|model|>CharacterName:
```

You can also use this instruct prompt example:

```
<|system|>What is one plus one?<|model|>
```

## Loading The Model

To use the model and interact with it, use the Python code below:

```Python
from transformers import (AutoModelForCausalLM,
                          AutoTokenizer,
                          pipeline,
                          )

model = AutoModelForCausalLM.from_pretrained('L-R/LLmRa-1.3B-V2')
tokenizer = AutoTokenizer.from_pretrained('L-R/LLmRa-1.3B-V2')

pipe = pipeline(task="text-generation", model=model, tokenizer=tokenizer, max_length=100)

input_question = 'QUESTION HERE'

question_formatted = f'<|system|>{input_question}<|model|>'

result = pipe(question_formatted)

print(f"[model]: {result[0]['generated_text'][len(question_formatted):]}")
```

## Known issues

Model doesn't some of the times follow instructions.