File size: 1,595 Bytes
0327787
 
2fb5deb
 
07b41b5
2fb5deb
 
 
 
 
 
0327787
 
d69ac68
0327787
 
 
 
2fb5deb
 
 
0327787
 
 
 
 
 
 
2fb5deb
0327787
2fb5deb
 
 
 
0327787
 
 
 
 
2fb5deb
0327787
2fb5deb
 
 
 
 
 
 
0327787
2fb5deb
0327787
2fb5deb
0327787
2fb5deb
0327787
2fb5deb
0327787
2fb5deb
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
---
library_name: transformers
license: other
datasets:
- mlabonne/orpo-dpo-mix-40k
- Open-Orca/SlimOrca-Dedup
- jondurbin/airoboros-3.2
- microsoft/orca-math-word-problems-200k
- m-a-p/Code-Feedback
- MaziyarPanahi/WizardLM_evol_instruct_V2_196k
base_model: meta-llama/Meta-Llama-3-8B
---

# llama-3-neural-chat-v1-8b

<!-- Provide a quick summary of what the model is/does. -->


![image/png](https://cdn-uploads.huggingface.co/production/uploads/6437292ecd93f4c9a34b0d47/6XQuhjWNr6C4RbU9f1k99.png)



## Model Details

### Model Description

<!-- Provide a longer summary of what this model is. -->

I fine-tuned llama-3 8B on an approach similar to Intel's neural chat language model. I have slightly modified the data sources so it is stronger in coding, math, and writing. I use both SFT and DPO.

- **Developed by:** Locutusque
- **Model type:** Built with Meta Llama 3
- **Language(s) (NLP):** Many?
- **License:** Llama 3 license https://huggingface.co/meta-llama/Meta-Llama-3-8B/blob/main/LICENSE

## Uses

<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->

This model has great performance in writing and coding.

## Training Data
- Open-Orca/SlimOrca-Dedup
- jondurbin/airoboros-3.2
- microsoft/orca-math-word-problems-200k
- m-a-p/Code-Feedback
- MaziyarPanahi/WizardLM_evol_instruct_V2_196k
- mlabonne/orpo-dpo-mix-40k

### Direct Use

<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->

Conversational AI.

## Evaluations

TBD.