File size: 1,742 Bytes
4abca38
 
779d1df
 
 
4abca38
779d1df
 
 
521a6a0
 
 
 
fc174fc
 
4a7a5ab
 
 
fc174fc
 
779d1df
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
098a99e
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
---
license: apache-2.0
language:
- en
library_name: transformers
---

# OpenLLaMA-3B-Chat: Chat Model on top of Open Reproduction of LLaMA

## Training Traces

[wandb](https://wandb.ai/autoai-org/fmengine/runs/3ddwtzyl/overview?workspace=user-xzyaoi)

## Prompt Format

```
<human>: Who is Alan Turing?<|endoftext|><assistant>:
```


## Reference

If you found OpenLLaMA useful in your research or applications, please cite using the following BibTeX:
```
@software{Yao_FMEngine_Library_for_2023,
  author = {Yao, Xiaozhe},
  doi = {10.5281/zenodo.8314779},
  month = sep,
  title = {{FMEngine: Library for Training Foundation Models}},
  url = {https://github.com/eth-easl/fmengine},
  version = {0.0.1},
  year = {2023}
}
@software{openlm2023openllama,
  author = {Geng, Xinyang and Liu, Hao},
  title = {OpenLLaMA: An Open Reproduction of LLaMA},
  month = May,
  year = 2023,
  url = {https://github.com/openlm-research/open_llama}
}
@software{together2023redpajama,
  author = {Together Computer},
  title = {RedPajama-Data: An Open Source Recipe to Reproduce LLaMA training dataset},
  month = April,
  year = 2023,
  url = {https://github.com/togethercomputer/RedPajama-Data}
}
@article{touvron2023llama,
  title={Llama: Open and efficient foundation language models},
  author={Touvron, Hugo and Lavril, Thibaut and Izacard, Gautier and Martinet, Xavier and Lachaux, Marie-Anne and Lacroix, Timoth{\'e}e and Rozi{\`e}re, Baptiste and Goyal, Naman and Hambro, Eric and Azhar, Faisal and others},
  journal={arXiv preprint arXiv:2302.13971},
  year={2023}
}
```

## Limitations and Bias
As with all language models, `openllama-3b-chat` may generate incorrect or biased content. It's important to keep this in mind when using the model.