File size: 928 Bytes
9986be8 4cba52c 9986be8 f457243 9e95dfe 9986be8 405cee8 649d1dd 527977a 9986be8 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 |
---
license: apache-2.0
datasets:
- nampdn-ai/tiny-textbooks
library_name: transformers
tags:
- general
language:
- en
pipeline_tag: text-generation
metrics:
- accuracy
---
#### LLaMA-2-7b-Tinytext
![ your alt-text if image link breaks](https://cdn.midjourney.com/3ded8b9e-a1e1-4893-8dff-26d82d81db22/0_1.png)
# Model Details
Llama2-7b-tinytext is a fine-tuned language model ontop of TinyPixel/Llama-2-7B-bf16-sharded,
### Model Description
TODO
- **Developed by:** Collin Heenan
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** English
- **Finetuned from model:** TinyPixel/Llama-2-7B-bf16-sharded
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
TODO
### Direct Use
TODO
## Bias, Risks, and Limitations
TODO |