Llama3-8b-Naija_v1 / README.md
saheedniyi's picture
Update README.md
7e242ec verified
|
raw
history blame
1.95 kB
---
library_name: transformers
license: llama3
datasets:
- saheedniyi/Nairaland_v1_instruct_512QA
language:
- en
pipeline_tag: text-generation
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
Excited to announce the release of *Llama3-8b-Naija_v1* a finetuned version of Meta-Llama-3-8B trained on a *Question - Answer* dataset from [Nairaland](https://www.nairaland.com/).
The model was built in an attempt to "Nigerialize" Llama-3, giving it a Nigerian - like behavior.
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [Saheedniyi](https://linkedin.com/in/azeez-saheed)
- **Language(s) (NLP):** English, Pidgin English
- **License:** [META LLAMA 3 COMMUNITY LICENSE AGREEMENT](https://huggingface.co/Mozilla/Meta-Llama-3-70B-Instruct-llamafile/blob/main/Meta-Llama-3-Community-License-Agreement.txt)
- **Finetuned from model [optional]:** [meta-llama/Meta-Llama-3-8B](Mozilla/Meta-Llama-3-70B-Instruct-llamafile)
### Model Sources
<!-- Provide the basic links for the model. -->
- **[Repository](https://github.com/saheedniyi02)**
- **Demo:** [Colab Notebook](https://colab.research.google.com/drive/1IGe7yR3ShU59dxVDmYOSYYxtxBYlcIcP?authuser=3)
## How to Get Started with the Model
Use the code below to get started with the model.
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("your-huggingface-username/llama3-nigeria")
model = AutoModelForCausalLM.from_pretrained("your-huggingface-username/llama3-nigeria")
input_text = "What's the latest news on Nairaland?"
inputs = tokenizer(input_text, return_tensors="pt")
outputs = model.generate(**inputs)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
```