Update README.md
Browse files
README.md
CHANGED
@@ -14,8 +14,82 @@ tags:
|
|
14 |
|
15 |
# Cosmos LLaMa
|
16 |
|
|
|
17 |
|
|
|
18 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
19 |
|
20 |
|
21 |
---
|
|
|
14 |
|
15 |
# Cosmos LLaMa
|
16 |
|
17 |
+
This model is a fully fine-tuned version of the LLaMA 3B model with a 30GB Turkish dataset.
|
18 |
|
19 |
+
The Cosmos LLaMa is designed for text generation tasks, providing the ability to continue a given text snippet in a coherent and contextually relevant manner. Due to the diverse nature of the training data, which includes websites, books, and other text sources, this model can exhibit biases. Users should be aware of these biases and use the model responsibly.
|
20 |
|
21 |
+
## Example Usage
|
22 |
+
|
23 |
+
Here is an example of how to use the model in colab:
|
24 |
+
|
25 |
+
```python
|
26 |
+
!pip install -U accelerate bitsandbytes
|
27 |
+
```
|
28 |
+
|
29 |
+
```python
|
30 |
+
import torch
|
31 |
+
from transformers import pipeline, AutoTokenizer, AutoModelForCausalLM
|
32 |
+
from transformers import BitsAndBytesConfig
|
33 |
+
import time
|
34 |
+
|
35 |
+
model_name = "ytu-ce-cosmos/Turkish-Llama-8b-v0.1"
|
36 |
+
|
37 |
+
bnb_config = BitsAndBytesConfig(
|
38 |
+
load_in_8bit=True,
|
39 |
+
bnb_8bit_compute_dtype=torch.bfloat16,
|
40 |
+
load_in_8bit_fp32_cpu_offload=True,
|
41 |
+
device_map = 'auto'
|
42 |
+
)
|
43 |
+
|
44 |
+
tokenizer = AutoTokenizer.from_pretrained(model_name)
|
45 |
+
model = AutoModelForCausalLM.from_pretrained(
|
46 |
+
model_name,
|
47 |
+
device_map="auto",
|
48 |
+
torch_dtype=torch.bfloat16,
|
49 |
+
quantization_config=bnb_config,
|
50 |
+
)
|
51 |
+
```
|
52 |
+
|
53 |
+
```python
|
54 |
+
text_generator = pipeline(
|
55 |
+
"text-generation",
|
56 |
+
model=model,
|
57 |
+
tokenizer=tokenizer,
|
58 |
+
device_map="auto",
|
59 |
+
temperature=0.3,
|
60 |
+
repetition_penalty=1.1,
|
61 |
+
top_p=0.9,
|
62 |
+
max_length=610,
|
63 |
+
do_sample=True,
|
64 |
+
return_full_text=False,
|
65 |
+
min_new_tokens=32
|
66 |
+
)
|
67 |
+
```
|
68 |
+
|
69 |
+
```python
|
70 |
+
text = """Yapay zeka hakkında 3 tespit yaz.\n"""
|
71 |
+
|
72 |
+
r = text_generator(text)
|
73 |
+
|
74 |
+
print(r[0]['generated_text'])
|
75 |
+
|
76 |
+
"""
|
77 |
+
1. Yapay Zeka (AI), makinelerin insan benzeri bilişsel işlevleri gerçekleştirmesini sağlayan bir teknoloji alanıdır.
|
78 |
+
|
79 |
+
2. Yapay zekanın geliştirilmesi ve uygulanması, sağlık hizmetlerinden eğlenceye kadar çeşitli sektörlerde çok sayıda fırsat sunmaktadır.
|
80 |
+
|
81 |
+
3. Yapay zeka teknolojisinin potansiyel faydaları önemli olsa da mahremiyet, işten çıkarma ve etik hususlar gibi konularla ilgili endişeler de var.
|
82 |
+
"""
|
83 |
+
```
|
84 |
+
|
85 |
+
# Acknowledgments
|
86 |
+
- Thanks to the generous support from the Hugging Face team, it is possible to download models from their S3 storage 🤗
|
87 |
+
|
88 |
+
|
89 |
+
### Contact
|
90 |
+
COSMOS AI Research Group, Yildiz Technical University Computer Engineering Department <br>
|
91 |
+
https://cosmos.yildiz.edu.tr/ <br>
|
92 |
+
cosmos@yildiz.edu.tr
|
93 |
|
94 |
|
95 |
---
|