New, adapter based instructions
Browse files
README.md
CHANGED
@@ -222,7 +222,7 @@ datasets:
|
|
222 |
|
223 |
# obvr/Llama-3-Lluca-8B
|
224 |
|
225 |
-
> ! Warning: The fusing of the model seems to have not applied properly. Please use the adapters directly.
|
226 |
|
227 |
[LLuca](https://huggingface.co/obvr/Llama-3-Lluca-8B) is a fine-tuned LLaMa 3 model based on [NousResearch/Meta-Llama-3-8B](https://huggingface.co/NousResearch/Meta-Llama-3-8B). It is targeting financial sentiment analysis using FinGPT's dataset.
|
228 |
Training was done on an M1 Ultra using MLX **0.13.0**.
|
@@ -231,12 +231,21 @@ Training was done on an M1 Ultra using MLX **0.13.0**.
|
|
231 |
## Use with mlx
|
232 |
|
233 |
```bash
|
234 |
-
pip install mlx-lm
|
235 |
```
|
236 |
|
237 |
```python
|
|
|
238 |
from mlx_lm import load, generate
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
239 |
|
240 |
-
|
241 |
-
response = generate(model,tokenizer, prompt="What is the sentiment of this news? Please choose an answer from {strong negative/moderately negative/mildly negative/neutral/mildly positive/moderately positive/strong positive}. Reports were released yesterday evening saying that Amazon.com, Inc. (NASDAQ:AMZN) is planning on laying off roughly 10,000 of its corporate and tech employees, starting this week.", verbose=True)
|
242 |
```
|
|
|
222 |
|
223 |
# obvr/Llama-3-Lluca-8B
|
224 |
|
225 |
+
> ! Warning: The fusing of the model seems to have not applied properly. Please use the adapters directly. The example has been updated to show this process.
|
226 |
|
227 |
[LLuca](https://huggingface.co/obvr/Llama-3-Lluca-8B) is a fine-tuned LLaMa 3 model based on [NousResearch/Meta-Llama-3-8B](https://huggingface.co/NousResearch/Meta-Llama-3-8B). It is targeting financial sentiment analysis using FinGPT's dataset.
|
228 |
Training was done on an M1 Ultra using MLX **0.13.0**.
|
|
|
231 |
## Use with mlx
|
232 |
|
233 |
```bash
|
234 |
+
pip install mlx-lm huggingface_hub
|
235 |
```
|
236 |
|
237 |
```python
|
238 |
+
|
239 |
from mlx_lm import load, generate
|
240 |
+
from huggingface_hub import snapshot_download
|
241 |
+
|
242 |
+
tokenizer_config = {"trust_remote_code": True, "eos_token": "<|eot_id|>"}
|
243 |
+
adapter_path = snapshot_download(repo_id="obvr/Llama-3-Lluca-8B", allow_patterns=["adapters/adapter_config.json", "adapters/adapters.safetensors"]) + "/adapters"
|
244 |
+
llm_model, tokenizer = load("NousResearch/Meta-Llama-3-8B-Instruct", adapter_path=adapter_path, tokenizer_config=tokenizer_config)
|
245 |
+
|
246 |
+
prompt = "What is the sentiment of this news? Please choose an answer from {strong negative/moderately negative/mildly negative/neutral/mildly positive/moderately positive/strong positive}. Reports were released yesterday evening saying that Amazon.com, Inc. (NASDAQ:AMZN) is planning on laying off roughly 10,000 of its corporate and tech employees, starting this week."
|
247 |
+
messages = [{"role": "user", "content": prompt}]
|
248 |
+
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
|
249 |
|
250 |
+
response = generate(llm_model, tokenizer, prompt=prompt, verbose=True)
|
|
|
251 |
```
|