File size: 2,012 Bytes
0658cf9 0c9ba20 31e783d 3cd4b50 958f2e9 f5286c7 958f2e9 2b058b1 0658cf9 958f2e9 c498eca 958f2e9 344641a 958f2e9 344641a 958f2e9 344641a 958f2e9 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 |
---
library_name: transformers
license: apache-2.0
language:
- en
pipeline_tag: text-generation
base_model:
- explorewithai/Loxa-3B
model_size:
- 4.75B
tags:
- text-generation
- conversational
- language-model
- cpu
widget:
- text: "Hello"
---
# Model Card for Loxa
**Model Name:** Loxa-3B
**Model Family:** Loxa
**Creator:** AIFRAME
**Description:** Loxa-3B is a powerful language model designed for optimal performance on CPU resources, particularly Raspberry Pi 4 and 5 (8GB+ RAM). It excels in math, code, chat, help, science, and formal conversations, achieving 92% total accuracy.
**Capabilities:**
* **Mathematics:** Solves problems, performs calculations, explains concepts.
* **Code:** Generates code, understands/debugs existing code, provides explanations.
* **Chat:** Engages in conversations, provides informative and helpful responses.
* **Help:** Offers guidance and clear explanations across various topics.
* **Science:** Discusses scientific topics, explains phenomena, provides insights.
* **Formal Conversations:** Maintains formal etiquette and respectful language.
**Performance:**
* **Accuracy:** 92% total accuracy.
* **Resource Usage:** Optimized for Raspberry Pi 4/5 (8GB+ RAM). Consult documentation for detailed metrics.
**Intended Use:** Educational purposes, personal projects, embedded systems, resource-constrained environments.
**Limitations:**
May produce incorrect or nonsensical outputs. Exercise caution for critical tasks. Performance may be affected by input complexity/length. See documentation for details on limitations and biases.
**How to Use:** See accompanying documentation for installation and usage instructions.
## Code Example:
```python
# Use a pipeline as a high-level helper
from transformers import pipeline
messages = [
{"role": "user", "content": "Who are you?"},
]
pipe = pipeline("text-generation", model="explorewithai/Loxa-3B") # Using 'explorewithai' as a placeholder organization
result = pipe(messages)
print(result)
``` |