File size: 3,263 Bytes
84559fe a7fb153 84559fe a7fb153 84559fe a7fb153 84559fe a7fb153 84559fe a7fb153 84559fe a1c9698 a7fb153 84559fe a7fb153 84559fe |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 |
---
license: apache-2.0
---
# Yi-1.5-6B-Chat-Math
**Yi-1.5-6B-Chat-Math** is a specialized natural language processing model fine-tuned from **Yi-1.5** to excel in mathematical problem-solving and related tasks. Designed to handle a wide range of mathematical queries, from algebra and geometry to calculus and beyond, this model serves as a powerful tool for students, educators, and researchers alike.
## Features
- **Mathematical Problem Solving**: Accurately solves various types of mathematical problems, including but not limited to algebra, geometry, and calculus.
- **Formula Derivation**: Assists in deriving and explaining mathematical formulas to enhance understanding of complex concepts.
- **Multilingual Support**: Capable of handling mathematical queries in multiple languages, enhancing accessibility for a diverse user base.
- **Custom Fine-Tuning**: Trained on a proprietary dataset to ensure high performance and reliability in mathematical contexts.
## Dataset
The model has been fine-tuned using a custom dataset tailored for advanced mathematical tasks. The dataset is openly available for research and development purposes.
- **Dataset Name**: Advanced-Math
- **Access Link**: [Advanced-Math Dataset](https://huggingface.co/datasets/haijian06/Advanced-Math)
## Installation
To get started with **Yi-1.5-6B-Chat-Math**, ensure you have the necessary dependencies installed:
```bash
pip install transformers torch
```
## Usage
Below is a simple example demonstrating how to use the model for solving a mathematical equation:
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
# Load the tokenizer and model
tokenizer = AutoTokenizer.from_pretrained("haijian06/Yi-1.5-6B-Chat-Math")
model = AutoModelForCausalLM.from_pretrained("haijian06/Yi-1.5-6B-Chat-Math", torch_dtype=torch.float16, device_map="auto")
input_text = "Solve the equation x^2 - 5x + 6 = 0 Let's solve this step-by-step:"
inputs = tokenizer(input_text, return_tensors="pt").to(model.device)
with torch.no_grad():
outputs = model.generate(
**inputs,
max_new_tokens=200,
do_sample=True,
temperature=0.7,
top_p=0.95,
)
answer = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(answer)
```
**Model answer:**
```
Solve the equation x^2 - 5 x + 6 = 0 Let's solve this step-by-step:
Step 1: Factor the equation
The equation can be factored as follows:
x^2 - 5x + 6 = 0
(x - 2)(x - 3) = 0
Step 2: Apply the zero product property
If the product of two numbers is zero, then at least one of the numbers must be zero.
So, either (x - 2) = 0 or (x - 3) = 0
Step 3: Solve for x
If (x - 2) = 0, then x = 2
If (x - 3) = 0, then x = 3
So, the solutions are x = 2 and x = 3.
Answer: 2, 3
```
## Contributing
Contributions are welcome! Whether you have suggestions for improvements, bug reports, or want to contribute code, feel free to open an issue or submit a pull request on GitHub.
## License
This project is licensed under the [Apache-2.0 License](https://www.apache.org/licenses/LICENSE-2.0).
## Contact
For more information, support, or inquiries, please visit my GitHub profile:
- **GitHub**: [https://github.com/Haijian06](https://github.com/Haijian06)
--- |