tgaddair's picture
Upload 3 files
0456e1c verified
---
library_name: peft
base_model: mistralai/Mistral-7B-v0.1
datasets:
- ise-uiuc/Magicoder-OSS-Instruct-75K
---
# Model Card for Model ID
Trained with [Ludwig.ai](https://ludwig.ai) and [Predibase](https://predibase.com)!
Given a programming problem and a target language, generate a solution.
Try it in [LoRAX](https://github.com/predibase/lorax):
```python
from lorax import Client
client = Client("http://<your_endpoint>")
problem = "<your programming problem>"
lang = "<your programming language>"
prompt = f"""
Below is a programming problem, paired with a language in which the solution
should be written. Write a solution in the provided that appropriately
solves the programming problem.
### Problem: {problem}
### Language: {lang}
### Solution:
"""
adapter_id = "tgaddair/mistral-7b-magicoder-lora-r8"
resp = client.generate(prompt, max_new_tokens=64, adapter_id=adapter_id)
print(resp.generated_text)
```
## Model Details
### Model Description
Ludwig config (v0.9.3):
```yaml
model_type: llm
input_features:
- name: prompt
type: text
preprocessing:
max_sequence_length: null
column: prompt
output_features:
- name: solution
type: text
preprocessing:
max_sequence_length: null
column: solution
prompt:
template: >-
Below is a programming problem, paired with a language in which the solution
should be written. Write a solution in the provided that appropriately
solves the programming problem.
### Problem: {problem}
### Language: {lang}
### Solution:
preprocessing:
split:
type: fixed
column: split
global_max_sequence_length: 2048
adapter:
type: lora
generation:
max_new_tokens: 64
trainer:
type: finetune
epochs: 1
optimizer:
type: paged_adam
batch_size: 1
eval_steps: 100
learning_rate: 0.0002
eval_batch_size: 2
steps_per_checkpoint: 1000
learning_rate_scheduler:
decay: cosine
warmup_fraction: 0.03
gradient_accumulation_steps: 16
enable_gradient_checkpointing: true
base_model: mistralai/Mistral-7B-v0.1
quantization:
bits: 4
```