File size: 1,154 Bytes
c4fe087 3e0a9d6 e3cf120 3e0a9d6 58d0eed 3e0a9d6 58d0eed 3e0a9d6 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 |
---
library_name: peft
base_model: mistralai/Mistral-7B-v0.1
pipeline_tag: text-generation
---
Description: Coding tasks in multiple languages\
Original dataset: https://huggingface.co/datasets/ise-uiuc/Magicoder-OSS-Instruct-75K \
---\
Try querying this adapter for free in Lora Land at https://predibase.com/lora-land! \
The adapter_category is STEM and the name is Code Generation (magicoder)\
---\
Sample input: Below is a programming problem, paired with a language in which the solution should be written. Write a solution in the provided that appropriately solves the programming problem.\n\n### Problem:
def strlen(string: str) -> int:
""" Return length of given string
>>> strlen('')
0
>>> strlen('abc')
3
"""
\n\n### Language: python\n\n### Solution: \
---\
Sample output: ```python
def strlen(string: str) -> int:
return len(string)```\
---\
Try using this adapter yourself!
```
from transformers import AutoModelForCausalLM, AutoTokenizer
model_id = "mistralai/Mistral-7B-v0.1"
peft_model_id = "predibase/magicoder"
model = AutoModelForCausalLM.from_pretrained(model_id)
model.load_adapter(peft_model_id)
``` |