Edit model card

Description: Coding tasks in multiple languages
Original dataset: https://huggingface.co/datasets/ise-uiuc/Magicoder-OSS-Instruct-75K
---
Try querying this adapter for free in Lora Land at https://predibase.com/lora-land!
The adapter_category is STEM and the name is Code Generation (magicoder)
---
Sample input: Below is a programming problem, paired with a language in which the solution should be written. Write a solution in the provided that appropriately solves the programming problem.\n\n### Problem:

def strlen(string: str) -> int: """ Return length of given string >>> strlen('') 0 >>> strlen('abc') 3 """ \n\n### Language: python\n\n### Solution:
---
Sample output: python def strlen(string: str) -> int: return len(string)
---
Try using this adapter yourself!

from transformers import AutoModelForCausalLM, AutoTokenizer

model_id = "mistralai/Mistral-7B-v0.1"
peft_model_id = "predibase/magicoder"

model = AutoModelForCausalLM.from_pretrained(model_id)
model.load_adapter(peft_model_id)
Downloads last month
918
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for predibase/magicoder

Adapter
this model

Collection including predibase/magicoder