File size: 1,482 Bytes
bca02e7 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 |
---
base_model: mistralai/Mistral-Nemo-Instruct-2407
datasets:
- iamtarun/code_instructions_120k_alpaca
language:
- en
library_name: peft
license: apache-2.0
pipeline_tag: text-generation
---
# Aurora-12B
Aurora-12B is a powerful language model fine-tuned from the Mistral-Nemo-Instruct-2407 base model, designed specifically for code generation and understanding tasks. This model leverages the extensive iamtarun/code_instructions_120k_alpaca dataset to provide high-quality code-related instructions and solutions.
## Overview
Aurora-12B is tailored for developers and researchers who need advanced code completion, code understanding, and programming assistance. It understands a wide variety of programming languages and can provide accurate and contextually relevant suggestions.
## Features
- Code Generation: Generate code snippets in various programming languages.
- Code Completion: Complete partial code fragments.
- Code Understanding: Explain code functionality and provide comments.
- Instruction Following: Follow and execute code-related instructions with high accuracy.
## Model Details
- Base Model: Mistral-Nemo-Instruct-2407
- Fine-Tuned On: iamtarun/code_instructions_120k_alpaca
- Parameters: 12 billion
## Dataset
The model was fine-tuned on the iamtarun/code_instructions_120k_alpaca dataset, which includes 120,000 examples of code instructions and solutions. This diverse dataset ensures that the model can handle a wide range of coding scenarios. |