01GangaPutraBheeshma's picture
Update README.md
5ba2560
|
raw
history blame
872 Bytes
metadata
license: apache-2.0
datasets:
  - iamtarun/python_code_instructions_18k_alpaca
language:
  - en
library_name: peft
pipeline_tag: text2text-generation
tags:
  - code

Here's a brief description of my project.

Table of Contents

Introduction

colab_code_generator_FT_code_gen_UT, an instruction-following large language model trained on the Google Colab Pro with T4 GPU and fine-tuned on 'Salesforce/codegen-350M-mono' that is licensed for commercial use. Code Generator_UT is trained on ~19k instructions/response fine-tuning records from 'iamtarun/python_code_instructions_18k_alpaca'.