metadata
license: bigcode-openrail-m
datasets:
- bigcode/the-stack-dedup
pipeline_tag: text-generation
tags:
- code
Santacoder finetuned on Shadertoys for 1000 steps with a batch size of 2 and full sequence length of 2048. Origianl finetuning script from found here, adapted version to follow (soon^^).
Main purpose of this model is to explore if finetuning models improves performance on ShaderEval, results to follow (sooner).
License carried over from model, and the finetuning dataset holds the same license.