statscoder / README.md
infinitylogesh's picture
Update README.md
a28e531
metadata
license: openrail
datasets:
  - bigcode/the-stack-dedup
library_name: transformers
tags:
  - code_generation
  - R programming
  - sas
  - santacoder

Statscoder

This model is a fine-tuned version of bigcode/santacoder on R and SAS language repositories in the stack dataset.

Training procedure

The model was finetuned using the code adapted from loubnabnl/santacoder-finetuning. Adapted to handle multiple subsets of datasets and it is here.

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • optimizer: adafactor
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 100
  • training_steps: 1600
  • seq_length: 1024
  • no_fp16