File size: 1,053 Bytes
60098a3
 
011f5e2
 
 
60098a3
 
 
011f5e2
60098a3
420f7df
 
011f5e2
 
60098a3
 
011f5e2
60098a3
011f5e2
 
 
 
 
 
 
 
 
 
60098a3
011f5e2
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
---
library_name: transformers
license: openrail
datasets:
- nuprl/MultiPL-T
---


# MultiPL-T DeepSeekCoder-33b-Base

This repository holds a [DeepSeekCoder-33b-base](https://huggingface.co/deepseek-ai/deepseek-coder-33b-base) fine-tune
on MultiPL-T Racket. 
Examine the commit message to determine the language and checkpoint. We have a checkpoint
for each epoch.


For more information the training process, see the MultiPL-T paper:

```
@misc{cassano:multipl-t,
      title={Knowledge Transfer from High-Resource to Low-Resource Programming Languages for Code LLMs}, 
      author={Federico Cassano and John Gouwar and Francesca Lucchetti and Claire Schlesinger and Anders Freeman and Carolyn Jane Anderson and Molly Q Feldman and Michael Greenberg and Abhinav Jangda and Arjun Guha},
      year={2024},
      eprint={2308.09895},
      archivePrefix={arXiv},
      primaryClass={cs.PL}
}
```

For usage instructions, see the model card for the original model. Replace the model name with the name of this repository, and set `revision=COMMIT_HASH`.