File size: 1,417 Bytes
c7e1806
 
 
 
 
 
 
 
 
 
 
 
 
 
65c4df2
c7e1806
5a2bb94
c7e1806
7fbdfb5
 
 
 
 
 
c7e1806
 
 
 
 
 
 
 
 
 
7fbdfb5
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
---
language: fr
license: mit
tags:
- causal-lm
- fr 
datasets:
- c4
- The Pile
---

### Quantized Cedille/fr-boris with 8-bit weights


This is a version of Cedille's GPT-J (fr-boris) with 6 billion parameters that is modified so you can generate **and fine-tune the model in colab or equivalent desktop gpu (e.g. single 1080Ti)**. Inspired by [GPT-J 8bit](https://huggingface.co/hivemind/gpt-j-6B-8bit). 

Here's how to run it: [![colab](https://camo.githubusercontent.com/84f0493939e0c4de4e6dbe113251b4bfb5353e57134ffd9fcab6b8714514d4d1/68747470733a2f2f636f6c61622e72657365617263682e676f6f676c652e636f6d2f6173736574732f636f6c61622d62616467652e737667)](https://colab.research.google.com/drive/1lMja-CPc0vm5_-gXNXAWU-9c0nom7vZ9)

This model can be easily loaded using the `GPTJForCausalLM` functionality:
```python
from transformers import GPTJForCausalLM
model = GPTJForCausalLM.from_pretrained("gustavecortal/fr-boris-8bit")
```

## fr-boris

Boris is a 6B parameter autoregressive language model based on the GPT-J architecture and trained using the [mesh-transformer-jax](https://github.com/kingoflolz/mesh-transformer-jax) codebase.

Boris was trained on around 78B tokens of French text from the [C4](https://huggingface.co/datasets/c4) dataset.

## Links

* [Cedille](https://en.cedille.ai/)
* [Hivemind](https://training-transformers-together.github.io/)
* [Gustave Cortal](https://twitter.com/gustavecortal)