gustavecortal commited on
Commit
c7e1806
1 Parent(s): 75eba8f

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +29 -0
README.md ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: fr
3
+ license: mit
4
+ tags:
5
+ - causal-lm
6
+ - fr
7
+ datasets:
8
+ - c4
9
+ - The Pile
10
+ ---
11
+
12
+ ### Quantized Cedille/fr-boris with 8-bit weights
13
+
14
+
15
+ This is a version of Cedille's GPT-J (fr-boris) with 6 billion parameters that is modified so you can generate **and fine-tune the model in colab or equivalent desktop gpu (e.g. single 1080Ti)**.
16
+
17
+ Here's how to run it: [![colab](https://camo.githubusercontent.com/84f0493939e0c4de4e6dbe113251b4bfb5353e57134ffd9fcab6b8714514d4d1/68747470733a2f2f636f6c61622e72657365617263682e676f6f676c652e636f6d2f6173736574732f636f6c61622d62616467652e737667)](https://colab.research.google.com/drive/1ft6wQU0BhqG5PRlwgaZJv2VukKKjU4Es)
18
+
19
+ ## fr-boris
20
+
21
+ Boris is a 6B parameter autoregressive language model based on the GPT-J architecture and trained using the [mesh-transformer-jax](https://github.com/kingoflolz/mesh-transformer-jax) codebase.
22
+
23
+ Boris was trained on around 78B tokens of French text from the [C4](https://huggingface.co/datasets/c4) dataset.
24
+
25
+ ## Links
26
+
27
+ * [Cedille](https://en.cedille.ai/)
28
+ * [Hivemind](https://training-transformers-together.github.io/)
29
+ * [My Twitter](https://twitter.com/gustavecortal)