File size: 4,823 Bytes
db98a4d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
915c8cd
db98a4d
 
 
 
 
 
 
 
 
 
 
 
eb8e99a
db98a4d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
eb8e99a
db98a4d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
---
license: mit
datasets:
- ccnet-fr
language:
- fr
tags:
- pagnol
---

# PAGnol: An Extra-Large French Generative Model

Paper: [ARXIV](https://arxiv.org/abs/2110.08554), [ACL ANTHOLOGY](https://aclanthology.org/2022.lrec-1.455/)

Code: [GITHUB](https://github.com/lightonai/lairgpt)

PAGnol is a collection of large French language models, geared towards free-form text generation. With 1.5 billion parameters. PAGnol is based on the [GPT](https://arxiv.org/abs/2005.14165) architecture. PAGnol is the first language model trained by [LightOn](https://lighton.ai/), in cooperation with the [ALMAnaCH team of Inria](http://almanach.inria.fr/index-en.html).

These model were trained in early 2021 following the then [scaling laws](https://arxiv.org/abs/2001.08361) and using the exact same training data as the [CamemBERT](https://camembert-model.fr/) model trained on [CCNet](https://github.com/facebookresearch/cc_net). We make it available for reproducibility purposes.
They do not constitute the current state of the art nor are they aiming at it.

PAGnol was built by [Julien Launay](https://lolo.science/), E.L. Tommasone, [Baptiste Pannier](https://www.linkedin.com/in/baptiste-pannier-b30758154/), [François Boniface](https://www.linkedin.com/in/fran%c3%a7ois-boniface-26313610b/), [Amélie Chatelain](https://www.instagram.com/amelietabatta/), [Iacopo Poli](https://twitter.com/iacopo_poli), and [Djamé Seddah](http://pauillac.inria.fr/~seddah/). It is named after Marcel Pagnol (with PAG standing for pré-apprentissage génératif), and was trained on the IDRIS Jean Zay supercomputer thanks to a GENCI allocation.

The model was converted to the Hugging Face format by [Wissam Antoun](https://wissamantoun.com) ([ALMAnaCH](http://almanach.inria.fr/index-en.html)'s PhD student, co-supervised by [Benoît Sagot](https://pauillac.inria.fr/~sagot/) and [Djamé Seddah](http://pauillac.inria.fr/~seddah/))

# Usage

### Using PAGnol with Huggingface
```python
from transformers import pipeline

generator = pipeline('text-generation', model='lightonai/pagnol-medium', trust_remote_code=True)

output = generator(
    "Salut PAGnol, comment ça va ?",
    max_length=50,
    do_sample=True,
    temperature=0.7,
)[0]["generated_text"]

>>> "Très bien! Les jours d’été sont là ! Bientôt les premiers festivals..."
```

### Using PAGnol with lairgpt
Head over to our [GitHub repository](https://github.com/lightonai/lairgpt) to access our PyTorch inference code.
Using PAGnol is as simple as running the following code:
```python
from lairgpt.models import PAGnol

pagnol = PAGnol.medium()
pagnol("Salut PAGnol, comment ça va ?")

>>> "Très bien! Les jours d’été sont là ! Bientôt les premiers festivals..."
```

# License
PAGnol is made available under the MIT licence: by downloading the models available below, you agree with the terms of the MIT licence agreement. Under no circumstances will LightOn and/or Inria be held responsible or liable in any way for any claims, damages, losses, expenses, costs or liabilities whatsoever (including, without limitation, any direct or indirect damages for loss of profits, business interruption or loss of information) resulting or arising directly or indirectly from your use of or inability to use PAGnol.

# Available Models
- [`lightonai/pagnol-small`](https://huggingface.co/lightonai/pagnol-small): 125M parameters
- [`lightonai/pagnol-medium`](https://huggingface.co/lightonai/pagnol-medium): 355M parameters
- [`lightonai/pagnol-large`](https://huggingface.co/lightonai/pagnol-large): 773M parameters
- [`lightonai/pagnol-xl`](https://huggingface.co/lightonai/pagnol-xl): 1.5B parameters

# Citation
```
@inproceedings{launay-etal-2022-pagnol,
    title = "{PAG}nol: An Extra-Large {F}rench Generative Model",
    author = "Launay, Julien  and
      Tommasone, E.l.  and
      Pannier, Baptiste  and
      Boniface, Fran{\c{c}}ois  and
      Chatelain, Am{\'e}lie  and
      Cappelli, Alessandro  and
      Poli, Iacopo  and
      Seddah, Djam{\'e}",
    editor = "Calzolari, Nicoletta  and
      B{\'e}chet, Fr{\'e}d{\'e}ric  and
      Blache, Philippe  and
      Choukri, Khalid  and
      Cieri, Christopher  and
      Declerck, Thierry  and
      Goggi, Sara  and
      Isahara, Hitoshi  and
      Maegaard, Bente  and
      Mariani, Joseph  and
      Mazo, H{\'e}l{\`e}ne  and
      Odijk, Jan  and
      Piperidis, Stelios",
    booktitle = "Proceedings of the Thirteenth Language Resources and Evaluation Conference",
    month = jun,
    year = "2022",
    address = "Marseille, France",
    publisher = "European Language Resources Association",
    url = "https://aclanthology.org/2022.lrec-1.455",
    pages = "4275--4284",
}
```
# Contact
For research enquiries: pagnol@lighton.ai
For business enquiries: customer.relations@lighton.ai