versae commited on
Commit
ba7767f
1 Parent(s): a4b1fcf

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +189 -73
README.md CHANGED
@@ -28,76 +28,192 @@ metrics:
28
  library_name: transformers
29
  ---
30
 
31
- # ALBERTI
32
-
33
- ALBERTI is a set of two BERT-based multilingual model for poetry. One for verses and another one for stanzas. This model has been further trained with the PULPO corpus for verses using [Flax](https://github.com/google/flax), including training scripts.
34
-
35
- See https://arxiv.org/abs/2307.01387.
36
-
37
- This is part of the
38
- [Flax/Jax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104), organised by [HuggingFace](https://huggingface.co/) and TPU usage sponsored by Google.
39
-
40
- ## PULPO
41
-
42
- PULPO, the Prolific Unannotated Literary Poetry Corpus, is a set of multilingual corpora of verses and stanzas with over 72M words.
43
-
44
- The following corpora has been downloaded using the [Averell](https://github.com/linhd-postdata/averell/) tool, developed by the [POSTDATA](https://postdata.linhd.uned.es/) team:
45
-
46
- ### Spanish
47
- - [Disco v3](https://github.com/pruizf/disco)
48
- - [Corpus of Spanish Golden-Age Sonnets](https://github.com/bncolorado/CorpusSonetosSigloDeOro)
49
- - [Corpus general de poesía lírica castellana del Siglo de Oro](https://github.com/bncolorado/CorpusGeneralPoesiaLiricaCastellanaDelSigloDeOro)
50
- - [Gongocorpus](https://github.com/linhd-postdata/gongocorpus) - [source](http://obvil.sorbonne-universite.site/corpus/gongora/gongora_obra-poetica)
51
- ### English
52
- - [Eighteenth-Century Poetry Archive (ECPA)](https://github.com/alhuber1502/ECPA)
53
- - [For better for verse](https://github.com/waynegraham/for_better_for_verse)
54
- ### French
55
- - [Métrique en Ligne](https://crisco2.unicaen.fr/verlaine/index.php?navigation=accueil) - [source](https://github.com/linhd-postdata/metrique-en-ligne)
56
- ### Italian
57
- - [Biblioteca italiana](https://github.com/linhd-postdata/biblioteca_italiana) - [source](http://www.bibliotecaitaliana.it/)
58
- ### Czech
59
- - [Corpus of Czech Verse](https://github.com/versotym/corpusCzechVerse)
60
- ### Portuguese
61
- - [Stichotheque](https://gitlab.com/stichotheque/stichotheque-pt)
62
-
63
- Also, we obtained the following corpora from these sources:
64
- ### Spanish
65
- - [Poesi.as](https://github.com/linhd-postdata/poesi.as) - [source](http://www.poesi.as/)
66
- ### English
67
- - [A Gutenberg Poetry Corpus](https://github.com/aparrish/gutenberg-poetry-corpus)
68
- ### Arabic
69
- - [Arabic Poetry dataset](https://www.kaggle.com/ahmedabelal/arabic-poetry)
70
- ### Chinese
71
- - [THU Chinese Classical Poetry Corpus](https://github.com/THUNLP-AIPoet/Datasets/tree/master/CCPC)
72
- ### Finnish
73
- - [SKVR](https://github.com/sks190/SKVR)
74
- ### German
75
- - [TextGrid Poetry Corpus](https://github.com/linhd-postdata/textgrid-poetry) - [source](https://textgrid.de/en/digitale-bibliothek)
76
- - [German Rhyme Corpus](https://github.com/tnhaider/german-rhyme-corpus)
77
- ### Hungarian
78
- - [verskorpusz](https://github.com/ELTE-DH/verskorpusz)
79
- ### Portuguese
80
- - [Poems in Portuguese](https://www.kaggle.com/oliveirasp6/poems-in-portuguese)
81
- ### Russian
82
- - [19 000 Russian poems](https://www.kaggle.com/grafstor/19-000-russian-poems)
83
-
84
- ## Team members
85
-
86
- - Álvaro Pérez ([alvp](https://huggingface.co/alvp))
87
- - Javier de la Rosa ([versae](https://huggingface.co/versae))
88
- - Aitor Díaz ([aitordiaz](https://huggingface.co/aitordiaz))
89
- - Elena González-Blanco
90
- - Salvador Ros ([salva](https://huggingface.co/salva))
91
-
92
- ## Useful links
93
-
94
- - [Community Week timeline](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104#summary-timeline-calendar-6)
95
- - [Community Week README](https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md)
96
- - [Community Week thread](https://discuss.huggingface.co/t/bertin-pretrain-roberta-large-from-scratch-in-spanish/7125)
97
- - [Community Week channel](https://discord.com/channels/858019234139602994/859113060068229190)
98
- - [Masked Language Modelling example scripts](https://github.com/huggingface/transformers/tree/master/examples/flax/language-modeling)
99
- - [Model Repository](https://huggingface.co/flax-community/alberti-bert-base-multilingual-cased/)
100
-
101
- ## Acknowledgments
102
-
103
- This project would not have been possible without the infrastructure and resources provided by HuggingFace and Google Cloud. Moreover, we want to thank POSTDATA Project (ERC-StG-679528) and the Computational Literary Studies Infrastructure (CLS INFRA No. 101004984) of the European Union's Horizon 2020 research and innovation programme for their support and time allowance.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
28
  library_name: transformers
29
  ---
30
 
31
+ # Model Card for Aʟʙᴇʀᴛɪ:
32
+ Aʟʙᴇʀᴛɪ is the first multilingual domain-specific language model for poetry analysis.
33
+
34
+ ## Model Details
35
+
36
+ ### Model Description
37
+
38
+ As a pre-trained language model, it is trained using the masked language modeling objective on top of [multilingual BERT](https://huggingface.co/bert-base-multilingual-cased), and therefore **it needs to be fine-tuned to specific tasks**.
39
+
40
+ - **Developed by:** [Javier de la Rosa](https://huggingface.co/versae), and [Álvaro Pérez Pozo](https://huggingface.co/)
41
+ - **Shared by:** [Javier de la Rosa](https://huggingface.co/versae)
42
+ - **Model type:** `bert`
43
+ - **Language(s) (NLP):** Spanish, French, Italian, Czech, Portuguese, English, Arabic, Finnish, German, Russian, Hungarian, Chinese.
44
+ - **License:** [CC-BY 4.0](https://creativecommons.org/licenses/by/4.0/)
45
+ - **Finetuned from model:** [https://huggingface.co/bert-base-multilingual-cased](bert-base-multilingual-cased)
46
+
47
+ ### Model Sources [optional]
48
+
49
+ <!-- Provide the basic links for the model. -->
50
+
51
+ - **Paper:** https://arxiv.org/abs/2307.01387
52
+ - **Demo:** https://huggingface.co/spaces/linhd-postdata/alberti
53
+
54
+ ## Uses
55
+
56
+ <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
57
+
58
+ ### Direct Use
59
+
60
+ <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
61
+
62
+ [More Information Needed]
63
+
64
+ ### Downstream Use [optional]
65
+
66
+ <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
67
+
68
+ [More Information Needed]
69
+
70
+ ### Out-of-Scope Use
71
+
72
+ <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
73
+
74
+ [More Information Needed]
75
+
76
+ ## Bias, Risks, and Limitations
77
+
78
+ <!-- This section is meant to convey both technical and sociotechnical limitations. -->
79
+
80
+ [More Information Needed]
81
+
82
+ ### Recommendations
83
+
84
+ <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
85
+
86
+ Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
87
+
88
+ ## How to Get Started with the Model
89
+
90
+ Use the code below to get started with the model.
91
+
92
+ [More Information Needed]
93
+
94
+ ## Training Details
95
+
96
+ ### Training Data
97
+
98
+ <!-- This should link to a Data Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
99
+
100
+ [More Information Needed]
101
+
102
+ ### Training Procedure
103
+
104
+ <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
105
+
106
+ #### Preprocessing [optional]
107
+
108
+ [More Information Needed]
109
+
110
+
111
+ #### Training Hyperparameters
112
+
113
+ - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
114
+
115
+ #### Speeds, Sizes, Times [optional]
116
+
117
+ <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
118
+
119
+ [More Information Needed]
120
+
121
+ ## Evaluation
122
+
123
+ <!-- This section describes the evaluation protocols and provides the results. -->
124
+
125
+ ### Testing Data, Factors & Metrics
126
+
127
+ #### Testing Data
128
+
129
+ <!-- This should link to a Data Card if possible. -->
130
+
131
+ [More Information Needed]
132
+
133
+ #### Factors
134
+
135
+ <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
136
+
137
+ [More Information Needed]
138
+
139
+ #### Metrics
140
+
141
+ <!-- These are the evaluation metrics being used, ideally with a description of why. -->
142
+
143
+ [More Information Needed]
144
+
145
+ ### Results
146
+
147
+ [More Information Needed]
148
+
149
+ #### Summary
150
+
151
+
152
+
153
+ ## Model Examination [optional]
154
+
155
+ <!-- Relevant interpretability work for the model goes here -->
156
+
157
+ [More Information Needed]
158
+
159
+ ## Environmental Impact
160
+
161
+ <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
162
+
163
+ Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
164
+
165
+ - **Hardware Type:** [More Information Needed]
166
+ - **Hours used:** [More Information Needed]
167
+ - **Cloud Provider:** [More Information Needed]
168
+ - **Compute Region:** [More Information Needed]
169
+ - **Carbon Emitted:** [More Information Needed]
170
+
171
+ ## Technical Specifications [optional]
172
+
173
+ ### Model Architecture and Objective
174
+
175
+ [More Information Needed]
176
+
177
+ ### Compute Infrastructure
178
+
179
+ [More Information Needed]
180
+
181
+ #### Hardware
182
+
183
+ [More Information Needed]
184
+
185
+ #### Software
186
+
187
+ [More Information Needed]
188
+
189
+ ## Citation [optional]
190
+
191
+ <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
192
+
193
+ **BibTeX:**
194
+
195
+ [More Information Needed]
196
+
197
+ **APA:**
198
+
199
+ [More Information Needed]
200
+
201
+ ## Glossary [optional]
202
+
203
+ <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
204
+
205
+ [More Information Needed]
206
+
207
+ ## More Information [optional]
208
+
209
+ [More Information Needed]
210
+
211
+ ## Model Card Authors [optional]
212
+
213
+ [More Information Needed]
214
+
215
+ ## Model Card Contact
216
+
217
+ [More Information Needed]
218
+
219
+