Update README.md
Browse files
README.md
CHANGED
@@ -4,7 +4,10 @@ widget:
|
|
4 |
- text: "Las [MASK] son adictivas."
|
5 |
---
|
6 |
|
7 |
-
|
|
|
|
|
|
|
8 |
|
9 |
For training the model we used a batch size of 8, Adam optimizer, with a learning rate of 2e-5, and cross-entropy as a loss function. We trained the model for four epochs using a GPU NVIDIA GeForce RTX 4070 12GB.
|
10 |
|
|
|
4 |
- text: "Las [MASK] son adictivas."
|
5 |
---
|
6 |
|
7 |
+
<img src="o.svg" align="left" alt="logo" width="40" style="margin-right: 5px;" />
|
8 |
+
|
9 |
+
LudoBETO is a domain adaptation of a [Spanish BERT](https://huggingface.co/dccuchile/bert-base-spanish-wwm-cased) language model. <br clear="left"/> It was adapted to the pathological gambling domain with a corpus extracted from a specialised [forum](https://www.ludopatia.org/web/index_es.htm). We automatically compiled with a LLM a lexical resource to guide the masking process of the language model and, therefore, to help it in paying more attention to words related to pathological gambling.
|
10 |
+
|
11 |
|
12 |
For training the model we used a batch size of 8, Adam optimizer, with a learning rate of 2e-5, and cross-entropy as a loss function. We trained the model for four epochs using a GPU NVIDIA GeForce RTX 4070 12GB.
|
13 |
|