--- license: apache-2.0 widget: - text: "Las [MASK] son adictivas." --- logo LudoBETO is a domain adaptation of a [Spanish BERT](https://huggingface.co/dccuchile/bert-base-spanish-wwm-cased) language model.
It was adapted to the pathological gambling domain with a corpus extracted from a specialised [forum](https://www.ludopatia.org/web/index_es.htm). We automatically compiled with a LLM a lexical resource to guide the masking process of the language model and, therefore, to help it in paying more attention to words related to pathological gambling. For training the model we used a batch size of 8, Adam optimizer, with a learning rate of 2e-5, and cross-entropy as a loss function. We trained the model for four epochs using a GPU NVIDIA GeForce RTX 4070 12GB. ## Usage ```python from transformers import pipeline pipe = pipeline("fill-mask", model="citiusLTL/ludoBETO") text = pipe("Las [MASK] son adictivas.") print(text) ``` ## Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("citiusLTL/ludoBETO") model = AutoModelForMaskedLM.from_pretrained("citiusLTL/ludoBETO")