|
---
|
|
license: apache-2.0
|
|
---
|
|
# Unam_tesis_beto_finnetuning: Unam's thesis classification with BETO
|
|
|
|
|
|
This model is created from the finetuning of the pre-model
|
|
for Spanish [BETO] (https://huggingface.co/dccuchile/bert-base-spanish-wwm-uncased), using PyTorch framework,
|
|
and trained with a set of theses at the Autonomous University of Mexico [UNAM](https://tesiunam.dgb.unam.mx/F?func=find-b-0&local_base=TES01).
|
|
The model classifies for five (Psychology, Law, Química Farmaceutico Biológica, Actuaría, Economy)
|
|
possible careers at the University of Mexico.
|
|
List of races from a text.
|
|
|
|
|
|
## Example of use
|
|
|
|
For further details on how to use unam_tesis_beto_finnetuning you can visit the Huggingface Transformers library, starting with the Quickstart section. Unam_tesis models can be accessed simply as 'inoid/unam_tesis_beto_finnetuning' by using the Transformers library. An example of how to download and use the models on this page can be found in this colab notebook.
|
|
```python
|
|
|
|
tokenizer = AutoTokenizer.from_pretrained('inoid/unam_tesis_beto_finnetuning', use_fast=False)
|
|
model = AutoModelForSequenceClassification.from_pretrained(
|
|
'inoid/unam_tesis_beto_finnetuning', num_labels=5, output_attentions=False,
|
|
output_hidden_states=False)
|
|
pipe = TextClassificationPipeline(model=model, tokenizer=tokenizer, return_all_scores=True)
|
|
|
|
classificationResult = pipe("El objetivo de esta tesis es elaborar un estudio de las condiciones asociadas al aprendizaje desde casa")
|
|
|
|
```
|
|
|
|
To cite this resource in a publication please use the following:
|
|
|
|
|
|
## Citation
|
|
|
|
[UNAM's Tesis with BETO finetuning classify ](https://huggingface.co/hackathon-pln-es/unam_tesis_BETO_finnetuning)
|
|
|
|
To cite this resource in a publication please use the following:
|
|
|
|
```
|
|
@inproceedings{SpanishNLPHackaton2022,
|
|
title={UNAM's Tesis with BETO finetunning classify },
|
|
author={Cañete, Isahías and López, Dionis and Clavel, Yisell and López López, Ximena Yeraldin },
|
|
booktitle={Somos NLP Hackaton 2022},
|
|
year={2022}
|
|
}
|
|
``` |