clavel commited on
Commit
0804622
1 Parent(s): e354dfc

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +22 -19
README.md CHANGED
@@ -1,20 +1,22 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
- annotations_creators:
3
- - MajorIsaiah
4
- - Ximyer
5
- - clavel
6
- - inoid
7
- language_creators: [found]
8
- languages: [es]
9
- multilinguality: [monolingual]
10
- pretty_name: ''
11
- size_categories:
12
- - n=200
13
- source_datasets: [unam_tesis]
14
- task_categories: [text-classification]
15
- task_ids: [language-modeling ]
16
- license: apache-2.0
17
- ---
18
  # Unam_tesis_beto_finnetuning: Unam's thesis classification with BETO
19
 
20
  This model is created from the finetuning of the pre-model
@@ -25,7 +27,7 @@ possible careers at the UNAM.
25
 
26
  ## Training Dataset
27
 
28
- 1000 documents (Thesis introduction, Author´s first name, Author´s last name, Thesis title, Year, Career )
29
 
30
  | Careers | Size |
31
  |--------------|----------------------|
@@ -39,6 +41,7 @@ possible careers at the UNAM.
39
  ## Example of use
40
 
41
  For further details on how to use unam_tesis_beto_finnetuning you can visit the Huggingface Transformers library, starting with the Quickstart section. Unam_tesis models can be accessed simply as 'hackathon-pln-e/unam_tesis_beto_finnetuning' by using the Transformers library. An example of how to download and use the models on this page can be found in this colab notebook.
 
42
  ```python
43
 
44
  tokenizer = AutoTokenizer.from_pretrained('hiiamsid/BETO_es_binary_classification', use_fast=False)
@@ -53,13 +56,13 @@ For further details on how to use unam_tesis_beto_finnetuning you can visit the
53
 
54
  To cite this resource in a publication please use the following:
55
 
56
-
57
  ## Citation
58
 
59
- [UNAM's Tesis with BETO finetuning classify ](https://huggingface.co/hackathon-pln-es/unam_tesis_BETO_finnetuning)
60
 
61
  To cite this resource in a publication please use the following:
62
 
 
63
  ```
64
  @inproceedings{SpanishNLPHackaton2022,
65
  title={UNAM's Theses with BETO fine-tuning classify },
 
1
+ ---
2
+ annotations_creators: "MajorIsaiah, Ximyer, clavel, inoid,"
3
+ language_creators: found
4
+ languages:
5
+ - es
6
+ license:
7
+ - apache-2.0
8
+ multilinguality:
9
+ - monolingual
10
+ size_categories:
11
+ - n=200
12
+ source_datasets:
13
+ - unam_tesis
14
+ task_categories:
15
+ - text-classification
16
+ task_ids:
17
+ - language-modeling
18
  ---
19
+
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
20
  # Unam_tesis_beto_finnetuning: Unam's thesis classification with BETO
21
 
22
  This model is created from the finetuning of the pre-model
 
27
 
28
  ## Training Dataset
29
 
30
+ 1000 documents (Thesis introduction, Author´s first name, Author´s last name, Thesis title, Year, Career)
31
 
32
  | Careers | Size |
33
  |--------------|----------------------|
 
41
  ## Example of use
42
 
43
  For further details on how to use unam_tesis_beto_finnetuning you can visit the Huggingface Transformers library, starting with the Quickstart section. Unam_tesis models can be accessed simply as 'hackathon-pln-e/unam_tesis_beto_finnetuning' by using the Transformers library. An example of how to download and use the models on this page can be found in this colab notebook.
44
+
45
  ```python
46
 
47
  tokenizer = AutoTokenizer.from_pretrained('hiiamsid/BETO_es_binary_classification', use_fast=False)
 
56
 
57
  To cite this resource in a publication please use the following:
58
 
 
59
  ## Citation
60
 
61
+ [UNAM's Tesis with BETO finetuning classify] (https://huggingface.co/hackathon-pln-es/unam_tesis_BETO_finnetuning)
62
 
63
  To cite this resource in a publication please use the following:
64
 
65
+
66
  ```
67
  @inproceedings{SpanishNLPHackaton2022,
68
  title={UNAM's Theses with BETO fine-tuning classify },