Update README.md
Browse files
README.md
CHANGED
@@ -40,18 +40,24 @@ It achieves the following results on the evaluation set:
|
|
40 |
|
41 |
## Model description
|
42 |
|
43 |
-
|
|
|
44 |
|
45 |
## Intended uses & limitations
|
46 |
|
47 |
-
|
|
|
|
|
48 |
|
49 |
## Training and evaluation data
|
|
|
50 |
|
51 |
-
|
52 |
|
53 |
## Training procedure
|
54 |
|
|
|
|
|
55 |
### Training hyperparameters
|
56 |
|
57 |
The following hyperparameters were used during training:
|
|
|
40 |
|
41 |
## Model description
|
42 |
|
43 |
+
In this project, we fine-tuned mT5small, a multilingual variant of T5 that was pre-trained on a new Common Crawl-based dataset covering 101 languages.
|
44 |
+
The model was fine-tuned on the electric patent corpus using a variety of techniques, including transfer learning, data augmentation, and hyperparameter tuning.
|
45 |
|
46 |
## Intended uses & limitations
|
47 |
|
48 |
+
The fine-tuned model showed significant improvements in performance on the electric patent-specific tasks compared to the original pre-trained model.
|
49 |
+
|
50 |
+
Note: This project is suitable for researchers who are working on electric patent, as it's fine-tuned on electric patents and it can be used for related NLP problems for electric patent and electric patent research.
|
51 |
|
52 |
## Training and evaluation data
|
53 |
+
A subset of electric patents were used to fine-tune the model.
|
54 |
|
55 |
+
The fine-tuned model was evaluated using the ROUGE metric on a variety of natural language processing tasks specific to the patent domain, including, named entity recognition, and summarization.
|
56 |
|
57 |
## Training procedure
|
58 |
|
59 |
+
The model was fine-tuned on the electric patent corpus using a variety of techniques, including transfer learning, data augmentation, and hyperparameter tuning.
|
60 |
+
|
61 |
### Training hyperparameters
|
62 |
|
63 |
The following hyperparameters were used during training:
|