IProject-10 commited on
Commit
ce4b09b
1 Parent(s): a5b077e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +97 -15
README.md CHANGED
@@ -10,30 +10,109 @@ model-index:
10
  results: []
11
  language:
12
  - en
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
13
  ---
14
 
15
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
16
  should probably proofread and complete it, then remove this comment. -->
17
 
18
- # xlm-roberta-base-finetuned-squad2
19
-
20
- This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the squad_v2 dataset.
21
- It achieves the following results on the evaluation set:
22
- - Loss: 0.9802
23
-
24
  ## Model description
25
 
26
- More information needed
27
-
28
- ## Intended uses & limitations
29
 
30
- More information needed
 
 
 
 
 
31
 
32
- ## Training and evaluation data
33
-
34
- More information needed
35
 
36
- ## Training procedure
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
37
 
38
  ### Training hyperparameters
39
 
@@ -54,7 +133,10 @@ The following hyperparameters were used during training:
54
  | 0.8013 | 2.0 | 16666 | 0.8910 |
55
  | 0.5918 | 3.0 | 24999 | 0.9802 |
56
 
57
-
 
 
 
58
  ### Framework versions
59
 
60
  - Transformers 4.31.0
 
10
  results: []
11
  language:
12
  - en
13
+ - ar
14
+ - de
15
+ - el
16
+ - es
17
+ - hi
18
+ - ro
19
+ - ru
20
+ - th
21
+ - tr
22
+ - vi
23
+ - zh
24
+ metrics:
25
+ - exact_match
26
+ - f1
27
+ pipeline_tag: question-answering
28
  ---
29
 
30
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
31
  should probably proofread and complete it, then remove this comment. -->
32
 
 
 
 
 
 
 
33
  ## Model description
34
 
35
+ XLM-RoBERTa is a multilingual version of RoBERTa developed by Facebook AI. It is pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages.
36
+ It is an extension of RoBERTa, which is itself a variant of the BERT model. XLM-RoBERTa is designed to handle multiple languages and demonstrate strong performance across a wide range of tasks, making it highly useful for multilingual natural language processing (NLP) applications.
 
37
 
38
+ **Language model:** xlm-roberta-base
39
+ **Language:** English
40
+ **Downstream-task:** Question-Answering
41
+ **Training data:** Train-set SQuAD 2.0
42
+ **Evaluation data:** Evaluation-set SQuAD 2.0
43
+ **Hardware Accelerator used**: GPU Tesla T4
44
 
45
+ ## Intended uses & limitations
 
 
46
 
47
+ For Question-Answering in English-
48
+
49
+ ```python
50
+ !pip install transformers
51
+ from transformers import pipeline
52
+ model_checkpoint = "IProject-10/bert-base-uncased-finetuned-squad2"
53
+ question_answerer = pipeline("question-answering", model=model_checkpoint)
54
+
55
+ context = """
56
+ The Statue of Unity is the world's tallest statue, with a height of 182 metres (597 feet), located near Kevadia in the state of Gujarat, India.
57
+ """
58
+
59
+ question = "What is the height of statue of Unity?"
60
+ question_answerer(question=question, context=context)
61
+ ```
62
+ For Question-Answering in Hindi-
63
+
64
+ ```python
65
+ !pip install transformers
66
+ from transformers import pipeline
67
+ model_checkpoint = "IProject-10/bert-base-uncased-finetuned-squad2"
68
+ question_answerer = pipeline("question-answering", model=model_checkpoint)
69
+
70
+ context = """
71
+ स्टैच्यू ऑफ यूनिटी दुनिया की सबसे ऊंची प्रतिमा है, जिसकी ऊंचाई 182 मीटर (597 फीट) है, जो भारत के गुजरात राज्य में केवडिया के पास स्थित है।
72
+ """
73
+
74
+ question = "स्टैच्यू ऑफ यूनिटी की ऊंचाई कितनी है?"
75
+ question_answerer(question=question, context=context)
76
+ ```
77
+
78
+ For Question-Answering in Spanish-
79
+
80
+ ```python
81
+ !pip install transformers
82
+ from transformers import pipeline
83
+ model_checkpoint = "IProject-10/bert-base-uncased-finetuned-squad2"
84
+ question_answerer = pipeline("question-answering", model=model_checkpoint)
85
+
86
+ context = """
87
+ La Estatua de la Unidad es la estatua más alta del mundo, con una altura de 182 metros (597 pies), ubicada cerca de Kevadia en el estado de Gujarat, India.
88
+ """
89
+
90
+ question = "¿Cuál es la altura de la estatua de la Unidad?"
91
+ question_answerer(question=question, context=context)
92
+ ```
93
+
94
+ ## Results
95
+
96
+ Evaluation on SQuAD 2.0 validation dataset:
97
+
98
+ ```
99
+ exact: 75.51587635812348,
100
+ f1: 78.7328391907263,
101
+ total: 11873,
102
+ HasAns_exact: 73.00944669365722,
103
+ HasAns_f1: 79.45259779208723,
104
+ HasAns_total: 5928,
105
+ NoAns_exact: 78.01513877207738,
106
+ NoAns_f1: 78.01513877207738,
107
+ NoAns_total: 5945,
108
+ best_exact: 75.51587635812348,
109
+ best_exact_thresh: 0.999241054058075,
110
+ best_f1: 78.73283919072665,
111
+ best_f1_thresh: 0.999241054058075,
112
+ total_time_in_seconds: 218.97641910400125,
113
+ samples_per_second: 54.220450076686134,
114
+ latency_in_seconds: 0.018443225730986376
115
+ ```
116
 
117
  ### Training hyperparameters
118
 
 
133
  | 0.8013 | 2.0 | 16666 | 0.8910 |
134
  | 0.5918 | 3.0 | 24999 | 0.9802 |
135
 
136
+ This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the squad_v2 dataset.
137
+ It achieves the following results on the evaluation set:
138
+ - Loss: 0.9802
139
+
140
  ### Framework versions
141
 
142
  - Transformers 4.31.0