sjrhuschlee commited on
Commit
b360ddf
1 Parent(s): fe4e3d2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +50 -16
README.md CHANGED
@@ -20,7 +20,7 @@ This is the [deberta-v3-base](https://huggingface.co/microsoft/deberta-v3-base)
20
  **Training data:** SQuAD 2.0
21
  **Eval data:** SQuAD 2.0
22
  **Code:** See [an example QA pipeline on Haystack](https://haystack.deepset.ai/tutorials/first-qa-system)
23
- **Infrastructure**:
24
 
25
  ## Hyperparameters
26
 
@@ -36,19 +36,53 @@ doc_stride = 128
36
  max_query_length = 64
37
  ```
38
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
39
  ```
40
- {'EM': 81.90853196327804,
41
- 'f1': 86.86782434367508,
42
- 'top_n_accuracy': 97.75120020213932,
43
- 'top_n': 4,
44
- 'EM_text_answer': 75.94466936572199,
45
- 'f1_text_answer': 85.87747611883509,
46
- 'top_n_accuracy_text_answer': 95.49595141700405,
47
- 'top_n_EM_text_answer': 79.21727395411607,
48
- 'top_n_f1_text_answer': 90.0904505126207,
49
- 'Total_text_answer': 5928,
50
- 'EM_no_answer': 87.85534062237174,
51
- 'f1_no_answer': 87.85534062237174,
52
- 'top_n_accuracy_no_answer': 100.0,
53
- 'Total_no_answer': 5945}
54
- ```
 
 
 
 
 
 
 
 
 
 
 
20
  **Training data:** SQuAD 2.0
21
  **Eval data:** SQuAD 2.0
22
  **Code:** See [an example QA pipeline on Haystack](https://haystack.deepset.ai/tutorials/first-qa-system)
23
+ **Infrastructure**: A10G Tensor Core GPU
24
 
25
  ## Hyperparameters
26
 
 
36
  max_query_length = 64
37
  ```
38
 
39
+ ## Usage
40
+
41
+ ### In Haystack
42
+ Haystack is an NLP framework by deepset. You can use this model in a Haystack pipeline to do question answering at scale (over many documents). To load the model in [Haystack](https://github.com/deepset-ai/haystack/):
43
+ ```python
44
+ reader = FARMReader(model_name_or_path="deepset/deberta-v3-base-squad2")
45
+ # or
46
+ reader = TransformersReader(model_name_or_path="deepset/deberta-v3-base-squad2",tokenizer="deepset/deberta-v3-base-squad2")
47
+ ```
48
+
49
+ ### In Transformers
50
+ ```python
51
+ from transformers import AutoModelForQuestionAnswering, AutoTokenizer, pipeline
52
+ model_name = "deepset/deberta-v3-base-squad2"
53
+ # a) Get predictions
54
+ nlp = pipeline('question-answering', model=model_name, tokenizer=model_name)
55
+ QA_input = {
56
+ 'question': 'Why is model conversion important?',
57
+ 'context': 'The option to convert models between FARM and transformers gives freedom to the user and let people easily switch between frameworks.'
58
+ }
59
+ res = nlp(QA_input)
60
+ # b) Load model & tokenizer
61
+ model = AutoModelForQuestionAnswering.from_pretrained(model_name)
62
+ tokenizer = AutoTokenizer.from_pretrained(model_name)
63
  ```
64
+
65
+ ## About us
66
+ <div class="grid lg:grid-cols-2 gap-x-4 gap-y-3">
67
+ <div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
68
+ <img alt="" src="https://huggingface.co/spaces/deepset/README/resolve/main/haystack-logo-colored.svg" class="w-40"/>
69
+ </div>
70
+ <div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
71
+ <img alt="" src="https://huggingface.co/spaces/deepset/README/resolve/main/deepset-logo-colored.svg" class="w-40"/>
72
+ </div>
73
+ </div>
74
+ [deepset](http://deepset.ai/) is the company behind the open-source NLP framework [Haystack](https://haystack.deepset.ai/) which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.
75
+
76
+
77
+ Some of our other work:
78
+ - [Distilled roberta-base-squad2 (aka "tinyroberta-squad2")]([https://huggingface.co/deepset/tinyroberta-squad2)
79
+ - [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
80
+ - [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
81
+
82
+ ## Get in touch and join the Haystack community
83
+
84
+ <p>For more info on Haystack, visit our <strong><a href="https://github.com/deepset-ai/haystack">GitHub</a></strong> repo and <strong><a href="https://haystack.deepset.ai">Documentation</a></strong>.
85
+
86
+ We also have a <strong><a class="h-7" href="https://haystack.deepset.ai/community/join"><img alt="slack" class="h-7 inline-block m-0" style="margin: 0" src="https://huggingface.co/spaces/deepset/README/resolve/main/Slack_RGB.png"/>community open to everyone!</a></strong></p>
87
+
88
+ [Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Slack](https://haystack.deepset.ai/community/join) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://deepset.ai)