changyeop2 commited on
Commit
9660cb4
1 Parent(s): 3aa3bad

add readme

Browse files
Files changed (1) hide show
  1. README.md +49 -0
README.md ADDED
@@ -0,0 +1,49 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ datasets:
3
+ - KLUE-MRC
4
+ license: cc-by-sa-4.0
5
+ ---
6
+
7
+ # bert-base for QA
8
+ NOTE: You can try the model through the [Ainize DEMO](https://main-klue-mrc-bert-scy6500.endpoint.ainize.ai/), and you can call the api through the [Ainize API](https://ainize.ai/scy6500/KLUE-MRC-BERT?branch=main).
9
+
10
+ ## Overview
11
+ **Language model:** klue/bert-base
12
+ **Language:** Korean
13
+ **Downstream-task:** Extractive QA
14
+ **Training data:** KLUE-MRC
15
+ **Eval data:** KLUE-MRC
16
+ **Code:** See [Ainize Workspace]()
17
+
18
+
19
+ ## Usage
20
+ ### In Transformers
21
+ ```python
22
+ from transformers import AutoModelForQuestionAnswering, AutoTokenizer
23
+
24
+ tokenizer = AutoTokenizer.from_pretrained("./mrc-bert-base")
25
+ model = AutoModelForQuestionAnswering.from_pretrained("./mrc-bert-base")
26
+
27
+ context = "your context"
28
+ question = "your question"
29
+
30
+ encodings = tokenizer(context, question, max_length=512, truncation=True,
31
+ padding="max_length", return_token_type_ids=False)
32
+
33
+ input_ids = encodings["input_ids"]
34
+ attention_mask = encodings["attention_mask"]
35
+
36
+ pred = model(input_ids, attention_mask=attention_mask)
37
+
38
+ start_logits, end_logits = pred.start_logits, pred.end_logits
39
+
40
+ token_start_index, token_end_index = start_logits.argmax(dim=-1), end_logits.argmax(dim=-1)
41
+
42
+ pred_ids = input_ids[0][token_start_index: token_end_index + 1]
43
+
44
+ prediction = tokenizer.decode(pred_ids)
45
+ ```
46
+
47
+ ## About us
48
+ [Teachable NLP](https://ainize.ai/teachable-nlp) - Train NLP models with your own text without writing any code
49
+ [Ainize](https://ainize.ai/) - Deploy ML project using free gpu