Spaces:
Paused
Paused
File size: 5,701 Bytes
e15f1e0 9a229a7 993e75e 75eaa7d 1c8cf8d ca6ca1c 993e75e ca6ca1c e15f1e0 32a4ae2 14a130a 01c2292 14a130a 01c2292 b574413 01c2292 14a130a b574413 14a130a 407682c b574413 14a130a e15f1e0 b574413 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 |
import gradio as gr
from transformers import BertForQuestionAnswering
from transformers import BertTokenizerFast
import torch
device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')
tokenizer = BertTokenizerFast.from_pretrained('bert-base-uncased')
model = BertForQuestionAnswering.from_pretrained("CountingMstar/ai-tutor-bert-model").to(device)
def get_prediction(context, question):
inputs = tokenizer.encode_plus(question, context, return_tensors='pt').to(device)
outputs = model(**inputs)
answer_start = torch.argmax(outputs.start_logits)
answer_end = torch.argmax(outputs.end_logits) + 1
answer = tokenizer.convert_tokens_to_string(tokenizer.convert_ids_to_tokens(inputs['input_ids'][0][answer_start:answer_end]))
return answer
def question_answer(context, question):
prediction = get_prediction(context, question)
return prediction
def submit(context, question):
answer = question_answer(context, question)
return answer
examples_text = [
["A large language model (LLM) is a type of language model notable for its ability to achieve general-purpose language understanding and generation. LLMs acquire these abilities by using massive amounts of data to learn billions of parameters during training and consuming large computational resources during their training and operation.[1] LLMs are artificial neural networks (mainly transformers[2]) and are (pre-)trained using self-supervised learning and semi-supervised learning.","What is large language model?"],
["Feature engineering or feature extraction or feature discovery is the process of extracting features (characteristics, properties, attributes) from raw data. Due to deep learning networks, such as convolutional neural networks, that are able to learn features by themselves, domain-specific-based feature engineering has become obsolete for vision and speech processing. Other examples of features in physics include the construction of dimensionless numbers such as Reynolds number in fluid dynamics; then Nusselt number in heat transfer; Archimedes number in sedimentation; construction of first approximations of the solution such as analytical strength of materials solutions in mechanics, etc.", "What is Feature engineering?"],
["It calculates soft weights for each word, more precisely for its embedding, in the context window. It can do it either in parallel (such as in transformers) or sequentially (such as recurrent neural networks). Soft weights can change during each runtime, in contrast to hard weights, which are (pre-)trained and fine-tuned and remain frozen afterwards. Attention was developed to address the weaknesses of recurrent neural networks, where words in a sentence are slowly processed one at a time. Machine learning-based attention is a mechanism mimicking cognitive attention. Recurrent neural networks favor more recent words at the end of a sentence while earlier words fade away in volatile neural activations. Attention gives all words equal access to any part of a sentence in a faster parallel scheme and no longer suffers the wait time of serial processing. Earlier uses attached this mechanism to a serial recurrent neural network's language translation system (below), but later uses in Transformers large language models removed the recurrent neural network and relied heavily on the faster parallel attention scheme.", "What is Attention mechanism?"]
]
input_textbox = gr.Textbox("Context", placeholder="Enter context here")
question_textbox = gr.Textbox("Question", placeholder="Enter question here")
input_section = gr.Row([input_textbox, question_textbox])
markdown_text = """"
# AI Tutor BERT
์ด ๋ชจ๋ธ์ ์ธ๊ณต์ง๋ฅ(AI) ๊ด๋ จ ์ฉ์ด ๋ฐ ์ค๋ช
์ ํ์ธํ๋(fine-tuning)ํ BERT ๋ชจ๋ธ์
๋๋ค.
## Model
https://huggingface.co/bert-base-uncased
๋ชจ๋ธ์ ๊ฒฝ์ฐ ์์ฐ์ด ์ฒ๋ฆฌ ๋ชจ๋ธ ์ค ๊ฐ์ฅ ์ ๋ช
ํ Google์์ ๊ฐ๋ฐํ BERT๋ฅผ ์ฌ์ฉํ์ต๋๋ค. ์์ธํ ์ค๋ช
์ ์ ์ฌ์ดํธ๋ฅผ ์ฐธ๊ณ ํ์๊ธฐ ๋ฐ๋๋๋ค. ์ง์์๋ต์ด ์ฃผ์ธ ๊ณผ์ธ ์ ์๋๋ต๊ฒ, BERT ์ค์์๋ ์ง์์๋ต์ ํนํ๋ Question and Answering ๋ชจ๋ธ์ ์ฌ์ฉํ์์ต๋๋ค.
## Dataset
### Wikipedia
https://en.wikipedia.org/wiki/Main_Page
### activeloop
https://www.activeloop.ai/resources/glossary/arima-models/
### Adrien Beaulieu
https://product.house/100-ai-glossary-terms-explained-to-the-rest-of-us/
ํ์ต ๋ฐ์ดํฐ์
์ ์ธ๊ณต์ง๋ฅ ๊ด๋ จ ๋ฌธ๋งฅ, ์ง๋ฌธ, ๊ทธ๋ฆฌ๊ณ ์๋ต ์ด๋ ๊ฒ 3๊ฐ์ง๋ก ๊ตฌ์ฑ์ด ๋์ด์์ต๋๋ค. ์๋ต(์ ๋ต) ๋ฐ์ดํฐ๋ ๋ฌธ๋งฅ ๋ฐ์ดํฐ ์์ ํฌํจ๋์ด ์๊ณ , ๋ฌธ๋งฅ ๋ฐ์ดํฐ์ ๋ฌธ์ฅ ์์๋ฅผ ๋ฐ๊ฟ์ฃผ์ด ๋ฐ์ดํฐ๋ฅผ ์ฆ๊ฐํ์์ต๋๋ค. ์ง๋ฌธ ๋ฐ์ดํฐ๋ ์ฃผ์ ๊ฐ ๋๋ ์ธ๊ณต์ง๋ฅ ์ฉ์ด๋ก ์ค์ ํ์ต๋๋ค. ์์ ์์๋ฅผ ๋ณด์๋ฉด ์ดํดํ์๊ธฐ ํธํ์ค ๊ฒ๋๋ค. ์ด ๋ฐ์ดํฐ ์๋ 3300์ฌ ๊ฐ๋ก data ํด๋์ pickle ํ์ผ ํํ๋ก ์ ์ฅ๋์ด ์๊ณ , ๋ฐ์ดํฐ๋ Wikipedia ๋ฐ ๋ค๋ฅธ ์ฌ์ดํธ๋ค์ ์์ html์ ์ด์ฉํ์ฌ ์ถ์ถ ๋ฐ ๊ฐ๊ณตํ์ฌ ์ ์ํ์์ต๋๋ค. ํด๋น ์ถ์ฒ๋ ์์ ๊ฐ์ต๋๋ค.
## How to use
์
๋ ฅ ์์ ๋ 'Examples'์ ํ๊ธฐํด ๋์์ต๋๋ค.
๊ด๋ จ ๋ฌธ์ฅ๊ณผ ์ ์๋ฅผ ์๊ณ ์ถ์ ๋จ์ด๋ฅผ ๊ฐ๊ฐ 'Contexts', 'Question'์ ์
๋ ฅํ ํ 'Submit' ๋ฒํผ์ ๋๋ฅด๋ฉด ํด๋น ๋จ์ด์ ๋ํ ์ค๋ช
์ด ๋์ต๋๋ค.
"""
iface = gr.Interface(
fn=submit,
inputs=input_section,
outputs=gr.Textbox("Answer"),
examples=examples_text,
live=True,
title="BERT Question Answering"
)
iface.launch() |