Spaces:
Paused
Paused
CountingMstar
commited on
Commit
·
9dab2e1
1
Parent(s):
1dada4c
Update README.md
Browse files
README.md
CHANGED
@@ -15,6 +15,17 @@ This model is a BERT model fine-tuned on artificial intelligence (AI) related te
|
|
15 |
|
16 |
With the increasing interest in artificial intelligence, many people are taking AI-related courses and projects. However, as a graduate student in artificial intelligence, it's not common to find useful resources that are easy for AI beginners to understand. Furthermore, personalized lessons tailored to individual levels and fields are often lacking, making it difficult for many people to start learning about artificial intelligence. To address these challenges, our team has created a language model that plays the role of a tutor in the field of AI terminology. Details about the model type, training dataset, and usage are explained below, so please read them carefully and be sure to try it out.
|
17 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
18 |
## Model
|
19 |
https://huggingface.co/bert-base-uncased
|
20 |
|
@@ -85,13 +96,6 @@ I used 10 epochs for training, and I employed the Adam optimizer with a learning
|
|
85 |
The results, as shown in the graphs above, indicate that, at the last epoch, the loss is 6.917126256477786, and the accuracy is 0.9819078947368421, demonstrating that the model has been trained quite effectively.
|
86 |
|
87 |
|
88 |
-
## How to use?
|
89 |
-
https://github.com/CountingMstar/AI_BERT/blob/main/MY_AI_BERT_final.ipynb
|
90 |
-
|
91 |
-
|
92 |
-
You can load the trained model through the training process described above and use it as needed.
|
93 |
-
|
94 |
-
|
95 |
Thank you.
|
96 |
|
97 |
|
@@ -107,6 +111,7 @@ Thank you.
|
|
107 |
|
108 |
|
109 |
<img src="https://github.com/CountingMstar/AI_BERT/assets/90711707/45afcd24-7ef9-4149-85d4-2236e23fbf69" width="1400" height="700"/>
|
|
|
110 |
|
111 |
|
112 |
위 그림과 같이 인공지능관련 지문(문맥)과 용어 관련 질문을 입력해주고 Submit을 눌러주면, 오른쪽에 해당 용어에 대한 설명 답변이 나옵니다.
|
|
|
15 |
|
16 |
With the increasing interest in artificial intelligence, many people are taking AI-related courses and projects. However, as a graduate student in artificial intelligence, it's not common to find useful resources that are easy for AI beginners to understand. Furthermore, personalized lessons tailored to individual levels and fields are often lacking, making it difficult for many people to start learning about artificial intelligence. To address these challenges, our team has created a language model that plays the role of a tutor in the field of AI terminology. Details about the model type, training dataset, and usage are explained below, so please read them carefully and be sure to try it out.
|
17 |
|
18 |
+
|
19 |
+
## How to use?
|
20 |
+
|
21 |
+
|
22 |
+
<img src="https://github.com/CountingMstar/AI_BERT/assets/90711707/45afcd24-7ef9-4149-85d4-2236e23fbf69" width="1400" height="700"/>
|
23 |
+
https://huggingface.co/spaces/pseudolab/AI_Tutor_BERT
|
24 |
+
|
25 |
+
|
26 |
+
As shown above, you can input passages (context) related to artificial intelligence and questions about terms. Upon pressing "Submit," you will receive corresponding explanations and answers on the right side.
|
27 |
+
|
28 |
+
|
29 |
## Model
|
30 |
https://huggingface.co/bert-base-uncased
|
31 |
|
|
|
96 |
The results, as shown in the graphs above, indicate that, at the last epoch, the loss is 6.917126256477786, and the accuracy is 0.9819078947368421, demonstrating that the model has been trained quite effectively.
|
97 |
|
98 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
99 |
Thank you.
|
100 |
|
101 |
|
|
|
111 |
|
112 |
|
113 |
<img src="https://github.com/CountingMstar/AI_BERT/assets/90711707/45afcd24-7ef9-4149-85d4-2236e23fbf69" width="1400" height="700"/>
|
114 |
+
https://huggingface.co/spaces/pseudolab/AI_Tutor_BERT
|
115 |
|
116 |
|
117 |
위 그림과 같이 인공지능관련 지문(문맥)과 용어 관련 질문을 입력해주고 Submit을 눌러주면, 오른쪽에 해당 용어에 대한 설명 답변이 나옵니다.
|