Rifky commited on
Commit
b819f62
1 Parent(s): edce5d9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +15 -18
README.md CHANGED
@@ -1,26 +1,23 @@
1
  ---
2
- language: id
3
- tags:
4
- - indobert
5
- - indolem
6
  license: apache-2.0
7
  datasets:
8
- - 220M words (IndoWiki, IndoWC, News)
9
- - Squad 2.0 (Indonesian translated)
10
- widget:
11
- - text: kapan pangeran diponegoro lahir?
12
- context: Pangeran Harya Dipanegara (atau biasa dikenal dengan nama Pangeran Diponegoro,
13
- lahir di Ngayogyakarta Hadiningrat, 11 November 1785 – meninggal di Makassar,
14
- Hindia Belanda, 8 Januari 1855 pada umur 69 tahun) adalah salah seorang pahlawan
15
- nasional Republik Indonesia, yang memimpin Perang Diponegoro atau Perang Jawa
16
- selama periode tahun 1825 hingga 1830 melawan pemerintah Hindia Belanda. Sejarah
17
- mencatat, Perang Diponegoro atau Perang Jawa dikenal sebagai perang yang menelan
18
- korban terbanyak dalam sejarah Indonesia, yakni 8.000 korban serdadu Hindia Belanda,
19
- 7.000 pribumi, dan 200 ribu orang Jawa serta kerugian materi 25 juta Gulden.
20
  ---
21
  [Github](https://github.com/rifkybujana/IndoBERT-QA)
22
 
23
- This project is part of my research with my friend Muhammad Fajrin Buyang Daffa entitled "Teman Belajar : Asisten Digital Pelajar SMA Negeri 28 Jakarta dalam Membaca" for KOPSI (Kompetisi Penelitian Siswa Indonesia/Indonesian Student Research Competition).
 
 
 
 
 
 
24
 
25
  ## indoBERT Base-Uncased fine-tuned on Translated Squad v2.0
26
  [IndoBERT](https://huggingface.co/indolem/indobert-base-uncased) trained by [IndoLEM](https://indolem.github.io/) and fine-tuned on [Translated SQuAD 2.0](https://github.com/Wikidepia/indonesian_datasets/tree/master/question-answering/squad) for **Q&A** downstream task.
@@ -81,4 +78,4 @@ qa_pipeline({
81
  ```
82
 
83
  ### Reference
84
- <a id="1">[1]</a>Fajri Koto and Afshin Rahimi and Jey Han Lau and Timothy Baldwin. 2020. IndoLEM and IndoBERT: A Benchmark Dataset and Pre-trained Language Model for Indonesian NLP. Proceedings of the 28th COLING.
 
1
  ---
 
 
 
 
2
  license: apache-2.0
3
  datasets:
4
+ - rajpurkar/squad_v2
5
+ language:
6
+ - id
7
+ metrics:
8
+ - f1
9
+ base_model:
10
+ - indolem/indobert-base-uncased
 
 
 
 
 
11
  ---
12
  [Github](https://github.com/rifkybujana/IndoBERT-QA)
13
 
14
+ **Notice of Attribution Clarification**
15
+
16
+ This is to clarify that **Muhammad Fajrin Buyang Daffa** is not, and has never been, a part of this project. They have made no contributions to this repository, and as such, shall not be given any attribution in relation to this work.
17
+
18
+ For further inquiries, please contact the rifky@genta.tech.
19
+
20
+ > This project is part of my research entitled "Teman Belajar : Asisten Digital Pelajar SMA Negeri 28 Jakarta dalam Membaca" for KOPSI (Kompetisi Penelitian Siswa Indonesia/Indonesian Student Research Competition).
21
 
22
  ## indoBERT Base-Uncased fine-tuned on Translated Squad v2.0
23
  [IndoBERT](https://huggingface.co/indolem/indobert-base-uncased) trained by [IndoLEM](https://indolem.github.io/) and fine-tuned on [Translated SQuAD 2.0](https://github.com/Wikidepia/indonesian_datasets/tree/master/question-answering/squad) for **Q&A** downstream task.
 
78
  ```
79
 
80
  ### Reference
81
+ <a id="1">[1]</a>Fajri Koto and Afshin Rahimi and Jey Han Lau and Timothy Baldwin. 2020. IndoLEM and IndoBERT: A Benchmark Dataset and Pre-trained Language Model for Indonesian NLP. Proceedings of the 28th COLING.