julien-c HF staff commited on
Commit
68b3d41
1 Parent(s): 5cba048

Migrate model card from transformers-repo

Browse files

Read announcement at https://discuss.huggingface.co/t/announcement-all-model-cards-will-be-migrated-to-hf-co-model-repos/2755
Original file history: https://github.com/huggingface/transformers/commits/master/model_cards/activebus/BERT-PT_laptop/README.md

Files changed (1) hide show
  1. README.md +41 -0
README.md ADDED
@@ -0,0 +1,41 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # ReviewBERT
2
+
3
+ BERT (post-)trained from review corpus to understand sentiment, options and various e-commence aspects.
4
+
5
+ `BERT-DK_laptop` is trained from 100MB laptop corpus under `Electronics/Computers & Accessories/Laptops`.
6
+ `BERT-PT_*` addtionally uses SQuAD 1.1.
7
+
8
+ ## Model Description
9
+
10
+ The original model is from `BERT-base-uncased` trained from Wikipedia+BookCorpus.
11
+ Models are post-trained from [Amazon Dataset](http://jmcauley.ucsd.edu/data/amazon/) and [Yelp Dataset](https://www.yelp.com/dataset/challenge/).
12
+
13
+
14
+ ## Instructions
15
+ Loading the post-trained weights are as simple as, e.g.,
16
+
17
+ ```python
18
+ import torch
19
+ from transformers import AutoModel, AutoTokenizer
20
+
21
+ tokenizer = AutoTokenizer.from_pretrained("activebus/BERT-PT_laptop")
22
+ model = AutoModel.from_pretrained("activebus/BERT-PT_laptop")
23
+
24
+ ```
25
+
26
+ ## Evaluation Results
27
+
28
+ Check our [NAACL paper](https://www.aclweb.org/anthology/N19-1242.pdf)
29
+
30
+
31
+ ## Citation
32
+ If you find this work useful, please cite as following.
33
+ ```
34
+ @inproceedings{xu_bert2019,
35
+ title = "BERT Post-Training for Review Reading Comprehension and Aspect-based Sentiment Analysis",
36
+ author = "Xu, Hu and Liu, Bing and Shu, Lei and Yu, Philip S.",
37
+ booktitle = "Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics",
38
+ month = "jun",
39
+ year = "2019",
40
+ }
41
+ ```